Top 8 Microsoft Teams security issues
Due to the COVID-19 pandemic and remote work, there has been an increase in the use of collaboration tools such as Microsoft Teams, Zoom and Google Meet. It is important to be aware of the security risks deploying Microsoft Teams introduces in your IT environment and how to mitigate them.
Businesses and consumers are looking to leverage the video conferencing, collaboration, document sharing and instant messaging features of these tools. However, these tools come with their security issues, especially for organizations, as they are used to gain access to internal and sensitive/confidential data.
McAfee released a whitepaper in December 2020 on the top 10 issues affecting Microsoft Teams based on research conducted with more than 40 million McAfee MVISION Cloud users worldwide, and discussions with IT security, governance and risk teams.
This article provides a summary of the security issues identified with using Microsoft Teams and how to mitigate them in your IT environment.
Below are several security issues associated with deploying Microsoft Teams in your IT infrastructure.
1. Guest users
Microsoft Teams allows members of an organization to collaborate with guests (i.e., external users like vendors, clients, customers and contractors) by granting guests access to documents and resources in channels, chats and applications.
This access gives guests the ability to access and share files (from the Teams channel and SharePoint), chat, have online and live meetings, make calls, create tasks and more. Microsoft Teams allows tenant administrators to either enable or disable guest access. This means that teams are either open to all guests or are closed to all guests.
The issue here is it is difficult to implement an enterprise-wide setting that allows tenant admins to restrict team channels to just internal teams, public teams and teams with only authorized guests (as needed).
2. Access from unmanaged devices or untrusted locations
Microsoft Teams allows users to connect to any Teams channel from any device (including unmanaged devices). The risk exists that users with unmanaged devices can connect to any Teams channel, download sensitive information (chat or files) and then compromise the data stored on their devices when they become victims of a cyberattack.
Login location may be an indicator of risk (e.g., compromised credentials or devices). Tenant administrators should consider implementing risk-based authentication (i.e., rules or policies that limit or block access from unmanaged devices, certain locations or restrict access to only certain IP ranges or blocks).
3. Screen sharing and displaying sensitive/confidential data
Microsoft Teams allows users to share content in a Teams call or meeting via screen sharing from any location or device. This feature is quite handy as it enhances communication and boosts productivity and collaboration during meetings or calls.
However, the risk exists that while screen sharing, other applications or communications platforms may inadvertently show sensitive message alerts or data on the presenter’s screen, and are shared with other attendees, therefore compromising sensitive data.
IT and tenant administrators should also assess the risk of allowing guests to request control of a shared screen.
4. Malware uploaded via Microsoft Teams
Users and guests in a Teams channel are allowed to upload files. Guest devices are not managed by the organization; therefore, the status of the device is unknown. The risk exists that unmanaged devices (either from guests or unmanaged devices of internal users) may contain malware, or malicious files may be uploaded to Teams channels.
IT and tenant administrators should consider scanning files for malicious content before they are uploaded into Microsoft Teams.
5. Data Loss via Microsoft Teams chat, file shares and other apps
Microsoft Teams enables users to share and collaborate via files and chat. It can be integrated with other cloud-based applications such as OneDrive, Box, Dropbox and Google Drive.
The risk exists that sensitive data or confidential data may be exposed or uploaded by internal users or guests. IT and tenant administrators should consider whitelisting approved cloud-based applications. In addition, they should also consider using data loss prevention (DLP) policies for Microsoft Teams to be notified or block sensitive data from being posted in Microsoft Teams or other cloud-based applications.
6. Data residency
With the increase in data privacy laws and regulations focusing on the collection, processing and transfer of personal data comes a focus on data residency (i.e., when the storage of data is restricted to a specific geolocation). Many of the laws are restrictive on where personal data can be stored, such as Russia’s Data Protection Act No. 152 FZ, 2016, UAE’s Regulatory Framework for Stored Values and Electronic Payment Systems and the Consumer Protection Regulations v1.3, 2017.
It is important to be aware of the geographic region where your Microsoft Teams account resides and how data privacy laws affect the geographic region, your region and consumers or clients’ requirements.
7. Inconsistent control across applications
Microsoft Teams is just one of the many enterprise applications deployed in an organization. Due to the number of enterprise applications, the risk exists that security policies may be inconsistent across them. For instance, users may be blocked from sharing certain data via emails, but the data can be shared via Microsoft Teams.
It is therefore important that security policies are consistent across these applications. This includes the file transfer via email, encrypting and password protection on laptops, mobile devices and USB devices, whitelisting approved cloud-based applications and more.
8. Risky behavior patterns
Microsoft publishes APIs which allows security solutions to review user behavior analytics (UBA) in users, sessions and events. UBA searches for patterns and detects unusual or anomalous behaviors, and can be used to minimize damage from bad actors or insider threats. Behaviors like abnormal traffic patterns or events can be identified, blocked and remediated.
How do I make Microsoft Teams secure?
- Train users on the features of Microsoft Teams and how to mitigate the risk of sharing sensitive information when screen sharing
- Ensure unauthorized external domains are not allowed in Microsoft Teams
- Make sure DLP policies are enabled for Microsoft Teams to detect or block the sharing of sensitive data
- Check if external file sharing in Microsoft Teams is enabled for only approved cloud storage services
- Tenant administrators need to be aware of the data privacy laws or regulations impacting their organizations and the storage location of data in Microsoft Teams
- Ensure access to Microsoft Teams is limited to only authorized and managed devices via Azure AD Conditional Access policy
- Make sure Azure AD Identity Protection sign-in risk policy is enabled. This ensures risky behaviors (e.g., suspicious sign-ins) are blocked or require additional actions, such as password change or multi-factor authentication (MFA)
- Require sensitivity labels to be created and defined. This allows tenant administrators to regulate access to sensitive data created within Teams
- Enforce granular permissions to files shared within Teams, such as granting access to specific people, block data exfiltration, etc.
- Ensure Office 365 ATP for Microsoft Teams is enabled. This protects your organization from inadvertently sharing malicious files. When a malicious file is detected, the file is blocked until further actions are taken by the IT security team
- McAfee, Microsoft Teams: Top 10 Security Threats
- Data Breach Today, Is Teams Safe? Top Ten Teams Threats Explained
- CIS Security, CIS Microsoft Office Benchmarks
- Microsoft, Best practices for securely using Microsoft 365
- Okta, Risk-based Authentication
- InCountry, Data residency law by country: An overview