Zoom is an interesting case study in the various ways that software can fail. The Zoom team has had to learn a lot of lessons quickly, including the pitfalls of reusing components, figuring out how to make security engineering improvements to their SDLC and DevOps processes, and the need for a CISO leadership team.
In this article I want to walk you through some of the issues that were recently publicized. I’ll break them into categories to understand the mistakes made and the subsequent decisions that were necessary. There has been a bit of a pile-on with security professionals each taking their turn to tell Zoom how they could have done better. Some of the issues that were uncovered are truly concerning, while others are natural tradeoffs between security and usability. In some cases, Zoom was actually following best practices (like reusing components), but got bitten anyway.
Thankfully Zoom has responded well to these issues with good messaging and quick patches. I don’t envy the Zoom dev team right now, they’ve released dozens of new patches addressing these issues and other issues since the end of February!
The point of this article is not to add to the pile for or against Zoom, but to use the situation as a lens to understand how we can all improve our security posture. For each of the issues, I’ll summarize the issue and the impact, and then give a recommendation for how Zoom could have mitigated the issue with better security practices. I also want to add, that while all of this may seem clear in hindsight, I’ve seen how hard these types of security/functionality/financial tradeoffs can be in the real world.
I’ve sorted the issues from what I see as worst to least bad. The worst issues include real risks to end users. The issues at the bottom of the list represent a misunderstanding of risk or a misunderstanding of user behavior on the part of Zoom, with less potential risk.
Shaky Cryptography Practices
Citizen Lab found that instead of reusing standard cryptography components, Zoom built their own libraries. While the team did follow standards (using AES, for example), writing custom crypto libraries is never a good idea. Cryptography can fail in very subtle ways that could allow an attacker to recover part or all of the encrypted message or metadata. Additionally it looks like Zoom intended to use AES-256 when they in fact were using 128bit keys. Unfortunately, this shorter key length, drastically weakens the encryption. Finally, Zoom was using ECB mode, which can lead to data leakage, as two identical plaintext blocks will produce identical cipher-text blocks. If the data stream has repeating data, an attacker may be able to uncover information about the input.
This one is certainly bad. Nobody should ever roll their own cryptography under any circumstances. One of my favorite guidelines is to use “Boring Cryptography.” Boring cryptography mean you should use algorithms and cipher-suites that are obviously secure. The fewer clever tricks you use in your cryptography, the fewer places to make mistakes.
There are great crypto libraries out there and lots of libraries that make good crypto easy. Bouncy Castle is an example of an exceptional library that is certified and has been developed and improved over the last 20 years! I’ve been a fan of NaCl for philosophical reasons (they aim to make cryptography boring and less prone to mistakes), but I can’t fully endorse it as it’s not certified and uses some uncommon algorithms that I’m not familiar enough with to recommend. Note: this issue is going to take time to fix, and as of this writing no patch has been issued.
Poor Security Hygiene
Earlier this week Peiter Zatko (aka Mudge) wrote a Twitter thread (which partially inspired this article) about some of the issues that he and his colleagues have found in the Linux Zoom client. These issues highlight a basic misunderstanding of how to protect linux binaries and how to use Dev Tools to enable secure defaults. Namely Mudge highlighted the binaries were not using basic binary exploitation mitigation techniques like enabling DEP, ASLR, Stack Canaries, and more.
These features make it more difficult for attackers to exploit binary vulnerabilities, but more than that they are table stakes for good development practices. They should be enabled on every build by default. Mudge’s thread includes some great guidance that I don’t need to repeat here, but turning on these mitigations at build time, enabling unit tests and creating secure and reliable toolchains are a must.
Installer Issues
Objective-See reported a number of issues in Zoom, but one of the most important ones in my mind is calling unsigned scripts from a signed binary on MacOS. When MacOS runs any binary it checks to ensure the binary is from a trusted source, this decreases the likelihood of running malware on your computer. If the signed binary calls an unsigned script an attacker could replace the script with a malicious one to execute code on your computer. This happens only at install time, which reduces the window for the exploit, but with Zoom’s business growing so quickly there are a lot of installs going on right now!
Zoom resolved this quickly in version 4.6.9. They didn’t release the code change, so I can’t know how they fixed the issue for sure, but it is important that any trusted code only calls other trusted code. Verifying signatures on every binary and script is important.
Leaking email addresses
Zoom is Leaking Peoples’ Email Addresses and Photos to Strangers - VICE
Vice reported that Zoom poorly deployed a company directory feature, so it grouped some less common public email domains together, thus sharing email addresses to a bunch of people who didn’t intend to share.
Zoom quickly addressed this issue, but could have mitigated this before release by doing proper threat modeling. It’s important to bring many different viewpoints to the table when threat modeling. It’s likely that one of the people on the Zoom security team could have predicted this issue.
Credential Stuffing Attack
Over 500,000 Zoom accounts sold on hacker forums, the dark web
Credential Stuffing works by using already disclosed credentials on other systems. If you have a list of credentials collected from other data breaches it’s likely that somebody has an account somewhere else and they’ve reused their password on that account. This is one of the biggest reasons why it’s so important to use a password manager, like LastPass, Dashlane, or OnePassword.
I don’t think this is really Zoom’s fault, this is a password reuse issue which, at least partially, falls on the shoulders of the end user. However, Zoom could leverage a service like HIBP’s API to check if somebody is using a breached password. Password stuffing is an attack that can effect any service out there, not just Zoom.
Automatic UNC Path linking
Zoom Lets Attackers Steal Windows Credentials, Run Programs via UNC Links
Zoom reused a common Windows component for rich text that automatically detects links. One link type that it detects is a UNC link (the ones that start with \). If a user clicks on that link Windows will automatically send hashed credentials to the linked server. If a user has a weak password it may be possible to crack those hashes.
The security industry recommends reusing well tested components instead reinventing the wheel. They did that here and got burned by some unintended consequences. However, most of these issues are in the component they used or in Windows itself. Zoom did a good job of responding to this by disabling UNC.
War Dialing & “ZoomBombing”
‘War Dialing’ Tool Exposes Zoom’s Password Problems — Krebs on Security
It’s possible to guess Zoom conference IDs and jump into somebody else’s conference session. This is a classic tradeoff between usability and security. If we want to support phone only participants the IDs must be numeric only. 10 digits may have seemed like it was long enough, but they may want to consider longer codes. The likelihood of being able to guess a valid ID increases with the number of users on the system. I like the introduction of waiting room for this, but clearly passwords are the way to go.
Summing up
Security is a multi-faceted beast. There are security vulnerabilities to mitigate and remediate, there are perceptions to security and privacy the need to be considered, there are abuse cases to think about, threat modeling, testing, code reviewing, and scanning to be performed.
There’s a lot to think about, certainly, but when you find yourself in the public eye catching up on ten years of security lessons in a month, you are forced to grow up fast. Zoom is trying to do that now, and they’re doing a good job of it, but we’re certainly shaking out a lot of skeletons in their closet along the way. Any one of these issues taken alone could have been minimized.
As I wrote in our last newsletter this does show a theme that Zoom, like many rapidly growing companies, didn’t think about security as much as they should before. That said I appreciate the efforts they are making to quickly shore up their security posture in nearly real time.