Millions of Android Devices Vulnerable to Remote Hijacking: Baidu Wrote the Code, But Google Made it Possible
Last month, Chinese security researchers uncovered a security vulnerability in an Android software library developed by the Chinese search giant Baidu, and when it comes to security vulnerabilities, this one’s a whopper. It allows an attacker to remotely wreak all sorts of havoc on someone’s phone, from sending fake SMS messages to downloading arbitrary files to installing other apps without the user’s authorization.
The widespread deployment of the vulnerable software library makes things even worse. The library, known as the Moplus SDK, is used by over 14,000 separate Android apps. By some estimates, as many as 100 million unique Android devices were vulnerable. And that isn’t even the worst of it.
Further investigation by researchers at Trend Micro showed that this wasn’t just the result of some security bug. The Moplus SDK was actually designed to do all the terrible things described above. That’s right: Baidu apparently actually built the capability into its SDK to remotely upload files, install apps, and trigger all sorts of other actions—and this capability existed on every device on which an app that contained the Moplus SDK library had been installed.
Digging a little deeper, it’s obvious this was no accident. Back in 2013, Baidu applied for and received a Chinese patent on a method for triggering actions within a mobile app by having a web service send the mobile app a web request—in other words, precisely the backdoor functionality Moplus SDK contained. 1
Obviously, the vast majority of the blame for this catastrophe should fall on Baidu. But we think a small part of the blame should fall on Android’s permission system as well.
Android’s Permission System Was Broken…
Up until Android 6 (Marshmallow), Android users were given an all or nothing choice: either give an app permission to do whatever it requested, or don’t install the app. This kind of Hobson’s choice led to user apathy about permissions. (Why bother examining all the permissions the app is requesting if you don’t really get a choice about specific permissions?) At the same time, app developers realized they could ask for any permissions they wanted, since only about half of users have ever chosen not to install an app based on its permissions. (Why not have your flashlight app access a phone’s location if it means you can display higher-revenue location-targeted ads in your app?)
The result was an environment in which apps could request permissions unrelated to their core functionality, and some percentage of users would install them anyway. This includes apps bundled with the Moplus SDK—every such app explicitly asked for all the permissions it needed in order to enable Moplus SDK’s backdoor capabilities. The difference this time was what most users would normally shrug off as a privacy-invasive but otherwise harmless request for excess permissions turned out to be a request by a vulnerable backdoor system that endangered users’ security.
Fortunately, there’s a light at the end of this long, dark tunnel. The latest version of Android features a completely new permission system. In Android 6, when an app tries to use a permission for the first time, Android will show a popup to the user. The user can then decide whether or not to let the app use that permission.
This means users finally have a real choice when it comes to what info or capabilities they want to give their apps. While this system wouldn’t stop all of the malicious effects of Moplus SDK (e.g., a map app which uses location data for both its main functionality as well as for sending to an attacker) it would stop some of the most egregious examples (e.g., a flashlight app that tries to access location data).
…But Android’s Permissions Still Have a Huge Security Hole
But Google could, and should, do more. For one thing, Android currently gives every app permission to access the Internet automatically—without even notifying the user, much less asking their consent. In the past, Google has claimed that this is because apps without Internet access could exfiltrate data to servers through other surreptitious means (via an intent, for the Android devs out there), so blocking Internet access would be pointless.
This argument is a red herring. If apps with Moplus SDK had to ask for Internet access, it’s entirely possible that many users would have denied giving some of the apps that permission—thus completely eliminating their risk. In other words, while it’s true that apps without Internet access could still find sneaky ways to leak data without telling users, it’s also true that blocking Internet access is a necessary first step to keep many apps from doing so.
So what’s the real reason Google won’t expose Internet access as a top-level permission?
It’s the same reason they’ve blocked apps like Disconnect from the Play Store. Google is worried that giving users a choice about which apps are communicating about them could put a dent in their lucrative advertising business. After all, a flashlight app without Internet access can’t display ads.
The problem is that security and privacy are two sides of the same coin. By refusing to give users a choice about whether or not apps have Internet access, Google is putting its users at risk and sending the message that it cares more about its bottom line than its users' security.
Fortunately for Google, this is an easy fix—just include Internet access as one of the permissions apps have to request in the next version of Android. Otherwise, Moplus SDK won't be the last major Android security catastrophe.
Recent DeepLinks Posts
Feb 22, 2017
Feb 22, 2017
Feb 22, 2017
Feb 22, 2017
Feb 22, 2017
- Fair Use and Intellectual Property: Defending the Balance
- Free Speech
- UK Investigatory Powers Bill
- Know Your Rights
- Trade Agreements and Digital Rights
- State-Sponsored Malware
- Abortion Reporting
- Analog Hole
- Anti-Counterfeiting Trade Agreement
- Artificial Intelligence & Machine Learning
- Bloggers' Rights
- Border Searches
- Broadcast Flag
- Broadcasting Treaty
- Cell Tracking
- Coders' Rights Project
- Computer Fraud And Abuse Act Reform
- Content Blocking
- Copyright Trolls
- Council of Europe
- Cyber Security Legislation
- Defend Your Right to Repair!
- Development Agenda
- Digital Books
- Digital Radio
- Digital Video
- DMCA Rulemaking
- Do Not Track
- E-Voting Rights
- EFF Europe
- Electronic Frontier Alliance
- Encrypting the Web
- Export Controls
- Eyes, Ears & Nodes Podcast
- FAQs for Lodsys Targets
- File Sharing
- Fixing Copyright? The 2013-2016 Copyright Review Process
- Genetic Information Privacy
- Government Hacking and Subversion of Digital Security
- Hollywood v. DVD
- How Patents Hinder Innovation (Graphic)
- International Privacy Standards
- Internet Governance Forum
- Law Enforcement Access
- Legislative Solutions for Patent Reform
- Locational Privacy
- Mandatory Data Retention
- Mandatory National IDs and Biometric Databases
- Mass Surveillance Technologies
- Medical Privacy
- Mobile devices
- National Security and Medical Information
- National Security Letters
- Net Neutrality
- No Downtime for Free Speech
- NSA Spying
- Offline : Imprisoned Bloggers and Technologists
- Online Behavioral Tracking
- Open Access
- Open Wireless
- Patent Busting Project
- Patent Trolls
- PATRIOT Act
- Pen Trap
- Policy Analysis
- Public Health Reporting and Hospital Discharge Data
- Reading Accessibility
- Real ID
- Reclaim Invention
- Search Engines
- Search Incident to Arrest
- Section 230 of the Communications Decency Act
- Shadow Regulation
- Social Networks
- SOPA/PIPA: Internet Blacklist Legislation
- Student Privacy
- Stupid Patent of the Month
- Surveillance and Human Rights
- Surveillance Drones
- Terms Of (Ab)Use
- Test Your ISP
- The "Six Strikes" Copyright Surveillance Machine
- The Global Network Initiative
- The Law and Medical Privacy
- TPP's Copyright Trap
- Trans-Pacific Partnership Agreement
- Travel Screening
- Trusted Computing
- Video Games