The Security of Mobile Apps Starts and Ends With Developers
Third Party Development Adds to the App Security Dillema
According to a story from Business Insider, mobile device users spend 86 percent of their time using mobile apps, as opposed to the mobile web. Businesses cite the rapid adoption of apps as a way to cut costs and increase productivity, but apps suffer from insecure development process and inherent difficulties in securing the mobile environment. Entrepreneurs should be aware of the risks before purchasing an enterprise app, developing an app, or having an app developed by a third party.
The overarching problem with mobile apps, and with mobile app development, is that the process is seldom conducted with security in mind. Software development is often conducted via a process called agile, which, among other things, emphasizes speed. Speed leads to shortcuts. In a survey of developers by IBM’s Ponemon Institute, 65 percent of the 640 respondents said, “The security of mobile apps is sometimes put at risk because of customer demand or need.” Almost 40 percent of those surveyed did not look for any security holes in their code.
It is also not only mobile apps that are an issue. Back in May, the Center for Advanced Security Research Darmstadt (CASED) in Germany responsibly disclosed a cloud application security issue it discovered in mobile APIs and published in a white paper. CASED found 56 million sets of unprotected user data sets from Facebook’s Parse, Amazon, and other cloud data sources. The research highlights the deltas in mobile app security practices of developers and data living in the cloud. It also exposes the relative blindness of users to these issues because they are established behind the veil of app functionality and out of a user’s direct knowledge or ability to secure themselves.
When It Comes to Mobile Apps, Security Is Too Often an Afterthought
At the heart of the matter: Developers sometimes get really lazy with encryption. Most apps collect user data in some way. Usually this data consists of the login and password that users use to access the app, although location data might also be tracked. In a surprising number of cases, this data is stored directly on the mobile devices, and is not encrypted in any way whatsoever. In early 2014, Computerworld reported that none other than the Starbucks mobile app was subject to this vulnerability, exposing roughly 12 million users. Anyone who accessed a user’s phone could easily reveal their username, password, and location data using simple tools. Although this loophole was quickly patched, the Ponemon report cited above reveals that only 42 percent of surveyed developers write their programs in a way that stores sensitive data in an encrypted environment.
Rushed development cycles aren’t the only reason why mobile app security suffers. Google, the developers of the Android mobile operating system, does not require carriers or manufacturers to automatically update to the latest version of its software, allowing them to implement customized or modified versions of the operating system (OS). In practice, this has resulted in almost 19,000 distinct variations of Android software, some of which are desperately out of date. In order to thoroughly test security, an app developer would need to run their software on multiple versions of the Android platform, in an environment where few developers have the luxury of testing for security even once.
For the time being, there is no perfect solution to the problem of insecure apps. Secure development practices are beginning to infiltrate the industry, but progress is slow. At the enterprise level, containerization and mobile device management software can compensate for the security holes in most apps, although these solutions are still immature. Ultimately, the best solution is the common sense one: Lock down the phones of enterprise users, and only permit apps from thoroughly vetted developers.