Register now for better personalized quote!

Do you really know what's inside your iOS and Android apps?

Nov, 13, 2023 Hi-network.com

It's time to audit your code, as it appears that some no/low code features used in iOS or Android apps may not be as secure as you thought. That's the big take away from a report explaining that disguised Russian software is being used in apps from the US Army, CDC, the UK Labour party, and other entities.

When Washington becomes Siberia

What's at issue is that code developed by a company called Pushwoosh has been deployed within thousands of apps from thousands of entities. These include the Centers for Disease Control and Prevention (CDC), which claims it was led to believe Pushwoosh was based in Washington when the developer is, in fact, based in Siberia, Reuters explains. A visit to the Pushwoosh Twitter feed shows the company claiming to be based in Washington, DC.

The company provides code and data processing support that can be used within apps to profile what smartphone app users do online and send personalized notifications. CleverTap, Braze, One Signal, and Firebase offer similar services. Now, to be fair, Reuters has no evidence the data collected by the company has been abused. But the fact the firm is based in Russia is problematic, as information is subject to local data law, which could pose a security risk.

It may not, of course, but it's unlikely any developer involved in handling data that could be viewed as  sensitive will want to take that risk.

What's the background?

While there are lots of reasons to be suspicious of Russia at this time, I'm certain every nation has its own third-party component developers that may or may not put user security first. The challenge is finding out which do, and which don't.

The reason code such as this from Pushwoosh gets used in applications is simple: it's about money and development time. Mobile application development can get expensive, so to reduce development costs some apps will use off-the-shelf code from third parties for some tasks. Doing so reduces costs, and, given we're moving quite swiftly toward no code/low code development environments, we're going to see more of this kind of modelling-brick approach to app development.

That's fine, as modular code can deliver huge benefits to apps, developers, and enterprises, but it does highlight a problem any enterprise using third-party code must examine.

Who owns your code?

To what extent is the code secure? What data is gathered using the code, where does that information go, and what power does the end user (or enterprise whose name is on the app) possess to protect, delete, or manage that data?

There are other challenges: When using such code, is it updated regularly? Does the code itself remain secure? What depth of rigor is applied when testing the software? Does the code embed any undisclosed script tracking code? What encryption is used and where is data stored?

The problem is that in the event the answer to any of these questions is "don't know" or "none," then the data is at risk. This underlines the need for robust security assessments around the use of any modular component code.

Data compliance teams must test this stuff rigorously - "bare minimum" tests aren't enough.

I'd also argue that an approach in which any data that is gathered is anonymized makes a lot of sense. That way, should any information leak, the chance of abuse is minimized. (The danger of personalized technologies that lack robust information protection in the middle of the exchange is that this data, once collected, becomes a security risk.)

Surely the implications of Cambridge Analytica illustrate why obfuscation is a necessity in a connected age?

Apple certainly seems to understand this risk. Pushwoosh is used in around 8,000 iOS and Android apps. It is important to note that the developer says the data it gathers is not stored in Russia, but this may not protect it from being exfiltrated, experts cited by Reuters explain.

In a sense, it doesn't matter much, as security is based on pre-empting risk, rather than waiting for danger to happen. Given the vast numbers of enterprises that go bust after being hacked, it's better to be safe than sorry in security policy.

That's why every enterprise whose dev teams rely on off-the-shelf code should ensure the third-party code is compatible with company security policy. Because it's your code, with your company name on it, and any abuse of that data because of insufficient compliance testing will be your problem.

Please follow me on Twitter, or join me in the AppleHolic's bar & grill and Apple Discussions groups on MeWe. Also, now on Mastodon.

tag-icon Hot Tags : Security Apple Small and Medium Business Mobile iOS Android Software Development

Copyright © 2014-2024 Hi-Network.com | HAILIAN TECHNOLOGY CO., LIMITED | All Rights Reserved.