Dean Coclin

Subscribe to Dean Coclin: eMailAlertsEmail Alerts
Get Dean Coclin: homepageHomepage mobileMobile rssRSS facebookFacebook twitterTwitter linkedinLinkedIn


The speed of communication and connectivity today of the smart phone is unparalleled. Even better and faster than the Internet itself, because your phone is always talking to the network, always available and open to receive and send information without you having to dial a number or open a browser.

However, this phenomenal capability also has a dark side. Convenience has a price. The same smart phone that is always connected and serving you content can also be serving you malicious content or stealing from you for someone else. People are buying and downloading applications to their smart phones by the millions. These applications potentially contain code to enable a hacker entry to your phone, access to your address book, identity information and more, where they can cause damage to your device and data. Much like the malicious tool bar on the Internet, an ordinary application can contain code that can damage your phone and wallet.

But it is not just your wallet. How about your company’s?

As more commercial applications develop a mobile interface, they open a new door to the corporate network for hackers.

Security Holes in Mobile World
For many corporate employees today, mobile phones and PDAs have replaced their PCs.

Enterprise users are now using their mobile devices to perform the same functions that they previously performed on their desktop PC. Only now, these tasks can be done from a much smaller device from virtually anywhere at any time. One of the hidden dangers to which people aren’t paying much attention is rogue code infecting mobile phones. That’s unfortunate because although no major incidents have been reported yet, it’s only a matter a time before some serious event occurs.

Depending on the type of application, a piece of malware could cause a phone to dial foreign numbers, make exorbitant text messages, copy keystrokes (key logger) when owners log into their financial institution or cause some other form of disturbance for the end user. It might flood the network with meaningless messages or render the device inoperable, causing increased help desk costs for the carrier, and your phone to be refused service from the cell network. The same criminals spoofing websites in order to gain access to your personal information have figured out that access to enterprise information is far more rewarding. And while major hacks into corporate sites seem like monthly news, mobile device hacks are lurking in the wings.

This is possible, since smart phones today can browse the Internet and download code from many different places. In fact, many carriers offer "download sites" for their customers to use as a one-stop shop. In addition, vendors such as Handango provide applications for many different operating systems. Also, scammers can advertise rogue code and point browsers to their website to trick users into downloading an application that is not legitimate. Consider a phishing attack, for example, where an unsuspecting user receives an email with a link to "update" his bank account info. He is then directed to a rogue website where code can either be silently downloaded or a he is directed to a link to download a game, widget or some other application that looks legitimate but is really malware.

The fact is that mobile phones are here to stay and have become woven into the fabric of corporate information processing. Where once mobile devices existed simply as a phone, they are now very intelligent data devices and are getting smarter and more robust every day. This is a classic case of balancing convenience against absolute security. Security professionals need to consider what steps and policies they can adopt to ensure that the applications being downloaded by employees are safe and do not wind up causing a material information breach. How Vulnerable Are Smartphones? Is there an answer? The answer today is the digital signature that accompanies the application, whereby the developer digitally “signs” the application and a third party that issues the digital signature vouches for the identity of the individual. This is much like a driver’s license, where you can see an individual’s photo and the fact that the license was issued by the state, which acts as the trusted third party. In this way, signed applications and content can be downloaded and we know who signed it and that it has not been tampered with.

One example of this action in the mobile device world is Symbian, the world's most popular mobile operating system, accounting for 50% of smart phone sales. For creating applications on Symbian’s mobile operating system, authors are required to fax identity information (passport, driver’s license, etc.) to confirm they are who they say they are. They must also include information about their business and pay with a credit card. This process is called vetting and is what the trusted third party does to confirm identity.

Interestingly, other mobile operating systems aren’t quite so thorough. In fact, some only require that authors pay a certificate fee with a credit card, which could, of course, be stolen. There is no vetting or trusted third party. Little can be done to identify the perpetrator in such cases.

Beyond this, some operating system manufacturers like Symbian require that code be tested by a third-party test house before it gets signed by recognized commercial certificate authorities. The test house runs code through a battery of tests before it puts a seal of approval on it. Then it passes it back to the commercial certificate authorities to sign before being returned to the developer.

What are the others doing?

While Symbian has robust process, technology and rigorous testing programs in place to prevent malicious code from being distributed globally and almost instantaneously, the approaches other large mobile operator providers take vary greatly. Here are a few examples.

  • Blackberry
    According to Research in Motion (RIM), it uses “IT policies, application control policies and code signing to contain malware by controlling third-party application access to the BlackBerry device resources and applications. These containment methods are designed to prevent malware that might gain access to the BlackBerry device.” That said, RIM allows developers to sign applications with keys it issues, which means they can sign whatever they choose without further testing from a testing house. RIM does perform some vetting. Developers have to register with RIM via a form and a $20 credit card payment, but no real ID check is done. This means you can theoretically register with a stolen credit card and publish under a false name. But even if a responsible developer signed code, if the laptop with the key was stolen (and the key was not properly protected), the criminal could access the key and sign code in the future under the responsible developer’s identify.
  • iPhone
    To develop and sell applications for the iPhone, you join the Apple Developer program. With $99, an email address and a credit card that works, you can apply and distribute your applications via the Apple store. So with a stolen credit card and an alternative email address, you can theoretically distribute any application you can create without repercussions.
  • Google
    If security for the Blackberry and iPhone environments is lacking somewhat, it is practically nonexistent with Google. You can create your own certificate, sign the application and add it to the app store. There’s no charge. Anyone with a phony email address can theoretically create a rogue app, sign it and submit it. If you wish to publish to the Android market, there is a registration and signup fee of $25, but this has nothing to do with signing the application. For example, recently someone developed a rogue Android Smartphone phishing application that tried to gain access to consumers’ financial information. Called Droid09, it was launched from the Android Marketplace. Although now removed, it’s a frightening example of how susceptible we are to fraud.

How Can We Better Protect Smart Phones?
So how do we better protect smart phones and their users? Here are a few steps:

Step 1: Make Sure Code Is Signed By Trusted Individuals

The first step in protecting mobile devices is to ensure that digital certificates are used to authenticate downloaded code. A digital certificate is an ID that contains information about the person, machine or program to whom the certificate was issued. Certificates provide you with assurance that what you are about to use comes from a reliable source. In short, a certificate enables digital trust.

If you are a developer, certificates enable you to sign your work and to verify that this program and version of code is the code that you wrote (i.e., it has not been tampered with). Mobile phone code developers use certificates today to ensure programs are valid before being downloaded to literally millions of devices globally.

The good news is that certificates are inexpensive and, in fact, most mobile device suppliers require that all code be signed before it is used. Certificates serve as a deterrent to malicious behavior, since we know both who signed the code and when they signed it. And since authors of malware don’t want this information to be known, protection is enhanced.

Step 2: Vetting

As noted, if a company allows workers to download “unsigned” programs from sites, rogue code could infect the device and then possibly the entire network. Digital signatures are a necessary component of the security solution, but aren’t enough. For example, how do you know that authors of code are who they say they are? In fact, the process of verifying the identity of authors varies widely.

Typically, certificates are issued to developers after an identity check. More thorough organizations use recognized commercial certificate authorities that follow OMTP (Open Mobile Terminal Platform) standards (mobile network operator forum focused on standards) for identity validation and to conduct email address, valid credit card and identity card (passport or drivers license) checks. In addition, these organizations may even translate foreign documents.

Step 3: More Vetting

Properly done, vetting is about tying all the disparate loose ends together to eliminate or make extremely unlikely any mischief. But there’s one more step that is often missing. Some OS vendors provide certificates that sign the code directly to developers. In theory, that’s fine. As long as the developer uses and stores the certificate properly, security directors can sleep at night. But what if that certificate is given to another developer? Or stolen? Or misplaced? Then the entire security process has been compromised. The proper way to ensure security is to maintain the signing key in a portal so that developers must upload their signed code each and every time they create new software. In that way, the portal ensures the security of the signing key and the integrity of the code. Only the portal can sign the code with a key that will allow it to run on the phone. And since criminals don’t like to be identified, it greatly reduces the risk of rogue code.

Another advantage of this approach is that bad applications can be rescinded by revoking the certificate for that application. Because each application has a unique certificate, the revocation of the certificate for one application has no effect on the other applications. If a single certificate, such as the developer certificate, is used for multiple applications, this granular revocation capability is lost.

Enterprises, too, can take a role in ensuring authenticity. For example, some OS providers do not require applications to be signed, but provide tools for enterprises to manage devices on their network. An enterprise could implement a policy that all code be signed before executing on the device.

Conclusion
Most of the major providers don’t currently offer the proper level of security to protect smart phone users from unsavory developers. It will probably take a colossal failure or scam to move some of the more lax mobile operators to more rigorous processes and testing. For the safety of millions of businesses, digital certificates plus comprehensive vetting should be undertaken to protect our networks.

Smart phones are not going away and won’t get dumber. By following these few simple and inexpensive steps – using certificates and proper vetting – consumer and business mobile users can be assured of safe application experiences.

More Stories By Dean Coclin

Dean Coclin is VP of Business Development at ChosenSecurity, where he is responsible for fostering industry partnerships, technology alliances and promoting the company's products to system integrators, consulting firms and other partners. He can be reached at dcoclin (at) chosensecurity.com.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.