SmarTone plans to soon launch its innovative anti-cyberattack software, ST Protect, to enterprises in Hong Kong in anticipation of growing security threats targeted at smartphones.
“One of the interesting trends last year is that we started to see ransomware going through mobile,” said Zuk Avraham, founder and chairman of Zimperium, which specializes in providing mobile security solutions.
The US-based company has invented the world’s first on-device artificial intelligence (AI) and machine learning behavioral engine, which enables ST Protect to detect in real time known and unknown attacks on smartphones.
SmarTone is rolling out the enterprise version of ST Protect eight months after it introduced the service, which is delivered via an app, to its subscribers’ iOS and Android device.
Using AI for mobile security
Based on data collected over the last six months since the launch of ST Protect, the AI and machine-learning engine has been successful in detecting 100% of known and unknown mobile threats, according to SmarTone CTO Stephen Chau.
“The AI engine learns very quickly. By understanding the behavior of the OS platform, they have the knowledge to predict and understand when there is something is going on here that [it shouldn’t]. The beauty of ST Protect with the solution from Zimperium is that it is able to detect zero-day attacks. And while it is good that we can detect the known problems, it is even more important to actually detect that something bad will happen and we are still able to protect ourselves,” Chau said.
In order to use AI to secure the mobile platform, Zimperium had to invent a new approach.
“We had to because the traditional approach used with the PC does work on mobile. In the PC, you can see that attack and you can see the traffic. We cannot do that on mobile because the app sandboxing is very strong and there are permission separations,” Avraham said.
He explained: “Let us say, I am drinking water and I am raising my glass. The traditional approach is that you use your eyes to see that I am raising my glass. On mobile, we cannot see because we do not have the permission to sniff packets, etc. So what we are doing is instead of looking at the glass, we measure the vibes of the table, and then we can tell you this table as moved and the temperature has changed because it is hot water. Then we can tell based on these behavioral data what happens here.