The Future Of AI Is Mobile
A hands-on multimedia engineering guy, Ajit's interests span camera, video, virtual reality, computer vision & learning - algorithms and others
In the last few years, the world has witnessed waves of advances in Artificial Intelligence (AI) technology. AI is already being deployed in many applications to assist humans in studying vast amounts of data to make observations, draw inferences or take decisions. AI algorithms have demon-strated their promise by surpassing human accuracy at some tasks such as at recognizing faces from images, predicting failure of machines, and detecting health conditions from medical data.
AI algorithms are typically trained using a large collection of labeled data samples (`training data') to arrive at an inference rule also known as the model. This process of learning from data (rather than from rules explicitly programmed by a human) is called machine learning (ML). The model learnt through ML is then applied to unlabeled data samples to infer the labels.
Mobile AI
Traditionally, both training and inference algorithms have been executed on massive servers in large data centers due to their high requirements of compute and data storage. However, in the last few years, an explosion of mobile use-cases and a rapid increase in mobile compute capabilities has opened up new possibilities. Inference algorithms are being increasingly deployed on smartphones, smart cameras and other mobile form-factors. The ubiquitous smartphone is expected to become the most pervasive AI platform in the world.
On-device AI has many advantages over cloud-based AI, especially for inference. First, there is direct access to information via natural interactions with the user. Egfacial gestures, hand movements and voice commands. In addition, contextual information such as location and recent activities of the user can be leveraged securely on the device to make accurate inferences. Added to these, are the advantages of low latency of interaction and robust behavior even when the wireless connection is disrupted.
The AI-Enabled Mobile Personal Device
What can we expect well-designed, AI-enabled, personal mobile devices to look and feel like? While the smartphone will continue to be the primary mobile form-factor for many years, we expect that light-weight augmented reality glasses will also start becoming popular. Users will communicate with their personal device via gestures and voice. The device, aided by its perception and knowledge of the context, will use AI
In the last few years, the world has witnessed waves of advances in Artificial Intelligence (AI) technology. AI is already being deployed in many applications to assist humans in studying vast amounts of data to make observations, draw inferences or take decisions. AI algorithms have demon-strated their promise by surpassing human accuracy at some tasks such as at recognizing faces from images, predicting failure of machines, and detecting health conditions from medical data.
AI algorithms are typically trained using a large collection of labeled data samples (`training data') to arrive at an inference rule also known as the model. This process of learning from data (rather than from rules explicitly programmed by a human) is called machine learning (ML). The model learnt through ML is then applied to unlabeled data samples to infer the labels.
Mobile AI
Traditionally, both training and inference algorithms have been executed on massive servers in large data centers due to their high requirements of compute and data storage. However, in the last few years, an explosion of mobile use-cases and a rapid increase in mobile compute capabilities has opened up new possibilities. Inference algorithms are being increasingly deployed on smartphones, smart cameras and other mobile form-factors. The ubiquitous smartphone is expected to become the most pervasive AI platform in the world.
On-device AI has many advantages over cloud-based AI, especially for inference. First, there is direct access to information via natural interactions with the user. Egfacial gestures, hand movements and voice commands. In addition, contextual information such as location and recent activities of the user can be leveraged securely on the device to make accurate inferences. Added to these, are the advantages of low latency of interaction and robust behavior even when the wireless connection is disrupted.
The AI-Enabled Mobile Personal Device
What can we expect well-designed, AI-enabled, personal mobile devices to look and feel like? While the smartphone will continue to be the primary mobile form-factor for many years, we expect that light-weight augmented reality glasses will also start becoming popular. Users will communicate with their personal device via gestures and voice. The device, aided by its perception and knowledge of the context, will use AI
to reason and provide the user with information and recommended actions to situations. Perhaps, your phone combined with additional body-worn sensors could periodically monitor your health and predict events that will need medical intervention. In a social setting, your smart glass could save you embarrassment by recognizing and reminding you of the name of a long-lost acquaintance. At the mall, your phone may instantly check whether a piece of furniture you are eyeing would fit in your living room and provide a visualization of the upgraded look. All through this, the device continuously learns to improve its own accuracy aided by observations and its owner's feedback. Since all this happens on-device, the user is assured of privacy and the security of his or her personal data.
The Engineering Challenge
Running AI algorithms on mobile devices brings up a unique set of challenges that technologists are gearing up to solve. Firstly, AI algorithms typically leverage a computational structure called a neural network, which is computationally complex and memory intensive. Also, such a network must execute in real-time, while being always-on and allowing for other workloads to run con-currently on the device. The device must consume less power and be thermally efficient to allow for sleek ultra-light designs that can work all day(even multi-day in some cases)without the need for recharge.
While ML engineers are adapting their inference models to fit the constraints of mobile architectures, chip architects are meeting them midway by evolving their system designs to handle AI workloads more efficiently. For example, dedicated engines to handle neural networks are increasingly showing up on mobile System-on-chips (SoCs). In addition, mobile processors already offer efficient computation on many different workloads (ex: fixed or floating point, 2D and 3D filters) which ML engineers can exploit to invent better performing AI architectures.
AI for IoT
Beyond mobile personal devices, AI is rapidly expanding into cars, health monitors, shopping carts, refrigerators, traffic lights, vending machines and more - in short, everywhere. Pervasive and intelligent Internet of Things (IOT) scales the scope of AI from billions of phones / smart-glasses to trillions of devices. These intelligent devices will have the capability to sense, infer and re-act to the environment. Applications of AI-enabled IoT a bound in smart homes, industrial automation, wearable devices, healthcare, smart infrastructure for cities, ex-tended reality, and automotive.
5G and the Interdependence of Mobile and Cloud AI
The growth of mobile AI does not imply that AI on the cloud will cease to exist. To the contrary, the quality of AI models will significantly improve as a result of cloud-based training on large anonymized data sets sourced from mobile devices. Since the mobile data generators are distributed, data storage and training can also be decentralized. While inference and user-specific fine-tuning of models will be in the domain of the personal device, broad-scale cross-user training will reside in the cloud.
The interdependence of on-device and cloud AI demands a high-speed wireless communications system which can efficiently scale beyond today's 4G. The next generation 5G wireless system provides such a scalable infrastructure that offers higher through puts at low latency. 5G ensures robust coverage even at the edge of the network and efficiency across the wide dynamic range of user devices. Reliable device-to-device communications via 5G enables distributed intelligence where the task of inference is efficiently divided across multiple connected devices.
As the world adopts next generation 5G technology, expect mobile AI use cases to explode and the full potential of Mobile AI to be realized.
While inference and user-specific fine-tuning of models will be in the domain of the personal device, broad-scale cross-user training will reside in the cloud
The Engineering Challenge
Running AI algorithms on mobile devices brings up a unique set of challenges that technologists are gearing up to solve. Firstly, AI algorithms typically leverage a computational structure called a neural network, which is computationally complex and memory intensive. Also, such a network must execute in real-time, while being always-on and allowing for other workloads to run con-currently on the device. The device must consume less power and be thermally efficient to allow for sleek ultra-light designs that can work all day(even multi-day in some cases)without the need for recharge.
While ML engineers are adapting their inference models to fit the constraints of mobile architectures, chip architects are meeting them midway by evolving their system designs to handle AI workloads more efficiently. For example, dedicated engines to handle neural networks are increasingly showing up on mobile System-on-chips (SoCs). In addition, mobile processors already offer efficient computation on many different workloads (ex: fixed or floating point, 2D and 3D filters) which ML engineers can exploit to invent better performing AI architectures.
AI for IoT
Beyond mobile personal devices, AI is rapidly expanding into cars, health monitors, shopping carts, refrigerators, traffic lights, vending machines and more - in short, everywhere. Pervasive and intelligent Internet of Things (IOT) scales the scope of AI from billions of phones / smart-glasses to trillions of devices. These intelligent devices will have the capability to sense, infer and re-act to the environment. Applications of AI-enabled IoT a bound in smart homes, industrial automation, wearable devices, healthcare, smart infrastructure for cities, ex-tended reality, and automotive.
5G and the Interdependence of Mobile and Cloud AI
The growth of mobile AI does not imply that AI on the cloud will cease to exist. To the contrary, the quality of AI models will significantly improve as a result of cloud-based training on large anonymized data sets sourced from mobile devices. Since the mobile data generators are distributed, data storage and training can also be decentralized. While inference and user-specific fine-tuning of models will be in the domain of the personal device, broad-scale cross-user training will reside in the cloud.
The interdependence of on-device and cloud AI demands a high-speed wireless communications system which can efficiently scale beyond today's 4G. The next generation 5G wireless system provides such a scalable infrastructure that offers higher through puts at low latency. 5G ensures robust coverage even at the edge of the network and efficiency across the wide dynamic range of user devices. Reliable device-to-device communications via 5G enables distributed intelligence where the task of inference is efficiently divided across multiple connected devices.
As the world adopts next generation 5G technology, expect mobile AI use cases to explode and the full potential of Mobile AI to be realized.