SEPTEMBER 20189device, the user is assured of privacy and the security of his or her personal data. The Engineering ChallengeRunning AI algorithms on mobile devices brings up a unique set of challenges that technologists are gearing up to solve. Firstly, AI algorithms typically leverage a computational structure called a neural network, which is computationally complex and memory intensive. Also, such a network must execute in real-time, while being always-on and allowing for other workloads to run con-currently on the device. The device must consume less power and be thermally efficient to allow for sleek ul-tra-light designs that can work all day(even multi-day in some cases)without the need for recharge.While ML engineers are adapting their inference models to fit the constraints of mobile architectures, chip architects are meeting them midway by evolv-ing their system designs to handle AI workloads more efficiently. For example, dedicated engines to handle neural networks are increasingly showing up on mobile System-on-chips (SoCs). In addition, mobile processors already offer efficient computation on many different workloads (ex: fixed or floating point, 2D and 3D fil-ters) which ML engineers can exploit to invent better performing AI architectures.AI for IoTBeyond mobile personal devices, AI is rapidly expanding into cars, health monitors, shopping carts, refrigerators, traffic lights, vending machines and more - in short, ev-erywhere. Pervasive and intelligent Internet of Things (IOT) scales the scope of AI from billions of phones / smart-glasses to trillions of devices. These intelligent devices will have the capability to sense, infer and re-act to the environment. Applications of AI-enabled IoT a bound in smart homes, industrial automation, wearable devices, healthcare, smart infrastructure for cities, ex-tended reality, and automotive.5G and the Interdependence of Mobile and Cloud AIThe growth of mobile AI does not imply that AI on the cloud will cease to exist. To the contrary, the quality of AI models will significantly improve as a result of cloud-based training on large anonymized data sets sourced from mobile devices. Since the mobile data genera-tors are distributed, data storage and training can also be decentralized. While inference and user-specific fine-tuning of models will be in the domain of the per-sonal device, broad-scale cross-user training will reside in the cloud.The interdependence of on-device and cloud AI de-mands a high-speed wireless communications system which can efficiently scale beyond today's 4G. The next generation 5G wireless system provides such a scalable infrastructure that offers higher through puts at low la-tency. 5G ensures robust coverage even at the edge of the network and efficiency across the wide dynamic range of user devices. Reliable device-to-device com-munications via 5G enables distributed intelligence wherethe task of inference is efficiently divided across multiple connected devices. As the world adopts next generation 5G technolo-gy, expect mobile AI use cases to explode and the full potential of Mobile AI to be realized. While inference and user-specific fine-tuning of models will be in the domain of the personal device, broad-scale cross-user training will reside in the cloudAjit Rao
< Page 8 | Page 10 >