Enjoy Banking on the Go.
Download Mobile Banking App

Download
Banner

Fintalk Newsletter Blogs

  • Subscribe
  • Tiny AI

    Published Date : April 03, 2020

    Tiny AI is the term that is used to describe the efforts of the AI research community to reduce the size of algorithms, especially those that require large amounts of datasets and computational power.

    Tiny AI Researchers develop methods, called distillation methods that not only reduce the size of a model but do so while accelerating inference and maintaining high levels of accuracy. Using these distillation methods, a model can be scaled down significantly, by factors reaching up to 10x. Besides, a much smaller algorithm can be deployed on the edge without sending data to the cloud, rather making decisions on the device.

    Fintalk - Bank of Baroda

    In their quest to build more powerful algorithms, researchers are using ever greater amounts of data and computing power, and relying on centralized cloud services. This not only generates alarming amounts of carbon emissions but also limits the speed and privacy of AI applications.

    But a counter-trend of tiny AI is changing that. Tech giants and academic researchers are working on new algorithms to shrink existing deep-learning models without losing their capabilities. Meanwhile, an emerging generation of specialized AI chips promises to pack more computational power into tighter physical spaces and train and run AI on far less energy.

    These advances are just starting to become available to consumers. Google announced that it can now run Google Assistant on users’ phones without sending requests to a remote server. As of iOS 13, Apple runs Siri’s speech recognition capabilities and its QuickType keyboard locally on the iPhone.

    IBM and Amazon now also offer developer platforms for making and deploying tiny AI.

    As AI gets more accurate, there is a hidden environmental cost to high accuracy. According to the study, training one single algorithm might consume 5x the lifetime carbon dioxide emissions of an average car, or the equivalent of about 300 round-trip flights between New York and San Francisco. In search of high accuracy, we seem to have lost our focus on energy efficiency.

    The global market for artificial intelligence should grow from $3.5 billion in 2018 to reach $26.1 billion by 2023 at a compound annual growth rate (CAGR) of 49.1% for the period of 2018-2023.

    Credits : Akhil Handa,M.T.Rao

Displaying Item  1  to  1  of  179

Last Visited Page

X
Back to Top