Machine Learning Now at the Forefront of Embedded Applications
By Mark Papermaster, CTO, and SVP Technology and Engineering, AMD
Mark Papermaster, CTO, and SVP Technology and Engineering, AMD
Machine Intelligence (MI) is truly the next breakthrough in computing. Applications’ abilities to recognize data patterns, identify classifications as well as identifying anomalies with ever improving accuracy are rapidly growing in number. Maturing MI will transform how we work and play, and certainly the embedded devices market will be turned on its head over the next decade.
Embedded devices are quickly becoming imbued with MI, transforming them into even smarter devices. They can leverage historical and real-time data and increasingly deploy machine learning (ML) algorithms. Embedded devices are often located at the edge of the network; think cell towers, robotic arms, and smart city traffic sensors. With ML, embedded smart devices at the edge of the network are rapidly becoming the front line of analysis and decision-making.
This transformation will touch virtually every vertical industry, with huge growth and capability for embedded electronics going forward. Data is so important GE planned to spend $1 billion in 2017 alone to analyze data from sensors on gas turbines, jet engines, oil pipelines and other machines, and aims to triple sales of its software products by 2020 to roughly $15 billion.
In this new interconnected and more holistic world, we’re already seeing improved efficiency and outcomes. Embedded ML applications start from the obvious – e.g.; unlocking mobile devices with facial recognition for identity authentication.
Behind the scenes, security devices such as routers are rapidly incorporating smart technology. At the device and network levels, real-time analytics analyze traffic and behavior patterns to look for aberrations or signatures that could be security threats.
The impact of ML is most apparent for embedded applications with “Digital Twins.” These are digital representations of physical objects, usually specific real-world devices active in the field and represent an exact device configuration at a point in time. By adding sensors to the field device, product designers can gather real-world data and sometimes real-time asset performance. Thus enabled, the device becomes part of the Internet of Things (IoT) and real-world data can feed into ML training sets to improve generative design and facilitate design optimization. It can be used with real-life simulators or serve as a simulated digital twin before a physical design is completed. It can drive not only design improvements but add whole new capabilities. When running a specific application, designers instantly have visibility into the simulation model in a real-life use case and can see which elements of the design are getting stressed and may need more robust function.
Digital twins can spur innovation in any industry, helping companies create a virtual representation of everything it needs to take an idea and create a product
Digital twins can spur innovation in any industry, helping companies create a virtual representation of everything it needs to take an idea and create a product. However, a digital twin by itself isn’t enough to drive innovation. An increasing number of factors to test, especially in view of industry predictions that virtual testing will soon face a 10-fold or 100-fold increase, is forcing manufacturers like automotive companies to see that traditional methods simply don’t cut it – that digital twins need a much higher level of fidelity.
Traditional data processing methods are inadequate in the face of IoT and big data. Enter ML technologies. By 2021, more than half of enterprise infrastructure will make use of AI technologies. Much will come from embedded devices at the edge of the network, with data pouring in from every conceivable direction: operational and transactional systems, scanning and facilities management systems, inbound and outbound customer contact points, and mobile media and the Web. IoT/Big Data/AI trends will drive the embedded semiconductor market to more than $20B by 2021.
Edge computing brings high-performance compute closer to data sources. Real-time processing is often required to handle massive sensor and IOT-generated data. Often, too, the data is simply too large to tolerate the latency of transmitting to the cloud and must be locally analyzed for pertinent trends or aberrations needing to be flagged and acted upon immediately.
These trends are driving demand for networking capacity and compute where it’s needed, at the edge of the network. It’ll take the form of edge analytics, mobile edge compute, smart IoT gateways, and fog compute nodes, all requiring more CPU and GPU compute capabilities with greater networking.
I am more excited now than ever before about the opportunities ahead of us. This new era of networked connections among people, processes, data and machines will dramatically change how we interact with people and technology. MI may give rise to an entirely new generation of smart products and technologies, embedded everywhere in nearly every manner of device, harnessing data for better productivity. It’ll lead to developments over the next five years that we can hardly imagine today. It’s truly disruptive technology –the embedded market is transforming in this new and exciting time. It’s a great opportunity for innovation… embrace the disruption.
Check out: Top Machine Learning Companies