AI related technologies have been around for decades. It is nothing new despite the recent hype that has elevated AI to the forefront of the “digital” discussion. Why are we excited about AI? It is becoming increasingly ubiquitous thanks to some pivotal advancements in how and where AI can be deployed. TinyML is one of these advancements that promises to bring AI to the tiniest of things.
neXt Curve is joined by Joe Hoffman, research director at SAR Insight & Consulting to explore the following topics:
- What is TinyML and why is it important? – We discuss what TinyML is and the implications that it has on the world of AI computing and the new possibilities it presents for how and where ML modeling and inference can be deployed.
- How does TinyML work in principle? – We talk about the key principles, methods and technologies that make TinyML tick and tiny.
- What does TinyML mean for AI computing models going forward? – We discuss how TinyML will change the way that intelligent systems can be designed and deployed as well as change the way we think of computing and the applications that we build in the present and the future.
- What are some of the ideal early applications of TinyML? – We share our hypotheses on the TinyML applications that will make a difference today and the prospects for a new breed of intelligent applications and endpoint devices operating anywhere and everywhere.

Podcast: Play in new window | Download