
Developed at SRI International, the company has spun out and closed their first round of funding, led by Future Ventures. The company can compress common AI models by 10x without a noticeable change in accuracy, enabling deployment on the inexpensive microcontrollers, DSPs and other processor cores typically found in edge devices. This allow intelligence to migrate to the edge for local processing (e.g., face-detection algorithms running locally within security cameras or appliances, or Siri-like voice interfaces working instantly even when network connectivity is missing).
Couple some local intelligence to each sensor and the internet of things is becoming the sensory cortex of the planet, with countless data-collecting-devices. All of this ‘big data’ would be a big headache but for machine learning to find patterns to make it actionable, and edge computing to shift the processing to the periphery and avoid network overload. In short, the edge needs AI, and AI needs the edge. Latent AI integrates both with a portfolio of IoT edge compute optimizers and accelerators that bring an order of magnitude improvement to existing infrastructure. This is essential, as the majority of new software today is trained as a neural net, and most compute cycles will shift to the edge.
From the NVIDIA CEO: “We’ll soon see the power of computing increase way more than any other period. There will be trillions of products with tiny neural networks inside.”
The core technologies of Latent AI come from SRI International, where I have been an advisory board member for over a decade and seen technologies like Siri develop and then spin out of SRI.
Here is today’s announcement from the company.
And the company site: LatentAI.com
I took this photo of CEO and co-founder Jags Kandasamy presenting Latent AI to the IoT Consortium.
And headlining 
Closing Dinner with the


Leave a Reply to jurvetson Cancel reply