My most recent investment and board seat came out of stealth today. Mythic is taking a radically novel approach to building low-power, low-cost neural nets that can make any product more intelligent. Imagine any device having a voice interface like the Amazon Echo, but with local processing. It would be more responsive and secure with ~100x power and cost savings (at a comparable mfg process node). And no internet connection needed. Consider a Roomba or microwave oven. A rich voice interface could cost less than the clumsy buttons it replaces. Imagine a Dropcam with local intelligence to avoid the daily false alarms. Drones that could better track their targets.

Mythic implements the basic computation of machine intelligence — matrix multiply and add — by using a standard flash memory array and modifying the peripheral circuitry so that the memory cells store analog values for activation levels (a) for each neuron, and the weights (w) of each synapse. In the digital domain, multipliers and adders take many transistors, each consuming power and time.

Bottom line: Mythic can do an 8-bit multiply and add in a single transistor!

Computation and memory are unified, as in the brain. And flash is a non-volatile memory that consumes no power in standby mode.

Couple some local intelligence to each sensor and “The internet of things is becoming the sensory cortex of the planet” — TechCrunch

“’When you start to see these hints that we re-create evolutionary biology in a computer, you get the same basic building blocks, the same developmental milestones. It’s kind of spooky!’ When discussing Mythic, Jurvetson compared the startup to Minecraft…” — VentureBeat

I was drawing a loose analogy to the redstone wire in Minecraft implementing a digital OR. Mythic uses a common memory readout wire for current accumulation. Addition in the analog domain is instantaneous, and just takes a wire. It needs none of the transistors or time consumed by a digital carry-adder. With just inverters and OR logic to build with, here is an 8-bit CPU that my son built back in grade school.

From the NVIDIA CEO: “We’ll soon see the power of computing increase way more than any other period. There will be trillions of products with tiny neural networks inside.”

The march to specialized silicon, from CPU to GPU to FPGA to ASIC is now going further, to analog and quantum processing. At a high level, we are recapitulating our evolutionary computational march in silicon, and an ever-growing percentage of our compute will be massively parallel and in-memory processing, just like our cortex. I wrote a blog post about when Intel acquired our deep learning chip company Nervana.

Now it’s time to get Mythic.

10 responses to “Mythic Intelligence at the Edge”

  1. Mythic silicon team Texas, March 2017 HQ is here is Silicon Valley

  2. New ideas are powerful, as also observing natures process.
    Still, I can’t help picturing an unhappy couple arguing in front of a voice activated blender.. "on" "off" "stop" "don’t stop" etc..

  3. We "brainstormed" a diner salon that went late into the night. We spoke a bit about your scenario…. Now imagine you also have a camera to help resolve intent. You could flamboyantly orchestrate the lights like a conductor.

    I was also happy to vet my prediction with Yann LeCun that "we’ll build an artificial brain before we reverse engineer our own." Thanks to Shahin and Mike for co-hosting!

  4. Steve,
    This advanced design is really exciting. Good luck to the team.
    It seems to me that the brain does a few things well. Visual processing like the power of the common Drosophila melanogaster is amazing. This is what folks who are doing ASIC for autonomous driving – like your friend Elon is reportedly doing. Mythic is inspired by another portion of the brain. Combine these together with a "classical" CPU and they do a great job of mimicking the systems of the brain. Also wonder how they would work with a system with a few hundred q-bits.

    There was a great article by the economist recently about ASICs and how they’ll along with FPGA will take over the throne from CPU in terms of processing might.

    http://www.facebook.com/joelsapp/posts/10213189137670152?notif_t...

  5. Very interesting. Thanks. A nice writeup just came out on Mythic and Ai Investment in general: about.crunchbase.com/news/vc-interest-ai-focused-startups…

  6. and one year later, the Series B Congrats to Mythic on their $40M Series B, led by Softbankwith Andy Bechtolsheim and Softbank, and an order of magnitude appreciation.

  7. And now the market warms up to the radical notion of building edge intelligence and a voice interface to almost everything…

    From today’s WSJ: "We’re moving toward a world in which everything with a plug or battery can respond to a voice command.

    The CEO of Samsung’s consumer-electronics division recently told The Wall Street Journal that by 2020 his company plans to equip every single device it sells—from TVs to refrigerators—with microphones.

    While it’s easy to make fun of a high-tech trash can, [Simplehuman] tackles one of the biggest concerns that comes with smart assistants: the fact that they record what we tell them and send it back to their parent companies. Simplehuman’s trash can doesn’t do this, says Guy Cohen, the company’s director of electronics engineering. That’s because the latest microphones and their attached microprocessors process human speech in the gadget itself, without connecting to the cloud.

    At first, self-contained processing in gadgets will be limited to simple commands and wake words, like telling a device to turn on or setting a timer. In time, these commands will become more complex.

    One justification for adding voice control to everything we use is that it could ultimately be an easier and more elegant interface than the morass of buttons and menus we face today. Just think how hard it is to work a friend’s microwave, convection oven or thermostat, and imagine instead just telling it what you’d like it to do."

  8. In the sincerest form of flattery, IBM’s roadmap switches to “in-memory computing through Analog AI Cores” in 2023, which is “required for full realization of deep learning AI” —from "IBM Bets $2B Seeking 1000X AI Hardware Performance Boost" in HPCwire Reminds me of the Apple full-page WSJ ad, from 1981Welcome, IBM. Seriously (from Apple in 1981)

  9. P.S. a juicy prediction from Gartner’s Oct 2020 report:

    “By 2025, more than 50% of all data analysis by deep neural networks will occur at the point of capture in an edge system, up from less than 5% in 2020.”

  10. And from the May 2023 issue of WIRED: "Bringing back analog computers in much more advanced forms than their historic ancestors will change the world of computing drastically and forever."

    "Mythic’s analog chip uses less power by storing neural weights not in SRAM but in flash memory, which doesn’t consume power to retain its state. And the flash memory is embedded in a processing chip, a configuration Mythic calls “compute-in-memory.” Instead of consuming a lot of power moving millions of bytes back and forth between memory and a CPU (as a digital computer does), some processing is done locally."

Leave a Reply to jurvetson Cancel reply

Your email address will not be published. Required fields are marked *