Canon EOS 5D
ƒ/8
400 mm
1/500
320

The mountain biking in Montebello was brisk this morning.

I found myself pondering last night’s party. A friend of mine, who evolves neuronal simulations, just put together a mind-bogglingly powerful computer (36 teraflops) with $80K of equipment from Fry’s.

Gordon Moore happened to be next to us with a gleam in his eye. I had to ask “who would have predicted that in 1965?” =)

The key to this system’s performance is the parallel power of graphics processors, which maps well to cortical simulation. For $499, a NVIDIA board sports 240 cores running at 1.5GHz.

8 responses to “Redhead”

  1. 36 teraflops! crazy.

    how many cards does he stuff in a case? I count 140 systems if you got to put put 4 cards on a mobo in a $300 bare bones system. at 600 watts ish a system, that’s not easy to cool!

    But, mmmm neuronal simulations. How are those distinguished from neural nets?

  2. Hey Steve!

    Is he posting his notes anywhere? I’d be interested in building a smaller version… I’ve been working on some of my own neural simulations lately.

  3. I don’t see the details online. http://www.evolvedmachines.com
    Want me to ask him?

    BenODen: rather than a simple sum-sigma node with back propagation or some abstraction like that, he started with a fairly detailed dynamic neuron model, with ion channels and synapse formation in 3D. Details:

    Mental Model Brainstorm

  4. Damn that’s beautiful Steve. I am really going to miss the bay area if we move. Brrrr.

  5. Oooh, neato, dendrite level simulation! Cortexes simulation wild wild cool stuff.

  6. That’s one FAST horse.

  7. You should hear her terra clops

  8. oy vey SJ. =D

    GPU power is the future. …as a simple end user I’m just looking forward to Snow Leopard, which should be making better use of my GPU and multi-cores.

Leave a Reply

Your email address will not be published. Required fields are marked *