Malte Wagenbach — February 2026
There is a pattern that shows up repeatedly at the frontier of computing. The hardware arrives first — exotic, powerful, misunderstood. Then, sometimes years later, the software catches up. And the moment it does, everything changes.
We saw it with GPUs. Nvidia built graphics hardware for rendering pixels. Then CUDA arrived. And suddenly the same chip that rendered video games could train neural networks. The rest is history — a company worth more than most countries, a complete reorganization of the AI industry around parallel compute.
Neuromorphic computing is at that inflection point right now. The hardware is here. The software isn't. Not yet.
Vantar is the attempt to fix that.
⸻
What Neuromorphic Computing Actually Is
Most people, when they hear "neuromorphic," assume it means "AI that works like the brain" in the vague, metaphorical way marketers use the phrase. It doesn't. Neuromorphic computing is a specific architectural choice: chips where computation happens through the firing of artificial neurons, each one spiking when its inputs reach a threshold, propagating signals forward in time rather than executing instructions in sequence.
Intel's Loihi 2. IBM's chips. SpiNNaker 2 out of Manchester. These are real silicon, shipping now, with extraordinary properties. They are asynchronous — they don't tick in lockstep with a clock. They are sparse — neurons only compute when they fire, so idle circuits draw almost no power. They are massively parallel — millions of artificial neurons operating simultaneously, communicating through spikes rather than matrix operations.
The energy numbers are almost offensive in how good they are. Researchers have demonstrated neuromorphic inference running at 1000x lower power than GPU equivalents. Not 10x. Not 100x. A thousand times. For edge AI — sensors, robotics, autonomous systems that can't be plugged into a wall — this changes everything.
The hardware exists. It works. It has been working for years.
The problem is nobody could program it.
⸻
The Missing Layer
Traditional neural networks — the kind you train in PyTorch, the kind behind GPT and Stable Diffusion and everything else — don't run natively on neuromorphic hardware. The mathematics is different. GPUs execute dense matrix multiplications. Neuromorphic chips execute sparse spike propagation. Getting a trained model onto a neuromorphic chip required bespoke low-level code, hardware-specific APIs, and the kind of expertise that lives in maybe a dozen research labs worldwide.
This is the gap Vantar is built to close.
The Nuro SDK — the open-source core of the Vantar platform — gives you a Python API that looks and feels like PyTorch. You define your spiking neural network. You train it using surrogate gradient methods on a GPU, because that's where the training infrastructure exists. And then you deploy it to neuromorphic hardware without changing a single line of code.
Write once. Deploy anywhere.
The same network definition compiles to CPU simulation, to GPU acceleration, to Intel Loihi 2, to SpiNNaker 2. The SDK handles the translation. The researcher, the engineer, the edge AI team — they think about their network, not about the hardware abstraction layer underneath it.
This sounds simple. It isn't. It's the kind of infrastructure that takes years to build correctly and looks obvious in retrospect.
⸻
Why This Matters Now
The conversation around AI energy consumption has shifted from background concern to front-page problem. Training a large language model consumes as much electricity as a small town. Running inference at scale isn't much better. As AI moves from the cloud to the edge — into sensors, wearables, vehicles, industrial equipment — the power budget collapses from kilowatts to milliwatts.
You cannot run a GPU at the edge. The physics don't work.
Neuromorphic chips can do what GPUs cannot: always-on inference at milliwatt power budgets. A camera that never sleeps, processing every frame, drawing less power than an LED. A microphone that listens continuously and only wakes the system when something meaningful happens. Robotics that can perceive and react without a power tether.
The applications aren't speculative. They're waiting on software that makes the hardware accessible.
Vantar is that software.
⸻
What I Built
Vantar is three things:
Nuro is the open-source SDK — Apache 2.0, Python 3.10+, built on PyTorch. It provides the neuron models (LIF, Izhikevich, AdEx, IF), the training infrastructure, the hardware compilation backends. It is the layer everything else stands on. One hundred and twenty-one tests passing. A codebase built to be extended.
Vantar Cloud is the managed deployment platform — the place where you take a trained network and put it on neuromorphic hardware without buying and operating the hardware yourself. The cloud handles provisioning, routing, monitoring. You get an API endpoint. The neuromorphic chip does the work.
The Vantar Dev Kit is what comes next: an edge module pairing an event camera with a neuromorphic processor. Event cameras are another piece of hardware that has been waiting for its software moment. They don't capture frames — they detect changes in light intensity per pixel, asynchronously, at microsecond resolution. Paired with a neuromorphic processor, the two become something genuinely new: a visual system that processes the world the way retinas do, in continuous time, at almost no power cost.
⸻
The Deeper Pattern
I keep coming back to the CUDA analogy because I think it's the right one. Not because the outcomes will be identical — history doesn't repeat, it rhymes — but because the structure of the problem is the same.
Transformative hardware exists. It's inaccessible to most people who could use it. Someone builds the abstraction layer. The hardware becomes a platform. Everything built on top of it becomes possible.
Neuromorphic computing has been locked behind that abstraction layer for years. The chips got better. The energy numbers got more impressive. The research papers accumulated. But the software stayed hard, stayed specialized, stayed out of reach for anyone who wasn't already inside one of the handful of labs with the expertise to use it.
Vantar is the unlock.
The bet is that once you make neuromorphic hardware as easy to program as any other hardware — once you give researchers and engineers a Python API they already understand — the applications will arrive faster than anyone expects. Edge devices that actually work. Sensors that think. Systems that perceive without draining the battery in an hour.
The hardware was always ready. We just needed to write the software.
⸻
Vantar is available now. The Nuro SDK is open source at Apache 2.0. Vantar Cloud is in early access. If you're working on edge AI, neuromorphic research, or just curious what a 1000x energy improvement buys you — vantar.xyz