Researchers have developed a new kind of nanoelectronic device that could dramatically cut the energy consumed by artificial intelligence hardware by mimicking the human brain.
The researchers, led by the University of Cambridge, developed a form of hafnium oxide that acts as a highly stable, low‑energy ‘memristor’ — a component designed to mimic the efficient way neurons are connected in the brain.



It’d probably be far more appropriate for an analogue system where it isn’t being switched but it’s rather what the model is burned onto
This seems like such a glaringly-obvious solution to lower inference cost that surely there must be some fundamental flaw in it… otherwise all of the big AI firms would be doing it, right?
Right…?
Takes a while for the technology to become available in ASICs, we still don’t have purpose designed silicon for AI. We’re using repurposed GPUs with tensor cores scaled up still for pretty much all AI workloads