A new computer chip implements Bayesian logic with analog circuits, but still has a conventional digital computer “surrounding” the Bayesian logic components. In other words, it's a digital computer with an analog computer “accelerator” built right into it. The claim is that it reduces circuit power and complexity to achieve any given computational performance.
About a million years ago, when I first started learning about computers, I played with some electronic analog computers in the lab. These were implemented with (primitive) operational amplifiers and simple electronic circuitry. Later, when I was stationed on a ship, I was called in a few times to help troubleshoot an actual, production analog computer: the guidance computer for the Terrier anti-air missile system. This beast of a computer was electro-mechanical: it used synchros, gears, cams, etc. to do some very complex computations in real time.
These old analog computers were cantankerous and prone to unobvious calculation errors. We thought of digital computers as better in every respect.
So it feels kind of weird to see them being proclaimed as the next great thing!