The old dream of Jacques Vidal, who coined the term brain-machine interface (BCI) in 1973, has taken on a concrete and spectacular dimension in recent years. While Neuralink, Elon Musk’s company, monopolizes attention with its chips implanted directly in the cerebral cortex, a less radical alternative is emerging from Chengdu, the capital of Sichuan province in China. The startup Gestala promises to “read” the human mind using a technology that we all know: ultrasound.
The concept is based on a global approach: unlike physical electrodes which only capture signals in a restricted area, cranial ultrasound would make it possible to map the activity of the entire brain. Phoenix Peng, CEO of Gestala, explains that “the electrical brain interface only records part of the brain; ultrasound, it seems, can give us the ability to access the entire brain“.
By analyzing variations in blood flow with surgical precision, these new devices have several ambitious objectives such as restoring mobility to paralyzed patients, but also treating certain mental pathologies. Gestala’s project ultimately plans a second-generation headset capable of detecting conditions linked to depression or chronic pain, and then delivering targeted therapeutic stimulation.
Silicon Valley follows suit
However, the technical challenge remains colossal. As Popular Mechanics magazine points out, if the use of ultrasound to destroy failing neurons (particularly in the context of Parkinson’s disease) is a known practice, the extraction of reliable data through the bony wall of the skull is another matter. Bone tends to distort sound waves, thus blurring the fidelity of the information collected.
However, optimism reigns in Silicon Valley as in Asia. This technological breakthrough is part of a global trend which sees the use of artificial intelligence for medical purposes growing rapidly. In September 2025, UCLA researchers have already proven that a simple electroencephalography (EEG) cap coupled with AI could translate thoughts into movements without any incision.
For Jonathan Kao, author of this study, the objective is clear: “By using artificial intelligence to complement brain-machine interface systems, we are aiming for much less risky and invasive paths.»
The market is not mistaken. OpenAI recently invested in Merge Labs, a startup also using ultrasound in its research. If these technologies manage to overcome the barrier of 1,000 microvariations – necessary for smooth reading – we could soon control our computers by the simple force of thought, without having to first go through the operating room.