MIT’s robotic nose can detect early signs of illness
This article originally appeared on our sister site, Freethink, and is a part of The Future Explored, a weekly guide to technologies that change the world. You can receive stories like this straight to your inbox every Thursday morning by subscribing here.
Our smartphones know a lot about us: they can hear us, see us and feel our touch.
What they can’t do is feel us – at least not yet. But MIT researchers are working towards a future where we can have a mini olfactory system in our pockets.
Sniff the disease: Diseases often alter the smell of the human body – and being able to detect that smell could lead to earlier and more accurate medical diagnoses.
According to Business Insider, modern medical articles have described yellow fever as smelling like raw meat, typhoid like baked bread, and diabetic ketosis like bad apples. Two years ago, a woman swore she could smell her husband’s Parkinson’s disease.
And scientifically, this is true: when a healthy cell is attacked by a virus, a toxic byproduct is produced. This by-product can be emitted by the body via breath, sweat or urine.
“In theory, this should be the earliest possible detection of any possible infection event,” Josh Silverman, CEO of Aromyx, a biotech startup that focuses on digitally reproducing our sense of smell, told BI.
“You are measuring the production of an infected cell. It can happen long before you achieve viral replication. “
When a healthy cell is attacked by a virus, a toxic by-product is produced. This smell could be the earliest possible detection of a disease.
The problem is, a person’s sense of smell is weak and subjective, so we can’t rely on a human nose to diagnose a patient. A dog’s nose is much more sensitive, up to 100,000 times better than ours. (Sniffer dogs have even been trained to screen travelers for the coronavirus.)
“Dogs, for about fifteen years now, have proven to be the oldest and most accurate disease detectors for anything we have ever tried,” said Andreas Mershin, scientist at MIT. “So far, many types of cancer have been detected earlier by dogs than any other technology.”
The problem with dogs is that they have to be trained to detect each specific disease – and training them is expensive and time consuming. In addition, it is not very convenient to bring a dog to every doctor’s office or airport, for example.
Dogs can detect illnesses, but training them is expensive and time consuming.
So Mershin and his team are creating a digital dog nose that could possibly be built. in every smartphone.
Nano-Nose: Earlier this year, Mershin and his team announced that they had created a Nano-Nose – an AI-powered robotic nose – that could identify cases of prostate cancer from urine samples with a 70% accuracy. The study claims that the robotic nose performs as well as dogs trained to detect disease.
“Once we build the nose of the prostate cancer machine, it will be completely adaptable to other diseases,” Mershin told the BBC.
According to Mershin, the Nano-Nose is “200 times more sensitive than a dog’s nose” when it comes to detecting and identifying tiny traces of different molecules emitted by a human body.
But the device is “100% stupid” when it comes to interpreting these molecules. This is where AI comes in: the data collected by the Nano-Nose sensors is executed through a machine learning algorithm to interpret complex models of molecules.
Interpretation of the diagrams is essential for making an accurate diagnosis. The presence of a molecule or even a group of molecules does not necessarily mean cancer, but a complex pattern can. Scientists are still trying to figure out these patterns, but dogs detect them naturally.
“Dogs don’t know any chemistry,” Mershin said. “They don’t see a list of molecules popping up in their heads. When you smell a cup of coffee, you don’t see a list of names and concentrations, you get an integrated sensation. This sense of smell is what dogs can exploit.
MIT’s robotic nose has learned what cancer looks like.
To train AI to harness this sensation as well as dogs can, the team had to perform complex analysis of the individual molecules in urine samples as well as understand the genetic makeup of urine. They trained the AI on this data, as well as the data produced by the dog in their own analysis, to see if the machine could recognize a pattern between the different data sets.
And it did: the machine learned what cancer looks like.
“What we haven’t shown before is that we can train artificial intelligence to mimic dogs,” Mershin said. “And now we’ve shown that we can do it. “
Outside the laboratory: Then Mershin and his team need to replicate those results in a larger study – he hopes to conduct another study with at least 5,000 samples. He must also ensure that the system operates outside the pristine laboratory environment; to be useful in the real world, the system will need to operate in environments where multiple odors are present.
Still, this proof of concept is exciting because it means that a powerful way to diagnose disease could possibly be found in everyone’s pocket.
Smartphones that smell are close to reality, Mershin told Vox.
“I think we’re maybe five years away, maybe a little less,” he says, “to get it from where it is now to fully integrate it into a phone. And I speak [about deploying it] in a hundred million phones.