What do oceans and brain scans have in common? Artificial intelligence designed for neuroimaging research has led to new ways of identifying individual whales in the wild. Parley Snotbot expeditions bring the innovations from lab to sea.
Forget faraway galaxies; some of the most persistent mysteries in our universe are right in front of us, or even inside of us. From the human brain to the deep oceans, there are some systems that have proved simply too complex to fully understand.
That’s beginning to change. Thanks to major advances in high-powered computing, increasingly robust machine learning algorithms are tackling challenges across a broad range of disciplines. And researchers are finding that breakthroughs in one area can lead to discoveries in seemingly unrelated fields.
Ted Willke, a principal engineer at Intel’s research arm, Intel Labs, has spent the last years working alongside scientists at the Princeton Neuroscience Institute. He and his academic partners, who call the cooperation the Mind’s Eye Lab, have developed custom AI software that helps read functional magnetic resonance imaging (fMRI) scans of the brain.
Machine learning algorithms, supported by powerful custom servers, parse the enormous volume of data in each scan, identifying discrete neural activities in a matter of seconds. Researchers use the information to gain insight into cognitive functions such as memory or learning. They expect these tools will shed new light on little-understood conditions like Parkinson’s disease, post-traumatic stress disorder (PTSD), and Alzheimer’s disease.
This work has been challenging because of the complexity of brain scans. Neuroimaging analysis is extremely difficult because the signals picked up by fMRI are extremely weak compared to the background noise of the brain, which is controlling thousands of bodily tasks at once. But after years of work, the team has reduced fMRI analysis from days to seconds.
From Brain to Baleen
Some of the dividends of this research have been unexpected. The same machine learning algorithms developed to help neuroscientists better understand the human mind are proving useful to oceanographers studying whales as a barometer of ocean health. After all, identifying an individual whale by the unique shape of its tail in the open ocean can be as challenging as searching for a pattern in the synaptic chaos of a brain at work.
Willke has applied much of his work at the Mind’s Eye Lab to an ongoing collaboration with Parley for the Oceans, which uses a modified drone— called a ‘SnotBot’ because it collects whale mucus in petri dishes—to track whales at sea. He has developed similar algorithms that use drone images to help identify individual whales today; in the future, they will also provide volumetric analysis to measure whale heath.
This leap—from evaluating brain activity to identifying whales—was not as difficult as one might expect.
“It's surprisingly a little bit easier to detect the important signals in whale imaging data,” Willke explained. “The task of the neuroimaging has prepared us well for the types of challenges we see with whales.”
Whales can be identified by color, the speckles across their bodies, and the tiny ridges along their flukes. Many of these details are imperceptible to the human eye, Willke said, but machine learning can help identify these subtle patterns. Artificial intelligence uses similarity analysis across population databases to match each animal. They are highly effective even when given only a partial view of a whale swimming and diving in rough seas.
“The animals don't cooperate, the weather doesn't cooperate, the water doesn't cooperate,” said Willke. “We're looking into murky waters at an animal that's a very similar color to the water, where the textures are subtle. And the water itself creates reflection, refraction, and the whale creates wake.”
Body Beyond the Brain
There are many other cases where machine learning tools help identify an elusive signal, the salient data, in noisy environments. Willke says that AI research in neuroimaging may offer lessons for other seemingly unrelated fields, such as autonomous driving. But his work at the Mind’s Eye Lab may have more immediate applications, such as imaging techniques for other regions of the human body.
In healthcare, several businesses have already launched AI imaging solutions that promise to revolutionize medicine. One such company, Arterys, uses deep learning, a subset of machine learning inspired by the neural networks of the human brain, to automate cardiac imaging analysis.
“There are multiple applications within medical imaging that require some sort of automation, whether it’s in identification of certain disease, quantification, segmentation, or any sort of classification. These techniques can identify whether a tumor is benign or not, for example,” said Arterys founder John Axerio-Cilies. “We’re just at the start. There’s so much more that we can do to help augment physicians and radiologists so that they can focus on the patient, and less so on finding minute details in the images.”
These types of emergent technologies are transforming medicine. This process is speeding along, largely due to open-source approaches that accelerate the innovation cycle.
Willke’s work with Mind’s Eye Lab, for example, is available to the public so that the medical community can build on their progress, discovering new methods for the diagnosis and treatment of disease. Similarly, the whale research with Parley leverages a global network of scientists, designers, and environmentalists.
Dr. Iain Kerr, Parley for the Oceans’ chief science officer, said collaborative projects like SnotBot mark only the beginning of what’s possible when it comes to exploring complex systems.
“We're now at this fascinating tipping point,” said Kerr. “If we have the capacity to do analytics in the field with AI, it really gives us an opportunity to understand what’s going on in places we couldn’t access before.”
In the future, he explains, AI-powered whale identification will happen in real time, and technologies like 5G connectivity will allow a global audience to follow live footage from the drone. Kerr promises that this is only the beginning, and other remote ecosystems stand to benefit as well.
Whether we are looking into the human brain or the deep oceans, one thing is clear: Artificial intelligence will play a crucial role in the next chapter of scientific research.