Parley ‘SnotBot’ expedition brings machine learning and artificial intelligence to whale research

In more than five years with Intel Labs, I’ve had the good fortune to be involved with a team of extremely bright people focused on researching large-scale machine learning and data mining techniques. And now a great job is about to get even better, as I head out to sea this week with a team from Intel Labs and a collaborative team of ocean experts to put the same techniques to work to study the health and wellbeing of humpback whales.

This expedition builds on our ongoing work with Parley for the Oceans, a group dedicated to advancing research and education focused on conserving whales and the oceans that sustain them. In early June, we highlighted some of the great work that Parley is doing in our blog on Parley ‘SnotBot’ and Intel AI Technology.

Parley ‘SnotBot’ expedition brings machine learning and artificial intelligence to whale research
(Christian Miller Photography)

This week’s Parley SnotBot expedition, parts of which will be captured live by National Geographic, will build on the work being done by the Mind’s Eye Lab, a research initiative arising out of a collaboration between Intel Labs and the Princeton Neuroscience Institute. The Mind’s Eye Lab is focused on decoding digital brain data from functional magnetic resonance imaging (fMRI) scans to reveal how neural activity gives rise to learning, memory, and other cognitive functions. The technologies that make this decoding process possible are a perfect match for the process of sifting through the “noisy data” captured by drones hovering over whales in the ocean.

The drone innovation at play here is known as “SnotBot,” a colorful term that stems from the drones’ ability to collect the blow, or snot, exhaled from whales. That blow is rich with biological data, ranging from DNA, stress hormones and pregnancy hormones to viruses, bacteria, and toxins. These same drones also capture video of whales, which is the focus of the current expedition.

Parley ‘SnotBot’ expedition brings machine learning and artificial intelligence to whale research = Christian Miller Photography
(Christian Miller Photography)

In our expedition on Alaska’s Frederick Sound, our team of computer scientists and marine biologists will employ sophisticated algorithms to analyze large amounts of data from the video imagery captured via the SnotBot drones. This analytical process will give us the ability to gain clarity from data that is inherently murky.

So why is that data murky? When you capture video of whales, in our case the lively humpback whale, you are gaining glimpses of a twisting and turning target whose full form is often obscured by water. Water absorbs color and reduces contrast, making parts of the animal hard to see.  Waves further distort by reflecting and bending what gets through.  Even parts of the animal that surface, such as the whale’s flukes — the two lobes of the tail — is distorted by gravity, lighting conditions, and viewing angle.

Parley ‘SnotBot’ expedition brings machine learning and artificial intelligence to whale research - Christian Miller Photography
(Christian Miller Photography)

Naturally, it can be an extremely challenging endeavor to identify and assess the health of whales using manual processes and conventional image processing. This is where machine learning comes into play.  Applied to drone video, machine learning can learn and remove the distortions caused by water, bringing out whale color patterns, speckling, and shape.  Distinctive features can be compared with images from whale databases, making it possible to automatically identify individual whales.

Machine learning algorithms can also provide researchers with a full view of the animal. As such, the algorithms can help researchers understand the health of the whale by determining the shape and size of the animal and generating a body composite index (BCI) score. All of this is accomplished in a matter of seconds -- without clear images. Essentially, machine learning and AI give researchers X-ray vision that sees through the water.

Parley ‘SnotBot’ expedition brings machine learning and artificial intelligence to whale research - Christian Miller Photography
(Christian Miller Photography)

In machine learning terms, we are talking about unsupervised learning. This means that researchers don’t need to mark up the images they feed to the algorithms to help them along. Instead, they let the algorithms make judgments about the images and continually build on what they learn. While this week’s expedition focuses on humpback whales, the same technologies can be applied to other animals to generate deeper insights than would be possible even with direct observations in the field.

Needless to say, I’m honored to be a part of this mission and excited at the prospect of taking machine learning techniques out of the laboratory and applying them on the waters off the coast of Alaska to gain insights that can help us protect whales and the oceans that sustain them.

To catch a glimpse of the Parley SnotBot expedition, tune into the National Geographic live event at 8 p.m. EST on Sunday, July 9 and to learn more about Intel’s machine learning technology check out www.intel.com/machinelearning.