Source: bioRxiv
This past September a bioRxiv preprint caught my eye: “A principal odor map unifies diverse tasks in human olfactory perception.” Given my longstanding interest in odor maps and since one of the authors is Joel Mainland, a guy whose work I have long admired, it seemed ripe for a substack. But after skimming the paper, I lost interest.
Why? Because it relies on machine-learning AI algorithm manipulation which I find to be a profoundly misguided approach to smell; because its input data was an archival, low-resolution odor descriptor database; and because trying to make sense of the paper’s aims and methods was like unraveling a bowl of sticky spaghetti.
For example, I stalled out when I got to Figure 1 on page 3. It’s a mishmash of references to principal odor maps, graph neural networks, cFP structural fingerprints, and principal component statistics. And to what end? To support the idea that certain molecular properties tend to line up with certain odor descriptors. Hardly breaking news, with or without fancy math.
Around the same time there began a drumbeat of stories about Google, AI, and predicting the smell of a molecule from its chemical structure.
“Can Google Smell? Why Digitizing Odor Could Be a Business Opportunity,” INC.com, September 8, 2022
“Google AI can tell what things smell like by the molecular structure,” New Scientist, September 12, 2022
“Machine Learning Highlights a Hidden Order in Scents” Quanta Magazine, October 10, 2022.
“AI Predicts What Chemicals Will Smell like to a Human,” Scientific American, October 27, 2022
“This Startup Is Using AI to Unearth New Smells,” Wired, January 24, 2023
“Google Spin-Off Osmo Develops AI to Create Aroma Molecules for Fragrances,” Perfumer & Flavorist, January 26, 2023
Only this week did it dawn on me that all the Google AI smell hoopla was connected to the “Principal Odor Map” paper and two related preprints on bioRxiv (none of which has yet been published in a peer-reviewed journal). The common link is co-author Alexander B. Wiltschko, a Harvard neuroscience PhD (2016) and member of Google Research’s Brain Team in Cambridge, Mass. He is now “Entrepreneur in Residence” at Google Ventures which is one funder of his new company Osmo, of which he is CEO.
Wiltschko has several papers listed on PubMed but only one concerns smell. Published in early 2019, it appears in a special issue of the journal iScience—an issue Wiltschko himself guest-edited. The topic? “Building an interdisciplinary team set on bringing the sense of smell to computers,” or more specifically, “Dr. Wiltschko’s team.” It’s written in the form of an interview, but since the piece is credited to Wiltschko, we must assume he wrote (or at least approved) this passage:
Each new sensory modality created a whole field of AI. So, if there was a way scientists could give computers the sense of smell, a whole new sensory modality, it would spawn a new field of science.
In other words, “Wiltschko Spawns New Field of Science, Says Wiltschko.”
Wiltschko is not shy when it comes to self-promotion. In the glorified press release that appeared in Perfumer & Flavorist, he says (italics mine):
“The digitization of the human senses of vision and hearing have led to incredibly impactful leaps in technology that enrich our lives, from better healthcare to photography to digital music,” said Wiltschko. “We’re excited to play a part in unlocking the potential for olfaction to change the world in fields like flavor [and] fragrance, medical diagnostics, agriculture, and beyond.”
“To tackle this difficult, historic problem, we’ve brought together a founding team of world-class neuroscientists, machine learning experts, psychophysicists, hardware and software engineers, data scientists, and chemists. I’m looking forward to the journey with them and others who join us along the way,” said Wiltschko.
That sorta sounds like WeWork’s Adam Neumann with an assist from Buzz Lightyear. All that’s missing is “we’re making the world a better place.”
According to the Wired story, Osmo has big goals:
With $60 million in an initial funding round led by New York-based Lux Capital and GV (Google Ventures), Osmo aims to create the next generation of aroma molecules for perfumes, shampoos, lotions, candles, and other everyday products.
But Osmo’s mission is more profound than that. Says Wiltschko:
“There’s a huge opportunity to build safe and sustainable and renewable ingredients that don’t require that we harvest life.”
This man is truly special. He’s the Mahatma Gandhi of fragrance chemistry.
He is also something of a prophet, someone who believes deeply in the future:
“I deeply believe in a future where, in much the same way that computers can see, they can hear, they can smell,” says Wiltschko, who is now exploring commercializing this technology.
Am I selectively quoting Wiltschko to make him sound like a brilliant bullshit artist? Well, listen to the man in his own words and judge for yourself.
![Twitter avatar for @TwoSigmaVC](https://substackcdn.com/image/twitter_name/w_96/TwoSigmaVC.jpg)
Here’s the transcript:
That’s really the critical discovery that we’re building on top of at Osmo, is the map of odor. And it’s a really critical discovery because every other sense has a map. RGB is a map of color. It’s three numbers that tell you what any color is and how to mix them. And without RGB you can’t make a camera. And for sound, we have low to high frequency, and with it you can describe any sound or even store and transmit any song. And without that map you can’t build a microphone. And so we view that what we’ve done as the first step—we’re not saying we’ve solved the problem, but we’ve definitely taken the first step to building this map of odor. And we believe that with it we can build things that aren’t able to capture the world of light, like a photograph, but can capture the world of scent, which we’ll call an osmograph. So we think this map of odor is that first step.
This is a lot of bullshit crammed into a 62 second clip. RGB is no more a map of color than your hat size is a map of your brain. Frequency describes any sound but there are big differences in a 440 A played on a flute, a violin, and a mandolin.
Odor AI is nothing more than an automated data sifter. As such it’s only as good as the giant database you pour into it, such as a 75-year-old USDA compendium of mosquito repellants (as Wiltschko’s group did in the second preprint). Can an odor AI generate insight into human odor perception? Read the group’s third preprint and see if you can find the insight.
Still, you have to admire a guy who snags $60 million in funding based on three preprints and a glorified press release in iScience.
Brian K. Lee, Emily J. Mayhew, Benjamin Sanchez-Lengeling, Jennifer N. Wei, Wesley W. Qian, Kelsie Little, Matthew Andres, Britney B. Nguyen, Theresa Moloy, Jane K. Parker, Richard C. Gerkin, Joel D. Mainland and Alexander B. Wiltschko. (2022). A principal odor map unifies diverse tasks in human olfactory perception. bioRxiv preprint posted September 3, 2022.
Wiltschko, Alexander B. (2021). Building an interdisciplinary team set on bringing the sense of smell to computers. iScience 24, 102136.