This Startup Wants to Get in Your Ears and Watch Your Brain

Born from Alphabet's “moonshot” division, NextSense aims to sell earbuds that can collect heaps of neural data—and uncover the mysteries of gray matter.
Jen Dwyer with an earpiece.
Can a teeny-tiny earbud open a window to the body's most mysterious organ? Jen Dwyer, NextSense's medical director, and her team aim to find out.Photograph: Christie Hemm Klok

Konstantin Borodin is an ear nerd. He’s been looking into them, literally and professionally, for more than a decade. Even in social situations, he’ll find his attention drifting lobeward. “Sometimes I get weird looks,” he says.

I met Borodin when he measured my ears and their outer canals for custom-fit buds that can pick up my brain waves. To create a mold, you usually have to fill the ear with a warm waxy substance, but Borodin uses a device called the eFit Scanner that measures its precise dimensions with a laser. The size of an Oculus Quest, the scanner has twin eyepieces and a metal camera nozzle that looks like a long stinger.

I swab my ears with rubbing alcohol—to make them less shiny, he says—and he positions me on a stool. At his urging, I wedge my head into a brace. “It helps to stabilize things,” says Borodin, who is now swooping toward me, gripping the gadget with both hands. He tilts my head and zeroes in on my left ear. “Hold that position,” he says.

“How many of these have you done?” I ask him.

“Over 30,000,” he replies. Even after all those ears, Borodin marvels at them–that no two are the same, that the nose and ears are the only organs that grow as one ages. But what has brought us together for this fitting is another useful property of ears: They are in the perfect spot to eavesdrop on the brain.

This article appears in the June 2022 issue. Subscribe to WIREDIllustration: Patrick Savile

After acting as the scanner-in-chief for the company that invented the eFit, Borodin is now the lead ear spelunker for NextSense, which was born at Google and spun out of Alphabet’s X division. The startup’s focus is brain health—improving sleep, helping patients with epilepsy, and eventually enriching the lives of people with a range of mental conditions. The idea is to use its earbuds to capture an electroencephalogram, a standard tool for assessing brain activity. Just as an ECG tracks the fibrillations of the heart, an EEG is used to diagnose anomalies in brain activity. While some smart watches—Apple, Samsung, Fitbit—offer versions of an ECG and aim to spy on your sleep, collecting neural data has mostly been a can’t-try-this-at-home activity. Until now.

Standard EEGs are “a mess,” says Arshia Khan, a neurologist at the University of Minnesota, Duluth, who has done studies of those devices. To use the expensive machine in her lab, she has to fix electrodes to a person’s scalp. (“It leaves indentations on the head for a few hours, and if you use gel, it’s hard to shampoo it out.”) The device only works in a clinical setting and isn’t suitable for long-term studies. A handful of off-the-shelf, consumer EEG headsets are portable, but look incredibly awkward. If earbuds could deliver good results, that would be “fantastic,” she says. And not just for scientists.

For years, people have been shifting from tracking their health through sporadic visits to a doctor or lab to regularly monitoring their vitals themselves. The NextSense team is gambling that, with a gadget as familiar as an earbud, people will follow the same path with their brains. Then, with legions of folks wearing the buds for hours, days, and weeks on end, the company’s scientists hope they’ll amass an incredible data trove, in which they’ll uncover the hidden patterns of mental health.

For now, that’s the stuff of dreams. What’s real is that on one day in 2019, a patient tucked a bud into each ear, fell asleep, and proceeded to astound NextSense’s scientists—by churning out brain waves that showed exactly how this product could save a person's life.

Jonathan Berent is the CEO of NextSense. On a recent evening, the 48-year-old was talking like a podcast at 1.5 speed while we waited for our entrées on the patio of an Italian restaurant in Mountain View, California. The subject of his filibuster was how he’d gotten into brain health. His obsession wasn’t ears or wellness; it was sleep.

Raised by his single mother and a coterie of relatives in Seymour, Indiana—the small town John Mellencamp sings about—Berent says he struggled to fit in and often got into trouble at school. He threw himself into his hobbies, among them writing tiny games that could fit into Commodore 64 cartridges. As a teenager, he stumbled upon a book on lucid dreaming, a semi-conscious middle ground where dreamers have a measure of control over their visions. Written by the field’s top expert, Stephen LaBerge, the book got Berent hooked on the slumbering mind. “The physical laws don’t apply, and the social laws don’t apply,” Berent says of sleep. At 18, he made his first journal entry in what has become a lifelong project of documenting his dreams.

Berent managed to get into Stanford, where he started studying computer science, only to freeze on a final in an introductory course. He switched to philosophy, figuring he’d catch up with his fellow geeks in the workplace later on. A philosophy major isn’t a gateway to great Silicon Valley jobs, however. After some searching, Berent landed an entry-level position at Sun Microsystems, in a backwater division reviewing contracts.

By 2011, he’d made his way to Google, where he joined a sales team supporting AdWords (now called Google Ads). He was good at it. He managed a large team and worked out of an office that he decorated like a wellness retreat, with a yoga mat and a “wisdom library” of be-here-nowness. (“I don’t think incense was burning there, but in my memory it kind of is,” says one visitor.) All the while, he was experimenting with polyphasic sleep—going to bed around 10 pm, waking up three or four hours later, and taking 20-minute naps throughout the day.

It wasn’t long before he crossed paths with another sleep obsessive, in a different AdWords office. Joe Owens had a doctorate in neuroscience focusing on sleep and circadian rhythms. Their first conversation was a marathon Google Meet session. Berent detailed his adventures in sleep hacking: As a morning person, he explained, his naps effectively gave him multiple fresh starts in which to energetically read about neuroscience, consume novels, and practice the drums. Owens was impressed. “I’d never met anybody going this hard at sleep from a personal point of view,” he says. Both wound up guest-lecturing in a famous Stanford course on the science of sleep, and soon the two men were batting around ideas for products that might enhance sleep. LaBerge, the lucid dreaming expert, had become a mentor, and he shared with Berent a research paper in which playing a sound to slumberers increased the slow waves that correspond to deeper sleep. Berent thought a product built on that insight might allow people to rest more efficiently—compressing eight hours of shut-eye into six.

In April 2016, Google announced it was starting an incubator called Area 120, its artisanal take on Y Combinator. Berent and Owens applied and got rejected, but they were pointed to X, Alphabet’s “moonshot” division, which takes on riskier, longer-term projects than Area 120. X picked up the project to supercharge sleep, and Owens started running it full-time. Berent stayed in the ads division but devoted some of his time to the project.

One of their first efforts was to launch a study with Phyllis Zee, a well-known neurologist at Northwestern University. They committed $500,000 to the experiment, in which they tried sending audio signals to earphone-wearing subjects to boost the slow waves of deeper sleep. That’s when they hit their first snag: Some participants responded as they’d hoped but others not at all, and they couldn’t figure out why.

Thinking again about the earphones from their sleep study, Berent wondered if he might be better off trying to collect brain data from the ear. That would help him observe not just sleep, but perhaps everything happening inside our heads. He discovered that a Georgia Tech professor—who coincidentally happened to be the technical lead and manager of Google Glass—was working along those lines. The researcher put him in touch with United Sciences, where Konstantin Borodin was doing laser-guided earbud fittings. That company had tried to build a system to perform EEGs through the ear. It had even launched a Kickstarter campaign. But the product never shipped, and the company abandoned the effort.

Berent got in touch and arranged to get fitted for the device himself. Naturally, he tried to test it in his sleep, even though the ear molds were made of an uncomfortable hard plastic. To his delight, he was able to get some measurable brain data. Berent quickly made a deal with the company. Now the ad exec turned brain hacker was on the hook to somehow make it work.

Jonathan Berent, the CEO, set out to help people hack their sleep—and ended up with a way to collect unprecedented amounts of brain data.

Photograph: Christie Hemm Klok

An EEG is a finicky thing. In a gold-standard setup, a person’s scalp is covered in many electrodes smeared with a gooey gel to cut down on electrical noise. Once pasted to a person’s head, the electrodes can detect when huge cohorts of neurons fire together, producing signals in different frequency bands. That’s how an EEG can reveal roughly what the brain is up to—various frequencies correlate with stages of sleep, rest, or intense focus. It wasn’t obvious that Berent could do all that with only two electrodes (and no conductive goo). So he flew out to Atlanta to get some expert opinions.

Along with a team from United Sciences, Berent and a small group of renowned neurologists crowded into a tiny examination room at the Brain Health Center at Emory University. The head of the center, Allan Levey, was excited at the prospect of ear EEGs. “We know about our blood pressure, cholesterol, and respiratory system,” Levey says. “But the most important organ is our brain. We don’t assess that systematically.” He figured patients could get better care if they were also tracking the electrical activity inside the skull.

Levey had lured some colleagues to get fitted for earbuds with him; one professor had literally written the textbook on EEGs. But some of the scientists were skeptical. They weren’t convinced that the tiny sensors in earbuds could pick up the relatively weak electrical brain signals. If they could, though, the payoff could be huge, allowing for persistent and portable measurements. “The problem was squeezing in all the electronics that would make it work,” says Dan Winkel, an epilepsy researcher who participated in the demo.

The Emory scientists inserted their custom buds, closed their eyes … and thought. Then they turned to a computer monitor to see what data the buds had captured. “All of a sudden, the line begins to travel across the screen,” Winkel recalls—just as it would with a normal EEG setup. “I was pretty shocked, as were most of the people in the room.”

Levey told Berent that if he could eventually match the quality of a true EEG, he’d be on to something—a sort of Apple Watch for the brain. But, he added, the earbuds could be immediately put to use on an important problem: monitoring epilepsy.

In refining the earbuds, NextSense's team had to figure out how to amplify useful signals and cope with noise.

Photograph: Christie Hemm Klok

There is no easy, noninvasive way to observe a seizure, which is a critical step in treatment, both to assess the efficacy of drugs and to predict when the next seizure might strike. A patient might spend up to a week under observation at a hospital or get electrodes surgically implanted in the brain. The latter approach is expensive and painful. But by studying individuals who have undergone it, scientists have identified patterns of brain activity that seem to predict an impending seizure. With that kind of weather forecast for the brain, patients can better plan their lives, choosing to not get behind the wheel or climb tall ladders.

Berent left Atlanta feeling optimistic. A few months later, he decided to take a three-month transfer—a bungee, in Google parlance—to work full-time for X. But just as he arrived, the sleep project got axed.

Owens quickly moved to another team. Berent, however, had to scramble to remain at X. He had to somehow, quickly, pick up the pieces of his project and make a new case for himself. In February 2018, he met with one of X’s top moon-shotters, John “Ivo” Stivoric, to see if he could salvage his dream of ear EEGs. But Stivoric was more interested in a brain device that could control a computer. Such a project would fit into an existing X initiative called Intent OS, which was exploring the future of how humans and computers might interact. Perhaps the earbuds could reveal what a person was focusing on. Or provide other data useful for controlling a computer or augmented reality display. Berent was game, and the new project was dubbed Heimdallr, after the Norse god who used his keen eyesight and hearing to watch for invaders. His teammates started conducting an experiment on how they might use the earbuds to refocus a person’s attention. It involved streaming two audio books simultaneously, one in each ear.

Berent, however, was still obsessing over the idea of replicating medical-quality EEGs. He and his team had to figure out how to amplify more distant signals to make up for the fact they only had two electrodes. The United Sciences prototype wasn’t quite up to snuff; it couldn’t pick up alpha waves, which occur during both sleep and wakeful periods. The X’er also had to miniaturize the electronics of a traditional EEG to fit inside the two buds.

Berent felt that with Google’s knowledge, equipment, and talent, these tasks were possible. He also had on hand 5,000 ear scans from United Sciences, which revealed that it was critical to create a tight seal—to filter out electrical noise that could erode the brain signals. He had to improve on United Science’s hard plastic molds. While casting about, Berent discovered a product called Tecticoat, a super-pliable, conductive coating. When he put it on the buds, suddenly the brain waves they collected became far sharper, and the earbuds far more comfortable. (Berent eventually acquired the intellectual property related to the polymer.)

Impatient with the rate of progress, Berent one day grabbed a lead from a $50,000 portable EEG machine, smeared some gel on it, and jammed it into his ear. To his relief, the electrode registered alpha waves—now he just had to make the same thing happen with his buds. A more definitive clinical test came months later, when a Heimdallr prototype performed roughly on par with an EEG.

Stivoric, who’d been skeptical of Berent’s obsession, was impressed. “One of the worst sensors in the world is an EEG sensor—there’s environmental noise, surface noise, motion of the body, and so forth,” Stivoric says. “I thought, OK, it shouldn’t work. But it does work. These signals are showing up. How is this even possible?”

On October 18, 2019, Berent took a meeting with Google’s chief economist to discuss the privacy implications of reading people’s brain waves. A few minutes in, Berent began feeling poorly. He looked at his Apple Watch, which informed him that he could be in atrial fibrillation. Berent went to the hospital for tests, and a few days later, he underwent a cardiac version of a reboot, where his heart was stopped and restarted. The experience made Berent view his work differently. To hell with Intent OS–he now realized that all he wanted was to build a device that could do for his brain what his watch had done for his heart.    

On November 8, 2019, Jen Dwyer was working at her desk on the third floor of Moonshot Central, inside a converted shopping mall. Dwyer, who is the team's medical director, holds a doctorate in computational neuroscience and a medical degree, and she joined Berent's project because of a deep-seated interest in sleep and epilepsy. “I just got really fascinated with the electrophysiological waveform,” she says, calling it “mesmerizing and beautiful.”

She opened up a file of patient data from an earbud study she’d set up at Emory, under Winkel’s supervision. As one person’s brain waves marched across her screen, a pattern caught her eye. At first the lines on the chart were neatly spaced out and undulating. “Then, all of a sudden—boom,” she says. The lines started to jump wildly, as if the calm waters of the EEG had surged into an angry sea. It was the signature of a seizure—the first time ear monitoring had detected one. The subject, who had been sleeping, probably never knew anything had happened. But both the earbuds and the implanted electrodes confirmed the event. “We all gave each other high-fives,” says Berent. “This was what we really needed.” As the study progressed, the earbuds would log more of them, picking up 16 of the 17 seizures detected by the electrodes.

Jen Dwyer, the company's medical director,  is conducting studies with scientists at Emory to prove out the earbuds.

Photograph: Christie Hemm Klok

But Heimdallr was in trouble. It was still an awkward fit at X. In June 2020, Berent learned that X would stop funding the project. So he spun out an independent company. He worked out a deal where X got a stake in the new firm in exchange for the intellectual property. Five people made the jump from X to the startup, including its medical director. The team hired a new head of product who had worked on the Apple Watch. Now called NextSense and touting itself as a platform for brain-health monitoring, the company got $5.3 million in funding.

In the months since, NextSense has struck up partnerships with universities and drug companies to explore the medical uses of its earbuds. A multinational pharmaceutical firm called Otsuka hopes to use NextSense’s earbuds to assess the efficacy of medication, not only for epilepsy but for depression and other mental health issues. NextSense plans to submit its device for FDA approval this year, and Emory is conducting more studies in hopes of developing an algorithm to predict seizures, ideally hours or days in advance. (The Emory doctors are now consultants for NextSense, and have some equity in the company.)

But while the immediate uses of NextSense’s earbuds are medical, Berent hopes to eventually build a mass-market brain monitor that, if enough people start using it, can generate enormous quantities of day-to-day brain performance data. The catch, of course, is that since no one has ever done that, it’s not yet obvious what most people would get out of the information. That’s also what’s exciting. “We don’t necessarily know what we would learn because we’ve never had access to that type of data,” says Emory’s Winkel.

Berent and his team envision a multipurpose device that can stream music and phone calls like AirPods; boost local sound like a hearing aid; and monitor your brain to provide a window into your moods, attention, sleep patterns, and periods of depression. He also hopes to zero in on a few sizes that would fit a vast majority of people, to dispense with all the ear-scanning.

Far along on the NextSense road map is something unproven, and kind of wild. If artificial intelligence can decode tons of brain data, the next step would be to then change those patterns—perhaps by doing something as simple as playing a well-timed sound. “It’s almost a transformative moment in history,” says Gert Cauwenberghs, a bioengineer at UC San Diego, who licensed some of his own ear-EEG technology to NextSense. Like Berent, he is fascinated by the prospect of using audio to nudge someone into a deeper sleep state. “It’s so convenient, it doesn’t bother you,” he says, “people are wearing stuff in the ear typically anyway, right?” Yeah, but not to screw around with their brain waves.

NextSense's Richa Gujarati, Jonathan Berent, Stephanie Martin, and Jen Dwyer hope that patients will use the earbuds to keep an eye on their health and treatment.

Photograph: Christie Hemm Klok

Ten days after my scanning appointment, Berent introduces me to my custom set of earbuds. We are in NextSense’s Mountain View office, which consists of two cluttered rooms in a shared suite on the building’s first floor. I tuck the buds into my ears and find they fit perfectly—unlike my Airpods—and are much more comfortable than the molded hard-plastic hearing aid I sometimes wear.

Berent pulls out an Android phone and fires up NextSense’s app. It takes data from the buds and displays it on a number of charts and graphs—kind of like the display you see in a hospital room, the one where you hope that none of the lines goes flat. On the screen, I get an instant look at my brain waves, a thick green spiky line on a chart logging the amplitude. He taps to pull up different views and to flip between the two buds. “That looks like a typical EEG,” Berent says, maybe as much to reassure me that I’m normal as to assert that his product is capturing brain waves.

Another exercise had me alternate between a semi-meditative state and alertness. In my alert stage, I sat on a small orange couch—Ikea, maybe?—and looked around the room to note the busy desktops and a low bookshelf jammed with self-help volumes, medical texts, and coding manuals. On top of the unit is a turntable, two small speakers, and a life-size model of an ear; a cover of a vinyl Prince album leans against the wall. Another wall is a giant whiteboard scrawled with equations and data readings. I soon learn that moving my head to take in this scene has messed up my readings. Apparently these prototypes still have some bugs to work out.

But the most interesting test, and certainly the one that excited Berent most, involved napping. He’s still obsessed with sleep, and his company has an ongoing study on it at Emory. “We’re really able to see clear changes between sleep stages,” says Dwyer, the medical director. If the earbuds can prove themselves as snooze detectors, patients who ordinarily get dispatched to a sleep clinic might be spared the trip, says Richa Gujarati, NextSense’s head of product and strategy. With earbuds, she says, “you can send patients home for a diagnosis.”

I, however, was to nap on a small couch in the office. Berent retreated to his Jeep to do the same. I scrunched myself into a semi-fetal position and willed myself into the Land of Nod. It felt like it took half my allotted 20 minutes to doze off, but when my watch alarm started to chirp, I had definitely been out. Berent popped back in the room and congratulated me on my dozing. After uploading the data, we sat in front of his computer and watched as several graphs popped up. I could see the splotchy color fields of my spectrogram darkening around minute five or six, as sleep set in. Berent had taken a similar trajectory. But as he is a polyphasic maestro of the daytime nap, the last few minutes of his slumber produced a waveform signature that was almost a solid block of burnt orange. “It looks like I’m dead here,” he says. For comparison, Berent uploaded data from the Oura device he wears, a sleep tracker worn as a ring. It hadn’t registered the nap.

Of course, gazing at a chart of vibrant blotches wasn’t going to help me fortify my winks. That’s part of what NextSense is promising to one day deliver. But being able to so casually see what my brain was up to felt like a revelation. Just as some of us obsessively monitor our pulses and oxygen levels, we might regularly check our brain waves just to see what they’re up to. If enough of us do it, we may even figure out what they mean.


Let us know what you think about this article. Submit a letter to the editor at mail@wired.com.