icon caret-left icon caret-right instagram pinterest linkedin facebook twitter goodreads question-circle facebook circle twitter circle linkedin circle instagram circle goodreads circle pinterest circle

The Privacy Blog

Neurotechnology: Magic Heaven or Privacy Hell?

Everyone experiences magic in their lives. Small children experience magic frequently, using their guileless wonder to discover the world around them.


Most adults have suppressed their sense of wonder. Preoccupied with grown-up concerns, they sometimes feel the need for some magic and seek it out.  And what better place to find magic than in the middle of a desert where a thriving town shouldn't exist.


Through a strange confluence of circumstances, the desert town of Las Vegas has all the water and electricity it needs. It sustains a super-size tourist population of over-indulging adults in its many energy-gobbling hotels and casinos. It must be magic.


In 2020, I sought out my magic at the Consumer Electronics Show (CES) in Las Vegas. CES is the Rolls Royce of consumer electronic events worldwide. Attended that year by 250,000 people, the show took up the Las Vegas Convention Center and three hotels.


With the excitement of children on Christmas morning, geeks swarm from all corners of the globe to attend this event. They are a sympathetic audience that loves consumer electronics, and the displays do not disappoint. The exhibit halls deliver multisensory video and audio assaults using the largest video displays you have ever seen.


I wanted to believe that CES would be heavenly and magical. And in some ways, it was. I got to see the latest technology before it came to market. There were companion robots that looked like dogs and cats. There were robots for kids to play with and learn about becoming inventors. There was a ping pong-playing robot. There was a GPS-enabled combine harvester that operated autonomously.


But I started seeing cracks in the magic façade. The first thing that tipped me off was the staircase. Proudly displayed on the riser of each step, an ad said: "SHOP WITH YOUR DNA Booth 43742". At first, I thought this was outrageous. Upon reflection, I decided it was wrong and evil. Why should anyone want to share their DNA with a stranger? What kind of a tchotchke could tempt someone to give away her DNA? What's wrong with these people? Haven't they ever heard of privacy? Strike one in maintaining the illusion.


Scientists and engineers love the challenge of building things. They sometimes forget to think about whether these things should be created and unleashed on the world. For example, the atomic scientists had second thoughts after their bombs were dropped on Hiroshima and Nagasaki. At CES, just because they had the technology to take DNA samples and amass the data doesn't mean they should. Unsuspecting donors would end up donating their most personal data while the vendor commoditized it.


The next thing was the headsets. Many vendors displayed wearable headsets that could sense neural activity in the brain or provide virtual/augmented reality experiences. The applications included many medical uses, like detecting and treating psychological conditions such as depression and addiction. Learning specialists could use them in clinical settings to determine if a child had special learning needs. A teacher could employ them in a classroom to determine the level of engagement of each child. Wait, what?


Neural activity is also biometric. And HIPAA protects the data collected from such headsets used in a medical or clinical setting.  In other locations, however, like a classroom, the data has no privacy nor security protections. Once again, the technologists didn't stop to think about whether marketing this product to the non-medical community posed any risks, like being prosecuted for violating individual privacy.


Neuro and other biometric technologies hold out promise for alleviating people's suffering and improving their lives. But as new technologies, they're still being developed and refined. Many have no track record of performance. They rely on sophisticated algorithms that use artificial intelligence (AI) and machine learning (ML). Therefore, these technologies inherit all the liabilities of AI and ML. For example, the technologists have to train the algorithms using real data. The design for training needs to be grounded in the best practices of the scientific method. Where do we get the data? Can old data be reused?  Are the identities of the individuals that provided the data protected? Does the data need to come from a person that resembles the patient in specific ways? Will the system perform better if one uses the actual patient's data to train the device? The point is that the technologists won't know until enough time has elapsed for an established performance record to exist. There is reason to be optimistic, but the nascent technologies are not yet the promised panaceas.


Humans often try to take the shortcut of seeing things as black and white. They want to believe that technology just solves problems without introducing others. In truth, the issues are not black and white but represent the yin and yang of complementary and competing forces.  Many technological capabilities come at the expense of privacy. We should jealously protect all of our private information, especially our biometric data, as precious possessions.

Be the first to comment