The Tyson Popplestone Show

Nita Farahany is a renowned professor at Duke University, specializing in the ethical, legal, and social implications of emerging technologies. Her latest book, "The Battle for Your Brain: Defending the Right to Think Freely in the Age of Neurotechnology," explores the transformative potential and profound ethical concerns of neurotechnologies like brain-computer interfaces and neural implants. Farahany emphasizes the need for robust protections to safeguard mental privacy and cognitive liberty in the face of these advancements.

EPISODE OUTLINE:

00:00 Introduction and Overview of Neurotechnology
03:42 The Progression of Neurotechnology and Brain Sensors
06:35 Neurotechnology in the Workplace
09:29 Productivity and Brainwaves
14:15 The Double-Edged Sword of Neurotechnology
25:12 Balancing Benefits and Intrusion in Neurotechnology
29:32 The Future of Neurotechnology and Brain Sensors
30:38 The Need for Regulation and Cognitive Liberty
34:38 The ickiness of Surveillance State
42:32 Protecting Self-Determination, Mental Privacy, and Freedom of Thought
49:31 Legislative Efforts and International Discussions
52:57 Bipartisan Support and Public Awareness

TRANSCRIPT:
https://share.transistor.fm/s/4697dc29/transcript.txt

PODCAST INFO:

YouTube: https://www.youtube.com/channel/UCdpxjDVYNfJuth9Oo4z2iGQ
Apple Podcasts: https://podcasts.apple.com/au/podcast/pop-culture/id1584438354
Spotify: https://open.spotify.com/show/2gWvUUYFwFvzHUnMdlmTaI
RSS: https://feeds.transistor.fm/popculture

SOCIALS:
- Instagram: https://www.instagram.com/tysonpopplestone/
- YouTube: https://www.youtube.com/@tysonpopplestone9467

What is The Tyson Popplestone Show?

Tyson Popplestone is a Comedian from Melbourne Australia. Join him for a brand new interview each week.

Tyson (00:00.798)
If for whatever reason we lose connection, just hit that link that I send you again and I'll just be back here. I mean, it's never happened, but I just like to give you that heads up in case you think I've bailed on you.

Nita Farahany (00:09.102)
Yeah, there. Good. Good. And I'm in a hard line, so we'll be all right.

Tyson (00:14.942)
Yeah, awesome. Awesome. Hey, it's really funny actually to sit down and start a conversation with someone in the scene that you're currently in specifically in reference to the brain book. Because I'm a kind of guy, I'm a distance runner with a real appreciation for what a Garmin can do. I'm still pretty excited about the fact that a Garmin can tell me pace and distance and I have a strap around my heart that gives me some idea of what my heart rate is. And I say that to tell you

in advance that I'm well and truly behind what is taking place in the world of technology. And to start flicking through the pages of your book and hear about how far down the road of where we are with neuro technology, it absolutely blows my mind. And I thought maybe as a way of planting a bit of a foundation for this conversation, it might be good just to give us a bit of an overview of what it is that you're actually doing and where it is that we're actually at in terms of.

the progression of this neurotechnology because it seems as though it's like a weekly update the speeds gone pretty quick

Nita Farahany (01:19.182)
Yeah, it's going really fast. So, you know, to the extent that most people have heard of neurotechnology, it's more, you know, having heard about what's happening with Elon Musk and his, you know, implanted patient. And so that's what most people envision. They envision, you know, cutting a hole in your skull and putting electrodes deep into your brain and that the people who are using neurotechnology have lost their capacity to walk or talk or something like that, and that this might give them the hope of being able to overcome that.

And that's also neurotechnology, but that's not the kind of neurotechnology that I really focus on in the book, even though I do cover some of those aspects about what's coming. What I really focus is on neurotechnology for the rest of us. So you're an avid runner. It sounds like you're quite used to having sensors embedded in different products that you're wearing. And that's where neurotechnology is going, which is embedding sensors, which are brain sensors into

earbuds, headphones, watches and other devices that are multifunctional devices. And so joining that kind of wearable industry, where instead of picking up what is your heart rate, it's picking up the electrical activity in your body. And there are different ways to pick up brain signals, but the predominant one that's really on the market right now, picks up brain signals. So what does that mean? It means

you know, as you think, as you relax, as you as your mind wanders, as you're tired, neurons are firing in your brain that give off tiny electrical discharge. And when you have a dominant mental state, hundreds of 1000s of neurons are firing concurrently. And the sum of those different electrical discharges show up in what are called brainwaves. So particular patterns that can be detected by these sensors. And then thanks to the crazy rapid advances in AI, what

could potentially in the past have been undecipherable signals. Now, like any other massive data set, AI is able to start to decode what a lot of those signals are. And so what's happening now is there are a lot of products that are already on the market. And what's coming in the next year or so are all of the major tech companies like Meta and Apple integrating those sensors into the devices that we've come to know and love, but suddenly are going to have, you know,

Nita Farahany (03:38.606)
brain sensors that can detect a lot more than they used to be able to detect.

Tyson (03:42.174)
Yeah. And it was until fairly recently, I heard you say that you had to have something implanted below the actual skull in order to be able to pick up the, well with enough specificity for it to be relevant. The brainwaves that had like, like all of this world of AI and technology, things are transforming pretty quickly. When was it that that was the case? Like how, how fast have things progressed in regards to, you know, how the

Nita Farahany (04:04.686)
Well, I mean, it's still the case that you can pick up much more information with more electrodes deeply embedded in the brain. So to be clear, like what is wearable neurotechnology and what is implanted neurotechnology have very different use cases because lots and lots of electrodes deep in the brain, you can both get greater specificity and greater signal deeper in the brain.

But wearable sensors and dry sensors, so to the extent that anybody's heard of EEG in the past, they probably are thinking of like a big weird looking medical cap that has lots of wires that are coming out of it. Now just envision like your earbuds where there are invisible sensors inside and maybe it's one or maybe it's many that are inside your ear that can pick up brain activity. It's still far fewer sensors than what's deep in your brain. And so

the specificity of how much it can detect a brain signal is lower than what it can detect in the brain. But thanks to advances in AI, it's already possible to not just tell if a person is happy or sad, if their mind is wandering or they're paying attention, but much, much more. I mean, increasingly, it's possible to do things like type by thinking about doing so and having those neural signatures picked up with your intention of sending brain signals down your arm to your wrist.

and your intention to type or to swipe or to move. And so what a lot of these companies are really applying this technology to is where you use a mouse and a keyboard, increasingly you can just use brain sensors and think about interacting with your technology and then have that be a much more seamless interaction. So it's not like full scale mind reading of everything that you're thinking, where there are little thought bubbles above your head because that's still more complex signals than

than were there yet. But if anybody's watching what's happening with AI, what you say today may not be true tomorrow. And so how much you can decode and how much you can detect, it's like exponential changes on a daily basis.

Tyson (05:58.43)
Thank you.

Tyson (06:05.758)
It's so wild. I mean, the idea that it might be hard to pick up accurately exactly what the brainwaves represent is really interesting. And it's something that I was sort of going to jump into you with soon, specifically in regards to what this might look like from a corporate sense, a little way down the road in terms of how effectively are you using your time? What kind of a vibe or what kind of a gauge can we get at the moment based on brainwaves as to what those brainwaves represent in terms of productivity?

Nita Farahany (06:35.886)
You don't have to go a little ways down the line. It's already happening. So it just isn't happening at full scale. So already in workplaces around the world, there are companies, there's actually an Australian based company called SmartCap that's been selling a technology for more than a decade to companies that has what they call their life band embedded in it. And the life band has EEG sensors, electroencephalography sensors.

And they sell it to mining companies and trucking companies and aviation companies, which is for their employees to wear the life band inside their hard hat or their baseball cap or something like that, which picks up in this instance, just fatigue levels. So, you know, you can imagine if you have somebody who's in a mining situation, you want to know if, you know, they're being exposed down below in ways that put them at risk or if you have a commercial

driver, you want to know if they're falling asleep at the wheel. And what this allows is a score from one to five, the algorithm reads the brainwave activity and can then report back like this is where the person is on a sleepiness scale. Other companies have been using products that are these like headsets that track whether an employee is paying attention or their mind is wandering or if they are bored or engaged at work. In fact,

I was presenting on your brain at work. So the ways in which lots of companies have started to integrate neurotechnology into the workplace as a form of basically icky surveillance, I think, of workers, right? I mean, there's already so much surveillance of workers. This is next level at the World Economic Forum in Davos. And one of the CEOs of a company came up to me afterwards to talk about how their company would be a great use case for the examples that I'm giving because they'd already

Tyson (08:08.158)
.

Nita Farahany (08:29.294)
used it on thousands of their employees across Asia to track everything from their boredom, their engagement, their distraction, the extent to which they're paying attention at work. And then we're making decisions about how to design the workplace or whether people could work from home or work in the office based on what their brain data reveals. So that's already happening. And then there's been reports for years about how factory workers in China or train conductors in China are required to

wear helmets that have EEG sensors that track their brain activity at work. The second chapter of my book is called Your Brain at Work. And I do a really deep dive into a lot of what's already happening in workplaces and how that could evolve over time. And it's interesting because I think that chapter has struck more of a nerve with people than any other chapter in the book because I think it's so relatable for all of us to imagine suddenly like, what does it look like to have our employers have access to

our brain activity.

Tyson (09:29.854)
Yeah, I don't even really like my employer seeing Google search history.

Nita Farahany (09:34.926)
Yeah, I don't either, right? I mean, like the amount of surveillance that's happening, you know, I hate that like Microsoft Office has a panel that shows like how much time did you spend on email and how much time did you spend on Microsoft Word and each of the different documents that you have up that already feels incredibly intrusive to me of how I'm spending my time. If you can go a level deeper and say, and like,

What was the extent to which my brain was engaged in different activities or mind wandering or starting to get metrics? Or what about cognitive decline over time and being able to track your brain activity sequentially over time? What are employers likely to do with that kind of information? Or if I'm depressed or feeling emotionally distraught, is any of that information I'd want my employer to have access to? No, right?

Tyson (10:24.19)
Yeah. And how confident are we that we actually know what the most effective use of, say for the example of an eight hour workday, like how do we know what state we should be in for a certain amount of time? I do stand up comedy here in Australia. And one of the things that I noticed when I'm writing jokes is there's not necessarily a correlation between how active I feel in terms of being on a computer typing and the quality of a joke. Sometimes my best jokes are written when I'm sort of just sitting there and just going like,

And I mean, like surely that doesn't give that much away about, yeah, brainwave just looking at that. But I mean, to look at that physical posture, you go, that guy's not achieving much with his time, when the opposite is actually quite true.

Nita Farahany (10:55.886)
Yeah.

Nita Farahany (11:02.382)
No, I mean, so.

You're totally right. And that's a good insight, which is, you know, what the face expression you just did, I'm going to treat that as like your mind is wandering or your daydreaming or something, right. And it turns out most creativity and strokes of insight and big ideas don't come from times that we are laser focused on paying attention. It's when we kind of sit back and give our brain a break and it's able to integrate across different ideas. And, you know, it's that kind of free association, which is

part of what enables us to have these strokes of insight and creativity. So I don't think we have a very good idea at all about what the thing is that we should be measuring if we're measuring anything in the workplace. And so, you know, if employers start using as a metric of success and of like employee performance, how much time are you focused? That's not necessarily going to translate into better end, like end product for a company. Now, it might be the case

that if you're tracking fatigue levels, it does. And I don't mean that for like the white collar worker or the, you know, person who's doing comedy or anybody else. I mean more, if somebody's driving behind the wheel, then you're going to want to know if they're awake or if they're tired because there is a direct correlation between fatigue levels and risk, right? Decreased motor coordination, ability to pay attention to the road and safety.

So that's one metric where I'd say maybe it's okay, and especially if it's narrowly constrained, if the only thing they're picking up is fatigue levels and no other data, that might actually be a good use of it. But for the rest of it, not only do I think it's not a good metric of productivity to measure the extent to which a person is paying attention versus their mind is wandering, it's likely to be counterproductive because it undermines morale, it makes people far...

Nita Farahany (12:56.142)
you know, more uneasy, it decreases trust level between employers and employees. And those are good metrics of a successful workplace and work environment. So it's like measuring the wrong thing and counterproductive and likely to backfire against employers, I think, who actually implement it. You know, is it bad for you and me though? Like if we were sitting down and we were like, okay, you know what, we're in the digital era.

notifications pop up all the time and distract us. It's hard for us to stay as focused as we'd like to stay. I use focus tools, not brainwaves, focus tools, but like for example, one of the ways that I force myself to pay attention and not be distracted if I'm in a write for an hour is I have a little cube that I have on my desk and it has a timer on it. And so I'll pop it up to say 25 minutes. And during that 25 minutes,

I will not let myself be distracted by anything else. So if a notification pops up, anything happens, I just silence all of that, and I write for 25 minutes. And then I will take a five minute break, and I can be distracted by all of those things. And then I write for 25 minutes. If we had tools, even neurotech tools that helped us improve our focus, and we used it just for ourselves, I don't think that would be a bad use either.

Tyson (14:15.102)
For sure. It's always so interesting with these conversations, like the, it really is the double -edged sword. And this is something that you've been quite vocal about, obviously, and spoken about to a great extent, because what you just mentioned, it makes perfect sense. It sounds from a health perspective, from a progression and productivity perspective. I mean, you can see the potential benefits or the clear benefits in a big way, but then the other side of the conversation comes when we start speaking about this surveillance thing.

I mean, the idea of trust, especially here in Australia, I feel is very low at the moment, particularly coming through the other side of the COVID pandemic where our government just got incredibly excited and really made a big push to track data in a way that had never been done on a public scale here in Australia before. There was a lot of pushback and there was a lot of hesitancy. And particularly, I say like in hindsight, we look back now and I think even people who maybe went along with it was like, yeah, what did I?

What did I agree to? At least a lot of the people that I've been spending time with. And I mean, the idea of rolling this out on a math scale would be fascinating to me. Cause as you said, like there's companies here in Australia and I heard in your book speaking about the companies, the train drivers in China and I'm sure there's so many other places, but is there much information out on like the public's response to something like this? Because I hear about it and I don't know what the difference is.

Exactly. Like the idea of sharing heart rate data through a run with Strava or with Garmin. It doesn't feel overly personal. I'm like, I can see from a health perspective how that might actually be beneficial in improving your product. And I love your product. So I'm happy to help. But once they start tapping into brainwaves and, you know, I think it's almost the monitoring of it like that almost a dictatorial, like that dictator filter.

Nita Farahany (15:53.518)
Right.

Tyson (16:03.326)
what it is that you've agreed to take part in. I just, I don't think I necessarily believe in.

You know, the good of a lot of human intentions, a lot of the time, I think it can be used to manipulate as you speak about clearly. So personal perspective now, I can't ever imagine myself signing up for a product like this, but I mean, you look at the big companies that you mentioned, who'll be rolling out their own versions of this in the next 12 months. And I mean, obviously they know a lot better than me that that sounds as though there's a huge market for it.

Nita Farahany (16:35.886)
Yeah, so it's interesting. Tell me again all of the sensors that you wear when you're working out. So you have a watch, you have a heart rate sensor across the chest. What else do you use?

Tyson (16:41.022)
Yeah. So I've got a, yeah. So it's only the heart rate monitor and it's the Garmin watch.

Nita Farahany (16:48.462)
Okay, the garment and watch, okay. So first imagine that it turns out that peak exercise is better measured through brain activity than it is through heart rate alone, right? So, you know, what is like,

and I'm not a fitness guru, so forgive me. I'll throw out some imaginary language here, but if it's muscle max use or whatever, or glycogen use or things like that, the amount that you're using those things, and if you can get better measurements of that through brain activity. Or through EMG sensors instead of ECG sensors. So you probably have ECG sensors that are put into your Garmin watch and put across your chest.

And what's coming next in most of these watches is also EMG, so electromyography that measures electrical activity at the muscular junctures. Now that's actually a measure of brain activity, because what it's measuring is peripheral brain activity at muscles. So it's measuring the neural signatures as they go from your brain down your arm to any part of your body, any muscle, so you can measure it anywhere.

I think for people who are significantly into fitness and they're used to tracking different bodily metrics, those metrics will probably be more powerful, more precise, and give you more information. And then what if you know when you are hitting runners high and you know when you're at peak, you're just in the flow? And what if you can see that in your brain and you're trying to maximize that metric, right? Which is like you're trying to measure.

VO2 max or whatever, you throw out some terms that make sense, right? Those are things that I would imagine are gonna be part of the brain sensors. They're gonna be integrated into these different devices. And then there's a whole bunch of other metrics too that I think the average individual is going to want access to. Most people are used to tracking things like heart rate. They're used to tracking things like sleep.

Nita Farahany (18:49.23)
data or the number of steps they take per day or anything like that. And I think there's a lot of cognitive fitness information that people are going to gain access to and start to see the use case for. But I also think it's going to replace a lot of peripheral devices. So say, for example, immersive technology like virtual reality and augmented reality takes off. It's very unlikely that those are going to be.

operated through like joysticks or through a keyboard or a mouse, it's going to need to be much more portable. And so the way that like AR glasses are going to work is that you're going to have a sensor in your watch that enables you to think about moving or in your air, like earbuds that allows you to think about interacting. So it'll be a combination of things like eye gaze and eye tracking, which picks up brain activity and intention to move together with brain signals from your ears and other cases. And then what will happen is

Like, so you have a Garmin watch instead of an Apple watch. Apple just quietly made, like they're no longer updating some of their earlier watches in their latest Worldwide Developers Conference. And that's what's gonna happen is like first generation will have brain sensors and it'll be an option. Just like the heart rate sensors in the watch was an option. Second generation may still be an option. Third generation not gonna be an option. If you buy a watch or you buy AirPods, they're gonna have the sensors.

Tyson (20:11.166)
That's a, it's yeah, it's wild to hear. Isn't it? It's, it's really funny. I was actually, I was listening to, I drove about half an hour in the car and I've got your book on audio and I was going through the first chapter today and just thinking about.

How there's almost there seems to be so much reluctance to sign up for any form of new technology. I mean, the reluctance that I just shared there, but I think in the early days of seat belts being implemented, it was a real fear. Like it sort of tread on the freedom of people to drive the way they wanted to drive, despite the benefits that you see through it. And I mean, where do you stand with the, I know you would understand where I'm coming from, but the clear benefit of the product that you've just spoken about there versus the massive intrusion of.

Privacy and as you say it comes to a point where okay, you know in 15 years There's gonna be no questions asked about whether or not you get it It's just if you buy the product it's gonna be a part of that and With that and with the numbers of people who are taking part and purchasing those products It just becomes more and more normal which is to me where the conversation starts to get a little bit more scary because the idea of that being normal and the access that we're just Freely giving whoever it is that wants the access

starts to hand over quite a lot of power to people who can manipulate and whatever. I don't know. I'd just be interested to hear your personal thoughts on it.

Nita Farahany (21:35.406)
Well, you know, it's interesting. So the paperback version of my book comes out next month, and it has a new chapter on it. And it's all about this. It's all about the normalization of risk, like what happens as it's normalized, and it starts to disappear, which is when I actually think it becomes more frightening is because at least right now, you can look at it and talk about it and say, like, I'm just not gonna get that. That doesn't sound like a good idea. I don't want to breach my mental privacy.

when people encounter it more and more and they stop thinking about it because they're invisible sensors that are inside their ears and they just focus on the benefits and don't focus on the fact that their mental privacy is now gone in many aspects, right? Then I think it becomes much more frightening because we can't do anything about it. We can't do anything about the risk. So where I come out on it is I think this could be incredibly empowering technology and I think that it could be incredibly oppressive technology.

Tyson (22:27.006)
Yeah.

Nita Farahany (22:27.438)
And in fact, I think it could be the most oppressive technology that we've ever introduced into society because I believe that the space for mental reprieve, the space that, you know, the tiny bit that we hold back right now that is not obvious through behavioral, you know, engagement with anything around me, you can't read it from my face, you can't read it from my eye twitches, like there's a part of you that is internal to you.

And that part of you, I think, is fundamental to what it means to be human. It's fundamental to your capacity for self -actualization. To become who you are, you needed a space of privacy. And to continue to be who you are, you need a space that's just you, that's reserved for you, that you share when and if you want, that you select what you share, that you have a place to mull things over. And when we breach that, the question is, are we going to do it in a way

that preserves that space for you only, that other companies aren't using it to commodify and target you for advertisement, that they're not selling it to your employer, that your employer isn't using it to monitor your brain activity, that oppressive governments aren't using it to monitor it, or to your point, during a major pandemic, they don't suddenly say, well, you know what, actually, the first sign of, you know,

whatever this virus is shows up in brain activity. So we're going to start to collect data from what everybody's brain activity looks like. And we all see control over that. Or post 9 -11 here in the US, we passed the Patriot Act. The Patriot Act resulted in listening to all of our devices and collecting mass amounts of data on humans. What if it's much easier to figure out if somebody is going to plan a terrorist attack by starting to subpoena all of the companies who sell the

brain sensor data and start to scan for any mental rumination about anything having to do with bombs or terrorism or anything like that, right? Like keywords are being searched all the time in our text messages. Are we going to be scanning everybody's brains for that kind of information as well? You know, that's a terrifying vision of the future from my perspective. And

Nita Farahany (24:38.446)
You know, it's like either we don't adopt the technology and then we don't get any of the benefits of it, or we put really different rules into place now and hard controls, not just like, let's trust they're not going to collect it. Actually, if I have like earbuds in that have brain sensors, I want a hard control where I can switch off the brain sensor and I can have a conference call or I can wear it to listen to music where my brain activity isn't being monitored if I don't want it to be monitored, right? We need to give people hard

Tyson (24:48.833)
I'm going to go ahead and close the video.

Nita Farahany (25:06.926)
controls over their data, over their most sensitive data imaginable.

Tyson (25:12.382)
For sure. I struggled to, so a classic example that I have is I want the iPhone. I mean, I've got the iPhone here and when I carry it around, for whatever reason, I've got no, like I'm not that convinced that anyone's tracking my personal data or my personal location, but location services is always off. Cause I go, hey, I want the perks. I want Spotify, but I just don't want you to know where I'm driving. Do you know what I mean? But I...

Nita Farahany (25:35.502)
Yeah, no, I totally know what you mean. But like kids don't do that. You know, kids not only have their location data on all the time, but you know, my students this year, I was shocked to learn this. Like my freshmen students, first year university students, they have hundreds of people in their Find My Friends, like that they're sharing their location data with all the time. Hundreds, like anybody can just look and see where they are at any given time.

Tyson (25:56.158)
Yeah. Yeah.

Tyson (26:01.63)
But see, I've just caught myself out in my own little lie because going back to Garmin and sorry, like you can tell distance running is a thing. I've got no, I've got no shame. I've got no qualms uploading my route for that run. You know what I mean? So anyone who goes onto Strava, you're going to know where I'm at at a particular time of the day. but yeah, it's very interesting and I'm not sure how much trust and I'm not sure, how convincing certain companies are. Like when I turn that location setting off, I take it with a grain of salt.

Nita Farahany (26:12.078)
Right. Right.

Tyson (26:29.982)
that it's actually been turned off. I don't know for sure. I mean, mentally, I got, that's a good step, but the idea that this technology is, you know, being rolled out in far greater scale in the next couple of years makes me think there's no way that the companies who are using it aren't going to overstep the line in any way they possibly can. Like, I'm not sure you know the law a lot better than me, obviously, but the idea that if they can, they will. And

Nita Farahany (26:50.926)
Some of them may. I mean some of them may.

Yeah, I mean, that's why I think it can't just be that we like leave it up to them, right? I actually think we need a set of rights. Like I think we need to define the rights clearly as to what a right to cognitive liberty looks like, how we would recognize that in law, how we would enforce it. And I also think we need incentives for the companies to behave better. I think we need different business models that aren't based on commodification of the data. Like I think we need all of those things, but

Tyson (26:59.87)
I don't know if I'm just being overly cynical.

Mm.

Nita Farahany (27:29.49)
But I think we have to start with some hard, like not just hard controls with it, which is a little switch on my device where when I switch it off, it works and I have to trust that it works, but also big sticks and punishment if it doesn't work, right? If my brain data is commodified, if there is misuse, if the terms of service are updated and somehow, you know, there's tying agreements which say like, you want to use an AirPod, then you have to give away all of your brain data or you want to use a watch, then you have to give away all of your brain data. I don't want that.

On your sharing of your Garmin data, one of the things I discovered in writing the book was that there are online communities where people are sharing their brain data already with each other, where you can measure the extent to which you're meditating. And there are different brain waves where, like if you have a lot of gamma activity, for example, that's supposed to show that you're really in this extraordinary state of meditation.

And they share their like raw brainwave data with each other during meditation sessions on these like websites on these online sharing sites like places like Facebook and in Facebook groups, where you know, they want to say like, look at my gamma activity or like, how do I increase my gamma activity in this particular area or like, look at my beta and alpha brainwaves. That's already happening, right? And even though this isn't like widespread.

you know, adopted across all of society yet for the people who are already early adopters of this technology, exactly what you're talking about is happening. And that same data could be taken. And you know, more information could be extracted from it about what a person is, you know, feeling at the time, you know, even like mind wandering, soon kind of even word and substantive content of what they're thinking could be decoded from that information that they're sharing online.

Tyson (29:21.406)
Yeah, how far do you think we are away from this being rolled out on a math scale? Like I remember in 2007, I think I saw my first iPhone. And I think in 2008, everyone I knew had an iPhone. Like it was, it was fairly immediate. I jumped on the bandwagon there pretty quickly. Are we talking sort of like the next 12 or 18 months? This is going to become far more of a, yeah, wow.

Nita Farahany (29:32.686)
Slide one. Yeah.

Nita Farahany (29:42.478)
I think so. Yeah. So I think, I think first quarter 2025 is when Meta has said they plan to launch their EMG device. So their watch with brain sensors into the watch. I think Apple first is planning on launching likely earbuds with health sensors, but probably not EEG sensors yet. It'll have like heart rate sensors and then EEG sensors are soon to follow. I think 18 months is the max of kind of wide scale.

Launching, if you look at the number of companies that are launching even just this summer, there are a bunch of smaller but kind of big player companies in the neurotech space that are launching products this summer that are this new wave. Neurotech has been on the market for a while where it's an awkward headband or something like that. What's coming now and what has already entered the marketplace is quite a few products that are the earbuds and the headphones and the watches.

And that's the form factor transformation. And then the fact that large language models can now be on device is the technological breakthrough. Because one of the things that's tricky about brain data is everybody's brain data is a little bit different. So measuring heart rate from you and measuring heart rate from me is a pretty easy technological problem relatively. Whereas you have to.

Calibrate brain data because every like how you solve a math problem and how I solve this math problem looks a little bit different in our brains Having a large language model on device Allows like a device to learn you over time So out of the box the only thing it needs to be able to do is a really simple set of commands like up down left right and then it can Calibrate and learn you over time

so that it gets better and better at decoding your brain activity. So that's been one of the major advances is like the chat GPT moment is also the neuro tech moment.

Tyson (31:36.574)
Yeah, it seems like it's just a big wave at the moment. It all seems to be happening at once, but these are, they're all being rolled out pretty, not going to say free of regulation, but in terms of what you think needs to be established. Yeah.

Nita Farahany (31:45.806)
yeah, very little. Yeah, I mean, so, you know, across the world, there's sort of a patchwork of laws that have started Chile passed a law that had some specific protections with respect to the collection and use of neural data. Here in the US, Colorado passed a very narrow law and California is considering a law that gives protection under their Privacy Act for kind of greater levels of controls over

the collection and use of neural data. Right now, UNESCO has a major process underway to define the ethics of neurotechnology. And the first draft of that was issued a few months ago. The second draft will happen at the end of April of 2024. And then it'll go through a political process. And if that's adopted by the 194 member countries, that would have a strong norm as a baseline. And

The OECD has issued a set of recommendations about the responsible use of neuro technology, but there's not a lot of hard law on the books yet, right? There's starting to be a lot more soft law that tries to set the norms and the standards, and that's at least better than has happened with a lot of other technology. But given the stakes here, I'd like to see much stronger protections that happen as hard law across the world.

Tyson (33:06.462)
Yeah, and when you say the stakes, you're just referring to a surveillance state, essentially. Yeah, how realistic is it that we go in that direction of this being used? Like if we were to continue down the road of the regulations that we currently have, I mean, it sounds like there's just a whole heap of freedom to use it in any nasty way that you want. The other side of that, just I'll go on a little tangent for a minute and throw this at you. It's really interesting how...

So I was driving in Sydney here a couple of weeks ago and I got a letter in the mail to let me know that a camera had seen me. And I mean, I was in the wrong, there's no doubt about it, but I was on my phone and it said, you were caught by a traffic light camera, you're on your phone. And I thought, well, okay, I know I'd done the wrong thing. And if a police officer saw me, I would be in trouble and I'd fess up and I'd pay the fine. I don't know why it struck me so badly that a camera had caught me doing it. I mean, it was the same thing.

It was the same fine. It was the same problem, but the fact that it was handed out by a machine rather than a person, I go, hang on. Like this is, it just, I'm not, I'm not sure what the right words for it are. Like surveillance state, obviously, a weird step in a direction that I'm certainly not used to here in Australia. But what is that? Like there's something about the conversation around technology, like this stuff that we're speaking about now, I put in that same category of, okay, it's a, it's amazing, but it's also super yuck.

Nita Farahany (34:11.762)
Yeah.

Nita Farahany (34:32.462)
Yeah, I think it's pretty super yuck, the fact that we've introduced such a broad surveillance state.

Tyson (34:38.462)
Sorry, just to clarify, when I say it's amazing, I'm not talking about the surveillance state, I'm talking about the perks of the technology in case you thought I meant celebrating Australia's traffic light cameras.

Nita Farahany (34:43.406)
No, I know. I know. I know. I didn't. Yeah, I didn't take it for you to say. I think it's amazing in that way, right? So, you know, I remember in like 2007 or eight was when the UK first introduced cameras like everywhere. And, you know, that but that was still mostly like humans watching the cameras. So the ubiquitousness with which the camp like

you're still limited by how many eyes can you have on the cameras and how much can they actually detect. When you layer AI on top of it, where it's looking for and trained with computer vision to find somebody who has a phone in front of their face in a car, it does a couple of things. One is it makes it much more powerful. Those cameras everywhere are now paired with AI that's analyzing every step. But for a lot of people, it also, in, I think,

concerning way to me makes them think it's less problematic because it's a computer that's watching you instead of a human who has literally eyes. And I think, you know, what what they don't sort of recognize at that moment then is what it is that we've implemented like sure, maybe it's not a human who's been watching every camera and seeing you as you've gone by.

It means everybody is being watched all the time now and everybody's being watched all the time by cameras, which of course can be deciphered by humans, but we're putting into place punishment and signals for that surveillance state, right? Maybe it's sending you a ticket. And in some instances, maybe we're going to be okay with it, right? I mean, if it's suppose what it's tuned to is seeing somebody whose eyes are closed for more than, you know, five seconds on the road and it leads to

some kind of an alert system that prevents major accidents from happening. But when it's being used to send you tickets, it feels even ickier, right? It's being used to punish you, not to help you. Maybe that's supposed to be safety because like it's bad for us to be on phones in the cars. But part of this is about aligning technology to what is actually good and what we think is good. And we haven't done that well yet. Most of the technology that we have

Nita Farahany (37:04.526)
you know, put out there in society, most of the surveillance that we put out there in society hasn't made us feel safer or better or smarter or happier. It's led to more feelings of being watched and more power imbalance and more misalignment with what we think is flourishing. And so I think your example is probably you feeling that you feeling the ickiness of like, that's not how I would want technology to be used. That's not safety. That's not good for humanity.

Tyson (37:33.79)
Yeah. It is interesting how safety seems to be the key point that so many of these new technologies, I mean, not necessarily with what we're speaking about, but in terms of traffic cameras and CCTV, it seems to be rolled out with that as its banner of, hey, look, it's doing good for everybody. But you're right. Like it's, I think the fact that maybe as well as you were speaking, I was thinking there's no real solid line in the sand about how far can you take that? Like, does that mean, because I'm not sure how, I haven't dug.

Nita Farahany (37:59.214)
Yeah.

Tyson (38:04.734)
really deep into this Chinese social credit score. I'm sure you know so much more about it than what I do. But it seems as though whether you're in the States, whether you're here in Australia, we're all taking pretty big steps towards something like that. Because the idea of that camera just taking the photo and sending a ticket direct to my house now, it eliminates a whole heap of the bureaucracy or a whole heap of the middleman work. And I'm sure the revenue for them has just gone through the roof based on how broadly they've laid that out. But there's

Nita Farahany (38:10.094)
Yeah.

Tyson (38:33.662)
There's just no limits, written down as to, okay, well, this is an acceptable, place to take this kind of technology, but that's the, that's the line in the sand. Like once we've sent out the ticket, we can't go any further than that. It seems like a really murky messy place to try and set barriers in clearly, because I guess it's not until the technology has been rolled out that you see more opportunities to other, you know, further develop it or, or, or restricted it. It's a little bit of an ebb and flow kind of a feel to it.

Nita Farahany (39:01.614)
Yeah, I think that's a good insight, which is there's nobody really stepping back to say, OK, well, does all of this make sense together?

an analogy for you is I like going to New York City. The streets are laid out nicely. There was a lot of urban planning. And so, you know, it was like, yeah, there's a huge amount of density, but it's really easy to find your way around because it's, you know, like, here are the avenues and here are the streets and it's all planned. I live in Durham, North Carolina, though, it was not very planned. And there's a huge amount of growth that's happening here. And every time something pops up, there's like

new weird ways that you have to navigate the road, roads change names, you know, there's never enough parking anywhere that new plan and it's, it's like one thing at a time, like one building at a time that's approved as opposed to like this, let's have a really thoughtful plan and design for what we want a city to look like and how we're going to lay it out and how we're going to allow density to happen. And so this is like hodgepodge. And if you think about technology a little bit that way, each choice

might seem well intentioned in isolation, but it's the cumulative effect of all of these different choices together that starts to create a dystopian society. And so for the social credit system, maybe the first step was like, well, let's have a safer system where people are incentivized to behave well. And so we'll score their insurance where if they get caught by more than four cameras a year that see them on their phone while they're driving,

it'll cost them two points on their insurance and their insurance premium will go up. And then suddenly it was like, okay, and let's add one more behavior, which is like, what if they yell at the attendant at the counter for getting onto a plane and make a super unpleasant situation, let's make it more costly for them to fly. And then like suddenly humans are reduced to data points. And, you know, it's one choice after another, rather than

Nita Farahany (41:06.542)
looking at the system as a whole and saying, is this the kind of world we want to live in? Do we want to reduce us to data points where we have a surveillance society where everything we do is scored and it's all scored automatically by a huge surveillance system of watching every aspect of everything we do, including our brain activity. No, right? If anybody stepped back and said that from the get -go and we got to vote on that, we would vote no, but we vote on these tiny little measures along the way.

without thinking holistically about the kind of direction that we're taking in society. I'd like us to change that, right? I think this is a good moment. You know, I've been championing this idea of cognitive liberty to say, if you look at what we've given up in the first era of digital technology, it's the right to self -determination over our brain and mental experiences. Like our self -actualization is at risk. And if we have now, like if we have a...

Here's how we're trying to align choices. If we have a goalpost to say like flourishing depends on this, then every new choice we make, we could check it against that and say, does adding in an automatic ticket to Tyson advance or undermine cognitive liberty and humanity? And if the answer is it undermines it, then we shouldn't do it, right? You'd have some goalposts that you're trying to get to, which would be aligned with human values instead.

Tyson (42:32.606)
And have we got any, like from your perspective, just solid foundational rules that we can put in place or is it, I don't understand the actual technology well enough to be able to know what it is that you need to put in as a boundary, but the idea that there should be some things in place makes perfect sense as well for the exact reason that you just spoke about. Like where do we start with that from your perspective in terms of some good solid rules around just maintaining that cognitive liberty?

Nita Farahany (42:36.334)
and

Nita Farahany (43:00.878)
So I mean, I think the way I described it in the book as a starting place is to say, OK, well, we already have rights in place. Let's look at which rights are implicated by this and which ones we need to interpret more broadly in the digital era. And so those rights from a human rights perspective are things like self -determination. We generally recognize that as a political right, but it's also an individual right. And self -determination means

are the things we're doing interfering with your capacity for autonomy, for competency and critical thinking, or for relatedness with other people. We can look at social media and see it fails on most of those. We can then look at mental privacy. So we have a right to privacy. We haven't had to protect mental privacy in the past because it was just de facto protected by the lack of technology. OK, well now it's threatened. So now we need to say, OK, well,

in implementing technology, what does that mean, for example, for workplaces? Well, it probably means you have a right to mental privacy in the workplace. And so employers can't require that you disclose brain activity as a condition of employment, they can't punish you or fire you or hire you based on that. And then you have a right to freedom of thought, which usually what has been persecuted has been religion or belief. Well, now it's any kind of thought, right? It's like, interception with manipulation and punishment for your thoughts should just be off the table.

And if we start with those three rights, we can look at individual instances pretty easily and say, we have goalposts. The goalposts are don't interfere with self -determination, don't interfere with mental privacy, and don't interfere with freedom of thought. Mental privacy, like all privacy, is a relative right. And so sometimes societal interests will be stronger. I do think truck drivers probably don't have a stronger interest in keeping their fatigue levels private than we as a society have in being able to say,

if you're a commercial driver, you don't get to fall asleep at the wheel, right? But it would say, there's no other information that we have a right to know about you other than are you awake or falling asleep? Like, I don't need to know who you're fantasizing about while you're driving. I just need to know that you're awake while you're driving, right? And so it would just give us principles by which we make those decisions. And that's what we need in the modern era is we need to say like, look, these exist, but we need to interpret them and apply them and say, here's the goalpost.

Tyson (45:07.23)
Yeah, sure.

Nita Farahany (45:21.838)
Now let's make critical choices in each of the different contexts where it's relevant to make sure that it is aligned with those goal posts.

Tyson (45:29.886)
man, you must feel like you're living in a sci -fi movie. I'm 37 and I've got pretty clear memories of the 90s, just learning how to type on a big box computer and thinking like, this is great. And the idea of any time in the 90s or even 10 years ago, someone saying to me the things that you're saying to me now or the things that we're talking about, it just sounds, it sounds pretend, it sounds made up. Don't you think? How do you go?

Nita Farahany (45:33.806)
Yeah

Nita Farahany (45:52.558)
Yeah, it does. I mean, look, there were probably movies in the 90s about this. I think there was a movie that Mel Gibson had that was like, what women want where he could hear everything that the women were thinking. And yeah, right. And like, but now not so much, right?

Tyson (46:05.15)
Yeah, I remember thinking that would be so helpful. Yeah, it's incredible. Are you optimistic for where we're at with this? Do you think that obviously everything that we've spoken about in mind, is there going to be regulation put in place which is solid enough to actually keep companies accountable? Because if there's anything I've learned in the last few years, it's that

that money speaks to a really large degree. And it seems that the companies can overstep so many hurdles if they've got enough money to deal with the fines that they're given. It doesn't seem to matter. And that's one thing that I've found really frustrating is just the dishonesty with whatever it is that you're speaking about, because you can just pay off your fines as though it's almost a marketing cost for what it is. And I don't know if we're at that stage now with big tech, but it sounds as though it's well and truly

Nita Farahany (46:54.51)
Yeah.

Tyson (47:00.926)
heading in that direction where you could probably make a few little adjustments and just pay the fine based on the fact that it's earning so much money that it was worth it.

Nita Farahany (47:10.606)
Yeah, I mean, look, the odds are not in our favor, right? What was the saying in the Hunger Games, like, may the odds always be ever in your favor or whatever that was, right? The odds are not in our favor right now because the history of, you know, the relatively short history, but the history of digital technologies has really been one about commodification of data and commodification of brain data, like,

Tyson (47:20.414)
Hehehehehe

Nita Farahany (47:37.038)
It's like the last big source and probably the most profitable potential source of data that could be exploited. It's also the most costly to humanity. And I think what makes me optimistic is that a lot of times technologies come to market and then the conversations about regulation happen, as opposed to in advance in anticipation and trying to put into place a responsible pathway forward.

This time, it's, you know, it's, there's a huge number of conversations and legislative efforts that are underway and soft law that's underway across the world that makes me optimistic that the conversations are happening. What makes me pessimistic is that the general public is not aware of what's happening or how quickly this technology is coming, nor the potential risks of it and why the risks are different. Why mental privacy may

in fact, be the most important kind of privacy to protect, even if every other kind of privacy is breached. And the companies have powerful incentives to the contrary. They also have powerful incentives to get it right, to have society adopt the technology. So all of that is to say, I am always an optimist, and I try to always be an optimist. And at this moment, I'm optimistic because I think we could make real choices.

that would enable the technology to align with the kinds of things that would be good for us rather than bad for us. But that requires each and every one of us to make those choices, to demand that, right? To not just let it be something the tech companies decide, but to say, no, no, no, I demand if I'm going to use this product that I have a right to the data being my data, that it can't be used, it can't be used for advertisement, it can't be used and sold to third parties. Like, it's mine. My brain data is my data.

And if people say that, I think we'll be in good shape. But we need everybody to say that.

Tyson (49:35.838)
Yeah, from what you've seen, is there a pretty clear line? Like it seems like one of those almost a political issue at the moment. Like I'll assume that the stereotype of what an American or a conservative anywhere in the world is right now is they're probably the one with a lot more apprehension, you know, of taking big steps in this direction. Whereas like a classic, I get confused because liberals here in Australia are our conservative party, but the liberals...

Nita Farahany (50:02.414)
Just to mix it up, right? Just to make it confusing.

Tyson (50:02.654)
You know what I mean? You know what I'm trying to say? Why would we have any clarity on what the names mean in, you know, from Australia to America? What I'm trying to say is at the moment, I can imagine a conservative being very apprehensive and a liberal being like, hey, just do what you got to do. I trust you. No matter where you go. Like, is there, is it like 50 % split on how people take this?

Nita Farahany (50:10.926)
Right.

Nita Farahany (50:24.782)
I mean, so like what we have here is probably like a stronger deference to businesses and a desire to not interfere with businesses by what are our political conservatives. And liberals are here more likely to advocate for rights for individuals around privacy. I think in terms of adoption, though, what's been interesting to me is that it seems to be a, it doesn't seem to be political.

meaning everybody seems to recognize that cognitive liberty is important. So it seems to be one of those issues that could truly actually transcend political boundaries. Now, the devil's always in the details. And so, you know, what does that mean for regulation? What does it mean for rights? What does it mean for, you know, putting controls on companies? But what has been encouraging to me is to see, you know, in the few legislative efforts that have happened, for example, in the United States, there's been very strong

Tyson (50:58.526)
Just

Nita Farahany (51:23.662)
support across the aisle, it's been bipartisan. And that, to me, gives me hope that we could actually get at least in this particular context. It's like here and with kids, those seem to be the two areas that you can actually get bipartisan support. And I think, you know, it's possible that everybody recognizes that having a space of mental reprieve, a space of mental privacy is fundamental to being human, whether they're a political

conservative or liberal, no matter how that term is defined in any country.

Tyson (51:57.438)
Yeah, yeah, sure. Just to clarify, you're saying the what was bipartisan, the actual the seeing the need for something.

Nita Farahany (52:05.646)
Yeah, I mean, so for example, in Colorado, it was a bipartisan issue, you had Republicans and Democrats agreeing with each other about putting into place special protections around data that's collected by these devices. And similarly, in California, it's emerging as a bipartisan issue. And then when you talk to people about cognitive liberty and say like, hey, how do you feel about this? And then you do surveys, for example.

Tyson (52:17.15)
sure.

Nita Farahany (52:33.134)
A lot of times different rights and different interests break down along political lines. And this seems to be one that doesn't, that you actually have broad interest across the board, no matter who they are politically or ideologically, there seems to be a lot of convergence. And that's encouraging because, I mean, how many issues do we have left in the world that are actually across the political divide? Like, people feel the same about it. So that means.

Tyson (52:57.254)
I didn't know anyone in the world agreed on anything anymore. I thought those days were...

Nita Farahany (53:00.174)
Right, no, exactly, exactly. So I get to live in this utopia of like, the stuff that I work on happens to be something that you have both sides of the aisle agreeing on.

Tyson (53:10.654)
That sounds amazing. I might get involved. It sounds very refreshing. So I mean, just before I let you go. So the next 12 months for you, what are some of your big next steps? Like what are you trying to achieve at the moment? Obviously we've established that, okay, there's a real need for some regulation or some rules and laws just to get started. What does that even look like trying to get the ball rolling on that? Like the world of law is.

Nita Farahany (53:13.806)
Right. It's very refreshing. It's like, it's like you're living in a world where everybody disagrees. No, no, no. In my world, people all agree. It's great.

Nita Farahany (53:39.182)
Yeah.

Tyson (53:39.39)
one that I have no idea about. And so I mean, this makes sense to me, but I would have no idea what to do with this information.

Nita Farahany (53:46.35)
So for example, I'm involved with the UNESCO process. So I'm co -chairing the process of the experts draft that's being developed that'll then go through a political process after the expert process is done. So now through August, I'm really making a big push to try to get as much feedback and input across the world from people on that draft, make sure it's as broadly reflective of different perspectives across the world.

you know, there are other major organizations that have kind of put their toe in the water as well. So, for example, the World Economic Forum is starting to launch a much bigger effort to try to put together a council that really looks at these issues and tries to, across the world, set global policy on this. The UN has been exploring this issue. Different countries I've been meeting with, you know, government representatives and ministers of science and technology from across the world who are interested in engaging in these issues.

And then there's also a lot of political activity that's happening here in the United States. And so there's both state by state, different states that are trying to develop laws. And so I've been talking with a lot of legislators about that, but there's also, we have these organizations, something called like the Uniform Law Commission and the American Law Institute. These are organizations that try to put together model laws for the entirety of the country.

And so I'm working with those organizations to develop what the model laws would look like so that instead of going one state at a time, you could actually have a model for what it should look like. Those are just some of the examples. And then, you know, I do a lot of public conversations around the world because I think it's so important to not just have this be a conversation among legislators and among experts, but to spend as much time trying to talk with people in the general public, people who were likely to adopt the technology so that they understand

what's coming, what the risks are, how to have those conversations, how to think about what it means for themselves as well. And then of course, I'm doing a lot of research for kind of the next steps in the process.

Tyson (55:53.406)
I would love to be a fly on the wall of the different age groups that you speak to. Like the 18 year olds and the 75 year olds must have extremely different music. Man, my mom's going to enjoy this podcast more than me. I think I'm even further down the road of understanding your technology than she, so it's going to be a really interesting one. Nita, I'm going to, I'll let you go. I've got my eye on the clock. I know you've got plenty to do with your day, but really appreciate you making the time to come on and having a chat.

Nita Farahany (56:00.91)
Very different views. Yeah. Very different.

Nita Farahany (56:08.814)
again

Nita Farahany (56:21.902)
Thanks for having me, really enjoyed it.

Tyson (56:24.054)
Awesome. Cheers. I'll cut that off there. Thank you so much. I hope my questions made sense. It's 12 .30. Sorry, I got about halfway through and I was like, my gosh, I hope I'm making sense because I feel like I'm rambling.

Nita Farahany (56:27.726)
Okay.

They did. They were great questions. Really thoughtful. Okay, I know it's late for you. So.

Nita Farahany (56:39.182)
You are, you are, it's so late night. Thank you for making it work on your time zone.

Tyson (56:43.07)
No, of course. No, I was so excited. I know it's so difficult. Like Eastern time is probably one of the most tricky just because what's that like a 16 hour time difference. So I wasn't going to be fussy.

Nita Farahany (56:49.102)
totally. Yes. Well, I'm grateful that you made it work on this time zone.

Tyson (56:57.374)
No, you're a legend. Hey, thank you so much, Nita. Cheers, you too, bye.

Nita Farahany (57:00.334)
Thanks, take care. Okay, bye.