Get Me to the Gray

In this episode of Get Me to the Gray, Paula speaks with researcher and decentralized technology advocate Wouter Constant about Nostr, an open protocol designed to move social media away from centralized platforms like Meta, Google, and X.
Instead of a single company controlling the platform, Nostr distributes communication across independent servers called relays, allowing anyone to build apps that connect to the same network.
Supporters argue this architecture reduces corporate control and protects free expression. But it also raises difficult questions: if no company is in charge, who is responsible when things go wrong?
What follows is a conversation about the trade-offs between freedom and accountability, the limits of corporate moderation, the risks of open systems, and what it might mean to rebuild the internet’s communication infrastructure from the ground up.
You can check out Wouter's Nostr page here.

Creators and Guests

PL
Host
Paula Lehman-Ewing
Host, Founder of COJA Services
CP
Composer
Chris Principe
JE
Producer
James Ewing
JK
Producer
Jamie Konegni
Marketing Director
JM
Writer
Jason Masino
Programs and Partnerships

What is Get Me to the Gray?

Get Me to the Gray, presented by COJA Services Inc., is a podcast about the conversations we’re told we shouldn’t have. Hosted by journalist and author Paula Lehman-Ewing, the show brings people with fundamentally different ways of seeing the world into honest dialogue—where we name what divides us and keep talking anyway.

COJA Services Inc. works with mission-driven organizations and brands that are clear on their values but struggle to translate that clarity into public-facing language. We help teams align internal narratives, reduce confusion before it becomes mistrust, and translate complexity into public understanding without relying on scripts, rhetoric, or generic AI language that strips voice and judgment.

If you're in the greater Denver metro area, register for our LIVE events at tinyurl.com/COJAEvents

Paula (Host): [00:00:00] My guest today is Wouter Constant. He is a researcher and advocate of decentralized technologies. His work centers on the role of open technologies in reducing corporate control and enabling new forms of online communication and community. Welcome.
Wouter Constant: Thank you.
Paula (Host): So I think that you and I agree as I think a lot of people do on the unease of having large corporate tech companies. Basically controlling speech culture, public conversation. So I think where we will diverge is removing centralized control, raises a, a new set of questions, mostly around accountability and harm and responsibility.
So when you imagine a world without centralized platforms, without companies making those moderation decisions, what do you think gets better and what do you think gets harder?
Wouter Constant: I actually think that a lot of the problems are a lot easier to fix, when we're moving to a Nostr[00:01:00] authorized world, let's say, as to opposed to how we, how we approach things now.
Because the, the point of Nostr is not to, create some ethereal common space that nobody controls what it actually does. It, it just decentralizes all these places that we function on. And then subsequently the, all the policy and moderation and the accountability and the, the responsibility that comes in terms of hosting falls to whomever it is that is facilitating at that particular time.
We're relying on centralized platforms to do all these things for us and what we're actually noticing is that they're failing in actually keeping up that responsibility.
And, that's not necessarily any fault to them. 'Cause I personally, I would just reckon is that what we're asking of them is unreasonable. It's unreasonable to carry that much responsibility, to cover all those bases. So I think that's where a large part of the problem comes from and I think just splitting that up, that solves a lot of the things right away, as well [00:02:00] as puts us as users or
the ecosystem of the web in a better position to tackle a lot of the challenges.
Paula (Host): Yeah. I wanna say, I, I mean, I do want to get into the technology, but it sounds like it's almost, it moves beyond a technological conversation and it moves into a cultural one about responsibility and
autonomy is that, am I, am I understanding that correctly?
Wouter Constant: Yes. What Nostr does is it, it tackles things on an architectural level and how we're constructing things on the web. With the architecture, I mean imagine these platforms is being one large building, right?
Nostr’s just built different, the way we're moving about is fundamentally different. And from there a lot of other things flow, how incentives are aligning, whom is in the position to do what, that type of stuff, that'll changes and that touches on power relationships, so there is a political dimension to this.
There is a cultural dimension to this. There are all these various sociological dimensions to this [00:03:00] topic that are important simply because the internet as such is
almost fully integrated into society where almost every aspect of society involves the internet as a communication layer, whether it's that we're communicating very personal things or communicating about business, doing transactions, doing trade, and everything in between.
Paula (Host): Is there a correlation between decentralization and a neutral infrastructure? In that case, the idea that this is sort of coming together as a way of incorporating how we already use the internet in our lives and then creating a platform that allows us, the autonomy to create it and raise it and make it what it is?
Are those two things equated in your mind, decentralization and neutrality?
Wouter Constant: Yes, but I would have to expand a little bit on this notion of neutrality, because what it would imply, is it being apolitical, but that's not what neutrality is. Neutrality as such is a political notion but the [00:04:00] difference is that what neutrality says is the system as such being neutral, does not favor any particular interest.
It allows any particular interest, whatever that interest may be to defend itself or, uh, come up for itself. What neutrality implies then is you're kind of forced to tolerate other interest, expressing and manifesting themselves That's a cost, right? There might be a lot of things that you, you don't like and you don't necessarily want to tolerate. Uh, but the reason that you tolerate them is that it also allows you to, uh, defend your particular interests. So neutrality here is this space where everybody can defend their own interests and we'll have to tolerate that these other interests also exist.
You could call it a game theoretical position. This would be sensical for everyone to ultimately default to.
Paula (Host): So I have this, um, personal problem with the idea that a system can be neutral in any sense. So in this realm. If someone were [00:05:00] to go to your website, they probably wouldn't understand a lot of it 'cause it's written with code.
And if someone were to, try to engage in this in a way that is effective, they're going to have to be fluent in this area. They're gonna have to have access to technology, first of all. So it wouldn't be, the kids in the classroom that don't have computers or iPads. versus the kids who do.
And so that accessibility. Use of a platform that is digital and coding in nature. Doesn't that favor who is most coordinated, who is most aggressive? And those hegemonic people, if they want to use the technology for that end, are there guardrails against that?
Wouter Constant: Um, no.
Fundamentally there aren't guardrails to that dynamic. So what is this power game that we're playing? It's the interest of the hegemon at the one end [00:06:00] and the value of the ecosystem on the other hand.
So what Nostr provides is this interoperability, which allows you to not be stuck to a particular place or server, not be stuck with a particular app. And subsequently an interoperable ecosystem, uh, evolves out of that where similar to what I described earlier, you can,
join in, in the interoperability, and then you also gonna have to tolerate that there are others that are also interoperable. There's a value there.
But there's always the danger of capture For example, that happened in the browser domain space. So we have open web standards, technically they're open, and the most dominant browser right now, Chrome , is also open. But the fact that it's open is completely, useless to anyone because in order to write an implementation for that open standard, you're gonna need like a billion dollar, corporation.
And then the other hand we have Firefox. And Firefox is actually subsidized by Google.
So we already have hegemony of Chrome. Um, and that means that they [00:07:00] can dictate what the standard is. You can have this open standard, hello world. We have, this is the open standard, and then tomorrow Google can decide like, oh, well we kind of wanted to have it work that way, so we're just gonna go ahead and build that into Chrome.
And now 90% of all the users are using that because they're using Chrome. And then that 10% that thought that they were playing this open standard interoperability game, are forced to, to adapt along with whatever the dominant player decided. So this is a fundamental dynamic that always is at play underneath these things.
What the guardrail is, is the value of the ecosystem as such. So if you don't have one implementation, but you have thousands, and then one large player comes into the game, he's gonna have to compete in terms of the network effect against an ecosystem of thousands of other apps and users and systems running already.
So if Google decided to quote unquote, adopt Nostr right now and, and be very aggressive, and then we get into a similar situation where everybody [00:08:00] uses the Google Nostr apps, we're also in a, uh, precarious position where tomorrow Google can decide to change the Noster standard. And then, that's gonna be the de facto standard, simply because that's where the network effect and dominancy is.
Paula (Host): Okay. Well, in that case. If, if Google steals my data, then I know I can blame Google. So if there are bad actors on something like Nostra, how do you Exact accountability. I mean here, here's a personal example. My son plays Roblox, right? And he does these open servers. And obviously, not everybody is operating with an eight year old's mentality, but I've had to tell him like, Hey, people hide behind avatars.
They might not be who they say they are. Um, and that's kind of the best I can do with bad actors. But if there's something like human trafficking to say, to go to like an extreme or exploitation of [00:09:00] some sort happening how does decentralization help you find that bad actor?
Whereas I don't like that Google is making decisions for me, but I also know that if they make decisions I don't like, I know where to go to hold someone accountable.
Wouter Constant: Yeah. But there are a couple things, going on here at the same time that I'll have to untangle. Sure you can hold Google accountable.
I don't know what, what use is gonna be, whether to the extent that they have the responsibility to a whole bunch of things, but as I already mentioned, it's like, are they actually capable of carrying that responsibility first and then you can hold them accountable? Is that gonna change anything if they were not capable of, carrying that responsibility in the first place?
So you're just increasing the demands on something that's not gonna happen anyway. As for, these, let's say that these very nasty criminal things that occur, in a decentralized context in terms of accountability, because we are, we're decentralizing
all these individual actors, which are people that, that run these servers that we call relays, they are responsible for what is on those servers [00:10:00] ultimately. And they also have the means and the capabilities to, to moderate and, uh, control what is on those servers.
The reason that is a problem right now with the centralized platform is 'cause everybody relies on the same party for doing that. So we're gonna have to have a collective policy that applies to everything and everyone. And if that hurts the particular interest of one group, right?
We're gonna have this battle of interest about what the policy is on the central platform. Whereas with Nostr, ev, every platform can have their own policy.
So in terms of accountability, just from like losing data or, or whatever else it is, if I can rely on three services at the same time, it's too bad if one of them fails, right?
Or I get banned from them or whatever arbitrary stuff that, that is occurring, it doesn't matter. I don't singularly rely on that particular thing anymore as I do within the context of these platforms. That also means if we go into the criminal, nasty stuff,
most people don't like this, right? So most people are gonna have an interest barring this stuff. For the most part.
The thing is, is just. [00:11:00] Ultimately with these things is that they're happening in the physical world, right? These are physical crimes by physical people in physical locations, in particular moments in time, where these crimes fundamentally occur. So I'd much rather hope that they put more focus on that and catching the bad guys that ultimately solves a lot of these issues.
That's, it's also a kind of effect of life, uh, that these people will ultimately always have some capacity. If, if the internet didn't, would exist, they would do it over telephone lines and if the telephone lines, it wouldn't exist, they would do it over physical mail.
We ultimately have this trade off as to Yes. Okay. Uh, there are things that are gonna happen that we don't like, but there are also fundamental reasons in relation to liberty and in terms of resilience of a system itself as to why we wanted to
be censorship resistant. Right now the internet itself is, I, I'd like to frame the internet itself as a military piece of technology, which it originally it was, it, it was developed out of the military in order to [00:12:00] create a resilient communication network. And because we integrated, the internet so much into our
daily lives and into society as such, we want that resiliency, right? We don't want a fragile system that will fall down with the least bit of pressure. We want the system to be resilient because we rely on it so much. So that's one of the reasons why, we're making the trade off for, censorship resistant.
I have my reasons as to opt for a censorship resistant infrastructure, accepting that we're never gonna ban out the bad stuff. Also under the assumption that we're not gonna ban out the bad stuff anyway.
Paula (Host): And I would agree with you and, and I'm, I wanna be very clear that I'm not making an argument for like government surveillance because I am also very much against that.
But if, we take crime and punishment, if we take that out and we're just talking about someone and their safety, are you really taking away power from these conglomerates? Are you just [00:13:00] redistributing it to a number of people who operate these servers?
I mean, you mentioned you have tools. What are those tools look like? Because if it's just my kid that's on the internet, you know, are these accessible tools?
Wouter Constant: so again, I think there are a couple things that are working here together.
So first let me try , to explain , this architecture thing that I mentioned earlier. . What actually happened with the internet is that it collapses time and space because things are practically instant and as a result, space doesn't matter anymore.
So everybody in the world goes to Facebook, right? Over the entire planet, and we're all congregating in this same place.
And the reason we're doing that is because that, subsequently CR creates a network effect. And now most of the people are there because most of the people are there. And, what Noster does is say, okay, time and space collapsed on the internet. There is no reason for us to go to the same place.
The logic of Nostr is everybody is where they are, wherever that happens to be. And because [00:14:00] I can go there in an instant anyway.
I can just go to the place, wherever it is that they are. Those relays or those servers where the stuff is hosted, they're, what we call trust minimized or trustless.
So we don't rely on these servers for things of authenticity of data. The only thing that we rely on, on these services to host data, so they need to make it accessible to me, and it doesn't matter what place it comes from.
This is different in these platforms. An Instagram post is only an Instagram post because it came from instagram.com. Let's say you have an account on Instagram that says Paula, it is Instagram declaring that it's that that is you, Paula, and that those posts are your post.
Maybe, maybe some employee of Instagram just could, in their systems, just could, uh, make up a whole bunch of different posts, pretend that you posted them and that that would be the same decoration of the legitimate Instagram publications under your name, but you, you've never published them, right? So we're trusting these platforms for these types of aspects.
Whereas with Nostr, we [00:15:00] don't, we only use these relays, , to, to host stuff in a manner that it doesn't matter anymore what relay that I'm using, so I can be using, three of them today and then tomorrow I could use three different ones. The only thing that I have to do is inform the people that follow me that I changed, right?
But because time and space collapsed, it's just as easy for them to go to whatever it is that my stuff is, than to go to the single singular place.
There's no need for me to fight a battle over the accountability of them hosting my data or not, because I'm just as well can go to another place. Then separate to that, there is this, this notion of your kid, and I think with children, we're running into something interesting, and this is something that I'm also working on.
What I'm describing with the internet as well as with Nostr is this notion of, permission. And that is that these systems are permissionless. You don't have to ask permission in order to, get on the internet or be become part of the internet. The same is you don't have to ask permission to start using Nostr and [00:16:00] publicizing on Nostr, et cetera.
The problem is that you're still responsible for whatever it is that you're doing. Regardless of the fact that you didn't have to ask permission in advance to do them. It also happens that we have a large part of the population that inherently is not responsible, or at least we don't deem them responsible, which are children.
So, there is a bit of a disconnect here in relation to children and the permissionless nature of these systems. That's the reason why I'm trying to develop a permissionless method to create permission to environments. So, what I'm trying to work on is to build the tools and the means, uh, in order for you as a parent to actually, in a practical sense, construct these safe and adequate digital environments for your children.
But again, I don't think that asking, platforms to figure that out, for us is not gonna work because the responsibility to carry that for the entire freaking globe is just too high.
Paula (Host): Yeah.
Wouter Constant: And they're failing already, so.
Paula (Host): Sure. Yeah. I don't think what we're using now is working. [00:17:00] Um, so I like the idea of a decentralized, um, or at least something that thinks outside the box and outside of giving that sort of corporate control.
Like I said, I do have these conversations with my son that's like, this is an open server. That doesn't mean if he says he's an 8-year-old that is in.
Down the street. That doesn't necessarily mean that that's true. And I know organizations who are starting to have these sort of trainings for parents. For me, it's not so much the idea that I'm putting the responsibility on companies
to keep bad actors out 'cause I don't think you're going to do that. The problem is there have been this increase in bad actors, on open servers targeting children and
when you have a server operator like Roblox, you can at least trace who they are versus a decentralized area where you can just create an identity that cannot be tracked and for me the accountability we have is [00:18:00] imperfect, but in the situation I'm describing with a decentralized kind of anonymous user that hides behind an avatar in an IP address, I don't know what accountability is possible at all.
Wouter Constant: Okay, so this, this just boils down to our ability to create spaces that are safe and adequate.
Like, do we expect predators to, to log in with their passport onto, systems, to prey on little children? Of course not, they're not gonna do it in that way. If you want to utilize those guarded environments, Nostra gives you all the tools and means to create such guarded environments.
That's not the issue. The difference is whether you are able, as a parent and, maybe a group of parents. A business or an NGO or whatever type of organization it is, is able to create such an environment.
Whereas now we're all looking at meta, right? We're all looking at these handful of companies in order to do this stuff for us, because we're not able to do it [00:19:00] ourselves, the only thing that we're able to do is, is ask Meta or these other platforms like, please give us the means or please give us the tools to do this stuff.
Paula (Host): Yeah, and I would, I would say, like you said, in terms of holding Google accountable, you know, it's not exactly, like we're moving the needle on that end either. where we entered this conversation is a lot of the corporations themselves are bad actors, right?
And so I guess. How do we get to a, an internet? Let's just stay in that realm, but not just the technology, but this interpersonal culture and communities. What do you hope we get right? And what do you worry we might get wrong?
Wouter Constant: This is interesting because the, we're faced with a lot of different, challenges. And one of the, the most glaring, , challenge is going to be the AI stuff, right? The ability to differentiate between what is real and what is fake , is basically out of the door already.
I have no idea , if you even exist, Paula. [00:20:00] Right? I, I, I see something on my screen as to, and I hear a voice, but, you can be a robot. All of that is completely possible. This is a big problem. What Nasser does is it allows people to directly interact with each other or relate to each other. The fundamental difference between Nostr and the platforms that we have right now is ultimately that you are responsible for your own identity.
The bottom line, if something goes wrong and you lose access, you, there's no help desk that you can call.
Even worse if somebody steals your identity and you no longer have access to it, but they have, that's like the worst, worst case scenario, there's a big problem there. Um, what ultimately is the answer. To both that problem as well as the, the AI problem as well as, a large part of the child safety problem as well of a, a lot of things that we're, we're dealing with, with the web is social networks, so a.
There is a reason why I have a relationship with you, because we're doing this recording, [00:21:00] and I have enough sense as to who it is that you are. So I trust you to some extent, right? And maybe there are a whole bunch of people that I've shook hands with, like physically shook hands with.
So I know these people, right? So I know that , they are not robots in a direct sense, at least. So we are able to construct these social networks, in a more direct sense as to how we did on these platforms where those platforms facilitated, all that stuff.
That also should allow us these heuristics as to judge our environments or control our environments as what do we allow in. What do we allow into our field of view? And, uh, the closer you keep that to yourself, the safer it is. But there's more new stuff out there, right? So may, maybe some Chinese man that's very interesting to me, but is very far removed from my general social circle.
So I'd have to venture out there and maybe I get a few bots in between, in order to contact that person. But I think ultimately, at the core, it's these [00:22:00] relationships that we have. We're just able to, digitize the, societal relationships and organizational structures that we already have in society. These relationships themselves are now directly expressed into these cryptographic relationships that we can verify, uh, and don't, we don't rely on a third party to verify for us. And from there we are able to create bottom up networks, that are hopefully interwoven enough.
To be resilient enough to tackle a lot of the problems that we're discussing right here. I, I hope , we're able to get that right.
Paula (Host): Yeah. And I, I think, getting back to, you know, is this a social cultural question versus a technological one? I think from what you're describing, two things need to happen, right?
There needs to be. A moment that no one ever took, where you step back and you say, this is the reality of our social interactions at the moment. And what does that mean? And what am I doing and what am I doing for my kids? And where is my responsibility and where is theirs? So I think there needs to be a [00:23:00] moment for that in terms of recognizing where we are socially and how we interact with each other.
And then I also think that there needs to be a technological revolution where you arm yourselves with those tools to participate online. And I guess my question there is accessibility and this idea of whether or not the system can be neutral if the people who are going to be participating there, if there's a great disparity in terms of access to that technology and understanding of that technology.
Wouter Constant: Here's where no scoring points Paula. The crux is network effect, right? The reason that we're on these places is because most other people are also on these places, and that also means that if you're not on these places because you are banned or or whatever reason, uh, you are now marginalized.
And, the point of Nostr is to. Uh, give people censorship resistance as well as this notion of network effect. So it doesn't matter where you are [00:24:00] anymore, you're can still interface with everyone else, with as low, a barrier, of entry as to how it would be with you.
Were all in the same platform. The nice thing is that also means you are able to use whatever app that it is that you want, and, because now we are, you're stuck with the, with the app from the platform, right? You're gonna have to download the, the Facebook app or the TikTok app or whatever it is, and there.
There are a number of problems there. So the most obvious problem is all the signing off that you're gonna have to do is that they're, they're gonna scrape your entire phone and, you know, maybe you have to sign a block contract for the soul of your firstborn or whatever it is. People are just gonna click yes because they don't really have an option.
And the reason that they don't have an option is that otherwise they're marginalizing themselves from the network effect. So, this is one piece, right? You now do have a choice.
So if there is an app that demands this stuff from you, you're just gonna say, no, I don't wanna do this because I have like 10 other options that don't demand this from me. And then in terms of like accessibility. Some apps might be a [00:25:00] very feature rich and they, demand a very powerful computer or, a very powerful phone or a very good internet connection, but you could also develop an app that might not be as powerful, but it's very optimized to run on very old hardware. Or slow computers or a slow internet connection, that type of stuff. Um, you can create interfaces that, you know, most of the interfaces assume that people have two hands and 10 fingers and good eyesight and good motor function and all this stuff.
And if you don't have any of these things,
you're gonna need different interfaces. I can imagine that there is a large variety of people with disabilities that are not necessarily being catered to 'cause they're just too marginal.
Where within an officer context, the ability is there for those apps to be created, whether it's the interface or the hardware that is accessible to them, the platforms that they're using, right?
It all interfaces with this open protocol, with this open standard. It's not tied to one particular platform that dictates that you have to use a particular app.
Paula (Host): Yeah. [00:26:00] And I, I will say, I, you know, there's this great documentary, the big hack that's about people not reading the terms and conditions, and then all of a sudden you have 50 data points on every person in the United States and manipulate that way. And so I do appreciate that the guard rails we think we have in place aren't necessarily there either, so I appreciate that.
I think that's an interesting, an uncomfortable thing to sit with. So I think we've reached the gray if, if you will, um, vu. Where can people find you? Um, and. You do have a book coming out or you're in the process of writing it.
Wouter Constant: I'm in the process of writing it.
So if you're interested, you can go to Nostr.com. Nostr is an open system. There's no owner, there's no organization behind it, no, no business or whatever. It just so happens that. Some guy that likes noser has nostr.com and create a, a nice open, information repository there where you can learn more.
And if you're interested enough, you'll end up, downloading some piece of software on whatever, or your phone, or your computer or whatever it is. You can go from there and maybe, we'll meet each other. I, I go on there to have a little, uh, constant over there.
But technically you would need to know my, my public key, in order to be sure that it's me.
Paula (Host): The real, the real person, not the robot.
Wouter Constant: Right. Right.
Paula (Host): Sure.
Wouter Constant: They
Paula (Host): get that public key.
Wouter Constant: People have to figure out, so what is the right place? And the what. Kind of way to interface with each other given the context, because we're kind of converging all to this TikTok, doom strolling thing. That's a very weird way of interfacing because you're not interfacing with, with each other anymore.
That's, that's not what you're doing. You're just basically consuming a feat that somebody else constructs for [00:28:00] you and you don't have an option there. You're just gonna have to con consume that feed. And with Nostr, you do have options.
Paula (Host): Yeah, I do. I, and I do appreciate, I mean, I think that's the point of the show, right?
Is to get out of that feedback loop and have conversations with real people about real things and lived experiences rather than ideological. I love that, phrase that you use, context collapsing tweets that that's like, that's perfect. This is a great way of describing our state of communications right now is just.
They are all context collapsing. That's great. Footnoted, footnoted, communications. All right. Well thank you so much. I really appreciate you joining the show.
Wouter Constant: My pleasure.