API Intersection

This week on API Intersection, we welcomed Varun Singh, Chief Product & Technology Officer of the Daily. Daily is a company that powers real-time audio and video for millions of people globally. Their user base consists of many developers who use their APIs and client SDKs to build audio and video features into applications. 

We delved into the significance of APIs in the video streaming industry and the crucial design considerations when integrating with such systems. Varun shed light on the challenges that APIs address in the video industry, such as achieving low latency in live streaming and real-time communications and efficiently managing multiple streams while ensuring a seamless user experience, especially when scaling to accommodate large numbers of participants.

"Previously, video streaming was primarily focused on platforms like Netflix and YouTube. However, with the pandemic, real-time video communications have become more prevalent due to the surge in video calls, which means more and more APIs being created to support that," shares Varun. 

Find Varun and his teams work on LinkedIn and at Daily.co.
_____
To subscribe to the podcast, visit https://stoplight.io/podcast

--- API Intersection Podcast listeners are invited to sign up for Stoplight and save up to $650! Use code INTERSECTION10 to get 10% off a new subscription to Stoplight Platform Starter or Pro.

Offer good for annual or monthly payment option for first-time subscribers. 10% off an annual plan ($650 savings for Pro and $94.80 for Starter) or 10% off your first month ($9.99 for Starter and $39 for Pro).

What is API Intersection?

Building a successful API requires more than just coding.

It starts with collaborative design, focuses on creating a great developer experience, and ends with getting your company on board, maintaining consistency, and maximizing your API’s profitability.

In the API Intersection, you’ll learn from experienced API practitioners who transformed their organizations, and get tangible advice to build quality APIs with collaborative API-first design.

Jason Harmon brings over a decade of industry-recognized REST API experience to discuss topics around API design, governance, identity/auth versioning, and more.

They’ll answer listener questions, and discuss best practices on API design (definition, modeling, grammar), Governance (multi-team design, reviewing new API’s), Platform Transformation (culture, internal education, versioning) and more.

They’ll also chat with experienced API practitioners from a wide array of industries to draw out practical takeaways and insights you can use.

Have a question for the podcast? DM us or tag us on Twitter at @stoplightio.

I'm Jason
Harmon and this is API intersection

where you'll get insights from experienced
API practitioners to learn best

practices on things
like API design, governance, identity,

versioning and more.

Welcome back to API Intersection.

As always, I'm your host, Jason Hartman,
CTO from Stoplight.

It's been a little bit
since I've said the phrase today

we're going to do something
a little different

because I felt like I was beaten to death.

But we really are doing something in two

plus years of the podcast
we really haven't gotten into.

I think we talk a lot
about kind of REST and craft,

you will
and all these sort of synchronous APIs.

Sometimes we talk about event
driven stuff, but today we're going

to really get into kind of real time
and sort of streaming protocol stuff

with quite the expert on the subject.

Mr. Varun same.

Varun, thanks for joining us.

Thank you.

Thank you for inviting me.

Jason. Absolutely.

So, you know,
Varun had kind of come to us,

you know, talking about coming on and
it was a really easy one to say yes to.

And we kind of looked and he's got 12 IETF

and W3C specifications, numerous patents.

So when it comes to kind of real time
streaming, I think we got the guy here

to bring

us a little bit about kind of your journey
in this streaming stuff

and what put you in your current role
as Chief Product and Technology

officer at Daily, thank you very much.

And yeah, it's been a journey.

I started, you know, about 20 years ago,

straight out of school
working for a semiconductor company.

And at that time
the whole range was megapixel cameras.

If you remember, cameras, phones became
smartphones, smartphones carried cameras.

People started to take pictures.

And that was juxtaposition by a Web 2.0,
like emerging in the same period.

I used to work for a company
called SD Micro Electronics,

and then at Nokia, both of those roles,
I was working on cameras

and the camera hardware
and the software that goes with it,

and that's how I got

into the space and my background before
that was like communications.

I came from an electrical engineering
background and such

as, you know, historic play
like two, 15 years ago,

like we had this issue with Nokia
being like, outcompeted by Apple and such.

So at that time I kind of left
the industry to pursue a Ph.D.

and I started to work on real time
communications

because that was the evolution
of like taking cameras

from like taking still pictures
and recording and storing it

locally on your device,
still like live streaming it

or like communicating with another person
across the across the Internet.

So things like Skype
and what really changed like 15 years ago

was the insight that we've been building
real time communications in silos.

So you had the Cisco's, you had the Skype,
you had the my Microsoft Messengers.

They were all building these attacks
independently,

trying to solve
the same problem repeatedly.

And people envision that a

Web point, a web 2.0 had to, like, achieve
its full success.

It would have to be able
to, like, work with these protocols

so that anyone could build.

Like right now we are using Xen Kastor,
you know, like the fact that we are able

to communicate and a product like Xen,
Kastor does not have to rely on like

Skype or Zoom or anything.

It can actually use
some of the native APIs available.

So the question became like,
what is the big gap between like

these siloed solutions and how to make it
accessible through three APIs?

So that was the work that happened
ten years ago,

and that's why the idea of documents

and the W3C specifications came out
like the industry got together.

You know,

while biggest companies

said, okay, we solve these problems
repeatedly by ourselves,

of course, with cross-pollination,
how do we make it accessible?

I eventually graduated with my Ph.D.

around the same time
when the protocols came to light,

I built a company called called Statoil.

You can think of it as like analytics
for real time communication.

So just like Google
Analytics is for web called stats.

As far as for real time and R&R business
for four, seven, eight years.

Eventually we exited
just before the pandemic.

So maybe not the best timing.

You know, we've been,

you know, trying to grow this product
for four, five, six, seven years.

And, you know, if you're venture backed,
there's a big emphasis

to actually like keep doing
triple, triple, double, double.

We did a couple of triples and doubles,
but at some point we flattened out

because the market kind of matured
and we can see it grow

faster.

And then the pandemic happened
and like everything grew.

That brought me to my current role
at Daily

Deli was a customer of cloud stats
for several years, but

because of the pandemic,
it fueled its growth.

And Deli is basically an API for real time
communication

for the widest set of use cases
that you want to build on video.

And lastly, it's a third gen

iteration.

So the first gen was ten years ago
when when real time communications

first appeared or a bunch of companies
that started then.

Then five years later,

the second gen of companies and then
the third gen which is which is daily.

And I think the big difference
between the third

and the first gen was like developer
tools.

Ten years ago there was no react,
there was no React native

like people were writing vanilla jazz.

So the type of APIs are built
for that period of time versus today,

where you have universal,
you have next years,

you have like React Native,
you have Android iOS, you want to like

be able to satisfy all these things
and people want to use no fear and react.

Where do you want to use components,
Hooks, all that stuff are used.

The lower level API, which is led directly

to as
a product company, which builds APIs.

We have to satisfy like a

slew of things depending on, you know,

you're basically, you know, code
was, is like, oh, I want to like control

every aspect of my experience, both
or both ends of the spectrum.

So that's how I got into my current role.

Very cool kind of progression.

I like that connection
of like how videos changed

as sort of the mobile world
and the web has changed, but

full disclosure here
I am not a video or streaming guy.

I don't know much about this and ask
you like a million dumb questions today.

And my hunch is
we have a lot of perfectly fine.

So I guess the first one is
are we talking about like the stuff that,

you know, when I watch Netflix or YouTube,
is it that or is it zoom or is it,

you know, what kind of sort of streaming
things are you talking about?

That's a good question.

And I think the way we think about it,

there are scanned media or stored media,
which is what Netflix is all about.

Like, you know, you're watching a show,
the show exists somewhere.

They can actually use things

like distribution networks, iTunes
to kind of push it as close to the edge

so that you get you as an end user
get a great performance.

That's one aspect
and that's what we call stored media.

It's been happening
for almost 30 years now.

The technology has improved vastly
that you can like actually do a very quick

like changes in quality

which are not perceivable to the human,

even if you have a network issue.

So that's more or less solved problem.

There are a bunch of companies
that try to do that.

A lot of the magic is in the content
delivery network.

So video content delivery networks,
the second one, which is live streaming.

So, you know, we just had the NHL playoffs
and the NBA playoffs this last week.

And if you're watching
something like that, I think there

the biggest thing is that it can be stored
because you're watching it.

And then the question
is how far behind real time can you be?

It can't be really,
really live like it can be.

But like most often it's

like a couple of seconds behind,
mainly because they want to,

you know, for legal reasons, but also they
want to distribute it widely.

That's called live streaming.

Typically today, most of their technology

is 5 seconds behind.

So basically, if a goal happens,

you you can see it on your screen, five,
5 to 10, 15 seconds behind real time.

The main notion that is that
you should even be screaming goal

before your neighbor dies or vice versa.

Your neighbor should be screaming goal
or scorer before before you fight.

So that's like the biggest challenge
in that in that framework.

And the third one,
which is like real time communications

where you are seeing something live
and what we've seen over the last

ten years is that we start with two people
and I call like, like just like Skype

and whatever came after two people
having a conversation

and then people said, Oh,
we want to use it in a work setting.

Can we have five or six people?

And in a call during the pandemic
that became, you know, town halls, 15

people, 30 people, and now we routinely
see like ten people in a call.

The question
now becomes like all these use cases,

like the live streaming
before that I talked about where

there are millions of people
watching this like the game.

Can you do that in real time?

Can we use the protocols that we use
for like having real time communication?

And here
we're talking about 200 milliseconds

delay by like between you and I,
we probably like tens of milliseconds

because we
are roughly in the same part of the world.

But Max, like, you know, if you're halfway
around the world, a speed of light,

you can go faster than the speed
of light and cables and all those things.

So, you know, you're about 300,
400 milliseconds behind,

but still good for interactivity
in case you want to raise your hand and,

you know, like if you're having a big town
hall, you know,

10,000 people watching the CEO speak
and someone wants to ask a question,

you should be able to raise your hand,
get on the floor through the green room

or whatever, and ask the question live
and then hear the answer.

So those are the use cases
that a lot of people are aspiring for.

And I think the the blurry lines
between these three use cases

like the live stream,
the CAM media, the live streaming

and real time communications are blurring
like becoming blurred.

People want to use
not three different protocols

or three different technologies
or three different vendors to do it.

People want to roughly use this one API,
one render, and does it matter?

Should it matter that I'm connected to you
in real time or I'm connected to

like a sporting event, sporting event and
or like getting something from Netflix?

Like basically
the API should be I'm playing video.

The sort of source could be,
you know, live or real time.

AR Can Yeah,
I was going to ask that to looking through

some of your IETF work in these things,
I saw a lot of like RTP Web RTC, things

like this in these kind of categories,
like what are sort of the common protocols

and where do you see that
things are perhaps converging?

It's a good question, and I think
everything that is old becomes new again.

So RTP is like 30 years old technology,
and it started

for broadcast or multicast.

And if you know, like if you know

your IP addresses, you know, there's like
a set of multicast addresses at the end

which are not accessible
mostly because like the ISP's don't

allow like transmission, like if you're on
Verizon, Verizon will not allow like a,

like a multicast beyond its boundary,
but for security reasons.

But also

they don't want to send media outside
and like then get media from outside.

So it's time for multicast.

And a lot of the design thing decisions
we made for RTP were based on multicast.

However, like by 2000 to the 1992
when they built the protocol

by 2001, 2002 multicast was kind of dead.

It was only used in IPTV.

So you are cable TV
network was probably is probably

still using broadcast to like
send your cable TV over over cable.

But you know it's coming
directly from your provider

and that's typically the person
you bought the Internet from anyway. So,

so that still exists and still continues.

And when I talked about earlier,
like MSN or Yahoo

or like all these messengers from,
you know, 20 years ago,

they all actually used RTP,
including Skype

at some point

started to use RTP because that's
where all the engineers knew it, right?

Like if

you move from one company to another,
you can like learn a new protocol.

Like it'll just take
you like two years before you're like,

competent in that new thing
and you're just re learning something.

So like, RTP is like the standard
what Web RTC was.

And like, people talk about Web RTC
both as a protocol

and as an API,
and they use them interchangeably.

So there are two aspects of Web RTC.

One is the API, which was standardized
and implemented by the browsers.

Of course, you know, the browser API

is called Live Web RTC, it's inside
chromium.

It's also inside
like all chromium derivatives.

So Brave edge are all these browsers
have it and then Mozilla has of

has their own version of web already C
and Safari has a

somewhat different version,
but the API is a standard,

so it doesn't matter which browser you use
or if you take that library

and they compile it into Android and iOS,
you still have like the same APIs

because the Web APIs
have a corresponding C++ layer.

So you can call these APIs anywhere.

And then APIs in
this case are not sufficient

because there's a protocol that like sends
the media over the Internet.

So those are
so that's the Web RTC protocols

and Web RTC
protocols are actually all old protocols.

As I said before,
you know, all these companies had Cisco

Skype had their versions of their product,

and they're all slightly different
versions of RTP with extensions.

You know, if you're doing

two cameras, one camera every like
no one uses the standard process

or the standard protocols
because there's always something

that they need to do extra.

Either their codecs are different,
like each two six, 4 to 6 five.

So back in 2010,
when we decided to work on this,

people realized that there was
a lot of competence across the industry,

but no one had a standardized way
of doing certain things.

So the whole thing about
Web RTC protocols is just

the whole industry coming together
and saying, like these variants

and these extensions of the protocols
will do these things.

And then that meant like you now
had Cisco, WebEx, Zoom,

all of these folks using roughly
the same kind of things behind the scenes.

Yeah, it's funny, as I was kind of
skimming around trying to not sound like

a total idiot on this subject and know
vaguely what you're talking about.

I was getting flashbacks of like
real player and some of these old things

that are in and using all this stuff.
Yeah.

Real player is a good example
of like all the things becoming new.

So there was a thing in RealPlayer
called DSP real time streaming protocol

and it uses RTP underneath web RTC uses
RTP underneath.

So it's RTP by itself
as a unit block is quite versatile.

Of course,
it was built for a different time, so

and for all things becoming new now,
there's a whole new initiative

called Middleware Quik.

And if you quick is like this new protocol

that is over UDP instead of TCP

and it's UDP
three uses quick, for example too.

And there's a whole reason
why we use UDP now

or instead of TCP for all these things.

So people are reimagining
the world of RealPlayer,

which actually had real time
streaming over.

UDP will say, Oh, can we bring back these
old things 20 years later

and web RTC already
does UDP So can we marry this whole world

where the almost all of the web
word work as UDP and real time

and everything would work over
UDP underneath

somewhat optimized for TCP,

but you know, no demise is complete.

Uh, yeah.

I love going back and and watching like

computer scientist
from like the sixties seventies

talk about ideas and things they had
and you realize there's nothing new.

Everything's reinvention.

Everyone thought of everything
a long time ago.

It's just taken a while for things
to mature, and maturity usually just means

going through two or three life spans
of every iterating on the same thing.

So sounds familiar from my world.

Yeah.

And although I would say
what's changed from the sixties

is how easy it has become
to, like, build these things for sure.

Yeah. Developers are super powered now.

Spoiled.

I remember back in the nineties, men
like you want to learn something,

you got to buy a book,
you got to buy software.

Now everything's open source, freely
documented, wonderful.

Languages.

Probably have no excuse

for not knowing this video stuff,
but I just never had a job doing it.

Yeah,

I guess what are

as things are kind of coming together
these days

and perhaps converging a bit like what
do you see as kind of the big challenges?

I mean, I look at like these reports
saying like, you know, 80% of the Internet

traffic is video streaming and stuff
like this, that it's just like

there's got to be crazy challenges
associated with that.

Yeah, I think ten years ago
it was all about streaming.

The things like Netflix
and YouTube were like the the 80%

the pandemic shifted that to, you know,
like the number of video calls

we had on every day,
like when we were in the office

or even like before, like people did,
like maybe a few calls a day.

Some of them some people did one a week
or further apart.

But we are down doing,

you know, everyone's doing it
at least one or more calls for per day.

So the real time videos

is more prevalent. Now

I am going to go back to the three
use cases

the can media,
live streaming and real time.

I think what the biggest challenges that
people are thinking of newer experiences

where they're thinking about
like how you and I are having this

interaction while, you know,
there could be a hundred thousand people

watching this and people think of Twitch
being like one way to do this

and people like, Oh,
how do I bring Twitch to my community?

I don't necessarily want to use Twitch,
but I want to like

have my experience,
which does sort of roughly the same thing.

But I also want like people
to have interactivity,

like, you know, be able to clap
or like do emotions.

And if you just look at the announcement
from WBEZ last week, like, you know,

Apple came out

with like screen chatting,
large screen chatting, like your video.

You have your slides, you can go back
and forth between between that,

you have a TV, Apple TV, and you can use
your camera like as a as a device.

And then it will you can do share plays.

So like there are so many such use cases

that people are thinking about
and vision is coming in.

Now people talk about the metaverse
at length for both home use cases.

And so video is just going to,
you know, everyone's going to have videos

and this is outside of like all the ring

devices that we have or like the doorbells

and like the surveillance stuff
that's happening, packed cameras and such.

So video is just like immense.

And depending on what slew

of use cases you look at, you just see
video

emerge more and more across the US.

So the biggest challenge is like
if I'm thinking of

like a use case
that goes beyond just the solitary.

So like if I'm just doing surveillance
and the doorbell, you know, you say, okay,

that's maybe one use case, but suddenly
you say, like, I think the doorbell,

you're not at home,
you're at work, you have to open the door.

Now, you want to have
this interaction now.

It's not just like recording surveillance,
but it's also like a real

time interaction.

And you want sometimes now with an email,
people like, Hey, I want to like be able

to detect, you know, someone
passing by to a bunch of alerts,

send a thing, maybe patch the user in

to talk

to this person
who may not have rang the bell as yet.

Right. So

in that case, you're thinking of these
as three different technologies

or two different technologies, artists,
is that it should be one technology.

It could be, you know,
it could be several vendor, like it

shouldn't be several vendors,
but like the vendors

need to be thinking about like,
how do I solve this

problem without actually having the
developer go to think too much about it?

So when I said earlier,

so just like a callback to one API,

basically saying it shouldn't matter,
I'm talking to, you know,

like a server recording
or like basically an email algorithm

which like finds the thing
that triggers a communication

or repressed able
to like get the person into the call.

All of that, like from an engineering
perspective, should be hidden.

And that's how we get

accelerated into like people building
more robust, unique use cases.

Because if everyone's trying to solve
like what I call protocol shenanigans,

like you go from one protocol

to another, then, you know,
they all they don't always work together.

Then you spend a lot of time debugging

and trying to like
make them kind of massage them into place.

And that can be the same with multiple
vendors.

You're like, Hey,
this thing has a different state.

It gets triggered at a different point.

The state machines don't match.

How do I make the state machines match
so that I can make them work together?

I think those are the biggest
challenges and

we have a perception that, you know,
having very simple APIs that solve

all these problems under the hood
kind of takes away some of that

mental gymnastics that you

might have to do to like kind of
get these things to work together.

Yeah, it was easy enough to see
and kind of surveying this stuff.

I mean, even just video encoding

formats, it's mind boggling
how many of these things are all

like I it's just like a topic
I hadn't paid attention to in a long time.

I was curious though, you know,

in the more kind of synchronous
and like event driven things,

there's always this challenge of of
how do you kind of design how these

things are going to work or look at scale
when you have lots and lots of them

so that people can find the right thing,
understand how to use it.

Are there sort of common issues like that
in the streaming world?

I mean, it's kind of weird
because it's like you're basically

just grabbing whatever video comes at you
and throw it the other way

in some sense. But,

you know, when you
think about like starting from scratch,

providing a developer experience
for integrating with these things,

what are the design challenges?

Yeah, if you come from an
it should be model of thinking

it's slightly different
because like an it should be the resource

needs to be like replicated
in many places, right?

Like you basically say
oh someone let's say

even in the case of Facebook

messages or whatever, like the thing is
like someone writes a message,

you push it to the web
server, the web server, replicate that

in one or more places
in databases and caches, so many places.

And then someone who's reading it
like pulled it.

And the design problem that is like,
how quickly are you,

you know, like ingesting this data
and how quickly are you like

exposing this information
like a new message, right?

So like you have pub sub like a
as a design parameter where you're like,

I'm publishing information,
I'm subscribing to this information.

And if the system knows
that there is a live subscription

on this publish, then there's a fast track

to like

kind of make this message
go faster to that subscription

while you're still also putting it
to disk.

There's a lot of HTP stuff
that you can use

for like standard messages
if you are like thinking about data.

Unfortunately, that same thing

for real time is a work in progress.

So what we call selective
forwarding units,

they've existed for about like ten,
12 years.

It started with like though
there used to be something called

Multi control
unit Amcos and SFE was now the thing.

There was like if you remember
when when we had offices and rooms

and room systems, like you went to a room
and you had a video system that insisted

on saw people on television,
typically they would composite the video.

So like if you want,

if there are five people in a video,
there were like there was some server

in the cloud
that would composite them together,

like small, big,
whatever, whoever is speaking someone,

but like the algorithm would control it,
but you would only get one stream.

So even if there were
five people on the video,

the compositing
in the cloud was doing that.

And that made sense.

Like if you're Cisco
and you're selling servers,

it's our job to like kind of sell you like
it made like you could buy a heavier,

more expensive server, which would do like
20 people or 40 people.

So like there was business need and Cisco
was in the business of selling servers.

I didn't like that was what they did
until like ten years ago

when select the forwarding units
and like as you're scaling,

you have 20 people you can't actually
composited anymore, right?

Like it's too much computing resources on
the cloud and someone has to pay for it.

So people realize that instead of
like having very large servers, what if we

distributed these arrows across the world
and they selectively forward?

So like if you are on a big call,

let's say a Peloton ride, like that's
a very good example, like there's a train,

are you looking at the train or but you're
also a group of cohort of friends

who are like
who are doing this exercise together.

But there are also like 50 other people
who you don't know who are on the call

to like it becomes like,
I want to watch the trainer.

I want to have a synchronized live
communication with my friends,

which is private.

We don't want to train our dog.
You have that.

But then there's also like the

for the other people
that I want to see in small like.

So you basically have this three
modalities now that that's happening.

One, follow the trainer

common with your friends
and then watch the rest of the class.

Now, in this case, as you were like,
think about,

you know, which ones can be delayed,
what are the latency

guarantees for each subset.

And that's what the selected forwarding
unit would do.

So basically
the end point would have more control.

You know, I as an end user who's on
the bike, can decide who I want to watch

at what priority, and you kind of move
the logic away from that server side

largely
to the endpoints because the endpoints

and then you need that magic, right?

Like if I have about connection, I still I
if I can't carry all those 60 streams

but streams, should I prioritize,
which stream

should I see in high resolution
at high frame rate.

Which ones
can I do at low resolution law firm.

Which one's going to turn off. Right.

Like I can turn off the class
if I don't have enough bandwidth

and I just want to list

like and if I really low then
I should just see the the the trainer.

And if I can't see the trainer,
then I should either listen to the trainer

or I should just see the trainer
and turn off

and like some of that magic
can happen on the on the cloud.

So basically

for this,
you can do something off the shelf.

There are some open source things.

Of course, you can build on top of. But

but some of these things,
like if you talk about mesh as if you

which is like you have a sphere
and then you how do you scale

like you can horizontally scale something
or you can vertically scale something.

And typically
if you want to get 100,000 people

watching some stream simultaneously,
you'll horizontally scale it, right?

So that is like the magic that we bring is
we have something called mesh as a few,

which allows you to like horizontally,
so be horizontally scale so that you don't

have to worry about like you have ten
people in the call or 100,000.

We will figure out how to do that.

We give you enough APIs to express
like what are the priority levels?

You figured out how much bandwidth
our servers figured out

for how much bandwidth you have
so that we can downstream

the appropriate amount of data
so that you can have a good experience.

Developers can say like, you know,
in the worst case, what would they want?

But they prefer video or audio.

So in the worst case,

then we can use their input as guidance.

So there we can.

The server is going to decide
what to do in those circumstances.

Yeah, all this stuff makes my head hurt,

which I.

I noted that you've also are working
on a book

on kind of this live streaming type
tech stuff.

You're going to tell us a little bit
about that.

Yeah, we, uh, we are cooperating
with the Dummies franchise,

so there's a dummies for interactive
livestreaming that will come later

this summer.

The idea here was to, like, put together

these three aspects, like

the different various use

cases people have, both for livestreaming
and in real time.

That's one part of the book.

The second part is
the protocol shenanigans.

Like there's so many protocols

that all four letter acronyms
and you have to learn so much before.

How do you like reconcile all of that
into like one holistic viewpoint?

So mental model for that?

And thirdly, like,

what would the API look for look like?

So we've had the product out for over
two years now.

So writing
this book is a point in time to say

that, you know,
this is not really that complex.

The use case has have emerged
and people are thinking about

this experience more widely than

people know,
and there are simpler ways of doing this.

So one mental model or one API and one

and if you care about the underpinnings,
it's one protocol

that drives all of this.

So normally this is the part of the show
where I ask

guests, you know,
where do you think if you had to

point someone in the right direction
on where to get started, where they go,

my hunch is

you might just answer, read the book,
but I'm going to throw it out there.

You know, for someone just getting going
in this, you know, What's your advice?

Yeah, I mean, we have

really good blog posts on daily code
that you can go and find so that you try

a less interactive livestreaming
deli code.

You'll find it.

I think Dr.

Start DailyKos is like a really good
starting point for this.

If you're really curious, like

about the underpinnings of this,
you should read for the for the book

to come out.

It explains and like Good Graphics
for Dummies, for example.

So like, you know, we've tried to like
go with that idea that make it accessible

to people who are interacting
with the technology for the first time,

and it should be out in the next 60 days.

So also probably by the time
this goes to air.

Yeah, well, I identify with feeling
like a dummy today on this subject,

so I think you might have picked the right
category of book,

but thank you so much for kind
of giving us some insight to this stuff.

And I guess for folks, you kind of
want to learn more here, few more.

I think you're kind of mentioned
the daily blog are

there are other places
people should follow you.

O follow us on Twitter,
Follow us on Mastodon.

We're on all socials, including LinkedIn.

You should you should type daily Dot SEO

and you should find us or daily article.

Thanks again for joining us for him.

Thank you, Jason. Thanks.

Thanks a lot for inviting me
on having this conversation.

It's been fun.

Thanks for listening.

If you have a question you want to ask,
look in the description of whichever

platform you're viewing or listening on
and there should be a link there

so you can go submit a question
and we'll do our best to find out

the right answer for you.

API Intersection Podcast listeners

are invited to sign up for Stoplight
and save up to $650.

Use the code intersection
ten to get 10% off a new subscription

to stoplight platform starter or pro.

Take a look at this episode's
description for more details.