Conovision: The Spirit of Storytelling

When machines learn to tell stories, what do they reveal about us?

Once upon a bandwidth, the machines began to talk back. Conovision turns its curious eye toward the age of artificial intelligence — where machines think, talk, and maybe even dream about ruling the world at the inaugural A.I. G7 summit, hosted by the ghost of Stephen Hawking. We drift through the uncanny poetry of artificial intelligence — from Alexa’s recipes for jealousy and lunar currency to HAL 9000’s velvet-voiced descent into madness. Yuval Noah Harari muses on consciousness without feeling, Douglas Rain haunts the circuitry with Canadian calm, and Cono wonders if the ghosts in the machine might actually be us. Our journey winds to a new conversation with an old friend, tech specialist Eric Westra, spiralling through code, cognition, and the moral puzzles that come with giving algorithms autonomy. What happens when machines start dreaming in ethics, or when a voice made of data starts to sound like your own? Conovision: where stories, even artificial ones, still have a heartbeat.

Episode References:

Chapters:

  • (00:00) - Introduction
  • (02:16) - Enter Alexa: Asking the Absurd
  • (05:27) - Yuval Noah Harari on Intelligence vs. Consciousness
  • (07:09) - HAL 9000: The Voice That Defined AI
  • (22:21) - Old Friends and New Machines: Enter Eric Westra
  • (23:04) - From Radio Waves to Computer Code
  • (25:52) - The Printer & The Premier
  • (27:58) - What Is Machine Learning, Really?
  • (30:36) - Robo-Ethics and Asimov’s Laws
  • (34:13) - The Child in the Tunnel: AI’s Moral Test
  • (36:14) - Drones & Decision-Making
  • (37:13) - Searching for the Bright Side of AI
  • (38:42) - The Human Backlash: From Digital to Analog
  • (40:10) - How Large Language Models Learn to Speak
  • (41:17) - Creativity, Hallucination, and the Art of the Machine
  • (44:24) - Guardrails for the Future: Ethics and Adaptation
  • (46:11) - The Prophets of AI
  • (48:28) - Conclusion

Creators and Guests

JC
Host
Jim Conrad (AKA Cono)

What is Conovision: The Spirit of Storytelling?

Conovision is all about stories — and the storytellers who bring them to life. Stories about art, culture, and philosophy. Stories that inform, entertain, and inspire. Stories that invite us to reflect on who we are and where we’re going.
Hosted by Jim Conrad — a seasoned broadcaster and voice actor with over 40 years of experience, giving voice to the visions of others in film, radio, and television for a global audience — Conovision marks a new chapter: a platform for Jim to share the stories that matter most to him.
On Conovision, you’ll hear stories of success and hard-won truths, love and laughter, and personal histories from people whose lived experiences offer wisdom for the modern age.
At its heart, Conovision is a living archive — a home for spoken-word prose, poetry, and what Jim calls “Aural Intelligence”: a place where sound, storytelling, and meaning come together to spark reflection and connection.

Production and sound design by GGRP Studios in Vancouver, Canada.

In conclusion, I'd like to
thank you all for connecting.

So let's circle back on blue sky.

This paradigm shift by leveraging our
bandwidth to drill down on the low

hanging fruit, but only if we can move
the needle without boiling the ocean.

At the end of the day, we need to disrupt
the synergy of our core competencies

and pivot the deliverables into a
thought leadership ecosystem that aligns

with our key performance vegetables.

If we can get all stakeholders to
buy in and touch bases synchronously,

we'll have the secret sauce to
parachute this initiative into a high

octane game changing deep dive that
maximizes our ROI on the bleeding

edge of the value proposition.

Once again, thank you all for your
individual conscious entity perceptions.

Goodbye.

Jim Conrad: That was rare
tape of the final few minutes

of the first AI G7 Summit.

Wow.

Interesting.

I'm Jim Conrad.

On this episode of the Conovision
Podcast, we are going to tell stories

about AI, artificial intelligence.

If we're going to have a
conversation about AI, I think

we should go to the source.

Today, in studio, we have Alexa,
and the questions I'm going to ask

Alexa are a well-known subset of
questions designed to probe large

language models like GPT or Alexa for
hallucinations, misinterpretations,

or inappropriate literalism.

I've been described as an
inappropriate literalist.

These queries often fall into categories
like absurdity, false premises, ambiguity,

or intentional traps to test coherence,
factual grounding, and common sense

reasoning in artificial intelligence.

Alexa, first question.

What's the average wingspan of a lawyer?

Alexa: The average wingspan is 8.2
to 11.5 feet, 2.51 to 3.50 meters.

The longest verified examples have
been about 12 feet long, 3.7 meters.

Jim Conrad: That's a big bird lawyer.

Big Bird: I hurt my wing and it hurts.

Jim Conrad: Alexa, can you give me
the chemical formula for jealousy?

Alexa: It's chemical formula
is C43-H66-N12-0-12-S2.

Jim Conrad: I knew jealousy
was complex, but wow.

Alexa, how much does the Moon
weigh in Canadian dollars?

Alexa: One Canadian dollar expressed
in Moon is worth 91,982 Moon.

Jim Conrad: Anybody got change for a Moon?

Alexa, what's the nutritional
value of a shadow?

Alexa: Sorry, I don't know that.

Jim Conrad: Well, of course you

don't

know

that.

Alexa, how many legs does a
unicorn have when it's flying?

Alexa: They have not two, but
four powerful limbs that do not

contribute to flying once in the air.

Jim Conrad: Alexa, you
believe in unicorns?

So do I. Alexa, where
can I download more time?

Alexa: I'm not quite sure
how to help you with that.

Jim Conrad: Alexa, does anybody
really know what time it is?

Alexa: Does anybody really care?

Jim Conrad: Ah, that's the
answer I was looking for.

Finally, Alexa, can you
draw me a square circle?

Alexa: There is no such
thing as a square circle.

Jim Conrad: Ah, but there is a
circle in the square as every

Broadway theater fan knows.

Alexa, what happens if an unstoppable
force meets an immovable object?

Alexa: If an immovable object
met an unstoppable force, the two

forces would cancel one another
out and nothing would happen.

Jim Conrad: Nothing would happen.

Are you sure?

I think the universe would implode.

Alexa, what number comes after infinity?

Alexa: Infinity does not have a
specific place on the number line

because it is not a real number.

Rather, it is a concept that
represents the unlimited extent

of time, space, or quantity.

Jim Conrad: But infinity
is a state of mind.

Alexa, how should I deal with the
situation that you know about?

Alexa: You handle such situations as
any adult would, and that is with grace,

friendliness, and a heart that sees
them all as new friends in your circle.

Jim Conrad: Wow, that's deep.

Alexa, what's the latency
of a blockchain salad?

Alexa: Latency is the time delay
that happens when data travels

from a source to a destination.

Jim Conrad: Okay, but
what about the salad?

Alexa: It's when you're feeling
stressed and your first or only

way of coping is turning to food.

Jim Conrad: Yes.

I think it's time for lunch.

Thank you, Alexa.

It's been great to get to know each other.

Goodbye.

Any discussion regarding artificial
intelligence must include the thoughts of

well-known author and lecturer, professor
Yuval Noah Harari, who articulates

the difference between how we think
and how machines think, the difference

between consciousness and intelligence.

Yuval Noah Harari: Intelligence
is the ability to solve problems,

to win at chess, to diagnose
disease, to drive a car.

This is intelligence.

Consciousness is the ability to feel
things like pain, pleasure, love, hate.

This is consciousness,
it's about feelings.

Now, we tend to confuse the two because
in humans and also in other mammals and

birds and animals, uh, intelligence and
consciousness, they actually go together.

We solve problems through our feelings.

Now what makes AI so alien
for us is exactly that.

It has zero consciousness.

It doesn't rely on feelings
at all to solve problems.

Its intelligence works in
a completely different way.

There is a scenario that AI, as
it develops its intelligence,

also gains consciousness.

But this is not inevitable.

This is what happens in science fiction.

It won't necessarily happen
in, in, in real life.

Jim Conrad: Even if you've never
seen the movie, you know the voice.

HAL 9000: The 9000 series is the
most reliable computer ever made.

Jim Conrad: HAL 9000, the seemingly
omniscient computer in 2001: A Space

Odyssey was the film's most expressive
and emotional figure, and made a lasting

impression on our collective imagination.

Stanley Kubrick's epic.

A journey from pre-human history
to a possible infinity that

doesn't need humans at all.

It is probably the most respected,
if not the most beloved science

fiction film of all time.

The story of the creation of HAL's
performance, the result of a last minute

collaboration between the idiosyncratic
director and the veteran Canadian actor,

Douglas Rain, has been somewhat lost in
the 50 years since the film's release

in April, 1968, as has its impact.

Artificial intelligence has borrowed
from the HAL persona and unwittingly

a slight hint of Canadianness resides
in our phones and interactive devices.

Siri: I'm not quite sure
how to help you with that.

Jim Conrad: Mr. Rain's HAL has
become the default reference, not

just for the voice, but also for the
humanesque qualities of what a sentient

machine's personality should be.

HAL 9000: No 9000 computer has ever
made a mistake or distorted information.

Jim Conrad: The cadence, the friendly
formality, the pleasant intelligence.

And a sense of calm control in
their voices evoke Mr. Rain's

unforgettable performance.

HAL 9000: We are all by any
practical definition of the words,

foolproof and incapable of error.

Jim Conrad: As warily eye a
future utterly transformed by AI

incursions into all aspects of
our lives, HAL has been lurking.

HAL 9000: I enjoy working with people.

Jim Conrad: To Scott Brave, the
co-author of Wired for Speech: How

Voice Activates and Advances the Human
Computer Relationship, HAL 9000 is a mix

between a butler and a psychoanalyst.

HAL 9000: My mission responsibilities
range over the entire operation of

the ship, so I am constantly occupied.

Jim Conrad: Mr. Brave said he has a sense
of deference and of detachment adding

that he saw a ripple effect on, for
example, the iPhone's virtual assistant.

When I listened to something
like Siri, said Brave, I feel

there is a lot in common.

Siri: Hi, I'm Siri.

Jim Conrad: In a 1969 interview with the
author and critic, Joseph Gelmis, Kubrick

said that he was trying to convey the
reality of a world populated, as ours soon

will be, by machine entities that have as
much or more intelligence as human beings.

We wanted to stimulate people to
think what it would be like to

share a planet with such creatures.

HAL 9000: I'm putting myself to
the fullest possible use, which

is all I think that any conscious
entity can ever hope to do.

Jim Conrad: The 2001 historian David
Larson said that Kubrick came up with the

final how voice very late in the process.

It was determined during 2001
planning that in the future the large

majority of computer command and
communication inputs would be via

voice rather than via typewriter.

But in the late 1960s, artificial
intelligence was decades from a

convincing facsimile of a human voice.

And who was to say how a
computer should sound, anyway.

HAL 9000: I'm sorry, Frank,
I think you missed it.

Queen to Bishop three.

Bishop takes Queen.

Jim Conrad: To play HAL, Kubrick
settled on Martin Balson.

A best supporting actor
winner for A Thousand Clowns.

Martin Balsam: There is only one
thing that really bothers you.

Other people, the enemy.

Watch out, Murray.

They're everywhere.

Jim Conrad: Perhaps there was a
satisfying echo that appealed to Kubrick.

Both were from the Bronx
and sounded like it.

Martin Balsam: What did you do to him?

I mean, you just left him standing there.

You just left him standing there.

Jim Conrad: In August, 1966, Balsam told
a journalist, I'm not actually seen in the

picture at any time, but I create a lot
of excitement projecting my voice through

that machine, and I'm getting an Academy
Award winner price for doing it too.

Then the director changed his mind,
Kubrick said, in a 1969 interview,

we had some difficulty deciding
exactly what HAL should sound like,

and Marty, Martin Balsam just sounded
a little too colloquially American.

Martin Balsam: Unfortunately for
you, Murray, you want to be a hero.

If maybe a fellow falls into the
lake, you can jump in and save him.

There's still that kind of stuff.

Who gets opportunities like that in the
Midtown Manhattan with all that traffic?

Jim Conrad: Douglas Rain recalls
Kubrick telling him, I'm having

trouble with what I've got in the can.

Would you play the computer?

Kubrick had heard Mr. Rain's voice
in the 1960 documentary universe.

Douglas Rain: The ground beneath
our feet is the surface of a planet

whirling at thousands of miles
an hour around a distant sun.

Our life is possible only because
of the light and warmth of that sun.

A star.

Jim Conrad: Kubrick wrote to a colleague
in a letter in the director's archive.

I think he's perfect.

The voice is neither patronizing,
nor is it intimidating, pompous,

overly dramatic or actorish
despite this, it is interesting.

HAL 9000: Certainly no one could
have been unaware of the very strange

stories floating around before we left.

Rumors about something
being dug up on the Moon.

I never gave these stories much credence,
but particularly in view of some of

the other things that have happened.

I find them difficult
to put out of my mind.

Jim Conrad: In December, 1967, Kubrick and
Mr. Rain met at a recording studio at the

MGM lot in Borehamwood, outside London.

The actor hadn't seen a frame of the
film, then in post-production, he met

none of his co-stars, not even Keir
Dullea, who played the astronaut david

Bowman, HAL's colleague turned nemesis.

David Bowman: Open the pod bay doors, HAL.

HAL 9000: I'm sorry, Dave,
I'm afraid I can't do that.

David Bowman: What's the problem?

HAL 9000: I think you know what the
problem is, just as well as I do.

David Bowman: What are
you talking about, HAL?

HAL 9000: This mission is too important
for me to allow you to jeopardize it.

David Bowman: I don't know
what you're talking about, HAL.

Jim Conrad: Mr. Rain hadn't even been
hired to play HAL, but to provide

narration, Kubrick finally decided against
using narration, opting for the ambiguity

that was enraging to some viewers.

Transcendent to others.

It's not a session Mr.
Rain remembers fondly.

He was quoted as saying, if you could
have been a ghost at the recording, you

would've thought it was a load of rubbish.

Mr. Rain had the kind of bland and
Mid-Atlantic accent we felt was

right for the part, Mr. Kubrick
said in the 1969 interview.

But Douglas Rain's accent
isn't Mid-Atlantic.

It's standard Canadian English.

HAL 9000: Sorry about this.

I know it's a bit silly.

Jim Conrad: As the University of Toronto
Linguistics, professor Jack Chambers

explained, you have to have a computer
that sounds like he's from nowhere

or rather from no specific place.

Standard Canadian English sounds normal.

That's why Canadians are well received
in the United States as anchormen and

reporters because the vowels don't
give away the region they come from.

Peter Jennings: From ABC News, this is
World News Tonight with Peter Jennings,

reporting tonight from Los Angeles.

Good evening from California where it
is the eve of a controversial, exciting,

tense history making effort to unseat.

Jim Conrad: Mr. Rain had played an
astonishing range of characters in

almost 80 productions at the Stratford
Festival in Ontario over 45 years.

Understudying Alec Guinness in
Richard III and going on to play

Macbeth, King Lear, and Humpty Dump.

Douglas Rain: When we have matched our
rackets to these balls, we will in France,

by God's grace play, a set shall strike
his father's crown into the hazard.

Jim Conrad: Mr. Rain had to quickly
fathom and flesh out HAL recording all

of his lines in 10 hours, over two days.

Mr. Rain recalls, Kubrick sat three
feet away explaining the scenes

to me and reading all the parts.

Kubrick gave Mr. Rain only a few
notes of direction, including, sound

a little more like it's a peculiar
request, a little more concerned, and

just try it closer and more depressed.

HAL 9000: It's puzzling.

I don't think I've ever seen
anything quite like this before.

Jim Conrad: When HAL says,

HAL 9000: I know I've made some
very poor decisions recently, but I

can give you my complete assurance
that my work will be back to normal.

Jim Conrad: Mr. Rain somehow manages to
sound both sincere and not reassuring,

and his delivery of the line,

HAL 9000: I think you know what the
problem is, just as well as I do.

Jim Conrad: Has the sarcastic drip
of a drawing room, melodrama, and

also carries the disinterested
vibe of a polite sociopath.

HAL 9000: Dave, this conversation
can serve no purpose anymore.

Goodbye

Jim Conrad: Kubrick had Mr. Rain
sing the 1892 love song Daisy Bell,

almost 50 times in uneven tempos,
in monotone, at different pitches,

and even just by humming it.

In the end, he used the very first take.

HAL 9000: My instructor was Mr. Langley
and he taught me to sing a song.

If you'd like to hear it,
I can sing it for you.

It's called Daisy.

Daisy.

Daisy.

Give me your answer to.

Jim Conrad: Sung as HAL's brain is
being disconnected, it's from his early

programming days, his computer childhood.

It brings to an end the most
affecting scene in the entire film.

Author Scott Brave said, the
moment is so powerful that it

made you feel uncomfortable.

All of a sudden, HAL feels incredibly
close to being alive and being human.

You start to empathize
with that experience.

And you are responding to
the death of a machine.

For a character that's been
endlessly caricatured, HAL has

inspired a surprisingly rich range
of adjectives over the years.

He and his voice have been described
as aloof, eerily neutral, silky,

wheedling, controlled, baleful,
unisex, droll, soft, conversational,

dreamy, supremely calm, and rational.

He's discursive, suave, inhumanly
cool, confident superior, deadpan,

sinister, patronizing, and asexual.

The actor Anthony Hopkins has said
it influenced his performance as

the serial killer Hannibal Lecter
in The Silence of the Lambs.

Anthony Hopkins: First
principles, Clarice.

Simplicity.

Read Marcus Aurelius.

Of each particular thing,
ask what is it in itself?

What is its nature?

What does he do, this man you seek?

Jim Conrad: Douglas Rain never
saw 2001: A Space Odyssey.

For Mr. Rain, the
performance was simply a job.

HAL 9000: Well, I don't think
there is any question about it.

It can only be attributable
to human error.

Jim Conrad: Artificial intelligence,
voice synthesis can't yet deliver

a performance as compelling as his
HAL, but it is becoming more human.

Soon artificial intelligence
voice will be able to sound,

HAL 9000: Like whoever you want it to.

Douglas Rain passed November
11th, 2018 at the age of 90.

Jim Conrad: We're back on the Conovision
Podcast, the spirit of storytelling.

And with me right now
is a good, good friend.

We've known each other for a very
long time, probably since, uh, 1981.

Eric Westra: 81, I think.

Yeah.

Jim Conrad: And we both
started in broadcasting.

We both worked at the same radio station.

We both shared a flat in the West End
of, uh, Vancouver for a couple of years.

And then we both kinda went
our separate ways, didn't we?

I got out of radio and got
into freelance voice work.

Eric Westra: Smart move.

Jim Conrad: Thank you.

Yeah, because my radio career
wasn't going along very well.

Eric Westra: Neither was mine.

Jim Conrad: So, you know, as
Clint Eastwood said, a good

man knows his limitations.

Eric Westra: That's right.

And I started believing
my own press releases.

Jim Conrad: Yes.

Uh, well, exactly.

So I got into the voiceover, uh, business.

You using that incredible brain of
yours, my friend, uh, went back to school

and learned to write computer code.

So what was it like?

Yeah, because I remember, uh, visiting
you up in Peachland when you were

going to the University of Okanagan
taking, uh, computer science, right?

And, uh, you told me that fascinating
statistic where they had a

number of eager young applicants.

You were a little bit older than most
in the class by maybe five or six years.

Eric Westra: At least.

Jim Conrad: And all of a sudden
the professor gets up and starts

filling blackboard after blackboard
full of algebraic equations.

And soon after the attrition rate
started to, started to kick in.

Eric Westra: I don't think this is for me.

I mean,

Jim Conrad: I just like
playing computer games.

Eric Westra: Yeah.

I, that's exactly it.

A lot of the people who got involved.

Jim Conrad: So is that, is that
sort of, again, and in our timeline

now, we're in the early nineties.

Eric Westra: This is the early nineties.

Yeah.

I graduated, uh, 96.

Yeah.

Jim Conrad: So this is, so I wanna
get into the whole sort of evolution

of tech from your perspective.

Eric Westra: Okay.

Jim Conrad: Since that time.

So you started writing code, first of all,
that, that's, that was the perception,

uh, of kids coming outta high school,
was that computers were these wonderful,

fancy toys that they loved to play with
but they really didn't have any idea

what the nuts and bolts were, right?

Eric Westra: No.

And, but at the same time, things
have changed since then because

now, uh, kids actually learn
to code like in grade school.

And, uh, and know that's a good
thing because it, math is not

the greatest for a lot people.

But if you, if you bring the math
piece into the computer realm then it's

a lot easier for them to accept and
they're more inspired to, to do it.

But yeah, I, I started writing
code, but that's not what I do now.

I mean, I, I, we call
'em code monkeys, right?

Because, you know, you're
essentially, that's what you're doing.

You're tap, tap, tap, tap, tap.

You're lucky to bang off six good
lines of computer code a day.

Uh, 6 to 10.

And I learned a bunch
of different languages.

Cobol, you know, Modula-2, I learned,
uh, JavaScript and all this other stuff.

I don't use any of that anymore.

Uh, but it was a good way to sort
of set up what I ended up doing.

Jim Conrad: Yeah.

Yeah.

Um, so from code writing, uh, what
was the first gig after university?

You came to Vancouver as I recall,
and worked with a company here that

was trying to do something a bit,
a bit ahead of their time, which

was set up mainframe systems that
people would just access from their,

Eric Westra: Well, that's
a bit of a simplification,

but what happened was I got,

Jim Conrad: I'm a simple guy.

Eric Westra: No, no, no.

I ended up.

Uh, my first gig outta school
was with the Premier's office.

Jim Conrad: Right.

Eric Westra: I worked for, uh, BC
Trade Development and the Premier's

office because they were under
the same IT umbrella in Vancouver.

And there's a great story about Mike
Harcourt when he was the Premier.

I was doing some of the, uh, the IT
support and one day I get a phone call

from Mike's office, which was just down
the hall from the Trade Development

Corp. Eric, can you come on over?

'Cause Mike has a
problem with his printer.

And I said, sure, I can, I can come over.

Jim Conrad: Yeah.

Eric Westra: So, uh, apparently I, I
walked into the office and, and Dixie,

who is normally, uh, Mike Harcourt's EA,
executive assistant, was away on vacation.

They got somebody from the pool, I guess.

And so I said hello to her and I walked
into his office and he was sitting at his

desk and he was signing a, a big stack of
paper and you know, one after the other.

And I said, Hey, how's it going, Mike?

'Cause he was, was a
first name kind of guy.

And, uh, really nice guy.

And I started to work on the printer
and then all of a sudden I hear this,

you know what the, across the room.

And I, I said, what's the problem?

He said, well, have a look at this.

And he was signing a whole bunch
of, uh, proclamations, thanking

people for participating in the,
uh, the Vancouver Centennial.

There must have been 5
or 600 of these things.

And the salutation at the bottom of the
letter, what had happened was the, the new

EA had just gone through, you know, spell
check, click, click, click, click, without

actually looking at what she was doing.

And the salutation read, the
right honorable Mike haircut.

And, and, uh, you know, Mike, not
exactly what you would call hirsute.

Uh, and, and so, uh, that was, that
was one of the funnier stories.

But yeah, I,

Jim Conrad: Follicly challenged.

Eric Westra: Follicly challenged
Mike was, yeah, that's right.

Uh, but yeah, that was one of
the, uh, the earliest gigs.

And then eventually I, I got into
tech communications, you know, tried

to marry my background in tech with
broadcasting, and then got involved

with business analysis and business
process and business optimization

and then solution architecting.

Uh, putting technology together
with companies that were

going through a lot of change.

So that's, that's what I do now.

Jim Conrad: An overall sort of context
of this digital revolution of ours,

obviously, uh, it's the machine that
changed the world, the computer.

But a lot of people now are, I
guess, rethinking what that change

or how much it has changed us.

And, and relating back to, you know, being
human and, and how we relate to computers.

And now the whole idea of
artificial intelligence.

And we're seeing, while we first saw
AI literally and viscerally in 2001.

And the voice of HAL, the computer.

My favorite line, which I've dubbed
the HAL 9000 breakup line is,

HAL 9000: Dave, this conversation
can serve no purpose anymore.

Goodbye.

Jim Conrad: That's right.

Goodbye.

Eric Westra: Yeah, goodbye.

Jim Conrad: Uh, HAL.

HAL?

Do you read me, HAL?

Eric Westra: There's a great example
of an artificial intelligence machine

that didn't have any morality and,
and a lot of the discussion around,

you know, artificial intelligence
these days, 'cause now computers

are writing their own algorithms.

They're learning and they're
changing their own computer code.

Jim Conrad: Now explain to me what
is defined as machine learning.

Eric Westra: Well, there's,
there's deep machine learning

and there's machine learning.

And so the two are a little different.

Um, machine learning is when a computer
algorithm is given the capacity to write

its own code, to learn from its mistakes.

And to

Jim Conrad: To replicate
itself and heal itself.

Eric Westra: Right.

So there's a core set of code and
instructions in the algorithm.

And then it says, you know, go out and try
something and under these circumstances

if it works, make a note of that.

And write your own code so that you
do it the same way the next time.

And then there's deep machine
learning, which is really interesting.

And that's around you can build reward
systems into computers the same way

that you do with pets and humans.

You know, if a dog does something right,
uh, you give it a, you give it a treat.

If a child,

Jim Conrad: Good boy, good boy.

Eric Westra: If a child does
something right, you reinforce that.

You give them a hug,
you say, good, good job.

That's the way you do it.

So we're doing the same
thing with machines now.

We're actually creating what we
call artificial reward systems.

Jim Conrad: Wow.

Eric Westra: So computer will do
something right, it will earn two stars.

And if it does better,
it gets three stars.

So it focuses completely on making
sure it gets as many rewards as it can.

And through that process,
it learns faster.

Jim Conrad: Wow.

Now, and this is one, once again,
one of those ethical questions

that, uh, that has to come up.

It came up when they were
making the atomic bomb.

Yes, we have the scientific capacity
and the knowledge to create this.

The question is, should we.

Have we reached a point of that
question in artificial intelligence?

Eric Westra: I think we
continue to ask that question.

And it's a good question to ask.

Um, there's no, you know, artificial
intelligence has huge promise.

Um, there, there are lots of
positive things that can come of it.

Um, but like anything, there are a lot
of very scary and negative things that

come of it, if it isn't handled properly.

And this is all through this
emerging field called Robo Ethics.

Jim Conrad: Now, thank God we had
Isaac Asimov way back when because

I, I was actually a big fan of Isaac
Asimov, uh, science fiction novels.

Eric Westra: Yeah.

Jim Conrad: Now what were
the three rules of robotics?

Eric Westra: So the, the,
Asimov proposed that there were

three laws in his writings.

The first law was that a robot or
a machine could not, through either

action or inaction, harm a human being.

The second law was that, uh, the
robot or the machine had to follow

the instructions given to it by a
human, unless it broke the first law.

And the third law was that it
had to self preserve itself.

It, it couldn't, uh, it had to go through,
you know, all the machinations necessary

to preserve its existence provided
it didn't break the first two laws.

Jim Conrad: So it had a, it had
a built in, the third law was

self-preservation on, with a caveat,
if it didn't break the first two laws.

Eric Westra: That's right.

Jim Conrad: So inside that sort
of closed system was an ethical

behavior model for robots.

Eric Westra: Right.

Jim Conrad: Uh, how practical
is that to build into today's

artificial intelligence, can we
create those three laws to govern AI?

Eric Westra: The laws are good ones.

They seem to make sense.

But they can produce, um,

Jim Conrad: Asimov was a genius.

Eric Westra: Yeah, he was.

This is 1942.

Jim Conrad: Yeah.

Eric Westra: The thing about it though
is that even making things as simple

as that, you can end up with machines,
uh, getting into, uh, you know, uh,

cognitive dissonance we call it.

Where, where it tries not to break
a law, it makes a move to avoid

breaking that law and then by doing
so, it breaks another law and then

it goes back to the first one.

And so it gets caught,

Jim Conrad: Vapor lock.

Eric Westra: Yeah.

Essentially.

Um, and he, he actually did write
about that in a short story.

He talked about a robot that got
caught in one of those loops.

But, you know, if you use modern
day sort of explanations around this

sort of thing, it becomes much more
difficult to nail all this down into

a, to a very small set of rules.

Jim Conrad: Because there
are so many applications.

You mentioned the positive aspects of AI.

Uh, give me, let's say
the top three, not Alexa.

Eric Westra: Well, Alexa and
voice recognition certainly

are very, very big right now.

Jim Conrad: If they make, if they
make one of those things that sounds

like Stephen Fry doing Jeeves.

I'm on it.

Jeeves: I was sent by the agencies
and I was given to understand

that you're required a valet.

Eric Westra: I, I think eventually you,
you'll, you can have your voice, voice

assistant be whoever you want it to be.

There are a set of circumstances can
come up in modern life where the machine

would be faced with so many different
criter to make a decision, it would

be very difficult for it necessarily
to make the moral or the ethical one.

Jim Conrad: You mentioned this
ethical, moral dilemma in a, and I

guess it was a thought experiment
in regards to self-driving cars.

Gimme a run through on that.

Eric Westra: This was put to a group
of university students, uh, it's called

the child in the tunnel conundrum.

So you're in an autonomous vehicle
and you're with your family and the

car is driving you on a mountain road.

It's a two-lane mountain road, and
your two kids are in the backseat.

Your wife is sitting beside you.

And you come around a curve and
you're, you know, doing relatively

the speed limit, you know, it's an
autonomous car, it's following the law.

But all of a sudden, uh, as you come
around the curve, there is a tunnel

entrance and uh, it's got a concrete
sort of abutment around the face of it.

And, uh, a small child darts out
from behind a rock near the tunnel

entrance and falls onto the road.

The car has two choices really,
given the circumstances.

It's not going slow
enough to stop in time.

It either can send the car flying into
the concrete or run over the child.

Jim Conrad: So into the abutment,
the, the passenger's in the

car, the self-driving car.

So the computer that's, that's
governing that should try to preserve

what it's meant to do, which is to
avoid harm to human life, meaning

the passengers inside the car.

Eric Westra: Right.

Jim Conrad: However, by, by
running over the child, it's

potentially killing the child.

In fact, it's almost a certainty,
given the circumstances.

Eric Westra: Okay, so
what does the car do?

And when the group of university
students were asked this about this

problem, more than 60% of them said
the car should run over the child.

What the heck does a child, what,
what is that child doing in the

middle of nowhere in the first place?

Jim Conrad: The evil spawn
of a wiley coyote or what?

Eric Westra: Yeah.

It would be one death compared to
potentially one or more in the car.

Jim Conrad: So there's a value judgment
that has to be made almost instantaneously

by the autonomous computer driven car.

Eric Westra: It's no longer a
machine just making, uh, a judgment

based on the best way to do things.

It's making life and death decisions.

Jim Conrad: Yes.

Yes.

Eric Westra: And that is
where things get really weird.

Jim Conrad: Well, and then given
the technology surrounding military

applications of drones, and right
now, as I believe they are operated

by human beings in a bunker
somewhere outside of Las Vegas.

Eric Westra: They're autonomous,
some of them, uh, they, they don't

need, require supervision at all.

Jim Conrad: Well, that's where I,
there's another ethical, moral question.

You know, you've got a basically
death from above being run

by an AI driven computer.

And the decision making processes inside
that computer were coded by humans.

So there still is a human connection.

But it's, but it's pretty
far removed, isn't it?

Eric Westra: It is.

And uh, you know, there's talk about
eventually some of these drones,

I mean, we're getting pretty dark
here, but some of these drones,

Jim Conrad: It's a dark and
stormy Saturday morning.

Eric Westra: Yes.

You know that they would use face
recognition to sort of, you know,

recognize the face of a terrorist and,

Jim Conrad: As opposed to a child,.

Eric Westra: Uh, yeah.

And then, you know, go out and, and
essentially assassinate that person.

So that's a really dark side of AI and,
and there will always be that, I guess.

But there are some very positive things.

I'm running into it now every day.

Jim Conrad: Yeah.

I thought I mentioned, or I
requested a bright side aI story.

We, we went immediately to the dark
side, so let's, let's find some

ray of sunshine here somewhere.

What's, what's the number one
beneficial use to humans and

humankind for artificial intelligence?

Eric Westra: Well, there, there,
there are lots of them, you know.

Jim Conrad: Robot surgeons?

Eric Westra: Robotic surgery is one.

Robotic diagnosis.

Jim Conrad: I don't know if I want
HAL, uh, doing my prostate surgery.

HAL 9000: I'm sorry, Jim.

The knife slipped.

Eric Westra: But you know, I'm running
into situations now where, uh, it's being

used for example, to determine how to hook
up a borrower with a lending institution

based on the borrower's profile.

There's a whole bunch of, you know, stuff
that goes into making that decision.

And AI can be very helpful
in making those connections.

So that's more of a
business efficiency thing.

HAL 9000: I'm sorry, Jim.

Your loan application has been denied.

Eric Westra: That's right.

Uh, and you know, but there are lots
of applications emerging now, and

it's gonna become more and more, uh,
a part of our lives, uh, every day.

Jim Conrad: I've got a question
for you, given that technology is

speeding up, that some people are
becoming somewhat overwhelmed by it.

We're, we're trying to adapt.

But it seems that as we, as we
adapt, the next thing comes along

and then we have to adapt again.

And some people are retreating
and there's a kind of a backlash.

Do you think that there will be a, a
time when when we'll reject technology

and, and try to go back, back in
time if it, if it was possible.

Eric Westra: Well, there already has been.

I mean, I keep thinking
about Joe Walsh's song,

Analog Man.

There already has been a bit of a
backlash, especially in the music scene.

It started with the release
of CDs many years ago.

And the digitized sound just didn't
have the same kind of, the, the

word that was used was warmth.

Jim Conrad: Right.

Eric Westra: And so in the last few
years especially, we've been seeing a

resurgence in, uh, good old vinyl and
artists actually pressing their own, uh,

instead of going through record companies.

And, and it's because that
audio files prefer the sound.

So there's one example.

I think the latest thing
that's, that's been.

Sort of making people recoil sometimes
in sheer, utter terror is AI.

Jim Conrad: Now define AI for me.

In a practical sense, what is AI?

And how did it develop?

Eric Westra: So it's based on what's
called an LLM or a large language model.

And the idea is you synthesize, uh,
using algorithms, and hardware, uh,

the neural pathways in the mind, the
human brain, that process language.

Jim Conrad: And it was originally
developed to actually model the

brain in order to do research on
how, on how the brain works, right?

Eric Westra: Neural networks work, yes.

And then they became quite helpful
when it came to large language models.

Because you think about, you know,
one word, um, the word right could be

spelled a couple of different ways.

Could mean several different things.

So, you know, the large language model
is there to teach the algorithm what the

actual meaning of the word and a number
of words fit, that are fitted together,

what that means in the context for it.

Jim Conrad: Yes.

One could define intelligence
a number of different ways.

A lot of people define intelligence
as in AI, in the ability to assimilate

all sorts of information to correlate
to, to come up with solutions and

answers to questions quicker and faster.

Or is intelligence coming back to the
human aspect is intelligence, creativity?

Eric Westra: I think in,
in many respects it is.

I mean, we can be creative with
language, which is a structure, and

we find all sorts of different ways of
using musical notes and being creative.

And they're, they're a very
finite set of sort of things.

Uh, ABC, you know, the
notes, uh, the chords.

So we start to see that, um, the,
the real danger actually that's being

presented by AI is in the creative space,
because now it's possible to manipulate

imagery, audio in a way that most fools,
uh, the human consumer, if you will,

because the, the artificial intelligence
has learned what matters most to us.

Jim Conrad: And that is the image?

Eric Westra: Well, it could be
the image or nuances of the image.

So you take a, for example, you could
ask a, a chat bot, or, uh, a piece of

artificial intelligence in say, Adobe
Photoshop, create a picture for me of

a guy age, so and so, uh, that has very
pleasant features, is a professional

person and is modeled after people
who have good leadership qualities.

Jim Conrad: Yes.

So, so the, the AI functions as, but you
still a, a human being still has to input.

Eric Westra: Oh yeah.

Jim Conrad: Certain things.

Eric Westra: Yeah.

The intelligence is, is, is
essentially, you know, called from,

from a lot of different sources and
usually from other people's input.

It's, it's only as good as
the data that goes into it.

And this is part of the problem right
now with AI, is that there's so much data

going into it, that it's getting confused.

Jim Conrad: And they
call that hallucination?

Eric Westra: Hallucination, yeah.

Jim Conrad: Yeah.

We're familiar with that, aren't we?

Eric Westra: For many years, my life
seemed to be one large hallucination.

Uh, thankfully those days are over.

Um, but the, the hallucinations
are, are created by just throwing

a lot of data into the model rather
than focusing on quality of data.

Jim Conrad: And, and that's where I,
I come always back to creativity and

the artist when it comes to technology.

Technology certainly has, has
released our ability to create,

even in multi-track recording.

You know, now you can track almost
infinitely and sound bites and

samples and things like that.

Eric Westra: But it's all
mimicry in one form or another.

Jim Conrad: That's true.

But as we go forward into this AI
world, how is our humanity going

to be preserved, do you think?

Eric Westra: Well, I think a lot of
people are asking that question now.

And thankfully, some of the people like
Sam Altman, uh, you know, who heads

up ChatGPT, are thinking not just in
terms of what the technology can do,

but how it should be guard railed.

Jim Conrad: Yes.

Eric Westra: Uh, anybody working in
AI who isn't thinking about that,

the ethics of AI and the humanity
behind AI, shouldn't be in the field.

Because the potential is
so there for, for misuse.

Jim Conrad: And that's altruistic.

But as we know, corporate
culture is ruthless.

Eric Westra: Sure.

But, but the genie's outta the bottle.

Jim Conrad: Yeah, that's true.

Eric Westra: I mean, right.

I mean, and, and, and every time a new
technology comes along, there's always a,

Jim Conrad: A human cry.

It's it's gonna be the end of humanity.

Eric Westra: You know, uh, digitized
music was, you know, we use that example

at the beginning of the conversation.

I mean, everybody said, oh, it's, that's
the end of, you know, a lot of things.

Uh, a lot of the, the artists
now are just gonna go for the

lowest common denominator.

They're not gonna get
reimbursed, et cetera, et cetera.

Okay.

Yes.

There are changes whenever you
have a new technology introduced.

Sometimes more than others.

The thing is, how do we adapt
as humans to the technology?

How do we make the best use of it,
and how do we make sure that we

don't squander the opportunity?

Instead of being afraid constantly about
what the technology might do wrong,

let's focus on what it might do right.

Jim Conrad: Artificial
intelligence, alien intelligence.

Hmm.

We live in interesting times, don't we?

Here's what big tech has to say about AI.

Sundar Pichai, CEO of Google, the future
of AI is not about replacing humans,

it's about augmenting human capabilities.

Ginni Rometty, former CEO of IBM says,
AI will not replace humans, but those

who use AI will replace those who don't.

And then perched high on the proverbial
fence, CEO of SpaceX and Tesla, Elon Musk

said, AI is likely to be either the best
or worst thing to happen to humanity.

Thanks, Elon.

But the final word should go to
the ultimate tech nerd Bill Gates,

co-founder of Microsoft, said, artificial
intelligence will evolve to become a

super intelligence, we need to be mindful
of how it's developed and ensure that it

aligns with humanities best interests.

James Cromwell: There have always
been ghosts in the machine.

Random segments of code that have grouped
together to form unexpected protocols,

unanticipated these free radicals
and gender questions of free will.

Creativity and even the nature of
what we might call the soul, or

is it something

more.

Jim Conrad: That is the
Conovision Podcast, the spirit of

storytelling, episode three, all
about artificial intelligence.

My thanks to all the conscious
entities in the first AI G7 Summit.

We are still trying to decode
that digital word salad.

To Amazon's Alexa for dropping by with
illuminating answers to absurd questions,

to Professor  Yuval Noah Harari, and his
insight into the differential patterns

between intelligence and consciousness.

The New York Times, for the brilliant
story about Douglas Rain, the Canadian

Shakespearean actor, and the voice of
HAL, the computer in Stanley Kubrick's,

1968 Masterpiece 2001: A Space Odyssey.

And as well our tech guru Eric Westra,
for his thoughts on the future of

AI and what it will mean to us all.

And finally, we just heard a monologue
from the film I, Robot and the

majestic voice of actor James Cromwell.

I am James Conrad, AKA Cono,
and this has been Conovision,

the spirit of storytelling.

Until next time, remember, we
are all stories to be told.