State of Play

Stephen Haney has been quietly building design tools for years. Now he's betting that the canvas wants to talk to your agents.

Paper just shipped MCP support. I've been playing with it. It's wild.

We talked about why he thinks the future stack is just three tools, why his team canceled Figma four months ago, and what happens when your production site becomes your source of truth for design.

Get the UX Tools Newsletter (written by me)
Join 100,000+ designers for weekly insights on creative software and the people shaping it: https://uxtools.co

CHAPTERS:
00:00 - Everything changed in nine months
03:57 - Where the puck is going
05:08 - Figma's walled garden problem
06:18 - Agent + Code Review + Canvas
11:42 - Did agents kill collaboration?
15:18 - They canceled Figma 4 months ago
17:27 - What MCP actually means
21:03 - Live demo: production → canvas → code

LINKS:
Paper: https://paper.design
Stephen: https://x.com/stephenhaney

FOLLOW ME:
X / Twitter: https://x.com/designertom
Instagram: / itsdesignertom
LinkedIn: / tommygeoco

What is State of Play?

Conversations with designers, founders, and builders behind some of the best work

SOP_Stephen Haney Full Episode.txt
English (US)

00:00:00.000 — 00:01:15.900 · Speaker 1
Stephen Haney has been building design tools for a while now, radix modules and now paper, which started as this beautiful graphics tool that teams like Korda were using to create these viral posters. We recorded a version of this conversation nine months ago, but so much has changed in the design tool space.

I decided to sit down with him last week to talk about it, and today paper just shipped MCP support, which means that now your design canvas can talk to your AI tools cloud code, cursor or codex, whatever you're using, you can now prompt an agent to go to your production website, pull the styles into the canvas, design something, and ship it back out to production in one tool.

It is quite remarkable and we have a video coming out soon showing you a number of these AI native workflows that we've been experimenting with. Stephen says the future of building is going to involve three tools an agent, a code review tool, and a canvas. And he wants paper to be the obvious choice for the third bucket.

We get into why he thinks FEMA's walled garden approach is a liability. Why? Spatial reasoning still matters, even when you can prompt anything and using the canvas as an interface to your agent. This is state of play. Let's get into it.

00:01:23.060 — 00:01:45.860 · Speaker 1
You know, what's funny is for those who have kind of followed along with the shows that we produce, you and I spoke about nine months ago for an episode of this podcast, and we put a couple of those episodes on hold, and then I reached out and said, hey, man, we're about to put out your episode. I said, is there anything new that's happened?

And you're like, as a matter of fact, there is. Yeah.

00:01:45.900 — 00:03:57.350 · Speaker 2
I mean, so paper is, uh, we're about a year and a half in now. Um, and it's going really well. I mean, we have a ton of, uh, production usage from, like, some of the best designers in the world. Uh, recently Quarter. If you know the posters that always go viral on Twitter every time a quarter does. They've started using paper quite a bit for the new posters, which is just like, we're honored to be, you know, be involved.

Stripe is starting to use it in some of the annual newsletter stuff that they publish. So we're going to do some some of the paper shaders and assets. Some of these real tastemaker leading designers in the world are starting to use paper, both for the visual side, like graphics and then also for the UI side.

And we really want to do both is an interesting problem for us. We're like, where does paper fit into the world? If everyone's prompting a cloud code, everyone's kind of like shipping faster. Um, does a tool that still cares a lot about craft and quality matter, and I firmly believe it does. I think, you know, sometimes you can prompt and get 80% of a job done in five minutes, and that's enough for some jobs.

I think there's a vast majority of jobs where the human touch precision still matters, but the current canvas tools that we have are not moving at the speed of prompting, and they're not moving at the speed of you know, cloud code or what your developers are doing. And so that I think is, is an opportunity to make better tools.

And so we can say, you know, spatial reasoning still matters. Being able to alt drag clone two things, see different designs right next to each other. Still very important. Even just doing diagraming or brainstorming collaboration with humans on a canvas is important. But how do we then, like bring that up to speed with what developers are doing?

And so that's what I'm going to share today. We've we've, uh, we've done some pretty big things to connect paper to any agent that you have. Um, this is a big point, actually, like any agent and whatever, bring your own agent. We're not going to bake one in. We're going to let you pick as a professional if you're using Codex or cursor or like, whatever you can, you can integrate with paper and it just connects paper to files to data.

You can bring in your CSS tokens from your code base. You can prompt back from paper back to the code base. And it creates this like full circle design dev design dev loop that I think is just like super powerful and it brings the canvas up to speed with. You know what developers are doing.

00:03:57.350 — 00:05:08.890 · Speaker 1
I've gotten access to the MCP. The paper's rolling out. We've been playing with that. There's a lot of really interesting workflows in that, especially because we've become so high pilled over here using Open Claw and all the things. Frankly, what's happened since opus 4.5 is that coding like strictly vibe coding now, and I almost am moving away from that term.

And I'll tell you why in a second. Is is is very production ready at this point you still need eyes on and and now with 4.6 and codecs and everything that's that's happening. And obviously with a lot of the workflows with agents, now that we can write code, how can we support that with some more manual control.

And now we're starting to see everybody getting into MCC. And context is king. How can I get access to the things already available for you. It kind of all started with that early exploration, with all the tools, whether it was radix, whether it was modules, the things you've created before. Where do you think the puck's going, man?

I mean, in all of this, all the companies you've studied, how they're working with AI, what are you. What are you saying?

00:05:08.930 — 00:06:16.709 · Speaker 2
No, I mean, the race is on, right? So I think there was a period of time where everyone was trying to figure out what their AI strategy was going to be, and you saw every company talking about AI, but maybe not in a very useful way. I think the for design tools at least. Um, you know, I think everyone is starting to see what's going to happen over the next few years.

And clearly your canvas needs to be connected. We're giant figma fans, like they've done a ton for the design industry, but I think traditionally they've been more of like a walled garden approach. You know, where you have plugins and things. But Figma is a proprietary data format. They have their own rendering engine.

It's not it's not like web standards. And so it's actually just like technically difficult for them to have an LLM. Right. Figma designs or pull Figma designs back to code because there's a translation step and l'OMS are pretty good at translating, but they're not perfect. And certainly when things get more complicated, it can be difficult.

I think the future of building will be you're going to have an agent, probably multiple, you know, cloud code, cursive or whatever harnesses you're using. You're going to have a code review tool if you review your code. Not everyone does. But I think, you know, obviously in professional settings, you still probably want to

00:06:18.430 — 00:07:10.560 · Speaker 2
uh, and then I think you need a canvas. I think you need a spatial relationship. You need to be able to, um, draw and interact not just to your teammates because, like, collaboration is still really important. Multiplayer still really important, but actually your agent too. So if you think about it, like right now, we can only type to our agents.

Mostly some of us speak to them here. Um, the canvas can become an interface in the same way that your IDE or your terminal is an interface. And I think for some jobs, typing or speaking to the agent is better. And then for other jobs, um, drawing or rearranging or sketching is is a better way to actually prompt and get context into your agent.

I really see the future of building as you're going to have these three tools. It's like agent code review and a canvas. And so we want paper to be like the obvious choice, the only choice in that new professional design stack, building stack. Everyone you know, who's building who, I think will use these tools to be that that agent interface where you can talk to your agent through paper.

00:07:10.600 — 00:09:04.320 · Speaker 1
The insight to me has been this idea, that vibe coding, which is about a year old now, is starting to feel less and less accurate because on one hand, I've found myself through the editor being able to craft these really intricate interaction experiences prompt after prompt for hours, really tweaking every detail, not writing the code, but actually being able to get in there and being like, hey, I want it to look this way.

I want the motion to be this many milliseconds. I want these things to happen on hover. And and to me, that's very hand prompting, right? It's not just I'm just not vibing. Whereas when I'm talking with one of my agents, an open claw agent. Something that was extremely cool with paper was to be able to say, hey, we have a bunch of assets we need for our new tool, Bender's website, a favicon, a thumbnail for the metadata, you know, um, description when you post it on social media, a number of other tasks like that.

And I could tell it, we need these things. Here's our brand guidelines. Here's a number of things that you can pull from for reference. Wake up in the morning. And it has spun up these little boards all over paper. Now some of the ideas were great as one shot and some of the ideas needed some love. But now I could open up a canvas and immediately start tweaking things and doing real consequential work inside of something that's already started.

And that was really an eye opening moment for me. Now, when I'm told that I'm about the 5% who are doing this, I want to ask you, because we hear about all of these major companies who mandate the use of AI, who talk about how their AI native, right? I think it was Shopify that said, you need to justify before you add headcount.

And I want to know from you someone who's been studying how these teams are working, what's the difference between how people say they're using AI and what's actually happening in design?

00:09:04.480 — 00:11:42.590 · Speaker 2
Yeah. Gosh, great question is about three things I wanted to talk about in there. Well I think it's early days like certainly uh MCP they're working on MCP two right now because MCP one is a Wild West a little bit. And if you're messing around with this stuff, you're an early adopter for sure, and you're probably figuring out things that kind of break and then you need to fix them.

The workflows matter, though. The workflows are proven when you when you have it working. The workflow is magic. And so over time, all those little problems, the MCP breaking, you get to open things in a certain order or whatever it is. Over time those things will get fixed. And this workflow is connected workflow where you can just prompt, hey, go grab my assets from my production website, and it's like you're summoning it onto the canvas.

Uh, it's not like. Of course we need that. So we'll. We'll figure out the, like connections between things. And I think that agent layer doing the being able to kind of like have some intelligence when it's going out and pulling things for you is just incredible. So yeah, I'm actually finding a lot you can prompt in your agent.

So say you're in cloud code. You can say go to my website, grab the styles and put them on the canvas as a style guide. And and then you can prompt like more pages off of it. And honestly it's like it's great. Like I'm like, do you even need like, do you need to maintain your design system or do you just load your design system from your website, like every time you need it?

Like there's it's like, you know, we're on the verge of that being like a possibility. And what's cool is it's not just a design system. It's like the UI on your website and it turns your production site the things that users are actually interacting with into your source of truth for design. Um, and I think that's really cool.

And you see designers wanting that, they're taking screenshots of production and like drawing on top of them in Figma. And it's like, no, I think we can actually make it. It's almost hard to talk about because it's so magic to. To have everything connected. Another example of this. We're making a launch video for for pay for desktop, and I'll give you a demo of some of this stuff in a few minutes.

By the way. We can we can actually see it happening. We're making this launch video for Paper Desktop, and we're making a music player that's going to be like the sample project. And one of the problems that we found was really cool was we're like, hey, go to Pitchfork and grab the top ten albums from this month and put that into our designs.

And it just, you know, works. And actually, like Pitchfork happens to block agents, so that didn't work. But the agent is smart enough to go grab it from somewhere else. Instead, it like looked for sites that aggregate pitchfork. And so it's handing off this boilerplate work that you could go copy and paste from Pitchfork's website.

That takes time and effort, and you should really be spending that on creative, um, aspects of your of your work. It's early days on how people are using this. And, um, I think the new connected canvas will will really, for the first time, allow the design tool to be part of the building stack and that that's when AI is going to take off.

00:11:42.590 — 00:12:28.650 · Speaker 1
Well, and the crazy thing is, I mean, less than two years ago, we were still trying to find a good solution for working with live data in static designs like that was still, that was kind of always the game. Like, how do we get closer and closer to emulating a live data scenario? And then obviously a lot of these browser based vibe coding tools and then a lot of the tooling now we can do so much more beyond that.

And I think you're the one who said we're back in local space, like we really are operating on our machines, and we're pulling in all this context from the different tools that we use. Um, do you do you think it's accurate to say that AI broke the workflow, that Figma kind of, you know, it got us all into?

00:12:28.970 — 00:14:44.220 · Speaker 2
Yeah. I mean, it's a question we're trying to figure out the answer to pretty quickly. I did say that, um, I was surprised that designers moved to local space. I really thought that collaboration with humans is, paramount. But I think the power of pronouncing against your actual code base, even for things that are just like, hey, you can start to treat your code base as a source of truth.

The screens that actually your users are seeing and you can prompt based on them and get them out. Now, I don't think, you know, I think Figma is not dead. I think a lot of people are still using Figma every single day. Most people I talked to, you know, there's viral tweets. I haven't touched Figma in three months.

Yeah, I know a lot of people are still in Figma all the time because you do need that, the ability to explore very quickly. And I think Figma put out a video this morning that was talking about the limitations of local space, which I thought were pretty accurate. You know, once you kind of know what you're doing, going into a prompt situation where you're dialing in a certain file and really narrowing down, I think is really useful.

And stateful prototyping is great for handing off. But before you get there, exploring, deciding what to do, collaboration. But to answer your question, did agents kill collaboration with humans? I don't think so. I think we need both, and I just don't think we have both yet. And so that's why we're kind of like tugging back and forth.

I just think we need a canvas that does both. And so that's what we're building is, is it's agent connected. It can see your local repo. If you want to do that. It can see the stateful prototypes you're making. You can prompt into the canvas. You can take step back out of the canvas. And then by the way you have like a Figma plus pro level editor for direct manipulation and multiplayer.

So like as you're doing this with the agent, you can have humans, you can have your PM or your CEO or whatever. They could be watching you prompt if they want to. So we have the real, you know, the multiplayer experience to everyone in papers coated forever, and we're all using agents, but we still use the canvas as a way to share ideas, to try 20 iterations of something right next to each other, and then to, like, share that out and also make decisions.

It's really a tool for decision driving within within companies. Well, if you're watching this video today or it's available now so go check it out. Paper dot design and tell me what you think and tell us how to make it better. But I really think I'm not satisfied yet with this split between prompting and dev and design.

I think we really just need something that can, uh, connect and go across that boundary.

00:14:44.580 — 00:15:18.220 · Speaker 1
Yeah, I used to think I had to answer the question, am I going to be primarily a designer in a terminal or some sort of, you know, IDE, or am I going to be a designer in an infinite canvas? And I thought, oh, wow, the world's going to go to a place where you're one or the other, and it's become more increasingly clear that, no, it's going to be both.

And how do we best marry the two? Um, I want to get into the demo because I'm really excited about it. But before I do, I just have a question. The the paper team uses paper to build paper. Is that true?

00:15:18.260 — 00:15:20.220 · Speaker 2
Yes. Yeah, 100%. Yeah.

00:15:20.260 — 00:15:35.140 · Speaker 1
We I find it really interesting because I think for a little while in the earlier of the hype cycle, maybe two and a half, three years ago, people were claiming how great AI was. And then the question would be, oh, you know, are you using your AI product to build your product? And they'd be like.

00:15:35.140 — 00:15:35.660 · Speaker 2
Well.

00:15:35.780 — 00:15:56.680 · Speaker 1
A little bit, you know, and it's like, yeah, because the promise was trying to be discovered. And we're hearing more and more now. The Cloud code team using cloud to write 95% of the code and these other cases. So when you hear AI native I'm an AI native employee, I'm an AI native designer. Um, what is your definition of an AI native design?

00:15:57.200 — 00:16:03.840 · Speaker 2
To be honest, I haven't thought about it much. I think that it just won't be a you won't think about that in the future. You're just going to use the.

00:16:04.080 — 00:16:07.040 · Speaker 1
Like a delineation of, like, I'm an AI, I'm just a designer.

00:16:07.080 — 00:16:55.660 · Speaker 2
I'm just a designer. And you use the tools that are connected now and that's that's better for everyone. And, um, I think it also it provides opportunities to get more into code if you want to for sure, and definitely more into like shipping things to production, which is really cool. But the thing I'm most excited about is the connection between the tools, so that you can use the best tool for whatever task you're doing, and sometimes that'll be text and sometimes it'll be dragging.

Um, but yeah, we've been so we actually again, we're big figma fans, but it was a big moment for us. We we canceled our Figma subscription 3 or 4 months ago now because we had kept it around. We were kind of in both for a while. Design tools take forever to build, right? So we still working on our pen tool, things like this.

But no, we're fully doing paper and paper now. And actually I can show you, um, our upcoming website. That'll be live if you're watching this. And I'll show you kind of how we built it in paper, which I think is pretty interesting.

00:16:55.660 — 00:17:27.740 · Speaker 1
That would be that would be fantastic. Let's while you pull that up for the people who are listening, and Paper Desktop is launching with MCP today, I believe, on the release of this episode. Um, but for the people who don't know what MCP is, what can you help us understand? What does it actually mean when when a design tool can talk to your code or your database, your project management tool, linear or whatever you use in it through this MCP?

What is what is so special about that for people who haven't really used that?

00:17:27.780 — 00:21:03.290 · Speaker 2
I mean, MCP is just a way to to to have the tools talk to each other. Uh, anthropic originally created the spec, and now it's become this kind of open spec that all the agents follow. And honestly, it's it's, um, it needs work. Still, it's still a little rough around the edges, the MCP protocol itself. And they are working on this.

Uh, and so it's early days for it. But the power of connection, it basically lets your agents speak to your tools. And so paper defines a bunch of tools that the agents can use. We define a tool called git screenshot. So the agent can take a screenshot of a certain layer. We have a tool called like name name layer.

So they can rename the layers um, a kind of a unique thing to paper compared to the other tools that are doing this, is that paper uses HTML and CSS to render, so everything is already browser native browser standards. So our tool for the agent editing is called write HTML, because the agent can literally write HTML into paper and it just writes it out.

And so when it's when it's pulling styles from your code base, it just writes CSS styles into paper. And so there's no translation layer there. It's just like exactly what's from your, from your code. Um, so yeah, MCP is just a connection mechanism to make that all possible. Uh, and still it's still improving, still getting better.

But it's I kind of view this as like, your operating system is now intelligent. And what I mean by that is when you have cloud code open, you can you can ask it to make you a website. You can ask it to change your Mac OS settings. You can ask it to change your Bluetooth to a different codec. Uh, I used it the other day.

I had like a long running process. I couldn't figure out how to kill, and I was like, I'll just ask cloud code and it figured out how to close it. And so if you start to think of it more like open like that and like cloud code or codecs or whatever agent you're using can now talk to everything. It can talk to your operating system and your repo and your tooling.

Um, then you start to think about like, what's how do I want to be able to prompt this thing? And that's, I think, where the canvas becomes like a new way of prompting we don't have yet. So I'll show you. So this is like, this is our new design for for paper that we're we have like a million things going on in here.

But, um, one of the things we do is we were designing by hand and designing by hand. Sounds funny. We were we were designing by hand. But we were also prompting along the way. Um, aigoo, who does our marketing work? He was asking the agent, what do you think of this copy? How can you make this copy better? Can you generate a bunch more variations of this copy and then, like you were talking about, the work becomes a little bit of review, or it's like giving you inspiration to shoot off of.

We tried 20 different headlines because we were trying to figure out like, how do we talk about this? But we're going to talk about as a connected canvas. I really think the connection is the most important part of this. As we were building this in paper with direct manipulation, we were having the agent, like copy, build copy for us, build copy ideas off to the side.

And so a GU was literally like dragging and dropping, looking at copy ideas, integrating them back together. You know, we have our paper shaders of course, is the thing we're known for. So we have some shader integrated in here, having a starting place of a canvas where you can have these artboards and then do the direct manipulation.

And when he was done, he just prompted the clod, okay, take my artboard. He selected his artboard and he's like, okay, take this artboard and put it back into our code base using our code base conventions. And it was like 95% of the job done. Like some of the SDGs. He had to like dial in. We had some crazy absolute positioning, but it was like very close and just the copy, the assets were already copied over.

All that boilerplate work was saved. It's clear to me like, this is the future, like we're going to be doing this, this is how we're going to work in the future. Why would we do that stuff manually? There's no way. So here's paper desktop. I just have a blank file. And, um, I actually haven't thought a lot about our demo, but we'll just we'll just say,

00:21:04.410 — 00:21:19.370 · Speaker 2
can you go to stripe? Com, grab the styles and visual. Look, let's, let's say we're working at stripe and we need to like launch a new mini site or some marketing announcement or something like this. Um, styles and visual look and create a small,

00:21:20.410 — 00:21:24.689 · Speaker 2
uh, style guide and paper. This is

00:21:25.690 — 00:21:28.910 · Speaker 2
for a demo, so you can keep it short.

00:21:29.230 — 00:21:33.390 · Speaker 1
And so for those who don't know, this is Claude code you're using here on the left.

00:21:33.430 — 00:21:33.790 · Speaker 2
Thank you.

00:21:33.830 — 00:21:39.830 · Speaker 1
Yeah. And? And your MCP connection is already established between Claude code and paper.

00:21:40.070 — 00:22:18.110 · Speaker 2
Exactly. Yeah, yeah, and it's really easy to do that. Um, we also. So I have, I have to have codecs open over here too. It's also connected. I have cursor open. It's also connected. This is a big distinction we see from like using the the kind of turnkey tools like the, the levels are the zeros which you don't need to wire anything together.

They just work. On the other hand, they make all the, uh, all the decisions for you about how the stack works. So we think professionals will want to use probably multiple agents. They'll want to use codecs if the only thing that happens is like each one jumps ahead of each other each month. So I don't know if you run into this, but like this month I'm using codecs.

Last month I'm using cloud code.

00:22:18.150 — 00:22:19.310 · Speaker 1
Yes, yes.

00:22:19.350 — 00:23:00.360 · Speaker 2
So this lets you bring your own agent to a canvas interface, uh, which I think is really cool. Cloud code has gone out to the stripe strip website, examine the styles, found the colors that are used, the spacing scale, the headlines. Now, if I was going to do this for real, I'd probably pull it out of my code base.

I wouldn't go for the design system. I probably want exactly, you know, everything perfect. But for a demo, I think this is great. Or if you're just going to do something quickly, this is really, really good as a way to get your your core assets into the design tool. Okay. So that was cloud code pulled in our design system from stripe.com.

That's pretty cool. Let's ask it to uh, um, can you make, uh, simple settings page to invite teammates.

00:23:01.000 — 00:23:02.880 · Speaker 3
Using this design system?

00:23:02.880 — 00:23:26.380 · Speaker 1
And one one thing I do right here, by the way, is I like to create this style guide as it's done here. And then I'll also create a few template, uh, uh, layout templates. So it knows like these are some of our standard templates for different things. And now you have these source of truth. And you point at the style guide.

You point at the template and you say, you know, using your judgment spin up a few options for a settings page.

00:23:26.660 — 00:24:30.240 · Speaker 2
Exactly. For sake of demo, I'm kind of skipping. I'm getting skipping some steps here and our quality may suffer for it, to be honest. But if you actually spend the time to like, get your system in place, uh, prompting off of it is just like, awesome. The other thing we're doing is we have a new tool that lets you grab live HTML off of your website and paste it into paper, and so you can go grab your actual settings page and then prompt, make me through your variations or whatever that that job is that you're, that you're doing.

Um, it's pretty cool, man. It's like the amount of work you can save and paper is not excited about agents doing design work for you. Right? So there's a lot of places that are kind of trying to replace designers, and that's not our job at all. In fact, even this demo is a little contrived. Um, we think that we can save designers a tremendous amount of boilerplate work.

This, this design to dev loop, I think, is just like, it's so cool that we can pull stuff from code. We can pull stuff from production, change it. In fact, I'll even show you, like, as this is working, I can just, like, clone it, you know, which is so fun. And then, like, I can just start, like, changing stuff, say, hey, Tommy, um,

00:24:31.640 — 00:26:16.690 · Speaker 2
and, you know, the agent can keep working. Maybe it was already done. They didn't keep working two at the same time. And you can have multiple agents working on multiple things at the same time. So then you start to think about your humans designing next to your agents. Uh, very, very cool. Um, and again, it's not like this is the most amazing design in the world.

And if we spent more time on the prompt, we could get it there. But it's more that now I can change it, you know, now it's in a design tool. I can drag out just this card and work on just this card. Um, and, and all the direct manipulation benefits and sharing benefits that, that come with that. Um, so the way we're talking about this is like a continuous loop from paper to code and back.

And that's because when I'm done, let's say I love this card, I can actually prompt to like just pull it back into my code base. Or maybe this came from my code base in reality, maybe it came from my code base in the first place, and I dial it in visually, and then I prompt bring it back into my code base. And so that's what we did with the the actual paper website.

Um, and it's just yeah, it's pretty night and day towards, towards what we've had before. Designers ask me all the time about Google Docs integration, because I think a lot of times I get handed like a Google Sheet with a bunch of information in it, and then you're like using some plugin that only half works or you're copy and pasting.

Um, and so now, like using agents to do that work for you to pull that into designs is just like effortless. Uh, and to, to pull real content and again, you know, go search the web to find news stories from today. Um, things that don't have plugins, things that don't have structured data. You can now prompt and pull into your designs, go grab assets, go grab images, that kind of thing.

Um, I'm pretty excited about this, too. And the last big story we talk about is really just handing off that, that boilerplate work and and letting you, um, focus more on the human touch, the human decisions.

00:26:16.690 — 00:27:39.770 · Speaker 1
The way I think about this, that's really interesting to me. If you go back to, like, the table here and again like for a quick demo, which by the way, this spun it up in like a matter of minutes here. There are certain things I don't want to recreate again. I've created so many tables, data tables in my life with headers and actions and filters and search and rows and the secondary data, and there's so many things you can do, and it's like, we know what works and what doesn't work about a table.

The part that gets interesting and that requires a human to get involved is when your table has to start to address edge cases, or when you really need to get in there and you're doing something that, you know, you have to tweak some stuff that doesn't meet brand guidelines. That's where I want to be living.

And then I can focus on a really interesting growth component, right where I can do some better conversions and have some more fun. Um, so I really like that part of it. Now I have a question because I've used the I've used the, the demo version of the MCP, and one of the things I'm having a lot of fun with my free time now that I have more of it because of AI is spending more time on like micro interactions and things where a lot of movement and motion and human interactivity can move the interface around.

How does paper capture interactivity like that? Is that something that that you guys are thinking about?

00:27:39.810 — 00:28:48.910 · Speaker 2
We kind of see it right now, like, uh, bring this back into Claude and ask it to create a prototype for you. You know, so we see the canvas as an interface to that. When you want to draw something, this is the best way to do it. Like when that's the best task for the job. Use this. And then, uh, Claude, kind of or your agent in general Codex becomes your operating system that's connecting these things together.

And so if you want to use a tool like, you know, motion, for instance, the library motion or one of these animations, um, the agents are really good at that. And so you can, you can use this as like, you know, grab this. I don't know if this is the best example for motion. Necessarily. But, um, grab this, uh, you know, wire it up as a prototype, and now you're off and running.

And then if while you're making your prototype, you you need to make more visual changes, just come to the canvas, make the visual change, and then ask the agent to pull it back to, to the prototype. So we're seeing like I think we will go after stateful prototyping eventually, but right now I'm a little more interested in like the agents are already really good at that.

And let's let's make this tool that lets you describe the visual changes, uh, to them a little bit more.

00:28:48.910 — 00:29:18.090 · Speaker 1
So I've got an idea. Here's what I'd like to see, because this is a practical use case that used to drive me nuts looking at. So we have a basic table, right. So in a couple of minutes using our style guide we spun up a basic table. Fantastic. Now let's get to the stuff that I really have to bang my head against, right?

I want to ask it to generate some variations of a nested, uh, row. Right. I want to open a table with progressive disclosure. What are a couple options like create a nested row for me.

00:29:18.090 — 00:29:30.130 · Speaker 2
So now we're switching over to Codex, which is, uh, OpenAI's um, agent. We're using chat. So 55.3, I've actually heard 5.3 is really good at design. Uh, you know, for as far as agents go, we'll see.

00:29:30.930 — 00:29:37.330 · Speaker 1
Like, I've found that it works really well when given a very good reference or set of guidelines.

00:29:37.530 — 00:29:40.370 · Speaker 2
Um, okay. We'll see. So whereas like.

00:29:40.410 — 00:30:19.380 · Speaker 1
Opus, I've found that it can free, it can free form a little bit better because think about all the tedious things. Like you create this beautiful table, you're like this. This matches all the requirements. There's a lot of things here that are good for HBase, but then you've got to do all the other things.

Well okay. Well now we need three versions of table density. We want like a minimal. We want like a you know a high detail. We want something that's just comfort right. When you think about like spacing maybe then you have like I need variations for nested children, I need inline actions. I need bulk actions.

And it's like you could kind of spin this stuff up and then spend your afternoon actually making sure it gets into the details the way that you intend it to.

00:30:19.900 — 00:31:28.720 · Speaker 2
Well, I think, you know, we're doing this live. I think if I was doing this for real, I would I would ask it to make ten variations and I would go, you know, do another task or go, you know, get lunch or whatever. Um, so, you know, I think that's the beauty of it is coming back to ten variations that then you can curate.

Get inspiration from dial in. I think maybe the agents will get better. I mean, certainly the agents will get better over time at designing, but I still want to be the human in the driver's seat. And these are just ways to skip work. That is kind of repetitive. The other big one is like, if you already have your design system established, uh, do you really want to create mobile copies of everything?

Do you really want to create the dark mode copy of everything? Like that's just a manual work of mapping to tokens. The agents are great at that, and in fact, we can even ask it. We don't have a direct mode established. And this is a very bad version of the stripe design system. Sorry. Stripe. Uh, demo demo, you know, purposes, but, um, the edge agent is really great at creating variations with different padding from mobile for dark mode things where your system is established and it can just map those on it.

They really do really well. So why don't we do this to we'll have a little shoot off. We'll give the same prompt over to cloud code and we'll see.

00:31:29.840 — 00:31:31.080 · Speaker 2
We'll see who wins.

00:31:31.840 — 00:31:58.280 · Speaker 1
And if you've ever run like a crazy eight exercise before you go into the meaty design work, this is in place of that. And I would argue it's a much stronger type of crazy. Eight exercise we create, you know, eight different variations of an idea, and then you kind of pick a couple that you think are worth exploring further.

That's exactly what this is. And to me, this is, uh, a very, very helpful way that I've been working well.

00:31:58.280 — 00:32:17.340 · Speaker 2
And I think canvas tools are always for exploration, always for discovery of the problem space. Right. And that's what this is. We've found cases where you can actually just take a screenshot and put it in here and be like, hey, I want to make this more approachable, use my design system and redesign it.

And you get and you get these editable layers and again. Um, I think I'm going to like cloud codes results better already.

00:32:18.820 — 00:33:32.550 · Speaker 2
Um, again, like once you find something you like, you clone it, you edit it, you have that direct manipulation, you have the full professional design tool. Um, and, and then by the way, everything is, is like code already. So whether you ask the agent to grab it out or you just copy it as code. Um. Oh, wow. In paper, like, this is the exact code that's running right here.

Um, so there's nothing lost in translation or things like, you know, uh, tabular numbers. Uh, font, you know, OpenType font settings, variable fonts. These are the things that always get lost when devs are building from designs. And, um, this just there's no translation layer. So it's literally the whatever you're looking at right here, that's what you get out.

And that's what you can ask the agent to get out or you to copy out to. In this particular case, I'm definitely, uh, I feel like Cloud Code did the best job, but it's kind of. The point is we don't want to be agent specific and in truth. Like you probably are going to use multiple agents. I use cloud code and cursor depending on what I'm doing, and both can talk to paper at the same time as we're as we're looking at.

Um, and so I just think that's so cool. And you get to know the agents over time. You know, what cursor is a little bit better at. And you know what cloud code is a little bit better at. And so you can even use that within, within paper.

00:33:32.550 — 00:34:00.510 · Speaker 1
And so the idea to come back to this and to see it now here's the question is when you think about plugins, write sketch and Figma have like I think a one of the earlier big moves for both of them was like these robust plugin marketplaces and supporting these things. How do you think about plugins and do you think about skills like what do you think about when you say, hey, we want these prepackaged, uh, functionality for users that other people can create?

00:34:00.550 — 00:34:19.409 · Speaker 2
Yeah, it saves a lot of work for us, doesn't it? Because we don't need to go create a Google Docs plugin or, you know, like whatever. Um, even if, even if you want to move your tokens from Figma. If you have the Figma MCP, just ask the agent to move your tokens from from Figma to paper. And the Figma MCP is notoriously, uh,

00:34:20.929 — 00:35:51.270 · Speaker 2
suffers from like some some quality issues or consistency issues. But it's really good at moving tokens. Uh, and so that's the use case that we've seen people already using is to just like, move their stuff from a design tool to a different design tool. Uh, so it saves us a tremendous amount of work in terms of that.

But what's so cool about that is just the possibility space. I don't understand yet the implications of this. You know, I think I was going to show you, um, we were playing around with the design system the other day, and here's, here's part of their annual newsletter, and I pasted it like this. It has like formatting issues and everything.

I grabbed it from a PDF and I was like, hey, just grab this style guide, which again, this is like a demo style guide and apply it to this text. And this is what it came up with. And the agent like again, you see some maybe some contrast issues you would clean up. That's not a big deal. The agent knew enough to understand that this is probably for an annual letter, even though it never said that.

I never said that. And it put it in this format, the block quote, you know, whatever. And so for data viz, I think you start to be able to like just dump data at it with a, with an established style guide and just like create database, create newsletters. Um, that kind of stuff. It's going to be it's so good at the consistency is good enough.

But this is just like one thing. Like I'm excited to see people doing research on the canvas. I think you can argue or brand designer. He's like pasting these like essays into paper and then asking the agent to like, generate variations of the essay or to go do research and prove the things in the essay. So again, it's becoming this interface to the agent.

It's not just a design tool anymore.

00:35:51.310 — 00:37:08.510 · Speaker 1
Well, and that's why you think about it. You say, hey, I have a PRD that's located over here, let's say confluence. Right? And then I have, um, let's say I'm using a research tool like dovetail or maybe in-flight, and they have MCC that I can pull from. And so suddenly when when you use the phrase which. It's the first time I've heard this, I really like the messaging connected canvas.

And you can say, we know that this bucket of research coming from in-flight pertains to this particular workspace and paper, and you can start to map it and associate it almost automatically. And you wake up tomorrow, you know, your research team just kicked off some research. That research came in, was synthesized.

And you wake up tomorrow in your paper. Canvas has a V2 starting point mapped out of your design system, your the PRD, your product manager created, and then all this feedback that you got from one of the, you know, working group sessions yesterday and you just immediately dive in and start tweaking and smoothing out the edges of the things that already created off that research, that opens up a whole new world of possibilities and frankly, more, more ways I want to spend my time than, you know, order taking on tiny minutia.

00:37:09.510 — 00:38:07.760 · Speaker 2
Yeah. And it's that, you know, moving data around is so time consuming. And this just the agents can do it for you, right? And the tools just need to support it. And they need to support it with accuracy though. Like you need to be able to trust that, um, the work that it's doing is accurate, right and well done.

And so a lot of our work that we spend is making sure paper is really intuitive for agents to use so that the agents don't stumble, they don't hallucinate when they're working with paper, you're going to get the most accurate, powerful agents to any design canvas because we spend a lot of time making sure it's really intuitive for the agents to use.

I'm curious to see what people will come up with. I think, you know, contra always holds contests, you know, for for new design tools. I think we're going to do one with them about like, what can you do with this? Like, what's the coolest use of this, uh, new technology that you can. Is it research? Is it you know, obviously there's the design implications inside of product teams, which was great.

But like, what else can we do with this more connected tooling?

00:38:08.240 — 00:38:41.560 · Speaker 1
I would love to. If you do that with Ben, I would love to be a judge on that. I've judged a few of those, and what I'd be interested in seeing on that is, yes, obviously I want to see the output. I want to see Lift the veil. I want to see what the connectedness looks like and how you pulled things in, because that's the part that I think a lot of people are interested in, in understanding how how should I think?

Because it's not just, oh, the tool can do a thing. It's oh, I can reframe how I think about the tools so that I understand what's possible.

00:38:41.600 — 00:40:09.920 · Speaker 2
I think you're going to change the tools that you use for jobs like that. You will start to remap. And we're seeing this on our team already where we're like, wait, we could use the agent in the canvas for that, and it would save a day of work or it would let us explore. Actually, that was the thing we were trying to decide is like, how do we message this?

Is it about saving time? because paper is not a brand that's like. Go as fast as possible, ship slop. And in fact, we actually are talking about this as the anti slop workflow because you have the control human still in control. I think it's a it's letting you explore more. It's letting you have time to explore the problem space and to really map out the options and make better decisions.

And there's definitely a time saving component of that. Like you can move faster with this flow like for sure, but it's then like, what are you going to spend that extra time on and how like lets you build better things? Because the paper mission is to help the most talented people in the world do their best work, and so that we can have more beautifully designed things in the world.

And that's why a paper exists as a company. And so we wanted to make sure the way that we fit into this AI workflow is a way that encourages high quality work and lets you spend less time on the boilerplate and more time on making something that's like, truly great and stands out in the world. Um, so, you know, I hope, I hope we were able to achieve that.

And like I said, it's now it's available today if you're watching this video. So hit. Hit papercut design try out paper desktop and please let me know what you build because I'm so excited to see what people can come up with.

00:40:10.120 — 00:41:43.770 · Speaker 1
So. So I made a tweet, uh, last week I think it was or before and I it was right when I got access to this and I said the question I'm now asking all of my friends who were building design tools, like, literally, I sent texts to everybody. I was like, hey, do you have an MCP yet? Because when I saw what was possible with this, it I had a moment where it just really clicked with me and I said, well, how many of my tools designed like like Maybin, for example, I was talking, I was talking with them and they're working on something internally right now where you can pull referential data from a lot of these, like tried and true workflows, because, I mean, I think one of the it's great that we can screenshot stuff, but I wish AI could read like moments over a timeline.

Like I clicked on a thing, I hovered over a thing for five seconds, which triggered an event like I want that type of fidelity to be captured and referenced, referenced so we can do things. Um, it's so all that to say, I have a video coming out pretty soon where we're going to dig into the workflows that we're experimenting with paper, because there's truly a lot that this enables that I don't think people have thought about yet.

Now, when Shopify says we are including AI as a as part of your performance, review, your usage of AI as part of as a performance review. And you're just saying, you know, I don't I don't know what we're really indexing on. Like what is what is a good use of AI if it's speed, you know, is it like like how would you how would you review someone's performance based on their AI usage if you had to?

00:41:43.810 — 00:42:00.030 · Speaker 2
Already within our team, you don't think of yourself as using AI when you're asking, like even our messaging, you know, we talk about agents and agents and AI is a weird word. Nobody thinks of themselves as an agent user, I don't know. Do you think of yourself as like a user of agents? I don't really.

00:42:01.310 — 00:42:05.270 · Speaker 1
Yeah, that's a hard that's a I'm a I mean, you're asking me at a weird time,

00:42:06.510 — 00:42:09.110 · Speaker 1
but like, it's a weird thing to say out loud.

00:42:09.150 — 00:42:57.970 · Speaker 2
Yeah, totally. Totally. But I certainly use cloud code. Right. And I certainly use cursor, and I certainly use codecs and open code. It just disappears really fast when the tools are just doing the job. It's all about the job you're trying to accomplish and how the tools help you get there. So yeah, it's it's cool to talk about AI right now, but, you know, I don't think it's going to be a differentiator in five years.

It's just how the tools will work. And the the connectedness is the key, you know? So I think, um, the way design tools used to talk a lot about collaboration, I think that's still very important. Again, that's why we're bringing that part too. But it's like connected collaboration. So humans, agents, data, code all together in one place, all speaking to each other.

Um, that's really cool. And yeah, AI helps us get there. And I hope that, uh, Shopify would count this as as AI in their performance reviews, but.

00:42:58.890 — 00:43:33.010 · Speaker 1
I think, I think for people like me currently, you know, hands dirty head in the car, you know, engine. Uh, it's easy to say, oh, I'm a user of agents because I'm fixing all the things they regularly break about them. Yes. But I do think the next few layers up, it's it's people aren't going to think about it that way.

Well, the last question I have for you, man, this has been fantastic. And I truly am very excited about the future of paper. And I know we just launched this feature. I hate to ask you this question. What is what is this allow you to do next? What are you thinking about?

00:43:33.050 — 00:43:44.850 · Speaker 2
Oh yeah. I mean, so this is this is really cool. And one of the reasons that we did desktop is because there's no latency to the agent. So the agent could be like, give me a screenshot. And paper desktops, like here you go a millisecond later.

00:43:45.930 — 00:44:47.110 · Speaker 2
But it requires you to have K for open while you're working. And so when you start talking about those use cases of coming back to, you know, all kinds of things for you to review. I think doing a cloud version of this makes total sense, because then you can, you know, prompt it from your phone or whatever. You don't need to have a paper open.

Um, now you're going to come back and want to do the direct editing. So it needs to like show up here when you come back. But having a cloud version of this for sure. Um, and then from there, you know, we're still building, uh, pen tool. So some of the core basic design tool stuff is still coming. Pen tool should land within a month or two.

Um, we still care a lot about the graphics side of the business. So we just shipped, you know, for the quarter team. We shipped, uh, the Cmyk. We just shipped this new thing. Oh, this is going to be crazy. So with our shaders, now, you can, uh, there's this little eyedropper, and you can feed in any element inside of paper into the shader.

And so this is kind of a strange use case to use like UI, but, uh, it's really powerful. You can start to, like, change shaders together. Let me grab like a class. We'll grab like a glass effect and I don't.

00:44:47.110 — 00:44:51.150 · Speaker 1
Know, but you just I dropped the component.

00:44:51.930 — 00:45:49.410 · Speaker 2
You just add anything, any element in paper, you hit the eyedropper, you click on it and it's going to pull it into the shader and wow. And then you can, you know, just go to town doing whatever. That's actually kind of cool. I like this already. Um, so we hear a lot about both the this this paper desktop, the MCP is a lot about UI.

Um, but we care a lot about the graphics side of the business too. And we're going to we're just shipping a bunch of more shaders and and new things, new capabilities with this. Um, this is so fun because it lets you experiment like we started from this table, you know, this business looking table. And like in a few clicks, we're into this, like, cool, abstract territory where you can, uh, you know, start doing kind of crazy visuals.

So excited about that, too. Um, but, uh, yeah, I think we only plan in like six week chunks. And so I'm really excited to see the response to paper desktop. And then we'll kind of like see what's the most useful thing that we can do next in this space. But I believe in this a lot. Like I think this is going to be maybe our biggest launch yet in terms of impact.

00:45:49.450 — 00:46:49.780 · Speaker 1
I asked Stephen, what does AI native designer even mean? And he basically said, it won't mean anything. In a few years you'll just be a designer. All the tools will be connected. You're not going to be thinking about whether you're using AI. You'll just pick the best tool for the job. Sometimes that's typing, sometimes that's dragging.

And that reframe hit me because right now we are all caught up in are you AI native or do you vibe code all these identity questions? But Steven's betting that all of that is going to disappear, and all that's going to remain is can you ship? And do you have the tools that let you move between prompting and direct manipulation?

And he says papers team canceled their Figma subscription four months ago. They're designing paper with paper and quarters using it. Stripe is using it. And the MCP integration now means your canvas isn't a dead end, it's just a node in your workflow if you want to try it. And I do think you should. Paper desktop is live today.

The link is in the description. Stay curious and I'll see you next time.