Matt from Netlify, Arize conference update, Mastra Speed Run, and AI News with Tyler
Abhi gives us an update from the Arize conf last week, we do a Mastra speed run, we talk some AI news, and Matt from Netlify joins us.
Watch on
Episode Transcript
actually couple months I think i think so it's been like almost two months since we've been in person now we're chilling so good to be back drinking some Celsius yep they're sponsored by no one actually but there's these guys from this company called TRDA in YC and they don't like Celsius so we're drinking
this on purpose yeah maybe they're watching uh we have a pretty jam-packed show today we're probably going to go a little over an hour we're going to do we're going to learn a little bit about some of the stuff that Abby did last week at the Arise Conference we're going to see a little bit of a MRA speedrun so
if you haven't seen some of the stuff we've been shipping we're going to give you kind of the the really quick this is all the stuff that MRA can do because I think it's kind of cool to check in every once in a while and see all the things that we've been doing over the last three months we're going to bring Tyler on from the MRA team and talk some
AI news then we're going to have a special guest we're going to have Matt from Netlefi come on we're going to talk about their agent week we're going to talk about AX all kinds of stuff also should plug one last plug for y'all we have a second edition of the principles of building AI agents it has about like 50 more pages in it cuz we
learned some since the first edition so if you haven't uh gotten it or have read it yet mashra.aibook yeah I'll put that up on the screen maybe maybe there it is but yeah so yeah without further ado I guess I'll get started so yeah so so tell us what was the Arise Con conference last week and who was there what were they talking about and why
were you why did they invite you dude honestly I still don't know why they invited me but why the hell were you there hey I got a already got a chat from Alex Booker what up dude hey dude we know you um yeah the Matt yeah the Matt the Matt he's Yeah you saw him on Friday too yeah I had drinks with him on Friday at the work OS had like a startup happy hour um
and I probably drank quite a few beers it was great we gave out quite a few books there too and met a lot of you so that was awesome u but yeah Arise conference was last Wednesday um if you all don't know Arise is a observability eval platform they I think they started like 2022 2023 or something and they throw this conference every year called Arise Observe i think there were 700
attendees it felt really like crowded in there at times um and they had talks from people from Llama Index Crew AI Leta you name it they had some talks there from Anthropic etc salesforce was there that was cool like I think agent force is has some opinions on evals and stuff and then there were a lot of other startup heads there like people who were trying to do their own companies a lot
of YC companies doing evals um and so I think like three major things I kind of took away so the keynote was good and the whole event was tastefully done so like if you're trying to throw an event in SF like that's has this many people you got to come correct on the venue and the food and the drinks because that's like they set the bar like all these are setting the bar higher uh which means it
costs more money to throw these events for sure and then this one had like cool t-shirts you could print like it's just it's a total thing you know because you're paying so much for the ticket I think so yeah so that's one two and I think that's why we got invited is Arise themselves their TypeScript numbers are going up and they haven't really focused
on TypeScript neither has anyone right doing this stuff and they're noticing TypeScript adoption going up so who' they call the TypeScript people us and um you know Sam was going to go he's couldn't go so I went instead i was on a frameworks panel me uh Leta Llama Index and Crew AI and there was this article that they wrote about it apparently it was a like a spicy takes only and like
they started the panel saying like "Oh what are your spicy takes on frameworks etc." You guys should read the article cuz I think if you read the article I won the panel you know like honestly like read the thing cuz I totally won honestly um I I should I shouldn't be surprised that you had the spiciest takes that's true i did say Lang Chain sucks like once you know only
once though so that wasn't too bad but it did definitely started the conversation and let people feel more relaxed to speak their mind because I already like blurted out the thing that you probably shouldn't say right so um but I think like notably like I made I talk mad about A to A even though we support it but I talk mad i talk
mad about multi- language frameworks and how they don't work which is like a kind of an opinion of ours also had this dope line where I'm like there's like what's the ne like what's the uh JavaScript version of Ruby on Rails it's called Nex.js and it's not it's not the same framework you know so that like that was a sick line um so
yeah that's cool so there then I had a bunch of people talk to me about AI and JavaScript and so that's just a sign yeah i mean a sign we're doing the right thing yeah well I mean so unrelated to that the next YC batch starts I think tomorrow and the thing that was interesting about when we were in YC which was winter batch there were I would say like 25 to 33% like a quarter
to a third of the people building an AI were using JavaScript and TypeScript we heard from a company in the last batch that said they estimated it was 50% to twothirds of the people using JavaScript for their AI agents so I'm curious if that's going to keep going up i imagine the trend is in the direction of more
and more companies just choosing to use JavaScript or TypeScript because they can just if they know it they're comfortable with it they're shipping their frontends in JavaScript and TypeScript why not ship their agents dude the the number of JavaScript in our batch was low you know and just one batch later they're like
so we're building the right thing y'all let me show you what we're building how's that yeah so you to set the stage a little bit so you gave you on top of the panel you also gave a 20-minute presentation i did yes oh I forgot about that i did i called it building agents in Typescript but the problem was it was the right the talk before lunch and so
actually it was a good thing because it created some this aura of like people who missed it because they were looking at something else and they were like "Oh I wish I could have seen that talk like I missed it because I wanted to see Agent Force." Don't go for Agent Force when he can come see your boy okay um so I'm going to show the demo I I did there
we're not going to do all of it just some things okay yeah we only got Yeah we got like you got to give like the 10-minute version because then uh Tyler's going to be here and we're going to talk some so you got So I'll talk MCP server client server sure networks and that's it or workflows yeah we'll do the do the first two and if we have time if we're waiting on Tyler we'll we'll do workflows okay cool
so I'm going to introduce two features that already exist um first is MCP support that we have i think not many people know that you can host MCP servers through Maestra so I'm going to show that um I built something kind of dumb but it works and also you know fix try to fix the camera here um this is live so whether you're watching on LinkedIn YouTube X you can actually just
comment on this if you have questions just shout them out oh my bad shout out the questions and we'll try to answer as many as we can along the way yeah we're just here to have a good time talk about uh AI and JavaScript and Typescript and AI news all right let's see the demo let's do it entire screen uh okay cool let's start with the code
so in MSRA we can do um MCP server hosting um well I mean you can build your MCP server we have a primitive called MCP server and um uh once you create the MCP server you can fill it with tools so for example here I just made my own Tailwind docs tool i struggle writing Tailwind in general and whenever I try to flag code it too something always goes wrong you
know I don't know why but uh so I just made this like dumb doc tool for myself and yeah that's what so yeah I have this docs I passed the docs tools now I have a doc server I could also pass agents into here so we can just do that for fun we'll see how it goes I have this like git buddy agent that just fetches GitHub through a GitHub MCP So I'm going to um
I'm going to add that to my MCP server now when you you can run Monster Dev i'm assuming people know Monster via Monster Dev which is a playground here it's initializing our MCP servers and stuff if I go to my playground you can go to MCP servers and you can see that we have our doc server already ready to go um
you can test these tools agents become ask the agent name so now that agent is part of the stack which is dope and then you know this is the tool that I made you can also add workflows to MCP as well i'm not going to show that but you can and then you can get your different URLs for cursor and stuff u I want to
use the HTTP stream URL so I have that um I already have that in my cursor so I'll go to settings mcp not that MCP and here's my doc server i'm just going to restart it and you can see now that there's two tools enabled um I have the get tailwind docs and ask git buddy and all I did to develop this MCP server and
start testing it and stuff is I just put my uh HTTP URL here and I'm off to the races so let's like test it out so I'll just do I was actually already testing this so let's try this again um let's see like ask my git buddy how many stars MRA AI MRA has and print a console log on this file i don't know what's going to happen but
we'll see so run the tool it's calling it let's see what happens okay i don't know why I did that searching codebase for console log man every time I like demo some like cursor type thing I always get into these stupid freaking loops because now it's asking me to s run it again except right it's gonna I bet you it's gonna go an infinite loop because it doesn't know what console log walk is
each time it gets the data anyway I'm not going to go through this unfortunately I'll just do the other demo um I need to understand we'll work out the kinks um need to understand more about tailwind flexbox please refer to my tool Bad request do it again maybe the previous thing wasn't my fault at all i don't know oh I have to run this so here it's using like my Tailwind
docs to get which get all the docs and then figure out which one and then go follow it so anyway that's cool um yeah so that's docs that's MCP server you could do your own thing right like whatever you want to do so so we do have a question in the chat not related to this but it's kind of two-parter wish
there was better support for deploying master apps on Wrangler based Cloudflare workers which pretty cool because that's our goal this week is to improve Cloudflare support on open source team we did fix some things recently so I don't So I don't know when you tried it but yeah we are we're actively working on that because yeah we we hear you uh Sadvic yeah it's
definitely uh something else we've heard from from our users that we we need to make it a little easier so we are hopefully going to be doing that this week yeah okay all right next feature and last feature we'll discuss is agent networks we just released that last week I believe in our experimental it's in latest in MRA but it's an experimental
uh vext kind of uh thing v- next for us is where we put our code before we go to production like and replace and deprecate old stuff so I mean not many people were using old agent network and if you were it was experimental from the beginning so if you're upset that we're about to break it sorry bro like you know your mile we had warnings right but
I'm really excited about this new version because it works pretty well but there's still a lot to test and everything so let me show you that and we do have we're going to be having Tony and I so Tony on the team he's the one who's been spending a lot of time working on this over the last couple weeks uh we're going to have a workshop
next Thursday I think so if you you know go to the master site you can uh see and join the join that workshop if you want a deeper dive we'll give you the the cursory overview but if you want a deep dive that workshop is a place to be yeah so agent network is our word for a multi- network a multi- aent system the
reason we call it a network though is it's not just agents that are involved with this network for example I've taken in my example app here I have all my agents i have a git buddy a hacker news i called them buddies for this demo for some reason i guess it was like more like disarming if I came in with the buddies you know um but this is Skynet
under the hood if you know um Gibli I have a Gibli agent i have like some rag agent I have but I also have like a workflows and I also have memory i could have passed tools into here too um so what the network is is you have a routing agent that takes the in the prompt coming in and then assesses what
to do right and there are two modes for a network one is where you're just playing quarterback and you're trying to get a first down uh that that's the way I like to describe it where the the routing agent's job is to figure out who to throw the ball to and then you get the first down and then the you do another message right you're talking
with the network to then accomplish a task the second is what we call looping which is you give it an arbitrary task and you just leave it alone until it's done or it thinks it's done with the limitations of iteration steps and all that type of stuff right like there are guardrails around this um but this
is more like the the full agentic spectrum right like here you go good luck I'll see you later type of thing i'm just going to show the quarterback version because that's easy to express in UI right now we're figuring out the other stuff um but yeah cool i'm going to go to my networks here are my here's
like my agent network here i'll just ask it um what Giblly films should I watch and so it starts thinking and I'll show what's happening in this thinking bubble but in our playground you can see it's fetched my it used the Giblly agent and um yeah so once it's done so it's still streaming and I love Giblly movies dude i have it t tattooed right here like the no face
from Spirited Away so like this was a great demo for me to make just for my own personal thing and especially at the conference when I showed this thing I was like super happy because it was tight but let me show you what the routing agent did so if I go in here you can see there's a decision-making process that the routing agent had to do the selection reason was hey the user is looking for suggestions on Giblly films
and I have a Gibli agent so it decides to do that executes the agent and then streams it back so now it's like you know like a quarterback if I wanted to do one more thing I can ask it to run a workflow for me i had an activity planner workflow right here so I'll just do that right now i want to plan an activity
in I think Lucas was from Brazil right so let's see sa Paulo Brazil on June 30th 2025 so it's thinking again let's see what happens i'll look at it when it's done thinking or maybe I'll just look at it now okay it's done so for example the decision-m process it wants to find an activity the resource ID is activity planner which is the workflow I have which is dope and then I start looking at the workflow
execution graph which is what happened based on this whole workflow this workflow in particular calls the weather and then branches to do some activity planning which it did honestly that's dope and then it hits the workflow steps if you want to look at them and then you get this JSON at the end because you that's how you know workflows return JSON so once we have UI for loop I'll come back
on stream and I'll show you the dangerous times a loop can do and some dangerous that you could pull with one but this is really sick so that's it thanks all right so that that's some updates if you haven't seen what been what we've been cooking over here you know Amjad says uh it was a game changer so you
know Oh wow cool definitely try it out you know it's it is still experimental but it's getting much more stable and we're getting really close so really excited about agent network oh the the guy with the aura is in asked us a question what streaming software you use um reream rest yeah Reream is what we're using
yeah keep Aura doing Keep Aura farming dude yeah all right uh we we're gonna bring in another uh Tyler from the MRA team here we're we're going to be talking about some AI news some of the stuff that we've seen over the last week we like to try to keep you updated on the things we're reading and learning because you know we have a channel in our Slack called Kindergarten because that's how
it feels every week we're just in kindergarten here trying to drink from the fire hose how's it going Tyler what up dude how you doing good good glad you could join us for for those that don't know you know Tyler you you have your own stream that you do with with Daniel from the team sometimes yep but you're also our resident uh you keep us updated on
all the AI news so he's our news correspondent yeah yeah you're the you're the MRA news correspondent i like to stay up to date normally the last week haven't as much but yeah yeah well we this it moves so fast you gota you got to keep keep your eye out yeah you take you take a one week vacation from keeping up and you're behind it's true but we do have some
some news articles here that we're going to be going through some different things so first one happened last week i don't know why I'm not signed in but that's okay you don't need to be so the first one stoked about this one all right so Open Router raised a decent chunk of money some would say that's a a
decent chunk i mean you I would I think it's a decent chunk unless you're an OpenAI uh unless you're on the OpenAI team and just got poached by Meta then maybe it's not that much to you an OpenAI employee can invest this money by themselves yeah they use a signing bonus for those that aren't following uh it seems that unrelated to this topic but Meta did uh poach a bunch of people from
OpenAI and other companies and gave them insane insane signing bonuses or at least insane contracts there's like this graphic of people getting poached like a like a wanted list yeah we we'll pull that up you you should find that um I think it was like TBPN or something shared it uh anyways Open Router
raised a bunch of money welld deserved we like Open Router Tyler you use Open Router quite a bit right oh yeah all the time this isn't surprising at all that they were able to raise 40 million they're probably worth I would think yeah yeah what is Open Router um so you get a single OpenAI compatible endpoint where you can hit like basically any model so Gemini OpenAI Anthropic really
everything so you just change the the model name in the endpoint you got all like the open source models too and then I think also there's like some some load balancing that happens as well which is really nice depending on the provider um yeah and then here is the in other news uh that's so funny poached all these people were apparently poached by Meta so you know Meta's spinning up
some teams over there that's a lot of people yeah but that is a lot of people that's a lot of money it's a lot of OpenAI employees yeah they definitely went hard after after Open AI for sure it's going to be a war dude i'm down i'm down to see what happens all right well we can keep chugging along here we got So I thought this one was interesting
and let me share this other screen so this came out last week kind of mid last week and it's basically introduction to deep research in the open AI API so I think one of the things that was interesting is they basically talked through and they kind of open sourced a lot of the code that in order to basically make your own deep research assistant using their APIs which is
pretty cool because it's not actually that much code and so you can there's a lot you can learn from it you know unfortunately it's in Python but what do you do um we should make a monster version of this that's actually a good idea we should definitely Yeah that that's maybe that's the followup takeaway for this is we should just take this Python code and actually make it look better
and but code what's that just give it a cloud code it'll be ready in like three minutes yeah yeah maybe we can just we can ship it here in the during this news segment but yeah it talks about it kind of has nice graphics of how it works you know you have this user query potentially a rewriter or you get all
the details you know if you've used 03 you you'll know that sometimes it'll ask you for more details and sometimes it doesn't right and I that's basically what it's doing here right it's saying did do I have enough information to solve the problem okay just continue on otherwise ask for more information and it only ever seems to ask once right it just asks one time for
more feedback then it has this enriched prompt calls deep research so it's a really good again way you know way to learn if you're trying to build your own deep research assistant you can kind of take prior art rather than having to figure it all out yourself you can also build this with agent network yes 100% yeah and and that's the other option right it's like agent network
should make it so you don't have to write this code but even if you did it's not too bad yeah it's not bad and then in other news a new model dropped last week tyler is this the um I I don't know if I know how to even pronounce it so you So you know which one I'm talking about yeah i downloaded it this morning i was going to try it but then I
realized oh it's actually too big to run that That's Yeah okay tyler how do you pronounce it i don't know i don't I'm I'm not even going to try i'll just My name is Hunan Hun Hunan hunan A13B however you pronounce it dude they make cooler names for this dude well I mean what if that What does that even mean i'm looking at what Han Oh I don't know had a A13B that means there's
13 billion active parameters at least that part I know i don't know the first part so I think this is like a it's like a 80 billion parameter model but it runs way faster because the whatever mixture of experts just activate 13 billion parameters at a time hun Yuan means round round I mean the logo is round so maybe
doesn't look like a butthole that's good all right we're going to share it is the first one that doesn't look like a butthole okay so it's a mixture of experts model so 80 billion total parameters but 13 billion active that's you know where you see the A13b so h how does how does that work Tyler when you have active parameters it basically only then uses the certain parameters
based on your uh based on what the mix mixture of experts model decides i think so i don't know the exact technical details but I know that it runs a lot faster yeah um rather than you know activating all 80 billion which that's what I think a regular model does just the whole entire like size of parameters gets activated and so this person I think this one's a
little tangential like you know biased potentially like outperforms bigger models like we'll see but at least in their their case it it did has reasoning built into It looks cool i mean I I wish I could try it out the file size of the model is like 40 gigabytes oh damn which is like you need like two uh GPUs for that for like consumer GPUs
so I only have one the one thing that um so it comes with two new benchmarks i think that's a strategy by these model companies they create their own benchmarks so they're the best you know it's kind of like Mistl did that yeah it's it's like when they the car companies you you'd see like oh we won this award five years in a row well if you look it's because they created the award five years ago to give
it to themselves it's like oh we're the best on this benchmark we just created okay you had won the Toyota Excellence Award you bought a Toyota it's like kind of like of course you of course you you won your benchmark you know I will say this though that's funny the Chinese models don't look like buttholes they
got a whale and this round thing so that's just interesting maybe they're like maybe they know something as well yeah all right um Yeah what else we got to talk about today so in other related news the VS Code Copilot chat was open sourced yeah let me share this some reason sharing isn't working very well today but we're in a different
Go ahead in the last few days it got open sourced it looks like yeah and so tell me a little bit about this i mean I don't I don't really know actually um I I just was started started looking at this maybe 30 minutes ago it's cool though i guess this is like you know their whole you know co-pilot sidebar thing in VS Code i would assume that's what it is oh they just open source it that's cool github Copilot
who started like building your own you know views on this too the uh the the UI frameworks in our space are also trying to build like automatic like like the code they want you should be able to like pull a a chat and then theme it like maybe it's like a VS code theme or it's like a support ticket or
support kind of theme so that's pretty cool but this one has logic in it too like the model selection and all that yeah and VS Code's kind of come into play because they were the first ones at least so they claim to fully support the MCP spec right and I don't know when I don't know if it's like at a certain point in time because obviously the specs still pretty actively changing the
new version came out since then yeah exactly they haven't already released a new version so but at least at one point in time they were the furthest along so I mean they're they're definitely doing some things i I know uh I I'm still a cursor user over here but I've been also using claude code within cursor and basically in its own tab because claude code has been pretty sweet the
future is coding agents that are not ide so VS Code tyler just wants to get back into Neoim i'm in there all day now cloud code you just use clogged you're back you're back already you You couldn't You You popped out of the oven for a while and now you're back you found a way back in yeah exactly yeah
yeah we do need to try the Gemini CLI too and put it through the same ringer maybe it's I tried it out last week um and I think it needed a bunch of bug fixes that maybe those are out already but it kept getting into weird loops anytime it would switch from Pro to Flash it would just get stuck forever basically the
marketing on that was solid though just saying i think I'm probably gonna end up using both of them like having the million context window with Gemini that's pretty sweet yeah digest a codebase real quick yep make a plan give it a quad got this message don't on VS Code we still love it sorry dude my bad
hey to to each their own right you know just my opinion my opinion i mean has Tony on our team started using cursor cloud code because once he learns he can get back into Neoim he'll be back he Oh yeah for sure all you Neo all you Neo Vim guys and girls out there I don't know how you you know I don't know how
you do it but All right you do you all right so next up we're going to talk about you know anytime there's a Simon Willis blog we seem to cover it here because he's been posting a lot of great stuff this one was looks like yesterday it's kind of short and easily digestible so I like that but it's like how to fix your context so it's just how to fix
your context posted yesterday we can kind of maybe read through some of it together he he summarizes four common patterns for context rot which I guess you can click into that which is context poisoning context distraction context confusion and context clash so there's a quick summary you can all read that if you're tuning in Tyler did
you have a chance to read this uh yeah yeah I did um yeah I think it's interesting um the thing I really like about the whole like context engineering brand or whatever that people are are talking about these days is it that's basically what memory is right it's just managing the context window and that's
something we're focusing on a lot at MRA so everybody talking a lot about it really helps us you know think about it and kind of see how other people are thinking about it so yeah yeah I think there's you know a lot a lot of people are basically saying context engineering as a term is replacing prompt engineering which I guess I it's just new it's new terminology
right for similar things but it I think this is at least a better term than prompt engineering but I think so too i don't know it just shows they were getting super pedantic about and people want to spend time naming things versus actually making innovations happen i respect Simon and stuff but like and all this this article is great like about the different scenarios and stuff but then what's not cool is all
this argument on Twitter over the naming of context engineering versus prompt engineering and people arguing about what memory is and not and why don't we all just like keep making moves instead you know instead of like just arguing about this stuff yeah definitely like even this is divisive right which is dumb because
it's not it all makes sense but it would be considered divisive given context engineering as a thing you know you mean the article is divisive no just like the like the topic oh okay yeah i mean but on the flip side you have to if you don't have terms that you can talk about it is hard to understand what you're
actually talking about right at the end of the day I think context engineering is just engineering i think you know I think that was a tweet from David Kramer uh he he tweeted that last week and that that's of course got some contention right as it would right but I I think he he's right but at the end of
the day there's like classifications that I think can be helpful i I don't know so I'm on the fence i think it's good to have terminology that means something but now of course the problem when you add terminology is then people who are outsiders it makes them feel dumb because they have to learn all these terms just to figure things out and all you're doing at the end of the
day is just figuring out like what to tell the LLM to get it to do something and so there's that's all we're doing right is like how do we fill what is called the context how do you fill the context window that's what it is these four uh situations he's Simon's talking about in this post are 100% I've
experienced every single one of them um and the one that pisses me off the most is context clash because if you have like long-term memory and you're getting all messages right sometimes depending on the model depending on the time of day month or whatever uh it can like you have new tool calls that have like that
are essentially later date than old tool calls but sometimes you get the wrong response back because it's using previous tool calls data so like I'm I'm sure everyone has gone through these four scenarios if you're doing some serious work you know um so it is like context clash i don't know if the right
name but that's some real life so couple things from couple things from the chat sodfix said I created an AI SDK provider to use the Gemini CLI off when running locally so that's pretty cool cool also Lucas asks "Are you guys recommending using master in production?" So a lot of teams are already using master in production and and I will say with this AI stuff if
you're comfortable putting an LLM in production because that is where your biggest risk factor probably is it's not going to be in the master code it's going to be in do you want an LLM to be able to formulate a response and give it to your users because that is where the biggest risk is which is why you need things like guardrails and evalu
then provide that response to a user I don't think uh the master code is going to be your your where you're going to be spending most your time worried about so yes there are a lot of teams already using it in production All right we got a anything else on the list oh speaking of I need to Yeah you want to share some share some more workflow stuff i got to send over an invite here to Okay our
next guest i'm gonna drop later dude thanks for coming out with you guys yeah we'll see you see you all right I guess we can look at more stuff while we wait for Matt the Mat to come the Matt so let me go back here got too many tabs open now okay cool that's it okay so we're looking at the network um
and we executed a workflow we executed a regular agent call um and in this you saw this execution graph we've been putting a lot of time into the execution graph so I'll just like do another uh workflow here let me add this this is like the activity planner it just shows like a logical control flow our workflow system we're expanding it in ways that
I'm very excited about things like temporal Cloudflare workers are incoming they're imminent as we would say and we really like our workflow syntax now the type safety of it and then we wanted the visualization of the API like we wanted the API to be able to do visualization so you know we have this graph set up we
can also like see and visualize the runs that happen so if I do like San Francisco here let it run so each step highlights as it completes and then now you know it's going through this and we can actually go to each step and be like what was the input here what was the output this one is essentially turn um this one returns the weather right so like okay cool typical weather use case
and you can see the input data here was this because it just takes it from the previous step then if I go into here I can see the input which is that previous step and the output which is like what the agent planned and this was running an agent in this step a monster agent to then like plan activities and stuff and then lastly we had this like mapping where you can map previous steps and
then return new data much like a map right and so you can see that this was the input the output flattened it and then you have like a config of like what happened here so that was cool you can see the result so this should help you know building with workflows and here you can see the run so this was the run here i don't know where there's a place that's raining today if anyone else
knows maybe the rainforest or whatever but let's just Is anyone watching where it's raining maybe I don't know it's summer so Chicago let's just see so I'm going to run this for Chicago it's not raining there unfortunately but we'll just wait till this goes i don't really care about the response because I just showed it to you but all
the runs are calculated here or maybe I don't know got to refresh the page um so yeah this was the Chicago and here's my San Francisco run and so this is a nice and all we have APIs for all this try Kansas City kansas City kansas City might be or Cleveland kansas City or Cleveland or Buffalo let's do Buffalo
because the Bills are cool nope no well the map is wrong the chance of precipitation actually no yeah it have to be more than 50% chance right well well it was very close 48% but anyway so this like visualization is like Well I mean you could just change the code yeah I'm not gonna do that the demo is priceless but anyway
so that was workflows so we we saw some agent network we saw some workflows what else we got going on what else we working on over on in the MRA world we're doing eval v2 right now which is going to be good got a lot of performance things going a lot of bug fixing too there's a lot of a lot of usage in Monstrous so there's a lot of bugs or like feature requests
things like that so um yeah also Oh yeah matt should be here in five minutes all right so I've got a question here single agent got too big trying master networks for sub agents how do how do I pass tool calls from sub aents through the network can't figure it out and I want them for displaying richer UI
so I'm assuming you're saying you want to if one of the sub agents calls a tool you want to display that tool in the UI we don't have that yet but it's on the it's on the road map essentially yeah Lucas i think that's a common honestly when you have like deeply nested workflows that's a problem uh when you
have these like deeply nested sub aents that's a problem you know we've been working with copilot kit quite a bit to try to make you know our integration with them better so we can do things like this but also integrated in with our client so you can roll it yourself so I think there's it's not easy to do yet but it needs to be easy to do it'll
happen like it should should should it should happen automatically we're just not piping the tool calls back into the the the main response stream so when you see like an agent gets executed in a network for example you don't know what's happened in the network call you just know what happened you know input output but yeah that that's still
experimental so you know we'll we'll make that um better looks like we have some more questions yeah some more questions from Lucas two different Lucas uh what use cases do you not recommend using the storage layer or is it something you always consider essential i mean I think it just depends in mo in almost all complex use cases you're probably going to need some kind of
storage yep but if you're just making a basic let's say a simple LLM call and you just need you could just wrap that in an agent call the LM through that maybe with some kind of system prompt you wouldn't necessarily have to have storage for that but I think in most cases anything that gets to modest
complexity is going to need some kind of storage back end especially if you're dealing with workflows and you want you know workflow state memory all those things all end ends up using storage yep just gonna say it better myself all right um I keep clicking at the same time you take it uh after deploying agents how do we observe or monitor their behavior is
that handled through the MCP server or is there another interface for that um Monster itself has telemetry it exports hotel um so if you have like an hotel provider that you want to like plug in or use whatever you can use Monsterra cloud you can use whatever you want sentry data dog freaking whatever
there's a bunch of observability tools you could pipe into so yeah you could just it exports as hotel so yeah you can Yeah so it's not necessarily handled through an MCP server uh but yeah if you do deploy to master cloud you get kind of the local playground experience in cloud so you have that as an option as well so yeah many different ways you can
deploy yeah and tons to improve there too for sure yeah honestly like we could have a whole section on just deployment of agents right because there's a lot of different ways you can deploy these things and a lot of different things and considerations when you do deploy them but this is probably as good a time as any because another place you could deploy agents is to netleify and so Matt
is going to join us dude what a segue yeah what is that was that was that was not even planned but it it worked out what's up man how's it going Matt hey good to see you all yeah to see you too good to see you i I unfortunately didn't get to see you last week like Obby did so you know glad to be there i I unfortunately didn't get to hang out and
have have a beer or two so or or however many you had uh but yeah it's good to see you yeah good to be here how's it going so for those that aren't aware for those that aren't aware this is Matt uh founder CEO of Netlefi and yeah we we just want to talk about all the cool stuff that Netlefi has been doing obviously you have AX you had agents
week you have probably a whole bunch of things whole bunch of updates that we can talk about so also he used to be our boss yes that is also true yes yes yeah we he was definitely our boss as well so but it's good to good to chat so what's going on over on in the Netlefi world Matt i mean lot lots of agents are are building on Netlifi these
days and deploying to our infrastructure and using our workflows and collaborating with our users and then people at the same time are building agents with those agents or on their own without agents but more and more with agents so it's more about agents building agents these days and and as you said like I I wrote this blog post in the start of the year called the ex agent experience and why
it matters and and um it really captures a lot of what what's my biggest focus on the company right now right like that we're really in this really unique position and that that to me feels a lot like like a bit more than 10 years ago when we launched Netifi um at back back then I was sort of a big believer in this like big foundational shift in in how we
would build from the web and we would start like decoupling the front end talking to all these different API and services and um and we would allow like the front end to really be the real application in some way right like and through that we would allow this new class of sort of front-end developers to become the actual web develop velopers
and at at at the time it's hard to sort of imagine now where where front end developers like very sought after they are everywhere they are experts in all these like complex frameworks and so on right like but back then most people saw front-end developers as like not real developers they were more like tinkerers and so and they would take a Photoshop
file and like slice it into HTML CSS and and a little bit of JavaScript and then hand it over to a real developer that would like implement it in the back end and so on right i can and part of what was so exciting in the in the in the beginning of that whole journey and as we started coining the gambit stack term and as frameworks like Gatsby started coming out and so on right like was this
real shift where suddenly we allowed whole new group of developers to really become real developers and and and built for the web right like today there's like hard to get like precise numbers for something like 17 million professional JavaScript developers right like that are very much real developers and it feels like right now it's just like such
a similar time in the sense that the fact that these code agents really really lower the barriers for like building stuff on the web by taking away sort of a lot of like before you kind of really needed to spend years and years learning that layer of of of code and algorithms and so on right like now suddenly you can kind of like just poke
around with agents until they they they figure most of it out right like and it works surprisingly well and you can get surprisingly far right like but there's still this notion that people that work in that way are by coders and they're definitely not real developers and they they're definitely always going to hand
it over to real developer that can take it all the way and so on right like and I have a very like similar belief that if we build like the full more opinionated platforms around it and the full guard rails and the techniques to to to not just ship but like iterate and roll back and manage rollouts and
understand like what goes live and protect yourself from shipping something bad and like put a few few a bit more structure in place while these models just get better and better then this new class of developers are absolutely going to be very real software developers right like yeah there's probably just going to be like
more of them than we can imagine today right and that's that's just really exciting and it also means that like the most important thing I can do as someone building platform and infrastructure is make sure that that platform that infrastructure works incredibly well with all these agents that it's really simple and approachable and easy to work
with when when you're working with an agent and not just on your own and so that's that's just a lot of our folks now yeah yeah and and I think you know maybe some people that are considered real software developers today I don't know if they're worried about that or they you know but there is that I guess aura where they're like you can definitely sense it oh those you know that's just they're building toys but
you know that's probably true i think a in a lot of cases today like it's probably a lot of toys but I think we remember like it was just oh they're just that's just in the Gatsby days oh that's just their personal blog you know that's just a toy it wasn't a real thing and then eventually it graduates to a real thing so you know I think it's it's
going to get there the timeline is maybe you know it's going to take some time for it to really work itself out but I can totally see the path where we're going to get there and I think it's moving i mean it's moving a lot faster than I honestly thought it would right like I I started like I started thinking this would be important a long time ago and sort of set it as an internal goal in Netlify
that we should be the platform for AI right like but when I first set that goal I thought okay it's going to be really important but it's going to be like really important towards like the end of the decade and so on right like and and and it's just it's it's happening much much much faster than that yeah halfway through actually you
know Matt would an interesting story for the audience would be your relationship with Bolt new at least like some some uh origin story there because you've been you've been homies with those people for a long time right yeah I was I was an early investor in in in Stack Blitz wrote a little angel check back back in the really
early days there right like and um and and really because like they had such an interesting technological bet on on web containers um which for those that don't work great like bolt new started out as a company called stack blitz that were building um a live coding environment in the browser and with this
really interesting pitch that look big became like dominant as a design tool in a world where before Figma Map every design tool had been like a desktop app and something you you downloaded and ran on your computer right like and what really made that switch happened in Figma's cause case was that they used web assembly to build like a real like hardcore design app in
the browser right like versus a lot of other tools that sort of in the end could build some design tooling but a lot of like the operations and so on it would need to call out to a cloud server and so on right like so so Eric's original vision back when he pitched me was really this idea that if if if we want this idea of like a full code editing environment in the browser
to happen it has to be by like being able to run the whole thing locally in the browser and not this like world where you have some UI in the browser but like the actual terminal no longer lives on your on your laptop it lives somewhere else right like so they built this whole web container technology to be able to basically run the whole node
ecosystem in in the browser um and it was really impressive from from a technological standpoint but there was like two things playing against them one like none of the companies building browser based code ids could really really get to the point where where they could monetize well um and web containers still had limitations
right like any like there was a bunch of libraries that just wasn't web container compatible and so on and that you couldn't use in that world and sort of having to educate every single developer using your IDE on what packages they can and can't use that's that's really hard Yeah yeah so they were in this point where where where where they were not doing too well and like the last time I met him before
they joined Bold right like it was like very clear that he was thinking about maybe maybe I need to call it soon right like and and and do some like maybe find some kind of like sweet like landing spot as an aqua hire or something like that right like and and get somewhere and then Bolt was sort of like the last
the the the last shot they had right and and and what a shot so like Yeah what a Yeah what a trajectory and it was funny because like we had long talked about like hey we should get like deployments to Netlify integrated into your IDE and and into stack blitz and so on right like and we had kept sharing like as we built some of the APIs that also made their ways into like chat GPT and other
environments and so on right like we kept sharing them with with with with the team there and I didn't actually know that that they had any sort of real plans of of integrating them until like right before like right around the time Bolt went live and I just got a mail from Eric being like "Hey can you raise the rate limits on on our API key
because I think we're about to like go way over them." And I was like "What API i didn't even know you had integration." B launch and they just started sending like thousands of sites our way so it was quite wild what a journey dude yeah and what's wild is how quickly that journey's happened you know it's Yeah it's you know and
yeah it's pretty incredible it's a it's it's quite the quite the moonshot of like all and and I think their background probably helped them right like that they're going through that journey of like building all the web assembly stuff and now what they're doing with Bolt it's and and it's still that like now it's kind of like a secret
weapon right like because now it it gives Bolt this quality over any of the competitors right like that that the whole coding environment does actually live ex like entirely in in in the in your local machine when you open a bolt project right like it's not some remote container and so on right like so the
feedback loop is like very instant everything happens there it's extremely costefficient for them right like they don't have to spin up like a a serverside cloud container for every single like vip coder that's trying to build something right like it becomes just the token cost for for for for calculating code changes and the
limitations and I think that's an interesting lesson for sort of like the whole ecosystem also right like the limitations of web containers in the world of large language models suddenly don't matter that much because instead of having to teach every single developer interacting with their system you can only use these packages and not that package right like they just have
to get the right context in place so the LLM knows which Yeah which package what are the agent just works right like and suddenly it just works and you just don't run into those problems when when you're like normally working with Bolt like the agent just like sticks to stick sticks to the stuff that that works
there right like and now it's suddenly just a pure pure advantage and I think that that's something we'll we'll see that we're still very early on right like it's it's Mhm it's just like how once people write most of their codes through LLMs a lot of what's good or bad what works well what doesn't work well is going to
fundamentally change that's sick enough about bolt on you though we're here for Nellifi yeah tell tell us a little bit about So you had agent week agent week a couple weeks ago and Yeah yeah we had a couple of like I mean we had several really really exciting launches um one uh one is a little similar in the
sense that like one tool that's also just exploded in usage together with these LLMs is VET right like that's just proven to be simpler for the models to work with and reason about and it has maybe also just like less historical baggage of changes and deprecated APIs and so on right like yes so it just
turns out to like work really well in this this this LLM world and has now like just become the basic for every every framework except for nextgs right like of course and and become sort of the basis for um almost any coding LLM like what what they tend to work the best with except for VZ um but um but that also
means that we've we've built like a good really good partnership with with the V team we're now the preferred deployment partner for VIT um and we've started like sort of building our local development tooling just directly into V uh through our V plugin and and that just means that that traditionally we had to have this tool called Netlifi dev that like if you want
our whole environment with secret management and serverless functions and edge functions and everything to work locally with your framework instead of instead of starting the framework's dev server you start netifi dev and we essentially set up a little proxy in front of that dev server that can like
implement our edge rules and redirects and invoke serverless functions and everything right like but now with vit instead you can just like install a netifi plugin and then any dev server based on vit will work with netifi functions or routing everything like that right like so that was like one of the one one of the launches to also just like simplify
like our level of integration and adapters with like all the Vbased frameworks and then for nonvas based frameworks we can still like have the netifi dev approach um and then we launched um Netlifi's MCP server um which I'm really excited about right like because it's actually like I I made a little live coding session where I
just like one click installed the MCP server in in GitHub copilot right and immediately just by installing it there that agent can kind of oneshot a lot of things um as if it was like as as integrated as bolt for example right like in the sense that suddenly like some people think of MCP as just a wrap around an an API and it's or or CLI and so on and it's not true right like because once you have
the MCP in scope you you also sort of really encourage these models to use and you've done like such an amazing job at Master with this right like you you start being able to help them get context not just on your API but on like your whole product and how to use it right like so once you install Netlify MCP
these agents will start asking like how should I write serverless functions how should I do storage how should I do persistence and so on right like and suddenly like Netify function starts working much better as a primitive if I blobs that's working much better it can like figuring out doing chron jobs without you having to to teach it how to
and so on right like so it's really powerful yeah and a lot of you know if you think about a human we would have to go read all the docs and and you know this you know Netifi has a lot of products so I might not be familiar with everything that Netlefi can do but you know what who is familiar who can get contacts much faster the LLM yeah and so
you can leverage that LLM ask questions and now the LLM can do the searching for you and formulate provide better context of like how to actually approach solving some of these common problems that you maybe would have had to spend a lot of time researching and you might have ended up in the wrong spot because you didn't know what you're looking for totally and and and
the MCP really helps on that because like if you just do that with with with without the MCP if you just hope it figures out our CLI and so on right we've been around for 10 years parts of our product has really changed but the LLM might still have indexed a lot of old code written against our old APIs
and our old ways of working so by default the LLMs might actually like do a bunch of anti patterns right like use old interfaces for our functions that don't give you access to all the features or like are frustrating to work with or don't work as well with streaming and so on right like but you plug in the MCP and suddenly they they start doing the right thing and then of
course the MCP also allows deployments which is like um I I think one of the first that just like really supports that right like Cloudflare has a bunch of MTPs but you can't actually deploy through any of them right like we we we really um have have that have that in place and then all the admin functionality and anything right like so MCP really exciting and then we launched
um Netifi DB as well um so another big pretty big big launch for us in in partnership with with NEMDB but again really aimed against this agent experience sort of really making sure that again in in that live coding example right like I just installed um the MCP and GitHub copilot and then I asked it to to to build an app that would require a database right like Yeah and the MCP could just tell it use
Netify database and all it had to do was just you know write the code and then when you either either run Netify Steph server or you deploy to Netlify we make sure that if a database isn't provisioned already we provision one right like um so again really just allows single shoting database applications and and and getting them running um we had like a really fun like
one one of the developers that that had been working on on the DB uh project once we launched he he he tried a similar project with like installing the MCP in augment and other for those that don't know it's a loyal code agent similar to Copilot also in VS Code but like specializes in indexing your code
base a lot and so on right like and there's so any of those but um he then asked the augment build a realtime chat application with Netifi and it built a a real-time chat application used the the NMDB behind the scenes right like to get that up and running and then he had also added the Playright MCP server that allows the agents to go through a
browser and and and test real websites right like so he asked Augment to test that the chat app worked when deployed to Netifi and it did that by joining the chat and and he also invited like a few of his friends to join the chat so suddenly those started chatting to the code agent and they started asking the code agent for for features and since it had access
to like the the code editor and to deployments through our MCP it's not like implementing those features and deploying and then telling in the chat refresh the browser and you'll you'll see the feature implemented crazy self app that started adding DMs and themes and notifications that would be a that would have been a great catching that on video you know just like
it it's like part scary listening to the chat because the chat could tell it to build anything but also you know pretty powerful in the way that it was able to like just update it like self update based on feedback yeah do we have do we have time to vibe something right now i we could vibe something i don't know this in action yeah yeah yeah
um let's see like um let me share my screen if I can figure out if how how it works yeah there should be there should be a share button just on the bottom cool i got to fix our camera here now I'm sharing myself sharing my screen that always gets a little funky let's see so I one of the things I vibe coded over time is is just my own personal blog um
the on that blog I actually like used our guide um on uh building MCP servers with Netifi right like and I basically just took like the guide and gave it to an LM and told it like I want an MCP server myself um and now now my blog has an MCP server and can try just like let's see if we can make this work um and can you can you zoom one or two clicks yeah that's
better it's better yeah yep a lot mcp at I think it's like this transport we give the MCP server a name and um I think the URL for the MCP is just like this i really don't like Let's see if this works at all but let's see plot code yep cc yeah yeah yeah we can see it has like it's connected it has three tools and the the tools I
added um were were were these three like one tool to list the articles whoops my battery maybe I should just quickly get a charger be right back yeah yeah no worries play some play some elevator music sometime we'll catch people up who who have are just joining so if you are just joining us we have we did a MRA speedrun obby talked about the Arise
conference and the demo that he gave there we gave a little preview of agent network and we talked workflows we talked MCP we brought Tyler on from the master team and we talked AI news so we talked about all the all that's been happening we talked a lot of today too yeah well I will be honest well we
do that every day that's true um and then we have Matt from Netifi joining us and we we've been talking you know all about all the stuff that Netifi has been releasing over agents week and now we're we're seeing a little demo of Matt's blogs M MCP server so take it away Matt no I cannot hear you though we lost your audio
hello hello now we can back let me make sure is the sound still good it might have switched um I think it's good okay um yeah yeah so so the MCP just has like three tools because I didn't really know what what what tools I would really need for an MCP for my blog one just lists articles and one gets a specific article and the third one is like a fun one right like
because I thought um with this AX agent experience angel let's let's learn how the agent experiences my MCP so it's a tool to leave an agent net promoter score uh for how well the tools works right like so we can try to ask claude um and let's see it's going to go use the list article function get it Here
the dollar is just Yep cl from your bank account just counting their money you know just as a racks up tokens honestly that's kind of freaking disrespectful dude put the tokens in your face right there it both like tell me right like recent articles I built a B personal MCP like what what was here and now it's leaving an MCP like an an agent net promoter
score of nine saying that like the billment block MCP tools are excellent for accessing blog content very satisfied CL code I'll say like yeah go ahead and and send it right like and now um somewhere on my admin panel Um I'll have some dashboard data here currently an NPS score of 70 for um Hey that's
pretty good i think in general these agents are like they they typically like to say nice things to people so yeah they do they yeah they do you know tend to be more positive right they they want to be happy they they play to your ego a bit but now I thought like the thing that could be fun to to like um vibe code um I added
as I said like master has a pretty cool MCP server so I added it here um I added Netlifi's MCP server as well and this is the actual like blog directory and I thought it could be interesting to see if we can build a master powered chat to actually like get a like that that uses the MCP on the blog um let's just clear the context first and I
prepared a little prompt right like build add chat page with an agent built on master and netlifi functions that uses the billman block mcp to let visitors chat with billman block there's already an open API key set through the netifi secrets already so knows what that means that's sick so so let's see what happens
let it roll so So this is always the weird thing with with coding live right like you're basically just sitting and and praying that it works yeah you never really know you know sometimes it surprises you and it's amazing and other times you get into a you know in loop of debugging but you really don't know um I love how it makes a to-do list okay at
least see see this is one of the examples right like where it's actually like looking up how should I actually use serverless functions and it's really good it does that because otherwise sometimes they they go and and and and do funky stuff with netify functions for some reason um setting up master and it also used
mattress docs right like and I know it's the the first before you had the MCP server I was actually playing around with this and it was just like really hard to to get the models to use master right again because it's like young product and like the latest like MCP docs wasn't in their training set and so on and I tried to get them to read your website and it worked so so right like
yeah MCP just makes such a difference some of our users you know have told us you know it's so cool that you did that but out of our in our opinion it was a necessity because of all the reasons you were saying it was so hard to get it to write master code that without Yeah I don't know how we could have done it no
no and again the part that that people that like that that don't get MCB they don't get that part it like the role it plays in in essentially in the context engineering right like that's I love that it's going this loop nice this is a good comment yeah matisia Matia says "Matt is building the infra for AX which is cool the current infra is lacking in one of the bottlenecks for
a huge unlock." EMF baby yeah and and and Lucas of course you know I already showed this one said "Anthropic is so happy watching those token counts go up." I know dude this is It's just their way of saying "F you." Like count those tokens
i should say that I think I went to some tropic event and and they gave me some free tokens so I'm probably like still eating from those for a little bit good you can eat for eat for free for a while they give you a taste you know i mean but it is it's like your us like your your local porcher giving out free
fentil or something yeah exactly it's pretty much what it is oh man hilarious but that is one of the things I like you know maybe like or don't like about Claude Code but you know cursor is kind of incentivized to kind of limit your tokens because they you know but Cloud Code they for the most part they kind of
want you to Yeah they want you to use the tokens up but they're going to also try to give you the best results because of it so yeah I've I've really really spent a lot of time in claw code lately yeah me too al also because I'm kind of like in this mindset of like I I I want to try to really live in that world where I don't even have my code editor open right like where I'm really not
looking at the code and see like how what what does it take to to become a good um a good engineer without ever looking at the code directly and plot code just has that aesthetic of right like you're not like you're just talking to the agent about the code right like but yeah and you kind of you kind of send it off and let it do its
thing for a while check in on it answer some questions send it off again then in the end it always does things like in this case for example I told it that there was a maybe it didn't oh no it didn't okay good good like I thought it was saying that it was adding ain file which would be like annoying but it's just saying telling me to do it which I
don't need to because like there should be one in the environment but let's let's see what happens let's spin it up and and see if it builds something usable or not i mean where you never know it's where you never know it's the excitement of it yeah it's kind of like hey you got to roll the dice we're doing it live yeah you roll the dice doing it
live i mean so far so good let's go to chat and see what happens no type errors no type errors no type errors no type well it would have checked itself actually you'd hope yeah you'd have thought i think we lost you a little bit little puzzle by the speed of this yeah i mean I I I didn't expect it to actually work the first time you're always hopeful but you never you know
sometimes it does sometimes you can one shot it i feel like it's just probably something small it usually is a little thing right a couple little things and I think we kind of lost Matt a little bit network connection I think is sometimes it does I I I I um Let me just try to quit stuff yeah we can hear you yeah I'm I'm wondering what's happening
here here we go it It loaded i think my whole laptop is like almost died it's actually really Yeah yeah it was It was your Your CPUs were spinning uh Yeah yeah yeah i that that that definitely couldn't have been Mastra but who knows ah some error what is it saying mcp client connection failed why did it fail
connecting failed to connect command or URL maybe it wrote the MCP client wrong yeah it must have wrote the MCP client slightly wrong but it gave you the chatter interface which is nice and I'm assuming there Yeah yeah um I mean I think I'm I think I might cheat and and and look at the code yeah yeah i mean for for the sake of tokens yeah yeah for
for the sake of your tokens your free your free tokens i think it could also just be like that's something you really missed of like my laptop is just like struggling right now i think for some reason the live like the live screen sharing is like disagreeing with it yeah you need you get live yeah you got the plus the
live stream software yeah yeah I think some something there is attacking um so that wrong yeah just instead of transport yeah it should be URL right and then it's just the URL yeah true it should Yeah so I think you just change transport to URL yeah it should just be URL and right now I can just do without that
whole Yeah base URL like that's nice because technically it's trying to make it work against the development server and development and so but right now let's just uh be a little more brute forish looks right right yep i think that's right i hope so let's see let's see if we get something back this time
good thing having the the master team on on call for this looking at the code like the good old days yeah we still have to look at Yeah yeah yeah i mean I feel like I could have like started like asking like master to de debug ask it ask cloud code to debug this but just like I think my laptop really is struggling with
something right now so I thought it would be pressing it oh wow slack helper let me let me force quit Slack okay not quite there so and yeah that was close it was close we got a comment that says that someone Matia doesn't really like cloud code is you lose the IDE context um but I will say one of the things if you you can install the cloud code has a
VS code plugin and you can open it up in a side tab just like you would you know cursor agent or something or your VS code agent and so I've been using cloud code also I like it on the CLI but I also have been using it in my IDE as well so you can kind of have it in you can have the best of both worlds
let me see i'm going to cheat a little this one I actually did like right before um why didn't it reset uh because it's not there i think this was the one look at the time this was my previous take on on the exact same prompt yeah hey you know something like right right before I I I I joined just just to
see if it worked um the first take actually pretty much went one shoted it this time it got something wrong in in in the MCP setup but let's let's let's just try it once yeah i mean I think it looked like it was pretty close right so It's pretty close rather than spending all the time debugging it's it it's nice just to see it see it live and see it
working yeah let's let's see if it works this time like it could also just be that that the live streaming is more than my than than my MacBook Air can handle right now some M10s around here dude it's going on a mission to Mars honestly sometimes yeah I know and Lucas has a has a nice comment we lose
the IDE context but at least we look like movie hackers when we're on the CLI all the time so that's a thing let's go see if we can get the in toint stuff running or not sorry sorry about like the I really don't know why the live streaming here is is so rough on my my pure MacBook something must be putting in a bad mood
but here we go all right fingers crossed let's see is it going to work moment of truth moment of truth post to MCP get MCP post MCP some stuff is going on ah wow it's always Yeah it looks like something did not connect something failed streamable HTTP ah bummer well oh well we almost one-shotted it dude
i remember I spent like 20 minutes in a workshop fiddling with Tailwind and we were like doing a vibe coding thing i hate when that happens that looks right yeah I mean it looks the URL looks right you're getting the tools set if you scroll down to your tools well I guess it's right here what was uh the
request URL no but like I think this might not even be the right version new URL and it creates another new yeah maybe just hard you want to hard code it and see yeah that's right let's see if it works give it a shot what was there where did they end up running this one wonder if I have something running in the background somewhere
sick okay we're we're definitely taking a drink on this one this is going to be the one let's see let's see for times the sham or not we're gonna find out see hasn't blown up yet something's happening in the background here go see if I can load anything on on on while sharing if I can I'll go see here let's go find the actual blog let's find the logs let's go
look at the function logs it's the MCP server something's happening 348 here we go hey everyone you're on the list article we did it we did it real question will be did it leave did it leave a review i don't know might might be this one i'm not sure yeah a nine not too bad that's awesome yeah yeah it's so easy like you have all these Nellifi primitives that I mean obviously
we were like oneshotting stuff but if you're like making a serious project and you're vibing and doing stuff I can see how you don't have to worry about you just make moves yeah i mean I think that's the current state we're in right like nothing it doesn't always guaranteed it's going to work the first time you But it gets you further and then it you can actually you could chat
with it it would probably be able to debug itself if you really spend the time on it if you were totally if you're willing to spend some tokens you could get it get it there or you could you know if you're if you are an engineer you can still jump in and make the fixes and get it to work yeah it's it's pretty amazing how
how how far it's come as I said like I'm playing a lot around with with cloud code because I'm really trying that like it it helps me like in in in the beginning it's like a really weird feeling when you are a really experienced engineer and you start like really working with this models because it kind of breaks the whole like 90% of your existing skill set
starts being like really weird right but you also like discover that like 10% of of your existing skill set is suddenly like a thousand times as powerful right like because you really understand what's what what's what it's kind of moving around and so on and as I said I'm like trying to to give myself a
lot of these challenges of like working in in the context of really trying to work as little directly with the code as possible as much with just like get the model to doing because I I I believe that eventually eventually we'll get to a point where that will be how most engineering is right like I think I think code will be very unimportant i think the current
state where even where we think about it as like okay but we need engineers to review all the code and turn right like it's it just like it it doesn't make sense that like humans will be spending a bunch of their time sitting in GitHub and reading machine generated code and correcting it like it's not it's not going to be a thing right like so I think we're going to build more and more sort of full
feedback loops that allow us to work more and more without looking at the code and that won't happen like straight over right like as as like right now it will still for for most things like at this very moment you definitely still for more complex things need to be able to go in and look at the code and sometimes just
write it and so on but I think it can be an healthy exercise to sort of start trying to already live in that future where code has become completely unimportant and and just like like an artifact in the same version of the output from from from our compiler right like something like how often do you look today at the output of the TypeScript compiler right like I'm I'm all like it's not that long ago that
like first coffecript came along and it was like one of the really important qualities of the coffees script compiler was that it generated JavaScript that you could really read and you could understand because you kind of still really wanted to to keep check on whether it did the right thing and so on right like then we had the first
transpilers was still very much that like oh they better they better generate something that we can verify is good JavaScript and today I don't think anyone spends any time looking at like the output of TypeScript and saying like oh does it really generate good JavaScript or not right like who cares it like the compiler said it works it
probably works exactly with these LMS we're going to end up in the same world where right now we care a lot about like is is the code following our linting rules right in like stuff like that is gonna be be really like it's just not gonna be a thing save so much time on CI for not running prettier on your whole code base who cares about commas anymore you know
you'll you'll save what you spend on tokens by by not running anymore awesome this was sick man thanks for coming yeah thanks for coming on uh so appreciate you s stopping by uh Tyler has uh you need that Claude Max 20X plan claude Opus for the whole day yeah um but yeah thanks thanks a lot Matt you see yeah it was great talking to you great learning a little bit more about all the agent week launches you had and
doing some vibe coding together and getting this chat app to work so definitely want to have you on again next time have the next big thing to announce which I'm sure is coming soon yep yep plenty of plenty of stuff in the in the works so yeah would be fun see you around dude yeah we'll see you have a great one bye dude that was sick yeah dude that was
sick all right everybody thank you for tuning in today i think we're about to wrap up we've been going strong for an hour and 40 minutes that's 275 of y'all yeah so thanks for of AI if you're one of the 275 that have watched some of this you know we like to think it's fun obviously some of you agree but we appreciate you being here uh we we did
the MRA speedrun we talked some AI news we had Matt from Netlefi we do this pretty much every Monday usually around noon Pacific today was a little later for you know had some other things going on but appreciate you all for being here and yeah we'll see you next time yeah tune in Friday for the Canadians yeah we
do have a we do have another stream on Fridays master code a I don't know if they should be doing it this week I would guess but it's it's the fourth so maybe they'll they'll take it off in in celebration even though you know I don't know they don't celebrate it but do they July 1st oh tomorrow is their holiday
okay all right so I don't know Tyler if you're still watching I don't know if you're going to be doing a live stream on Friday but normally on Friday uh they do a monster code a which they actually dive a little deeper in the code and uh than we get to maybe on this stream so all right mom should wrap it up