Build your first AI agent in 90 minutes
The guy who taught Abhi JavaScript is back! Guil Hernandez has spent 15+ years teaching developers. His courses at Treehouse, Scrimba, and LinkedIn Learning have reached over 500,000 learners — including Abhi and Shane, who both learned JavaScript and CSS from him. He just released Mastra's first video course at https://mastra.ai/learn, and it's free. "Build Your First Agent in TypeScript" is a 90-minute, hands-on course that takes you from zero to a deployed agent. Fourteen lessons across five sections: agents, tools, workflows, memory, and production. The project is a theme park planner agent — pulls live wait times, weather, and park hours, keeps track of what you like, and builds you an itinerary. Everything runs in Mastra Studio, so you can inspect traces, tool calls, and behavior as you go. You'll see how to wire up local tools and MCP servers side by side, how message history and observational memory change agent behavior, how to compose a workflow for a mock ticket purchase, and how to expose the whole thing as an HTTP server with one-click Slack integration. Guil also shares his broader take on teaching AI engineering. The mechanics — syntax, boilerplate, wiring — are no longer the hard part. What matters now is how you think through a problem, whether you have the taste to spot bad output, and when to take the handoff from the AI instead of iterating forever. The gap between people who just generate output and people who can actually shape it keeps widening. This course is built for the second group. Start here: https://mastra.ai/learn
Watch on
Episode Transcript
1 00:00:00,020 --> 00:00:01,760 Speaker 1: All right, we are bringing on Guil. 2 00:00:02,300 --> 00:00:06,540 Speaker 1: Guil has relatively recently released the first Mastra video course. 3 00:00:06,700 --> 00:00:09,660 Speaker 1: If you have been paying attention to Mastra, you know that a long time ago, 4 00:00:10,140 --> 00:00:11,800 Speaker 1: we built kind of this Mastra MCP course. 5 00:00:11,860 --> 00:00:16,219 Speaker 1: It was kind of innovative at the time, but it always lacked that appeal of the traditional course 6 00:00:16,240 --> 00:00:19,680 Speaker 1: where what's the best way to get started, and I want you to kind of walk me through it, 7 00:00:19,980 --> 00:00:23,199 Speaker 1: teach me the basics, hold my hand a little bit so I know that 8 00:00:23,200 --> 00:00:24,840 Speaker 1: I'm building that base of knowledge. 9 00:00:25,040 --> 00:00:28,180 Speaker 1: I think that's something that I always look for in a really good course, and I think we 10 00:00:28,300 --> 00:00:28,940 Speaker 1: really nailed it. 11 00:00:29,360 --> 00:00:30,780 Speaker 1: We're really excited to work with Guil on this. 12 00:00:30,900 --> 00:00:34,680 Speaker 1: Guil, do you want to give a quick introduction, and then we can definitely talk a bit more 13 00:00:34,820 --> 00:00:38,640 Speaker 1: about the Mastra course, what it covers, and how you went about building it? 14 00:00:38,780 --> 00:00:38,980 Speaker 2: Absolutely. 15 00:00:39,540 --> 00:00:39,840 Speaker 2: I'm Guil. 16 00:00:40,200 --> 00:00:44,900 Speaker 2: I've been working with the Mastra team for a few months now on the course course. 17 00:00:44,901 --> 00:00:49,660 Speaker 2: But yeah, I'm down in South Florida. I've been a developer and educator. I'd say most of my work 18 00:00:49,680 --> 00:00:56,739 Speaker 2: sits in that overlap. I've spent most of my career in developer education, working across many tech 19 00:00:56,940 --> 00:01:02,660 Speaker 2: companies in the past. Back in the day at Treehouse, Scrimba, done some content for LinkedIn Learning 20 00:01:02,980 --> 00:01:08,520 Speaker 2: and built courses, workshops, learning programs across front-end, full-stack, back-end, 21 00:01:08,920 --> 00:01:12,800 Speaker 2: Core CS and more recently AI and agent systems. 22 00:01:13,620 --> 00:01:19,200 Speaker 2: Overall, I spent a lot of my time building and teaching developers, beginners and curious learners how to use them. 23 00:01:20,500 --> 00:01:24,780 Speaker 1: I'm pretty sure I saw a Treehouse course from you a long time ago. 24 00:01:25,020 --> 00:01:27,060 Speaker 2: Probably a JavaScript course or React course. 25 00:01:27,480 --> 00:01:31,480 Speaker 2: It's funny, the first day I joined the Mastra's Slack, 26 00:01:32,080 --> 00:01:34,980 Speaker 2: Abhi was like, hey, thanks for teaching me JavaScript. 27 00:01:35,490 --> 00:01:38,860 Speaker 2: So I was like, whoa, the Abhi actually watched one of my courses back then? 28 00:01:38,860 --> 00:01:41,880 Speaker 2: I learned JavaScript from you, dude, as well as CSS basics. 29 00:01:42,110 --> 00:01:43,300 Speaker 2: I remember that one, too. 30 00:01:43,550 --> 00:01:45,640 Speaker 2: Yeah, happy to have you teach the next generation. 31 00:01:45,880 --> 00:01:47,680 Speaker 2: So tell us a little bit about the Mastra course. 32 00:01:48,240 --> 00:01:49,120 Speaker 2: How did you break it down? 33 00:01:49,380 --> 00:01:50,420 Speaker 2: Yeah, so you can check it out. 34 00:01:50,580 --> 00:01:53,320 Speaker 2: It's live at mastra.ai slash learn. 35 00:01:53,560 --> 00:01:57,800 Speaker 2: Essentially, it's a hands-on course where you build an agent from... 36 00:01:57,800 --> 00:01:59,680 Speaker 2: scratch all the way to deployment with 37 00:01:59,780 --> 00:02:01,480 Speaker 2: Mastra and we really wanted to focus 38 00:02:02,200 --> 00:02:03,120 Speaker 2: on how these things actually 39 00:02:03,960 --> 00:02:05,660 Speaker 2: work when you wire them up 40 00:02:05,840 --> 00:02:07,640 Speaker 2: clear up any ambiguity around agents 41 00:02:07,880 --> 00:02:09,580 Speaker 2: I know there's a lot of people out there hearing about 42 00:02:09,720 --> 00:02:11,380 Speaker 2: what agents are, what they do and 43 00:02:11,760 --> 00:02:13,360 Speaker 2: they just don't know where to get started or they 44 00:02:13,739 --> 00:02:17,720 Speaker 2: feel like the fear of missing out, right? Like they're not catching up. So this was, of course, 45 00:02:17,879 --> 00:02:23,819 Speaker 2: to provide a smooth on ramp onto that. So we start right with a basic agent. I'm from Orlando, 46 00:02:23,950 --> 00:02:28,640 Speaker 2: so I grew up around theme parks and I thought it'd just be fun and kind of simple to create 47 00:02:29,019 --> 00:02:32,420 Speaker 2: like a theme park agent. And we'll get into that a little bit more. But 48 00:02:32,940 --> 00:02:37,680 Speaker 2: Essentially, we define what an agent is, we create a simple agent, we start adding tools, 49 00:02:37,930 --> 00:02:40,860 Speaker 2: we connect it to MCP servers, right? 50 00:02:41,120 --> 00:02:44,820 Speaker 2: We make the MCP servers tools available alongside existing local tools. 51 00:02:45,520 --> 00:02:48,400 Speaker 2: We get into the basics of multi-step workflows. 52 00:02:49,040 --> 00:02:53,800 Speaker 2: We also introduce building with Mastra Skills along with the MCP doc server, but we do 53 00:02:53,980 --> 00:02:59,620 Speaker 2: We teach learners what Mastra Skills are, how to activate them and consume those in your project. 54 00:03:00,280 --> 00:03:05,460 Speaker 2: And then we get into memory, message history, working memory, the new observational memory. 55 00:03:06,220 --> 00:03:10,840 Speaker 2: And you see how the agent behavior changes as you go forward. 56 00:03:10,841 --> 00:03:15,000 Speaker 2: And for this course in particular, we run everything in Mastra Studio, right? 57 00:03:15,010 --> 00:03:19,340 Speaker 2: So you can get that immediate feedback loop and you can see it work. 58 00:03:19,450 --> 00:03:20,780 Speaker 2: You can inspect what happened. 59 00:03:21,230 --> 00:03:25,760 Speaker 2: You can use traces and observability and you can see all the tool calls, everything in 60 00:03:25,819 --> 00:03:26,340 Speaker 2: one place, right? 61 00:03:26,340 --> 00:03:27,000 Speaker 2: You don't have to work. 62 00:03:27,001 --> 00:03:32,700 Speaker 2: worry about adding a UI to it just yet. That might be another course. We use a very simple API, 63 00:03:33,140 --> 00:03:38,560 Speaker 2: the Queue-Times API, but it can work with like the Thrill Data and other more complex APIs. But 64 00:03:38,840 --> 00:03:43,040 Speaker 2: essentially just it looks up park data, right? You might say, you know, I'm planning to go to Epic 65 00:03:43,239 --> 00:03:48,299 Speaker 2: Universe today. What are the park hours? And then it extracts some data using MCP for the park hours, 66 00:03:48,480 --> 00:03:50,099 Speaker 2: and then it looks up current wait times. 67 00:03:50,420 --> 00:03:55,980 Speaker 2: It pulls live data and recommends, okay, what should I ride next? 68 00:03:56,049 --> 00:04:00,640 Speaker 2: Put an itinerary together based on the weather, based on the wait times, the park hours. 69 00:04:00,980 --> 00:04:08,180 Speaker 2: Also, it uses fire crawl to maybe go out and extract historical crowd data, like attendance, 70 00:04:08,659 --> 00:04:10,959 Speaker 2: or what are the typical wait times on a Tuesday? 71 00:04:11,489 --> 00:04:14,780 Speaker 2: Or what do you recommend that I ride in the morning? 72 00:04:14,829 --> 00:04:16,400 Speaker 2: I have kids, for example. 73 00:04:16,510 --> 00:04:20,239 Speaker 2: I don't like roller coasters, so we use memory to kind of keep track of those things. 74 00:04:20,240 --> 00:04:22,660 Speaker 2: those things and make suggestions based on that. 75 00:04:23,160 --> 00:04:25,560 Speaker 2: And we also tie in a workflow, right? 76 00:04:25,840 --> 00:04:30,880 Speaker 2: So we build a simple workflow that simulates like a mock ticket purchase. 77 00:04:31,000 --> 00:04:32,680 Speaker 2: And it walks you through the whole step of like, 78 00:04:32,940 --> 00:04:33,740 Speaker 2: is it the right park? 79 00:04:33,960 --> 00:04:37,860 Speaker 2: And how much does a ticket to like Animal Kingdom cost? 80 00:04:38,660 --> 00:04:44,340 Speaker 2: you know, and for what date and then it prints out an itinerary and some suggested things 81 00:04:44,520 --> 00:04:46,699 Speaker 2: to do that day after it confirms the purchase. 82 00:04:47,880 --> 00:04:50,400 Speaker 2: And then we teach deployment, right? 83 00:04:50,401 --> 00:04:50,501 Speaker 2: And, 84 00:04:50,699 --> 00:04:56,180 Speaker 2: So the idea is to teach how you can actually call that behavior and how Mastra can expose 85 00:04:56,500 --> 00:04:59,500 Speaker 2: all those via endpoints and you can consume in an app later. 86 00:04:59,860 --> 00:05:02,900 Speaker 2: So yeah, essentially how Mastra can become a HTTP server. 87 00:05:03,680 --> 00:05:10,460 Speaker 2: We're going to teach a one-click enablement of a Slack bot integration, so you can actually interact with it on the go via Slack, for example. 88 00:05:10,670 --> 00:05:18,660 Speaker 2: So we created it because agents, as I said, they're starting to show up everywhere, but it's not obvious what they are, how they actually work when you build one. 89 00:05:19,440 --> 00:05:29,599 Speaker 2: I think the goal from the start was like, make that concrete and provide that smooth on-ramp for those wanting to learn about and build agents and show what actually happens when an agent runs. 90 00:05:29,740 --> 00:05:42,099 Speaker 2: People to walk away with a clear mental model of what agentic behavior is, what it looks like in practice versus like a chatbot, for example, how an agent can decide what to do via tools, when it uses those tools, when to hand off to workflow. 91 00:05:42,100 --> 00:05:42,200 Speaker 2: Yeah. 92 00:05:42,820 --> 00:05:44,700 Speaker 2: and a sense of how to structure one themselves 93 00:05:44,980 --> 00:05:47,640 Speaker 2: and then just build on top of it using TypeScript, 94 00:05:47,840 --> 00:05:49,400 Speaker 2: the stack we all love. 95 00:05:49,640 --> 00:05:52,360 Speaker 1: The stack that we have learned to love over the years. 96 00:05:52,900 --> 00:05:53,700 Speaker 3: I was curious, Guil, 97 00:05:54,220 --> 00:05:57,780 Speaker 3: how does education in these concepts differ than 98 00:05:58,000 --> 00:06:00,280 Speaker 3: maybe what you were teaching before with React 99 00:06:00,740 --> 00:06:03,760 Speaker 3: or even lower level things like languages and stuff. 100 00:06:04,280 --> 00:06:07,000 Speaker 3: Is it a lot harder to teach or is it easier? 101 00:06:07,320 --> 00:06:08,420 Speaker 3: It's a little bit of both, it depends. 102 00:06:08,720 --> 00:06:11,360 Speaker 2: I also work at an ed tech company out in Austin 103 00:06:11,610 --> 00:06:13,900 Speaker 2: and there I started by leading a collaboration 104 00:06:14,180 --> 00:06:16,200 Speaker 2: on building an entire AI native community. 105 00:06:16,300 --> 00:06:18,600 Speaker 2: programming curriculum, right, that's used at universities. 106 00:06:19,080 --> 00:06:23,840 Speaker 2: And I've also been building AI grading tools and some some feedback mechanisms 107 00:06:24,660 --> 00:06:26,020 Speaker 2: to help review student projects. 108 00:06:26,240 --> 00:06:30,360 Speaker 2: But what's become clear after working with AI in education 109 00:06:30,760 --> 00:06:34,240 Speaker 2: is that the mechanics of it were never like the hardest part, right? 110 00:06:34,340 --> 00:06:37,860 Speaker 2: AI sure can handle a lot of that, the syntax, the boilerplate, 111 00:06:38,260 --> 00:06:39,760 Speaker 2: we're wiring everything together. 112 00:06:40,180 --> 00:06:42,420 Speaker 2: But the part that still matters is how you think through a problem, 113 00:06:42,720 --> 00:06:44,260 Speaker 2: how you might structure something. 114 00:06:44,740 --> 00:06:48,360 Speaker 2: how you test it, and most importantly, how you decide if it's actually good. 115 00:06:49,460 --> 00:06:54,460 Speaker 2: So my focus has been teaching how to use AI well. There's a lot of misuse right now. And 116 00:06:54,820 --> 00:06:59,360 Speaker 2: you know, when you're learning and experimenting, that's all good. But I think like good education 117 00:06:59,620 --> 00:07:04,140 Speaker 2: can really make AI an equalizer. And that also includes helping people adapt 118 00:07:04,319 --> 00:07:07,080 Speaker 2: without losing confidence in the sense of what they bring to the table. 119 00:07:07,460 --> 00:07:09,560 Speaker 2: And yes, I've seen students in the past, 120 00:07:09,580 --> 00:07:13,460 Speaker 2: it would take nine weeks to teach a student to build like a portfolio, right? 121 00:07:13,560 --> 00:07:18,620 Speaker 2: That maybe has like some JavaScript and carousels and bootstrap or tailwind. 122 00:07:18,740 --> 00:07:22,300 Speaker 2: But now like our capstone project is like the student builds an agent 123 00:07:22,539 --> 00:07:26,139 Speaker 2: that calls APIs and does a whole lot more than what they would have done 124 00:07:26,180 --> 00:07:27,919 Speaker 2: that whole semester in building out a portfolio. 125 00:07:28,099 --> 00:07:29,860 Speaker 2: So definitely expedites their learning. 126 00:07:29,919 --> 00:07:31,199 Speaker 2: And I think you can 127 00:07:31,320 --> 00:07:36,820 Speaker 2: get something quickly working right now, but a lot of students are doing more and more of that, 128 00:07:36,850 --> 00:07:41,940 Speaker 2: and it's starting to look a lot of the same, right? It works, but it's kind of shallow. It's kind of 129 00:07:41,990 --> 00:07:46,100 Speaker 2: the same. And after a while, I think the foundation still matters. Without some foundation, it's hard 130 00:07:46,100 --> 00:07:50,740 Speaker 2: to tell, like, all right, is my code actually holding up or does it just happen to work? You know, 131 00:07:50,740 --> 00:07:55,900 Speaker 2: the better you understand something, the better questions you can ask, the faster you can spot 132 00:07:56,139 --> 00:08:01,020 Speaker 2: bad output, the more you can steer it. And like I said, I think taste matters, right? You got to 133 00:08:01,020 --> 00:08:06,419 Speaker 2: know what looks good and not stopping at the first result and having a clear definition and teaching 134 00:08:06,560 --> 00:08:09,960 Speaker 2: students, all right, what does a clear definition of done look like? And another thing is actually 135 00:08:10,199 --> 00:08:14,520 Speaker 2: knowing when to maybe step in and take that handoff, if you will, from the AI and not just 136 00:08:14,669 --> 00:08:16,820 Speaker 2: over-iterate forever and ever and fall into that. 137 00:08:18,100 --> 00:08:36,260 Speaker 1: Yeah, I mean, I think I can definitely echo that with just talking with people that are a little bit more junior on the engineering side where you maybe don't need to understand syntax anymore. The day of needing to know that there's a needs to semicolon here and how to debug obscure missing semicolon issue or something in a language is 138 00:08:36,261 --> 00:08:37,620 Speaker 1: That's mostly gone, right? 139 00:08:38,180 --> 00:08:40,840 Speaker 1: But now it comes down to clarity of thinking. 140 00:08:41,040 --> 00:08:43,400 Speaker 1: Can you logically understand what's happening 141 00:08:43,780 --> 00:08:46,060 Speaker 1: and describe it in words to get what you want? 142 00:08:46,140 --> 00:08:47,780 Speaker 1: And I think one thing I've realized is 143 00:08:48,080 --> 00:08:50,620 Speaker 1: people are starting to realize that maybe writing the code 144 00:08:51,100 --> 00:08:53,540 Speaker 1: wasn't as big of a bottleneck as what people actually thought it was. 145 00:08:53,700 --> 00:08:55,400 Speaker 1: It is a bottleneck, right? It does take time. 146 00:08:55,520 --> 00:08:57,960 Speaker 1: But it wasn't just writing the code, it's the iteration loops 147 00:08:58,080 --> 00:09:00,700 Speaker 1: and the coming up with and making sure you have creative ideas 148 00:09:00,700 --> 00:09:02,520 Speaker 1: ideas for how to solve these actual problems. 149 00:09:02,800 --> 00:09:04,640 Speaker 1: And I think that now is where people spend 150 00:09:04,780 --> 00:09:06,700 Speaker 1: more of the time. And I think that is sometimes scary for people 151 00:09:06,800 --> 00:09:08,140 Speaker 1: too, because that's actually kind of hard. 152 00:09:08,420 --> 00:09:10,620 Speaker 1: You can lose yourself into, I have to move this code around, 153 00:09:10,720 --> 00:09:12,680 Speaker 1: I have to write this function, this is going to take me 154 00:09:12,700 --> 00:09:14,300 Speaker 1: some time, I'm going to burn half a day just 155 00:09:14,740 --> 00:09:16,200 Speaker 1: reorganizing this code. But at 156 00:09:30,019 --> 00:09:30,519 Speaker 2: Yeah, absolutely. 157 00:09:30,970 --> 00:09:33,459 Speaker 2: And in learning itself, there's less friction. 158 00:09:33,920 --> 00:09:40,460 Speaker 2: You can try things out quicker, get feedback faster, but people are spending less time on long tutorials. 159 00:09:40,820 --> 00:09:44,720 Speaker 2: One of my favorite teaching approaches is the Socratic way of learning. 160 00:09:45,080 --> 00:09:49,300 Speaker 2: So one thing I like is that how I really supports that Socratic way. 161 00:09:49,301 --> 00:09:54,260 Speaker 2: approach, right? You can stay in that loop of asking questions, testing your ideas, refining 162 00:09:54,560 --> 00:09:58,340 Speaker 2: what you think you know, and being more critical in your learning. So you're actively working 163 00:09:58,510 --> 00:10:00,900 Speaker 2: through things, and AI really makes that loop faster. 164 00:10:01,140 --> 00:10:06,920 Speaker 3: When we first started Mastra, we started doing our education pieces. And people at that time, 165 00:10:07,180 --> 00:10:11,640 Speaker 3: let's just say, they were still writing code. Sonnet 3.5 was out, and people were still writing 166 00:10:11,880 --> 00:10:18,020 Speaker 3: code. Or maybe they had code assistance through the AI agent. Since then, now it's like, you know, 167 00:10:18,200 --> 00:10:22,740 Speaker 3: Maybe most things are fully through your coding agent and stuff like that. 168 00:10:22,840 --> 00:10:27,420 Speaker 3: But people kept getting tripped up with these concepts of AI engineering, 169 00:10:28,120 --> 00:10:29,940 Speaker 3: or when should I do this or that. 170 00:10:30,340 --> 00:10:33,040 Speaker 3: Have you also seen students like, oh, what is a workflow? 171 00:10:33,380 --> 00:10:34,920 Speaker 3: Why do I want to use one? 172 00:10:35,160 --> 00:10:36,100 Speaker 3: What is an eval? 173 00:10:36,360 --> 00:10:37,540 Speaker 3: Why is it called an eval? 174 00:10:37,680 --> 00:10:40,620 Speaker 3: Are there questions like that, which seem more foundational, 175 00:10:40,700 --> 00:10:42,140 Speaker 3: but the foundation keeps changing? 176 00:10:42,520 --> 00:10:43,940 Speaker 3: as the industry is progressing. 177 00:10:44,600 --> 00:10:46,880 Speaker 2: I had those same questions not too long ago. 178 00:10:47,640 --> 00:10:50,920 Speaker 2: There's a video in the actual course, the Mastra course, 179 00:10:51,100 --> 00:10:55,380 Speaker 2: that walks through the differences between agents and workflows 180 00:10:55,820 --> 00:10:59,280 Speaker 2: and when you might use one versus the other or combine them together. 181 00:10:59,650 --> 00:11:03,220 Speaker 2: But I have seen that the gap between people who just generate output 182 00:11:03,350 --> 00:11:05,340 Speaker 2: or the people who can actually shape it and work with these tools, 183 00:11:05,460 --> 00:11:06,400 Speaker 2: it's been widening. 184 00:11:06,650 --> 00:11:08,700 Speaker 2: You still need to know what you're building in the end. 185 00:11:09,040 --> 00:11:14,120 Speaker 1: Who would you say is the intended audience, right? It's probably not for someone who's 186 00:11:14,200 --> 00:11:18,000 Speaker 1: already shipped an agent into production in maybe another language and is just getting 187 00:11:18,180 --> 00:11:23,000 Speaker 1: started with Mastra, right? It's probably someone a bit more new to AI engineering in general. 188 00:11:23,200 --> 00:11:26,780 Speaker 2: It's for anyone who's curious about agents, but it's more, yeah, it's geared more towards 189 00:11:26,780 --> 00:11:31,580 Speaker 2: that audience, right? Maybe you're a TypeScript developer, yeah, you want to ship an agent, 190 00:11:31,860 --> 00:11:32,900 Speaker 2: right? Or maybe you've... 191 00:11:32,900 --> 00:11:35,120 Speaker 2: built agents using other languages 192 00:11:35,400 --> 00:11:36,920 Speaker 2: and want to learn how to get up to speed 193 00:11:37,060 --> 00:11:39,180 Speaker 2: with building one in Mastra in TypeScript. 194 00:11:39,420 --> 00:11:40,860 Speaker 2: So it's for everyone, really. 195 00:11:41,200 --> 00:11:43,000 Speaker 2: But the goal was to 196 00:11:43,390 --> 00:11:45,140 Speaker 2: help someone who's never built an agent 197 00:11:45,360 --> 00:11:47,200 Speaker 2: before build their first agent and ship it 198 00:11:47,600 --> 00:11:48,820 Speaker 2: and then just build on top of it. 199 00:11:48,940 --> 00:11:50,560 Speaker 1: I do feel like we naturally 200 00:11:51,080 --> 00:11:52,200 Speaker 1: gravitate towards people who 201 00:11:52,600 --> 00:11:54,360 Speaker 1: are kind of like us in some ways. 202 00:11:54,720 --> 00:11:57,880 Speaker 1: I think, Guil, with your courses on React and CSS 203 00:11:58,259 --> 00:12:00,959 Speaker 1: and web development, and Abhi and I have web development 204 00:12:01,160 --> 00:12:03,199 Speaker 1: backgrounds coming from Gatsby, 205 00:12:03,380 --> 00:12:06,360 Speaker 1: where I think if you're that kind of person, 206 00:12:06,459 --> 00:12:09,399 Speaker 1: this will resonate really well if you're just getting into AI engineering. 207 00:12:09,579 --> 00:12:12,220 Speaker 1: And of course, if you're just interested in agents in general, 208 00:12:12,220 --> 00:12:14,000 Speaker 1: I think you'll find value of it, but 209 00:12:14,120 --> 00:12:24,180 Speaker 1: Specifically, if you have built websites with JavaScript or TypeScript before, this is probably a really good on-ramp for getting started with what is an agent and why you might want to build one. 210 00:12:24,380 --> 00:12:31,400 Speaker 2: Yeah, you bet. It's been fun trying to keep up with all the changes and the things you're quickly shipping in Mastra, but... 211 00:12:31,640 --> 00:12:37,280 Speaker 2: Yeah, even that, I'm making sure that it's holding up and that there's nothing really out of date yet. 212 00:12:37,560 --> 00:12:43,440 Speaker 1: There is something inherently valuable, in my opinion, around being able to relate to a person that's teaching you something, right? 213 00:12:43,640 --> 00:12:45,720 Speaker 1: You have the actual teacher that is there. 214 00:12:45,720 --> 00:12:46,600 Speaker 1: I can see the person. 215 00:12:47,040 --> 00:12:47,740 Speaker 1: They can teach it. 216 00:12:47,780 --> 00:12:48,920 Speaker 1: It feels like they're talking to me. 217 00:12:49,260 --> 00:12:51,860 Speaker 1: But video is a little harder to keep up with. 218 00:12:51,980 --> 00:12:54,080 Speaker 1: And AI makes things better. 219 00:12:54,081 --> 00:13:00,519 Speaker 1: It's easier to build features and code, so the number of features we're shipping is probably higher now than it would be two or three years ago. 220 00:13:00,610 --> 00:13:03,680 Speaker 1: And it was always kind of hard to keep video courses up to date. 221 00:13:03,790 --> 00:13:06,180 Speaker 1: What have you seen others doing? Do you have ideas? 222 00:13:06,600 --> 00:13:09,319 Speaker 2: No, it's definitely been challenging, so I try to think about it more. 223 00:13:09,620 --> 00:13:14,079 Speaker 2: In discrete, sort of composable pieces, not having too many videos dependent on the other. 224 00:13:14,740 --> 00:13:23,380 Speaker 2: But I think what's really helped with this Mastra course is once we introduce Mastra Skills, I encourage the learner to lean on that, right? 225 00:13:23,560 --> 00:13:26,680 Speaker 2: Maybe something that you're doing that I'm doing doesn't quite line up. 226 00:13:26,860 --> 00:13:34,760 Speaker 2: Well, you know, ask Mastra Skills like, all right, give me the latest information about like this workflow or why this tool isn't working. 227 00:13:34,870 --> 00:13:42,380 Speaker 2: So once you have that knowledge, I think maybe with some, all right, you can provide a little context in the notes or whatever. 228 00:13:42,381 --> 00:13:43,000 Speaker 2: in the code itself. 229 00:13:43,420 --> 00:13:45,500 Speaker 2: But I think once you know that, 230 00:13:46,080 --> 00:13:48,180 Speaker 2: I don't have to be backed into this corner. 231 00:13:48,180 --> 00:13:50,060 Speaker 2: I can take full advantage of Mastra Skills 232 00:13:50,240 --> 00:13:53,040 Speaker 2: or MCP docs server and immediately get up to speed 233 00:13:53,160 --> 00:13:53,760 Speaker 2: with the latest. 234 00:13:54,060 --> 00:13:56,060 Speaker 2: But yeah, in traditional content creation, 235 00:13:56,300 --> 00:13:57,800 Speaker 2: it's a little bit more challenging. 236 00:13:58,080 --> 00:14:01,160 Speaker 3: We'll just have to make a Guil agent, just teach all the classes. 237 00:14:01,450 --> 00:14:02,780 Speaker 3: We can give him a voice. 238 00:14:03,870 --> 00:14:05,740 Speaker 3: Instead of a real person, we'll make him a cartoon. 239 00:14:08,360 --> 00:14:13,900 Speaker 1: Yeah, the Guil agent cartoon trainer that keeps up to date. 240 00:14:14,120 --> 00:14:14,940 Speaker 1: That's the future. 241 00:14:15,320 --> 00:14:16,380 Speaker 1: We will see. 242 00:14:18,040 --> 00:14:18,620 Speaker 1: All right, Guil. 243 00:14:19,100 --> 00:14:20,180 Speaker 1: It was great having you on. 244 00:14:20,300 --> 00:14:21,440 Speaker 1: We appreciate you coming in, 245 00:14:22,120 --> 00:14:22,860 Speaker 1: talking about the course, 246 00:14:23,140 --> 00:14:24,580 Speaker 1: mastra.ai/learn, 247 00:14:24,920 --> 00:14:26,200 Speaker 1: talking a little bit about just AI 248 00:14:26,440 --> 00:14:27,600 Speaker 1: and education in general. 249 00:14:27,760 --> 00:14:30,279 Speaker 1: I have a kind of education ed tech background as well. 250 00:14:30,320 --> 00:14:31,480 Speaker 1: It's kind of fascinating just how 251 00:14:31,820 --> 00:14:34,380 Speaker 1: how it's changing, but it's still the same, right? 252 00:14:34,600 --> 00:14:35,440 Speaker 1: People need to learn things. 253 00:14:35,660 --> 00:14:37,320 Speaker 1: People like to learn things from people. 254 00:14:37,680 --> 00:14:39,060 Speaker 1: And I think it's just a matter of like, 255 00:14:39,170 --> 00:14:41,000 Speaker 1: we now have different tools maybe to help 256 00:14:41,640 --> 00:14:42,839 Speaker 1: make that process even faster. 257 00:14:43,260 --> 00:14:44,360 Speaker 1: So appreciate you coming on. 258 00:14:44,620 --> 00:14:46,160 Speaker 1: We're excited to have you back on, you know, 259 00:14:46,220 --> 00:14:49,000 Speaker 4: for the Mastra course version two or updates along the way. 260 00:14:49,279 --> 00:14:50,660 Speaker 4: And we'll see you next time. 261 00:14:51,160 --> 00:14:51,980 Speaker 4: Appreciate it, Shane. 262 00:14:52,589 --> 00:14:52,880 Speaker 5: Take care. 263 00:14:53,040 --> 00:14:53,540 Speaker 4: Yeah, thanks, Guil. 264 00:14:54,220 --> 00:14:54,839 Speaker 5: That guy's a legend. 265 00:14:55,190 --> 00:14:56,399 Speaker 5: Just want to throw that out there.