Look, we have only covered some of the basics here today. I want to point out where you can find some more information to continue your agent journey. We've covered a lot of info. Here are some of my favorite resources for diving deeper into all that we just talked about. The LangChain documentation is a great place to start. Specifically, it provides a great high level overview of all the packages and services in the LangChain ecosystem. So we break down the different LangChain core packages as well as all the LangChain community packages. So we used Tavily as part of LangChain community. And then we also use LangChain OpenAI which is a separate partner package. But we have a ton of other integrations and you can explore those here. LangChain itself is more high level and covers agents and other chains and different retrieval strategies that are good high-level entry points. If you just want to get started. We also have a bunch of templates that can be deployed easily with LangServe LangServe's just a really easy way to turn your LangChain application into a web server. Helping out with all of this is LangSmith from the beginning, LangSmith can help with debugging, and then it can also help with monitoring in production. And it also has a playground which we saw earlier. Looking at a few other sites. The LangChain GitHub repo has a lot of good resources, a lot of good cookbooks, and a lot of templates for getting started. The LangGraph repo, of course, has in-depth documentation on LangGraph, everything that we just covered. So great reference docs, great tutorials, and great how-to guides. We've also done previous courses on LangChain with DeeLearning.AI, that I'd highly recommend checking out. In particular, Functions, Tools and Agents with LangChain, is a really good precursor to this lesson. We already saw the prompt hub once, but just to reiterate, this is a great place to go to get inspiration and see what other expert prompters are doing.