Working with complex codebases
1. Working with complex codebases
Welcome back! In this video, we'll talk about how to give more context to Cursor so it can understand and work better with complex codebases.2. Introducing Max Mode and MCP servers
We'll look at two powerful tools that make this possible — Max Mode and MCP servers.3. Max Mode
When working on smaller projects like our Next.js portfolio app, Cursor's normal context window works perfectly. But as our projects grow — say we're working on a monorepo, or our codebase has deeply connected components and shared modules — Cursor might need more context to reason accurately. That's exactly what Max Mode is for. We can enable Max Mode from the model selector, and it expands Cursor's context window, letting the AI "see" more of our codebase at once.4. Max Mode
This is especially useful when we're refactoring code across multiple files, debugging logic that depends on shared utilities, or working on projects with nested dependencies. Now, there's a small trade-off. Because Max Mode reads more data, it can take slightly longer to respond. So, we can use it when we're doing cross-file debugging or refactors, and switch back to Normal Mode when we just need quick edits. It's a trade-off between speed and depth — depending on what we're working on.5. Model Context Protocol (MCP)
Next, let's talk about how to give Cursor even more context — not just from our local codebase, but from external sources. That's where MCP, or Model Context Protocol, comes in.6. MCP: a standardized protocol
Think of MCP like a USB port for AI applications. Just as USB provides a standardized way to connect electronic devices, MCP provides a standardized way to connect AI applications to external systems.7. MCP in action
To set up an MCP server inside Cursor, we first open Cursor Settings. Inside Tools & MCP, clicking "Add a new MCP Server" will open the mcp.json file, where we can add our MCP server's configuration. Let's try to connect the Langbase Docs MCP server. This provides extended context from Langbase Documentation, which is super useful when we're shipping AI agents. We head to the Langbase Docs site, open the MCP section, and choose the Cursor option. Then we're going to copy the config code, paste it into our mcp.json file inside Cursor, and save it. Usually, all the public MCP servers come with ready-to-paste mcp.json config files. This way, our MCP server will be connected. Once connected, we'll see Cursor automatically pull in relevant context from Langbase while generating responses. For instance, if we ask Cursor, "Explain what memory agents are on Langbase," it'll pull references from our connected Langbase documentation. This makes Cursor way smarter when handling large-scale or enterprise codebases — it now understands our code and the documentation behind it.8. MCP in action
If we visit github.com/mcp, we have all the popular MCP servers listed with their configuration files that we can copy and paste into Cursor. We can even connect multiple MCP servers — for design specs, APIs, or internal tools — giving Cursor a full picture of our development environment. Max mode and MCP servers make Cursor ready for advanced, real-world projects — where understanding the bigger picture really matters.9. Let's practice!
Now, let's turn on Max Mode and explore MCP servers in Cursor.Create Your Free Account
or
By continuing, you accept our Terms of Use, our Privacy Policy and that your data is stored in the USA.