
Google Colab just made its Gemini integration considerably more useful for anyone who actually wants to learn to code rather than just get code written for them. Two new features – Custom Instructions and Learn Mode – change the dynamic between user and AI assistant in ways that matter for educators, students, and developers picking up new frameworks.
The headline feature is Learn Mode, which flips Gemini’s default behavior. Instead of generating a block of copy-paste code that solves your problem, it walks you through the solution step by step, explains the underlying concepts, and pushes you to develop the skill rather than just get the answer.
What Custom Instructions Actually Do
Custom Instructions are stored at the notebook level, which is the detail that makes this feature genuinely useful rather than just a settings menu. When you add instructions to a notebook – your preferred coding style, a specific library you want Gemini to use, context about the course syllabus a notebook is built around – those instructions travel with the notebook when you share it.
That sharing behavior is what separates this from a standard user preference. A developer building a teaching notebook can configure exactly how the Gemini assistant behaves for anyone who opens it. An educator designing coursework can ensure that every student interacting with the AI gets the same tailored experience.
Practical uses are straightforward: always use NumPy instead of base Python for array operations, maintain a specific level of explanation depth, stay aware of the topics covered in a particular course. Any context that would normally require re-explaining at the start of every session can now be encoded once and applied automatically.
Learn Mode and How It Works
Learn Mode is built on top of Custom Instructions – it is essentially a pre-configured instruction set that tells Gemini to teach rather than solve. You turn it on from the Gemini chat window in Colab, and from that point the assistant shifts its approach entirely.
The practical difference is significant. A standard Gemini response to “how do I filter a list in Python” might produce a working list comprehension with a brief explanation. A Learn Mode response breaks down what a list comprehension is, why it works the way it does, walks through the logic step by step, and expects you to apply the understanding rather than just copy the output.
For developers brushing up on an unfamiliar framework, this mode accelerates actual learning rather than just task completion. For students new to programming, it prevents the pattern where AI assistance becomes a crutch that produces working code without building any underlying understanding.
Google has published example notebooks with Gemini pre-configured to use Learn Mode – one focused on list exercises, one on string exercises – that demonstrate what the guided experience looks like in practice.
Why the Notebook-Level Storage Matters
Most AI assistant customization lives at the account or application level. You configure it once for yourself and it applies everywhere. The problem with that model for educational contexts is that it assumes everyone using a shared resource has the same preferences and needs.
Storing Custom Instructions in the notebook changes the unit of customization from the user to the artifact. A Colab notebook is already a shareable, forkable document – adding AI configuration to that document means the intelligence about how to use the notebook travels with it. When someone in the Colab community shares a notebook designed to teach a specific topic, they can now also share exactly how the AI assistant should behave while working through it.
This is a more composable model for AI-assisted learning tools. The notebook author becomes the designer of the AI experience, not just the content. For the Colab community specifically, where sharing and collaboration are central to how the platform is used, this unlocks a category of more intentionally designed learning resources.
Who This Is For
Students and beginners get the most direct benefit from Learn Mode. The default behavior of AI coding assistants – generate working code quickly – is actively counterproductive for people trying to build foundational skills. Learn Mode reorients the assistant toward explanation and guided problem-solving.
Educators now have a tool for embedding AI behavior directly into course materials. A notebook designed to walk students through data structures can specify that Gemini should always explain time complexity, always use a particular syntax style, always connect examples back to the course concepts. That consistency was previously impossible to enforce when students were interacting with a general-purpose assistant.
Experienced developers learning new frameworks can use Custom Instructions to calibrate the level of explanation they actually need – skip the basics, focus on idiomatic patterns, always compare the new approach to whatever they already know well.
Both features are available now in Google Colab through the Gemini chat window. The example notebooks Google has published are a reasonable starting point for understanding what a well-configured Learn Mode experience looks like before building your own.
https://blog.google/innovation-and-ai/technology/developers-tools/colab-updates/




