
To use the Gemini models via the google.generativeai library in Colab, we need an API key from Google AI Studio. Visit Google AI Studio – API Keys, sign in with your Google account, and generate a new API key if you haven’t already. Once generated, this key can be securely stored using Colab’s secrets manager, allowing us to authenticate and interact with Gemini models programmatically.
This script uses the google.generativeai library to list all available models supported by our API key. It retrieves the key from Colab’s userdata securely and prints model names and their supported generation methods.
import google.generativeai as genai
from google.colab import userdata
try:
GOOGLE_API_KEY = userdata.get('GOOGLE_API_KEY')
genai.configure(api_key=GOOGLE_API_KEY)
# List available models
for m in genai.list_models():
print(f"Name: {m.name}")
print(f"Supported generation methods: {m.supported_generation_methods}")
print("-" * 20)
except Exception as e:
print(f"An error occurred: {e}")
print("Check your API key with the name 'GOOGLE_API_KEY'.")
Part of the output is here
Name: models/embedding-gecko-001
Supported generation methods: ['embedText', 'countTextTokens']
--------------------
Name: models/gemini-1.5-pro-latest
Supported generation methods: ['generateContent', 'countTokens']
--------------------
Name: models/gemini-1.5-pro-002
Supported generation methods: ['generateContent', 'countTokens', 'createCachedContent']
--------------------
This script uses the google.generativeai library to initialize the gemini-1.5-flash-latest model and generate a short poem about the stars. The API key is accessed via userdata.get(), and the generate_content() method handles the text generation task.
import google.generativeai as genai
from google.colab import userdata
try:
GOOGLE_API_KEY = userdata.get('GOOGLE_API_KEY')
genai.configure(api_key=GOOGLE_API_KEY)
# Initialize the model that supports generateContent
# You can choose another model from the list if you prefer, e.g., 'gemini-1.5-pro-latest'
model = genai.GenerativeModel('gemini-1.5-flash-latest')
# Generate content
response = model.generate_content("Write a short poem about the stars.")
print(response.text)
except Exception as e:
print(f"An error occurred: {e}")
print("Check your API key in the name 'GOOGLE_API_KEY'.")
Output is here
Distant suns, a scattered gleam,
Across the velvet, midnight stream.
A million points of silver light,
Igniting darkness, pure and bright.
Silent stories, ages old,
In cosmic dust, their tales unfold.
This script uses the google.generativeai library and allows the user to enter a custom prompt for generating text with the gemini-1.5-flash-latest model. The API key is retrieved via userdata.get(), and the response is printed dynamically based on user input.
import google.generativeai as genai
from google.colab import userdata
try:
GOOGLE_API_KEY = userdata.get('GOOGLE_API_KEY')
genai.configure(api_key=GOOGLE_API_KEY)
# Initialize the model that supports generateContent
# You can choose another model from the list if you prefer,
# e.g., 'gemini-1.5-pro-latest'
model = genai.GenerativeModel('gemini-1.5-flash-latest')
# Get prompt from user input
user_prompt = input("Enter your prompt for the model: ")
# Generate content based on user input
response = model.generate_content(user_prompt)
print(response.text)
except Exception as e:
print(f"An error occurred: {e}")
print("Please check your API key.")
Input prompt
Enter your prompt for the model: Tell me why flight path are not straight line.
Output ( Part only )
Flight paths aren't straight lines for a variety of reasons, all boiling down to efficiency, safety, and regulatory compliance:
* **Jet Streams:** These high-altitude air currents can significantly impact flight time and fuel consumption. Pilots often plan routes to take advantage of tailwinds (flying with the jet stream) and avoid headwinds (flying against it). This results in a curved path, even if the ground distance appears longer.
GOOGLE_API_KEY=your_actual_api_key_here
Here is the code to get the API key from local file and configure the same with the Model.
import os
import google.generativeai as genai
from dotenv import load_dotenv
# Load API key from .env file
load_dotenv()
GOOGLE_API_KEY = os.getenv("GOOGLE_API_KEY")
try:
# Configure Gemini API
genai.configure(api_key=GOOGLE_API_KEY)
# Load and initialize the model
model = genai.GenerativeModel('gemini-1.5-flash-latest')
# Get prompt from user input
user_prompt = input("Enter your prompt for the model: ")
# Generate content based on user input
response = model.generate_content(user_prompt)
print(response.text)
except Exception as e:
print(f"An error occurred: {e}")
print("Please check your API key.")
import os
import google.generativeai as genai
from dotenv import load_dotenv
# Load API key from .env file
load_dotenv()
GOOGLE_API_KEY = os.getenv("GOOGLE_API_KEY")
try:
# Configure Gemini API
genai.configure(api_key=GOOGLE_API_KEY)
# Initialize chat model
model = genai.GenerativeModel('gemini-1.5-flash-latest')
chat = model.start_chat()
print("🤖 Gemini AI Chat (type 'exit' to quit)\n")
while True:
user_input = input("You: ")
if user_input.lower() in ['exit', 'quit']:
print("👋 Exiting chat. Goodbye!")
break
response = chat.send_message(user_input)
print(f"Gemini: {response.text}\n")
except Exception as e:
print(f"An error occurred: {e}")
print("Please check your API key.")
google.colab.ai library is specifically designed for use within Google Colab environments. It provides a simple interface to access state-of-the-art language models directly within Colab. Therefore, it is not intended for use outside of the Colab platform.
from google.colab import ai
# Use Colab AI to ask a factual question and get a response
response = ai.generate_text("What is the capital of India?")
# Display the response
print(response)
List of available Models
from google.colab import ai
# List all available models accessible through Google Colab's AI interface
ai.list_models()
Using a specific Model
# @title Choose a different model
from google.colab import ai
# Generate response using a specific Gemini model
response = ai.generate_text(
"What is the capital of England",
model_name='google/gemini-2.5-flash-lite'
)
print(response)
ai.generate_text(..., stream=True) is described as streaming real-time text generation. from google.colab import ai
# Request AI to generate a short story and stream the output as it's being created
stream = ai.generate_text(
"Tell me a short story.",
stream=True
)
for text in stream:
print(text, end='')
# get the sample student data from the url
!wget https://www.plus2net.com/python/download/student.xlsx
Creating the Pandas DataFrame.
import pandas as pd
df=pd.read_excel('student.xlsx')
df.head()
Generating the string by using Pandas df_tostring()
df_string = df.to_string()
print(df_string)
Passing the string to AI model and getting output.
from google.colab import ai
question = "From the following student data, which student has the highest mark? Please only provide the student's name.\n\n" + df_string
ai_response = ai.generate_text(question, model_name='google/gemini-2.5-flash')
print(ai_response)
Ask for full details of the student who got the highest mark.
question = "Give me the name, class , mark for the student who has the highest mark.\n\n " + df_string
ai_response = ai.generate_text(question, model_name='google/gemini-2.5-flash')
print(ai_response)
It does not maintain the historic information, by not giving the df_string it can't reply by using previous posted information.
question = "Give me the name, class , mark for the student who has the highest mark.\n\n"
#question="Give me the name, class , mark for the student who has the highest mark.\n\n" +df_string
ai_response = ai.generate_text(question, model_name='google/gemini-2.5-flash')
print(ai_response)
from google.colab import ai
countries = ['India', 'USA', 'Japan']
# Convert list to a comma-separated string
my_str = ",".join(map(str, countries))
# Create a prompt asking AI to generate a dictionary with country names as keys and capitals as values
question = "Create one Python dictionary by using the list names as key \
and capitals of the countries as values .\n\n" + my_str
ai_response = ai.generate_text(
question,
model_name='google/gemini-2.5-flash'
)
print(ai_response)
google.colab.ai module itself does not provide built-in functionalities for creating charts or data visualizations. You would typically use libraries like matplotlib or seaborn with pandas for such tasks.ai.generate_text call is independent. As you observed, if you don't include the df_string again, it cannot answer questions based on previously provided data.ai.generate_text function is designed for text-based interactions. It cannot directly process or generate images. While some advanced models support multimodal inputs, the current generate_text function focuses on textual data.This library allows developers to interact with Google's Gemini AI models using Python to generate content from text or image prompts.
You can install the package using pip install google-generativeai. For handling environment variables, use pip install python-dotenv.
Store your API key in a .env file and use the python-dotenv package to load it securely in your script.
Yes, by using the start_chat() method, you can maintain context in a conversation and send follow-up messages using send_message().
You can use 'gemini-1.5-pro-latest' for full capabilities or 'gemini-1.5-flash-latest' for faster, lightweight responses depending on your use case.
Yes, the code runs on your local machine by replacing Colab-specific parts and installing the necessary packages via pip.
Tkinter and Gemini AI — complete with settings and chat history.
Read Tutorial
Author
🎥 Join me live on YouTubePassionate about coding and teaching, I publish practical tutorials on PHP, Python, JavaScript, SQL, and web development. My goal is to make learning simple, engaging, and project‑oriented with real examples and source code.