Getting Started with Gemini API in Google Colab

API call to AI Model

To use the Gemini models via the google.generativeai library in Colab, we need an API key from Google AI Studio. Visit Google AI Studio – API Keys, sign in with your Google account, and generate a new API key if you haven’t already. Once generated, this key can be securely stored using Colab’s secrets manager, allowing us to authenticate and interact with Gemini models programmatically.




Gemini AI with Python on Google Colab using API | Complete Beginner's Guide #colab #aitools


Listing Available Models from Google Generative AI

This script uses the google.generativeai library to list all available models supported by our API key. It retrieves the key from Colab’s userdata securely and prints model names and their supported generation methods.

import google.generativeai as genai
from google.colab import userdata

try:
  GOOGLE_API_KEY = userdata.get('GOOGLE_API_KEY')
  genai.configure(api_key=GOOGLE_API_KEY)

  # List available models
  for m in genai.list_models():
    print(f"Name: {m.name}")
    print(f"Supported generation methods: {m.supported_generation_methods}")
    print("-" * 20)

except Exception as e:
  print(f"An error occurred: {e}")
  print("Check your API key with the name 'GOOGLE_API_KEY'.")
Part of the output is here
Name: models/embedding-gecko-001
Supported generation methods: ['embedText', 'countTextTokens']
--------------------
Name: models/gemini-1.5-pro-latest
Supported generation methods: ['generateContent', 'countTokens']
--------------------
Name: models/gemini-1.5-pro-002
Supported generation methods: ['generateContent', 'countTokens', 'createCachedContent']
--------------------

Text Generation with Gemini API in Google Colab

This script uses the google.generativeai library to initialize the gemini-1.5-flash-latest model and generate a short poem about the stars. The API key is accessed via userdata.get(), and the generate_content() method handles the text generation task.

import google.generativeai as genai
from google.colab import userdata

try:
  GOOGLE_API_KEY = userdata.get('GOOGLE_API_KEY')
  genai.configure(api_key=GOOGLE_API_KEY)

  # Initialize the model that supports generateContent
  # You can choose another model from the list if you prefer, e.g., 'gemini-1.5-pro-latest'
  model = genai.GenerativeModel('gemini-1.5-flash-latest')

  # Generate content
  response = model.generate_content("Write a short poem about the stars.")
  print(response.text)

except Exception as e:
  print(f"An error occurred: {e}")
  print("Check your API key in the name 'GOOGLE_API_KEY'.")
Output is here
Distant suns, a scattered gleam,
Across the velvet, midnight stream.
A million points of silver light,
Igniting darkness, pure and bright.
Silent stories, ages old,
In cosmic dust, their tales unfold.

Generating Content Dynamically Based on User Prompt

This script uses the google.generativeai library and allows the user to enter a custom prompt for generating text with the gemini-1.5-flash-latest model. The API key is retrieved via userdata.get(), and the response is printed dynamically based on user input.

import google.generativeai as genai
from google.colab import userdata

try:
  GOOGLE_API_KEY = userdata.get('GOOGLE_API_KEY')
  genai.configure(api_key=GOOGLE_API_KEY)

  # Initialize the model that supports generateContent
  # You can choose another model from the list if you prefer,
  # e.g., 'gemini-1.5-pro-latest'
  model = genai.GenerativeModel('gemini-1.5-flash-latest')

  # Get prompt from user input
  user_prompt = input("Enter your prompt for the model: ")

  # Generate content based on user input
  response = model.generate_content(user_prompt)
  print(response.text)

except Exception as e:
  print(f"An error occurred: {e}")
  print("Please check your API key.")
Input prompt
Enter your prompt for the model: Tell me why flight path are not straight line.
Output ( Part only )
Flight paths aren't straight lines for a variety of reasons, all boiling down to efficiency, safety, and regulatory compliance:

* **Jet Streams:**  These high-altitude air currents can significantly impact flight time and fuel consumption.  Pilots often plan routes to take advantage of tailwinds (flying with the jet stream) and avoid headwinds (flying against it). This results in a curved path, even if the ground distance appears longer.

💻 Local Python Script for VS Code or any other platform.

We have to keep the API Key in local .env file in same Directory and read from the script.

🔐 .env File (Same Directory)

  • Do not add quotes around the key value.
  • If you're using Git, add .env to your .gitignore to avoid exposing your API key.
GOOGLE_API_KEY=your_actual_api_key_here 
Here is the code to get the API key from local file and configure the same with the Model.
import os
import google.generativeai as genai
from dotenv import load_dotenv

# Load API key from .env file
load_dotenv()
GOOGLE_API_KEY = os.getenv("GOOGLE_API_KEY")

try:
    # Configure Gemini API
    genai.configure(api_key=GOOGLE_API_KEY)

    # Load and initialize the model
    model = genai.GenerativeModel('gemini-1.5-flash-latest')

    # Get prompt from user input
    user_prompt = input("Enter your prompt for the model: ")

    # Generate content based on user input
    response = model.generate_content(user_prompt)
    print(response.text)

except Exception as e:
    print(f"An error occurred: {e}")
    print("Please check your API key.")

Chat Mode


Gemini AI Chat Mode with Python in VS Code | Interactive Prompt Handling Tutorial

In this script, we use chat mode with the Gemini AI model, which allows for multi-turn conversations where the context is preserved across messages. Unlike single-prompt interactions, chat mode enables the model to understand follow-up questions based on earlier inputs—so you don’t need to repeat the full context each time. This makes it ideal for building conversational tools, assistants, or interactive AI applications.
import os
import google.generativeai as genai
from dotenv import load_dotenv

# Load API key from .env file
load_dotenv()
GOOGLE_API_KEY = os.getenv("GOOGLE_API_KEY")

try:
    # Configure Gemini API
    genai.configure(api_key=GOOGLE_API_KEY)

    # Initialize chat model
    model = genai.GenerativeModel('gemini-1.5-flash-latest')
    chat = model.start_chat()

    print("🤖 Gemini AI Chat (type 'exit' to quit)\n")

    while True:
        user_input = input("You: ")

        if user_input.lower() in ['exit', 'quit']:
            print("👋 Exiting chat. Goodbye!")
            break

        response = chat.send_message(user_input)
        print(f"Gemini: {response.text}\n")

except Exception as e:
    print(f"An error occurred: {e}")
    print("Please check your API key.")

Frequently Asked Questions

Q1: What is the purpose of the google-generativeai library?

This library allows developers to interact with Google's Gemini AI models using Python to generate content from text or image prompts.

Q2: How do I install the required packages?

You can install the package using pip install google-generativeai. For handling environment variables, use pip install python-dotenv.

Q3: How do I keep my API key secure?

Store your API key in a .env file and use the python-dotenv package to load it securely in your script.

Q4: Can I send follow-up messages to the model?

Yes, by using the start_chat() method, you can maintain context in a conversation and send follow-up messages using send_message().

Q5: Which Gemini model should I use?

You can use 'gemini-1.5-pro-latest' for full capabilities or 'gemini-1.5-flash-latest' for faster, lightweight responses depending on your use case.

Q6: Can this code run outside of Google Colab?

Yes, the code runs on your local machine by replacing Colab-specific parts and installing the necessary packages via pip.

Conclusion

Integrating Gemini AI with Python opens up powerful possibilities for generating and interacting with natural language content directly from your code. Whether you're creating simple prompts or building full chat-based applications, Gemini provides a flexible API that works seamlessly in environments like Google Colab or your local system. By securely managing your API key and choosing the right model variant, you can start building intelligent, context-aware applications with minimal setup. As you explore further, you’ll find this integration to be a valuable addition to your Python development toolkit.
Next: Working with Gemini AI Image »

Inside AI Tools: The API Workflow (Restaurant Analogy) #aitools


Subhendu Mohapatra — author at plus2net
Subhendu Mohapatra

Author

🎥 Join me live on YouTube

Passionate about coding and teaching, I publish practical tutorials on PHP, Python, JavaScript, SQL, and web development. My goal is to make learning simple, engaging, and project‑oriented with real examples and source code.



Subscribe to our YouTube Channel here



plus2net.com







Python Video Tutorials
Python SQLite Video Tutorials
Python MySQL Video Tutorials
Python Tkinter Video Tutorials
We use cookies to improve your browsing experience. . Learn more
HTML MySQL PHP JavaScript ASP Photoshop Articles Contact us
©2000-2025   plus2net.com   All rights reserved worldwide Privacy Policy Disclaimer