Open Codex CLI brings the power of large language models (LLMs) directly to your terminal, making advanced AI coding assistance accessible from the command line. This guide shows developers, API engineers, and technical leads how to use Open Codex CLI with Google’s Gemini 2.5 Pro model—one of the most capable LLMs available today.
Whether you’re automating code generation, refactoring scripts, or streamlining repetitive tasks, integrating Gemini into your CLI workflow can boost productivity, code quality, and team efficiency—all without leaving your terminal.
What Is Open Codex CLI and Why Use It with Gemini?
Open Codex CLI is an open-source tool (npm: open-codex) originally forked from OpenAI’s Codex CLI, but now supports multiple AI providers, including Google Gemini, OpenAI, OpenRouter, and Ollama. Its terminal-native design allows you to:
- Ask technical questions and generate code directly in your CLI
- Automate boilerplate, refactoring, and code review tasks
- Execute shell commands in a sandboxed, reviewable way
- Work securely alongside version control (Git)
Combining Open Codex CLI with Google Gemini 2.5 Pro offers:
- Cutting-Edge AI: Gemini 2.5 Pro provides deep reasoning, clear explanations, and advanced code generation.
- Terminal-First Workflow: No context switching—get AI-powered help as you code, test, and build.
- Open Source Flexibility: Switch between AI providers or models as needed, with customizable configuration.
- Enhanced Security: All commands run in a sandbox by default, with approval prompts and Git integration for change safety.
- Team Productivity: Coders, QA, and backend engineers can iterate faster while maintaining code quality.
“In just a few hours my friend forked OpenAI Codex to work with Gemini 2.5 Pro 🤯”
Sawyer Hood, April 17, 2025
Prerequisites
Before getting started, make sure you have:
- Operating System: macOS 12+, modern Linux (Ubuntu 20.04+ or Debian 10+), or Windows 10/11 via WSL2. (No native Windows support.)
- Node.js: Version 22 or newer (
node -vto check). Download Node.js - Google Gemini API Key: Obtain from Google AI Studio or the Google Cloud Console. Keep it secure.
- Terminal Skills: Familiarity with your shell (Terminal, Bash, Zsh, etc.)
Step 1: Install Open Codex CLI
Install Open Codex CLI globally using npm:
bash
npm install -g open-codexOr, if you use Yarn:
bash
yarn global add open-codexVerify installation:
bash
open-codex --versionIf you run into permission issues, follow npm’s guide to fixing permissions.
Step 2: Configure Gemini Access
To use Gemini, you need to provide your API key and set Gemini as the provider.
Method 1: Environment Variable (Quick Start)
Set your Gemini API key as an environment variable:
bash
export GOOGLEGENERATIVEAIAPIKEY="YOURAPIKEY_HERE"This only lasts for the current terminal session. To make it permanent, add the export line to your shell config (e.g.,~/.zshrcfor Zsh,~/.bashrcor~/.bash_profilefor Bash), thensource ~/.zshrcto reload.
Each time you run Open Codex CLI, specify Gemini as the provider:
bash
open-codex --provider gemini "What is the capital of France?"Method 2: Configuration File (Recommended for Persistent Setup)
Set Gemini as the default provider in a config file:
- Create the config directory:
bash
mkdir -p ~/.codex- Create/edit
~/.codex/config.jsonusing your preferred editor:
bash
nano ~/.codex/config.json- Add:
json
{
"provider": "gemini",
"model": "gemini-2.5-pro-preview-03-25"
}- Save and exit.
You still need to set GOOGLEGENERATIVEAIAPIKEY as an environment variable as described above. The config file sets the provider/model; the env variable provides credentials.Now, running open-codex will default to Gemini:
bash
open-codex "Explain the concept of API versioning."Step 3: Basic Usage Examples with Gemini
Interactive REPL Mode
Start a conversational session:
bash
open-codex
open-codex --provider geminiYou'll see:
Welcome to Open Codex CLI! Type your request, then hit Enter twice to send.
>Type your prompt (e.g., How does Gemini 2.5 Pro differ from earlier models?), then Enter twice to send.
Direct Prompting
For quick, one-off answers:
bash
open-codex "Show a Python example of API pagination."To specify model or provider explicitly:
bash
open-codex --provider gemini --model gemini-2.5-pro-preview-03-25 "Write a bash script to find all files larger than 10MB."Or use a faster, cheaper model for lightweight tasks:
bash
open-codex --provider gemini --model gemini-2.0-flash "Summarize the key points of RESTful API design."Step 4: Advanced Features—AI-Powered Development from Your Terminal
File System Interaction
Open Codex CLI can read and modify files, enabling real-world coding tasks:
- Read a file for suggestions
bash
open-codex --provider gemini "Read 'api/routes.js' and suggest improvements for maintainability."- Patch a file (with approval)
bash
open-codex --provider gemini "Add error handling to 'api/routes.js' for all endpoints."The CLI will show a diff and ask: Apply patch? [y/N]
Command Execution (with Safety)
Open Codex CLI can run shell commands suggested by Gemini, but only after your approval (unless you change approval mode):
- Default (suggest): Approve all file changes and shell commands.
- auto-edit: File changes are auto-applied; shell commands still require approval.
- full-auto: File changes and commands auto-applied (use with caution).
Example:
bash
open-codex --provider gemini "Install the 'requests' library using pip."Gemini may propose pip install requests, and the CLI will ask: Run command? [y/N]
All commands are sandboxed—network disabled and directory-confined for safety.
Project Context with codex.md
Guide Gemini with persistent project instructions:
~/.codex/instructions.md— Global personal instructionscodex.mdat repository root — Project-wide contextcodex.mdin working directory — Subdirectory-specific context
Use these files to describe API conventions, libraries, or team practices—so you don’t have to repeat them in every prompt.
Troubleshooting & Best Practices
- API Key Issues: Double-check
GOOGLEGENERATIVEAIAPIKEYis set and accessible in your session. - Model Not Found: Ensure model name matches what’s supported (e.g.,
gemini-2.5-pro-preview-03-25). - Provider Errors: Confirm
provideris set togeminiin config or via flag. - Verbose Debugging: Prepend
DEBUG=trueto your command for detailed logs. - Start Simple: Begin with basic prompts; increase complexity as needed.
- Always Review Changes: Never approve file or shell command changes blindly, especially in production codebases.
Conclusion
Integrating Open Codex CLI with Google Gemini 2.5 Pro empowers API developers, QA teams, and backend engineers to harness advanced AI directly within the terminal. From generating API code snippets to refactoring scripts and automating repetitive tasks, this combination enhances productivity, code quality, and security.
By following this guide, you can set up a powerful, developer-centric AI workflow—while keeping control over your codebase and workflow. Experiment with prompt styles, approval modes, and project context for best results.
And if you’re looking to bring this level of efficiency to your API lifecycle—testing, documentation, and team collaboration included—consider exploring Apidog as your all-in-one platform.



