Codex Plugins Are Here: Slack, Figma, Google Drive, and More

OpenAI rolled out plugins for Codex on March 26, 2026. The integrations cover Slack, Figma, Notion, Gmail, and Google Drive, and they work out of the box across the Codex app, Codex CLI, and IDE extensions. This is a real expansion of what Codex can do inside a development workflow, not just inside a code file.

What Plugins Actually Do

Each plugin bundles two things: authentication to your tools and skills that make Codex immediately capable within them. You don’t configure credentials from scratch or spend time prompt-engineering your way around how a given app works. It arrives ready to operate.

The Google Drive plugin is a clear example. It gives Codex access across Drive, Docs, Sheets, and Slides. That means Codex can pull context from a spec document, reference a spreadsheet, or work with a slide deck as part of a larger task. The same applies to Slack and Notion. Codex can participate in the planning and coordination work that happens before code gets written, not just the code itself.

That distinction matters. A lot of real development work isn’t writing code. It’s reading a brief, checking a design file, updating a ticket, writing a summary. Codex has handled the coding part well for a while now. Plugins extend that reach into the surrounding workflow.

Where Plugins Are Available

SurfacePlugin Support
Codex AppYes
Codex CLIYes
IDE ExtensionsYes

The IDE extension support is worth paying attention to. Codex is natively integrated into JetBrains IDEs as of v2025.3 with the AI Assistant plugin. Authentication works through a JetBrains AI subscription, a ChatGPT account, or a standard OpenAI API key, and you can control how much autonomy Codex has over things like network access and command execution. Plugins extend that further, bringing external tool access directly into the IDE without switching contexts.

The Figma integration stands out on its own. It connects Codex to your actual design files so you can move between implementation and Figma designs without leaving the agent. For teams where the design-to-code handoff is a persistent friction point, having Codex read directly from Figma removes a real step from the process.

The Tools Covered at Launch

Codex plugin launch integrations coverage

The chart above reflects how broadly each tool touches a typical development workflow. Google Drive and Figma sit at the top because they tend to be involved across the most stages, from specs and briefs through to design handoffs. Slack and Notion cover coordination and documentation. Gmail rounds out the set for communication.

Building Your Own Plugins

OpenAI is also opening up custom plugin development. Developers can build and share plugins locally, with documentation at developers.openai.com/codex/plugins#use-plugins-locally. There’s already at least one open-source example in the community, a plugin called codex-last-prompt-footer, which gives a sense of what people are starting to build.

This is the right move. The most useful plugins for any given team are going to be specific to that team’s stack. If your workflow runs through an internal ticketing system, a proprietary data source, or a custom deployment pipeline, a general plugin library won’t cover that. Letting teams build and share their own integrations extends the value well beyond what OpenAI can ship directly.

The release notes on the infrastructure side also show active work on the setup experience. Recent app-server updates included smoother plugin setup flows, ChatGPT device-code sign-in, and better real-time handling. The rough edges that tend to appear in early rollouts are being addressed quickly, which is good to see.

The Underlying Model

Codex runs on GPT-5.3-Codex across all surfaces, including the app, CLI, and IDE extensions. That model is tuned for agentic coding workflows, handles multi-turn conversations well, and is built for long-horizon tasks. It also runs 25% faster than GPT-5.2-Codex while using 48% fewer tokens for equivalent results, which matters when you’re running agents over extended tasks with multiple tool calls involved. For a broader look at how GPT-5.3-Codex compares to other frontier models right now, the earlier coverage on OpenAI’s model roadmap is worth reading alongside this.

For context on how coding agents are being evaluated more rigorously, CursorBench-3 covers the kinds of real developer tasks that benchmarks are starting to measure. It’s a useful frame for thinking about what plugin-enabled Codex workflows will eventually be judged against.

What Comes Next

OpenAI has said more plugins are coming for additional use cases, with the skills library growing alongside them. The current set already covers most of the major productivity and collaboration tools that developers use daily, so the next wave will likely go deeper into more specialized or team-specific integrations.

The bigger question is how fast custom plugin adoption picks up and whether teams start sharing plugins openly the way they share other tooling. The codex-last-prompt-footer example is a small signal that the community is already moving in that direction.

Plugins don’t change what Codex is at its core. It’s still an agentic coding assistant. What they do is remove the wall between the code editor and the rest of the work that surrounds writing code, which is where a lot of time actually goes.

Links

They're clicky!

Follow me on X Visit Ironwood AI →

Adam Holter

Founder of Ironwood AI. Writing about AI stuff!