Your GIS, now AI‑supercharged.
ArcGIS Pro AI Toolbox is the only BYOK, open‑source plugin that brings conversational AI, code generation, and prompt‑driven geoprocessing natively into ArcGIS Pro—no cloud hop, no proprietary credits.
- Install in minutes. Prompt, generate, map—directly inside ArcGIS Pro.
- Preconfigured with OpenRouter out of the box, while still working with OpenAI, Azure, Claude, DeepSeek, local LLMs, and more.
- BYOK: Bring your own API key, keep your data private.
- No cloud detour, no extra Esri credits, no code required.
- Add AI Generated Field: Create rich text from attributes using AI.
- Get Map Info: Extract map context to JSON for smarter prompts.
- Generate Python Code: ArcPy snippets tuned to your map.
- Interpret Map: Summarize the active view with an AI-driven interpretation that combines a screenshot and spatial context.
- Create AI Feature Layer: Describe data, get a layer.
- Convert Text to Numeric: Standardize messy columns fast.
There are two ways to get started with the ArcGIS Pro AI Toolbox:
-
The Simple Way (Recommended):
- Download the toolbox directly from the arcgispro_ai website.
- Set up the required environment variables for your chosen AI provider(s)
- Add the downloaded
.pytfile to ArcGIS Pro and start using the tools immediately.
-
The Python Way (For Advanced Users):
- Install the package via pip:
pip install arcgispro_ai
- Set up the required environment variables for your chosen AI provider(s)
- Use the tools programmatically or within ArcGIS Pro by referencing the installed package. This requires an import of the toolbox from a path like
`C:\Users\<username>\AppData\Local\Programs\Python\Python<version>\Lib\site-packages\arcgispro_ai\toolboxes - Install the package via pip:
Set up the required environment variables for your chosen AI provider(s):
setx OPENROUTER_API_KEY "your-key-here"
setx OPENAI_API_KEY "your-key-here"
setx AZURE_OPENAI_API_KEY "your-key-here"
setx ANTHROPIC_API_KEY "your-key-here"
setx DEEPSEEK_API_KEY "your-key-here"OpenRouter is the default provider for every tool, so configuring OPENROUTER_API_KEY is enough to start running prompts immediately. Set the other environment variables only if you plan to switch to those providers.
- Deploy a compatible LLM server that implements the OpenAI chat completions API. That's up to you to figure out. idk, ask ChatGPT.
- Make sure to configure the endpoint URL to
http://localhost:8000or you'll have to override it every time you want to run a tool.
- Leave the Source dropdown at OpenRouter (the default) or switch to another provider if you've configured its API key
- Configure any provider-specific settings (model, endpoint, etc.)
- Enter your prompt or query
- Execute the tool
Each tool starts on OpenRouter, so you can run with zero extra configuration. When you choose another provider, make sure its API key and settings are in place before executing the tool.
- OpenRouter (default): Unified API for multiple models including OpenAI, Gemini, Claude, Llama, and more (requires
OPENROUTER_API_KEY) - OpenAI: GPT-4 and more (requires
OPENAI_API_KEY) - Azure OpenAI: Microsoft-hosted (requires
AZURE_OPENAI_API_KEY) - Claude (Anthropic): (requires
ANTHROPIC_API_KEY) - DeepSeek: (requires
DEEPSEEK_API_KEY) - Local LLM: No API key needed, OpenAI-compatible API
- Wolfram Alpha: For math/computation (requires
WOLFRAM_ALPHA_API_KEY)
OpenRouter provides a single API that gives you access to dozens of AI models from various providers:
- OpenAI models (GPT-4, GPT-3.5, etc.)
- Google models (Gemini 2.0 Flash, etc.)
- Anthropic models (Claude 3.5 Sonnet, etc.)
- Meta models (Llama variants)
- DeepSeek models
- And many more
To use OpenRouter:
- Sign up for an API key at openrouter.ai
- Set your API key:
setx OPENROUTER_API_KEY "your-openrouter-key"
- OpenRouter is already selected in every tool—leave the Source dropdown on OpenRouter (or switch back to it) to start using it immediately
- The Model dropdown shows a curated set of OpenRouter options; if you need something else, you can type any OpenRouter model ID manually. Refer to the full catalog at https://openrouter.ai/models?fmt=table
This project is organized for both maintainability and ease of distribution:
-
Modular Source Structure:
- The codebase is organized into multiple Python modules and packages (see
arcgispro_ai/andarcgispro_ai/toolboxes/). - This modular design makes the code easy to maintain, test, and extend.
- Utility functions, API clients, and tool logic are separated for clarity and reusability.
- The codebase is organized into multiple Python modules and packages (see
-
Monolithic
.pytfor Distribution:- For end users, a single-file, monolithic Python Toolbox (
.pyt) is generated (arcgispro_ai.pyt). - This file contains all required code inlined—no dependencies on the rest of the repo structure.
- Users can simply download the
.pytand add it to ArcGIS Pro, with no need to install Python packages or clone the repo. - The monolithic
.pytis auto-generated by thebuild_monolithic_pyt.pyscript, which inlines all code and strips out internal imports. - The version of the
.pytalways matches the package version (fromsetup.py), ensuring consistency with PyPI releases.
- For end users, a single-file, monolithic Python Toolbox (
-
Release Management:
- The
release.shscript automates the version management and release process. - It automatically increments the patch version in
setup.py, commits the change, and tags the release. - The script creates a Git tag with the new version number and pushes changes to the repository.
- After pushing, it attempts to open the GitHub release page to facilitate release note creation.
- This ensures consistent versioning between the codebase, PyPI releases, and GitHub tags.
- The
Summary:
- Developers benefit from a clean, modular codebase.
- Users benefit from a simple, single-file download for ArcGIS Pro.
- Releases are managed systematically with automated versioning.
See build_monolithic_pyt.py for details on how the monolithic .pyt is built.
The codebase is organized to keep Esri toolbox glue minimal while centralizing reusable logic:
arcgispro_ai/
├─ toolboxes/
│ ├─ arcgispro_ai_tools.pyt # Esri Python Toolbox (UI, params, execute orchestration)
│ └─ arcgispro_ai/
│ └─ arcgispro_ai_utils.py # Shared utilities and helpers used by tools
├─ core/
│ └─ api_clients.py # Provider abstraction + client factory (`get_client`)
├─ docs/ # Static docs site content
├─ build_monolithic_pyt.py # Builds single-file .pyt for distribution
├─ arcgispro_ai.pyt # Generated monolithic toolbox (checked in for convenience)
└─ README.md
- Toolbox file focuses on: parameter definitions, validation messages, and high-level
execute()orchestration. - Implementation details live in utilities and core modules to keep the
.pytreadable and maintainable. - Examples of logic moved out of the toolbox into utilities:
- Map context capture and screenshot handling
- Interpretation instructions (system prompt) for the AI
- Markdown → HTML rendering for rich, theme-friendly output
- Feature counting, API key resolution, model parameter wiring, and vision capability checks
Key modules:
- Utilities:
arcgispro_ai/toolboxes/arcgispro_ai/arcgispro_ai_utils.pyget_interpretation_instructions()– central system prompt for Interpret Maprender_markdown_to_html()– tasteful Markdown renderer (headings, lists, tables, code, links) with minimal, theme-inheriting stylesget_feature_count_value(),resolve_api_key(),update_model_parameters(),model_supports_images()- Map and layer helpers used across tools
- Provider abstraction:
arcgispro_ai/core/api_clients.pyget_client(source, api_key, **kwargs)– returns the right provider client (OpenRouter, OpenAI, Azure OpenAI, Anthropic/Claude, DeepSeek, local)
- Environment variables: configure the provider(s) you use
OPENROUTER_API_KEY,OPENAI_API_KEY,AZURE_OPENAI_API_KEY,ANTHROPIC_API_KEY,DEEPSEEK_API_KEY
- Toolbox UI wiring:
update_model_parameters()enables/disables fields (e.g.,endpoint,deployment) and dynamically populatesmodeloptions (OpenRouter catalog supported).resolve_api_key()maps the selected provider to the appropriate environment variable and surfaces helpful errors.model_supports_images()gates use of map screenshots only when a provider/model can accept images.
- Vision usage:
- If enabled and supported, Interpret Map sends both the textual context and a low-detail screenshot to multimodal models.
- Interpret Map responses can include tasteful Markdown (headings, lists, tables, links, fenced code).
- Output is rendered to HTML via
render_markdown_to_html()with minimal CSS that inherits ArcGIS Pro’s theme—no hardcoded text colors—so it looks good in both light and dark modes. - A screenshot preview (if captured and supported) is shown with neutral borders/shadows that work across themes.
Make an issue or create a branch for your feature or bug fix, and submit a pull request.
