Getting Started
Positron Assistant supports the following language model providers:
| Provider | Chat | Code Completions | Authentication |
|---|---|---|---|
| Anthropic | API Key | ||
| GitHub Copilot (Preview) | GitHub OAuth | ||
| Amazon Bedrock (Preview) | AWS CLI or Posit Workbench Managed Credentials | ||
| Snowflake Cortex (Preview) | API Key or Posit Workbench Managed Credentials | ||
| OpenAI (Preview) | API Key | ||
| Custom Provider (Experimental) | API Key |
We’re actively expanding provider support and welcome your feedback and issue reports!
Step 1: Enable Positron Assistant
- Enable the setting
positron.assistant.enable. - Restart Positron or run the Developer: Reload Window command in the Command Palette.
Step 2: Configure language model providers
Positron Assistant requires a Claude Console account with API access. See Anthropic’s Claude API and Console documentation for more information.
Positron Assistant does not support logging into Anthropic with the Claude Pro Plan, Max Plan, or other Claude subscription plans.
How to get an Anthropic API key
To use Anthropic’s Claude models in Positron Assistant, you need to bring your own API key (BYOK). To obtain an API key from Anthropic:
- Log in to or create an account for Anthropic’s Claude Console.
- Navigate to the API keys management page.
- Click the Create Key button.
- Fill out any required information and click Add to generate your API key.
- Copy and save the API key to a password manager or another secure location.
Add Anthropic as a language model provider
Run the command Positron Assistant: Configure Language Model Providers
Select Anthropic as the model provider.
Paste your Claude Console API key into the input field and click Sign in.
Alternatively, set the ANTHROPIC_API_KEY environment variable to authenticate with Anthropic in Positron Assistant.
Positron Assistant requires a GitHub account with Copilot enabled.
How to get GitHub Copilot access
GitHub Copilot is a proprietary tool from GitHub. If you want to use GitHub Copilot, you need a subscription for GitHub Copilot in your personal GitHub account or to be assigned a seat by an organization with a subscription for GitHub Copilot for Business.
Students and faculty can use GitHub Copilot for free as part of the GitHub Education program. For more information, see the GitHub Education page.
Add GitHub Copilot as a language model provider
Run the command Positron Assistant: Configure Language Model Providers
Select GitHub Copilot as the model provider.
Click the Sign in button to initiate GitHub’s OAuth authentication flow.
- Complete the authentication flow in your browser, and return to Positron when finished.
If you are using Positron with a remote SSH session, you will need to authenticate to GitHub on the remote server as well. Follow along on GitHub as we make improvements in this area.
To use Amazon Bedrock in Positron Desktop, Positron Assistant requires an AWS account with Amazon Bedrock access, and you must be signed in using the AWS CLI.
Login with the AWS CLI
Enable Amazon Bedrock as a provider
Amazon Bedrock provider support is currently in preview and must be manually enabled.
Run the command Preferences: Open User Settings (JSON) and add the following to your settings file:
settings.json
"positron.assistant.enabledProviders": [
"amazon-bedrock"
](Optional) Configure AWS region and profile
By default, Positron Assistant uses the default AWS CLI profile and the us-east-1 region to connect to Amazon Bedrock.
To specify a different AWS CLI profile or region, update the positron.assistant.providerVariables.bedrock setting.
Add Amazon Bedrock as a language model provider
Run the command Positron Assistant: Configure Language Model Providers
Select Amazon Bedrock as the model provider.
Click Sign in so Positron Assistant can verify your AWS CLI authentication.
To use Snowflake Cortex in Positron Desktop, Positron Assistant requires a Snowflake account with Cortex access, as well as a Snowflake account identifier and programmatic access token (PAT).
Enable Snowflake Cortex as a provider
Snowflake Cortex provider support is currently in preview and must be manually enabled.
Run the command Preferences: Open User Settings (JSON) and add the following to your settings file:
settings.json
"positron.assistant.enabledProviders": [
"snowflake-cortex"
]Add Snowflake Cortex as a language model provider
Add your
SNOWFLAKE_ACCOUNTID to thepositron.assistant.providerVariables.snowflakesettingRun the command Positron Assistant: Configure Language Model Providers
Select Snowflake Cortex as the model provider.
Paste your Snowflake Cortex PAT into the API key field, then click Sign in.
Enable OpenAI as a provider
OpenAI provider support is currently in preview and must be manually enabled.
Run the command Preferences: Open User Settings (JSON) and add the following to your settings file:
settings.json
"positron.assistant.enabledProviders": [
"openai-api"
]Add OpenAI as a language model provider
Run the command Positron Assistant: Configure Language Model Providers
Select OpenAI as the model provider.
Paste your OpenAI API key into the input field and click Sign in.
Use the OpenAI provider to connect to a provider that implements the OpenAI Responses API, by replacing the base URL with your compatible custom endpoint.
Positron Assistant’s custom provider support is intended for use with any OpenAI-compatible API endpoint that uses the /v1/chat/completions endpoint for chat.
At this time, we don’t recommend using a local model as a custom provider. Read more about why local models are not there (yet) on Posit’s blog.
Enable Custom Provider as a provider
Custom provider support is currently experimental and must be manually enabled.
Run the command Preferences: Open User Settings (JSON) and add the following to your settings file:
settings.json
"positron.assistant.enabledProviders": [
"openai-compatible"
]Add Custom Provider as a language model provider
Run the command Positron Assistant: Configure Language Model Providers
Select Custom Provider as the model provider.
Enter your API key and base URL into the input fields and click Sign in.
Some OpenAI-compatible providers may not implement the /models endpoint, which Positron Assistant uses to list available models. If this is the case for your provider, you can manually configure a model listing using the positron.assistant.models.custom setting.
Step 3: Use Positron Assistant!
Once you’ve authenticated with at least one language model provider, you’re all set to use Positron Assistant.
Click on the chat robot icon in the sidebar, or run the command Chat: Open Chat in the Command Palette to open the chat:
Chat with Assistant by typing your question or request in the chat input box at the bottom of the chat pane, then pressing EnterEnter or the send button.
To learn more about Positron Assistant’s core features, check out the following guides:






