Getting Started

Configure and customize Positron Assistant by connecting it to your preferred AI model provider, such as Anthropic or GitHub Copilot. Choose the exact LLM provider that fits your workflow and security needs.

Positron Assistant supports the following language model providers:

Provider Chat Code Completions Authentication
Anthropic

API Key
GitHub Copilot (Preview)

GitHub OAuth
Amazon Bedrock (Preview)

AWS CLI or Posit Workbench Managed Credentials
Snowflake Cortex (Preview)

API Key or Posit Workbench Managed Credentials
OpenAI (Preview)

API Key
Custom Provider (Experimental)

API Key

We’re actively expanding provider support and welcome your feedback and issue reports!

Step 1: Enable Positron Assistant

  1. Enable the setting positron.assistant.enable.
  2. Restart Positron or run the Developer: Reload Window command in the Command Palette.

Step 2: Configure language model providers

Important

Positron Assistant requires a Claude Console account with API access. See Anthropic’s Claude API and Console documentation for more information.

Positron Assistant does not support logging into Anthropic with the Claude Pro Plan, Max Plan, or other Claude subscription plans.

How to get an Anthropic API key

To use Anthropic’s Claude models in Positron Assistant, you need to bring your own API key (BYOK). To obtain an API key from Anthropic:

  1. Log in to or create an account for Anthropic’s Claude Console.
  2. Navigate to the API keys management page.
  3. Click the Create Key button.
  4. Fill out any required information and click Add to generate your API key.
  5. Copy and save the API key to a password manager or another secure location.

Add Anthropic as a language model provider

  1. Run the command Positron Assistant: Configure Language Model Providers

  2. Select Anthropic as the model provider.

    Anthropic selected in provider modal

    Anthropic selected in provider modal
  3. Paste your Claude Console API key into the input field and click Sign in.

Tip

Alternatively, set the ANTHROPIC_API_KEY environment variable to authenticate with Anthropic in Positron Assistant.

Important

Positron Assistant requires a GitHub account with Copilot enabled.

How to get GitHub Copilot access

GitHub Copilot is a proprietary tool from GitHub. If you want to use GitHub Copilot, you need a subscription for GitHub Copilot in your personal GitHub account or to be assigned a seat by an organization with a subscription for GitHub Copilot for Business.

Students and faculty can use GitHub Copilot for free as part of the GitHub Education program. For more information, see the GitHub Education page.

Add GitHub Copilot as a language model provider

  1. Run the command Positron Assistant: Configure Language Model Providers

  2. Select GitHub Copilot as the model provider.

    GitHub Copilot selected in provider modal

    GitHub Copilot selected in provider modal
  3. Click the Sign in button to initiate GitHub’s OAuth authentication flow.

    • Complete the authentication flow in your browser, and return to Positron when finished.
Tip

If you are using Positron with a remote SSH session, you will need to authenticate to GitHub on the remote server as well. Follow along on GitHub as we make improvements in this area.

Important

To use Amazon Bedrock in Positron Desktop, Positron Assistant requires an AWS account with Amazon Bedrock access, and you must be signed in using the AWS CLI.

Login with the AWS CLI
  1. Download and install the AWS CLI
  2. Configure your AWS credentials for the AWS CLI
  3. Login with the AWS CLI

Enable Amazon Bedrock as a provider

Amazon Bedrock provider support is currently in preview and must be manually enabled.

Run the command Preferences: Open User Settings (JSON) and add the following to your settings file:

settings.json
"positron.assistant.enabledProviders": [
    "amazon-bedrock"
]

(Optional) Configure AWS region and profile

By default, Positron Assistant uses the default AWS CLI profile and the us-east-1 region to connect to Amazon Bedrock.

To specify a different AWS CLI profile or region, update the positron.assistant.providerVariables.bedrock setting.

Add Amazon Bedrock as a language model provider

  1. Run the command Positron Assistant: Configure Language Model Providers

  2. Select Amazon Bedrock as the model provider.

    Amazon Bedrock selected in provider modal

    Amazon Bedrock selected in provider modal
  3. Click Sign in so Positron Assistant can verify your AWS CLI authentication.

Important

To use Snowflake Cortex in Positron Desktop, Positron Assistant requires a Snowflake account with Cortex access, as well as a Snowflake account identifier and programmatic access token (PAT).

Enable Snowflake Cortex as a provider

Snowflake Cortex provider support is currently in preview and must be manually enabled.

Run the command Preferences: Open User Settings (JSON) and add the following to your settings file:

settings.json
"positron.assistant.enabledProviders": [
    "snowflake-cortex"
]

Add Snowflake Cortex as a language model provider

  1. Add your SNOWFLAKE_ACCOUNT ID to the positron.assistant.providerVariables.snowflake setting

  2. Run the command Positron Assistant: Configure Language Model Providers

  3. Select Snowflake Cortex as the model provider.

    Snowflake Cortex selected in provider modal

    Snowflake Cortex selected in provider modal
  4. Paste your Snowflake Cortex PAT into the API key field, then click Sign in.

Enable OpenAI as a provider

OpenAI provider support is currently in preview and must be manually enabled.

Run the command Preferences: Open User Settings (JSON) and add the following to your settings file:

settings.json
"positron.assistant.enabledProviders": [
    "openai-api"
]

Add OpenAI as a language model provider

  1. Run the command Positron Assistant: Configure Language Model Providers

  2. Select OpenAI as the model provider.

    OpenAI selected in provider modal

    OpenAI selected in provider modal
  3. Paste your OpenAI API key into the input field and click Sign in.

Tip

Use the OpenAI provider to connect to a provider that implements the OpenAI Responses API, by replacing the base URL with your compatible custom endpoint.

Important

Positron Assistant’s custom provider support is intended for use with any OpenAI-compatible API endpoint that uses the /v1/chat/completions endpoint for chat.

At this time, we don’t recommend using a local model as a custom provider. Read more about why local models are not there (yet) on Posit’s blog.

Enable Custom Provider as a provider

Custom provider support is currently experimental and must be manually enabled.

Run the command Preferences: Open User Settings (JSON) and add the following to your settings file:

settings.json
"positron.assistant.enabledProviders": [
    "openai-compatible"
]

Add Custom Provider as a language model provider

  1. Run the command Positron Assistant: Configure Language Model Providers

  2. Select Custom Provider as the model provider.

    Custom Provider selected in provider modal

    Custom Provider selected in provider modal
  3. Enter your API key and base URL into the input fields and click Sign in.

Tip

Some OpenAI-compatible providers may not implement the /models endpoint, which Positron Assistant uses to list available models. If this is the case for your provider, you can manually configure a model listing using the positron.assistant.models.custom setting.

Step 3: Use Positron Assistant!

Once you’ve authenticated with at least one language model provider, you’re all set to use Positron Assistant.

  1. Click on the chat robot icon in the sidebar, or run the command Chat: Open Chat in the Command Palette to open the chat:

    Assistant Sidebar Icon

    Assistant Sidebar Icon
  2. Chat with Assistant by typing your question or request in the chat input box at the bottom of the chat pane, then pressing Enter or the send button.

To learn more about Positron Assistant’s core features, check out the following guides: