This guide explains how to configure API keys for Amazon Bedrock to enable Discourse AI features that require 3rd party LLM keys.
Required user level: Administrator
In this example, we are using Amazon Bedrock to generate the keys.
Note: You will require a paid plan and configured api keys
Platforms will always change so this guide may not reflect 100% process.
Obtain API keys
Configuring keys on Amazon Bedrock is a somewhat more complicated than most other providers. You will likely need to be versed with concepts such as IAM roles and more. Instruction on how to obtain keys is at:
What bedrock models does Discourse AI support?
Discourse AI supports all Anthropic models (Haiku 3.5 / Sonnet 3.5 / Opus 3.5) and all all Nova language models (Micro / Lite and Pro)
These models are supported with images, videos and tool calling (xml and native).
As of December 2024 here are the key model ids:
Nova:
- amazon.nova-pro-v1:0
- amazon.nova-lite-v1:0
- amazon.nova-micro-v1:0
Claude:
- anthropic.claude-3-5-haiku-20241022-v1:0
- anthropic.claude-3-5-sonnet-20241022-v2:0
Keep in mind not all models are available in all regions, check your bedrock configuration page.
Using API keys for Discourse AI
- Go to
Admin
→Plugins
→AI
→LLMs
tab - Click on the Set up button on “Manual configuration”
- Enter all model settings:
Note, you will need an API key, bedrock access key and region.
As a tokenizer you can use the OpenAiTokenizer for Nova and the AnthropicTokenizer for Claude based models.
Should I disable native tool support or not?
Discourse ships with native tool and xml tool based configurations, in some cases XML tools outperform native tool implementation, you should experiment to find which configuration works best for you.