--- title: AWS Bedrock Chat Model node documentation description: Learn how to use the AWS Bedrock Chat Model node in n8n. Follow technical documentation to integrate AWS Bedrock Chat Model node into your workflows. contentType: [integration, reference] --- # AWS Bedrock Chat Model node The AWS Bedrock Chat Model node allows you use LLM models utilising AWS Bedrock platform. On this page, you'll find the node parameters for the AWS Bedrock Chat Model node, and links to more resources. ```{note} Credentials You can find authentication information for this node [here](/08-0-0-Workflow/integrations/builtin/credentials/aws.md). ``` ```{include} ../../../../../_snippets/integrations/builtin/cluster-nodes/sub-node-expression-resolution.md ``` ## Node parameters * **Model**: Select the model that generates the completion. Learn more about available models in the [Amazon Bedrock model documentation](https://docs.aws.amazon.com/bedrock/latest/userguide/models-supported.html). ## Node options * **Maximum Number of Tokens**: Enter the maximum number of tokens used, which sets the completion length. * **Sampling Temperature**: Use this option to control the randomness of the sampling process. A higher temperature creates more diverse sampling, but increases the risk of hallucinations. ## Proxy limitations This node doesn't support the [`NO_PROXY` environment variable](/08-0-0-Workflow/hosting/configuration/environment-variables/deployment.md). ## Templates and examples ## Related resources Refer to [LangChains's AWS Bedrock Chat Model documentation](https://js.langchain.com/docs/integrations/chat/bedrock/) for more information about the service. ```{include} ../../../../../_snippets/integrations/builtin/cluster-nodes/langchain-overview-link.md ```