Project Structure#

GitHub this note shows how to create a route handler running on the server side to communicate with Amazon Bedrock.

Let create a next NextJS project

npx create-next-app@latest .

Then install bedrock sdk

npm i @aws-sdk/client-bedrock-runtime

Here is the project structure

|--app
|--bedrock
|--route.ts
|--chat
|--page.tsx
|--page.tsx

Bedrock Image#

Let generate images using Stable Diffusion model in bedrock

import {
BedrockRuntime,
InvokeModelCommand,
} from "@aws-sdk/client-bedrock-runtime";
import * as fs from "fs";
const bedrock = new BedrockRuntime({ region: "us-east-1" });
const bedrockGenImage = async ({ prompt }: { prompt: string }) => {
const body = {
text_prompts: [{ text: prompt, weight: 1 }],
seed: 3,
cfg_scale: 10,
steps: 50,
style_preset: "anime",
height: 1024,
width: 1024,
};
const command = new InvokeModelCommand({
body: JSON.stringify(body),
modelId: "stability.stable-diffusion-xl-v1",
contentType: "application/json",
accept: "image/png",
});
try {
const response = await bedrock.send(command);
fs.writeFile("sample.png", response["body"], () => {
console.log("OK");
});
console.log(response);
console.log(response["body"]);
} catch (error) {
console.log(error);
}
};
const main = async () => {
await bedrockGenImage({ prompt: "a big whale swimming high in anime style" });
};
main();

Bedrock Chat#

Let generate text using claude-v2 in bedrock

import {
BedrockRuntime,
InvokeModelCommand,
} from "@aws-sdk/client-bedrock-runtime";
const bedrock = new BedrockRuntime({ region: "us-east-1" });
const decoder = new TextDecoder();
const bedrockGenText = async ({ prompt }: { prompt: string }) => {
const formattedPrompt = `\n\nHuman: ${prompt} \n\nAssistant:`;
const body = {
prompt: formattedPrompt,
max_tokens_to_sample: 2048,
temperature: 0.5,
top_k: 250,
top_p: 1,
stop_sequences: ["\n\nHuman:"],
};
const command = new InvokeModelCommand({
body: JSON.stringify(body),
modelId: "anthropic.claude-v2",
accept: "application/json",
contentType: "application/json",
});
try {
const response = await bedrock.send(command);
console.log(response["body"]);
console.log(JSON.parse(decoder.decode(response["body"])).completion);
} catch (error) {
console.log(error);
}
};
const main = async () => {
await bedrockGenText({ prompt: "How to cook chicken soup?" });
};
main();

Beckrock Streaming#

Let call bedrock streaming response

npx ts-node --skip-project test/bedrock.ts

Here is how to call bedrock service with streaming response

import {
BedrockRuntime,
InvokeModelWithResponseStreamCommand
} from '@aws-sdk/client-bedrock-runtime'
const bedrock = new BedrockRuntime({ region: 'us-east-1' })
export const callBedrock = async () => {
const decoder = new TextDecoder()
let prompt = 'how to cook chicken soup?'
const claudePrompt = `\n\nHuman: ${prompt} \n\nAssistant:`
const config = {
prompt: claudePrompt,
max_tokens_to_sample: 2048,
temperature: 0.5,
top_k: 250,
top_p: 1,
stop_sequences: ['\n\nHuman:']
}
const command = new InvokeModelWithResponseStreamCommand({
body: JSON.stringify(config),
modelId: 'anthropic.claude-v2',
accept: 'application/json',
contentType: 'application/json'
})
try {
console.log('call bedrock ...')
const response = await bedrock.send(command)
if (response.body) {
console.log(response.body)
for await (const chunk of response.body) {
if (chunk.chunk) {
// console.log(chunk);
console.log(
String(JSON.parse(decoder.decode(chunk.chunk.bytes)).completion)
)
}
}
}
} catch (error) {
console.log(error)
}
}
const main = async () => {
await callBedrock()
}
main()

Bedrock Prompt#

Let create a simple prompt for claude-v2. The prompt format should be '\n\nHuman: abc \n\nAssitant:'

  • Chain or concat conversations
  • Format the prompt for a specific model
let mprompt = "";
for (let i = 0; i < messages.length; i++) {
if (messages[i]["role"] == "user") {
mprompt += "\n\nHuman:" + " " + messages[i]["content"];
}
if (messages[i]["role"] == "assistant") {
mprompt += "\n\nAssitant:" + messages[i]["content"];
}
}
mprompt += "\n\nAssistant:";

Here is the detail code

bedrock-prompt.ts
// haimtran 02/02/2024
// how to build a prompt for bedrock anthropic
import {
BedrockRuntimeClient,
InvokeModelWithResponseStreamCommand,
} from "@aws-sdk/client-bedrock-runtime";
import * as dotevn from "dotenv";
dotevn.config();
const decoder = new TextDecoder();
const bedrockClient = new BedrockRuntimeClient({
region: process.env.AWS_REGION ?? "us-east-1",
credentials: {
accessKeyId: process.env.AWS_ACCESS_KEY_ID ?? "",
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY ?? "",
sessionToken: process.env.AWS_SESSION_TOKEN ?? "",
},
});
const callBedrock = async ({ prompt }: { prompt: string }) => {
const response = await bedrockClient.send(
new InvokeModelWithResponseStreamCommand({
modelId: "anthropic.claude-v2",
contentType: "application/json",
accept: "application/json",
body: JSON.stringify({
prompt: prompt,
max_tokens_to_sample: 300,
}),
})
);
if (response.body) {
for await (const chunk of response.body) {
if (chunk.chunk) {
console.log(JSON.parse(decoder.decode(chunk.chunk.bytes)).completion);
}
}
}
console.log(response);
};
const messages = [
{ role: "user", content: "how to cook chicken soup?" },
{
role: "assistant",
content:
" Here is a basic recipe for homemade chicken soup:\n\nIngredients:\n- 1 whole chicken or 2-3 pounds chicken pieces (breasts, thighs, legs)\n- 2 tbsp olive oil\n- 1 onion, diced \n- 3 carrots, peeled and sliced\n- 3 stalks celery, sliced\n- 6 cups chicken broth\n- 2 bay leaves\n- 1 tsp thyme\n- Salt and pepper to taste\n- Egg noodles or rice (optional)\n\nInstructions:\n\n1. If using a whole chicken, remove skin and fat and cut the chicken into pieces. If using chicken pieces, trim off any excess fat or skin.\n\n2. Heat olive oil in a large pot over medium heat. Add the chicken pieces and cook for 3-4 minutes on each side until browned. Remove chicken to a plate. \n\n3. Add the onion, carrots and celery to the pot. Sauté for 5 minutes until vegetables are softened. \n\n4. Return the chicken to the pot and add the broth, bay leaves, thyme, salt and pepper. Bring to a boil, then reduce heat and simmer for 45 mins to 1 hour.\n\n5. Remove the chicken pieces and shred or cut into bite-sized pieces. Discard bay leaves.\n\n6. Add the shredded chicken back to the pot and add noodles or rice if desired. Cook for 8-10 minutes until noodles or rice are tender. \n\n7.",
},
{
role: "user",
content: "how to customize it for a little 3 years old girl?",
},
{
role: "assistant",
content:
" Here are some tips for customizing homemade chicken soup for a 3 year old girl:\n\n- Use a mild tasting broth or reduce the amount of herbs/seasonings so it's not too strong flavored. You can use low sodium chicken broth.\n\n- Add small pasta shapes like stars, alphabets or mini noodles. Toddlers enjoy fun shaped pastas. \n\n- Dice the vegetables into small pieces so they are easy to eat. Good veggie additions include carrots, peas, corn, green beans.\n\n- Shred or cut the chicken into very small, bite-sized pieces. Remove any bones or skin to be safe. \n\n- Add a spoonful of uncooked rice to thicken the broth slightly. This makes it easy for a toddler to eat.\n\n- Stir in some spinach or kale at the end for extra nutrients. Blanch quickly in the hot soup to wilt.\n\n- Keep the soup on the milder side for spiciness. Avoid pepper or hot spices.\n\n- Mix in a dollop of plain yogurt or sour cream to provide thickness and tang.\n\n- Garnish with a sprinkle of shredded cheddar cheese for added flavor and nutrition. \n\n- Let the soup cool slightly before serving to prevent burns. Check temperature first.\n\n- Serve with soft bread sticks or rolls that are easy to dip and chew.\n\nThe key is tailoring the textures and flavors to a",
},
{
role: "user",
content: "should I add some spicy ingredients for the 3 years old one? ",
},
];
let buffer = "";
for (let i = 0; i < messages.length; i++) {
if (messages[i]["role"] == "user") {
buffer += "\n\nHuman:" + " " + messages[i]["content"];
}
if (messages[i]["role"] == "assistant") {
buffer += "\n\nAssitant:" + messages[i]["content"];
}
}
buffer += "\n\nAssistant:";
console.log(buffer);
const main = async () => {
await callBedrock({ prompt: buffer });
};
main();

Route Handler#

Let create a NextJS route hander which running on the server side to handle prompt from client. This show how to convert chunk from bedrock response to a response stream.

import {
BedrockRuntime,
InvokeModelWithResponseStreamCommand
} from '@aws-sdk/client-bedrock-runtime'
import { NextRequest } from 'next/server'
const bedrock = new BedrockRuntime({ region: 'us-east-1' })
async function* makeIterator(prompt: String) {
const claudePrompt = `\n\nHuman: ${prompt} \n\nAssistant:`
const config = {
prompt: claudePrompt,
max_tokens_to_sample: 2048,
temperature: 0.5,
top_k: 250,
top_p: 1,
stop_sequences: ['\n\nHuman:']
}
const command = new InvokeModelWithResponseStreamCommand({
body: JSON.stringify(config),
modelId: 'anthropic.claude-v2',
accept: 'application/json',
contentType: 'application/json'
})
try {
console.log('call bedrock ...')
const response = await bedrock.send(command)
if (response.body) {
console.log(response.body)
for await (const chunk of response.body) {
if (chunk.chunk) {
yield chunk.chunk.bytes
}
}
}
} catch (error) {
console.log(error)
}
}
function iteratorToStream(iterator: any) {
return new ReadableStream({
async pull(controller) {
const { value, done } = await iterator.next()
if (done) {
controller.close()
} else {
controller.enqueue(value)
}
}
})
}
function sleep(time: number) {
return new Promise(resolve => {
setTimeout(resolve, time)
})
}
const encoder = new TextEncoder()
export async function GET() {
const iterator = makeIterator('how to cook chicken soup?')
const stream = iteratorToStream(iterator)
return new Response(stream)
}
export async function POST(request: NextRequest) {
const res = await request.json()
console.log(res)
const iterator = makeIterator(res.prompt)
const stream = iteratorToStream(iterator)
return new Response(stream)
// return Response.json({ name: "hai" });
}

FrontEnd#

  • This standard useState with a string somehow does not display break lines characters, so work around by using document.getElementById().innerText.

  • Disable defaut event inform and send a POST request to the route handler. Here is the form

<div className="min-h-screen">
<div className="max-w-3xl mx-auto px-10 py-10">
<div>
<form className="relative mb-3">
<input
type="text"
className="border-2 border-blue-500 border- w-full p-3"
id="prompt"
></input>
<button
type="submit"
className="bg-orange-500 px-10 py-2.5 rounded-md cursor-pointer absolute top-0 right-0 translate-y-1 mr-2"
onClick={event => {
event.preventDefault()
callBedrock()
}}
>
Submit
</button>
</form>
</div>
<p id="story-output"></p>
</div>
</div>

Here is the full code

chat.tsx
'use client'
const ChatPage = () => {
const callBedrock = async () => {
const prompt = (document.getElementById('prompt') as HTMLInputElement).value
const story = document.getElementById('story-output')
story!.innerText = ''
console.log('call bedrock ', prompt)
try {
const response = await fetch('/bedrock', {
method: 'POST',
headers: {
'Content-Type': 'application/json'
},
body: JSON.stringify({ prompt: prompt })
})
console.log(response)
const reader = response.body!.getReader()
const decoder = new TextDecoder()
while (true) {
const { done, value } = await reader.read()
if (done) {
break
}
try {
const json = JSON.parse(decoder.decode(value))
console.log(decoder.decode(value))
story!.innerText += json.completion
} catch (error) {
console.log(error)
story!.innerText += 'ERROR'
}
}
} catch (error) {
console.log(error)
}
}
return (
<div className="min-h-screen">
<div className="max-w-3xl mx-auto px-10 py-10">
<div>
<form className="relative mb-3">
<input
type="text"
className="border-2 border-blue-500 border- w-full p-3"
id="prompt"
></input>
<button
type="submit"
className="bg-orange-500 px-10 py-2.5 rounded-md cursor-pointer absolute top-0 right-0 translate-y-1 mr-2"
onClick={event => {
event.preventDefault()
callBedrock()
}}
>
Submit
</button>
</form>
</div>
<p id="story-output"></p>
</div>
</div>
)
}
export default ChatPage

Vercel AI#

Let use vercel ai sdk to create a simple chat app with Bedrock. The vercel sdk provide

  • UseChat which handles messsages between client and server
  • Create streamming response for client

Let create a route handler with a project structure as below

|--app
|--page.tsx
|--api
|--route.ts
|--package.json
|--tsconfig.json
|--tailwind.config.

The route will handle POST request from client with messages from request.

  • Create prompt for anthropic using experimental_buildAnthropicPrompt

  • Create streaming response to client

export async function POST(req: Request) {
// Extract the `prompt` from the body of the request
const { messages } = await req.json();
console.log(messages);
console.log(experimental_buildAnthropicPrompt(messages));
// Ask Claude for a streaming chat completion given the prompt
const bedrockResponse = await bedrockClient.send(
new InvokeModelWithResponseStreamCommand({
modelId: "anthropic.claude-v2",
contentType: "application/json",
accept: "application/json",
body: JSON.stringify({
prompt: experimental_buildAnthropicPrompt(messages),
max_tokens_to_sample: 300,
}),
})
);
// Convert the response into a friendly text-stream
const stream = AWSBedrockAnthropicStream(bedrockResponse);
// Respond with the stream
return new StreamingTextResponse(stream);
}

Here is the detail code

route.ts
import {
BedrockRuntimeClient,
InvokeModelWithResponseStreamCommand,
} from "@aws-sdk/client-bedrock-runtime";
import { AWSBedrockAnthropicStream, StreamingTextResponse } from "ai";
import { experimental_buildAnthropicPrompt } from "ai/prompts";
// IMPORTANT! Set the runtime to edge
export const runtime = "edge";
const bedrockClient = new BedrockRuntimeClient({
region: process.env.AWS_REGION ?? "us-east-1",
credentials: {
accessKeyId: process.env.AWS_ACCESS_KEY_ID ?? "",
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY ?? "",
sessionToken: process.env.AWS_SESSION_TOKEN ?? "",
},
});
export async function POST(req: Request) {
// Extract the `prompt` from the body of the request
const { messages } = await req.json();
console.log(messages);
console.log(experimental_buildAnthropicPrompt(messages));
// Ask Claude for a streaming chat completion given the prompt
const bedrockResponse = await bedrockClient.send(
new InvokeModelWithResponseStreamCommand({
modelId: "anthropic.claude-v2",
contentType: "application/json",
accept: "application/json",
body: JSON.stringify({
prompt: experimental_buildAnthropicPrompt(messages),
max_tokens_to_sample: 300,
}),
})
);
// Convert the response into a friendly text-stream
const stream = AWSBedrockAnthropicStream(bedrockResponse);
// Respond with the stream
return new StreamingTextResponse(stream);
}

Finally let create a simple frontend with UseChat which handles messages from client

"use client";
import { Message } from "ai";
import { useChat } from "ai/react";
export default function Chat() {
const { messages, input, handleInputChange, handleSubmit } = useChat({
api: "./api",
});
// Generate a map of message role to text color
const roleToColorMap: Record<Message["role"], string> = {
system: "red",
user: "black",
function: "blue",
tool: "purple",
assistant: "green",
data: "orange",
};
return (
<div className="flex flex-col w-full max-w-md py-24 mx-auto stretch">
{messages.length > 0
? messages.map((m) => (
<div
key={m.id}
className="whitespace-pre-wrap"
style={{ color: roleToColorMap[m.role] }}
>
<strong>{`${m.role}: `}</strong>
{m.content || JSON.stringify(m.function_call)}
<br />
<br />
</div>
))
: null}
<form onSubmit={handleSubmit}>
<input
className="fixed bottom-0 w-full max-w-md p-2 mb-8 border border-gray-300 rounded shadow-xl"
value={input}
placeholder="Say something..."
onChange={handleInputChange}
/>
</form>
</div>
);
}