Skip to content

Commit 9898f92

Browse files
committed
update conversation api
Signed-off-by: yaron2 <[email protected]>
1 parent 336dcdf commit 9898f92

File tree

2 files changed

+36
-34
lines changed

2 files changed

+36
-34
lines changed

daprdocs/content/en/developing-applications/building-blocks/conversation/howto-conversation-layer.md

+15-7
Original file line numberDiff line numberDiff line change
@@ -52,8 +52,6 @@ spec:
5252
value: <REPLACE_WITH_YOUR_KEY>
5353
- name: model
5454
value: gpt-4-turbo
55-
- name: cacheTTL
56-
value: 10m
5755
```
5856
5957
## Connect the conversation client
@@ -114,12 +112,12 @@ func main() {
114112
}
115113

116114
input := dapr.ConversationInput{
117-
Message: "Please write a witty haiku about the Dapr distributed programming framework at dapr.io",
118-
// Role: nil, // Optional
119-
// ScrubPII: nil, // Optional
115+
Content: "Please write a witty haiku about the Dapr distributed programming framework at dapr.io",
116+
// Role: "", // Optional
117+
// ScrubPII: false, // Optional
120118
}
121119

122-
fmt.Printf("conversation input: %s\n", input.Message)
120+
fmt.Printf("conversation input: %s\n", input.Content)
123121

124122
var conversationComponent = "echo"
125123

@@ -163,7 +161,7 @@ async fn main() -> Result<(), Box<dyn std::error::Error>> {
163161
let request =
164162
ConversationRequestBuilder::new(conversation_component, vec![input.clone()]).build();
165163

166-
println!("conversation input: {:?}", input.message);
164+
println!("conversation input: {:?}", input.content);
167165

168166
let response = client.converse_alpha1(request).await?;
169167

@@ -224,6 +222,16 @@ dapr run --app-id=conversation --resources-path ./config --dapr-grpc-port 3500 -
224222

225223
{{< /tabs >}}
226224

225+
## Advanced features
226+
227+
The conversation API supports the following features:
228+
229+
1. Prompt caching - Allows developers to cache prompts in Dapr, leading to much faster response times and saving costs on egress and associated costs of inserting the prompt into the LLM provider's cache.
230+
231+
2. PII scrub - Allows for the obfuscation of data going in and out of the LLM.
232+
233+
To learn more on how to enable these features, see the API reference({{< ref conversation_api.md >}}) page.
234+
227235
## Related links
228236

229237
Try out the conversation API using the full examples provided in the supported SDK repos.

daprdocs/content/en/reference/api/conversation_api.md

+21-27
Original file line numberDiff line numberDiff line change
@@ -30,40 +30,34 @@ POST http://localhost:<daprPort>/v1.0-alpha1/conversation/<llm-name>/converse
3030

3131
| Field | Description |
3232
| --------- | ----------- |
33-
| `conversationContext` | The ID of an existing chat room (like in ChatGPT). |
34-
| `inputs` | Inputs for the conversation. Multiple inputs at one time are supported. |
35-
| `metadata` | [Metadata](#metadata) passed to conversation components. |
33+
| `inputs` | Inputs for the conversation. Multiple inputs at one time are supported. Required |
34+
| `cacheTTL` | A time-to-live value for a prompt cache to expire. Uses Golang duration format. Optional |
35+
| `scrubPII` | A boolean value to enable obfuscation of sensitive information returning from the LLM. Optional |
36+
| `temperature` | A float value to control the temperature of the model. Used to optimize for consistency and creativity. Optional |
37+
| `metadata` | [Metadata](#metadata) passed to conversation components. Optional |
3638

37-
#### Metadata
39+
#### Input body
3840

39-
Metadata can be sent in the request’s URL. It must be prefixed with `metadata.`, as shown in the table below.
40-
41-
| Parameter | Description |
41+
| Field | Description |
4242
| --------- | ----------- |
43-
| `metadata.key` | The API key for the component. `key` is not applicable to the [AWS Bedrock component]({{< ref "aws-bedrock.md#authenticating-aws" >}}). |
44-
| `metadata.model` | The Large Language Model you're using. Value depends on which conversation component you're using. `model` is not applicable to the [DeepSeek component]({{< ref deepseek.md >}}). |
45-
| `metadata.cacheTTL` | A time-to-live value for a prompt cache to expire. Uses Golang duration format. |
46-
47-
For example, to call for [Anthropic]({{< ref anthropic.md >}}):
48-
49-
```bash
50-
curl POST http://localhost:3500/v1.0-alpha1/conversation/anthropic/converse?metadata.key=key1&metadata.model=claude-3-5-sonnet-20240620&metadata.cacheTTL=10m
51-
```
52-
53-
{{% alert title="Note" color="primary" %}}
54-
The metadata parameters available depend on the conversation component you use. [See all the supported components for the conversation API.]({{< ref supported-conversation >}})
55-
{{% /alert %}}
43+
| `content` | The message content to send to the LLM. Required |
44+
| `role` | The role for the LLM to assume. Possible values: 'user', 'tool', 'assistant' |
45+
| `scrubPII` | A boolean value to enable obfuscation of sensitive information present in the content field. Optional |
5646

57-
### Request content
47+
### Request content example
5848

5949
```json
6050
REQUEST = {
61-
"inputs": ["what is Dapr", "Why use Dapr"],
62-
"metadata": {
63-
"model": "model-type-based-on-component-used",
64-
"key": "authKey",
65-
"cacheTTL": "10m",
66-
}
51+
"inputs": [
52+
{
53+
"content": "What is Dapr?",
54+
"role": "user", // Optional
55+
"scrubPII": "true", // Optional. Will obfuscate any sensitive information found in the content field
56+
},
57+
],
58+
"cacheTTL": "10m", // Optional
59+
"scrubPII": "true", // Optional. Will obfuscate any sensitive information returning from the LLM
60+
"temperature": 0.5 // Optional. Optimizes for consistency (0) or creativity (1)
6761
}
6862
```
6963

0 commit comments

Comments
 (0)