You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The conversation API supports the following features:
228
+
229
+
1. Prompt caching - Allows developers to cache prompts in Dapr, leading to much faster response times and saving costs on egress and associated costs of inserting the prompt into the LLM provider's cache.
230
+
231
+
2. PII scrub - Allows for the obfuscation of data going in and out of the LLM.
232
+
233
+
To learn more on how to enable these features, see the API reference({{< ref conversation_api.md >}}) page.
234
+
227
235
## Related links
228
236
229
237
Try out the conversation API using the full examples provided in the supported SDK repos.
Copy file name to clipboardexpand all lines: daprdocs/content/en/reference/api/conversation_api.md
+21-27
Original file line number
Diff line number
Diff line change
@@ -30,40 +30,34 @@ POST http://localhost:<daprPort>/v1.0-alpha1/conversation/<llm-name>/converse
30
30
31
31
| Field | Description |
32
32
| --------- | ----------- |
33
-
|`conversationContext`| The ID of an existing chat room (like in ChatGPT). |
34
-
|`inputs`| Inputs for the conversation. Multiple inputs at one time are supported. |
35
-
|`metadata`|[Metadata](#metadata) passed to conversation components. |
33
+
|`inputs`| Inputs for the conversation. Multiple inputs at one time are supported. Required |
34
+
|`cacheTTL`| A time-to-live value for a prompt cache to expire. Uses Golang duration format. Optional |
35
+
|`scrubPII`| A boolean value to enable obfuscation of sensitive information returning from the LLM. Optional |
36
+
|`temperature`| A float value to control the temperature of the model. Used to optimize for consistency and creativity. Optional |
37
+
|`metadata`|[Metadata](#metadata) passed to conversation components. Optional |
36
38
37
-
#### Metadata
39
+
#### Input body
38
40
39
-
Metadata can be sent in the request’s URL. It must be prefixed with `metadata.`, as shown in the table below.
40
-
41
-
| Parameter | Description |
41
+
| Field | Description |
42
42
| --------- | ----------- |
43
-
|`metadata.key`| The API key for the component. `key` is not applicable to the [AWS Bedrock component]({{< ref "aws-bedrock.md#authenticating-aws" >}}). |
44
-
|`metadata.model`| The Large Language Model you're using. Value depends on which conversation component you're using. `model` is not applicable to the [DeepSeek component]({{< ref deepseek.md >}}). |
45
-
|`metadata.cacheTTL`| A time-to-live value for a prompt cache to expire. Uses Golang duration format. |
46
-
47
-
For example, to call for [Anthropic]({{< ref anthropic.md >}}):
48
-
49
-
```bash
50
-
curl POST http://localhost:3500/v1.0-alpha1/conversation/anthropic/converse?metadata.key=key1&metadata.model=claude-3-5-sonnet-20240620&metadata.cacheTTL=10m
51
-
```
52
-
53
-
{{% alert title="Note" color="primary" %}}
54
-
The metadata parameters available depend on the conversation component you use. [See all the supported components for the conversation API.]({{< ref supported-conversation >}})
55
-
{{% /alert %}}
43
+
|`content`| The message content to send to the LLM. Required |
44
+
|`role`| The role for the LLM to assume. Possible values: 'user', 'tool', 'assistant' |
45
+
|`scrubPII`| A boolean value to enable obfuscation of sensitive information present in the content field. Optional |
56
46
57
-
### Request content
47
+
### Request content example
58
48
59
49
```json
60
50
REQUEST = {
61
-
"inputs": ["what is Dapr", "Why use Dapr"],
62
-
"metadata": {
63
-
"model": "model-type-based-on-component-used",
64
-
"key": "authKey",
65
-
"cacheTTL": "10m",
66
-
}
51
+
"inputs": [
52
+
{
53
+
"content": "What is Dapr?",
54
+
"role": "user", // Optional
55
+
"scrubPII": "true", // Optional. Will obfuscate any sensitive information found in the content field
56
+
},
57
+
],
58
+
"cacheTTL": "10m", // Optional
59
+
"scrubPII": "true", // Optional. Will obfuscate any sensitive information returning from the LLM
60
+
"temperature": 0.5// Optional. Optimizes for consistency (0) or creativity (1)
0 commit comments