How to update OpenAI Prompts for document generation
Configuration Scope: Environment-Specific
This setting is environment-specific and must be configured separately in each environment (dev, test, prod). Changes here will not be included in configuration exports.
Overview
A new feature allows system prompts to be stored and managed in a storage table instead of being hardcoded. This also leverages Redis caching for improved performance.
Storage Configuration
The storage table OpenAIPrompts has been created in the briefconnectsa storage resource. Prompts are stored under the PartitionKey OpenAiPrompt and uniquely identified by the following RowKeys:
| RowKey | Description |
|---|---|
| prompt-default | Default system prompt |
| prompt-qon | Q&A prompt |
| prompt-wordfile | Word file prompt |
New AppSettings
The following AppSettings have been added to the Function App Service:
| Setting Name | Description | Default Value |
|---|---|---|
| AzureOpenAIServiceStorage | Connection string for the storage account | N/A |
| AzureOpenAIServicePromptCacheTime | Cache duration in minutes | 10 |
Example AppSettings Configuration
{
"AzureOpenAIServiceStorage": "<storage_connection_string>",
"AzureOpenAIServicePromptCacheTime": 10
}
Functionality
-
The AI generation process retrieves prompts from the storage table instead of hardcoded values.
-
Prompts are cached using Redis for 10 minutes by default. This cache duration can be configured via
AzureOpenAIServicePromptCacheTime. -
If retrieval fails, the application will log the error. Common issues might include connection problems or incorrect storage configuration.