Logging of AI requests
This functionality is available only in IDE Services On-Premises.
If you need a detailed retrospective log of communications between AI Enterprise and external AI services (such as OpenAI Platform, Azure OpenAI, Google Vertex AI, and so on), IDE Services allows you to track all request-response interactions. The log data can be stored in your organization's object storage or a dedicated object storage for logs.
warning
Since private data may be logged, your company must ensure compliance with regional regulations on data processing and retention.
Additionally, you can configure the formatting style for AI logs, choosing from JSON, HTTP, or Splunk. Examples of different formatting styles:
{
"correlation": "a56996318975f2f8",
"protocol": "HTTP/1.1",
"method": "GET",
"path": "/api/ai/user/v5/llm/profiles",
"attributes": {
"userId": "5caeeae2-eb90-441d-9cfb-67bc9ec48d3c"
},
"headers": {
"Authorization": [
"XXX"
]
},
"body": "<body>"
}
Incoming Request: 96cd14698f9ff001
GET https://<serverURL>/api/ai/user/v5/llm/profiles HTTP/1.1
Authorization: XXX
Request Attribute `userId`: 6092281d-947a-4cf4-bb76-70332c94b595
<body>
origin=remote type=request correlation=baa0fe4c12598729 protocol=HTTP/1.1 method=GET path=/api/ai/user/v5/llm/profiles attributes={userId=dca1270f-1e6d-4518-a49d-b48cf38ad7f0} headers={Authorization=[XXX]} body={<body>}
You can configure IDE Services to store logs with AI requests in your existing object storage connected to IDE Services, or in a separate bucket.
tip
Before start, make sure you have the
Write
access to the intended log storage.
You can choose to use the same object storage as configured for your IDE Services. For this purpose, you need to provide a specific path-prefix
value.
Add the following configuration to your server configuration file:
application.yamlvalues.yaml (Helm)tbe: ai: platform: logging: enabled: true path-prefix: <PATH_PREFIX> format: <json|http|splunk>
ides: configCustomization: ai: platform: logging: enabled: true path-prefix: <PATH_PREFIX> format: <json|http|splunk>
If you'd like to store AI requests in a separate object storage, you need to create a dedicated S3 bucket or Azure Storage account and specify its details in the server configuration file.
Add the following configuration to your server configuration file:
application.yamlvalues.yaml (Helm)tbe: ai: platform: logging: enabled: true format: <json|http|splunk> storage-type: <azure|s3> azure: container: <CONTAINER_NAME> connection-string: <CONNECTION_STRING> s3: bucket: <BUCKET_NAME> url: <ENDPOINT_URL> access-key: <ACCESS_KEY> secret-key: <ACCESS_SECRET>
ides: configCustomization: ai: platform: logging: enabled: true format: <json|http|splunk> storage-type: <azure|s3> azure: container: <CONTAINER_NAME> connection-string: <CONNECTION_STRING> s3: bucket: <BUCKET_NAME> url: <ENDPOINT_URL> access-key: <ACCESS_KEY> secret-key: <ACCESS_SECRET>
- ai.platform.logging.format
Specify one of Logbook's formatting styles for logs. Possible values:
json
,http
, andsplunk
.- ai.platform.logging.path-prefix
To store AI logs together with other IDE Services Server data, specify the path for these log files in your object storage. The resulting layout is as follows:
$pathPrefix/${yyyy-MM-dd}/$correlationId-response.json
.- ai.platform.logging.storage-type
If you use a separate object storage for logs, specify its type:
s3
orazure
. Depending on the specified type, set up a connection to the storage of your choice.- ai.platform.logging.azure.connection-string
Provide a connection string to authorize requests to Azure storage.
Thanks for your feedback!