Settings reference
Last modified: 20 February 2025Settings | Tools | AI Assistant
Use this page to configure the general behavior of AI Assistant.

Features
Item | Description |
---|---|
Enable smart chat mode | Allows AI Assistant to send additional context data to the LLM. This helps AI Assistant to provide better responses and enables you to ask questions about your files, classes, and functions. For additional information, refer to Manage the smart chat mode. |
Enable cloud completion suggestions | Enable AI Assistant to use cloud completion models for autocompleting single lines, blocks of code, or even entire functions based on the project context. For additional information, refer to Configure cloud completion. |
Allow attaching database schemas to AI Assistant chat | Allows AI Assistant to get access to database schemas. This helps AI Assistant to enhance the quality of generated SQL queries. For additional information, refer to Attach database schema. |
Provide AI-generated name suggestions | Enables AI Assistant to suggest names when renaming symbols. For additional information, refer to Get help with name suggestions. |
Suggest converting pasted code to the language of the target file | Suggests converting the pasted code to the language of the target file. ![]() |
Generates a title for the shelved changelist | When enabled, AI Assistant automatically generates a title for the silently shelved changes. For additional information, refer to Generate shelf title. |
GitHub Plugin: Generate a summary upon opening a Pull Request | Enables AI Assistant to automatically generate a summary of changes when opening a pull request.
|
Natural Language
Item | Description |
---|---|
Receive AI Assistant chat responses in a custom language | Specify the language in which you want to receive chat responses. For additional information, refer to Change the chat response language. |
Third-party AI providers
Item | Description |
---|---|
Enable LM Studio | Specify the URL to connect to the local LM Studio service. For additional information, refer to Connect AI Assistant chat to your local LLM. |
Enable Ollama | Specify the URL to connect to the local Ollama service. For additional information, refer to Connect AI Assistant chat to your local LLM. |