Chat with AI
Use the AI Assistant tool window to have a conversation with the LLM (Large Language Model), ask questions about your project, or iterate on a task.
AI Assistant takes into consideration the language and technologies used in your project, as well as local changes and version control system commits. You can search for files, classes, and element usages.
Start a new chat
Click AI Assistant on the right toolbar to open AI Assistant.
In the input field, type your question.
If you have a piece of code selected in the editor tab, use
/explain
and/refactor
commands to save time when typing your query.Use the
/docs
command to ask PhpStorm-related questions. If applicable, AI Assistant will provide a link to the corresponding setting or documentation page.If you want to attach a particular file or function to your query to provide more context, use
#
:#thisFile
refers to the currently open file.#localChanges
refers to the uncommitted changes.#file:
invokes a popup with selection of files from the current project. You can select the necessary file from the popup or write the name of the file (for example,#file:Foo.md
) .#symbol:
adds a symbol into prompt (for example,#symbol:FieldName
).#schema:
refers to a database schema. You can attach a database schema to enhance the quality of generated SQL queries with your schema's context.
Alternatively, click above the input field and select the files, symbols, or commits to add them to the context of current chat.
In the input field, select your preferred AI chat model from the list of currently available models by clicking .
If you want to connect the AI Assistant chat to your local model, refer to this chapter.
Press Enter to submit your query.
Click Regenerate this response at the beginning of the AI Assistant's answer to get a new response to your question.
AI Assistant keeps the chats' history separately for each project across IDE sessions. You can find the saved chats in the All Chats list.
Names of the chats are generated automatically and contain the summary of the initial query. Right-click the chat's name to rename it or delete it from the list.
Manage the smart chat mode
To give more precise answers, AI Assistant has the smart chat mode enabled by default.
In this mode, AI Assistant might send additional details, such as file types, frameworks used, and any other information that may be necessary for providing context to the LLM.
To disable the smart chat mode, clear the Enable smart chat mode checkbox in .
Connect AI Assistant chat to your local LLM
If you do not want to use cloud-based models while working with the AI Assistant chat, you can connect your local LLM available through Ollama.
Press Ctrl+Alt+S to open settings and then select
.In the Third-party AI providers section, select the Enable Ollama checkbox, specify your local host URL, and click Test Connection.
When working with the AI Assistant chat, select your model from the list of available LLMs.
Use AI Assistant to retrieve context-based answers
Click AI Assistant on the right toolbar to open AI Assistant.
Use natural language to request information based on the context of your workspace. Here are some examples:
Request recent files: to retrieve a list of files you have recently viewed.
Ask for the current file: to display the full content of the currently opened file.
Request visible code: to retrieve the code currently visible in your editor.
Ask for local changes: to display uncommitted changes in your file tree.
Find information in README: to search for relevant information within README files.
Check recently changed files: to list files modified in the ten latest commits.
Create file from snippet
You can create a new file with the AI-generated code right from the AI Assistant chat.
In the upper-right corner of the field with the generated code, click Create File from Snippet.
AI Assistant will create a file with the AI-generated code.
If you have any file opened or selected in the Project tool window Alt+1, the new file will be created in the same folder as the selected file.
In other cases, the new file will be created in the root project folder.
Attach database schema
You can enhance the quality of generated SQL queries with the context of a database schema that you are working with. To do that, attach the schema in AI Assistant tool window. AI Assistant will get access to the structure of the attached schema, providing the LLM with information about it.
To use this feature, you need to grant AI Assistant consent to access the database schema.
Attaching a schema will also improve the results of the context menu AI Actions actions group, for example, Explain Code, Suggest Refactoring, and so on. For more information about those actions, refer to Use AI prompts to explain and refactor your code.
In the AI Assistant tool window input field, enter your prompt with
#
followed by the schema name. For example:Give me a query to get all actor names from #public
.Press Enter.
AI Assistant will analyze your schema and generate the result.
You can see which schema was attached to your message, and also navigate to that schema in Database tool window. To do that, click Attached elements in your message, then click the schema name.
If you want to allow AI Assistant to always attach the selected schemas, select the Always allow attaching database schemas checkbox in the Attach Schema dialog. Alternatively, enable the Allow attaching database schemas setting in .