Instead of json config it should have intellij native configuration in settings.
Often I get glitched views because of the inline buttons, please give me an option to hide these buttons too as I don't use mouse.
Configuring which model is used for code completion requires editing that json config (in case you use ollama with local models), could it be autodetected and have its own configuration section please?
Chat feature works fine, but the auto complete feature is completely broken. It sometimes doesn't work at all. I feel like this plugin is not even tested for Intellij. As you can see even the screenshots are from VS Code.
Q: can you see the codebase now?
A: No, I can't see the codebase. If you provide a snippet or specific code block you want help with, I'd be happy to assist!
This appears to be one of those apps where they feature add the hell out of it, but the most fundamental things (you know, like indexing and being able to read your codebase) just doesn't work at all. It's basically like having models you can talk to, but i cannot read your codebase and even when properly setting up the config json and adding files and codebase to it... doesn't work. waste of time.
Works well for me. The speed enhancements of using a solid local model far outweigh the setup annoyances.
One minor issue at the moment is that my "sessions" history is not being updated from Android Studio. It is convenient that my Visual Code and my Android Studio are using the same config file. It's a bit inconvenient that there is only one "index" for all usages instead of one index per project, I see some odd files and some slow RAG lookups coming from having a huge index, and I'm regularly having to manually re-index which then removes all other index'ed project files.
To setup, needed to use JCEF-compatible interface and change the registry, as described in
Not usable on IDEA right now. Settings don't work, indexing progress is nowhere to find, "reset" button in settings doesn't work, command+enter to augment with project files doesn't work (I have to click with cursor). Augmentation sent is several random files, including some unrelated files from node_modules.
The tool looks promising, but lacks in execution. The UI is very different from the IDE, settings buttons do not work and the plugin hangs the IDE when used with ollama.
I tried running local LLM's via ollama. The plugin can initially connect to my local ollama, as it complains about the model being not installed. It seems that the model name in the plugin is not correct, but I cannot change it. Clicking the "Cogs" icon for the model does not do anything. There is only a finite list of models, and since I cannot enter a model name by text it is impossible to select the model I want to use.
Trying another model from the list no longer shows errors, but also does not generate anything for "Autocomplete". Trying "Chat" with the same model but no output is generated at all. There is no error shown, making it very hard to debug the issue. The UI does not match the IDE, making working with the plugin even harder. Once it started hanging my IDE I uninstalled the plugin.
best plugin avail right now, why ? because you can choose your LLM completely free by a config. that´s is a killer feature. all other plugins force you to choose from a fixed list which is stupid as hell considering that fact that every day a new model comes to life !
The best plugin for AI development at the moment. It's a shame that it's not developing as quickly as the one for VSCode, and indeed, there are still a lot of bugs (but the functionality it offers still allows you to do wonders and I like to code in JetBrains more than in VSCode). Thanks for developing this.
Our website uses some cookies and records your IP address for the purposes of accessibility, security, and managing your access to the telecommunication network. You can disable data collection and cookies by changing your browser settings, but it may affect how this website functions. Learn more.
With your consent, JetBrains may also use cookies and your IP address to collect individual statistics and provide you with personalized offers and ads subject to the Privacy Notice and the Terms of Use. JetBrains may use third-party services for this purpose. You can adjust or withdraw your consent at any time by visiting the Opt-Out page.
T Svoboda
YesterdayNeeds fixes & improvements.
Instead of json config it should have intellij native configuration in settings.
Often I get glitched views because of the inline buttons, please give me an option to hide these buttons too as I don't use mouse.
Configuring which model is used for code completion requires editing that json config (in case you use ollama with local models), could it be autodetected and have its own configuration section please?
Nathan Spotten
3 days agoHad bugs but fixed them. best AI plugin out there for customizability like choosing your own model.
yutu1225
3 days agoDoes Ollama support a custom host without requiring a local host?
javadarjmandi
6 days agoChat feature works fine, but the auto complete feature is completely broken. It sometimes doesn't work at all. I feel like this plugin is not even tested for Intellij. As you can see even the screenshots are from VS Code.
Michael Bladowski
7 days agoinline-code completions are not working, after a restart for a few minutes, than it stops and there is no error or anything, it just stops working. even if you disable auto-completion jetbrains own ai assistant is refusing to enable code-completion because the Continue-Plugin is not using jetbrains internal API that was especially created for code-completions. so using both is impossible, which is super super annoying ;-((( read yourself: https://youtrack.jetbrains.com/issue/LLM-14495/Request-to-Allow-Concurrent-Use-of-Multiple-AI-Code-Completion-Plugins#focus=Comments-27-11434189.0-0
kore
1 week agodo not work !
weqwe qweqwe
1 week agoIt just crashes the whole IDE, can't even get it started lol
Cihad Turhan
28.01.2025Chat works great however autocomplete never works. I used lm studio and load my own model without issues.
pestrige
21.01.2025Autocomplete does not work in Webstorm 2024.3.1.1 so I switched back to Copilot
Shane Messer
16.01.2025Q: can you see the codebase now? A: No, I can't see the codebase. If you provide a snippet or specific code block you want help with, I'd be happy to assist!
This appears to be one of those apps where they feature add the hell out of it, but the most fundamental things (you know, like indexing and being able to read your codebase) just doesn't work at all. It's basically like having models you can talk to, but i cannot read your codebase and even when properly setting up the config json and adding files and codebase to it... doesn't work. waste of time.
Hamilton Turner
14.01.2025Works well for me. The speed enhancements of using a solid local model far outweigh the setup annoyances.
One minor issue at the moment is that my "sessions" history is not being updated from Android Studio. It is convenient that my Visual Code and my Android Studio are using the same config file. It's a bit inconvenient that there is only one "index" for all usages instead of one index per project, I see some odd files and some slow RAG lookups coming from having a huge index, and I'm regularly having to manually re-index which then removes all other index'ed project files.
To setup, needed to use JCEF-compatible interface and change the registry, as described in
Medium: @Syex/getting-continue-plugin-running-in-android-studio-4683b1b09115
and
Github: continuedev/continue/issues/596 and
horomneacristian
14.01.2025Nothing works. Not even deleting the text from input in chat. Installed on a fresh MacBook with a fresh installation of Rider.
nordine.ait-ouhmad
13.01.2025Do not work on Phpstorm 2024.3.1.1. Blank window.
anhe
13.01.2025If it worked it would be really nice. I tried this on PHPStorm and it has just been buggy from the getgo.
Igor Loskutov
03.01.2025Not usable on IDEA right now. Settings don't work, indexing progress is nowhere to find, "reset" button in settings doesn't work, command+enter to augment with project files doesn't work (I have to click with cursor). Augmentation sent is several random files, including some unrelated files from node_modules.
Floris Luiten
03.01.2025The tool looks promising, but lacks in execution. The UI is very different from the IDE, settings buttons do not work and the plugin hangs the IDE when used with ollama.
I tried running local LLM's via ollama. The plugin can initially connect to my local ollama, as it complains about the model being not installed. It seems that the model name in the plugin is not correct, but I cannot change it. Clicking the "Cogs" icon for the model does not do anything. There is only a finite list of models, and since I cannot enter a model name by text it is impossible to select the model I want to use.
Trying another model from the list no longer shows errors, but also does not generate anything for "Autocomplete". Trying "Chat" with the same model but no output is generated at all. There is no error shown, making it very hard to debug the issue. The UI does not match the IDE, making working with the plugin even harder. Once it started hanging my IDE I uninstalled the plugin.
lexus
29.12.2024PyCharm 2024.3.1.1 (Professional Edition) Build #PY-243.22562.220, built on December 18, 2024
Autocomplete does not work. Chat works. The checkbox in the settings is checked. Other plugins are disabled.
Michael Bladowski
16.12.2024best plugin avail right now, why ? because you can choose your LLM completely free by a config. that´s is a killer feature. all other plugins force you to choose from a fixed list which is stupid as hell considering that fact that every day a new model comes to life !
SL
16.12.2024The best plugin for AI development at the moment. It's a shame that it's not developing as quickly as the one for VSCode, and indeed, there are still a lot of bugs (but the functionality it offers still allows you to do wonders and I like to code in JetBrains more than in VSCode). Thanks for developing this.
Shingha Manish
14.12.2024I encountered numerous bugs with this plugin, and it failed to perform its intended purpose