Continue

Compatible with IntelliJ IDEA (Ultimate, Community), Android Studio and 17 more
Screenshot 1
Screenshot 2

Continue

Continue is the leading open-source AI code assistant.

You can connect any models and any context to build custom autocomplete and chat experiences inside VS Code and JetBrains IDEs.

Chat

Chat makes it easy to ask for help from an LLM without needing to leave the IDE.

You send it a task, including any relevant information, and it replies with the text / code most likely to complete the task. If it does not give you what you want, then you can send follow up messages to clarify and adjust its approach until the task is completed.

Autocomplete

Autocomplete provides inline code suggestions as you type.

To enable it, simply click the "Continue" button in the status bar at the bottom right of your IDE or ensure the "Enable Tab Autocomplete" option is checked in your IDE settings.

Edit

Edit is a convenient way to modify code without leaving your current file.

Highlight a block of code, describe your code changes, and a diff will be streamed inline to your file which you can accept or reject.

Actions

Actions are shortcuts for common use cases.

For example, you might want to review code, write tests, or add a docstring.

License

Apache 2.0 © 2023-2024 Continue Dev, Inc.

What’s New

View the latest release notes on GitHub
Feb 08, 2025
Version 0.0.88

Rating & Reviews

2.8
81 Ratings (278,667 Downloads)
5
4
3
2
1

T Svoboda

Yesterday

Needs fixes & improvements.

Instead of json config it should have intellij native configuration in settings.
Often I get glitched views because of the inline buttons, please give me an option to hide these buttons too as I don't use mouse.

Configuring which model is used for code completion requires editing that json config (in case you use ollama with local models), could it be autodetected and have its own configuration section please?

0

Nathan Spotten

3 days ago

Had bugs but fixed them. best AI plugin out there for customizability like choosing your own model.

0

yutu1225

3 days ago

Does Ollama support a custom host without requiring a local host?

0

Additional Information

Vendor:
Continue Dev(Trader)
Plugin ID:
com.github.continuedev.continueintellijextension