Credit: VentureBeat made with Visual Electric/Stable Diffusion
Generative AI powered code generation is getting more powerful and more compact.
Stability AI, the vendor that is still perhaps best known for its stable diffusion text to image generative AI technology today announced its first new AI model of 2024: the commercially licensed (via membership) Stable Code 3B.
As the model name implies Stable Code 3B is a 3-billion parameter model, and it is focused on code completion capabilities for software development.
At only 3 billion parameters, Stable Code 3B can run locally on laptops without dedicated GPUs while still providing competitive performance and capabilities against larger models like Meta’s CodeLLaMA 7B.
The push toward smaller, more compact and capable models is one that Stability AI began to push forward at the end of 2023 with models like StableLM Zephyr 3B for text generation.
Stability AI first previewed Stable Code in August 2023 with the code generation LLM’s initial release and has been steadily working on improving the technology ever since.
How Stability AI improved Stable Code 3B
Stability AI has improved Stable Code in a number of ways since the initial release.
With the new Stable Code 3B not only does the model suggest new lines of code, but it can also fill in larger missing sections in existing code.
The ability to fill in missing sections of code is an advanced code completion capability known as Fill in the Middle (FIM).
The training for the model was also optimized with an expanded context size using a technique known as Rotary Position Embeddings (RoPE), optionally allowing context length up to 100k tokens. The RoPE technique is one that other LLMs also use, including Meta’s Llama 2 Long.
Stable Code 3B is built on Stability AI’s Stable LM 3B natural language model. With further training focused on software engineering data, the model gained code completion skills while retaining strengths in general language tasks.
Its training data included code repositories, programmer forums, and other technical sources.
It also trained on 18 different programming languages, and Stability AI claims that Stable Code 3B demonstrates leading performance on benchmark tests across multiple languages.
The model covers popular languages like Python, Java, JavaScript, Go, Ruby, and C++. Early benchmarks indicate it matches or exceeds the completion quality of models over twice its size.
The market for generative AI code generation tools is competitive with multiple tools including Meta’s CodeLLaMA 7B being one of the larger and most popular options.
On the 3-billion parameter side, the StarCoder LLM — which is co-developed as an open source effort with the participation of IBM, HuggingFace and ServiceNow — is another popular option.
Stability AI claims Stable Code 3B outperforms StarCoder across Python, C++, JavaScript, Java, PHP and Rust programming languages.
Part of Stability AI’s membership subscription offering
Stable Code 3B is being made available for commercial use as part of Stability AI’s new membership subscription service that was first announced in December.
Members gain access to Stable Code 3B alongside other AI tools in Stability AI’s portfolio including the SDXL stable diffusion image generation tools, StableLM Zephyr 3B for text content generation, Stable Audio for audio generation, Stable Video for video generation.
VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.