The Ignition milestone has been successfully achieved. This enables large language models (LLMs) to be utilized within the Internet Computer, allowing integration into canister smart contracts with just a few lines of code.
What Is the Ignition Milestone?
The Ignition milestone is a key component of the Decentralized AI segment of the ICP Roadmap, dedicated to linking canisters with off-chain LLMs. This advancement allows developers to incorporate AI capabilities into their dapps without complex configurations, facilitated by AI workers.
What’s New in Ignition
LLM Libraries for Streamlined Integration
To enhance the connection of canisters to LLMs, we have introduced libraries in three programming languages prevalent on the Internet Computer: Motoko, Rust, and TypeScript.
These libraries significantly speed up the process of integrating LLMs into your dapps. For instance, here’s a basic example of how a canister can interact with Llama 3.1 using a few lines of Motoko code:
import LLM “mo:llm”;
await LLM.chat(#Llama3_1_8B).withMessages([
#system_ {
content = “You are a helpful assistant.”;
},
#user_ {
content = “How big is the sun?”;
},
]).send();
This example is available for exploration on ICP Ninja, demonstrating how a canister can submit a query to an LLM and receive a response with minimal setup.
Here’s another example demonstrating how to call LLMs with tools:
import LLM “mo:llm”;
actor {
public func example() {
let response = await LLM.chat(#Llama3_1_8B)
.withMessages([
#system_ {
content = “You are a helpful assistant.”
},
#user {
content = “What’s the weather in Zurich?”
},
])
.withTools([LLM.tool(“get_weather”)
.withDescription(“Get current weather for a location”)
.withParameter(
LLM.parameter(“location”, #String)
.withDescription(“The location to get weather for”)
.isRequired()
)
.build()
])
.send();
};
}
You can view a live demonstration of the tool usage here, along with the source code of the demo here.
AI Workers
To link canisters with off-chain LLMs, we created a minimum viable product of AI workers, a streamlined system allowing canisters to fetch data from off-chain LLMs. AI workers manage communication, enabling canisters to send requests and receive responses from LLMs in real-time.
Currently, AI workers are compatible with Llama 3.1 8B, Llama 4 Scout, and Qwen 3 32B, granting developers the flexibility to choose the most suitable LLM for their projects.
Below is a diagram illustrating how AI workers operate to relay prompts to off-chain LLM providers.
How it works:
- Canisters send prompts to an “LLM canister” through the aforementioned LLM libraries, specifically designed to receive LLM prompts.
- The LLM canister stores these prompts in a queue.
- AI workers continuously check the LLM canister for new prompts.
- AI workers execute the prompts and return the responses to the LLM canister, which in turn sends them back to the originating canister.
Real-World Use Cases
Developers are actively utilizing the LLM libraries to craft innovative dapps, including:
- Wallets with chat functionalities: Incorporating conversational AI into crypto wallets for improved user experience (set to launch soon in OISY)
- DAOs employing sentiment analysis: Utilizing LLMs to assess community sentiment and inform portfolio decisions, as seen in the Alice DAO.
These examples illustrate how AI workers can facilitate a variety of applications on the Internet Computer.
Why This Matters and How to Get Started
The Ignition milestone simplifies the process for developers to incorporate LLMs into their Internet Computer projects, paving the way for new dapps, such as chatbots, analytics tools, and AI-driven DeFi applications. By merging LLMs with Internet Computer capabilities like Chain Fusion, HTTPS outcalls, and on-chain randomness, developers can create innovative and robust solutions.
Ready to explore? Check out the LLM Chatbot project on ICP Ninja, experience the live demo, or delve into the code and examples in our repository.
Bringing LLMs to the Internet Computer was initially published in The Internet Computer Review on Medium, where discussions about this topic are ongoing through comments and responses.