AI Integration
Reox makes Large Language Models a first-class citizen in your code.
The `ai` Keyword
Define AI assistants directly in your application code. Reox handles the context management and API calls to the system-level LLM service.
ai assistant {
model: "neolyx-gpt"
temperature: 0.7
fn suggest_reply(email: string) -> string {
return ai.prompt("Draft a polite reply to: {email}");
}
}Async AI Operations
AI operations can be performed asynchronously using async fn and await, keeping your UI responsive while waiting for model responses.
async fn generate_code(prompt: string) -> string {
let result = await ai.prompt(prompt);
return result;
}
// Usage in async context
async fn handle_button_click() {
let code = await generate_code("Write a sorting function");
display_result(code);
}Native Performance
Unlike calling REST APIs in other languages, Reox AI calls are optimized IPC (Inter-Process Communication) messages to the NeolyxOS Neural Engine service, ensuring low latency and privacy.