AI & Prompts
Register AI models, manage reusable prompts, and power assistants and workflows.
Bosca includes a lightweight AI layer to standardize model access and prompt management. You can register one or more LLMs, store reusable prompts, and call them from workflows or services. This keeps AI usage consistent across your apps.
What you get:
- Central model registry with API keys and defaults
- Reusable prompts with input variables and versioning
- A simple service layer for chat, completion, and tool‑style responses
- Integration points for search‑augmented generation and content workflows
Typical uses:
- Draft Assistance: Generate summaries, tags, or titles automatically during Workflows.
- Semantic Search: Create embeddings that help users find content by meaning, not just keywords.
- Chat Experiences: Power chatbots and assistants in your applications using shared, versioned prompts.
For developers
Bosca provides a unified service layer for interacting with LLMs (like OpenAI, Gemini, etc.). Instead of hardcoding prompts in your code, you can manage them centrally in Bosca.
Key features:
- Model Registry: Configure providers and parameters (temperature, tokens) in one place.
- Prompt Management: Store prompts as versioned templates with input variables.
- Unified API: Switch models or update prompts without changing your application code.
Related:
- Architecture: AI/ML overview
- Search: Semantic search
- Source code:
backend/framework/core-ai,backend/framework/ai