Skip to content

LLM / AI Development

Professional LLM / AI development from experienced developers based in Graz, Austria.

Artificial intelligence and Large Language Models are no longer futuristic concepts - they are concrete tools we integrate into client projects today. At dectria, we combine solid software engineering with practical AI experience to embed LLMs meaningfully and securely into existing business processes.

With our product NetCero, we use AI productively: Using the Vercel AI SDK, we leverage AI-powered text generation directly within the application - for text summarization, tone adjustment, and multi-language content generation. The SDK's provider abstraction lets us seamlessly switch between different LLM providers. Azure OpenAI analyzes ESG reports, extracts relevant data from complex documents and supports companies with EU CSRD compliance. This experience from our own product operations flows directly into our client projects.

We implement RAG systems (Retrieval-Augmented Generation) that connect LLMs with company-specific data, develop intelligent assistants for specialist departments and automate document-based workflows. We pay particular attention to data privacy, traceability and controlled integration into existing IT landscapes.

Capabilities

What We Build with LLM / AI

Azure OpenAI & Claude Integration Ollama On-Premise Deployments RAG Systems (Retrieval-Augmented Generation) Document Analysis & Extraction Vercel AI SDK & Streaming Prompt Engineering & Evaluation Embedding-Based Vector Search LLM-Powered Workflow Automation Model Selection & Benchmarking Privacy-Compliant AI Architectures

Use Cases

Typical Use Cases

Document Analysis & Compliance

AI-powered extraction and analysis of business documents, contracts and reports - as in our ESG platform NetCero, which processes EU CSRD reports automatically.

Intelligent Assistants & Chatbots

Custom AI assistants trained on your company data that help employees with research, summarization and decision-making - with RAG technology for precise, source-based answers.

Process Automation with AI

Integration of LLMs into existing workflows for automating classification, summarization, translation and data extraction - from email triage to automated report generation.

FAQ

LLM / AI FAQ

Which AI models does dectria use?
We work with Azure OpenAI (GPT), Anthropic Claude, and open-source models via Ollama for on-premise deployments. We choose models on a project-specific basis based on requirements for accuracy, speed, cost and data privacy. For privacy-sensitive projects, we use Azure OpenAI with European data residency or self-hosted solutions with Ollama.
How does dectria ensure data privacy in AI projects?
Data privacy is non-negotiable for us. We use Azure OpenAI with EU data residency, never use customer data for model training and implement clear data processing architectures. All AI integrations are implemented GDPR-compliant, with full transparency about data flows.
What is a RAG system and when do you need one?
RAG (Retrieval-Augmented Generation) connects LLMs with your own data. The model first searches for relevant information in your knowledge base and then generates an answer based on those sources. This prevents hallucinations and enables precise, source-based answers - ideal for customer support, knowledge management and compliance.

Every project starts with a conversation.

Let us talk about your individual needs and goals.

Start a project