Diese Case Study ist auch auf Deutsch verfügbar

TL;DR

The Challenge

Sprengnetter Books' traditional keyword-based search could find relevant documents, but it couldn’t generate direct answers to specific questions. Users had to manually sift through results, which was especially inefficient for technical or fuzzy, vibe-based queries. The goal was to develop a solution that could understand real estate valuation knowledge and provide users with precise, natural language answers.

Our Solution

INNOQ implemented a Retrieval-Augmented Generation (RAG) assistant that combines the modern generative power of the GPT-4o Large Language Model with targeted information retrieval. The solution was developed by a small INNOQ team in just a few months.

The Result

An AI-powered chat system now provides subscribers with natural language access to Sprengnetter Books, going far beyond the capabilities of traditional search functions.

Challenge: Providing Answers, Not Just Search Results

Real estate valuation demands in-depth expertise across many areas—ranging from building regulations to valuation methods and market analyses. Sprengnetter makes this knowledge accessible through its digital database, “Sprengnetter Books.” While the existing full-text search could guide users to relevant documents, it lacked the ability to understand and independently formulate answers. Users had to go through the documents themselves—an often time-consuming task, especially when faced with specific technical questions.

Approach: Agile and Focused Implementation

Working closely with Sprengnetter, INNOQ first analyzed the specific requirements and developed a prototype to demonstrate the capabilities of modern generative AI. A small, dedicated development team then worked in short iterations to implement the solution. This agile approach also provided Sprengnetter with a straightforward and hands-on introduction to generative AI.

The collaboration with INNOQ was highly focused from the start. The team not only brought technical expertise but also developed a deep understanding of how our users work with technical information.

Tina UhligProduct Owner Sprengnetter Books

Technical Approach: State-of-the-Art AI Meets Professional Knowledge Base

The implemented system is built on the RAG architecture (Retrieval-Augmented Generation), which combines the power of large language models with precise information retrieval. Using OpenAI GPT-4o via API and MongoDB Atlas for vector search and PDF storage, the solution delivers real value to users: precise, source-backed answers to technical questions. As one of the leading “world models,” GPT-4o has the necessary generalization capabilities to address a wide range of questions in combination with the retrieved sources. Furthermore, hallucination rates are significantly lower compared to smaller models.

What is RAG?

RAG architecture (Retrieval-Augmented Generation) addresses a core issue with using large language models: the tendency to “invent” facts when they encounter knowledge gaps. Instead of making up information, the system first retrieves relevant documents to extract matching information. Only these verified details serve as context for generating answers. The result is clear: the system can harness the language model’s expressive power while remaining grounded in verifiable knowledge.

This architecture has distinct advantages. Unlike a simple language model or traditional full-text search, users receive precise, source-based answers in natural language. RAG thus blends the precision of document-based search with the AI’s ability to understand relationships and convey them clearly.

For a quick technical introduction to RAG architecture, check out our free primer.

One challenge with large language models is their tendency to “hallucinate,” when working around gaps in the knowledge providing factually incorrect but plausible-sounding statements. Our solution mitigates this issue by strictly requiring that every answer generated must be supported by specific references from Sprengnetter Books. This ensures that the system delivers verifiable, technically sound answers.

Special attention was also given to preparing technical information in a context-sensitive way: the system not only understands user questions but can specifically extract relevant details from the knowledge base and present them comprehensibly—even for complex mathematical formulas.

With the new AI-powered assistant, we’ve achieved a quantum leap in the usability of our digital knowledge database. Our customers can now engage in genuine dialogue with our expertise.

Andreas KadlerCEO Sprengnetter Real Estate

Technologies Used

Conclusion

The Result: More Than Just an Intelligent Search

The new assistant transforms how users interact with Sprengnetter Books. Instead of clicking through search results, they can now ask questions in natural language and receive precise, contextual answers. The system understands complex relationships and can even ask follow-up questions to clarify users' intentions. For practical work, every statement is supported by specific source citations. Users can click through to the relevant section in the original text—whether to verify the answer or explore further.

The project’s quick and successful implementation demonstrates how generative AI can be leveraged to create real value for users. For Sprengnetter, this represents not only a technical upgrade to their product but also a significant step in the digital transformation of knowledge sharing within the real estate industry.

Avatar of Christopher Stolle
Principal Consultant

We’d love to assist you in your digitalization efforts from start to finish. Please do not hesitate to contact us.

Get in touch!