Christian Weyer

Semantic AI: Why Embeddings Might Matter More Than LLMs

Are we too focused on LLMs? This talk argues that embeddings are the true foundation of modern AI, enabling powerful, deterministic systems for retrieval and routing.

Semantic AI: Why Embeddings Might Matter More Than LLMs
#1about 1 minute

Moving beyond hype with real-world generative AI

An internal company tool serves as a practical case study for applying language and embedding models to solve real business problems.

#2about 3 minutes

Integrating AI with existing enterprise data sources

The system combines API-based data from a third-party planning tool with document-based data from a Git-based knowledge base.

#3about 4 minutes

Building language-enabled universal interfaces for software

Instead of extending traditional GUIs, a universal interface allows users to interact with systems using natural language through platforms like Slack or voice.

#4about 3 minutes

Demonstrating a multi-system AI chat interface

A live demo shows how a single chat interface can query both a knowledge base and an employee availability system, providing source links to verify information.

#5about 3 minutes

Contrasting language models and embedding models

Language models are non-deterministic and generative, while embedding models are deterministic and create vector representations for comparison and retrieval.

#6about 4 minutes

Implementing retrieval-augmented generation for documents

The RAG pattern uses embeddings and a vector database to find relevant document chunks to provide as context for an LLM's answer.

#7about 4 minutes

Using LLMs for structured data and API calls

By providing a technical schema in the prompt, a language model can be forced to generate structured, machine-readable output for reliable API integration.

#8about 4 minutes

How semantic routing directs user queries

Semantic routing uses embeddings to classify a user's intent by finding the closest cluster of example questions, directing the request to the correct backend system.

#9about 1 minute

Why embeddings are the foundation of AI systems

Embeddings are crucial not just within LLMs but also for encoding meaning and enabling core architectural patterns like semantic routing and guarding.

Related jobs
Jobs that call for the skills explored in this talk.

Featured Partners

Related Articles

View all articles
DC
Daniel Cranney
Stephan Gillich - Bringing AI Everywhere
In the ever-evolving world of technology, AI continues to be the frontier for innovation and transformation. Stephan Gillich, from the AI Center of Excellence at Intel, dove into the subject in a recent session titled "Bringing AI Everywhere," sheddi...
Stephan Gillich - Bringing AI Everywhere
CH
Chris Heilmann
With AIs wide open - WeAreDevelopers at All Things Open 2025
Last week our VP of Developer Relations, Chris Heilmann, flew to Raleigh, North Carolina to present at All Things Open . An excellent event he had spoken at a few times in the past and this being the “Lucky 13” edition, he didn’t hesitate to come and...
With AIs wide open - WeAreDevelopers at All Things Open 2025
LM
Luis Minvielle
What Are Large Language Models?
Developers and writers can finally agree on one thing: Large Language Models, the subset of AIs that drive ChatGPT and its competitors, are stunning tech creations. Developers enjoying the likes of GitHub Copilot know the feeling: this new kind of te...
What Are Large Language Models?

From learning to earning

Jobs that call for the skills explored in this talk.

AI Engineer, London

AI Engineer, London

Eloquent AI
Charing Cross, United Kingdom

52K
Intermediate
Azure
React
Python
Node.js
+4