Podcast
2026-04-23

Utbenat: Context

Sebastian Mildgrim
,
Utbenat: Context

About the episode

At your company, everyone is speaking "Danish" to each other without knowing it. Words you throw around every day rely entirely on who's listening to land correctly. This problem has always existed, but now that you're trying to get an AI to understand and act on your business data, the consequences show up in a completely new way.

In this episode of Syntra Utbenat, Sebastian unpacks what context actually means and why AI, no matter how smart it is, has never set foot in your office. It knows nothing about your culture, your internal concepts or your unspoken rules. It lacks your Danish. And that can be the difference between an AI project that delivers sharp insights and one that fires off with the wrong question from the very first second.

In this episode, you’ll learn:

Why your internal terminology is your "Danish" and what that means when you start using AI on business data
Why AI can give technically correct answers that are completely useless for you and your department
How the same word can mean entirely different things to different parts of the organization and why that matters
A concrete exercise you can do as early as Monday to start giving your AI the right context

Voices in this episode

Sebastian Mildgrim
Sebastian Mildgrim
Technical Advisor

Listen to the episode

Transcript of the episode

Introduction

Mitra:
Welcome to Syntra Utbenat. This is the podcast where, in ten minutes, we take the words and concepts being thrown around in the world of data and AI and break down what they actually mean, together with Sebastian Mildgrim.

What Is Context, Really?

Sebastian:
It's often said that Danish is one of the hardest languages in the world to learn, even for Danish children. Aside from being quite funny, it raises the question: why? Danish is distinguished by something called vowel reduction. Endings and consonants get swallowed entirely. Words blend together and sound almost identical. So what does that actually do?

Let's use an example. Imagine you're at a dinner party. You're sitting next to a friendly Danish person and you're talking about work. She says she's a "Lä-e." You lean in, impressed, and ask: "Isn't it incredibly heavy to carry responsibility for other people's lives?"

Your dinner companion looks at you strangely, takes a sip of water and replies: "Well, it's mostly the sixth-graders who act up at recess."

You thought she said Læge, doctor. But she said Lærer, teacher. Phonetically, they were almost identical. What you were missing wasn't vocabulary, it was context.

Today we're going to talk about what happened at that dinner table. We're going to talk about context. Because the truth is that at your company, right now, everyone is speaking "Danish" to each other without knowing it. You throw around words you think are clear but that depend entirely on who's listening. The kind of thing where the right report, in the wrong hands, can create real confusion. And now you're also facing the challenge of getting an AI to understand that language. That's where it gets interesting, and that's where there is money to be made and saved.

The Problem: AI Has No Backpack

Sebastian:
Let's move from the dinner table to your business. You've been handed a directive: "Become data-driven" or maybe "Use AI to drive efficiency." It sounds great, and you've probably heard no shortage of exciting pitches about what AI can do for you. But this is exactly where context matters so much. AI doesn't work like a human. It has no idea how your business operates. It doesn't know that "Lars in the warehouse" is the one who actually decides the order of outbound shipments, regardless of what the system says.

We humans carry a backpack full of tacit knowledge, full of context. We know what industry we're in, we know what a POC means when Johan in IT says it and we know what a POC means when Josefin in Logistics says it, even though they're talking about two completely different things. For Johan, it's a Proof of Concept, a smaller project to validate whether an idea holds up. For Josefin, it's a Point of Contact. Meaning: who do I call when the truck is in the ditch? That's a person, not a test.

An AI doesn't carry the same backpack. Instead, it's packed with all of Wikipedia, a large chunk of every website and billions of forum threads from the internet. It can be incredibly smart. But it has never set foot in your office. It knows nothing about your culture, your unspoken rules or your Danish. It lacks context.

Same Word, Completely Different Reality

Sebastian:
Let's walk through an example using a word that comes up every single day: "customer."

You work in logistics. There have been problems with deliveries from the central warehouse out to your stores. The store managers in Örebro and Gävle are furious because they now have to tell their customers that their orders won't arrive on the promised date. You turn to your new AI assistant for a quick analysis and type: "We have a lot of unhappy customers right now due to late deliveries. Pull together the data we have that could explain why."

Here's where the collision happens, invisibly. For you, "customer" means the internal recipients, the stores. It's specific and bounded. But in your data, "customer" shows up in a dozen different places: end consumers, visitors, contracts and sometimes you're the customer of a third party.

The AI sees no problem and gets to work. It pulls data based on the assumption that you want information about deliveries to the end consumer, because that's the most likely interpretation in its context.

For someone on the e-commerce team, that might have been gold. For you, in logistics, it's completely worthless. You don't need data showing that pickup times from Bring don't line up well with when orders are placed. You need data that can help you reroute trucks or look at how pick lists are being prioritized.

The AI didn't do anything wrong, technically speaking. It understood the words. But it missed your reality. It thought you were talking about "Göran waiting for a package" when you were actually talking about "Store manager Karin waiting for a pallet."

And this isn't just about the word "customer." It's a stretch to think that all fields related to customers sit in tables called "customer." It gets even messier with concepts like "product" or "agreement." The path forward is the same: we need to make the right context available so that AI can understand what we give it and return the results we actually need.

The Solution: Give Context Where It Matters

Sebastian:
So how do we solve this? Many people jump to: "We need to structure all our data" or "We need to build an ontology for the entire company."

Take a breath. You don't need to do that. In fact, it's often the wrong approach: too big, too expensive and it takes too long. More importantly, context is contextual. The context you need in a given situation depends on the situation. Preparing is great, but only if you know what you're preparing for. And I don't think anyone can know that given how fast things are moving right now. The solution is much simpler and more concrete. It's about giving context where it matters.

Don't just take my word for it. Test it yourself: open a chat with your favorite AI, whether that's Claude, Gemini, Copilot or Le Chat, and ask without any context what POC means. Then try adding context, something like: "You are a logistics director at a larger group company responsible for the central warehouse serving all branches. What does the abbreviation POC mean to you?" Then compare the difference in responses.

How to Do It in Practice (Starting Monday)

Sebastian:
Getting better answers from AI is hardly the most exciting thing we can achieve with context. What people are talking about now is the concept of "chat with your data," the ability to explore your data in a conversation with an AI to find new insights and quickly validate ideas for new reports. And that, more than anything, requires context. So how do you get there?

Start by choosing one small, well-defined area. A specific report works well. Maybe the weekly deviation report you always end up discussing at Monday's meeting.

Then look at it with "stranger's eyes." Imagine you have a new colleague who just graduated and has never been to the company before. If she reads the report, what's unclear? Does it say "customer"? Clarify that it means "receiving store." Does it say "priority"? Clarify what criteria determine what gets prioritized. Write it all down. A separate document works perfectly, though there are of course dedicated tools for this.

That is invaluable context for an AI to use when interpreting, understanding and working with data correctly.

Closing Thoughts

Sebastian:
Getting value out of AI isn't about coding. It's about explaining your own reality, your Danish. You need to teach the AI to tell the difference between Læge and Lærer. That's how you go from simply "having a lot of data" to actually getting help finding insights that move the needle on the bottom line.

Outro

Sebastian:
Thanks for listening to today's episode of Syntra Utbenat. Want to talk more about today's topic, or just talk data? Reach out to us. We're on LinkedIn or at fiwe.se. Until next time, take care out there.

Listen to more episodes

Deepen your knowledge. Explore more episodes of Syntra.

More episodes
An abstract image of data that flows in multiple colors.

Ready to take the next step together with your data?

We help you transform data into information and communication that truly makes a difference – for your workflows, decision-making and product offering.