Strong take, Oliver. "Not a lack of AI capabilities — a lack of governed access" — that really nails it.
What I find interesting about Muster: you're building the infrastructure layer that sits between enterprise systems and AI agents. Most organizations are completely missing this, and they don't even realize it yet.
I'd take the thought one step further though: before you can govern access, you need to understand what you're actually protecting. And it's not just data — it's the value creation embedded in your processes, domain knowledge, and competitive intelligence. Most AI vendor agreements are essentially IP transfer agreements wrapped in Terms of Service.
We're working on something that picks up exactly here — not "who sees my data?" but "who profits from my intelligence?" More on that soon.
Governed access and sovereignty assessment are two sides of the same coin. Exciting to see so much movement in this space.
Strong take, Oliver. "Not a lack of AI capabilities — a lack of governed access" — that really nails it.
What I find interesting about Muster: you're building the infrastructure layer that sits between enterprise systems and AI agents. Most organizations are completely missing this, and they don't even realize it yet.
I'd take the thought one step further though: before you can govern access, you need to understand what you're actually protecting. And it's not just data — it's the value creation embedded in your processes, domain knowledge, and competitive intelligence. Most AI vendor agreements are essentially IP transfer agreements wrapped in Terms of Service.
We're working on something that picks up exactly here — not "who sees my data?" but "who profits from my intelligence?" More on that soon.
Governed access and sovereignty assessment are two sides of the same coin. Exciting to see so much movement in this space.
Hi Oliver, will you opensource muster?
there you go: https://github.com/giantswarm/muster