MC · Algorithmic Sovereignty · 2026
Module MC — Digital Sovereignty
Who truly
controls
the models?
Explainability · Auditability · Ownership · Bias · EU AI Act
Fundamental Rights #2 & #4 — ORBii Framework

"Understand how your systems work and operate them freely: an opaque, non-auditable AI model owned by a third party simultaneously violates both of these rights."

What this module covers

Decision explainability, model auditability, fine-tuned model ownership, bias detection, EU AI Act Art. 13-17 requirements.

Target audience

CDO, AI Officer, risk managers, DPO, MLOps teams, compliance officers, internal auditors.

Core regulation

EU AI Act Art. 9-17 (high-risk systems), Art. 50 (transparency), GDPR Art. 22 (automated decisions), DORA Art. 28.

Recommended duration

Half day (3h30) — 2 sessions + 1 model audit workshop.

ORBii.Academy — Digital Sovereignty & AIMC · P.01
MC · Algorithmic Sovereignty
The 4 dimensions of algorithmic sovereignty

What mastering a model truly means.

Deploying an AI model without controlling all four dimensions simultaneously exposes the organization to regulatory, operational, and strategic risks.

D1 — Explainability

Understanding why a decision was made

The ability to explain in business terms why a model produced a given result. Not just technically (SHAP, LIME), but operationally: "This loan was denied because..."

GDPR Art. 22 EU AI Act Art. 13
D2 — Auditability

Being able to inspect the model at any time

Access to model weights, training data, performance metrics by subgroup, and inference logs. Auditability is an enforceable right under the EU AI Act for high-risk systems.

EU AI Act Art. 17 DORA Art. 30
D3 — Ownership

Owning fine-tuned models and artifacts

Does a model fine-tuned on internal data belong to the organization or to the training platform vendor? Intellectual property over model weights is both a contractual and strategic issue.

Right 3 — Reversibility Data Act
D4 — Bias detection

Identifying and correcting algorithmic discrimination

A model that discriminates without the organization detecting it exposes the company to regulatory sanctions. The EU AI Act mandates bias assessment before deployment and continuous monitoring.

EU AI Act Art. 9.7 GDPR Art. 22.3
ORBii.Academy — Digital Sovereignty & AIMC · P.02
Protected content

You have viewed the preview of this module (first 2 pages).
To access the full content, enter your access code or request access.

8 remaining pages Personal link · Valid 24h