Aligning the Regulatory Framework with Technological Realities and Architectural Design
NEWS
3/9/20262 min read
Effective regulation must accurately reflect the underlying technological architecture it seeks to govern; laws that rely on outdated technical proxies or fail to recognize the realities of software engineering inadvertently stifle beneficial innovation.
The Digital Fitness Check offers an opportunity to ensure the digital rulebook remains technologically sound and future-proof.
A vital first step is shifting AI regulation from compute-based proxies to capabilities-based triggers.
The AI Act presumptively assigns systemic risk to General-Purpose AI models based on a training compute threshold of 10^25 FLOPs. Technologically, this is an obsolete proxy; modern advancements like quantization, distillation, and post-training compute allow highly efficient, smaller models to exhibit frontier capabilities with significantly less compute. This dynamic creates regulatory false positives for standard models and safety gaps for highly capable small models.
The Commission should raise this computational threshold to 10^26 FLOPs to reflect current hardware realities, while developing an agile, capabilities-based evaluation matrix to accurately assess systemic risks.
Furthermore, the intersection of the AI Act and the GDPR creates severe architectural conflicts regarding data minimization. While the GDPR mandates processing as little personal data as possible, the AI Act requires massive, diverse datasets to ensure model accuracy and mitigate biases.
Regulators must provide joint guidance that allows for the legal processing of data at the scale necessary for training modern AI, treating AI development as a specified, legitimate purpose under the GDPR.
The regulatory framework should also structurally incentivize the adoption of Privacy-Enhancing Technologies.
Currently, data protection laws treat companies deploying highly advanced privacy technologies with the same regulatory scrutiny as those using basic encryption.
Because there is no formal compliance dividend for utilizing these technologies, businesses are caught in a low-uptake trap where the significant costs deter their use.
Data protection laws must be amended to explicitly give significant legal weight to PETs, creating a safe harbor effect that drives the tech industry toward privacy-by-default architectures.
Technological realities must also dictate the modernization of telecommunications rules.
The ePrivacy Directive was drafted for a world of simple voice calls and text messages, and its outdated rules currently chill innovation.
Today, the automated processing of communications is technically essential for detecting spam and malware, as well as enabling accessibility features like text-to-speech.
The technological understanding of confidentiality must be modernized to explicitly recognize that automated detection tools designed to protect user security are not invasions of privacy, but necessary technical safeguards.
In parallel, the EU should repeal the outdated traffic and location data provisions of the ePrivacy Directive and transition their governance entirely to the horizontal, technologically neutral framework of the GDPR.
Finally, to prevent non-security regulations - such as data-sharing mandates - from inadvertently introducing cyber vulnerabilities, the EU should empower ENISA to conduct Security Impact Assessments during the legislative design of all digital laws.
Contribution by Techno Polis to the consultation “Digital Fitness Check: Testing the cumulative impact of the EU’s digital rules”, available at: https://ec.europa.eu/info/law/better-regulation/have-your-say/initiatives/15554-Digital-fitness-check-testing-the-cumulative-impact-of-the-EUs-digital-rules/feedback_en?p_id=21274
Engage • Educate • Innovate
Techno Polis, your Partner in Technology, Policy, and Innovation.
Privacy Policy
© 2026. All rights reserved.
Receive our insights
Get in touch and join our Forum