top of page

Regulating Intelligence: How Legal Borders Shape Apple’s Smart Tech

By Rabani Malhotra



ree


You may have heard about Apple’s latest technological advancement. Live Translation for Airpods does exactly what it appears to do, namely, translates speech in real time as you listen. It’s a sleek, futuristic feature that reflects Apple’s growing investment in artificial intelligence. Unfortunately, however, if you happen to live in the EU, you won’t be getting the feature - at least, not yet. 


Apple has blocked the feature for EU users, citing regulatory complications under the Digital Markets Act (DMA). The DMA, introduced to curb the power of “gatekeeper” tech firms, requires companies like Apple to ensure interoperability with rival devices and also mandates fair access for third parties to Apple’s platforms. In theory, this promotes competition. In practice, it complicates Apple’s ability to roll out AI features seamlessly across markets, especially when those features rely on tight hardware-software integration and heavy data protection safeguards. 


The company argues that these requirements create security and privacy risks. The crux of their positions is that opening up its ecosystem undermines the very trust that Apple’s brand depends on. But beyond the PR narrative, the delay reveals something deeper: the growing friction between innovation and regulation, and the way legal borders are quietly reshaping the commercial contracts that power global tech.]


The EU’s approach to AI


The EU’s regulatory stance is characterised by its risk-based model. The Artificial Intelligence Act, which began phasing this year, classifies AI systems according to the level of risk they pose. High-risk systems, such as those affecting employment, credit, or law enforcement, face strict transparency and oversight requirements.


AI is also increasingly embedded in the EU’s broader governance frameworks. AI has already been utilised for rule-making and legal interpretation, speeding up analysis and improving access to information. But questions surrounding accountability, particularly in situations when algorithms influence consequential decisions remain unanswered.


For companies like Apple, these layered obligations mean every AI-driven feature must be evaluated not only for functionality but also regulatory classification. The DMA and AI Act together create overlapping compliance burdens affecting contracts with developers, suppliers, and service partners. Each agreement must now specify how data is processed, who bears liability for AI malfunctions, and how interoperability will be managed.


The UK’s stance


The UK has a more flexible, “pro-innovation” approach. Rather than adopting a single AI Act, it relies on sector-specific regulators and a set of guiding principles including transparency, fairness, accountability, and contestability.


Yet, within this flexible system, regulatory gaps are being tested. The Text and Data Mining exemption under UK copyright law, for instance, remains under judicial review. This exemption determines whether AI systems can legally mine copyrighted material for training: raising the same questions Apple now faces about how its translation models process and store user data. If your AirPods are listening and learning from your speech, what happens to that data, and which jurisdiction is to control its future?


Commercial contracts in a divided regulatory landscape


The regulatory divergence between the EU and UK forces companies like Apple to restructure their commercial contracts for AI features regionally. Rules applicable in London may not be compliant in Brussels. 


This legal fragmentation doesn’t just affect Apple: it’s shaping how law firms and startups respond to emerging technology more broadly. Programs such as Allen & Overy’s FUSE and legal AI tools like Harvey illustrate how firms are experimenting with technology within regulatory “sandboxes” - controlled environments that balance innovation with compliance. Keeping pace with these shifts is no longer optional; it’s central to how modern commercial law operates.


Apple’s delayed rollout of Live Translation might seem a small inconvenience for EU consumers, but it symbolises a larger transformation in global tech governance. As AI becomes integral to everyday devices, the rules that govern it are no longer abstract; rather, they directly shape the products we use and the contracts that make them possible.


In the race to regulate intelligence, the challenge both for lawmakers and companies is to strike a balance, and to foster innovation without fragmenting it across borders.



Sources:



Edited by Artyom Timofeev


 
 
 

Recent Posts

See All
Commercial Awareness Digest - 28th November 2025

Reeves' 2025 Budget and its Implications By Esme Glover The eagerly awaited 2025 Budget, delivered by UK Chancellor Rachel Reeves this week, has sparked significant discussion about its terms and area

 
 
 

Comments


© 2025 by UCL LAW FOR ALL SOCIETY 

  • LinkedIn Social Icon
  • YouTube Social  Icon
  • Facebook Social Icon
  • Instagram Social Icon
  • Twitter Social Icon
bottom of page