How New Regulations are Shaping Europe’s Digital Landscape

This is an Insight article, written by a selected partner as part of GCR's co-published content. Read more on Insight

New regulatory frameworks to address new competition challenges

Regulators in Europe have taken significant steps to address the inadequacy of existing legislation in addressing the challenges created by the rapid pace of development in the digital and technology sectors, which, to date, granted limited enforcement capabilities to the European Commission (EC) and national governments. With the enactment of both the Digital Markets Act (DMA) and the Digital Services Act (DSA), we are possibly entering into a new era of enforcement in Europe. Before analysing the potential of these new pieces of regulation and their main targets, there are a number of key themes to highlight relating to regulation and enforcement in the technology sector.

Increasing responsibility put on digital service providers

The DMA and the DSA put significant onus on digital and technology players through ex ante obligations to take greater responsibility for transparency, interoperability, interconnection and governance. These efforts particularly target gaps left by the E-Commerce Directive from 2000,[2] under which platforms were disincentivised from taking proactive measures to combat illegal content at the risk of losing their safe harbour protection. The success and widespread use of platforms built on this foundation led to significant user safety risks, rightsholder remuneration demands and monopolisation of certain services. Through new regulations, governments and competition authorities are working to rebalance the responsibility placed on platforms and the need to safeguard fair and vibrant competition, which is a driving factor for innovation and growth.

Broader scope of regulation considering future challenges

Digital innovations have a clear impact on interaction and identity, distribution, production and consumption systems. They cannot be anticipated with any certainty, and their impact is often wide-reaching and equally unpredictable. Governments and politicians are faced with a number of challenges in their attempts to regulate transformative technologies, in particular that these technologies rely on unpredictable business models and raise new privacy, security, ownership and control issues. New regulation, by placing obligations on digital and technology players, aims to pre-empt issues, particularly those relating to:

  • increasing reliance by digital actors on algorithms, artificial intelligence (AI) and robotics;
  • the significance of the cloud for public and private sectors and related questions around the security of networks; and
  • roll-out of impactful technologies like facial recognition and predictive and data-driven law enforcement and justice initiatives and related privacy issues.

Diverging legislative approaches

European national and transnational legislators are at various stages of aiming to address these issues. The EC in particular finds itself facing competing pressures from national regulators, interest groups and powerful multinationals as they attempt to legislate and implement legislation in the digital and technology sectors. Their challenges can largely be grouped into three themes: the impact on democracy and human rights; national security concerns; and preserving corporate value while encouraging the growth of European businesses.

Navigating these challenges and the specificities of the various Member States has led some, notably Germany, to take matters into their own hands. In Germany, Section 19a of the Act against Restraints of Competition, enacted in 2021, is often referred to as the DMA’s blueprint. However, the trend of Member States enacting individual national legislation has the potential to disrupt the harmonisation efforts of the EC.

There is growing consensus with respect to a blueprint across the European Union that has begun to emerge across the new digital and tech-focused regulation. It relies on three key pillars:

  • accountability: making digital services providers responsible for harm and third-party infringement;
  • transparency: increased reporting requirements placed on digital services providers; and
  • proactive obligations: filtering, blocking and gating requirements placed on digital services providers.

A more notable distinction in approaches can be seen between Europe and the United Kingdom. While the overall goals of the EU and UK regimes are similar, there are some important differences in the approach taken by the respective regulators.

The European Union has embarked on a regulation-driven approach that will designate certain online platforms as ‘gatekeepers’ based on their size. The United Kingdom will have the discretionary power that will result in obligations for digital platforms designated with strategic market status (SMS).

In the United Kingdom, where the Competition and Markets Authority (CMA) is in the process of setting up its Digital Markets Unit (DMU), a bespoke code of conduct will be drafted for each individual company with an SMS designation that addresses the particular harms associated with that company’s activities. Under the DMA, a uniform list of obligations and prohibited behaviours will apply universally to all gatekeepers, though the EC retains the right to further specify measures to be undertaken by a specific gatekeeper to ensure effective compliance.

Such distinctions risk creating a complex web of parallel and overlapping obligations and may lead to conflicting outcomes. In turn, this could result in a highly disruptive environment for both industry players and consumers.

Another example of diverging regulatory approaches can be found in EU and UK efforts to regulate AI and address AI-related risks. In the European Union, the proposed draft AI Act sets horizontal standards for developing, commercialising and using AI-powered products, services and systems within the European Union. It is intended to act in conjunction with other current and planned data-related policies, including the General Data Protection Regulation (GDPR), the DSA, the DMA, the proposed Data Act[3] and the proposed Cyber Resilience Act.

The draft AI Act sets out a risk-based strategy with multiple enforcement mechanisms. Limited-risk AI systems would be required to meet minimal transparency requirements to allow users to make informed decisions about interacting with AI, whereas AI systems with unacceptable risks that are considered a threat to people would be banned.

In the United Kingdom, the CMA has recently proposed principles to guide the AI market while protecting consumers.[4] They stem from a report based on a review of foundation models, which are AI systems with broad capabilities that can be adapted to a range of different, more specific purposes. The CMA identified potential barriers to entry in the development of AI that may affect competition, including:

  • data requirements, in particular increasing reliance on proprietary data, data advantages for certain firms and difficulties in obtaining alignment data;
  • computational resources, particularly access to computational infrastructure via the cloud;
  • technical expertise, including knowledge of machine learning, data engineering and high-performance computing;
  • access to funding;
  • open-source models, given that most high-performing pre-trained foundational models are being kept closed source and are trained on larger data sets with more powerful hardware; and
  • uncertainties surrounding the future development of foundational models.

The CMA also examined concerns about the impact of AI as an important input in a wide range of markets, how that could create new or entrench existing positions of market power for firms that develop that product or service, and the potential risks from vertical integration.

On the basis of these identified concerns, the CMA articulated proposed principles that place accountability on AI developers and deployers for outputs provided to consumers, and recommend access, diversity, choice and flexibility to switch or use AI systems according to need. They also promote fair dealing and oppose anticompetitive conduct, including anticompetitive self-preferencing, tying or bundling, and transparency, so that consumers and businesses are given information about the risks and limitations of AI-generated content and can make informed choices.

The CMA plans to engage with stakeholders to develop these principles further in the coming months and will publish an update on its views on AI in early 2024, including on how the principles have been received and adopted.

The UK government also published a White Paper, ‘A pro-innovation approach to AI regulation’, for consultation in July 2023,[5] which comes on the heels of a reintroduced Protection and Digital Information Bill to Parliament. This Bill proposes measures to use AI responsibly while reducing compliance burdens on businesses to boost the economy.[6]


The recently enacted DMA and DSA have the potential to transform the European Union’s digital regulatory environment with the aim of creating a ‘safer digital space in which the fundamental rights of all users of digital services are protected’[7] and ‘a level playing field to foster innovation, growth and competitiveness, both in the European Single Market and globally’.

The DSA and DMA represent the most significant pieces of European technology regulation to date. Both aim to reduce the burden currently placed on digital antitrust enforcers by requiring compliance from digital players, including extensive technology builds and process updates to address obligations set out by the EC.


The DMA sets out specific obligations that apply to very large digital companies identified through thresholds and other metrics as gatekeepers providing certain categories of core platform services (CPSs), which include search engines, social media networks, web browsers, operating systems, online intermediation services, video sharing services and advertisements. The DMA aims to prevent gatekeepers from imposing unfair conditions on businesses and end users and to ensure the openness of important digital services.

Having entered into force on 1 November 2022,[8] the DMA applied as of 2 May 2023. Following extensive discussions with potential gatekeepers regarding thresholds and designation, on 6 September 2023, the EC designated six companies as gatekeepers: Alphabet, Amazon, Apple, ByteDance, Meta and Microsoft.

In total, 22 CPSs have been listed in the designation decision, on the basis of meeting the quantitative thresholds for those particular services:

  • social network: Tiktok, Facebook, Instagram and LinkedIn;
  • intermediation services: Google Maps, Google Play, Google Shopping, Amazon Marketplace, App Store and Meta Marketplace;
  • ads: Google, Amazon and Meta;
  • browser: Chrome and Safari;
  • operating system: Google Android, iOS and Windows PC OS;
  • video sharing: YouTube;
  • search: Google Search; and
  • messaging services: WhatsApp and Messenger.

In parallel, the EC opened four market investigations to further assess Microsoft’s and Apple’s submissions, arguing that, despite meeting the thresholds, some of their CPSs do not qualify as gateways:

  • Microsoft: Bing, Edge and Microsoft Advertising; and
  • Apple: iMessage.

The EC expects these investigations to be complete within a maximum of five months, by February 2024.

As the DMA regulates large online platforms that provide an important gateway between business users and consumers, whose position can grant them the power to create a bottleneck in the digital economy, it also allows the EC to designate gatekeepers that have these characteristics but may not meet the quantitative thresholds. The EC has therefore also opened a market investigation to further assess whether Apple’s iPadOS should be designated as a gatekeeper, despite not meeting the thresholds. This investigation should be completed within a maximum of 12 months.

Following their designation, gatekeepers need to comply with the behavioural rules for their CPSs within six months, by March 2024. Two obligations are effective from the day of designation, namely gatekeepers must:

  • put in place a DMA compliance function; and
  • report on any intended concentrations in the digital sector within the meaning of the EU Merger Regulation (EUMR) prior to their implementation.

The obligations contained in the DMA relate to a mix of practices that are traditionally subject to competition rules or have recently been on the competition authorities’ radar (e.g., exclusivity, tying and self-preferencing) and others that clearly fall outside of that prerogative (e.g., data collection, usage and portability, transparency and alternative dispute resolution). These obligations will likely require gatekeepers to undertake significant changes to the design and operation of some of their key services to ensure compliance. Gatekeepers have six months to submit a detailed compliance report in which they outline how they comply with each of the relevant obligations of the DMA.

The DMA provides for an extensive toolkit for the EC to monitor and enforce compliance with the new obligations. Gatekeepers are obliged to report annually on the measures they have taken to comply with the obligations. The EC will also have the power to launch market investigations into systemic non-compliance by particular gatekeepers or to assess whether further digital services should be added to the list of CPSs, as they have already done in the case of Microsoft and Apple.

The EC can impose fines up to 10 per cent of worldwide consolidated turnover, which can increase to 20 per cent in cases of repeat infringement of DMA obligations, as well as behavioural and structural remedies if they find shortcomings in these efforts leading to violation of these obligations.


The DSA largely maintains the safe harbours set out in the E-Commerce Directive but adds significant new duties on each of the newly defined online service categories.

New online service categories

The DSA sets out six categories of online services: ‘intermediary’, ‘hosting service’, ‘online platform’, ‘online search engine’, ‘very large online platforms’ (VLOPs) and ‘very large online search engines’ (VLOSEs). Each category is subject to expanded obligations, with the highest stakes and fines likely to have an impact on VLOPs and VLOSEs.

The DSA came into effect on 25 August 2023 for VLOPs and VLOSEs and will shortly be coming into effect for all other intermediaries on 17 February 2024.

Safe harbours

The E-Commerce Directive safe harbours are largely replicated in the DSA, with the addition of a ‘good samaritan’ provision for intermediaries who carry out investigations to detect illegal content or undertake measures to comply with the DSA. This is a welcome change that has long been advocated for by the technology industry. However, the hosting safe harbour defence has been narrowed to exclude consumer law violations where it is reasonable for consumers to believe that the intermediary is providing the information, goods or services they received. In other words, clarity as to with whom a consumer is engaging is paramount, and going forward this will have an impact on product and customer contracting strategies and related business structures.

Notice and takedown

The DSA aims to harmonise notice and takedown mechanisms for the first time in the European Union. However, the mechanism is fairly general and, in practice, unlikely to materialise in significant changes for the majority of platforms and marketplaces, most of which already have sophisticated processes for notice and takedown in place. The most significant change is to require a statement of reasons to be provided to explain why a host has removed, disabled or restricted content and to provide an internal complaint handling mechanism for recipients to appeal these decisions. These statements then have to be made publicly available on a database that mirrors a parallel obligation in the Platform to Business Regulation.[10]

Another change is the recognition of ‘trusted flaggers’, who will be appointed by the newly established digital service coordinators in Member States, on the basis of their expertise in flagging illegal content. However, given some of the current political tensions within the European Union relating to divergent views on the rule of law, there will likely be material variance between Member State approaches to trusted flagging, with no EC-level harmonisation mechanism.

Know-your-trader requirements

In an effort to clamp down on illegal and harmful goods and services available online, the EC has included ‘know-your-trader’ requirements, requiring online platforms to obtain proof-of-trader identities and to actively verify their accuracy. While some of this information is already collected by platforms in their ordinary course of business, the legal duty to verify this information has not been seen before outside of where anti-money laundering requirements are applicable. These requirements echo proposals in other jurisdictions, including the United States, and are a bid by the EC to force marketplaces to take greater responsibility for their platform without – automatically – bearing liability for the actual listings.

VLOPs and systematic risks

The DSA proposes a requirement for VLOPs and VLOSEs to carry out an annual review to identify what ‘systemic risks’ stem from the use and provision of their services and then to take measures to address these risks and have their risk report audited by an independent auditor. This approach invokes the spirit of self-regulation but with sharper legal teeth.

Transparency reporting

One of the strongest themes emanating from the DSA is the push for greater transparency. While many intermediaries already provide some of the information the DSA is asking for, the DSA requires more. All intermediaries must publish transparency reports at least once a year that include the number of orders by Member States to remove content, notice and takedown requests, the time it takes to remove them, and what content moderation measures have been implemented.

VLOPs are also required to specify the human resources dedicated to content moderation, qualifications and linguistic expertise and indicators of accuracy broken down by the official language of the Member States. This extensive transparency reporting will be required every six months. For context, this is a significantly expanded requirement from what is expected of a data protection officer under the GDPR.

Online Safety Bill[11]

In light of the United Kingdom’s exit from the European Union, the DSA will not apply in the United Kingdom. The UK government has therefore drafted its own legislation with the aim of making ‘the UK the safest place in the world to be online while defending free expression’.[12] The Online Safety Bill (OSB) has been passed by Parliament with the expectation that it will receive royal assent in autumn 2023.[13]

The OSB will bring in new statutory duties of care for online user-to-user and search services. Services in scope of the OSB will therefore include social media networks, search engines and video sharing platforms. Companies will be under a duty to put systems and processes in place that protect users by limiting or removing any harmful or illegal content. The greatest obligations will fall on Category 1, Category 2A and Category 2B companies, with less burdensome obligations on the remaining regulated services, which will generally be smaller companies whose user numbers do not meet the thresholds necessary to be designated as a higher-risk service.

The OSB aims to prevent the spread of illegal content by requiring platforms to remove this content as soon as they see it. This is described under the Bill as ‘priority illegal content’ and includes content relating to terrorism, child sexual exploitation and 134 other offences listed in a Schedule to the Bill. Another key aim of the government is to protect children by ensuring that they are not exposed to harmful or inappropriate online content. To this end, the Bill requires stricter age verification processes and requires that services prevent children from being exposed to certain harmful content either through their functionalities or by blocking child users. Unlike the DSA, the OSB draws a distinction between ‘harmful’ and ‘illegal’ content, and the largest user-to-user services will be under an obligation to allow adult users to choose whether they are exposed to certain categories of legal but harmful content. The Bill also imposes numerous transparency duties on services, including the obligation to conduct regular risk assessments in relation to both illegal content and children’s safety duties, and to provide annual transparency reports to Ofcom.[14] The Bill includes obligations to protect users from fraudulent advertising and requires that services consider the impact of measures adopted to comply with the Bill on users’ rights to freedom of speech and right to privacy.

Ofcom will enforce the Bill and will be able to impose fines of up to £18 million or 10 per cent of a company’s annual global revenue. The Bill also imposes criminal sanctions on senior managers and directors for serious information offences, such as the failure to comply with an information notice.

AI Act[15]

AI technologies present a multitude of benefits to society but also raise certain risks, such as the possibility of categorising individuals based on appearance or behaviour. According to the EC, the AI Act will be the first legal framework to address and regulate AI. The EC proposes to follow the logic of the new legislative framework[16] (i.e., the EU approach to ensuring that a range of products comply with the applicable legislation when they are placed on the EU market through conformity assessments and the use of CE marking). The extent of regulatory intervention within the Act is based on the level of threat posed by the respective use of an AI system to the fundamental rights and security of citizens.

Unacceptable risk

The unacceptable risk category includes systems that pose a threat to individuals’ rights and safety and are therefore banned (e.g., cognitive behavioural manipulation of people or specific vulnerable groups, such as voice-activated toys that encourage dangerous behaviour in children).

High risk

AI systems that negatively affect safety or fundamental rights will be considered high risk and will be divided into two categories: (1) AI systems that are used in products falling under the European Union’s product safety legislation[17] (e.g., toys and aviation systems); and (2) AI systems falling into eight specific areas that will have to be registered on an EU database (including biometric identification and categorisation of natural persons and educational and vocational training).

Generative AI

Generative AI such as ChatGPT would have to comply with transparency requirements: (1) disclosing that the content was generated by AI; (2) designing the model to prevent it from generating illegal content; and (3) publishing summaries of copyrighted data used for training.

Limited risk

Within the limited risk category, AI systems are required to be transparent, notifying users that they are interacting with a machine (e.g., chatbots). Finally, where there is minimal or no risk, the Act allows liberal use.

The AI Act sets out a four-tier structure for fines against infringements. The largest fines are imposed for violating the prohibition of specific AI systems and can be up to €40 million or up to 7 per cent of annual worldwide turnover. The lowest penalties are for providing incorrect, incomplete or misleading information and can be up to €500,000 or up to 1 per cent of annual worldwide turnover.

On 14 June 2023, members of the European Parliament adopted Parliament’s negotiating position on the AI Act.[18] The talks will now begin with EU countries in the Council on the final form of the law. The aim is to reach an agreement by the end of 2023.

Data Act

In February 2022, the EC published a proposal for the Data Act,[19] with the stated aim of maximising the value of data (both personal and non-personal) in the digital economy. An updated provisional agreement on the Data Act was reached on 27 June 2023.[20] The Data Act will impose new obligations on digital players, including the following:

  • Providers of connected devices and any related services (including virtual assistants) must make data generated by their use available to users (both businesses and consumers). Users can also ask these providers to make data available to third parties (so that they can, for example, access a wider range of after-sales services such as repair and maintenance).
  • Where data holders are required to make data available to third parties acting in a professional capacity (under either the Data Act or other EU legislation), they must do so on fair, reasonable and non-discriminatory terms. Any terms concerning data that are unilaterally imposed on micro, small and medium-sized enterprises will be subject to a fairness test, similar in some respects to the fairness test for terms in consumer contracts under the Unfair Contract Terms Directive.[21]
  • Data holders must make data available to public sector and EU bodies where those bodies can demonstrate an exceptional need to use the data requested (e.g., in the case of public health emergencies or major cybersecurity incidents); such bodies may further share data for certain research or statistical purposes.
  • Cloud services providers must take measures to ensure that customers can switch to another data processing service of the same type offered by a third-party provider, ensuring continuity of service during transition. They must also take reasonable steps to prevent unlawful access to non-personal data by third-country governments (similar in some respects to the international transfer provisions of the GDPR).[22]

The Data Act is currently expected to be adopted nearer the end of 2023 and to become enforceable midway through 2025.

Regulatory framework revisions

In response to the dynamic nature and growing importance of digital services and products, the EC has initiated an evaluation and review exercise of its existing regulatory frameworks. Since 2021, the Directorate-General for Competition has been in the process of revising the Market Definition Notice of 1997. In November 2022, the EC published a draft for a public consultation where Section 4 deals exclusively with the digital sector and pays particular attention to such factors as research and development and innovation, globalisation, digital ecosystems, treatment of data, free services and multi-sided markets. The revised notice is expected to be adopted by the end of 2023.

Key areas of focus in technology cases

Merger control

The European Union’s referral policy has led to the review of more digital transactions, including, notably, the following:

  • In February 2023, the EC launched an investigation into Adobe’s proposed acquisition of Figma on the basis of an Article 22(1) referral. The transaction was initially notified for clearance in Austria and Germany, where it met the national notification thresholds. Austria then requested a referral to the EC, which was later joined by 15 other Member States. The EC preliminarily raised concerns on the consolidation of the markets for the provision of interactive product design tools and digital asset creation tools, as well as the potential foreclosure of competing providers of interactive product design tools via the bundling of Figma with Adobe’s Creative Cloud.
  • In March 2023, the EC unconditionally cleared the acquisition by Google of Photomath, a provider of an online homework and study help application. The case originated from an Article 4(5) referral. The EC found that (1) the parties’ combined market shares were limited, (2) the potential integration of Photomath in Google Search would not strengthen Google’s position, (3) access to Google Search was not of significant importance for maths tools users, and (4) maths tools do not depend on access to Google’s in-app store search.
  • In August 2023, the EC started investigating Qualcomm’s proposed acquisition of Autotalks, an Israeli semiconductor manufacturer. Unlike Adobe/Figma, the transaction did not meet any national notification thresholds, but referral requests were made by six Member States, including France and the Netherlands. Later, eight other Member States joined the referral request. The EC’s preliminary concerns were that the transaction would combine two of the main suppliers of vehicle-to-everything semiconductors in the European Economic Area (EEA).
  • In August 2023, the EC accepted requests submitted by three EU Member States and one European Free Trade Association Member State to assess the proposed acquisition of Nasdaq Power by European Energy Exchange AG (EEX), a German subsidiary of Deutsche Börse AG. The transaction did not reach the notification thresholds set out in the EUMR, and it was not notifiable in any Member State. Denmark and Finland submitted initial referral requests, which were joined by Sweden and Norway. The EC expressed concerns that the transaction appeared to combine the only two providers of services facilitating the on-exchange trading and subsequent clearing of Nordic power contracts.

The use of the Article 22 EUMR referral mechanism for digital transactions will increase under the new DMA. In September 2020, EU Competition Commissioner Margrethe Vestager announced that the EC would now accept referral requests from Member States even when it has no jurisdiction over a particular transaction, provided that it (1) affects trade between Member States and (2) threatens to significantly affect competition. This has led to increased use of the Article 22 referral mechanism, with the EC considering each such referral individually.

In two recent decisions, EEX/Nasdaq Power and Qualcomm/Autotalks (detailed above), the EC accepted that the acquisitions concerned, which did not meet EU merger control thresholds and were not notified in any Member States, met the criteria for referral under Article 22 EUMR. Following the EC’s landmark decision in Illumina/GRAIL, these decisions represent only the second and third cases in which the EC has accepted jurisdiction under Article 22.

The designated gatekeepers’ obligation under Article 14 of the DMA to notify the EC of any digital or data-related transactions will lead to a greater visibility of transactions that do not meet EU or national merger thresholds, which can be used to assess whether referral would be appropriate.

Another important trend is that competition authorities continue to experiment and develop their understanding of theories of harm in the digital sector. This is particularly the case with regard to those that consider not only the individual products and services but also the surrounding digital ecosystem.

For instance, in its reference of Microsoft’s proposed acquisition of Activision Blizzard to Phase II investigation, on 1 September 2022, the CMA preliminarily considered that there could be foreclosure of cloud gaming service providers using the ecosystem of Microsoft. Even though, later, in its final report, on 26 April 2023, the CMA ultimately abandoned this theory of harm, the ecosystem component was still used to analyse incentives and effects in a more traditional input foreclosure, which was ultimately relied on to prohibit the transaction as initially notified. On a similar basis, the EC looked into Microsoft’s ecosystem to assess potential foreclosure of PC OS providers.

Other investigations that considered an ecosystem theory of harm are as follows:

  • In July 2023, the EC launched a Phase II investigation into Amazon’s planned acquisition of iRobot. The EC preliminarily found that Amazon would obtain access to iRobot’s users’ data, providing Amazon with an important advantage not only in the market for online marketplace services but also in other data-related markets. While the CMA ultimately cleared the transaction on 16 June 2023, the regulator was initially concerned that iRobot vacuum cleaners could be an important input for smart home platforms, and Amazon could disadvantage its smart home rivals post-transaction.
  • In September 2023, the EC prohibited Booking’s proposed acquisition of eTraveli, a global online travel agency (OTA). The EC found that the transaction would enable Booking to expand its ecosystem of travel services by adding eTraveli’s flight OTA services. The EC found that this would strengthen Booking’s dominant position in the hotel OTA market. It would increase traffic to and sales by Booking’s platforms, hence raising barriers to entry and expansion.

Companies considering a digital transaction also need to be aware that competition authorities remain protective over ‘nascent’ or ‘innovative’ markets.

  • In 2023, both the EC and the CMA remained highly concerned about Microsoft’s proposed acquisition of Activision Blizzard due to the potential foreclosure of competitors in the cloud gaming market, which market participants and internal documents described as nascent and growing. Even though the EC approved the transaction subject to behavioural licensing remedies and a monitoring trustee framework on 15 May 2023, the CMA instead decided to prohibit the transaction on 26 April 2023. However, such prohibition was not as final as it appeared, as Microsoft submitted a restructured deal to the CMA on 22 August 2023 while an appeal against the prohibition was still pending before the Competition Appeal Tribunal (CAT). To resolve concerns about cloud gaming, Microsoft proposed to divest the global (excluding the EEA) cloud streaming rights for Activision Blizzard’s current and future games to Ubisoft, a French video game developer and publisher.
  • On 25 May 2023, the EC cleared unconditionally the proposed acquisition by Viasat of Inmarsat. Both companies own and operate geo-stationary earth orbit satellites, which are used to provide, inter alia, broadband in-flight connectivity (IFC) services to commercial airlines. The EC considered the supply of IFC services a nascent market and found that the transaction would not restrict competition given the moderate market position of the parties and a significant number of competitors remaining.

Different approaches of the competition authorities is a noteworthy trend of recent post-Brexit years. In more and more cases, the CMA and the EC are seen to diverge when (1) opening an in-depth investigation, (2) determining relevant markets and theories of harms, (3) designing remedies and (4) deciding whether to approve the deal.

Difference in initiating in-depth investigation

On 16 June 2023, the CMA cleared Amazon’s proposed acquisition of iRobot following its Phase I investigation due to lack of incentive for Amazon to foreclose on competitors. This differs from the EC, which opened a Phase II investigation on 6 July 2023, since its preliminary investigation found that Amazon may have not only the ability but also the incentive to foreclose. Moreover, while the EC prohibited the Booking/Etraveli transaction, the CMA did not refer it to Phase II investigation and cleared it unconditionally on 29 September 2022.

Difference in substantive assessment

During its investigation into Amazon’s proposed acquisition of iRobot, the CMA initially defined three areas of potential concern: (1) whether Amazon could use its online store to disadvantage iRobot’s rivals post-transaction; (2) whether Amazon could enter to compete as an alternative supplier of robot vacuum cleaners absent the merger, and whether this loss of potential competition would be substantial; and (3) whether iRobot vacuum cleaners could be an important input for smart home platforms, and whether Amazon could disadvantage its smart home rivals post-transaction. As noted above, the CMA ultimately decided that none of these concerns would result in significant lessening of competition and approved the deal on 16 June 2023. In contrast, while the EC is also investigating whether Amazon could abuse its online marketplace to foreclose on iRobot’s competitors post-transaction, its other theories of harm appear to be different from those of the CMA. In particular, the EC is looking into whether through access to iRobot users’ data, Amazon would be able to strengthen its position in the market for online marketplace services or other data-related markets, or both, thereby raising entry and expansion barriers. Microsoft’s proposed acquisition of Activision Blizzard is an even more prominent demonstration of the enforcement tensions between the UK and EU competition authorities. While the CMA’s main focus was on the risks of foreclosure of Microsoft’s rivals only in the console gaming and cloud gaming markets, the EC was additionally concerned about potential restriction of competition in the PC OS market.

Difference in remedies

While the EC approved the Microsoft/Activision Blizzard merger subject to behavioural commitments, the CMA did not find the same remedy package adequate to resolve its concerns and prohibited the deal. Unlike the EC, the UK regulator found that the proposed remedy duration (10 years) and monitoring instruments were insufficient due to the technological unpredictability and immaturity of the cloud gaming market. Whereas in the proposed acquisition by Broadcom of VMware, on 12 July 2023, the EC allowed the parties to proceed only subject to access and interoperability remedies, on 21 August 2023, the CMA unconditionally cleared the transaction.


The interplay between data and competition law has been at the forefront of many new national abuse of dominance investigations. In May 2023, the Italian competition authority (AGCM) opened an investigation into Apple’s alleged abuse of a dominant position in the market for online app distribution platforms for iOS users. The AGCM found that Apple had adopted a privacy policy for third-party app developers that was more restrictive than Apple’s own policy. Apple was therefore accused of reducing the revenues of third-party advertisers, benefiting its own digital ecosystem. In France, in a similar fashion, Apple is accused of abuse of dominance in relation to certain data practices. In July 2023, the French competition authority (FCA) issued a statement of objections regarding unfair conditions relating to the data mining of user data. The investigation originated from a referral by online advertising associations whose request for interim measures was initially rejected but then used as a basis for the investigation.

Close cooperation with the investigating authority and a transparent discussion on commitments are often the best way to address concerns.

  • In December 2022, the EC accepted commitments by Amazon, among others, barring it from using non-public marketplace seller data for its retail business.
  • In July 2023, the AGCM closed its investigation into an abuse of dominant position by Google, which consisted of obstacles to interoperability in data sharing with other platforms. To resolve concerns, Google offered three commitments, including the possibility to start testing, prior to official release, a new solution allowing direct data portability between services.
  • In January 2023, the Bundeskartellamt issued a statement of objections where it accused Google of combining user data from across its Google services. The concern was that this was done without giving users sufficient choice as to whether, and to what extent, they agreed to this processing of their data across services.

An important issue that is currently being considered is whether competition authorities can investigate and sanction a GDPR infringement as a violation of competition law. In July 2023, the Court of Justice of the European Union (CJEU) confirmed that a competition authority may take data protection rules into consideration when weighing interests in decisions under competition law. In particular, personal data was considered of importance for competition. However, the CJEU also clarified that competition authorities do not replace data protection authorities. In particular, they should neither monitor nor enforce the application of the GDPR. National competition authorities are further obliged to consult with data protection authorities when assessing and applying the GDPR.

Tied to data is the increased focus by competition authorities on anticompetitive behaviour in the digital advertising space. The most prominent case is the EC’s investigation into Google’s abusive practices in the ad tech sector. In June 2023, the EC issued a statement of objections preliminarily finding that Google abused its dominant position by favouring its own online advertising exchange platform, AdX. Of note is that the only commitment that the EC currently would consider as resolving competition concerns is a mandatory divestiture of part of Google’s services.

In December 2022, as part of its statement of objections on abuses in relation to Facebook Marketplace, the EC preliminarily found that Meta had unilaterally imposed unfair trading conditions on competing online classifieds services that advertise on Facebook and Instagram. On a similar basis, in May 2023, the FCA imposed interim measures on Meta because of unfair and discriminatory conditions on ad verification service providers for accessing its ecosystem.

As with mergers, a focus on the broader ecosystem remains a key trend. In February 2023, the EC revised its preliminary views on the anticompetitiveness of Apple’s App Store rules for music streaming providers. It sent a new statement of objections, confirming that it would no longer investigate the legality of the IAP obligation, and would focus only on the anti-steering obligations.

Several digital players are also now being investigated for anticompetitive tying and bundling.

  • In December 2022, the EC sent a statement of objections to Meta on abusive practices favouring Facebook Marketplace. Among other things, the EC provisionally found that Meta abused its dominant position in the personal social networking market by tying Facebook Marketplace to its Facebook social network.
  • In June 2023, the Bundeskartellamt issued a statement of objections against Google in respect of (1) bundling of apps as a collection of apps in ‘the Google Automotive Services’, (2) agreements on advertising revenue sharing and obligations to give preference to Google Automotive Services, and (3) obstruction of interoperability with third-party services.
  • In July 2023, the EC opened a formal investigation into Microsoft for tying or bundling its communication and collaboration product Teams to Office 365 and Microsoft 365.

Apart from tying, online marketplaces also raised other competition concerns. In July 2023, the Spanish competition authority (CNMC) fined Apple and Amazon €194 million for restricting inter- and intra-brand competition in the sale of Apple products on Amazon Marketplace in Spain. Clauses were found to be anticompetitive by object because they (1) unjustifiably restricted the number of resellers of Apple products on Amazon Marketplace in Spain, (2) limited the space in which competing Apple products could be advertised on Amazon Marketplace in Spain, and (3) restricted Amazon’s ability to target marketing campaigns to customers of Apple products on its website.

Finally, the video games sector not only is a focus of merger investigations but has also been subject to antitrust scrutiny. On 27 September 2023, the General Court dismissed Valve’s appeal to annul the 2021 EC decision on geo-blocking. The EC fined Valve €7.8 million for restricting the sale of PC games by introducing territorial control features in the Baltic States and countries in Central and Eastern Europe. The Court confirmed that the geo-blocking on Valve’s Steam platform infringed EU competition law. It agreed with the EC that the geo-blocking did not have the objective of protecting intellectual property rights but rather was used to prevent parallel imports.

Market investigations

Market investigations by competition authorities have been ongoing to better understand the cloud sector and potential anticompetitive behaviour. In April 2023, in its interim report on UK cloud services, Ofcom proposed to refer its market investigation to the CMA for further investigation. In particular, Ofcom raised concerns about (1) charges for transferring data out of the cloud (egress fees), (2) technical restrictions on interoperability, and (3) committed spend discounts. A final report is due to be published in October 2023.

The CMA is already looking at a specific segment of the cloud. In November 2022, it launched an in-depth market investigation into, among other things, the distribution of cloud gaming services through app stores on mobile devices. However, on 18 January 2023, the investigation was challenged by Apple before the CAT, which suspended the investigation and ordered the CMA to reconsider its decision. Apple’s successful argument was that the CMA had failed to comply with the statutory time frame set out in the 2022 Enterprise Act. On 13 April 2023, the CMA decided to appeal the decision, of which a hearing will be held in October 2023.

In addition to the UK authorities, the FCA published its market study on the cloud sector in June 2023. The FCA identified certain risks such as cloud credits and egress fees that could be resolved by competition law, and identified market failures that were likely to be addressed by new regulations such as the EU Data Act. Pointing to future enforcement, the FCA mentioned several new developments, such as large language models, edge computing, cloud gaming and cybersecurity, as potentially having an impact on competition in the sector.

The DMA is expected to lead to more market investigations into digital companies and their products. On 6 September 2023, the EC opened four market investigations to assess whether certain products of Microsoft (Bing, Edge and Microsoft Advertising) and Apple (iMessage) qualify as core platform services. In addition, the EC opened a market investigation to assess whether Apple’s IpadOS should be designated as a gatekeeper, even though it does not meet the thresholds.

Foreign direct investment review

In addition to the traditional merger control review, digital-related deals are also scrutinised under the foreign direct investment (FDI) screening rules. In particular, many European FDI regimes consider transactions involving, for example, 5G, cloud technology or semiconductors to likely pose security and defence risks due to potential abuse concerns. In this context, the following cases, which took place during the past year, are of particular interest:

  • On 16 November 2022, the UK government exercised its powers under its national security and investment regime to unwind the Chinese-owned Nexperia’s acquisition of Newport Wafer Fab (NWF), a semiconductor wafer factory located in Newport, Wales. The UK government’s concerns related to the risks of technology transfers outside the United Kingdom and access to sensitive technological expertise and know-how. In December 2022, Nexperia and NWF filed a judicial review with the High Court to challenge the decision, which is still pending at the time of writing.
  • On 5 December 2022, the UK government cleared the acquisition of UK eSIM company Truphone by German billionaire Hakan Koç, Swiss-based Pyrros Koussios, TP Global and HMFK Capital, subject to information security remedies. The UK government’s national security concerns related to the risks of the disruption of UK mobile services and potential exploitation of the target’s UK user data.
  • On 19 December 2022, the UK government ordered LetterOne, a Russian-founded investment firm, to unwind its £1 billion acquisition of UK broadband provider Upp due to national security and defence concerns. In addition, on 19 December 2022, the UK government adopted another blocking decision that prohibited Chinese semiconductor company SiLight from acquiring UK rival HiLight Research.
  • On 16 March 2023, the Italian government blocked the acquisition by Nebius BV, a Dutch cloud services provider, of its Italian tech rival Tecnologia Intelligente Srl due to the ties of the buyer with Russia.
  • On 8 May 2023, the Japan-based optical sensors and devices manufacturer Hamamatsu Photonics confirmed that its acquisition of Danish NKT Photonics, a supplier of fibre lasers and photonic-crystal fibre, had been prohibited by the Danish government. The block was likely caused by the target’s highly sensitive activities and links to the defence industry.
  • On 15 June 2023, the Italian government exercised its national security competence to limit the corporate powers of the Chinese state-owned conglomerate Sinochem as a shareholder of Pirelli, an Italian tire manufacturer. The main government’s concerns related to the ability of the sensors implanted in tyres to collect vehicle data regarding, among other things, road layouts, geo-location and the state of infrastructure, that might be used in a way that would harm the national security order.
  • On 13 September 2023, the German government prohibited the complete takeover of KLEO Connect, a German satellite start-up, by Shanghai Spacecom Satellite Technology (SSST), a Chinese firm. While SSST is already a majority shareholder of KLEO Connect, the German government was concerned that the increase of the majority stake from 53 per cent to 98 per cent might pose risks to the national security of Germany.


1 Paul Johnson and Ben Allgrove are partners and Rebecca Bland and Ola McLees are associates at Baker McKenzie. The authors would also like to thank Beau Maes, Pavlo Prokhorov and Lilli Meldrum for their contributions to this chapter.

2 Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market (‘Directive on electronic commerce’) [2000] OJ L178/1.

3 The Data Act was formally proposed by the EC on 23 February 2022. A political agreement reached by the European Parliament and the Council of the European Union was reached on 28 June 2023 and is now subject to formal approval by the two co-legislators. Once adopted, the Data Act will enter into force on the 20th day following its publication in the Official Journal and will become applicable 20 months after the entry into force. Press release, ‘Data Act: Commission welcomes political agreement on rules for a fair and innovative data economy’, 28 June 2023:

5 UK Department for Science, Innovation and Technology, ‘Establishing a pro-innovative approach to AI regulation’, 18 July 2023:

6 The UK government introduced the Data Protection and Digital Information (No.2) Bill on 8 March 2023, withdrawing the Data Protection and Digital Information Bill that was introduced in June 2022. Government Bill, Data Protection and Digital Information Bill, 8 March 2023:

8 In April 2022, the EU institutions reached a compromise on the final text of the DMA, which was then approved by the European Parliament and Council of the European Union in July 2022.

10 This database went live on 24 September 2023 and can be viewed at

11 Online Safety HC Bill (2022-23) [121].

14 Sections 78 and 79, Online Safety Bill (HL Bill 164 (as amended on Report)).

15 Texts adopted – Artificial Intelligence Act – Wednesday, 14 June 2023 (

19 Proposal for a Regulation of European Parliament and of the Council on harmonised rules on fair access to and use of data (Data Act) COM (2022) 68 final:

20 Proposal for a Regulation of the European Parliament and of the Council on harmonised rules on fair [access to] and use of data (Data Act), 2022/0047(COD):

21 Council Directive 93/13/EEC of 5 April 1993 on unfair terms in consumer contracts:

22 Chapter V, Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation):

Unlock unlimited access to all Global Competition Review content