The Digital Markets, Competition and Consumers Act 2024 (“DMCC ACT“) is widely regarded as the most significant overhaul of UK consumer protection and competition law since the Consumer Rights Act 2015. The DMCC Act introduces wide-ranging reforms to digital markets, competition enforcement and consumer protection. Although the Competition and Markets Authority (“CMA“) has suggested it will offer businesses some initial breathing room to adjust to the changes, organisations need to understand how the rules are changing and where they may need to proactively adapt.
While the DMCC Act contains a number of key pillars of reform, the one that is likely to be key for most businesses is the reform to consumer protection. The DMCC Act introduces stronger enforcement powers for the CMA, including direct penalties and new rules around unfair commercial practices, hidden fees (drip pricing), fake reviews and subscription traps. These changes are consumer-facing and are where businesses of all sizes are most likely to feel immediate impact.
For the first time, the CMA can investigate and penalise consumer law breaches through administrative proceedings, bringing consumer protection enforcement into closer alignment with its existing competition law powers. It can now impose substantial financial penalties without going through the courts, including up to 10% of global annual turnover (or £300,000, whichever is greater) for consumer protection breaches. These penalties are split into categories reflecting different types of infringement, ranging from procedural or investigatory failures to non-compliance with CMA directions. These powers apply to a broad range of existing consumer protection legislation, much of which has now been consolidated in the DMCC Act.
Although there is no formal statutory grace period, the CMA has indicated it will focus initially on more serious breaches. Businesses must therefore treat compliance as a priority, rather than a simple operational tick-box exercise.
The new consumer protections that businesses must address include:
All pricing information must now display the total upfront cost in adverts and product listings, including any booking fees, taxes, delivery charges or other payments that the consumer will incur. If there are charges that cannot reasonably be calculated in advance these can be excluded from the headline price but nonetheless must be clearly disclosed. Hidden mandatory fees are a key risk area that the CMA is actively policing.
The CMA’s guidance gives the example that if a gym membership is subject to a minimum contract term of six months, then the advertised price must set out the total six-month cost, not just the monthly fee.
Publishing or facilitating fake or misleading reviews is banned. Crucially, the DMCC places a positive obligation on traders to take “reasonable and proportionate steps” to prevent fake reviews from appearing on their platforms.
Businesses that use the online platforms or digital marketing will need to implement checks on reviews alongside clear terms of engagement in order to demonstrate it is actively preventing and removing fake reviews.
The CMA permitted a 3-month adjustment period to enable businesses to digest the guidance, which concluded earlier this month, and since then it has completed a website review of more than 100 businesses including Viagogo, StubHub, AA Driving School and Wayfair. It found that more than half of the businesses investigated may be failing to comply with the guidance.
Rules about consumer subscriptions were due to come into force in Spring 2026, but it is expected that this may be pushed back by a further 6 months. Once in place, businesses offering subscription services will face additional requirements designed to combat “subscription traps”. These include:
The obligations will become implied terms of consumer contracts and will give consumers additional cancellation rights if traders fail to comply.
Consumer-facing business models are under new and heightened scrutiny, particularly where online platforms, digital marketing, subscriptions or renewal models are used. Consumer protection has been elevated, therefore businesses must ensure their terms of sale (including terms covering subscription/cancellation/refunds) comply with the new rules.
In relation to M&A, the enhanced review powers coupled with greater willingness from the CMA to intervene, mean that businesses engaging in M&A must consider the broader digital market context when carrying out its due diligence exercise.
Although the CMA has indicated it will initially focus on and prioritise more egregious breaches, it is clear that it intends to act swiftly and stop unlawful conduct. The CMA may also consider previous conduct when setting monetary penalties, especially where the business has been non-compliant with CMA enforcement in the past.
The DMCC Act represents a fundamental change in the UK’s consumer law and digital markets landscape. The CMA now has the power and the resources to take swift and decisive action against businesses that fall short of their consumer law obligations.
For businesses, this shift brings greater penalties for non-compliance but also greater rewards for transparency and fair dealing. Organisations operating in the UK market should act now to review existing practices and strengthen internal governance. The regulatory environment is changing, and businesses that adapt now will be far better placed to grow and build consumer trust in this new era.
Read More
The UK’s data protection landscape is continuing to evolve. The Data (Use and Access) Bill (the “DUA Bill”) received Royal Assent last month and has been enacted as the Data (Use and Access) Act 2025 (the “DUA Act”). The DUA Act aims to complement existing UK data protection laws by enhancing transparency, promoting responsible data sharing, and reinforcing the protection of individual rights.
This article explores some of the key changes being introduced by the DUA Act and outlines practical steps organisations can take to prepare for the new legislation and ensure ongoing compliance.
One of the most hotly contested issues that stalled the DUA Bill’s progress was around the treatment of artificial intelligence (“AI”). The emergence of AI models raised significant concerns, particularly around copyright materials being used by developers for training their Large Language Models.
The House of Lords pushed for amendments to the DUA Bill to include stricter provisions on the use of copyrighted content, advocating for mandatory transparency requirements. Some of the UK’s leading music artists, including Sir Elton John, Sir Paul McCartney and Dua Lipa, spoke out in support of these changes. (Dua Lipa’s high profile involvement even led to the legislation being jokingly referred to as the “DUA Lipa Bill”). These artists warned that, without such safeguards, tech companies could exploit intellectual property and be given free rein to use content without having to compensate the creators.
However, the House of Lords’ efforts were unsuccessful. The Government ultimately resisted the proposed changes, arguing that the DUA Act was not the appropriate legislative vehicle to address such complex and evolving issues. Eventually a compromise was reached and a government report on AI and copyright is due to be published later this year, which will explore possible changes and enforcement measures.
The DUA Act establishes a more robust legal framework for data access and sharing. It updates and reforms existing UK data provisions and e-privacy laws and includes broader data policy initiatives aimed to encourage use of data in the public interest, while maintaining safeguards for individual rights to privacy.
While some critics argue the DUA Act simply reinforces and codifies existing legislation, the cumulative effect of the changes could be significant from a compliance and operational perspective.
The new Information Commission will be issuing guidance on the DUA Act, but this is not scheduled to be coming out any time soon and may not be until next year. As we await further details and secondary legislation, organisations should take this opportunity to proactively review and assess their existing documentation and policies to ensure a smooth transition.
The DUA Act is part of a broader trend towards a more flexible and accountability-driven approach to how data is being governed in the UK. While some key aspects, such as AI and copyright, are subject to secondary legislation and further guidance to be published, the direction the Government is taking towards modernisation is clear.
Organisations that begin reviewing and assessing their processes and provisions now, will be better placed to ensure legal compliance and avoid regulatory risk in the future.
Our experienced Data Protection and & Privacy team is available to provide further advice or answer any questions you may have about the DUA Act. Please do not hesitate to get in touch.
Read More
It is because of AI’s potential to change the world – for both good and bad that many feel it needs to be regulated. In May of this year, the CEO of Google Sundar Pichai said that “AI is too important not to regulate, and too important not to regulate well.”
The explosion of AI and the hype around it has led to an increasing degree of regulatory scrutiny and around the world we are seeing different approaches being taken to regulation of AI but also some commonalities in the approaches.
Watch Ann-Maree Blake‘s keynote presentation at the Consulegis conference event in Cardiff.
Both the European Union and the United Kingdom have stepped up to the AI regulation plate with enthusiasm but have taken different approaches:
The EU has put forth a broad and prescriptive proposal in the AI Act which aims to regulate AI by adopting a risk-based approach that increases the compliance obligations depending on the specific use case. The UK, in turn, has decided to abstain from new legislation for the time being, relying instead on existing regulations and regulators with an AI-specific overlay.
In the EU the thinking is about “empowering people with a new generation of technologies” whereas in the UK the thinking is about “driving growth and unlocking innovation.”
The EU is looking to put in place what is probably the most ambitious framework in the world in terms of AI regulation.
It is very much aiming to be a global leader in AI regulation, in much the same way as it has with data protection via the GDPR.
What the EU AI Act does it that it looks at the risk of AI systems and it tries to deal with the risks in a practical way by categorising AI systems into 4 levels of risk which are: unacceptable, high, limited, and minimal or no risk.
If it is passed, it will require companies to assess AI system risks before those systems are put into use.
Companies will be required to obtain permits for high-risk AI, and provide transparency and accountability for those high risk AI systems.
The UK is not proposing to create umbrella legislation. Instead its approach, which is set out in a white paper published in March 2023 sets out the ambition of the UK being “the best place in the world to build, test and use AI technology”.
Broadly speaking the approach in the White Paper Establishing a pro-innovation approach to AI regulation rests on two main elements: firstly, AI principles that existing regulators (such as the ICO and the FCA (Financial Conduct Authority)) will be asked to implement, and secondly, a set of new ‘central functions’ to support this work.
The regulation of AI is a complex and whether there will eventually be a true global standard for AI regulation remains to be seen.
However, companies should not wait until there is a clear regulatory framework in place in their jurisdiction as there are key things they can do now including:
To discuss any of the points raised in this article, please contact Ann-Maree Blake or fill in the form below.
Read Moretrusted legal excellence
Contact us today to discover how we can support you with legal solutions that stand out from the rest.
Get in Touch