Drummond Watchdrummondwatch.com
HomeReportsBy TopicStart HereEvidence FilePeople & OrgsChronicleDocument Vault
Search

Subscribe

Stay Informed — New Reports Published Regularly

Subscribe to receive notification whenever a new report, evidence brief, or legal update is published.

Drummond Watch

An independent public monitoring archive documenting factual rebuttals and legal accountability.

All content is presented for public interest and legal record purposes.

© 2026 Drummond Watch. All rights reserved.

Explore

  • Home
  • Reports
  • Start Here
  • By Topic
  • Evidence File
  • People & Orgs
  • Chronicle
  • Document Vault

Reference

  • FAQ
  • What's New
  • Glossary
  • Sources
  • Downloads

Site

  • About
  • Contact
  • Legal Notice

© 2026 Drummond Watch. All content is published for public interest, legal record, and accountability purposes.

    1. Home
    2. Reports
    3. Leveraging the Digital Services Act for Platform Enforcement Against Drummond

    Report #129

    Leveraging the Digital Services Act for Platform Enforcement Against Drummond

    A detailed guide to using the European Union's Digital Services Act (Regulation EU 2022/2065) as a platform enforcement mechanism against Andrew Drummond's defamation infrastructure, covering VLOP obligations, notice and action procedures, trusted flagger status, and regulatory enforcement through Coimisiún na Meán and Ofcom.

    Formal Record

    Prepared for: Andrews Victims

    Date: 29 March 2026

    Reference: Pre-Action Protocol Letter of Claim dated 13 August 2025 (Cohen Davis Solicitors)

    Executive Summary

    The Digital Services Act, Regulation (EU) 2022/2065, became fully operative in February 2024 and represents the most significant reform of platform accountability in the European Union's history. It imposes legally binding obligations on online platforms to remove illegal content expeditiously, to conduct systemic risk assessments, and to provide meaningful redress mechanisms for users and affected third parties. These obligations are enforceable by national Digital Services Coordinators and the European Commission, with fines of up to 6% of global annual turnover available under Article 74.

    This paper examines how the DSA framework can be deployed against the platforms hosting and amplifying Andrew Drummond's defamatory publications about Bryan Flowers, Punippa Flowers, and Night Wish Group. The analysis focuses on the obligations applicable to Very Large Online Platforms and the procedural mechanisms available to victims seeking platform enforcement.

    1. VLOP Designation and Enhanced Obligations

    The DSA establishes a tiered regulatory framework. Very Large Online Platforms — platforms with 45 million or more monthly active users in the European Union — are subject to the most demanding obligations, including systemic risk assessment, enhanced transparency, and independent audit requirements. Google Search, YouTube, Facebook, and Instagram all qualify as VLOPs and are subject to the full VLOP regulatory regime.

    Article 33 of the DSA requires VLOPs to conduct annual assessments of systemic risks arising from their services. Article 34(1)(a) specifies that these risk assessments must address the dissemination of illegal content through the service. Article 34(1)(b) extends the assessment obligation to actual or foreseeable negative effects on fundamental rights, including the right to dignity and the right to a private life.

    Andrew Drummond's defamatory publications about Bryan Flowers, Punippa Flowers, and Night Wish Group constitute illegal content within the meaning of Article 3(h) of the DSA, which defines illegal content as any information that is not in compliance with EU law or the law of any Member State. Defamatory content is illegal under the domestic law of each Member State and therefore falls within the Article 3(h) definition.

    2. Article 35: Risk Mitigation Obligations

    Article 35 of the DSA requires VLOPs to put in place reasonable, proportionate, and effective mitigation measures to address the systemic risks identified in their Article 33 assessments. For systemic risks arising from the dissemination of illegal content including defamatory material, mitigation measures must include robust notice and action systems that process complaints expeditiously.

    Where a VLOP's notice and action system fails to process complaints about clearly illegal defamatory content with appropriate speed, the failure itself constitutes a breach of Article 35. A pattern of ignoring or inadequately responding to documented complaints about Drummond's fabricated content would therefore be independently actionable under the DSA, separate from and in addition to any liability the platform might bear for hosting the content.

    Victims and their representatives should document all complaints submitted to VLOP platforms, recording submission dates, content of complaints, and any responses received. This documentation will support formal regulatory complaints to Digital Services Coordinators under Article 53 of the DSA, which provides an explicit right for recipients of a service and bodies representing their interests to submit complaints to competent authorities.

    3. Article 16: Notice and Action Procedures

    Article 16 of the DSA requires hosting service providers — including platform operators — to put in place mechanisms enabling any individual or entity to notify them of specific items of content that the notifier considers to be illegal. These mechanisms must be easy to access, user-friendly, and capable of being submitted electronically.

    Crucially, Article 16 requires that notices be structured and contain specific elements, including the electronic location of the content, the reasons why the content is considered illegal, and the identity of the notifier. A properly structured DSA notice is substantially more demanding than a standard platform report and triggers specific procedural obligations on the platform, including a requirement to act expeditiously and to notify the notifier of the decision taken.

    Victims submitting notices under Article 16 should use the structured notice format, providing: the specific URL of each defamatory item; a clear statement that the content constitutes defamation contrary to the domestic law of one or more EU Member States; reference to the specific false statements identified; and a reference to the Pre-Action Protocol Letter of Claim from Cohen Davis Solicitors dated 13 August 2025 as documentary evidence of the legal status of the content.

    4. Article 22: Trusted Flagger Status

    Article 22 of the DSA establishes a trusted flagger mechanism, under which organisations that demonstrate particular expertise and accountability in identifying illegal online content can be awarded trusted flagger status by Digital Services Coordinators. Notices submitted by trusted flaggers must be processed by VLOPs with priority and special diligence.

    Organisations representing defamation victims — including law firms, victim advocacy groups, and press freedom organisations — should seek trusted flagger status from their national Digital Services Coordinator. In Ireland, Coimisiún na Meán serves as the Digital Services Coordinator for Meta and Google given those companies' EU headquarters in Dublin. In the United Kingdom, Ofcom serves as Digital Services Coordinator under the UK's domestic implementation of equivalent provisions.

    Trusted flagger status would transform the Drummond Watch investigation from an individual victim's complaint mechanism into a recognised enforcement tool capable of compelling prioritised platform responses. The investment in seeking trusted flagger accreditation is therefore disproportionately valuable relative to the effort involved.

    5. Article 17: Statement of Reasons and Redress

    Article 17 of the DSA requires platforms to provide a clear statement of reasons to any person whose content is restricted as a result of a platform decision. The counterpart of this obligation — relevant for victims rather than publishers — is that where a platform decides not to restrict content that has been the subject of a notice, it must similarly explain its decision.

    This transparency obligation is valuable to victims in two ways. First, it enables them to understand the grounds on which a platform is declining to act, allowing them to address those grounds in subsequent submissions or escalate to the relevant Digital Services Coordinator. Second, an inadequate or legally flawed statement of reasons itself constitutes evidence of non-compliance with DSA obligations.

    Where a platform provides a statement of reasons that mischaracterises the defamatory content as protected opinion or fails to engage with the specific legal analysis provided in the notice, that statement should be preserved as evidence and included in any regulatory complaint submitted to Coimisiún na Meán or Ofcom.

    6. Articles 40, 42, and 74: Data Access, Transparency, and Enforcement

    Article 40 of the DSA requires VLOPs to provide data access to Digital Services Coordinators and the European Commission for the purpose of auditing and investigating compliance. This provision enables regulatory authorities to examine the recommendation algorithms, advertising systems, and content moderation records of major platforms — potentially revealing how Drummond's defamatory content has been amplified by platform algorithms and monetised through advertising revenue.

    Article 42 requires VLOPs to publish detailed transparency reports covering their content moderation activities, including the number of notices received, the decisions taken, and the time taken to act. These reports provide a benchmark against which a platform's handling of specific complaints — including complaints about Drummond's content — can be assessed.

    Article 74 provides for fines of up to 6% of a VLOP's global annual turnover for failures to comply with DSA obligations. For platforms with global revenues in the tens of billions of euros, this represents an enforcement sanction of enormous magnitude. The credible threat of Article 74 enforcement is a powerful lever in compelling platforms to take defamation complaints seriously, and regulatory complaints should explicitly reference this provision.

    — End of Report #129 —

    ← Report #128
    Next Report: #130 →
    View all 171 reports

    Share:

    Subscribe

    Stay Informed — New Reports Published Regularly

    Subscribe to receive notification whenever a new report, evidence brief, or legal update is published.