UK Expands Online Safety Act to Mandate Preemptive Scanning

(reclaimthenet.org)

56 points | by aftergibson 4 hours ago

18 comments

  • munksbeer 1 hour ago
    > A major expansion of the UK’s Online Safety Act (OSA) has taken effect, legally obliging digital platforms to deploy surveillance-style systems that scan, detect, and block user content before it can be seen.

    If this is implemented as it reads, just a note to everyone else, everywhere in the world:

    For this policy to work, everything must be scanned. So now, every time you communicate with someone in the UK, your communications are no longer private.

    • flumpcakes 40 minutes ago
      Well, yes, because it is designed to protect UK citizens. As much as GDPR applies "everywhere in the world" when interacting with EU citizens.

      Just as much as my communications are scanned when interacting with US citizens with PRISM. I'd argue that is exponentially more dangerous and nefarious given it's apparently illegality and (once) top secrecy.

      • cheeseomlit 19 minutes ago
        >it is designed to protect UK citizens

        Is that really what it's designed for?

        And as far as the PRISM comparison goes, I'd rather mass surveillance not be done at all, but if it's being done no matter what I'd rather it be illegitimate than official policy. At least they have to jump through some hoops for parallel construction that way, and it doesn't normalize the practice as morally/socially acceptable- it's a "secret" because its embarrassing and shameful for it to exist in a "free" society. If its not a secret and nobody is ashamed of it then you dont even have the pretense of a free society anymore

  • ghusto 2 hours ago
    > The UK Department for Science, Innovation and Technology (DSIT) unveiled the changes through a promotional video showing a smartphone scanning AirDropped photos and warning the user that an “unwanted nude” had been detected.

    "Unwanted"

    • mosura 2 hours ago
      Cryptographically signed with proof of the sender’s bank balance to enable appropriate filtering.
    • soco 2 hours ago
      I can imagine in the app/phone settings "allow nudes only from contacts" or a whitelist something? I get on Tumblr all the time unsolicited shit, not necessarily bad looking but no thanks I can take care of myself.
    • rdm_blackhole 2 hours ago
      > The UK Department for Science, Innovation and Technology (DSIT)

      It should be called the Ministry of Truth at this point.

      > Unwanted

      How do you know if a nude is unwanted? The premise itself makes no sense. The only way this could potentially work is if you had the whole context of the relationship somehow embedded in the messages and then if you deciphered the intent behind the messages. Even then what about sarcasm or double entendre?

      • potato3732842 2 hours ago
        >How do you know if a nude is unwanted? The premise itself makes no sense

        If the app has sufficient permissions to infer user demographics a sufficiently jaded person should be able to come up with a set of rules that get you a 99% solution pretty easily.

        • mikkupikku 2 hours ago
          In the future, phones will refuse to take pictures of dicks unless men register their height and income levels so that useful and relevant information can be added to the image metadata.
      • flumpcakes 2 hours ago
        Perhaps there should be a setting "Allow X" that has to be set on a contact. By default it is set to disallow nudes.

        I think this already exists by the way - screening potentially pornographic images and you have to explicitly confirm a choice to view it.

        • akikoo 2 hours ago
          "Allow X" now that they are planning to ban X :)
      • pacifika 58 minutes ago
        Unwanted by DSIT
  • captain_coffee 2 hours ago
    So wait - would this be something like... you trying to send a dickpic via WhateverMessenger, the content would be scanned first and you would be presented with a message along the lines of "This message cannot be sent as it violates our T&Cs"?
    • imdsm 2 hours ago
      scanned locally or externally? that's what i care about
      • Phemist 2 hours ago
        Don't buy into the framing. No scanning at all is what I care about.
        • like_any_other 1 hour ago
          Don't buy into that framing either. Optional scanning - if a user wants to, they are free to download government spyware onto their phone/computer and do all the scanning they want, local or otherwise. No new laws needed.
          • rdm_blackhole 55 minutes ago
            I agree. If someone is happy for a government worker/algorithm to snoop through everything they send to anyone, feel free to opt in, just don't force us to participate.
            • Natfan 39 minutes ago
              if it provably isn't networked and is ephemeral with no logging, then i potentially don't have an issue with it
              • like_any_other 31 minutes ago
                You have no issue with censorship, as long as there's no surveillance to go with it?
      • doublerabbit 39 minutes ago
        Externally. When is anything ever scanned internally.
      • ChrisRR 2 hours ago
        Preferably not scanned at all
    • like_any_other 2 hours ago
      More likely it would just silently not be sent, and potentially a week later you get a visit from the cops. Censors hate drawing attention to their actions, that is why you never see a "this message censored on government request" as sender or recipient.

      This is where someone conflates it with anti-spam and acts confused, because showing such a notice for every spam message would make a service unusable. As if spam is equivalent, as if users cannot be given the choice to opt in/out of however much anti-spam and other filtering that they want as recipients, and as if "This was censored" messages cannot be collapsed/shown per category, e.g. "Messages blocked: 12 spam, 4 unwanted sexual content, 5 misinformation/lacking context, 7 hate/harmful content". As a rule, when someone raises an objection that can be resolved with less than 60 seconds of thought, they are not being genuine.

      But more importantly, it would make it illegal to provide any kind of messaging software without government approval, which is only given by letting government-designated censorship and surveillance services act as middle-men. And then the law can be more or less strictly applied, depending how much the government dislikes the general sentiment that is spread on your network, regardless of its legality, thus controlling discourse.

      I am not speculating here - this is what the UK government has admitted they want:

      First, we are told, the relevant secretary of state (Michelle Donelan) expressed “concern” that the legislation might whack sites such as Amazon instead of Pornhub. In response, officials explained that the regulation in question was “not primarily aimed at … the protection of children”, but was about regulating “services that have a significant influence over public discourse”, a phrase that rather gives away the political thinking behind the act. - https://archive.md/2025.08.13-190800/https://www.thetimes.co...

  • imdsm 2 hours ago
    > To meet the law’s demands, companies are expected to rely heavily on automated scanning systems, content detection algorithms, and artificial intelligence models trained to evaluate the legality of text, images, and videos in real time.

    this means either devices need to evolve to do this locally, or the items need to be sent to external service providers, usually based outside of the UK, to scan them unencrypted

    I also assume this means the government here in the UK are okay with all whatsapp messages they send to be sent to an LLM to scan them for legality, outside the UK?

  • 6LLvveMx2koXfwn 2 hours ago
    I understand the rage generated here, but what is the alternative?

    If a service implements privacy invading 'features' then we have the choice not to use that service. Letting tech companies self-regulate has failed, and too many people leave morality at the door when engaging online, something which doesn't happen at scale IRL.

    What are we to do if not monitor? And how to make that scalable if not to introduce automation?

    • enderforth 2 hours ago
      I don't know what the alternative is, but I don't think I've ever found a situation yet where the solution has been His Majesty's Government being able to exercise more control over what people can see and hear.
    • Bender 1 hour ago
      but what is the alternative

      If an app can be installed on someones hardware without their intervention launch it into the air and use it for target practice. If a website requires some crypto-crap to verify objects were scanned then upload to smaller platforms and let others link to the objects from the big platform. The big platforms can play whack-a-mole removing links, it's a fun game. The smaller sites can give the crawler alternate images. Better yet just use small semi-private self hosted platforms. Even better yet ensure those platforms are only accessible via .onion domains requiring a browser that is Tor enabled. People can then make sites that proxy/cache objects from Tor onion sites to easier to access sites.

    • flumpcakes 2 hours ago
      > Letting tech companies self-regulate has failed, and too many people leave morality at the door when engaging online, something which doesn't happen at scale IRL.

      I completely agree with this point.

      We also have some tech companies (X) openly hostile to the UK Government. At what point does a sovereign country say "you're harming the people here and refuse to change, you're blocked".

      • cmxch 39 minutes ago
        Well, X seems to only be “hostile” in the sense that it airs the uncomfortable truths that the UK would rather not have heard.
    • jpfromlondon 2 hours ago
      >too many people leave morality at the door

      Yep, that's life, if something bothers you and it's already a crime then report it.

      There is precious little in life that can be undertaken without some risk of something unwanted however small (hah).

      • flumpcakes 2 hours ago
        > Yep, that's life, if something bothers you and it's already a crime then report it.

        I think that's the issue with this, and why we are seeing new laws introduced.

        If someone is assaulted in real life, the police can intervene.

        If people are constantly assaulted at a premises, that premise can lose it's license (for example a pub or bar).

        When moving to the online space, you are now potentially in contact with billions of people, and 'assaults' can be automated. You could send a dick pic to every woman on a platform for example.

        At this point the normal policing, and normal 'crime', goes out of the window and becomes entirely unenforcable.

        Hence we have these laws pushing this on to the platforms - you can't operate a platform that does not tackle abuse. And for the most part, most platforms know this and have tried to police this themselves, probably because they saw themselves more like 'pubs' in real life where people would interact in mostly good faith.

        We've entered an age now of bad faith by default, every interaction is now framed as 'free speech', but they never receive the consequences. I have a feeling that's how the US has ended up with their current administration.

        And now the tech platforms are sprinting towards the line of free speech absolutism and removing protections.

        And now countries have to implement laws to solve issues that should have just been platform policy enforcement.

        • jpfromlondon 1 minute ago
          Believe it or not, when a crime has been committed these providers universally defer to the police whose remit is enforcement, a role they seem reluctant to undertake, I'm unconvinced this is anything other than a convenient revenue stream, an opportunity to steer public opinion, and a means of quashing dissent.

          Frankly, a few dick pics here and there seems wildly low-stakes for such expensive draconian authoritarianism.

    • cmxch 41 minutes ago
      The US model, where hurty words don’t invoke a SWAT team like the UK does.
    • polski-g 46 minutes ago
      The Internet has worked fine for the past 30 years without this. There is no reason for such filtering.
    • cft 2 hours ago
      Goodbye all small independent forums with no AI budgets. An attacker posts a nude picture, 18m fine from OfCom ("whichever is larger", not proportional to revenue)
      • flumpcakes 2 hours ago
        I don't think the fine is automatic like that, it's more if you don't have an appropriate mechanism to manage it. In other words you need a content policy that is enforced.

        A mod who deletes nude pictures is probably enough to not get fined.

        I think the real issue is what I just said... "probably enough"; that's the real problem with the online safety act. People are mostly illiterate on the law, and now asking them to understand a complex law and implement it (even when the actual implementation is not that much effort or any effort at all for well run spaces) is the real issue.

        • jaffa2 53 minutes ago
          As far as I am aware, 'probably' is about the best you can do, since the OSA is so vaguely defined, it's actually difficult to actually know what is and what isn't valid.
    • rdm_blackhole 1 hour ago
      > What are we to do if not monitor?

      Simple, you can choose to only use platforms that use the most stringent scanning technologies for you and your family.

      You give the UK government (or the equivalent that applies to you) the right to continuously scan everything from pictures to emails to messages and then obviously you give them the right to prosecute you and come after you when one of their AI algorithms mistakenly detects child porn on your device or in your messages just like this guy: https://www.theguardian.com/technology/2022/aug/22/google-cs...

      For the rest of us, we should be free to opt out from being surveilled by machines 24/7.

      Then everyone is happy.

      Edited: typos

      • btasker 1 hour ago
        Personally, I think this is the answer too - rather than mandating it across all platforms, they could have created a service which provides scanning so that there was an additional app people could choose to install (and would, presumably, present as an accessibility addon so it could access content in other apps).

        That's not without its own issues though - creating external deps is more or less what they did the first time they tried to mandate age verification.

        Although their plans fell through, they created an industry who'd expected a captive market and started lobbying heavily. Eventually, it worked and we've ended up with mandatory age verification.

    • like_any_other 2 hours ago
      > but what is the alternative?

      We already have alternatives, this legislation is taking them away. If I want heavily censored discourse, I can go to reddit. If I want the wild west, I can go to 4chan. If I want privacy, I can use signal. And lots of services on different parts of that spectrum, or where different things are allowed.

      But the UK government wants to eliminate that choice and decide for me. And most importantly, they don't want to call it censorship, but "safety". To keep women and girls "safe" (but nobody is allowed to opt out, even if they're not a woman or girl, or don't want this "safety")

  • doublerabbit 2 hours ago
    How's that lawsuit with 4Chan going ofcom? Last checking, just now the site is still online.

    Time to move my colocated servers out of the UK.

    • HeckFeck 2 hours ago
      If they're really keen, they could just ask the hacker known as Soyjak.party to knock it offline again.
      • westmeal 2 hours ago
        They'd better make sure there are no conspicuously placed yellow vans a either, least they explode.
    • pelagicAustral 2 hours ago
      It will get sorted in two more weeks
  • flumpcakes 2 hours ago
    Most of these comments I think are off the mark. For some reason anything to do with EU or the UK legislating to protect citizenry is seen as some Orwellian conspiracy to mind control people. I agree some of the policies feel like always using a hammer - but I strongly suspect it's because the tech industry is clearly not playing ball.

    Children being sent dick pics, or AI generated nudes of them being sent around schools, etc. are real problems facing real normal people.

    I think people here need to be reminded that the average person doesn't care about technology. They will be happy for their phones to automatically block nude pictures by Government rule if the tech companies do not improve their social safety measures. This is the double edged sword: these same people are not tech savvy enough to lock down their children's phones, they expect it to be safe, they paid money for it to be "safe", and even if you lock a phone down, it doesn't stopped their class mates sending them AI porn of other class mates.

    Musk is living proof that a non zero number of these giant tech companies are happy for child porn ("fake" or not) to be posted on their platform. If I was in his shoes, it would be pretty high up on my list to make sure Grok isn't posting pornography. It's not hard to be a good person.

    • HPsquared 2 hours ago
      The things you mention are already illegal. The effective proven solution is to enforce existing laws, to punish and deter bad behaviour like any other crime.

      This incongruence is why a lot of people don't take the reasoning at face value and see it as only rhetorical justification for increased surveillance, which is widely understood as something the state wants do do anyway.

      • yladiz 2 hours ago
        How do you deal with a crime that isn’t reported due to things like shame?

        Not to say that we need to scan messages to enforce nudes not to be sent, but I don’t think you can say “just enforce existing laws” and be done with it, it’s not that simple.

        • iamnothere 3 minutes ago
          Perhaps His Majesty’s Government could establish mandatory thought scanning using cutting edge technology[0] to ensure that no crimes go unreported due to shame, dishonesty, or threats. Just step into the scanning booth once a week, a minor inconvenience to ensure your safety. Surely you have nothing to hide?

          [0] https://www.nature.com/articles/d41586-025-03714-0

        • t0bia_s 24 minutes ago
          Maybe we should ban a shame?
        • HPsquared 59 minutes ago
          The externalities of this policy don't justify that small benefit.
          • yladiz 46 minutes ago
            Define small benefit.
      • flumpcakes 2 hours ago
        I posted a reply here https://news.ycombinator.com/item?id=46599842 that addresses why I think "this is already a crime" doesn't go far enough, and why these laws are being introduced.
    • polski-g 43 minutes ago
      Adobe isn't the creator of child porn when Photoshop is interacted with a child pornographer.

      So why are you considering xAI the creator when it's the tool that's being interacted with?

      The human child pornographer using tools is the one who's creating it, not the tools.

  • miroljub 2 hours ago
    Sex Pistols are more actual than ever.

        God save the Queen
        The fascist regime
        It made you a moron
        Potential H-bomb
        God save the Queen
        She ain't no human being
        There is no future
        In England's dreaming
    
        Don't be told what you want to want to
        And don't be told what you want to need
        There's no future, no future
        No future for you
    • mosura 2 hours ago
      They were also about the only people to call out Savile while he was alive.

      Actual abusers are fine. Talking about it is the problem.

  • anal_reactor 1 hour ago
    Literally China
  • HeckFeck 2 hours ago
    Nothing any government in my lifetime has done has arrested this feeling of decay, decline and desperation. It's like the occupational political class has a miserable vendetta and must afflict it upon the population. But I'm not actually miserable like you, I don't want to feel like you, we invented liberty in this country, now fuck off the lot of you thank you.
  • Popeyes 2 hours ago
    Tech industry walked right into this one, well done Musk.
    • mikkupikku 2 hours ago
      UK government publicly making a fool of itself is probably not counter to the interests of Elon Musk at all... His political faction have been keen to insult the British government whenever possible. The more absurd their public enemies act, the more reasonable they look in comparison.
      • flumpcakes 2 hours ago
        Musk is implicitly allowing child pornography on his platform. There's no way around that. Apple/Google should have removed X a while ago.
        • mikkupikku 2 hours ago
          Come on now. That's obviously not true. CSAM is absolutely banned on twitter, and all other American platforms.
          • cjs_ac 2 hours ago
            Grok AI generating child pornography has been a leading news story in the British press for the past few days.
            • mikkupikku 2 hours ago
              All images uploaded to twitter, or any other lawful American social media platform, are perceptually hashed and checked against databases of known CSAM. Computer generated pornography, while obviously odious, is not technically illegal in the US. And in either case, twitter has been a dumping ground of such crap for as long as it has existed. In short, with all due respect, get a grip.
              • btasker 1 hour ago
                All of what you said could be true and it'd *still* be wrong for Grok to be allowed to generate it.

                All Musk actually needed to say was "oh fuck, we'll fix that". Instead, he responded with laughing emojiis and nothing's changed.

                > is not technically illegal in the US

                Bully for you.

                X is operating in the UK and it *is* illegal here (and not just here). X can either comply with our laws (and the associated moral standards) or it can cease operating here.

                There's weird nerd diving in front of Musk to defend him and then there's defending his AI generating CSAM. Neither's a good look, but one is much worse than the other

              • cjs_ac 1 hour ago
                What makes you think this is an appropriate response to someone telling you what's been published in newspapers?
            • jpfromlondon 2 hours ago
              to make the censorship more palatable to the general public.
          • flumpcakes 2 hours ago
            Then why does Musk refuse to fix 'Grok' and allow it to produce CSAM? Is this what the billionaire class want? AI everywhere and then hide behind "it's not the AI doing it, it's the users prompting it!".
            • pirates 2 hours ago
              > Is this what the billionaire class want?

              The billionaire class loves this type of shit, just look at the epstein files

        • rdm_blackhole 2 hours ago
          > Musk is implicitly allowing child pornography on his platform.

          That is blatantly false and you know it. Musk has lot to answer for but we don't need to start making up imaginary crimes here.

          > Apple/Google should have removed X a while ago.

          Those who ask willy-nilly for censorship always end up being surprised when the systems comes after them in the end as it always does.

          If tomorrow Apple and Google ban an app that you like, will you still agree that censorship is ok?

  • enderforth 2 hours ago
    Okay, everyone here is talking about dick pics but let's be clear here the goal is

    >A major expansion of the UK’s Online Safety Act (OSA) has taken effect, legally obliging digital platforms to deploy surveillance-style systems that scan, detect, and block user content before it can be seen.

    Do we really believe that no government forever is not going to use this to prevent certain "misinformation" from circulating?

    And by misinformation we mean things like MPs breaking COVID lock down rules or "problematic" information about the PM being involved in a scandal, or the list is endless.

    Let's be clear this isn't at all and never has been about dick pics this is 100% about being able to control what you can see and share.

    • rdm_blackhole 2 hours ago
      I don't understand the downvotes that you are getting.

      There is a clear intent to muzzle the population that is going on in Europe with this new legislation and then with Chat control. Those who can't see that need to remove the blinders they have on.

      First, it's the nudes and then it's something else. Once there is a capability to filter what can be shared between two adults in private message, then can anyone say that any government is not going to come back for more and ask more and more things to be removed or censored?

  • PunchyHamster 3 hours ago
    UK has fallen
  • Mistletoe 3 hours ago
    Pre-cog, you say?
  • ajsnigrutin 3 hours ago
    How will it know if the dick pic is wanted or unwanted?
    • netsharc 3 hours ago
      The recipient will be required to fill a form to confirm desire for the dick pic, and the ministry will issue a dispensation allowing the taking and sending of said dick pic.

      Please allow 3-4 weeks to process the request.

      • bubblethink 3 hours ago
        Yes, it's an amended version of form 27B/6.
        • kitd 2 hours ago
          Can I get that one in the Post Office?
          • HeckFeck 2 hours ago
            No, but you can get the form to request the form. Then it must be stamped by an official in the [strikethrough]Ministry of Information[/strikethrough] Ofcom. Please allow 4-5 months for processing thanks to our partners delivering efficient intersection of Government and Industry, Capita.
          • mikkupikku 2 hours ago
            Those forms are kept in the Displays Department, in the basement.
    • hexbin010 3 hours ago
      The uncomfortable truth: I know and have met plenty of women who have invited and welcomed dick pics. As a gay guy, I can tell you that lots of women are actually very interested in dick pics. They don't need a minister protecting them from themselves.
      • netsharc 1 hour ago
        Come on, even knowing this, you have to admit there's probably a lot more unsolicited than solicited ones...
        • hexbin010 1 hour ago
          Who knows, I haven't searched for unbiased data to be honest

          But they can be more judicious with whom they share contact details, and use the block button. They are not forced to be the recipient of any message.

          Do you think the only solution is for a government backdoor?

          • netsharc 12 minutes ago
            Yeah, the government's idea is moronic. But making it the victims' responsibility ("don't share your number indiscriminately") is depressing too. How about make it easier to prosecute the unsolicited sending. How about educating people not to be fuckwits..

            Yeah prosecution makes lives more difficult, it's rife for abuse (the recipient could fake evidence, the sender could claim he was hacked/his friend took his unlocked phone..)..

          • doublerabbit 32 minutes ago
            No, I don't. But does the government think the only solution is a backdoor? Yes.

            They are the ones in power, not you & I.

  • tuktoyaktuk 3 hours ago
    [dead]
  • hexbin010 3 hours ago
    Wow nobody saw this coming /s

    They whipped up a mini pandemic of people being subject to an onslaught of unwanted dick pics (not mentioning even once about the "block" feature on every single platform) to justify it

    This is the Ministry of Truth building up their toolset

  • 10xDev 2 hours ago
    [flagged]
    • ChrisRR 2 hours ago
      Oh this is a very loaded statement if I've ever seen one. What's your issue with the "demographic of your street" and what does it have to do with scanning your messages?
    • amelius 2 hours ago
      Which city?
      • _fzslm 2 hours ago
        And which people?