So, you've hit an age gate. what now?

(eff.org)

230 points | by hn_acker 3 hours ago

31 comments

  • ryandrake 2 hours ago
    My kid has recently just quit playing Roblox because of the sketchy facial age check process. She said that her and all her friends know not to ever upload a picture of themselves to the Internet (good job, fellow Other Parents!!) so they're either moving on to other games or just downloading stock photos of people from the internet and uploading those (which apparently works).

    What a total joke. These companies need to stop normalizing the sharing of personal private photos. It's literally the opposite direction from good Internet hygiene, especially for kids!

    • btown 2 hours ago
      One aspect of this normalization of photo uploading is that, if a platform allows user-generated content that can splash a modal to kids, a bad actor can do things like say “you need to re-verify or you’ll lose all your in-game currency, go here” and then collect photo identification without even needing to compromise identity verification providers!

      I truly fear the harm that will be done before legislators realize what they’ve created. One only hopes that this prevents the EU and US from doing something similar.

      • kspacewalk2 1 hour ago
        The fundamental question that needs answering is: should we actually prevent minors below the age of X from accessing social media site Y? Is the harm done significant enough to warrant providing parents with a technical solution for giving them control over which sites their X-aged child signs up, and a solution that like actually works? Obviously pinky-swear "over 13?" checkboxes don't work, so this currently does not exist.

        You can work through robustness issues like the one you bring up (photo uploading may not be a good method), we can discuss privacy trade-offs like adults without pretending this is the first time we legitimately need to make a privacy-functionality or privacy-societal need trade-off, etc. Heck, you can come up with various methods where not much privacy needs trading off, something pseudonymous and/or cryptographic and/or legislated OS-level device flags checked on signup and login.

        But it makes no sense to jump to the minutiae without addressing the fundamental question.

        • Aurornis 49 minutes ago
          > The fundamental question that needs answering is: should we actually prevent minors below the age of X from accessing social media site Y?

          I suspect if you ask Hacker News commenters if we should put up any obstacles to accessing social media sites for anyone, a lot of people will tell you yes. The details don't matter. Bashing "social media" is popular here and anything that makes it harder for other people to use is viewed as a good thing.

          What I've found to be more enlightening is to ask people if they'd be willing to accept the same limitations on Hacker News: Would they submit to ID review to prove they aren't a minor just to comment here? Or upvote? Or even access the algorithmic feed of user-generated content and comments? There's a lot of insistence that Hacker News would get an exception or doesn't count as social media under their ideal law, but in practice a site this large with user-generated content would likely need to adhere to the same laws.

          So a better question might be: Would you be willing to submit to ID verification for the sites you participate in, as a fundamentally good thing for protecting minors from bad content on the internet?

        • ryandrake 53 minutes ago
          > The fundamental question that needs answering is: should we actually prevent minors below the age of X from accessing social media site Y?

          This is only an interesting question if we can prevent it. We couldn't prevent minors from smoking, and that was in a world where you had to physically walk into a store to buy cigarettes. The internet is even more anonymous, remote-controlled, and wild-west. What makes us think we can actually effectively age gate the Internet, where even Nobody Knows You're A Dog (1993)[1].

          1: https://en.wikipedia.org/wiki/On_the_Internet,_nobody_knows_...

        • array_key_first 1 hour ago
          The real solution, IMO, is a second internet. Domain names will be whitelisted, not blacklisted, and you must submit an application to some body or something.
          • anon84873628 1 hour ago
            I agree. There were attempts to do something like this with porn sites via the .xxx TLD I believe, but that inverts the problem. Don't force the public to go to a dark alley for their guilty pleasures. Instead, the sites that want to target kids need to be allowlisted. That is much more practical and palatable.
            • tracker1 1 hour ago
              Yeah.. the opposition was just a bad take IMO... "but it will create a virtual red light district" which is EXACTLY what you want online, unlike a physical city, you aren't going to accidentally take a wrong turn, and if you're blocking *.xxx then it's even easier to avoid.

              Then require all nudity to be on a .edu, .art or .xxx, problem mostly solved.

              • MarsIronPI 1 hour ago
                > Then require all nudity to be on a .edu, .art or .xxx, problem mostly solved.

                Who's doing the requiring here? Sounds like yet another path to censorship dystopia.

                • tracker1 47 minutes ago
                  In the case of cc-tlds the respective government... In the case of other TLDs ICANN.

                  edit: .edu provides for educational content, .art for artistic expression, .xxx for explicit content.

          • jvanderbot 48 minutes ago
            I dont see why phones can't come with a browser that does this. Parents could curate a whitelist like people curate playlists, and share it, and the browser would honor that.

            Combined with some blacklisted apps (e.g., all other browsers), this would be a passable opt-in solution. I'm sure there's either a subscription or a small incentive for someone to build this that hopefully isn't "Scam children".

            It's not like kids are using PCs, and if they use someone else's phone, that's at least a severely limiting factor.

          • goopypoop 1 hour ago
            sounds like an app store
        • immibis 6 minutes ago
          Can we actually prevent children under 16 from buying beer?
        • anal_reactor 1 hour ago
          It's never been about porn. By marking certain part of the internet "adult-only" you imply that the rest is "family-friendly" and parents can feel less bad about themselves leaving their children with iPads rather than actually parenting them, which is exactly what Big Tech wants for obvious reasons. If I had a child I'd rather have it watch porn than Cocomelon, which has been scientifically developed so that it turns your child's brain into seedless raspberry jam. Yet nobody's talking about the dangers of that, because everyone's occupied with <gasp> titties.
      • thewebguyd 46 minutes ago
        > I truly fear the harm that will be done before legislators realize what they’ve created.

        Not defending the legislation as I overwhelmingly disagree with it, but if I recall, I don't think any of the age verification legislation specifies a specific implementation of how to verify age.

        Requiring photos, or photo ID, or any other number of methods being employed, were all decided on by the various private companies. All the legislators did is tell everyone "you must verify age." The fault here is on Roblox as much as it is on the legislature and they should equally share blame.

        • idopmstuff 38 minutes ago
          How would you suggest they verify age? I am not aware of a good way to do it from a privacy and security perspective.
          • thewebguyd 7 minutes ago
            It doesn't have to be exclusively digital. You can be psuedoanonymous using some form of key as verification. To get a key, you have to present your ID in person at, for example, the social security office or local DOL.

            All the key does is attest that "this person is over X years old" with no other identifying information associated with it.

            I think blending in person & digital together is going to be the best way forward. Like going to the store and buying alcohol. I have little privacy risk from the cashier glancing at my ID for a second to check my birth date.

            • idopmstuff 3 minutes ago
              But that would require the government to set up the system that lets you present your ID and get a key. They haven't done that, so it's not valid to blame businesses for not using it.
          • kreco 30 minutes ago
            You can take a look at what Switzerland is about to do:

            https://www.homburger.ch/de/insights/swiss-voters-approve-ne...!

            • SoftTalker 14 minutes ago
              Would be very tough to implement in the US, as proposing any sort of "national ID" is pretty much a nonstarter, at least up to this point.

              States could do it, and maybe agree on some protocols so that things like privacy-preserving "age verification" could be done.

              Maybe the feds could push it like they did with speed limits: make federal funding contingent upon adopting e-ID. Would still get a lot of pushback.

              • thewebguyd 4 minutes ago
                The problem with e-ID is its focused on identity verification, not just age verification and that's where the problem lies.

                We still need the ability to be psuedoanonymous online. We should be able to verify age without divulging any identifying information to the service requesting age verification.

                An e-ID registry could work on a sort of public/private key system so long as the services requesting informatino from the registry only receives a yes or no of "is this person old enough" and no further information.

      • pfraze 2 hours ago
        I’m sorry to say that a number of US states have instituted age verification laws over the past year
        • pixl97 1 hour ago
          Aka, morality laws mostly.
      • jofla_net 1 hour ago
        i call this slipstreaming, it can even occur during the signup yeah, once the bouncing around to many domains / uploading photos is psychologically normalized havoc can ensue. this is the greater evil.
      • ryandrake 1 hour ago
        I'm optimistic actually. I think "Gen Alpha" is gonna be alright and sufficiently wary of Internet sharing and privacy. Unlike the previous few generations, esp. Milleneals and to a somewhat lesser extent Gen Z and Boomers, who have massively over-shared and are now reaping some of the horrible harvest that comes from that oversharing. Today's teens and tweens seem to finally be getting the message.

        I also actually think AI might be a savior here. The ability to fake realistic 18+ year old selfies might help put the nail in the coffin of these idiotic "share a photo with the Internet" verification methods.

        • sublinear 6 minutes ago
          I otherwise agree with what you're saying, but I think the ratio of conscientious people has fluctuated over time across all generations. It has more to do with what year it is than how old they are.
    • bigfatkitten 43 minutes ago
      My kids also know their date of birth is 1 January 1970, as far as the internet is concerned.
    • cortesoft 30 minutes ago
      Having to manage my kids online accounts have been a nightmare. So many different rules, with arbitrary age limits on things that go completely against my own rules for what my kids can do at different ages, with weird methods for linking or verifying or sharing/transferring purchases. I have gotten so frustrated trying to get accounts set up so we can play together.
    • doctorpangloss 16 minutes ago
      > She said that her and all her friends know not to ever upload a picture of themselves to the Internet (good job, fellow Other Parents!!)

      it's a video game, it's an aesthetic experience, if uploading a photo of yourself doesn't feel good, it's valid to say, it's a bad game or whatever.

      but by some more objective criteria, this photo upload thing that you are saying doesn't really matter. they are uploading photos of themselves to the Internet all the time (what do you think Apple Photos is). of course, with kids, i can understand the challenges of making nuanced guidelines, but by that measure, it's simpler to just say, playing roblox is kind of a waste of time, or suggesting better games to play, rather than making it about some feel-good nonsense i'm-a-savvy-Internet-user rule. it's what this whole article is about, providing real answers, but who under 18 years old is going to read the whole thing?

    • turblety 2 hours ago
      There seems to be a big movement (UK specifically) from governments using age gateing as an excuse to increase surveillance and online tracking. I don't know where Roblox is based or it's policies, but it's likely they are just implementing what the government has forced them to do.

      We need to push back against governments that try and restrict the freedom of the internet and educate them on better regulations. Why can sites not dictate the content they provide, then let device providers provide optional parental controls.

      Governments forcing companies to upload your passport/ID, upload pictures/videos of your face, is dangerous and we are going to see a huge increase of fraud and privacy breaches, all while reducing our freedoms and rights online.

      • phatfish 0 minutes ago
        I see lots of claims about governments using age gating to "track" people, but no evidence. Your last point about uploading ID documents to random online services (which i agree is a privacy risk) would be solved with a government digital ID.

        That is never going to happen it seems, as -- in the UK at least -- people go crazy whenever it is mentioned. Despite "the government" having the ability to track whatever they wanted already, should they care to.

        Age gating discussions always devolve into some fantasy land were people are arguing for children to have access to porn and other inappropriate material, and happily construct some straw man where age gates lead to censorship for everyone.

        If your government wanted to censor the internet they can do it without age gates. As a parent I am happy to have society agree on some basic rules around what children can do online, as there are rules on what children can do in the real world.

        Yes, I know all the come back arguments about how it is my responsibility as a parent. Don't worry, I will be responsible for what my children do online when they are older. But in the end a society raises children, and society should agree a limit on what children can be exposed to online.

      • anon84873628 1 hour ago
        IMO it should not be hard for large services like Roblox and Instagram to get together with device makers to come up with a sensible solution.

        When you create a new profile on Netflix you mark it as "kids" and voila. Devices should have kid profiles with lots of sane defaults. The parent profiles have a thorough monitoring and governance features that are dead simple to use.

        As always it's not perfect but it will go a long way. Just getting a majority of parents on sane defaults will help unknot the broader coordination problems.

    • irusensei 1 hour ago
      I think the way Roblox is doing right now separating the users in age groups just makes it easier for predators to find victim.
      • jacobsenscott 29 minutes ago
        Governments and corporations are never interested in protecting children - they don't vote, and they don't have money. So making it "easier for predators to find victims" is not a failure of the policy.
    • next_xibalba 2 hours ago
      [flagged]
      • drnick1 1 hour ago
        Age verification on mainstream porn sites does absolutely zilch against teenagers accessing porn. There are countless other ways of obtaining porn. Even DDG with the safety off will provide plenty of it.
      • pixl97 1 hour ago
        >it might prevent that

        On the global internet... good luck with that.

        Oh, they'll ban us from looking at other countries net's soon enough for our safety.

      • Barrin92 1 hour ago
        >and this seems like it might prevent that

        sorry but we're on the internet. You can type the literal words 'hardcore pornography' into any search engine of your choice and find about fifteen million bootleg porn sites hosted on some micro-nation that don't care about your age verification.

        In fact ironically, this will almost certainly drive people to websites that host anything.

      • polski-g 2 hours ago
        What evidence led you to believe this, when controlling for heritability?
        • gjsman-1000 1 hour ago
          How about that 38% of young women in the UK have experienced asphyxiation; combined with studies showing there is zero safe threshold without brain damage markers in the blood?

          https://www.bbc.com/news/articles/c62zwy0nex0o

          https://www.theguardian.com/society/2025/nov/18/sexually-act...

          https://wecantconsenttothis.uk/blog/2020/12/21/the-horrifyin...

          https://www.nytimes.com/2024/04/12/opinion/choking-teen-sex-...

          https://www.psychologytoday.com/us/blog/consciously-creating...

          https://www.itleftnomarks.com.au/wp-content/uploads/2024/07/...

          Before the widespread adoption of pornography, this rate was near 0%. Now we have literally a significant minority of women with permanent brain damage, induced from widespread pornography, unknown harms long-term, and studies already suggesting increased risk of random stroke decades afterwards.

          • john01dav 1 hour ago
            > combined with studies showing there is zero safe threshold without brain damage markers in the blood?

            Are you saying that there's zero safe threshold of choking, or for viewing porn?

            (To be clear, choking someone without consent is assault and unacceptable, whether a blood test shows damage or not.)

            • gjsman-1000 1 hour ago
              A. There is zero safe threshold for choking.

              B. Choking is inherently, obviously, dangerous.

              C. Pornography has caused choking behaviors among youth to go from negligible to over 38%.

              D. Brain damage is measurable in anyone who has been choked.

              E. As such, pornography does, in fact, have blame for encouraging this kind of experimentation.

              F. If "fighting words" and "misinformation" shouldn't be free speech, who is to say pornography does not incite risk, when other things can?

              • array_key_first 1 hour ago
                The argument I commonly hear of pornography causing more extreme sexual experimentation is a very weak one. I know, for sure, pornography did not cause me to be a homosexual.

                Kinks, BDSM, and what have you, have always existed and will continue to exist. The solution is teaching safe ways to participate, and the importance of consent. A desire to just wipe them out is naive, and will not work.

              • d1sxeyes 1 hour ago
                I have a lot of concerns about your presentation of this.

                A. It’s also true that there is no safe level of alcohol consumption and yet we sort of see experimentation with alcohol as a rite of passage.

                B. I mean, so is walking out your front door. I don’t see this as adding much to point A.

                C. This is a big jump. First, we see more openness about sexual behaviour. While I’m prepared to agree that it has likely gone up, I would not be comfortable with the degree you imply. Second, while I do think it is likely that pornography has indeed contributed to this, pornography has also likely contributed to an increase in experimentation in general, with other sexual behaviours also likely seeing an increase (for example oral/anal sex, water play, etc).

                D. I find this very hard to accept at face value. Do you have studies/evidence to support this claim?

                E. Yes, I would likely agree, although whether “encourages sexual experimentation” is a bad thing or not is a question for further debate.

                F. This conflates some very weird things. “Fighting words” are a specific type of restricted speech (i.e. you can’t go round shouting “I’ll kill you”). Sharing misinformation is broadly not illegal (except in very specific sets of circumstances-fraud, inciting violence, etc.). It’s also broadly speaking not against the law to tell the truth. “Some people like to choke each other during sex” is a true statement, even if it’s harmful.

                Do you support a ban on porn all together? That’s quite a radical view.

              • ulrikrasmussen 53 minutes ago
                I don't believe that choking leads to brain damage in every single individual who has been choked, for whatever duration. If that is the case, then holding your breath should lead to brain damage too, no? You really need to back up that claim with some evidence.
              • stickfigure 1 hour ago
                > Pornography has caused choking behaviors among youth to go from negligible to over 38%.

                That which is asserted without evidence, can be dismissed without evidence.

              • terminalshort 1 hour ago
                Can you explain point A? It seems fundamentally flawed unless there is also brain damage from breath holding, hiking at high altitude, and other normal activities that involve operating at lower oxygen levels.
              • gruez 1 hour ago
                >and "misinformation" shouldn't be free speech

                That worked so well during covid, right?

          • irusensei 1 hour ago
            I'm trying to find the contact for the does-not-imply-causation dept but I think I lost my slashdot account in 2004.
            • gjsman-1000 1 hour ago
              Nobody studying this issue, from the UK government to independent researchers to NGOs, says this anymore. PornHub in legal filings never uses this argument, but instead focuses on rights to expression rather than dispute the claim.

              The causation is clear, documented, proven. Increased pornography exposure with dangerous behaviors, causes those dangerous behaviors to be repeated, even when participants are warned of the risk.

              At this point, denial is like saying flat earth has merit.

              • iamnothere 36 minutes ago
                The extreme danger of marijuana and its role as a “gateway drug” was also extensively studied and “proven” by a handful of moralist researchers and groups who had an agenda to push. The highly biased “researchers” who pursued this were often directly funded by the US government.

                And now? This research has been debunked. It’s likely bad for people prone to mental illness, especially when taken regularly and in excess, and even stable people shouldn’t overdo it, much like alcohol. But it’s not going to cause lasting harm to most people.

                Regarding porn, your argument from authority is extremely suspect. Porn is considered morally suspect due to lingering Puritan values, and if there is a research deficit (which I doubt) then it is likely because reputable researchers avoid the topic due to reputational damage. Sex researchers in general have often faced harassment, targeted government inquiries, and threats. So in short, I don’t believe you here.

                I haven’t personally met anyone whose life was negatively affected by porn, except for a couple of people who were in relationships where one partner considered porn to be a form of infidelity. Utterly ridiculous from my perspective.

                Edit: Total bunk. After looking into it, reputable meta-studies have showed no link between porn and sexual violence, ED, or mental health issues. It’s trivial to find this research, search for it if you care.

              • idiotsecant 1 hour ago
                So what? The problem here would be if these activities are nonconsensual. I've seen no evidence of that. If you're just trying to thought police ideas that lead to people doing risky things you better drop your clutching pearls and pick up a pencil cause that's a long list, some of which are probably things you do.
                • irishcoffee 1 hour ago
                  The internet in a nutshell: I’m right and if you don’t agree, you’re wrong. Facts need not apply.
          • stickfigure 1 hour ago
            > Before the widespread adoption of pornography, this rate was near 0%

            Bullshit. Men and women have been dying of autoerotic asphyxiation long before the internet. And we only hear about the ones that fuck up badly enough to make the news.

            I'm puzzled by this phenomenon myself, but there is apparently a significant minority of women who enjoy getting choked in bed:

            https://link.springer.com/article/10.1007/s13178-025-01247-9

            This doesn't excuse people who choke without consent, but there's something going on here waaaay more complex than "see it in porn, do it". Humans are weird.

            • gjsman-1000 1 hour ago
              Nobody is saying that nobody did this before. We are saying now that it is a health crisis, objectively.

              You're the guy saying that 110 MPH speed limits can't be responsible for crashes because people also died at 20 MPH.

              • stickfigure 1 hour ago
                You did in fact just say that nobody did it before - or very strongly implied it based on how sloppy you want to be with the phrase "near 0%".

                Stop pretending you know what that number is.

          • dangus 1 hour ago
            > Before the widespread adoption of pornography, this rate was near 0%.

            Big giant citation needed on that one. How would it ever have been near 0%?

            First, I’d like to point out that we don’t make other media illegal or age gated with privacy-compromising tactics because it depicts harmful things. There’s no age verification gate for watching movies and TV that depict murder and other serious crimes. You can watch Gaston drink beer and fall to his death and the Beast bleed in a kids movie rated G.

            Watching NFL football, boxing, and UFC fighting isn’t illegal even those sports conclusively cause brain damage.

            Pornography is singled out because it’s taboo and for no other reason. People won’t politically defend it because nobody can publicly admit that they like watching it, even though most people consume it.

            Over 90% of men and over 60% of women in the last month. [1]

            Second, what I see missing from your links is really solid studied link to an increase in choking injuries directly caused by changes in pornography trends and viewership. Were these kinks just underreported in the past? Heck, I read 4 of your linked articles and none of them actually compared the rate of choking injury over time, they just sort of pointed it out as something that exists and jumped to blaming pornography.

            I am perfectly willing to accept your hypothesis but I don’t think we’ve been anywhere near scientific enough about evaluating it, and even if that was the case, we don’t really treat pornography the same as other media just like I mentioned.

            We need a lot more information. Personally, I think there’s nothing wrong with sexual pleasure and believe it’s stigmatized way too much. I also believe that normalizing sexual pleasure helps people talk about consent and avoids issues like doing a sexual act when you don’t enjoy it.

            [1] https://pubmed.ncbi.nlm.nih.gov/30358432/

    • kevmo 2 hours ago
      I was getting a haircut last week and chatting about our kids with the stylist, who said (basically): "I just started letting my 7 year old on Roblox. I know its full of pedophiles. I told him to come to me or his older brother if anyone tries to talk to him."

      If the million reports of Mark Zuckerberg enabling pedophiles and scam artists haven't made it clear, the executives of these tech companies just don't care. They will sell children into sexual slavery if it improves next quarter's numbers.

      • rhplus 39 minutes ago
        The drip-feed of mindless brain-rot, micro-payments, and cyber-bullying should be much higher up the list of reasons for not letting a 7 year old use Roblox (and YouTube and FaceBook and…)
  • cons0le 3 hours ago
    >If Google can guess your age, you may never even see an age verification screen. Your Google account is typically connected to your YouTube account, so if (like mine) your YouTube account is old enough to vote, you may not need to verify your Google account at all.

    This has been proven false a bunch of times, at least if the 1000s of people complaining online about it are to be believed. My google account is definitely old enough to vote, but I get the verification popup all the time on YouTube.

    I think the truth is, they just want your face. The financial incentive is to get as much data as possible so they can hand it to 3rd parties. I don't believe for a second that these social networks aren't selling both the data and the meta data.

    • xmprt 1 hour ago
      I think the reality is a lot less nefarious. They don't want your face. But they also don't care enough to not take your face. Why would Google spend lobbying and legal money trying to fight this requirement when it doesn't hurt their bottom line? On the other hand, requirements like storing ID cards does hurt their bottom line because it means:

      1. they need additional security measures to avoid leaking government documents (leaking face photos doesn't hurt them as much) 2. not every person has a valid government document 3. additional customer support staff to verify the age on documents rather than just using some fuzzy machine learning model with "good enough" accuracy.

      The bottom line is that companies are lazy and will do the easiest thing to comply with regulations that don't hurt them.

    • AshamedCaptain 3 hours ago
      My Google account is more than 18 years old and I hit an age prompt when I was trying to watch some FPGA video (out of all things). So no, account age is not necessarily a factor.
      • stonemetal12 2 hours ago
        They probably need to account for parents allowing kids to use their account, so account age can be a factor but not an automatic pass.
      • dlcarrier 2 hours ago
        Field programmable gatorade is an adult-only beverage.
      • inopinatus 2 hours ago
        That makes sense. Golf has a minimum age of 35.
        • pixl97 1 hour ago
          Did you hear they are letting kids play pickleball these days! How scandalous.
      • RobotToaster 2 hours ago
        Can't allow any underage synthesis.
      • raverbashing 2 hours ago
        Yeah, they could/*should* infer your age just by the fact you're watching an FPGA video
        • bluGill 1 hour ago
          I would have watched those at 10 if the internet was a thing when I was 10. I think most people here would have. (I may or may not have understood it, but I would have tried)
    • qweiopqweiop 47 minutes ago
      This comes across as incredibly paranoid. Most places use 3rd party age verification anyway. They're following the law/playing safe with the law in certain countries, and it's just easier to apply it everywhere.
    • blacksmith_tb 2 hours ago
      I agree they want the face data, but I think it's less clear they want to "hand it" (presumably that's really "sell it"?) to third parties. My sense is Google and Apple and Meta are amassing data for their own uses, but I haven't gotten the impression they're very interested in sharing it?
      • llbbdd 2 hours ago
        Sharing it is bad for business; selling insights derived from it for ad placement is the game. Faces definitely contain some useful information for that purpose.
      • testing22321 2 hours ago
        They’ll do whatever makes money.

        Sell it and use it internally.

      • 121789 2 hours ago
        you are correct. having that data is one of their competitive advantages, it makes no sense to sell it. they will collect as much as possible and monetize it through better ads, but they don't sell it
    • zahlman 2 hours ago
      I haven't gotten it yet on my account from 2006. Maybe it matters whether it's a brand account? Maybe it matters whether the accounts actually are connected?
      • mythrwy 2 hours ago
        well as long as it's you logging in, they know you are minimum 20 years old!
        • fuzzzerd 27 minutes ago
          As opposed to a child uploading a selfie of an adult on a new account.
    • jama211 2 hours ago
      They definitely already have your face though…
      • ambicapter 2 hours ago
        The more examples in various situations they can get, the higher their accuracy.
      • zahlman 2 hours ago
        From where? Not everyone even puts selfies on the Internet.
        • pixl97 1 hour ago
          Honestly, it's probably already happening, but I would not be surprised if retail stores that check your ID also have cameras snaping your face and selling that to data brokers.

          Anything you can image that is bad with privacy, figure what is occurring is far worse.

    • gosub100 2 hours ago
      I just got glasses yesterday and the optician needed to take a pic of my face to "make sure my glasses fit". The first thing I thought of was they are probably selling the data.
      • rolph 1 hour ago
        just say no thank you, i will manage like everyone else has for decades.

        else you and your money go elsewhere.

    • SilasX 2 hours ago
      I wrote an April Fool's parody in 2021 that Google is going to get rid of authentication because they're following you around enough to know who you are anyway (modeling it after their No Captcha announcement[1]):

      http://blog.tyrannyofthemouse.com/2021/04/leaked-google-init...

      Edit:

      >I think the truth is, they just want your face.

      I just realized the parody also predicted that part (emphasis added):

      >>In cases where our tracking cookies and other behavioral metrics can't confidently predict who someone is, we will prompt the user for additional information, increasing the number of security checkpoints to confirm who the user really is. For example, you might need to turn on your webcam or upload your operating system's recent logs to give a fuller picture.

      [1] https://security.googleblog.com/2014/12/are-you-robot-introd...

    • shevy-java 2 hours ago
      > I think the truth is, they just want your face.

      Agreed. They treat people as data points and cash cows. This is also one reason why I think Google needs to be disbanded completely. And the laws need to be returned back to The People; right now Trump is just the ultimate Mr. Corporation guy ever. Lo and behold, ICE reminds us of a certain merc-like group in a world war (and remember what Mussolini said about fascism: "Fascism should more appropriately be called Corporatism because it is a merger of state and corporate power." - of course in italian, but I don't know the italian sentence, only the english translation)

  • dakiol 2 hours ago
    I’ve noticed that many people struggle to simply let things go. Take a hypothetical case where HN requires ID verification. I'd just stop using HN, even if that meant giving up checking tech news. Sometimes things end, and that's fine.

    I used to watch good soccer matches on public TV. When services like DAZN appeared, only one major match was available each weekend on public TV. Later, none were free to watch unless you subscribed to a private channel. I didn't want to do that, so I stopped watching soccer. Now I only follow big tournaments like the World cup, which still air on public TV (once every 4 years).

    Sometimes you just have to let things go

    • mystifyingpoi 1 hour ago
      > I’ve noticed that many people struggle to simply let things go

      Because it's not always about their entertainment. I know churches that post info about events only on WhatsApp groups, if you don't use it - you're screwed. I know kindergardens which use Facebook Messenger groups to send announcements to their parents' children - if you don't use it, you will miss important info.

      For most people, letting go such things is very impractical. One can try to persuade for a better way to do something - but then you become the problem.

      • array_key_first 1 hour ago
        People need to be more comfortable being the problem more often. Even if people actually use these solutions, they're almost always suboptimal anyway. We shouldn't be relying on them the way we do.
        • xmprt 1 hour ago
          Or to flip it on its head, be the solution. If a church or some other activity is requiring Whatsapp, then come up with a better alternative that does more than Whatsapp ever could.
          • milkytron 37 minutes ago
            I've tried this. It's hard to get people to switch platforms when they don't perceive any major existing problems with their current platform.

            My neighborhood that I'm on the HOA board for has been entirely on a facebook group. When I joined, I made sure that we communicate all necessary communication via email (for others like me not in the group or on FB). I created a website for the neighborhood that does everything the FB group does and more, but people don't see a reason to visit another website when FB has everything they want, so they still only engage on Facebook.

            I'm okay with being the problem (green bubbles are a whole nother thing for friends and family), but without sufficient pressure to switch, people generally prefer what they're comfy with.

          • devilsdata 8 minutes ago
            I guess this means a return to websites.
      • inkcapmushroom 34 minutes ago
        I have a similar problem, I do swing dancing and all the information for dances in my area are exclusively posted on Facebook by a wide variety of people who are putting on the dances. I can try and go to each individual organizing a dance and try and get them off Facebook, but that's making their job harder when we've already had lots of people stop organizing events post-COVID, and the system they have now seems to really work for getting new people into dancing that haven't done it before with lots of new faces each dance. So I just go along with it.
    • zackmorris 1 hour ago
      Funny, I'm the opposite. Since information wants to be free, and storage/compute get more affordable every year, then really everything ever posted on the web should be mirrored somewhere, like Neocities.

      I grew up in the 80s when office software and desktop publishing were popular. Arguably MS Access, FileMaker and HyperCard were more advanced in some ways than anything today. There was a feeling of self-reliance before the internet that seems to have been lost. To me, there appears to be very little actual logic in most websites, apps and even games. They're all about surveillance capitalism now.

      Now that AI is here, I hope that hobbyists begin openly copying websites and apps. All of them. Use them as templates and to automate building integration tests. Whatever ranking algorithm that HN uses, or at least the part(s) they haven't disclosed, should be straightforward to reverse engineer from the data.

      That plants a little seed in the back of every oligopoly's psyche that ensh@ttification is no longer an option.

      • terminalshort 53 minutes ago
        If "information wants to be free," doesn't that cut both ways? It applies equally to the personal data that I don't want to upload to an age gate as it does to the information that people want to keep behind an age gate.
        • irishcoffee 1 minute ago
          One persons age is a data point.

          Everyone’s age is information.

          Data doesn’t want to be free.

    • layer8 1 hour ago
      Many people don’t struggle to let privacy go.
  • firefoxd 2 hours ago
    My main concern is that there isn't a reliable way to know your information is securely stored[0].

    > A few years ago, I received a letter in the mail addressed to my then-toddler. It was from a company I had never heard of. Apparently, there had been a breach and some customer information had been stolen. They offered a year of credit monitoring and other services. I had to read through every single word in that barrage of text to find out that this was a subcontractor with the hospital where my kids were born. So my kid's information was stolen before he could talk. Interestingly, they didn't send any letter about his twin brother. I'm pretty sure his name was right there next to his brother's in the database.

    > Here was a company that I had no interaction with, that I had never done business with, that somehow managed to lose our private information to criminals. That's the problem with online identity. If I upload my ID online for verification, it has to go through the wires. Once it reaches someone else's server, I can never get it back, and I have no control over what they do with it.

    All those parties are copying and transferring your information, and it's only a matter of time before it leaks.

    [0]: https://idiallo.com/blog/your-id-online-and-offline

    • pixl97 1 hour ago
      Honestly that main concern should be two main concerns.

      You/your kid/your wife goes to hàckernews.com and is prompted for age verification again, evidently the other information has expired based on the message. So they submit their details. Oops, that was typosquatting and now who the hell knows has your information. Good luck.

  • devilsdata 10 minutes ago
    I'm 32 and submitted a photo of myself for age verification on Instagram and Threads. Was promptly banned, with no resource.

    I do look a little younger than 32, due to a healthy lifestyle and religious use of sunscreen but I have a beard and moustache. It's a little insane that I was instantly banned with no way to move forward.

  • JoshTriplett 3 hours ago
    I'm surprised that the EFF does not highlight the best option, here: use a VPN to a jurisdiction that doesn't have such ridiculous laws.
    • j-krieger 4 minutes ago
      VPNs are increasingly useless, with Cloudflare in front of 80% of the public net. I always wonder if people giving this advice try it themselves, most major sites are unusable with a common VPN provider.
    • kristjank 3 hours ago
      It might be bad for an activist group to advocate just ignoring the problem into a different jurisdiction.
      • paulddraper 2 hours ago
        They could sell it as "if your IP geolocation is inaccurate, or if the statute does not apply to you."

        But FWIW VPNs can get flagged for suspicious behavior. YMMV

    • hamdingers 2 hours ago
      "Give up" is not the best option. Certainly not from the EFF's perspective.
      • JoshTriplett 1 hour ago
        I mean, the best option is to fight this legislation, and AIUI they're doing that too. But this article is not about that, it's about how to minimize the harm if you encounter it.
    • Retr0id 2 hours ago
      In many cases, using a VPN is a great way to get your account flagged as suspicious.
      • stavros 4 minutes ago
        Then more people need to use a VPN!
      • iamnothere 22 minutes ago
        Care to share more details about this? Which account? What do you mean by “suspicious”? What specific effects does this have?

        I use a VPN 24/7 on one machine. Zero issues even with banking, although sometimes I have to answer CAPTCHAs.

        • j-krieger 3 minutes ago
          Your VPN provider shares their IP lists publicly. For a lot of website owners, blocking those is a simple way of getting rid of 80% of spam.
    • cedws 2 hours ago
      The days are numbered on this technique working. After enough countries enact their own age verification laws tech companies will just make that the global default policy, and I'm sure the opportunity to harvest user data will not be left to waste. Many sites already block and throttle VPNs.

      When that day comes I'll stop casually using the internet or search for the underground alternative.

    • omoikane 2 hours ago
      I think EFF does not recommend for or against VPN in general because it's not always a clear win, depending on the VPN and the use case.

      https://ssd.eff.org/module/choosing-vpn-thats-right-you

    • SoftTalker 2 hours ago
      Next step: the same government that is demanding the age verification will ban VPNs.
      • ohemorange 16 minutes ago
        Yep.

        > For example, in 2025, Wisconsin lawmakers escalated their war on privacy by targeting VPNs in the name of “protecting children” in A.B. 105/S.B. 130. It’s an age verification bill that requires all websites distributing material that could conceivably be deemed “sexual content” to both implement an age verification system and also to block the access of users connected via VPN. Another proposed Michigan bill requires “An internet service provider providing internet service in this state [to] actively monitor and block known circumvention tools.” Circumvention tools being: VPNs.

        https://www.eff.org/pages/vpns-are-not-solution-age-gating-m...

      • JoshTriplett 2 hours ago
        Not especially feasible if you want to support businesses. More likely is trying to demand that VPNs also enforce age verification, which business-targeted VPNs might do, and then ban the ones that don't.
      • pc86 2 hours ago
        Everyone seems to forget that using VPNs to violate your local laws gives lots of good ammo to the authoritarians that want to ban VPNs. The answer isn't to use a VPN to get around it (and thus give fodder to your enemies) but to change the law.
        • SoftTalker 23 minutes ago
          But it's easier to ask a relative few ISPs to block VPNs than it is to police the behavior of millions of individuals.
        • luke727 2 hours ago
          While I agree with this in spirit, here in the UK both major parties along with the public at large generally support these types of laws.
          • JoshTriplett 1 hour ago
            Two of the major parties support it, but it's not entirely obvious how much public support there is; it's not most people's top issue, and it's easy to make polls say what you want depending on the question you ask.

            You'd get different answers if, for instance, you ask "do you want to have to show ID or submit a picture of your face in order to access many sites on the Internet".

            • terminalshort 51 minutes ago
              The entire concept of public support breaks down when the majority of the public doesn't actually know what a VPN is.
          • SoftTalker 21 minutes ago
            I would guess the vast majority of parents support these laws. They are disgusted with the social media platforms who shrug and pretend they are just dumb pipes when it comes to filth, predators, and harmful content, while at the same time keeping users engaged with addictive algorithms and tracking everything every user does and knowing everything about them.
      • Jigsy 1 hour ago
        I doubt this would be workable.

        They could, sadly, however, make it a crime to bypass things like The Online Safety Bill. Downloading or using Tor, for example.

        At that point, the only sane option is to become a criminal.

  • marssaxman 3 hours ago
    I have never clicked "accept" on a cookie banner, as a matter of principle; I zap them away with uBlock Origin. Should the plague of age verification reach my jurisdiction, I'm sure I will handle it in like fashion.
    • RankingMember 2 hours ago
      Zapping only works if the site lets you continue/pull content without verification.
      • marssaxman 2 hours ago
        I expect I'll need to employ some other technical means of circumvention, but the principle of refusing to engage with the thing on its own terms will remain the same.
        • kube-system 2 hours ago
          These things are integrated into the authentication systems of these services. They aren't implemented client side. Refusing to engage with them means you cannot use the service.
          • BanAntiVaxxers 1 hour ago
            Then it wasn't meant to be. Let it go.
            • pixl97 1 hour ago
              Fun and games until your government makes getting access to the internet at all work that way.
            • RankingMember 1 hour ago
              The problem there is when it's inescapable, on every site.
    • antonvs 2 hours ago
      The difference is that the cookie banner is not a gate. uBlock Origin is unlikely to be able to satisfy a website about your age without submitting the info that the site expects. (Assuming the age check has any teeth at all.) You're unlikely to be able to continue as usual if these kinds of measures become ubiquitous.
    • goopypoop 2 hours ago
      ignoring the banner is the same as agreeing to all the opt-out "legitimate interest" shit
  • cloudfudge 2 hours ago
    This makes me wonder if there's a business case for a privacy-preserving identity service which does age verification. Say you have a strong identity provider that you have proven your age to. Just as the 3rd party site could use SSO login from your identity provider, perhaps the identity provider could provide signed evidence to the 3rd party site that asserts "I have verified that this person is age X" but not divulge their identity. Sidestep the privacy issue and just give the 3rd party site what they need to shield them from liability.
    • izacus 1 hour ago
      This is how Swiss e-ID was proposed to work: https://www.eid.admin.ch/en
    • enahs-sf 1 hour ago
      I’ve been noodling on this idea for a while but I think getting commercial acceptance would be hard. People have tried it with crypto albeit with lukewarm results. I think to have the network effects required to be successful in such an endeavor, it would have to come from a vendor like apple or google unfortunately.

      You kind of want an mTLS for the masses with a chain of trust that makes sense.

    • awkward 2 hours ago
      The article does go into this and gives lip service to the idea that a secure third party could expose age without exposing identity. Ultimately, there's still the problem that even if point of verification can be done in a zero trust way, you are still entrusting very sensitive information to a third party which is subject to data breach.
      • tzs 37 minutes ago
        If you do it right the only sensitive information exposed to the age gated site is that your age is above their threshold.

        The party that actually has to at some point verify who you really are of course has your sensitive information, and there is no obvious way to work around that. However, there is a way to make it so that it doesn't matter.

        That is by making them be a party that already has that information. Probably the simplest would be to make it be the same government agency that issues your physical identity documents like passports or drivers licenses. If we don't want it to be a government agency or we want to have competition banks would be a possibility.

    • triceratops 1 hour ago
      Yes. In fact the 3rd party doesn't even need to know who you are.

      https://news.ycombinator.com/item?id=46447282

      • cloudfudge 1 hour ago
        That's quite an elaborate system. It goes through a lot of gyrations (not the least of which is inventing a whole new type of crime and passing laws about it) and doesn't sound even as strong as the age verification "required" to buy cigarettes in the US. I'd think "welcome to pornhub. Either log in or do Privacy-enhanced Age Verification by Auth0 (TM)" would be a lot easier to get off the ground.
    • MiddleEndian 1 hour ago
      I'm more interested in a business that reliably provides fraudulent IDs to services that unnecessarily want IDs that I cannot avoid for some reason.
    • dakiol 2 hours ago
      The question is: why would services like Google and others want to use such privacy-preserving identity solutions? They wouldn't gain anything from a non-invasive, user-friendly system, so I don't think they'd use it. They want more data, so they are going for it.
      • SJC_Hacker 12 minutes ago
        > The question is: why would services like Google and others want to use such privacy-preserving identity solutions? They wouldn't gain anything from a non-invasive, user-friendly system, so I don't think they'd use it. They want more data, so they are going for it.

        Consumer pressure and/or laws

      • tzs 15 minutes ago
        Considering that Google is releasing open source software they developed to facilitate such systems [1], apparently they are OK with the idea.

        It could simply be that they realize that online age verification becoming required for some online activities is inevitable for the same reasons age checks are required for some non-online activities, and when that comes to pass they want to be able to do in a way that doesn't expose them to too much risk.

        Yes, Google loves data but that doesn't mean they don't care about risk. The data they would from some of the age verification methods probably wouldn't improve their ability to advertise much but would cause a lot of problems if leaked.

        Another possibility might be that have no choice. My understanding is that in the EU member states that enact online age verification laws will have to require that verification can be done using the privacy-preserving system that the EU Digital Identity Wallet will support. Sites will be able to use other methods too (as long as the don't violate GDPR) so they could support something that gives them more information for advertising, but they will still have to support the privacy-preserving option.

        [1] https://news.ycombinator.com/item?id=44457390

      • cloudfudge 2 hours ago
        I was thinking someone like Auth0 might want to offer it. They are not in the business of invasive user tracking but are in the business of trust.
    • tzs 45 minutes ago
      You've almost got it right. You just need to modify this part:

      > Just as the 3rd party site could use SSO login from your identity provider, perhaps the identity provider could provide signed evidence to the 3rd party site that asserts "I have verified that this person is age X" but not divulge their identity

      The way you compared it so SSO login makes it sounds like there would be interaction between the 3rd party site and the identity provider. That's bad because if someone got a hold of the records from both the site and the identity provider they might be able to match access time logs and figure out who you are.

      A fix is to make it so you get your signed document from the identity provider ahead of time, and that document is not tied to doing age verification with any particular site(s). You get it once and then use it with as many sites as you want.

      When you use it with a site to demonstrate age we need to do that in such a way that neither of you have to communicate with the identity provider. If the site needs to verify a signature of the identity provider on something you present they use the provider's previously published public key.

      We need to make it so that when you use the signed document from the identity provider to show your age to a site they don't see enough from the document to identify you, even if they have been compromised and are collaborating with the identity provider to try to identify you.

      Finally, the signed document should be bound to you in some way so that you can't just make copies and give them to others or sell them on the black market to people who want to evade age checks.

      BTW, since under this approach the identity provide isn't actively involved after their issue your signed document what probably makes the most sense is to have your government be the identity provider. In particular, the same agency that issues your driver's license or passport or nation ID (if your country has those).

      Such a system can in fact be built. The EU is including one in their EU Digital Identity Wallet project, which has been in development for several years and is not undergoing large scale field testing in several countries. It is supposed to be deployed to the public this year or next.

      The first version handles the binding of the document to you by tying it to your smart phone's hardware security element. They plan to later support other types of hardware security elements. 90+% of adults in the EU have smart phones (95-98% for adults under 54), and it is going up, so the first version will already cover most cases.

      Google has published some libraries for implementing a similar system. Both the Google libraries and the EU system are open source.

  • numpad0 1 hour ago
    Isn't age guesstimation by appearance, even with advanced machine learning techniques, even if attempted by real person with honest effort, just total snake oil? This ongoing age verification push with weird emphasis on generating name-face pairs is beyond fishy.
  • torcete 2 hours ago
    I thought the article was about finding a job when you reach a certain age, which is my problem.
    • nocman 49 minutes ago
      Yeah, I didn't notice where the article was located at first, and I thought that's what it was going to be about also.
  • drnick1 1 hour ago
    If this is about porn or other content deemed age-sensitive, the moment it becomes difficult to source through "official," mainstream platforms, the content will move underground (P2P networks), making it even more difficult to analyze and regulate. So this is a very shortsighted move.
  • izzydata 2 hours ago
    If my options are upload a picture of myself for Google to monetize through ads or not use Google / Youtube then I will be moving on regardless of the inconvenience to myself.
  • aleksandrm 1 hour ago
    OpenAI uses AI to scan your ChatGPT conversations to determine your age. And even though I've been using ChatGPT for mostly work-related stuff, it has identified me, a man in my 40s, as under 18 and demanded government ID to prove my age. No thank you.
  • Retr0id 2 hours ago
    There were some amusing headlines a while back about Discord's verification being fooled with game screenshots. Does anyone know if that's still the case?
    • everyday7732 2 hours ago
      saw a recent screenshot of someone doing it yesterday, so I think it still is a thing.
  • einpoklum 14 minutes ago
    Either the platform is trying to age-gate anonymously, in which case it is likely you (or your child) can just circumvent that with fake details; or it's some corporation with ongoing access to large government databases, and probably the government can tap the data it collects in some ways, and you (or your child) should probably be worried about being there in the first place.
  • shevy-java 2 hours ago
    States need to stop sniffing for age really. This is age discrimination.
    • kube-system 2 hours ago
      Basically every government on the planet has laws that apply specifically to children. The term "age discrimination" typically refers to disadvantaging someone for being of old age.
  • tracker1 1 hour ago
    I'm honestly a bit mixed on this... I don't think that (especially young) children should have access to explicit, graphic sexual content, especially kink. If you as a parent want your kids to have access, so be it... but then the onus should be on the parent.

    On similar lines, I think that something between an unrestricted smart phone and the classic dumb phone is a market segment that is needed.

  • dlcarrier 2 hours ago
    How well does the selfie test detect AI-generated photos? That seems easy to bypass, especially if you copy the metadata over from a real photo.
    • kube-system 2 hours ago
      The ones I have used do not accept photos, they require real-time video with the front-facing camera and they prompt you to move your head to face different directions on command. Not impossible to attack, I'm certain, but it's tougher than simply uploading a photo.
      • pzo 1 hour ago
        on desktops you can have virtual camera, if you can generate video fast enough wen AI you can ask to edit it according to instructions. Definitely tougher but I'm sure someone will offer services or software like that.
  • drnick1 1 hour ago
    Switch VPN region or upload a random picture generated by AI, problem solved.
  • irusensei 2 hours ago
    Face scan: download and install Gary's mod.
  • bloppe 1 hour ago
    Estonia basically got this completely right in 2002 with their e-ID. I'm kinda shocked nobody else has figured it out yet. Age verification could be simple, secure, robust, and require only the disclosure of your age, nothing more.

    Instead, the rest of us have systems that are both far more vulnerable to privacy beaches, and far easier to circumvent anyway.

  • neilv 1 hour ago
    > At some point, you may have been faced with the decision yourself: should I continue to use this service if I have to verify my age?

    An excellent question, which I didn't see the article really get into.

    > If you’re given the option of selecting a verification method and are deciding which to use, we recommend considering the following questions for each process allowed by each vendor:

    Their criteria implies a lot of understanding on the part of the user -- regarding how modern Web systems work, widespread industry practices and motivations, how 'privacy policies' are often exceeded and assurances are often not satisfied, how much "audits" should be trusted, etc.

    I'd like to see advice that starts by communicating that the information will almost certainly be leaked and abused, in n different ways, and goes from there.

    > But unless your threat model includes being specifically targeted by a state actor or Private ID, that’s unlikely to be something you need to worry about.

    For the US, this was better advice pre-2025, before the guy who did salutes from the capitol was also an AI bro who then went around hoovering up data from all over government. Followed by a new veritable army and camps being created for domestic action. Paired with a posture from the top that's calling harmless ordinary citizens "terrorists", and taking quite a lot of liberties with power.

    We'll see how that plays out, but giving the old threat model advice, without qualification, might be doing a disservice.

  • miki123211 2 hours ago
    > Even though there’s no way to implement mandated age gates in a way that fully protects speech and privacy rights

    I think the EFF would have more success spreading their message if they didn't outright lie in their blog posts. While cryptographic digital ID schemes have their problems (which they address below), they do fully protect privacy rights. So do extremely simple systems like selling age-verification scratchcards in grocery stores, with the same age restrictions as cigarettes or alcohol.

    • autoexec 1 hour ago
      > So do extremely simple systems like selling age-verification scratchcards in grocery stores

      Which stores sell age-verification scratchcards? How do you make sure they can't be traced back to the person who paid for them or where they were purchased from? How would a website know the person using the card is the same person who paid for them? It may be a simple system, but it still sounds ineffective, dangerous, and unnecessary.

      • triceratops 1 hour ago
        > Which stores sell age-verification scratchcards?

        Stores that sell other age-restricted products.

        > How do you make sure they can't be traced back to the person who paid for them

        How would they be traced? Pay cash. I've never had my ID scanned or recorded when I buy alcohol. And now I look old enough that I don't even have to show ID.

        If someone can trace the store they're bought from and you're that paranoid, rotate between stores. Buy them from a third-party. Drive to another state and buy them there. So many options.

        > How would a website know the person using the card is the same person who paid for them?

        They don't. How does Philip Morris know the person who bought the cigarettes is the same person lighting up? It's clearly not that important when selling actual poisons so why would it matter for accessing a website? The system works well enough to keep most kids from smoking.

        Rate-limit sales in a store (one per visit) and outlaw selling or transferring them to a minor (same penalties as giving alcohol or tobacco to a child). Require websites to implement one code per account policies with a code TTL of 6 months or a year, and identify and disallow account sharing. It's Good Enough verification with nearly perfect anonymity.

        • autoexec 1 hour ago
          > Stores that sell other age-restricted products.

          So far, I've never seen an age verification scratch card sold anywhere

          > How would they be traced?

          Your ID is collected at retail and its barcode scanned along with a barcode on the card, your personal data and card ID get uploaded to a server operated by the entity that created the cards and/or the state. ID barcode scan can be replaced or used alongside facial recognition, data collected (directly or passively) from your cell phone, your credit card info, etc. Even just being able to link a used card back to the time/place it was purchased could be enough to ID someone and put them at risk.

          > It's clearly not important when selling actual poisons so why would it matter for social media?

          The main difference is that I can't upload 1 million cigarettes to the internet for anyone of any age to anonymously download and smoke, but I could upload a spreadsheet of 1 million unredeemed scratch off codes to the internet for anyone to use. It seems highly likely that codes would get sold, shared online, generated, or leaked which means cards would be ineffective at keeping children from using them.

          Why should we be okay with jumping through a bunch of hoops that don't even do what they're supposed to in the first place while costing us money and opening ourselves up to new risks in the process? I reject the premise that proving my identity to a website is necessary let alone being worth the costs/risks. Scratch cards seem likely to fail at being private or effective. Of course, "Think of the children" is really only the excuse. Surveillance and control is the real motivation and any system that doesn't meet that goal is doomed to be replaced by one that does.

  • deadbabe 1 hour ago
    It is very easy to lie about age through age gates. I have yet to find one that is actually able to get strong proof of age, fake IDs are easy to upload.
  • jmclnx 2 hours ago
    >should I continue to use this service if I have to verify my age?

    Simple answer, never accept this If everyone selected "cancel" you can be sure these sites will stop age banning, they wan $ more than anything else.

    If a site asks me one question about me, I stop using if.

  • jimbob45 3 hours ago
    Is there a throwaway identity that people are using? A dead person unchecked in Mississippi somewhere? Like every teen in America using the same identity like everyone's extended family does with their uncle's Netflix account?

    I don't want to google it because I don't want to be put on a list but I also feel somewhat confident that this is being done. Apparently, HN feels safe to ask questions like that for me.

    • glitcher 2 hours ago
      > I don't want to google it because I don't want to be put on a list

      Of all the controversial things out there we've become afraid to even google in order to learn more about the world around us, this one strikes me as not all that controversial.

      But you're not wrong, just making a comment about how sad the world has become.

    • bee_rider 2 hours ago
      That’s an interesting question.

      Actually, a follow up. PII leaks are so common, I guess there must be millions of identities out there up for grabs. This makes me wonder: we’ve got various jurisdictions where sites are legally required to verify the age of users. And everybody (including the people running these sites) knows that tons of identities are out there on the internet waiting to be used.

      How does a site do due diligence in this context? I guess just asking for a scan of somebody’s easily fabricated ID shouldn’t be sufficient legal cover…

      • kube-system 2 hours ago
        These ID laws typically require a solution to be "commercially practical" or similar. The standard is not "impenetrable and impossible to circumvent"

        That's why some of them don't even ask for ID but just guess the age based on appearance. That's good enough per the law, usually.

    • everyday7732 2 hours ago
      It would probably flag that multiple people are using the same photo or same persons name/ id, but I expect you could get away with doing using someone known to you. iirc the reason people are using game screenshots is because it's not going to match any image that the recogniser has seen before. Use tor for the things you don't want to google and have associated with you.
    • acka 2 hours ago
      Netflix has been checking accounts against public IP addresses and local networks for ages, at least in The Netherlands. if I use my Dad's account, I get flagged as being "not on the same home network" immediately. I think that using a VPN and Netflix detecting that would only make matters worse, like termination of service.
      • reincarnate0x14 2 hours ago
        I gave up on netflix years ago for unrelated reasons but never had any sort of issue both VPNing between various countries and traveling between them. My wife would pretty regularly want to watch netflix as if she was in Japan or the UK and so we'd turn a VPN on for the TV network and their own TV app never complained at all that it was suddenly on a different continent.
    • shiandow 2 hours ago
      Last time I tried I could find a photo ID just with a basic image search. It is an unavoidable consequence of teaching people that scanning an ID is not utterly insane.

      Ironically there was no way to report the image anonymously to the service hosting it.

    • Jblx2 56 minutes ago
      >I don't want to google it because I don't want to be put on a list

      You might think about using something like the Tor Browser for anonymous web surfing:

      https://www.torproject.org/download/

      ...If you are worried about getting on a list by downloading the Tor browser, then take a trip to the next-town-over public library and download it from there. I guess your ISP could still guess that you were using Tor, and you might end up on a list of people using Tor. Also: If everyone is on the list, then no one is on the list.

  • AndyMcConachie 2 hours ago
    Why can't the EFF tell people to lie? Because if you can get away with it, lying is almost always your best option. Unless there are actual real world consequences to lying like you may anger the police.

    And maybe consider using a VPN.

    • kube-system 2 hours ago
      I'd imagine it is because several of the obvious options for "lying" here may violate criminal law. And also because the EFF is an civil liberties advocacy group, they want to change the law, not circumvent it.
    • HotGarbage 2 hours ago
      For real. This should be an article about circumvention, not compliance.
      • nottorp 1 hour ago
        That's not EFFs job, just ask your kids how they circumvent age gates for that :)
  • maximgeorge 2 hours ago
    [dead]
  • iLoveOncall 2 hours ago
    What a piss poor article.

    "We disagree with age gates but our recommendation is to comply". Fuck this.

  • mlinster 2 hours ago
    I think that age verification is important. While its not perfect, it is one tool to help protect kids.
    • benbristow 33 minutes ago
      In an ideal world, parents would be good parents, know what their kids are up to, install parental controls on their digital devices (software solutions out there range from free/bundled to not expensive), have conversations with kids about what's on the internet and what to avoid.

      Government overreach is not the answer, it's a plaster (and an excuse for more surveillance which is arguably the primary factor) over bad parenting. In the UK at least, all major ISPs and mobile providers have a basic parental/adult-content control package that is set-up by default (opt-out by the bill payer). Albeit trivial to get around with a VPN/proxy or changing DNS servers etc.

      Kids will be kids as well. They'll get around restrictions, they're clever, they talk with their mates in the playground about this sort of thing. Especially teens.

    • unglaublich 2 hours ago
      Against what? How much struggle and pain are we actually seeing in the world because children have unrestricted internet access?
    • MiddleEndian 1 hour ago
      I would say that normalizing giving random websites photos of yourself is harmful to children.
    • t-3 1 hour ago
      Think back to when you were a child. Did age verification ever stop you from doing anything? The automated, technologically-implemented age-verification is even less interested in properly verifying anything than the ID-checking bouncers at a bar. None of these things protect kids, they just annoy them and teach them that authority is stupid and lying is a convenient way to deal with stupid people.
    • anthk 2 hours ago
      Call your ISP and ban any NSFW/NSFL access by DNS, both in your children's phones and your home connection. Problem solved.
      • drnick1 1 hour ago
        This does not work, browsers like Firefox don't even always use the system DNS by default.
        • pixl97 1 hour ago
          Ah, blocking porn from your devices does not work. But age gating porn in your country somehow fixes the fucking global internet....

          Please explain that too me.

          I'm sorry for getting a little steamed here, but I have to wonder if you've put any thought into what you're asking for in the name of kids safety. And worse, if you think it will work globally what are you going to do when Saudi Arabia wants anything they don't like banned in the US, for example.