Congress Wants to Hand Your Parenting to Big Tech

(eff.org)

59 points | by hn_acker 2 hours ago

5 comments

  • sdoering 1 hour ago
    I am always wondering, if initiatives like these are a way to get a system in place that enables governments (by proxy of these platforms) a way to ensure any online activity is tied to a governmental id.

    Because if you want to use these platforms this would mean you would have to prove your age.

    Then I ask myself if I am wearing my tinfoil hat?

    Sadly, nowadays, I am just not sure anymore.

    • V__ 1 hour ago
      I am more and more sure that isn't the case. That would imply long term planning, strategy and intelligence. Which is obviously missing nowadays.

      It's just bribery, sorry I mean lobbying. Push this through, we make money and will fund your reelection.

    • analog31 1 hour ago
      Let the industry regulate itself until people get angry enough. Keep pumping out addictive, manipulative content that's targeted at kids. Then we can see what the political reaction will be. That's assuming the industry hasn't already blown its chances. If it has, then it can hang on for a few more years by buying favoritism from the regime in power.
    • iLoveOncall 47 minutes ago
      This is obviously the case, and I don't understand why anybody falls even a second for "it's for the children".

      They don't give a flying fuck about the children, they want to have total control over the citizens because all westerns countries are more or less slowly slipping towards authoritarianism.

      Dictatorships in 21st century first world country will be impossible to topple, once the government can reliably link your ID to your online activity, you'll be arrested before you even know you'll commit an anti-governmental act.

    • seneca 32 minutes ago
      I genuinely don't understand how anyone can think it's anything other than governments trying to destroy online anonymity. "Think of the children" is a cliche for a reason.
      • ocdtrekkie 24 minutes ago
        CSAM is not an overstated problem. If anything the amount of child abuse behavior online is an epidemic. The world's richest man sells a CSAM generator, the most popular game for kids under 12, Roblox, is besieged with predators.

        Are governments good at regulating technology? Generally no. Is there a real problem that needs to be regulated: Oh my God, yes.

  • skybrian 1 hour ago
    If there were a store selling cigarettes to children, then naturally you'd want the store to stop doing it. It's their responsibility. But they do need information about who they're serving. (Just enough information.)

    Making a website adults-only should be as easy as setting a web server's config parameter. The fact that the industry has taken so long to come up with a decent Internet standard for this is pretty ludicrous. It doesn't have to be perfect. Even just a minimal implementation like requiring an "X-adult: yes" HTTP header from the browser would work for a locked-down client like an iPhone.

    Sure, older kids will get around it but that's okay; they probably learned something.

    • netsharc 21 minutes ago
      Hah, 30 years old: https://en.wikipedia.org/wiki/Platform_for_Internet_Content_...

      It uses meta HTML tags and correct configuration of the browser to block/allow different ratings. I suppose one could use wget, curl, or lynx to bypass that stuff and download the HTML files, and then find the links to the the JPEGs in them...

    • hypeatei 1 hour ago
      I don't think it's that simple. Since the header mechanism is easy to bypass, there would be:

      1) software that makes it easy to do for the layman (browser extensions etc.), and

      2) scams and malware that target children offering a "bypass" to access adult websites

      Then parents, teachers, and administrators need to be aware of the latest bypass mechanism thus sending them on a wild goose chase. I think this would end up similar to the Do Not Track header which ultimately no one cared about or took seriously.

      • goalieca 27 minutes ago
        In a case like this, perfect is the enemy of good.

        A locked down iPhone or Chromebook is going to thwart everyone but the most determined without compromising any privacy.

      • pessimizer 2 minutes ago
        The Do Not Track header didn't die because of an arms race, it died because there wasn't any legislation making it criminal to track people who had explicitly indicated to you that they did not wish to be tracked.

        Kids (especially ones close to the age of legal access anyway) will try (and succeed) in bypassing any sort of restriction on adult content including any of the digital ID garbage. There are any number of software scams targeting everybody, and your hypothetical just be another one; I doubt that it would increase the total number of such scams.

        But requiring sites with adult content by law to require what would sort of be the opposite of Do Not Track flag (Let Me In?) would at least mean that kids would have to do something illicit on the client side to access adult websites that they would have to hide from their parents. If you made sure their phone or Chromebook were nerfed, you could make sure they couldn't install extensions or software that added the flag, you could strip it from their network requests; you could even strip it at the router. You as a parent, and people who have nothing to do with kids, could trivially opt-in.

      • idle_zealot 27 minutes ago
        > 1) software that makes it easy to do for the layman (browser extensions etc.), and

        It's already a given that this only works on a locked-down device. Making it a simple binary "is this device owned by a minor" switch means parents will actually be able to understand it.

        > 2) scams and malware that target children offering a "bypass" to access adult websites

        And advertising to children should also be banned, so they won't be exposed to such scams, among other things. Thankfully this header lets the site know if they're breaking the law by showing scam ads, which makes prosecution super easy.

        > I think this would end up similar to the Do Not Track header which ultimately no one cared about or took seriously.

        Oh, of course none of this works unless it has the teeth of law to back it up.

      • gjsman-1000 32 minutes ago
        Also it already exists. It's called the RTA header; and it was invented by the porn industry decades ago to try and appear as a responsible self-regulating industry. (Total failure at that.)
    • Retr0id 14 minutes ago
      If we're requiring a locked-down client, why not have the server advertise the age rating in a header and let the client decide whether it'll display the response or not? That way the server doesn't get to see any age information whatsoever.
    • 2OEH8eoCRo0 28 minutes ago
      Do they really need a standard or should they make sites liable for allowing children on?

      There is no standard ID check protocol at liquor stores. If you're old they can just look at ya, some just look at your ID, others scan the ID. The govt didn't need to provide a standard. Just don't sell to kids. Figure it out! It's not on the govt to figure it out for you!

    • iLoveOncall 49 minutes ago
      > If there were a store selling cigarettes to children, then naturally you'd want the store to stop doing it.

      No, I would want children to know better than to buy cigarettes.

  • lateforwork 1 hour ago
    The problem with "let the parents decide" is that if all other kids in the neighborhood have phones and are on social media then unless you want your kid to grow up with no friends you don't have a choice but to let your kid also use social media.

    The government makes many basic restrictions for protecting children: parents can't give their children drugs or alcohol, porn, guns etc. Social media definitely fits in this category because it has been shown to cause mental harm.

    • armenarmen 47 minutes ago
      Sure, but is using the full force of the State, in the process tying all online activity to government IDs, really the best alternative to having a harder conversation with little Johnny and Sally?
    • koolba 48 minutes ago
      > The problem with "let the parents decide" is that if all other kids in the neighborhood have phones and are on social media then unless you want your kid to grow up with no friends you don't have a choice but to let your kid also use social media.

      This is why you find a circle of friends and like mind neighbors who raise their kids in a manner that makes you comfortable. It’s never 1:1, but it doesn’t have to be you against the entire world either. (Though it can certainly feel like that at times)

    • mattmaroon 44 minutes ago
      Parents actually legally can give their kids alcohol and guns in most states. Porn I’m not sure about. You can’t give anyone drugs, unless they’re legal in which case you can give them to your kids.
    • seneca 30 minutes ago
      > The problem with "let the parents decide" is that if all other kids in the neighborhood have phones and are on social media then unless you want your kid to grow up with no friends you don't have a choice but to let your kid also use social media.

      Sorry, no, this is just abdicating your responsibility as a parent. "It's hard" isn't an excuse for throwing your hands up and handing your responsibility over to the state.

    • michaelmrose 52 minutes ago
      In the US parents can mostly give their kids porn, guns, and alcohol at home. Wherein the drug isn't itself illegal you are for practical purposes also able to give your kids drugs.

      Being shown to cause harm is also a meaninglessly low standard. Bathtubs, pools, and bikes can cause harm. You would need to show an actually useful standard. Lets propose will cause an unacceptable level of harmn that cannot be mitigated by less restrictive means.

      I don't buy the argument that you are unacceptably harmed because you aren't capable of denying your kid social media nor do I buy the idea that social media couldn't be regulated to be less shitty and harmful.

      • lateforwork 21 minutes ago
        Exposing children to pornography is illegal federally and in all states, treated as distribution of obscene material or child exploitation with no parental exemptions. Federal law (18 U.S.C. § 2252) prohibits such exhibition to minors under 18, carrying severe penalties like imprisonment.

        So precedent exists. Social media is at least as harmful as porn.

  • notatoad 1 hour ago
    >Big Tech is somehow both the problem and the solution

    not sure why they're framing this like it doesn't make sense. of course the people who've created the problem would be in a position to solve it.

  • altacc 1 hour ago
    The problem with "let the parents decide" is that so many parents take the option of least resistance and currently that's a terrible option. From what I see of my childrens' peers, it's not parents are deciding to let their children run wild on social media, it's that they don't even think about it, they just hand over a phone or tablet, often with their own login, and don't think much about it.

    One way of solving this is if the default was everything locked down, then effort needed to give the children anything, forcing parents to consider each permission.

    However I also see that parents are addicted to their devices and social media, so don't see the problem.

    • ls612 1 hour ago
      I’m still not convinced what is fundamentally different today about social media compared to violent video games which were the supposed evil my parents obsessed about when I was a kid. This is just the “sex drugs and rock & roll” for the 21st century’s control freaks.
      • zugi 1 hour ago
        And before sex, drugs, and rock 'n roll it was that sinful Lindy Hop those kids were doing.
      • sylens 36 minutes ago
        You can’t tell the difference between a finite experience like Goldeneye or Doom and an endlessly scrolling, network connected app like TikTok, optimized to feed you what it thinks will keep you scrolling?
        • SpicyLemonZest 10 minutes ago
          I'd encourage you to read some of the stuff that people wrote back then. Some choice quotes from a Senate hearing (https://www.govinfo.gov/content/pkg/CHRG-106shrg78656/pdf/CH...):

          > Kids as young as 3 years old can use mounted guns to shoot people to pieces and watch blood splatter on the screen. Kids get points for killing people. Parents eat pizza while their kids blow somebody up. I have friends who play them. Their eyes look crazy when they play them, and they get excited when the blood splatters and parts of bodies fly.

          > The project is going to continue for a long time, because it is really hard to convince some people about the dangers. Some will not even listen. Some parents do not think it is harmful for a child to make blood splatter and body parts explode. I do not understand why they think it is okay to do this killing.

          > Mortal Kombat series, Mortal Kombat Ultimate—This has joysticks. You use your fists and legs and feet. Bodies explode blood when you hit them. Mortal Kombat Ultimate says on the screen—‘‘There is no Knowledge that is not Power.’’ Does that mean that if you know how to kill someone, then you will have power?

          It's very hard for me to read commentary on social media and not be reminded of this kind of rhetoric. All of the individual facts are true, it's hard to explain exactly what's wrong, and it's clear that everyone in this hearing passionately believed that disaster was incoming if we didn't take action. Yet I'm very confident that video games do not have the negative effects they thought were obvious.

        • ls612 14 minutes ago
          Obviously the particulars of each generation’s moral panic are different, but the fundamental nature of moral panics remain the same.
      • michaelmrose 46 minutes ago
        I don't think rock and roll taught fundamentally bad values nor did playing mario or doom.

        Social media is by contrast fairly designed to spread 17 different kinds of poisonous stupidity. So you liked $conspiracy_theory... how about 10 more 3 of which suggest genocide!

        • verdverm 40 minutes ago
          Disney is worse in ways, subtle sexual imagery in their cartoons and interpersonal drama in their teen shows. Kids are learning these patterns before they even get to social media
    • PlatoIsADisease 42 minutes ago
      While I am quite laissez-faire and not sure how much I care about this particular issue, I have seen this mentality on teaching. "Its the parents fault the kids can't read in college."

      No... They spent 13 years in government school, that is not the parents fault if they can't read. If we assume its the parents job to educate their kids, there will be some 1-5% of kids that fall through the cracks, damning millions of kids to failure.

      For policy that we care about, it is not good enough to have parents decide.

      • bigbadfeline 1 minute ago
        > No... They spent 13 years in government school

        If that school doesn't take into account parents' preferences it would be a farm, not a school.

        > If we assume its the parents job to educate their kids

        We should assume it's the school's job to educate kids approximately in alignment with the wishes of their parents.

        > For policy that we care about, it is not good enough to have parents decide.

        "Good enough" for whom? Who is supposed to decide to the exclusion of parents? How such a decision is going to be made? Who is going to be responsible for the inevitable failures which are now called "successes"?

        > "Its the parents fault the kids can't read in college."

        If you understand what I'm trying to say here, you'll know that parents will always get the blame, no other party is willing to accept even the slightest hint of responsibility.