Why did people in the 90s/early 00s say that the internet "couldn't be taken down"? - eviltoast

Or am I the only one remembering this opinion? I felt like it was common for people to say that the internet couldn’t be taken down, or censored or whatever. This has obviously been proven false with the Great Firewall of China, and of Russia’s latest attempts of completely disconnecting from the global internet. Where did this idea come from?

  • Nyxicas@kbin.melroy.org
    link
    fedilink
    arrow-up
    1
    ·
    2 days ago

    I think it’s mostly because of how rapid the internet was at becoming more accessible. It was inevitable as to how big it’d become.

    And the opinion then changed from that to “The internet never forgets” which was more in the mid-2000s and early 2010s. This is 50/50 because it really depends. Some sites shut down for good, so if there was anything or anyone on it, then we can safely say the internet forgot. But that opinion mostly applies to whenever someone becomes a lolcow or someone who generally does something so stupid online that it’s everywhere. Hence the internet communities not forgetting.

  • jet@hackertalks.com
    link
    fedilink
    English
    arrow-up
    100
    arrow-down
    1
    ·
    4 days ago

    The internet was originally designed to withstand nuclear war, so that a functioning military network could coordinate a retaliation quickly.

    The network protocols themselves are self-healing, routing around failures, very resilient.

    The internet itself, even today, is incredibly difficult to destroy. It is nearly impossible to take it down.

    However, the internet that most people think of as the internet, Facebook Google etc. Are centralized services that are trivial to take down.

    Peer to peer protocols like email, torrents, are also nearly impossible to take down.

    The examples of Russia and China isolating themselves, are different. That’s the network designers isolating the network. It’s not a third party trying to destroy the network.

  • Captain Aggravated@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    17
    ·
    3 days ago

    The basic building blocks of the internet were designed by DARPA, and it was designed with that military mentality of “If the ruskies nuke any part of our infrastructure, the rest of it should keep running.” You can chop large parts of the internet off and those parts stop working but the rest of it keeps going. Here’s an extreme example: I can unplug my cable modem and disconnect my house from the internet completely, yet I can still access the web pages hosted by my switch, Wi-Fi router and NAS through my local area network.

    • Phoenixz@lemmy.ca
      link
      fedilink
      arrow-up
      4
      arrow-down
      1
      ·
      3 days ago

      Mind you that a lotmof that no longer works

      In the past traffic could be routed over whatever. If one node went down, the traffic would go over another

      Now we have a few very fast backbones and if even one goes down bye bye internet

      What you have cached locally or on your doesn’t count because it’s only that which you’ve seen before.

  • TrickDacy@lemmy.world
    link
    fedilink
    arrow-up
    30
    ·
    4 days ago

    I’m confused. You’re citing the actions of a country to impact its own Internet as evidence they can take the Internet down?

    That’s like saying me disconnecting my microwave proves that I can take down the power grid.

  • BananaTrifleViolin@lemmy.world
    link
    fedilink
    English
    arrow-up
    37
    ·
    4 days ago

    This opinion remains largely correct - the Internet as a network is very difficult to take down.

    However things have happened that have undermined the Internet in favour of commercial priorities.

    Net Neutrality was a major principle of the Internet but that is under attack, particularly in the US, where infrastructure providers want to maximise profit by linking their income to each Gb used rather than just paid as a utility. Their costs are largely fixed in infrastructure but they push the lie that they need to be paid for how busy that infrastructure is. A network router doesn’t care whether it’s transferring 1gb or 10gb, it only matters if you hit capacity and the network needs to be expanded. The Internet providers instead want profit profit profit so are pushing for a way to maximise it.

    The other major issue has been consolidation and that’s thanks to monopolies being allowed to form and dictate how the Internet works. Google, Facebook, Microsoft, Amazon and Apple - they’ve all used their services to try to manipulate customers into their walled gardens and prevent competition.

    So the Internet as many people think of it is very vulnerable - big centralised services can have outages that affect everyone because people don’t have much choice.

    But the reality is the underlying protocols and infrastructure remains robust. Google might have an outage, but the Web itself is still functional. Email protocols and file transfer protocols still work. The problem is people who are sitting in Googles walled garden of services are locked out of everything. And with Googles huge monopoly on search and advertising it means lots of other major services are out too.

    So the Internet itself is fine. It’s the services and monopolies built on it thay are the problem.

    • Swordgeek@lemmy.ca
      link
      fedilink
      arrow-up
      5
      ·
      3 days ago

      Net Neutrality was a major principle of the Internet but that is under attack, particularly in the US[…]

      Not really focused on the US. Every nation, every corporation, every venal special interest group is fighting against net neutrality.

    • subtext@lemmy.world
      link
      fedilink
      arrow-up
      4
      ·
      3 days ago

      To expand upon “walled gardens”, the customers are not just you and I, it includes the majority of the Internet since they’re all running on the cloud, a.k.a., AWS, MS Azure, and Google Cloud.

  • xmunk@sh.itjust.works
    link
    fedilink
    arrow-up
    22
    ·
    4 days ago

    The internet couldn’t and still can’t be taken down - but countries can certainly restrict it within their locale (though it is insanely difficult).

    The opinion is that the internet as a concept and set of protocols was and is too widespread to ever fully dismantle and one dude with a mission can capture and preserve an immense amount of data.

    That’s still all true but doesn’t hold for social media walled gardens which have come to control a huge proportion of communication.

  • Swordgeek@lemmy.ca
    link
    fedilink
    arrow-up
    14
    ·
    3 days ago

    “The internet sees censorship as damage and routes around it.”

    From a very primitive perspective, this is true. Many of the infrastructure protocols (DNS, BGP, etc.) that the internet sits on are designed to be resilient and fault tolerant. Block access to a DNS server, and the system will find another one. Usually. Depending on circumstances.

    Firewalling an entire country is incredibly difficult. From a technical point of view, the GFoC is only modestly successful. It blocks casual and accidental access to the ‘outside world’ just fine, but for the determined operator there are absolutely ways around it - VPNs, cellular networks, satellite relays, you name it.

    But do you want to risk having the police show up at your door with orders to kill on sight?

    This is fundamentally no different than content filtering in a typical office. From my work computer, I can’t get to porn sites. If I really wanted to, I could find a way - but the odds are pretty good that HR would be at my desk with termination papers and a security escort out of the building.

  • Nollij@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    30
    ·
    4 days ago

    A 1993 Time Magazine article quotes computer scientist John Gilmore, one of the founders of the Electronic Frontier Foundation, as saying “The Net interprets censorship as damage and routes around it.”[7]

    That applied a whole lot more when most connections were using a phone line, and a decent size city could have hundreds of ISPs. But part of the design of a redundant mesh network is that there are tons of different paths to any destination. Cutting any of those links would simply force traffic to other routes.

    The early Internet was decentralized in other ways, too. Rather than flock to corporate platforms like Facebook, people spent a lot of time on federated and independent platforms. This included Usenet, IRC, and BBSes. In the event that the feds, lawyers, etc could take one down, a dozen more could spring up overnight. There was such a small barrier to entry, and many were run by hobbyists.

    It’s somewhat true today. There are countless Lemmy instances that are completely independent. Pirate Bay famously references the Hydra, and it applies to their peers as well. But these are limited in scope.

    Xitter has shown us just how quickly and thoroughly a platform can collapse through hostile admins, and how slowly people will reject it.

    • FourPacketsOfPeanuts@lemmy.world
      link
      fedilink
      arrow-up
      11
      ·
      edit-2
      4 days ago

      I moan about it regularly but this…

      Rather than flock to corporate platforms like Facebook, people spent a lot of time on federated and independent platforms. This included Usenet, IRC, and BBSes

      Is just tragic isn’t it? We really had it. A global free flow of hobbies, interests, research, debate, exploration.

      I don’t know what’s so fundamentally flawed about human nature a) that something that started so well like facebook gets enshitified to the extent that it has and b) why people flock to it like flies round a steaming turd

      • Swordgeek@lemmy.ca
        link
        fedilink
        arrow-up
        2
        ·
        3 days ago

        The answer to your second point is simple.

        Meta’s properties (FB, Insta) have something that most other social networks are lacking: A network of real-world family and friends.

        Twitter, Reddit, Mastodon, Lemmy, Tiktok, and the rest all tend to have communities built from the platform’s population, based on shared interests. Meanwhile, FB is the platform that you use to connect with your oddball uncle and high school friends from way back. That’s the sunk cost that makes it so much harder to leave than the strangers on reddit who share your love of lime jello.

      • Monkey With A Shell@lemmy.socdojo.com
        link
        fedilink
        English
        arrow-up
        2
        ·
        4 days ago

        That’s a big part of the appeal of the fediverse for me. Setting up a personal site used to be fairly easy, but was largely isolated and unidirectional. With the AP protocol, and frankly a lot of self-hostable apps in general these days, you can make something to converse with the whole globe and you don’t even need to make a big effort to help people find it.

        Webrings still exist, but finding them is less than trivial when they get drowned out by the noise of corporate sites. I’ve used IRC within the last year, but had to look up the proper use of nicserve commands. The old web mentality is still out there, but for the major part people want simplicity. Few want to go through the learning curve to deal with some of the more esoteric parts of it when they can just auth into a site and do a thing.

      • Dagwood222@lemm.ee
        link
        fedilink
        arrow-up
        2
        ·
        4 days ago

        It’s a truism in writing; villains act and heroes react. If someone looks at the internet and sees a way to exploit it they will. They don’t care that it’s working fine for everyone else; they want the money.

    • Kelly@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      4 days ago

      “The Net interprets censorship as damage and routes around it.”

      As an example of this, one of the easiest and most performant methods a nation has of blocking a website is dictating which DNS records its ISPs return for domains.

      This has the advantages that it doesn’t require traffic inspection and doesn’t slow traffic at all.

      But it has the disadvantages that it has an all-or-none effect on the domain e.g. it can’t be used to bock specific pages.

      It can also be bypassed by simply using an international DNS server. There are people bypassing this kind of censorship without even knowing they are doing so.

  • DarkThoughts@fedia.io
    link
    fedilink
    arrow-up
    10
    ·
    3 days ago

    If Russia disconnects, or get disconnected, then they’re just not part of the internet anymore. The internet itself will continue to exist though - and probably would be quite a bit better.

  • neidu3@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    23
    ·
    4 days ago

    Because then it was a robust network with a myriad websites and not just those four websites linking to each other. Also, they weren’t all dependant on adsense or akamai to function.

  • HubertManne@moist.catsweat.com
    link
    fedilink
    arrow-up
    8
    arrow-down
    1
    ·
    3 days ago

    Because its decentralized. So you can take a part of it out but not the whole thing. Unfortunately in some ways its become centralized.

  • Maggoty@lemmy.world
    link
    fedilink
    arrow-up
    3
    ·
    3 days ago

    Those countries are controlling access at the very few origin points. And they can still be foiled by tunnels, VPNs, and Encryption. The only counter is to actually cut the network at that origin point. But that still gives a country sized internet that’s very resilient. Could they start isolating cities? Depends on their infrastructure. I know the mid size town I lived in could be shut down with one cable. (Because road construction hit it at least once a year and 80,000 people lost Internet for a couple days each time)

    When it was first envisioned it was supposed to be an actual web. With multiple points of contact at each place. Instead we’ve consistently done the bare minimum to bring the Internet to each place. Meaning in many places there’s only one connection. For an international look at connection points there are undersea cable maps. It becomes clear quite quickly how easy it is to isolate a single country’s web.

  • Ziggurat@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    8
    ·
    4 days ago

    A big change between the internet in the 90’s/00’s and today, is that today we don’t really have this internet with “all computer being equal”, we have a dozen of facebook/google/reddit/tiktok massive websites, and it’s relatively easy to close one of these.

    in the 90’s a judge could ask an ISP to close the homepage of someone without impacting the whole internet

      • Daemon Silverstein@thelemmy.club
        link
        fedilink
        arrow-up
        1
        ·
        3 days ago

        Nowadays, there are some efforts to try and bring back the old gold web:

        • Neocities: it tries to mimick geocities, so people can host HTML+JS+CSS sites that are meant to be static
        • Geminispace capsule hosting services: similar, but without CSS and JS, it goes even further on trying to return the grand old web from Mosaic browser era, as it’s highly content-focused.
  • NutWrench@lemmy.world
    link
    fedilink
    arrow-up
    8
    ·
    4 days ago

    Authoritarian regimes must control the flow of information if they want to continue to exist. Just because they can disconnect themselves from the rest of the world doesn’t mean they’ve "taken down the Internet. "

  • MudMan@fedia.io
    link
    fedilink
    arrow-up
    8
    arrow-down
    1
    ·
    4 days ago

    There is a lot of confused misinterpretation in this thread. “Can’t be taken down” was a thing, but it was about how you can lose big chunks of the network and have the rest of it still work. That was misinterpreted at the time, in fairness, and it’s even less true now, where centralization in both the infrastructure and the hosting have a lot of things dropping at once and being disrupted, but it’s still technically true. Ukrainian drones are out there beaming up to satellite internet and being used in active warfare in the middle of a battlefield. Which hey, in that context, robust military communication was the original intent of the network to begin with. Given the previous baseline is wired telephone, the characterization isn’t that far off.

    Censorship is different, but also true. You can isolate a chunk of the Internet, and once you’ve done that if you have very centralized control you can monitor it, but that’s a high bar. And of course outside of those cases people struggle to limit communications they don’t want, from nazi chatter to piracy.

    At the time I used to think that was a good thing, now… yeah, harder to justify. Turns out “free information” didn’t automatically make everyone smarter. I have lots of apologies to give to teachers and professors of theory of communication that were trying to explain this to us in the 90s and we were all “nah, man, their only crime was curiosity, hack the planet, free the information” and all that.