Original Vintage WWII Poster We'll Take Care of the Rising Sun Inflation - US WWII War Bond ad - 1943 Pacific - eviltoast
    • ZombiFrancis@sh.itjust.works
      link
      fedilink
      arrow-up
      5
      ·
      2 months ago

      Those are two very different things.

      Medical technology advancements aside, the insurance industry that began to expand from the 20s onward began to supplant the notion that health care could come without insurance.

      • Corkyskog@sh.itjust.works
        link
        fedilink
        arrow-up
        2
        ·
        2 months ago

        It just wasn’t as necessary… most ailments don’t reveal symptoms until they are too late. And now, there are so many perverse incentives that the whole industry is diseased itself.

      • friend_of_satan@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 months ago

        Health care has absolutely been better. I honestly don’t know about insurance though, which is why I asked. The two are tightly related so I included both in my question.