It's true. - eviltoast
    • HelixDab2@lemm.ee
      link
      fedilink
      arrow-up
      1
      arrow-down
      1
      ·
      2 months ago

      Calculators also say that dividing by 0 is an error, but logic says that the answer is infinite. (If i recall, it’s more correctly ‘undefined’, but I’m years out of math classes now.)

      That is, as you divide a number by a smaller and smaller number, the product increases. 1/.1=10, 1/.01=100, 1/.001=1000, etc. As the denominator approaches 0, the product approaches infinity. But you can’t quantify infinity per se, which results in an undefined error.

      If someone that’s a mathematician wants to explain this correctly, I’m all ears.

      • teletext@reddthat.com
        link
        fedilink
        arrow-up
        4
        ·
        edit-2
        2 months ago

        It approaches positive and negative infinity, depending on the sign of the denominator. The result must not be two different numbers at once, so dividing by zero cannot be defined.

        There are other reasons, too, but I forgot about them.