TIL The Goodhart's Law: Any observed statistical regularity will tend to collapse once pressure is placed upon it for control purposes. - eviltoast
  • nodimetotie@lemmy.worldOP
    link
    fedilink
    English
    arrow-up
    2
    ·
    1 year ago

    Thanks, yes, I saw that one, too, but I liked the emphasis on relationships. The shorter version is easier to get but it does not explain why this happens. E.g., you can observe some relationship (e.g., test results and a student’s intelligence) and then you target grades. But then you have an incentive to teach to the test, which breaks down the relationship between test results and intelligence. Other people here gave great examples of relationships that can fail.

    • fubo@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      1 year ago

      Imagine an antivirus program that looks at a piece of code and outputs either “Yes, this is malware” or “No, this is not malware.” It is not perfect, but it is pretty good.

      If the malware authors have access to this program, they can test their malware with it. They can keep modifying their malware until it passes the antivirus program.

      Once the antivirus people publish a function AV(code)→boolean, the malware people can use that function to make malware that the function mistakes for non-malware.

      If you publish the exact metric that you promise to use to make a decision, then people who want to control your decision can use that metric to test their methods of manipulating you.

      • nodimetotie@lemmy.worldOP
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        That’s what’s happening with Google and Instagram search algorithms. People figure out how to manipulate them and start spamming. Then the search results deteriorate and you have to modify the algorithm.

        • fubo@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          1 year ago

          Partly, yeah. But eventually the fake news people write a narrative that looks prosodically identical to real news; a bot can’t tell it isn’t because the bot doesn’t interact with the real world, only with text on the web.

          Ultimately, fact-checkers and anti-spam systems have to touch grass too.