@brucethemoose - eviltoast
  • 8 Posts
  • 867 Comments
Joined 8 months ago
cake
Cake day: March 22nd, 2024

help-circle
  • NFT artists kind of hide behind the oldschool art world to justify it…

    I think it shows how stupid that world is.

    The point of art is to move people, to make them think and feel something by conveying the artist’s thoughts, not be a store of value. And while I’ve been to museums with gorgeous pieces, if you’re paying millions for a painting, at some point most of that value is the gratification of hoarding it. That same money could buy you an incredible experience in today’s art landscape, but it’s not about the experience, is it? And NFTs are like the perfect deconstruction of that.

    That being said, OP I am downvoting your post because eyeballs are exactly what crypto bros want, no offense :P




  • It’s even simpler than that, it’s people being told what to think.

    I think “people” speaking very generally used to not read a ton of news, heard stuff from the grapevine, and so on. “Elites” and news junkies had somewhat more monolithic sources.

    And that’s not true anymore. Nearly every “average” person’s life is now dominated by a personalized feed, a podcast, TV, radio, chatroom, whatever, and it’s having an outsized influence compared to their observations of reality now.

    It’s my belief that there’s basically nothing Biden could have done to alter this (other than regulating algorithms, and it’s far too late) and ultimately it’s the DNC’s fault for “taking the high road” and not playing the propaganda game.









  • How useful would the training data be

    Open datasets are getting much better (Tulu for an instruct database/recipe is a great example), but its clear the giants still have “secret sauce” that gives them at least a small edge over open datasets.

    There actually seems to be some vindication of using massively multilingual datasets as well, as the hybrid chinese/english models are turning out very good.



  • It turns out these clusters are being used very inefficiently, seeing how Qwen 2.5 was trained with a fraction of the GPUs and is clobbering models from much larger clusters.

    One could say Facebook, OpenAI, X and such are “hoarding” H100s but are not pressured to utilize them efficiently since they are so GPU unconstrained.

    Google is an interesting case, as Gemini is getting better quickly, but they presumably use much more efficient/cheap TPUs to train.






  • You are spouting assertions about what sounds right while ignoring bodies of scientific evidence contradicting your viewpoint. Just because Lemmy users are unwilling to spend 20-30 minutes digging through arixv to refute you doesn’t make you right.

    You don’t want to go and look up and analyze evidence that floride in the water supply is beneficial, you want to just assert the hypothesis you’ve formed as likely truth without evidence and research into related work, and I can confidently say this because experts who spend their lives reading papers and writing them on this very topic are qualified to make these assertions.

    …This “sounds right” line of thinking has been the bane of civilization for eons. You aren’t breaking up some scientific fallacy like the church believing the Earth is the center of the universe, you are perpetuating one.