How does Lemmy feel about "open source" machine learning, akin to the Fediverse vs Social Media? - eviltoast

Obviously there’s not a lot of love for OpenAI and other corporate API generative AI here, but how does the community feel about self hosted models? Especially stuff like the Linux Foundation’s Open Model Initiative?

I feel like a lot of people just don’t know there are Apache/CC-BY-NC licensed “AI” they can run on sane desktops, right now, that are incredible. I’m thinking of the most recent Command-R, specifically. I can run it on one GPU, and it blows expensive API models away, and it’s mine to use.

And there are efforts to kill the power cost of inference and training with stuff like matrix-multiplication free models, open source and legally licensed datasets, cheap training… and OpenAI and such want to shut down all of this because it breaks their monopoly, where they can just outspend everyone scaling , stealiing data and destroying the planet. And it’s actually a threat to them.

Again, I feel like corporate social media vs fediverse is a good anology, where one is kinda destroying the planet and the other, while still niche, problematic and a WIP, kills a lot of the downsides.

  • brucethemoose@lemmy.worldOP
    link
    fedilink
    arrow-up
    8
    arrow-down
    2
    ·
    4 months ago

    I dunno, I keep a 35B open on my desktop all day just to bounce ideas off it, ask it stuff, easy queries, like a instant personal assistant.

    And the feel is totally different when its yours. Long context responses on huge documents are instant because it’s cached, and I can repeat quieries over and over again without any worry. I can dig in and mess with the system prompt ,even the manual formatting, in ways that API models just don’t like. I can finetune smaller models for styles, thoug I don’t do this a ton. And I don’t feel weird about sending certain things over the internet to be datamined.

    The visual media models tend to be more for crude entertainment, yeah.

    Matmul free LLMs are theoretically incredibly power efficient, if accelerators for them ever come out.