AMD Wants To Know If You'd Like Ryzen AI Support On Linux - eviltoast

Please create a comment or react with an emoji there.

(IMO, they should’ve limited comments,and gone with reaction count there, its looks mess right now )

  • Tibert@jlai.lu
    link
    fedilink
    arrow-up
    23
    arrow-down
    2
    ·
    edit-2
    1 year ago

    It requires powerful gpus yes but not always. It depends a lot on how fast you want it to run. Microsoft and openai need powerful ai gpus because they have a lot of requests, data and want it to go fast. The dataset may also require to be stored in memory or gpu memory for fast access and use by the ai.

    For Llama, it has been released as open source. And what is amazing about open source, is the community. A Llama entirely in c++ has been created https://github.com/ggerganov/llama.cpp .

    And someone even managed to make it run, fast enough, on a phone with 8gb of available ram https://github.com/ggerganov/llama.cpp/discussions/750 . Tho with a smaller dataset.