Lugh@futurology.todayM to Futurology@futurology.todayEnglish · 1 year agoIntel says its new Meteor Lake chips will allow users to run some LLM AIs locally on their laptops.www.reuters.comexternal-linkmessage-square4fedilinkarrow-up120arrow-down12cross-posted to: artificial_intelligence@burggit.moe
arrow-up118arrow-down1external-linkIntel says its new Meteor Lake chips will allow users to run some LLM AIs locally on their laptops.www.reuters.comLugh@futurology.todayM to Futurology@futurology.todayEnglish · 1 year agomessage-square4fedilinkcross-posted to: artificial_intelligence@burggit.moe
minus-squarePennomi@lemmy.worldlinkfedilinkEnglisharrow-up6·1 year agoWeird because you can run some LLM AIs locally on even your phone. That’s not a very impressive claim.
minus-squareScratch@sh.itjust.workslinkfedilinkEnglisharrow-up6·1 year agoYou can add huge numbers! Want to know what a million plus a million is? Intel has the answer!
minus-squaremorrowind@lemmy.mllinkfedilinkEnglisharrow-up1·1 year agoNot very well though, the idea is to make them more efficient
Weird because you can run some LLM AIs locally on even your phone. That’s not a very impressive claim.
You can add huge numbers! Want to know what a million plus a million is? Intel has the answer!
deleted by creator
Not very well though, the idea is to make them more efficient