Mind-bending new programming language for GPUs just dropped... - Code Report - eviltoast
  • eveninghere@beehaw.org
    link
    fedilink
    arrow-up
    2
    ·
    edit-2
    6 months ago

    Yeah, and still… the example code in github is also bad. The arithmetic is so tiny that the performance of the execution can be worse than the serial execution. It makes the impression that the language parallelizes everything possible, in which case the execution would possibly get stuck at some parallel parts that’s not worth parallelizing.

    There’s a huge chunk of technical information missing for an expert to imagine what’s going on. And too many comments here still praise the language. They don’t mention anything concrete in those texts. This makes me REALLY skeptical of this post.

    Edit: there are many posts that make up BS for job interviews. I sure hope this is not one of those.