It wouldn't be hyperbole to say that *every* ML paper and project of significance prior to 2023 relied on CUDA. ROCm was a nightmare to deal with back then and had very little adoption within academic or industry circles. Custom is definitely the fastest growing, but a lot has changed in just a few years.
People have lost sight of what LLM's are. They are chat bots. Really decent chat bots.
They work by guessing what words have the best chance of satisfying you, based on the input prompt, their dataset and weightings provided by a human during training.
They're a very useful tool. But surely it's obvious this is not the path to any form of sentience.
ML more generally is even better. It's very useful for iterating over a complex problem with many parameters, such as finding new drugs and many other things. But it's not capable of thinking. It can't invent something really out of the box, only derive through iteratation. Super useful, but this isn't the Matrix.
Yeah, I don’t get the narrative of “LLMs don’t work” or “LLMs are under delivering”.
There’s always companies and grifters that over promise and hype way too much. But LLMs add real value, which is different from the “AI/ML blockchain” crap of the 2010s.
there's tons of applications of transformer based ML models. the entire digital visual space has been transformed from it
every white collar job uses it extensively now
every student uses it to cheat and are completely dependent on it