A new paper from Apple's artificial intelligence scientists has found that engines based on large language models, such as those from Meta and OpenAI, still lack basic reasoning skills.
AI in general is a shitty term. It’s mostly PR. The Term “Intelligence” is very fuzzy and difficult to define - especially for people who are not in the field of machine learning.
AI in general is a shitty term. It’s mostly PR. The Term “Intelligence” is very fuzzy and difficult to define - especially for people who are not in the field of machine learning.
So for those in ML it’s easier?