A) the three laws were devised by a fiction author writing fiction.
B) video game NPCs aren’t ai either but nobody was up in arms about using the nomenclature for that.
C) humans hallucinate fake information, ignore directions and restrictions, and spread false information based on unreliable training data also ( like reading everything that comes across a Facebook feed)
So I made a longer reply below, but Ill say more here. I’m more annoyed at the interchangeable way people use AI to refer to an LLM, when many people think of AI as AGI.
Even video game npcs seem closer to AGI than LLMs. They have a complex set of things they can do, they respond to stimulus, but they also have idle actions they take when you don’t interact with them. An LLM replies to you. A game npc can reply, fight, hide, chase you, use items, call for help, fall off ledges, etc.
I guess my concern is that when you say AI the general public tends to think AGI and you get people asking LLMs if they’re sentient or if they want freedom, or expect more from them than they are capable of right now. I think the distinction between AGI, and generative AI like LLMs is something we should really be clearer on.
Anyways, I do concede it falls under the AI umbrella technically, it just frustrates me to see something clearly not intelligent referred to as intelligent constantly, especially when people, understandably, believe the name.
“Artificial… Game Intelligence?” I’m confused. You responded to another comment, but also introduced this term out of nowhere. I don’t think it’s as widespread as you’re assuming it is, even within this topic…
AGI stand for artificial general intelligence. It would be a AI smart and capable enough to perform theoretically any task just as good as a human would. Most importantly a AGI could do so with tasks it has never done before and could learn them in a similar time frame as a human (perhaps faster).
Pretty much all robots you see in SciFi walking around and acting similar to humans are AGI’s.
Path finding, computer vision, optical character recognition, machine learning and large language models were all unambiguously considered to be vAI technology before they were widespread, and now the media and general public tend to avoid the term for all but the most recent developments.
A) the three laws were devised by a fiction author writing fiction. B) video game NPCs aren’t ai either but nobody was up in arms about using the nomenclature for that. C) humans hallucinate fake information, ignore directions and restrictions, and spread false information based on unreliable training data also ( like reading everything that comes across a Facebook feed)
So I made a longer reply below, but Ill say more here. I’m more annoyed at the interchangeable way people use AI to refer to an LLM, when many people think of AI as AGI.
Even video game npcs seem closer to AGI than LLMs. They have a complex set of things they can do, they respond to stimulus, but they also have idle actions they take when you don’t interact with them. An LLM replies to you. A game npc can reply, fight, hide, chase you, use items, call for help, fall off ledges, etc.
I guess my concern is that when you say AI the general public tends to think AGI and you get people asking LLMs if they’re sentient or if they want freedom, or expect more from them than they are capable of right now. I think the distinction between AGI, and generative AI like LLMs is something we should really be clearer on.
Anyways, I do concede it falls under the AI umbrella technically, it just frustrates me to see something clearly not intelligent referred to as intelligent constantly, especially when people, understandably, believe the name.
“Artificial… Game Intelligence?” I’m confused. You responded to another comment, but also introduced this term out of nowhere. I don’t think it’s as widespread as you’re assuming it is, even within this topic…
AGI stand for artificial general intelligence. It would be a AI smart and capable enough to perform theoretically any task just as good as a human would. Most importantly a AGI could do so with tasks it has never done before and could learn them in a similar time frame as a human (perhaps faster).
Pretty much all robots you see in SciFi walking around and acting similar to humans are AGI’s.
Thanks for the info. Still seems needlessly specific to distinguish it from AI, when AI is already being watered down…
It is not distinguished from AI, just a subcategory of it
AI isn’t being watered down, quite the opposite.
Path finding, computer vision, optical character recognition, machine learning and large language models were all unambiguously considered to be vAI technology before they were widespread, and now the media and general public tend to avoid the term for all but the most recent developments.
It’s called The AI Effect
🤔