I know there’s other plausible reasons, but thought I’d use this juicy title.
What does everyone think? As someone who works outside of tech I’m curious to hear the collective thoughts of the tech minds on Lemmy.
I know there’s other plausible reasons, but thought I’d use this juicy title.
What does everyone think? As someone who works outside of tech I’m curious to hear the collective thoughts of the tech minds on Lemmy.
I mean, I don’t think AGI necessarily implies singularity, and I doubt singularity will ever come from LLM’s. But when you look at human intelligence one could make the argument that it is a glorified input-output system like LLM’s.
I’m not sure. There’s a lot of things going on in the background with even human intelligence that we don’t understand.
Yes except human brains can learn things without the typical manual training and tweaking you see in ML. In other words, LLMs can’t just start from an initial “blank” state and train themselves autonomously. A baby starts from an initial state and learns about objects, calibrates their eyes, proprioception, movement, then learns to roll over crawl, stand, walk, grasp, learns to understand language then speak it, etc. of course there’s parental involvement and all that but not like someone training an LLM on a massive dataset.
Good point
Spin up AI Dungeon with chatgpt and see how compelling it is once you run out of script.
Really good point. I’ve actually messed around a lot with GPT as a 5e DM and you’re right—as soon as it needs to generate unique content it just leads you in an infinite loop that goes no where.
I’ve had some amazing fantasy conversations with LLMs running on my own GPU. Family and world history, tribal traditions, flora and fauna, etc. It’s quite amazing and fun.