• 0 Posts
  • 174 Comments
Joined 1 year ago
cake
Cake day: August 11th, 2023

help-circle
  • I grew up as a PC gamer (if you can call 8-bit computers PCs too) and never had a console as a kid. I got an Xbox One when it came out, just because of the Kinect, and never played anything on it other than Just Dance. Playing on my PC is more convenient. I got a Switch and played some Pokémon, but couldn’t get in the habit of playing on a device instead of a PC. When I got a Switch emulator on my PC, I played more on that than I did on the actual Switch in all the time I owned it.







  • Hallucinations are an issue for generative AI. This is a classification problem, not gen AI. This type of use for AI predates gen AI by many years. What you describe is called a false positive, not a hallucination.

    For this type of problem you use AI to narrow down a set to a more manageable size. e.g. you have tens of thousands of images and the AI identifies a few dozen that are likely what you’re looking for. Humans would have taken forever to manually review all those images. Instead you have humans verifying just the reduced set, and confirming the findings through further investigation.








  • Make a large enough model, and it will seem like an intelligent being.

    That was already true in previous paradigms. A non-fuzzy non-neural-network algorithm large and complex enough will seem like an intelligent being. But “large enough” is beyond our resources and processing time for each response would be too long.

    And then you get into the Chinese room problem. Is there a difference between seems intelligent and is intelligent?

    But the main difference between an actual intelligence and various algorithms, LLMs included, is that intelligence works on its own, it’s always thinking, it doesn’t only react to external prompts. You ask a question, you get an answer, but the question remains at the back of its mind, and it might come back to you 10min later and say you know, I’ve given it some more thought and I think it’s actually like this.