• 0 Posts
  • 26 Comments
Joined 2 years ago
cake
Cake day: June 29th, 2023

help-circle









    1. I think you’re on the wrong community for this question.

    2. The thing regularly referred to as “AI” of late is more accurately referred to as generative AI, or large language models. There’s no capacity for learning from humans, it’s pattern matching based on large sets of data that are boiled down to a series of vectors to give a most-likely next word for a response to a prompt. You could argue that that’s what people do, but that’s a massive over simplification. You’re right to say it does not have the ability to form thoughts and views. That said, like a broken clock, an LLM can put out words that match up with existing views pretty darn easily!

    You may be talking about general AI, which is something we’ve not seen yet and have no timeframe for existing. That may be able to have beliefs… But again, there’s not even a suggestion of that being close to happening. LLMs are (in my opinion) not even a good indicator or precursor to that coming soon.

    TL;DR: An LLM (or generative AI) can’t have or form beliefs.






  • To me, the difference there is that the jokes about snake oil and homeopathy, healing crystals, or essential oils are roughly the same - e.g. “what do you call X that works and has been peer reviewed? Medicine.”

    So far, there has been no equivalent positive usage in the crypto sphere. Medicine, though often administered to different levels, is a good idea in itself.

    Actually, for most uses of crypto it’s attempting to muddle in and “add” value to a previous known-good thing. Is the comparison here that crypto is snake oil currency, snake oil databases, or snake oil contracts? In every case, to me, crypto is the snake oil salesman trying to sell you the brighter tomorrow - without adding anything positive, and often getting the heck out of dodge (or folding a company and moving on to, e.g. LLMs) before delivering on promises.