Is anyone actually surprised by this?

    • UnderpantsWeevil@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      4 hours ago

      Building my entire data model around the Tienanmen Square copypasta. I can run this thing on a Raspberry Pi plugged into a particularly starchy potato and it reliably returns the only answer I’ve thought to ask it.

        • dreadbeef@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          2 hours ago

          Ah, just acquire such hardware, very simple and anyone can do it without supply chain knowledge or advantage

          • webghost0101@sopuli.xyz
            link
            fedilink
            arrow-up
            1
            ·
            edit-2
            5 minutes ago

            Sorry but you are just talking assumptions without even having looked at the facts.

            Its not cheap, but basically a single toptier gaming desktop with an additional graphics card (or 2) is literally all you need.

            I know multiple people who work normal IT jobs that have already started on setting up their own. They plan on running them for their whole family, many users at a time from the same machine.

            Here is someone who got it to work on a cluster of mac-minis. Again not cheap, but clearly within dedicated consumer enthusiast reach. https://digialps.com/deepseek-v3-on-m4-mac-blazing-fast-inference-on-apple-silicon/

            And this is before even considering how fast open source moves, i am expecting quantized models which can have double speed for negligible quality impact any second now.

    • quant@leminal.space
      link
      fedilink
      English
      arrow-up
      15
      ·
      15 hours ago

      By extension, anything that’s not self hosted means 3rd party actors snooping. American, Chinese, whoever happens to operate that machine.