PayPay, in particular, seems a good target for consumer action. Just saying…

    • corroded@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      3 days ago

      Personally, I’d really like the option of running LLMs locally, but the hardware requirements make it hard. Small models run okay on CPU or low-end GPUs, but anything approaching the complexity and usefulness of GPT4 or DeepSeek requires a hefty GPU setup. Considering how much even old hardware like the P40 has gone up in price, it’s hard to justify the cost.