deleted by creator
deleted by creator
If you already didn’t know, you can run locally some small models with an entry level GPU.
For example i can run Llama 3 8B or Mistral 7B on a 1060 3GB with Ollama. It is about as bad as GPT-3 turbo, so overall mildly useful.
Although there is quite a bit of controversy of what is an “open source” model, most are only “open weight”
I personally think that Tesla plan is to forcefully change the IRA bill that mandate CCS connector on all federally funded charging station.
With this bill, Tesla could lose their charging network advantage in the medium term, or even worse, be burdened by an obsolete “non-standard” in the long term.
Except Tesla, where regen power is always at the maximum level.
I don’t find the source anymore, but i saw a lifetime analysis about sodium ion batteries. Overall they are slighly worse than lithium ion due to higher energy input required during fabrication, despite better mineral availability.
The most common Na-ion batteries use Prussian Blue.
This map is great, time and space wise
It is incredible how overbloated their app are. I have no idea why every app need to integrate a social media feed, and be able book a taxi/takeout or whatever.
They seriously need to have a look at KISS principles.
There are some countries where you can reclaim a few cents if you return your plastic PET bootles on a specialized container.
On this particular case, since the plastic isn’t mixed with other incompatible plastics, the recyclability is actually really good, as good as paper/cardboard.
This isn’t true anymore, Intel dropped AVX512 since they moved to Big+Small cores design while AMD actually implemented it with Zen 4.