Why was Web 2.0 a mistake and what does that have to do with centralization?
Why was Web 2.0 a mistake and what does that have to do with centralization?
Maybe that’s what you believe, but allowing commercial use has been a core tenant of free and open source software
This is one of the funnier things I see frequently on here. People both champion free and open source code and data that can be used for anything… until it is used for anything they even mildly dislike.
Sure but the model is already trained. I’m not talking about using any sort of specialized model.
Why is it idiotic? Your tests will let you know if it is correct. Suppose I have 100 interface functions to implement, I let the AI write the boilerplate and implementations and get a 90% pass rate after a revision loops where errors are fed back into the LLM to fix. Then I spend a small amount of time sorting out the last 10%. This is a viable workflow today.
This sounds pretty typical for a hobbyist project but is not the case in many industries, especially regulated ones. It is not uncommon to have engineers whose entire job is reading specifications and implementing them. In those cases, it’s often the case that you already have compliance tests that can be used as a starting point for your public interfaces. You’ll need to supplement those compliance tests with lower level tests specific to your implementation.
Time to go stock up on chargers
Yeah this doesn’t hold up against the $200+ options but it’s also not priced that way.
Wide area network. It’s basically the “internet” side of the router.
States don’t have rights, people have rights.
This implies TikTok would have some incentive to propagandize their users that Google wouldn’t also have. Google does corporate American propaganda, which many Americans have been acclimated to and thus don’t perceive as propaganda.
Because the definition of what is and isn’t technology is arbitrary. Wikipedia says “Technology is the application of conceptual knowledge to achieve practical goals, especially in a reproducible way.” By that definition, social media is a technology (uses knowledge of computers and networking to enable online communication), but also so are most human creations.
Find me a laptop that has 10GbE. I’ve only seen 1Gb and recently 2.5GbE. Note that thunderbolt 3 is 40 Gb/s, or 4 times that. Thunderbolt 5 is up to 120Gb/s, or 12 times that. If you’re editing video directly from your high-bandwidth NAS on a several-thousand dollar laptop, I find it extremely unlikely that a sub-$500 dock would be a concern. Even more, anybody who actually does that on a day-in-day-out basis would clearly see the benefit in using a dock in the first place, due to the convenience of having so much bandwidth and power able to be provided over a single cable.
Sure, but that’s not a common scenario anymore. We have wifi that is faster than most client Ethernet installations, and if you’re at a desk anyways you probably want a dock. I suppose there is the network engineer who needs to plug directly into various switches and such, but if that’s your debug mechanism, well I am very sorry, look into remote management options to make your life so much easier.
When a port is extremely high bandwidth, the number of them stops mattering much. I’m plugging everything into a dock via a single cable anyways, the rest go largely unused. We used to need a dozen ports because each one could only handle a single task and all were relatively low bandwidth.
Strongly disagree. I use a laptop with a thunderbolt dock. Being able to plug in a single cable to provide power, connect my monitor, all of my input devices, Ethernet, and anything else in a single cable is awesome. If I had to plug 10 things in manually it would be quite cumbersome. I disconnect the laptop daily as I bring it between work and home, as well as use it, well, as a portable laptop.
iTerm2 works well enough
What exactly does Signal have to offer if one already uses iMessage with contact key verification?
Not the same thing since this the device is still partially decrypted.
You should verify this, but I think there is like a consortium of sorts made up of tech companies that pick a standard that they all must follow. So in the future, it’s possible for them to pick a new standard, and then after a transition period everything would be required to switch (though of course you could still continue using old devices, they just can no longer be sold new).