So is it … the “whole thing”? I.e., Lemmy was invented / created / etc. by a bunch of authoritarian weirdos?
I’m guessing that the question itself reveals that I don’t even understand what Lemmy is, but hey. Any help appreciated.
So is it … the “whole thing”? I.e., Lemmy was invented / created / etc. by a bunch of authoritarian weirdos?
I’m guessing that the question itself reveals that I don’t even understand what Lemmy is, but hey. Any help appreciated.
Well this is news to me. The whole thing; created by Marxists, etc, etc.
Wtf?
Honestly it just doesn’t sound like a legit app to me. Sounds like some guy’s personal project.
Calling your cool new app “TARD,” for example — and then insisting (with a straight face) that it is simply an acronym and “people should get over it” — is just being stupid and missing the entire point…while failing miserably.
If it flies, look for a huge spike in stds
Good luck. You’re going to get nothing but “popular = bad” here.
But the suicide part … that confirmed as well?
Honestly don’t know, just surprised to read that.
Wat
To be fair
NO! No fair.
I delivered a season of 4k animations for a network show using Motion, AE, C4D, Ps, AI…all using a base model M1 Mini (8/256), with zero problems.
Of course more would be better, but unless you’ve actually used one, it’s hard to imagine how well it works. I tried mentioning this in another post, but it’s all Apple hate all the way down here
Point taken! Clearly more is always better. Don’t have any experience with the M2 or 3.
I’m just adding a personal experience with having the minimum be plenty to get big jobs done.
…And yet…?
My point is that while of course more is better, 8 sufficed for me…a professional, doing demanding…professional…work.
Do the journalists (and researchers, editors, verifying staff, etc.) all work for free?
Can’t help but think that the all-too prevalent (here, at least) attitude that one shouldn’t have to pay anything, or very, very little, for quality content has a lot to do with it.
To be fair, M-series Macs are pretty insanely efficient with memory. Unless you’ve actually used one extensively, I can understand the attitudes here…BUT:
I’ve done broadcast animation for many years, and back in ‘21 delivered an entire season of info/explainer-type pieces for a network show — using Motion, Cinema 4D, and After Effects (+ Ai and Ps) — all of it running on a base-level, first-gen M1 Mini (8/256). Workflow was fast and smooth; even left memory-pig apps running in the background most of the time…not one hiccup. Oh, and everything was delivered in 4k.
So 8gb actually is plenty for most folks…even professionals doing some heavy lifting. Sure I’d go for 16 next one, but damn I was/am still impressed. (Maybe it sucks for gaming, I don’t do that so have no clue).
Are you going to share any of your wisdom here? Or rebut misinformation?
It seems to me that the long experiment playing out may include simply waiting to see if there is a critical mass threshold to be reached (ie, of this LLM “simply repeating what everyone agrees on” idea) that allows the process to evolve into something closer to “thinking.”
I’m sure I don’t know enough about LLMs, but as others others here are pointing out, this seeming regurgitation of the already-known does seem to provide the foundation or potential for generating hypotheses and/or “new” ideas.
why people are still buying Apple products
For me, unmatched user experience.
From day one, the focus and sensibilities have all been on making things that are intuitive, useful, and pleasing to the eye. Things that feel, “engineered by designers,” instead of “designed by engineers.”
“Viral” is completely meaningless now.
Also, sometimes some people just decide that the content provided is worth the price charged.
Should totally keep a cup of pencils next to it.
I’d just like to know how the same fucking company that makes Illustrator and Photoshop can come up with something as astonishingly shitty as Acrobat.