Running AI is so expensive that Amazon will probably charge you to use Alexa in future, says outgoing exec::In an interview with Bloomberg, Dave Limp said that he “absolutely” believes that Amazon will soon start charging a subscription fee for Alexa
Alexa is so bad though. Who’s going to pay for that?
“By the way, you can now pay for Alexa AI option if you want me to reply in a slightly smarter way, but I will still cut you off with ads and other useless things. To activate AlexaAI say activate”
“Welcome to the PiHole, Alexa.”
Just made the switch to NextDNS. For $2/month I get a lot of the same features but also on my phone when not on WiFi. Still love my pihole though!
“No”
“I heard ‘activate’. Thank you! Your credit card will be charged $129 annually. To cancel, please log on to the website because there’s no way we’re letting you get out of this mess the same way we got you into it.”
To cancel, please log on to the website because there’s no way we’re letting you get out of this mess the same way we got you into it.
Unless you’re in California
Guess we’ll find out when they finally pull the trigger
Mine can’t ever seem to tell the difference between on and off if there is any sound in my house
I use “turn on ___” and “kill ___”. Much more reliable matches.
Do you change the names of all your devices to people names? The living room lamp is Steve, the bedroom fan is Maryanne…
Which one’s Bill?
Good to see people already training ai to kill.
Still better than Siri …
Siri was always shit but somehow managed to devolve even further lately. I never trusted her to do more than than turning lights on or off but now this shit happens:
Me: Siri, turn off the lights in the living room
Siri: OKAY, WHICH ROOM? BATHROOM, BEDROOM, KITCHEN, HALLWAY, LIVING ROOM?
Imagine living in a mansion with this cunt
I use Google to turn on my TV by saying ‘turn on TV’, easily done. But then when I ask it to adjust volume it asks me which TV… I only have one TV online and it had just turned it on.
We need to move AI from the cloud to our own hardware running in our homes. Free, open source, privacy focused hardware. It’ll eventually be very affordable.
That’s already here. Anyone can run AI chatbots similar to, but not as intelligent as, Chatgpt or Bard.
Llama.cpp and koboldcpp allow anyone to run models locally, even with only a CPU if there’s no dedicated graphics card available (although more slowly). And there are numerous open source models available that can be trained for just about any task.
Hell, you can even run llama.cpp on Android phones.
This has all taken place in just the last year or so. In five to ten years, imo, AI will be everywhere and may even replace the need for mobile Internet connections in terms of looking up information.
Yes, and you can run a language model like Pygmalion Al locally on koboldcpp and have a naughty AI chat as well. Or non sexual roleplay
Absolutely and there are many, many models that have iterated on and surpassed Pygmalion as well as loads of uncensored models specifically tuned for erotic chat. Steamy role play is one of the driving forces behind the rapid development of the technology on lower powered, local machines.
Never underestimate human ingenuity
When they’re horny
And where would one look for these sexy sexy AI models, so I can avoid them, of course…
Huggingface is where the models live. Anything that’s uncensored (and preferably based on llama 2) should work.
Some popular suggestions at the moment might be HermesLimaRPL2 7B and MythomaxL2 13B for general roleplay that can easily include nsfw.
There are lots of talented people releasing models everyday tuned to assist with coding, translation, roleplay, general assistance (like chatgpt), writing, all kinds of things, really. Explore and try different models.
General rule: if you don’t have a dedicated GPU, stick with 7B models. Otherwise, the bigger the better.
Which models do you think beat Pygmalion for erotic roleplay? Curious for research haha
Hey, I replied below to a different post with the same question, check it out.
Oh I see, sorry for the repeat question. Thanks!
lol nothing to be sorry about, I just wanted to make sure you saw it.
GPT4All is a neat way to run an AI chat bot on your local hardware.
Thanks for this, I haven’t tried GPT4All.
Oobabooga is also very popular and relatively easy to run, but it’s not my first choice, personally.
it does have a very funny name though
In five to ten years, imo, AI will be everywhere and may even replace the need for mobile Internet connections in terms of looking up information.
You’re probably right, but I kinda hope you’re wrong.
Why?
Call it paranoia if you want. Mainly I don’t have faith in our economic system to deploy the technology in a way that doesn’t eviscerate the working class.
Don’t these models require rather a lot of storage?
Storage is getting cheaper every day and the models are getting smaller with the same amount of data.
I’m just curious - do you know what kind of storage is required?
13B quantized models, generally the most popular for home computers with dedicated gpus, are between 6 and 10 gigs each. 7B models are between 3 and 6. So, no, not really?
It is relative so, I guess if you’re comparing that to an atari 2600 cartridge then, yeah, it’s hella huge. But you can store multiple models for the same storage cost as a single modern video game install.
God I wish, I would just love local voice control to turn my lights and such on and off… but noooooooooooo
I have that with just my phone, using Wiz lights and ITEEE. It’s the only home automation I even have because it’s the only one I found that doesn’t necessarily need a special base station like an Alexa or Google Home.
But you want a local base station, else there’s no local control. You want to use local-only networks like z-wave, zigbee, Thread, Bluetooth, etc, even though they require a base station because that’s what gives you a local-only way of controlling things.
Matter promises a base station may no longer be necessary for smart devices to control each other, but it is rolling out very slowly
I also wonder what I’ll be able to do with the Thread radio in the iPhone 15 Pro
The base stations are what uses the cloud/AI shit. The setup I have doesn’t even require an Internet connection or wifi; it’s entirely bluetooth. Why in the hell would I want a base station that costs money, is controlled by Amazon or Google, and requires an Internet connection for my local shit?
I don’t want a piece of hardware that does nothing but act like a fucking middleman for no good reason.
I’m a huge fan of Home Assistant. You might look into it
It’s the year of the voice for Home Assistant. Given their current trajectory, I’m hopeful they’ll have a pretty darn good replacement for the most common use cases of Google Home/Alexa/Siri in another year. Setting timers, shopping list management, music streaming, doorbell/intercom management. If you’re on the fence about a Nabu Casa subscription, pull the trigger as it helps them stay independent and not get bought out or destroyed by commercial interests.
Thumbs up for Nabu Casa and Home Assistant!
I haven’t yet played with the local voice stuff but have been following it with interest. Actually, now that Taspberry Piis are starting to become available again, I’m on the fence between buying a few more, vs finding something with a little more power, specifically for voice processing
While you may have points against Apple and how effective Siri may be, with this latest version kind of products, even the watch has enough processing power to do voice processing on device. No ads. No cloud services
Pretty much. If you want a voice assistant right now, Siri is probably the best in terms of privacy. I bought a bunch of echos early, then they got a little shitty but I was in, and now I just want them out of my house except for one thing - music. Spotify integration makes for easy multi-room audio in a way that doesn’t really work as well on the other platform that I’ll consider (Apple/Siri) and basically adds sonos-like functionality for a tiny fraction of the price. The Siri balls and airplay are just not as good, and of course, don’t work as well with Spotify.
But alexa is so fucking annoying that at this point I mostly just carry my phone (iPhone) and talk to that even though it’s a little less convenient because I’m really goddamned tired of hearing “by the way…”
I do wonder how much of those voice assistants could run on-device. Most of what I use Bixby for (I know. I KNOW.) is setting timers. I think simple things like that can run entirely on the phone. It’s got a shocking amount of processing in it.
deleted by creator
AI is being touted as the solution to everything these days. It’s really not, and we are going to find that out the hard way.
I get what you’re saying, but voice assistants are one of the main places LLMs belong.
Yes, but so much more. An actually useful assistant that could draft emails, set reminders appropriately, create automations, etc. would be worth A LOT of money to me.
I think if there ends up actually being a version of AI that is privacy focused and isn’t screwing over creators it’d be so much less controversial. Also, everyone (including me) is really, really fucking sick of hearing about it all of the time in the same way that everyone is/was sick of hearing about the blockchain. As in: “Bro your taco stand needs AI/the blockchain.”
You wouldn’t need any kind of special training for this. Just the ability to do simple things like make calendar appointments, draft emails/responses, and set reminders based on time/locations/etc. It really doesn’t seem very complicated but as far as I know no one has figured out how to do it yet. All the existing “assistants” are so bad that I don’t even bother trying to use them anymore. They can’t even do something simple like turning on a light with any degree of reliability.
Hey that’s only because Amazon, Google and Microsoft (et al) just doesn’t have the Money to Make it good!!
So what about 9.99 a month?
4.99 if you pay up front for a year?
Euh, or how much can you cough up, like for a year or at least for Q4, I’m literally on a bad roll here.
I’m not going to buy into a subscription model for something I’ve already paid for. This subscription model crap is complete bullshit.
We even tried to do it with heated seats recently. Like install heated seats in your car, but disable them in software. It’s crazy that companies think they can get away with this.
I think there’s a massive difference between unlocking a feature that’s already there and requires no maintenance and a cloud-based service that demands 24/7 uptime and constant developer support, as well as ongoing feature development
While I agree with you, they are 💯 going to get away with it, because your average consumer just doesn’t care.
If IBM actually manages to convert COBOL into Java like they’re advertising, they’ll end up killing their own cash cow
So much still runs on COBOL
It’s not even A.I. either
Alexa is more like a telemarketer disguised as an assistant. Every interaction is followed by a “by the way . Its a shit experience so I stopped using mine.
Alexa was designed explicitly for that purpose. They lose money on every Echo sold, the whole idea was they would make money selling you stuff. Turns out people would rather use their Echo to check the weather, get recipes, etc. rather than voice shop.
I just can’t see a use case for voice shopping. There are almost zero instances where I want to buy something without having a visual of that thing in front of me at time of purchase.
I could possibly see something like “buy another stick of deodorant”, but even then I want to see if there are deals or some other options and would want to check the price at a minimum.
Seems like yet another MBA idea.
Yeah it seems the execs who had the idea for Alexa never used Amazon for shopping. It’s a shit shopping site full of scammy products. I’d never buy anything from them without checking out the prices reviews, etc.
It’s really only good for re-ordering things you’ve already ordered. It will let you know that it found something in your order history and then you can decide whether you want to order again.
And this makes sense, but I’d still want to check prices to make sure that my $3 deodorant didn’t get discontinued and priced at $30/stick.
Well you think this way because you’ve seen what happened to Amazon in the past 10 years. 10 years ago, when they were getting ready to launch the Echo, Amazon was a great retailer that people trusted. Now a decade of sellers gaming listings and reviews, and Amazon customer service deteriorating, we’ve been trained not to trust Amazon’s defaults.
Ha, I use mine almost exclusively as a light switch. I don’t have to get out of bed to turn off my lights or turn on my fan. I’m sure they’re losing a bunch of money on me
It’s a great lightswitch. Also, thermostat adjustor (my husband is very particular and changes it about a dozen times a day).
It’s also good for setting a timer. But yeah, I’m not buying shit from it.
Setting all my Alexa’s to UK English got rid of all marketing “by the ways.” I still regret going with the Alexa ecosystem but at least for now there is a workaround for the most rage inducing part of it.
By the way, did you know that you can find out more about telemarketing with an audio book from audible on the subject. Would you like to hear a preview of that now?
So they get massive amounts of free data for Machine Learning, but want to charge users for supplying it?
It’s like charging you for cable and then shoving ads down your throat.
It’s like charging you for Prime Video and then shoving ads down your throat.
It’s like RAAAAIIAAIIIN on your wedding day
That’s often the case. They can have their cake and eat it too. Shareholders would expect nothing less.
I think the data is probably less valuable than people think, especially if the users expect an AI response whenever a data point can be collected from them.
I wake my echo and then grind coffee beans.
So they expect that people pay for being spied upon and seriously data mined?
Yes and people will pay for it.
Sometimes I can’t help thinking some people deserve to be taken advantage of.
I don’t know about that. They never delivered on Smart Home promises and the only truly useful thing my Google AI does is to give me the forecast. Otherwise it’s just a wifi speaker.
If they finally integrate Bard, I would actually consider paying for the service.
We have a Google mini. They listen to my six year old request the song Poopy Bum Bum all day. The ads get interesting.
Alexa has a feature where you tell it you’re leaving the house and it will listen for smoke detectors or breaking glass, alerting you through your phone if it detects something. Amazon is putting that behind a paywall next year.
Let me in your house and I’ll observe it for you for less money
Google did that with Nest Aware years ago as well. It’s super annoying.
I already don’t use it, you don’t have to sell me on it.
deleted by creator
Alexa isn’t nearly good enough to pay for. I basically use it for timers, math, weather, and conversions.
Yep, used to be much better. There was SO much potential with it too. I wish there was a Smart Speaker with integration into ChatGPT. Id love to stand in the shower and ask it shit
You can do this with a Siri shortcut.
It still falls short because LLMs aren’t smart, they’re just approximately not wrong most of the time. I thought it would be a lot cooler than it actually is.
Building your own dystopia I see…
Not following
Yeah, they’re all pretty disappointing. I’d love to have something that feels like how movies portray digital assistants. Movie assistants never misunderstand you or say “I’m sorry, I couldn’t recognize your voice”. I’ve mostly used the Google one and it’s so bad at doing what I feel like is feasible even with inaccuracy.
Eg, I’ve tried to tell my assistant to like a song that was currently playing on YTM but could not find a voice command that worked (and some commands backfired by making it skip to the next song). I’ve had very poor success with getting assistant to cast something to my Chromecast with my voice. It sometimes works, but it fails or gets it wrong so often that it’s not worth the time.
Sometimes I use it for rewinding (e.g., “ok google, rewind 30 seconds”) because many apps don’t have granular rewind buttons and tracking on the track bar is way too inaccurate. But lol, it’s so slow! It takes a few seconds to figure out what I said (so I have to ask it to rewind more than I wish) and it seems every app is unoptimized for rewinding, as it usually takes several seconds of loading.
It can’t really do any kind of research either. You basically can just ask it to google things and it sometimes is able to extract the meaningful part from simple questions. It’s a far way from how Hollywood thinks a digital assistant will work.
…charge me to use Alexa?
I already avoid it like the plague.
I have doubts alexa is using AI, seen how dumb it is, but well
From the article:
Amazon has bet big on AI, with the company unveiling a new, AI-powered version of Alexa alongside updated versions of its Echo Frames and Carrera smart glasses last week.
Good luck, I guess? Got the first Google home, at first it was great, I was asking it tons of questions. Then the questions stopped, used it for turning on the lights and other automations. Then I installed Home Assistant and the only command Google Home got was to set a timer to know when to pull things out of the oven. Eventually I stopped doing that.
At the moment all Google/Nest Homes have their mic cut off, I only use them to stream music in my house from my NAS via Plex. So yeah…
I still use mine for voice commands with home assistant. Works great.
How to make me go back to buying shit in person, by Amazon.com
I use Alexa as a way to use an old speaker system. I wouldn’t pay to use any “smart” speaker systems. They are pretty dumb and I’ve already paid once