Never touch another man's tendies
RAM logo

RAM

Aries I Acquisition Corp - Class A

Price Data Unavailable

About Aries I Acquisition Corp - Class A

View all WallStreetBets trending stocks

Premarket Buzz
2
Comments today 12am to 9:30am EST


Comment Volume (7 days)
10
Total Comments on WallstreetBets

18
Total Comments on 4chan's biz

View all WallStreetBets trending stocks

Recent Comments

Wait till they find out most if the phones they are buying wont have enough ram to run AI. 
OP thinks people invest in cars, probably because it's the only thing OP hasn't lost yet. Intelligent people invest in things that go UP in value over time. Houses, land, businesses. This is where the rich put their money. Inflation hits cars when supply drops (Covid supply chain fuck show) or when money is printed and given to poors, who run out and buy $80,000 trucks they can't afford on a payment plan they don't understand, and put hot wheels mods on them so they will never be able to recover any value from the vehicle - unless they can find an equally stupid poor to buyout their car note on a hot wheels modded rusty Dodge Ram. Rate cuts will be great for people who own houses. Everyone else, please continue to enjoy 68 cents Walmart brand macaroni and cheese. Boil, mix cheese packet with water. We're not splurging on milk. Do not add butter! You ain't got butter money. The stimmy checks program was a failure and has been discontinued indefinitely.
Still has the same performance if not better than an rtx 3060ti with more ram I was looking to upgrade and at 1080p it just ain't worth lol
unless it is stealing data and having RAM inefficient shitty software, don't trust in Google
This is great but I don't think it will have big financial impact on their revenue. In the grand scheme of things, Facebook isn't spending any significant money on llama training runs (it doesn't use much of their compute capacity) and isn't getting any revenue from those models, since it's their inference API providers and users that can reap the benefits. I think llama 70b and 405b models started a trend of third party companies competing in inference game. Earlier it was only Anthropic, Google and OpaqueAI, MistralAI that were able to host llm's. As in, you had to be an author of a model to host it, because weights represent competitive advantage and those companies don't want them to be public. Llama 70B and 405B allowed for hardware companies such as Groq, SambaNova, Etched and Cerebras to compete in the game. This brought down model inference cost since it has no overhead to pay for model training/author's revenue and allowed for fast inference innovations to be brought to market. So, llama is creating new businesses and market segments and allowing for faster tech innovation. Every laptop with 8GB of RAM can run llama 3.1 8B locally, phones can run Nvidia's llama 3.1 4B even. And for bigger models, APIs are cheap and plentiful.
The only way to do that is to move layer by layer of the network from flash to ram for processing, and keep the chain going as processing moves on. There will definitely be more lag.
Alot of iPhone 13 and 14 pro owners are mad their phone can’t use AI bc Apple was cheap and only have 6gb ram. Expect major blowback on this.
Yeah I've got an 11, and I'll probably snag a 16 soon because 5G would be nice, that and the 11's ram is kind of near its limit
Not just that. But the new iPhone simply doesn’t seem future proof.  Everyone expects AI to become increasingly important. And people already claimed that for that reason the iPhone 17 might come with 16 GB of RAM.  And yet the iPhone 16 only comes with 8 GB.  It’s the same shit with USB C not coming early enough, but RAM having the bigger impact for the future.  The new camera button is beyond stupid. It’s expensive, works like crap with a case and has never been needed since you can use the volume buttons to make a photo.  And their Software, iOS has become increasingly buggy and bloated since iOS 15.  And with pictures and videos becoming increasingly of better quality, the storage becomes too expensive. If the iPhone shall become increasingly a camera, then SD Cards should become the norm here as well.  Apple had managed to make the worst product that people would still like for the longest time possible. Huge respect for that. But the last few years they made many mistakes. While increasingly establishing services, they had the chance to better the hardware. 
Yea, Siri has been dumb for ages. Only benefit I see is maybe the ai features will require more ram which Apple has always been cheap to part with. Wouldn’t mind an android level or ram on an iPhone not that you need a ton. I’ll be interested to see if they’re caught taking things off the phone for the ai either secretly or be keeping all the best feature behind sending information off the phone for processing. Either way others can lead the charge to figure that out. I’m much more the 3-5 year upgrade cycle type and I just upgraded last year after 5 models of waiting.
View All

Next stock RAMMU

Previous stock RADI