this post was submitted on 30 Mar 2025
66 points (100.0% liked)

Hardware

1413 readers
100 users here now

All things related to technology hardware, with a focus on computing hardware.


Rules (Click to Expand):

  1. Follow the Lemmy.world Rules - https://mastodon.world/about

  2. Be kind. No bullying, harassment, racism, sexism etc. against other users.

  3. No Spam, illegal content, or NSFW content.

  4. Please stay on topic, adjacent topics (e.g. software) are fine if they are strongly relevant to technology hardware. Another example would be business news for hardware-focused companies.

  5. Please try and post original sources when possible (as opposed to summaries).

  6. If posting an archived version of the article, please include a URL link to the original article in the body of the post.


Some other hardware communities across Lemmy:

Icon by "icon lauk" under CC BY 3.0

founded 2 years ago
MODERATORS
top 22 comments
sorted by: hot top controversial new old
[–] [email protected] 10 points 3 days ago

I don't care what "side" you're on. This is bad for all of us. Competition is the only reason Nvidia isn't charging 2k for a GPU already. If AMD walks away we are all fucked now.

[–] [email protected] 16 points 4 days ago (3 children)

Not surprising the both AMD and Intel have given up on the consumer high-end segment.

At the prices in that segment, people are going to go with Nvidia almost by default.

[–] [email protected] 22 points 4 days ago

I hope this is just fake news, because Intel has the foundation, their Battlemage is pretty damn good. They just need to scale it up a bit more. I don't care if it's flagship levels or not, if they can squeeze out RTX 5070Ti or RX 9070 XT performance out of ARC card and it would be awesome.

[–] [email protected] 10 points 4 days ago

eh, 9070 XT is only a little bit slower than the top of the line 7900 XTX. In my mind it's still high-end, 5090 should get it's own super-ultra-high end designation lol

[–] [email protected] 9 points 4 days ago (2 children)

I don’t think AMD has. My understanding was that the next gen UDNA architecture has the full range. It’s meant to launch next year.

The two RDNA 4 cards were just to have something out at the same time as the 5000 series.

[–] [email protected] 9 points 4 days ago (3 children)

I bought the RX 9070 XT and it's freaking amazing. Like, who gives a shit if NVIDIA holds the performance crown when their cards are entirely unobtainable and so expensive they are out of reach of most people? Fuck that metric. On top of that they have missing ROPs, melting power connector because they run so close to its power limit and they keep denying that shit for second generation now.

And don't be mistaken, RX 9070 XT is a VERY high performing card, often beating RTX 5080. They also improved FSR4 to levels that mostly beat DLSS3 CNN in terms of quality which is huge. They improved ray tracing basically on par with NVIDIA and AFMF or AMD Fluid Motion Frames is AMAZING framegen tech. I love it so much and it just works in all games and it works really well. RDNA4 was a huge leap for AMD.

Not to mention Radeon software is SIGNIFICANTLY better than NVIDIAs stupid control panel from year 2006 and same goes for new NVIDIA App. It's just such a stupid piece of software with missing settings available in old panel and also with dumbest overclocking panel I've ever seen.

[–] [email protected] 5 points 4 days ago (2 children)

I will probably be buying a 9070 XT to replace my 3080. I don’t need to be sold on its capabilities.

It is categorically not beating a 5080. It does however go toe to toe with the 5070 ti.

[–] [email protected] 2 points 3 days ago

That's the exact upgrade I made. Totally worth it, and I get to ditch Nvidia finally.

[–] [email protected] 2 points 3 days ago

I went from RTX 3080 to RX 9070 XT and it's a huge upgrade. I didn't quite expect it to be this large. I know it's not beating RTX 5080 in all of them there are some and some where it's very close. Which is crazy considering the 600-800€ gap that it had back then. It has now fallen to "just" 400€ gap, but I can't justify a whole low-mid end graphic card value for few more frames. It's insane.

[–] [email protected] 2 points 4 days ago (1 children)

I wish AMD didn't botch the UI after Adrenaline, it was so much sleeker and more usable

[–] [email protected] 2 points 3 days ago

It has few annoying elements like the stupid Ai tab, but beyond that it's actually really nice and powerful. I especially like Tuning panel so I don't need MSI Afterburner anymore as separate app to overclock it. It's just so much more responsive panel and has bunch of settings that are very useful, like Radeon Chill if I want to keep power consumption down without really compromising framerate or the AFMF is insane feature. Works in every game, makes motion so much smoother while I can't really notice any input lag unlike with FSR3 FrameGen that always creates massive input lag that totally ruins the whole "higher fps number" thing. And I really like the in-game overlay that allows me to see entire Radeon panel within the game if needed.

I've been last on Radeon back when HD7950 was relevant and have dealt with stupid ancient NVIDIA Control Panel ever since. NVIDIA App most certainly didn't impress me because it's dumbed down and stupid, but at least applying settings in it doesn't take 5 seconds anymore which is a plus I guess.

[–] [email protected] 2 points 4 days ago (2 children)

Where did you get it though? 😭

[–] [email protected] 1 points 3 days ago

In a local retail computer store. I see there is plenty of them in online stores here in Europe (using Geizhals as reference) and the price has dropped a bit. I did overpay for it a bit, but eh fuck it, I can live with that as I got it basically 2 days after the launch. I couldn't justify 1600€ for RTX 5070Ti which is why I never went for it. If RTX 5080 was 1000€ I'd still go for it. For the price NVIDIA is asking now and back then, no fucking chance.

[–] [email protected] 1 points 3 days ago

I bought mine in a Newegg bundle a week after launch with a PSU. Sold the PSU separately for the difference.

Those bundles weren't there on launch day, I just happened to be looking for other stuff when I found it. Only bundle in stock. Didn't think it would actually ship, but it did.

[–] [email protected] 1 points 3 days ago

Yeah, AMD basically said "tactical retreat" and tried to make a popular midrange card to get market share back up.

I don't think they have long term plans to stay midrange only. They'll keep working and once they have something that can compete, they'll launch it.

[–] [email protected] 6 points 3 days ago

I really hope this isn’t true. I’m probably a unicorn here, but I have an Arc A750 and I really like it. I was actually considering getting the next high end Intel GPU.

[–] [email protected] 7 points 4 days ago (1 children)

Intel is getting bought by Nvidia. It is inevitable I think. It just makes sense.

[–] [email protected] 2 points 3 days ago (1 children)

I dont get where you think it makes sense, Intels foundries cant do the nm that nvidia would need, so owning those wouldnt be of use. NVIDIA already has its own CPU/DPU/GPU architectures and all the x86 stuff isnt a help to them when they use RISC or Arm. So would you please inform me what NVIDIA would see in acquiring Intel?

[–] [email protected] 2 points 3 days ago* (last edited 3 days ago) (1 children)

Think in terms of military. x86 is dead in its present form. The GPU is a hack. The real future is a single product that can do all compute tasks. Taking Intel will make the US gov happy. It gives Nvidia a real future beyond the GPU boom of the present.

You're thinking about the present. I'm thinking about 10 years from now where the real design edge is at. Nvidia has had a lot of luck and good leadership. Intel has had the exact opposite and is stuck in the past.

[–] [email protected] 2 points 3 days ago (1 children)

You think selling Intel to anyone would make them happy? And I was thinking of the future. And there isnt going to be a all in one chip that does it all, their could be FPGAs that have instanced layouts, but that would still require huge money to get the fastest ones (and I dont know if Intel has really been keeping that pushed as hard as it could). And I already stated they are not just in the GPU market, they have DPUs and CPUs and they are not anything that Intel is working on, so that wouldnt be a great thing to buy into.

And no Intel was the product of letting bean counters run the show and they milked it. And I wouldnt say that NVIDIA was lucky at all, they pushed and invested to get where they are at, I dont even buy their cards but the gaming sector isnt even a blip on their money radar now. They have all the networking, CPU NPU and DPU, its almost a full rack NVIDIA solution at this point. Also if I were to just take the idea of the mil into it, they are working on models, which is the AI boom that you want to look past. I am not sure what you think the mil buys that is intel based, but outside of networking and servers (which they said they were going to goto a 50/50 solution on those) there really isnt a market thing the gov buys thats Intel, TI maybe, but not Intel.

[–] [email protected] 2 points 3 days ago* (last edited 3 days ago)

The only fundamental issue with the CPU and tensors is the L2 to L1 cache bus width. This cannot be altered and maintain the speed. This is not a real issue in the grand scheme of things. It is only an issue with the total design cycle. Don't get sucked into the little world of marketing nonsense surrounding specific fab nodes and whatever spin nonsense the sales fools are pedaling. Real hardware takes 10 years from initial concept to first market availability. Nvidia was lucky because their plans happened to align with the AI boom. They could adjust a few minor packaging tweaks to tailor the existing designs in the pipeline to the present market, but they had no prescient genius about how AI would explode like the last two years. Such a premise assumes they began the 40 series knowing about the AI boom in 2012, nearly 4 years before OpenAI was founded.

The FPGA does not work for AI. It does not scale like you assume and the power required is untenable. You can find information about Intel/Altera AI researchers that were well funded and traversed this path before the constraints were discovered. You need simpler architecture with a lower transistor count. This is like the issue with static RAM versus DRAM. Static is functionally superior in nearly every way, but it simply can't scale due to power and space requirements.

With tensors all that is needed is throughput. That is a solvable problem. Single thread speeds in CPUs is a sales gimmick and nothing more. Your brain is a much more powerful biological computer and operates on 3 main clocks the fastest of which is only around 100 Hz. Parallelism can be used to create an even faster and more rich user experience than the present. This is the future. The dual processor paradigm has been done before in the x286 - x386 era and it failed because data centers rejected such a dual processor paradigm in favor of slightly better hardware that was nearly good enough. This is the reality of the present too. Any hardware that is good enough to do both workloads will be adopted by data centers and therefore the market. This is where the real design edge is made and all consumer products are derived.

None of Nvidia's products are relevant 8 years from now. They are a temporary hack. This is why they must use their enormous capital to buy a new future beyond the GPU, and they will.

[–] [email protected] 3 points 4 days ago

But why would they? There is likely no money to be made.