this post was submitted on 31 Jan 2024
511 points (100.0% liked)

Technology

67669 readers
5578 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

AMD’s new CPU hits 132fps in Fortnite without a graphics card::Also get 49fps in BG3, 119fps in CS2, and 41fps in Cyberpunk 2077 using the new AMD Ryzen 8700G, all without the need for an extra CPU cooler.

top 50 comments
sorted by: hot top controversial new old
[–] BombOmOm@lemmy.world 117 points 1 year ago (2 children)

I have routinely been impressed with AMD integrated graphics. My last laptop I specifically went for one as it meant I didn't need a dedicated gpu for it which adds significant weight, cost, and power draw.

It isn't my main gaming rig of course; I have had no complaints.

[–] prole@sh.itjust.works 19 points 1 year ago* (last edited 1 year ago)

Same. I got a cheap Ryzen laptop a few years back and put Linux on it last year, and I've been shocked by how well it can play some games. I just recently got Disgaea 7 (mostly to play on Steam Deck) and it's so well optimized that I get steady 60fps, at full resolution, on my shitty integrated graphics.

[–] empireOfLove2@lemmy.dbzer0.com 13 points 1 year ago

I have a Lenovo ultralight with a 7730U mobile chip in it, which is a pretty mid cpu... happily plays minecraft at a full 60fps while using like 10W on the package. I can play Minecraft on battery for like 4 hours. It's nuts.

AMD does the right thing and uses their full graphics uArch CU's for the iGPU on a new die, instead of trying to cram some poorly designed iGPU inside the CPU package like Intel does.

[–] CalcProgrammer1@lemmy.ml 59 points 1 year ago

AMD's integrated GPUs have been getting really good lately. I'm impressed at what they are capable of with gaming handhelds and it only makes sense to put the same extra GPU power into desktop APUs. This hopefully will lead to true gaming laptops that don't require power hungry discrete GPUs and workarounds/render offloading for hybrid graphics. That said, to truly be a gaming laptop replacement I want to see a solid 60fps minimum at at least 1080p, but the fact that we're seeing numbers close to this is impressive nonetheless.

[–] PerogiBoi@lemmy.ca 58 points 1 year ago (4 children)

I was sold on AMD once I got my Steamdeck.

[–] EndHD@lemm.ee 12 points 1 year ago

same here. or at least i finally recognized their potential. but it's not just the performance, it's the power efficiency too!

[–] prole@sh.itjust.works 8 points 1 year ago

Everything I see about AMD makes me like them more than Intel or Nvidia (for CPU and GPU respectively). You can't even use an Nvidia card with Linux without running into serious issues.

load more comments (2 replies)
[–] fosstulate@iusearchlinux.fyi 41 points 1 year ago (1 children)

I hope red and blue both find success in this segment. Ideally the strengthened APU share of the market exerts pressure on publishers to properly optimize their games instead of cynically offloading the compute cost onto players.

[–] Rai@lemmy.dbzer0.com 19 points 1 year ago (3 children)

Hell yeah, I want EVERYONE to make dope ass shit. I’ve made machines with both sides, and I hate tribal…ness. My current machine is a 9900k that’s getting to be… five years old?! I’d make an AMD machine today if I needed a new machine. AMD/Intel rivalry is so good for us all. Intel slacked so hard after the 9000-series. I hope they come back.

[–] Vlyn@lemmy.zip 20 points 1 year ago (1 children)

Intel has slacked hard since the 2000-series. One shitty 4 core release after another, until AMD kicked things into gear with Ryzen.

And during that time you couldn't buy Intel due to security flaws (Meltdown, Spectre, ..).

Even now they are slacking, just look at the power consumption. The way they currently produce CPUs isn't sustainable (AMD pays way less per chip with the chiplet design and is far more flexible).

load more comments (1 replies)
load more comments (2 replies)
[–] Dehydrated@lemmy.world 35 points 1 year ago

Common W for AMD

[–] RememberTheApollo_@lemmy.world 32 points 1 year ago (6 children)

Only downside if integrated graphics becomes a thing is that you can’t upgrade if the next gen needs a different motherboard. Pretty easy to swap from a 2080 to a 3080.

[–] olympicyes@lemmy.world 40 points 1 year ago (2 children)

Integrated graphics is already a thing. Intel iGPU has over 60% market share. This is really competing with Intel and low-end discrete GPUs. Nice to have the option!

load more comments (2 replies)
[–] tonyravioli@lemm.ee 33 points 1 year ago (1 children)

AMD has been pretty good about this though, AM4 lasted 2016-2022. Compare to Intel changing the socket every 1-2 years, it seems.

[–] the_q@lemmy.world 20 points 1 year ago (3 children)

Actually AMD is still releasing new AM4 CPUs now. 5700x3D was just announced.

load more comments (3 replies)
[–] T156@lemmy.world 6 points 1 year ago (1 children)

Could you not just slot in a dedicated video card if you needed one, keeping the integrated as a backup?

load more comments (1 replies)
[–] GhostFence@lemmy.world 5 points 1 year ago

And the shared RAM. Games like Star Trek Fleet Command will crash your computer by messing with that/memory leaks galore. Far less crashy with a dedicated GPU. How many other games interact poorly with integrated GPUs?

load more comments (2 replies)
[–] aluminium@lemmy.world 30 points 1 year ago (1 children)

Oh, oh ok I thought one of the new Threadrippers is so powerful that the CPU can do all those graphics in Software.

[–] sardaukar@lemmy.world 23 points 1 year ago (1 children)

It's gonna take decades to be able to render 1080p CP2077 at an acceptable frame rate with just software rendering.

[–] modeler@lemmy.world 14 points 1 year ago (1 children)

It's all software, even the stuff on the graphics cards. Those are the rasterisers, shaders and so on. In fact the graphics cards are extremely good at running these simple (relatively) programs in an absolutely staggering number of threads at the same time, and this has been taken advantage of by both bitcoin mining and also neural net algorithms like GPT and Llama.

[–] pivot_root@lemmy.world 7 points 1 year ago

It's a shame you're being downvoted; you're not wrong. Fixed-function pipelines haven't been a thing for a long time, and shaders are software.

I still wouldn't expect a threadripper to pull off software rendering a modern game like Cyberpunk, though. Graphics cards have a ton of dedicated hardware for things like texture decoding or ray tracing, and CPUs would need to waste even more cycles to do those in software.

For people like me who game once a month, and mostly stupid little game, this is great news. I bet many people could use that, it would reduce demand for graphic card and allow those who want them to buy cheaper.

[–] inclementimmigrant@lemmy.world 20 points 1 year ago

Mind you that it can get these frame rates at the low setting. While this is pretty damn impressive for a APU, it's still a very niche market type of APU at this point and I don't see this getting all that much traction myself.

[–] AlmightySnoo@lemmy.world 17 points 1 year ago (5 children)

A bit misleading, what is meant is that no dedicated GPU is being used. The integrated GPU in the APU is still a GPU. But yes, AMD's recent APUs are amazing for folks who don't want to spend too much to get a reasonable gaming setup.

[–] ysjet@lemmy.world 46 points 1 year ago (2 children)

Wow, it's almost like that's why they said you didn't need a graphics card, instead of saying you didn't need a GPU!

[–] AlmightySnoo@lemmy.world 6 points 1 year ago* (last edited 1 year ago) (1 children)

Because the title is still vague, and yes GPU and "graphics card" are often used interchangeably by the internet (examples: https://www.hp.com/gb-en/shop/tech-takes/integrated-vs-dedicated-graphics-cards and https://www.ubisoft.com/en-us/help/connectivity-and-performance/article/switching-to-your-pcs-dedicated-gpu/000081045 ).

"New CPU hits 132fps" could wrongly suggest software rendering, which is very different (see for example https://www.gamedeveloper.com/game-platforms/rad-launches-pixomatic----new-software-renderer ) and died more than a decade ago.

[–] blueday@lemmy.world 8 points 1 year ago (1 children)

Up until the G in 8700G I totally thought 'software renderer' and was hella impressed. So yea, totally plausible it could have been described better.

[–] fuckwit_mcbumcrumble@lemmy.world 6 points 1 year ago (1 children)

Software rendering hasn’t worked in 99% of games made on the last 15+ years. Only the super under low fi hipster stuff would be fine without 3D acceleration.

load more comments (1 replies)
load more comments (1 replies)
[–] Max_P@lemmy.max-p.me 10 points 1 year ago (1 children)

Yeah slightly misleading but I guess they did mention a card specifically, not GPU.

But for a moment I was like wow, 100FPS in software rendering, that's impressive even for an EPYC.

[–] AlmightySnoo@lemmy.world 6 points 1 year ago

But for a moment I was like wow, 100FPS in software rendering

Thank you, that exactly was my point.

[–] LazaroFilm@lemmy.world 7 points 1 year ago

I can see Single Board Conputers with this on for powerful TV boxes. hello Emulators and Steam OS‽

[–] PersonalDevKit@aussie.zone 6 points 1 year ago

They state in the title and description graphics card, not GPU. Implying a dedicated graphics solution not an integrated one.

load more comments (1 replies)
[–] flintheart_glomgold@lemmy.world 15 points 1 year ago* (last edited 1 year ago) (1 children)

$US330 for the top 8700G APU with12 RDNA 3 compute units (compare to 32 RDNA 3 CUs in the Radeon RX7600). And it only draws 88W at peak load and can be passively cooled (or overclocked).

$US230 for the 8600G with 8 RDNA 3 CUs. Falls about 10-15% short of 8700G performance in games, but a much bigger spread in CPU (Tom's Hardware benchmarks) so I'm pretty meh on that one.

Given the higher costs for AM5 boards and DDR5 RAM, you could spend about the same or $100-200 more than an 8700G build you could combine a cheaper CPU and better GPU and get way more bang for your buck. But I see the 8700G being an solid option for gamers on a budget, or parents wanting to build younger kids their first cheap-but-effective PC.

I also see this as a lazy mans solution to building small form factor mini-ITX Home Theatre PCs that run silent and don't need a separate GPU to receive 4K live streams. I'm exactly in this boat right now where I literally don't wanna fiddle with cramming a GPU into some tiny box, but also don't want some piece of crap iGPU in case I use the HTPC for some light gaming from time to time.

[–] bonus_crab@lemmy.world 5 points 1 year ago

itll be a great upgrade for these little nuc like things , thin laptops, and steamdeck competitors

[–] sturmblast@lemmy.world 14 points 1 year ago (2 children)

That's pretty damn impressive. AMD is changing the game!

[–] SpaceCadet@feddit.nl 5 points 1 year ago* (last edited 1 year ago) (3 children)

Meh. It's also a $330 chip...

For that price you can get a 12th gen i3/RX6600 combination which will obliterate this thing in gaming performance.

[–] barsoap@lemm.ee 6 points 1 year ago (1 children)

Your i3 has half the cores. Spending more on GPU and less on CPU gives better fps, news at 11.

[–] SpaceCadet@feddit.nl 5 points 1 year ago (2 children)

So what's the point of this thing then?

If you just want 8 cores for productivity and basic graphics, you're better off getting a Ryzen 7 7700, which is not gimped by half the cache and less than half the PCIe bandwith and for gaming, even the shittiest discrete GPUs of the current generation will beat it if you give it a half decent CPU.

This thing seems to straddle a weird position between gaming and productivity, where it can't do either really well. At that pricepoint, I struggle to see why anyone would want it.

It's like that old adage: there are no bad CPUs only bad prices.

load more comments (2 replies)
load more comments (2 replies)
load more comments (1 replies)
[–] northendtrooper@lemmy.ca 10 points 1 year ago (4 children)

So will this be a HTPC king? Kind of skimped on the temps in the article. I assume HWU goes over it and will watch it soon.

load more comments (4 replies)
[–] Poutinetown@lemmy.ca 10 points 1 year ago (5 children)

It's about the same performance as a 1050ti, which is a 2016 gpu. It's still very much behind entry level discrete gpus like Rx 6600.

Might make sense for a laptop or mini pc, but dont really see the point for desktop considering the price.

[–] surewhynotlem@lemmy.world 19 points 1 year ago (2 children)

I think I might be the target market. I'm very happy with my 1070. I need a CPU and mobo upgrade imminently. I might just snag this and not think about a discrete GPU for a while.

load more comments (2 replies)
[–] cyberpunk007@lemmy.ca 8 points 1 year ago (1 children)

It's a cpu... So that's pretty impressive..

[–] asdfasdfasdf@lemmy.world 4 points 1 year ago (1 children)

It's a GPU. It's just integrated with the CPU, so you don't need a dedicated graphics card.

load more comments (1 replies)
load more comments (3 replies)
[–] echo64@lemmy.world 9 points 1 year ago (1 children)

The playstation 5 also does this.

[–] Toes@ani.social 7 points 1 year ago

I wonder how well it does AI workloads.

[–] XEAL@lemm.ee 6 points 1 year ago (1 children)

Aaaaand the 7950x3D is not top tier anymore

[–] SpookyLegs@lemmy.world 9 points 1 year ago (1 children)

Back in my day the 7950 was a GPU!

Yelling at clouds

load more comments (1 replies)
load more comments