Thousand times this. For actual builders that care about the nuance it all probably makes sense but then there is me over here looking at pre-builts wondering why the fuck are two seemingly identical machines have a $500 difference between them.
I’m spending so much time pouring through spec sheets to find “oh the non-z version discombobulator means this cheaper one is gonna be trash in three years when I can afford to upgrade to a 6megadong tri-actor unit”.
I’m in this weird state of to cheap to buy a Mac and can’t be arsed to build my own.
Yeah, and when you check the detail pages of the games and other software you are upgrading it for it’ll turn out the 6 megadong tri-actor unit should work well in general, but there’s a certain crashing bug near the end of this game I already bought that the devs haven’t patched yet…
And even after all those considerations modded Minecraft will be just about functional.
Just go here and check the charts for the kind of work you want the PC to do. If one looks promising you can check specific reviews on YouTube.
For gaming the absolute best cpu/gpu combo currently is the 9800x3d and a rtx 4090, if you don’t have a budget.
Yes the part naming is confusing but it’s intentional.
Gamer’s Nexus
It’s funny that you wrote the wrong GPU name while agreeing that the naming is confusing.
R*TX 4090
Yes the part naming is confusing but it’s intentional
Yes, that’s what people are upset about.
For very broad definitions of “convention”
Just don’t rent one from NZXT.
I saw a video on Gamers Nexus about how shitty a company they are. Hopefully word spreads amongst gamers & builders that they’re no good and they should be avoided.
What’s the deal with them? Only NZXT component i’ve had is my current case, which has awful airflow (old model of H710 I think, bought 5 ish years ago).
Apparently their PC rental program is a worse value than illegal loans that are likely mafia-backed.
Apparently they very recently got acquired or invested in and are probably looking to increase profits tenfold in under a year so the company can be dumped before it all crashes.
If you are blindly renting things without doing numbers you have bigger issues.
Always read and do long term calculations
Problem is that a lot of “influencers” advertise it to teens as an easy way to get a new computer.
I think that is more on the teens and there parents.
I recently had to go through this maze. I hate it. And I’m glad that my PCs tend to live ~10y, this means that I’m not doing it again in the foreseeable future.
Meanwhile the data i care about, efficiency, is not readily availlable. I’m not gonna put a 350 watt GPU in the 10 liter case if i can have the same power for 250 watt.
At least TomsHardware now includes efficiency in tests for newer cards.GamersNexus has start add efficiency score in frame / joule. Also have full writeup of video on website.
Tell me about it. The numbers that I’m interested in - “decibels under full load”, “temperature at full load” - might as well not exist. Will I be able to hear myself think when I’m using this component for work? Will this GPU cook all of my hard drives, or can it vent the heat out the back sufficiently?
I wish this was data was more available. I got a GPU upgrade 6800xt and it’s so loud. I can’t enjoy sitting at my desk without hearing a loud whine and a bunch of other annoying noises. Its probably because the card is 2nd hand but still.
Maybe not cuz I have first hand 7900xtx and if I load it up it whines horribly lol.
Temperature is meaningless unless you want oc headroom. A watt into your room is the same no matter the temp the part runs at.
That’s not correct, I’m afraid.
Thermal expansion is proportional to temperature; it’s quite significant for ye olde spinning rust hard drives but the mechanical stress affects all parts in a system. Especially for a gaming machine that’s not run 24/7 - it will experience thermal cycling. Mechanical strength also decreases with increasing temperature, making it worse.
Second law of thermodynamics is that heat only moves spontaneously from hotter to colder. A 60° bath can melt more ice than a 90° cup of coffee - it contains more heat - but it can’t raise the temperature of anything above 60°, which the coffee could. A 350W graphics card at 20° couldn’t raise your room above that temperature, but a 350W graphics card at 90° could do so. (The “runs colder” card would presumably have big fans to move the heat away.)
That is fundamentally not how PC cooling works. Each part is a closed system, with the PC an open system so long as you have fans. The heat sink temp over ambient could be what you are looking for, but that still would not work that way if you are looking at hot spot temps. If you tried to run a thread ripper at 500W in a closed space the air temp would end up hotter than than a 350W Graphics card. But the CPU if not throttling would have a temp over ambient of about 30c and the gpu core would be about 45c over ambient. The effect on your room will be that the 500W cpu raises the ambient temp more than the 350W gpu over the same period of time. The air in your room is what is cooling the components. Air at a given humidity has a specific heat capacity and will be your limiting factor. With your bath example you would need to have a much larger capacity of 60c water to melt the ice since the specific heat of water doesn’t change when a liquid.
You have a fundamental misunderstanding of the 1st law of thermodynamics and what a “system” is as relating to the 2nd.
For your HDD you want them to run 45-60c. running them colder will impact their life span. The drive will try to heat up if under 30c to prevent damage.
60% or 60 percentage points ?
This is why I love Lemmy (it’s a reference to another thread btw)
That post is older than Lemmy
Wouldn’t that be the same thing with no other percentages in sight because we’re subtracting from 100%?
I have no idea, that was just a tongue in cheikh reference to that other thread
Fortunately there are resources that make a good starting point because I agree; naming schemes are a shit show. I generally start with this and go from there research wise. https://www.logicalincrements.com/
I’d be very careful relying on that site… just flipped through some of the build and it was very strange.
E.g. they were recommending a $500 or $900 CASE at the highest tiers - not even good cases, you can get something less than half the price with better performance. They recommended a single pcie 4.0 SSD and a SPINNING HARD DRIVE for a motherboard with pcie 5.0 m2 slots. Recommending CPU coolers that are far, far in excess of requirements (a 3x140mm radiator for a 100W chip? Nonsense). Memory recommendations for AMD builds are also sus - DDR5 6000 CL30 is what those cups do best with, they were recommending DDR5600 CL32 kits for no reason.
Just strange… makes me question the rest of their recommendations.
Mind you, recommending a PCIe 4.0 SSD is the one part that makes sense. Right now very few people will gain noticeable benefits from a PCIe 5.0 SSD, AFAIK. The rest though… yikes.
The price differential doesn’t really exist anymore, though. If they were recommending 4TB, then I’d agree (only a few 4TB 5.0 and they are quite pricey), but at 2TB you’re looking at like $10 difference between something like the MP700 and the SN850X they recommend (not counting all the black Friday sales going on).
Ah, good to know. Thanks.
…I just the other day ordered all the components to make the first “Extremist” tier build, nearly verbatim.
I guess I made some of the right choices, then.
Power consumption is part of the equation now too. You’ll often see newer generation hardware that has comparable performance to a last gen model but is a lot more power efficient.
Or you’ll see something equally efficient and equally performing at the same power levels…except you’ll see newer gens or upgraded skus allowed to pull more power
Just buy AMD 😜
Honestly my preferred manufacturer since I started putting together my own machines.
Make sure to get your 5900x3d with your 7900XTX. Note that one is a CPU and the other is a GPU. For extra fun, their numbers should eventually overlap given their respective incrementation schemes. The 5900x3d is the successor to the 5900xd, which is a major step down in performance even though it has more cores.
I’m gonna give this award to Intel, which has increased the numbers on their CPU line by 1000 every generation since before the 2008 housing crash.
It’s so annoying when you buy a GPU instead of a CPU.
Or when you buy a GPU inside of your CPU.
They already do overlap, 7000 series CPUs have been out for a while. As have the 5000 series GPUs.
…don’t worry, I’m sure Intel won’t change things up on us… right? (Just pretend the last year of Intel CPUs didn’t happen)
I assume you haven’t seen the latest series of processors from Intel…
You still need to understand their naming convention if you plan on comparing hardware.
The only thing you should realistic understand from the naming conventions is relative generations and which bracket of price/performance the part targets. Assuming more than that is just a mistake.
Is it not still “higher better” at AMD? With the obvious X or “m”, but usually price reflects the specs when the numbers are the same.
Just ordered another CPU from them. Downside is that there isn’t any modern AMD desktop platform that works with coreboot, which seems to be the only workable way to deactivate the Management Engine/Platform Security Processor after boot.
Was really considering to swap to Intel for that, but got a good deal on a Ryzen 9 that fits in my socket, so…
Is there anything from the last 10 years that runs coreboot?
They want you to fork over some cash for the most current binaries, though.
You can of course just build it from source.
The most current AMD Boards that are supported are FM2+. I actually have an FM2+ processor flying around somewhere, an Athlon II X4 860K, but that thing uses a lot of power for not very much performance.
Oh is this a different project to libreboot?
Yeah, it’s a different coreboot fork. They seem to be kinda focused on selling their implementation to corporate users, but if that finances open source development, I’m not gonna complain.
You are mixing coreboot up with libreboot
Also libreboot now ships some proprietary firmware so it is more compatible then it used to be.
AMD is one of the worst with naming
They had at least two or three halfway sensible naming schemes, which they then proceeded to abandon after like one generation.
I fault marketing department at the chipmakers that are trying to somehow justify their existence.
Explain yourself.
Explain how AMD naming works. I’m so confused as it is pretty hard to understand plus they randomly will violate there own conventions.
I always go by the rule of the larger the number/more letters the better. The exception being M that usually means it’s made for mobile devices.
i’ll trade you my geforce 9500 for your 4090.
Ok maybe also look at the year the card was released too.
Q. E. D.
quantum electrodynamics
how about my geforce 9500 for your vega 64?
The other exception being monitors, which are named by connecting three keyboards to one computer and then rolling a bowling ball across all three.
No one really knows how that method was established, but it’s industry standard now.
They know people like you are the majority, that’s why, specially when it comes to low-end hardware, they up the price while selling you the same or worse performance just because the part is newer.
I occasionally “refresh” my PC with new board, CPU etc. I never buy the top of the line stuff and quite honestly there is little reason to. Games are designed to play perfectly well on mid range computers even if you have to turn off some graphics option that enables some slight improvement in the image quality.
I agree. Another good trick: Don’t buy a 4K screen. GPU’s work for much longer that way.
For many games you can set graphics rendering to for example 1080p but run the whole game in 4k so text, menues and so on are super crisp but the game still runs very light. But maybe it’s good advice to never even start because I can’t imagine going back to 1080p after using 2k and 4k screens
I just go by PassMarks rating for CPU and GPU. It may not be the most nuanced rating, but it does give numbers that can be easily compared.
And it sucks! Sorry, I mean it SUX.
They periodically run out of integers so they have to reuse old ones.