I wonder if my system is good or bad. My server needs 0.1kWh.
The PC I’m using as a little NAS usually draws around 75 watt. My jellyfin and general home server draws about 50 watt while idle but can jump up to 150 watt. Most of the components are very old. I know I could get the power usage down significantly by using newer components, but not sure if the electricity use outweighs the cost of sending them to the landfill and creating demand for more newer components to be manufactured.
Pulling around 200W on average.
- 100W for the server. Xeon E3-1231v3 with 8 spinning disks + HBA, couple of sata SSD’s
- ~80W for the unifi PoE 48 Pro switch. Most of this is PoE power for half a dozen cameras, downstream switches and AP’s, and a couple of raspberry pi’s
- ~20W for protectli vault running Opnsense
- Total usage measured via Eaton UPS
- Subsidised during the day with solar power (Enphase)
- Tracked in home assistant
kWh is a unit of energy, not power
Wasn’t it stated for the usage during November? 60kWh for november. Seems logic to me.
Edit: forget it, he’s saying his server needs 0.1kWh which is bonkers ofc
Only one person here has posted its usage for November. The OP has not talked about November or any timeframe.
Yeah misxed up pists, thought one depended on another because it was under it. Again forget my post :-)
I was really confused by that and that the decided units weren’t just in W (0.1 kW is pretty weird even)
Wh shouldn’t even exist tbh, we should use Joules, less confusing
Watt hours makes sense to me. A watt hour is just a watt draw that runs for an hour, it’s right in the name.
Maybe you’ve just whooooshed me or something, I’ve never looked into Joules or why they’re better/worse.
Joules (J) are the official unit of energy. 1W=1J/s. That means 1Wh=3600J or that 1J is kinda like “1 Watt second”. You’re right that Wh is easier since everything is rated in Watts and it would be insane to measure energy consumption by seconds. Imagine getting your electric bill and it says you’ve used 3,157,200,000J.
Thanks for the explainer, that makes a lot of sense.
3,157,200,000J
Or just 3.1572GJ.
Which apparently is how this Canadian natural gas company bills its customers: https://www.fortisbc.com/about-us/facilities-operations-and-energy-information/how-gas-is-measured
I guess it wouldn’t make sense to measure energy used by gas-powered appliances in Wh since they’re not rated in Watts. Still, measuring volume and then converting to energy seems unnecessarily complicated.
At least in the US, the electric company charges in kWh, computer parts are advertised in terms of watts, and batteries tend to be in amp hours, which is easy to convert to watt hours.
Joules just overcomplicates things.
Wow, the US education system must be improved. 1J is 3600Wh. That’s literraly the same thing, but the name is less confusing because people tend to confuse W and Wh
Running an old 7th gen Intel, It has a 2070 and a 1080 in it, six mechanical hard drives 3 SSDs. Then I have an eighth gen laptop with a 1070 TI mobile. But the laptop’s a camera server so it’s always running balls to the wall. Running a unified dream machine pro, 24 port poe, 16 port poe and an 8 port poe
Because of the overall workload and the age of the CPU, it burns about 360 watts continuous.
I can save a few watts by putting the discs to sleep, But I’m in the camp where the spin up and spin down of the discs cost more wear than continuous running.
Edit: cleaned up the slaughter from the dictation, after I cleaned up my physical space from Christmas festivities.
You might have your units confused.
0.1kWh over how much time? Per day? Per hour? Per week?
Watthours refer to total power used to do something, from a starting point to an ending point. It makes no sense to say that a device needs a certain amount of Wh, unless you’re talking about something like charging a battery to full.
Power being used by a device, (like a computer) is just watts.
Think of the difference between speed and distance. Watts is how fast power is being used, watt-hours is how much has been used, or will be used.
If you have a 500 watt PC, for example, it uses 500Wh, per hour. Or 12kWh in a day.
I forgive 'em cuz watt hours are a disgusting unit in general
idea what unit speed change in position over time meters per second m/s acceleration change in speed over time meters per second, per second m/s/s=m/s² force acceleration applied to each of unit of mass kg * m/s² work acceleration applied along a distance, which transfers energy kg * m/s² * m = kg * m²/s² power work over time kg * m² / s³ energy expenditure power level during units of time (kg * m² / s³) * s = kg * m²/s² Work over time, × time, is just work! kWh are just joules (J) with extra steps! Screw kWh, I will die on this hill!!! Raaah
Power over time could be interpreted as power/time. Power x time isn’t power, it’s energy (=== work). But otherwise I’m with you. Joules or gtfo.
Whoops, typo! Fixed c:
Could be worse, could be BTU. And some people still use tons (of heating/cooling).
kWh is the stupidest unit ever. kWh = 1000J/s * 6060s = 3.610^6J so 0.1kWh = 360kJ
If you have a 500 watt PC, for example, it uses 500Wh, per hour. Or 12kWh in a day.
A maximum of 500 watts. Fortunately your PC doesn’t actually max out your PSU or your system would crash.
0.1kWh per hour? Day? Month?
What’s in your system?
Computer with gpu and 50TB drives. I will measure the computer on its own in the enxt couple of days to see where the power consumption comes from
You are misunderstanding the confusion, Kwh is an absolute measurement of an amount of power, not a rate of power usage. It’s like being asked how fast your car can go and answering it can go 500 miles. 500 miles per hour? Per day? Per tank? It doesn’t make sense as an answer.
Does your computer use 100 watt hours per hour? Translating to an average of 100 watts power usage? Or 100 watt hours per day maybe meaning an average power use of about 4 watts? One of those is certainly more likely but both are possible depending on your application and load.
You’re adding to the confusion.
kWh (as in kW*h) and not kW/h is for measurement of energy.
Watt is for measurement of power.Lol thank you, I knew that I don’t know why I wrote it that way, in my defense it was like 4 in the morning.
They said kilawatt hours per how, not kilawatts per hour.
kWh/h = kW
The h can be cancelled, resulting in kW. They’re technically right, but kWh/h shouldn’t ever be used haha.
Yeah but tbh it’s understandable that OP got confused. I think he just means 100W
Which GPU? How many drives?
Put a kill-o-watt meter on it and see what it says for consumption.
Do you mean 0.1kWh per hour, so 0.1kW or 100W?
My N100 server needs about 11W.
The N100 is such a little powerhouse and I’m sad they haven’t managed to produce anything better. All of the “upgrades” are either just not enough of an upgrade for the money, it just more power hungry.
To my understanding 0.1kWh means 0.1 kW per hour.
It’s the other way around. 0.1 kWh means 0.1 kW times 1 h. So if your device draws 0.1 kW (100 W) of power for an hour, it consumes 0.1 kWh of energy. If your
devicefactory draws 360 000 W for a second, it consumes the same amount of 0.1 kWh of energy.Thank you for explaining it.
My computer uses 1kwh per hour.
It does not yet make sense to me. It just feels wrong. I understand that you may normalize 4W in 15 minutes to 16Wh because it would use 16W per hour if it would run that long.
Why can’t you simply assume that I mean 1kWh per hour when I say 1kWh? And not 1kWh per 15 minutes.
A watt is 1 Joule per Second (1 J/s). E.g. Every second, your device draws 1 Joule of energy. This energy over time is called “Power” and is a rate of energy transfer.
A watt-hour is (1 J/s) * (1 hr)
This can be rewritten as (3600 J/hr) * (1 hr). The “per hour” and “hour” cancel themselves out which makes 1 watt-hour equal to 3600 Joules.
1 kWh is 3,600 kJ or 3.6 MJ
kWh is a unit of power consumed. It doesn’t say anything about time and you can’t assume any time period. That wouldn’t make any sense. If you want to say how much power a device consumes, just state how many watts (W) it draws.
Thanks!
0.1kWh per hour can be written as 0.1kWh/h, which is the same as 0.1kW.
Thanks. Hence, in the future I can say that it uses 0.1kW?
If this was over an hour, yes. Though you’d typically state it as 100W ;)
Yes. Or 100W.
deleted by creator
My whole setup including 2 PIs and one fully speced out AM4 system with 100TB of drives a Intel Arc and 4x 32gb ecc ram uses between 280W - 420W I live in Germany and pay 25ct per KWh and my whole apartment uses 600w at any given time and approximately 15kwh per day 😭
9 spinning disks and a couple SSD’s - Right around 190 watts, but that also includes my router and 3 PoE WiFi AP’s. PoE consumption is reported as 20 watts, and the router should use about 10 watts, so I think the server is about 160 watts.
Electricity here is pretty expensive, about $.33 per kWh, so by my math I’m spending $38/month on this stuff. If I didn’t have lots of digital media it’d be worth it to get a VPS probably. $38/month is still cheaper than Netflix, HBO, and all the other junk I’d have to subscribe to.
That’s true. And the children of my family see no ads which is priceless. Yet I am looking into ways to cut costs in half by using an additional lower powered mini pc which is always on and the main computer only running in the evening - maybe.
Same here. 300w with 12 disks, switches, and router. But electricity only costs $.12/kwh. I wouldn’t trust having terabytes of data in the cloud.
last I checked with a kill-a-watt I was drawing an average of 2.5kWh after a week of monitoring my whole rack. that was about three years ago and the following was running in my rack.
- r610 dual 1kw PSU
- homebuilt server Gigabyte 750w PSU
- homebuilt Asus gaming rig 650w PSU
- homebuilt Asus retro(xp) gaming/testing rig 350w PSU
- HP laptop as dev env/warmsite ~ 200w PSU
- Amcrest NVR 80w (I guess?)
- HP T610 65w PSU
- Terramaster F5-422 90w PSU
- TP-Link TL-SG2424P 180w PSU
- Brocade ICX6610-48P-E dual dual 1kw PSU
- Misc routers, rpis, poe aps, modems(cable & 5G) ~ 700w combined (cameras not included, brocade powers them directly)
I also have two battery systems split between high priority and low priority infrastructure.
Ugh, I need to get off my ass and install a rack and some fiber drops to finalize my network buildout.
would love to add more fiber to my diet! if I had the time and money. next four years is going to get pricy so I’m solidifying my stack now with backup hardware and planning for failures.
the brocade is running my pvt lan since it’s the most important. physically cut off public access. just upgraded most my servers to use 10gbe and would love to run fiber to my office about 60-70 feet away.
the brocade I’m using was unlocked by the eBay seller I got it from, so it can theoretically transfer up to 40g. would be great for my AI rig I keep in the office.
I was drawing an average of 2.5kWh after a week of monitoring my whole rack
That doesn’t seem right; that’s only ~18W. Each one of those systems alone will exceed that at idle running 24/7. I’d expect 1-2 orders of magnitude more.
IDK, after a week of runtime it told me 2.5kwh average. could be average per hour?
Highest power bill I ever saw was summer of 2022. $1800. temps outside were into to 110-120 range and was the hottest ever here.
maybe I’ll hook it back up, but I’ve got different (newer) hardware now.
I think at max 200w? It runs a collection of fedi/self service stuff.
I also run a pi with a couple of apps on a pi 3 that sips power.
It’s a legitimate issue because it’s 50+ cents per killowat hour where I live so power is very expensive…
That seems really high, I think power where I live is about 12-14 cents per kilowatt hour. What makes it so expenses where you live?
0.50 $/kWh is a normal price in Europe now
Mostly just that they can. It’s more expensive per tier actually.
https://www.pge.com/assets/pge/docs/account/rate-plans/residential-electric-rate-plan-pricing.pdf
Take a look, this is the old pricing. They just voted to up it again.
There’s legislation that is moving along to charge people with solar because…idk.
They’re charging for solar because PGE is a greedy fuck.
Yes. There was talk locally for local government to take control of the power but it’s just talk…
It gets over 110 where I live in the summer…so air conditioning can make it very expensive.
Wait this is in the US? How, this is even more expensive than Hawaii, and they have obvious reasons for power to be more expensive there
PGE serves Northern California. They keep raising rates like 10-15% each year to cover their losses after all the wildfires a couple years ago and because of the greed.
Take note, folks. Watch this trend spread.
Yep. And they are talking about a couple more price hikes next year. Significant ones.
Damn, I wish ours was that cheap. We’re roughly $.30/kwh, mostly because our local poco is a reseller of SCE and we’re in a rural area.
Holy shit. I’m paying less than 10c per kwh even in the “high usage” tier.
I wish that was ours…
that’s insane, i pay like 5¢ a KWh
Want to switch?
I’m idling at 120W with eight drives, but I’m currently looking into how to lower it.
Idles at around 24W. It’s amazing that your server only needs .1kWh once and keeps on working. You should get some physicists to take a look at it, you might just have found perpetual motion.
.1kWh is 100Wh
I ate sushi today.
Good point. Now it does make sense. I know the secret to the perpetual motion machine now.
This is a factual but irrelevant statement
Mate, kWh is a measure of electricity volume, like gallons is to liquid. Also, 100 watt hours would be a much more sensical way to say the same thing. What you’ve said in the title is like saying your server uses 1 gallon of water. It’s meaningless without a unit of time. Watts is a measure of current flow (pun intended), similar to a measurement like gallons per minute.
For example, if your server uses 100 watts for an hour it has used 100 watt hours of electricity. If your server uses 100 watts for 100 hours it has used 10000 watts of electricity, aka 10kwh.
My NAS uses about 60 watts at idle, and near 100w when it’s working on something. I use an old laptop for a plex server, it probably uses like 50 watts at idle and like 150 or 200 when streaming a 4k movie, I haven’t checked tbh. I did just acquire a BEEFY network switch that’s going to use 120 watts 24/7 though, so that’ll hurt the pocket book for sure. Soon all of my servers should be in the same place, with that network switch, so I’ll know exactly how much power it’s using.
My server rack has
- 3x Dell R730
- 1x Dell R720
- 2x Cisco Catalyst 3750x (IP Routing license)
- 2x Netgear M4300-12x12f
- 1x Unifi USW-48-Pro
- 1x USW-Agg
- 3x Framework 11th Gen (future cluster)
- 1x Protectli FE4B
All together that draws… 0.1 kWh… in 0.327s.
In real time terms, measured at the UPS, I have a running stable state load of 900-1100w depending on what I have at load. I call it my computationally efficient space heater because it generates more heat than is required for my apartment in winter except for the coldest of days. It has a dedicated 120v 15A circuit
Good lord, how much does electricity cost where you are? Combined with the air conditioning to keep the space livable, that would be prohibitively expensive for me
It’s always wild reading the power draw people wrote here.
I knew it was because this is a US & Europe centric site and many people from homelabs actually run Enterprise size rigs, but my 4 member household run on 2kW for the entire house lol and 75℅ of that is just A/C we use at night.
My household of 7 averages 900 watts year-round.