I wonder if my system is good or bad. My server needs 0.1kWh.
Mate, kWh is a measure of electricity volume, like gallons is to liquid. Also, 100 watt hours would be a much more sensical way to say the same thing. What you’ve said in the title is like saying your server uses 1 gallon of water. It’s meaningless without a unit of time. Watts is a measure of current flow (pun intended), similar to a measurement like gallons per minute.
For example, if your server uses 100 watts for an hour it has used 100 watt hours of electricity. If your server uses 100 watts for 100 hours it has used 10000 watts of electricity, aka 10kwh.
My NAS uses about 60 watts at idle, and near 100w when it’s working on something. I use an old laptop for a plex server, it probably uses like 50 watts at idle and like 150 or 200 when streaming a 4k movie, I haven’t checked tbh. I did just acquire a BEEFY network switch that’s going to use 120 watts 24/7 though, so that’ll hurt the pocket book for sure. Soon all of my servers should be in the same place, with that network switch, so I’ll know exactly how much power it’s using.
My home rack draws around 3.5kW steady-state, but it also has more than 200 spinning disks
For the whole month of November. 60kWh. This is for all my servers and network equipment. On average, it draws around 90 watt.
How you measuring this? Looks very neat.
I use unraid with 5950x and it wouldn’t stop crashing until I disabled c states
So that plus 18 hdds and 2 ssds it sits at 200watts 24/7
kWh is a unit of energy, not power
Wasn’t it stated for the usage during November? 60kWh for november. Seems logic to me.
Edit: forget it, he’s saying his server needs 0.1kWh which is bonkers ofc
Only one person here has posted its usage for November. The OP has not talked about November or any timeframe.
I was really confused by that and that the decided units weren’t just in W (0.1 kW is pretty weird even)
Wh shouldn’t even exist tbh, we should use Joules, less confusing
Watt hours makes sense to me. A watt hour is just a watt draw that runs for an hour, it’s right in the name.
Maybe you’ve just whooooshed me or something, I’ve never looked into Joules or why they’re better/worse.
At least in the US, the electric company charges in kWh, computer parts are advertised in terms of watts, and batteries tend to be in amp hours, which is easy to convert to watt hours.
Joules just overcomplicates things.
Idles at around 24W. It’s amazing that your server only needs .1kWh once and keeps on working. You should get some physicists to take a look at it, you might just have found perpetual motion.
.1kWh is 100Wh
This is a factual but irrelevant statement
Good point. Now it does make sense. I know the secret to the perpetual motion machine now.
My server rack has
- 3x Dell R730
- 1x Dell R720
- 2x Cisco Catalyst 3750x (IP Routing license)
- 2x Netgear M4300-12x12f
- 1x Unifi USW-48-Pro
- 1x USW-Agg
- 3x Framework 11th Gen (future cluster)
- 1x Protectli FE4B
All together that draws… 0.1 kWh… in 0.327s.
In real time terms, measured at the UPS, I have a running stable state load of 900-1100w depending on what I have at load. I call it my computationally efficient space heater because it generates more heat than is required for my apartment in winter except for the coldest of days. It has a dedicated 120v 15A circuit
Good lord, how much does electricity cost where you are? Combined with the air conditioning to keep the space livable, that would be prohibitively expensive for me
Running an old 7th gen Intel, It has a 2070 and a 1080 in it, six mechanical hard drives 3 SSDs. Then I have an eighth gen laptop with a 1070 TI mobile. But the laptop’s a camera server so it’s always running balls to the wall. Running a unified dream machine pro, 24 port poe, 16 port poe and an 8 port poe
Because of the overall workload and the age of the CPU, it burns about 360 watts continuous.
I can save a few watts by putting the discs to sleep, But I’m in the camp where the spin up and spin down of the discs cost more wear than continuous running.
Edit: cleaned up the slaughter from the dictation, after I cleaned up my physical space from Christmas festivities.
17W for an N100 system with 4 HDD’s
That’s pretty low with 4 HDD’s. One of my servers use 30 watts. Half of that is from the 2 HDD’s in it.
@meldrik @qaz I’ve got a bunch of older, smaller drives, and as they fail I’m slowly transitioning to much more efficient (and larger) HGST helium drives. I don’t have measurements, but anecdotally a dual-drive USB dock with crappy 1.5A power adapter (so 18W) couldn’t handle spinning up two older drives but could handle two HGST drives.
Which HDDs? That’s really good.
Seagate Ironwolf “ST4000VN006”
I do have some issues with read speeds but that’s probably networking related or due to using RAID5.
50W-ish idle? Ryzen 1700, 2 HDDs, and a GTX 750ti. My next upgrade will hopefully cut this in half.
Around 18-20 Watts on idle. It can go up to about 40 W at 100% load.
I have a Intel N100, I’m really happy about performance per watt, to be honest.
0.1kWh per hour? Day? Month?
What’s in your system?
Computer with gpu and 50TB drives. I will measure the computer on its own in the enxt couple of days to see where the power consumption comes from
You are misunderstanding the confusion, Kwh is an absolute measurement of an amount of power, not a rate of power usage. It’s like being asked how fast your car can go and answering it can go 500 miles. 500 miles per hour? Per day? Per tank? It doesn’t make sense as an answer.
Does your computer use 100 watt hours per hour? Translating to an average of 100 watts power usage? Or 100 watt hours per day maybe meaning an average power use of about 4 watts? One of those is certainly more likely but both are possible depending on your application and load.
You’re adding to the confusion.
kWh (as in kW*h) and not kW/h is for measurement of energy.
Watt is for measurement of power.They said kilawatt hours per how, not kilawatts per hour.
kWh/h = kW
The h can be cancelled, resulting in kW. They’re technically right, but kWh/h shouldn’t ever be used haha.
Lol thank you, I knew that I don’t know why I wrote it that way, in my defense it was like 4 in the morning.
Which GPU? How many drives?
Put a kill-o-watt meter on it and see what it says for consumption.
You might have your units confused.
0.1kWh over how much time? Per day? Per hour? Per week?
Watthours refer to total power used to do something, from a starting point to an ending point. It makes no sense to say that a device needs a certain amount of Wh, unless you’re talking about something like charging a battery to full.
Power being used by a device, (like a computer) is just watts.
Think of the difference between speed and distance. Watts is how fast power is being used, watt-hours is how much has been used, or will be used.
If you have a 500 watt PC, for example, it uses 500Wh, per hour. Or 12kWh in a day.
If you have a 500 watt PC, for example, it uses 500Wh, per hour. Or 12kWh in a day.
A maximum of 500 watts. Fortunately your PC doesn’t actually max out your PSU or your system would crash.
I forgive 'em cuz watt hours are a disgusting unit in general
idea what unit speed change in position over time meters per second m/s acceleration change in speed over time meters per second, per second m/s/s=m/s² force acceleration applied to each of unit of mass kg * m/s² work acceleration applied along a distance, which transfers energy kg * m/s² * m = kg * m²/s² power work over time kg * m² / s³ energy expenditure power level during units of time (kg * m² / s³) * s = kg * m²/s² Work over time, × time, is just work! kWh are just joules (J) with extra steps! Screw kWh, I will die on this hill!!! Raaah
Power over time could be interpreted as power/time. Power x time isn’t power, it’s energy (=== work). But otherwise I’m with you. Joules or gtfo.
Whoops, typo! Fixed c:
Could be worse, could be BTU. And some people still use tons (of heating/cooling).
kWh is the stupidest unit ever. kWh = 1000J/s * 6060s = 3.610^6J so 0.1kWh = 360kJ
My whole setup including 2 PIs and one fully speced out AM4 system with 100TB of drives a Intel Arc and 4x 32gb ecc ram uses between 280W - 420W I live in Germany and pay 25ct per KWh and my whole apartment uses 600w at any given time and approximately 15kwh per day 😭
Do you mean 0.1kWh per hour, so 0.1kW or 100W?
My N100 server needs about 11W.
The N100 is such a little powerhouse and I’m sad they haven’t managed to produce anything better. All of the “upgrades” are either just not enough of an upgrade for the money, it just more power hungry.
To my understanding 0.1kWh means 0.1 kW per hour.
It’s the other way around. 0.1 kWh means 0.1 kW times 1 h. So if your device draws 0.1 kW (100 W) of power for an hour, it consumes 0.1 kWh of energy. If your
devicefactory draws 360 000 W for a second, it consumes the same amount of 0.1 kWh of energy.Thank you for explaining it.
My computer uses 1kwh per hour.
It does not yet make sense to me. It just feels wrong. I understand that you may normalize 4W in 15 minutes to 16Wh because it would use 16W per hour if it would run that long.
Why can’t you simply assume that I mean 1kWh per hour when I say 1kWh? And not 1kWh per 15 minutes.
A watt is 1 Joule per Second (1 J/s). E.g. Every second, your device draws 1 Joule of energy. This energy over time is called “Power” and is a rate of energy transfer.
A watt-hour is (1 J/s) * (1 hr)
This can be rewritten as (3600 J/hr) * (1 hr). The “per hour” and “hour” cancel themselves out which makes 1 watt-hour equal to 3600 Joules.
1 kWh is 3,600 kJ or 3.6 MJ
kWh is a unit of power consumed. It doesn’t say anything about time and you can’t assume any time period. That wouldn’t make any sense. If you want to say how much power a device consumes, just state how many watts (W) it draws.
Thanks!
0.1kWh per hour can be written as 0.1kWh/h, which is the same as 0.1kW.
Thanks. Hence, in the future I can say that it uses 0.1kW?
Yes. Or 100W.
If this was over an hour, yes. Though you’d typically state it as 100W ;)
deleted by creator