Alright, with all those talk about XBOX360 eating alot of electricity got me wondering, I'm on my computer alot, about 3-5hours a day, with a 19' inch monitor, around how much would that cost monthly? Just wondering.
Name:
Anonymous2005-12-05 15:28
Let's say the computer is consuming 400W (it sounds a lot, but with today's hotplate CPUs and GPUs that's not unusual). A 19" LCD panel will draw a further ~40W, so let's say it uses 440W in total. Averaging at four hours a day that's 1.76 KWh or units of electricity per day, or 52.8 units per month. Multiply that by how much each unit costs you (check your electricity bill) and there's your magic figure. This of course varies by how much power your computer actually draws - a setup consisting of, for example, a cool-n-quiet Athlon 64 and onboard graphics instead of a high-powered graphics card, will cost a bit less.
440 is bit thick. Watch: MB 250W, CPU 70W, graphics 25W, RAM 10W, sound 5W, HDD 15W, LCD 50W.
In total 200W. I mean average computer.
Name:
Anonymous2005-12-06 19:51
Bullpies. You try running a modern PC on a 150W power supply and see how far you get. Prescott-core P4s consume >100W on their own, and high-powered graphics cards aren't far behind. The motherboards suck up quite a bit too - some north bridges require active cooling, plus you have at least four or more fans churning away in the case (at least another 25W in fans alone). And the PSUs aren't generally the most efficient devices either. You could call 200W average a few years ago, but not now. Things are getting under control again, what with AMD's cool-n-quiet CPUs, and Intel's forthcoming plans to reduce power consumption (basing their next generation of CPUs on the same design philosophies as the laptop-friendly Pentium-M), but current stuff is still very thirsty.
Name:
Anonymous2005-12-06 20:29
150W power suply has nothing to do with consumption.
Name:
Anonymous2005-12-06 20:38
>>5 Power suply has no own consumption. But your numbers may be right if we consider 100% usage of all components (e.g. gaming). >>4 is right if we consider normal usage (e.g. office work or surfing)
Name:
Anonymous2005-12-07 3:42
>>6
A-whu? When did I say a power supply always draws its maximum rated wattage? Yes, a PSU will only draw what power it needs, but if the components demand more than the PSU can deliver, chaos ensues. That's why a 150W PSU won't drive a modern PC in a stable manner, if at all.
>>7
Then I suppose the heat the power supply produces just comes from nowhere then? On this board we obey the laws of thermodynamics! My calculations were based on fairly intense usage.
Name:
Anonymous2005-12-07 5:08
>>8
My computer uses about 300W. The good thing is that I save about 200W on heating.
Name:
Anonymous2005-12-07 8:19
My notebook can run from battery for about 3 hours. On the battery plate there is written: 14,8V & 4000mAh.
So if I=4000/3 then it uses 1,3A. And if P=I*A (for DC) then I have consumption about 19W. Let's have efficiency of adapter about 70% -> P=28W!
Amazing... Or what have I miscounted?
Name:
Anonymous2005-12-07 11:51
Notebooks are generally designed for lower power usage, particularly when running on battery power - they reduce the clock speed of their CPU, shut down hard disks and optical drives when not in use, and the screens generally draw less power than an equivalent-sized desktop LCD panel too (usually by reducing the intensity of the backlight). If you were hammering the laptop with a high-powered game or video encoding or something, the battery would be drained much faster.
Name:
Anonymous2005-12-17 16:26
Doesn't matter whether it is 100W or 500W. I'm gonna immediately change all my five lightbulbs from 100W to 50W.
Name:
Anonymous2005-12-17 16:32
Better still to change 100W lightbulbs for 15W energy saving ones. Then you can feel less guilty about running your PC for hours on end!
Name:
Anonymous2005-12-17 17:06
So how much power can you save from your PC? I know switching from CRT to TFT saves around 30-50W depending on size. The next things are the AMD powersaving technology which you can use while chatting or bittorrenting but how much does it save? Are ATI videocards still 'low power' and I read that a large amount of power is wasted by cheap PSUs, how much could you save from those?
My computer uses about 140W under normal/idle usage, out of which 105W is my old CRT monitor, that's not too bad for a Pentium 4! If you really wanna save power, get a shitty slow CPU and a nvidia geforce MX series card without fan. My ancient Pentium 4 takes nearly 10W less at full steam than some equivilant speed (the speed rating, not mhz) AMD cores. But there are several power efficient old AMDs you can go for too. If you don't want a shitty old computer though, then you're just gonna have to improvise, switch to TFT, use one big hard drive instead of multiple, underclock your processor. There are many fun things to do!
Name:
Anonymous2005-12-18 1:03 (sage)
i have a power supply with a 3.5" power usage meter. on boot it uses like 250-300 watts and on use its at 150. thats a p4 3.0 775 series, x700pro, 8 hdd's, dvd-rw, 4 fans, dual monitor on.
Name:
Anonymous2005-12-18 12:32
>>13
energy saving lightbulbs suck because their production uses much more natural resources than production of normal ones.
Name:
Anonymous2005-12-18 13:50 (sage)
>>18
Well, duh. More efficient equipment generally is harder to manufacture. But after a certain period of use, these factors balance out, and after that.. PROFIT!
Name:
Anonymous2005-12-18 13:58
>>18
But they last longer, and use less energy over their lifespan. It's still win-win.