Jump to content
xisto Community
Sign in to follow this  
rayzoredge

How Much Power (in Watts) Does Hardware Draw?

Recommended Posts

Out of curiosity, how much power does a hard drive draw? How about a CPU? Graphics card?

These questions came to my mind during my Laptop vs. Desktop argument in which I somehow convinced myself not to build the desktop machine that I've been planning on building for years. :P After posting 3 unique posts and receiving no replies, I let the topic lie...

... until now.

I'm looking to purchase a Hitachi 320GB Travelstar 2.5" hard drive and I'm planning on coupling it with an external enclosure that does not require an external power supply. Inadvertently, on a separate venture to look up backwards-compatibility of SATA 3.0GBps with SATA 1.5GBps, I saw that USB 2.0 offers 2.5W @ 5V to connected devices. I then looked up the Hitachi TS7K320 datasheet... which states that the hard drive takes 1.8W of power during a read/write operation, and idles at 0.2W. (It's interesting to know this as I have two 3.5" 500GB Samsung Spinpoint T Series @ 7200RPM consuming 10-10.6W on seek/read/write, 8.2W on idle, and just over half a Watt during sleep or standby.) Amazing how a USB port can supply enough power to a 320GB hard drive...

Anyway, I did some more digging in the power department and found out that Joe Schmoe's Intel E6600 Duo Core draws shy of 120W under load and his Intel Q6600 Quad Core draws about 202W under load. Source Other components, I would assume, would draw a maximum wattage of the voltage rating to the device multiplied by the amperage (i.e. +12V, 1.4A DVD burner would consume up to 16.8W at peak usage). Are my numbers skewed in thinking, therefore my laptop cost-of-ownership argument is incorrect, or are these numbers accurate? Would anyone like to shed some light on this?

(And for those of you who think that this is tech jargon that doesn't apply to you... think about the power consumption of your computers and add up how much it costs to run your machines.) :)

Share this post


Link to post
Share on other sites

Indeed your math is OK, provided the voltage is fixed then amps x voltage = watts. Modern devices such as drives have come along way in a short time so far as power consumption is concerned with most now using high efficiency motors and CMOS.The other most important trick to efficiency is cooling in a balanced manner; the more amps a device draws, especialy processors etc the hotter they run and consequently the less efficient they become. Idealy, you should draw fresh air at the front bottom say, and draw the hot at the top rear as circulation is one of the key factors rather than just sucking hot air out. Efficient cooling will also reduce the curent draw for a given load...

Share this post


Link to post
Share on other sites

Your calculations are right but then processors are not always using that much power. or for that matter you can practically never run every device attached on its peak load (atleast I can't ) and so most of the unused devices are either in a low power state or shut off. Most of the laptops offer a power setting where all the unused devices are shut-off completely to give you more time. For Example My Vaio can efficiently give me 6-7 hrs if I am just reading/editing some documents. I can switch to a maximum power saving where it shuts-off my dvd-drive, my WLAN card, Ethernet card, bluetooth etc. and just lets me work on the document that I believe is loaded in main memory (HDD in low state). But then (I've not read your Laptop v/s Desktop ) most of the Desktop PCs are capable of the same thing. I can use the same features on my Desktop and save power . My motherboard is an ASUS P5Q (its suppose to have this EPU unit, I don't know if it actually works or not) and I have a 17" CRT ( Can't afford a TFT :) ) and I've seen that when I leave my system idle for some time like say 4-5 min , its shuts-off my monitor, turns off all the idle hard-disks ( I have three hard-disks) and even reduces the power for my Processor (Q6600) i.e. CPU using only 9-10 Watt in idle state.SO I guess both PC and Laptop can effectively use the same amount of power in Low state. BUT then You can never achieve the performance of a Desktop on a Laptop without spending at-least the double. If we don't consider the portability , Desktops are always better.

Share this post


Link to post
Share on other sites

@bluedragon: But the portability is a large deciding factor... which, including power savings, is why I lean moreso on a laptop than a desktop in my other thread (Laptop vs. Desktop). A laptop also includes a "monitor," although you are limited to the form factor of the laptop.?One wonderment I've had as far as the performance argument goes is that if you take a desktop machine with similar specifications and hardware configuration and compare it to a similarly-spec'd laptop, run the same programs, play the same games, and whatnot, would you draw more power using the desktop than you would the laptop? I did some dummy math and said that a 120W power supply that a desktop replacement laptop would draw a maximum of 120W (give or take a few watts) to power CPU and GPU-intensive tasks like gaming and 3D modeling, whereas a 500W (or more) power supply in a desktop is able to breach that 120W to feed those same desktop-variant components. Maybe I did some bad math in my previous thread, as it doesn't make sense that a laptop with virtually the same hardware as a desktop would draw less power than its desktop variants, but then again, mobile hardware may be designed to sip less from the power grid... which also brings the question as to why we don't use the same technologies in desktops (if mobile hardware actually used less power to do the same thing). I've seen that NVIDIA put out a sort of "green-ish" technology with its GeForce 9800 series where it would be much more selective in drawing power when it was needed (some sort of Hybrid technology... I can't remember) for desktops, but otherwise, I haven't seen much else in using less to do more.I wish I had a desktop system that was similar to one of my laptops so that I could get a Kill-A-Watt and compare the two, but I don't have the moola.? :) Can anyone tell me if I'm wrong or not as far as the power consumption goes?

Share this post


Link to post
Share on other sites

you are not wrong with this .. Desktop with same specs would absolutely leech more power. Its not only the PSU thing its the different components altogether. A CPU in a desktop is completely different from a CPU in a Laptop because it needs to run on batteries it cannot drain 120 Watts :)Similarly the RAM modules are optimized and take less power ( I cannot comment upon the performance of RAMs in Laptops)the Graphics card , I think are almost similar but you don't find the same graphics card in Laptops as they are in Desktops (I am talking about the High End cards) , If you notice both NVidia and Radeons have different series for Laptops and Dekstops .Rest is almost same. Oh.. Has someone noticed this in a Laptop , the DVD writters are generally slow on the Write speed than on Read speed ?

Share this post


Link to post
Share on other sites
Why run hyper cpuHow Much Power (in Watts) Does Hardware Draw?

There is such a thing as MoDT - mobile on desktop, that run the mobile series processors: see the motherboards at http://www.logicsupply.com/

It was helpful when I chose my system to use this link for maximum power consuption of Intel cpu's:

http://ark.intel.com/

-reply by grainman

 

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
Sign in to follow this  

×
×
  • Create New...

Important Information

Terms of Use | Privacy Policy | Guidelines | We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.