Published on: 01.08.2017
I was surprised how little electricity is used by computer and monitor.
All together it is maximum of 50 Watts, so for 1 kilo Watt hour, 20 hours of work is needed.
I remember how CRT 15 inch monitor was using 70 Watt.
Pentium 3 was using 55 Watt and Pentium 4 was using 130 Watt, at 100 CPU.
Even my 19 inch HP L1950g is using 30 Watt.
What is much more than 24 inch and it is much smaller than 24 inch.
Planned obsolescence from Apple
In MacBook I have 10 GB of DDR3 1067 MHz memory.
Originally I had 4 GB but that was too little, so I had idea to put 16 GB inside 2x 8GB.
But, for ever reason (I think that Apple have some software/hardware lock, so that you need to buy new laptop and can not just install more memory) 2x 8 GB did not work, I could not boot it.
With try and error, I found that if I leave original 2 GB in one slot and put 8 GB in another slot that it is working fine and that is how I have 10 GB.
Basics of electricity consumption
This will just be basic introduction to electricity consumption, so that you can understand what is Watt.
For electricity consumption you always have two parts:
– electric current, it is measured in ampere
– voltage, it is measured in volt
To get electricity consumption you need to multiply electric current(ampers) with voltage(volts).
Unit of measurement for electricity consumption is called Watt.
Usually electrical devices are rated in Watt/hour, what just means how many watt will some device use in 1 hour of their work.
Electricity consumption of monitor
Electricity consumption of monitor depend on how white/bright is your screen.
So you can separate it by brightness and main color of screen.
Table show some real measurements.
|Main Background Color||Brightness|
|White||10.9 Watt||20.3 Watt||22.7 Watt|
More brightness use more electricity, same is with white color.
I also have tested contrast, but contrast does not have big impact.
With brightness and contrast set to 0%, electricity consumption was 7.9 Watt, regardless of background color.
But with brightness and contrast set to 0%, you just have black screen with just small back-light, so it is not useful.
With brightness and contrast set to 100%, electricity consumption for white color was 22.9 Watt and for black 19.9 Watt.
I have my monitor brightness and contrast set to 75% and electricity consumption was never above 20 Watt.
When Flurry screensaver is running it is 17.6 Watt.
For one kilo Watt to be used, with biggest possible consumption of 20 Watt, monitor need to be on for 50 hours.
Electricity consumption of MacBook
MacBook is laptop, so it has a screen, but screen brightness had little effect on electricity consumption.
Maximum difference between 0% and 100% brightness was 4 Watt and this was at load of 100% CPU.
When CPU load was at 10%, when computer is not in active use, difference was at 2-3 Watt.
Table show some real measurements.
|CPU 10%||15 Watt||17 Watt|
|CPU 100%||28 Watt||32 Watt|
|Flurry screensaver||8 Watt||20 Watt|
My regular work is at 25% CPU.
As it can been seen from table, main part of electricity consumption in computer is CPU load.
All this measurement as done with Dell 24-inch monitor connected to MacBook.
I did same measurement with Dell 24-inch monitor disconnected, difference was just in 2-3 Watt.
Electricity consumption some of my other equipment
iPhone 4 use 10 Watt to charge Battery from 0% to 100%.
iPad Retina Mini use 35 Watt to charge Battery from 0% to 100%. After 20 charging that is 1 kilo Watt, what I spend per month and I use it every day few (around 3) hours.
Additional explanation of measurements
I have used Velleman NETBSEM5 ENERGY METER for measuring electricity consumption.
For white Main Background Color I used https://www.google.com.
For black Main Background Color I used http://www.blackle.com/, look-like they really save electricity.
To get CPU at 100%, just
yes > /dev/null & was used.