I was just thinking if you are on your computer anyway would it just use some of the excess electricity that your computer would have wasted or would it be worse than charging your phone from a charger while using your laptop separately.
The other answers have touched upon the relative efficiencies between a phone charger and a desktop computer’s PSU. But I want to also mention that the comparison may be apples-to-oranges if we’re considering modern smartphones that are capable of USB Power Delivery (USB PD).
Without any version of USB PD – or its competitors like Quick Charge – the original USB specification only guaranteed 5 V and up to 500 mA. That’s 2.5 W, which was enough for USB keyboards and mice, but is pretty awful to charge a phone with. But even an early 2000s motherboard would provide this amount, required by the spec.
The USB Battery Charging (USB BC) spec brought the limit up to 1500 mA, but that’s still only 7.5 W. And even in 2024, there are still (exceedingly) cheap battery banks that don’t even support USB BC rates. Motherboards are also a mixed bag, unless they specifically say what they support.
So if you’re comparing, for example, the included phone charger with a Samsung S20 (last smartphone era that shipped a charger with the phone) is capable of 25 W charging, and so is the phone. Unless you bought the S20 Ultra, which has the same charger but the phone can support 45 W charging.
Charging the S20 Ultra on a 2004-era computer will definitely be slower than the stock charger. But charging with a 2024-era phone charger would be faster than the included charger. And then your latest-gen laptop might support 60 W charging, but because the phone maxes out at 45 W, it makes no difference.
You might think that faster and faster charging should always be less and less efficient, but it’s more complex since all charging beyond ~15 Watts will use higher voltages on the USB cable. This is allowable because even the thinnest wire insulation in a USB cable can still tolerate 9 volts or even 20 volts just fine. Higher voltage reduces current, which reduces resistive losses.
The gist is: charging is a patchwork of compatibility, so blanket statements on efficiency are few and far between.
There is no such thing as ‘excess electricity’ in a modern (switching) power supply unit. They use as much power as is needed. There is a few percent of loss in the device, no big deal.
Some desktop computers are less efficient because they have too strong psu’s (lots of reserve for your future “gaming” graphics card) built in.
This would only affect the 12V rail though no? It’s not like they are beefing up the 5V rail that supplies your USB ports in excessive amounts. Picking a random PSU from pcpartpicker, the CORSAIR RM650e vs RM1200e (650W vs 1200W) both have a +5V@20A rail. There would be no need to have a larger 5V rail to support gaming cards.
Also correct me if I am wrong, most PSU’s are more efficient at 20-50% utilization, not 100%. I’m basing this off the higher ratings for 80 Plus.
I’ve previously spoken with PSU engineers for enterprise power supplies – specifically for 48-54v PoE equipment – who described to me that today’s switch mode power supplies (SMPS) tend to get more efficient with increasing load. The exception would be when the efficiency gains from higher loading start to become offset by the heating losses from higher input currents.
This graph for a TDK PSU shows that North American 120 VAC nominal (see here for the small difference between nominal and utilization voltages) will cause a small efficiency hit above 75% or so. And this is exactly why data centers – even in North America – will run with “high line” voltage, which is 200 VAC or higher (eg North American 208VAC delta supplies, British 240/415 wye, European 230/400 wye).
I guess being on a laptop I don’t have to worry about the PSU being less efficient.
It’ll depend on how efficient your phone charger is vs your PC PSU. Looking at some charts, it’s a very close battle but generally the phone charger seems to win out. Probably because it’s more optimized for its max power output, whereas the PSU needs to support a wider range of loads.
https://silentpcreview.com/power-lost-a-better-way-to-compare-psu-efficiency/
needs to support a wider range of loads.
Are we still doing phrasing?
Interesting.
If you’re bothered about overall waste, consider that some batteries degrade slower if you charge slower. I tend to prefer a slow charger when i can.
I remember seeing an experiment saying that the difference is negligible. Even if it isn’t, it’s far more important to keep your battery between 20 and 80 percent at all times.
Your computer doesn’t have “excess energy”
Charging at higher voltage is more efficient (at least on electric vehicles) but also your phone is such a relatively small battery it barely matters. Avoid wireless charging as it is extremely inefficient.
USB PD (Power Delivery) actually does use a higher voltage for more wattage. Standard USB is limited to 5V at 0.5A and sometimes up to 5V @ 2A on quick chargers. But PD chargers can give 20V and 3A for 60w or even 5A (100w) with properly rated cables. There’s even a proposal for up to 48V at 5A to get 240w. This is all determined by a negotiation between the charger, the cable (which does have a small chip for this purpose), and the device, therefore PD chargers must support multiple voltages.
excess electricity that your computer would have wasted
Ya this is not how electricity or computers work.
Your computer doesn’t “waste” electricity, power usage is on-demand. A PSU generally has 3 “rails”; a 12V (this powers most of the devices), a 5V (for peripherals/USB) and 3.3V (iirc memory modules use this). Modern PSUs are called Switched-mode power supplies that use a Switching voltage regulator which is more efficient than traditional linear regulators.
The efficiency of the PSU/transformer would be what determines if one or the other is more wasteful. Most PSUs (I would argue any PSU of quality) will have a 80 Plus rating that defines how efficiently it can convert power. I am not familiar enough with modern wall chargers to know what their testing at… I could see the low-end wall chargers using more wasteful designs, but a high quality rapid wall charger is probably close to if not on par with a PC PSU. Hopefully someone with more knowledge of these can weigh in on this.
Gallium Nitride based modern phone chargers are 95% efficient.
The very best, most expensive PC power supplies on 115v AC will only reach 94% at the very specific 50% load of power supply rated wattage. So if you have a 500 watt power supply, and aren’t using almost exactly 250 watts, you aren’t getting that 94% efficiency. Regular power supplies under normal variable load conditions are going to be somewhere in the 80% efficient range. If the PC is idle, that efficiency can drop to 20% (but it’s fine because it’s only a few watts).
https://forum.level1techs.com/t/super-high-efficiency-300-400w-psu/184589/2
So using a modern Gallium Nitride stand alone charger will be more efficient. It will be extremely more efficient if you use that stand alone charger instead of charging off your PC while your PC is idle.
Counter point. Most computer power supplies have a curve to their charging efficiency (somewhere north of 50% load). If your PC is substantially below the peak of that curve, then adding load (the phone) could raise the PSU’s efficiency say from 80 to 85% (I’m making up numbers) which would affect the overall efficiency of the entire PC’s load.
I think your answer is still probably correct, but it’s an interesting nuance to think about.
Side notes: Some PSU’s use gallium, e.g., Corsair ax1600i, though by and large most do not. Also if your in the EU then your working with 220/240v PSU’s which adds more efficiency, but that would apply to the phone charger as well.
Adding a 30 watt phone is going to be maybe 5% of the PC load. So it could bring it up to a few percent. But that is insignificant compared to the normal swings of 200+ watts between normal and load.
Agreed, especially for gaming PC. On laptops and generic PC I think there could be a bigger swing, but also you’d need a USB port capable of pushing 30w which is not a standard feature by any means in a PC/laptop.
It reaaaaaally doesn’t matter. If you charged your phone from empty to full every day it might cost you a dollar. Per year.
I was just curious also I was more thinking of energy efficiency for environmental reasons.
Not to discourage such thoughts in the future, but your single post asking here probably used up more electricity than what you would save over the course of the next ten years.
I know the fediverse needs a green push and some instances are using renewable energy but really 10 years worth from one post?
It was a bit of a hyperbole, I have no idea about the exact amount.
Let’s say you charge your 2000mAh battery every day and your PSU is 10% more efficient than your charger (the difference is most likely not even this big).
2Ah × 5V x 356d= 3.56kwh
3.56kwh × 0.1 = 356Wh
356Wh would be the difference per year, that’s about 12ct per year.
Now estimating the power usage for fediverse messages is very hard to do since it depends on a lot of different factors (your device, cellular or WiFi data, amount of hops needed to reach you, general state of your nearby network, your instances infrastructure).
The only even remotely similar thing I could find was emails with pictures producing about 20-40g CO2, which only slightly increases with more recipients, and Reddit usage comes at about 2.5g per minute. Comparing these two numbers just shows that all estimates done are pretty much useless for us since we have no idea how they are done.
But if we go with a low estimate of 0.1g (slightly above SMS and somewhere around spammail level) per user seeing it and a few hundred to a thousand users seeing this even if they just scroll past, we reach the CO2 equivalent of 1kWh pretty fast without even talking about long term storage and future indexing. Not to mention that comments produce something too since they need to be federated, albeit not so much as the post itself.
So while 10 years was a bit much, 2-3 years would be very much in the realm of possibilities, but no one knows or can even properly estimate the actual numbers.
I think that’s more about the scale than it is about the fediverse.
Effectively you’re asking about a quarter cup of water where the answer isn’t even clear. Wireless charging is a bit wasteful though.
I still appreciate your asking, because there’s been interesting discussion in the answers.
In that case, you can make it a point to charge when the grid is “cleaner” - usually overnight. Your electricity costs may be cheaper then anyway.
The Apple Home app shows a grid forecast for your location, with cleaner times highlighted in green. I’m sure they pull this info from the utility company, so the info should be available in other smart home apps or maybe even your utility’s website.
But like others said, phone charging is very minimal. We’re talking about a 20W charger vs. say, a 1500W air fryer. Running larger appliances off-hours is a bigger deal - dishwasher, laundry, etc.
Overnight? I thought it would be cleaner during the day because that is when the sun shines. I haven’t had an Iphone in a while but I will have a look into grid forecasts. I still use an air fryer not sure what the the wattage is though I would assume it is similar to an oven.
Kinda depends on where you live but there often is an excess of hydro and wind power overnight.
I’d like to remind everyone of the “vampire effect” of wall-wart chargers - if you just leave them plugged into the wall waiting for you to connect a device, you’re constantly wasting a bit of electricity. That should also be involved in the efficiency decision of using the already plugged in computer or laptop.
Someone correct me, but unless your charger is warm to the touch, this is a very insignificant amount of power a year.
Your TV pretending to be off probably draws more every couple of days.
Anything with a remote, anything with a screen, way too many “computerized” appliances. Leaving a computer on
If you care to minimize Standby power “comfortably”, usually libraries or power companies will let you borrow an AC Power Meter free of charge.
You can use that to inspect your various devices Standby Power. For example I have an amplifier that pulls nearly 15W in standby, since finding out it lives on a smart plug.
However my TV pulls less than 1W, and at that point I prefer the convenience of just being able to use the remote to turn it on.
(Also keep in mind with the smart plug solution that the plug itself will pull a little bit of power too, this will pretty much always be <1W though.)
Vampire effect was a real problem with older transformer-based wall warts. If you still have one that feels heavy and solid, or has a fixed energy rating on the label, this is you
If it feels light, almost empty, or the label has a wide range of frequencies and voltages, it uses an order of magnitude less and really doesn’t add up anymore. I believe it’s something like a couple cents per year.
Edit: found something from 14 years ago calculating the worst case scenario of leaving a wall wart plugged in as 7¢/month but I don’t believe it’s anywhere near that expensive for a modern one …. Although it’s probably more telling that I didn’t find anything more recent with a price estimate.
Unless your chargers are generating a noticeable amount of heat (and they shouldn’t be), the amount of electricity they are using is simply negligible. Electricity cannot simply evaporate into thin air.
It is a little bit more efficient, because your PCs Power Supply is not very efficient when you dont use much power. By using it more of it, it becomes more efficient, peaking somewhere at 50%. But: we are talking about very little differences here and only for Desktop PCs. When you use a Laptop, you Powersupply is way less powerful, so you use more of it by just using the Laptop. So in that case, i would rather use the charger. To be perfectly honest with you: All that is not realy worth thinking about. Its like opening your Fridge for only 6 Secounds instead of 8. Yes, its saves power, but there are petter ways to do so. Doing only 1 kilometer less with your electric car ist about the same as charging your phone 10 times.
IF you’ve a high-efficiency charger, then I’d say it’s probably more-efficient to use that charger.
The warmer you run your computer, the less-efficient it becomes, & the shorter the lifespan of the hottest chips in it ( this effect shouldn’t be significant )
e.g. increasing a CPU by 10Celsius should cut its lifespan in half.
by having more heat-generating-stuff going on in your computer, you impair the cooling of your CPU & GPU ( slightly, probably ), & that may affect your computer’s time-to-failure.
Fan-bearings may dry-out sooner, too.
hth, eh?
_ /\ _
The amount of heat this will add to your case is negligible. We’re talking 15% waste on a 20W load, so 3W worth of extra heat. And that heat is produced in the PSU.
(PSU efficiency standards: https://i.imgur.com/WSWrsCm.gif)
If the heat is negligible I would assume it should not matter as long as you do not charge while your pc is doing a task that uses up to much resources?
Thank you I will charge them separately there are some good ways to save electricity but I guess this is not one of them.