I've been asked (e.g. at a talk on LED lighting) and seen comments (most recently at Brad's site) suggesting the use of DC in home wiring in addition to 110v (or 220v) AC. This does not make sense from an engineering and safety point of view. The resistive power loss in the wiring results in an impractical waste of power and excessive cost for wiring. You really do need the higher voltage to carry the power loads that are found in typical rooms.
The following table shows the maximum safe power delivered, power wasted heating the wires, and necessary source voltage for various delivered voltages. It assumes the use of current normal house wiring, meeting current safety standards for overheating and overcurrent. (This is the typical 20amp home circuit).
|Target Voltage||Power Delivered||Power Wasted||Source Voltage|
|5 Volts||75 Watts||66 Watts (40% waste)||10 Volts|
|12 Volts||180 Watts||66 Watts ( 25% waste)||15 Volts|
|36 Volts||540 Watts||66 Watts (10% waste)||40 Volts|
|100 Volts||1500 Watts||66 Watts (3% waste)||100 Volts|
|200 Volts||3000 Watts||66 Watts (1% waste)||200 Volts|
I'll analyze the use of DC for three realistic scenarios.
First, suppose the goal is powering all those little devices (cell phone chargers, clocks, etc.) A quick look shows that my cordless phone consumes 8W, projector clock 5W, etc. They want a variety of voltages ranging from 5V to 15V. I would want some operational margin. It would be really annoying to blow the circuit breaker by plugging in a cell phone charger. If I allow a dozen devices, I'm already above the limit for 5V. A 12V system could make it, but I need to be even higher if I want more than one room per breaker.
Next, if I add my laptop, ink jet, wireless router, etc. the load jumps up. The laptop wants 65W max. The router wants 20W. I might survive at 12V but probably need to move up to 36V power.
If I add the load for LED lighting, I add another 100-300W per room. LED lighting is approximately the same efficiency as fluorescent lighting. The 36V system might make it, but a 100V system is probably needed.
From an efficiency perspective, there is a real efficiency gain if you can up the size of the AC/DC converters to 50W or above. The current target for affordable efficient (Energy Star) low power AC/DC power supplies is 50% at 1W rapidly growing to 85% at 50W. Power supply efficiency levels off above 50W. Using the USB power (which provides a maximum of 1W) can cut down on the number of very low power adapters needed. But notice those high power losses for the low voltage distribution. Distributing low voltage DC is very inefficient.
Better approaches are steps like standardizing DC power connectors (much like AC connectors have been standardized for a long time). This would enable more power flexibility. The USB is becoming the defacto very low power connector (for under 1w). The wiring on USB is very thin (for flexibility) and it actually has substantial power loss and fire hazard if you push the power level up much more. Another connector for 5V at up to 5W and one for 12v at up to 20W would cover most of the higher load devices.
You might start seeing customized power supplies and PC's acting as power hubs emerge much more quickly if the plugs were standardized and equipment started using them.
The trend at the moment seems to be in another direction. Some devices are driving their total power requirements down, so that they can be USB or battery powered. An odd one that I recently found when replacing my old KVM was pulling power from the mouse and keyboard connectors. These are designed to provide up to 1W (old keyboard technology). The electronics have improved a lot. The old KVM consumed 4W; the new one uses only 100mW. So the power wart is entirely eliminated.
Other devices are taking advantage of improvements in power semiconductors to put wide range AC/DC power supplies internally, so that they use an ordinary wall outlet power.