A rather late post. After working on this for an hour or so last night, it just wasn't coming across clearly. To me. I thought discretion was in order and I'd rework it to make it more understandable - and make it twice as long.
Wireless charging is one of those things that seems like a great
idea but really hasn't taken off as much as many had
expected. I touched on that briefly
last
month, but while wireless charging fits into many peoples'
lives, it isn't revolutionary. I recall writing:
Yeah, they allow you to put your phone on a mat, or a
special pad like the Samsung charger pictured, but the bottom line is
that it doesn't buy you much. Your phone isn't very usable
while it's charging - at least, not as a phone. How
different is leaving your phone on a pad overnight from leaving it
plugged in overnight?
Within a week of that, I ran into talk about another wireless
charger that went beyond "not revolutionary" to a bit scary.
EE
Times reported:
The Cota wireless charging technology from Cairo-based
Si-Ware Systems (SWS) and Ossia Inc. departs from coil-based systems
to serve up to eight devices simultaneously at a distance of 30 feet
using 2.4 GHz multi-path radio frequency.
...
Ossia and SWS envision a charging technology that mimics
Wi-Fi, automatically connecting devices to power without the need for
charging mats. The Cota system includes a scalable receiver and
transmitter with between 1,000 and 8,000 tiny antennas. [emphasis added - SiG]
The
idea itself is pretty elegant and you have to admit it has some aspects that sound cool. Instead of charging inductively, like all current systems do; implementing the two halves of a
transformer as a coil on the charger and one in the thing being charged, this system uses radio transmitters to send power to the thing you're charging. As a user, you never worry about plugging in the device, or putting it on the charging coil. You simply go about your life and the phone (or whatever) is automagically charged while you use it. In their system, one transmitter
connects with up to 8 devices with no action required from the user, like a WiFi network recognizing a device that had been there before, and sends power to charge the device's batteries over
radio frequencies. The article reports this system relies on custom components which they describe.
The SWS1410 MIMO transmitter chip can deliver more than
10 Watts with support for up to four antennas, a central CPU that can
store location data for different clients and on-chip RAM. Abid
Hussein, chief commercialization officer at Ossia, which developed
Cota’s antenna technology, made a distinction between his
technology and beam forming.
Each antenna emits a few milliwatts of ambient power he
said, then uses massive multi-path technology to process a receiver’s
signal and send power to a chip or chips. Ossia and SWS will
demonstrate a consumer-scale personal area transmitter at CES in
January. [emphasis added - SiG]
The
way I read how they're describing this, they intend to radiate "more than
10 Watts" for the charging, but the system will emit a few milliwatts, and when a
receiver replies, then it sends power.
So what's the problem? The amount of power they need to send and the way the system is described sounds like it's going to be in the realm of potentially dangerous.
I find numbers help me visualize the system and make it easier to see the way it works, so let's play with some numbers. An iPhone 5 or 6 has a battery rated at about 6 or 7 Watts. An iPad has a battery rated at 25W. A Samsung Galaxy S5 has a battery which upgrades that to almost 11 Watts. The Nexus 10 Tablets are rated at close to 34 Watts. That sort of rating in watts is generally given for a 10 hour charge/discharge rate (I talk about this in more detail
here), which implies a charger would apply about a tenth of that for over 10 hours. (Why more than 10 hours to put back 10 hours worth of use? There are always inefficiencies, things that generate heat, that make it take more than 10 hours to put 10 hours of capacity into a battery).
Since they say that their system will charge up to eight devices, we could assume a mix of these things or "worst case" it by assuming we have a room with eight Nexus 10 tablets in it, but I'm going to be generous and say four Nexus 10 tablets (4 times 3.4 Watts) and four iPhones (4 times 0.7 Watts). That's 16.4 Watts, quite a bit more than the "more than 10 Watts" rating they talk about. Based on reading Engineering Sales pitches for 30 years, I believe that if they really could do 16.4 watts they'd say, "more than 15 watts" if not saying more than 16 watts. So I'm going to go from here and limit my calculations to 10 Watts. (More than 10 Watts could mean 10.001)
The problem is the electromagnetics of filling a room with 10 Watts of radio frequency (RF) energy, aiming it at a handful of small devices, and actually transferring the charge to them. In particular, can it be done safely? RF safety is pretty big and contentious subject.
Way back in 2011, I put down a lot of thoughts on the subject, but the 25 cent summary is that while there are many, many accusations that RF causes all sorts of injuries, cancers or other problems, the only effects that everyone agrees with is that it causes heating. We all know that - didn't Robin Williams joke about putting Mr. Hamster in the microwave oven in the mid '70s?
Over the last half year or so of my career, I became the default guy to go to with questions about RF safety. I reviewed what the US, European Union, Canada, and Australia had for their RF safety limits, and then looked into what the Environmental Defense Fund had to say about smart meters (I found it interesting that EDF essentially used the US limits). In general, the US limits are about typical of everyone. There are a couple of countries in the European Union who reject the EU limits and impose limits about 1/10 of those but don't really justify it. I will stick with the US limits here.
The US has a safe exposure limit (for the general public) in the 2.4 GHz frequency range (cited in the first quote paragraph) of .001 W per square centimeter; 1mW/sq.cm. Recall from the previous charging discussion that we're assuming 10Watts or 10,000 mW. To get to 1 mW/sq.cm., we have to spread that over 10,000 sq.cm. to be safe! 10,000 sq.cm is 1550 square inches. An antenna beam doesn't have to be circular in shape, but it's a convenient way to wonder how big a thing we're talking about. A circular beam of 1550 square inches would be almost 44 inches in diameter, almost four feet in diameter.
Think now about pointing that 44 inch diameter pencil-shaped beam at one device to charge. 44 inches diameter is much bigger than an iPad, so some of that energy will be lost and charging efficiency will go down making it take more hours to charge it. The only way they could get the efficiency up would be to make the beam smaller, but that would make the power density exceed the FCC safety rules. It gets more complicated. That 10Watts needs to be split into eight different directions to put the power into the device. In a way that's good, since it might be feasible to send lower powers, which can ease the safety hazard and make the beams smaller. On the other hand, consider a room with eight people carrying devices and the system trying to charge them. How does it track them? How does it keep the beam from not crossing someone's eyes? How fast could someone move and still allow the system to charge something?
I hate to be the guy who told the Wright Brothers "it'll never fly", but it doesn't seem to be an approach that can really work. Too many conflicting requirements. They say that they want to use the 2.4 GHz band for this power distribution, but your smart device has a 2.4 GHz WiFi receiver in it and putting 10 Watts in the vicinity of a receiver designed to handle millionths of a watt just isn't a good idea. That much power would very likely physically damage the receiver. If it doesn't physically damage the receiver, it will jam it and make it useless.
I'm especially concerned about the RF safety requirements. No, I don't think 1mW/sq.cm is dangerous, but computing what an actual electromagnetic field looks like, especially one that comes from "between 1000 and 8000" radiating antennas, is so computationally horrific that it would choke a really fast computer for days. The amount of simulation and measurement that would be required to show it to be safe would require many sets of calculations. And those calculations go out the window unless everything in the room can be modeled, including the people carrying the devices. Don't say they can measure the fields because putting any probe into a field distorts it. That doesn't make the safety analysis any easier.