The Wireless Home Digital Interface specification has reached its 1.0 milestone. It covers a protocol for forming a local wireless network that's able to shuffle 1080p broadcasts among devices, complete with copy protection and quality-of-service prioritizing.
Tired of being forced to run wires or shuffle large files around in order to enjoy some HD content in the room of your choice? A new protocol has been formalized that will allow royalty-free implementations of a wireless mesh that's up to handling 1080p broadcasts out to about 30m (100 feet). The Wireless Home Digital Interface will use the same 5GHz area of the spectrum that's currently employed by the the N version of WiFi, but its backers promise that this won't create any problems: supposedly, the system has the brains to identify the portions of the signal that contain the most important visual information, and give them priority.
The WHDI standard is being pushed by AMIMON, which makes the chips that power it. Still, the group has attracted some serious backers in the home electronics area, including Hitachi, LG, Samsung, and Sharp. The details of the protocol are only available to companies that have formally signed on as "early adopters," but the WHDI website has a fair amount of information on the technology behind the standard scattered among different pages.
For starters, it specifies that devices will broadcast on the unlicensed area of the spectrum around 5GHz, which is currently occupied by 802.11n WiFi signals. Right now, that area of the spectrum isn't that crowded, since it doesn't allow backwards compatibility with B and G devices, but traffic will undoubtedly increase as those earlier devices get retired. As with just about anyone else brining new wireless devices onto the market, WHDI's backers claim that their equipment will be able to sense areas of the spectrum that are already in use and avoid them, maintaining an extremely low-latency signal.
But, even if that fails, the protocol can apparently overcome momentary hiccups in its signal. The trick is separating out the video information into chunks of data with greater and lesser visual importance, and ensuring that the most important ones get through. As the WHDI site describes it, the system takes the most significant bits from a pixel's color data, and groups those into a packet, assigning that the highest priority. The next few bits get a lower priority, and so on down the line until you get to the last few bits, which simply convey very subtle differences in color information for a given pixel.
If everything goes well, all the bits are reassembled into a glorious, 60Hz 1080p signal at the receiving end. If they're not, the first things to go are subtle details that aren't likely to be noticeable. Only if the network runs into serious performance problems should obvious color problems become apparent. WHDI considers this superior to compressed signals, since compression commits to a known value of loss in advance, and doesn't allow any sort of recovery if some of the data goes missing in transit.
Privacy advocates will be happy to note that the system enables 128-bit AES encryption, while Hollywood will possibly be comfortable with the fact that it implements version 2 of the HDCP copy protection scheme.
By giving away the technical details of the standard, AMIMON clearly hopes that multiple vendors will implement it, and then turn to the company to buy the chips for doing so. It promises that the system will form an open mesh, allowing any company's device to hop on without any sort of preconfigured pairing. Of course, nothing's stopping the vendors in question from making pairing with foreign devices a UI nightmare.
The group claims that devices using the technology will be on the market late next year. Among them, the companies that have already signed on make devices that include high-def displays, cable boxes, and disc players, so the potential is there for an entire wireless HD ecosystem if all of them follow through with actual products.
Source: ars technica