Thursday, July 10, 2008

A Patch to Fix the Net

A major flaw in the basic design of the Internet is being repaired by a large group of vendors working in concert.
On Tuesday, major vendors released patches to address a flaw in the underpinnings of the Internet, in what researchers say is the largest synchronized security update in the history of the Web. Vendors and security researchers are hoping that their coordinated efforts will get the fix out to most of the systems that need it before attackers are able to identify the flaw and begin to exploit it. Attackers could use the flaw to control Internet traffic, potentially directing users to phishing sites or sites loaded with malicious software.
Discovered six months ago by security researcher Dan Kaminsky, director of penetration testing services at IOActive, the flaw is in the domain name system, a core element of the Web that helps systems connected to the Internet locate each other. Kaminsky likens the domain name system to the telephone company's 411 system. When a user types in a Web address--technologyreview.com--the domain name system matches it to the numerical address of the corresponding Web server--69.147.160.210. It's like giving a name to 411 and receiving a phone number, Kaminsky says.
The flaw that Kaminsky found could allow attackers to take control of the system and direct Internet traffic wherever they want it to go. The worst-case scenario, he says, could look pretty bleak. "You'd have the Internet, but it wouldn't be the Internet you expect," Kaminsky says. A user might type in the address for the Bank of America website, for example, and be redirected to a phishing site created by an attacker.
Details of the flaw are being kept secret for now. After Kaminsky discovered it, he quietly notified the major vendors of hardware and software for domain name servers. In March, he was one of 16 researchers who met at Microsoft's Redmond, WA, campus to plan how to deal with the flaw without releasing information that could help attackers. The researchers began working with vendors to release patches simultaneously. Also, since patches are known for giving away information that can help attackers reverse-engineer malicious software, the researchers chose a fix that kept the exact nature of the problem hidden. "We've done everything in our power up to and including selecting an obscure fix to provide the good guys with as much of an advantage as possible," Kaminsky says. "The advantage won't last forever. We think--we hope--it'll last a month."
Since the flaw is in the design of the domain name system itself, it afflicts products made by a variety of vendors, including Microsoft, Cisco, Sun Microsystems, and Red Hat, according to a report released by the U.S. Department of Homeland Security's Computer Emergency Readiness Team. The flaw also poses more problems for servers than it does for Web surfers, so vendors are focusing on getting patches to Internet service providers and company networks that might be vulnerable. Most home users will be covered by automatic updates to their operating systems.Rich Mogull, an analyst with Securosis, says, "This is something that absolutely affects everyone who uses the Internet today." While he notes that most home users won't have to take action to address the flaw, he stresses that it's very important for businesses to make sure that they've covered their bases. "It is an absolutely critical issue that can impede the ability of any business to carry out their normal operations," he says.
Although Kaminsky was careful to avoid giving out too much information about the flaw that he discovered, he did say a few things about the nature of the fix. When a domain name server responds to a request for a website's location, it provides a confirmation code that is one of 65,000 numbers, as assurance that the transaction is authentic. "What has been discovered," Kaminsky says, "is that, for undisclosed reasons, 65,000 is just not enough, and we need a source of more randomness." The new system will require the initial request to include two randomly generated identifiers, instead of the one it now contains. Both identifiers will automatically be returned in the server's response. Kaminsky likens this to sending mail. Before the patch, it was possible to send a letter signed on the inside, but without a return address. After the patch, all "mail" sent from domain name system servers must include both a "signature"--the confirmation code--and the "return address"--the source port information.
Jeff Moss, CEO of Black Hat, a company that organizes conferences on security, stresses the importance, not only of the vulnerability, but also of the approach taken to patching it. "I don't even want to ask Dan [Kaminsky] how much money he could have gotten for this bug had he decided to sell it," Moss says.
Kaminsky says he's glad that vendors were willing to work together to address the flaw. "Something of this scale has not yet happened before," he says. "It is my hope that for any issue of this scale, especially design issues of this scale, this is the sort of thing that we can do in the future." He plans to release full details of the vulnerability next month at the Black Hat security conference in Las Vegas.

A Picowatt Processor

A low-power chip could be used for implantable medical sensors
Pico power: This tiny processor, called the Phoenix, uses 90 percent less energy than the most efficient chip on the market today. It could enable implantable medical sensors powered by tiny batteries.


Before long, sensors may be implanted in our bodies to do things like measure blood-glucose levels in diabetics or retinal pressure in glaucoma patients. But to be practical, they'll have to both be very small--as tiny as a grain of sand--and use long-lasting batteries of similarly small size, a combination not commercially available today.
Now researchers at the University of Michigan have made a processor that takes up just one millimeter square and whose power consumption is so low that emerging thin-film batteries of the same size could power it for 10 years or more, says David Blaauw, professor of electrical engineering and computer science at Michigan and one of the lead researchers on the project.
But when this processor, dubbed the Phoenix, is coupled with a battery, the whole package would only be a cubic millimeter in volume. At this scale, Blaauw says, it could be feasible to build the chip into a thick contact lens and use it to monitor pressure in the eye, which would be useful for glaucoma detection. It could also be implanted under the skin to sense glucose levels in subcutaneous fluid. More broadly, this low-power approach to processor design could be used in environmental sensors that monitor pollution, or structural health sensors, for instance.
The processer uses only about 30 picowatts (a picowatt is one-millionth of one-millionth of a watt) of power when idle. When active, the processor consumes only 2.8 picojoules of energy per computing cycle. That amount is about a tenth of the energy used by the most energy-efficient chips on the market, says Jan Rabaey, a professor of electrical engineering and computer science at the University of California, Berkeley, who was not involved in the research.
The Michigan team's main idea was to design a chip that runs at an extremely low voltage. While microprocessors for personal computers may require two volts of electricity per operation, the Phoenix only needs 500 millivolts, or 75 percent less.
At this voltage, parts of the chip don't operate well, explains Blaauw, so his team redesigned the chip's memory, which is smaller than most processor memory, and its internal clock so that it could operate with minimal electrical input. The chip's clock--the timepiece that synchronizes number-crunching operations--has been reduced to an extremely slow rate of 100 kilohertz, as opposed to the gigahertz rates of personal computers. This approach makes sense for sensors, says Blaauw. "If we wanted to monitor pressure in the eye . . . we only need to take readings every few minutes," he says.Additionally, the researchers paid close attention to the energy loss that occurs while the chip is in sleep mode, or not collecting or processing data. Transistors in the newest computers are made using a 45-nanometer process in which features on a chip are 45 nanometers in size. While this allows for more transistors on a smaller chip, it also results in electrical leakage, due to the physics of the materials at this scale. Blaauw and his team opted for larger transistors made using a 180-nanometer process, from a previous generation of chips. These transistors are in a "sweet spot," says Blaauw. They are big enough to have minimal leakage and yet small enough for the researchers to fit a large number on a one-millimeter-square chip.
To further minimize leakage, the researchers added special transistors that completely shut off the power supply to the processing transistors when the chip is in standby mode. This is a common approach, says Blaauw, but his team took it to the extreme and dedicated much more of the chip than usual to these "power-gating" transistors. "If a normal [chip] designer would look at this, he'd say, 'You're out of your mind,'" Blaauw says. "But it gives us the power-savings trade-off we need." In sum, the researchers combined a number of already existing tricks and fine-tuned them to achieve the record-breaking low power consumption.
The Michigan team, which is also led by Dennis Sylvester, professor of electrical engineering and computer science, still must add a battery to the Phoenix, and it needs to develop a way for data to be offloaded from the chip for further analysis. Once this is done, the researchers can work on full integration within a biological system, which could take years.
Berkeley's Rabaey, who is writing a book on low-power processors, says that the work is significant. "What has impressed me is that they've driven this to quite extreme numbers," he says. "The energy consumption is extremely low. Nobody else has come even close to this." Rabaey notes that this processor is intended for specialty sensor applications and that it won't show up in a cell phone anytime soon. However, it's an important step toward building implantable medical sensors whose batteries can last for years.
The idea of this low voltage chip is not new, says Rabaey: it's been used successfully in the watch industry for decades. But within the past few years, academic and industry interest in such design has blossomed as engineers are exploring more varied and ubiquitous uses of sensors, devices that require energy-saving tricks in order to be practical.

More-Searchable Flash

Information from millions of Web pages that use the animation software is now available to search engines.
The Web would be useless without search engines. But as good as Google and Yahoo are at finding online information, much on it remains hidden, or difficult to rank in search results. On Tuesday, however, Adobe took a major step toward opening up tens of millions of pages to Google and Yahoo. The company has provided the search engines with a specialized version of its Flash animation player that reveals information about text and links in Flash files. It's a move that could be a boon to advertisers, in particular, who have traditionally had to choose between building a site that's aesthetically pleasing and one that can be ranked in a Web search.
The new software is required only to index Flash files, not to play them, says Justin Everett-Church, senior product manager for Adobe Flash Player. Web surfers don't need to download a new Flash player, and content providers don't have to change the way they write applications. "For end users, they're going to see a lot more results and a lot better results," says Everett-Church. "The perfect result may have been out there but trapped in a SWF [Shockwave Flash file]. But now they can find it."
Currently, Google indexes nearly 71 million Flash files on the Internet (this number can be acquired by searching "filetype:swf"). These files have, to a limited degree, always been searchable. Before Adobe's announcement, search engines were able to look at a Flash file and extract static text and links from it. But they couldn't tell where on the Flash site the text fell--on the main page, for instance, or deep within the site--which made it difficult to evaluate its importance. Search engines would also miss moving text inside animations.
Adobe gave Google and Yahoo new Flash player technology that works in conjunction with the "spiders" that search engines use to index Web pages. (Microsoft, which has developed its own competitor to Flash, called Silverlight, is not publicly involved in Adobe's initiative.) Spiders are autonomous programs that browse through the Web in a systematic fashion. Adobe's new player allows these spiders to load Flash files, read the text and links, and click any buttons or tabs. This allows the spider to make inferences about the context in which a word or link occurs--something it couldn't do before.
"Previously, content providers have had to make a trade-off between using a SWF [pronounced 'swiff'] and searchability," says Everett-Church. But now, he says, Adobe hopes that more people will feel comfortable developing visually appealing sites without forgoing search rank.Analysts agree that it's important to make more of the Web searchable, and Adobe's move is crucial. However, it's an intermediate step, says Peter Elst, a Flash platform consultant. While the move opens up more text and links to search engines, site designers should have "control over what exactly gets indexed and how it should be interpreted by a search engine," Elst says. With conventional Web pages, designers exert that control by adding metadata and tags that describe their sites. But, Elst says, that's not yet possible with the new Flash tools.
At this stage, says Elst, many Flash programmers are concerned about how Google and Yahoo will use their newly acquired information to rank sites. "As far as we know," he says, "the data that gets indexed is just a raw dump, and no context is applied, making it difficult to figure out how you can actually use this to do search-engine optimization and get higher page ranks."
Google has dropped some hints about how it will handle Flash searches. For instance, its spiders currently will not load Flash applications that use the language JavaScript, so those applications may not get indexed. But in the end, people and businesses that want to promote their websites may need to use trial and error to figure out how to build Flash sites that search engines will rank highly, adjusting their tactics as Google's and Yahoo's algorithms change. But then that's what they had to do with traditional HTML sites anyway.