Newswise — A little over a year ago, a Swedish scientist learned the hard way that laptop computers do not quite live up to their name. According to the British medical journal, the Lancet, the mercifully anonymous man spent an evening writing a report, periodically shifting position to avoid heat from the machine. The next day he woke to find himself blistered in a very sensitive place. He'd been well and truly fried.

Computer users, lulled by the ubiquitous hum of their workstations' fans, can be forgiven for thinking that the heat thrown off by a computer's innards is not a burning issue. But it is, writes Philip E. Ross in the May issue of IEEE Spectrum. Chip designers, computer makers, assorted university researchers, and chip-packaging specialists are uniting to tackle one of the most urgent, but overlooked, of the litany of potential showstoppers looming for the global semiconductor industry: the soaring densities of heat on integrated circuits, particularly high-performance microprocessors.

Researchers are studying exotic new kinds of heat-conducting "goop" that suck the heat out of a chip and convey it to heat sinks, which radiate it into the air. Still, it is a measure of the seriousness of the problem that engineers are also pursuing concepts that have been considered too elaborate and far too expensive for such a mass-produced consumer product as a personal computer. Possibilities on the horizon include tiny self-contained evaporative cooling systems and even devices that capture the heat and turn it directly into electricity.

What has led researchers to such measures? Basic physics: virtually all of the power that flows into a chip comes out of it as waste heat. Today's standard-issue Pentium 4 throws off 100 watts, the same as the bulb in a child's Easy-Bake Oven and, as the hapless Swede learned, more than enough to cook meat. Divide by the area and you get a heat flux of around 30 watts per square centimeter--a power density several times higher than that of a kitchen hot plate.