One of the fatal flaws of plasma technology is the short life of the panel. The typical number I used to hear was that the panel will go to half brightness in around 2 years of use. This is mainly due to ion bombardment of the phosphorus in the individual plasma cells. Anyone who used to work on color tv's remembers the "ion trap" magnet that was strapped around the picture tube neck. It's job was to deflect the relatively heavy ions and keep them from bombarding the center of the screen and causing a dark spot. Because these ions are heavier than electrons, they are not effectively deflected by the deflection system, and thus would repeatedly strike the same area of the screen, causing accelerated wear. The same thing happens in plasma TV's. Each plasma "cell" has a low pressure gas, usually xenon and kryton if I remember correctly, an electron gun and the phosphorus. When the cell switches on, the gas is ionized, and illuminates the phosphorus by bombarding it with electrons, and you guessed it, ions. I theorize that this is the primary mechanism of wear in the panel.
The reason the panels radiate so much is the length of the conductive strips that connect all the cells in row and column fashion. on a 60" set, some of those wires are three feet long, and carry large amounts of current when the cells are in conduction. Because shielding the glass would reduce the brightness of the set, its very difficult to keep the EMI in check. The best manufacturers can do is soften the initial conduction pulse of the cell as it switches on, which also affects brightness. Just like a switch mode power supply, the cells are either on or off. To get variations in brightness, the percentage of time the cell is lit is varied, and our eyes fill in the rest. DLP systems work on the same principles.
I could be wrong, but after being around a few of them, that's my theory about them.
+-RH