What’s been holding researchers back up to this point is that, unlike x-rays and other forms of high-energy, ionizing radiation, low levels of radio-wave exposure don’t have the power to penetrate cells and blow apart bits of DNA. For radio frequencies from cell phones to cause genetic mutations in brain tissue, the tissue must absorb an obscene dose of them, and scientists simply don’t know whether or not that’s happening–even in people who spend the better part of their days holding the little energy-emitting devices flush against the side of their heads. To get a definitive answer, researchers need to be able to measure exactly how much radiation the brain absorbs during normal cell phone use–and they might finally have a way of doing it. Radio frequencies get transferred to heat when they’re absorbed by brain tissue, and those heat signatures can be detected by magnetic resonance imaging. Unfortunately, because of the intense magnetic fields involved, you can’t just put someone inside of an MRI with a metal-laden cell phone and measure how much warmer her brain gets. In the past, researchers have used electrical probes to emit energy inside of model brains, and then measured the resulting heat signatures. But such simulations have never been close enough to the real thing to yield conclusive results. Now, a group of researchers in New York and New Jersey have designed an antenna that emits radio frequencies in the same way as cell phones, but doesn’t include any of a phone’s pesky metal parts. They’ve already put the antenna next to a cow’s brain inside an MRI and tracked the resulting hot spots in the brain. In the future, the antenna system should allow scientists to build an accurate 3-D map of cell phone radiation in the human brain–a crucial step in determining how much energy the organ is exposed to at a time, and whether those little doses might add up to a real threat.