Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Wireless Networking

Wi-Fi Cards Can Now Detect Microwave Ovens 124

An anonymous reader writes "Researchers at UW Madison have used regular WiFi cards to detect non-WiFi interference sources like microwave ovens, Bluetooth devices, cordless phones, Xbox controllers and video cameras. They call their software Airshark. Current products like Wispy, Spectrum Expert are expensive and need extra hardware, whereas Airshark is a software-only solution that can directly work on the Wi-Fi cards on your laptops and APs. This also paves way several interesting applications. For example, your WiFi network will not be affected anymore just because your neighbor switched on a microwave oven or a cordless phone — the newer WiFi APs will be able to switch the channels and adapt to the interference accordingly."
This discussion has been archived. No new comments can be posted.

Wi-Fi Cards Can Now Detect Microwave Ovens

Comments Filter:
  • by Anonymous Coward

    If you live in a city or in the suburbs, you can see LOTS of WiFi access points already, so switching to a different frequency won't get you to an interference free channel! Maybe you'll get to a little less populated one, but not interference free!

    • Or use 5 or 5.8ghz. Even if there are tons of devices, it has many times more spectrum than the 802.11b/g band.
      • Or use 5 or 5.8ghz. Even if there are tons of devices, it has many times more spectrum than the 802.11b/g band.

        What's the deal with 802.11n and the 5 GHz band? I've seen a lot of N equipment that only seems to support the 2.4 GHz band- I'm guessing because it's cheaper to only have one radio?- so is is the case that N equipment doesn't *have* to support 5 GHz? (*)

        At any rate, it sounds like buying 5 GHz-supporting N equipment would be worth the extra money. I made sure my first router and card (circa 2005) supported 802.11a even though it cost more because I suspected congestion might become a problem and the less-p

        • Yup, cheapo 802.11n stuff doesn't support 5GHz. The only reason I'm still running 2.4GHz is my Android smartphone... stupid cheapo Broadcom BCM4329 doesn't support 5GHz.

          At least all my Thinkpads support 5GHz and the router's dual band...

          I can definitely recommend going with 5GHz if you don't need all too much range (through walls and such) and the airwaves around 2.4GHz are crowded in your neighborhood.

          Look at this crap... my neighborhood at 2.4GHz: http://dl.dropbox.com/u/7086491/pictures/2.4ghz.PNG [dropbox.com]

          And 5GH

  • by Skinkie ( 815924 ) on Saturday September 24, 2011 @05:59PM (#37504910) Homepage
    Then the entire spectrum problem is solved, and everything would be autoconfigured for the basic paradigm: connectivity. Now I don't expect a microwave to give me food-over-ip, but I would expect a neighbor wifi cell, helping my AP to extend the signal, if my client would move out of range (aka: has more noise).
    • by morcego ( 260031 )

      The technology allows for this already. However, the security and privacy implications are big. Not to mention bandwidth limitations. And switching capability. And routing tables. And ARP tables. And those are the problems I though about while typing this. I'm sure there are several others.

      • Not an expert on this, but couldn't the other hub just relay the encrypted signal, without dectrypting it? Then there is no security problem.
        • by morcego ( 260031 )

          You are talking about point to point encryption. Yes, it would be possible, assuming you are able to establish encrypted point to point connections to all services you connect to. Or you can setup a VPN to a "relay" server. The first is simply not a reality at this point, and the second is something most people wouldn't know how to do.

          Privacy would still be a problem, because anything up to Layer 3 would not be encrypted in the scenario you propose. Depending on how it is implemented, even Layer 4 would not

    • by CityZen ( 464761 )

      It's call mesh networking. If everyone did it, we wouldn't need telcos (or ISPs) so much.

      • In this ideal world of yours, what incentive is there for ISPs to maintain their networks, as you have effectively cut them out of the loop?

        You do realize that you would need a backbone SOMEWHERE unless you wanted horrific latency, for example to get traffic from coast to coast?

        • by CityZen ( 464761 )

          Yes, which is why I added "so much".

    • Well, it's not just that, it's that the devices are designed to give the most range possible without going with a unidirectional antenna. And the problem is that it worked fine when the 802.11b devices were first rolling out because few people had them, but as they've gotten to be common, you then have to deal with a dozen WAP competing for scarce spectrum.

      And all is well and good if you have a large property, but if you're in an apartment and just need something that's fast and can let you move from the de

      • say 15' away, having a device cranking out enough power to go 200' is way too much power

        The WiFi routers I've seen have a setup were one can adjust the output power. Of course, almost noone sets it to something sensible, which means my neighbour's signals a stronger than my own in parts of the apartment (on the other hand, that same inability to configure WiFi means they all fight around the channels 1-4, so I can avoid them).

        Routers should have an automatic power adjustment for the devices it has connected. Is this technically possible?

        • LTE does something like this, where the base station continuously analyzes the radio spectrum and
          figures out the optimal frequency (it uses multiple separate ones for the same cell) and the optimal
          transmission power (for both downlink and uplink transmissions).

          Part of the definition of optimal here is not so strong as to interfere with neighbouring cells more than necessary

          In theory, you could implement some inter-AP protocol as part of a WiFi standard to allow them to
          determine their resepective
      • by tlhIngan ( 30335 )

        Well, it's not just that, it's that the devices are designed to give the most range possible without going with a unidirectional antenna. And the problem is that it worked fine when the 802.11b devices were first rolling out because few people had them, but as they've gotten to be common, you then have to deal with a dozen WAP competing for scarce spectrum.

        The WiFi devices actually cooperate, it's part of the spec. A WiFi network consists of more than just the clients attached to it - it includes *all* clie

    • Re: (Score:3, Informative)

      by tuxicle ( 996538 )
      802.11a devices (operating at 5.45 GHz) are already supposed to detect radar signals and switch channels if one is found. This is particularly a problem in Europe, where most weather monitoring radars are C-band, and share the same frequency band as 802.11a.
    • Hi, I'm working for The Serval Project [servalproject.org], and like other projects related to wifi mesh routing, we do have high level goals like this. And we're actively trying to make them a reality.

      One of our staff just returned from a presentation to IEEE, to propose a more open standard for the next 802.11 spec.

      The basic premise of our proposal is that the protocol for using wifi devices to route traffic should be dealt with in kernel or user space. Not in the radio spec. And that adhoc, and 802.11s are useless for this task (Damn you BSSID, why you change?). We also think that security and perhaps even error correction should be dealt with via a VPN or baked into the application layer.

      We want the next wireless spec to include a basic packet radio mode, operating in any unlicensed white-space spectrum, that gives as much control as possible to higher levels of the OS. So that new interesting ideas are easier to experiment with and implement.

      And we've been invited to the next IEEE working group to help make it happen.

    • Yup, Shannon be damned. Frequency agility, which has been used in various radios for decades, will solve all these pesky spectrum crowding problems.
  • "Honey, did you leave the microwave on?"

    "I don't know, Dear, let me log into my PC and check."

    • by maxume ( 22995 )

      Right, because no one has neighbors in close proximity or anything silly like that.

    • Erm.. Is it even possible to "leave the microwave on?" Aren't they all on timers?

      • by grub ( 11606 )

        Nomally, yes.

        ~20 years ago I had a microwave with a dial timer that stopped working but still kept the mic "running". Sometimes it would blow an internal fuse if you opened the door while operating, I got tired of changing the internal bit and wired it out. The part bypassed was the door safety assembly. Basically the thing would run with the door open and never stopped.

        I had to warn anyone that was over to hit RESET before opening the door. Girlfriend moved in and we kept her microwave. Wimp.

        Serious en
        • by ae1294 ( 1547521 )

          Nomally, yes.

          ~20 years ago I had a microwave with a dial timer that stopped working but still kept the mic "running". Sometimes it would blow an internal fuse if you opened the door while operating, I got tired of changing the internal bit and wired it out. The part bypassed was the door safety assembly. Basically the thing would run with the door open and never stopped.

          I had to warn anyone that was over to hit RESET before opening the door. Girlfriend moved in and we kept her microwave. Wimp.

          Serious end note: I called my microwave "Ol' Auschwitz".

          Why not just wire the fuse part outside the unit?

        • Creepy. I myself have an old Sharp microwave (built like a tank) and it has this designed behavior where it runs the fan (along the light bulb) also when the door is open, creating a feeling that the machine is running. The turntable and magnetron are then not active though...
  • by vadim_t ( 324782 ) on Saturday September 24, 2011 @06:09PM (#37504978) Homepage

    I thought that shielding was well understood and in fact a good reason of the part why microwave ovens are a common household item.

    Could anybody with experience in these matters explain where the leak is coming from, and why do they still exist? Is it impractical or physically impossible to have perfect shielding for some reason?

    • by maxume ( 22995 )

      My el-cheapo microwave says on the box that it has 1500 watts of cooking power. Wifi stuff works over a couple of watts.

      I see those numbers and I'm not surprised that the one can bother the other, but I don't have the EM smarts to know how easy it would be to shield the microwave enough to get the emissions down to (the equivalent of) a watt or two.

      • by bbn ( 172659 ) <baldur.norddahl@gmail.com> on Saturday September 24, 2011 @06:30PM (#37505068)

        WIFI is only allowed to transmit 100 mW (0.1 watt).

        Even if only 0.01% of the microwave is leaking it is still more powerful than the WIFI. And even less is required if you do not have a perfect WIFI signal to begin with.

        • Even if only 0.01% of the microwave is leaking it is still more powerful than the WIFI.

          Usually it's not any leaking from the cooking chamber, those are pretty well shielded, but the power supplies, which aren't well shielded.

          I've read that it has to do with the AC duty cycle, but I previously had a Panasonic that ran on a DC inverter (supposed to make the microwave cooking better, but didn't) and it had terrible WiFi interference.
          'Microwave robustness' and 802.11g didn't do a darn bit of good (maybe

      • According to Wikipedia, the safety regulations are "5 milliwatts per square centimeter, measured 5 cm from the surface" (although it cites this as "over the lifetime of the device" - I'm unsure if that means "the total amount over its life" or "the maximum released at any time during its life"). Given their 28x38x25cm measures for a "standard" microwave, that comes out to 5.4 watts. Output power on most Wifi devices is 100-200mW. So yeah, it's completely plausible that a microwave can leak enough radiation

      • by mpe ( 36238 )
        My el-cheapo microwave says on the box that it has 1500 watts of cooking power. Wifi stuff works over a couple of watts.

        In order to get that amount of microwave energy you'd need something in excess of 2kw of electrical energy. If you are in North America 1.5kw makes rather more sense as power consumption. The term "cooking power" could turn out to be as much a fiction as "music power".
    • My parents bought a microwave oven this summer and we placed a cell phone in and called it. It rang so I doubt microwave ovens are properly shielded. It was a Samsung btw but all brands are probably the same.
      • Really? I installed my Whirlpool microwave and neither my cell nor my GF's cell (ATT & Verizon) receives any text, call, or email while in the microwave. I test it often. If I were you I would stop using your Samsung microwave, or at least get a remote control for it so you can cook from down the street.
        • by cffrost ( 885375 )

          If a microwave oven were leaking enough energy to be harmful, wouldn't that energy be perceptible as warmth/pain? Let's suppose that's not the case; what type of injury could be inflicted that would not be apparent to the exposed person?

          • http://www.lifeenergies.com/symptoms/sysrmfi.htm [lifeenergies.com] Symptoms of radio wave sickness Neurological : headaches, dizziness, nausea, difficulty concentrating, memory loss, irritability, depression, anxiety, insomnia, fatigue, weakness, tremors, muscle spasms, numbness, tingling, altered reflexes, muscle and joint paint, leg/foot pain, "Flu-like" symptoms, fever. More severe reactions can include seizures, paralysis, psychosis and stroke. [Possibly MS as well.]
          • by Agripa ( 139780 )

            Leaking enough to be harmful to what? You can certainly feel 5 watts of RF under the right conditions like sticking your finger into a helical resonator but not from a diffuse source like a leaking seam or inadequate feedthrough.

        • Really? I installed my Whirlpool microwave and neither my cell nor my GF's cell (ATT & Verizon) receives any text, call, or email while in the microwave. I test it often.

          [joke]
          Well, I repeated your tests and I can confirm: not only my cell does not receive any text, call, or email while in the microwave, but even for infinite time afterwards. Maybe due to the apparent design change, which occurred during the microwave exposition? You did not mention any visible changes...

          Used methodology: put cell into microwave, set at least 700 W power, apply for at least 1 minute. Visible changes comes after some first 10 secs. My neighbor even reported (using the same methodology) e

      • by Osgeld ( 1900440 )

        so what, celphones work on a different frequency, shielding is not the same as a Faraday cage

      • It doesnt work that way. Microwave ovens are designed to block microwave radiation at the wavelength they produce; sticking a completely different EM source inside it and noting that the radiation isnt blocked doesnt show you anything. Wavelength plays a big part in it.

      • by mpe ( 36238 )
        My parents bought a microwave oven this summer and we placed a cell phone in and called it. It rang so I doubt microwave ovens are properly shielded.

        All that proves is that it dosn't block 1.9GHz, 1.8GHz, 850kHz, 900kHz, 450kHz or whatever the phone was using.
        You might just as well put a battery powered radio inside. Or even declared it was not "properly shielded" because you could see inside!
    • by stms ( 1132653 )

      I don't have much experience with this kind of thing but from what I can extrapolate from this [wikihow.com] wikihow article it's just that some microwaves are poorly designed or are old.

    • by tuxicle ( 996538 )

      Perfect anything is impossible to achieve when dealing with microwave devices, in my opinion. Particularly shielding - you can reduce it down to a point, and it gets expensive the lower you go. For instance, "quiet chambers" used when testing for EMI compliance typically have doors with beryllium-copper fingers that try to create a faraday shield to keep out external interference. I've never seen any commercial microwave oven that uses these - for good reason too, since they're expensive and won't last very

      • Aren't the fingers there to avoid reflection of the inside radiation? In that case, they would be counter productive for microwave ovens; you definitely do not want the internal radiation to be absorbed by the walls.

        • by tuxicle ( 996538 )
          You must be referring to the foam absorbers [djmelectronics.com] used in an anechoic chamber, these are used for absorption. I was referring to finger stock [majr.com] that's used to minimize leakage out (or into) an enclosure with a door.
        • by Agripa ( 139780 )

          Finger stock breaks up seams which otherwise would act as slot antennas. Sometimes mesh wire tubing is used.

    • by Osgeld ( 1900440 )

      yes it is impossible to have perfect shielding, for your average 30$ microwave, and even if it would probably blow something open from steam pressure inside

    • Re: (Score:3, Informative)

      Why else do you think 2.4 Ghz wifi is unlicensed spectrum? It's mainly *because* microwaves make it useless for much else.
  • by ModernGeek ( 601932 ) on Saturday September 24, 2011 @06:10PM (#37504982)
    When my downloads get slow and I can't refresh slashdot, it means it's time to take a break because mom is making me a snack upstairs.
  • Most microwaves oven are supposed to work right at 2450 MHz, so if you want to avoid microwave ovens (which you should if there's one in the area and it's used often), you want to use the highest or the lowest numbered WiFi channel -- you don't need a spectrum analyzer to tell you that.

    But if you want to see the results of one on many microwave ovens ... here you go [bldrdoc.gov]. It looks like the exact bandwidth used by their signals vary quite a bit, though my advice above still stands in the majority of cases.

    Of cou

  • by MindPrison ( 864299 ) on Saturday September 24, 2011 @06:34PM (#37505092) Journal

    (not to be confused with the WIFE Detector(tm) )

    then you can indeed detect a microwave oven, and pretty much anything that spews out parasitic signals from 1500-3000 Mhz.

    *Technical explanation coming up*
    This is due to the cheap construction of those So Called WiFi detectors, they're not digital, they're in fact analog receivers that only detect any modulation on the band (very VERY wide-band / BroadBand reception)... it's just a glorified Crystal Radio with a small half-coil, 3-4 transistors to amplify anything...any signal picked up by the small 1 cm internal antenna, and 1 transistor to switch on a led (or 3-4 resistors, if it's sophisticated and have 3-4 leds...ha ha) ;)

    There...now the Chinese can mass-produce them, I just literally gave you the schematics for it... ...oh wait!

    • This reminds me of my poor-man's EMI-RFI detector. Whenever someone asks me why their software threw an error I tell them it is probably EMI from the fluorescent lights or their improperly shielded power strip. When they call bullshit I grab the wand from my tone kit and presto! As it gets closer to the field it gets louder. They think I'm a genius and of course, I am! Also, they don't even know what a tone kit is so the genius bar isn't very high. I hope I don't get sued for using that term.
    • by jovius ( 974690 )

      Could it be possible to create a 3D image of your surroundings by some triangulation and delicate signal processing of WIFI signals? The more there are microwave sources the better the result.

      • by Arlet ( 29997 )

        That would be tricky. A lot of things tend to reflect the 2.4 GHz signals, and the reflections interfere with the signal. A fun experiment is to attach a 2.4 GHz antenna to a spectrum analyzer, and move it around. Just moving it a few inches back and forth has enormous effect on the signal strength (that's why a lot of gear has two antennas).

  • I was going to say how this sounds like a potential invasion of privacy, but then I realized that the police knowing you have a microwave or Xbox controller probably isn't something to worry about.

    • by CAIMLAS ( 41445 )

      You're correct; however, if I can detect someone's 2.4GHz wireless security camera(s), it's another story. It really would be an invasion of privacy at that point, assuming I'm intent upon invading your privacy.

  • The only escape is the 5 Ghz band, but if you could use that, you'd be doing so in the first place.

  • So where's the source code for this?
  • I would love to use my Bionic, headset, GV and sipdroid [sipdroid.org] over wifi, currently not possible b/c of wifi interference [google.com].
  • I had a microwave oven that consistently stopped my Netflix videos streaming over WiFi every time someone made a cup of tea.
    I was able to prove a contributing issue was related to its poor door seal letting microwaves out using the free WiFi tool NetStumbler [wikipedia.org] (Also known as "Network Stumbler").

    NetStumbler has can graph the Signal/Noise ratio of a WiFi station over time. If you put a laptop running NetStumber in a microwave (Don't turn on the microwave!) you should see the signal to noise ratio drop 30 dBm [wikipedia.org] a

  • But can the researchers detect the government mind control rays transmitted from cell phone towers? If it could do that, I wouldn't have to wear this tin-foil hat all the time!

  • Handy-dandy microwave signal detector! Hey, if your app can tell between wifi and microwave then that would be a neat thing to be able to track.

  • Nice, I know the second author of that since I just got my masters there in CS. Also, screw 'UW-Madison'. It is 'UW' and I hate that people put it on the same level as all the other bullshit UW-* schools (http://www.wisconsin.edu/campuses/ [wisconsin.edu]).
  • by russotto ( 537200 ) on Saturday September 24, 2011 @09:12PM (#37505810) Journal

    First of all, switching channels to avoid a microwave is futile... the magnetron isn't all that frequency stable and the peak tends to wander across the band as a result.

    Second, 802.11g/n uses OFDM. You get narrowband interference, you reduce the rate on the affected subcarriers. It's built in.

    Third, I'm fairly sure using a wifi card as a spectrum analyzer has been done before.

    • by Agripa ( 139780 )

      First of all, switching channels to avoid a microwave is futile... the magnetron isn't all that frequency stable and the peak tends to wander across the band as a result.

      It is futile for another reason as well. The direct conversion receivers used in consumer level 802.11 equipment have atrocious out of channel rejection and dynamic range so the magnetron output only needs to be in the same band to cause significant problems.

  • This is cool but I would be far more interested in a card that does NOT detect microwave signals.
  • by flatulus ( 260854 ) on Saturday September 24, 2011 @11:15PM (#37506190)

    In 1992 I was at an IEEE 802.11 meeting (that's WiFi, if you didn't know it by that name). My company was presenting a "pre-standard" wireless LAN design that we were developing, to be considered as a contribution to the standard.

    Someone asked "Why does your design have so much error correction coding? Are you expecting the RF environment to be that bad?"

    I replied, "Well, I haven't seen any 'Listen Before Cook' microwave ovens out there!"

    This got a few chuckles and we moved along.

    Many years later, I was doing some patent searches, and I came upon Patent number 6,346,692, titled "Adaptive Microwave Oven"

    I'll be damned! Somebody actually patented the "Listen Before Cook microwave oven!"

    So now we have WiFi devices detecting microwave ovens. That seems obvious to me. But I'm still waiting for a commercially available microwave oven that will avoid stomping on my WiFi signal :-)

    FWIW, The 802.11 Media Access Control (MAC) protocol effectively avoids microwave ovens most of the time, because the magnetrons in consumer microwave ovens only operate on a "half wave" basis. This means they're off at least half the time. A microwave oven during its "on" time looks indistinguishable from another WiFi transmitter, and so your WiFi device simply waits until the microwave oven turns off before transmitting the next packet. This results in slower throughput, but isn't a show stopper.

    The bigger problem is that since the microwave oven doesn't listen before turning on its magnetron, it tends to "stomp on" your WiFi signal occasionally. This, combined with the fact that the majority of IP based communications is TCP (and TCP sees every packet loss as congestion, causing it to slow down for the next few-to-tens of seconds), results in more throughput loss than is strictly related to the number of packets "stomped upon."

    • But I'm still waiting for a commercially available microwave oven that will avoid stomping on my WiFi signal :-)

      That'd be nice, but I'm waiting for a reverse microwave: instead of heating food by exciting water molecules with microwaves, it cools stuff by sucking out the microwaves... so you put your warm beer in there and a minute later its frosty cold and completely drained of microwaves.

  • I could so see an application that let you detect and pinpoint a phone or any bluetooth device that you lost around the house. The problem is a thief could also be using it to pinpoint what houses got the good electronic shit just by driving around or what houses are empty just by looking for who not using anything! Hell you want to steal dozen of cell phones you just drive to the nearest parking lot with this thing.
  • They should have done a little Googling before they decided on a name...there's already a commercial product (albeit a very different one) that uses the name "AirShark":

    http://www.itrbo.com/airshark/airshark.html [itrbo.com]

    • by led7 ( 1219712 )
      IIRC there a program back in the ethereal/early wireshark days called airshark that was for analysing WiFi traffic. A google & bing search turns up nothing. Anyone else remember this?
  • Let me get this straight. These researchers managed to prove, on what was likely a proprietary platform, what we already proved on an open platform? The Ubertooth has been proving and showing this interference for months now. [Hak5 - Youtube [youtube.com]]

Ocean: A body of water occupying about two-thirds of a world made for man -- who has no gills. -- Ambrose Bierce

Working...