Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Wireless Networking Communications Intel

Has 2.4 GHz Reached Maximum Capacity? 250

An anonymous reader writes "There's been a lot of talk lately about the concept of Personal Area Networks. At CES Intel and Connectify both released software that turns Windows laptops into Access Points for file transfers, wirelessly syncing pictures from cameras, and Internet sharing. This is good, maybe great, if you're a road warrior, but what about the rest of us holed up in apartment buildings and small neighborhoods? We already have to deal with the wireless chatter of the 50 or so other Linksys routers in the vicinity. What will happen when every laptop also acts as a software router? To add fuel to the fire, Intel and Netgear also announced the Push2TV device that allows you to stream your display, including Netflix videos straight to your television. Isn't this going to kill lower powered 2.4 GHz devices, like Bluetooth mice and headsets? When does the 2.4 GHz band collapse completely? Why can't we push all this short range, high bandwidth stuff onto 5 GHz?"
This discussion has been archived. No new comments can be posted.

Has 2.4 GHz Reached Maximum Capacity?

Comments Filter:
  • Channel 14 (Score:5, Informative)

    by Ungulate ( 146381 ) on Wednesday January 27, 2010 @05:26PM (#30924984)
    If you're in a place with too much WiFi noise, try using the Japanese-only channel 14 - it doesn't overlap with any other channels, and you're pretty much guaranteed to be the only person using it. DD-WRT and other alternative firmwares will broadcast on 14, and Windows just requires a simple registry hack to receive on 14. Macs, I believe, can connect without hassle. It's technically illegal, but the likelihood of being caught is pretty much nil. I've still never found an explanation for what this frequency is used for in the US, if anything.
  • Obligatory (Score:5, Informative)

    by toastar ( 573882 ) on Wednesday January 27, 2010 @05:31PM (#30925072)
    Boom! Headshot. [xkcd.com]
  • by Duradin ( 1261418 ) on Wednesday January 27, 2010 @05:42PM (#30925274)
    It is fun to watch a spectrum analyzer when someone fires up a microwave. Old cordless phones are entertaining too. One is a multi-megaton nuke, the other a tac-nuke.
  • Re:Obligatory (Score:5, Informative)

    by jgtg32a ( 1173373 ) on Wednesday January 27, 2010 @05:47PM (#30925382)

    The Alt text is perfect as well
    'Cheater!' 'Hey, gaming on wifi? You have only yourself to blame.'

  • by jgtg32a ( 1173373 ) on Wednesday January 27, 2010 @05:51PM (#30925476)

    Doesn't 802.11n come with the option of being in 5GHz

  • by dtmos ( 447842 ) * on Wednesday January 27, 2010 @05:56PM (#30925580)

    The devices and the protocol should be smart enough to optimize spectrum, both by analyzing the noise at various frequencies and choosing the band with the least automatically[...]

    You'll be comforted to know that ZigBee devices, and other devices that conform to the IEEE 802.15.4 standard, do this today, and have done so since 2003.

    The second part of your wish, that devices dial-down their power to the minimum needed, is certainly possible, but requires that the receiving device send an RSSI (Received Signal Strength Indication) or, better, an RQI (Received Quality Indication) value back to the transmitting device (perhaps as part of the ACK), and that the transmitting device be capable of lowering its transmit power in the first place (it's an extra expense). I don't know of any protocol that does this (other than CDMA protocols, for a different reason), although I expect newer protocols will -- but to save power (extend battery life), not to reduce spectral pollution.

  • Re:Channel 14 (Score:5, Informative)

    by vlm ( 69642 ) on Wednesday January 27, 2010 @06:08PM (#30925854)

    I've still never found an explanation for what this frequency is used for in the US, if anything.

    Short summary, nothing important, just other unlicensed ISM stuff.

    ISM is supposed to be for bulk raw stuff like industrial heating, cooking, diathermy, NMR/MRI basically the kind of stuff where the specs need be no tighter than "a couple gigs and a zillion watts". But wouldn't it be funny to try to use the same band for communications stuff? Thus Wifi. A stupid idea from a frequency coordination standpoint, but it was the best solution to a bad situation etc etc.

    Channel 14 is going to cover 2.473-2.495 GHz. The 2.4G ISM band is 2.4 to 2.5 GHz, so superficially, channel 14 is all good.

    However, the FCC thought it would be fun to regulate the unlicensed ISM spectrum into segments, probably to avoid chaos like pre-rebanded 800 MHz Nextel interference. Theoretically, it should be possible for wifi and 2.4G wireless mikes and other 2.4G stuff to coexist. Theoretically. And channel 14 just happens to live outside the proper band segment, so its not going to cooperate. The idea is a joke because "Industrial" users like microwave ovens tend to crap all over the entire band, so any interference to a subband is generally blamed on the high power industrial stuff rather than a "mistuned" piece of communications gear.

    So, you won't be knocking out military radar or interfering with satellites, but you will be interfering with other unlicensed ISM users. Are there any local communications users other than phones and WiFi? Frankly, probably not. And the industrial ISM guys are not going to care.

    Wikipedia has links to the CFRs. Good Luck with reading those.

    http://en.wikipedia.org/wiki/ISM_band [wikipedia.org]

  • Re:Apartment Wifi (Score:5, Informative)

    by snowraver1 ( 1052510 ) on Wednesday January 27, 2010 @06:14PM (#30925954)
    The theoretical max for wireless G is 27.4 Mbps. Reading a DVD at 1X is 10.5 Mbps. As I understand it, when you are transmitting from one wireless device to another, you are talking to the AP which then retransmits to the other device, effectively doubling your data.

    So, starting with 10.5Mbps then doubling the data and adding overhead, you are dangerously close to the theoretical 27.4Mbps max.
  • Um. Yes it does.

    http://en.wikipedia.org/wiki/IEEE_802.11n-2009 [wikipedia.org]

    Coupling MIMO architecture with wider bandwidth channels offers increased physical transfer rate over 802.11a (5 GHz) and 802.11g (2.4 GHz).

  • WRONG (Score:3, Informative)

    by rickb928 ( 945187 ) on Wednesday January 27, 2010 @06:27PM (#30926192) Homepage Journal

    Besides the FCC specs that WiFi devices meet (which are really a fairly bare minimum to avoid really crappy interference), manufacturers take some time to make sure their devices behave. You can be certain Cisco pays attention, and other well-regarded makers, because they intend for their enterprise clients to deploy these in a mesh and will definitely suffer if they are out of spec.

    Now, there are plenty of shady outfits, but since everyone pretty much uses the same few radio chipsets, the makers of those chips do their work and produce decent product. No one wants to take back a few thousand routers because they suck so bad they can't live with anything. Besides, does anyone buy D-Link routers any more after their NNTP fiasco? Well, actually, we have short memories, and there is fresh meat being minted every day.

    Claiming this is a technical failure by the manufacturers is bogus. More likely, the WiFi spec doesn't really accomodate an infestation of routers. There just aren't enough channel spacing options to solve this in WiFi, and I doubt there is a fix beyond a new spec.

    Hey, that's it, call IEEE! We need a new 5GHz or higher spec, dudes. We can wait 4 or 5 years. get on it, ok?

    Not that going to higher bandwidth will solve range problems. It's a morass of options and expectations.

  • As you get to the end of the range noise is generally dominated by receiver noise, interference etc not by transmitter noise (which don't forget is attenuated as much as the signal)

    So I would expect cranking up the power to increase range unless the transmitter gets so noisy that even a close-by receiver is having trouble.

    Still use it wisely and watch out for the FCC ;)

An Ada exception is when a routine gets in trouble and says 'Beam me up, Scotty'.

Working...