Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Wireless Networking Communications Intel

Has 2.4 GHz Reached Maximum Capacity? 250

An anonymous reader writes "There's been a lot of talk lately about the concept of Personal Area Networks. At CES Intel and Connectify both released software that turns Windows laptops into Access Points for file transfers, wirelessly syncing pictures from cameras, and Internet sharing. This is good, maybe great, if you're a road warrior, but what about the rest of us holed up in apartment buildings and small neighborhoods? We already have to deal with the wireless chatter of the 50 or so other Linksys routers in the vicinity. What will happen when every laptop also acts as a software router? To add fuel to the fire, Intel and Netgear also announced the Push2TV device that allows you to stream your display, including Netflix videos straight to your television. Isn't this going to kill lower powered 2.4 GHz devices, like Bluetooth mice and headsets? When does the 2.4 GHz band collapse completely? Why can't we push all this short range, high bandwidth stuff onto 5 GHz?"
This discussion has been archived. No new comments can be posted.

Has 2.4 GHz Reached Maximum Capacity?

Comments Filter:
  • by Goalie_Ca ( 584234 ) on Wednesday January 27, 2010 @04:12PM (#30924712)
    Pretty much every manufacturer sets the default power output levels to FUCKING LOUD. This means that I can get a clean signal from your router 100m away. Worse yet, most channels have overlapping frequencies with one or two of their neighbours on either side.
    • by Scutter ( 18425 ) on Wednesday January 27, 2010 @04:19PM (#30924856) Journal

      God help you if you want to use your microwave. You'll kill the connection to every device in your house.

    • by default luser ( 529332 ) on Wednesday January 27, 2010 @04:27PM (#30925004) Journal

      Yup, none of the access points are actually smart enough to switch channels and optimize frequency/power usage. So you end-up with the problem I encountered this December, when a neighbor across the way got a wireless router, and suddenly my internet stopped working. You couldn't even *see* my access point anymore, it was just overpowered.

      My access point configuration was set to "Auto," but this just meant it kept trying to use channel 1 like an idiot. So I forced it to use channel 6, and the problem was solved.

      But this isn't how things should be. The devices and the protocol should be smart enough to optimize spectrum, both by analyzing the noise at various frequencies and choosing the band with the least automatically, and by playing nice with other devices and dialing-down the power to that needed by the connected device furthest from the access point.

      Too bad the above is just a pipe dream. I can't imagine how bad it is living in dense residential/apartments, where these users still don't know how to configure things, but there are 2 dozen within range instead of 5.

      • Re: (Score:3, Insightful)

        by Facegarden ( 967477 )

        Yup, none of the access points are actually smart enough to switch channels and optimize frequency/power usage. So you end-up with the problem I encountered this December, when a neighbor across the way got a wireless router, and suddenly my internet stopped working. You couldn't even *see* my access point anymore, it was just overpowered.

        My access point configuration was set to "Auto," but this just meant it kept trying to use channel 1 like an idiot. So I forced it to use channel 6, and the problem was solved.

        But this isn't how things should be. The devices and the protocol should be smart enough to optimize spectrum, both by analyzing the noise at various frequencies and choosing the band with the least automatically, and by playing nice with other devices and dialing-down the power to that needed by the connected device furthest from the access point.

        Too bad the above is just a pipe dream. I can't imagine how bad it is living in dense residential/apartments, where these users still don't know how to configure things, but there are 2 dozen within range instead of 5.

        That all works in my favor. Nearly everyone in my complex has their router set to channel 6 or something. I set mine to channel 1 and I've got all the reception I could ask for!
        -Taylor

      • by dtmos ( 447842 ) * on Wednesday January 27, 2010 @04:56PM (#30925580)

        The devices and the protocol should be smart enough to optimize spectrum, both by analyzing the noise at various frequencies and choosing the band with the least automatically[...]

        You'll be comforted to know that ZigBee devices, and other devices that conform to the IEEE 802.15.4 standard, do this today, and have done so since 2003.

        The second part of your wish, that devices dial-down their power to the minimum needed, is certainly possible, but requires that the receiving device send an RSSI (Received Signal Strength Indication) or, better, an RQI (Received Quality Indication) value back to the transmitting device (perhaps as part of the ACK), and that the transmitting device be capable of lowering its transmit power in the first place (it's an extra expense). I don't know of any protocol that does this (other than CDMA protocols, for a different reason), although I expect newer protocols will -- but to save power (extend battery life), not to reduce spectral pollution.

        • The second part of your wish, that devices dial-down their power to the minimum needed, is certainly possible, ...

          The issue being, of course, if your currently connected "closest device" is in the living room near the router and you want to connect from your bedroom. You'll have to enter the SSID and channel manually because your bedroom device won't hear any broadcasts.

          And, of course, when you connect from your bedroom the power will go up and may start interfering with your neighbor.

          • by dtmos ( 447842 ) *

            Yep. Problems like this are what created the FCC and licensed spectrum in the first place.

        • by jo_ham ( 604554 )

          You can control the power level on Apple's Airport stations, but not automatically - you have to manually adjust a slider and (I think) reboot the router for it to take effect. It is possible to dial down the level though if you don't need the full strength, and you want to make your network tougher to connect with, or your neighbour is complaining that you are melting her brain with your evil microwave signals.

          It's cheaper than buying her a tinfoil hat, but much less hilarious.

        • Re: (Score:3, Interesting)

          by AB3A ( 192265 )

          The IEEE-802.15.4 specification defines a way to reduce power, but it does not enshrine this at the MAC layer of this protocol. Perhaps Zigbee may do this, but it isn't in '15.4 as far as I have read.

          The feature you're talking about is called Clear Channel Assessment (CCA) and it is part of ,most of the wireless specifications. The problem with CCA is that the threshold is shockingly low. And what you hear at the transmitting end isn't necessarily what the receiver hears. In other words, the receiver coul

        • by AmiMoJo ( 196126 )

          The problem is that access point manufacturers will always set their devices to FUCKING LOUD because all they care about is their own product's range and signal strength.

          So what if their router stops the neighbours router working and makes your bluetooth mouse have a fit? At least their product looks good with it's 3 MIMO antennas and TurboBoost which uses all 11 channels and gets you a couple of extra signal bars at the other end of the house. Unfortunately most people won't even realise that interference

      • by skids ( 119237 )

        Well, enterprise APs are.

        I run a Cisco LWAPP (now CWAPP) system that auto adjusts channels and power levels. On the enterprise side the challenge isn't the AP technology, it's the fact that just about every gadget, and even to this day a lot of laptops, do not have a 5GHz radio/antenna. The customer saves pennies, then ends up competing for bandwidth.

        Also I suspect most drivers are not smart enough to realize that a just-good-enough 5GHz signal should be held onto instead of leaping to a stronger, but cro

    • Pretty much every manufacturer sets the default power output levels to FUCKING LOUD.

      Well, not quite all of them. One of the main benefits with tinkering with DD-WRT on Linksys b/g hardware is being able to boost the output from the rather puny(but effective for most) factory 28mW to well over 200mW. Makes for some pretty expansive WDS setups, and a HELL of a lot cheaper than the higher end commercial hardware...

      • Re: (Score:2, Interesting)

        by poopdeville ( 841677 )

        Well, not quite all of them. One of the main benefits with tinkering with DD-WRT on Linksys b/g hardware is being able to boost the output from the rather puny(but effective for most) factory 28mW to well over 200mW. Makes for some pretty expansive WDS setups, and a HELL of a lot cheaper than the higher end commercial hardware.../i.

        DD-WRT's power settings are great. For setting the power level lower...

        Don't use them to set the power levels higher than the chipset's specs. The signal to noise ratio drops.

        • Re: (Score:3, Informative)

          by petermgreen ( 876956 )

          As you get to the end of the range noise is generally dominated by receiver noise, interference etc not by transmitter noise (which don't forget is attenuated as much as the signal)

          So I would expect cranking up the power to increase range unless the transmitter gets so noisy that even a close-by receiver is having trouble.

          Still use it wisely and watch out for the FCC ;)

        • In addition, you're cramming more wattage through the wireless IC's. If you don't take steps to mitigate this (heatsinks, even a fan perhaps) you _WILL_ burn out the router, sooner or later.

          On WRT hardware, the wireless IC's are under the 'shield' and is not actually the Broadcom chip.

      • So what you're saying is DD-WRT goes to 11?

        I don't see how that helps solve the 'loudness' problem for anyone but yourself, which really doesn't solve the problem at all.

  • Why can't we push all this short range, high bandwidth stuff onto 5 GHz?

    pretty simple really, because people are tight-arses and wont pay extra for the licencing to use non-ism spectrum

    • Re: (Score:3, Informative)

      by jgtg32a ( 1173373 )

      Doesn't 802.11n come with the option of being in 5GHz

      • Re: (Score:3, Informative)

        Um. Yes it does.

        http://en.wikipedia.org/wiki/IEEE_802.11n-2009 [wikipedia.org]

        Coupling MIMO architecture with wider bandwidth channels offers increased physical transfer rate over 802.11a (5 GHz) and 802.11g (2.4 GHz).

        • Here's the problem though: My router is draft-N and it does both, but the default setting is 2.4GHz "for maximum compatibility". It has to be manually set to 5Mhz, and of course that setting is buried in the router config pages. Being a good geek I went through all the settings and made this change right away (since it doubles the maximum speed) but most users aren't going to know that they can change it. I expect the same would apply for most other routers designed for the masses.

          • since it doubles the maximum speed

            You sure about that... ?

            The 2.4GHz/5Mhz (i think you meant 5GHz) is the frequency of the carrier wave. It doesn't necessarily have anything to do with the modulator (which carries the actual data)

      • Yes, 802.11n can operate in 2.4, 5.8, or both.

        Parenthetically, I recently purchase a Bullet M and outdoor antenna from Ubiquiti, and wanting to avoid the whole 2.4GHz zoo, I bought the 5GHz version. This was serendipitous, I thought, when I recently had my internet suspended (bad Telus). Not fearing, I got up on the roof with my 802.11n laptop to search for signals from charitable neighbours. Enabling just 2.4GHz mode on the laptop I could pick up around 10 APs. Switching to 5GHz-only mode, not a single one

        • Parenthetically, I recently purchase a Bullet M and outdoor antenna from Ubiquiti, and wanting to avoid the whole 2.4GHz zoo, I bought the 5GHz version.

          I actually did something similar when I bought my first wireless router five years back.

          Wireless networking was just starting to really take off around that time and even then I was worried about potential congestion issues. Most devices being sold then were 2.4GHz 802.11g/b models, with 802.11a support being far less common.

          Anyway, I bought a tri-format 802.11a/b/g laptop card and router precisely *because* 802.11a was far less popular and less likely to suffer congestion. Unfortunately, such devices w

  • Channel 14 (Score:5, Informative)

    by Ungulate ( 146381 ) on Wednesday January 27, 2010 @04:26PM (#30924984)
    If you're in a place with too much WiFi noise, try using the Japanese-only channel 14 - it doesn't overlap with any other channels, and you're pretty much guaranteed to be the only person using it. DD-WRT and other alternative firmwares will broadcast on 14, and Windows just requires a simple registry hack to receive on 14. Macs, I believe, can connect without hassle. It's technically illegal, but the likelihood of being caught is pretty much nil. I've still never found an explanation for what this frequency is used for in the US, if anything.
    • by Chapter80 ( 926879 ) on Wednesday January 27, 2010 @04:31PM (#30925066)

      If you're in a place with too much WiFi noise, try using the Japanese-only channel 14 - it doesn't overlap with any other channels, and you're pretty much guaranteed to be the only person using it.

      And then use Google Translate to convert to English. Perfect.

      (I know someone who didn't buy a Japanese radio back in the 70's because they were worried that they wouldn't hear any American music.)

      --

      The world would be a much better place if every single person tried to be just a little tiny bit more tolerant of my bigotry.

    • If you're in a place with too much WiFi noise, try using the Japanese-only channel 14 - it doesn't overlap with any other channels, and you're pretty much guaranteed to be the only person using it. DD-WRT and other alternative firmwares will broadcast on 14, and Windows just requires a simple registry hack to receive on 14. Macs, I believe, can connect without hassle. It's technically illegal, but the likelihood of being caught is pretty much nil. I've still never found an explanation for what this frequency is used for in the US, if anything.

      I've always considered doing that, but having to do a registry edit every time a guest wants to use my wireless is just not acceptable. I'm the resident techy and my housemates would hate me.
      -Taylor

      • Put the regkey on the thumbstick, if your guests have a problem they don't get wireless.
         
        It may help your case if you tweak the setting in the router so for a month everyone gets terrible internet; then come to the rescue with this fix.

        • Put the regkey on the thumbstick, if your guests have a problem they don't get wireless.

          It may help your case if you tweak the setting in the router so for a month everyone gets terrible internet; then come to the rescue with this fix.

          Hah, true.
          Also, doesn't help on closed devices like phones.

          Either way, everyone in our complex seems to use the same default channel, so I changed ours and its pretty good.
          -Taylor

      • by vlm ( 69642 )

        I've always considered doing that, but having to do a registry edit every time a guest wants to use my wireless is just not acceptable. I'm the resident techy and my housemates would hate me.

        You use your "special" wifi channel. Your house guests use your neighbors wifi. No problemo!

    • by PPH ( 736903 )
      That's used by the Gundam. You're about to get your ass kicked by a giant robot.
    • by Algan ( 20532 )

      You'd still have a lot of overlap with people running on channel 11

      • by dissy ( 172727 )

        You'd still have a lot of overlap with people running on channel 11

        The channel overlap is only in 2 channels in each direction.

        So channel 11 only overlaps with 9,10,(11),12,and 13.
        (I only include 11 in the list to illustrate the two channels in each direction aspect)

        This means 11 does not overlap with any channel below 9, nor any channel above 13.
        14 is fortunately above 13!

        Technically however, 14 is special. It isn't allocated in the same way, and is a little bit higher in the frequency range than all the others. What that means is while all the other channels are 22mhz

    • Re: (Score:3, Interesting)

      by dattaway ( 3088 ) *

      The Atheros wifi card in my Acer One can cover 2.1-2.8GHz, in 58 channels with a simple config. But I won't do it, because all kinds of things in the sky use slices of those frequencies.

      • by RoboRay ( 735839 )

        I don't mind if the guys on the space station piggyback off my wireless network, so long as they're not torrenting.

    • by cjb658 ( 1235986 )

      Yeah I've been using it for years and so far the authorities haven't ...

      NO CARRIER

    • Re:Channel 14 (Score:5, Informative)

      by vlm ( 69642 ) on Wednesday January 27, 2010 @05:08PM (#30925854)

      I've still never found an explanation for what this frequency is used for in the US, if anything.

      Short summary, nothing important, just other unlicensed ISM stuff.

      ISM is supposed to be for bulk raw stuff like industrial heating, cooking, diathermy, NMR/MRI basically the kind of stuff where the specs need be no tighter than "a couple gigs and a zillion watts". But wouldn't it be funny to try to use the same band for communications stuff? Thus Wifi. A stupid idea from a frequency coordination standpoint, but it was the best solution to a bad situation etc etc.

      Channel 14 is going to cover 2.473-2.495 GHz. The 2.4G ISM band is 2.4 to 2.5 GHz, so superficially, channel 14 is all good.

      However, the FCC thought it would be fun to regulate the unlicensed ISM spectrum into segments, probably to avoid chaos like pre-rebanded 800 MHz Nextel interference. Theoretically, it should be possible for wifi and 2.4G wireless mikes and other 2.4G stuff to coexist. Theoretically. And channel 14 just happens to live outside the proper band segment, so its not going to cooperate. The idea is a joke because "Industrial" users like microwave ovens tend to crap all over the entire band, so any interference to a subband is generally blamed on the high power industrial stuff rather than a "mistuned" piece of communications gear.

      So, you won't be knocking out military radar or interfering with satellites, but you will be interfering with other unlicensed ISM users. Are there any local communications users other than phones and WiFi? Frankly, probably not. And the industrial ISM guys are not going to care.

      Wikipedia has links to the CFRs. Good Luck with reading those.

      http://en.wikipedia.org/wiki/ISM_band [wikipedia.org]

    • by dissy ( 172727 )

      It's technically illegal, but the likelihood of being caught is pretty much nil. I've still never found an explanation for what this frequency is used for in the US, if anything.

      According to (PDF warning) http://www.ntia.doc.gov/osmhome/allochrt.PDF [doc.gov]
      3rd line from the bottom with the end label '300ghz', with the top label of 2483.5 - 2500 (it is listed as mhz)

      2.484ghz (what channel 14 centers on) is allocated in the USA to two things: Mobile satellites, and "Radiodetermination satellite"

      The color code indicates "Government / Non-government shared"

      My guess is the mobile satellite is the non-government bit, and the radio determination satellite is the government controlled part.
      That

    • by Malc ( 1751 )

      I wish I could figure out how to get my US laptop to use the two higher channels. I've moved to London, and our office WAP has dreadful problems due to the interference from so many other WAPs. We can't get it on to cleaner frequencies because of the crippled American laptops.

  • But I am on 5 GHz (Score:2, Insightful)

    by greed ( 112493 )

    Since I _don't_ deal with a lot of Windows PCs, I _am_ running everything on 5.0 GHz. Well, OK, there's a "guest WiFi" at 2.4 GHz for friends, but I don't care if that's broken.

    Most PCs I've seen with 802.11n only have the 2.4GHz support, and same with a lot of add-on cards and access points. The dual-band stuff is starting to be a little easier to get, though still a bit of a price-premium. Of course, since my machines are either Macs on Wi-Fi or Linux on gigabit copper, I'm already used to a price prem

  • 2.4Ghz raises the temperature of dihydrogen monoxide molecules, so it's also contributing to global warming!

  • Ah yes, the quaint wire: Your own personal air domain, a veritable Ether network. Copper's getting scarce and we'll never run out of glass, but optical transceivers are still thousands of dollars. It's getting cramped on this little planet!

    • I'm fairly certain aluminum wires would work fine for what most people use. If you aren't running maximal-length wires, you can probably stand a bit more resistance on the wire... and if you are, well - I'd imagine that a copper shortage would be a big enough event they could draft an ethernet sub-standard to raise the voltage a bit to compensate.

  • by femto ( 459605 ) on Wednesday January 27, 2010 @04:57PM (#30925590) Homepage

    The problem is dumb receivers, not lack of spectrum.

    Channel capacity is determined by the MIMO form [wikipedia.org] of Shannon's Theorem [wikipedia.org]. Add more antennas and smarter processing in the receivers, and the capacity in a channel with lots of multipath (eg. in an apartment) increases approximately linearly.

    The problem is that most hardware is a decade (or more) behind state of the art, and that people aren't prepared to pay for a more complex, and costly receiver.

    • Re: (Score:3, Interesting)

      by cdrguru ( 88047 )

      You are missing the fact that inherently there can only be one transmitter on at a time within a given physical space.

      You might be able to expand the physical space beyond the transmitter's range and call that having two transmitters in the same space - but that isn't the point.

      There are some tricks with polarization that can be used, but these are not currently being done and are basically incompatible with the omnidirectional nature of most uses of the 2.4Ghz frequency.

      Anything beyond one transmitter at a

      • by femto ( 459605 )

        That's just plain wrong.

        The whole point of MIMO and space-time coding is that you can have multiple transmitters and receivers in proximity to each other and everyone still gets data through. The "space" in space-time means we can pick out transmitter and receivers based on their location, not just their frequency (FDM) or when they transmit (TDM).

        99% of "old school" radio guys I've come across just cannot get their brain around MIMO. They are too used to the idea that we need separation in frequency. So

  • by RoboRay ( 735839 ) on Wednesday January 27, 2010 @05:55PM (#30926736)

    "Why can't we push all this short range, high bandwidth stuff onto 5GHz?"

    I really wish you wouldn't. I prefer to keep the 5GHz band wide-open for the unimpeded use of my own 5GHz devices. I bought them specifically to avoid 2.4GHz clutter!

  • by sillivalley ( 411349 ) <sillivalleyNO@SPAMcomcast.net> on Wednesday January 27, 2010 @06:04PM (#30926900)
    ...And currently, it's 2.4 GHz.

    Before that, it was the 900 MHz band -- until it filled up with cordless crap.

    As others have posted, 5 GHz is still pretty clean, so use it while you can. In our residential area, 2.4 GHz is full (even 14 is in use), but there's little activity on 5 GHz, so that's where our macbooks connect.

    Same at work -- dual mode phones, bluetooth, microwave ovens, old laptops and more all on 2.4, with the newer equipment connecting on 5 GHz.
  • Leave all that trash on 2.4Ghz. I want to be able to continue to use 5.1,5.2 and (to a lesser degree, since phones use it too) 5.8. As soon as we get a proliferation of $25 devices on the 5Ghz spectrum, we'll fill those up too.

The unfacts, did we have them, are too imprecisely few to warrant our certitude.

Working...