Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Networking Wireless Networking

Fast Wi-Fi's Slow Road To Standardization 140

CWmike contributes this excerpt from Computerworld: "For a technology that's all about being fast, 802.11n Wi-Fi sure took its sweet time to become a standard, writes Steven J. Vaughan Nichols. In fact, until September 2009, it wasn't, officially, even a standard. But that didn't stop vendors from implementing it for several years beforehand, causing confusion and upset when networking gear that used draft standards from different suppliers wouldn't always work at the fastest possible speed when connected. It wasn't supposed to be that way. But, for years, the Wi-Fi hardware big dogs fought over the 802.11n protocol like it was a chew toy. The result: it took five drama-packed years for the standard to come to fruition. The delay was never over the technology. In fact, the technical tricks that give 802.11n its steady connection speeds of 100Mbps to 140Mbps have been well-known for years."
This discussion has been archived. No new comments can be posted.

Fast Wi-Fi's Slow Road To Standardization

Comments Filter:
  • = Wi-fight?

    Guess not.

    Are we due for a new, faster, standard now, since it has been 5 years for this to come to fruition?

  • Well, I brought a WRT-600n and got worse coverage that with the older WRT-54GS I had been using; totally unimpressed. I now run with an Asus WL-520gu using Tomato firmware, very nice.
    • An awful lot of N routers are still shit, either wait or do a lot of research.

      • by Andy Dodd ( 701 ) <atd7NO@SPAMcornell.edu> on Thursday December 10, 2009 @05:42PM (#30394732) Homepage

        Basically any 802.11n system is going to be shit with legacy devices nearby. I believe this was one of the final barriers to standardization between Draft-N and final.

        So to get decent performance you MUST be in the 5 GHz range. However, there are almost no 5 GHz 802.11n routers out there with external antennas! (Curse Linksys for their move to "saucer" form factors across their entire 802.11 product line... 100% driven by form and aesthetics/marketing, not technical function.) End result is that in most situations, a G router with an external antenna (especially an upgraded one) will blow nearly any of the 11n devices on the market out there.

      • by jomcty ( 806483 )
        I got the WRT-600n since it is a true dual-band router that can run DD-WRT.
  • Not the first time (Score:5, Insightful)

    by causality ( 777677 ) on Thursday December 10, 2009 @04:41PM (#30393804)
    Betamax vs. VHS, HD-DVD vs. Blu-Ray, now Wi-Fi draft N versus finalized standard draft N.

    Open standards are a good thing. They avoid these kinds of problems. They promote interoperability. They also force vendors to compete on the merits of their implementations of those standards instead of competing on the basis of who is better at customer lock-in. It also lessens but does not remove the competition of who is the best at marketing.

    If you care about assigning blame, it lies squarely on the people who purchased draft-N hardware. Whether they realized it or not, they were using their wallets to vote for this behavior. Those purchasing decisions reward this kind of behavior and make it profitable. Give companies the choice of agreeing on a standard or making no sales and they will agree on a standard every time.
    • by daveime ( 1253762 ) on Thursday December 10, 2009 @04:56PM (#30394020)

      So I guess you'll not be using any of that "not-yet-finalized" html5 stuff, or any beta software from Google ?

      After all, no one should invent anything until it's been discussed in committee for a minimum of 10 years, until the technology it is attempting to standardize has already been superseded by something better !

      Thank [deity-of-your-choice] they didnt invent the wheel using open standards. It probably would have had 6 sides, none of which are equal in length, a 100 page operating manual, a concession to Pantone that it should only be made in RGB color 255,147,97, and an alternative implementation involving Microsoft's .innerHTML

      Anything that takes longer to describe than it does to make is probably better not describing. Just use the bloody thing and be done with it.

      • So I guess you'll not be using any of that "not-yet-finalized" html5 stuff, or any beta software from Google ?

        Not if it requires a hardware purchase.

      • by causality ( 777677 ) on Thursday December 10, 2009 @05:34PM (#30394556)

        So I guess you'll not be using any of that "not-yet-finalized" html5 stuff, or any beta software from Google?

        Terrible examples. I really don't think you appreciate the difference between open standards and proprietary "standards". That, or you understand it perfectly well but find it inconvenient for your argument, PR-style.

        HTML5 is intended to be an open standard, so in this case you're making my point for me. There was a draft standard of HTML5 released January 2008. This too was produced openly. A vendor who produces something based on this draft standard is using the same specifications that are available to all other vendors. The same will be the case with the finalized standard.

        That has not been the case with the proprietary draft-N implementations. Each vendor has their own version of draft-N. It's very similar to Microsoft's practice of embrace-and-extend. Interoperability with another vendor's implementation is not guaranteed. If you can't get Vendor X's equipment to operate with Vendor Y's equipment, or suffer reduced performance, neither vendor will file that as a bug and fix it. Instead, both will tell you "we recommend you use our products for all your networking needs". You think this is just like HTML5, that you're really comparing an apple to an apple here?

        Most of the beta software that Google has released for download has been open source (Chromium, for example). Open source is no good if you want to implement a proprietary standard. It's great when you want the world to see precisely how something was done so they can interoperate with your software or port it to other platforms. Google obviously understands the value of this. That again serves to reinforce my point.

        This is just another example of a phony debate tactic. If there's not a term for this, there should be. The procedure goes like this:

        1. Ignore any points that the other person made. This is important. If anything the other guy said contradicts your position, just pretend that you didn't notice. Best foot forward, even at the expense of intellectual honesty. Besides, this way you don't have to waste your time with refutation and can get right down to expressing your predetermined conclusion.
        2. Proceed to find anything the other person said that is generally true, and does apply for the specific examples that person gave. Then take the general truth to an absurd extreme.
        3. Pretend like this says something about the validity of the general truth. Whatever you do, don't acknowledge that it says anything about your ability to interpret the general truth within a reasonable perspective.
        4. Declare that the general truth is inherently absurd. State outright or imply strongly that it must be false in all cases. It was false when you took it to an absurd extreme well beyond its intended scope, so it must be totally useless in all cases. Right?
        5. Congratulate yourself for your ability to handle argumentation. For extra points, assume that the other guy was a total idiot, that your trivial objections never occurred to him, and that the existence of such trivial objections could not possibly have indicated that you missed his point.
        • I'll try to respect your handy punchlist of debating etiquette, and just focus on one aspect of your reply.

          That has not been the case with the proprietary draft-N implementations. Each vendor has their own version of draft-N. It's very similar to Microsoft's practice of embrace-and-extend. Interoperability with another vendor's implementation is not guaranteed. If you can't get Vendor X's equipment to operate with Vendor Y's equipment, or suffer reduced performance, neither vendor will file that as a bug an

          • by causality ( 777677 ) on Thursday December 10, 2009 @07:32PM (#30396428)
            I agree that a five-year wait is a big problem, especially for things that develop at a fast pace like software or networking. But, to me that doesn't mean we should scrap the whole idea of open standards and open protocols. It means we should improve the processes by which those open standards are produced. We should profile them like any algorithm and look for bottlenecks. We should do that with a ruthless willingness to eliminate those bottlenecks.

            I bet that there are no technical reasons why it takes 5 years or more to come up with an HTML standard. I bet that there are lots of political reasons for that. I bet that a small team of engineers could do a better job in less time than a bureaucratic committee.

            And for sure, even if and when the standard *is* finalized, that won't be before all the big players have bartered with the comittee for concessions on alternative allowable formats

            I think you identified the problem right there.

            unfortunately we live in the real world where innovators cannot wait 5 years for technology to be debated, formalized, bartered, compromised and generally muddied into yet another worthless piece of documentation that is out of date before it's ever released.

            Nothing is stopping them from innovating. They just can't legitimately call their independent innovations "HTML 5". That doesn't bother me. But what we get for that are ubiquitous yet proprietary things like Flash and all of the problems that come with them. I still think it'd be better to fix what's wrong with the processes we use to create open standards.

            • Re: (Score:3, Interesting)

              by daveime ( 1253762 )

              It's funny you mention Flash.

              It's one of the few things that really "just works" across all browsers, regardless of the underlying O/S. Perhaps that is why the web as a whole has adopted Flash so readily, (for better or worse), and why I feel we'll never be rid of it even when html5 is "live".

              Now it's proprietary, which means the owner gets the control over what the user can see or not see (source), but let's face it, it doesn't take Adobe 5 years to bang out an even better version, what are we on now, vers

              • It's one of the few things that really "just works" across all browsers

                Try telling that to the iPhone (or a lot of other mobile devices that only have the embedded version of Flash). The difference between HTML 5 and Flash is that Flash's functionality is binary: it either works completely or it doesn't work at all (more or less: it might just run too slowly on your CPU). HTML 5 can have bits working perfectly and other bits not working at all. Done well, a site using HTML 5 can fail a lot more gracefully than one using Flash. It's a lot easier, however, to have a site tha

              • It's funny you mention Flash.

                It's one of the few things that really "just works" across all browsers, regardless of the underlying O/S.

                <img> works perfectly across all browsers, last I checked. So does pretty much all actual HTML markup that people use, as long as you don't get into weird stuff like <keygen>. And a fair bit of CSS and JS. There's no reason to expect <video> and <canvas> won't be the same, eventually.

                Perhaps that is why the web as a whole has adopted Flash so readily, (for better or worse), and why I feel we'll never be rid of it even when html5 is "live".

                Flash becomes a heck of a lot less useful as soon as its market share drops below 80%. Suddenly to use your page, users have to click through a whole bunch of buttons and install some extra software,

            • I think you identified the problem right there.
              The real underlying problem is that a standard is most useful when all the major vendors* involved use it and becomes less useful the less of them use it.

              Steering a path between a load of conflicting interests takes a lot of time and effort but if a standards body doesn't do it then (unless forced by government) their standards will not gain acceptance.

              *I include major FOSS projects in the definition of vendors here.

            • I bet that there are no technical reasons why it takes 5 years or more to come up with an HTML standard. I bet that there are lots of political reasons for that. I bet that a small team of engineers could do a better job in less time than a bureaucratic committee.

              HTML5 is developed by the WHATWG [whatwg.org], which more or less is a small team of engineers rather than a bureaucratic committee. It's driven almost entirely by implementers. The reason things aren't happening immediately is because the various browser developers only have a limited amount of development resources; they can only assign some of them to implementing new markup-level features (as opposed to security features, UI, bug fixes, etc.); and HTML5 defines a huge number of new features (latest draft is 696 pa

          • Even IF they fully implement the video tag correctly, who exactly will reformat all their existing video collections in AVI / FLV into open source format (.OGG ?), and what benefit will it give them. Absolutely none.

            And for sure, even if and when the standard *is* finalized, that won't be before all the big players have bartered with the comittee for concessions on alternative allowable formats, and we'll end up with a video tag that needs to play not just open source formats but also AVI, WMV, FLV and all the popular formats of the day.

            Provided of course in the next N years, an even better video compression format comes along and despite the agreed upon video tag, we'll end up with an <ms-video> tag, an <flv-video> tag etc etc.

            Please try consulting the actual spec [whatwg.org]. HTML5 does not define or require any particular format for video or audio. It's open-ended and works with any format the browser supports, just like <img> works with any image format. Currently Firefox supports only Ogg Theora, Chrome supports both Theora and H.264, and Safari uses the system codecs. You can provide the video in multiple formats, and the browser will use the first one that it supports. Not flawless, but nothing like the chaos you hypothesize

        • There was a draft standard of HTML5 released January 2008.

          The first drafts of HTML5 were published by the WHATWG [whatwg.org] in 2004. 2008 was only when the W3C got involved. The W3C's endorsement has made little to no practical difference to the development or implementation of HTML5 so far – Mozilla, Apple, Google, and Opera were all implementing based on the WHATWG spec to begin with, and the W3C editor is the same as the WHATWG editor. The W3C was brought in to the picture as a political measure, and its involvement shouldn't be viewed as a milestone from a spec

      • "After all, no one should invent anything until it's been discussed in committee for a minimum of 10 years, until the technology it is attempting to standardize has already been superseded by something better !"

        The irony is that routers and wifi network cards are a CASE IN POINT against your argument, many router and wifi network card manufacturers were allowed to release hardware based on unfinished specs, they worked intermittently and made a mess of the home router market. Almost all early home routers

    • What customer lock in is there with Wi-fi routers?
      • Re: (Score:3, Informative)

        Well first thing that comes to mind is D-Link and Linksys with their "airboost" and whatever the other one was called. Want to take advantage of the faster speeds you need a specific router or wifi-card/usb dongle/whatever...
    • Open standards are a good thing. They avoid these kinds of problems.

      They do? I guess so, since, obviously, it's only evil corporations that disagree on things....

    • by LWATCDR ( 28044 )

      Actually this problem was caused by the want of Open Standards.
      1. You can not forbid them from selling none standardized WiFi. The FCC granted unlicensed spectrum for this so each company can do what they want. If not you would really stifle innovation.
      2. Customers wanted faster wifi and companies supplied it. Hey it said draft on the box. It is called a free market.
      The Standards committee needs to start working on the next several standards now. The problem is the speed and planning of the standards commit

  • Drama...? (Score:4, Funny)

    by ScoLgo ( 458010 ) <scolgo@g m a i l . c om> on Thursday December 10, 2009 @04:41PM (#30393808) Homepage

    I love this line; "The result: it took five drama-packed years for the standard to come to fruition"

    Yep, this has definitely kept me on the edge of my seat waaayyyy more than watching Lost or Heroes.

    • Re:Drama...? (Score:5, Interesting)

      by RichMan ( 8097 ) on Thursday December 10, 2009 @04:47PM (#30393896)

      I went through two small companies working on pre-standard N devices. Both went under as a little company you can't pre-run a standard to market. We were ready for production 7 years ago.

      So yes, drama that personally affected me as I went through two collapsing companies.

      • by IANAAC ( 692242 )
        So yes, drama that personally affected me as I went through two collapsing companies.

        That's a perfectly valid point.

        Just realize that it hasn't affected the VAST majority of, well, anyone else (consumers).

      • Re: (Score:2, Interesting)

        by stnt ( 730547 )
        Can you name the companies ? I was part of Wipro Newlogic in France, designing 802.11n hardware IP. We collapsed too 6 months ago. Seems no phone or TV / STB manufacturer was ready to integrate 11n in their chips. The late standardization didn't help for sure.
    • by Monkeedude1212 ( 1560403 ) on Thursday December 10, 2009 @05:11PM (#30394204) Journal

      I hear HBO has hired someone to write a season or two based on the whole ordeal. Rumours are it will be called "N" and the tagline will be "Wi the Fi is this taking so long?". It's not above any of the normal problems that HBO shows have. You know, the kind where there is a secret love plot between two characters that have no influence on the story whatsoever. Or the writers write in a love scene, and then it gets cut short for commercial breaks.

    • Definitely better than Lost, and also better than Heros after season 1.

  • Wi-fail (Score:2, Insightful)

    by Anonymous Coward

    Between crap home routers and microwaves knocking out my signal I'm just sticking to good ole fashioned cables. Wifi has been nothing but a headache in the years I've used it. Give me a good ethernet cable anyday.

    • I agree, although you might look into a newer microwave, it is supposed to make a big difference.

      • by jedidiah ( 1196 )

        ...and perhaps one day it will equal the speed, reliability and security of an ancient wired standard.

    • by afidel ( 530433 )
      If your microwave knocks you offline throw it out! No correctly functioning device should be emitting outside the cooking area. When I supported Cisco's wireless division we bought every microwave available for sale in our area and every microwave from every Goodwill in the area and could never measure any significant amount of emissions. Also 802.11n supports operating in the 5Ghz spectrum so if you don't have concrete walls you can generally avoid collisions with neighbors as there is much less gear using
  • BG is good enough, tied to residential/office network, and hard to notice the benefit of N.
    • Re:Blueray of Wifi (Score:4, Informative)

      by The Ancients ( 626689 ) on Thursday December 10, 2009 @04:55PM (#30393994) Homepage

      BG is good enough, tied to residential/office network, and hard to notice the benefit of N.

      Do you do much copying of files at all? We can see a huge difference on our network (have had n since the first 11n Airport Extreme), with speeds 3-5 times faster than G in the same environment. For us, 11n actually made wireless an acceptable alternative.

      • Re:Blueray of Wifi (Score:4, Insightful)

        by h4rr4r ( 612664 ) on Thursday December 10, 2009 @05:07PM (#30394158)

        I have this magic technology called wired networking. Even the copper stuff goes all the way up to 10Gb.

        Wireless is 99% of the time more a pain than it is worth.

        • also wires let you multiply the spectrum. With wireless you just get one, whereas with wires you get as many spectrums as you can lay down.
        • Re: (Score:3, Insightful)

          I also have a girlfriend who bitches when I place wires all round the house. Doesn't stop me doing it though.
        • You say that because you're not running two wires to every office/cubicle in the building and attaching them to an expensive Cisco router... and then doing it all again when you have to reconfigure the cubes in three years. And you're not trying to run stuff through an older home that didn't come with built-in 10gigE wall jacks.
          • by jasonwc ( 939262 )
            The point is that you don't have to rewire. You could likely obtain Gigabit performance with Cat 5 cabling installed 15 years ago, and certainly could obtain 100 Mbit speeds with very old Cat 5 wiring - which is more equivalent to 802.11n performance.

            A 15 year-old wired technology in most cases will beat the cutting-edge wireless technology. While rewiring for 10 GbE may be a bitch, you're doing so to obtain a level of performance totally unachievable over wireless.

            BTW, when you wire for 10 GbE, are you usi
            • Cat 6 became the bastard standard; turns out it was only good for short runs of 10GbE and 5e could already do gigabit at 100 meters. I'd personally stick with 6a (the new 5e) lest 7 share the same fate as 6 for whatever comes after 10GbE. Remember when they said we'd be using fiber to the desktop for high speed Ethernet? Well, they keep coming up with ways to make good ol' copper stay the course.

          • by h4rr4r ( 612664 )

            No, we run 4 to every office/cube. They go back to a very expensive cisco switch. Why would any desk be plugged into a router?

            Wall jacks are easy to put in as well.

          • As one of the guys who does run the expensive Cisco routers with all those drops connected to them, I would still much rather have them wired - faster, more reliable, and easier to troubleshoot - plus the wired Cisco switches and routers are still cheaper than the Cisco Access Points and controllers plus PoE switches and routers, that you need to provide office-caliber wireless to a floor full of 100+ people.

        • Re:Blueray of Wifi (Score:5, Informative)

          by jasonwc ( 939262 ) on Thursday December 10, 2009 @06:40PM (#30395680)
          I completely agree with the parent. Wireless is often more trouble than it's worth. It's great for casual internet access on laptops, but it's really unreliable for HD video streaming, system backups, or large file transfers. If you live in a single-family home, you probably don't have many interference issues to deal with, but in a multi-unit apartment building, there is often significant interference on the 2.4 Ghz spectrum not merely from other Wireless routers but from phones, microwaves, baby monitors and other devices. The end result is often dropouts, unreliable connectivity, and slow speeds.

          Wired Ethernet is a reliable and mature technology that pretty much always works. While the vast majority of wireless routers are crap, it's quite easy to purchase a quality Gigabit Switch, and pretty much any wireless router will provide stable wired connections. People seem to ignore the fact that wireless only provides shared speeds while wired ethernet provides dedicated bidirectional bandwidth per port. In addition, the stated maximums for wired ethernet provide a sense of real world speeds while wireless does not.

          For example, 802.11g provides real-world speeds of 20-25 Mbit/sec - not the 54 Mbit theoretical speed pasted on the box. 802.11n is advertised as 300 Mbit/sec but generally provides 100-130 Mbit/sec at best. However, these speeds can only be obtained with a line-of-sight connection at a short distance - a distance so short that you could easily connect over wired ethernet and obtain 10x the speed! At 100-BaseTX I am able to obtain 90 Mbit/sec after network (94 Mbit/sec with Jumbo frames). With Jumbo Frames Gigabit can achieve 950 Mbit/sec or higher speeds.

          So, 802.11n in a best-case scenario (little interference, 10 ft from the AP) provides speeds only slightly in excess of 100-BaseTX, a standard formed 14 years ago (1995)! If you actually connect from any significant distance, 100-BaseTX will provide better speeds. In addition, you get dedicated upstream and downstream per-port bandwdith on 100-BaseTX.

          Gigabit switches with Jumbo frame support can now be had for less than $30 - and in some cases $20. Nearly all laptops and desktops now come with Gigabit NICs and support Jumbo frames. Modern dual and quad-core CPUs can easily take the overhead of transferring at gigabit speeds with or without jumbo frames. Furthermore, modern OS's are more efficient at high-speed network transfers.

          Yet, I see 802.11n routers advertised for streaming HD video, system backups, and large file transfers. Would you really want to backup 100 GB over 802.11n? Sounds like fun watching your entire network come to a crawl for 3 hours. Gigabit ethernet over Cat 5e/6 is generally limited only by your hard drive setup, and can be used for all the aforementioned tasks without any impact on network performance.

          The following is my recent experience with wireless networking:

          I use a WRT54GL with the Tomato (Linux) firmware. This provides a rock-solid solution, yet I was getting dropouts on wireless at my current apartment. I know it's not the router as it worked fine in my cinderblock college dorm, and I have another WRT54GL running Tomato at my home in NJ - which usually is up for months at a time - until a power outage.

          At my latest apartment, I was getting constant dropouts on wireless. I ran a second AP to get a stronger connection. The connection was indeed stronger but I still got intermingle dropouts. I tried changing the wireless channel, antenna placement etc. but nothing worked. Finally, I just said fuck it, bought a 100' Cat 5e cable for $8 and ran it directly from the ADSL router to the router in my room. No dropouts since then - and I get more consistent and faster speeds.

          I purchased an 8 port Gigabit switch with Jumbo Frames for my internal network and now my network speeds are limited only by my hard drives. Transferring between a 1 TB eSATA and 1 TB SATA drive, I was able to transfer a 12 GB file in 2:15 at an average speed of 95 MB/sec - around 800 Mbit/sec. With
          • by Movi ( 1005625 )

            Despite being a design that is over 5 years old, the WRT54GL is still one of the best selling routers on Newegg. It has nearly 3,000 user reviews, the best wireless router rating on the website, and has been nominated for the "best router" category for the last 30 consecutive months.

            There's one problem tho. After you get to faster speeds (above 25MB), even the wired packet switching in that baby can't handle the speed. Right now i have a 50MBit DOCSIS 3.0 connection from my ISP, yet when using my trusty 54gl as a router for the connection, when 3 clients try to use the network, they only get about 11-12mbits per client. ON WIRE. The g-generation hardware just wasnt built for these kinds of speeds.

            Tthere is a n-generation router that in my eyes serves as the 54gl (which i still use succ

        • "I have this magic technology called wired networking. Even the copper stuff goes all the way up to 10Gb.

          Wireless is 99% of the time more a pain than it is worth."

          Smart Engineer: If you pay me 2 million dollars I can make it go to 11Gb.

      • BG is good enough, tied to residential/office network, and hard to notice the benefit of N.

        Do you do much copying of files at all? We can see a huge difference on our network (have had n since the first 11n Airport Extreme), with speeds 3-5 times faster than G in the same environment. For us, 11n actually made wireless an acceptable alternative.

        From what I've seen, higher speeds reduce range.

        Yes, I know, N is supposed to do fancy things with multiple antennas and all that good stuff... Supposed to actually have a better range... But, honestly, I haven't seen it.

        Typically, with the installations we do, range is more important than connection speed. I will normally still set WAPs to do B - it generally has better range than G or N.

        If you're really concerned about speed, N will get you up around 100 Mbps... Which is certainly better than 11 or 54

      • My whole house is N, because my T-Mobile blackberry uses UMA to tunnel GSM over IP, and the access points need to be G. I also use a wireless PS3 for streaming HD content to my TV, and a Roku box for the same. Have never had any problems with G, although if you need something to transfer FAST, N may be a requirement. For the vast majority of folks, G is sufficient, especially since their internet connection isn't going to be anywhere near 54Mb/s (theoretical G limitation), except for FTTH folks.
        • Typo: "My whole house is N" should read "My whole house is G"
        • by jasonwc ( 939262 )
          802.11g can only reliably do 20 Mbit/sec or so. It maxes out at about 25 Mbit/sec. My gf has 25/15 FTTH from Verizon, and I had to cap the connection at 16-20 Mbit on 802.11g (20 feet from the AP) as 25 Mbit/sec downloads would cause erratic speeds.

          However, the solution wasn't to upgrade to 802.11n. Since every room in the apartment is wired for 1000-BaseT, I just am going to hook up a Cat 6 cable, and then it won't be an issue.
          • I'm not sure if cost is still an issue but cat5e works just find for 1000-BaseT. Also, if stringing the cords through walls, 5e lets you drill smaller holes.

            • by jasonwc ( 939262 )
              The apartment is less than a year-old and was pre-wired with either Cat 5e or Cat 6. I'm not sure as I haven't checked which. I'm just using a Cat 6 patch cable from the wall to the other side of the room. Cat 6 is less prone to interference from power cables and other network cables, so it seems the better choice for patch cables. I paid $1.50 for 10' Cat 6 cables, about 20 cents more than Cat 5e, and a 50' Cat 6 cable costs $6. So, price isn't an issue.
    • Re: (Score:1, Funny)

      by Anonymous Coward

      How's that HD video streaming working out for you?

    • by tisch ( 1371229 )

      BG is good enough, tied to residential/office network, and hard to notice the benefit of N.

      B is quite slow dude. G is fine. The benefits of N are faster transfers between nodes in your network wirelessly. Send/receive files 5-6times faster within your network. Internet access, as always, is only as fast as your ISP allows.

  • Lack of Demand (Score:3, Insightful)

    by Isaac-1 ( 233099 ) on Thursday December 10, 2009 @05:15PM (#30394254)

    Maybe the problem is lack of demand, how many people need the speed, for that matter how many people need the speed of 802.11G. These days everything seems to be about streaming media, at home people stream media off the internet, or for the more geeky stream it off a media server. So do they really need a wireless connection that is 50 times faster than a typical home broadband connection, particularly when these N routers are over twice the price of their G counterparts.

    Ike

    • When a measly 1GB file takes half an hour to copy over 802.11g, you'll WANT 802.11n to work at your home.

      Who doesn't have 3 or 4 computers at home these days?
      • Re: (Score:2, Insightful)

        by Xeno man ( 1614779 )
        Lots of people have more than 2 or three computers, but hardly anyone sends large files back n forth. Just because you and you Slashdot friends all push gigs of data over your network, doesn't mean everyone does. Most people only use a network so all of their computer can go online, that's it. Some of them will venture into printer sharing and maybe a few for network storage or backup but that is the extent of it. Go hang out in future shop and stand next to the wireless routers and see what regular custome
        • by jasonwc ( 939262 )
          Doesn't that just support our point?

          If you just need internet access, print sharing, and light file transfers, 802.11g is fine*
          If you need high-reliability, very fast speeds, and want to engage in very large file transfers (HD Video) or networked backup solutions, you should get Gigabit ethernet.

          The only plausible benefit 802.11n may have is for consumers that want to stream HD video and are unable or unwilling to wire their home. And 802.11n may not do so well at farther distances, as others have mentioned
      • by RulerOf ( 975607 )
        I pull down 1.1(ish) GB files in about 7 minutes on my G connection. I'd prefer N, but it's fast enough if you're only doing 1 of those files.
      • Re:Lack of Demand (Score:4, Interesting)

        by jasonwc ( 939262 ) on Thursday December 10, 2009 @07:07PM (#30396114)
        That's an exaggeration. Here are real-world best-case-scenario speeds:

        1 GB file
        802.11b- 27 Minutes at 5 Mbit/sec (625 KB/sec)
        802.11g- 8.5 Minutes - 7 Minutes at 20 Mbit/sec (2.5 MB/sec)
        802.11n- 1 Minute, 15 seconds at 110 Mbit/sec (13 MB/sec)

        100-BaseTX- 1 Minute, 30 seconds at 92 Mbit/sec (11.5 MB/sec)
        1000-BaseT from/to Laptop drive- 17 seconds at 480 Mbit/sec (60 MB/sec)
        1000-BaseT from+to high-performance desktop drive- 11 seconds at 800 Mbit/sec (95 MB/sec)
        1000-BaseT RAM --> RAM - 9 seconds at 945 Mbit/sec (118 MB/sec)

        60 MB/sec is realistic to expect when transferring to or from a laptop with a 5400 RPM drive. 85-95 MB/sec or even 100 MB/sec+ is achievable when transferring between high-performance 7200 RPM desktop drives, at the beginning of the drive.

        However, 1 GB is small. A typical HD tv show is 1.1-1.4 GB. A typical 720p x264 encode is 5-8 GB. A typical 1080p x264 encode is 8-15 GB. A system backup can be anything from > 10 GB for incremental backups to 60-100 GB for full backups of system partitions.

        At 85 MB/sec you can transfer a DVD (4.37 GB) in 60 seconds.
  • One quote from the article:

    Before the IEEE will approve any given standard, everyone with a patent that touches that standard must sign a LoA (Letter of Agreement). The LoA states that the patent holder won't sue anyone using his or her patent in a standard-compatible device. In this case, the holdout was CISRO (Commonwealth Scientific and Industrial Research Organization), an Australian government research group that held a patent that concerned the development of a wirless LAN. CISRO refused to sign the 8

    • Re: (Score:1, Insightful)

      by Anonymous Coward

      an Australian government

      and one from the US Constitution: ... Try again, Congress.

      Does this not point out a flaw in your logic?

      • Re: (Score:2, Funny)

        by HoppQ ( 29469 )

        an Australian government

        and one from the US Constitution: ... Try again, Congress.

        Does this not point out a flaw in your logic?

        He's obviously suggesting that U.S. Congress failed because it didn't order an invasion of Australia to promote the progress of science.

        • by Thud457 ( 234763 )

          an Australian government

          and one from the US Constitution: ... Try again, Congress.

          Does this not point out a flaw in your logic?

          He's obviously suggesting that U.S. Congress failed because it didn't order an invasion of Australia to promote the progress of science.

          seconded!
          for Science!

    • IEEE, "I" = International.

      CISRO = Australian.

      US Constitution = United States.

      Tell me again how Congress failed us when the standard was held up in an international standards body by an agency of the Australian government?

  • by organgtool ( 966989 ) on Thursday December 10, 2009 @05:21PM (#30394338)
    The same thing happened with 802.11g. I remember going through four 802.11g PCI cards before finding one that could communicate with my 802.11g router at a distance of more than three feet. I was not aware that the devices were pre-draft (they didn't state "pre-draft" on the packaging like they do now), so I did not realize that was causing my problem. Eventually the standard was ratified, and if my memory is correct, the manufacturers released firmware updates so that the devices complied with the ratified standard. I doubt that this practice will go away since the manufacturers want to release bleeding-edge technology to stay ahead of the competition, but at least their packaging now states "pre-draft" so that cautious consumers will know to avoid it.
    • Re: (Score:3, Interesting)

      by RedLeg ( 22564 )

      Indeed it does.... and you don't know the half of it.

      IEEE operates with a completely different dynamic from what most internet folks are used to.

      One of the big motivations for a company to sponsor a participant (an engineer, by paying him to prepare and to attend) is to get the company's intellectual property incorporated into the standard under development as a MUST. This is all above board, and the companies must declare up front if they believe they have IP in a proposal and to agree that if adopted the

  • Despite the moniker of "open standard" every vendor who contributes to these standards and who has "voting" authority on them have to maintain their business interests. 802.11n was held up more for business reasons, members are competitors remember, where some didn't have product available. They obviously want to make sure that their engineering and pre-manufacturing ramp ups are in line before the standard is released. Like 802.11n, this didn't stop many vendors from releasing "pre standard" products as

  • by serviscope_minor ( 664417 ) on Thursday December 10, 2009 @06:35PM (#30395602) Journal

    One of the tricks is low density parity check codes (LDPCC) which are the best currently known error correcting codes. They're decoded with a wonderfully elegant decoding algorithm which is embarresingly parallel so it works very well in hardware.

    In fact, you can pretty much implement the belief network in hardware directly.

    The codes are also used in 10G Ethernet, too.

    Funny thing is that they date from the 60's, but were impractical because of the amount of computation required to decode them. The decoding algorithm was then rediscovered for inference on Bayes nets.

    If you lick this sort of thing, it is worth reading Mackay's book on inference which is free online. I have no affiliation to Mackay, btw.

  • Despite expanding the acronym once, and linking to the organisation, the article manages to spell it incorrectly 3 times out of 4.

    It's CSIRO, you numbnuts!

    Also, IIRC, the CSIRO patents referred to pre-date any work on 802.11n, and their reluctance to release the patents for use by the WiFi consortium was due to the fact that they were still involved in outstanding suits and countersuits with IBM, Dell, HP, Microsoft, Netgear, Buffalo, etc. When all that was cleared up / dropped, CSIRO agreed to sign off on

BLISS is ignorance.

Working...