Slashdot is powered by your submissions, so send in your scoop


Forgot your password?
Trust the World's Fastest VPN with Your Internet Security & Freedom - A Lifetime Subscription of PureVPN at 88% off. Also, Slashdot's Facebook page has a chat bot now. Message it for stories and more. ×
The Internet Wireless Networking Technology

Verizon LTE Can Use the Monthly Data Allotment In 32 Minutes 273

adeelarshad82 writes "Verizon's new 4G LTE network is so fast that you can use up your entire 5GB in as little as 32 minutes. The 2010-era speeds are soured by the 2005-era thinking on data plans. Verizon has priced LTE pretty much like 3G to encourage data sipping, not guzzling. As soon as you start using the latest high-bandwidth Internet services, your whole month's allotment can evaporate in no time. According to a test, the network's speed maxed out at 21Mbps, which means that it takes only 32 minutes to smoke up the 5GB monthly data cap on the plan. While the 21Mbps speed was hit on a low traffic network, Verizon estimates you'll be able to get around 8.5Mbps with a loaded network which still means that the cap can be exhausted in about an hour and a half."
This discussion has been archived. No new comments can be posted.

Verizon LTE Can Use the Monthly Data Allotment In 32 Minutes

Comments Filter:
  • by meerling ( 1487879 ) on Thursday December 02, 2010 @07:30PM (#34425750)
    What if you are watching a streaming video? (Which of course means you don't know what size it actually is...)
    There are lots of data chugging activities on the net that don't tell you how large they are, combine that with a greedy provider that wants you to go over your limits so they can charge you more, and your wallet is going to be taking the hit sooner or later.
  • by Firehed ( 942385 ) on Thursday December 02, 2010 @07:47PM (#34425966) Homepage

    Still cheaper than a teenager without an unlimited texting plan.

  • by DragonWriter ( 970822 ) on Thursday December 02, 2010 @07:52PM (#34426024)

    Why the hell are you measuring bandwidth in seconds? And why do you need over 5 GB on your phone?

    Verizon's 4G LTE network isn't even available on phones initially, its limited to USB modems for computers (and Verizon's marketing of those USB modems and the associated plans is targetted primarily to business users); it will be rolled out to phones later.

    So the more relevant question should be, "why do you need over 5GB/month in network data transfer to a computer, especially one you use for business?"

  • by damnbunni ( 1215350 ) on Thursday December 02, 2010 @09:23PM (#34426836) Journal

    Sprint used to, then they amended their Unlimited plan to Unlimited* with an asterisk and a 5 gig cap.

    Virgin Mobile Prepaid, however, is $40/month for unlimited 3G internet.. and uses Sprint's network. It's actually Sprint's prepaid brand.

    I've pulled down 20 gigs and up per month (downloaded all my Steam games!) without issues.

    However, this VM Unlimited plan is pretty new.

  • by Caerdwyn ( 829058 ) on Thursday December 02, 2010 @09:45PM (#34426986) Journal

    The real bottleneck that wireless carriers worry about is not their network. It's the capacity of a single cell tower to carry a finite number of simultaneous connections.

    Have a look at the info about LTE frequency assignments []. OK, all you hams out there, how many MHz of the frequency band to carry a data rate of 21MHz at the various assigned frequencies? How much frequency spectrum is available? Divide X by Y and you get the number of simultaneous full-speed downloads. Exceed that, and you have to start some sort of time-sharing scheme in which individual users grab a few milliseconds of exclusive ownership of each channel at a time. (Token Ring, anyone?)

    Because of the way radio works, you can only get so much network bandwidth out of a particular frequency spectrum. You can do phasing tricks and subcarrier acrobatics to squeeze more out, but there will be a point at which you can't handle more devices per cell tower, no matter how much (wired/fiber) network there is behind it. And putting two cell phone towers right next to each other doesn't double the number of connections that can be handled; a phone connecting at 2410MHz to one cell phone tower will be putting out radio noise that a second tower right next to it will pick up. This is why AT&T is getting hammered in places like San Francisco and New York where there is a very high density of 3G users; they just can't add more cell towers. They're saturated; it's not because they're cheap bastards (they are), it's physics. That's how radio works.

    Think of it this way: your FM radio has channels from 88.1MHz to 107.9MHz in 200KHz steps. Once all 101 channels are allocated, just "adding more towers" doesn't get you anything.

    Smart phones differ from traditional cell phones in that they are "on the air" more than voice-only phones (insert teenage-girl joke here). A voice call might need 50kbit/sec for the duration of the call, and thus consume very little radio spectrum during that call (a handful of KHz). But a data session is a steady high-bitrate stream that can consume several MHz. Yes, interlacing occurs, but it really comes down to this: the limitation is how many MBits per second an allocated frequency spectrum can carry, divided by the number of simultaneous users of that frequency and their data demands. Once it's all in use, there ain't no more. Users get timesliced to slower and slower connections, until the granularity demanded by timeslicing and channel-juggling among X-thousand users of a single tower is so small that you can't even get a voice call through.

    So yeah, I understand why wireless carriers would want to cap data usage. It sucks, but physics doesn't care how angry a consumer is, you can't sue to force 1000MHz of in-use spectrum to fit into 200MHz of allocated spectrum, and carriers can't throw money at physics until it goes away. Radio spectrum is a finite resource, data at a given rate requires a specific portion of that spectrum, and that's it. Something has to be capped. Data rate or data cap; something has to throttle usage, because there's not enough to go around for everyone to max it at once.

  • by MonMotha ( 514624 ) on Friday December 03, 2010 @12:26AM (#34427918)

    It doesn't quite work this way. This is going to be a bit technical, but you asked a technical question, so bear with me. Yes, I am a ham (since you asked for one), and I've also done some commercial RF data systems.

    As others have pointed out, cellular telephone systems aren't like broadcast systems. You really can "put up more towers" to increase the amount of "service" (available data transfer per unit time, number of simultaneous voice calls, etc.) in a given geographic area without using more RF bandwidth. The reason for this is that you can turn the power on the base and handset down to reduce the coverage of the cell allowing reuse of the RF bandwidth more frequently within a certain geographical space. This is already done: cells on rural highways are much larger than cells within a city. In fact, the cells on rural highways would often be capable of covering an entire city from a geographic point of view, but there wouldn't be enough capacity to handle all that traffic, so smaller (lower power, lower antenna angle, etc.) cells are placed in cities allowing reuse of that RF bandwidth. Broadcast services can be thought of as "cellular" with very large cells (depending on the service, up to and including the entire planet for HF "shortwave" radio, for example) if you want, but that's not a traditional interpretation.

    As for how much bandwidth it takes to attain a certain information rate, that varies with a number of factors. Assuming a uniform RF environment (noise, propagation, etc., which of course isn't true but is handy for discussion), the key tradeoff is made by how "aggressive" your modulation scheme is. A more aggressive modulation scheme packs more data into a certain amount of RF bandwidth, but it requires a stronger signal to noise ratio at the receiver to demodulate and recover the data. The exact relationship between how much data you can chuck into a given amount of RF bandwidth and the required receiver SNR varies with your chosen modulation scheme and receiver design. The reason data rates have been increasing with time is that newer, better (easier to demodulation) modulation schemes and better (mostly less noisy, but also more cost effective for a given complexity) receivers are being developed. More cells are also being added (see above) to lessen "competition" for the channel's bandwidth, but we're also seeing a lot more users and demand, so that probably averages out. The amount of RF bandwidth allocated to the cellular telephone services has remained roughly constant since the late 90s (800MHz cellular band + 1900MHz PCS band, though other bands are also used regionally, and some of these are new).

    In a two-way scenario like a cellular telephone, you also get to play with the fact that the two directions don't behave equally. The base-to-handset link (downlink) has the advantage of no access contention (there's just one base, and it knows everything it's doing), expensive equipment (there's only one, so the company can pump some money into it), and lots of power available (it's plugged into the wall). The handset-to-base link (uplink) is messier: it has access contention (multiple handsets coordinated remotely by the base), cost sensitive equipment (consumers don't like to pay thousands of dollars for their handsets), and limited power (batteries). Antennas are something of a wash since antennas are effective about equally in both directions. What all this means is that it's easier to use a more aggressive modulation scheme (and hence cram more bits per second into a given chunk of RF MHz) on the downlink than the uplink. Fortunately, this is roughly in-line with consumer demand: most consumers want to transfer large stuff to their phones, not from them. FWIW, Cable Modems have similar concerns, and a similar situation results.

    You also seem to assume a TDMA based uplink channel. Modern standards are all CDMA based. While the theory of operation is totally different, the effect is the same: multiple people contend for the same resource. Various

New systems generate new problems.