NightKhaos on Digital Freedom

Digital Rights and Technology Blog

Posts Tagged ‘wifi

Wireless is the future, so roll on FTTP.

leave a comment »

I want to take the time to acknowledge and explain a few points that keep coming up around the NBN policy. I may be repeating a few points I made in previous blogs, but considering how long it has been since I posted I feel you won’t mind. First off, before I crack into it, some house cleaning and updates of relevance:

  • The NBN policy seems to have addressed a few issues that concerned me. For example the CAN will not be shut down in remote areas anymore.
  • The NBN policy appears to be able to meet its ROI projections due to the surprising underestimation of the number of 100Mbps service users.
  • The Coalition have put together a workable alternative, which Renai LeMay from Delimiter sums up fairly well in a position I agree with here.
  • Telstra have released a simply awesome LTE network which appears to be delivering real world speeds of around 5-25 Mbps, comparable to ADSL services.

Right, on to the article post:

Yes, the future is wireless. There, I admit it. The world is going increasingly mobile, smart phone and tablet usage has exploded. I am not immune to this either, I have a HTC One XL, which I could not live without, and a Asus Google Nexus 7. I love the freedom this technology presents. I love the fact I can sit in the middle of a park and write this article. I can see a lot of these mobile devices in my future. And I think that FTTP, like what the NBN intends to deliver, are a very important part of this future.

Confusion, I suspect. You must have questions. So let me attempt to answer them. First, with some evidence, and since we’re talking about Australian Broadband here we might as well use statistics relevant to Australia: ABS 8153.0. Please note I have modified their reporting style to something more prudent to this article, however the values have not been changed.

Connections

Technology
Dec 2011
Jun 2012
Movement
Growth
Share
'000
'000
'000
Dial-Up
473
439
-34
-7.19%
1.56%
-0.21%
DSL
4,553
4,632
79
1.74%
16.41%
-0.59%
Cable
900
917
17
1.89%
3.25%
-0.11%
Fibre
37
52
15
40.54%
0.18%
0.05%
  Fixed Line
5,963
6,040
77
1.29%
21.40%
-0.86%
Satellite
100
94
-6
-6.00%
0.33%
-0.04%
Fixed Wireless
35
30
-5
-14.29%
0.11%
-0.02%
  Rural Options
135
124
-11
-8.15%
0.44%
-0.06%
Mobile Wireless
5,491
5,862
371
6.76%
20.77%
0.27%
Smartphone/Tablet
15,190
16,192
1,002
6.60%
57.36%
0.65%
  Mobile Networks
20,681
22,054
1,373
6.64%
78.13%
0.92%
Other
8
10
2
25.00%
0.04%
0.01%
Total
26,787
28,228
1,441
5.38%
 

As you can see from this despite the massive uptake of Mobile Wireless and Smartphone/Tablet with growth of 6.64% there is still a marked growth in the Fixed Line sector of 1.29%.

Volume Downloaded

Jun 2011
Dec 2011
Jun 2012
TB
TB
TB
Broadband
Fixed line
254,947
322,280
389,130
Wireless
19,149
23,142
25,301
Smartphone/Tablet
3,695
5,000
6,610
Total broadband
277,791
350,422
421,041
Total volume of data downloaded
277,897
350,518
421,047

Please note that in the above table Wireless includes Fixed Wireless and Satellite and the Total includes Dial Up and other connection types (not shown). Also the amount of usage reported is for 3 months, not 6.  This is a limitation of the ABS 8153.0 data set.

However, even more telling than the continuing growth in Fixed Line is the amount of data that is being downloaded on the Fixed Line networks. In June 2012 we were up to an average of 64.42 GB per user for the period, as opposed to the 1.44 GB per user for Wireless.

So why are we seeing this trend? Well, considering 4G networks like Telstra’s do in most cases outperform ADSL2+ and Cable connections, it clearly is no longer about the fact that “fixed-line is faster”, although I do not doubt that particular trend will re-establish itself if the NBN continues past the election. No, it has too do with the capacity that each type of network allows for. With fixed-line connections the quotas you can expect are between dozens of gigabytes to completely unlimited, with the typical plan being in the range of hundreds of gigabytes.

Granted this is more than the average usage, however that is probably due to the tendency of most users to overestimate their requirements because they do not want to suffer a loss in speed or excess usage charges. I remember when we first got Broadband, and we would consistently go over our 1GB capacity. Downloading a game demo of 300MB even managed to burn through a large chunk of our quota, and that would only take an hour or so too do. Ever since then my family have insisted that we get capacity well over what we use, especially after experiencing the perks of a Unlimited connection in the UK. Right now with 5 people in the house we are on a 200GB plan and we probably never go over a 100GB.

In the mobile space however the quotas available are significantly less, at between 200MB and around 10GB. And no this isn’t a money grabbing exercise by the providers, there is a legitimate technical reason why, and this reason can be seen the world over. AT&T for example refused to offer tethering on their iPhone unlimited data plans unless you first downgraded to a limited quota plan. The reason is quite simple, network congestion . Mobile networks in particular suffer from this problem.

Why do you think companies like Telstra and Optus have invested in 4G technology? It isn’t because they fixed line as dead, it is because they are fighting a losing battle with their customers over data usage on their wireless networks. They need you to use a little as possible so that they can provide a consistent experience to all users of the network. This is why every smart-phone in existence has the ability to switch to WiFi. And this, ladies and gentlemen, is where we start to get to the substance of my initial statement.

Our wireless future does not in fact rely on the advances of wireless wide area networks (WWAN) like 3G and 4G technology, although they will be important, our wireless future relies on the advances of wireless local area networks (WLAN), like 802.11g, 802.11n and 802.11ac. It is with these technologies that our mobile nirvana will occur, as well as other technologies like mesh networks. Short range, low power, high bandwidth networks. The backbone of these  WLAN networks will be some from of high bandwidth fixed line technology, which is not limited by spectrum and does not suffer from congestion as easily as WWAN networks do. And the best technology to do this job is currently FTTP, and there have been no sudden leaps in the technological space that suggest otherwise.

There are technologies out right now like Fon that can even further reduce the load on our WWAN networks by sharing our WLAN networks with the public at large (a practice that probably has proved unpopular in Australia due to our ISPs still having usage quotas). It is with high penetration of FTTP, and WLAN, mixed in with a little bit of WWAN to “fill in the blanks” is the shape I see our wireless future taking. Just remember the fact that you are getting a FTTP connection does not mean you’re tethered to a desktop computer to use your connection. I don’t know anyone who doesn’t have a WLAN access point in their home today, and FTTP is not going to change that.

Written by NightKhaos

December 8, 2012 at 11:27 am

The Problem with Mobile Broadband

leave a comment »

With the NBN policy in full swing there has been some concern from the industry, and political followers of the issues, that it costs to much, with the political opposition going so far as to call it a “White Elephant”. Some Telecoms even put forward a new proposal, dubbed “NBN 3.0” that was supposed to reduce the overall cost of the project while delivering similar results.

The problem with those advocating the use of Wireless lies with a fundamental misconception about telecommunications; that everything is “going wireless”. It is easy to understand why people would think this, almost everyone has WiFi in their homes, at their university, sometimes even at their local Café. The trend is to buy and use mobile broadband, since it’s more flexible, and you can use it just about anywhere. With the devices like the iPad, and smart-phones starting to take up major ground in the Australian market, the argument seems flawless.

Everyone wants wireless, so why should we invest in fibre? Why should Australia waste around A$40 billion on installing a network that will deliver speeds of up to 1Gbps with a minimum delivery target of 25Mbps into the home.

I’ll tell you why: wireless is a terrible technology, and increasing the usage numbers of said technology will only serve to make the situation worse and likely drive the cost up to consumers as networks try and keep their usage numbers down. This is opposed to fibre, where increase usage numbers will serve to improve the technology, as providers try and innovate with new service delivery models and content delivery methods.

Now this, at first glance, seems odd, why would a technology get worse the more people who use it? Wouldn’t the increased usage numbers allow for better innovation and push companies to deliver more robust and stable networks, thus removing the problems of the current network?

You would think so, and a lot of self called “industry experts” have been saying this. In ten years time their will be a new fibre killer technology which makes it look like going down the fibre road was a waste of money, they say. While we’re at it, for the select few who want a fixed like connection, there are new DSL variants which offer speeds comparable to what the NBN is offering and do not require is to replace the copper infrastructure with fibre.

If you take what these “experts” say at face value, then the opposition are right, this is truly a waste of money. However, and now, for the punch line. Let us look in detail about what wireless technology actually does.

A wireless connection needs three things. The first is an allocation of spectrum. This is the “space” where the data and voice signals will travel. For 3G technologies the spectrum used is in a few spots, generally 850MHz, 900MHz, 1800MHz, and 2100MHz. For in the home WiFi and technologies like Bluetooth it is generally at 2.4GHz, with standards in WIFi also reaching into the 5GHz band, as well as DECT, or cordless phones, usually working either 2.4GHz or 5.8GHz.

Once you work out where you’re going to put it, you need a transmitter. This transmitter needs to be powerful enough to penetrate a given geographic area, and weak enough that it doesn’t interfere with other transmitters running on the same frequency. This is why in home WiFi only has a range of a maximum hundred meters, and the mobile phone towers have a range of a few kilometres. It all depends on how big of an area you want to serve.

And finally, you need a channel. This is an allocation of the spectrum specific to you. Without channels you couldn’t have multiple networks running close together from different carriers, and every time you connected to your WiFi you would get problems because of your mate next door using WiFI. (Of course many home WiFi users are not aware of the channels, and often you get conflicts anyway. If you are having bad signal issues at home and there are a couple of WiFi networks nearby, I suggest trying changing your channel and see if you get some better performance.)

Now this seems fine and dandy, but let us continue with our story, unless the communication method is direct line of sight, or they is only ever going to be one client on the network, you will never achieve the maximum theoretical speeds your network is designed too. This is known as contention.

Contention happens because every device on the same network uses the same channels (usually two, one for transmitting, know as TX and one for receiving, known as RX, some technologies will use more, often in an attempt to bias speeds in a certain direction). This means that if you want to send a packet from your, say part of an email, or an MMS, you will need to wait until no one use on the network is transmitting before you do. And to avoid the situation of two clients noticing “no one is transmitting right now, so it must save for me to” it is often done via a rotating queue, the access point, or tower, deciding who is trying to send the most packets and thus needs to most “spots” on the queue. If you ever downloaded a file on the WiFi you will notice that it often starts slow and speeds up as the file progresses.

So, the more clients you have, the less of a “window” you will have allocated to you to transmit, and the less bandwidth you will have, even if the other client is not doing anything it will still be allocated a “window” so that it can inform the network that it needs a bigger “window.” There are even empty spots where clients can announce they “hey I want to connect to” which, depending the network, can sometimes take a significant part of the window. This is why in busy areas it can sometimes take a long time to connect to the network. Engineers have a hard time making this all work out such that it is seamless for the user.

Now, hopefully you can understand where I am going with this: the more users you have in a given tower, or cell, the slower your connection will be, and thus the deliverable bandwidth you receive will be less. This is also when we completely ignore the issues like signal degradation over distance, where a phone closer to the tower can get a better signal than one in the fringes of the cell. Scale this to many hundreds, or even thousands of users, and those theoretical speeds of over 100Mbps are starting to look like less than 1Mbps per client.

This is worse not only than we currently get now on fixed line broadband (1.5Mbps being the minimum delivery target), it is also worse than what we get on existing, and less efficient, wireless technology. And there are only two ways we can fix this, short of someone working out how to cram more data in the limited spectrum, which often involves a completely replacement of the current infrastructure to achieve.

Option one: we increase the spectrum we have to work with. However, there is very little unused spectrum in the right area (microwave) that we can use for this purpose.

Option two: increase the cell density, thus reducing the number of clients per cell. This will require more backbone fibre to be laid, comparable to what the current NBN is going to lay out. Meaning that we will end up spending tens of billions of dollars anyway.

Fibre on the other hand, if you say it is going to deliver 100Mbps, it will deliver 100Mbps. Not only that, but to upgrade it is much cheaper than the initial roll out, because you simply have to replace the equipment at either end, and you can leave the existing cable in place. This is almost exactly the same as fixed line connections on copper today. The only difference is in maintainable costs, which I understand may be slightly more expensive, however this is offset by fibre cable being much cheaper to buy and produce than copper.

While we’re on the subject of copper, as I mentioned previously some advocates have pointed out advances in DSL technology that negate the need for fibre. These rely on common deception from Telecoms currently: maximum theoretical speed, also known as the dreaded “up to”.

DSL technology delivers slower and slower speeds depending on how far you are from the exchange. In fact, in some areas, if you are far enough out you can’t even get DSL because the signal isn’t strong enough to reach you. No new DSL technologies have fixed this issue. All they have managed to do is increase the bandwidth to the select few who are close to the exchange. One of the options pushed, known as UniDSL, will even fall back to an older technology if you cannot receive one of the higher bandwidth signals.

Fibre on the other hand will be able to deliver the same bandwidth no matter how far you are out from the exchange, because it’s losses over distance are negligible compared to copper (that isn’t to say it doesn’t have losses, but the losses are so small as that for cables less than 5km, or the length you expert for a typical telephone exchange setting, you can expect to see no losses, and there are methods to extend that distance further, such as reducing the number of clients per splitter).

Now you will remember I also said that I agree that the tread is wireless. So if this is the case how do we continue to service this trend while overcoming the issues I’ve mentioned here? Well, as I said, one of the methods is to increase cell density, which involves pretty much the same thing we have been doing now, a select few who actually need the portability can pay for (relatively expense) mobile broadband option, and the rest of us can use fixed line connections with a WiFi access point at the end of it. It’s worked pretty well so far, so why do we all need to suddenly make the switch to completely wireless?

There are other technologies than can help us, like public WiFi from every home. In the UK BT currently, with everyone’s (with an option to opt out) ADSL connection, takes a small amount of bandwidth and dedicates it to a seperate network, a BT Openzone, which anyone who uses their ADSL, or has an Openzone account can use. AT&T in the US currently has MicroCells, which is a little device that boosts 3G signal in the home by acting as a repeater, and T-mobile in the US has recently started offering WiFi calling for their Android G2, where you can make a call on the WiFi network via their VoIP services.

The truth is, we need fibre and it’s not an if, it’s a when. And it has been projected that if we wait any longer it will cost us more to deliver it. So Australia, suck it up, otherwise we really will be in the dark ages when it comes to telecommunications. I’m not saying the NBN 2.0 is a perfect plan, but I have yet to see a better one offered, and if pressed, I’d say this project would be perfectly fine, provided Labor stopped trying to tell us that you will be able to get a strong ROI at the end of it, because if that were the case, someone would have started already.

Written by NightKhaos

October 12, 2010 at 12:11 pm

Posted in NBN, Technology

Tagged with , , , , , ,