Monday 23 April 2012

LTE efficiency @ 10MHz


A recent 4G throughput test campaign from PC World in the US, produced the graph above. The whole report can be found here but the interesting point to note is that essentially it is a comparisson between the efficiency of LTE Vs HSPA+ using the same bandwidth. Allow me to explain..

Both AT&T and Verizon are currently operating LTE networks using 10MHz of spectrum in the 700MHz band. T-Mobile on the other hand is operating an HSPA+ network using the Dual Cell/Carrier feature which it labels 4G (or fauxG if you are a purist). As the Dual Cell/Carrier feature essentially aggregates two 5MHz carriers the total amount of spectrum is the same (i.e. 10MHz).

Looking at the results it is clear that LTE performs better than HSPA+ but the difference is not that big. Both technologies are using 10MHz of spectrum and capable of 64QAM. LTE has the advantage of MIMO which in certain radio conditions can double the data rate.

The difference between T-Mobile and AT&T is 3.6Mbps and the difference between T-Mobile and Verizon is 1.8Mbps.

But what about the difference between the theoretical peak and the actual average rate? LTE @ 10MHz should deliver 73Mbps. In the tests the best network delivered 9.12Mbps. That is 12.5% of the max.

HSPA+ Dual Cell/Carrier should deliver a theoretical peak of 42Mbps. In the tests T-Mobile achieved 5.53Mbps. That is 13.2% of the max, which is a little bit better than LTE. We also need to remember that T-Mobile's HSPA+ network is a mature network with a lot of existing traffic. As LTE devices become more popular we can expect the LTE networks to slow down due to resource sharing.

One last point to mention, is that it is possible to combine Dual Cell/Carrier with MIMO. Although T-Mobile doesn't offer this capability it would have been interesting to see how close the results would have been. Maybe even HSPA+ would out-perform LTE?!

Monday 9 April 2012

LTE Default Bearer concept


If you have started looking into LTE you will not doubt soon come against the concept of the default bearer. Even though in the begining I was a bit confused about it, it turns out it is quite simple to understand it, especially if you have some background in UMTS or GPRS as it is very similar to a Primary PDP context.

The picture above highlights the differences in terminology quite well, and as you can see besides the naming conventions the concepts are identical.

In UTMS/GPRS, the UE requests the establishment of a PDP Context. This creates a logical end to end "pipe" between the UE and the GGSN. The process of establishing a PDP context also assigns the UE an IP address. Once a primary PDP context is established, secondary PDP contexts can also be established which use the same IP address but require different QoS. Even though these secondary PDP contexts have been defined in the specs since the beginning of GPRS, they were never really implemented to my knowledge.

In LTE, the UE requests the establishment of a PDN Connection. This creates a logical end to end "pipe" between the UE and the PGW. The UE is also assigned an IP address (can be IPv4 or IPv6) and the default bearer is setup. This default bearer is always best effort. Unlike UMTS/GPRS where the UE can request a PDP context at any time, in LTE the PDN connection for the establishment of the default bearer is always requested at power on. This ensures that the UE always has an IP address which in a packet based system like LTE is a necessity. If now the UE requires some QoS that is different than best effort, a dedicated bearer can be setup. This will be a necessity when things like voice services are offered over LTE but could also be used when for example a streaming session is setup, or a Skype/Facetime session etc.

Finally if you are wondering how will the network know that a dedicated bearer is needed, this is done by deep packet inspection, most likely by the PCRF (Policy and Charging Rules Function) node.