Tuesday 27 March 2012

LTE worlds apart



Back in 2009, Telia in Sweden launched the first commercial LTE network. Three years down the line, looking at their website there is a little mention of some mobile broadband packages and that is about it. Looking at some other European LTE operators, things don't get much more exciting.

Jump over the pond to the USA and things are much different. LTE 4G, is plastered all over Verizon's and AT&Ts websites, there are flames, direct comparissons of download speeds between the two operators, commercials about LTE, smartphones, tablets, mi-fi, iPad, the lot!

Sure, most of it is just commercial hype, but LTE needed this American style push desperately as the European marketing style was very boring. Now it seems it has started to trickle through as a few European operators have started announcing LTE capable smartphones. Let's see how things develop..


Monday 26 March 2012

Power consumption in different RRC states


A few posts ago I looked at the current drain of 3G Vs 2G during a voice call. For this post I thought it would be interesting to quantify the current drain of a UE in different RRC states. This helps explain the need for fast dormancy, DRX optimisation and managing RRC state transitions from a UE battery perspective.

Looking at the graph (again obtained using Nokia's energy profiler app), we start off in RRC IDLE with the current drain at approx 60mA. The web browser is launched and the UE transitions to RRC DCH via the RRC Connection Setup procedure. A PDP context is requested and following the authentication procedure the UE is assigned an HSPA radio bearer. Current drain is approx 260mA. The actual web page requested was the Google homepage so the data transfer is completed very quickly. From there on the RRC inactivity timer on the RNC is initiated and after 5s the UE is transitioned to RRC FACH. The current drain is now approx 150mA. The reduction is due to the fact that in CELL_FACH the UE can switch off its transmitter. It still has to keep its receiver on however as the RNC can schedule a data transfer at any time.

On this network the timer to transition from FACH to URA_PCH is 30s, which is quite long. This clearly shows that sometimes operators can have netwotk settings that are not very battery friendly. If the UE used fast dormancy (the specific one in question did not), the UE could have released the RRC connection and moved back to IDLE. As it didn't, after the 30s of inactivity the UE is instructed to move to RRC PCH (URA_PCH on this network). As the DRX timer is configured at the same value as the IDLE DRX timer (on this specific network), the current drain is the same at approx 60mA.

Saturday 10 March 2012

73Mbps LTE for new iPad?



I was watching the Apple presentation last week and I found it a bit odd that Philip Schiller quoted an LTE maximum speed of 73Mbps for the new iPad. It is common knowledge that Qualcomm produce category 3 LTE chipsets at the moment which support a theoretical maximum speed of 100Mbps, so why 73Mbps?

Well it turns out it is related to the messy spectrum situation in the U.S. and the fact that the two biggest carriers that will launch the new iPad (Verizon and AT&T), only have 10MHz blocks for LTE at the moment.

LTE allows for flexible carrier allocation (up to 20MHz) and the number of Resource Blocks is tied to the amount of spectrum. With a 10MHz block, 50 Resource Blocks are available.

Looking at 3GPP spec 36.213, we find a table that indicates the maximum transport block size for a given number of resource blocks per 1ms TTI.



If we look at the column for 50 Resource Blocks we find that the maximum TB is 36696 bits.

So 36696 x 1000 = 36696000 bits per second. As LTE allows for 2x2 MIMO, this is doubled to 73392000 bits per second or 73Mbps, which is the figure Apple quoted.

Of course this is a theoretical maximum speed, which requires perfect RF conditions, and the user using all the resources in the cell. Both of course are quite unlikely (unless in a lab) so real speeds can vary.


Friday 9 March 2012

3G Vs 2G power consumption


Most people are aware of the increased power consumption requirements of 3G Vs 2G. In fact quite a lot of people "lock" their phones on 2G as a way of increasing battery life. For this post, I wanted to quantify the difference between the two but in a more novel way than the usual comparison charts. So I thought it would be a good idea to measure the current drain of a voice call in 3G then perform a handover to 2G while still measuring the current drain and then compare the two. So using Nokia's energy profiler and the right radio conditions the chart above was produced (x axis is seconds and y axis is current in mA)..

The various phases are:

1. UE is in idle mode on 3G, current drain is approx. 50mA

2. CS call is set up on 3G, current drain increases to approx. 220mA

3. Screen dims, current drain falls to 200mA

4. Radio conditions start to deteriorate. UE is instructed to power up while at the same time compressed mode is activated which requires higher transmit power due to the spreading factor reduction

5. UE performs a handover to 2G and current drain reduces to 80mA. Finally the call is terminated right at the end of the graph

Although battery life has improved through the years, it is some inherent differences between 2G and 3G that account for the increased current drain. The main one being that in 3G the UE is required to transmit/receive the DPCCH continuously while in 2G the UE is only required to transmit/receive one timeslot out of 8 (GSM TDMA frame structure) and shut down its transmitter/receiver in between. The differences in modulation scheme (GMSK Vs QPSK) also lead to a more efficient power amplifier in 2G. 

So what about LTE? Even though it is early days for voice calls over LTE, the actual implementation allows for DRX/DTX in connected mode so it is possible that current consumption actually improves over 3G. Lets see..

Sunday 4 March 2012

Femtocell bandwidth utilisation


Since femtocells launched a few years back, I have been truly amazed by them. To me they offer a glimpse into the future, where wireless communications will be low power, self-regulating & self-optimising. Vodafone in the UK (and in other markets) has been offering a femto product for a while now, mainly aimed at the consumer segment and especially for those people that don't get macro coverage at home. I happen to have one at home (they only cost 50GBP) and have been looking at it a bit more closely lately.

For this post, I "hubbed out" of the femto and monitored the bandwidth utilisation CS calls have on my ADSL. The femto product that Vodafone use (a.k.a. Sure Signal) can support a maximum of 4 simultaneous users, so I tested it up to this limit.

As can be seen from the Wireshark graph below, a single CS call generates about 60kbps of throughput (in one direction). Adding more calls linearly increases this to approx. 250kbps when the maximum of 4 CS calls are taking place. Considering a UMTS CS call using the standard AMR codec runs at 12.2kbps, this means the overhead is approx. 48kbps or 400%!

Whether this is an issue or not very much depends on someones ADSL package. With a low end ADSL package where the uplink can be as low as 512kbps, this represents a considerable portion of it. With better ADSL packages things obviously get better and I guess the probability that 4 people in your home will be on the phone at the same time is rather slim..

For those that are wondering, the uplink vs downlink utilisation is almost the same and the actual payload from the femto is encapsulated using IPSec.

More femto insights will follow in future posts..