Sunday 25 November 2012

LTE RF conditions classification


It is common sense that the performance of any wireless system has a direct relationship with the RF conditions at the time. To aid with performance analysis then, we typically define some ranges of RF measurements that correspond to some typical RF conditions one might find themselves in.

When it comes to LTE, I came across the above table that presents a good classification. The source of this table is a EUTRAN vendor and has been complied during the RF tuning process for a major US operator. Of course there are no rules as to how various RF conditions are classified, so different tables will exist but to a great extent you can expect them to align.

In this particular example, three measurement quantities are used. RSRP (Reference Signal Received Power), RSRQ (Reference Signal Received Quality) and SINR (Signal to Interference & Noise Ratio).

RSRP is a measure of signal strength. It is of most importance as it used by the UE for the cell selection and reselection process and is reported to the network to aid in the handover procedure. For those used to working in UMTS WCDMA it is equivalent to CPICH RSCP.

The 3GPP spec description is "The RSRP (Reference Signal Received Power) is determined for a considered cell as the linear average over the power contributions (Watts) of the resource elements that carry cell specific Reference Signals within the considered measurement frequency bandwidth."

In simple terms the Reference Signal (RS) is mapped to Resource Elements (RE). This mapping follows a specific pattern (see below). So at any point in time the UE will measure all the REs that carry the RS and average the measurements to obtain an RSRP reading.


RSRQ is a measure of signal quality. It is measured by the UE and reported back to the network to aid in the handover procedure. For those used to working in UMTS WCDMA is it equivalent to CPICH Ec/N0. Unlike UTMS WCDMA though it is not used for the process of cell selection and reselection (at least in the Rel08 version of the specs).

The 3GPP spec description is "RSRQ (Reference Signal Received Quality) is defined as the ratio: N×RSRP/(E -UTRA carrier RSSI) where N is the number of Resource Blocks of the E-UTRA carrier RSSI measurement bandwidth."

The new term that appears here is RSSI (Received Signal Strength Indicator). RSSI is effectively a measurement of all of the power contained in the applicable spectrum (1.4, 3, 5, 10, 15 or 20MHz). This could be signals, control channels, data channels, adjacent cell power, background noise, everything. As RSSI applies to the whole spectrum we need to multiple the RSRP measurement by N (the number of resource blocks) which effectively applies the RSRP measurement across the whole spectrum and allows us to compare the two.

Finally SINR is a measure of signal quality as well. Unlike RSRQ, it is not defined in the 3GPP specs but defined by the UE vendor. It is not reported to the network. SINR is used a lot by operators, and the LTE industry in general, as it better quantifies the relationship between RF conditions and throughput. UEs typically use SINR to calculate the CQI (Channel Quality Indicator) they report to the network.

The components of the SINR calculation can be defined as:

S: indicates the power of measured usable signals. Reference signals (RS) and physical downlink shared channels (PDSCHs) are mainly involved

I: indicates the power of measured signals or channel interference signals from other cells in the current system

N: indicates background noise, which is related to measurement bandwidths and receiver noise coefficients

So that is it! I have also included a real life measurement from a Sierra Wireless card that includes the above mentioned metrics so you can see what is the typical output from a UE. Using that and the table above you should be able to deduce the RF condition category it is in at the time of measurement.





Sunday 18 November 2012

Category 4 LTE UEs in the market


It is interesting to see that the first category 4 LTE UEs are appearing in the market. Up to now the vast majority (if not all) of LTE devices have been category 3. The device in question is Huawei's E3276 USB dongle. Category 4 LTE UEs can reach theoretical physical layer DL throughputs of up to 150Mbps. The uplink throughput stays the same at 50Mbps.

The E3276 is also LTE penta-band capable (LTE FDD 800/900/1800/2100/2600) which means that it will operate in pretty much every country that has launched LTE.

The complete 3GPP release 8 LTE UE category table can be seen below.

Moving to the next category, 5,  is not going to be that easy as it requires 4x4 MIMO in the DL. This will mean operators will need to deploy an additional 2 antennas on the base station and UE vendors will need to squeeze another 2 antennas in the device. Possible on a tablet but anything smaller is going to be a challenge. In case you are wondering the increase in the uplink performance of a category 5 device is due to the use of 64QAM as opposed to 16QAM.

Wednesday 7 November 2012

Unhappy with 4G? Lock to 3G!

For years now buying a 3G handset and locking it to 2G has been quite popular. As reported here this was mainly done to get around poor battery life in 3G. Of course this would have an obvious impact on data performance as GPRS/EDGE networks struggle to deliver anything faster than a couple of hundered of kbps at best. From an operator (carrier for our US readers) point of view this was bad as it meant the legacy 2G network was still seeing quite a lot of traffic.

Considering the above it was interesting to see what the options are for 4G capable devices. A screenshot from the Network Mode menu of a Samsung SIII LTE is shown below.


As you can see gone is the option to lock to 2G and now the two options are either 2G/3G/4G (i.e. auto) or 3G only. I guess the option to disable 4G has to be there, from a battery life point of view, stability of early 4G networks and the problems with support for voice calls. But at least in my opinion the second choice should be 3G/2G as opposed to 3G only. I guess this will change depending on the device manufacturer so it will be interesting to see what others do.

Tuesday 6 November 2012

Chance encounter with a femto cell



I was recently performing some drive tests in a busy city centre and came across a femto cell in closed access mode. It was a chance encounter and I was only alerted to its presence by the automated TEMS voice indicating a failed Location Update. The first thing to note as indicated by the map below, is that this location in theory should have perfect coverage. What this shows (perhaps of no surprise) is that even in city centres, coverage holes do exist and people are forced to install femto cells for in-house coverage.


The second thing to note is that this is probably the worst place to install a closed access femto cell. The road is a busy one and to make things even worse there is a bus stop just outside! The amount of failed location updates must be in the in the hundreds if not thousands per day as a continuous stream of cars, buses and pedestrians pass through. A true closed access femto cell nightmare. The signalling trace below, shows the procedures involved as the UE tries to camp on the femto cell and gets rejected (cause code 13).

The time elapsed from reselecting the femto cell, failing to perform a LAU, returning to the macro and performing a further LAU/RAU (not shown in the trace) was approx. 6 seconds. So quite a large "outage" from the customers point of view. From a femto point of view we can expect the LAU attempt to be handled locally (i.e by the femto itself). The subsequent LAU/RAU back on the macro however is handled normally and as such the load on the core network is measurable. So what can be done is cases like this? Femto cells can be installed anywhere so how can the operator protect themselves? A few things come to mind. First detecting the problem. This could be easily solved by looking at failed LAU attemtps counters from the femto (if the manufacturer has implemented them). For an even more obvious detection method an NMC alarm could be created for when the amount of failed LAUs exceed a particular threshold. Once the problem femto is detected its CPICH could be reduced so it is not over-propagating into the street.

Sunday 4 November 2012

WCDMA code tree fragmentation and what to do about it

Managing the downlink code tree in WCDMA is an important Radio Resource Management function. As a WCDMA system starts accepting traffic, various branches of the code tree will be blocked. Ensuring that the code tree is as compact as possible enables the system to freely allocate higher branches (lower SF). For HSDPA heavy networks this is key in ensuring the scheduler has at any TTI the maximum SF16 available.

Most UTRAN vendors manage this during the set-up phase. So for example a CS12.2kbps voice call (SF128) will get the left most SF128 available. This ensures that the right hand side of the code tree can be allocated to HSDPA. In high traffic situations however, as calls continuously get set up and released, even this approach might lead to fragmentation. As an example assume that at time X, SF128,48 was the left most SF128 available and this get allocated to a UE. At time X+1 however, it might be that a number of connections have been released and SF128,30 is now available. If however the original call on SF128,48 is still ongoing that space between 48 and 30 cannot be utilised.

A solution to this problem is to use dynamic code tree management which is what this network is using. As the trace extract shows at RRC Radio Bearer Setup that UE was allocated SF128-37.




Then as time progresses, a number of calls get released and the RNC instructs the UE to switch to SF128-30 as this is the left most SF128 available. This switch is signalled to the UE using the RRC Physical Channel Reconfiguration message, as the spreading operation in WCDMA is covered by the physical layer.




This procedure can then further repeat itself, depending on how much traffic there is and how long the connection lasts.