By Tiffany Fox, (858) 246-0353, email@example.com
San Diego, Calif., May 17, 2010 — Smartphones have become powerful pocket PCs, and with usage skyrocketing, you might think industry giant Qualcomm, Inc., would be over the moon.
And you'd be right. Except for the small matter of what happens when everyone tries to use his or her smartphone at once.
"San Diego, we have a problem," joked Roberto Padovani, executive vice president and chief technology officer at Qualcomm and an adjunct professor of electrical and computer engineering at UC San Diego’s Jacobs School of Engineering. For a May lecture at UCSD, Padovani described “A Day in the Life of a Smartphone.” He outlined some of the challenges facing the macro-cellular networks, which must cope with the increasing number of data connected devices. Over 1 billion 3G devices are in use today and the smartphone segment, which represents just over 10 percent of total handset sales today, is expected to reach 38 percent of total annual handset sales by 2014, a staggering 600 million units per year.
"It's true that most of us in the industry like to see this kind of growth," Padovani explained during the lecture, which was organized by graduate students in the Electrical and Computer Engineering (ECE) department and co-sponsored by ECE, the UCSD Division of the California Institute for Telecommunications and Information Technology (Calit2), and UCSD’s Center for Wireless Communications (CWC). "The fastest-growing segment of data devices is the smartphone, which is expected to outnumber the PC by 2011.”
But it turns out that when thousands of smartphones are connecting to a cellular network in one concentrated area — at a football stadium during game time, for example, or in major metropolitan areas like New York City — the networks become stressed, leading to blocked Internet access and dropped connections.
Explained Padovani: "In these cases the networks weren't failing because of too much data transfer, since 40 percent of connections transferred less than 100 bytes. They were failing because the total number of connections exceeded expectations." Original industry network simulation models were built on the idea that PCs would be connecting infrequently, and for the purpose of transferring large amounts of data. While this happens, smartphones more often do the opposite: they connect very frequently and transfer only small amounts of data.
Last year, Qualcomm began looking for ways to mitigate this network stress by researching the problem at its source. By logging usage data at a number of college football games and in heavily populated cities, Qualcomm was able to refine the network simulation models.
"Some of these events — the Superbowl, concerts, etc. — can be easier to plan for because they are both predictable and temporary. We can guess at the total number of connections per event per operator. But every single device and application, from the iPhone to Windows Mobile to the Android to the Blackberry, does something different, and that can make it challenging to build an accurate simulation model."
Part of the problem, Qualcomm discovered, was that the industry never expected smartphones, with their emphasis on data transfer, to become so popular. Padovani recalled telling a group of executives in 1997 that mobile communications systems would become data- rather than voice-centric and "they almost threw me out of the room."
Padovani and his colleagues at Qualcomm knew something needed to be done, and quickly. The processing power of smartphones has grown by a factor of 100 since the year 2000, and monthly mobile worldwide data traffic is expected to rise by a factor of 39 (to over 3 million terabytes of data per month) by 2014. Current usage patterns, Padovani noted, are 41.8 minutes per day on average for voice apps, 33.1 minutes for texting, and 65.5 minutes for data.
One solution proved to be relatively simple: leveraging the more accurate network simulations, existing network parameters could be optimized for the new paradigm of a “smartphone world.” By further optimizations, more connections could be available more often, estimating that existing networks could accommodate up to five times the number of the “chatty applications.”
"But the expectation is that the volume of data traffic is just going to go higher and higher," Padovani remarked, adding that a number of other measures must be taken to allow for not only higher connection efficiency but also for higher spectral efficiency per unit area. Optimizing voice capacity to free up spectrum for data is one option, he noted, as is off-loading to Wi-Fi or femtocells and operator deployed pico-cells (which are basically small cellular base stations for high traffic-density areas).
"But even though these user or operator deployable nodes introduce more hotspots," he added, "that brings up the question of signal interference. Leveraging the topology and bringing the network closer to the user makes sense, but it's not easy. Traditional optimization in wireless technology is reaching its theoretical limit. There's quite a number of challenging problems to be solved in this new paradigm."
Coming to the rescue, in part, is the U.S. government, which issued its comprehensive National Broadband Plan in March and called for a plan to free up spectrum and do it fast. As the plan notes, spectrum re-allocation for cellular technologies alone took 11 years. And every moment wasted is money down the drain — the contribution of wireless services to overall GDP grew over 16 percent annually from 1992-2007, compared with less than 3 percent for the rest of the economy.
"This is definitely a good step forward," Padovani said of the plan, which calls for the Federal Communications Commission (FCC) to free up 300 MHz of spectrum within the next five years and make a total of 500 MHz available within the next 10 years.
"This means job security for the radio-frequency engineers forever," he laughed. "We're already supporting 11 different frequency bands on a single device. With the introduction of the plan, there will be plenty more to come."
The National Broadband Plan also calls for broadband television broadcasters to free up spectrum, which has already been achieved somewhat by the nationwide switch from analog to digital TV. At issue now is another 300MHz, with 120MHz coming from recycling television bandwidth.
Another component of the plan’s proposals is to enable and accelerate the deployment in 90 MHz of Mobile Satellite Spectrum (MSS), which consists of a significant amount of bandwidth in the L and S-bands with characteristics suitable for mobile broadband.
Still, Padovani cautioned, "this is not going to happen overnight. There are going to be an awful lot of fights but hopefully it will move forward quickly."
Tiffany Fox, (858) 246-0353, firstname.lastname@example.org