March 31, 2015

Hockey crowd melted down Levi’s Stadium network and app, overwhelmed light rail

Levi's Stadium scoreboard during Stadium Series hockey game. Credit all images: Paul Kapustka, MSR (click on any photo for larger image).

Levi’s Stadium scoreboard during Stadium Series hockey game. Credit all images: Paul Kapustka, MSR (click on any photo for larger image).

From a financial and publicity standpoint Saturday’s Coors Light Stadium Series hockey game at Levi’s Stadium was a success, with 70,205 fans packing the football facility to watch the San Jose Sharks lose to the Los Angeles Kings, 2-1. But while the sellout crowd contributed to the general electricity that filled the venue, the mass of people also caused problems with the stadium’s vaunted wireless network, knocking out some parts of the Wi-Fi and cellular networks and overwhelming the unique feature of the stadium app designed to allow fans to have food and drinks delivered to their seats.

Hockey fans also swamped the VTA light rail system, causing some fans to wait as long as two hours before they could catch a bus or train to get home from the stadium. Though light rail officials said they will work on correcting the problems, the commuting jam does not bode well for a facility that is scheduled to host Super Bowl 50 in less than a year’s time, especially since many Super Bowl fans are expected to be traveling from San Francisco to the Santa Clara, Calif., neighborhood where Levi’s Stadium sits.

According to Roger Hacker, senior manager for corporate communications for the San Francisco 49ers, the Levi’s Stadium network team identified “isolated interruptions” of the Wi-Fi network, due to “frequency coordination issues” that the network team had not seen at previous events. Hacker also said that one unnamed wireless carrier had “issues” with its base station firmware, but said that the problems were resolved by game’s end. (For the record, I am a Verizon Wireless customer and I had “issues” getting cellular connectivity Saturday, so draw your own conclusions.)

Since the Niners’ full explanation is somewhat light on facts and numbers, we will first offer a “fan’s view” of the events Saturday night, under the caveat that Mobile Sports Report was not attending the game as press, but instead as just a regular hockey fan (one who purchased two full-price tickets) who was looking forward to using the stadium’s technology to enhance the game experience. Unfortunately for this fan, the Levi’s Stadium network, app and transit services all fell down on the job.

Light show a dud

Though the MSR team had no problems getting to the stadium — our light rail train out of Mountain View at about 5:30 p.m. was relatively empty — I noticed some irregularities in network connections during the pregame ceremonies, when I tried to join in the fan-participation light show, a technology feature recently added to the Levi’s Stadium app especially for the Stadium Series game. Like many people in our area, I couldn’t get the app to work, leaving me staring at a spinning graphic while others in the stadium saw their phones contribute flashing lights during pre-game music.

After the light show segment ended, I noticed that the Levi’s app was performing erratically, quitting on its own and kicking my device off the Wi-Fi network. After rebooting the device (a new Apple iPhone 6 Plus) I still couldn’t connect to the Wi-Fi, an experience I’ve never had at Levi’s. Turning off the Wi-Fi didn’t help, as cellular service also seemed poor. Since I wasn’t really there to work — I just wanted to enjoy the game with my older brother, who was in town for the event — I posted a quick tweet and went back to just watching the Sharks play poorly for the first 20 minutes.

One of the benefits of being a close follower of Levi’s Stadium technology is that when you tweet, people listen. By the middle of the first intermission, I was visited personally by Anoop Nagwani, the new head of the Levi’s Stadium network team, along with a technician from Aruba Networks, the Wi-Fi gear supplier at the stadium. Even with laptops and scanners, my visitors couldn’t immediately discern the network problem; they were, however, visited by a number of other nearby fans, who figured out who they were and relayed their own networking problems to them.

To be clear: I didn’t spend the game as I usually do at Levi’s, wandering around to see how the network is performing at as many spots as I can. But even if the outage was only in our area, that’s a significant problem for Levi’s Stadium, which has touted its technology every chance it gets. I also noticed problems with cellular connectivity all night, which leads me to believe that the network issues were more widespread than just at my seating area.

The official statement from Hacker describing the problems doesn’t pin any specific blame, but a guess from us is that perhaps something in the mix of systems used by the entertainment performers (there was a small stage to one side of the rink where musicians performed) and media new to the facility caused the Wi-Fi problem. Here is the official statement on the Wi-Fi issues:

The Levi’s Stadium network team identified isolated interruptions of the WiFi system in specific sections on Saturday night due to frequency coordination issues previously unseen at the venue and unique to this event. Saturday’s event featured extra radio systems not typical to previous stadium events, some of which were found to be unauthorized by event frequency coordinators. To avoid similar situations in the future, Levi’s Stadium management will be initiating additional frequency control protocols for all events.

Hacker said the network team did not track exactly how widespread the outages were, so could not provide a number of fans affected. But enough apparently did connect, since according to Hacker, the Levi’s network saw near-record traffic Saturday night, with a total of 3.0 terabytes of data carried, second only to the season-opening Niners game back in September, which saw 3.3 TB of data used on the Wi-Fi. Hacker said there were 24,792 unique devices connected to Wi-Fi during Saturday’s event, with a peak concurrent user number of 17,400 users, also second highest behind the season-opener total of 19,0000. The Stadium Series game did set a new mark for throughput with 3.5 Gbps on the network just before the start of the game, a surge that seems to be behind some of the other problems.

Food ordering overwhelmed

During the intermission, my brother and I went out on the 300-level concourse to get something to eat and drink — and encountered one of the untold stories of Levi’s Stadium: the incredibly long and slow lines for concessions. While I haven’t researched this problem in depth, after 10 minutes of inertia in our line I told my brother I would use the app’s food and drink ordering function to get us some vittles and beverages. Finally able to connect via Wi-Fi while on the concourse I placed an order for two beers and two hot dogs, and didn’t worry that the delivery time was 20 minutes. That would put it at the very latest near the end of the second period, which was fine by me since it meant I didn’t have to wait in lines. Or so I thought.

Back in my seat, I was troubled by the fact that even halfway through the period, the app had not switched yet from ordered to “en route.” I also got some error messages I had never seen at Levi’s Stadium before:

When the period ended and there was still no movement from the app (which I only checked sporadically since Wi-Fi never fully connected in my seat), I went back on the concourse where I found a small, angry crowd around the food-runner window at the closest concession stand. Pretty much, everyone there had the same problem I had: We’d ordered food and the app had said that the order had been taken, but nothing had happened since then.

Fans trying to figure out why their food orders weren't delivered

Fans trying to figure out why their food orders weren’t delivered

The situation wasn’t good since nobody at the food-runner window had any technology that would allow them to communicate with the app or network team; they couldn’t even cancel orders or make sure credit card refunds would be processed, which only served to increase the frustration for the fans who were just trying to use the services as advertised.

In the end, the staff at the delivery window did the best they could — which at one point resulted in someone producing slips of paper which the waiting fans used to write down their orders; one staffer then tried to fulfill those orders as best he could, going to the concession stand and bringing them out one by one. After waiting nearly the full intermission (missing Melissa Etheridge) I was given two cold hot dogs and two draft beers. Since there were no food holders left at the stand, I had to put the hot dogs into my jacket pockets and hold both beers. At least I didn’t starve or go thirsty, but it was a far cry from the delivered-to-the-seat functionality I had raved about to my brother that simply didn’t materialize.

During this process I sent an email to Louise Callagy, vice president of marketing at stadium app developer VenueNext. Her in-game response was:

“Levi’s Stadium app usage exceeded any previous event and set new records, causing delivery and order fulfillment delays. As always, we will do a post mortem after the event, and make the necessary adjustments to operational and staffing support, including systems performance analysis. We apologize to any fans who were inconvenienced.”

According to Hacker, the Levi’s Stadium food-runner staffing was at the same level as a regular-season Niners’ game; however, Hacker said the hockey fans broke the previous ordering records before the first period was over. Here is the official statement on the food ordering snafu:

With more than 31,000 new downloads of the Levi’s Stadium App – 20 percent more than had ever been seen at any previous stadium event – the [food ordering] system experienced 50 percent higher order volume in the just first hour of the game than had been seen during any previous event. The dramatic increase led to the extended wait times and cancelled orders experienced by some fans.

In a separate email, Hacker did not provide an exact number for how many fans were represented by the term “some,” but he did confirm that “no customers were charged for unfulfilled orders.”

Still, the system shouldn’t have had any unfulfilled orders, at least not according to the Niners’ consistent hype of the network and the app. Remember, Niners officials had long been confident that their network would be able to stand up to any load. Such was not the case Saturday night.

The long wait home

VTA line following Levi's Stadium hockey game

VTA line following Levi’s Stadium hockey game

After an exciting third period and a game that went down to the final horn, we left the stadium and were immediately greeted by a mass of people packing in to the VTA departure area. With too many people and not enough trains and buses, we spent almost an hour moving like slow cattle until we eventually got on a train to Mountain View. We considered ourselves lucky, since it looked like the folks heading south on VTA were in for an even longer wait.

When we got to the Mountain View station, we waited almost another hour to leave since Caltrain (nicely) kept its last train at the station until two more VTA trains brought the stragglers in from Levi’s. Though VTA has since claimed there were more than twice the “normal” number of riders than it saw at Niners games this season, there was no explanation why VTA didn’t or couldn’t provide more capacity after it saw more fans use the service to get to the game. What was most unpleasant was the overall unorganized method of boarding the trains, just a massive group line with one VTA person on a bullhorn telling everyone to make sure they bought a ticket.

In the end, the time it took to get from the start of the VTA line to my house in San Mateo was three hours — almost as long as the game itself. With some other “special” events like Wrestlemania and concerts coming up at Levi’s and the Super Bowl 50 next year, it’s clear there is lots of work that needs to be done to make it a good experience for all who purchase a ticket, especially those looking to use public transport and the app features to enhance their game-day experience.

Sharks and Kings on the ice at Levi's Stadium

Sharks and Kings on the ice at Levi’s Stadium

New Atlanta football stadium picks IBM as lead technology integrator

New Atlanta football stadium under construction. Credit all images: New Atlanta Stadium

New Atlanta football stadium under construction. Credit all images: New Atlanta Stadium

The yet-to-be named new football stadium under construction in Atlanta has selected IBM as its lead technology integrator, somewhat officially welcoming the new 800-pound gorilla to the stadium technology marketplace.

While computing giant IBM has dabbled in sports deployments before — mainly contributing technology as part of its corporate sponsorship of events like The Masters in golf and the U.S. Open for tennis — only recently has Big Blue gotten into the large-venue technology integration game. And while IBM’s recent deal as technology integrator for the revamp of Texas A&M’s Kyle Field was probably its true debut, for the new Atlanta stadium IBM will lead the selection, design and deployment of a wide range of technologies, including but not limited to the core Wi-Fi and DAS networks that will provide in-venue wireless connectivity.

Due to open in March of 2017, the new $1.4 billion stadium is expected to hold 71,000 fans for football, and up to 83,000 fans for other events like basketball or concerts. And while soccer and concerts and basketball will certainly be part of its events schedule, the NFL Atlanta Falcons and owner Arthur Blank are driving the bus on the new building, picking IBM in part to help satisfy a desire to build a venue that will be second to none when it comes to fan experience.

IBM’s size and experience a draw for Atlanta

Interior stadium design rendering

Interior stadium design rendering

In addition to Wi-Fi and DAS network buildouts, IBM will design systems to control the expected 2,000-plus digital displays in the planned stadium and will also oversee other technology-related parts of the stadium, including video security, physical door controls and a video intercom system, according to an announcement made today. IBM will also partner with the stadium owners to develop as yet-undetermined applications to “leverage the power of mobility to create a highly contextual, more personalized game day experience for fans, all through the integration of analytics, mobile, social, security and cloud technologies.”

In a phone interview Thursday, Jared Miller, chief technology officer for Blank’s namesake AMB Sports and Entertainment (AMBSE) group, said IBM’s depth and breadth in technology, applications and design made it a somewhat easy choice as lead technology partner.

Miller said the stadium developers looked at the number of different technology systems that would exist within the building, and ideally wanted to identify a single partner to help build and control them all, instead of multiple providers who might just have a single “silo” of expertise.

Proposed stadium exterior

Proposed stadium exterior

“IBM is unique with its span of technology footprint,” Miller said. He also cited IBM’s ability to not just deploy technology but to also help determine what the technology could be used for, with analytics and application design.

“They’ve looked at the [stadium] opportunity in a different manner, thinking about what we could do with the network once it’s built,” Miller said.

IBM, which also has a sizable consulting business, created a group targeting “interactive experiences” about two years ago, according to Shannon Miller, the North America Fan Experience Lead for the IBM Interactive Experience group. Miller (no relation to Jared Miller), also interviewed by phone Thursday, said IBM had been working with Arthur Blank and the Falcons for more than a year to determine how to make the new stadium “the best fan experience in the world.”

And while IBM is somewhat of a newcomer to the stadium-technology integration game, IBM’s Miller said the company not only understands “how to make digital and physical work together,” but also has resources in areas including innovation, technology development and design that smaller firms may not have. And while the Kyle Field project was ambitious, IBM’s Miller said the Atlanta operation will be much bigger.

“The size and scale of what we’re going to do [in Atlanta] will be unique,” he said.

No suppliers picked yet for Wi-Fi or DAS

For industry watchers, IBM and the Falcons team have not yet picked technology suppliers for discrete parts of the coming wireless network, such as Wi-Fi access points and DAS gear. (Daktronics has already been announced as the supplier of the new planned Halo Screen video board.) But those vendor decisions will likely be coming soon, since the stadium is under a hard deadline to open for the first game of the Major League Soccer season in March of 2017.

“We’re working fast and furious on design, and we want to identify [the gear suppliers] as early as possible,” said AMBSE’s Miller.

IBM and AMBSE did announce that the stadium’s network will be fiber-based, and will probably use Corning as a fiber and Passive Optical Network (PON) technology provider, though that choice was not announced or confirmed. IBM and Corning partnered to install a fiber network core for Wi-Fi and DAS at Texas A&M’s Kyle Field, believed to be the first large fiber network in a large stadium anywhere.

The Atlanta deal puts IBM solidly into the rapidly expanding field of stadium technology integration, which includes companies like CDW (which led network deployments at the University of Nebraska and the University of Phoenix Stadium) as well as stadium ownership groups, like the San Francisco 49ers, and technology name sponsors like AT&T, which has partnered with owners for technology and network deployments at venues like AT&T Park and AT&T Stadium.

Overhead view

Overhead view

Super Bowl XLIX sets new stadium Wi-Fi record with 6.2 Terabytes of data consumed

University of Phoenix Stadium. Credit: Arizona Cardinals.

University of Phoenix Stadium. Credit: Arizona Cardinals.

The Super Bowl is once again the stadium Wi-Fi champ, as fans at Sunday’s Super Bowl XLIX in Glendale, Ariz., used 6.23 terabytes of data during the contest, according to the team running the network at the University of Phoenix Stadium.

The 6.23 TB mark blew past the most recent entrant in the “most Wi-Fi used at a single-day single-stadium event” sweepstakes, the 4.93 TB used at the Jan. 12 College Football Playoff championship game at AT&T Stadium. Prior to that, pro football games this past season at Levi’s Stadium in Santa Clara, Calif., and at AT&T Stadium had pushed into the 3-plus TB mark to be among the highest totals ever reported.

The live crowd watching the New England Patriots’ 28-24 victory over the Seattle Seahawks also used about as much cellular data as well, with Verizon Wireless, AT&T and Sprint claiming a combined total of 6.56 TB used in and around the stadium on game day. All three carriers were on the in-stadium and outside-the-stadium DAS deployments being run by neutral host Crown Castle. If those figures are correct (more on this later) it would put the total wireless data usage for the event at 12.79 TB, far and away the biggest single day of wireless data use we’ve ever heard of.

Apple OS updates still the application king

Handrails with Wi-Fi antenna enclosures from AmpThink. Credit: Arizona Cardinals.

Handrails with Wi-Fi antenna enclosures from AmpThink. Credit: Arizona Cardinals.

Mark Feller, vice president of information technology for the Arizona Cardinals, and Travis Bugh, senior wireless consultant for CDW, provided Mobile Sports Report with the final Wi-Fi usage numbers, which are pretty stunning for anyone in the stadium networking profession. According to Feller the new CDW-deployed Wi-Fi network with Cisco gear at the UoP Stadium saw 2.499 TB of data downloaded, and 3.714 TB uploaded, for a total of 6.213 TB of Wi-Fi usage. Bugh of CDW said there were 25,936 unique devices connecting to the network on game day, with a peak concurrent usage of 17,322, recorded not surprisingly at halftime.

Peak download usage of 1.3 Gbps was recorded before the game’s start, while peak upload usage of 2.5 Gbps was hit at halftime. The top applications by bandwidth use, Feller said, were Apple (mobile update), Facebook, Dropbox and Snapchat.

DAS numbers also set new record, but clarification needed

The only reason we aren’t yet trumpeting the 6.564 TB of reported DAS use as a verified record is due to the differences in clarity from each of the reporting providers. We also haven’t yet heard any usage totals from T-Mobile, so it’s likely that the final final wireless data use number is somewhere north of 13 TB, if all can be believed.

Parking lot light poles, Westgate entertainment district. Can you spot the DAS?

Parking lot light poles, Westgate entertainment district. Can you spot the DAS?

As reported before, AT&T said it saw 1.7 TB of cellular wireless activity from its customers on game day, with 696 GB of that happening inside the stadium, and the balance coming from the outside areas before and after the game. We’d also like to welcome Sprint to the big-game reporting crew (thanks Sprint!), with its total of 754 GB of all 4G LTE traffic used in and around the stadium on game day. According to Sprint representatives, its Super Bowl coverage efforts included 5 COWs (cell towers on wheels) as well as expanded DAS and macro placements in various Phoenix-area locations. The Sprint coverage included the 2.5 GHz spectrum that uses TDD LTE technology.

As also previously reported, Verizon Wireless claimed 4.1 TB of customer traffic in and around the stadium on game day, which Verizon claims is all cellular traffic and does not reflect any Verizon Wireless customer use of the stadium Wi-Fi network. Verizon also reported some other interesting activity tidbits, which included 46,772 Verizon Wireless devices used at the game, of which just 59.7 percent were smartphones. Verizon also said it saw 10 million emails sent on its networks that day, and 1.9 million websites visited, while also seeing 122.308 videos sent or received over wireless connections.

We’re still waiting to see if we can get usage numbers from the Super Bowl stadium app (we’re especially interested to see if the instant replay feature caught on) but the warning for stadium owners and operators everywhere seems to be clear: If you’re hosting the big game (or any BIG game), make sure your network is ready for 6 TB and beyond!

Super DAS: AT&T, Verizon beefed up Phoenix area with mobile cell towers and more DAS

AT&T Cell on Wheels (COW) deployment outside the ESPNZone in Phoenix. Credit all photos: AT&T (click on any photo for a larger image)

AT&T Cell on Wheels (COW) deployment outside the ESPNZone in Phoenix. Credit all photos: AT&T (click on any photo for a larger image)

While we wait for the traffic stats from the incredibly exciting Super Bowl XLIX, here’s the final installment of our Super DAS series — in which the two major U.S. wireless carriers, AT&T and Verizon Wireless, provide some details about how they beefed up coverage in and around Phoenix to handle the expected Super Bowl communications crush.

The lengths to which AT&T and Verizon went to ensure no signals were dropped are interesting from several business points of view; to be sure, no major carrier wants Twitter to erupt with reports of dropped calls from a major event. (AT&T folks still grimace when you bring up the historical benchmark for this type of problem, SXSW and Twitter.)

The flood-the-zone type of temporary enhancements now brought in on a regular basis for big events also point out the ongoing need for distributed antenna system (DAS) deployments: the basic fact of our ever more connected lives simply means that for large public venues, or places where lots of people gather at once, the legacy cellular network designs simply can’t keep up.

ESPNZone DAS gear in underground garage

ESPNZone DAS gear in underground garage

To make sure it could, AT&T said it deployed 10 cell towers on wheels (aka “COWS”) to the Phoenix area in advance of the weekend, while also upgrading its equipment at DAS installations like the one Crown Castle had at the University of Phoenix Stadium as well as at other points around town. AT&T folks were kind enough to supply us with plenty of photos of the deployments — we especially like the DAS built in an underground garage near the ESPNZone outlet in Phoenix.

Verizon also said it deployed 13 COWs and upgraded many DAS deployments in the Phoenix area prior to the Super Bowl, and even said it had a team of network technicians on hand to make sure traffic kept running smoothly.

How did it all work out? So far, we haven’t seen any reports of missed cellular connections during Super Bowl weekend (which also included the Waste Management golf tournament in the area, further adding to cellular pressure). What it does make us wonder about is the economic solution in the future to big-crowd wireless traffic concerns, which clearly aren’t limited to inside the event venue anymore. Are more portable deployments the way forward, or will we see more DAS installations that can be upgraded quickly on the fly?

More photos below!

AT&T COW with box on roof

AT&T COW with box on roof

Another AT&T COW

Another AT&T COW

AT&T COW at Wild Horse Pass

AT&T COW at Wild Horse Pass

AT&T COW deployment in downtown Phoenix

AT&T COW deployment in downtown Phoenix

Downtown COW on a roof

Downtown COW on a roof

ESPNZone DAS cabling run

ESPNZone DAS cabling run

Hyatt Gainey Ranch COW

Hyatt Gainey Ranch COW

Sonim’s rugged LTE phones get public-safety trials at Super Bowl, World Ski Championships

Sonim XP7 handset

Sonim XP7 handset

The new XP7 ruggedized LTE smartphone from Sonim Technologies will get some on-the-scene testing by public safety professionals at both the Super Bowl as well as the upcoming World Ski Championships in Vail, Colo., according to Sonim, a San Mateo, Calif.-based maker of ruggedized devices.

Expected to be publicly announced Friday, the news that Sonim’s newest ruggedized LTE handset will be tested by firefighters from the Phoenix Fire Department during their Super Bowl deployments is significant for those concerned with public-safety operations around large public venues, since it offers a new way for industry-standard applications to be shared in a potentially “extreme” environment. With support for both standard wireless-carrier LTE networks as well as the emerging “FirstNet” public safety LTE frequency, the Sonim XP7 also offers one potential path toward the long-desired goal of having communication devices that can allow different first-responder agencies to communicate with each other, or to more simply share information from different devices, applications or networks.

While the Phoenix test deployment of XP7 handsets will use AT&T LTE airwaves, a similar test process scheduled to take place in Vail and Beaver Creek at the Feb. 2-15 FIS World Ski Championships will use a demonstration version of the Band Class 14 LTE public safety broadband network, according to a release from FirstNet Colorado.

Screen Shot 2015-01-28 at 3.15.18 PMThe Vail demonstration will make use of a Distributed Antenna System (DAS) that was built for the town of Vail by neutral host provider Crown Castle, as well as the Sonim phones, among other devices and services.

While it has the first-glance look of a regular smartphone, the Sonim XP7 has a host of ruggedized features including long battery life, an extra-loud speaker, protection against drops and weather, a touchscreen accessible with gloves, and a screen viewable in bright sunlight. In a recent interview with Sonim CEO Bob Plaschke, MSR got to see and hold Plashcke’s XP7, a bulky device that certainly feels like it could stand up to extreme weather and rough handling. In addition to its obvious target market of first responders and other extreme-condition businesses, the XP7 is also being targeted at extreme athletes and outdoor-lifestyle customers, who should be able to purchase the device from major U.S. wireless carriers later this year.

In Phoenix, the Sonim phone will be compared to consumer-grade smartphones in a test using a custom-built firehouse alert app, according to Sonim. The Phoenix firefighters will also test the XP7′s ability to act as a Wi-Fi hotspot, and will also test its compatibility and interoperability with other mobile devices.

Super DAS, Part 2: Super Bowl stadium DAS expands to address increased demand for cellular connectivity

Editor’s note: This story is part 2 of a series of profiles of the providers of the extensive Distributed Antenna System (DAS) deployment for Super Bowl XLIX at and around the University of Phoenix Stadium in Glendale, Ariz. and other parts of the Phoenix city area as well. Stay tuned all week as we highlight how DAS will keep Super Bowl fans connected, no matter where they roam in and around Phoenix and Glendale this week and weekend.

DAS antenna inside the University of Phoenix Stadium. Credit all photos: TE Connectivity

DAS antenna inside the University of Phoenix Stadium. Credit all photos: TE Connectivity

Two years ago, the University of Phoenix Stadium had a pretty good distributed antenna system (DAS) network to handle cellular communications inside the building. But with Super Bowl XLIX coming to the Glendale, Ariz., facility this year, pretty good wouldn’t be good enough — so the stadium’s network operators expanded the DAS by almost 50 percent in preparation for the game-day network surge expected on Feb. 1.

For fans attending the big game with cellular devices in hand that information may be comforting enough; thanks to a bigger, better DAS that is built to service all the major U.S. wireless carriers, they should have no problem getting a signal. Stadium technology professionals, however, usually want to know more about such expansion plans: What does it really mean to increase DAS capacity? How does that new DAS stack up to others in different stadiums and arenas?

More sectors means more capacity

For the Crown Castle neutral-host DAS at the University of Phoenix Stadium, there is one quick measure of how much the DAS expanded: More sectors. In DAS parlance, a “sector” is an area that has a dedicated amount of base station capacity; for the University of Phoenix Stadium DAS, the number of sectors increased from 33 two years ago to 48 sectors now, according to John Spindler, director for product management at DAS gear maker TE Connectivity. TE’s FlexWave Prism and FlexWave Spectrum DAS gear are part of the infrastructure deployed by neutral host Crown Castle in the UoP network.

John Spindler, TE Connectivity

John Spindler, TE Connectivity

Without getting too deep into telecom physics, more sectors in the same amount of space means more capacity. And when it comes to all the different flavors of phones and carrier spectrum, there’s a lot that goes into a DAS to use up that capacity. With all four major U.S. carriers (AT&T, Verizon Wireless, Sprint and T-Mobile) using the DAS, the infrastructure must support a full range of cellular spectrum, from 700 MHz LTE signals to LTE, CDMA, UMTS and EVDO signals in the 800 MHz, 850 MHz, 1900 MHz and 2100 MHz bands. The DAS inside the stadium will use 228 remote antenna units, according to Crown Castle.

“More frequencies, more MIMO [multiple-in, multiple-out antenna-enhancement technology] and heavier sectoring,” is how Spindler described the general needs for most DAS upgrades, and for the UoP DAS, where Spindler foresees another big number for Super Bowl stadium DAS traffic on Feb. 1.

“I would expect to see record [DAS] numbers,” Spindler said.

One DAS to rule them all

DAS active integration panel

DAS active integration panel

Last year, the DAS situation at MetLife Stadium in New Jersey was especially tough to explain, since both AT&T and Verizon built their own separate infrastructures. According to AT&T its DAS customers at Super Bowl XLVIII used 624 gigabytes of traffic, a record then but a figure that has been surpassed many times this past football season at both college and pro football venues (the recent College Football Playoff championship game, for instance, saw 1.4 TB of DAS traffic for AT&T customers at AT&T Stadium). Verizon claimed last year that its customers used 1.9 TB of wireless data during the Super Bowl, but Verizon never provided specifics whether that number represented just DAS traffic, or Verizon customer usage of the MetLife Wi-Fi network as well.

Either way, the guess is that the DAS at the University of Phoenix Stadium will set new Super Bowl traffic records on Feb. 1, and by all accounts the infrastructure seems ready to handle it. Spindler, for one, said the Crown Castle DAS is “definitely well designed.” And Travis Bugh, senior wireless consultant for CDW (which installed the new Wi-Fi system at UoP), said he was also impressed by the performance of the Crown Castle DAS, which he said seems more than ready for the coming Super Bowl crush.

NEXT: What are the carriers doing to supplement the DAS coverage?