February 26, 2015

Hockey crowd melted down Levi’s Stadium network and app, overwhelmed light rail

Levi's Stadium scoreboard during Stadium Series hockey game. Credit all images: Paul Kapustka, MSR (click on any photo for larger image).

Levi’s Stadium scoreboard during Stadium Series hockey game. Credit all images: Paul Kapustka, MSR (click on any photo for larger image).

From a financial and publicity standpoint Saturday’s Coors Light Stadium Series hockey game at Levi’s Stadium was a success, with 70,205 fans packing the football facility to watch the San Jose Sharks lose to the Los Angeles Kings, 2-1. But while the sellout crowd contributed to the general electricity that filled the venue, the mass of people also caused problems with the stadium’s vaunted wireless network, knocking out some parts of the Wi-Fi and cellular networks and overwhelming the unique feature of the stadium app designed to allow fans to have food and drinks delivered to their seats.

Hockey fans also swamped the VTA light rail system, causing some fans to wait as long as two hours before they could catch a bus or train to get home from the stadium. Though light rail officials said they will work on correcting the problems, the commuting jam does not bode well for a facility that is scheduled to host Super Bowl 50 in less than a year’s time, especially since many Super Bowl fans are expected to be traveling from San Francisco to the Santa Clara, Calif., neighborhood where Levi’s Stadium sits.

According to Roger Hacker, senior manager for corporate communications for the San Francisco 49ers, the Levi’s Stadium network team identified “isolated interruptions” of the Wi-Fi network, due to “frequency coordination issues” that the network team had not seen at previous events. Hacker also said that one unnamed wireless carrier had “issues” with its base station firmware, but said that the problems were resolved by game’s end. (For the record, I am a Verizon Wireless customer and I had “issues” getting cellular connectivity Saturday, so draw your own conclusions.)

Since the Niners’ full explanation is somewhat light on facts and numbers, we will first offer a “fan’s view” of the events Saturday night, under the caveat that Mobile Sports Report was not attending the game as press, but instead as just a regular hockey fan (one who purchased two full-price tickets) who was looking forward to using the stadium’s technology to enhance the game experience. Unfortunately for this fan, the Levi’s Stadium network, app and transit services all fell down on the job.

Light show a dud

Though the MSR team had no problems getting to the stadium — our light rail train out of Mountain View at about 5:30 p.m. was relatively empty — I noticed some irregularities in network connections during the pregame ceremonies, when I tried to join in the fan-participation light show, a technology feature recently added to the Levi’s Stadium app especially for the Stadium Series game. Like many people in our area, I couldn’t get the app to work, leaving me staring at a spinning graphic while others in the stadium saw their phones contribute flashing lights during pre-game music.

After the light show segment ended, I noticed that the Levi’s app was performing erratically, quitting on its own and kicking my device off the Wi-Fi network. After rebooting the device (a new Apple iPhone 6 Plus) I still couldn’t connect to the Wi-Fi, an experience I’ve never had at Levi’s. Turning off the Wi-Fi didn’t help, as cellular service also seemed poor. Since I wasn’t really there to work — I just wanted to enjoy the game with my older brother, who was in town for the event — I posted a quick tweet and went back to just watching the Sharks play poorly for the first 20 minutes.

One of the benefits of being a close follower of Levi’s Stadium technology is that when you tweet, people listen. By the middle of the first intermission, I was visited personally by Anoop Nagwani, the new head of the Levi’s Stadium network team, along with a technician from Aruba Networks, the Wi-Fi gear supplier at the stadium. Even with laptops and scanners, my visitors couldn’t immediately discern the network problem; they were, however, visited by a number of other nearby fans, who figured out who they were and relayed their own networking problems to them.

To be clear: I didn’t spend the game as I usually do at Levi’s, wandering around to see how the network is performing at as many spots as I can. But even if the outage was only in our area, that’s a significant problem for Levi’s Stadium, which has touted its technology every chance it gets. I also noticed problems with cellular connectivity all night, which leads me to believe that the network issues were more widespread than just at my seating area.

The official statement from Hacker describing the problems doesn’t pin any specific blame, but a guess from us is that perhaps something in the mix of systems used by the entertainment performers (there was a small stage to one side of the rink where musicians performed) and media new to the facility caused the Wi-Fi problem. Here is the official statement on the Wi-Fi issues:

The Levi’s Stadium network team identified isolated interruptions of the WiFi system in specific sections on Saturday night due to frequency coordination issues previously unseen at the venue and unique to this event. Saturday’s event featured extra radio systems not typical to previous stadium events, some of which were found to be unauthorized by event frequency coordinators. To avoid similar situations in the future, Levi’s Stadium management will be initiating additional frequency control protocols for all events.

Hacker said the network team did not track exactly how widespread the outages were, so could not provide a number of fans affected. But enough apparently did connect, since according to Hacker, the Levi’s network saw near-record traffic Saturday night, with a total of 3.0 terabytes of data carried, second only to the season-opening Niners game back in September, which saw 3.3 TB of data used on the Wi-Fi. Hacker said there were 24,792 unique devices connected to Wi-Fi during Saturday’s event, with a peak concurrent user number of 17,400 users, also second highest behind the season-opener total of 19,0000. The Stadium Series game did set a new mark for throughput with 3.5 Gbps on the network just before the start of the game, a surge that seems to be behind some of the other problems.

Food ordering overwhelmed

During the intermission, my brother and I went out on the 300-level concourse to get something to eat and drink — and encountered one of the untold stories of Levi’s Stadium: the incredibly long and slow lines for concessions. While I haven’t researched this problem in depth, after 10 minutes of inertia in our line I told my brother I would use the app’s food and drink ordering function to get us some vittles and beverages. Finally able to connect via Wi-Fi while on the concourse I placed an order for two beers and two hot dogs, and didn’t worry that the delivery time was 20 minutes. That would put it at the very latest near the end of the second period, which was fine by me since it meant I didn’t have to wait in lines. Or so I thought.

Back in my seat, I was troubled by the fact that even halfway through the period, the app had not switched yet from ordered to “en route.” I also got some error messages I had never seen at Levi’s Stadium before:

When the period ended and there was still no movement from the app (which I only checked sporadically since Wi-Fi never fully connected in my seat), I went back on the concourse where I found a small, angry crowd around the food-runner window at the closest concession stand. Pretty much, everyone there had the same problem I had: We’d ordered food and the app had said that the order had been taken, but nothing had happened since then.

Fans trying to figure out why their food orders weren't delivered

Fans trying to figure out why their food orders weren’t delivered

The situation wasn’t good since nobody at the food-runner window had any technology that would allow them to communicate with the app or network team; they couldn’t even cancel orders or make sure credit card refunds would be processed, which only served to increase the frustration for the fans who were just trying to use the services as advertised.

In the end, the staff at the delivery window did the best they could — which at one point resulted in someone producing slips of paper which the waiting fans used to write down their orders; one staffer then tried to fulfill those orders as best he could, going to the concession stand and bringing them out one by one. After waiting nearly the full intermission (missing Melissa Etheridge) I was given two cold hot dogs and two draft beers. Since there were no food holders left at the stand, I had to put the hot dogs into my jacket pockets and hold both beers. At least I didn’t starve or go thirsty, but it was a far cry from the delivered-to-the-seat functionality I had raved about to my brother that simply didn’t materialize.

During this process I sent an email to Louise Callegy, vice president of marketing at stadium app developer VenueNext. Her in-game response was:

“Levi’s Stadium app usage exceeded any previous event and set new records, causing delivery and order fulfillment delays. As always, we will do a post mortem after the event, and make the necessary adjustments to operational and staffing support, including systems performance analysis. We apologize to any fans who were inconvenienced.”

According to Hacker, the Levi’s Stadium food-runner staffing was at the same level as a regular-season Niners’ game; however, Hacker said the hockey fans broke the previous ordering records before the first period was over. Here is the official statement on the food ordering snafu:

With more than 31,000 new downloads of the Levi’s Stadium App – 20 percent more than had ever been seen at any previous stadium event – the [food ordering] system experienced 50 percent higher order volume in the just first hour of the game than had been seen during any previous event. The dramatic increase led to the extended wait times and cancelled orders experienced by some fans.

In a separate email, Hacker did not provide an exact number for how many fans were represented by the term “some,” but he did confirm that “no customers were charged for unfulfilled orders.”

Still, the system shouldn’t have had any unfulfilled orders, at least not according to the Niners’ consistent hype of the network and the app. Remember, Niners officials had long been confident that their network would be able to stand up to any load. Such was not the case Saturday night.

The long wait home

VTA line following Levi's Stadium hockey game

VTA line following Levi’s Stadium hockey game

After an exciting third period and a game that went down to the final horn, we left the stadium and were immediately greeted by a mass of people packing in to the VTA departure area. With too many people and not enough trains and buses, we spent almost an hour moving like slow cattle until we eventually got on a train to Mountain View. We considered ourselves lucky, since it looked like the folks heading south on VTA were in for an even longer wait.

When we got to the Mountain View station, we waited almost another hour to leave since Caltrain (nicely) kept its last train at the station until two more VTA trains brought the stragglers in from Levi’s. Though VTA has since claimed there were more than twice the “normal” number of riders than it saw at Niners games this season, there was no explanation why VTA didn’t or couldn’t provide more capacity after it saw more fans use the service to get to the game. What was most unpleasant was the overall unorganized method of boarding the trains, just a massive group line with one VTA person on a bullhorn telling everyone to make sure they bought a ticket.

In the end, the time it took to get from the start of the VTA line to my house in San Mateo was three hours — almost as long as the game itself. With some other “special” events like Wrestlemania and concerts coming up at Levi’s and the Super Bowl 50 next year, it’s clear there is lots of work that needs to be done to make it a good experience for all who purchase a ticket, especially those looking to use public transport and the app features to enhance their game-day experience.

Sharks and Kings on the ice at Levi's Stadium

Sharks and Kings on the ice at Levi’s Stadium

Artemis announces DISH spectrum lease, setting up San Francisco pCell service trial; also makes venue-specific hub available for trial

Artemis Networks founder Steve Perlman. Credit all photos: Artemis Networks

Artemis Networks founder Steve Perlman. Credit all photos: Artemis Networks

Artemis Networks moved one step closer to a real-world offering of its pCell wireless service with the announcement of a spectrum lease deal with satellite provider DISH that will give Artemis the means to offer commercial services in San Francisco perhaps as early as sometime later this year, pending FCC approval.

In the meantime, owners of large public venues (like sports stadiums) can now test out the Artemis technology for themselves, by testing an Artemis I Hub and antenna combination in a trial arrangement with the company. Announced last year, Artemis’ pCell technology claims to solve two of the biggest problems in wireless networking, namely bandwidth congestion and antenna interference, by turning much of the current technology thinking on its head. If the revolutionary networking idea from longtime entrepreneur Steve Perlman pans out, stadium networks in particular could become more robust while also being cheaper and easier to deploy.

In a phone interview with Mobile Sports Report prior to Tuesday’s announcement, Perlman said Artemis expects to get FCC approval for its pCell-based wireless service sometime in the next 6 months. When that happens, Artemis will announce pricing for its cellular service, which will work with most existing LTE phones by adding in a SIM card provided by Artemis. Phones with dynamic SIMs like some of the newer devices from Apple, Perlman said, will be able to simply choose the Artemis service without having to add in a card.

Though he wouldn’t announce pricing yet, Perlman said Artemis services would be less expensive than current cellular plans. He said that there will likely be an option for local San Francisco service only, and another that includes roaming ability on other providers’ cellular networks for use outside the city.

More proof behind the yet-untasted pudding

When Perlman, the inventor of QuickTime and WebTV, announced Artemis and its pCell technology last year, it was met with both excitement — for its promise of delivering faster, cheaper wireless services — and no shortage of skepticism, about whether it would ever become a viable commercial product. Though pCell’s projected promise of using cellular interference to produce targeted, powerful cellular connectivity could be a boon to builders of large public-venue networks like those found in sports stadiums, owners and operators of those venues are loath to build expensive networks on untested, unproven technology. So it’s perhaps no surprise that Artemis has yet to name a paying customer for its revolutionary networking gear.

Artemis I Hub

Artemis I Hub

But being able to name names and talk about spectrum deals are steps bringing Artemis closer to something people can try, and perhaps buy. VenueNext, the application development firm behind the San Francisco 49ers’ Levi’s Stadium app, confirmed that it is testing Artemis technology, and the San Francisco network will provide Perlman and Artemis with a “beta” type platform to test and “shake out the system” in a live production environment.

“We need to be able to move quickly, get feedback and test the network,” said Perlman about Artemis’ decision to run its own network first, instead of waiting for a larger operator to implement it. “We need to be able to move at startup speed.”

For stadium owners and operators, the more interesting part of Tuesday’s news may be the Artemis I Hub, a device that supports up to 32 antennas, indoor for now with outdoor units due later this year. The trial testing will allow venue owners and operators to kick the tires on pCell deployment and performance on their own, instead of just taking Artemis’ word for it. Artemis also has published a lengthy white paper that fleshes out the explanation of their somewhat radical approach to cellular connectivity, another step toward legitimacy since publishing such a document publicly means that Artemis is confident of its claims.

If networking statistics from recent “big” stadium events are any barometer, the field of stadium networking may need some significant help soon since fans are lately using way more data than ever before, including the 13+ Gigabytes of traffic at the Super Bowl in Phoenix and the 6+ GB figure from the college football playoff championship game. To Perlman, the idea of trying to use current Wi-Fi and cellular technology to address a crowded space doesn’t make sense.

“You simply cannot use interfering technology in a situation where you have closely packed transmitters,” said Perlman. “You just can’t do it.”

Artemis explained

pCell antenna from Artemis Networking

pCell antenna from Artemis Networking

If you’re unfamiliar with the Artemis idea, at its simplest level it’s a new idea in connecting wireless devices to antennas that — if it works as advertised — turns conventional cellular and Wi-Fi thinking on its head. What Perlman and Artemis claim is that they have developed a way to build radios that transmit signals “that deliberately interfere with each other” to establish a “personal cell,” or pCell, for each device connecting to the network.

(See this BusinessWeek story from 2011 that thoroughly explains the Artemis premise in detail. This EE Times article also has more details, and this Wired article is also a helpful read.)

Leaving the complicated math and physics to the side for now, if Artemis’ claims hold true their technology could solve two of the biggest problems in wireless networking, namely bandwidth congestion and antenna interference. In current cellular and Wi-Fi designs, devices share signals from antenna radios, meaning bandwidth is reduced as more people connect to a cellular antenna or a Wi-Fi access point. Adding more antennas is one way to solve congestion problems; but especially in stadiums and other large public venues, you can’t place antennas too close to each other, because of signal interference.

The Artemis pCell technology, Perlman said, trumps both problems by delivering a centimeter-sized cell of coverage to each device, which can follow the device as it moves around in an antenna’s coverage zone. Again, if the company’s claims hold true of being able to deliver full bandwidth to each device “no matter how many users” are connected to each antenna, stadium networks could theoretically support much higher levels of connectivity at possibly a fraction of the current cost.

The next step in Artemis’ evolution will be to see if (or how well) its technology works in the wild, where everyday users can subject it to the unplanned stresses that can’t be tested in the lab. With any luck and FCC willing, we won’t have to wait another year for the next chapter to unfold.

AT&T upgrades DAS at University of Washington hoops arena

Alaska Airlines Arena at Hec Edumunson Pavilion. Credit: University of Washington

Alaska Airlines Arena at Hec Edumunson Pavilion. Credit: University of Washington

AT&T’s deal to bring better wireless connectivity to Pac-12 conference schools expanded into the basketball zone, with announcement of DAS upgrades at the University of Washington’s Alaska Airlines Arena.

Officially and somewhat longwindedly called the Alaska Airlines Arena at Hec Edumunson Pavilion, the cozy 10,000-seat facility is home to not only the Husky men’s and women’s basketball teams, but also hosts other indoor sports like volleyball and gymnastics. According to AT&T the new DAS deployment has more than 80 antennas spread out through the facility to enhance cellular coverage, especially 4G LTE signals.

According to AT&T, the new DAS is already up and running and is seeing an average of 17 gigabytes of traffic used by AT&T customers per basketball game so far this season. Last fall AT&T also announced DAS upgrades at the U-Dub football stadium, where it has roughly 340 DAS antennas and saw an average 190 GB of traffic per game.

DGP gets deal to extend DAS outside Levi’s Stadium

Franks and DAS: DGP DAS antennas above food station at Levi's Stadium. Photo credit: Paul Kapustka, MSR

Franks and DAS: DGP DAS antennas above food station at Levi’s Stadium. Photo credit: Paul Kapustka, MSR

DAS Group Professionals, the company that installed the neutral-host DAS inside Levi’s Stadium, now has a deal to extend the DAS outside the Levi’s walls, covering parts of the city of Santa Clara, Calif., that surround the stadium.

With next year’s Super Bowl set to take place at Levi’s Stadium, it makes sense that city officials would want to make sure the parking lots and other pre-game gathering areas outside the venue had good cellular connectivity. At the most recent Super Bowl in Glendale, Ariz., neutral host provider Crown Castle did an extensive job of building the “oDAS” or outside DAS in the spaces surrounding the University of Phoenix Stadium.

According to DGP, it will design, build and maintain an oDAS for the City of Santa Clara, initially targeting the area around the Great America theme park and the Santa Clara Convention Center, which sit on the other side of the main Levi’s Stadium parking lots. Like the DAS inside the stadium, access to the network outside the stadium will be offered to all major wireless carriers, who must pay DGP and the city for access to the network.

While the network will definitely come in handy for pre- and post-game connectivity following Levi’s Stadium events, it will also improve overall cellular performance in the area, which is also the home to several large corporate office buildings as well as the busy convention center.

Super Bowl XLIX sets new stadium Wi-Fi record with 6.2 Terabytes of data consumed

University of Phoenix Stadium. Credit: Arizona Cardinals.

University of Phoenix Stadium. Credit: Arizona Cardinals.

The Super Bowl is once again the stadium Wi-Fi champ, as fans at Sunday’s Super Bowl XLIX in Glendale, Ariz., used 6.23 terabytes of data during the contest, according to the team running the network at the University of Phoenix Stadium.

The 6.23 TB mark blew past the most recent entrant in the “most Wi-Fi used at a single-day single-stadium event” sweepstakes, the 4.93 TB used at the Jan. 12 College Football Playoff championship game at AT&T Stadium. Prior to that, pro football games this past season at Levi’s Stadium in Santa Clara, Calif., and at AT&T Stadium had pushed into the 3-plus TB mark to be among the highest totals ever reported.

The live crowd watching the New England Patriots’ 28-24 victory over the Seattle Seahawks also used about as much cellular data as well, with Verizon Wireless, AT&T and Sprint claiming a combined total of 6.56 TB used in and around the stadium on game day. All three carriers were on the in-stadium and outside-the-stadium DAS deployments being run by neutral host Crown Castle. If those figures are correct (more on this later) it would put the total wireless data usage for the event at 12.79 TB, far and away the biggest single day of wireless data use we’ve ever heard of.

Apple OS updates still the application king

Handrails with Wi-Fi antenna enclosures from AmpThink. Credit: Arizona Cardinals.

Handrails with Wi-Fi antenna enclosures from AmpThink. Credit: Arizona Cardinals.

Mark Feller, vice president of information technology for the Arizona Cardinals, and Travis Bugh, senior wireless consultant for CDW, provided Mobile Sports Report with the final Wi-Fi usage numbers, which are pretty stunning for anyone in the stadium networking profession. According to Feller the new CDW-deployed Wi-Fi network with Cisco gear at the UoP Stadium saw 2.499 TB of data downloaded, and 3.714 TB uploaded, for a total of 6.213 TB of Wi-Fi usage. Bugh of CDW said there were 25,936 unique devices connecting to the network on game day, with a peak concurrent usage of 17,322, recorded not surprisingly at halftime.

Peak download usage of 1.3 Gbps was recorded before the game’s start, while peak upload usage of 2.5 Gbps was hit at halftime. The top applications by bandwidth use, Feller said, were Apple (mobile update), Facebook, Dropbox and Snapchat.

DAS numbers also set new record, but clarification needed

The only reason we aren’t yet trumpeting the 6.564 TB of reported DAS use as a verified record is due to the differences in clarity from each of the reporting providers. We also haven’t yet heard any usage totals from T-Mobile, so it’s likely that the final final wireless data use number is somewhere north of 13 TB, if all can be believed.

Parking lot light poles, Westgate entertainment district. Can you spot the DAS?

Parking lot light poles, Westgate entertainment district. Can you spot the DAS?

As reported before, AT&T said it saw 1.7 TB of cellular wireless activity from its customers on game day, with 696 GB of that happening inside the stadium, and the balance coming from the outside areas before and after the game. We’d also like to welcome Sprint to the big-game reporting crew (thanks Sprint!), with its total of 754 GB of all 4G LTE traffic used in and around the stadium on game day. According to Sprint representatives, its Super Bowl coverage efforts included 5 COWs (cell towers on wheels) as well as expanded DAS and macro placements in various Phoenix-area locations. The Sprint coverage included the 2.5 GHz spectrum that uses TDD LTE technology.

As also previously reported, Verizon Wireless claimed 4.1 TB of customer traffic in and around the stadium on game day, which Verizon claims is all cellular traffic and does not reflect any Verizon Wireless customer use of the stadium Wi-Fi network. Verizon also reported some other interesting activity tidbits, which included 46,772 Verizon Wireless devices used at the game, of which just 59.7 percent were smartphones. Verizon also said it saw 10 million emails sent on its networks that day, and 1.9 million websites visited, while also seeing 122.308 videos sent or received over wireless connections.

We’re still waiting to see if we can get usage numbers from the Super Bowl stadium app (we’re especially interested to see if the instant replay feature caught on) but the warning for stadium owners and operators everywhere seems to be clear: If you’re hosting the big game (or any BIG game), make sure your network is ready for 6 TB and beyond!

AT&T sets new DAS traffic records for Super Bowl with 1.7 Terabyte mark

University of Phoenix Stadium

University of Phoenix Stadium

AT&T said its customers set new records for Super Bowl and professional football game wireless data consumption, with a total of 1.7 terabytes of traffic used in and around the University of Phoenix Stadium in Glendale, Ariz., Sunday night.

In a blog post from AT&T senior executive vice president John Donovan AT&T said it saw 696 gigabytes of wireless data used on its in-stadium DAS Sunday night, with an additional 1 TB used in and around the stadium in the surrounding parking lots and the Westgate entertainment district, a mall/restaurant complex that is connected to the UoP stadium area. The 1.7 TB mark surpasses the 1.4 TB DAS mark AT&T saw at the recent College Football Playoff championship game in Arlington, Texas, on Jan. 12.

Donovan’s blog post contains some interesting looks back — with a peak usage of 125 GB per hour Sunday, AT&T saw another new high mark, one that seems to say that usage of wireless data at stadiums is still climbing with no roof (retractable or not) in sight. Here’s a couple quotes:

Since 2011 – inclusive of the last five Big Games – the total data usage on AT&T’s in-stadium network has increased from 177GB to 696GB and peak hour data usage has increased from 30GB to 125GB.

And:

These numbers don’t come as a total shock as we experienced several high marks this season. In total, from 253 games at 31 stadiums, our customers have used more than 85.7TB of mobile data on our venue-specific cellular networks. That’s equivalent to more than 245M social media posts with photos from 253 games (an average of almost 1M social media posts per game).

We are still waiting for results from the stadium Wi-Fi network… will the total break the 6 TB mark set at the CFP championship game? Stay tuned! More AT&T infographic fun below.

click on photo for larger image

click on photo for larger image

ATT_Super_Bowl_Football_GraphicTwitter_R1V4-2-2-2015