Can virtualization help venues meet growing mobile capacity demands?

By Josh Adelson, director, Portfolio Marketing, CommScope

U.S. mobile operators reported a combined 50 terabytes of cellular traffic during the 2018 Super Bowl, nearly doubling over the previous year. In fact, Super Bowl data consumption has doubled every year for at least the past 6 years and it shows no sign of slowing down.

Clearly, fans love their LTE connections almost as much as they love their local team. Fans have the option for cellular or Wi-Fi, but cellular is the default connection whereas Wi-Fi requires a manual connection step that many users may not bother with.[1] The same dynamic is playing out on a smaller scale in every event venue and commercial building.

Whether you are a venue owner, part of the team organization or in the media, this heightened connectivity represents an opportunity to connect more with fans, and to expand your audience to the fans’ own social connections beyond the venue walls.

But keeping up with the demand is also a challenge. High capacity can come at a high cost, and these systems also require significant real estate for head-end equipment. Can you please your fans and leverage their connectedness while keeping equipment and deployment costs from breaking the capex scoreboard?

Virtualization and C-RAN to the rescue?

Editor’s note: This post is part of Mobile Sports Report’s new Voices of the Industry feature, in which industry representatives submit articles, commentary or other information to share with the greater stadium technology marketplace. These are NOT paid advertisements, or infomercials. See our explanation of the feature to understand how it works.

Enterprise IT departments have long since learned that centralizing and virtualizing their computing infrastructures has been a way to grow capacity while reducing equipment cost and space requirements. Can sports and entertainment venues achieve the same by virtualizing their in-building wireless infrastructures? To answer this question, let’s first review the concepts and how they apply to wireless infrastructure.

In the IT domain, virtualization refers to replacing multiple task-specific servers with a centralized resource pool that can be dynamically assigned to a given task on demand. Underlying this concept is the premise that, while each application has its own resource needs, at any given time only a subset will be active, so the total shared resource can be less than the sum of the individual requirements.

How does this translate to in-building wireless? Centralizing the base station function is known as C-RAN, which stands for centralized (or cloud) radio access network. C-RAN involves not only the physical pooling of the base stations into a single location — which is already the practice in most venues — but also digital architecture and software intelligence to allocate baseband capacity to different parts of the building in response to shifting demand.

C-RAN brings immediate benefits to a large venue in-building wireless deployment. The ability to allocate capacity across the venue via software rather than hardware adds flexibility and ease of operation. This is especially important in multi-building venues that include not only a stadium or arena but also surrounding administrative buildings, retail spaces, restaurants, hotels and transportation hubs. As capacity needs shift between the spaces by time of day or day of week, you need a system that can “point” the capacity to the necessary hot spots.

C-RAN can even go a step further to remove the head-end from the building campus altogether. Mobile network operators are increasingly deploying base stations in distributed locations known as C-RAN hubs. If there is a C-RAN hub close to the target building, then the in-building system can take a signal directly from the hub, via a fiber connection. Even if the operator needs to add capacity to the hub for this building, this arrangement gives them the flexibility to use that capacity in other parts of their network when it’s not needed at the building. It also simplifies maintenance and support as it keeps the base station equipment within the operator’s facilities.

For the building owner, this architecture can reduce the footprint of the on-campus head-end by as much as 90 percent. Once the baseband resources are centralized, the next logical step is to virtualize them into software running on PC server platforms. As it turns out, this is not so simple. Mobile baseband processing is a real-time, compute-intensive function that today runs on embedded processors in specialized hardware platforms. A lot of progress is being made toward virtualization onto more standard computers, but as of today, most mobile base stations are still purpose-built.

Perhaps more important for stadium owners is the fact that the base station (also called the signal source) is selected and usually owned by the mobile network operator. Therefore the form it takes has at most only an indirect effect on the economics for the venue. And whether the signal source is virtual or physical, the signal still must be distributed by a physical network of cables, radios and antennas.

The interface between the signal source and the distribution network provides another opportunity for savings. The Common Public Radio Interface (CPRI) establishes a digital interface that reduces space and power requirements while allowing the distribution network — the DAS — to take advantage of the intelligence in the base station. To leverage these advantages, the DAS also needs to be digital.

To illustrate this, consider the head-end for a 12 MIMO sector configuration with 4 MIMO bands per sector, as shown below. In this configuration a typical analog DAS is compared with a digital C-RAN antenna system, with and without a CPRI baseband interface. In the analog systems, points of interface (POIs) are needed to condition the signals from the different sources to an equal level before they are combined and converted to optical signals via an e/o (electric to optical) transceiver. In a digital system, signal conditioning and conversion from electric to digital is integrated into a single card, providing significant space saving.

* A typical analog system will require 8 POIs (4 MIMO bands per sector) and 2 o/e transceivers per sector resulting in 10 cards per sector. A typical subrack (chassis) supports up 10-12 cards, so a subrack will support 1 MIMO sector. For 12 MIMO sectors, 12 subracks are needed and each is typically 4 rack unit height. This results in a total space of 48 rack units.

* For a digital system[2] without CPRI, each subrack supports 32 SISO ports. Each MIMO sector with 4 MIMO bands will require 8 ports resulting in supporting 4 MIMO sectors per subrack. For 12 MIMO sectors, 3 subracks of 5 rack unit height each are needed resulting in total space of 15 rack units.

* For a digital system with CPRI, each subrack supports 48 MIMO ports. Each MIMO sector with 4 MIMO bands will require 4 ports resulting in 12 MIMO sectors per subrack. For 12 MIMO sectors, only 1 subrack of 5 rack unit height is needed resulting in total space of 5 rack units.

One commercial example of this is Nokia’s collaboration with CommScope to offer a CPRI interface between Nokia’s AirScale base station and CommScope’s Era C-RAN antenna system. With this technology, a small interface card replaces an array of remote radio units, reducing space and power consumption in the C-RAN hub by up to 90 percent. This also provides a stepping-stone to future Open RAN interfaces when they become standardized and commercialized.

The Benefits in Action — A Case Study

Even without virtualization, the savings of digitization, C-RAN and CPRI at stadium scale are significant. The table below shows a recent design that CommScope created for a large stadium complex in the U.S. For this, we compared 3 alternatives: traditional analog DAS, a digital C-RAN antenna system with RF base station interfaces, and a C-RAN antenna system with CPRI interfaces. Digital C-RAN and CPRI produce a dramatic reduction in the space requirements, as the table below illustrates.

The amount of equipment is reduced because a digital system does in software what analog systems must do in hardware, and CPRI even further eliminates hardware. Energy savings are roughly proportional to the space savings, since both are a function of the amount of equipment required for the solution.

Fiber savings, shown in the table below, are similarly significant.

The amount of fiber is reduced because digitized signals can be encoded and transmitted more efficiently.

But these savings are only part of the story. This venue, like most, is used for different types of events — football games, concerts, trade shows and even monster truck rallies. Each type of event has its own unique traffic pattern and timing. With analog systems, re-sectoring to accommodate these changes literally requires on-site physical re-wiring of head-end units. But with a digital C-RAN based system it’s possible to re-sectorize from anywhere through a browser-based, drag and drop interface.

The Bottom Line

It’s a safe bet that mobile data demand will continue to grow. But the tools now exist to let you control whether this will be a burden, or an opportunity to realize new potential. C-RAN, virtualization and open RAN interfaces all have a role to play in making venue networks more deployable, flexible and affordable. By understanding what each one offers, you can make the best decisions for your network.

Josh Adelson is director of portfolio marketing at CommScope, where he is responsible for marketing the award-winning OneCell Cloud RAN small cell, Era C-RAN antenna system and ION-E distributed antenna system. He has over 20 years experience in mobile communications, content delivery and networking. Prior to joining CommScope, Josh has held product marketing and management positions with Airvana (acquired by CommScope in 2015) PeerApp, Brooktrout Technology and Motorola. He holds an MBA from the MIT Sloan School of Management and a BA from Brown University.

FOOTNOTES
1: 59 percent of users at the 2018 Super Bowl attached to the stadium Wi-Fi network.
2: Dimensioning for digital systems is based on CommScope Era.

New Report: Texas A&M scores with new digital fan-engagement strategy

In the short history of in-stadium mobile fan engagement, a team or stadium app has been the go-to strategy for many venue owners and operators. But what if that strategy is wrong?

That question gets an interesting answer with the lead profile in our most recent STADIUM TECH REPORT, the Winter 2018-19 issue! These quarterly long-form reports are designed to give stadium and large public venue owners and operators, and digital sports business executives a way to dig deep into the topic of stadium technology, via exclusive research and profiles of successful stadium technology deployments, as well as news and analysis of topics important to this growing market.

Leading off for this issue is an in-depth report on a new browser-based digital game day program effort launched this football season at Texas A&M, where some longtime assumptions about mobile apps and fan engagement were blown apart by the performance of the Aggies’ new project. A must read for all venue operations professionals! We also have in-person visits to Atlanta’s Mercedes-Benz Stadium and the renovated State Farm Arena, the venue formerly known as Philips Arena. A Q&A with NFL CIO Michelle McKenna-Doyle and a report on a CBRS network test by the PGA round out this informative issue! DOWNLOAD YOUR REPORT today!

We’d like to take a quick moment to thank our sponsors, which for this issue include Mobilitie, JMA Wireless, Corning, Huber+Suhner, Boingo, Oberon, MatSing, Neutral Connect Networks, Everest Networks, and ExteNet Systems. Their generous sponsorship makes it possible for us to offer this content free of charge to our readers. We’d also like to welcome readers from the Inside Towers community, who may have found their way here via our ongoing partnership with the excellent publication Inside Towers. We’d also like to thank the SEAT community for your continued interest and support.

As always, we are here to hear what you have to say: Send me an email to kaps@mobilesportsreport.com and let us know what you think of our STADIUM TECH REPORT series.

Mercedes-Benz Stadium Wi-Fi saw 12 TB of data used at January’s college championship

The iconic ‘halo board’ video screen below the unique roof opening at Atlanta’s Mercedes-Benz Stadium. Credit all photos: Paul Kapustka, MSR (click on any picture for a larger image)

The Wi-Fi network at Atlanta’s Mercedes-Benz Stadium saw 12 terabytes of data used at the 2018 College Football Playoff championship on Jan. 8, 2018, according to officals from the Atlanta Falcons, owners and operators of this city’s new distinctive venue.

We’d long suspected that Mercedes-Benz Stadium, which opened in August of 2017, had seen big data days inside the 71,000-seat arena with its innovative technology, but until Sunday the Falcons had never made any network-performance data publicly available. But a day after the venue saw another 8.06 TB of Wi-Fi used during the SEC Championship game, Danny Branch, chief information officer for AMB Sports & Entertainment, revealed the statistics during a live MSR visit at an Atlanta Falcons home game. The 12 TB mark (which was an estimate — we’ll check back with the Falcons for exact numbers) is the second-highest we’ve ever seen in our unofficial research of single-day Wi-Fi totals, trailing only the 16.31 TB recorded at Super Bowl LII in February at U.S. Bank Stadium.

“We’re confident and ready for the Super Bowl,” said Branch during a pregame stadium tour, details of which we’ll dig into deeper in a full profile for our upcoming Winter Stadium Tech Report. Multiple network speed tests taken by MSR during Sunday’s 26-16 Falcons loss to the visiting Baltimore Ravens showed robust Wi-Fi performance on the network that uses gear from Aruba, a Hewlett Packard Enterprise company, in a design from AmpThink.

DAS renovation complete

An under-seat DAS antenna in the 300 seating section at Mercedes-Benz Stadium

According to Branch, the cellular distributed antenna system (DAS) network inside Mercedes-Benz — a deployment that is at the center of a current lawsuit filed by contractor IBM against gear supplier and designer Corning — is also now at full deployment, with the completion of 700 new under-seat DAS antenna deployments, mostly in the upper seating deck.

MSR speed tests taken during Sunday’s game showed a wide range of DAS results, from single-digit tests in some tough-deployment areas to results near 100 Mbps directly in front of what looked like some new antenna deployments. Again, look for more details in our upcoming profile in the Winter Stadium Tech Report (due out in mid-December).

“We’re in a good place [with the DAS],” said Branch, though he did say there was going to be more DAS work done on the outside of Mercedes-Benz Stadium prior to when Super Bowl LIII comes to the venue on Feb. 3, 2019, mainly to help ensure that the move toward more digital Super Bowl tickets goes smoothly. Mercedes-Benz Stadium also now has a couple of MatSing ball antennas in its rafters, there to bring DAS coverage to the sidelines of the playing field.

Sunday the Mercedes-Benz Stadium staffers were hosting a rare big-game back-to-back event, following Saturday’s packed-house tilt between SEC powers Alabama and Georgia, a championship-game rematch won by Alabama 35-28 after a dramatic comeback.

“That was a massive flip,” said Branch of the two-day stretch, which saw another huge data day Saturday with 8.06 TB of Wi-Fi used. The network, sponsored by backbone provider AT&T, averages about a 50 percent take rate from event attendees, according to Branch, who gave praise to Aruba and AmpThink for their combined deployment efforts.

“The expectation for fans now is that there will be Wi-Fi [in a sports venue],” said Branch. “But I love it when friends come to me after a game and tell me ‘the Wi-Fi is so fast!’ ”

THE MSR TOP 17 FOR WI-FI

1. Super Bowl 52, U.S. Bank Stadium, Minneapolis, Minn., Feb. 4, 2018: Wi-Fi: 16.31 TB
2. 2018 College Football Playoff Championship, Alabama vs. Georgia, Mercedes-Benz Stadium, Atlanta, Ga., Jan. 8, 2018: Wi-Fi: 12.0 TB*
3. Super Bowl 51, NRG Stadium, Houston, Feb. 5, 2017: Wi-Fi: 11.8 TB
4. Atlanta Falcons vs. Philadelphia Eagles, Lincoln Financial Field, Philadelphia, Pa., Sept. 6, 2018: Wi-Fi: 10.86 TB
5. Super Bowl 50, Levi’s Stadium, Santa Clara, Calif., Feb. 7, 2016: Wi-Fi: 10.1 TB
6. Taylor Swift Reputation Tour, Gillette Stadium, Foxborough, Mass., July 27, 2018: Wi-Fi: 9.76 TB
7. Minnesota Vikings vs. Philadelphia Eagles, NFC Championship Game, Lincoln Financial Field, Philadelphia, Pa., Jan. 21, 2018: Wi-Fi: 8.76 TB
8. Jacksonville Jaguars vs. New England Patriots, AFC Championship Game, Gillette Stadium, Foxborough, Mass., Jan. 21, 2018: Wi-Fi: 8.53 TB
9. Taylor Swift Reputation Tour, Broncos Stadium at Mile High, May 25, 2018: Wi-Fi: 8.1 TB
10. Kansas City Chiefs vs. New England Patriots, Gillette Stadium, Foxborough, Mass., Sept. 7, 2017: Wi-Fi: 8.08 TB
11. SEC Championship Game, Alabama vs. Georgia, Mercedes-Benz Stadium, Atlanta, Ga., Dec. 1, 2018: Wi-Fi: 8.06 TB*
12. Green Bay Packers vs. Dallas Cowboys, Divisional Playoffs, AT&T Stadium, Arlington, Texas, Jan. 15, 2017: Wi-Fi: 7.25 TB
13. Stanford vs. Notre Dame, Notre Dame Stadium, South Bend, Ind., Sept. 29, 2018: 7.19 TB
14. (tie) Southern California vs. Notre Dame, Notre Dame Stadium, South Bend, Ind., Oct. 21, 2017: 7.0 TB
Arkansas State vs. Nebraska, Memorial Stadium, Lincoln, Neb., Sept 2, 2017: Wi-Fi: 7.0 TB
15. WrestleMania 32, AT&T Stadium, Arlington, Texas, April 3, 2016: Wi-Fi: 6.77 TB
16. Wisconsin vs. Nebraska, Memorial Stadium, Lincoln, Neb., Oct. 7, 2017: Wi-Fi: 6.3 TB
17. Super Bowl 49, University of Phoenix Stadium, Glendale, Ariz., Feb. 1, 2015: Wi-Fi: 6.23 TB

* = pending official exact data

NFL CIO: Mercedes-Benz Stadium’s wireless is ‘ready for the Super Bowl’

The entry concourse at Atlanta’s Mercedes-Benz Stadium. Credit all photos: Paul Kapustka, MSR

The wireless networks at Atlanta’s Mercedes-Benz Stadium are “ready for the Super Bowl,” according to Michelle McKenna, senior vice president and chief information officer for the NFL, who spoke to Mobile Sports Report via phone last week.

Though McKenna would not comment on any of the particulars of the recent lawsuit filed by IBM against Corning that revolves around issues with the stadium’s distributed antenna system (DAS) cellular network, she did assert that any past problems have since been fixed, and that the league is confident the venue’s wireless systems will stand up to the stress test that will likely arrive when Super Bowl LIII takes place on Feb. 3, 2019.

“The [Atlanta] Falcons have been super-cooperative in remedying one of the challenges they had,” said McKenna. “The networks will be ready for the Super Bowl.”

Mercedes-Benz Stadium also has an Aruba-based Wi-Fi network, which has not been the subject of any lawsuit; however, stadium officials have also not ever released any performance statistics for the network since the stadium’s opening. According to IBM’s lawsuit documents, the company said it had to pay extra to fix the DAS network, a task it said was completed before the end of the 2017 NFL season.

Outside connectivity a challenge as well

While the Super Bowl is almost always the biggest single-day sports events for wireless connectivity, McKenna added that this year’s version will be even a little more challenging than others since the league is in the process of moving fans to digital ticketing for its championship event.

“This year one of the new challenges is the move to paperless ticketing,” said McKenna in a wide-ranging interview about NFL technology issues (look for a full breakdown of the interview in our upcoming Winter Stadium Tech Report). Though this year’s game will still have some paper-based ticket options, McKenna said the lessons learned in ensuring good connectivity outside the stadium gates will help prepare for future Super Bowls, which will likely be all-digital ticketing.

One Super Bowl technology not yet decided is the game-day app, which for the past two years has been built by the NFL. In previous years, the league used versions of local game-day apps with Super Bowl additions, a direction McKenna said the league might still take this year. Designed mainly as a way to help visitors find their way around an unfamiliar stadium and city, the Super Bowl app this year might need to lean on the local app to help integrate the digital ticket functionality, McKenna said. The Falcons’ app for Mercedes-Benz Stadium was built by IBM.

IBM sues Corning over ‘botched’ Mercedes-Benz Stadium DAS deployment

The entry concourse at Atlanta’s new Mercedes-Benz Stadium. Credit all photos: Paul Kapustka, MSR (click on any photo for a larger image)

IBM has filed suit against Corning in Georgia federal court, claiming that Corning “botched” the design for the cellular distributed antenna system installed at the Atlanta Falcons’ Mercedes-Benz Stadium, according to court documents first reported on Law360.com.

According to the court filing, IBM basically alleges that Corning failed to deliver a working DAS for the Falcons’ new home, which opened last summer. IBM also said that it, the Falcons and the NFL needed to spend extra millions to make the system work. The topic is of special interest to the NFL and the Falcons, since Mercedes-Benz Stadium will host Super Bowl LIII in February.

In the court documents IBM does not list an amount it is seeking as compensation, but instead said it will seek “damages in an amount to be proven at trial” for several categories of claims it identified. In its filing, IBM claimed that it had purchased approximately $20 million in equipment and materials from Corning to build the DAS.

Wi-Fi and DAS antennas inside Mercedes-Benz Stadium

Corning representatives did not want to conduct an interview Friday about the subject, but a company spokesperson provided the following official reply via email:

“Corning is a company of the highest integrity. We are confident that the company has conducted itself in an honorable manner and has been fully compliant in meeting its contractual obligations.”

Among the claims in IBM’s filing is that Corning’s design for the DAS was flawed, especially in its ability to predict the correct placement and orientation of DAS antennas. IBM also claimed that Corning did not have enough engineers on hand during deployment times to ensure the DAS was working correctly. According to IBM’s filing, during last fall’s opening season there were “many areas of Mercedes-Benz Stadium that had little or no usable cellular services” until IBM fixed the system later in the year.

IBM also declined to make anyone available for an interview with MSR. A company spokesperson provided the following official reply via email:

“IBM successfully works with partners on major projects around the world. On this project, however, Corning delivered a flawed cellular system to the Falcons and IBM, and then failed to fix it. IBM stepped in and spent a year to deliver state-of-the-art cellular performance for fans, and Corning is now accountable for failing to live up to its obligations.”

MSR has heard that Verizon Wireless is currently working on enhancing the DAS network at the stadium for the upcoming Super Bowl, but Verizon executives would not comment on any specifics. One source has told MSR that some of the improvements include new under-seat DAS antenna placements, which have required core drilling through the existing concrete floors for installation.

We’ll be following and developing this story as we can, so stay tuned for more info (and please contact us if you know any of the particulars).

Wi-Fi upgrade producing solid results for Denver Broncos at Mile High

A fan walks by a railing wireless enclosure in the upper deck of Broncos Stadium at Mile High during the Oct. 1 game against the Kansas City Chiefs. Credit all photos: Paul Kapustka, MSR (click on any picture for a larger image)

As the Denver Broncos’ Wi-Fi network upgrade nears its final steps of completion, solid coverage around the venue now known as Broncos Stadium at Mile High is producing Wi-Fi data totals averaging more than 6 terabytes per game, according to statistics from the team.

During a recent game-day visit to Mile High, Mobile Sports Report got consistent high-bandwidth readings for Wi-Fi throughout the venue, and into the parking lots as well. Multiple speed tests recorded bandwidth marks in the high double-digits of megabits per second, even at the top reaches of the stands as well as in other hard-to-cover areas, like concourses and plazas.

And even as Russ Trainor, Broncos’ senior vice president for information technology, and his networking team put the final tuning touches on an expansion that will end with somewhere near 1,500 Cisco Wi-Fi APs installed throughout the building, the football (and concert) fans who have shown up lately are already finding ways to use lots of Wi-Fi data. In the first three home games of the Broncos’ current regular season, Trainor said the Wi-Fi network has seen total single-day usage numbers of 6.4 TB, 6.3 TB and 6.2 TB, the latter coming during the exciting Monday Night Football game Oct. 1 versus the Kansas City Chiefs.

More APs coming for gate areas, concourses

“We still have a few more APs to add,” said Trainor in a quick interview during the Chiefs game, which MSR attended. And while Trainor added that the team is also planning to step up its promotion of the network, many fans are finding it already, as proven by some other high-water marks this year that include a peak of 32,837 concurrent users during the home opener on Sept. 9; peak throughput of 10.83 Gbps on Sept. 16; and the most unique connections, 42,981, on Oct. 1.

Parking lots are well-covered at Mile High

Because many of the new APs are the new Cisco 3800 Series with two radios, Trainor is confident the Broncos Stadium network is far from maxing out.

“We still have room to grow folks onto the system, and we’ll continue to advertise that network for the fans,” Trainor said.

During our visit at the Oct. 1 game, MSR was impressed the moment we got out of our car in the parking lot, when we recorded a Wi-Fi mark of 28.3 Mbps down and 56.5 Mbps up. As a Verizon customer we were automatically connected to the stadium’s Wi-Fi network, one of the perks that came with Verizon’s investments in the Wi-Fi and DAS networks at the stadium.

Inside the premium-seating United Club area, we got a Wi-Fi mark of 48.0 Mbps / 70.3 Mbps, even as fans crowded the open dining hall during pregame. We also saw some cool new food-station kiosks along one wall, each with its own connected display for menu items as well as a touchscreen payment system (a turnkey deployment from Centerplate, Tapin2, and PingHD) that eliminated the need for additional concessions staffers.

Up on the top-level concourse we saw APs every other wall section with two antennas pointing in opposite directions, coverage that produced one mark of 31.8 Mbps / 68.2 Mbps even as fans crowded the stands to get food and drink before kickoff. According to Trainor the concourse areas will get roughly a doubling of coverage with more APs next year, to support a plan to move to more digital payment methods.

A good look at the hardened, single-cable Wi-Fi APs in the walkway ramps area. According to the Broncos these use POE (power over Ethernet), cutting down on the conduit needed.

Out in the upper-level stands (Section 541, row 5) we got a Wi-Fi mark of 36.0 Mbps / 29.6 Mbps, in an area where we could see APs pointing down on the seats from the top-level light standards as well as in railing enclosures. Some areas in the upper deck are also covered by under-seat APs, which also are used in the south end zone stands where there is no overhang infrastructure.

We also got good connectivity in an often overlooked area, the walkway ramps and escalators behind the seats, where the Broncos installed some APs that use power over Ethernet and weather-hardened enclosures since those areas are more open to weather. While riding up on an escalator we not only stayed connected but got a test mark of 26.4 Mbps / 37.6 Mbps.

Keeping crowds of fans connected

In perhaps one of the biggest stress tests we could find, the Mile High Wi-Fi had no problem keeping fans connected. Just before halftime we planted ourselves on the outdoor plaza behind the south stands, and waited for fans to crowd the area during the break. With a Wi-Fi mark of 38.4 Mbps / 35.7 Mbps second five minutes into the halftime break, we were still able to easily view video highlights of the first half even as everyone around us was using their phones to check email or to connect with friends and family.

As the second-half kickoff neared, we walked into the main concourse underneath the west stands and still stayed solidly connected, with a mark of 33.0 Mbps / 59.1 Mbps in the middle of a thick crowd of fans who were either waiting for concessions or walking back to their seats.

With a high-water mark of 8.1 TB for a Taylor Swift concert earlier this spring, the new Wi-Fi network in Broncos Stadium at Mile High showed that it’s more than ready for big games or other big events. Some more photos from our visit below!

Nothing like Monday Night Football!

Fans gather on the south stands plaza during halftime

Close-up of an AP install on the back wall facing out into the south stands plaza

United Club dining area with single-stand kiosks in back

Single-stand food kiosk with its own display and self-service payment terminal (from PingHD)

AP deployment on top-level concourse

AP deployment (on post) in lower concourse area