Average per-fan Wi-Fi use total jumps again at Super Bowl 52

Seen in the main concourse at U.S. Bank Stadium: Two IPTV screens, one Wi-Fi AP and a DAS antenna. Credit: Paul Kapustka, MSR

After a year where the actual amount of average Wi-Fi data used per connected fan at the Super Bowl dropped, the trend of more data used per fan reversed itself again to a new peak at Super Bowl 52, with an average total of 407.4 megabytes per user.

Even though the number of unique connections to the Wi-Fi network at U.S. Bank Stadium for Super Bowl 52 also increased to a record 40,033 users (according to the official statistics compiled by Extreme Networks), the jump from 11.8 terabytes of Wi-Fi data used at Super Bowl 51 to 16.31 TB used at Super Bowl 52 pushed the average per-user number to the top, surpassing the 333 MB per user number from Super Bowl 51, as well as the 370 MB per user mark seen at Super Bowl 50.

While this statistic has not ever been called out by the Extreme Networks Super Bowl compilations, we here at MSR think it is a vital mark since it shows that even with more users on the network those connected users are still using more data. That means that IT departments at venues everywhere should probably still plan for no letup in the overall continued growth in demand for bandwidth at large-venue events, especially at “bucket list” events like the Super Bowl.

Last year we guessed the drop in per-user totals from Super Bowl 50 to Super Bowl 51 might have been due to a larger number of autoconnected users, but we never got an answer from the Extreme Networks team when we asked that question. At U.S. Bank Stadium there was also an autoconnect feature to the Wi-Fi for Verizon Wireless customers, but it didn’t seem to affect the per-user total mark.

Connectivity at the core of Little Caesars Arena, District Detroit

Little Caesars Arena, the new home for the Detroit Red Wings and the Detroit Pistons. Credit: Olympia Entertainment (click on any photo for a larger image)

Bringing great wireless connectivity to a new stadium is almost table stakes these days. But building up a nearby commercial district — and keeping connectivity high outside the venue’s walls — is a bet of another level, especially in Detroit where networks extend outside the new Little Caesars Arena into the 50-block District Detroit.

Following the arena’s opening in September of 2017, the prognosis so far is so far, so good, with solid reports of high network performance on both Wi-Fi and cellular networks in and around the new home of the NHL’s Detroit Red Wings and the NBA’s Detroit Pistons. But for John King, vice president of IT and innovation for venue owners Olympia Entertainment, the responsibilities for him and his network team extend far beyond the new stadium’s walls.

“We’re focused on the [wireless] signal not just in the bowl, but also in the surrounding elements — the streets, the outdoor arenas, and the Little Caesars Arena garage,” said King in an interview shortly after the arena opened. “The vision is, to be connected wherever you are. And to share that experience.”

An ambitious revival in downtown Detroit

Editor’s note: This profile is from our most recent STADIUM TECH REPORT for Winter 2018, which is available for FREE DOWNLOAD from our site. This issue has an in-depth look at the wireless networks at U.S. Bank Stadium in Minneapolis, as well as profiles of network deployments at the Las Vegas Convention Center and Orlando City Stadium! DOWNLOAD YOUR FREE COPY today!

The inside concourse at Little Caesars Arena. Credit: Olympia Entertainment

Built nearby the Detroit Lions’ Ford Field and the Tigers’ Comerica Park, the new hoops/hockey stadium seats 19,515 for hockey and 20,491 for basketball. Unlike many stadiums of the past which rise up from the ground, Little Caesars Arena is built into the ground, 40 feet below street level. The innovations in construction and accessibility, including an outdoor arena adjacent to the indoor one, may require another full profile and an in-person visit. For now, we’ll concentrate on the wireless deployment in and around Little Caesars Arena, which was funded in part by a sponsorship from Comcast Business, which provides backbone bandwidth to the arena and the district in the form of two 100 Gbps connections. The Wi-Fi network design and deployment, done by AmpThink, uses Cisco Wi-Fi gear; Cisco’s Vision for Sports and Entertainment (formerly known as StadiumVision) is used to synchronize video output to the 1,500 TV screens located in and around the venue.

On the cellular side, Verizon Wireless built a neutral-host DAS, which was getting ready to welcome AT&T as the second carrier on board shortly after the opening. According to King, the Wi-Fi network has approximately 1,100 total APs both inside and outside the arena, many of those from Cisco’s 3802 series, which each have two radios per AP. For many of the 300 APs located in the main seating bowl, Little Caesars Arena went with an under-seat deployment, with some others placed in handrail enclosures, especially for the basketball floor-seating areas.

“AmpThink did a really nice job with the deployment,” said King, who said the arena’s open-air suite spaces helped provide “lots of flow” to wireless gear, without the historical overhangs around to block signals on different levels. One early visitor to the arena saw many Wi-Fi speed tests in the 50-60 Mbps range for both download and upload, as well as several in the 80-to-100 Mbps range, signs that a strong signal was available right at the start.

“We’ve still got a lot of tuning, but early on we’re getting great results,” said King of the Wi-Fi performance. “Our goal is to make it the best it can be.”

Staying connected outside the walls

Like The Battery area surrounding the Atlanta Braves’ new SunTrust Park, the District Detroit is meant to be a stay-and-play kind of space, with restaurants, clubs, office spaces and residences seeking to lure visitors and residents to do more than just see a game. For King and his team, one of their tasks is to ensure that visitors can stay connected no matter where they are inside the district, including inside restaurants, offices and other indoor spaces.

Connectivity blends well with the architecture inside Little Caesars Arena. Credit: Tod Caflisch, special to MSR

“We want the [network] signal to be robust, to carry into outdoor spaces, restaurants and many other areas” inside the District Detroit, King said. “We want to push the envelope a little bit and create a useful opportunity.”

Back inside Little Caesars Arena, the team and stadium apps are built by Venuetize, which built a similar integrated app for the Buffalo Bills and the Buffalo Sabres, one that also extends outside arenas to support connectivity in city areas. King said that Little Caesars Arena will be testing pre-order and express pickup concession ordering through the app, with a focus on seating areas that don’t have ready access to some of the club facilities.

Like any other new facility, Little Caesars Arena will no doubt go through some growing pains in its debut season, but for King and others who spent time getting the venue ready it’s fun to have the doors open.

“It’s really great seeing it all come to life,” King said.

Fans use 16.31 TB of Wi-Fi data during Super Bowl 52 at U.S. Bank Stadium

A Wi-Fi handrail enclosure at U.S. Bank Stadium in Minneapolis. Credit: Paul Kapustka, MSR (click on any photo for a larger image)

It is now official — we have a new record for most Wi-Fi data used at a single-day event, as fans at U.S. Bank Stadium in Minneapolis for Super Bowl 52 used 16.31 terabytes of data on the Wi-Fi network.

According to statistics compiled by Extreme Networks during the Philadelphia Eagles’ thrilling 41-33 victory over the New England Patriots Sunday night, the AmpThink-designed network which uses Cisco Wi-Fi gear also saw 40,033 unique users — 59 percent of the 67,612 in attendance — a new top percentage total for any single-game network experience we’ve been told about. (The Dallas Cowboys saw approximately 46,700 unique Wi-Fi users during a playoff game last season, about 50 percent of attendance at AT&T Stadium.)

The network also saw a peak concurrent connection of 25,670 users, and a peak data transfer rate of 7.867 Gbps, according to the numbers released by Extreme. Though Extreme gear was not used in the operation of the network, Extreme has a partnership deal with the NFL under which it provides the “official” network analytics reports from the Super Bowl.

The final total of 16.31 TB easily puts Super Bowl 52 ahead of the last two Super Bowls when it comes to Wi-Fi data use. Last year at NRG Stadium in Houston, there was 11.8 TB of Wi-Fi use recorded, and at Super Bowl 50 in 2016 there was 10.1 TB of Wi-Fi data used at Levi’s Stadium in Santa Clara, Calif. So in reverse chronological order, the last three Super Bowls are the top three Wi-Fi events, indicating that data demand growth at the NFL’s biggest game shows no sign of slowing down. Combined with the 50.2 TB of cellular data used in and around the stadium on game day, Super Bowl 52 saw a total of 66.51 TB of wireless traffic Sunday in Minneapolis.

Confetti fills the air inside U.S. Bank Stadium after the Philadelphia Eagles defeated the New England Patriots in Super Bowl LII. Credit: U.S. Bank Stadium

Super Bowl 52 represented perhaps a leap of faith, in that the handrail-enclosure Wi-Fi design had not yet seen a stress test like that found at the NFL’s biggest event. Now looking ahead to hosting the 2019 Men’s NCAA Basketball Final Four, David Kingsbury, director of IT for U.S. Bank Stadium, can be forgiven for wanting to take a bit of a victory lap before we set our Wi-Fi sights on Atlanta’s Mercedes-Benz Stadium, home of Super Bowl 53.

“AmpThink, CenturyLink and Cisco designed and built a world-class wireless system for U.S. Bank Stadium that handled record-setting traffic for Super Bowl LII,’ Kingsbury said. “AmpThink president Bill Anderson and his team of amazing engineers were a pleasure to work with and the experts at Cisco Sports and Entertainment supported us throughout the multi-year planning process required for an event of this magnitude. High-density wireless networking is such a challenging issue to manage, but I am very happy with our results and wish the team in Atlanta the best next year. The bar has been raised.”


1. Super Bowl 52, U.S. Bank Stadium, Minneapolis, Minn., Feb. 4, 2018: Wi-Fi: 16.31 TB
2. Super Bowl 51, NRG Stadium, Houston, Feb. 5, 2017: Wi-Fi: 11.8 TB
3. Super Bowl 50, Levi’s Stadium, Santa Clara, Calif., Feb. 7, 2016: Wi-Fi: 10.1 TB
4. Minnesota Vikings vs. Philadelphia Eagles, NFC Championship Game, Lincoln Financial Field, Philadelphia, Pa., Jan. 21, 2018: Wi-Fi: 8.76 TB
5. Kansas City Chiefs vs. New England Patriots, Gillette Stadium, Foxborough, Mass., Sept. 7, 2017: Wi-Fi: 8.08 TB
6. Green Bay Packers vs. Dallas Cowboys, Divisional Playoffs, AT&T Stadium, Arlington, Texas, Jan. 15, 2017: Wi-Fi: 7.25 TB
7. Southern California vs. Notre Dame, Notre Dame Stadium, South Bend, Ind., Oct. 21, 2017: 7.0 TB
8. WrestleMania 32, AT&T Stadium, Arlington, Texas, April 3, 2016: Wi-Fi: 6.77 TB
9. Super Bowl 49, University of Phoenix Stadium, Glendale, Ariz., Feb. 1, 2015: Wi-Fi: 6.23 TB
10. Georgia vs. Notre Dame, Notre Dame Stadium, South Bend, Ind., Sept. 9, 2017: Wi-Fi: 6.2 TB

U.S. Bank Stadium in Minneapolis before the start of Super Bowl LII

Eagles see 8.76 TB of Wi-Fi data for NFC Championship game on new Panasonic network

Panasonic Everest Wi-Fi APs (lower left, middle right) mounted underneath an overhang at Lincoln Financial Field in Philadelphia. Credit: Panasonic (click on any photo for a larger image)

The Philadelphia Eagles saw 8.76 terabytes of Wi-Fi data used at Lincoln Financial Field on Jan. 21 during the Eagles’ 38-7 win over the Minnesota Vikings in the NFC Championship game, a new high in one-day Wi-Fi usage for reported marks in games not called the Super Bowl.

Though the game’s position as No. 3 on our unofficial “top Wi-Fi” list (see below) may change as we get reports from other recent NFL playoff games, the mark is nevertheless impressive, and perhaps a big confirmation metric for Panasonic’s nascent big-venue Wi-Fi business. According to Panasonic, its 654-Access Point network inside “The Linc” also saw 35,760 unique connections during the game, out of 69,596 in attendance; the network also saw a peak of 29,201 concurrent devices connected (which happened during the post-game trophy presentation), and saw peak throughput of 5.5 Gbps.

What’s most interesting about the new Panasonic network in Philadelphia is that it is a completely top-down deployment, meaning that most of the APs (especially the 200 used in the seating bowl) shoot signals down toward seats from above. While most new networks at football-sized stadiums (and some smaller arenas) have turned to under-seat or railing-mounted APs to increase network density in seating areas, Panasonic claims its new “Everest” Wi-Fi gear has antennas that can provide signals up to 165 feet away, with “electronically reconfigurable directional beam profiles” that allow for specific tuning of where the Wi-Fi signal can point to.

By also putting four separate Wi-Fi radios into each access point, Panasonic also claims it can save teams and venues money and time on Wi-Fi deployments, since fewer actual devices are needed. By comparison, other big, new network deployments like Notre Dame’s often have a thousand or more APs; Notre Dame, which uses railing-mounted APs in the seating bowl, has 685 in the seating bowl out of a total 1,096 APs. Many of the Notre Dame APs are Cisco 3800 devices, which have two Wi-Fi radios in each AP.

‘The Linc’ before last week’s NFC Championship game. Credit: Kiel Leggere, Eagles

Atlanta’s new Mercedes-Benz Stadium, which uses Aruba Wi-Fi gear mainly deployed under seats in the bowl, has nearly 1,800 APs, with 1,000 of those in the seating bowl.

Antennas close to fans vs. farther away

From a design and performance standpoint, the under-seat or railing-mounted “proximate” networks are built with many APs close together, with the idea that fans’ bodies will intentionally soak up some of the Wi-Fi signal, a fact that network designers use to their advantage to help eliminate interference between radios. The under-seat AP design, believed to be first widely used by AT&T Park in San Francisco and then at a larger scale at Levi’s Stadium in Santa Clara, Calif., was developed to help bring better signals to seats where overhang-mounted APs couldn’t deliver strong connectivity. Older concrete-bowl stadiums like Notre Dame’s also went with a proximate railing design for a similar lack of overhangs.

Though the Eagles’ IT team has repeatedly turned down interview requests from MSR since this summer, Danny Abelson, vice president connectivity for Panasonic Enterprise Solution Company, met with MSR last week to provide details of the deployment. Citing new, patented antenna technology developed specifically by Panasonic to solve the limitations of prior overhead gear, Abelson claims Panasonic can deliver a similar stadium experience for “two-thirds the cost” of an under-seat or railing-mount network design, with savings realized both in construction costs (since it is usually cheaper to install overhead-mounted equipment than under-seat or railing mounts due to drilling needed) and in the need for fewer actual APs, since Panasonic has four radios in its main Wi-Fi APs.

Eagles fans cheering their team to the Super Bowl. Credit: Hunter Martin, Eagles

Abelson, however, declined to provide the exact cost of the Panasonic network at Lincoln Financial Field, citing non-disclosure agreements. There are also more questions to be answered about a Panasonic deployment’s cost, including charges for management software and/or administration services. Currently, Abelson said, Panasonic includes the costs for management software and management personnel in its bids.

When it comes to how the Eagles found Panasonic, the team and the company already had an existing relationship, as Panasonic’s video-board division had previously supplied displays for the Linc. According to Abelson, Panasonic went through a performance test at several Eagles games last season, bringing in Wi-Fi gear to see if the new technology could provide coverage to areas where the Eagles said they had seen lower-quality coverage before. One of the forerunners in the NFL in bringing Wi-Fi to fans, the Eagles had previously used Extreme Networks Wi-Fi gear to build a fan-facing network in 2013. Though the Eagles would not comment about the selection process, after issuing an RFP this past offseason the team chose Panasonic for a new network, which Abelson said was deployed in three months during the football offseason.

Re-opening the debate for antenna placement?

Though Mobile Sports Report has not yet been able to get to Philadelphia to test the new network in a live game-day situation, if Panasonic’s new gear works as promises the company may find many potential interested customers, especially those who had shied away from deploying under-seat networks due to the construction issues or costs.

The Panasonic system may be of particular interest to indoor arenas, like hockey and basketball stadiums, where the gear could be potentially mounted in catwalk areas to cover seating. John Spade, CTO for the NHL’s Florida Panthers and BB&T Center in Sunrise, Fla., has tweeted favorably about a Panasonic deployment going in at the arena whose networks he oversees:

But even as the impressive 8.76 TB mark seen at the NFC Championship game now sits as the third-highest reported Wi-Fi data use event we’ve heard of (behind only the 10.1 TB of Wi-Fi seen at Super Bowl 50 and the 11.8 TB seen at Super Bowl 51), that number may fall a bit down the list if we ever get verified numbers for some network totals we’ve heard rumors about lately. (Or even any older ones! C’mon network teams: Check out the list below and let us know if we’ve missed any.)

So far this season, we haven’t gotten any reports of Wi-Fi usage out of the network team at Atlanta’s Mercedes-Benz Stadium (which recently hosted the college football playoff championship game), and we’ve only heard general talk about oversized playoff-game traffic at U.S. Bank Stadium in Minneapolis, home of Sunday’s Super Bowl 52. Like Notre Dame Stadium, U.S. Bank Stadium uses a mostly railing-mounted AP deployment in its seating bowl; both networks were designed by AmpThink. We are also still waiting for reports from last week’s AFC Championship game at Gillette Stadium, where the previous non-Super Bowl top mark of 8.08 TB was set in September; and from any games this fall at AT&T Stadium in Arlington, Texas, where the NFL’s biggest stadium has 2,567 Wi-Fi APs.

Will overhead still be able to keep up as demand for more bandwidth keeps growing? Will Panasonic’s claims of lower costs for equal performance hold up? At the very least, the performance in Philadelphia could re-open debate about whether or not you need to deploy APs closer to fans to provide a good Wi-Fi experience. If all goes well, the winners in renewed competition will be venues, teams, and ultimately, fans.


1. Super Bowl 51, NRG Stadium, Houston, Feb. 5, 2017: Wi-Fi: 11.8 TB
2. Super Bowl 50, Levi’s Stadium, Santa Clara, Calif., Feb. 7, 2016: Wi-Fi: 10.1 TB
3. Minnesota Vikings vs. Philadelphia Eagles, NFC Championship Game, Lincoln Financial Field, Philadelphia, Pa., Jan. 21, 2018: Wi-Fi: 8.76 TB
4. Kansas City Chiefs vs. New England Patriots, Gillette Stadium, Foxborough, Mass., Sept. 7, 2017: Wi-Fi: 8.08 TB
5. Green Bay Packers vs. Dallas Cowboys, Divisional Playoffs, AT&T Stadium, Arlington, Texas, Jan. 15, 2017: Wi-Fi: 7.25 TB
6. Southern California vs. Notre Dame, Notre Dame Stadium, South Bend, Ind., Oct. 21, 2017: 7.0 TB
7. WrestleMania 32, AT&T Stadium, Arlington, Texas, April 3, 2016: Wi-Fi: 6.77 TB
8. Super Bowl 49, University of Phoenix Stadium, Glendale, Ariz., Feb. 1, 2015: Wi-Fi: 6.23 TB
9. Georgia vs. Notre Dame, Notre Dame Stadium, South Bend, Ind., Sept. 9, 2017: Wi-Fi: 6.2 TB
10. Alabama vs. Texas A&M, Kyle Field, College Station, Texas, Oct. 17, 2015: Wi-Fi: 5.7 TB

NFL exec: U.S. Bank Stadium Wi-Fi network ‘in a strong place’ ahead of Super Bowl LII

A Wi-Fi handrail enclosure at U.S. Bank Stadium in Minneapolis. Credit: Paul Kapustka, MSR (click on any photo for a larger image)

Like many football fans, I was jaw-dropping excited while watching the Minnesota Vikings’ dramatic walk-off touchdown win in last Sunday’s playoff game against the New Orleans Saints. Unlike many football fans but probably more like our readership, my next thought while watching the celebrations was: I hope the Wi-Fi holds up!

According to a top NFL IT executive who was at the game, the Wi-Fi network at U.S. Bank Stadium was more than up to the load applied to it by the Vikings’ exciting win and victory celebration, a good stress test ahead of the stadium’s hosting of Super Bowl LII on Feb. 4. “There were an amazing amount of [Wi-Fi] connections” after the game’s end, said Aaron Amendolia, vice president of IT in the NFL’s office of the CIO, in a phone interview Thursday.

The “massive spike” in connectivity after the game’s exciting conclusion produced numerous social media posts from fans present, mainly on Facebook and Snapchat, Amendolia said. Though he didn’t have full networking statistics from the game, Amendolia did share one interesting number, the fact that there were approximately 37,000 unique connections to the Wi-Fi network during the game — a total greater than that at last year’s Super Bowl LI in Houston, where 35,430 fans out of 71,795 in attendance at NRG Stadium used the Wi-Fi at some point. Attendance at Sunday’s playoff game in Minneapolis was 66,612.

“I feel we’re in a strong place now” with the Wi-Fi network at U.S. Bank Stadium, Amendolia said. “We’re hoping to set some new records.”

Still no sign of bandwidth demand decline

Amendolia, part of the NFL’s networking team that ensures good connectivity at the league’s championship event, said testing work on the AmpThink-designed network (which uses Cisco Wi-Fi gear) started last year, and then ramped up through the current season.

Seen in the main concourse at U.S. Bank Stadium: Two IPTV screens, one Wi-Fi AP and a DAS antenna. Credit: Paul Kapustka, MSR

“Starting with the presason [games] we had staff sitting in seats, doing Facebook, visiting websites,” said Amendolia. “The unique architecture in each stadium makes Wi-Fi [performance] unique. We had people sitting in odd corners, and next to big concrete structures.”

Ever since Wi-Fi has been a part of Super Bowls, the total data used and numbers of fans connecting have steadily increased each year, always setting current records for single-day use of a large venue network. At Super Bowl 49 in 2015, fans used 6.23 terabytes of data on the Wi-Fi network at the University of Phoenix Stadium in Glendale, Ariz.; the next year, it was 10.1 TB of Wi-Fi at Levi’s Stadium in Santa Clara, Calif.; and last year at NRG Stadium in Houston there was 11.8 TB of Wi-Fi data used. (Cellular data use on stadium DAS networks has also increased apace, from almost 16 TB at Super Bowl 50 to more than 25.8 TB last year.)

What’s interesting is that networking usage totals for games the following NFL season usually increase as well, not to Super Bowl levels but surpassing marks from years before. For this season’s opening game at the New England Patriots’ Gillette Stadium, the Wi-Fi network there saw 8.08 TB of data used, a mark that trails only the last two Super Bowls.

“Super Bowls set the benchmark for the next season,” said Amendolia, who agrees that there may never be an end to the growth.

“Even if [current] usage levels off, there’s new technology like augmented reality and wearable glasses,” Amendolia said. “How does that change the future?”

‘Super’ Wi-Fi and DAS at U.S. Bank Stadium ready for Super Bowl 52

A look at downtown Minneapolis from inside U.S. Bank Stadium. Credit all photos: Paul Kapustka, MSR (click on any photo for a larger image)

After Sunday’s stunning last-second victory, the Minnesota Vikings are one step closer to becoming the first team to play a Super Bowl in its own home stadium. Should the Vikings beat the Eagles in Philadelphia this weekend, Super Bowl 52 visitors should prepare for a true Norse experience inside U.S. Bank Stadium, with repeated blasts from the oversize “Gjallarhorn” and a fire-breathing dragon ship that will launch the home team onto the field. Skol!

But even if the hometown team falls short of making the big game this season, on Feb. 4, 2018 the stadium itself should do Minneapolis proud, especially when it comes to wireless connectivity. With two full regular seasons of football and numerous other events to test the networks’ capacity, both the Wi-Fi and DAS networks inside the 66,655-seat U.S. Bank Stadium appear more than ready to handle what is usually the highest single-day bandwidth stress test, namely the NFL’s yearly championship game. (Though the selfies and uploads following Sunday’s walk-off touchdown toss may have provided an early indicator of massive network use!)

In a mid-November visit to U.S. Bank Stadium for a Vikings home game against the Los Angeles Rams, Mobile Sports Report found robust coverage on both the Wi-Fi and cellular networks all around the inside of the stadium, with solid performance even amidst thick crowds of fans and even in the highest reaches of the seating bowl. Speedtests on the Wi-Fi network, built by AmpThink using Cisco gear, regularly hit marks of 40 to 50-plus Mbps in most areas, with one reading reaching 85 Mbps for download speeds.

And on the DAS side of things, Verizon Wireless, which built the neutral-host network inside U.S. Bank Stadium, said in December that it has already seen more cellular traffic on its network for a Vikings home game this season than it saw at NRG Stadium for Super Bowl LI last February. With 1,200 total antennas — approximately 300 of which were installed this past offseason — Verizon said it is ready to handle even double the traffic it saw at last year’s game, when it reported carrying 11 terabytes of data on stadium and surrounding macro networks.

Good connectivity right inside the doors

Editor’s note: This profile is from our most recent STADIUM TECH REPORT for Winter 2017-18, which is available for FREE DOWNLOAD from our site. This issue has an in-depth look at the wireless networks at U.S. Bank Stadium in Minneapolis, as well as profiles of network deployments at the brand-new Little Caesars Arena, the Las Vegas Convention Center, and Orlando City Stadium! DOWNLOAD YOUR FREE COPY today!

A new Verizon DAS antenna handrail enclosure (right) at U.S. Bank Stadium in Minneapolis. (The enclosure lower left is for Wi-Fi).

James Farstad, chief technology advisor for the Minnesota Sports Facilities Authority (MSFA), the entity that owns U.S. Bank Stadium, said he and his group are “very pleased” with the state of the wireless networks inside the venue heading toward its Super Bowl date.

“You’re never really satisfied, because you want it to be the best it can be,” said Farstad in an interview during our November visit to Minneapolis. “But generally speaking, we’re very pleased with the state of the networks.”

Those networks are tested the very moment the Vikings open the doors for home games, especially in warmer weather when the signature big glass doors — five of them, all 55 feet wide and ranging in height from 75 to 95 feet — swing out to welcome fans. As the entry that points toward downtown, the west gate can account for as much as 70 percent of the fans arriving, according to the Vikings, putting a big crush on the wireless networks in the doorway area.

To help keep people connected in crowded situations, Verizon deployed extra DAS antennas on short poles in front of both the west and east end zone concourse areas, part of a 48 percent increase in overall DAS antenna numbers added during the football offseason. Even with thick crowds streaming into the stadium, we still got a DAS speedtest of 77.35 Mbps download and 32.40 Mbps upload on the concourse just inside the west doors, and just below the Gjallarhorn.

Walking around the main level concourse, connectivity hardware is easy to see if you know what you’re looking for; part of the extensive DAS coverage includes dual antennas hanging off a single pole above wide walkway segments. And in one instance, we saw a good example of aesthetic integration, with a Wi-Fi AP attached just behind two IPTV screens, with a beacon attached to the side and a DAS antenna mounted just above everything else.

First big test of railing-mounted Wi-Fi?

Moving into the seating bowl, visitors may not know that many of the Wi-Fi network’s 1,300 APs are hiding there in plain sight — inside silver handrail enclosures, many of which now sport bright, bold section numbers to help fans find their seats. Believed to be the first big football-sized stadium that relied mainly on railing-mounted APs, the proximate network design from AmpThink is proving to be a winner in performance, producing regular-season game data totals of around 3 terabytes per event and maybe more importantly, keeping an optimal number of fans attached to the AP closest to them for the speediest connection.

Top-down antennas provide coverage for suite seating

Sitting next to AmpThink president Bill Anderson in the stadium’s press box you get a great view of the field, but it’s doubtful Anderson watches much football action given that he spends most of a game day glued to a screen that shows live detailed performance for every Wi-Fi AP in the building. While the analytics program produces a wealth of interesting data, the one metric that keeps Anderson’s attention is the one showing how many fans are connected to each AP, a number that will be no more than 50 and ideally somewhere around 25 connections if the network is performing as it should be.

On the day we visited, on Anderson’s screen there was one AP showing more than 200 devices trying to connect to it, an issue Anderson noted for immediate problem-solving. But with only a handful of others showing more than 50 connections, Anderson was confident that AmpThink has been able to figure out how to solve for the main dilemma for Wi-Fi in large enclosed structures, namely keeping APs from interfering with each
other. The large clear-plastic roof and wall areas at U.S. Bank Stadium don’t help, since they reflect RF signals to add to the network design degree of difficulty.

But the multiple railing-mount network design – which AmpThink duplicated at Notre Dame University, whose new network is seeing the highest-ever data totals seen at collegiate events – seems to be fulfilling AmpThink’s goal to produce networks with steady AP loads and consistent, high-density throughput in extremely challenging environments. The railing-mounted APs provide connectivity that couldn’t be delivered by overhead antennas, like in Notre Dame’s open concrete bowl and in U.S. Bank Stadium’s similar wide-open seating area, where no overhead structure is within 300 feet of a seat.

Two DAS antennas hang from a pole above the main concourse

“I think we have a network strategy that produces good uniform performance” in venues like U.S. Bank Stadium, Anderson said. “It’s pretty darn exciting to have a formula that works.”

More antennas get DAS ready for big game

And even though Verizon knew the Super Bowl was coming to U.S. Bank Stadium when it built the neutral host DAS for the 2016 opening, it came right back this past offseason and added approximately another 300 new antennas (mainly for its own use and not for the shared DAS), all in the name of unstoppable demand for mobile bandwidth from fans attending events.

Diana Scudder, executive director for network assurance at Verizon, said in a phone interview that “the consumer appetite [for wireless data] is insatiable,” especially at the NFL’s biggest game, where DAS use has grown at a fast clip the past few years. Scudder said these days Verizon pretty much plans to see double whatever the last Super Bowl saw for each following big game, and adds network capacity accordingly. Verizon’s numbers from the past three Super Bowls are a good guide, with the carrier reporting 4.1 TB used at Super Bowl 49, 7 TB at Super Bowl 50, and 11 TB at Super Bowl 51.

AmpThink’s handrail-mounted AP enclosures seem to have played a hand in part of Verizon’s DAS upgrade, as some of the new DAS enclosures seem to mimic the Wi-Fi ones with their smaller silver enclosures. Scudder did say that Verizon used contractors to assist with the new antenna deployment enclosures and mounts, but did not cite AmpThink by name. Verizon also deployed some under-seat antenna enclosures for its upgrade, a tactic the company also used for Super Bowl 50 at Levi’s Stadium in Santa Clara, Calif.

Even up in the most nosebleed of seats — in U.S. Bank Stadium’s case, section 345, which has seats almost touching the roof in the southwest corner, we got a DAS speedtest on the Verizon network of 60.87 Mbps / 44.22 Mbps, most likely from some antennas we could see mounted just above the seats on ventilation pipes a bit toward the field. And hanging from the middle of U.S. Bank Stadium’s roof are a pair of Matsing Ball antennas, which point down to provide cellular service for media and photographers on the sidelines, as well as for floor seating for concerts and other events.

Ready to add more bandwidth on the fly

Even less unseen and probably not appreciated until it’s needed is the stadium’s backbone bandwidth, provided by sponsoring partner CenturyLink.

A Wi-Fi enclosure in section 345, near the stadium’s roof

Though some stadiums are touting 100 Gbps pipes coming in, the U.S. Bank Stadium setup makes the venue its own ISP, according to Farstad.

With six 10-Gbps pipes that are always active — and on two separate network infrastructures for redundancy — the stadium can turn up its bandwidth on the fly, a test the venue got on its first public event.

According to Farstad, when U.S. Bank Stadium opened for the first time with a soccer game on Aug. 3, 2016, the stadium operators expected about 25,000 fans might show up for a clash between Chelsea and AC Milan. But a favorable newspaper article about the stadium led to more than 64,000 fans in the house, a surge that backed up the light-rail trains and saw the concession stands run out of food.

“We were watching the Wi-Fi system during the first break [in the soccer game] and it was coming down fast,” Farstad said. But the ability to increase capacity quickly — Farstad said that within 45 seconds, the stadium was able to provision new bandwidth, a task that in other situations could take weeks — the Wi-Fi survived the unexpected demands, proof that it should be able to handle whatever happens on Super Bowl Sunday.

“I think we can handle the Super Bowl traffic,” Farstad said.