Small company delivers big Wi-Fi for Minnesota United at Allianz Field

The standing section at Allianz Field for the opening game this spring. Credit: Minnesota United (click on any picture for a larger image)

Fans at the new Allianz Field in St. Paul are the beneficiaries of a big project done by a small company to bring solid fan-facing Wi-Fi to the new 19,400-seat home arena for the Minnesota United FC MLS team.

The striking new $250 million facility, opened in April just off the highway that connects Minneapolis to St. Paul, is a looker from first sight, especially at night if the multi-colored lights in its cursive outside shell are lit. Inside, the clean sight lines and close-to-the-pitch seating that seems a hallmark of every new soccer-specific facility are accompanied by something that’s not as easy to detect: A solid fan-facing Wi-Fi network with approximately 480 Cisco access points, in a professional deployment that wouldn’t seem out of place at any larger facility, like an NFL stadium.

Actually, the Wi-Fi network inside Allianz Field is somewhat more conspicuous than many other deployments, mainly because instead of hiding or camouflaging the APs, most have very visible branding, letting visitors know that the Wi-Fi is “powered by” Atomic Data.

Who is Atomic Data? Though perhaps better known for their data center and enterprise business managed-services prowess, the 215-person Minneapolis-based firm also has a developing track record in stadium technology deployments, including a role as part of the IT support team for the launch of U.S. Bank Stadium two years ago. In what is undeniably a unique arrangement, Atomic Data paid for and owns the network infrastructure at Allianz Field, providing fan-facing Wi-Fi as well as back-of-house connectivity as a managed service to the team as well as to internal venue vendors like concessionaires.

LOCAL PARTNER EARNS TEAM’S TRUST

Editor’s note: This report is from our latest STADIUM TECH REPORT, an in-depth look at successful deployments of stadium technology. Included with this report is a profile of the new Wi-Fi network at Chesapeake Energy Arena in Oklahoma City, and an in-depth research report on the new Wi-Fi 6 standard! DOWNLOAD YOUR FREE COPY now!

While most new stadium builds often look for network and technology firms with a bigger name or longer history, Atomic Data was well known to the Minnesota team, having been a sponsor even before the club moved up to MLS.

One of the Cisco Wi-Fi APs installed by Atomic Data inside the new Allianz Field in Minneapolis. Credit: Paul Kapustka, MSR

Chris Wright, CEO of the MNUFC, credited a longtime relationship with Atomic CEO Jim Wolford, a company Wright had known from his days with the NBA Timberwolves and WNBA’s Lynx.

“They [Atomic Data] are a very strong local company and we knew of their work, including at U.S. Bank Stadium,” Wright said. “Jim has also been a huge advocate of the [soccer] club, even before they moved to MLS. Their history is solid, and they [Atomic Data] have an incredible reputation.”

As the team prepared to move into its under-construction home, Wright said that originally having a high-definition wireless network wasn’t in the cards.

“The original plan was not to have a robust Wi-Fi network,” Wright said, citing overall budget concerns as part of the issue. But when he was brought in as CEO he was looking for a way to change the direction and have a more digital-focused fan experience – and he said by increasing Atomic Data’s partnership, the company and the team found a way to make it happen.

As described by both Wright and Atomic Data, the deal includes having Atomic Data pay for and own the Wi-Fi network components, and also to act as the complete IT outsourcer for the team, providing wired and wireless connectivity as a managed service.

“When you look at the demographic of our fans, they’re mostly millenials and we wanted to have robust connectivity to connect with them,” Wright said. “Over time we were able to negotiate a deal [with Atomic Data] to build what I think is the most capable Wi-Fi network ever for a soccer-specific venue. I think we’ve turned some heads.”

UNDER SEAT AND OUTSIDE THE DOORS

Just before the stadium hosted its first league game, Mobile Sports Report got a tour of the facility from Yagya Mahadevan, enterprise project manager for Atomic Data and sort of the live-in maestro for the network at Allianz Field. Mahadevan, who worked on the U.S. Bank Stadium network deployment before joining Atomic Data full-time, was clearly proud of the company’s deployment work, which fit in well with the sleek designs of the new facility.

An under-seat AP deployment at Allianz Field. Credit: Paul Kapustka, MSR

For the 250 APs in the main seating bowl, Atomic Data used a good amount of under-seat AP deployments, since many of the seats have no overhang. A mix of overhead APs covers the seating areas that do have structures overhead, and more APs – which are clearly noticable, including some APs painted white to pop out against black walls and vice versa – are mounted along concourse walkways as well as on the outside of the main entry gates. Since MNUFC is a paperless ticketing facility, Mahadevan said Atomic Data paid special attention to entry gates to make sure fans could connect to Wi-Fi to access their digital tickets.

Wright, who called Atomic Data’s devotion to service “second to none,” noted that before the first three games at the new stadium, Atomic Data had staff positioned in a ring around the outside of the field, making sure fans knew how to access their tickets via the team app and the Wi-Fi network.

“The lines to get in were really minimized, and that level of desire to deliver a high-end experience is just the way they think,” Wright said of Atomic Data.

According to Atomic Data the network is backed by two redundant 10-Gbps backbone pipes (from CenturyLink and Consolidated Communications) and is set up to also provide secure Wi-Fi connectivity to the wide number of independent retail and concession partners. Mahadevan also said that the network has a number of redundant cable drops already built in, in case more APs need to be added in the future. The stadium also has a cellular distributed antenna system (DAS) built by Mobilitie, but as of early this spring none of the carriers had yet been able to deploy gear.

Even the chilly temperatures at the team’s April 13 home opener didn’t keep fans from trying out the new network, as Atomic Data said it saw 85 gigabytes of Wi-Fi data used that day, with 6,968 unique Wi-Fi device connections, a 35 percent take rate from the sellout 19,796 fans on hand. According to the Atomic Data figures, the stadium’s Wi-Fi network saw peak Wi-Fi bandwidth usage of 1.9 Gbps on that opening day; of the 85 GB Wi-Fi data total, download traffic was 38.7 GB and upload traffic was 46.3 GB.

According to Wright, the stadium has already had several visits from representatives from other clubs, who are all interested in the networking technology. Wright’s advice to other clubs who are in the process of thinking about or building new stadiums: You should get on the horn with Atomic Data.

“I tell them if you’re from Austin or New England, you should be talking to Atomic,” Wright said. “They should try to replicate the relationship we have with them.”

Temporary courtside network helps set Final Four Wi-Fi records

A temporary under-seat Wi-Fi network helped bring connectivity to courtside seats at this year’s Final Four. Credit all photos: Paul Kapustka, MSR (click on any picture for a larger image)

One of the traditional characteristics of the Final Four is the yearly travel scramble of the fortunate fans and teams who have advanced to the championship weekend. Somehow, with only a week’s notice, plane flights, road trips and hotel rooms get scheduled and booked, leading to packed houses at college basketball’s biggest event.

On the stadium technology side, a similar last-minute fire drill happens just about every year as well, as the hosting venues reconfigure themselves to host basketball games inside cavernous buildings built mainly to hold football crowds. At this year’s NCAA Men’s Final Four at U.S. Bank Stadium in Minneapolis, the stadium tech team and partner AmpThink were able to quickly construct a temporary Wi-Fi network to cover the additional lower-bowl seating. The new capactity was part of a record-setting Wi-Fi network performance at the venue, with single-day numbers surpassing those from Super Bowl 52, held in the same building the year before.

The Wi-Fi numbers, both staggering and sobering especially to venues who are next in line for such bucket-list events, totaled 31.2 terabytes for the two days of game action, according to figures provided by the NCAA. For the semifinal games on Saturday April 6, U.S. Bank Stadium’s Wi-Fi network saw 17.8 TB of traffic, topping the 16.31 TB used during Super Bowl 52 on Feb. 4, 2018. The Saturday semifinals also set an attendance record for the venue, with 72,711 on hand, topping the 67,612 in attendance for Super Bowl 52.

During the championship game on April 8, U.S. Bank Stadium saw an additional 13.4 TB of data used on the Wi-Fi network, giving the venue three of the top four single-day Wi-Fi numbers we’ve reported, with this year’s mark of 24.05 TB at Super Bowl 53 in Atlanta the only bigger number. Saturday’s take rate at U.S. Bank Stadium, however, surpassed even the most-recent Super Bowl, with 51,227 unique users on the network, a 70 percent take rate.

‘Like building an arena network inside a football stadium’

Editor’s note: This report is from our latest STADIUM TECH REPORT, an in-depth look at successful deployments of stadium technology. Included with this report is a profile of the new Wi-Fi network at Allianz Field in St. Paul, Minn., and an in-depth research report on the new Wi-Fi 6 standard! DOWNLOAD YOUR FREE COPY now!

Switches for the temporary network were deployed under the seat scaffolding.

There’s no doubt that the temporary network installed by AmpThink and the U.S. Bank Stadium IT team contributed a great deal to the final Wi-Fi totals, with 250 access points installed in the additional seats. Like at other football venues that are transformed into basketball arenas, U.S. Bank Stadium had temporary seating installed on all four sides of the stadium, with temporary risers stretching down over football seating as well as with risers built behind both baskets. More seats were installed on the “floor” of the football field, right up to the elevated court set in the middle. The temporary APs, like the existing ones in the stadium, are from Cisco.

“There are a lot more moving parts to a Final Four than to a Super Bowl,” said David Kingsbury, director of IT for U.S. Bank Stadium, describing the difference in providing the networking and technical underpinnings for each event. While planning for the networks was obviously done far in advance, the actual buildout of the temporary Wi-Fi couldn’t even begin until the additional seating was in place, a task that finished just five days before the first game was played.

That’s when AmpThink deployed a staff of 12 workers to start connecting cables to APs and to switches, while also adding in another 700 wired network connections to the courtside areas for media internet and TV monitor connections. Like it does for every venue network it designs and deploys, AmpThink came to the stadium equipped with a wide assortment of lengths of pre-terminated cables, preparation that made the fast deployment possible.

“If we had to spin raw cable and terminate it on site, we never would have been able to finish in five days,” said AmpThink president Bill Anderson.

AmpThink’s previous experience in deploying such temporary networks under temporary seating — including at the previous year’s Final Four in San Antonio — taught the company that it would also need protection for under-seat switch deployments, to fend off the inevitable liquid spills from the seats above. That requirement was potentially even more necessary at U.S. Bank Stadium, since this year’s Final Four was the first to allow in-venue sales of alcoholic beverages.

Some temporary seats were deployed on top of existing lower bowl seats.

With some of the temporary seating installed over existing seating, there were 95 APs in the existing handrail-enclosure design that had to be turned off for the Final Four, according to Kingsbury. The 250 new APs added were all installed under the folding chairs, in enclosures that simply sat on the floor.

According to AmpThink’s Anderson, the company did learn a lesson at U.S. Bank Stadium — that it will, at future events, need to secure the actual enclosures since during the weekend curious fans opened a few of the boxes, with one AP disappearing, perhaps as an interesting IT souvenir.

In San Antonio, AmpThink had zip-tied the enclosures to chairs, which led to increased labor to detatch the devices during the post-event breakdown. While having no such measures at U.S. Bank led to a fast removal — AmpThink said it had removed all the temporary network elements just seven hours after the championship game confetti had settled — for next year’s Final Four AmpThink plans to at least zip-tie the enclosures shut so that fans can’t attempt any ad hoc network administration.

More APs for back of house operations

Another difference between the Final Four and the Super Bowl is the fact that four, not two, teams are in attendance for a full weekend, necessitating the need to set up temporary “work rooms” adjacent to each school’s locker room area. The media work center for the Final Four is also typically larger than that of a Super Bowl, again with more cities and their attendant media outlets on site thanks to there being four, not just two, teams involved.

A concourse speed test taken just after halftime of the final game.

“We had to cover a lot of places in the stadium that we don’t normally cover” with wireless and wired network access, Kingsbury said, saying that an additional 30 APs were needed for team rooms and the main media workspace, which were located on the field level of the stadium in the back hallways. An interesting note at U.S. Bank Stadium was that the yards and yards of fabric used as curtains to cover the clear-plastic roofing and wall areas was actually benefical to Wi-Fi operations, since it cut off some of the reflective interference caused by ETFE surfaces.

According to Kingsbury the final count of active APs for the Final Four was 1,414, a number reached by adding in the temporary APs while deducting the ones taken offline. Not included in the official NCAA traffic numbers was an additional 3 TB of traffic seen during the free-admission Friday practice sessions, when 36,000 fans visited the stadium, with 9,000 joining the Wi-Fi network.

From the official stats, the peak concurrent user number from Final Four Saturday of 31,141 was also an overall record, beating Super Bowl 53’s mark of 30,605. (Super Bowl 53 had 70,081 fans in attendance for the Feb. 3 game between the New England Patriots and the Los Angeles Rams.) The Wi-Fi network numbers for Monday’s championship game (won by Virginia 85-77 over Texas Tech in overtime) saw big numbers itself, with 13.4 TB of total data used, and 48,449 unique connections and 29,487 peak concurrent users (out of 72,062 in attendance). Monday’s game also produced a peak throughput number of 11.2 Gbps just after the game ended.

None of those totals could have been reached without the temporary network, which AmpThink’s Anderson compared to “building a 10,000-seat arena network inside a football stadium.” Next stop for a temporary Wi-Fi network is Mercedes-Benz Stadium in Atlanta, where the 2020 Final Four awaits.

This is what your football stadium looks like with a championship basketball game inside of it.

The temporary center-hung scoreboard was able to play video programming onto the court surface.

The NBA on TBS crew was courtside for the Final Four.

The secret to keeping your network operations room running? All kinds of energy inputs.

New Report: Wi-Fi 6 research report, record Wi-Fi at the Final Four, and more!

MOBILE SPORTS REPORT is pleased to announce the Summer 2019 issue of our STADIUM TECH REPORT series, the ONLY in-depth publication created specifically for the stadium technology professional and the stadium technology marketplace.

Our latest issue contains a research report on the new Wi-Fi 6 standard and what it means to stadium networks, as well as three separate profiles of Wi-Fi network deployments, including a look at how a temporary network helped fans use record data totals at the Final Four! Download your FREE copy today!

Inside the report our editorial coverage includes:

— A Wi-Fi 6 research report that looks into the new standard’s technology improvements that make it a great bet for in-venue networks;
— An in-person report from the NCAA Men’s 2019 Final Four at U.S. Bank Stadium, where the weekend saw a record 31+ terabytes of Wi-Fi data used;
— How Minnesota United’s new home, Allianz Field, got a big Wi-Fi network from a small company, Atomic Data;
— A look at the new Wi-Fi network at Chesapeake Energy Arena, home of the NBA’s Oklahoma City Thunder.

Download your free copy today!

We’d like to take a moment to thank our sponsors, which for this issue include Mobilitie, JMA Wireless, Corning, Boingo, MatSing, Cox Business/Hospitality Network, ExteNet, Neutral Connect Networks, Atomic Data, Oberon, and America Tower. Their generous sponsorship makes it possible for us to offer this content free of charge to our readers. We’d also like to welcome readers from the Inside Towers community, who may have found their way here via our ongoing partnership with the excellent publication Inside Towers. We’d also like to thank the SEAT community for your continued interest and support.

Broncos Stadium at Mile High sees 12.63 TB of Wi-Fi during Garth Brooks show

The Garth Brooks show in Denver June 8 saw fans use a venue-best 12.68 terabytes of data on the Wi-Fi network at Broncos Stadium at Mile High, according to figures provided by the team’s IT department.

In what sounds like a great time for both attendees and the performer (see Tweet below) there were 48,442 unique devices connected at some point during the night, according to figures sent our way by Russ Trainor, senior vice president of IT for the Broncos. That’s a take rate of about 58 percent, with some 84,000 fans in the stadium that night. Trainor also said that there was a peak concurrent connection total of 34,952 devices, and that the network saw a throughput peak of 17.65 Gbps, also a record for the venue.

The previous top Wi-Fi event at the Broncos’ home was a Taylor Swift concert last year, where the Wi-Fi network saw just over 8 TB of traffic. Looks like the continued improvements to the venue’s Wi-Fi network are able to handle more traffic.

THE MSR TOP 21 FOR WI-FI

1. Super Bowl 53, Mercedes-Benz Stadium, Atlanta, Ga., Feb. 3, 2019: Wi-Fi: 24.05 TB
2. NCAA Men’s 2019 Final Four semifinals, U.S. Bank Stadium, Minneapolis, Minn., April 6, 2019: Wi-Fi: 17.8 TB
3. Super Bowl 52, U.S. Bank Stadium, Minneapolis, Minn., Feb. 4, 2018: Wi-Fi: 16.31 TB
4. NCAA Men’s 2019 Final Four championship, U.S. Bank Stadium, Minneapolis, Minn., April 8, 2019: Wi-Fi: 13.4 TB
5. Garth Brooks Tour, Broncos Stadium at Mile High, June 8, 2019: Wi-Fi: 12.63 TB
6. 2018 College Football Playoff Championship, Alabama vs. Georgia, Mercedes-Benz Stadium, Atlanta, Ga., Jan. 8, 2018: Wi-Fi: 12.0 TB*
7. Super Bowl 51, NRG Stadium, Houston, Feb. 5, 2017: Wi-Fi: 11.8 TB
8. Atlanta Falcons vs. Philadelphia Eagles, Lincoln Financial Field, Philadelphia, Pa., Sept. 6, 2018: Wi-Fi: 10.86 TB
9. Super Bowl 50, Levi’s Stadium, Santa Clara, Calif., Feb. 7, 2016: Wi-Fi: 10.1 TB
10. Taylor Swift Reputation Tour, Gillette Stadium, Foxborough, Mass., July 27, 2018: Wi-Fi: 9.76 TB
11. Minnesota Vikings vs. Philadelphia Eagles, NFC Championship Game, Lincoln Financial Field, Philadelphia, Pa., Jan. 21, 2018: Wi-Fi: 8.76 TB
12. Jacksonville Jaguars vs. New England Patriots, AFC Championship Game, Gillette Stadium, Foxborough, Mass., Jan. 21, 2018: Wi-Fi: 8.53 TB
13. Taylor Swift Reputation Tour, Broncos Stadium at Mile High, May 25, 2018: Wi-Fi: 8.1 TB
14. Kansas City Chiefs vs. New England Patriots, Gillette Stadium, Foxborough, Mass., Sept. 7, 2017: Wi-Fi: 8.08 TB
15. SEC Championship Game, Alabama vs. Georgia, Mercedes-Benz Stadium, Atlanta, Ga., Dec. 1, 2018: Wi-Fi: 8.06 TB*
16. Green Bay Packers vs. Dallas Cowboys, Divisional Playoffs, AT&T Stadium, Arlington, Texas, Jan. 15, 2017: Wi-Fi: 7.25 TB
17. Stanford vs. Notre Dame, Notre Dame Stadium, South Bend, Ind., Sept. 29, 2018: 7.19 TB
18. (tie) Southern California vs. Notre Dame, Notre Dame Stadium, South Bend, Ind., Oct. 21, 2017: 7.0 TB
Arkansas State vs. Nebraska, Memorial Stadium, Lincoln, Neb., Sept 2, 2017: Wi-Fi: 7.0 TB
19. WrestleMania 32, AT&T Stadium, Arlington, Texas, April 3, 2016: Wi-Fi: 6.77 TB
20. Wisconsin vs. Nebraska, Memorial Stadium, Lincoln, Neb., Oct. 7, 2017: Wi-Fi: 6.3 TB
21. Super Bowl 49, University of Phoenix Stadium, Glendale, Ariz., Feb. 1, 2015: Wi-Fi: 6.23 TB

* = pending official exact data

Federated Wireless completes ESC network for CBRS

One of the coastal sensors deployed in Federated Wireless’ ESC network. Credit: Federated Wireless

Federated Wireless announced Monday the completion of its environmental sensing capability (ESC) network, in what may be one of the final stepping stones toward commercial deployments of networks in the CBRS band.

Under the unique shared-spectrum licensing structure of the CBRS (Citizens Broadband Radio Service) band, a swath of 150 MHz in the 3.5 GHz range, an ESC network must be in place to sense when U.S. Navy ships are using the band. What Federated is announcing Monday is that its ESC network is ready to go, one of the final things needed before commercial customers of Federated’s products and services would be able to formally start operating their networks.

Though the Federated ESC network is still pending final FCC approval, Federated president and CEO Iyad Tarazi said in a phone interview that the company “expects to get the green light [from the FCC] in June,” with the commercial customer launches following soon behind. Federated, a pure-CBRS startup with $75 million in funding, also offers Spectrum Access Services (SAS), another part of the CBRS puzzle to help ensure that any network operators who want to play in the shared-space sandbox that is CBRS are only using spectrum chunks that are free of any higher-priority traffic.

According to Tarazi Federated already has 25 customers testing its gear and services in getting ready to launch CBRS networks, a yet-unnamed group of entities that Tarazi said includes wireless carriers, enterprise companies looking to launch private networks, and even some large public venues.

Private networks first for venues?

The early thinking on CBRS use cases for sports stadiums includes the possibility of using private LTE networks for sensitive internal operations like ticketing and concessions, or even for closed-system video streaming and push-to-talk voice support. In the longer-term future, CBRS has been touted as a potential way to provide a neutral-host network that could support fan-facing carrier offload much like a current distributed antenna system (DAS), but to get to that place will still likely require some more-advanced SIM technology to be developed and deployed in client devices like cellphones.

But the potential of a new, huge chunk of spectrum — and the possibility of teams, leagues and venues being able to own and operate their own networks — has created a wide range of interest in CBRS among sports operations. While many of those same entities already operate stadium Wi-Fi networks, CBRS’s support for the cellular LTE standard theoretically could support faster, more secure networks. However, the emerging Wi-Fi 6 standard may close the performance gaps between cellular and Wi-Fi in the near future; many networking observers now seem to agree that most venues will likely see a continued mix of Wi-Fi and cellular systems in the near future, possibly including CBRS.

Already, the PGA and NASCAR have live tests of CBRS networks underway, and the NFL and Verizon have kicked the ball around with CBRS tests, reportedly for possible sideline headset network use.

While CBRS will potentially get more interesting when the commercial deployments become public, if you’re a network geek you will be able to appreciate some of the work done by Federated to get its ESC network operational, starting with the deployments of sensors on coastal structures as varied as “biker bars and luxe beach resorts,” according to a Federated blog post.

Tarazi, who was most recently vice president of network development at Sprint, said the Federated ESC network is “triple redundant,” since losing just one sensor could render a big chunk of spectrum unusable.

“If you lose a sensor, you lose hundreds of square miles of [available] network,” Tarazi said. “That’s a big deal.”

And ensuring network availability is in part what Federated’s clients will be paying the company for, part of the puzzle that when put together will theoretically open up wireless spectrum at a much lower cost compared to purchasing licensed spectrum at auction. As one of the pick-and-shovel providers in the CBRS gold rush, Tarazi and Federated may be the only ESC game in town for a while, as the joint effort between CommScope and Google to build another ESC is not expected to be completed until later this year at the earliest.

“I feel like we’re at an inflection point now,” Tarazi said. “It feels good to be leading this wave.”

AT&T sees 2.5 TB of DAS traffic at men’s Final Four championship game

The concourses at U.S. Bank Stadium were well covered by DAS and Wi-Fi antennas for the recent Final Four. Credit: Paul Kapustka, MSR

In addition to the big Wi-Fi numbers seen at the NCAA men’s 2019 basketball championship game, AT&T said it saw 2.5 terabytes of data used by its customers on its DAS network at U.S. Bank Stadium in Minneapolis for the final game of the men’s Final Four weekend.

The neutral-host DAS in U.S. Bank Stadium, which is operated by Verizon, tested strong during MSR’s visit to the Final Four — we saw a mark of 37.5 Mbps on the download and 45.0 Mbps on the upload during the championship game, on a Verizon phone. Verizon, however, declined to provide any data totals from the Final Four.

In addition to its championship game numbers, AT&T said it saw 44.6 TB of data used on its networks in and around U.S. Bank Stadium for the entire men’s Final Four weekend.

Women’s Final Four sees 1.1 TB of DAS

At the NCAA women’s Final Four weekend in Tampa, Fla., AT&T said it saw a total of 1.1 TB of traffic used by its customers on the new MatSing Ball-powered DAS at Amalie Arena. That number includes traffic from both semifinal games as well as the championship game on April 7.