Oklahoma leads the way with Wi-Fi 6 network at football stadium

An AmpThink handrail enclosure for Wi-Fi APs at Oklahoma. Credit all photos: Paul Kapustka, MSR (click on any picture for a larger image)

In the long history of college football, the Univeristy of Oklahoma is a name that is always somehow in the discussion when it comes to top teams and Heisman-quality talent. And now you can add stadium Wi-Fi to the list of things Oklahoma does well, after a deployment of a 100 percent Wi-Fi 6 network at Gaylord Family-Oklahoma Memorial Stadium was in place for most of the recent football season.

Formerly among the most conspicuous Wi-Fi have-nots among big-school stadiums, the Sooners have now moved to the front of the class with a network of approximately 1,350 access points in their 80,126-seat stadium, all new models that support the emerging Wi-Fi 6 standard, also known as 802.11ax. With a deployment led by AT&T, using gear from Aruba, a Hewlett Packard Enterprise company, and a design and deployment from AmpThink, using mainly handrail-mounted enclosures in the main bowl seating areas, OU fans now have the ability to connect wirelessly at the most advanced levels, with a technology base that will support even better performance as the balance of attendee handsets starts to catch up to the network with
support for Wi-Fi 6.

“We’re very excited” about the new network, said David Payne, senior technology strategist for athletics at the University of Oklahoma’s information technology department. Payne, who has been at Oklahoma since 2003, has spent the last several years shepherding the overall stadium Wi-Fi plan into place, starting first with Wi-Fi coverage for the stadium RV parking lots, then adding initial forays into stadium Wi-Fi deployment when Oklahoma renovated the south part of the stadium three years ago. But this past offseason was the big push to full stadium coverage, a trek that included a switch in equipment vendors that was prompted by Oklahoma’s solid commitment to the emerging Wi-Fi 6 standard.

Committed to Wi-Fi 6 for the future

Editor’s note: This profile is from our latest STADIUM TECH REPORT, which is available to read instantly online or as a free PDF download! Inside the issue are profiles of the new Wi-Fi and DAS networks at Chase Center, as well as profiles of wireless deployments at Fiserv Forum and the University of Florida! Start reading the issue now online or download a free copy!

A water-sealed connection for the bottom of a handrail enclosure.

If there was a tricky time to pull the trigger on Wi-Fi 6, it was last summer, when not every vendor in the market could ensure it would have enough gear on hand to fully supply a big stadium like Oklahoma’s. And even though Wi-Fi 6 gear is new and generally more expensive than previous versions, for Payne and Oklahoma the long-term benefits combined with the periodic ability to refresh something as significant as a football stadium network made committing to Wi-Fi 6 somewhat of a no-brainer.

Payne, like many other big-school IT leaders, has spent years helping administrators and others at budget- deciding levels of leadership at his school try to understand the benefits of stadium-wide Wi-Fi connectivity. For many of those years, it just didn’t make sense to try to push through the multi-million-dollar expense of a project “that would only be used six or seven Saturdays a year,” Payne said. “There’s always a difficulty in telling the story of what value you receive in this since it’s different from traditional revenue streams,” Payne said. “There isn’t a direct dollar seen from Wi-Fi users.”

But with the late-2018 approval of a capital expenditure project to revamp the football stadium’s lower-bowl seating with new handrails, wider seats and other ADA-related improvements, Payne and the IT team were able to weave in the extra $3 million (out of a total project cost of $14.9 million) it would cost to bring full Wi-Fi coverage to the entire stadium.

“It’s just taking advantage of the timing to get economies of scale,” said Payne. Because of the already- planned work on the handrails, Oklahoma was able to add the AmpThink-designed handrail Wi-Fi enclosures (which use the handrail pipes to carry cabling) for a fraction of the cost of having to do that work as a separate project, Payne said. The university had also installed new backbone gear and cabling during the south end zone renovation, so that cost was already paid for.

The decision to commit to Wi-Fi 6, Payne said, was based on standard release projections from manufacturers. “We paid close attention to projected order availability and ship dates,” Payne said. “We were felt that if we were able to receive the gear by June, we could complete the project on time.”

Though some manufacturers were not sure of being able to fully deliver Wi-Fi 6 gear, Aruba, Payne said, had “high confidence” in meeting the deadlines, and won the deal. According to Payne, all the Aruba gear was shipped in time to begin construction in June.

A handrail enclosure in the lower bowl

“It’s important for us to get the full life cycle of technology, so that’s why we decided to go 100 percent Wi-Fi 6,” Payne said.

Attention to detail an AmpThink hallmark

On a visit before and during a home game against Texas Tech in late September 2019, Mobile Sports Report was able to test the live network in all parts of the stadium, with strong performance at even the highest seating levels as well as in sometimes overlooked spots like the long ramps that fans walk up to get in and out of the venue.

The Oklahoma deployment was part of a very busy summer for AmpThink, with similar Wi-Fi design and deployments at Oklahoma, Ohio State and Arkansas. Like those two others, Oklahoma’s main bowl AP deployment was in the patented AmpThink handrail enclosures, each stamped with the distinctive “OU” logo.

The handrail deployment system, which typically includes a core drill through the concrete floor to bring wiring into the handrail tubing, is now a standard process for AmpThink, following similar deployments at the Minnesota Vikings’ U.S. Bank Stadium and at Notre Dame Stadium, among others. At Oklahoma, AmpThink said it used 10 different handrail enclosure designs to fit all the necessary spaces.

AmpThink president Bill Anderson was present during our visit and took great pride in showing off some of the finer points of an AmpThink deployment, including a method of using a metal sleeve and some clever waterproof paint and sealant to ensure that no moisture finds its way into the holes used for cable delivery.

“We spend a tremendous amount of time [during deployments] making sure there isn’t any water leakage under the stands,” Anderson said. “Because you never know what is going to be below. This is a big part of what we do. We don’t just sell an enclosure.”

Concourse APs visible high on concrete posts

The same can be said of AmpThink’s overall network designs, which it monitors and tests and tweaks as fans use the system. On the game day we visited, no fewer than four AmpThink employees were at the stadium in the network control room, checking AP performance and network usage.

“We’re pretty proud of what we can do,” Anderson said about the company’s track record for network design in large venues. “We have proven formulas which we reliably implement.”

Solid speed tests throughout the venue

At 10:20 a.m. local time, just ahead of the early 11 a.m. kickoff, Mobile Sports Report started our testing inside the main-level concourse, where fans were already lining up to purchase cold beer, another first at the stadium this past season. In the midst of the entering crowds we got a speedtest of 55.9 Mbps on the download side and 43.7 Mbps on the upload side, an inkling of the strong tests we were to see everywhere we walked. In the concourses and near concession stands, a mix of overhead and wall-mounted APs provided coverage.

Up in the stands, we took our first test among the railing-mounted enclosures in section 6, row 51, just about at the 50-yard line. We got a mark of 68.2 Mbps / 58.7 Mbps before the stands were completely full. We then hiked up to row 67, which was underneath the press box overhang and served by overhead APs, not railing enclosures. There we got a speedtest of 27.8 Mbps / 49.5 Mbps, a half hour before kickoff.

One more speedtest in the lower bowl (around the 30-yard line, in row 19) netted a mark of 68.9 Mbps / 61.2 Mbps; then as we walked around to the south end zone, we got a mark of 38.7 Mbps / 64.3 Mbps in the south concourse, busy with fans getting food and drink ahead of the imminent kickoff.

The recently renovated south end of the stadium has a series of loge boxes and other premium seating options, and has an overhang which provides additional real estate for Wi-Fi AP mounting options. Ducking into a loge box (covered by overhead APs) for a quick test we got a mark of 36.8 Mbps / 54.2 Mbps just before kickoff. Moving around to the corner of the south stands for the pregame ceremonies we got a mark of 33.7 Mbps / 63.8 Mbps even as all the phones were out to capture the team run-on and school song rendition. After kickoff, we went into the crowded main east concourse and got a mark of 43.2 Mbps / 46.6 Mbps amidst all the late-arrivers.

Good coverage in the stairwells

Wi-Fi antennas in an overhang deployment

If there is one area where stadiums sometimes skimp on wireless coverage it’s in the stairwells and pedestrian ramps, which may not seem like an important place to have connectivity. But at Oklahoma, the multiple switchbacks it takes to climb from ground level to the top seating areas are all well covered with Wi-Fi, as we got a mark of 39.9 Mbps / 29.5 Mbps during a brief rest stop on our hike to the top of the east stands.

At a concession stand on the top-level concourse we got a mark of 61.3 Mbps / 70.5 Mbps, as we admired the neatness of the core drilling we could see that got the cabling to the underside of the seating areas above. In the stands we got a mark of 57.5 Mbps / 69.5 Mbps at one of the highest rows in the stadium, row 24 of section 226, a half hour after the game’s start.

According to Payne our visit coincided with the first live game with the Wi-Fi 6 software fully turned on, part of a sort of rolling start to the network deployment which wasn’t fully live at the first game on Aug. 31.

“It wasn’t without some hiccups and headaches,” said Payne of the overall deployment, which included a small number of temporary black-colored handrail enclosures from AmpThink, which saw its single source of handrail molding material run out of supply late in the summer. According to Payne Oklahoma started the season with 966 radios working on the network, ramping up with more at each home game until reaching full capacity later in the season. AmpThink had also replaced the black enclosures by the time of our visit with the standard silver ones.

Oklahoma also experienced what other venues deploying Wi-Fi 6 may find – that some of the very oldest devices still in use may have issues in connecting to the Wi-Fi 6 equipment. Payne said one such
problem surfaced in the press box (where reporters were using older laptops) but it was solved by creating some virtual APs which were tuned to an older version of the Wi-Fi standard.

Oklahoma fans during pregame ceremonies

OU also didn’t widely promote the network early in the season, but by the Oct. 19 home game with West Virginia not only was the school promoting the network on the stadium’s big video boards, the IT team also added the ability for students to automatically join the stadium network via their regular WiFi@OU SSID used around campus.

With 82,620 in attendance for the West Virginia game the total number of Wi-Fi users took a big jump from the previous high, with 25,079 unique connections, according to numbers provided by Payne. When Iowa State came to Norman on Nov. 9, the network saw its highest usage with 32,673 unique users, who used approximately 4.2 terabytes of data while in the stadium.

What was also interesting to Payne was the number of devices connected using the Wi-Fi 6 standard, which currently is only supported by a small number of phones. Payne noted that the first week OU had the Wi-Fi 6 working in the stadium was the same week Apple started delivery of its new iPhone 11 line, which includes support for the new Wi-Fi 6 standard. After seeing 941 devices connect on Wi-Fi 6 at the Texas Tech game, Payne said Oklahoma saw a steady increase of Wi-Fi 6 devices at each following home game, with 1,471 at the West Virginia game and 2,170 at the Iowa State game.

Is AX coming ‘sooner’… rather than later?

Though most consumer handsets being used today do not support the Wi-Fi 6 standards, Apple’s decision to include Wi-Fi 6 support in its latest iPhone 11 line as well as Wi-Fi 6 support from other new Android phone models suggests that device support for the standard may be coming sooner, rather than later, to the fans in the stands. When that happens and the Wi-Fi 6 network starts utilizing its new capabilities, Oklahoma’s network will be among the first to make use of the new standard’s ability to support more clients at higher connection speeds, critical features for big networks in small places like football stadiums.

The non-insignificant number of AX devices already seen by the stadium network, Payne said, felt like good justification of the school’s decision to commit to Wi-Fi 6. What was also interesting to Payne was some later analysis of the network which showed Wi-Fi 6 clients using nearly 10 times the data per client as older Wi-Fi 5 devices.

Looking ahead to next season, Payne said he will be working with school network officials to see how to more closely tie the stadium network with the overall campus wireless infrastructure, and to see how the school might be able to incorporate a stadium app or web-based sites to increase the ability of the network to improve the fan experience. Currently Oklahoma uses a portal from AmpThink to get email addresses from network guests, which Payne said will be used by marketing and ticketing departments to try to increase engagement.

The good news is, Payne said, is that “we are no longer looking at what it costs to put a network in place” to drive any new digital experience ideas.

For Oklahoma athletics director Joe Castiglione, it was important for the school to deliver an amenity that provided a a consistent fan experience whether a fan was in a suite or in the upper deck, a goal our tests seem to have validated.

“We feel that the Oklahoma tradition is among the strongest in the nation and really want to provide a top-notch fan experience to celebrate that tradition,” Castiglione said. “Wi-Fi is just the beginning of enhancing that experience. We hope to be able to use it to engage our fans through in venue activations and experiences that would not be available without the addition of Wi-Fi.”

The scoreboard touts the new Wi-Fi network (credit this photo: University of Oklahoma)

A panaoramic view of the stadium


Wi-Fi enclosure above a concessions stand

‘Best of Breed’ wireless drives Chase Center experience

An under-seat Wi-Fi AP enclosure at Chase Center, foreground, with a DAS enclosure visible to the left. Credit all photos (except where otherwise noted): Paul Kapustka, MSR (click on any picture for a larger image)

As stunning as Chase Center is visually, what you can’t see is equally powerful in adding to the fan experience. Namely, the wireless networks, and the gear that supports the connectivity.

Inside the shiny new home of the NBA’s Golden State Warriors, which sits on the edge of the San Francisco Bay, is a cellular DAS deployment from Verizon using Corning gear that may be the new forward-thinking model for cellular infrastructure for large public venues like stadiums and arenas. The 18,000-seat arena also has a Wi-Fi network using gear from Aruba, a Hewlett Packard Enterprise company, which supports the emerging Wi-Fi 6 standard for communications inside the main seating bowl.

But if you’re attending a Warriors game, or one of the many concerts scheduled at Chase Center, you may not ever see the equipment that brings the world-class connectivity to the fans. Both the DAS and the Wi-Fi networks utilize an under-seat antenna deployment method, just part of an aesthetic plan that does its best to minimize the visual impact of antennas and other wireless gear. Even deeper into the building is all the optical fiber supporting the networks, with capacity for future needs already in place.

During a mid-October 2019 visit before all the networks were fully tuned, Mobile Sports Report still got strong test results from both Wi-Fi and DAS networks in most areas in and around the arena, clear confirmation that the Warriors’ goal of having excellent wireless connectivity at their new home was right on track. And with the Corning ONE system in behind a DAS design built from the ground up with future needs in mind, as well as the expected capacity gains coming from Wi-Fi 6, the Warriors and their partners are confident they’ve built a wireless system worthy of their world-class venue goals.

“We feel extremely proud” of the venue’s wireless systems, said Brian Fulmer, director of information technology for the Golden State Warriors. Though the inevitable construction delays led to some late nights heading up to the arena’s Sept. 6, 2019 public debut, according to Fulmer all wireless systems were fully online for the opening Metallica concert, where the arena saw 2.58 terabytes of data used on the Wi-Fi network with another 2.69 TB used at another Metallica show a couple days later.

“It was a race to the finish line but we did it, and the performance speaks for itself,” said Fulmer.

Searching for ‘Best in Breed’

Editor’s note: This profile is from our latest STADIUM TECH REPORT, which is available to read instantly online or as a free PDF download! Inside the issue are profiles of the new Wi-Fi deployment at the University of Oklahoma, as well as profiles of wireless deployments at Fiserv Forum and the University of Florida! Start reading the issue now online or download a free copy!

If there was ever a chance to build the best-ever new arena, Chase Center was probably a once-in-a-lifetime opportunity. When you combine the championship run of the team on the court with a devoted fan base centered in one of the hottest economic markets ever, you have the liberty to search for quality instead of bargains on every level.

A Wi-Fi AP hovers over a concourse gathering area.

(Case in point: The Warriors were able to sell out of their new court-level luxury suites, which have rooms just under the stands that include private wine lockers and can cost up to $2 million per year. Clearly, this is a model that may not work in places that aren’t Silicon Valley.)

For the privately financed $1.4 billion building, the Warriors turned to consulting firm Accenture to help determine the “best in breed” technology partners, especially on the wireless front. Several Warriors executives interviewed for this story did all agree on one main point: The team was not trying to install any technology to win imaginary awards for being the best or fastest building out there. Instead, it was all about how technology, especially wireless, could help bring about a world-class experience during every visit.

“Nobody shows up [at an arena] just looking for fast wireless speeds,” said Mike Kitts, the Warriors’ senior vice president for partnerships. “They want to interact. We wanted to create unforgettable experiences in an engaging environment. With the end in mind of a world-class experience, we knew great technology would absolutely play a role.”

Like a team drafting top players, the Warriors ended up choosing Verizon to lead the distributed antenna system (DAS) for cellular wireless, and Aruba for Wi-Fi. To build its neutral-host system, Verizon chose Corning and the Corning ONE platform, with an installation led by Communication Technology Services (CTS).

“We certainly leveraged the expertise of Verizon, as well as AT&T (which is also on the DAS as a client),” said Fulmer. “They’ve done this countless times, and they have the lessons learned of painful experiences.”

Building a DAS that can handle growth

Anyone in the stadium business in Northern California doesn’t have to look too far or remember too long ago to recall one such example of the pain that the nonstop growth in cellular demand can cause. After the San Francisco 49ers’ brand-new home, Levi’s Stadium, opened in 2014, the also brand-new DAS had to be upgraded the very next season to ensure it had enough capacity for the upcoming Super Bowl 50. Verizon, which basically invented under-seat DAS antennas for that deployment, said it had a goal at Chase Center to build a DAS that didn’t need upgrading for at least a few years.

A Wi-Fi AP painted to blend into the outside facade.

Terry Vance, senior manager for Verizon’s Pacific market network performance group, said “the plan from day 1 was to build a DAS with capacity for today and tomorrow. We needed to build this DAS so that for the next 3 to 4 years, we won’t have to touch it.”

Verizon also had to build the DAS in a way that complied with the Warriors’ stringent requirements for clear sight lines, especially in the main bowl seating area. According to the Warriors’ Fulmer, the team “looked at handrail [enclosure] designs,” but rejected them in favor of an under-seat approach. Though more costly in both equipment and construction, the under-seat approach was Verizon’s favored method as well to get more density in the arena.

What Verizon ended up with was a design that currently uses 71 active sectors, with 42 of those in the seating bowl. According to Vance, all the sectors in the bowl area can basically be split into two parts if needed, for a total of 84 potential bowl sectors. Currently, Vance said there are 598 under-seat DAS antennas in use.

According to Vance the Corning ONE system’s extensive use of optical fiber makes it easier to add capacity to the system as needed.

“The fiber to the edge [in the Corning system] is especially useful as you go to 5G,” Vance said. Though it’s not part of the shared DAS system, Verizon also has full 5G bowl coverage at Chase Center, one of the first arena deployments in California. Verizon also is using a couple of MatSing ball antennas, mounted in the rafters to provide cellular coverage to the floor area for concerts and other non-basketball events.

Right now AT&T is the only other carrier on the DAS, with participation from T-Mobile and/or Sprint pending depending upon the outcome of those two companies’ potential merger.

A Verizon 5G speedtest. Credit: Verizon

Jessica Koch, sports and entertainment director of business development for Corning optical communications, gave praise to integrator CTS for its deployment know-how, which she said was “critical to the success of this project.” Corning, Koch said, knows that for fans in large venues like Chase Center, “reliable connectivity without restriction – all the time, at full speed, on any device, from anywhere – has become the expectation in our connected world.”

For Warriors president and COO Rick Welts, the best wireless system is one fans don’t see or worry about, but just use without concern.

“The best thing is if the phone just works, and I don’t have to think about it,” said Welts, who led a stadium tour during MSR’s October visit.

Though Verizon said the system went through some necessary optimization during the hectic early events schedule at Chase Center, Verizon engineers in December were getting DAS speed tests in excess of 100 Mbps for both download links in most locations, according to Philip French, vice president of network engineering for Verizon. Download speeds for 5G connections, he said, are breaking the 1 Gbps mark.

“This DAS is unique since it was the first one we’ve built with 5G in mind from the ground up,” French said. “It’s a very robust design, and for us this is the design of the future.”

Leading the way with Wi-Fi 6

Like several other stadiums that were being finished this past summer, Chase Center was able to take advantage of the release of Wi-Fi equipment that supports the emerging Wi-Fi 6 standard. Though all the new capabilities won’t be fully realized until most end-user devices also support the new version of Wi-Fi, having support for the technology inside the arena was key for the Warriors’ plans.

“You can never really be ‘future proofed’ but we were extremely fortunate with the timing [of Wi-Fi 6 gear arriving],” said the Warriors’ Fulmer. “We were right in the sweet spot for an initial deployment.”

Wi-Fi and DAS gear on the catwalk.

According to Aruba, Chase Center has approximately 250 Aruba 500 Series APs (which support Wi-Fi 6) deployed in the main seating bowl, mostly in under-seat enclosures. Overall, there are approximately 852 total APs used in the full Chase Center network, which includes coverage inside the building as well as in the connected outdoor plaza areas.

During our October visit, MSR got Wi-Fi speedtests of 27.3 Mbps on the download side and 18.2 Mbps on the upload side while standing outside the east entry doors near the big mirror balls that are selfie central for fans visiting the new arena. Inside the doors, our speedtest in the lobby got a mark of 55.8 Mbps / 68.6 Mbps.

On one upper concourse area, near several concession stands outside portal 57, we got a speedtest of 10.5 Mbps / 11.2 Mbps. In the seats in upper section 220 just before tipoff we got a mark of 46.0 Mbps / 28.0 Mbps, and in a lower-bowl concourse area outside portal 9 we got a test mark of 53.7 Mbps / 71.5 Mbps.

According to Aruba, several events other than the Metallica concerts have passed the 2 TB Wi-Fi data mark so far, with several events seeing more than 8,000 unique clients connected and marks of 6,000+ concurrent connected devices and 2.6 Gbps of throughput.

The Warriors’ Fulmer praised not just the Wi-Fi gear but the full “end to end network solutions” available from Aruba as well as from parent Hewlett Packard Enterprise, which is a founding partner at Chase Center.

“We’re still only three months in, and there’s a lot more that we want to do,” Fulmer said. “It was not a small undertaking. But I think we can let the technology speak for itself.”

Fiserv Forum’s wireless networks ready for the Democratic Convention

Milwaukee’s Fiserv Forum, home of the NBA’s Milwaukee Bucks and also the locale for this summer’s Democratic Convention. Credit all photos: Paul Kapustka, MSR (click on any picture for a larger image)

With one of the most demanding arena-sized events headed its way this upcoming summer, the wireless networks at Milwaukee’s Fiserv Forum appear to be more than ready to handle any audience demand for mobile connectivity.

With a full-featured distributed antenna system (DAS) deployed and operated by ExteNet Systems using gear from JMA Wireless, as well as a Wi-Fi network using Cisco gear, Fiserv Forum shows both the expertise of wireless providers who have a long history of knowing what works, as well as the foresight to add new techniques and technologies to combine high performance with the quality aesthetics that are the hallmark of the new home of the NBA’s Milwaukee Bucks.

And while a Mobile Sports Report visit this past fall for a Bucks game found all the wireless elements in top working order, the big event for the venue’s second year of operation will be the Democratic National Convention in July 2020. While the four-day nomination gathering is a test for any locale, Fiserv Forum’s forethought on how to prepare for numerous types of events in and around its uniquely designed structure has it well prepared to handle whatever wireless needs the convention will require.

It all starts with the DAS

Editor’s note: This profile is from our latest STADIUM TECH REPORT, which is available to read instantly online or as a free PDF download! Inside the issue are profiles of the new Wi-Fi deployment at the University of Oklahoma, as well as profiles of wireless deployments at Chase Center and the University of Florida! Start reading the issue now online or download a free copy!

Even in these days of predictions of the death of DAS, Fiserv Forum is proof that for high-profile venues, carriers will still participate in a quality deployment. And while many venues have just two or three cellular providers on their DAS, according to ExteNet, the Fiserv Forum DAS has five major carriers participating — AT&T, Verizon, T-Mobile, Sprint and U.S. Cellular.

Wi-Fi AP on an outdoor plaza light pole

Unlike some new arenas, where wireless is an afterthought to construction, ExteNet was involved early on, according to Manish Matta, vice president of marketing at Extenet.

“Getting in sooner rather than later is always better,” said Matta, who said ExteNet was well involved in the overall construction plans, ensuring that there were no delays associated with wireless deployments holding up construction of other parts of the building.

During a pregame tour in October with a team from ExteNet as well as with Robert Cordova, chief technology and strategy officer for the Bucks, Mobile Sports Report got an up-close look at some of the inside parts of the DAS network design, including the headend room and multiple antenna installations that were hard to find given their well-designed placements and camouflaging.

In addition to regular enclosures that were painted or otherwise placed in areas out of the main sight lines, ExteNet and JMA also utilized some of the newer circular flat-panel antenna enclosures that fit flush to ceilings, minimizing the exposure.

The 215 DAS antennas are powered by 40 remote units. According to JMA, the remotes are connected to the backbone with optical fiber, and use digital power to bring power to elements up to a mile away. With 16 sectors in the bowl design, the DAS is able to segment coverage to all parts of the arena, including the bowl as well as concourses and other in-house areas.

DAS antenna in a concourse location

ExteNet, which owns and operates the DAS as a neutral host, also installed 10 extra MatSing ball antennas in the rafters for additional top-down coverage. Though only AT&T is using the MatSings right now, ExteNet said they are integrated into the DAS design if other carriers should wish to utilize them in the future.

During a short walk-around before the Bucks game started, MSR got a DAS speedtest of 85.8 Mbps on the download and 14.9 Mbps on the upload, even though our older iPhone (on the Verizon network) doesn’t support all the latest DAS capabilities. Near the start of the game, as the pregame introductions were at their peak, we got a DAS mark of 18.0 Mbps / 15.7 Mbps in the middle of an upper-deck seating area (Section 227) and then a little bit after the game started, we got a mark of 21.3 Mbps / 12.5 Mbps near a bar area on the upper-level concourse.

Wi-Fi inside and out

On the Wi-Fi side of things, a visitor to Fiserv Forum can connect to the network even before coming in the doors, as part of the 623-AP Cisco installation includes Wi-Fi APs mounted on light poles in the “Deer District,” the plaza area on the stadium’s east side that connects to an outdoor beer garden and several bars and restaurants that were all part of the planned environment built in sync with the arena’s opening.

Before we went inside, we got a Wi-Fi speedtest of 40.5 Mbps / 40.2 Mbps in the middle of the Deer District plaza, which was hosting a pop-up haunted house attraction sponsored by Jack Daniels.

Inside the building, we again needed some guidance from the Bucks’ Cordova to locate some of the Wi-Fi APs, which are inside triangular enclosures that are either painted to match wall surfaces, or utilized as high-visibility section number signs, hiding the gear in plain sight.

Wi-Fi AP blended in to the wall covering

In the seating bowl, Fiserv Forum again shows its commitment to aesthetics with the smallest handrail enclosures we’ve ever seen, a discreet hand-sized enclosure that tucks the antenna components neatly into the top part of a railing, with the AP electronics hidden below the seating areas. Designed by integrator Johnson Controls and its ecosystem partners, Abaxent and AccelTex, the 28 special enclosures are also designed to be easy to detatch and re-attach (with something Johnson Controls calls a simple two-click “dart connector”) which facilitates keeping the network working when the lower-bowl seating areas need to be reconfigured for different events.

Sitting in a courtside seat near one of the handrail enclosures about 20 minutes before tipoff, we got a Wi-Fi speedtest mark of 15.8 Mbps / 33.2 Mbps. On the main concourse just after the game’s start we got a Wi-Fi mark of 28.6 Mbps / 60.4 Mbps, and later on at that same upper-concourse bar we got a mark of 39.9 Mbps / 61.1 Mbps.

Later on during the second quarter of the game, we watched another fan in our lower-bowl seating area spend most of the period keeping one eye on Monday Night Football streaming on his phone. “The Wi-Fi is really good here,” he noted.

Looking ahead to CBRS and 5G

As ExteNet and JMA prepare for the onslaught of the convention’s needs, in many areas the Bucks are already looking farther ahead, to future communications improvements including 5G millimeter wave deployments, and a possible introduction of CBRS services. Cordova, who is an advocate of the capabilities of private LTE networks over the CBRS spectrum, said the flexibility of provisioning services in a CBRS environment could be extremely useful for temporary needs, like during last year’s NBA playoffs when the NBA on TNT crew set up a temporary stage out in the plaza.

While the Bucks have already prepared for connectivity of all sorts out on the plaza space – from the top-level outside Panorama deck at Fiserv Forum that lets fans look out over the city, Cordova pointed out several metal boxes in the plaza that have home-run fiber connections for broadcast TV as well as remote power – there’s going to be all sorts of temporary connectivity needs when the convention media tents set up in the empty lot next door where the previous stadium, the Bradley Center, used to stand.

The fact that the Bucks and ExteNet were already well involved with planning for a July event in October the year before is just another sign of a networking operation that is well positioned now and already thinking about what the next necessary steps are.

Robert Cordova, chief technology and strategy officer for the Bucks, in the headend room

MatSing ball antennas point down from the rafters

The Daktronics centerhung video board

Verizon sees 21.5 TB of cellular data used at Super Bowl LIV in Miami

Verizon, which led the DAS effort at Miami’s Hard Rock Stadium, said it saw 21.5 terabytes of data used on its network “in and around the stadium” during Sunday’s Super Bowl LIV.

While we are still circling back with Verizon to see if we can get more granular details, we are guessing that this number comes from the network inside and immediately adjacent to the venue, and not the “2-mile radius” that other carriers are also reporting. We will update as we get more info. With AT&T’s report of 10.2 TB inside/around the stadium and 14.5 TB in the two-mile radius, we are somewhere above 25 TB total of cellular traffic for the big game, even before we have any stats from other carriers.

We are also asking Verizon to provide a total number of users on the 5G network the carrier deployed inside the stadium, but don’t hold your breath since total number of users is not a statistic carriers like to provide (as opposed to Wi-Fi stats, which almost always report the unique connections).

On an interesting note, Verizon said in an email to MSR that it saw 20.5 TB of data traffic on its networks around the stadium at Super Bowl 53 in Atlanta last year — as far as we know this is the first official number from Verizon about last year’s Super Bowl, when Verizon declined to provide MSR with any post-game numbers directly after the event. According to the Falcons IT staff there was only 12 TB total of DAS traffic on the in-stadium network during last year’s Super Bowl so the 20.5 TB of Verizon traffic must have included a lot of “around the stadium” traffic to get to that number.

AT&T sees 10.2 TB of cell data used by customers at Super Bowl LIV

Super Bowl LIV fans who are AT&T wireless customers used 10.2 terabytes of data inside Miami’s Hard Rock Stadium during Sunday’s game, according to statistics provided by AT&T.

Also according to AT&T the carrier saw an additional 4.3 TB of data used Sunday in a two-mile radius around the Super Bowl venue, bringing AT&T’s Super Sunday total to 14.5 TB of data used. Sunday’s data totals were the capping on a week of Miami-related wireless activity that saw 172 TB of data used on AT&T’s networks in a two-mile radius around the NFL festivities areas.

Interestingly, the numbers have dipped a bit from the previous year’s Super Bowl totals, when AT&T said it saw 11.5 TB used in and around Atlanta’s Mercedes-Benz Stadium and 23.5 TB used within a two-mile radius of the Super Bowl 53 venue.

However, we’ve found that it’s not always comparing apples to apples with these numbers each year, especially on the cellular front when it’s simply harder to define boundaries. Case in point was last year, when the only number we got from Sprint was one that included apparently all of downtown Atlanta. Verizon Wireless and T-Mobile did not report any Super Bowl numbers from last year.

With any luck, we will get more carrier numbers during the week and then hopefully before too long, the official Wi-Fi numbers from Hard Rock Stadium as well. Though we did hear some reports Sunday night about the Wi-Fi struggling in some places during halftime, we also heard other reports of solid coverage so we will wait to see what the final numbers say. Also hoping to get some 5G performance figures, to see if the pre-game hype was realized. Stay tuned!

Stadium Tech Report: New Wi-Fi network soars at Ohio State

Ohio Stadium set records this season for single-day Wi-Fi use inside a venue. Credit all photos: Paul Kapustka, MSR (click on any picture for a larger image)

With its long tradition of excellence in all things pertaining to college football, is it any surprise that when the Ohio State University finally got Wi-Fi installed at Ohio Stadium the network would instantly be one of the best around?

Over this past offseason, the school oversaw the first comprehensive installation of a fan-facing Wi- Fi network inside the venerable “Horseshoe,” with almost 2,000 access points, some 600 of which were installed in handrail enclosures that all sport the Ohio State logo engraved on each side. Live and operational for the Buckeyes’ home opener on Aug. 31, the network saw just more than 47,000 unique users its first day and carried more than 13 terabytes of data, instantly lifting Ohio State to the front of the class in single-day collegiate football Wi-Fi records. In subsequent home dates this fall, Ohio State went on to record more big-data days, including the highest-ever single-day use of Wi-Fi in a stadium, 25.6 TB on Oct. 5 for a game against Michigan State.

Impressive as its first season might be, the network will only get significantly better in the near future as device technology catches up with it. A decision to use the new Wi-Fi 6 standard, also known as 802.11ax, in as many of the APs as possible, will let Ohio State take advantage of the technology’s promise of higher throughput and the ability to handle more clients per AP when more fans get their hands on devices that support Wi-Fi 6 and bring them to games.

During a visit by Mobile Sports Report for the Aug. 31 game, close-up inspection of many of the APs in a pre-game walkaround saw no evidence of the frenetic summer of hard work getting the equipment installed. Using Wi-Fi gear from Aruba, a Hewlett Packard Enterprise company, and installed with with a design by AmpThink (which also manufactured the AP enclosures), the deployment does an excellent job of looking like it’s been part of the almost 100-year-old stadium for a long time, with discreet wall and overhead antenna placements complementing the standout handrail enclosures. And with connectivity finally in their house, the Ohio State fans wasted no time jumping on the network, with many fans expressing great joy at being able to use their wireless devices at the game.

A bumpy road to Wi-Fi

Editor’s note: This report is from our latest STADIUM TECH REPORT, an in-depth look at successful deployments of stadium technology. Included with this report is a profile of the new converged fiber network at Dickies Arena, and an in-person research report on the new Wi-Fi network at Las Vegas Ballpark. You can either VIEW THE REPORT LIVE (no registration needed) or DOWNLOAD YOUR FREE COPY now!

Handrail enclosures brought Wi-Fi gear close to the fans.

Built in 1922 as one of the then-largest poured- concrete structures, the building known officially as Ohio Stadium (and also as “the Horseshoe,” or just “the Shoe”) is among the biggest of the big, with capacity reaching 104,944 after renovations in 2014. That number actually decreased a bit with a recent round of renovations that removed some seats in favor of some new suite areas, but even with capacity of around 102,000, Ohio Stadium is still among the top echelon of Saturday afternoon shrines for its scarlet- and grey-clad followers.

While the venue is long held in reverence by not just Ohio State fans but by football fans in general, the things that make it a great place to watch a game – the big, open seating bowl and the historic concrete structure – also make it a challenge to equip with modern wireless technology. Back in 2012, it looked like the school had solved the problem by signing a deal with Verizon to bring Wi-Fi to the football stadium and basketball arena. But according to several reports, the installation never occurred and now the school and Verizon are still involved in a lawsuit concerning the non-deployment.

Fast forward to 2018, and the school finally approved a measure that will bring connectivity not just to the stadiums, but in many other places across campus as well. Jim Null, senior associate athletic director and chief information officer for Ohio State, noted that as a digital program partner with Apple, the school gives all students iPads as freshmen, leading to demands for coverage not just in classrooms but anywhere students may wander.

“There were a lot of coverage gaps on campus,” Null said. The new deal, reached in the spring of 2018, approved $18.6 million in spending for wireless coverage in the stadiums and across campus. According to Null, the sports stadiums’ portion of that deal was approximately $10 million. Null also said the stadium has a 30 Gbps backbone pipe, courtesy of the Ohio Academic Resources Network (OARnet), the 100 Gbps network that connects the state’s major cities and research institutions.

Handrails and Wi-Fi 6

With a bill of material in hand for the deployment, Null said that Aruba asked if the school wanted to use Wi-Fi 6 gear, which was available this spring when construction was to begin.

The big video board at Ohio Stadium helped fans find the Wi-Fi.

“It was good timing in a sense – Aruba came back to us and said, why not go with Wi-Fi 6, and everyone here [at the school] decided that was a good idea,” Null said. While the new version of the standard will improve Wi-Fi performance in any kind of network, at large sports venues the improvements will likely be significant. AmpThink president Bill Anderson, who is urging most new-construction Wi-Fi clients to install Wi-Fi 6 if possible, calls the new standard “a significant game-changer” for in-venue networks.

AmpThink’s Anderson, whose company has designed and helps run networks in the biggest stadiums that see the biggest events – including last year’s Super Bowl and last year’s men’s NCAA Final Four – says that over the past year or so, networks based on older Wi-Fi standards are reaching some theoretical limits, mostly with spectrum re-use. “We are getting to the cutting edge of what we can support,” with the older Wi-Fi 5 technology (also known as 802.11ac), Anderson said.

Wi-Fi 6, however, promises to deliver more capacity per access point, along with better techniques for communication between devices and access points, which most industry followers agree should produce significant benefits, especially in venues where spectrum re-use is necessary given the large numbers of APs needed to provide coverage. While it’s true that it may take some time before Wi-Fi 6 technology is on both the access point and the balance of user devices in stadiums (both sides of the equation need to support Wi-Fi 6 for the full range of benefits to be realized), the fact that many new devices – including the recently announced Apple iPhone 11 line – contain support for Wi-Fi 6 means that the full improvements will likely be seen sooner rather than later.

“Ohio State made the right choice to go with Wi-Fi 6,” Anderson said.

Putting the APs into handrail enclosures was another decision point, but one Null said the school was unified on. Though Aruba has traditionally preferred to deploy Wi-Fi in under-seat placements, like in deployments at Levi’s Stadium and Mercedes-Benz Stadium, Null said the combination of aesthetics, performance and cost made railing enclosures the preferred choice at Ohio State.

“The combination of all three led us to the handrails,” Null said, noting that with the ability to place two APs into a single handrail enclosure, Ohio State was able to approximately cut in half the number of holes it would have to drill into the concrete to string cable to the devices, a huge savings in cost and construction time. With bleachers in most of its seating areas, Ohio Stadium would have cut into under-seat spaces significantly with under-seat APs, Null said.

Wi-Fi enclosures in the handrails at Ohio Stadium’s upper deck.

Though some lower-bowl areas without handrails did get under-seat AP placements, the 600 handrail enclosures – all manufactured by AmpThink and custom-stamped with an Ohio State logo – now wrap around the entire seating bowl, from near the field to way up at the top of Deck C. Null said performance from some other recent AmpThink deployments that primarily used handrail enclosures – including Notre Dame Stadium and U.S. Bank Stadium – led Ohio State to believe that handrail installation techniques would be “very comparable in performance” to under-seat.

According to stats compiled this season, the Ohio Stadium handrail enclosures are working just fine. According to the school the network saw 47,137 unique connections out of 103,228 in attendance for the home opener against Florida Atlantic on Aug. 31, with a peak concurrent connection number of 28,900. Total bandwidth tonnage for the first game was 13.3 terabytes, a mark which put Ohio State in fifth place in the unofficial all-time Wi-Fi single-day record list kept by MSR. But Ohio Stadium’s network was just getting started.

Ohio State’s second home game of the season, a week later versus Cincinnati, was nearly equal in performance statistics. According to figures provided by Ohio State, on Sept. 7 the network saw 47,579 unique connections out of 104,089 in attendance, with a peak concurrent connection mark of 28,900. Total tonnage for the second game was 12.7 TB, good enough for then sixth place on the MSR list. Peak bandwidth rates were just over 10 Gbps for the home opener, and just above 6 Gbps during the second game.

Later in the year, the network heated up even more as OSU hosted its biggest games. On Oct. 5, Ohio State shattered the all-time Wi-Fi record with a mark of 26.5 TB, with an astonishing 74,940 unique connections and a peak concurrent connectivity number of 45,200 users. Hosting Wisconsin on Oct. 25, Ohio State saw 17.0 TB of data used on the network (during a full-day rainstorm) and then saw another 16.10 TB used on Nov. 9 against Maryland. Then on Nov. 23 against Penn State the network saw 20.70 TB of data, giving Ohio Stadium seven of the top-10 Wi-Fi days we’ve ever heard of.

Solid tests throughout the venue

An unofficial walk-around testing process by MSR before and during the home opener showed solid performance in just about every part of the venue, from outside the entry gates to all the seating areas low and high, and on concourses and other busy walkways. Inside of Gate 14, we got one of the highest Wi-Fi speedtest marks in the stadium, at 62.7 Mbps on the download side and 72.1 Mbps for upload. According to Null the entryways are well covered, with four access points hidden behind a directional sign that simply blends into the structure.

A good look at the spread of handrail enclosures in the lower bowl.

Inside the stadium, we got a mark of 49.2 Mbps / 42.9 Mbps in the seats in the lower bowl around the 45-yard line, an area covered primarily by handrail enclosures. Closer to the field in seats along the goal line on the press box side of the stadium we got a mark of 51.2 Mbps / 32.0 Mbps; in the same spot we tested the DAS coverage for cellular and got a Verizon network speedtest of 20.1 Mbps / 1.34 Mbps. According to Null Verizon runs a neutral-host DAS inside the stadium, with AT&T as a client.

Back on Wi-Fi with the stadium still closed to fans we went up into the metal bleachers in the non-curved end zone and got a speed test of 38.6 Mbps / 18.7 Mbps. In the concourse below these same stands we got a test mark of 47.2 Mbps / 48.5 Mbps.

An elevator ride to Deck C and a hike up the steep steps found us at the top row of the stadium, where the Wi-Fi was still strong, with a mark of 42.0 Mbps / 35.6 Mbps in row 41. We then went down to Deck B on the non-press box side of the stadium, where some concrete overhangs make for interesting placements. There, we saw Wi-Fi APs mounted above the seating areas pointing down. With fans starting to come into the stadium we got a mark there of 24.3 Mbps / 45.2 Mbps; in the same area the DAS provided a test of 21.8 Mbps / 12.6 Mbps, again on the Verizon network.

The one place we found with poor Wi-Fi coverage – down near the field in section 28AA – was one of the few areas where Null said that the network deployment was not yet complete early in the season. (The Speedtest.net app we use for testing dropped during the test here; the same area did have DAS coverage, with a mark of 16.9 Mbps / 4.66 Mbps on the Verizon network.)

That the network was near complete for the opening game was a testament to extra work from all suppliers. AmpThink, which outfitted three major college fields this summer, had overtime shifts to manufacture enough enclosures, while Aruba had to produce enough Wi-Fi 6 APs not just to fill Ohio State, but also Oklahoma, whose stadium is of similar size.

“It was quite a ballet dance the last nine months,” said Jeff Weaver, director of high density consulting at Aruba. “Hats off to the construction team.”

Perhaps the most impressive tests we got were taken during live game action, one just after an Ohio State touchdown. In section 13 up on the C deck we wandered out into the middle of celebrating fans and got a speedtest of 59.9 Mbps / 57.9 Mbps. Walking down to section 27AA on the press box side after yet another OSU touchdown we sat in the aisle and got a speed test of 54.7 Mbps / 70.2 Mbps, from an area covered by handrail enclosures.

Fans happy now, likely to be even happier in the future

If Ohio State is known widely for its football excellence (Ohio State has eight national championship titles to its name, and is in the playoffs for this year’s title), its fans have known mostly wireless frustration over the recent years, a situation that has now changed 180 degrees. In several conversations with fans MSR heard how happy OSU fans were now “that we can actually use our phones!” And as good as the network speed tests and overall performance is now, it’s worth noting that the Wi-Fi 6 advancements are not yet even being used – meaning that when more fans have Wi-Fi 6 enabled devices the network should perform even better, leading to faster connections and more capacity for all.

Null said that Ohio State will also be deploying the Passpoint software in the future, which allows for automatic sign-on to the Wi-Fi network and better support for device roaming. Ohio State does not ask fans to log in with any sort of email information or personal identification – all they need to do is select the OSUfanWiFi SSID and connect. And if the first season is any indication, many Ohio State fans will continue to do so with great appreciation for the foreseeable future.

Editor’s note: You can now read our Stadium Tech Report profile of the new Ohio State network (with all our great photos) instantly online, with no registration or email address needed! JUST CLICK RIGHT HERE and start reading our latest report today!