Top-down approach brings Wi-Fi to OKC Thunder’s Chesapeake Energy Arena

Chesapeake Energy Arena, home of the NBA’s Thunder. Credit all photos: Oklahoma City Thunder

If there’s one sure thing about stadium Wi-Fi deployments, it’s that pretty much no two networks are ever exactly the same. So even as there is a growing large-venue trend for putting Wi-Fi access points under seats or in handrails, sometimes the traditional top-down method is still the one that works best.

Such was the case for the first full fan-facing Wi-Fi network at Chesapeake Energy Arena in Oklahoma City, home of the NBA’s Thunder. With a large amount of retractable seating in the 18,000-seat venue, an under-seat approach to Wi-Fi would prove too costly and disruptive, leading the team to look for connectivity from above.

While a solid in-building cellular distributed antenna system (DAS) had done a good job of keeping fans connected the last few years, the team’s desire to have more mobile insight to fan activity as well as a switch to a Wi-Fi-centric point of sale system led Oklahoma City to finally install fan-facing Wi-Fi throughout the venue.

Chris Nelson, manager of information technology for venue manager SMG, and Tyler Lane, director of technology for the Thunder, spoke with Mobile Sports Report about the recent Wi-Fi deployment at Chesapeake Energy Arena, which went live during the most recent NBA season.

An AP placement in the rafters

Though the venue looked at all options, Nelson said that going under-seat with APs would have been “very costly” to do, given the large number of retractable seats in the arena.

“We wanted to hang them [APs] from the top if we could,” Nelson said.

After testing the top equipment brands available, the Thunder settled on Ruckus gear, for what they said was a simple reason, one involving the 96 feet in air space from the catwalk to the arena floor.

“Ruckus was the only one whose gear could reach down all the way,” Nelson said.

Adding to the fan experience

Editor’s note: This report is from our latest STADIUM TECH REPORT, an in-depth look at successful deployments of stadium technology. Included with this report is a profile of the new Wi-Fi network at Allianz Field in St. Paul, Minn., and an in-depth research report on the new Wi-Fi 6 standard! DOWNLOAD YOUR FREE COPY now!

According to the team the deployment saw 410 total APs used, with 350 in the arena proper and another 60 deployed across the street at the Cox Convention Center. According to the Thunder’s Lane, the team rolled out the service slowly at first, with some targeted testing and feedback from season ticket holders.

Close-up of an AP placement

“We got some good feedback and then when we went to a full rollout we had signage in the concourses, communications via ticketing services and announcements over the PA and on the scoreboard,” to tell fans about the system, said Lane.

According to statistics provided by the team, the Wi-Fi was getting good traction as the season went on, with a March 16 game vs. the Golden State Warriors seeing 589.3 gigabytes of traffic, from 2,738 clients that connected to the network. Lane said the team employed Jeremy Roach and his Rectitude 369 firm to assist with the network design; Roach in the past helped design networks at Levi’s Stadium and Sacramento’s Golden 1 Center.

Now that the Wi-Fi network is in place, Lane said the Thunder is starting to increase the ways it can add to the fan experience via digital means, including app-based features like showing press conferences live and by having an artificial intelligence chatbot to help provide fans with arena information.

“It’s really all about enhancing the fan experience,” Lane said, with an emphasis on driving digital ticketing use in the YinzCam-developed team app. Lane said that the system also drives a lot of mobile concessions traffic, and added that “Ruckus did a fantastic job of asking all the right questions for our food and beverage partners.”

Temporary courtside network helps set Final Four Wi-Fi records

A temporary under-seat Wi-Fi network helped bring connectivity to courtside seats at this year’s Final Four. Credit all photos: Paul Kapustka, MSR (click on any picture for a larger image)

One of the traditional characteristics of the Final Four is the yearly travel scramble of the fortunate fans and teams who have advanced to the championship weekend. Somehow, with only a week’s notice, plane flights, road trips and hotel rooms get scheduled and booked, leading to packed houses at college basketball’s biggest event.

On the stadium technology side, a similar last-minute fire drill happens just about every year as well, as the hosting venues reconfigure themselves to host basketball games inside cavernous buildings built mainly to hold football crowds. At this year’s NCAA Men’s Final Four at U.S. Bank Stadium in Minneapolis, the stadium tech team and partner AmpThink were able to quickly construct a temporary Wi-Fi network to cover the additional lower-bowl seating. The new capactity was part of a record-setting Wi-Fi network performance at the venue, with single-day numbers surpassing those from Super Bowl 52, held in the same building the year before.

The Wi-Fi numbers, both staggering and sobering especially to venues who are next in line for such bucket-list events, totaled 31.2 terabytes for the two days of game action, according to figures provided by the NCAA. For the semifinal games on Saturday April 6, U.S. Bank Stadium’s Wi-Fi network saw 17.8 TB of traffic, topping the 16.31 TB used during Super Bowl 52 on Feb. 4, 2018. The Saturday semifinals also set an attendance record for the venue, with 72,711 on hand, topping the 67,612 in attendance for Super Bowl 52.

During the championship game on April 8, U.S. Bank Stadium saw an additional 13.4 TB of data used on the Wi-Fi network, giving the venue three of the top four single-day Wi-Fi numbers we’ve reported, with this year’s mark of 24.05 TB at Super Bowl 53 in Atlanta the only bigger number. Saturday’s take rate at U.S. Bank Stadium, however, surpassed even the most-recent Super Bowl, with 51,227 unique users on the network, a 70 percent take rate.

‘Like building an arena network inside a football stadium’

Editor’s note: This report is from our latest STADIUM TECH REPORT, an in-depth look at successful deployments of stadium technology. Included with this report is a profile of the new Wi-Fi network at Allianz Field in St. Paul, Minn., and an in-depth research report on the new Wi-Fi 6 standard! DOWNLOAD YOUR FREE COPY now!

Switches for the temporary network were deployed under the seat scaffolding.

There’s no doubt that the temporary network installed by AmpThink and the U.S. Bank Stadium IT team contributed a great deal to the final Wi-Fi totals, with 250 access points installed in the additional seats. Like at other football venues that are transformed into basketball arenas, U.S. Bank Stadium had temporary seating installed on all four sides of the stadium, with temporary risers stretching down over football seating as well as with risers built behind both baskets. More seats were installed on the “floor” of the football field, right up to the elevated court set in the middle. The temporary APs, like the existing ones in the stadium, are from Cisco.

“There are a lot more moving parts to a Final Four than to a Super Bowl,” said David Kingsbury, director of IT for U.S. Bank Stadium, describing the difference in providing the networking and technical underpinnings for each event. While planning for the networks was obviously done far in advance, the actual buildout of the temporary Wi-Fi couldn’t even begin until the additional seating was in place, a task that finished just five days before the first game was played.

That’s when AmpThink deployed a staff of 12 workers to start connecting cables to APs and to switches, while also adding in another 700 wired network connections to the courtside areas for media internet and TV monitor connections. Like it does for every venue network it designs and deploys, AmpThink came to the stadium equipped with a wide assortment of lengths of pre-terminated cables, preparation that made the fast deployment possible.

“If we had to spin raw cable and terminate it on site, we never would have been able to finish in five days,” said AmpThink president Bill Anderson.

AmpThink’s previous experience in deploying such temporary networks under temporary seating — including at the previous year’s Final Four in San Antonio — taught the company that it would also need protection for under-seat switch deployments, to fend off the inevitable liquid spills from the seats above. That requirement was potentially even more necessary at U.S. Bank Stadium, since this year’s Final Four was the first to allow in-venue sales of alcoholic beverages.

Some temporary seats were deployed on top of existing lower bowl seats.

With some of the temporary seating installed over existing seating, there were 95 APs in the existing handrail-enclosure design that had to be turned off for the Final Four, according to Kingsbury. The 250 new APs added were all installed under the folding chairs, in enclosures that simply sat on the floor.

According to AmpThink’s Anderson, the company did learn a lesson at U.S. Bank Stadium — that it will, at future events, need to secure the actual enclosures since during the weekend curious fans opened a few of the boxes, with one AP disappearing, perhaps as an interesting IT souvenir.

In San Antonio, AmpThink had zip-tied the enclosures to chairs, which led to increased labor to detatch the devices during the post-event breakdown. While having no such measures at U.S. Bank led to a fast removal — AmpThink said it had removed all the temporary network elements just seven hours after the championship game confetti had settled — for next year’s Final Four AmpThink plans to at least zip-tie the enclosures shut so that fans can’t attempt any ad hoc network administration.

More APs for back of house operations

Another difference between the Final Four and the Super Bowl is the fact that four, not two, teams are in attendance for a full weekend, necessitating the need to set up temporary “work rooms” adjacent to each school’s locker room area. The media work center for the Final Four is also typically larger than that of a Super Bowl, again with more cities and their attendant media outlets on site thanks to there being four, not just two, teams involved.

A concourse speed test taken just after halftime of the final game.

“We had to cover a lot of places in the stadium that we don’t normally cover” with wireless and wired network access, Kingsbury said, saying that an additional 30 APs were needed for team rooms and the main media workspace, which were located on the field level of the stadium in the back hallways. An interesting note at U.S. Bank Stadium was that the yards and yards of fabric used as curtains to cover the clear-plastic roofing and wall areas was actually benefical to Wi-Fi operations, since it cut off some of the reflective interference caused by ETFE surfaces.

According to Kingsbury the final count of active APs for the Final Four was 1,414, a number reached by adding in the temporary APs while deducting the ones taken offline. Not included in the official NCAA traffic numbers was an additional 3 TB of traffic seen during the free-admission Friday practice sessions, when 36,000 fans visited the stadium, with 9,000 joining the Wi-Fi network.

From the official stats, the peak concurrent user number from Final Four Saturday of 31,141 was also an overall record, beating Super Bowl 53’s mark of 30,605. (Super Bowl 53 had 70,081 fans in attendance for the Feb. 3 game between the New England Patriots and the Los Angeles Rams.) The Wi-Fi network numbers for Monday’s championship game (won by Virginia 85-77 over Texas Tech in overtime) saw big numbers itself, with 13.4 TB of total data used, and 48,449 unique connections and 29,487 peak concurrent users (out of 72,062 in attendance). Monday’s game also produced a peak throughput number of 11.2 Gbps just after the game ended.

None of those totals could have been reached without the temporary network, which AmpThink’s Anderson compared to “building a 10,000-seat arena network inside a football stadium.” Next stop for a temporary Wi-Fi network is Mercedes-Benz Stadium in Atlanta, where the 2020 Final Four awaits.

This is what your football stadium looks like with a championship basketball game inside of it.

The temporary center-hung scoreboard was able to play video programming onto the court surface.

The NBA on TBS crew was courtside for the Final Four.

The secret to keeping your network operations room running? All kinds of energy inputs.

New Report: Wi-Fi 6 research report, record Wi-Fi at the Final Four, and more!

MOBILE SPORTS REPORT is pleased to announce the Summer 2019 issue of our STADIUM TECH REPORT series, the ONLY in-depth publication created specifically for the stadium technology professional and the stadium technology marketplace.

Our latest issue contains a research report on the new Wi-Fi 6 standard and what it means to stadium networks, as well as three separate profiles of Wi-Fi network deployments, including a look at how a temporary network helped fans use record data totals at the Final Four! Download your FREE copy today!

Inside the report our editorial coverage includes:

— A Wi-Fi 6 research report that looks into the new standard’s technology improvements that make it a great bet for in-venue networks;
— An in-person report from the NCAA Men’s 2019 Final Four at U.S. Bank Stadium, where the weekend saw a record 31+ terabytes of Wi-Fi data used;
— How Minnesota United’s new home, Allianz Field, got a big Wi-Fi network from a small company, Atomic Data;
— A look at the new Wi-Fi network at Chesapeake Energy Arena, home of the NBA’s Oklahoma City Thunder.

Download your free copy today!

We’d like to take a moment to thank our sponsors, which for this issue include Mobilitie, JMA Wireless, Corning, Boingo, MatSing, Cox Business/Hospitality Network, ExteNet, Neutral Connect Networks, Atomic Data, Oberon, and America Tower. Their generous sponsorship makes it possible for us to offer this content free of charge to our readers. We’d also like to welcome readers from the Inside Towers community, who may have found their way here via our ongoing partnership with the excellent publication Inside Towers. We’d also like to thank the SEAT community for your continued interest and support.

Cisco brings fan-facing Wi-Fi to Pebble Beach for U.S. Open

This year’s U.S. Open featured a fan-facing Wi-Fi network at Pebble Beach. Credit: Keith Newman, MSR

Fans at the recent U.S. Open golf championship at Pebble Beach were treated to an on-course Wi-Fi network from Cisco, as part of a sponsorship partnership between Cisco and the U.S. Golf Association.

As the official technology partner for the USGA and its championships, Cisco said it set out with the goal to make this year’s 119th U.S. Open the “most connected” in the event’s history, mainly through the deployment of about 400 Meraki Wi-Fi APs throughout the famed seaside course.

According to the USGA, the network saw 25 total terabytes of data used during the championship, but the USGA did not break out daily totals. The USGA also said it saw more than 100,000 connections to the network, but did not specify if that number represents unique connections or contains multiple connections from the same devices. In addition, our special correspondent Keith Newman did spend tournament Saturday at the course, and found the network to provide good connectivity in many places around the grounds. In addition to putting APs on obvious placement spots like the edges of seating areas and on top of hospitality and other temporary structures, Cisco also had some mobile AP placements on towers in strategic locations.

According to Cisco, it brought in gear to create a 10 Gbps backbone for the Wi-Fi network, also including support for tournament back of house operations on that backbone. Static signage at the event directed fans to the Wi-Fi network, and since Cisco also sponsored this year’s U.S. Open mobile app, users of that were also alerted to the free Wi-Fi on the property.

Cisco Vision on the driving range

On the display side of things, Cisco also utilized its Cisco Vision IPTV display management system to help bring more interesting information to fans at the venue. Especially interesting was the incorporation of the Toptracer shot-tracking graphics to show live player performances on the driving range, with the ability to map multiple players and provide a range of stats on shot distance and speed.

The tournament, especially Sunday’s thrilling victory by Gary Woodland over the close-finishing Brooks Koepka, no doubt presented many networking challenges, especially when fans randomly thronged to tee-box areas to try to get a photo or a video of players teeing off.

“Our digital integration with Cisco provided us the opportunity to elevate the fan experience and provide more connectivity than any previous U.S. Open,” said Navin Singh, chief commercial officer of the USGA, ina prepared statement. “We also learned a lot and recognize that mobile consumption demands are only going to continue to grow. We are excited to get to work on providing an even better experience in 2020 at the 120th U.S. Open at Winged Foot.”

More photos from Pebble Beach below.

An on-course mobile AP placement. Credit: Cisco


Digital device use soared at the U.S. Open whenever Tiger Woods was around. Credit: Paul Kapustka, MSR (Screen shot of Fox TV broadcast)

A leaderboard provided space for an AP placement. Credit: Cisco

Toptracer shot-tracking graphics at the driving range, powered by Cisco Vision. Credit: Cisco

Fans clustered around tee boxes, putting extra stress on the network. Credit: Keith Newman, MSR

Broncos Stadium at Mile High sees 12.63 TB of Wi-Fi during Garth Brooks show

The Garth Brooks show in Denver June 8 saw fans use a venue-best 12.68 terabytes of data on the Wi-Fi network at Broncos Stadium at Mile High, according to figures provided by the team’s IT department.

In what sounds like a great time for both attendees and the performer (see Tweet below) there were 48,442 unique devices connected at some point during the night, according to figures sent our way by Russ Trainor, senior vice president of IT for the Broncos. That’s a take rate of about 58 percent, with some 84,000 fans in the stadium that night. Trainor also said that there was a peak concurrent connection total of 34,952 devices, and that the network saw a throughput peak of 17.65 Gbps, also a record for the venue.

The previous top Wi-Fi event at the Broncos’ home was a Taylor Swift concert last year, where the Wi-Fi network saw just over 8 TB of traffic. Looks like the continued improvements to the venue’s Wi-Fi network are able to handle more traffic.

THE MSR TOP 21 FOR WI-FI

1. Super Bowl 53, Mercedes-Benz Stadium, Atlanta, Ga., Feb. 3, 2019: Wi-Fi: 24.05 TB
2. NCAA Men’s 2019 Final Four semifinals, U.S. Bank Stadium, Minneapolis, Minn., April 6, 2019: Wi-Fi: 17.8 TB
3. Super Bowl 52, U.S. Bank Stadium, Minneapolis, Minn., Feb. 4, 2018: Wi-Fi: 16.31 TB
4. NCAA Men’s 2019 Final Four championship, U.S. Bank Stadium, Minneapolis, Minn., April 8, 2019: Wi-Fi: 13.4 TB
5. Garth Brooks Tour, Broncos Stadium at Mile High, June 8, 2019: Wi-Fi: 12.63 TB
6. 2018 College Football Playoff Championship, Alabama vs. Georgia, Mercedes-Benz Stadium, Atlanta, Ga., Jan. 8, 2018: Wi-Fi: 12.0 TB*
7. Super Bowl 51, NRG Stadium, Houston, Feb. 5, 2017: Wi-Fi: 11.8 TB
8. Atlanta Falcons vs. Philadelphia Eagles, Lincoln Financial Field, Philadelphia, Pa., Sept. 6, 2018: Wi-Fi: 10.86 TB
9. Super Bowl 50, Levi’s Stadium, Santa Clara, Calif., Feb. 7, 2016: Wi-Fi: 10.1 TB
10. Taylor Swift Reputation Tour, Gillette Stadium, Foxborough, Mass., July 27, 2018: Wi-Fi: 9.76 TB
11. Minnesota Vikings vs. Philadelphia Eagles, NFC Championship Game, Lincoln Financial Field, Philadelphia, Pa., Jan. 21, 2018: Wi-Fi: 8.76 TB
12. Jacksonville Jaguars vs. New England Patriots, AFC Championship Game, Gillette Stadium, Foxborough, Mass., Jan. 21, 2018: Wi-Fi: 8.53 TB
13. Taylor Swift Reputation Tour, Broncos Stadium at Mile High, May 25, 2018: Wi-Fi: 8.1 TB
14. Kansas City Chiefs vs. New England Patriots, Gillette Stadium, Foxborough, Mass., Sept. 7, 2017: Wi-Fi: 8.08 TB
15. SEC Championship Game, Alabama vs. Georgia, Mercedes-Benz Stadium, Atlanta, Ga., Dec. 1, 2018: Wi-Fi: 8.06 TB*
16. Green Bay Packers vs. Dallas Cowboys, Divisional Playoffs, AT&T Stadium, Arlington, Texas, Jan. 15, 2017: Wi-Fi: 7.25 TB
17. Stanford vs. Notre Dame, Notre Dame Stadium, South Bend, Ind., Sept. 29, 2018: 7.19 TB
18. (tie) Southern California vs. Notre Dame, Notre Dame Stadium, South Bend, Ind., Oct. 21, 2017: 7.0 TB
Arkansas State vs. Nebraska, Memorial Stadium, Lincoln, Neb., Sept 2, 2017: Wi-Fi: 7.0 TB
19. WrestleMania 32, AT&T Stadium, Arlington, Texas, April 3, 2016: Wi-Fi: 6.77 TB
20. Wisconsin vs. Nebraska, Memorial Stadium, Lincoln, Neb., Oct. 7, 2017: Wi-Fi: 6.3 TB
21. Super Bowl 49, University of Phoenix Stadium, Glendale, Ariz., Feb. 1, 2015: Wi-Fi: 6.23 TB

* = pending official exact data

Federated Wireless completes ESC network for CBRS

One of the coastal sensors deployed in Federated Wireless’ ESC network. Credit: Federated Wireless

Federated Wireless announced Monday the completion of its environmental sensing capability (ESC) network, in what may be one of the final stepping stones toward commercial deployments of networks in the CBRS band.

Under the unique shared-spectrum licensing structure of the CBRS (Citizens Broadband Radio Service) band, a swath of 150 MHz in the 3.5 GHz range, an ESC network must be in place to sense when U.S. Navy ships are using the band. What Federated is announcing Monday is that its ESC network is ready to go, one of the final things needed before commercial customers of Federated’s products and services would be able to formally start operating their networks.

Though the Federated ESC network is still pending final FCC approval, Federated president and CEO Iyad Tarazi said in a phone interview that the company “expects to get the green light [from the FCC] in June,” with the commercial customer launches following soon behind. Federated, a pure-CBRS startup with $75 million in funding, also offers Spectrum Access Services (SAS), another part of the CBRS puzzle to help ensure that any network operators who want to play in the shared-space sandbox that is CBRS are only using spectrum chunks that are free of any higher-priority traffic.

According to Tarazi Federated already has 25 customers testing its gear and services in getting ready to launch CBRS networks, a yet-unnamed group of entities that Tarazi said includes wireless carriers, enterprise companies looking to launch private networks, and even some large public venues.

Private networks first for venues?

The early thinking on CBRS use cases for sports stadiums includes the possibility of using private LTE networks for sensitive internal operations like ticketing and concessions, or even for closed-system video streaming and push-to-talk voice support. In the longer-term future, CBRS has been touted as a potential way to provide a neutral-host network that could support fan-facing carrier offload much like a current distributed antenna system (DAS), but to get to that place will still likely require some more-advanced SIM technology to be developed and deployed in client devices like cellphones.

But the potential of a new, huge chunk of spectrum — and the possibility of teams, leagues and venues being able to own and operate their own networks — has created a wide range of interest in CBRS among sports operations. While many of those same entities already operate stadium Wi-Fi networks, CBRS’s support for the cellular LTE standard theoretically could support faster, more secure networks. However, the emerging Wi-Fi 6 standard may close the performance gaps between cellular and Wi-Fi in the near future; many networking observers now seem to agree that most venues will likely see a continued mix of Wi-Fi and cellular systems in the near future, possibly including CBRS.

Already, the PGA and NASCAR have live tests of CBRS networks underway, and the NFL and Verizon have kicked the ball around with CBRS tests, reportedly for possible sideline headset network use.

While CBRS will potentially get more interesting when the commercial deployments become public, if you’re a network geek you will be able to appreciate some of the work done by Federated to get its ESC network operational, starting with the deployments of sensors on coastal structures as varied as “biker bars and luxe beach resorts,” according to a Federated blog post.

Tarazi, who was most recently vice president of network development at Sprint, said the Federated ESC network is “triple redundant,” since losing just one sensor could render a big chunk of spectrum unusable.

“If you lose a sensor, you lose hundreds of square miles of [available] network,” Tarazi said. “That’s a big deal.”

And ensuring network availability is in part what Federated’s clients will be paying the company for, part of the puzzle that when put together will theoretically open up wireless spectrum at a much lower cost compared to purchasing licensed spectrum at auction. As one of the pick-and-shovel providers in the CBRS gold rush, Tarazi and Federated may be the only ESC game in town for a while, as the joint effort between CommScope and Google to build another ESC is not expected to be completed until later this year at the earliest.

“I feel like we’re at an inflection point now,” Tarazi said. “It feels good to be leading this wave.”