Top-down approach brings Wi-Fi to OKC Thunder’s Chesapeake Energy Arena

Chesapeake Energy Arena, home of the NBA’s Thunder. Credit all photos: Oklahoma City Thunder

If there’s one sure thing about stadium Wi-Fi deployments, it’s that pretty much no two networks are ever exactly the same. So even as there is a growing large-venue trend for putting Wi-Fi access points under seats or in handrails, sometimes the traditional top-down method is still the one that works best.

Such was the case for the first full fan-facing Wi-Fi network at Chesapeake Energy Arena in Oklahoma City, home of the NBA’s Thunder. With a large amount of retractable seating in the 18,000-seat venue, an under-seat approach to Wi-Fi would prove too costly and disruptive, leading the team to look for connectivity from above.

While a solid in-building cellular distributed antenna system (DAS) had done a good job of keeping fans connected the last few years, the team’s desire to have more mobile insight to fan activity as well as a switch to a Wi-Fi-centric point of sale system led Oklahoma City to finally install fan-facing Wi-Fi throughout the venue.

Chris Nelson, manager of information technology for venue manager SMG, and Tyler Lane, director of technology for the Thunder, spoke with Mobile Sports Report about the recent Wi-Fi deployment at Chesapeake Energy Arena, which went live during the most recent NBA season.

An AP placement in the rafters

Though the venue looked at all options, Nelson said that going under-seat with APs would have been “very costly” to do, given the large number of retractable seats in the arena.

“We wanted to hang them [APs] from the top if we could,” Nelson said.

After testing the top equipment brands available, the Thunder settled on Ruckus gear, for what they said was a simple reason, one involving the 96 feet in air space from the catwalk to the arena floor.

“Ruckus was the only one whose gear could reach down all the way,” Nelson said.

Adding to the fan experience

Editor’s note: This report is from our latest STADIUM TECH REPORT, an in-depth look at successful deployments of stadium technology. Included with this report is a profile of the new Wi-Fi network at Allianz Field in St. Paul, Minn., and an in-depth research report on the new Wi-Fi 6 standard! DOWNLOAD YOUR FREE COPY now!

According to the team the deployment saw 410 total APs used, with 350 in the arena proper and another 60 deployed across the street at the Cox Convention Center. According to the Thunder’s Lane, the team rolled out the service slowly at first, with some targeted testing and feedback from season ticket holders.

Close-up of an AP placement

“We got some good feedback and then when we went to a full rollout we had signage in the concourses, communications via ticketing services and announcements over the PA and on the scoreboard,” to tell fans about the system, said Lane.

According to statistics provided by the team, the Wi-Fi was getting good traction as the season went on, with a March 16 game vs. the Golden State Warriors seeing 589.3 gigabytes of traffic, from 2,738 clients that connected to the network. Lane said the team employed Jeremy Roach and his Rectitude 369 firm to assist with the network design; Roach in the past helped design networks at Levi’s Stadium and Sacramento’s Golden 1 Center.

Now that the Wi-Fi network is in place, Lane said the Thunder is starting to increase the ways it can add to the fan experience via digital means, including app-based features like showing press conferences live and by having an artificial intelligence chatbot to help provide fans with arena information.

“It’s really all about enhancing the fan experience,” Lane said, with an emphasis on driving digital ticketing use in the YinzCam-developed team app. Lane said that the system also drives a lot of mobile concessions traffic, and added that “Ruckus did a fantastic job of asking all the right questions for our food and beverage partners.”

Introducing: The VENUE DISPLAY REPORT!

Mobile Sports Report is pleased to announce our latest editorial endeavor, the VENUE DISPLAY REPORT!

A new vertical-specific offering of MSR’s existing STADIUM TECH REPORT series, the VENUE DISPLAY REPORT series will focus on telling the stories of successful venue display technology deployments and the business opportunities these deployments enable. No registration or email address required — just click on the image below and start reading!

Like its sibling Stadium Tech Report series, the Venue Display Report series will offer valuable information about cutting-edge deployments that venue owners and operators can use to inform their own plans for advanced digital-display strategies.

Our reporting and analysis will be similar to that found in our popular STR series, with stadium and venue visits to see the display technology in action, and interviews and analysis with thought leaders to help readers better inform their upcoming technology purchasing decisions. And in case you are new to the MSR world, rest assured that all our VDR reports will be editorially objective, done in the old-school way of real reporting. We do not accept paid content and do not pick profiles based on any sponsorship or advertising arrangements.

Our inaugural issue contains profiles of a new concourse display strategy at the San Jose Sharks’ SAP Center, powered by new LED screens from Daktronics and the Cisco Vision IPTV digital display management system; a look at the Utah Jazz’s decision to use Samsung’s system-on-a-chip displays at Vivint Smart Home Arena; and the San Francisco 49ers’ decision to use Cisco Vision to control displays at Levi’s Stadium.

Start reading the first issue now! No download or registration necessary.

As venues seek to improve fan engagement and increase sponsor activation, display technology offers powerful new ways to improve the in-stadium fan experience. While these topics are of prime interest to many of our long-term audience of stadium tech professionals, we suggest that you share the link with colleagues on the marketing and advertising sales side of the house, as they will likely find great interest in the ROI enabled by strategic display system deployments.

Sponsorship spots are currently available for future VDR series reports; please contact Paul at kaps at mobilesportsreport.com for media kit information.

Levi’s Stadium sees 5.1 TB of Wi-Fi data used at college football championship

Fans and media members at Monday night’s College Football Playoff championship game used a total of 5.1 terabytes of data on the Wi-Fi network at Levi’s Stadium, according to figures provided by the San Francisco 49ers, who own and run the venue.

With 74,814 in attendance for Clemson’s 44-16 victory over Alabama, 17,440 of those in the stands found their way onto the stadium’s Wi-Fi network. According to the Niners the peak concurrent connection number of 11,674 users was seen at 7:05 p.m. local time, which was probably right around the halftime break. The peak bandwidth rate of 3.81 Gbps, the Niners said, was seen at 5:15 p.m. local time, just after kickoff.

In a nice granular breakout, the Niners said about 4.24 TB of the Wi-Fi data was used by fans, while a bit more than 675 GB was used by the more than 925 media members in attendance. The Wi-Fi data totals were recorded during an 8-1/2 hour period on Monday, from 1 p.m. to 9:30 p.m. local time.

Added to the 3.7 TB of DAS traffic AT&T reported inside Levi’s Stadium Monday night, we’re up to 8.8 TB total wireless traffic so far, with reports from Verizon, Sprint and T-Mobile still not in. The top Wi-Fi number at Levi’s Stadium, for now, remains Super Bowl 50, which saw 10.1 TB of Wi-Fi traffic.

AT&T: Lots of DAS traffic for college football championship

DAS on a cart: DAS Group Professionals deployed mobile DAS stations to help cover the parking lots at Levi’s Stadium for the college football playoff championship. Credit: DGP

This may not be a news flash to any stadium network operations team but the amount of mobile data consumed by fans at college football games continues to hit high levels, according to some new figures released by AT&T.

In a press release blog post where AT&T said it saw 9 terabytes of cellular data used over the college football playoff championship-game weekend in the Bay area, AT&T also crowned a cellular “data champion,” reporting that Texas A&M saw 36.6 TB of data used on the AT&T networks in and around Kyle Field in College Station, Texas.

(Actually, AT&T pointedly does NOT declare Texas A&M the champs — most likely because of some contractural issue, AT&T does not identify actual stadiums or teams in its data reports. Instead, it reports the cities where the data use occurred, but we can figure out the rest for our readers.)

For the College Football Playoff championship, AT&T was able to break down some specific numbers for us, reporting 3.7 TB of that overall total was used inside Levi’s Stadium on game day. Cell traffic from the parking lots and tailgating areas (see photo of DAS cart to left) added another 2.97 TB of traffic on AT&T’s networks, resulting in a game-day area total of 6.67 TB. That total is in Super Bowl range of traffic, so we are excited to see what the Wi-Fi traffic total is from the game (waiting now for the college playoff folks to get the statistics finalized, so stay tuned).

DAS antennas visible at Levi’s Stadium during a Niners game this past season. Credit: Paul Kapustka, MSR

For the additional 2+ TB of traffic, a footnote explains it somewhat more: “Data includes the in-venue DAS, COWs, and surrounding macro network for AT&T customers throughout the weekend.”

Any other carriers who want to add their stats to the total, you know where to find us.

Back to Texas A&M for a moment — in its blog post AT&T also noted that the stadium in College Station (which we will identify as Kyle Field) had the most single-game mobile usage in the U.S. this football season, with nearly 7 TB used on Nov. 24. Aggie fans will remember that as the wild seven-overtime 74-72 win over LSU, an incredible game that not surprisingly resulted in lots of stadium cellular traffic.

Niners, SAP announce stadium-operations management application

A sample screen shot from the new Executive Huddle stadium operations management platform, developed by SAP for the San Francisco 49ers. Credit: San Francisco 49ers (click on any photo for a larger image)

A desire by the San Francisco 49ers to see stadium operations information in real time has become a real product, with today’s announcement of Executive Huddle, a stadium operations management application developed for the Niners by SAP.

In use at the Niners’ Levi’s Stadium since the start of the current football season, Executive Huddle brings transaction information from nine different stadium operations systems, including parking, concessions, retail sales, weather and fan opinions into a visual output that allows team executives to make real-time decisions on how to fix problems or otherwise enhance the game-day experience.

Demonstrated at Sunday’s home game against the Los Angeles Rams, the software not only reports raw data like concession sales or parking lot entries, but also provides a layer of instant feedback to let team executives make immediate changes to operations if necessary. The cloud-based application, developed by SAP and Nimbl, is currently only in use at one upper-level suite at Levi’s Stadium, where the output runs during Niners’ game days on several video screens. SAP, however, plans to make the system available to other teams in the future, according to SAP executives at Sunday’s demonstration.

Fixing issues in real time

Al Guido, president of the 49ers, said Executive Huddle was the end product of a desire of his to be able to fix any game-day experiences on the day of the game, instead of in the days or weeks after. According to Guido, the Niners have been passionate about collecting fan-experience data since Levi’s Stadium opened in 2014. But in the past, the compilation of game-day data usually wasn’t complete until a day or two after each event, meaning any issues exposed were only learned lessons that needed to wait until the next games to be fixed.

Executives huddle: from left, SAP’s Mark Lehew, Niners’ Moon Javaid, SAP’s Mike Flannagan and Niners president Al Guido talk about the Executive Huddle system at a Sunday press event at Levi’s Stadium. Credit: Paul Kapustka, MSR

Things like slower sales at concession stands, or issues with parking-lot directions, Guido said, wouldn’t be known as they were happening, something he wanted to change.

“I really wanted to be able to act on it [the operations data] in real time, instead of waiting until the Wednesday after a Sunday game,” Guido said.

Now, with Executive Huddle, the Niners’ operations team can sit in a single room and watch as operations events take place, and can make in-game moves to fix things, like calling on the radio to a parking lot to tell gate operators of traffic issues.

“It’s like having an air traffic control system,” said Mark Lehew, global vice president for sports and entertainment industry solutions at SAP. Lehew said SAP worked with the Niners’ list of operations vendors, including Ticketmaster, ParkHub, caterer Levy and point-of-sale technology provider Micros to provide back-end application links so that Executive Huddle could draw information from each separate system into the uber-operations view that Executive Huddle provides. According to SAP, Executive Huddle is based on SAP’s Leonardo and Analytics platform.

The manager of managers

Though the system doesn’t currently monitor some other key stadium operations information, like performance of the Levi’s Stadium Wi-Fi network, Michael Pytel, chief innovation officer for Nimbl, said the system could conceivably add “any information we can get from an API.”

The Levi’s Stadium suite where the Niners monitor Executive Huddle information. Credit: San Francisco 49ers

Moon Javaid, the Niners’ vice president of strategy and analytics, said the continued robust performance of the stadium’s wireless networks make them a lower-priority need for the kind of oversight Executive Huddle provides.

Javaid, the quarterback of the program’s development from the Niners’ side of the equation, noted that part of its power comes not just from surfacing the data, but also from providing some instant intuitive markers — like red for declining metrics and green for positive — and the ability to compare current data to those from other events so that data could not just be seen but also understood, within seconds.

And while SAP plans to make Executive Huddle available to other teams, it’s clear that the program — as well as education and training for the decision-making staff who will use it — will need different care and feeding for each stadium that might want to use it. But SAP’s Lehew noted that being able to provide real-time data in an exposed fashion was becoming table stakes for operations providers, who would have to move past old ways of doing things if they wanted to be a part of the next generation of stadium service providers.

Massive MIMO is Sprint’s path to 5G, says CTO Saw

Dr. John Saw, CTO of Sprint, at an IEEE keynote speech. Credit all photos: Sprint

Sprint chief technical officer John Saw has seen the future of cellular wireless, and according to him it was at a sports event.

“I was at the [Winter] Olympics where KT [Korea Telecom] and Intel set up the first 5G network,” said Saw in a recent phone interview. “Stadiums will be a good showplace for the capabilities of 5G. It’s pretty impressive what you can do with 5G that you can’t do today.”

Saw, who was CTO at WiMAX play Clearwire before that company became part of Sprint, will be the first to admit that the network built for the PyeongChang Olympics wasn’t “true” 5G, but said it was a good precursor. He also added that it wasn’t a cost-conscious deployment, something MSR had heard from other sources who said Intel and KT didn’t hold back when it came to spending.

“They spent a lot of money [on the network],” Saw said.

But some of the services the Olympic network was able to support included local viewing of replays using Intel’s True View technology, which gives fans the ability to watch a play or action from a 360-degree angle. While Intel has had limited deployments of the technology at some U.S. sporting events, for the Olympics Saw said they used hundreds of cameras linked over millimeter wave frequencies, which can offer very low latency.

“They needed [to have the images] in real time,” Saw said, and built the millimeter wave network to do just that. While the network “wasn’t fully compliant to the subsequent 5G standards, a lot of what they built is the forerunner to 5G,” Saw said. “It was a pretty cool showcase, and will certainly find a home in stadiums.”

No Millimeter Wave spectrum for Sprint

Editor’s note: This profile is from our latest STADIUM TECH REPORT, an in-depth look at successful deployments of stadium technology. Included with this report is a profile of a new MatSing ball DAS deployment at Amalie Arena, a new DAS for the Chicago Cubs at Wrigley Field, and a look at the networks inside the new Banc of California Stadium in Los Angeles! DOWNLOAD YOUR FREE COPY now!

Millimeter wave networks, however, won’t be part of Sprint’s early push toward 5G, said Saw. Instead, he said Sprint will concentrate on deploying “Massive MIMO” networks in its rich space of spectrum at the 2.5 GHz frequency, where Sprint controls upwards of 150 MHz of spectrum in most major U.S. metro markets.

Without trying too hard here to explain exactly how Massive MIMO works — think splitting up transmissions between mulitple antennas then using lots of compute power to bring the data back together — the key here is Sprint’s spectrum holdings, which Saw said are still only about half used.

“When we launched LTE [on the 2.5 Ghz spectrum] we used less than half the spectrum we had,” Saw said. “With 5G, we will use all the spectrum we have in market. We’ll be one of very few carriers who launch 5G in the same [spectrum] footprint [as LTE].”

With the ability to carry “four to 10 times the capacity of regular LTE,” Saw sees Massive MIMO 5G as something perfect for large public venues like stadiums and shopping malls.

Dr. John Saw

“When you have sports events with 50,000 people in the stadium, you need this kind of capacity,” Saw said. “Were building out the footprint for [5G] this year, and we’ll launch next year.”

Saw said that part of the infrastructure support for 5G networks will be different as well.

“It’s more than just speed, or more capacity. It’s more than tonnage,” Saw said. “We’ll have a different way of deploying the new network, with a more distributed core, one [with more resources] out to the edge of the network.”

Why is such equipment redistribution necessary? According to Saw, a network with more components at the edge can help with content delivery for the new bandwidth-hungry apps like virtual-reality replays.

“Say you want VR at a hockey game, where you want to give real time [replay] viewing to customers, with different camera angles,” Saw said. “You literally have to have the 5G core inside the stadiums, so it can process [the content] without having to go back to the cloud.”

Will DAS trail in the path to 5G?

One type of network Saw doesn’t see leading the way to 5G is the traditional DAS, or distributed antenna system.

“DAS is going to have to migrate to 5G,” Saw said. “It’s not going to lead the pack.”

In fact, Saw said Sprint has been somewhat of a reluctant DAS participant at times, including at the most recent Super Bowls. In the last two of the NFL’s “big game” events, Super Bowl 51 in Houston and Super Bowl 52 in Minneapolis, Saw said Sprint used small cell deployments instead of the neutral DAS systems to augment its coverage.

“We had hundreds of small cells, inside and outside [the venues],” Saw said. “We got the same performance, maybe better, for a lot less money.”

Part of the issue for Sprint and DAS, Saw said, is that the carrier usually has to pay more for its unique spectrum bands, especially the 2.5 GHz frequencies which are not used by any of the other major wireless carriers.

“We always think through before we sign up for DAS fees… there’s more than one way to skin a cat,” Saw said. While in many cases there is no alternative except to participate in a neutral-host configuration, Saw said “we do prefer small cells.”

Will CBRS help?

One of the more hyped platforms being pushed this year is use of the CBRS spectrum at the 3.5 GHz range for not just more carrier networks, but even for “private” LTE networks, like for venues or campuses.

“It’s an interesting concept because it opens things up to more than just four operators,” Saw said. But he also called out the need for an online database to make sure CBRS spectrum use doesn’t interfere with systems run by the U.S. Navy, and added that without any definitive FCC action yet, the rules for future CBRS use are still unclear.

“There’s quite a lot of work to be done, and not a lot of spectrum there,” said Saw. While claiming that Sprint is “watching CBRS with interest,” he added that with its 2.5 GHz holdings, Sprint most likely won’t be at the front of any CBRS deployments.

“At the end of the day, CBRS is not 5G,” Saw said.

How will a merger with T-Mobile help?

Since our conversation took place just a day after Sprint and T-Mobile announced their renewed plans to merge, Saw didn’t have a lot of details to share, beyond his opinion that the two companies’ different spectrum holdings would build a more powerful competitor when put together.

“When you put our 2.5 (GHz) with their 600 MHz it gives you a much larger footprint with higer capacity,” Saw said. “There’s tremendous synergy. Both [companies] are enthusiastic about this deal.”

Editor’s note: This post is part of Mobile Sports Report’s new Voices of the Industry feature, in which industry representatives submit articles, commentary or other information to share with the greater stadium technology marketplace. These are NOT paid advertisements, or infomercials. See our explanation of the feature to understand how it works.