Little Caesars Arena revs the engine on wireless

Little Caesars Arena in Detroit is revving its engine with wireless deployments of Wi-Fi and DAS. Credit all photos: Terry Sweeney, MSR (click on any picture for a larger image)

Detroit has made an ambitious bet on the sports entertainment model with its 50-block District Detroit development – which embraces Ford Field (where the NFL’s Lions play), Comerica Park (MLB’s Tigers) and most recently, Little Caesars Arena (NBA’s Pistons and NHL’s Red Wings).

In fact, Motor City might just as easily be renamed Stadium City as Detroit looks to professional sports as one cornerstone of economic re-development.

The city has all four major pro sports teams competing within a few blocks of each other, noted John King, vice president of IT and innovation for Olympia Entertainment and the Detroit Red Wings. District Detroit plays host to more than 200 events, welcoming some 3 million visitors annually – not bad for an area that’s barely 18 months old.

Detroit’s hardly alone in riding this development wave. Sports entertainment districts are a proven engine to boost local economies and are popping up all over the country:
–Los Angeles’s LA Live complex uses the Staples Center as its hub but includes restaurants, hotels and plenty of retail;
–Houston Avenida gangs together Minute Maid Park, BBVA Compass Stadium and NRG Stadium, along with a convention center and hotels;
–Battery Atlanta houses the Atlanta Braves’ SunTrust Park and a Coca-Cola entertainment facility, along with retail, residences and hotels;
— Westgate Entertainment District in the greater Phoenix area houses State Farm Stadium (NFL’s Cardinals) and Gila River Arena (NHL’s Coyotes), plus the obligatory retail, restaurants and hotels.

San Francisco, Kansas City, Cincinnati and Sacramento and other cities are all building out similar sports entertainment developments in their downtown areas that encourage sports fans to make a night of it, or even a weekend. Even venerable venues like Green Bay’s Lambeau Field and Chicago’s Wrigley Field are also getting in the act of trying to build areas outside the parks to keep fans engaged (and spending) before and after events, or even when there’s no games being played.

Robust DAS, Wi-Fi in LCA

Editor’s note: This report is from our latest STADIUM TECH REPORT, an in-depth look at successful deployments of stadium technology. Included with this report is a profile of the new Wi-Fi and DAS networks being planned for the University of Colorado, as well as a profile of Wi-Fi at Vivint Smart Home Arena in Salt Lake City! DOWNLOAD YOUR FREE COPY now!

John King oversees the IT operations at Little Caesars Arena

King is pleased with the performance of the IT infrastructure at Little Caesars Arena since the $863 million venue opened in the fall of 2017. With a backbone of two 100-Gbps fiber connections, the arena counts more than 700 Cisco Wi-Fi access points. There are 364 APs in the bowl itself; the bulk of those – 300 APs – have been installed under seats to get the signals closer to where the users are.

Mobile Sports Report put LCA’s Wi-Fi network and DAS system to the test this season during a Red Wings home game against the New York Rangers. Due to personal technical constraints, we were only able to test Verizon’s portion of the DAS deployment; the Wi-Fi network tested was the District Detroit Xfinity SSID.

The good news is that both network types performed admirably. No surprise that bandwidth was most plentiful and speeds were fastest on concourses near concessions, as well as in the private clubs parceled around LCA. Fastest measured speeds: 139.68 Mbps download/33.24 Mbps on the DAS network outside the MotorCity Casino Club. The Wi-Fi was also well engineered there – 51.89 Mbps download and 72.34 Mbps upload were plenty fast for hockey’s power users.

We measured comparable speeds by the Rehmann Club with 134.4 Mbps/36.25 Mbps on the DAS, and 21.56 Mbps/120.8 Mbps on Wi-Fi. Similarly, connectivity was not an issue while standing in front of the impossible-to-miss Gordie Howe statue in LCA’s main concourse, where we clocked DAS at 102.95 Mbps/22 Mbps, and Wi-Fi at 43.34 Mbps/43.72 Mbps.

Speeds around the arena were generally in double-digit megabits, both for Wi-Fi and DAS. The Wi-Fi signal got a little sluggish in Section M7 (0.79 Mbps/3.03 Mbps) and Section M33 (1.68 Mbps/29 Mbps). Lowest measured throughput on the DAS network was in Suite 17 with 16.18 Mbps/17.41 Mbps, still plenty fast to handle most fan requirements.

Lighting Things Up in District Detroit

In tandem to LCA, there are approximately 1,000 APs also attached to the network that either handle District Detroit’s public Wi-Fi or connect to 34 parking lots and garages.

Wireless gear painted to blend in

“Our goal is to bring life and excitement throughout the District and not just focus on Little Caesars Arena,” King said. Video and digital signage are essential to that effort, both inside and outside LCA. The network enables more than 1,500 IPTV connections distributed across the arena, but also externally to LED boards and electronic parking signs. “We want to take the excitement from the event and run it out to the city – ‘5 minutes to puck drop’, on all those signs as one example,” King explained. “We can leverage [signage] for more than just the price of parking.”

The network uses the Cisco Vision IPTV digital display management system to control display programming, including advertising that appears on video screens in LCA’s many hospitality suites. With five TV screens per suite, LCA deploys an L-shaped “wrapper” around the main video image used for advertising. “We rotate that content in the suites and run loops in concourse before and after events,” King said. “It allows us to put scripting in different zones or post menus and dynamically update prices and items for sale.” LCA’s concessionaires can change the price or location of food and beverage items, all through the networked point-of-sale system.

Tune-able Wi-Fi

The District Detroit app is divided into three “buckets,” according to King: Detroit Red Wings, Detroit Pistons and 313 Presents — all the events and entertainment outside of sporting events (313 is Detroit’s area code). When configured for hockey, LCA can accommodate up to 19,515 Red Wings fans; as a basketball arena for the Pistons, LCA holds 20,491. But some events may draw fewer people and King and his team adjust accordingly.

“We’re an arena for 20,000 fans and as we looked at that density, we found that 10,000 fans behave differently and we’ve had to tune the arena differently based on traffic flows,” he said. When completely full, Wi-Fi signals must pass through so many “bags of water,” as RF engineers sometimes describe human spectators. Half as many fans means that Wi-Fi signals behave differently, consequently, a fan may connect to an AP that’s less than ideal, which can affect both user experience and system performance.

An under-seat Wi-Fi enclosure

“We’ve looked at some power tweaks and tuning; we also have the ability to tune [the arena] on the fly,” King said, but emphasized that the venue’s Wi-Fi doesn’t get re-tuned for every event. “We try to find the sweet spot and not do that too much. On an event day, we try not to touch anything that isn’t broken,” he said.

Previews of coming attractions

Like any sports and entertainment IT exec, King is looking at ways to improve the fan experience and derive more performance and revenue from Olympia’s IT investment. Buoyed by the success of mobile ticketing at LCA, King said he’d like to find some way to use biometrics to help speed up transactions at counters and pedestals throughout the arena. And he’s excited about 5G cellular deployment, which he believes could compete with Wi-Fi if 5G delivers on all that’s been promised by carriers.

LCA’s app uses Bluetooth for navigation, letting fans input their seat information for directions. “Right now, we have pre-order pickup, but in-seat service is something we’re looking at. What other line-busting technologies can we do?” King said.

And while fans can pre-order food and beverages at LCA, King also wonders if pre-ordering of team merchandise (“merch”) is something that would appeal to fans and be easy to execute. “We’re looking at a Cincinnati venue where they have compartments for food, hot or cold, that’s been pre-ordered,” he said, wondering if a similar compartmentalized pickup system be used for merch.

King sees plenty of room for improvement in overall management reporting across IT systems at LCA and the 12,000 active ports that keep systems humming.

“Everything is connected and our electricians can use their iPads to dim or turn on lights anywhere in the building,” he said, adding that everything’s monitored — every switch, every port. “It would be nice to see more information around traffic flow and performance patterns. We’re seeing a little bit of that. But I’d like to see network information on people tracking and doors, and correlate visual information with management data.”

Another set of metrics King can’t get at the moment: Performance data from AT&T, T-Mobile and Verizon about LCA’s 8-zone DAS system. King said he’s talking with Verizon, the lead DAS operator at the venue, about getting autonomous reports in the future, but for the time being King and his team don’t have much visibility there. The DAS uses the Corning ONE system.

Venue Display Report: Sharks bring ‘excitement’ to SAP Center concourses with new digital display technology from Daktronics and Cisco

A long LED board lights up the main concourse at the San Jose Sharks’ home, SAP Center. Credit all photos: Paul Kapustka, MSR (click on any picture for a larger image)

If you’re an ice hockey fan, you are no doubt somewhat addicted to the excitement of seeing games live, in person. Yet one historical drawback to going to games has always been fearing those moments when you need or want to leave your seat, when missing out on the unpredictable action makes waiting in lines excruciating.

While many teams in all kinds of sports have been busy installing television screens in concourses and concession areas to help keep fans connected to the live action, at SAP Center in San Jose the NHL’s Sharks have taken concourse display technology to a new level: With cutting-edge LED displays from Daktronics and the Cisco Vision IPTV display management system from Cisco, the Sharks have turned what used to be basically a dark concrete tunnel into a well-lit, display-laden walkway that can bring live game action and exciting, engaging marketing messages to fans while they are outside the bowl, keeping the excitement level high no matter where in the building a fan might be.

The most visible part of the new display deployment, one installed in phases over the last two seasons, are the concourse LED boards from Daktronics, displays that were custom designed for the stadium’s walkways. Robin Hall, a regional manager for the Brookings, S.D.-based Daktronics, said there were a total of 17 displays added to the main concourse at SAP Center, all 3 1/2-feet tall but in many different widths, with one measuring almost 66 feet wide.

Narrow Pixel Pitch LEDs make a difference

Editor’s note: This profile is from our new VENUE DISPLAY REPORT series, a vertical-specific offering of MSR’s existing STADIUM TECH REPORT series. The VENUE DISPLAY REPORT series will focus on telling the stories of successful venue display technology deployments and the business opportunities these deployments enable. No registration or email address required — just click on this link and start reading!

John Castro, vice president of corporate partnerships for the Sharks, said the concourse displays are just the latest step in an ongoing process to “keep the venue updated and modernized.” Now celebrating its 25th year in existence, SAP Center recently hosted the NHL’s All-Star Game and is a regular stop for such big-ticket events as NCAA basketball regionals and U.S. Figure Skating championships.

In 2010, Castro said the arena added a new Daktronics center-hung video board, which has distinctive circular ribbon boards above and below that synchronize with the ribbon board that circles the arena in the middle of the seating areas. A few years ago, the arena put out an RFP to bring Wi-Fi to the stadium, and when it picked Cisco for the gear supplier, it also decided to use Cisco Vision to synchronize a new display strategy for the building’s main concourse.

“The idea was, let’s emulate what people see in the seats and bring it to the concourse,” Castro said.

A new LED screen above an entryway

What was eventually installed over the past two seasons were the new wall-mounted displays, which joined the 240 TV screens and the 16 hanging pendant displays (with six screens each) that were already in the concourses. According to Castro the Sharks took down eight static signs to make room for the new, interactive displays.

All the new displays make use of Daktronic’s new Narrow Pixel Pitch (NPP) technology, which feature 2.5-millimeter line spacing. The close alignment of the LED lights in the displays makes them sharp even from close distances, with a look and feel more like a traditional TV screen than an LED ribbon board.

By using LED technology, not only are the boards more flexible in what kind of content they can carry, but they are also cheaper and more resilient than TV screens, something Hall said matters a lot to venues like SAP Center that may see up to 300 live events a year.

“If you have TVs, you have to replace them often, and over a lot of hours [the expense] is hard to justify,” said Hall. With its LED technology, Daktronics was able to create custom size boards to fit different areas in the concourse (like above the entry and exit doorways, or above the main entry openings to the seating bowl), giving the Sharks lots of flexibility to build their new concourse viewing experience.

Bringing Cisco Vision to control displays

To make fans take notice of the new displays, the Sharks turned to Cisco and its Cisco Vision IPTV display management system, which allows teams and venues to program and run multiple displays from a single management system. Cisco also brings to the table years of experience in designing, deploying and selling display systems and system content, which can help teams like the Sharks not only keep fans more engaged but also help the team improve its digital ad sales.

Cisco, which supplied the Wi-Fi gear when SAP Center got its new wireless networks a couple years ago, teamed up with network deployment partner AmpThink to deploy a new display system at the same time, often doubling up on infrastructure. At many points inside the arena, a display screen is mounted in the same space as a Wi-Fi access point, an efficient design that combines aesthetics (the APs are hidden behind the screens) with cost savings.

Menu screens and live action are side-by-side to keep fans engaged

According to Ken Martin, executive director of digital transformation for the consumer industries in the Americas and for the sports and entertainment industry globally at Cisco, the Sharks’ previous display system was limited in its capabilities, especially in the ability to change things like menu boards easily between events. Martin also said the Sharks had four different signage solutions for the various boards and displays throughout the stadium, making it hard to coordinate programming across screens.

Now with Cisco Vision in place, the Sharks can build “shows” of content and advertising that flow from screen to screen, or arrive simultaneously on multiple screens to increase the visual effect. Inside the SAP Center concourses, the new Daktronics panels combine with an previously existing infrastructure of screen displays hanging over the walkways to create a visual “wall” that draws the eye.

“The way [the screens] are positioned, you cannot stand in the SAP Center concourses without being hit by something,” Cisco’s Martin said.

The Sharks’ Castro said there “was a lot of discussion and research” about the placement of the signs.

“Whether you turn left or right, you’re always going to see an LED,” Castro said.

How to use digital displays to entertain and inform

Through its professional services that are part of the Cisco Vision deal, Cisco also helps the Sharks brainstorm with potential sponsors to create digital display advertising ideas, and then also helps create, produce and run the “show” of ads that streams across all the stadium’s displays. A current campaign with BMW is an example of using all concourse screens simultaneously to create an immersive feel to the advertising.

A look at the hanging pendant screens in sync with the LED wall boards

“Part of what we do is show customers the art of the possible,” said Martin, who said many demonstrations of digital-display potential can happen in his team’s extensive demo room at Cisco, where they have 27 different types of screens to model just about any possible stadium deployment. Though much of the digital advertising industry in venues is still in an adolescent stage, Martin said that sponsors are “way more educated than they have ever been,” and know now that they can ask for particulars like having ads shown at certain times, or to have advertising content “wrapped” around live action on partial screen real estate, like an “L-wrap.”

With Cisco Vision, the Sharks are able to not just coordinate a “show” of ads and other content during the game, but they can also break in and trigger special screen content when something happens live, like a goal being scored. Such “takeover” moments are just another new asset that can be added to the ROI for a smart digital display solution, something not possible with static display systems. Such timely messages can really catch the fans’ eye, especially so at hockey games where people pay attention when they aren’t in their seats.

“If you’re a true hockey fan, you have your concourse timing down to a science,” said Daktronic’s Hall. “You never want to go to the game and feel like you’re missing something.”

To help those fans, one of the live action content pieces run across most of the concourse boards at SAP Center is a live clock that counts down the time until live action starts again.

“It can really be a showstopper, to use the screens and video walls, especially when they are all synchronized to the same message,” Cisco’s Martin said. “You’re going to get people to stop and pay attention.”

For the Sharks, the new system is already returning dividends; according to Castro, some 80 percent of all new digital display sponsorship business includes Cisco Vision integration as part of the opportunity.

“It helps [ads] rise above the clutter,” Castro said of the new display system. “You can see the impact on the brands as well as on the fans.”

“It’s like putting on a show in the concourse,” Daktronic’s Hall said of the new system. “It really extends the in-bowl experience through the whole venue.”

Editor’s note: This profile is from our new VENUE DISPLAY REPORT series, a vertical-specific offering of MSR’s existing STADIUM TECH REPORT series. The VENUE DISPLAY REPORT series will focus on telling the stories of successful venue display technology deployments and the business opportunities these deployments enable. No registration or email address required — just click on this link and start reading!

Super Bowl recap: 24 TB for Wi-Fi, 12 TB for DAS

Pats fans celebrate with a selfie at the end of Super Bowl 53. Credit all photos: Mercedes-Benz Stadium (click on any picture for a larger image)

Super Bowl 53 at Atlanta’s Mercedes-Benz Stadium rewrote the record book when it comes to single-day stadium Wi-Fi, with 24.05 terabytes of traffic seen on the stadium’s network. That is a huge leap from the official 16.31 TB seen at last year’s Super Bowl 52 in Minneapolis at U.S. Bank Stadium.

According to official statistics provided by Extreme Networks, new high-water marks were set last Sunday in every category of network measurement, including an amazing 48,845 unique users on the network, a take rate of 69 percent out of the 70,081 who were in attendance to watch the New England Patriots beat the Los Angeles Rams 13-3. The average Wi-Fi data use per connected fan also set a new record, with the per-fan mark of 492.3 megabytes per user eclipsing last year’s mark of 407.4.

While fans might have preferred some more scoring excitement during the game, the lack of any tense moments in network operations was a perfect outcome for Danny Branch, chief information officer for AMB Sports & Entertainment.

“I was ecstatic on how [the network] executed, but honestly it was sort of uneventful, since everything went so well,” said Branch in a phone interview the week after the game. Though network performance and fan usage during some of the big events leading up to the Super Bowl had Branch thinking the Wi-Fi total number might creep near the 20-terabyte range, the early network use on game day gave Branch a clue that the final number might be even higher.

“When I saw the initial numbers that said we did 10 [terabytes] before kickoff we didn’t know where it would end,” Branch said. “When we were watching the numbers near the end of the game, we were just laughing.”

Aruba APs and AmpThink design shine

Editor’s note: This report is from our latest STADIUM TECH REPORT, an in-depth look at successful deployments of stadium technology. Included with this report is a profile of the new Wi-Fi and DAS networks being planned for the University of Colorado, as well as a profile of Wi-Fi at Little Caesars Arena in Detroit! DOWNLOAD YOUR FREE COPY now!

Digital device use once again set records at the NFL’s championship game.

With some 1,800 APs installed inside Mercedes-Benz Stadium — with most of the bowl seating APs located underneath the seats — the Wi-Fi gear from Aruba, a Hewlett Packard Enterprise company, in a design from AmpThink, also saw a peak throughput rate of 13.06 Gbps, seen at halftime. The peak number of concurrent network users, 30,605, also took place during the halftime show, which featured the band Maroon 5 (whose show played to mixed reviews).

Extreme Networks, which provides Wi-Fi analysis in a sponsorship deal with the NFL, had a great list of specific details from the event. Here are some of the top-line stats:

Need proof that people still watch the game? Out of the 24.05 TB total, Extreme said 9.99 TB of the traffic took place before the kickoff, followed by 11.11 TB during the game and halftime, and another 2.95 TB after the game concluded.

On the most-used apps side, Extreme said the most-used social apps were, in order of usage, Facebook, Instagram, Twitter, Snapchat and Bitmoji; on the streaming side, the most-used apps were iTunes, YouTube, Airplay, Spotify and Netflix. The most-used sporting apps by fans at the game were, in order, ESPN, NFL, the Super Bowl LIII Fan Mobile Pass (the official app for the game), CBS Sports (which broadcast the game live) and Bleacher Report.

Did Verizon’s offload spike the total?

While Super Bowl Wi-Fi traffic has grown significantly each year since we started reporting the statistics, one reason for the bigger leap this year may have been due to the fact that Verizon Wireless used its sponsorship relationship with the NFL to acquire its own SSID on the Mercedes-Benz Stadium Wi-Fi network.

Hard copy signage in the stadium helped direct fans to the Wi-Fi.

According to Andrea Caldini, Verizon vice president for networking engineering in the Eastern U.S., Verizon had “autoconnect in play,” which meant that any Verizon customer with Wi-Fi active on their devices would be switched over to Wi-Fi when inside the stadium.

“It’s going to be a good offload for us,” said Caldini in a phone interview ahead of the Super Bowl. While Verizon claimed week to have seen “record cellular traffic” as well during Super Bowl Sunday, a spokesperson said Verizon will no longer release such statistics from the game.

According to Branch, the NFL helped fans find the Wi-Fi network with additional physical signage that was put up just for the Super Bowl, in addition to rotating messages on the digital display screens around the stadium.

“The venue was well signed, we really liked what they [the NFL] did,” Branch said. Branch said the league also promoted the Wi-Fi link throughout the week, with a common ID at all the related Super Bowl activity venues, something that may have helped fans get connected on game day.

No issues with the DAS

One of the parts of the wireless mix at Mercedes-Benz Stadium, the cellular distributed antenna system, was under scrutiny after a lawsuit emerged last fall under which technology supplier IBM sued Corning over what IBM said was faulty installation. While Corning has disputed the claims, over the past year IBM, the Falcons and the NFL all said they got the DAS in working order, and according to Branch “all the carriers were pleased” with its operation during the Super Bowl.

There was only one, but it helped increase the wireless traffic.

According to Branch, the Falcons saw 12.1 TB of traffic on the in-stadium DAS on Super Bowl Sunday, including some traffic that went through the Matsing Ball antennas. Branch said the two Matsing Balls, which hang from the rafters around the Halo Board video screen, were turned back on to assist with wireless traffic on the field during the postgame awards ceremony.

Overall, the record day of Wi-Fi traffic left Branch and his team confident their infrastructure is ready to support the wireless demands of more big events into the future, including next year’s NCAA men’s Final Four.

“Until you’ve taken the car around the track that fast, you don’t really know how it will perform,” Branch said. “But so much work was done beforehand, it’s great to see that it all paid off.”

New Report: Record Wi-Fi at Super Bowl 53, and Wi-Fi and DAS for Colorado’s Folsom Field

MOBILE SPORTS REPORT is pleased to announce the Spring 2019 issue of our STADIUM TECH REPORT series, the ONLY in-depth publication created specifically for the stadium technology professional and the stadium technology marketplace.

Our string of historical in-depth profiles of successful stadium technology deployments continues with reports from the record-setting Wi-Fi day at Super Bowl 53, a look at the network performance at Little Caesars Arena, plans for Wi-Fi and DAS at the University of Colorado and more! Download your FREE copy today!

We’d like to take a quick moment to thank our sponsors, which for this issue include Mobilitie, JMA Wireless, Corning, Boingo, MatSing, and Cox Business/Hospitality Network. Their generous sponsorship makes it possible for us to offer this content free of charge to our readers. We’d also like to welcome readers from the Inside Towers community, who may have found their way here via our ongoing partnership with the excellent publication Inside Towers. We’d also like to thank the SEAT community for your continued interest and support.

Raiders sign Cox Communications as Wi-Fi partner for Las Vegas Stadium

Cox president Pat Esser, left, and Raiders president Marc Badain in front of the under-construction Las Vegas Stadium. Credit, photo and renderings: Cox Communications /Raiders

Cox Communications signed on with the NFL’s Raiders as a founding partner and “official Wi-Fi and Internet provider” for the under-construction Las Vegas Stadium, which is scheduled to open next summer.

Though no details are available yet on whose gear Cox will use for the venue’s Wi-Fi network or how many APs they will place in the 65,000-seat stadium, a press release did say that Cox would provide “multiple gig-speed bandwidth” to the venue, which at the very least should ensure good connectivity when the now-Oakland Raiders and their fans arrive.

“With a rich history of powering the largest stadiums, hotels and convention centers – many right here in Las Vegas – we’re excited to work with Cox on the next evolution of the connected fan experience,” said Marc Badain, president of the Raiders, in a prepared statement. Cox, which is targeting the large public venue space more aggressively lately, also is the lead technology provider at T-Mobile Arena in Las Vegas and recently took over in the same role at State Farm Stadium in Glendale, Ariz., home of the NFL’s Arizona Cardinals and multiple big events.

“Las Vegas is the sports and entertainment capital of the world, but it’s also becoming one of the smartest, and most connected cities in the world,” said Pat Esser, president, Cox Communications, in a prepared statement. “The new Las Vegas Stadium will perfectly complement this progression and could become the smartest gridiron yet.” Cox is also the wireless provider at the huge Las Vegas Convention Center, where shows like CES drive lots of Wi-Fi and cellular traffic.

As the bones of the stadium are now rising into the Las Vegas skyline, MSR will keep tabs on the construction development and how all the technology is coming together, so stay tuned. Some renderings of what the finished product is supposed to look like are below.

What the new stadium is supposed to look like inside

And from the outside

PGA Tour gives CBRS a test

Volunteers track shots with lasers on the fairways of PGA Tour tournaments. Credit: Chris Condon/PGA TOUR (click on any photo for a larger image)

CBRS technology doesn’t need spikey shoes to gain traction on the fairways, if early results from technology tests undertaken by the PGA Tour at courses around the country are any indication.

A recent 14-state test run by the top professional U.S. golf tour tapped the newly designated Citizens Broadband Radio Service (CBRS), which comprises 150 MHz of spectrum in the 3.5 GHz band. Golf courses, which typically lack the dense wireless coverage of more populated urban areas, are easily maxed out when thousands of fans show up on a sunny weekend to trail top-ranked players like Brooks Koepka, Rory McIlroy or perennial favorite Tiger Woods.

To cover the bandwidth needs of tournaments, the PGA Tour has over time used a mix of technologies, many portable in nature given the short stay of a tournament at any given course. Like Wi-Fi or temporary cellular infrastructures used in the past, the hope is that CBRS will help support public safety, scoring and broadcast applications required to keep its events operating smoothly and safely, according to the PGA Tour.

“We’re looking at replacing our 5 GHz Wi-Fi solution with CBRS so we can have more control over service levels,” said Steve Evans, senior vice president of information systems for the PGA Tour. Unlike 5 GHz Wi-Fi, CBRS is licensed spectrum and less prone to interference the Tour occasionally experienced.

CBRS will also make a big difference with the Tour’s ShotLink system, a wireless data collection system used by the PGA Tour that gathers data on every shot made during competition play – distance, speed and other scoring data.

“CBRS would help us get the data off the golf course faster” than Wi-Fi can, Evans explained. “And after more than 15 months of testing we’ve done so far, CBRS has better coverage per access point than Wi-Fi.”

The preliminary results are so encouraging that the Tour is also looking to CBRS to carry some of its own voice traffic and has already done some testing there. “We need to have voice outside the field of play, and we think CBRS can help solve that problem,” Evans added.

But as an emerging technology, it’s important to acknowledge the limitations of CBRS. Compatible handsets aren’t widely available; the PGA Tour has been testing CBRS prototypes from Essential. Those units only operate in CBRS bands 42 and 43; a third, band 48, is expected to be added by device makers sometime in the first half of 2019.

“We’re waiting for the phones to include band 48 and then we’ll test several,” Evans told Mobile Sports Report. “I expect Android would move first and be very aggressive with it.”

CBRS gear mounted on temporary poles at a PGA Tour event. Credit: PGA Tour

The PGA Tour isn’t the only sports entity looking at CBRS’s potential. The National Football League is testing coach-to-coach and coach-to-player communications over CBRS at all the league’s stadiums; the NBA’s Sacramento
Kings are testing it at Golden 1 Center with Ruckus; NASCAR has been testing video transmission from inside cars using CBRS along with Nokia and Google, and the ISM Raceway in Phoenix, Ariz., recently launched a live CBRS network that it is currently using for backhaul to remote parking lot Wi-Fi hotspots.

Outside of sports and entertainment, FedEx, the Port of Los Angeles and General Electric are jointly testing CBRS in Southern California. Love Field Airport in Dallas is working with Boingo and Ruckus in a CBRS trial; service provider Pavlov Media is testing CBRS near the University of Illinois Champaign-Urbana with Ruckus gear. Multiple service providers from telecom, cable and wireless are also testing the emerging technology’s potential all around the country.

Where CBRS came from, where it’s going

Editor’s note: This profile is from our latest STADIUM TECH REPORT, an in-depth look at successful deployments of stadium technology. Included with this report is a profile of the new game-day digital fan engagement strategy at Texas A&M, as well as a profile of Wi-Fi at Merceds-Benz Stadium, home of Super Bowl LIII in Atlanta! DOWNLOAD YOUR FREE COPY now!

CBRS has undergone a 6-year gestation period; 150 MHz worth of bandwidth was culled from the 3.5 GHz spectrum, which must be shared (and not interfere) with U.S. government radar operations already operating in that same spectrum.

From a regulatory perspective, CBRS’s experimental status is expected to give way to full commercial availability in the near future. Consequently, wireless equipment vendors have been busy building – and marketing – CBRS access points and antennas for test and commercial usage. But entities like the PGA Tour have already identified the benefits and aren’t waiting for the FCC to confer full commercial status on the emerging wireless technology.

CBRS equipment vendors and would-be service providers were hard to miss at last fall’s Mobile World
Congress Americas meeting in Los Angeles. More than 20 organizations – all part of the CBRS Alliance – exhibited their trademarked OnGo services, equipment and software in a day-long showcase event. (Editor’s note: “OnGo” is the alliance’s attempt to “brand” the service as something more marketable than the geeky CBRS acronym).

The CBRS Alliance envisions five potential use cases of the technology, according to Dave Wright, alliance president and director of regulatory affairs and network standards at Ruckus:
• Mobile operators that want to augment capacity of their existing spectrum
• Cable operators looking to expand into wireless services instead of paying a mobile virtual network operator (MVNO)
• Other third-party providers looking to offer fixed broadband services
• Enterprise and industrial applications: extending or amplifying wireless in business parks and remote locations; Internet of Things data acquisition.
• Neutral host capabilities, which some have likened to LTE roaming, an important development as 5G cellular services ramp up.

Previously, if customers wanted to extend cell coverage inside a building or a stadium, their best option was often distributed antenna systems (DAS). But DAS is complicated, expensive and relies on carrier participation, according to Wright. “Carriers also want to make sure your use of their spectrum doesn’t interfere with their macro spectrum nearby,” he added.

CBRS uses discrete spectrum not owned by a mobile operator, allowing an NFL franchise, for example, to buy CBRS radios and deploy them around the stadium, exclusively or shared, depending on their requirements and budgets.

More CBRS antenna deployment. Credit: PGA Tour

On a neutral host network, a mobile device would query the LTE network to see which operations are supported. The device would then exchange credentials with the mobile carriers – CBRS and cellular – then permissions are granted, the user is authenticated, and their usage info gets passed back to the carrier, Wright explained.

With the PGA Tour tests, the Essential CBRS devices get provisioned on the network, then connect to the CBRS network just like a cell phone connects to public LTE, Evans explained. The Tour’s custom apps send collected data back to the Tour’s network via the CBRS access point, which is connected to temporary fiber the Tour installs. And while some of Ruckus’s CBRS access points also support Wi-Fi, the Tour uses only the CBRS. “When we’re testing, we’re not turning Wi-Fi on if it’s there,” Evans clarified.

While the idea of “private LTE” networks supported by CBRS is gaining lots of headline time, current deployments would require a new SIM card for any devices wanting to use the private CBRS network, something that may slow down deployments until programmable SIM cards move from good idea to reality. But CBRS networks could also be used for local backhaul, using Wi-Fi to connect to client devices, a tactic currently being used at ISM Raceway in Phoenix.

“It’s an exciting time… CBRS really opens up a lot of new opportunities,” Wright added. “The PGA Tour and NFL applications really address some unmet needs.”

CBRS on the Fairways

Prior to deploying CBRS access points at a location, the PGA Tour surveys the tournament course to create a digital image of every hole, along with other data to calculate exact locations and distances between any two coordinates, like the tee box and the player’s first shot or the shot location and the location of the hole. The survey also helps the Tour decide how and where to place APs on the course.

Courses tend to be designed in two different ways, according to the PGA Tour’s Evans. With some courses, the majority number of holes are adjacent to each other and create a more compact course; other courses are routed through neighborhoods and may snake around, end-to-end.

“In the adjacent model, which is 70 percent of the courses we play, we can usually cover the property with about 10 access points,” Evans explained.

Adjacent-style courses where the PGA Tour has tested CBRS include Ridgewood Country Club in Paramus, N.J.; Aronimink Golf Club in Newtown Square, Penn.; and East Lake Golf Club in Atlanta.

In the second model, where the holes are strung back to back, the PGA Tour may have to deploy as many as 18 or 20 APs to get the coverage and throughput it needs. That’s the configuration used during a recent tournament at the TPC Summerlin course in Las Vegas, Nev., Evans told Mobile Sports Report.

On the course, CBRS APs get attached to some kind of structure where possible, Evans added. “Where that doesn’t make sense, we have portable masts we use – a tripod with a pole that goes up 20 feet,” he said. The only reason he’d relocate an AP once a tournament began is if it caused a problem with the competition or fan egress. “We’re pretty skilled at avoiding those issues,” he said.

A handful of PGA Tour employees operates its ShotLink system, which also relies on an army of volunteers – as many as 350 at each tournament – who help with data collection and score updates (that leader board doesn’t refresh itself!). “There’s a walker with each group, recording data about each shot. There’s technology for us on each fairway and green, and even in the ball itself, as the ball hits the green and as player hits putts,” said Evans.

The walker-volunteers relay their data back to a central repository; from there, ShotLink data then gets sent to PGA Tour management and is picked up by a variety of organizations from onsite TV broadcast partners; the pgatour.com Website; players, coaches and caddies; print media; and mobile devices.

In addition to pushing PGA Tour voice traffic over on to CBRS, the organization is also looking for the technology to handle broadcast video. “We think broadcast video capture could become a [CBRS] feature,” Evans said. The current transport method, UHF video, is a low-latency way to get video back to a truck where it can be uploaded for broadcast audiences.

A broadcast program produced by the organization, PGA Tour Live, follows two groups on the course; each group has four cameras and producers cut between each group and each camera. That video needs to be low latency, high reliability, but is expensive due to UHF transmission.

Once 5G standards are created for video capture, the PGA Tour could use public LTE to bond a number of cell signals together. Unfortunately, that method has higher latency. “It’s fine for replay but not for live production,” Evans said, but is expected to eventually improve performance-wise. “The idea is eventually to move to outside cameras with CBRS and then use [CBRS] for data collection too,” he added. “If we could take out the UHF cost, it would be significant for us.”

In the meantime, the Tour will continue to rely largely on Cisco-Meraki Wi-Fi and use Wi-Fi as an alternate route if something happens to CBRS, Evans said. “But we expect CBRS to be primary and used 99 percent of the time.”