Colorado brings Wi-Fi and DAS to Folsom Field

Folsom Field at night. Credit: University of Colorado (click on any picture for a larger image)

There will be a change in the air at Folsom Field this fall, and not just from the team that new head coach Mel Tucker will lead onto the gridiron. For the first time, the mile-high atmosphere inside the University of Colorado’s historic venue will be filled with fan-facing Wi-Fi and cellular signals, thanks to new networks being installed this offseason by third-party host Neutral Connect Networks (NCN).

In a deal that will also bring Wi-Fi and a cellular DAS to the school’s basketball arena, NCN will use Cisco gear for the Wi-Fi network and JMA Wireless gear for the cellular networks. A centrally located head-end will serve both venues via fiber connections, some run through existing tunnels from the campus’ old steam-heating infrastructure.

Due to be live before the 2019 football season begins on Sept. 7 when CU hosts Nebraska, the Wi-Fi network will use 550 APs in a mostly under-seat deployment at Folsom Field, where there are no overhangs over any of the seating areas. DAS deployment in Colorado’s historic football stadium — which first hosted games in 1924 — will use antennas pointing down from the stadium’s top edges, with some new flagpoles scheduled to help provide antenna-mounting locations.

While its incredibly picturesque location at the edge of the Rocky Mountains has historically made Folsom Field a fan-favorite place to visit (at least for photos), the lack of any comprehensive wireless coverage of any sort has produced some grumbling from Buffs fans in recent years. According to Matt Biggers, CU’s chief marketing officer and associate athletic director for external affairs, wireless coverage inside the sports venues has been a topic of internal research for more than 6 years.

“It was all about finding a partner and a financial model that works for us,” said Biggers. “It finally got to a point where it made sense to pull the trigger.”

Neutral host model appealing to schools

Editor’s note: This report is from our latest STADIUM TECH REPORT, an in-depth look at successful deployments of stadium technology. Included with this report is a profile of the Wi-Fi records set at Super Bowl 53, as well as a profile of Wi-Fi at Vivint Smart Home Arena in Salt Lake City! DOWNLOAD YOUR FREE COPY now!

The CU Events Center, home of Colorado hoops teams. Credit: Paul Kapustka, MSR

The model brought to CU is a classic neutral-host operation, where a provider like NCN (which bought the former sports-stadium practice from 5 Bars) will build a school’s Wi-Fi and DAS networks under a revenue-sharing deal with the school where the carriers help some with upfront payments and then provide payments over a long-term lease to operate on the DAS.

The neutral-host option is one good way for schools or teams with smaller budgets or lightly used facilities to bring connectivity to arenas. CU’s Folsom Field, for example, doesn’t see much use other than the six home games per football season. This year, the stadium will see big crowds beyond football only at a few events, including the Memorial Day Bolder Boulder 10K footrace (which ends inside the stadium), a Fourth of July fireworks celebration, and a couple of July concerts featuring the Dead & Company tour.

According to James Smith, vice president of carrier services for NCN, AT&T will be the anchor tenant on the DAS, and will be first to be operational. Verizon Wireless and T-Mobile, Smith said, are still negotiating long-term agreements but are expected to be on the DAS by 2020.

NCN [then under its old name of 5 Bars] negotiated a similar neutral-host deal with CU’s neighbor to the north, Colorado State University, for CSU’s new football stadium which opened in 2017. Now known as Canvas Stadium, the 41,000-seat venue had 419 total Wi-Fi access points when it opened, with approximately 250 of those used in the bowl seating area. Like CSU’s deployment, the Wi-Fi network at Folsom Field will use primarily under-seat AP deployments, mainly because the stadium’s horseshoe layout has no overhangs.

DAS gear already installed in the CU Events Center

According to NCN’s Smith, the current plan sees a deployment of 550 APs in Folsom Field, with another 70 APs in the basketball arena, the CU Events Center. Both venues’ networks will be served by a central head-end room located in an old telephone PBX space near the center of campus. Fiber links will run from there to both Folsom Field and the Events Center.

At Folsom, the NCN team will have a long list of deployment challenges, mainly having to navigate the construction particulars of a stadium that has been gradually expanded and added onto over the years.

“Sometimes it’s hard to know what’s behind a brick,” said NCN director of program management Bryan Courtney, speaking of existing infrastructure that has been around for decades. Smith said the Folsom Field DAS will make use of overhead antennas, including some that will require new flagpole-type structures that will need to match Folsom Field’s architectural heritage.

Basketball arena is all top-down

At the 11,064-seat CU Events Center, formerly known as the Coors Events Center, deployment of both Wi-Fi and DAS will be somewhat easier, as all the gear servicing the seating area will be suspended from the catwalks. With the main concourse at stadium entry level and all the seats in a single rectangular bowl flowing down from there, the ceiling is close enough for good top-down coverage for both Wi-Fi and celluar, NCN’s Smith said.

The Golden Buffalo Marching Band on a CU game day. Credit: Paul Kapustka, MSR

Though deployment of both networks in the Events Center is currently underway, neither will be active until after the current college basketball season is completed. However, the Events Center stays somewhat more busy than the football stadium, with events like local high school graduations and other special events (like a Republican Party debate in 2015) making use of the space. Both networks should be fully up and running by the next basketball season, according to NCN.

Unlike some other universities that are aggressively pursuing digital fan-connection strategies, CU’s Biggers said the school will start slowly with its fan-facing networks, making sure the experience is a solid one before trying too hard.

“We’re pretty conservative, and this is a complicated project and we want to make sure we get it right,” said Biggers. Though Biggers said CU fans haven’t been extremely vocal about connectivity issues inside the sports venues, he does admit to hearing about “some frustration” about signals in some areas of the stadium (which until now has only been served by a couple of dedicated macro antennas from the outside).

“There’s definitely a hunger [for wireless service],” Biggers said.

On the business side, Biggers said CU will also be taking more time to evaluate any additions to its game-day digital operations. Though CU recently introduced a mobile-only “buzzer beater” basketball ticket package that offered discounted passes that would deliver an assigned seat to a device 24 hours before game time, Biggers said that for football, a longtime paper-ticket tradition for season ticket holders would likely stay in place.

Colorado will also “re-evaluate” its game-day mobile application strategy, Biggers said, with the new networks in mind. “But the real game-changer for us is data collection,” he said. “We’re most excited about having data to better serve the fans.”

Commentary: Cheer, Cheer for old Wi-Fi

A hoops fan records action during the FInal Four at U.S. Bank Stadium. Credit all photos: Paul Kapustka, MSR (click on any picture for a larger image)

News item: Super Bowl 53 sees 24 terabytes of Wi-Fi data used.

Second news item: Final Four weekend sees 31.2 terabytes of Wi-Fi data used.

Even as people across the wireless industry seem ready to dig Wi-Fi’s grave, the view from here is not only is Wi-Fi’s imminent death greatly exaggerated, things may actually be heading in the other direction — Wi-Fi’s last-mile and in-building dominance may just be getting started.

The latest ironic put-down of Wi-Fi came in a recent Wall Street Journal article with the headline of “Cellphone Carriers Envision World Without Wi-Fi,” in which a Verizon executive calls Wi-Fi “rubbish.” While the article itself presents a great amount of facts about why Wi-Fi is already the dominant last-mile wireless carrier (and may just get stronger going forward) the article doesn’t talk at all about the Super Bowl, where Verizon itself basically turned to Wi-Fi to make sure fans at the big game who were Verizon customers could stay connected.

Wi-Fi speedtest from U.S. Bank Stadium during the Final Four championship game.

As readers of MSR know, the performance of the cellular DAS at Mercedes-Benz Stadium in Atlanta has been a question mark since its inception, and the emergence of competing lawsuits between lead contractor IBM and supplier Corning over its implementation means we may never learn publicly what really happened, and whether or not it was ever fixed. Though stadium tech execs and the NFL said publicly that the DAS was fine for the Super Bowl, Verizon’s actions perhaps spoke much louder — the carrier basically paid extra to secure part of the Wi-Fi network bandwidth for its own customers, and used autoconnect to get as many of its subscribers as it could onto the Wi-Fi network.

While we did learn the Wi-Fi statistics in detail — thanks to the fact that Wi-Fi numbers are controlled by the venue, not the carriers — it’s interesting to note that none of the four top cellular providers in the U.S. would give MSR a figure of how much cellular traffic they each saw in the stadium on Super Sunday. For the record, stadium officials said they saw 12.1 TB of data used on the Mercedes-Benz Stadium DAS on Super Bowl Sunday, a figure that represents the total traffic from all four carriers combined. But how that pie was split up will likely forever remain a mystery.

AT&T did provide a figure of 23.5 TB for Super Bowl traffic inside the venue as well as in a 2-mile radius around the stadium, and Sprint provided a figure (25 TB) but put even a less-measurable geographic boundary on it, meaning Sprint could have basically been reporting all traffic it saw anywhere inside the greater Atlanta city limits. Verizon and T-Mobile, meanwhile, both refused to report any Super Bowl cellular statistics at all.

An under-seat Wi-Fi AP placement in the end zone seating at the Final Four.

Verizon also did not reply to a question about how much traffic it saw on the Verizon-specific Wi-Fi SSID inside the venue. While we get the marketing reasons for not reporting disappointing stats (why willingly report numbers that make you look bad?), it seems disingenious at best for one Verizon executive (Ronan Dunne, executive vice president and president of Verizon Wireless) to call Wi-Fi “rubbish” when another part of the company is relying heavily on that same rubbish technology to make sure its customers can stay connected when the cellular network can’t keep up. One man’s trash, I guess, is another division’s treasure.

Wi-Fi 6 and more spectrum on the way

For venue owners and operators, the next few years are likely going to be filled with plenty of misinformation regarding the future of wireless. The big carriers, who pull in billions each quarter in revenue, are staking their near-term future on 5G, a label for a confusing mix of technologies and spectrum chunks that is unlikely to be cleared up anytime soon. Unlike the celluar industry change from 3G to 4G — a relatively straightforward progression to a new and unified type of technology — the change to 5G has already seen carriers willing to slap the marketing label on a different number of implementations, which bodes many headaches ahead for those in the venue space who have to figure out what will work best for their buildings and open spaces.

There’s also the imminent emergence of networks that will use the CBRS spectrum at 3.5 GHz, which will support communications using the same LTE technology used for 4G cellular. Though CBRS has its own challenges and hurdles to implementation, because it is backed by carriers and the carrier equipment-supply ecosystem, you can expect a blitz of 5G-type marketing to fuel its hype, with poor old Wi-Fi often the target for replacement.

While the Wi-Fi Alliance and other industry groups rallying around Wi-Fi might seem like the Rebel Alliance against a First Order dreadnought, if I’ve learned anything in my career of technology reporting it’s that you should never bet against open standards. I’ve been around long enough to see seemingly invincible empires based on proprietary schemes collapse and disappear under the relentless power of open systems and standards — like Ethernet vs. DEC or IBM networking protocols, and TCP/IP vs. Novell — to count out Wi-Fi in a battle, even against the cellular giants. In fact, with the improvements that are part of Wi-Fi 6 — known also as 802.11ax in the former parlance — Wi-Fi is supposed to eventually become more like LTE, with more secure connections and a better ability to support a roaming connection and the ability to connect more clients per access point. What happens then if LTE’s advantages go away?

With Wi-Fi 6 gear only now starting to arrive in the marketplace, proof still needs to be found that such claims can work in the real world, especially in the demanding and special-case world of wireless inside venues. But the same hurdles (and maybe even more) exist for CBRS and 5G technologies, with big unanswered questions about device support and the need for numerous amounts of antennas that are usually ignored in the “5G will take over the world soon” hype stories. I’d also add to that mix my wonder about where the time and talent will come from to install a whole bunch of new technologies that will require new learning curves; meanwhile, as far as I can tell the companies supporting Wi-Fi continue to add technology pros at ever-growing user and education conferences.

So as we ready for the inevitable challenge of sifting through cellular FUD and hype let’s have a cheer for good old Wi-Fi — for now the champion of the biggest data-demand days in venues, and maybe the leader for years to come.

Little Caesars Arena revs the engine on wireless

Little Caesars Arena in Detroit is revving its engine with wireless deployments of Wi-Fi and DAS. Credit all photos: Terry Sweeney, MSR (click on any picture for a larger image)

Detroit has made an ambitious bet on the sports entertainment model with its 50-block District Detroit development – which embraces Ford Field (where the NFL’s Lions play), Comerica Park (MLB’s Tigers) and most recently, Little Caesars Arena (NBA’s Pistons and NHL’s Red Wings).

In fact, Motor City might just as easily be renamed Stadium City as Detroit looks to professional sports as one cornerstone of economic re-development.

The city has all four major pro sports teams competing within a few blocks of each other, noted John King, vice president of IT and innovation for Olympia Entertainment and the Detroit Red Wings. District Detroit plays host to more than 200 events, welcoming some 3 million visitors annually – not bad for an area that’s barely 18 months old.

Detroit’s hardly alone in riding this development wave. Sports entertainment districts are a proven engine to boost local economies and are popping up all over the country:
–Los Angeles’s LA Live complex uses the Staples Center as its hub but includes restaurants, hotels and plenty of retail;
–Houston Avenida gangs together Minute Maid Park, BBVA Compass Stadium and NRG Stadium, along with a convention center and hotels;
–Battery Atlanta houses the Atlanta Braves’ SunTrust Park and a Coca-Cola entertainment facility, along with retail, residences and hotels;
— Westgate Entertainment District in the greater Phoenix area houses State Farm Stadium (NFL’s Cardinals) and Gila River Arena (NHL’s Coyotes), plus the obligatory retail, restaurants and hotels.

San Francisco, Kansas City, Cincinnati and Sacramento and other cities are all building out similar sports entertainment developments in their downtown areas that encourage sports fans to make a night of it, or even a weekend. Even venerable venues like Green Bay’s Lambeau Field and Chicago’s Wrigley Field are also getting in the act of trying to build areas outside the parks to keep fans engaged (and spending) before and after events, or even when there’s no games being played.

Robust DAS, Wi-Fi in LCA

Editor’s note: This report is from our latest STADIUM TECH REPORT, an in-depth look at successful deployments of stadium technology. Included with this report is a profile of the new Wi-Fi and DAS networks being planned for the University of Colorado, as well as a profile of Wi-Fi at Vivint Smart Home Arena in Salt Lake City! DOWNLOAD YOUR FREE COPY now!

John King oversees the IT operations at Little Caesars Arena

King is pleased with the performance of the IT infrastructure at Little Caesars Arena since the $863 million venue opened in the fall of 2017. With a backbone of two 100-Gbps fiber connections, the arena counts more than 700 Cisco Wi-Fi access points. There are 364 APs in the bowl itself; the bulk of those – 300 APs – have been installed under seats to get the signals closer to where the users are.

Mobile Sports Report put LCA’s Wi-Fi network and DAS system to the test this season during a Red Wings home game against the New York Rangers. Due to personal technical constraints, we were only able to test Verizon’s portion of the DAS deployment; the Wi-Fi network tested was the District Detroit Xfinity SSID.

The good news is that both network types performed admirably. No surprise that bandwidth was most plentiful and speeds were fastest on concourses near concessions, as well as in the private clubs parceled around LCA. Fastest measured speeds: 139.68 Mbps download/33.24 Mbps on the DAS network outside the MotorCity Casino Club. The Wi-Fi was also well engineered there – 51.89 Mbps download and 72.34 Mbps upload were plenty fast for hockey’s power users.

We measured comparable speeds by the Rehmann Club with 134.4 Mbps/36.25 Mbps on the DAS, and 21.56 Mbps/120.8 Mbps on Wi-Fi. Similarly, connectivity was not an issue while standing in front of the impossible-to-miss Gordie Howe statue in LCA’s main concourse, where we clocked DAS at 102.95 Mbps/22 Mbps, and Wi-Fi at 43.34 Mbps/43.72 Mbps.

Speeds around the arena were generally in double-digit megabits, both for Wi-Fi and DAS. The Wi-Fi signal got a little sluggish in Section M7 (0.79 Mbps/3.03 Mbps) and Section M33 (1.68 Mbps/29 Mbps). Lowest measured throughput on the DAS network was in Suite 17 with 16.18 Mbps/17.41 Mbps, still plenty fast to handle most fan requirements.

Lighting Things Up in District Detroit

In tandem to LCA, there are approximately 1,000 APs also attached to the network that either handle District Detroit’s public Wi-Fi or connect to 34 parking lots and garages.

Wireless gear painted to blend in

“Our goal is to bring life and excitement throughout the District and not just focus on Little Caesars Arena,” King said. Video and digital signage are essential to that effort, both inside and outside LCA. The network enables more than 1,500 IPTV connections distributed across the arena, but also externally to LED boards and electronic parking signs. “We want to take the excitement from the event and run it out to the city – ‘5 minutes to puck drop’, on all those signs as one example,” King explained. “We can leverage [signage] for more than just the price of parking.”

The network uses the Cisco Vision IPTV digital display management system to control display programming, including advertising that appears on video screens in LCA’s many hospitality suites. With five TV screens per suite, LCA deploys an L-shaped “wrapper” around the main video image used for advertising. “We rotate that content in the suites and run loops in concourse before and after events,” King said. “It allows us to put scripting in different zones or post menus and dynamically update prices and items for sale.” LCA’s concessionaires can change the price or location of food and beverage items, all through the networked point-of-sale system.

Tune-able Wi-Fi

The District Detroit app is divided into three “buckets,” according to King: Detroit Red Wings, Detroit Pistons and 313 Presents — all the events and entertainment outside of sporting events (313 is Detroit’s area code). When configured for hockey, LCA can accommodate up to 19,515 Red Wings fans; as a basketball arena for the Pistons, LCA holds 20,491. But some events may draw fewer people and King and his team adjust accordingly.

“We’re an arena for 20,000 fans and as we looked at that density, we found that 10,000 fans behave differently and we’ve had to tune the arena differently based on traffic flows,” he said. When completely full, Wi-Fi signals must pass through so many “bags of water,” as RF engineers sometimes describe human spectators. Half as many fans means that Wi-Fi signals behave differently, consequently, a fan may connect to an AP that’s less than ideal, which can affect both user experience and system performance.

An under-seat Wi-Fi enclosure

“We’ve looked at some power tweaks and tuning; we also have the ability to tune [the arena] on the fly,” King said, but emphasized that the venue’s Wi-Fi doesn’t get re-tuned for every event. “We try to find the sweet spot and not do that too much. On an event day, we try not to touch anything that isn’t broken,” he said.

Previews of coming attractions

Like any sports and entertainment IT exec, King is looking at ways to improve the fan experience and derive more performance and revenue from Olympia’s IT investment. Buoyed by the success of mobile ticketing at LCA, King said he’d like to find some way to use biometrics to help speed up transactions at counters and pedestals throughout the arena. And he’s excited about 5G cellular deployment, which he believes could compete with Wi-Fi if 5G delivers on all that’s been promised by carriers.

LCA’s app uses Bluetooth for navigation, letting fans input their seat information for directions. “Right now, we have pre-order pickup, but in-seat service is something we’re looking at. What other line-busting technologies can we do?” King said.

And while fans can pre-order food and beverages at LCA, King also wonders if pre-ordering of team merchandise (“merch”) is something that would appeal to fans and be easy to execute. “We’re looking at a Cincinnati venue where they have compartments for food, hot or cold, that’s been pre-ordered,” he said, wondering if a similar compartmentalized pickup system be used for merch.

King sees plenty of room for improvement in overall management reporting across IT systems at LCA and the 12,000 active ports that keep systems humming.

“Everything is connected and our electricians can use their iPads to dim or turn on lights anywhere in the building,” he said, adding that everything’s monitored — every switch, every port. “It would be nice to see more information around traffic flow and performance patterns. We’re seeing a little bit of that. But I’d like to see network information on people tracking and doors, and correlate visual information with management data.”

Another set of metrics King can’t get at the moment: Performance data from AT&T, T-Mobile and Verizon about LCA’s 8-zone DAS system. King said he’s talking with Verizon, the lead DAS operator at the venue, about getting autonomous reports in the future, but for the time being King and his team don’t have much visibility there. The DAS uses the Corning ONE system.

Super Bowl recap: 24 TB for Wi-Fi, 12 TB for DAS

Pats fans celebrate with a selfie at the end of Super Bowl 53. Credit all photos: Mercedes-Benz Stadium (click on any picture for a larger image)

Super Bowl 53 at Atlanta’s Mercedes-Benz Stadium rewrote the record book when it comes to single-day stadium Wi-Fi, with 24.05 terabytes of traffic seen on the stadium’s network. That is a huge leap from the official 16.31 TB seen at last year’s Super Bowl 52 in Minneapolis at U.S. Bank Stadium.

According to official statistics provided by Extreme Networks, new high-water marks were set last Sunday in every category of network measurement, including an amazing 48,845 unique users on the network, a take rate of 69 percent out of the 70,081 who were in attendance to watch the New England Patriots beat the Los Angeles Rams 13-3. The average Wi-Fi data use per connected fan also set a new record, with the per-fan mark of 492.3 megabytes per user eclipsing last year’s mark of 407.4.

While fans might have preferred some more scoring excitement during the game, the lack of any tense moments in network operations was a perfect outcome for Danny Branch, chief information officer for AMB Sports & Entertainment.

“I was ecstatic on how [the network] executed, but honestly it was sort of uneventful, since everything went so well,” said Branch in a phone interview the week after the game. Though network performance and fan usage during some of the big events leading up to the Super Bowl had Branch thinking the Wi-Fi total number might creep near the 20-terabyte range, the early network use on game day gave Branch a clue that the final number might be even higher.

“When I saw the initial numbers that said we did 10 [terabytes] before kickoff we didn’t know where it would end,” Branch said. “When we were watching the numbers near the end of the game, we were just laughing.”

Aruba APs and AmpThink design shine

Editor’s note: This report is from our latest STADIUM TECH REPORT, an in-depth look at successful deployments of stadium technology. Included with this report is a profile of the new Wi-Fi and DAS networks being planned for the University of Colorado, as well as a profile of Wi-Fi at Little Caesars Arena in Detroit! DOWNLOAD YOUR FREE COPY now!

Digital device use once again set records at the NFL’s championship game.

With some 1,800 APs installed inside Mercedes-Benz Stadium — with most of the bowl seating APs located underneath the seats — the Wi-Fi gear from Aruba, a Hewlett Packard Enterprise company, in a design from AmpThink, also saw a peak throughput rate of 13.06 Gbps, seen at halftime. The peak number of concurrent network users, 30,605, also took place during the halftime show, which featured the band Maroon 5 (whose show played to mixed reviews).

Extreme Networks, which provides Wi-Fi analysis in a sponsorship deal with the NFL, had a great list of specific details from the event. Here are some of the top-line stats:

Need proof that people still watch the game? Out of the 24.05 TB total, Extreme said 9.99 TB of the traffic took place before the kickoff, followed by 11.11 TB during the game and halftime, and another 2.95 TB after the game concluded.

On the most-used apps side, Extreme said the most-used social apps were, in order of usage, Facebook, Instagram, Twitter, Snapchat and Bitmoji; on the streaming side, the most-used apps were iTunes, YouTube, Airplay, Spotify and Netflix. The most-used sporting apps by fans at the game were, in order, ESPN, NFL, the Super Bowl LIII Fan Mobile Pass (the official app for the game), CBS Sports (which broadcast the game live) and Bleacher Report.

Did Verizon’s offload spike the total?

While Super Bowl Wi-Fi traffic has grown significantly each year since we started reporting the statistics, one reason for the bigger leap this year may have been due to the fact that Verizon Wireless used its sponsorship relationship with the NFL to acquire its own SSID on the Mercedes-Benz Stadium Wi-Fi network.

Hard copy signage in the stadium helped direct fans to the Wi-Fi.

According to Andrea Caldini, Verizon vice president for networking engineering in the Eastern U.S., Verizon had “autoconnect in play,” which meant that any Verizon customer with Wi-Fi active on their devices would be switched over to Wi-Fi when inside the stadium.

“It’s going to be a good offload for us,” said Caldini in a phone interview ahead of the Super Bowl. While Verizon claimed week to have seen “record cellular traffic” as well during Super Bowl Sunday, a spokesperson said Verizon will no longer release such statistics from the game.

According to Branch, the NFL helped fans find the Wi-Fi network with additional physical signage that was put up just for the Super Bowl, in addition to rotating messages on the digital display screens around the stadium.

“The venue was well signed, we really liked what they [the NFL] did,” Branch said. Branch said the league also promoted the Wi-Fi link throughout the week, with a common ID at all the related Super Bowl activity venues, something that may have helped fans get connected on game day.

No issues with the DAS

One of the parts of the wireless mix at Mercedes-Benz Stadium, the cellular distributed antenna system, was under scrutiny after a lawsuit emerged last fall under which technology supplier IBM sued Corning over what IBM said was faulty installation. While Corning has disputed the claims, over the past year IBM, the Falcons and the NFL all said they got the DAS in working order, and according to Branch “all the carriers were pleased” with its operation during the Super Bowl.

There was only one, but it helped increase the wireless traffic.

According to Branch, the Falcons saw 12.1 TB of traffic on the in-stadium DAS on Super Bowl Sunday, including some traffic that went through the Matsing Ball antennas. Branch said the two Matsing Balls, which hang from the rafters around the Halo Board video screen, were turned back on to assist with wireless traffic on the field during the postgame awards ceremony.

Overall, the record day of Wi-Fi traffic left Branch and his team confident their infrastructure is ready to support the wireless demands of more big events into the future, including next year’s NCAA men’s Final Four.

“Until you’ve taken the car around the track that fast, you don’t really know how it will perform,” Branch said. “But so much work was done beforehand, it’s great to see that it all paid off.”

PGA Tour gives CBRS a test

Volunteers track shots with lasers on the fairways of PGA Tour tournaments. Credit: Chris Condon/PGA TOUR (click on any photo for a larger image)

CBRS technology doesn’t need spikey shoes to gain traction on the fairways, if early results from technology tests undertaken by the PGA Tour at courses around the country are any indication.

A recent 14-state test run by the top professional U.S. golf tour tapped the newly designated Citizens Broadband Radio Service (CBRS), which comprises 150 MHz of spectrum in the 3.5 GHz band. Golf courses, which typically lack the dense wireless coverage of more populated urban areas, are easily maxed out when thousands of fans show up on a sunny weekend to trail top-ranked players like Brooks Koepka, Rory McIlroy or perennial favorite Tiger Woods.

To cover the bandwidth needs of tournaments, the PGA Tour has over time used a mix of technologies, many portable in nature given the short stay of a tournament at any given course. Like Wi-Fi or temporary cellular infrastructures used in the past, the hope is that CBRS will help support public safety, scoring and broadcast applications required to keep its events operating smoothly and safely, according to the PGA Tour.

“We’re looking at replacing our 5 GHz Wi-Fi solution with CBRS so we can have more control over service levels,” said Steve Evans, senior vice president of information systems for the PGA Tour. Unlike 5 GHz Wi-Fi, CBRS is licensed spectrum and less prone to interference the Tour occasionally experienced.

CBRS will also make a big difference with the Tour’s ShotLink system, a wireless data collection system used by the PGA Tour that gathers data on every shot made during competition play – distance, speed and other scoring data.

“CBRS would help us get the data off the golf course faster” than Wi-Fi can, Evans explained. “And after more than 15 months of testing we’ve done so far, CBRS has better coverage per access point than Wi-Fi.”

The preliminary results are so encouraging that the Tour is also looking to CBRS to carry some of its own voice traffic and has already done some testing there. “We need to have voice outside the field of play, and we think CBRS can help solve that problem,” Evans added.

But as an emerging technology, it’s important to acknowledge the limitations of CBRS. Compatible handsets aren’t widely available; the PGA Tour has been testing CBRS prototypes from Essential. Those units only operate in CBRS bands 42 and 43; a third, band 48, is expected to be added by device makers sometime in the first half of 2019.

“We’re waiting for the phones to include band 48 and then we’ll test several,” Evans told Mobile Sports Report. “I expect Android would move first and be very aggressive with it.”

CBRS gear mounted on temporary poles at a PGA Tour event. Credit: PGA Tour

The PGA Tour isn’t the only sports entity looking at CBRS’s potential. The National Football League is testing coach-to-coach and coach-to-player communications over CBRS at all the league’s stadiums; the NBA’s Sacramento
Kings are testing it at Golden 1 Center with Ruckus; NASCAR has been testing video transmission from inside cars using CBRS along with Nokia and Google, and the ISM Raceway in Phoenix, Ariz., recently launched a live CBRS network that it is currently using for backhaul to remote parking lot Wi-Fi hotspots.

Outside of sports and entertainment, FedEx, the Port of Los Angeles and General Electric are jointly testing CBRS in Southern California. Love Field Airport in Dallas is working with Boingo and Ruckus in a CBRS trial; service provider Pavlov Media is testing CBRS near the University of Illinois Champaign-Urbana with Ruckus gear. Multiple service providers from telecom, cable and wireless are also testing the emerging technology’s potential all around the country.

Where CBRS came from, where it’s going

Editor’s note: This profile is from our latest STADIUM TECH REPORT, an in-depth look at successful deployments of stadium technology. Included with this report is a profile of the new game-day digital fan engagement strategy at Texas A&M, as well as a profile of Wi-Fi at Merceds-Benz Stadium, home of Super Bowl LIII in Atlanta! DOWNLOAD YOUR FREE COPY now!

CBRS has undergone a 6-year gestation period; 150 MHz worth of bandwidth was culled from the 3.5 GHz spectrum, which must be shared (and not interfere) with U.S. government radar operations already operating in that same spectrum.

From a regulatory perspective, CBRS’s experimental status is expected to give way to full commercial availability in the near future. Consequently, wireless equipment vendors have been busy building – and marketing – CBRS access points and antennas for test and commercial usage. But entities like the PGA Tour have already identified the benefits and aren’t waiting for the FCC to confer full commercial status on the emerging wireless technology.

CBRS equipment vendors and would-be service providers were hard to miss at last fall’s Mobile World
Congress Americas meeting in Los Angeles. More than 20 organizations – all part of the CBRS Alliance – exhibited their trademarked OnGo services, equipment and software in a day-long showcase event. (Editor’s note: “OnGo” is the alliance’s attempt to “brand” the service as something more marketable than the geeky CBRS acronym).

The CBRS Alliance envisions five potential use cases of the technology, according to Dave Wright, alliance president and director of regulatory affairs and network standards at Ruckus:
• Mobile operators that want to augment capacity of their existing spectrum
• Cable operators looking to expand into wireless services instead of paying a mobile virtual network operator (MVNO)
• Other third-party providers looking to offer fixed broadband services
• Enterprise and industrial applications: extending or amplifying wireless in business parks and remote locations; Internet of Things data acquisition.
• Neutral host capabilities, which some have likened to LTE roaming, an important development as 5G cellular services ramp up.

Previously, if customers wanted to extend cell coverage inside a building or a stadium, their best option was often distributed antenna systems (DAS). But DAS is complicated, expensive and relies on carrier participation, according to Wright. “Carriers also want to make sure your use of their spectrum doesn’t interfere with their macro spectrum nearby,” he added.

CBRS uses discrete spectrum not owned by a mobile operator, allowing an NFL franchise, for example, to buy CBRS radios and deploy them around the stadium, exclusively or shared, depending on their requirements and budgets.

More CBRS antenna deployment. Credit: PGA Tour

On a neutral host network, a mobile device would query the LTE network to see which operations are supported. The device would then exchange credentials with the mobile carriers – CBRS and cellular – then permissions are granted, the user is authenticated, and their usage info gets passed back to the carrier, Wright explained.

With the PGA Tour tests, the Essential CBRS devices get provisioned on the network, then connect to the CBRS network just like a cell phone connects to public LTE, Evans explained. The Tour’s custom apps send collected data back to the Tour’s network via the CBRS access point, which is connected to temporary fiber the Tour installs. And while some of Ruckus’s CBRS access points also support Wi-Fi, the Tour uses only the CBRS. “When we’re testing, we’re not turning Wi-Fi on if it’s there,” Evans clarified.

While the idea of “private LTE” networks supported by CBRS is gaining lots of headline time, current deployments would require a new SIM card for any devices wanting to use the private CBRS network, something that may slow down deployments until programmable SIM cards move from good idea to reality. But CBRS networks could also be used for local backhaul, using Wi-Fi to connect to client devices, a tactic currently being used at ISM Raceway in Phoenix.

“It’s an exciting time… CBRS really opens up a lot of new opportunities,” Wright added. “The PGA Tour and NFL applications really address some unmet needs.”

CBRS on the Fairways

Prior to deploying CBRS access points at a location, the PGA Tour surveys the tournament course to create a digital image of every hole, along with other data to calculate exact locations and distances between any two coordinates, like the tee box and the player’s first shot or the shot location and the location of the hole. The survey also helps the Tour decide how and where to place APs on the course.

Courses tend to be designed in two different ways, according to the PGA Tour’s Evans. With some courses, the majority number of holes are adjacent to each other and create a more compact course; other courses are routed through neighborhoods and may snake around, end-to-end.

“In the adjacent model, which is 70 percent of the courses we play, we can usually cover the property with about 10 access points,” Evans explained.

Adjacent-style courses where the PGA Tour has tested CBRS include Ridgewood Country Club in Paramus, N.J.; Aronimink Golf Club in Newtown Square, Penn.; and East Lake Golf Club in Atlanta.

In the second model, where the holes are strung back to back, the PGA Tour may have to deploy as many as 18 or 20 APs to get the coverage and throughput it needs. That’s the configuration used during a recent tournament at the TPC Summerlin course in Las Vegas, Nev., Evans told Mobile Sports Report.

On the course, CBRS APs get attached to some kind of structure where possible, Evans added. “Where that doesn’t make sense, we have portable masts we use – a tripod with a pole that goes up 20 feet,” he said. The only reason he’d relocate an AP once a tournament began is if it caused a problem with the competition or fan egress. “We’re pretty skilled at avoiding those issues,” he said.

A handful of PGA Tour employees operates its ShotLink system, which also relies on an army of volunteers – as many as 350 at each tournament – who help with data collection and score updates (that leader board doesn’t refresh itself!). “There’s a walker with each group, recording data about each shot. There’s technology for us on each fairway and green, and even in the ball itself, as the ball hits the green and as player hits putts,” said Evans.

The walker-volunteers relay their data back to a central repository; from there, ShotLink data then gets sent to PGA Tour management and is picked up by a variety of organizations from onsite TV broadcast partners; the pgatour.com Website; players, coaches and caddies; print media; and mobile devices.

In addition to pushing PGA Tour voice traffic over on to CBRS, the organization is also looking for the technology to handle broadcast video. “We think broadcast video capture could become a [CBRS] feature,” Evans said. The current transport method, UHF video, is a low-latency way to get video back to a truck where it can be uploaded for broadcast audiences.

A broadcast program produced by the organization, PGA Tour Live, follows two groups on the course; each group has four cameras and producers cut between each group and each camera. That video needs to be low latency, high reliability, but is expensive due to UHF transmission.

Once 5G standards are created for video capture, the PGA Tour could use public LTE to bond a number of cell signals together. Unfortunately, that method has higher latency. “It’s fine for replay but not for live production,” Evans said, but is expected to eventually improve performance-wise. “The idea is eventually to move to outside cameras with CBRS and then use [CBRS] for data collection too,” he added. “If we could take out the UHF cost, it would be significant for us.”

In the meantime, the Tour will continue to rely largely on Cisco-Meraki Wi-Fi and use Wi-Fi as an alternate route if something happens to CBRS, Evans said. “But we expect CBRS to be primary and used 99 percent of the time.”

Super Bowl cellular report: AT&T, Sprint combine for almost 50 TB of game-day traffic

An under-seat DAS antenna in the 300 seating section at Mercedes-Benz Stadium. Credit: Paul Kapustka, MSR

Let the cellular traffic reports begin! AT&T is the first to report numbers for our annual unofficial tabulation of wireless traffic from the Super Bowl, with 11.5 terabytes of data in and around Atlanta’s Mercedes-Benz Stadium for Sunday’s Super Bowl 53.

While the New England Patriots’ 13-3 victory over the Los Angeles Rams can and will be debated for its entertainment value (or lack thereof), as usual the fans there for the “bucket list” event apparently held up the trend of mobile wireless traffic continuing to grow. According to AT&T it also saw a total of 23.5 TB of traffic on its network in a 2-mile radius around the stadium Sunday. Both the near-stadium and wider metro numbers were records for AT&T; previously it had seen a high of 9.8 TB of near-stadium traffic at Super Bowl 51 in Houston, and a wider metro total of 21.7 TB last year at Super Bowl 52 in Minneapolis.

Next in with numbers is Sprint, which said it saw 25 TB of traffic “in and around” the stadium on game day, but with Sprint this number is usually the bigger geographical area of the downtown area around the stadium, and not just in and directly outside. Right now Sprint is declining to provide any more granularity on the size of its reporting area “for competitive reasons,” so feel free to speculate if the 25TB comes from network activity actually close to the stadium or if it includes all of downtown Atlanta.

It’s worthwhile to note that Sprint’s reported total grew from 9.7 TB last year to 25 TB this year. So the big-area total is now at 48.5 TB, and that is all the reporting we are going to get this year. A spokesperson from Verizon said that while the company saw “record-breaking” traffic at the event, the spokesperson also said that Verizon “decided to no longer release specific performance statistics around this event.” T-Mobile also declined to provide any traffic figures.

Sprint did have more to say this year about upgrading Atlanta-area infrastructure, adding its massive MIMO technology in an effort to boost performance.

Even without actual numbers from Verizon or T-Mobile it’s clear that last year’s total of 50.2 TB of total metro cellular traffic was most likely surpassed, by a huge margin.

Wi-Fi numbers for Super Bowl 53, reported Friday at 24.05 TB, are an indication that traffic overall is still climbing year to year, with no ceiling in sight.

Going into Sunday’s game there had been some lingering questions about whether or not the Mercedes-Benz Stadium DAS would hold up to the demands, given that its initial deployment is now the subject of a lawsuit between IBM and Corning. As usual, all the wireless carriers said that they had made substantial improvements to infrastructure in the stadium as well as in the surrounding metro Atlanta area ahead of the game, to make sure Super Bowl visitors stayed connected, so for now it seems like any DAS issues were corrected before the game.

An interesting factoid from AT&T: At halftime, AT&T said it saw more than 237 GB of data crossing its network within 15 minutes. Sprint also said that it saw the most data cross its network at halftime. More as we hear more! Any in-person reports welcome as well.