Mercedes-Benz Wi-Fi (and DAS) ready for Super Bowl LIII

Mercedes-Benz Stadium’s Wi-Fi network is ready for its moment in the Super Bowl sun. Credit all photos: Paul Kapustka, MSR (click on any picture for a larger image)

With less than two weeks to go before Mercedes-Benz Stadium hosts Super Bowl LIII, there’s no longer any doubt that the venue’s Wi-Fi network should be ready for what is historically the biggest Wi-Fi traffic day of the year.

Oh, and that DAS network you were wondering about? It should be fine too, but more on that later. On a recent game-day visit to the still-new roost of the NFL’s Atlanta Falcons (and the latest MLS champions, Atlanta United), Mobile Sports Report found that the stadium’s Wi-Fi network, using gear from Aruba, a Hewlett-Packard Enterprise company, in a design by AmpThink for lead technology provider IBM, was strong on all levels of the venue, including some hard-to-reach spots in the building’s unique layout.

And in our game-day interview with Danny Branch, chief information officer for AMB Sports & Entertainment, we also finally got some statistics about Wi-Fi performance that should put any Super Bowl capacity fears to rest. According to Branch, Mercedes-Benz Stadium saw 12 terabytes of Wi-Fi used during the College Football Playoff Championship Game on Jan. 8, 2018, the second-highest single-game Wi-Fi total we’ve seen, beaten only by the 16.31 TB recorded at Super Bowl LII on Feb. 4, 2018, at U.S. Bank Stadium in Minneapolis.

“We’re confident, and we’re ready for the Super Bowl,” said Branch about his stadium’s network preparedness, during an interview before the Dec. 2 Falcons home game against the visiting Baltimore Ravens. The night before our talk, Mercedes-Benz Stadium had hosted the SEC Championship Game, where a classic comeback by Alabama netted the Tide a 35-28 win over Georgia, while fans packing the stadium used another 8.06 TB of Wi-Fi data, according to Branch.

Along with lawsuit, DAS gets 700 new antennas

Editor’s note: This profile is from our latest STADIUM TECH REPORT, an in-depth look at successful deployments of stadium technology. Included with this report is a profile of the new game-day digital fan engagement strategy at Texas A&M, as well as a profile of Wi-Fi at the renovated State Farm Arena in Atlanta! DOWNLOAD YOUR FREE COPY now!

An under-seat DAS antenna in the 300 seating section at Mercedes-Benz Stadium

The Wi-Fi totals revealed by Branch were the first such statistics reported by Mercedes-Benz Stadium since its opening in August of 2017. While initially the lack of reports of any kind last fall were thought to have been just some kind of Southern modesty, MSR had been hearing back-channel industry questions about the wireless coverage in the venue since its opening, particularly with the performance of the DAS network.

Those whispers finally became public when IBM filed a lawsuit on Oct. 31 in the U.S. District Court in Atlanta, alleging that subcontractor Corning had failed to deliver a working DAS. In its lawsuit complaints IBM said that the DAS had not worked correctly throughout 2017, and that IBM had to spend large amounts of money to fix it. Corning has since countered with its own legal claims, asking IBM’s claims to be dismissed.

While that battle is now left to the lawyers, inside the stadium, Branch said in December that the DAS was getting its final tuning ahead of the Super Bowl. In addition to (or as part of) the IBM DAS improvements, Branch said that an additional 700 under-seat DAS antennas have been installed in the seating bowl. In our walk-around review during the Falcons’ game, MSR noticed multiple DAS antenna placements that seemed to be new since our last visit in August of 2017, during the stadium’s press day.

“IBM addressed the DAS issues, and we’re in a good place,” said Branch. The NFL’s CIO, Michelle McKenna, also gave her office’s approval of the readiness of the Mercedes-Benz Stadium networks in a separate phone interview. And MSR even got to witness a live opening of the stadium’s unique camera-shutter roof, another technology that ran into some bugs during football season last year but now appears to be solved.

Selfies and speedtests

So how do the networks perform at a live event? The short answer is, on the Wi-Fi side we saw steady speeds wherever we tested, typically in a range between 20 Mbps on the low side to 60+ Mbps on the high side, for both download and upload speeds. On the DAS side, our Verizon network phone saw a wide range of speed results, from some single-digit marks all the way up to 99 Mbps in one location; so perhaps the best answer is that on cellular, your speedtest may vary, but you will most likely always have a strong enough signal to do just about any task you might want to at a stadium, even on Super Sunday. All four major wireless carriers, including Verizon, AT&T, Sprint and T-Mobile, use the Mercedes-Benz Stadium DAS. And you can also expect all the major carriers to beef up local bandwidth with a combination of permanent and temporary upgrades, to ensure good connectivity throughout downtown Atlanta during Super Bowl week. Sprint and AT&T have already made announcements about their local upgrades, and we are sure Verizon and T-Mobile will follow suit with announcements soon.

The iconic ‘halo board’ video screen below the unique roof opening at Atlanta’s Mercedes-Benz Stadium.

Though we didn’t get any tests during the brief on-field part of our tour, Branch did point out some Wi-Fi APs on the sidelines for media access. Mercedes-Benz Stadium also now has a pair of MatSing ball antennas perched way up near the roof openings, to help with cellular coverage down to the sidelines.

MSR started our speedtest tour in the place where most Falcons fans probably pull out their phones, in front of the metal falcon structure outside the main entry gate. Even with digital ticketing activities taking place close by and groups of fans taking selfies in front of the bird, we still got a high Wi-Fi test of 35.8 Mbps on the download side and 41.6 Mbps on the upload. On cellular our top speeds in the same area were 3.94 Mbps / 17.2 Mbps.

Just inside the stadium doors from the Falcon is what the team calls the stadium’s “front porch,” an extended concourse with a clear view down to the field. On the Sunday we visited there was a stage with a DJ and rapping crew providing pregame entertainment, in front of two of the stadium’s more distinctive Daktronics digital displays, the 101-foot-tall “Mega Column” and the 26-foot-tall (at its highest point) triangular “Feather Wall” display, which frame part of the porch.

In the middle of a slowly moving crowd that was taking selfies in multiple directions, MSR still got good connectivity, with Wi-Fi speeds of 22.4 Mbps / 12.3 Mbps, and a cellular mark of 5.38 Mbps / 12.0 Mbps. As far as we could see, the wide-open space was being served by antennas mounted on walls on two sides of the opening.

Bridges, nosebleeds and concourses

Looking for some tough-to-cover spots, we next headed to one of the two “sky bridges,” narrow walkways that connect over the main entry on both the 200 and 300 seating levels. Out in the exact middle of the 200-level sky bridge we still got a Wi-Fi test of 14.6 Mbps / 8.19 Mbps; celluar checked in at 4.07 Mbps / 4.59 Mbps.

For some more fan-friendly speeds we wandered in front of the nearby concourse watering hole, the Cutwater Spirits bar, where our Wi-Fi signal tested at 35.8 Mbps / 42.4 Mbps, and the DAS signal (directly in front of an antenna mounted above the concourse) reached 99.2 Mbps / 25.4 Mbps even with heavy foot traffic coming by.

The roof opens at Mercedes-Benz Stadium.

Right before kickoff, we wandered into the top sections of the Falcons’ new roost, where about halfway up in section 310 (near the 50-yard line) we got Wi-Fi speeds of 11.6 Mbps / 1.86 Mbps, and cellular speeds of 13.1 Mbps / 2.50 Mbps, during the height of the on-field pregame festivities. In that section and in others we walked around to, many fans were busy with phones during pregame, with many watching live video.

One interesting technology note: The stadium’s unique Daktronics halo video board, a 58-foot-high screen that circles around underneath the roof, is partially obscured in the uppermost sideline seats. But that’s pretty much the only place you aren’t wowed by the screen’s spectacle, which from most of the rest of the stadium offers multiple-screen views no matter where you are looking up from.

One final speedtest on the 300-level concourse saw the Wi-Fi speeds at 35.8 Mbps / 38.2 Mbps, while another one of those new-looking DAS antennas gave us a speed test of 77.0 Mbps / 21.4 Mbps. During the third quarter we visited the AT&T Perch, a section above the end zone area opposite of the entry porch where there are large displays with multiple TV screens and even some recliner-type chairs where fans can get their other-game viewing on while inside the arena. Wi-Fi in the Perch tested at 42.1 Mbps / 61.0 Mbps.

Fans are finding the Wi-Fi

Though we haven’t yet seen any more detailed network use statistics, like unique game-day connections or peak concurrent connections for any events, Branch said fans are definitely finding the network. Sponsored by AT&T with an “ATTWifi” SSID, there is no landing page or portal for the network asking for any information — once fans find the network and connect, they’re on.

This type of personal assistance might be even more needed at the Super Bowl.

“In the first year we didn’t promote it [the Wi-Fi] heavily, because we were making sure everything worked well,” Branch said. But this year, he said the team has been promoting the network in emails to season ticket holders, and with video board messages on game days. At a high school football weekend this past fall, Branch said the Falcons saw 75 percent of attendees connect to the Wi-Fi network.

“AmpThink and Aruba did a really good job” on the Wi-Fi network, Branch said. “I love it when my friends tell me how fast the Wi-Fi is.”

By adding solid wireless connectivity to the host of other amenities found inside Mercedes-Benz Stadium — including fan-friendly food and drink prices that are simply the lowest you’ll see anywhere — Branch said he felt like the Falcons’ ownership had succeeded in creating a venue that was “an experience,” where fans would want to come inside instead of tailgating until the last minute.

With the Super Bowl looming on the horizon, Branch knows there’s still no rest until the game is over, with new challenges ahead. The Sunday we visited, the Falcons debuted a new footbridge over the road outside the back-door Gate 1 entry, and Branch knows there will be networking challenges to make sure fans can still connect when the NFL erects its Super Bowl security perimeter far out from the actual stadium doors.

“Our motto is be prepared for anything,” said Branch, noting that there is really no template or historical model for a building unique as Mercedes-Benz Stadium.

“Sometimes it feels like you’re changing tires on a car going 100 miles per hour,” Branch said, only partially in jest. “But we’re confident we’ll be ready for the Super Bowl.”

The metal falcon is selfie central for visitors new and old

Wi-Fi and DAS antennas cover the ‘front porch’ landing area inside the main entry

Under-seat Wi-Fi AP enclosure

A shadowy look at one of the MatSing ball antennas in the rafters

The gear behind the under-seat DAS deployments

The view toward downtown

Texas A&M’s mobile browser end-around: How the Aggies and AmpThink changed the game-day fan engagement process

A look at the 12thmanlive.com site at a Texas A&M home game this past season. Credit: Texas A&M (click on any photo for a larger image)

In the short history of in-stadium mobile fan engagement, a team or stadium app has been the go-to strategy for many venue owners and operators. But what if that strategy is wrong?

You can always count on team and stadium apps to be introduced with a long list of bells and whistles, from in-seat food ordering and delivery to digital ticketing, instant replay options and venue wayfinding services. Yet after those apps are bought and released, very few teams or stadium-app vendors are willing to provide statistics on how those features are — or are not — being used. As such, the business benefits of almost every stadium app ever launched remain a mystery.

In fact, the only statistic that emerges with any regularity in regards to stadium apps in their still-young lifetime is that their game-day usage usually trails general-purpose mobile-phone applications by a large margin, far behind social media applications like Facebook, Snapchat, Twitter and Instagram, as well as email and text messaging. So why is the conventional wisdom of having a game-day app still so conventional?

To seek an answer to that question and in part to “question every underlying assumption” involving fan digital engagement, Texas A&M University partnered with AmpThink this fall on a wide-ranging experiment centered around using mobile web, as well as a captive Wi-Fi portal, to see if it was possible to find a better way to digitally engage fans, for far less than the cost of a custom app. And so far, it looks like they did.

Via its “12thmanlive.com” digital game-day program website and a gated entry to access the Wi-Fi network at Kyle Field, Texas A&M was able to gather more than 150,000 fan emails this football season as well as another 60,000-plus additional opt-ins for phone numbers, addresses and permissions for more messages from the school. In addition to the marketing lead generation, a “Black Friday” ticket sale promotion, sent to fans who had opted in for more emails, produced 2,285 tickets sold for a late-season game against LSU, an additional $137,100 revenue that Texas A&M might not have otherwise realized.

And unlike app-based programs, the simple WordPress headless CMS behind 12thmanlive.com allowed for fast updates for content and graphics, letting AmpThink and Texas A&M customize the site’s look repeatedly, to test — and measure — the success or failure of different offers and promotions during the seven-game 2018 home season. The 12thmanlive.com program is already slated for more experiments during the basketball season, with an eye to covering as many of the school’s sports as possible.

‘Don’t treat it like plumbing’

Editor’s note: This profile is from our latest STADIUM TECH REPORT, an in-depth look at successful deployments of stadium technology. Included with this report is a profile of the Wi-Fi network at Mercedes-Benz Stadium in Atlanta, as well as the renovated State Farm Arena, also in Atlanta! DOWNLOAD YOUR FREE COPY now!

It’s worthwhile to note here that such a forward-thinking experiment is not a huge surprise for the partnership of Texas A&M and AmpThink. While AmpThink may be best known for its expertise in large-venue Wi-Fi design (including at Texas A&M’s Kyle Field), the firm over the past few years has expanded into many other segments of the overall stadium connectivity market, including taking on full-stadium technology integration, optical fiber network design and deployment, enclosure design and manufacture, as well as digital-signage programming and related marketing activities. And Texas A&M was one of the first big stadiums to go all-in on fiber backbone connectivity for its Wi-Fi and DAS networks, which are still at the top level of performance three years after their debuts.

Initially, Texas A&M followed one of the emerging paths of market strategies when it came to engaging fans via its wireless networks: It didn’t require fans to give any identifying information (like email, or name and address) to connect. Some venues, like the Atlanta Falcons’ Mercedes-Benz Stadium, consider it a point of pride to make network connections as easy as possible, with no kind of login information needed. In Atlanta, a sponsorship from AT&T for the Wi-Fi service makes it easier for the Falcons to offer it with no strings attached.

The team at Texas A&M concluded that teams should put a higher value on connectivity, since there aren’t any measurable business metrics to be found that prove that fans are happier or more engaged simply because they have “frictionless” access to Wi-Fi. And by allowing fans to use Wi-Fi anonymously, teams give away opportunities to generate a return on their technology investment.

“Some people say the network’s just plumbing, but they don’t say why,” AmpThink president Bill Anderson said in a recent interview. “Two or three years ago, having Wi-Fi with no hurdles and getting big usage numbers gave you something to brag about. But now, we’re seeing more teams ask, ‘are we getting any return on investment for our technology?’ ”

The first step in exploring that direction was taken by the school for the 2018 football season, when Texas A&M introduced a portal for Wi-Fi login which required a name and a valid email address to connect. Acknowledging that it might lower overall Wi-Fi usage, the portal did serve Texas A&M’s goal of increasing its ability to identify attendees by only allowing access to those who were willing to share some information.

For Texas A&M, using a Wi-Fi portal was an opportunistic business decision. With robust Wi-Fi and cellular networks at Kyle Field, fans who didn’t want to share their information for Wi-Fi had the choice of using the cellular DAS, which has superb coverage from multiple carriers, including Verizon, AT&T and T-Mobile.

Mobile web instead of an app

For the 2018 football season, Texas A&M added another twist in a new direction: The debut of a new digital game-day program, called 12thmanlive.com, which uses HTML5 to create an app-looking web page with a simple menu of activity buttons located beneath a live scoreboard feed.

According to Pat Coyle, Texas A&M’s new senior associate athletic director and chief revenue officer, the mobile-web game day program was another important cog in the school’s broader data collection and monetization strategy, which he paints as a “digital flywheel” where Texas A&M can use a multitude of data points to “adjust and improve service to our key customers.” But key to that strategy was getting live attendees to engage with the network in greater numbers than previously seen. Enter, 12thmanlive.com.

What made 12thmanlive.com interesting from one perspective was not what it had, but what it didn’t have. With no app to download, the site was quickly available to anyone attending a game simply by entering the URL into a mobile-device browser. Its simple design (no photos or videos, for example) made it fast to load and easy to understand.

On the plus side, what the site did offer was activity much different from most team or stadium apps, which generally focus on content or on interactive services, like ticketing or loyalty programs. Among the 10 buttons on the site’s main interface were features including game-day rosters, a stats tracker and a way to send chat messages to stadium personnel; the site also included a number of sponsored promotions, including a giveaway contest for a helmet signed by new head coach Jimbo Fisher, future ticket giveaways, coupons for food and beverages, and a link to join the Wi-Fi network for fans who might have been on a cellular connection to begin with.

While team apps might have been looked at to fill game-day interactions, Coyle said that previous game-day statistics from Kyle Field’s Wi-Fi network showed fewer than 1 percent of fans would use the school’s old, downloadable app while attending a game.

With a web platform, the idea was that Texas A&M would have the ability to quickly add or change more game-day centric features and to integrate them with third-party services. But in the face of historic non-participation via the app, could Texas A&M and AmpThink get fans to click on a mobile website instead? And would it be worth the cost of trying?

A much cheaper experiment than an app

One obvious factor in the idea’s favor from the beginning was the low cost of development for a web-based project, especially when compared to that of a custom app. AmpThink estimates that most custom apps cost teams somewhere in the range of $1 million. Total costs for the 12thmanlive.com project were “in the mid-five figures,” according to the school, including not just the site and tools design but some “shoulder to shoulder” help from AmpThink during the season, according to Anderson.

A Kyle Field ribbon board advertises the stadium’s Wi-Fi network. Credit: Texas A&M

Launched at the start of the 2018 football season, the site was promoted in several ways, including messages on the big video board at Kyle Field as well as on smaller TV screens and ribbon boards throughout the stadium. The big screens also promoted individual contests, allowing fans to text a code word to a short numerical code, an action that would take them directly to the 12thmanlive.com site.

The Wi-Fi portal also helped, as a “welcome” email sent after a valid login to the network contained a prominent link to the 12thmanlive.com site.

Starting with the first game, the 12thmanlive.com site showed consistent user numbers, with an average visit total of approximately 8,500 fans per game over the 7-game season — close to 10 percent participation of all attendees, a 10x improvement over historic app interaction.

According to the school, Texas A&M started the season with the assumption that they did not know exactly what fans wanted. The 12thmanlive.com site featured some interesting content, like a stadium clock that was close to real time and game-day rosters. But analysis of site visits found that this game-related content had about zero dwell time and high abandonment rates. For contests and giveaways, however, there was very high engagement.

According to statistics provided by Coyle, a repeated contest to win a signed helmet was the most popular with 31,379 registrations over the seven games. That was followed in popularity by a milkshake coupon (14,261 registrations) and a free ticket contest (9,233 registrations).

Measurable and repeatable results

With the site only turned on during game days — and only promoted inside the stadium — the 12thmanlive.com efforts did not affect traffic to the team’s regular website, Coyle said.

Overall, the Wi-Fi portal and the 12thmanlive.com site garnered 156,543 total emails for Texas A&M, with 61,607 of those emails being new to the school’s database, according to figures from Coyle. Of that number, 44,894 came from the Wi-Fi portal, and another 16,713 unique emails came from registrations on 12thmanlive.com activities.

“While it’s natural to focus on 61,607 new records, the 156,543 number is also important,” said Coyle. “These are all fans who were anonymous but are now identified as ‘in attendance’ at particular games. Now we know more of the identities of folks who bought and attended games. So we can figure out which games the season ticket holders sold on secondary, for example.”

Coyle noted that Texas A&M’s overall strategy goes far beyond just the mobile web site, with power from the Wi-Fi network analytics also helping to spin the “flywheel.” For example, the school tested proximity marketing to educate fans about a new food stand on the 600 level of the stadium by using Wi-Fi location information to detect devices on that level, sending them an email promoting the food stand if they were registered in the system.

“We essentially used the Wi-Fi APs like beacons, and the difference is we didn’t need Bluetooth or a downloaded app to do this,” Coyle said.

When users who had previously logged in to the Wi-Fi network at a earlier game arrived for a new one, Coyle said the school was able to automatically trigger an email welcoming those users back; other network data collected included arrival and departure times, and DNS information to see what other apps fans are using, Coyle said.

“All of these data are more valuable when we can connect them to real people,” Coyle said. “When we know who these people are, we can use the data to adjust and improve service to our key customers. This will enhance loyalty, and eventually, profits.”

For Anderson, some additional proof in the pudding was the opt-in information fans were willing to share in the contests, giveaways and food coupon offers. On top of the email addresses another 60,055 fans gave permission to the school to send them follow-up marketing messages, a key indicator that people are willing to engage if they perceive value.

“Compared with other venues we work in, we saw better than expected opt-in rates,” AmpThink’s Anderson said. “I think it’s because Texas A&M gave fans a better value proposition.”

With actionable data already in hand, Texas A&M is iterating the 12thmanlive.com program for basketball season, with an eye toward next year’s football season and all the new ideas they can try. The WordPress content management system strategy allows teams and the schools to do a lot of the work themselves, since experience with WordPress is fairly widespread. In fact, Anderson said teams don’t even need to pick up the phone to call AmpThink, since what Texas A&M and AmpThink did is easily replicable from a DIY perspective.

“Anybody can just go out and get a good web person and build their own successes [with this model],” Anderson said.

Levi’s Stadium sees 5.1 TB of Wi-Fi data used at college football championship

Fans and media members at Monday night’s College Football Playoff championship game used a total of 5.1 terabytes of data on the Wi-Fi network at Levi’s Stadium, according to figures provided by the San Francisco 49ers, who own and run the venue.

With 74,814 in attendance for Clemson’s 44-16 victory over Alabama, 17,440 of those in the stands found their way onto the stadium’s Wi-Fi network. According to the Niners the peak concurrent connection number of 11,674 users was seen at 7:05 p.m. local time, which was probably right around the halftime break. The peak bandwidth rate of 3.81 Gbps, the Niners said, was seen at 5:15 p.m. local time, just after kickoff.

In a nice granular breakout, the Niners said about 4.24 TB of the Wi-Fi data was used by fans, while a bit more than 675 GB was used by the more than 925 media members in attendance. The Wi-Fi data totals were recorded during an 8-1/2 hour period on Monday, from 1 p.m. to 9:30 p.m. local time.

Added to the 3.7 TB of DAS traffic AT&T reported inside Levi’s Stadium Monday night, we’re up to 8.8 TB total wireless traffic so far, with reports from Verizon, Sprint and T-Mobile still not in. The top Wi-Fi number at Levi’s Stadium, for now, remains Super Bowl 50, which saw 10.1 TB of Wi-Fi traffic.

Can virtualization help venues meet growing mobile capacity demands?

By Josh Adelson, director, Portfolio Marketing, CommScope

U.S. mobile operators reported a combined 50 terabytes of cellular traffic during the 2018 Super Bowl, nearly doubling over the previous year. In fact, Super Bowl data consumption has doubled every year for at least the past 6 years and it shows no sign of slowing down.

Clearly, fans love their LTE connections almost as much as they love their local team. Fans have the option for cellular or Wi-Fi, but cellular is the default connection whereas Wi-Fi requires a manual connection step that many users may not bother with.[1] The same dynamic is playing out on a smaller scale in every event venue and commercial building.

Whether you are a venue owner, part of the team organization or in the media, this heightened connectivity represents an opportunity to connect more with fans, and to expand your audience to the fans’ own social connections beyond the venue walls.

But keeping up with the demand is also a challenge. High capacity can come at a high cost, and these systems also require significant real estate for head-end equipment. Can you please your fans and leverage their connectedness while keeping equipment and deployment costs from breaking the capex scoreboard?

Virtualization and C-RAN to the rescue?

Editor’s note: This post is part of Mobile Sports Report’s new Voices of the Industry feature, in which industry representatives submit articles, commentary or other information to share with the greater stadium technology marketplace. These are NOT paid advertisements, or infomercials. See our explanation of the feature to understand how it works.

Enterprise IT departments have long since learned that centralizing and virtualizing their computing infrastructures has been a way to grow capacity while reducing equipment cost and space requirements. Can sports and entertainment venues achieve the same by virtualizing their in-building wireless infrastructures? To answer this question, let’s first review the concepts and how they apply to wireless infrastructure.

In the IT domain, virtualization refers to replacing multiple task-specific servers with a centralized resource pool that can be dynamically assigned to a given task on demand. Underlying this concept is the premise that, while each application has its own resource needs, at any given time only a subset will be active, so the total shared resource can be less than the sum of the individual requirements.

How does this translate to in-building wireless? Centralizing the base station function is known as C-RAN, which stands for centralized (or cloud) radio access network. C-RAN involves not only the physical pooling of the base stations into a single location — which is already the practice in most venues — but also digital architecture and software intelligence to allocate baseband capacity to different parts of the building in response to shifting demand.

C-RAN brings immediate benefits to a large venue in-building wireless deployment. The ability to allocate capacity across the venue via software rather than hardware adds flexibility and ease of operation. This is especially important in multi-building venues that include not only a stadium or arena but also surrounding administrative buildings, retail spaces, restaurants, hotels and transportation hubs. As capacity needs shift between the spaces by time of day or day of week, you need a system that can “point” the capacity to the necessary hot spots.

C-RAN can even go a step further to remove the head-end from the building campus altogether. Mobile network operators are increasingly deploying base stations in distributed locations known as C-RAN hubs. If there is a C-RAN hub close to the target building, then the in-building system can take a signal directly from the hub, via a fiber connection. Even if the operator needs to add capacity to the hub for this building, this arrangement gives them the flexibility to use that capacity in other parts of their network when it’s not needed at the building. It also simplifies maintenance and support as it keeps the base station equipment within the operator’s facilities.

For the building owner, this architecture can reduce the footprint of the on-campus head-end by as much as 90 percent. Once the baseband resources are centralized, the next logical step is to virtualize them into software running on PC server platforms. As it turns out, this is not so simple. Mobile baseband processing is a real-time, compute-intensive function that today runs on embedded processors in specialized hardware platforms. A lot of progress is being made toward virtualization onto more standard computers, but as of today, most mobile base stations are still purpose-built.

Perhaps more important for stadium owners is the fact that the base station (also called the signal source) is selected and usually owned by the mobile network operator. Therefore the form it takes has at most only an indirect effect on the economics for the venue. And whether the signal source is virtual or physical, the signal still must be distributed by a physical network of cables, radios and antennas.

The interface between the signal source and the distribution network provides another opportunity for savings. The Common Public Radio Interface (CPRI) establishes a digital interface that reduces space and power requirements while allowing the distribution network — the DAS — to take advantage of the intelligence in the base station. To leverage these advantages, the DAS also needs to be digital.

To illustrate this, consider the head-end for a 12 MIMO sector configuration with 4 MIMO bands per sector, as shown below. In this configuration a typical analog DAS is compared with a digital C-RAN antenna system, with and without a CPRI baseband interface. In the analog systems, points of interface (POIs) are needed to condition the signals from the different sources to an equal level before they are combined and converted to optical signals via an e/o (electric to optical) transceiver. In a digital system, signal conditioning and conversion from electric to digital is integrated into a single card, providing significant space saving.

* A typical analog system will require 8 POIs (4 MIMO bands per sector) and 2 o/e transceivers per sector resulting in 10 cards per sector. A typical subrack (chassis) supports up 10-12 cards, so a subrack will support 1 MIMO sector. For 12 MIMO sectors, 12 subracks are needed and each is typically 4 rack unit height. This results in a total space of 48 rack units.

* For a digital system[2] without CPRI, each subrack supports 32 SISO ports. Each MIMO sector with 4 MIMO bands will require 8 ports resulting in supporting 4 MIMO sectors per subrack. For 12 MIMO sectors, 3 subracks of 5 rack unit height each are needed resulting in total space of 15 rack units.

* For a digital system with CPRI, each subrack supports 48 MIMO ports. Each MIMO sector with 4 MIMO bands will require 4 ports resulting in 12 MIMO sectors per subrack. For 12 MIMO sectors, only 1 subrack of 5 rack unit height is needed resulting in total space of 5 rack units.

One commercial example of this is Nokia’s collaboration with CommScope to offer a CPRI interface between Nokia’s AirScale base station and CommScope’s Era C-RAN antenna system. With this technology, a small interface card replaces an array of remote radio units, reducing space and power consumption in the C-RAN hub by up to 90 percent. This also provides a stepping-stone to future Open RAN interfaces when they become standardized and commercialized.

The Benefits in Action — A Case Study

Even without virtualization, the savings of digitization, C-RAN and CPRI at stadium scale are significant. The table below shows a recent design that CommScope created for a large stadium complex in the U.S. For this, we compared 3 alternatives: traditional analog DAS, a digital C-RAN antenna system with RF base station interfaces, and a C-RAN antenna system with CPRI interfaces. Digital C-RAN and CPRI produce a dramatic reduction in the space requirements, as the table below illustrates.

The amount of equipment is reduced because a digital system does in software what analog systems must do in hardware, and CPRI even further eliminates hardware. Energy savings are roughly proportional to the space savings, since both are a function of the amount of equipment required for the solution.

Fiber savings, shown in the table below, are similarly significant.

The amount of fiber is reduced because digitized signals can be encoded and transmitted more efficiently.

But these savings are only part of the story. This venue, like most, is used for different types of events — football games, concerts, trade shows and even monster truck rallies. Each type of event has its own unique traffic pattern and timing. With analog systems, re-sectoring to accommodate these changes literally requires on-site physical re-wiring of head-end units. But with a digital C-RAN based system it’s possible to re-sectorize from anywhere through a browser-based, drag and drop interface.

The Bottom Line

It’s a safe bet that mobile data demand will continue to grow. But the tools now exist to let you control whether this will be a burden, or an opportunity to realize new potential. C-RAN, virtualization and open RAN interfaces all have a role to play in making venue networks more deployable, flexible and affordable. By understanding what each one offers, you can make the best decisions for your network.

Josh Adelson is director of portfolio marketing at CommScope, where he is responsible for marketing the award-winning OneCell Cloud RAN small cell, Era C-RAN antenna system and ION-E distributed antenna system. He has over 20 years experience in mobile communications, content delivery and networking. Prior to joining CommScope, Josh has held product marketing and management positions with Airvana (acquired by CommScope in 2015) PeerApp, Brooktrout Technology and Motorola. He holds an MBA from the MIT Sloan School of Management and a BA from Brown University.

FOOTNOTES
1: 59 percent of users at the 2018 Super Bowl attached to the stadium Wi-Fi network.
2: Dimensioning for digital systems is based on CommScope Era.

AT&T: Lots of DAS traffic for college football championship

DAS on a cart: DAS Group Professionals deployed mobile DAS stations to help cover the parking lots at Levi’s Stadium for the college football playoff championship. Credit: DGP

This may not be a news flash to any stadium network operations team but the amount of mobile data consumed by fans at college football games continues to hit high levels, according to some new figures released by AT&T.

In a press release blog post where AT&T said it saw 9 terabytes of cellular data used over the college football playoff championship-game weekend in the Bay area, AT&T also crowned a cellular “data champion,” reporting that Texas A&M saw 36.6 TB of data used on the AT&T networks in and around Kyle Field in College Station, Texas.

(Actually, AT&T pointedly does NOT declare Texas A&M the champs — most likely because of some contractural issue, AT&T does not identify actual stadiums or teams in its data reports. Instead, it reports the cities where the data use occurred, but we can figure out the rest for our readers.)

For the College Football Playoff championship, AT&T was able to break down some specific numbers for us, reporting 3.7 TB of that overall total was used inside Levi’s Stadium on game day. Cell traffic from the parking lots and tailgating areas (see photo of DAS cart to left) added another 2.97 TB of traffic on AT&T’s networks, resulting in a game-day area total of 6.67 TB. That total is in Super Bowl range of traffic, so we are excited to see what the Wi-Fi traffic total is from the game (waiting now for the college playoff folks to get the statistics finalized, so stay tuned).

DAS antennas visible at Levi’s Stadium during a Niners game this past season. Credit: Paul Kapustka, MSR

For the additional 2+ TB of traffic, a footnote explains it somewhat more: “Data includes the in-venue DAS, COWs, and surrounding macro network for AT&T customers throughout the weekend.”

Any other carriers who want to add their stats to the total, you know where to find us.

Back to Texas A&M for a moment — in its blog post AT&T also noted that the stadium in College Station (which we will identify as Kyle Field) had the most single-game mobile usage in the U.S. this football season, with nearly 7 TB used on Nov. 24. Aggie fans will remember that as the wild seven-overtime 74-72 win over LSU, an incredible game that not surprisingly resulted in lots of stadium cellular traffic.

Notre Dame Stadium sees 4 TB of Wi-Fi used at NHL Winter Classic

It wasn’t a record-setting day for the Wi-Fi network at Notre Dame Stadium, but the 4 terabytes of data used by fans at Tuesday’s NHL Winter Classic continued a string of healthy wireless data use by attendees at the recently renovated venue.

With 76,126 in attendance to watch the Boston Bruins beat the Chicago Blackhawks in the latest of the NHL’s outdoor-arena games, some 17,000 unique devices connected to the stadium’s Wi-Fi network, according to statistics provided by the university. The peak concurrent Wi-Fi connections were 14,355, and peak throughput was 5.81 Gbps, according to the school. Given the winter weather and the fact that many fans were no doubt visiting Notre Dame for the first time, it’s no surprise that the data usage trailed all the other events at Notre Dame Stadium this fall and winter (see chart below). Still, a 4 TB day is still a big number, and again perhaps more impressive considering the conditions. Earlier this fall, Notre Dame Stadium saw 7.19 TB used for a game against Stanford.

According to some reports, including this story from CBS News, the second-largest ever Winter Classic crowd saw some issues arise on the concessions side, with some fans reporting that the stadium had run out of food and beer and that many concession lines were extremely long. Another report quoted a Notre Dame spokesperson as saying the reports of being out of food or beer were untrue.

More Wi-Fi stats from Notre Dame’s fall and winter events below. Thanks again to Notre Dame’s IT crew for providing the figures.