Corning seeks dismissal of IBM claims in Mercedes-Benz Stadium DAS lawsuit

An under-seat DAS antenna in the 300 seating section at Mercedes-Benz Stadium

Corning asked a Georgia court last week to dismiss the negligence and other claims filed against it last year by IBM in a lawsuit regarding the distributed antenna system (DAS) at Mercedes-Benz Stadium, claiming in part that IBM was responsible for any performance issues since it “failed to follow Corning’s design,” among other issues.

Since neither side is speaking publicly yet about the issue, we are pretty much left with the court document filings as the only way to figure out what exactly went wrong with the DAS installation at Mercedes-Benz Stadium, which opened in 2017. While IBM has claimed that the issues were Corning’s fault, Friday’s filing (which we have not yet seen) has Corning putting the blame back on IBM and its outside DAS deployment contractors.

According to a story posted on Law360 (subscription required), Corning’s motion in the U.S. District Court for the Northern District of Georgia also had a brief which said “IBM attempts to ignore the terms specified in its contract with Corning — and the work it actually contracted Corning to do — by bringing its own fraud and negligence claims along with purportedly assigned tort claims.” The Law360 story said Corning further “argued that the claims either are barred by the economic loss rule or a so-called merger clause in the contracts, or cannot be assigned to the hardware giant by the NFL team or the prime contractor on the project.”

A Corning spokesperson provided the following official comment from the company:

“Corning has provided successful DAS network solutions for dozens of major sports venues around the world. At Mercedes Benz Stadium, Corning performed under its contract with IBM, while IBM failed to follow Corning’s design, failed to provide the DAS for commissioning on time due to hundreds of IBM installation errors, and then failed to optimize the DAS; an area that was IBM’s responsibility. Regarding IBM’s lawsuit, Corning denies the allegations asserted by IBM, and Corning will vigorously defend its work at Mercedes Benz Stadium and its world-class reputation in Court. “

The Law360 story had the following comment from an IBM spokesperson:

“Corning delivered a flawed cellular system to the Falcons and IBM, and then failed to fix it. IBM stepped in and spent a year to deliver state-of-the-art cellular performance for fans, and
Corning is now accountable for failing to live up to its obligations.”

Levi’s Stadium sees 5.1 TB of Wi-Fi data used at college football championship

Fans and media members at Monday night’s College Football Playoff championship game used a total of 5.1 terabytes of data on the Wi-Fi network at Levi’s Stadium, according to figures provided by the San Francisco 49ers, who own and run the venue.

With 74,814 in attendance for Clemson’s 44-16 victory over Alabama, 17,440 of those in the stands found their way onto the stadium’s Wi-Fi network. According to the Niners the peak concurrent connection number of 11,674 users was seen at 7:05 p.m. local time, which was probably right around the halftime break. The peak bandwidth rate of 3.81 Gbps, the Niners said, was seen at 5:15 p.m. local time, just after kickoff.

In a nice granular breakout, the Niners said about 4.24 TB of the Wi-Fi data was used by fans, while a bit more than 675 GB was used by the more than 925 media members in attendance. The Wi-Fi data totals were recorded during an 8-1/2 hour period on Monday, from 1 p.m. to 9:30 p.m. local time.

Added to the 3.7 TB of DAS traffic AT&T reported inside Levi’s Stadium Monday night, we’re up to 8.8 TB total wireless traffic so far, with reports from Verizon, Sprint and T-Mobile still not in. The top Wi-Fi number at Levi’s Stadium, for now, remains Super Bowl 50, which saw 10.1 TB of Wi-Fi traffic.

Can virtualization help venues meet growing mobile capacity demands?

By Josh Adelson, director, Portfolio Marketing, CommScope

U.S. mobile operators reported a combined 50 terabytes of cellular traffic during the 2018 Super Bowl, nearly doubling over the previous year. In fact, Super Bowl data consumption has doubled every year for at least the past 6 years and it shows no sign of slowing down.

Clearly, fans love their LTE connections almost as much as they love their local team. Fans have the option for cellular or Wi-Fi, but cellular is the default connection whereas Wi-Fi requires a manual connection step that many users may not bother with.[1] The same dynamic is playing out on a smaller scale in every event venue and commercial building.

Whether you are a venue owner, part of the team organization or in the media, this heightened connectivity represents an opportunity to connect more with fans, and to expand your audience to the fans’ own social connections beyond the venue walls.

But keeping up with the demand is also a challenge. High capacity can come at a high cost, and these systems also require significant real estate for head-end equipment. Can you please your fans and leverage their connectedness while keeping equipment and deployment costs from breaking the capex scoreboard?

Virtualization and C-RAN to the rescue?

Editor’s note: This post is part of Mobile Sports Report’s new Voices of the Industry feature, in which industry representatives submit articles, commentary or other information to share with the greater stadium technology marketplace. These are NOT paid advertisements, or infomercials. See our explanation of the feature to understand how it works.

Enterprise IT departments have long since learned that centralizing and virtualizing their computing infrastructures has been a way to grow capacity while reducing equipment cost and space requirements. Can sports and entertainment venues achieve the same by virtualizing their in-building wireless infrastructures? To answer this question, let’s first review the concepts and how they apply to wireless infrastructure.

In the IT domain, virtualization refers to replacing multiple task-specific servers with a centralized resource pool that can be dynamically assigned to a given task on demand. Underlying this concept is the premise that, while each application has its own resource needs, at any given time only a subset will be active, so the total shared resource can be less than the sum of the individual requirements.

How does this translate to in-building wireless? Centralizing the base station function is known as C-RAN, which stands for centralized (or cloud) radio access network. C-RAN involves not only the physical pooling of the base stations into a single location — which is already the practice in most venues — but also digital architecture and software intelligence to allocate baseband capacity to different parts of the building in response to shifting demand.

C-RAN brings immediate benefits to a large venue in-building wireless deployment. The ability to allocate capacity across the venue via software rather than hardware adds flexibility and ease of operation. This is especially important in multi-building venues that include not only a stadium or arena but also surrounding administrative buildings, retail spaces, restaurants, hotels and transportation hubs. As capacity needs shift between the spaces by time of day or day of week, you need a system that can “point” the capacity to the necessary hot spots.

C-RAN can even go a step further to remove the head-end from the building campus altogether. Mobile network operators are increasingly deploying base stations in distributed locations known as C-RAN hubs. If there is a C-RAN hub close to the target building, then the in-building system can take a signal directly from the hub, via a fiber connection. Even if the operator needs to add capacity to the hub for this building, this arrangement gives them the flexibility to use that capacity in other parts of their network when it’s not needed at the building. It also simplifies maintenance and support as it keeps the base station equipment within the operator’s facilities.

For the building owner, this architecture can reduce the footprint of the on-campus head-end by as much as 90 percent. Once the baseband resources are centralized, the next logical step is to virtualize them into software running on PC server platforms. As it turns out, this is not so simple. Mobile baseband processing is a real-time, compute-intensive function that today runs on embedded processors in specialized hardware platforms. A lot of progress is being made toward virtualization onto more standard computers, but as of today, most mobile base stations are still purpose-built.

Perhaps more important for stadium owners is the fact that the base station (also called the signal source) is selected and usually owned by the mobile network operator. Therefore the form it takes has at most only an indirect effect on the economics for the venue. And whether the signal source is virtual or physical, the signal still must be distributed by a physical network of cables, radios and antennas.

The interface between the signal source and the distribution network provides another opportunity for savings. The Common Public Radio Interface (CPRI) establishes a digital interface that reduces space and power requirements while allowing the distribution network — the DAS — to take advantage of the intelligence in the base station. To leverage these advantages, the DAS also needs to be digital.

To illustrate this, consider the head-end for a 12 MIMO sector configuration with 4 MIMO bands per sector, as shown below. In this configuration a typical analog DAS is compared with a digital C-RAN antenna system, with and without a CPRI baseband interface. In the analog systems, points of interface (POIs) are needed to condition the signals from the different sources to an equal level before they are combined and converted to optical signals via an e/o (electric to optical) transceiver. In a digital system, signal conditioning and conversion from electric to digital is integrated into a single card, providing significant space saving.

* A typical analog system will require 8 POIs (4 MIMO bands per sector) and 2 o/e transceivers per sector resulting in 10 cards per sector. A typical subrack (chassis) supports up 10-12 cards, so a subrack will support 1 MIMO sector. For 12 MIMO sectors, 12 subracks are needed and each is typically 4 rack unit height. This results in a total space of 48 rack units.

* For a digital system[2] without CPRI, each subrack supports 32 SISO ports. Each MIMO sector with 4 MIMO bands will require 8 ports resulting in supporting 4 MIMO sectors per subrack. For 12 MIMO sectors, 3 subracks of 5 rack unit height each are needed resulting in total space of 15 rack units.

* For a digital system with CPRI, each subrack supports 48 MIMO ports. Each MIMO sector with 4 MIMO bands will require 4 ports resulting in 12 MIMO sectors per subrack. For 12 MIMO sectors, only 1 subrack of 5 rack unit height is needed resulting in total space of 5 rack units.

One commercial example of this is Nokia’s collaboration with CommScope to offer a CPRI interface between Nokia’s AirScale base station and CommScope’s Era C-RAN antenna system. With this technology, a small interface card replaces an array of remote radio units, reducing space and power consumption in the C-RAN hub by up to 90 percent. This also provides a stepping-stone to future Open RAN interfaces when they become standardized and commercialized.

The Benefits in Action — A Case Study

Even without virtualization, the savings of digitization, C-RAN and CPRI at stadium scale are significant. The table below shows a recent design that CommScope created for a large stadium complex in the U.S. For this, we compared 3 alternatives: traditional analog DAS, a digital C-RAN antenna system with RF base station interfaces, and a C-RAN antenna system with CPRI interfaces. Digital C-RAN and CPRI produce a dramatic reduction in the space requirements, as the table below illustrates.

The amount of equipment is reduced because a digital system does in software what analog systems must do in hardware, and CPRI even further eliminates hardware. Energy savings are roughly proportional to the space savings, since both are a function of the amount of equipment required for the solution.

Fiber savings, shown in the table below, are similarly significant.

The amount of fiber is reduced because digitized signals can be encoded and transmitted more efficiently.

But these savings are only part of the story. This venue, like most, is used for different types of events — football games, concerts, trade shows and even monster truck rallies. Each type of event has its own unique traffic pattern and timing. With analog systems, re-sectoring to accommodate these changes literally requires on-site physical re-wiring of head-end units. But with a digital C-RAN based system it’s possible to re-sectorize from anywhere through a browser-based, drag and drop interface.

The Bottom Line

It’s a safe bet that mobile data demand will continue to grow. But the tools now exist to let you control whether this will be a burden, or an opportunity to realize new potential. C-RAN, virtualization and open RAN interfaces all have a role to play in making venue networks more deployable, flexible and affordable. By understanding what each one offers, you can make the best decisions for your network.

Josh Adelson is director of portfolio marketing at CommScope, where he is responsible for marketing the award-winning OneCell Cloud RAN small cell, Era C-RAN antenna system and ION-E distributed antenna system. He has over 20 years experience in mobile communications, content delivery and networking. Prior to joining CommScope, Josh has held product marketing and management positions with Airvana (acquired by CommScope in 2015) PeerApp, Brooktrout Technology and Motorola. He holds an MBA from the MIT Sloan School of Management and a BA from Brown University.

FOOTNOTES
1: 59 percent of users at the 2018 Super Bowl attached to the stadium Wi-Fi network.
2: Dimensioning for digital systems is based on CommScope Era.

AT&T: Lots of DAS traffic for college football championship

DAS on a cart: DAS Group Professionals deployed mobile DAS stations to help cover the parking lots at Levi’s Stadium for the college football playoff championship. Credit: DGP

This may not be a news flash to any stadium network operations team but the amount of mobile data consumed by fans at college football games continues to hit high levels, according to some new figures released by AT&T.

In a press release blog post where AT&T said it saw 9 terabytes of cellular data used over the college football playoff championship-game weekend in the Bay area, AT&T also crowned a cellular “data champion,” reporting that Texas A&M saw 36.6 TB of data used on the AT&T networks in and around Kyle Field in College Station, Texas.

(Actually, AT&T pointedly does NOT declare Texas A&M the champs — most likely because of some contractural issue, AT&T does not identify actual stadiums or teams in its data reports. Instead, it reports the cities where the data use occurred, but we can figure out the rest for our readers.)

For the College Football Playoff championship, AT&T was able to break down some specific numbers for us, reporting 3.7 TB of that overall total was used inside Levi’s Stadium on game day. Cell traffic from the parking lots and tailgating areas (see photo of DAS cart to left) added another 2.97 TB of traffic on AT&T’s networks, resulting in a game-day area total of 6.67 TB. That total is in Super Bowl range of traffic, so we are excited to see what the Wi-Fi traffic total is from the game (waiting now for the college playoff folks to get the statistics finalized, so stay tuned).

DAS antennas visible at Levi’s Stadium during a Niners game this past season. Credit: Paul Kapustka, MSR

For the additional 2+ TB of traffic, a footnote explains it somewhat more: “Data includes the in-venue DAS, COWs, and surrounding macro network for AT&T customers throughout the weekend.”

Any other carriers who want to add their stats to the total, you know where to find us.

Back to Texas A&M for a moment — in its blog post AT&T also noted that the stadium in College Station (which we will identify as Kyle Field) had the most single-game mobile usage in the U.S. this football season, with nearly 7 TB used on Nov. 24. Aggie fans will remember that as the wild seven-overtime 74-72 win over LSU, an incredible game that not surprisingly resulted in lots of stadium cellular traffic.

New Report: Texas A&M scores with new digital fan-engagement strategy

In the short history of in-stadium mobile fan engagement, a team or stadium app has been the go-to strategy for many venue owners and operators. But what if that strategy is wrong?

That question gets an interesting answer with the lead profile in our most recent STADIUM TECH REPORT, the Winter 2018-19 issue! These quarterly long-form reports are designed to give stadium and large public venue owners and operators, and digital sports business executives a way to dig deep into the topic of stadium technology, via exclusive research and profiles of successful stadium technology deployments, as well as news and analysis of topics important to this growing market.

Leading off for this issue is an in-depth report on a new browser-based digital game day program effort launched this football season at Texas A&M, where some longtime assumptions about mobile apps and fan engagement were blown apart by the performance of the Aggies’ new project. A must read for all venue operations professionals! We also have in-person visits to Atlanta’s Mercedes-Benz Stadium and the renovated State Farm Arena, the venue formerly known as Philips Arena. A Q&A with NFL CIO Michelle McKenna-Doyle and a report on a CBRS network test by the PGA round out this informative issue! DOWNLOAD YOUR REPORT today!

We’d like to take a quick moment to thank our sponsors, which for this issue include Mobilitie, JMA Wireless, Corning, Huber+Suhner, Boingo, Oberon, MatSing, Neutral Connect Networks, Everest Networks, and ExteNet Systems. Their generous sponsorship makes it possible for us to offer this content free of charge to our readers. We’d also like to welcome readers from the Inside Towers community, who may have found their way here via our ongoing partnership with the excellent publication Inside Towers. We’d also like to thank the SEAT community for your continued interest and support.

As always, we are here to hear what you have to say: Send me an email to kaps@mobilesportsreport.com and let us know what you think of our STADIUM TECH REPORT series.

BYU scores with new Wi-Fi, app for LaVell Edwards Stadium

BYU’s LaVell Edwards Stadium. Credit all photos: photo@byu.edu (click on any picture for a larger image)

At Brigham Young University, the wait for Wi-Fi was worth it.

After a selection and deployment process that took almost three years, the first full season of Wi-Fi at BYU’s LaVell Edwards Stadium was a roaring success, with high fan adoption rates and a couple 6-plus terabyte single-game data totals seen during the 2018 football season. Using 1,241 APs from gear supplier Extreme Networks, the Wi-Fi deployment also saw high usage of the new game-day app, built for BYU by local software supplier Pesci Sports.

Duff Tittle, associate athletic director for communications at Brigham Young University, said the school spent nearly 2 1/2 years “studying the concept” of bringing Wi-Fi to the 63,470-seat stadium in Provo, Utah. After looking at “five different options,” BYU chose to go with Extreme, based mainly on Extreme’s long track record of football stadium deployments.

“We visited their stadiums, and also liked what they offered for analytics,” said Tittle of Extreme. “They had what we were looking for.”

According to Tittle, the deployment was actually mostly finished in 2017, allowing the school to do a test run at the last game of that season. Heading into 2018, Tittle said the school was “really excited” to see what its new network could do — and the fans went even beyond those expectations.

Opener a big success

For BYU’s Sept. 8 home opener against California, Tittle said the Wi-Fi network saw 27,563 unique connections out of 52,602 in attendance — a 52 percent take rate. BYU’s new network also saw a peak of 26,797 concurrent connections (midway through the fourth quarter) en route to a first-day data total of 6.23 TB. The network also saw a peak bandwidth rate of 4.55 Gbps, according to statistics provided by the school.

Sideline AP deployment

“It blew us away, the number of connections [at the Cal game],” Tittle said. “It exceeded what we thought we’d get, right out of the gate.”

With almost no overhangs in the stadium — there is only one sideline structure for media and suites — BYU and Extreme went with mostly under-seat AP deployments, Tittle said, with approximately 1,000 of the 1,241 APs located inside the seating bowl. Extreme has used under-seat deployments in many of its NFL stadium networks, including at Super Bowl LI in Houston.

Another success story was the new BYU app, which Tittle said had been in development for almost as long as the Wi-Fi plan. While many stadium and team apps struggle for traction, the BYU app saw good usage right out of the gate, finishing just behind the ESPN app for total number of users (2,306 for the BYU app vs. 2,470 for ESPN) during the same Cal game. The BYU app just barely trailed Instagram (2,327) in number of users seen that day, and outpaced SnapChat (1,603) and Twitter (1,580), according to statistics provided by Tittle. The app also supports instant replay video, as well as a service that lets fans order food to be picked up at a couple express-pickup windows.

What also might have helped fuel app adoption is the presence of a “social media” ribbon board along the top of one side of the stadium, where fan messages get seen in wide-screen glory. Tittle said the tech-savvy locals in the Provo area (which has long been the home to many technology companies, including LAN pioneer Novell) are also probably part of the app crowd, “since our fan base loves that kind of stuff.”

Tittle also said that Verizon Wireless helped pay for part of the Wi-Fi network’s construction, and like at other NFL stadiums where Verizon has done so, it gets a separate SSID for its users at LaVell Edwards Stadium. Verizon also built the stadium’s DAS (back in 2017), which also supports communications from AT&T and T-Mobile. (More photos below)

Under-seat AP enclosure

A peek inside

The social media ribbon board above the stands

LaVell Edwards Stadium at night, with a view of the press/suites structure