College championship game at AT&T Stadium breaks 6 Terabyte wireless data mark, with almost 5 TB of Wi-Fi traffic

AT&T Stadium before the college football playoff championship game. (Click on any photo for larger image) Credit all photos: Paul Kapustka, MSR

AT&T Stadium before the college football playoff championship game. (Click on any photo for larger image) Credit all photos: Paul Kapustka, MSR

Not only did Monday night’s College Football Playoff championship game crown a new new national title team — it also broke the unofficial record for most wireless traffic at a single sporting event, with more than 6 terabytes of data used by the 85,689 fans in attendance at AT&T Stadium in Arlington, Texas.

John Winborn, chief information officer for the Dallas Cowboys, said the AT&T-hosted Wi-Fi network at AT&T Stadium carried 4.93 TB of traffic during Monday’s game between Ohio State and Oregon, a far higher total than we’ve ever heard of before for a single-game, single-venue event. AT&T cellular customers, Winborn said, used an additional 1.41 TB of wireless data on the stadium DAS network, resulting in a measured total of 6.34 TB of traffic. The real total is likely another terabyte or two higher, since these figures don’t include any traffic from other carriers (Verizon, Sprint, T-Mobile) carried on the AT&T-neutral host DAS. (Other carrier reps, please feel free to send us your data totals as well!)

The national championship numbers blew away the data traffic totals from last year’s Super Bowl, and also eclipsed the previous high-water Wi-Fi mark we knew of, the 3.3 TB number set by the San Francisco 49ers during the opening game of the season at their new Levi’s Stadium facility. Since we’ve not heard of any other event even coming close, we’ll crown AT&T Stadium and the college playoff championship as the new top dog in the wireless-data consumption arena, at least for now.

University of Phoenix Stadium, already with Super Bowl prep under way

University of Phoenix Stadium, already with Super Bowl prep under way

Coincidentally, MSR on Tuesday was touring the University of Phoenix Stadium and the surrounding Westgate entertainment district, which is in the process of getting the final touches on a new complex-wide DAS installed by Crown Castle. The new DAS includes antennas on buildings and railings around the restaurants and shops of the mall-like Westgate complex, as well as inside and outside the UoP Stadium. (We’ll have a full report soon on the new DAS installs, including antennas behind fake air-vent fans on the outside of the football stadium to help handle pre-game crowds).

The University of Phoenix Stadium also had its entire Wi-Fi network ripped and replaced this season, in order to better serve the wireless appetites coming for the big game on Feb. 1. At AT&T Stadium on Monday we learned that the network there had almost 300 new Wi-Fi access points and a number of new DAS antennas installed since Thanksgiving, in anticipation of a big traffic event Monday night. Our exclusive on-the-scene tests of the Wi-Fi and DAS network found no glitches or holes in coverage, which is probably part of the reason why so many people used so much data.

UPDATE: Here is the official press release from AT&T, which basically says the same thing our post does.

Stadium Tech Report: AT&T Stadium network a winner at CFP Championship game

Inside AT&T Stadium at the College Football Championship game. Credit all photos: Paul Kapustka, MSR

Inside AT&T Stadium at the College Football Championship game. Credit all photos: Paul Kapustka, MSR

It’s late here in North Texas and you know by now that Ohio State beat Oregon to win the first non-mythical college football championship. Behind the scenes at AT&T Stadium Monday night, the wireless network in AT&T Stadium was also a winner, standing up to the challenge of the 85,000-plus crowd on both the DAS and Wi-Fi front.

We’ll have a more thorough stadium report when we get time to digest all the info we gathered at the game (and get the network stats back from the AT&T Stadium tech crew) but one thing we learned before the game was that since November, the Wi-Fi network at AT&T Stadium grew by more than 280 access points, on top of a total somewhere in the 1,200 range. According to AT&T network folks the stadium here in Arlington, Texas, has been seeing game-day totals of 3.3 Terabytes of data carried on the Wi-Fi network — leading some here to believe that Monday’s championship game could well surpass 4 TB of data used at a single game, an unofficial record as far we know for a single-day, single facility network.

As guests of AT&T we also got a quick demonstration of LTE broadcast technology, which basically slices the available cellular spectrum into a channel that can provide live streams of video. We’ll have more on this new technology in another separate report, but it is something to watch for facilities that want video options but don’t want to go whole hog on Wi-Fi.

AT&T LTE Broadcast demo, showing a live streaming broadcast of the game

AT&T LTE Broadcast demo, showing a live streaming broadcast of the game

Even though we were housed in a field-level suite your intrepid MSR crew wandered all over the massive facility, and basically found great connectivity wherever we were. Two places stick out in my mind: At the very top of the nosebleed section in the south end zone the Wi-Fi dipped to just 1 Mbps, probably because the roof is so high there is no place for an access point. However, at that same spot the AT&T 4G LTE signal was around 7 Mbps, providing great connectivity in a tough to configure spot.

The other notable spot was in a “star level” suite (about the 6th level of the building), where we got a Wi-Fi signal of 28 Mbps download and 59 (no typo!) Mbps on the upload. Yes, suite people have it better but all around wherever we went we got consistent Wi-Fi signals in the high teens or low 20s, and LTE cellular signals (including Verizon 4G LTE) just under 10 Mbps. Like the Ohio State offense, the network at AT&T Stadium works really well and may have set a new record Monday night. More soon, and more images soon as well. For now, Elvis has left the building.

Outside in the frozen tundra of North Texas, aka Arlington

Outside in the frozen tundra of North Texas, aka Arlington

This place was humming all night long

This place was humming all night long

AT&T 4G LTE speedtest, from the top of the stadium

AT&T 4G LTE speedtest, from the top of the stadium

The view from the nosebleed section

The view from the nosebleed section

Some "suite" Wi-Fi speeds

Some “suite” Wi-Fi speeds

MSR at the College Football Playoff Championships: Send us your speedtests!

ESPN's College Football Playoff Championships stage in downtown Fort Worth, Sunday night. Credit: Paul Kapustka, MSR

ESPN’s College Football Playoff Championships stage in downtown Fort Worth, Sunday night. Credit: Paul Kapustka, MSR

As Twitter followers found out yesterday, MSR is in “north Texas,” aka the Dallas-Fort Worth “Metroplex,” attending tonight’s inaugural College Football Playoff Championship game at AT&T Stadium.

We’re here to see a test of AT&T’s LTE broadcast technology, which will ostensibly make it easier for venues to deliver live video streams via a cellular connection. But we also are going to take advantage of the event to walk around “Jerry’s World” and take speed tests and see how the network at AT&T Stadium performs for a big game. If you are in attendence and know how to do a Wi-Fi or cellular speed test, send us the results (tweet at @paulkaps is the best way, or send me email to kaps at mobilesportsreport.com). Check Twitter for updates during the game.

Stadium Tech Report: Corning, IBM bringing fiber-based Wi-Fi and DAS to Texas A&M’s Kyle Field

Kyle Field, Texas A&M University. Credit all photos: Texas A&M

Kyle Field, Texas A&M University. Credit all photos: Texas A&M

Editor’s note: The following is an excerpt from our recent Stadium Tech Report series COLLEGE FOOTBALL ISSUE, a 40-page in-depth look at Wi-Fi and DAS deployment trends at U.S. collegiate football stadiums. You can download the full report for free, to get more stadium profiles as well as school-by-school technology deployment capsules for both the SEC and Pac-12 conferences.

When Texas A&M’s newly renovated Kyle Field opens for the 2015 football season, its outside appearance will have changed dramatically. But from a networking perspective, what’s really different is hidden on the inside – namely, an optical fiber infrastructure designed to bring a new level of performance, cost savings and future-proofing to stadium network deployments.

While the use of optical fiber instead of copper cable in large networks isn’t exactly “new” in the core telecom or enterprise networking worlds, in the still-nascent field of stadium network deployments fiber has yet to make large inroads. But the promise of fiber’s ability to deliver much higher performance and greater future-proofing at lower installation costs in stadium situations may get a very visible poster child when Texas A&M’s football facility kicks off the 2015 season with a technology infrastructure designed to be among the most comprehensive in any stadium, collegiate or professional.

With a Wi-Fi network designed to support 100,000 concurrent connections, a robust DAS network with more than 1,000 antennas, and an IPTV deployment with more than 1,000 screens, the IBM-designed network based largely on Corning’s fiber-optical systems is incredibly impressive on paper – and it has already produced some eye-popping statistics this past season, when just a part of it came online during the “Phase 1” period of the two-phase $450 million Kyle Field renovation.

The final, or Phase 2 of the renovation, just now getting underway, began with an implosion of the stadium’s west stands, with reconstruction scheduled to finish in time for the 2015 season with a new, enclosed-bowl structure that will seat 102,512 fans. And if the new network delivers as planned, those fans will be among the most-connected anywhere, with plenty of future-proofing to make sure it remains that way for the foreseeable future – thanks to fiber.

Driving on the left side of the street

What’s going to be new about Kyle Field? According to news reports some of the creature comforts being added include redesigned concession stands, so-called “Cool Zones” with air conditioning to beat the Texas heat, well-appointed luxury suites and new restrooms – including 300 percent more women’s bathrooms.

Scoreboard, Kyle Field

Scoreboard, Kyle Field

According to representatives from the school, the decision to make the new stadium a standout facility extended to its network infrastructure. “Our leadership decided that [the stadium renovation] would be leading edge,” said Matthew Almand, the IT network architect for the Texas A&M University System, the administrative entity that oversees university operations, including those at the flagship school in College Station, Texas. “There were some leaps of faith and there was a decision to be leading edge with technology as well.”

Though the Phase 1 planning had started with traditional copper cable network design for the network, Almand said a presentation by IBM and its “smarter stadium” team changed the thinking at Texas A&M.

“The IBM team came in and did a really good job of presenting the positive points of an optical network,” Almand said.

Todd Christner, now the director, wireless business development at Corning, was previously at IBM as part of the team that brought the optical idea to Texas A&M. While talking about fiber to copper-cable veterans can sometimes be “like telling people to drive on the left side of the street,” Christner said the power, scalability and flexibility of a fiber network fit in well with the ambitious Kyle Field plans.

“The primary driving force [at Texas A&M] was that they wanted to build a state of the art facility, that would rival NFL stadiums and set them apart from other college programs,” Christner said. “And they wanted the fan [network] experience to be very robust.”

With what has to be one of the largest student sections anywhere – Christner said Texas A&M has 40,000 seats set aside for students – the school knew they would need extra support for the younger fans’ heavy data use on smartphones. The school officials, he said, were also concerned about DAS performance, which in the past had been left to outside operators with less than satisfactory results. So IBM’s presentation of a better, cheaper alternative for all of the above found accepting ears.

“It was the right room for us to walk into,” Christner said.

IBM’s somewhat radical idea was that instead of having separate copper networks for Wi-Fi, DAS and IPTV, there would be a single optical network with the capacity to carry the traffic of all three. Though the pitch for better performance, far more capacity, use of less space, and cheaper costs might sound a bit too good to believe, most of it is just the combination of the simple physics advantages of using fiber over copper, which are well known in the core telecom and large-enterprise networking worlds, applied to a stadium situation.

Deploying now and for the future

Corning ONE DAS headend equipment.

Corning ONE DAS headend equipment.

Without going too deeply into the physics or technology, a simple explanation of the benefits stem from the fact that optical fiber can carry far more bandwidth than copper, at farther distances, using less power. That advantage is one reason why fiber is used extensively in core backbone networks, and has been creeping slowly closer to the user’s destination, through deployments like Verizon’s FiOS.

Why hasn’t fiber won over completely? Mainly because in single-user deployments – like to a single home or office – it is still costly to replace systems already in the ground or in the wall with fiber, and for many users fiber’s capacity can be a bit of overkill. Fiber’s main benefits come when lots of bandwidth is needed, and the scale of a project is large, since one main benefit is the elimination of a lot of internal switching gear, which takes up space and consumes lots of power.

Those reasons accurately describe the perfect bandwidth storm happening in networked stadiums these days, where demand seems to keep increasing on a daily basis. Some stadiums that were at the forefront of the wireless-networking deployment trend, like AT&T Park in San Francisco and AT&T Stadium in Arlington, Texas, have been in a near-constant state of infrastructure upgrades due to the ever-increasing needs for more bandwidth. And Isaac Nissan, product manager for Corning ONE, said new equipment like Wi-Fi access points with “smart” or multiple-input antennas are also going to help push the LAN world into more fiber on the back end.

But there’s another drawback to using fiber, which has less to do with technology and more to do with history: Installers, integrators and other hands-on networking folks in general are more comfortable with copper, which they know and have used for decades. Fiber, to many, is still a new thing, since it requires different skills and techniques for connecting and pulling
wires, as well as for managing and administering optical equipment.

“There’s definitely a learning curve for some the RF [industry] people, who have been doing coax for 20 years,” Nissan said. “Fiber is a little different.”

Texas A&M’s Almand admitted that bringing the stadium’s networking group into a new technology – fiber – was a challenge, but one with a worthy payoff.

Copper cable tray hardly filled by optical fiber

Copper cable tray hardly filled by optical fiber

“There’s definitely been a gear-up cycle, getting to a new confidence level [with fiber],” Almand said. But he added that “sometimes it’s good to break out of your comfort zone.”

Lowering the IDF count

Christner said the Corning optical gear is at the center of the Kyle Field deployment, providing support for the fan-facing Wi-Fi as well as Wi-Fi for back of the house operations like point of sale; it also supports the stadium DAS, as well as a network of more than 1,000 IPTV screens. Aruba Networks is the Wi-Fi gear supplier, and YinzCam is helping develop a new Kyle Field app that will include support to use smartphones as remote-control devices for IPTVs in suites.

On the Wi-Fi side, Christner said the finished network will have 600 APs in the bowl seating areas, and another 600 throughout the facility, with a stated goal of supporting 100,000 concurrent 2 Mbps connections. The DAS, Christner said, is slated to have 1,090 antennas in 50 sectors.

With no intermediate switching gear at all, Christner said that for the fiber network in Kyle Field only 12 intermediate distribution frames (the usually wall-mounted racks that support network-edge gear, also called IDFs) would be needed, as opposed to 34 IDFs in a legacy fiber/coax system. In addition to using less power, the cabling needed to support the fiber network is a fraction of what would have been needed for coax.

One of the more striking pictures of the deployment is a 36-inch wide cable tray installed for the original copper-network plan, which is carrying just 10 inches of fiber-optic cable. Christner said the fiber network also provides a cleaner signal for the DAS network, which already had a test run this past season, when 600 DAS antennas were deployed and lit during the 2014 season.

“At the Ole Miss game we had 110,663 fans at the stadium, and according to AT&T on the DAS all their lights were green,” Christner said. “Via our completely integrated fiber optic solution, we are now able to provide the DAS with much higher bandwidth as well,” said Texas A&M’s Almand, who also said that the carriers have responded very positively to the new DAS infrastructure.

Up from the dust – a model for the future?

Antenna and zone gear box near top of stands

Antenna and zone gear box near top of stands

Also included in the design – but not being used – are an additional 4,000 spare fibers at 540 zone locations, which Christner said can be immediately tapped for future expansion needs. And all of this functionality and flexibility, he added, was being built for somewhere between one-third and 40 percent less than the cost of a traditional copper-based solution.

The proof of the network’s worth, of course, will have to wait until after the west stands are imploded, the new ones built, and the final pieces of the network installed. Then the really fun part begins, for the users who will get to play with things like 38 channels of high-def TV on the IPTV screens, to the multiple-angle replay screens and other features planned for the mobile app. At Texas A&M, IBM’s support squad will include some team members who work on the company’s traditionally excellent online effort for the Masters golf tournament, as well as the “smarter stadium” team.

For Texas A&M’s Almand, the start of the 2015 season will mark the beginning of the end, and a start to something special.

“If I were a country singer, I’d write something about looking forward to looking back on this,” Almand said. “When it’s done, it’s going to be something great.”

Report excerpt: SEC moving slowly on stadium Wi-Fi deployments

Jordan-Hare Stadium, Auburn University

Jordan-Hare Stadium, Auburn University

Editor’s note: The following is an excerpt from our recent Stadium Tech Report series COLLEGE FOOTBALL ISSUE, a 40-page in-depth look at Wi-Fi and DAS deployment trends at U.S. collegiate football stadiums. You can download the full report for free, to get more stadium profiles as well as school-by-school technology deployment capsules for both the SEC and Pac-12 conferences.

When it comes to college football, the South- eastern Conference – usually just known as “the SEC” – is second to none when it comes to the product on the field.

But what about the product in the stands, namely the wireless technology deployments in SEC stadiums? With just two of 14 conference schools currently with fan-facing Wi-Fi in their main venues, the SEC isn’t pushing any technology envelopes as a whole. And according to one SEC athletic director, there probably won’t be a wholesale march by the conference to the technology forefront – simply because the SEC’s in-stadium fans have other priorities on what needs fixing first.

Scott Stricklin, the AD at SEC member Mississippi State, leads a conference-wide group that is taking a close look at the in- stadium fan experience, a concern for the SEC even as the conference enjoys NFL-like popularity for its teams and games.

“We are proud that we have a pretty special product in our stadiums, and we want to take steps to keep it that way,” said Stricklin in an interview with MSR. A recent conference-wide fan survey, he said, did highlight the fact that when it comes to wireless connectivity, “none of us from a performance standpoint scored very well.”

Wi-Fi not as important as parking, good food

But Stricklin also noted that the same fan survey didn’t place stadium connectivity at the top of the list of things to fix: Instead, it fell well down, trailing issues like parking, clean restrooms, stadium sound and good food. That lack of press- ing concern, combined with Stricklin’s still-common belief that fans should be cheering instead of texting while at the stadium, means that the SEC will probably take a measured approach to Wi-Fi deployments in stadiums, and continue to rely on carrier-funded DAS networks to carry the game-day wireless load.

Scott Stricklin, Mississippi State AD

Scott Stricklin, Mississippi State AD

“I take more of a Mark Cuban approach – I’d rather people in the stands not be watching video [on their phones],” Stricklin said. “It takes away from the shared experience.”

Stricklin also noted that the two schools that have installed Wi-Fi in their stadiums – Auburn and Ole Miss – haven’t had resounding success with their deployments.

“Some [SEC schools] have done [Wi-Fi], and they’re not completely happy with the results,” said Stricklin, saying the lack of success has reinforced the cautious approach to Wi-Fi, conference-wide. “Those are the issues all of us are facing and grappling with,” he added.

SEC fans setting DAS traffic records

Even as they trail on Wi-Fi deployments, that doesn’t mean SEC schools are putting in dial-up phone booths. Indeed, Stricklin noted the huge video boards that have been installed in most conference stadiums, and did say that the recent installations of carrier-funded DAS deploymentshave somewhat eased the no-signal crunch of the near past.

At his own school, Stricklin said his office got a lot of complaints about fans not being able to get a cellular signal before AT&T updated the stadium’s DAS in 2013.

“Last year, we got very few negative comments [about cellular service],” Stricklin said. “AT&T customers were even able to stream video.”

Vaught-Hemingway Stadium, Ole Miss

Vaught-Hemingway Stadium, Ole Miss

AT&T’s aggressive plan to install as many DAS networks as it can has helped bring the SEC to a 100 percent DAS coverage mark, and the fans seem to be enjoying the enhanced cellular connectivity. According to AT&T statistics, fans at SEC schools have regularly led the carrier’s weekly DAS traffic totals for most of the football season, especially at the “big games” between SEC schools like Alabama, Auburn, Ole Miss, Mississippi State and Georgia.

During Alabama’s 25-20 home victory over then-No. 1 Mississippi State, AT&T customers at Bryant-Denny Stadium used 849 gigabytes oftraffic, the second-highest total that weekend for stadiums where AT&T has a DAS. The next two highest data-usage marks that weekend came at games at Georgia (676 GB) and Arkansas (602 GB), highlighting that SEC games typically have huge crowds, and those crowds like to use their cellphones, no matter how good the game on the field is.

Would Wi-Fi help with some of the traffic crunches? Possibly, but only two schools in the conference – Ole Miss and Auburn – currently have fan-facing Wi-Fi in their stadiums. Texas A&M, which is in the middle of a $450 million renovation of Kyle Field, is leaping far ahead of its conference brethren with a fiber-based Wi-Fi and DAS network and IPTV installation that will be among the most advanced anywhere when it is completed this coming summer.

But most of the SEC schools, Stricklin said, will probably stay on the Wi-Fi sidelines, at least until there is some better way to justify the millions of dollars in costs needed to bring Wi-Fi to a facility that might not see much regular use.

“If you only have 6 home games a year, it’s hard to justify,” said Stricklin of the cost of a Wi-Fi stadium network.

Other sports may move before football

Stricklin, the man who wants fans to keep their phones in their pockets at football games, is no stranger to technology-enhanced experiences in stadiums. He claims to “love” the in-seat food delivery options at MSU baseball and basketball games, and notes that the conference athletic directors will have a meeting soon where the game-experience panel experts will walk the ADs through the facets of wireless technology deployments.

“They’re going to lay out what are the challenges, and what are the costs” of wireless deployments, Stricklin said. What Stricklin doesn’t want to see at MSU or at any SEC school is the return of the “no signal” days.

“When fans from other schools come here, we want them to have a good experience,” Stricklin said.

But he’d still prefer that experience is real, not virtual.

“I still just wonder, is anybody really doing this?” he asked. “Are you going to pay what you pay to come to our place, and then watch your phone? What I hope is that we produce such a great experience, you’re not going to want to reach for your phone.”

Washington dropping Huawei for Cisco/Verizon Wi-Fi at FedEx Field, report says

Ming He, Country General Manager for Huawei in the U.S. (left), and Rod Nenner, Vice President of the Washington Redskins (right), pictured together when Huawei announced the team sponsorship and partnership.

Ming He, Country General Manager for Huawei in the U.S. (left), and Rod Nenner, Vice President of the Washington Redskins (right), pictured together when Huawei announced the team sponsorship and partnership.

According to a report from Bill Gertz at the Washington Times, the Washington, D.C. NFL franchise is apparently scrapping a recent deal with Chinese networking gear supplier Huawei to put fan-facing Wi-Fi into FedEx Field, turning instead to U.S. companies Cisco and Verizon.

Gertz, in the “Inside the Ring” column at the Times, said the Washington team’s senior vice president Tony Wyllie said in an email that “We [Washington] are in the process of deploying a stadium-wide Wi-Fi network working with Verizon and Cisco.” Gertz said the team did not elaborate on why the recent deal with Huawei was apparently scrapped before it got started.

Huawei, which claims to have installed Wi-Fi networks in many stadiums worldwide, had not had any large-scale installations at major U.S. venues before announcing the FedEx Field deal. A major competitor to large U.S. networking firms like Cisco, Huawei has been at the center of controversy in recent years, including being tabbed as a security threat by U.S. government officials, and later as a reported target for N.S.A. surveillance.

Under the announced terms of the deal, Huawei was supposed to install Wi-Fi in suite areas this December; a company spokesman said that while there was no official deal announced, Huawei was also supposed to follow that install up with a full-stadium deployment before the 2015 season started. In the initial announcement, the team announced Huawei Enterprise USA as a multi-year team sponsor and “Official Technology Partner.”

We have got calls and emails in to all the interested parties, and will update this story as we hear more.