AT&T Stadium sees 7.25 TB of Wi-Fi for Packers vs. Cowboys playoff game

The Dallas Cowboys before taking the field against the Green Bay Packers in a Jan. 15 playoff game. Credit: James D. Smith/Dallas Cowboys

The Dallas Cowboys before taking the field against the Green Bay Packers in a Jan. 15 playoff game. Credit: James D. Smith/Dallas Cowboys

Pro football’s biggest stadium had the biggest non-Super Bowl Wi-Fi traffic day we’ve heard of this season, as the Dallas Cowboys reported seeing 7.25 terabytes of Wi-Fi data on the AT&T Stadium network during the Packers’ thrilling 34-31 victory on Jan. 15.

John Winborn, chief information officer for the Dallas Cowboys, sent us the info on the stadium’s biggest Wi-Fi day ever, surpassing the previous record of 6.77 TB seen on the AT&T Stadium Wi-Fi network for WrestleMania 32 back on April 5, 2016. The new total for Wi-Fi was even set by fewer fans, with attendance for the Jan. 15 playoff game at 93,396, compared to the 101,763 at WrestleMania.

Though he didn’t provide an exact number, Winborn also said that the take rate of unique clients on the Wi-Fi network for the Packers game was 50 percent of attendees, roughly 46,700, easily one of the biggest numbers we’ve seen anywhere. During the Cowboys’ excellent regular season, Winborn said the average of Wi-Fi data used per game was 5.28 TB, an increase of 33 percent over the 2015 season.

UPDATE: The AT&T folks have provided the DAS stats for the same game, with an additional 3 TB of data used on the AT&T cellular networks inside the stadium. So we’re up to 10.25 TB for a non-Super Bowl game… doubt we will get any other carriers to add their totals but sounds to me like this is the biggest non-Super Bowl event out there in terms of total data.

Any other NFL teams (or college teams) out there with peak games and/or season averages, send them in! Let’s keep updating this list!

THE NEW TOP 7 FOR WI-FI

1. Super Bowl 51, NRG Stadium, Houston, Feb. 5, 2017: Wi-Fi: 11.8 TB
2. Super Bowl 50, Levi’s Stadium, Santa Clara, Calif., Feb. 7, 2016: Wi-Fi: 10.1 TB
3. Green Bay Packers vs. Dallas Cowboys, Divisional Playoffs, AT&T Stadium, Arlington, Texas, Jan. 15, 2017: Wi-Fi: 7.25 TB
4. WrestleMania 32, AT&T Stadium, Arlington, Texas, April 3, 2016: Wi-Fi: 6.77 TB
5. Super Bowl 49, University of Phoenix Stadium, Glendale, Ariz., Feb. 1, 2015: Wi-Fi: 6.23 TB
6. Alabama vs. Texas A&M, Kyle Field, College Station, Texas, Oct. 17, 2015: Wi-Fi: 5.7 TB
7. Pittsburgh Steelers vs. New England Patriots, AFC Championship Game, Gillette Stadium, Foxborough, Mass., Jan. 22, 2017: Wi-Fi: 5.11 TB

Arizona State upgrades DAS, Wi-Fi at Sun Devil Stadium

Sun Devil Stadium at Arizona State. Credit all photos: ASU

Sun Devil Stadium at Arizona State. Credit all photos: ASU

When Arizona State University started renovating Sun Devil Stadium three years ago, the project wasn’t so much a simple wireless refresh as it was a total reset of what technology, sports and academia could co-create.

In addition to expanded Wi-Fi and DAS for the stadium (a venue that includes classrooms, meeting rooms and retail outlets), ASU activated a virtual beacon trial. The university also joined up with Intel to explore how Internet of Things devices might yield better environmental information about the bowl, including acoustic data, Jay Steed, assistant vice president of IT operations, told Mobile Sports Report.

The university’s IT department understood that a richer fan experience for football and other events would require a robust network. Steed and his colleagues visited other venues like Levi’s Stadium, AT&T Stadium, Stanford and Texas A&M to get a better handle on different approaches to networking, applications and services.

Regardless, some sort of refresh was overdue. Wedged between two buttes in the southeastern Phoenix suburb of Tempe, the 71,000-seat Sun Devil Stadium was completed in 1958 and needed infrastructure and technology updates. Wi-Fi access was limited to point-of-sale systems and stadium suites; fans generally relied on a DAS network.

Time for an upgrade

Editor’s note: This profile is from our latest STADIUM TECH REPORT, the ONLY in-depth publication created specifically for the stadium technology professional and the stadium technology marketplace. Read about the Sacramento Kings’ new Golden 1 Center and the new Wi-Fi network for the Super Bowl in our report, which is available now for FREE DOWNLOAD from our site!

“The stadium needed a lot of facelifting, not just from a technology perspective but also for the fan experience, like ADA compliance and overall comfort,” Steed said. “We didn’t just want to rebuild a venue for six football games a year, but extend its availability to 365 days and make it a cornerstone and anchor for the whole campus.”

The 'Inferno' student section got a priority for better connectivity.

The ‘Inferno’ student section got a priority for better connectivity.

The reset included tearing out the lower bowl to “punch some new holes” — new entry points to the stadium — and to add conduits and cabling for the new 10-gigabit fiber backbone for the stadium. The network can be upgraded as needed to 40- and even 100-gigabit pipes, according to Steed.

“We wanted to make sure it could support fans’ [connectivity needs] and all the facility’s operations with regard to video and StadiumVision, learning and education, and Pac-12 needs as well,” he said.

The overall stadium renovation was budgeted at $268 million; the technology upgrades will total about $8 million.

The university added 250 new DAS antennas. The vendor-neutral network includes AT&T, Verizon, Sprint and T-Mobile, which share 21 DAS sectors to keep cell service humming inside the stadium.

On the Wi-Fi side, ASU opted for Cisco’s access points. The networking vendor was already entrenched across the 642-acre campus; Steed and the IT department prefer the simplicity of a single-vendor network. Cisco helped with the hardware and RF engineering for Sun Devil Stadium. CenturyLink offered guidance on the networking and fiber pieces of the project, while Sundt was the contractor for most of the physical construction.

Wireless service for ‘The Inferno’

When the renovation is officially completed later in 2017 (most of the network is already live), there will be 1,100 APs in and around Sun Devil Stadium. The student sections, also known as The Inferno, get more APs and bandwidth since historical data has shown students to be the biggest bandwidth consumers in the stadium. Consequently, the ratio in the student sections is one AP to every 50 users; the rest of the bowl’s APs each handle about 75 users on average, Steed said.

Breakaway look at an under-seat AP

Breakaway look at an under-seat AP

ASU’s new Wi-Fi network was engineered to deliver 1.5 Mbps upstream and 3 Mbps downstream, but Steed said so far users are getting better performance – 8 Mbps up and 12 Mbps down. “We’re getting about 25 percent saturation,” he added. “Many users put their phones away during the games, but we see spikes at halftime and during commercial breaks.” Regardless, ASU continually monitors Wi-Fi and DAS usage and adjusts bandwidth as needed.

Another big challenge is the desert climate – temperatures regularly soar into triple digits. With about 450 under-seat APs in the bowl, Steed and his team had to make sure the enclosures could withstand heat and didn’t obstruct the walkways. “We’ll see how well the electronics do, baking at 120 degrees six months out of the year,” he laughed.

ASU is also working with Intel, using the stadium’s APs as part of an Internet of Things trial. As Steed described it, IoT sensors work alongside stadium APs to measure temperature, noise, vibration and other environmental data. “We also look at lighting control and water distribution and flow,” he said.

Concourses also got expanded Wi-Fi and DAS coverage.

Concourses also got expanded Wi-Fi and DAS coverage.

Automating the control of environmental functions like heating, cooling, power usage and facilities management will help the university toward its goal of being carbon-neutral by 2025, Steed added. The trials are designed so that the technology can be expanded across the university, possibly for campus transportation kiosks or student concierge services. IoT devices could give students and visitors information about adjacent buildings or landmarks around campus.

Separate but related, the university is also testing cloud-based, Bluetooth low energy (BLE) technology from Mist Systems. These “virtual beacons” use sensors attached to an AP to flag information or a point of interest for students or stadium visitors. “The virtualized beacon technology helps us understand where people are walking around and what they’re looking at in the stadium and elsewhere around campus,” Steed said.

They’re currently being tested in some of Sun Devil Stadium’s suites; Steed foresees expanding that to the student union to help guide people to meeting rooms, retail facilities or food vendors, for example.

Steed credited excellent communication and collaboration among the university’s athletic and IT departments and other players in the upgrade equation. “Our athletic director, Ray Anderson, brought the CIO and me into his office and really worked together with us,” he explained. “The biggest piece of our success was knowing that the AD supported our recommendations and brought us in as valued advisors.”

New Report: First look at Sacramento’s Golden 1 Center

q4 thumbMOBILE SPORTS REPORT is pleased to announce the Winter 2016-2017 issue of our STADIUM TECH REPORT series, with a first look at the pervasive stadium technology built into the Sacramento Kings’ new home, the Golden 1 Center.

Also in our latest in-depth publication created specifically for the stadium technology professional and the stadium technology marketplace is a profile of a new Wi-Fi deployment at the Indiana Pacers’ Bankers Life Fieldhouse, and a profile of new Wi-Fi and DAS networks deployed at Arizona State’s Sun Devil Stadium. We also provide an update on how the new Wi-Fi network at Houston’s NRG Stadium is getting ready for the upcoming Super Bowl LI.

Renting a Wi-Fi network?

In addition to our historical in-depth profiles of successful stadium technology deployments, our fourth issue for 2016 has additional news and analysis, including a look at whether or not stadiums will soon be able to lease their Wi-Fi networks. Download your FREE copy today!

We’d like to take a quick moment to thank our sponsors, which for this issue include Mobilitie, Crown Castle, SOLiD, CommScope, JMA Wireless, Corning, Samsung Business, Xirrus, Huber+Suhner, ExteNet Systems, and Extreme Networks. Their generous sponsorship makes it possible for us to offer this content free of charge to our readers. We’d also like to thank you for your interest and support.

As always, we are here to hear what you have to say: Send me an email to kaps@mobilesportsreport.com and let us know what you think of our STADIUM TECH REPORT series.

Vikings hit peak of 4.32 TB for Wi-Fi use at U.S. Bank Stadium, with average 43 percent take rate

Game day at U.S. Bank Stadium. Credit all photos: Vikings.com (click on any photo for a larger image)

Game day at U.S. Bank Stadium. Credit all photos: Vikings.com (click on any photo for a larger image)

While the football season may not have gone exactly to Vikings’ fans wishes, the Wi-Fi network at U.S. Bank Stadium performed well during its inaugural NFL season, with a peak single-game data total of 4.32 terabytes used, part of a season average of 2.89 TB used during Vikings games.

According to statistics provided to MSR by Tod Caflisch, vice president and chief technical officer for the Vikings, the biggest data-use day was Sept. 18, 2016, during the regular-season home opener for the Vikings against the rival Green Bay Packers, a 17-14 Vikings victory. That contest also saw season highs for unique Wi-Fi users, with 31,668 fans connecting to the Wi-Fi at some point of the game day, and for most concurrent users, with 17,556 users connected at the same time. The 31,668 number represented a 49 percent take rate, with the game’s reported attendance of 64,786.

Even though Caflisch said the Vikings didn’t heavily promote the AmpThink-designed Wi-Fi network — which uses Cisco Wi-Fi gear in mostly handrail-mounted AP locations to serve the main bowl seating areas — the average take rate during the season was at the high end of numbers we’ve seen, with a 43 percent average over the two preseason and eight regular-season Vikings games.

Screen Shot 2017-01-12 at 11.41.21 AMAnd even though the total data-used number only crested 3 TB one other time in the season — a 3.16 TB mark during a 30-24 Vikings win over the Arizona Cardinals on Nov. 20, 2016 — the average mark of 2.89 TB per game showed solid, consistent use.

Caflisch said that the Vikings and U.S. Bank Stadium were also able to correct the train-snafu issue that arose at some of the early events at the new venue, which has a light-rail station right outside the stadium doors. While some of the first events had big lines of riders and not enough trains, Caflisch said that during the season extra trains were held in reserve at the transit station that is close to Target Field (a few stops down the line from U.S. Bank) and then filtered in as Vikings games neared their end.

“We were able to clear the [train] platform in 40 minutes after the last game,” Caflisch said. “The fans really loved the trains.” (More U.S. Bank Stadium images below)

Screen Shot 2017-01-12 at 11.39.38 AM

Vikings fans gather outside the stadium for pregame activites.

Screen Shot 2017-01-12 at 11.40.04 AM

Great nighttime view with city skyline visible through windows.

vik5

A look at the handrail Wi-Fi antenna mounts (this photo credit: Paul Kapustka, MSR)

Will stadiums soon be able to rent their Wi-Fi networks from equipment vendors?

Nationwide Arena. Credit: Columbus Blue Jackets

Nationwide Arena. Credit: Columbus Blue Jackets

If it costs too much to buy a Wi-Fi network for your stadium, why not rent one instead?

A fairly common option in the world of enterprise networking, the ability to rent, or lease a fully operational Wi-Fi network may soon be coming to the world of sports stadiums and other large public venues, if it isn’t here already. Three of the largest Wi-Fi gear suppliers, Cisco, Aruba and Ruckus, already have public offers of network leasing arrangements, where venue owners could pay some kind of monthly or recurring fee for network setup and operation, instead of buying equipment outright. And Cisco, another leader in the marketplace, is rumored to be offering full-control lease-type arrangements for stadium Wi-Fi networks, possibly beginning with the new Wi-Fi network being built at the SAP Center in San Jose.

Though no large sports stadium has yet publicly announced a deal to lease, or in networking lingo, to buy a “Network as a service,” or NaaS, the idea is potentially attractive to stadium owners and operators, many of whom have struggled with the return-on-investment question ever since the idea of putting wireless networks in stadiums has emerged. While cellular carriers have so far borne the lion’s share of the costs of deploying enhanced cellular systems like DAS (distributed antenna system) in stadiums, the question of “who will pay for the Wi-Fi” is still a big one for many venues, especially those that are only filled several times a year.

The benefits of moving to opex vs. capex

Bart Giordano, vice president of business development for the Ruckus business unit at Brocade, said the idea of leasing a stadium network could be attractive especially to venue owners who don’t have the upfront capital necessary to pay for Wi-Fi, a cost that could run into the tens of millions of dollars.

Under new parent Brocade, Ruckus Wi-Fi gear can be obtained via something called the Brocade Network Subscription, a NaaS program that Giordano said “converts it all to opex — you subscribe to the network and pay a monthly service fee.” Under the subscription program, Giordano said Brocade/Ruckus will actually own the equipment, allowing the venue owner the flexibility of being able to return it or upgrade it as needed.

With many stadiums that deployed Wi-Fi several years ago already going through significant upgrades, the idea of a leased network that could be more easily refreshed with new technology might soon gain favor. Though no Ruckus stadium subscribers have yet been announced, Giordano said “some are coming.”

Aruba, now a Hewlett Packard Enterprise company, has a similar subscription model plan for enterprise wireless deployments, one which company representatives said could be used by stadiums as well. Both companies said such deals could possibly come via consulting partnerships, where the consultant firm manages the relationship and deployment/operation details.

Cisco also has a leasing option available for wireless networks, but so far has not made any public announcements of such deals in the sports stadium marketplace. However, there are reports of Cisco taking a more active role in the ownership, deployment and operation of stadium networks, like the Cisco-powered Wi-Fi currently being installed at the SAP Center in San Jose, home of the NHL’s Sharks. So far, neither Cisco nor the Sharks will comment on any business specifics of the new Wi-Fi network other than its use of Cisco gear.

Can leasing work for stadiums?

While the leasing idea for stadiums isn’t new, the business model has met some challenges along the way of the short history of wireless networks in large venues. So far, third-party integrators like Mobilitie, ExteNet Systems and Boingo have crafted lease-like deals in which the venue does not pay the full cost of the network but instead allows the operators to run networks (typically both DAS and Wi-Fi), earning money by leasing space on those networks to wireless carriers or by selling advertising or sponsorships.

Another leasing model, one that crashed and burned, was the one employed by SignalShare, a company now in bankruptcy proceedings with legal claims of fraudulent business practices against it. SignalShare, which also offered venues networks for a monthly cost, may have been hampered by a lack of financial resources, something that shouldn’t be an issue for companies the size of Cisco, HP and Brocade, who will mainly be offering leases on equipment they manufacture themselves. The larger equipment vendors may also not be under as much pressure as SignalShare was to earn revenues on the network operations, which may make them better able to succeed in the NaaS space.

And while the idea sounds good in theory, there are still unanswered questions about how the leases would work, and whether they will make good business sense for both sides. Unlike enterprise operations in traditional offices, stadium networks are far more complex to install and operate, especially those being retrofitted in stadiums built decades ago. Stadium networks also have a much different operational profile, with traffic coming in large spikes rather than daily workday routines.

But stadium networks can also act as public advertisements of sorts, gaining more attention for vendors in PR than perhaps in direct profits. As the market matures and vendors seek out potential customers who have shied away from Wi-Fi in the past due to upfront costs, leasing may be a way forward for both sides — as long as both can find a benefit to the deal.

Washington upgrades Wi-Fi backbone at FedExField

Screen Shot 2016-11-30 at 11.07.38 AMFans at FedExField now have more bandwidth to use on the stadium Wi-Fi network thanks to a recent upgrade that moved the backbone capacity to 10 gigabits per second, according to a release from the team.

Though it’s never been previously publicly announced by the Washington, D.C. NFL franchise, according to sources close to the team the reported network built by Verizon Wireless using Cisco Wi-Fi gear has been in full operation since last season, albeit under a sort of “soft launch” mode done without any promotion.

With more fans now using the network, the team saw a need to increase bandwidth from the 1 Gbps pipe that had been supplying all the Wi-Fi network needs. For many football-stadium Wi-Fi networks, 10-Gbps pipes are a popular choice, with several stadiums opting for multiple such connections. Levi’s Stadium in Santa Clara, Calif., home of the San Francisco 49ers, has four 10 Gbps pipes as part of its backbone configuration, while US Bank Stadium in Minneapolis, the new home of the Minnesota Vikings, has six 10 Gbps links.

For those who are new to stadium Wi-Fi (or who don’t know the difference between backhaul and actual end-user network speeds), the backbone bandwidth is shared by the entire network, which may have multiple tens of thousands of client devices running off any of hundreds of Wi-Fi access points. Typical user bandwidth speeds on Wi-Fi in large networks can range from single-digit megabits per second anywhere up to 20, 40 or even 60 Mbps, depending upon network configuration and density of users. Any fans attending games at FedExField who want to send us a speedtest please do!