Stadium Tech Report: Network finishes season strong at Niners’ Levi’s Stadium

Arriving at Levi's Stadium for last 2014 season game

Arriving at Levi’s Stadium for last 2014 season game

While the football season didn’t turn out like 49ers fans wanted, the wireless network at the team’s new Levi’s Stadium closed out the year with strong performances to complete a largely glitch-free stretch of events at what is one of the world’s most technologically advanced stadiums.

With more than 2 Terabytes of data used by fans at each of the last two home games for the San Francisco 49ers, the Wi-Fi and DAS networks at Levi’s Stadium closed out a season of superb connectivity that eventually allowed the team to not just offer in-seat food and beverage delivery, but also ordering and delivery of merchandise like hats and T-shirts, an option that was available for the Dec. 20 game against the San Diego Chargers and the Dec. 28 closer vs. the Arizona Cardinals.

According to the Levi’s Stadium network crew, the Wi-Fi network carried 2.34 TB of data for the Chargers game and another 2.11 TB for the Cardinals game, with 20,096 fans using the network on Dec. 20 and 20,164 on Wi-Fi on Dec. 28. Peak concurrent user numbers were 13,700 for the Chargers game, and 14,400 for the season closer.

All season long, our speed tests in various parts of the stadium showed strong signals for both the Wi-Fi as well as the distrbuted antenna system (DAS) enhanced cellular network. At the final game it was no different; we found Wi-Fi download speeds of 16 Mbps on the rooftop deck, 25 Mbps in a suite and a scorching 39 Mbps in the Levi’s 501 Club seats (no doubt in part because there was a Wi-Fi antenna under the seat next to us).

Both Verizon and AT&T 4G LTE services also worked well, consistently scoring download speeds in the 4-6 Mbps range in most places and much higher in others. In short, we didn’t find any flaws in the network coverage in five games of walking all around the stadium, testing everywhere we went.

CalTrain to VTA a smooth ride

Caltrain crown en route to Arizona game

Caltrain crown en route to Arizona game

At the final game, Mobile Sports Report (me) tested out the full public-transit method of getting to the game, starting from San Mateo on CalTrain at 10:51 a.m. The parking lot at the station was almost completely empty, and free since it was Sunday; it’s possible that crowds were lighter since the Niners had been eliminated from postseason play, but nevertheless the ride to Mountain View went without hitch, a good sign for next year when many fans in town for Super Bowl 50 will no doubt be using CalTrain to get from San Francisco to Levi’s.

At the Mountain View CalTrain/VTA station operations were at their best I’ve seen, with more neon-vested VTA helpers offering clear instructions on why you might want to take an express bus instead of the light rail. Insider tip: If the express bus is available, take it, because in our testing it arrived at Levi’s in about half the time as the train trip (~20 minutes as opposed to almost 40 minutes for the light rail option).

Express bus option at Mountain View

Express bus option at Mountain View

The only thing that still remains to be ironed out is the fare confusion in switching from CalTrain to VTA, which are two different operators. On CalTrain there was advertising for a $6 “combo ticket” that would let you ride VTA and could be purchased at the same time you bought your CalTrain pass. But an online single-day ticket purchased via the VTA app was only $4.50, so it’s not clear why you would buy the CalTrain pass. Especially for the Super Bowl, it’d help fans if there was one price and one place to buy a single “Get to Levi’s” public-transit ticket.

Food order arrives as promised

Another thing I tried at the season closer was the in-seat food ordering feature on the Levi’s Stadium app. Sitting in the Levi’s Club section seats on the third level, I consulted the app to order a cold beer and a warm pretzel, which the app said could be delivered in 10 minutes.

Food runner bringing me my cold beer and warm pretzel

Food runner bringing me my cold beer and warm pretzel

After entering credit-card information into the app and hitting the order button the app updated itself with timely notices about the order being prepared, and that it was on its way. I found that information to be very assuring, a sign that things were indeed happening; there was even a big number associated with my order that appeared, apparently to make it easier for the food runner to confirm the order.

The order arrived exactly in 10 minutes’ time, as predicted by the app — it also arrived in a lot of extra packaging, a separate plastic bag for the steel bottle of beer and a paper sack holding a cellophane wrapper-encircled pretzel. Since there is no way to add a gratuity in the app, I gave the runner a cash tip, which seemed appropriate even though there is a $5 charge added to the order cost for the delivery service. I have to admit it felt a little weird to have someone bring me my food and drink but in the time it took to order and deliver I sat in my seat and watched the Niners’ game-winning TD drive so it’s clearly a fan-friendly option.

Video replays work well, for small amount of viewers

Another part of the Levi’s Stadium technology that was at peak performance by year’s end was the app’s instant replay feature. Though it started slowly and had some hiccups early on, by the final game instant replays were appearing in the app even before the next play had concluded (see our feature on how the VenueNext team gets the replays to the app so quickly).

While it’s an impressive addition to the in-game experience, the replays are a feature that only a small amount of fans are watching. According to the team network stats there were only 1,253 unique users watching replays on Dec. 20, and 1,019 during the Dec. 28 game. Total replays viewed for the Chargers game were 6,285, while 4.310 replays were watched during the season closer.

Why aren’t the replays catching on? Our main guess is that the Levi’s Stadium big screens are so clear and so quick to show replays (they also show live action as it’s happening), fans don’t find it necessary to use their phones to watch replays. It’s also possible that many fans in the stadium who are using the network aren’t using the Levi’s Stadium app. Indeed, according to the team network stats, the team app hasn’t yet cracked the top-four apps being used at any of the games this season; for the Dec. 20 game the top apps being used on the network were Amazon cloud drive, Facebook, Google APIs (probably Gmail) and Apple; for Dec. 28 the list was Amazon, Google, Facebook, then Apple.

We’ll try to get some season-long stats to share for both the network and the app features, but our quick conclusion after five live-game visits to Levi’s Stadium this year is that the wireless network and the app both pretty much lived up to their pre-season billing and hype, delivering a wireless game-day experience that is the standard other large public facilities will be measured against, going forward. More photos from our last visit below.

The Microsoft Surface/sideline Wi-Fi unit

The Microsoft Surface/sideline Wi-Fi unit

close-up of cable connection

close-up of cable connection

Niners' Flickr promotion on scoreboard -- very popular

Niners’ Flickr promotion on scoreboard — very popular

Sideline Surface tablets for Niners players and coaches

Sideline Surface tablets for Niners players and coaches

Colin Kaepernick exchanges his radio helmet for his flat-brimmed hat after throwing a TD pass

Colin Kaepernick exchanges his radio helmet for his flat-brimmed hat after throwing a TD pass

View from the Levi's skydeck out over Santa Clara

View from the Levi’s skydeck out over Santa Clara

If you throw a rooftop party, the cheerleaders might visit

If you throw a rooftop party, the cheerleaders might visit

View from the Levi's 501 Club section seats

View from the Levi’s 501 Club section seats

Wi-Fin antenna right under the seat next to me (probably why my speedtest was 40+ Mbps)

Wi-Fin antenna right under the seat next to me (probably why my speedtest was 40+ Mbps)

In-stadium signing help to get fans to light rail

In-stadium signing help to get fans to light rail

End of game view from skydeck

End of game view from skydeck

A final toast to the season at the BNY Club

A final toast to the season at the BNY Club

VTA train line. Only took 15 minutes from here to get on bus.

VTA train line. Only took 15 minutes from here to get on bus.

Caltrain platform at Mountain View. Extra trains helped make ride home easy

Caltrain platform at Mountain View. Extra trains helped make ride home easy

Stadium Tech Report: Corning, IBM bringing fiber-based Wi-Fi and DAS to Texas A&M’s Kyle Field

Kyle Field, Texas A&M University. Credit all photos: Texas A&M

Kyle Field, Texas A&M University. Credit all photos: Texas A&M

Editor’s note: The following is an excerpt from our recent Stadium Tech Report series COLLEGE FOOTBALL ISSUE, a 40-page in-depth look at Wi-Fi and DAS deployment trends at U.S. collegiate football stadiums. You can download the full report for free, to get more stadium profiles as well as school-by-school technology deployment capsules for both the SEC and Pac-12 conferences.

When Texas A&M’s newly renovated Kyle Field opens for the 2015 football season, its outside appearance will have changed dramatically. But from a networking perspective, what’s really different is hidden on the inside – namely, an optical fiber infrastructure designed to bring a new level of performance, cost savings and future-proofing to stadium network deployments.

While the use of optical fiber instead of copper cable in large networks isn’t exactly “new” in the core telecom or enterprise networking worlds, in the still-nascent field of stadium network deployments fiber has yet to make large inroads. But the promise of fiber’s ability to deliver much higher performance and greater future-proofing at lower installation costs in stadium situations may get a very visible poster child when Texas A&M’s football facility kicks off the 2015 season with a technology infrastructure designed to be among the most comprehensive in any stadium, collegiate or professional.

With a Wi-Fi network designed to support 100,000 concurrent connections, a robust DAS network with more than 1,000 antennas, and an IPTV deployment with more than 1,000 screens, the IBM-designed network based largely on Corning’s fiber-optical systems is incredibly impressive on paper – and it has already produced some eye-popping statistics this past season, when just a part of it came online during the “Phase 1” period of the two-phase $450 million Kyle Field renovation.

The final, or Phase 2 of the renovation, just now getting underway, began with an implosion of the stadium’s west stands, with reconstruction scheduled to finish in time for the 2015 season with a new, enclosed-bowl structure that will seat 102,512 fans. And if the new network delivers as planned, those fans will be among the most-connected anywhere, with plenty of future-proofing to make sure it remains that way for the foreseeable future – thanks to fiber.

Driving on the left side of the street

What’s going to be new about Kyle Field? According to news reports some of the creature comforts being added include redesigned concession stands, so-called “Cool Zones” with air conditioning to beat the Texas heat, well-appointed luxury suites and new restrooms – including 300 percent more women’s bathrooms.

Scoreboard, Kyle Field

Scoreboard, Kyle Field

According to representatives from the school, the decision to make the new stadium a standout facility extended to its network infrastructure. “Our leadership decided that [the stadium renovation] would be leading edge,” said Matthew Almand, the IT network architect for the Texas A&M University System, the administrative entity that oversees university operations, including those at the flagship school in College Station, Texas. “There were some leaps of faith and there was a decision to be leading edge with technology as well.”

Though the Phase 1 planning had started with traditional copper cable network design for the network, Almand said a presentation by IBM and its “smarter stadium” team changed the thinking at Texas A&M.

“The IBM team came in and did a really good job of presenting the positive points of an optical network,” Almand said.

Todd Christner, now the director, wireless business development at Corning, was previously at IBM as part of the team that brought the optical idea to Texas A&M. While talking about fiber to copper-cable veterans can sometimes be “like telling people to drive on the left side of the street,” Christner said the power, scalability and flexibility of a fiber network fit in well with the ambitious Kyle Field plans.

“The primary driving force [at Texas A&M] was that they wanted to build a state of the art facility, that would rival NFL stadiums and set them apart from other college programs,” Christner said. “And they wanted the fan [network] experience to be very robust.”

With what has to be one of the largest student sections anywhere – Christner said Texas A&M has 40,000 seats set aside for students – the school knew they would need extra support for the younger fans’ heavy data use on smartphones. The school officials, he said, were also concerned about DAS performance, which in the past had been left to outside operators with less than satisfactory results. So IBM’s presentation of a better, cheaper alternative for all of the above found accepting ears.

“It was the right room for us to walk into,” Christner said.

IBM’s somewhat radical idea was that instead of having separate copper networks for Wi-Fi, DAS and IPTV, there would be a single optical network with the capacity to carry the traffic of all three. Though the pitch for better performance, far more capacity, use of less space, and cheaper costs might sound a bit too good to believe, most of it is just the combination of the simple physics advantages of using fiber over copper, which are well known in the core telecom and large-enterprise networking worlds, applied to a stadium situation.

Deploying now and for the future

Corning ONE DAS headend equipment.

Corning ONE DAS headend equipment.

Without going too deeply into the physics or technology, a simple explanation of the benefits stem from the fact that optical fiber can carry far more bandwidth than copper, at farther distances, using less power. That advantage is one reason why fiber is used extensively in core backbone networks, and has been creeping slowly closer to the user’s destination, through deployments like Verizon’s FiOS.

Why hasn’t fiber won over completely? Mainly because in single-user deployments – like to a single home or office – it is still costly to replace systems already in the ground or in the wall with fiber, and for many users fiber’s capacity can be a bit of overkill. Fiber’s main benefits come when lots of bandwidth is needed, and the scale of a project is large, since one main benefit is the elimination of a lot of internal switching gear, which takes up space and consumes lots of power.

Those reasons accurately describe the perfect bandwidth storm happening in networked stadiums these days, where demand seems to keep increasing on a daily basis. Some stadiums that were at the forefront of the wireless-networking deployment trend, like AT&T Park in San Francisco and AT&T Stadium in Arlington, Texas, have been in a near-constant state of infrastructure upgrades due to the ever-increasing needs for more bandwidth. And Isaac Nissan, product manager for Corning ONE, said new equipment like Wi-Fi access points with “smart” or multiple-input antennas are also going to help push the LAN world into more fiber on the back end.

But there’s another drawback to using fiber, which has less to do with technology and more to do with history: Installers, integrators and other hands-on networking folks in general are more comfortable with copper, which they know and have used for decades. Fiber, to many, is still a new thing, since it requires different skills and techniques for connecting and pulling
wires, as well as for managing and administering optical equipment.

“There’s definitely a learning curve for some the RF [industry] people, who have been doing coax for 20 years,” Nissan said. “Fiber is a little different.”

Texas A&M’s Almand admitted that bringing the stadium’s networking group into a new technology – fiber – was a challenge, but one with a worthy payoff.

Copper cable tray hardly filled by optical fiber

Copper cable tray hardly filled by optical fiber

“There’s definitely been a gear-up cycle, getting to a new confidence level [with fiber],” Almand said. But he added that “sometimes it’s good to break out of your comfort zone.”

Lowering the IDF count

Christner said the Corning optical gear is at the center of the Kyle Field deployment, providing support for the fan-facing Wi-Fi as well as Wi-Fi for back of the house operations like point of sale; it also supports the stadium DAS, as well as a network of more than 1,000 IPTV screens. Aruba Networks is the Wi-Fi gear supplier, and YinzCam is helping develop a new Kyle Field app that will include support to use smartphones as remote-control devices for IPTVs in suites.

On the Wi-Fi side, Christner said the finished network will have 600 APs in the bowl seating areas, and another 600 throughout the facility, with a stated goal of supporting 100,000 concurrent 2 Mbps connections. The DAS, Christner said, is slated to have 1,090 antennas in 50 sectors.

With no intermediate switching gear at all, Christner said that for the fiber network in Kyle Field only 12 intermediate distribution frames (the usually wall-mounted racks that support network-edge gear, also called IDFs) would be needed, as opposed to 34 IDFs in a legacy fiber/coax system. In addition to using less power, the cabling needed to support the fiber network is a fraction of what would have been needed for coax.

One of the more striking pictures of the deployment is a 36-inch wide cable tray installed for the original copper-network plan, which is carrying just 10 inches of fiber-optic cable. Christner said the fiber network also provides a cleaner signal for the DAS network, which already had a test run this past season, when 600 DAS antennas were deployed and lit during the 2014 season.

“At the Ole Miss game we had 110,663 fans at the stadium, and according to AT&T on the DAS all their lights were green,” Christner said. “Via our completely integrated fiber optic solution, we are now able to provide the DAS with much higher bandwidth as well,” said Texas A&M’s Almand, who also said that the carriers have responded very positively to the new DAS infrastructure.

Up from the dust – a model for the future?

Antenna and zone gear box near top of stands

Antenna and zone gear box near top of stands

Also included in the design – but not being used – are an additional 4,000 spare fibers at 540 zone locations, which Christner said can be immediately tapped for future expansion needs. And all of this functionality and flexibility, he added, was being built for somewhere between one-third and 40 percent less than the cost of a traditional copper-based solution.

The proof of the network’s worth, of course, will have to wait until after the west stands are imploded, the new ones built, and the final pieces of the network installed. Then the really fun part begins, for the users who will get to play with things like 38 channels of high-def TV on the IPTV screens, to the multiple-angle replay screens and other features planned for the mobile app. At Texas A&M, IBM’s support squad will include some team members who work on the company’s traditionally excellent online effort for the Masters golf tournament, as well as the “smarter stadium” team.

For Texas A&M’s Almand, the start of the 2015 season will mark the beginning of the end, and a start to something special.

“If I were a country singer, I’d write something about looking forward to looking back on this,” Almand said. “When it’s done, it’s going to be something great.”

Report excerpt: SEC moving slowly on stadium Wi-Fi deployments

Jordan-Hare Stadium, Auburn University

Jordan-Hare Stadium, Auburn University

Editor’s note: The following is an excerpt from our recent Stadium Tech Report series COLLEGE FOOTBALL ISSUE, a 40-page in-depth look at Wi-Fi and DAS deployment trends at U.S. collegiate football stadiums. You can download the full report for free, to get more stadium profiles as well as school-by-school technology deployment capsules for both the SEC and Pac-12 conferences.

When it comes to college football, the South- eastern Conference – usually just known as “the SEC” – is second to none when it comes to the product on the field.

But what about the product in the stands, namely the wireless technology deployments in SEC stadiums? With just two of 14 conference schools currently with fan-facing Wi-Fi in their main venues, the SEC isn’t pushing any technology envelopes as a whole. And according to one SEC athletic director, there probably won’t be a wholesale march by the conference to the technology forefront – simply because the SEC’s in-stadium fans have other priorities on what needs fixing first.

Scott Stricklin, the AD at SEC member Mississippi State, leads a conference-wide group that is taking a close look at the in- stadium fan experience, a concern for the SEC even as the conference enjoys NFL-like popularity for its teams and games.

“We are proud that we have a pretty special product in our stadiums, and we want to take steps to keep it that way,” said Stricklin in an interview with MSR. A recent conference-wide fan survey, he said, did highlight the fact that when it comes to wireless connectivity, “none of us from a performance standpoint scored very well.”

Wi-Fi not as important as parking, good food

But Stricklin also noted that the same fan survey didn’t place stadium connectivity at the top of the list of things to fix: Instead, it fell well down, trailing issues like parking, clean restrooms, stadium sound and good food. That lack of press- ing concern, combined with Stricklin’s still-common belief that fans should be cheering instead of texting while at the stadium, means that the SEC will probably take a measured approach to Wi-Fi deployments in stadiums, and continue to rely on carrier-funded DAS networks to carry the game-day wireless load.

Scott Stricklin, Mississippi State AD

Scott Stricklin, Mississippi State AD

“I take more of a Mark Cuban approach – I’d rather people in the stands not be watching video [on their phones],” Stricklin said. “It takes away from the shared experience.”

Stricklin also noted that the two schools that have installed Wi-Fi in their stadiums – Auburn and Ole Miss – haven’t had resounding success with their deployments.

“Some [SEC schools] have done [Wi-Fi], and they’re not completely happy with the results,” said Stricklin, saying the lack of success has reinforced the cautious approach to Wi-Fi, conference-wide. “Those are the issues all of us are facing and grappling with,” he added.

SEC fans setting DAS traffic records

Even as they trail on Wi-Fi deployments, that doesn’t mean SEC schools are putting in dial-up phone booths. Indeed, Stricklin noted the huge video boards that have been installed in most conference stadiums, and did say that the recent installations of carrier-funded DAS deploymentshave somewhat eased the no-signal crunch of the near past.

At his own school, Stricklin said his office got a lot of complaints about fans not being able to get a cellular signal before AT&T updated the stadium’s DAS in 2013.

“Last year, we got very few negative comments [about cellular service],” Stricklin said. “AT&T customers were even able to stream video.”

Vaught-Hemingway Stadium, Ole Miss

Vaught-Hemingway Stadium, Ole Miss

AT&T’s aggressive plan to install as many DAS networks as it can has helped bring the SEC to a 100 percent DAS coverage mark, and the fans seem to be enjoying the enhanced cellular connectivity. According to AT&T statistics, fans at SEC schools have regularly led the carrier’s weekly DAS traffic totals for most of the football season, especially at the “big games” between SEC schools like Alabama, Auburn, Ole Miss, Mississippi State and Georgia.

During Alabama’s 25-20 home victory over then-No. 1 Mississippi State, AT&T customers at Bryant-Denny Stadium used 849 gigabytes oftraffic, the second-highest total that weekend for stadiums where AT&T has a DAS. The next two highest data-usage marks that weekend came at games at Georgia (676 GB) and Arkansas (602 GB), highlighting that SEC games typically have huge crowds, and those crowds like to use their cellphones, no matter how good the game on the field is.

Would Wi-Fi help with some of the traffic crunches? Possibly, but only two schools in the conference – Ole Miss and Auburn – currently have fan-facing Wi-Fi in their stadiums. Texas A&M, which is in the middle of a $450 million renovation of Kyle Field, is leaping far ahead of its conference brethren with a fiber-based Wi-Fi and DAS network and IPTV installation that will be among the most advanced anywhere when it is completed this coming summer.

But most of the SEC schools, Stricklin said, will probably stay on the Wi-Fi sidelines, at least until there is some better way to justify the millions of dollars in costs needed to bring Wi-Fi to a facility that might not see much regular use.

“If you only have 6 home games a year, it’s hard to justify,” said Stricklin of the cost of a Wi-Fi stadium network.

Other sports may move before football

Stricklin, the man who wants fans to keep their phones in their pockets at football games, is no stranger to technology-enhanced experiences in stadiums. He claims to “love” the in-seat food delivery options at MSU baseball and basketball games, and notes that the conference athletic directors will have a meeting soon where the game-experience panel experts will walk the ADs through the facets of wireless technology deployments.

“They’re going to lay out what are the challenges, and what are the costs” of wireless deployments, Stricklin said. What Stricklin doesn’t want to see at MSU or at any SEC school is the return of the “no signal” days.

“When fans from other schools come here, we want them to have a good experience,” Stricklin said.

But he’d still prefer that experience is real, not virtual.

“I still just wonder, is anybody really doing this?” he asked. “Are you going to pay what you pay to come to our place, and then watch your phone? What I hope is that we produce such a great experience, you’re not going to want to reach for your phone.”

AT&T: Bills fans using almost 400 GB of data per game on Ralph Wilson Stadium DAS

Ralph Wilson Stadium

Ralph Wilson Stadium

The new DAS deployment at Buffalo’s Ralph Wilson Stadium is getting a workout from Bills fans, according to data from DAS operator AT&T. According to AT&T, fans on AT&T’s cellular network are using an average of 397 gigabytes of data per game so far this season, a figure that might drift a bit higher after the Bills’ big upset of Green Bay this past weekend.

The DAS, part of a $130 million stadium renovation project at Ralph Wilson Stadium for this season that also saw the installation of new HD video boards (but no Wi-Fi), has 33 sectors with 11 cell sites worth of AT&T equipment, according to news reports.

One of just 10 NFL facilities that doesn’t have fan-facing Wi-Fi, Ralph Wilson Stadium clearly now has less of a “no signal” problem, if fans are finding ways to use nearly 400 GB of data per game. We’ll circle back with the Buffalo folks to see if there is any news on future Wi-Fi plans.

Stadium Tech Report: Arizona Cardinals get stadium ready for Super Bowl with Wi-Fi upgrade

University of Phoenix Stadium. Credit all photos: Arizona Cardinals.

University of Phoenix Stadium. Credit all photos: Arizona Cardinals. (click on photos for larger image)

As they get set to host their second Super Bowl this February, the IT team at the University of Phoenix Stadium in Glendale, Ariz., knows now what they didn’t know then: The big game requires a big wireless network. Bigger than you think.

“It’s funny to look back now on that first Super Bowl,” said Mark Feller, vice president of information technology for the Arizona Cardinals, speaking of Roman numeral game XLII, held in the barely 2-year-old facility on Feb. 3, 2008. With a couple Fiesta Bowls and one BCS championship game (2007) under his belt in a facility that opened with Wi-Fi and DAS, Feller said he and his team “thought we had a good handle” on what kind of network was required for a Super Bowl crowd.

The NFL, he said, begged to differ. Those college games might have been big, but the Super Bowl was bigger.

“We had the Fiesta Bowl that year at night, and when the game was over there were people from the NFL there wanting to know when they could set up,” said Feller in a recent phone interview. “This year, we’re much better prepared. We know what the water temperature is this time.”

Rip and replace, with more and better gear

Wi-Fi railing antennas

Wi-Fi railing antennas

For Super Bowl XLIX, scheduled to take place in Glendale on Feb. 1, 2015, Feller and his team have not just tuned up their network — they have done a full rip and replace of the Wi-Fi system, installing new Cisco gear from back end to front, in order to support a wireless game-day demand that is historically second to none. Integrator CDW has led the Wi-Fi effort and Daktronics and ProSound did the installation of new video screens, and neutral host Crown Castle has overseen a revamp of the DAS system, again with more antennas added to bolster coverage. In all, there has been more than $8 million in wireless improvements before this year, Feller said, as well as another $10 million for two new video boards that are each three times larger than what was there before.

“The last three or four years there have been things we knew we needed to improve [before the Super Bowl],” Feller said. After extensive work with the NFL’s technical team — this time well before the Fiesta Bowl — Feller oversaw a “top to bottom” refurbishment that included replacing core Cisco networking gear with newer gear, and new and more Wi-Fi access points that now total somewhere north of the 750 mark, with some more to be added before the big game. The new network, which was in place for the start of the current NFL season, has undergone testing by CDW at each home game, Feller said. CDW also plans to expand the network outside the stadium before the Super Bowl, in part to handle the extra events that take place not just on game day but in the days leading up to the game.

“The plan is to install more [coverage] outside, in the plaza areas,” Feller said.

When it opened in 2006, the $455 million University of Phoenix Stadium was one of the first with full-bowl Wi-Fi, using Cisco gear from the inside out. “Cisco was in here before they called it [their solution] ‘connected stadium’,” Feller said. From core network switches to firewalls to edge switches, this year there is all new Cisco gear in the venue, as well as new 3700 series APs, with panel antennas and antennas in handrails.

“Handrail [antennas] are sometimes a bit of a challenge, because you need to drill through concrete that’s 40 feet up in the air, behind another ceiling,” said Feller, describing one particular design challenge. Another one was mounting antennas on drop rods from the catwalks below the stadium’s retractable roof, to serve the upper-area seating. There are also some new Wi-Fi APs on the front row of the seating bowl, pointing up into the crowd.

“It was a fun project,” Feller said.

Stadium with roof open

Stadium with roof open

All on board for the DAS

The upgrade for the stadium’s DAS, led by Crown Castle, was just finished a few weeks ago, Feller said, and included more coverage outside the stadium as well, with antennas placed on light poles and on the stadium’s shell.

“Crown Castle did a great job of managing the carriers” on what is a 48-sector DAS, Feller said. “It [the upgrade] really required a lot of creative thinking from their engineers.”

Since the stadium was originally designed with wireless in mind, Feller and his team didn’t need to build new head end room for the DAS upgrades. “But I wouldn’t say we have plenty of space left,” he added. “We’ve got a lot of new equipment.”

Though all the major carriers are expected to be on the DAS by the big game, league partner Verizon Wireless should have some special projects up its sleeve for the big game, including another demonstration of its LTE Broadcast technology, which optimizes things like live video over LTE cellular links.

New Cardinals app a preview of Super Bowl version?

The Cardinals also had a new version of the game-day team app for this season, built by stadium-app leader YinzCam. According to Feller the new app supports three different live video feeds, as well as instant replays.

Wi-Fi antenna on railing

Wi-Fi antenna on railing

“It’s really cool to have that ability to watch things like a touchdown pass at the end of the game,” Feller said. And while no details have yet been revealed, in an interview with NFL CIO Michelle McKenna-Doyle earlier this year MSR learned that the league and YinzCam are working on a Super Bowl app with its own new bells and whistles. (Stay tuned for more info on the Super Bowl app.)

In addition to two more regular-season home games in December, the University of Phoenix Stadium will have at least a couple more dry runs to help test the network, during the Dec. 31 Fiesta Bowl and during the NFL’s Pro Bowl, scheduled for Jan. 25. And though the Cardnials lost to the Atlanta Falcons Sunday, at 9-3 they are still tied with the Green Bay Packers for the best record in the NFC, something that has the Phoenix faithful optimistic about the postseason.

“We’re going to get some more test runs, on New Year’s Eve and during the Pro Bowl,” Feller said. “And maybe some home playoff games as well!”

(more photos below)

Wi-Fi antenna in roof rafters

Wi-Fi antenna in roof rafters

More antennas in rafters

More antennas in rafters

Wi-Fin antenna under overhang

Wi-Fi antenna under overhang

Big AT&T DAS weekend in Miami: 2.7 TB of traffic for two mid-November games

Screen Shot 2014-09-12 at 2.21.51 PMWe’re a couple weeks behind in catching up here, but it’s worth backtracking to look at a huge weekend of DAS traffic at Miami’s Sun Life Stadium that took place earlier this month. According to DAS traffic figures from AT&T, the two games held at Sun Life on Nov. 13 (Miami Dolphins vs. Buffalo Bills) and Nov. 15 (Florida State vs. Miami) generated a total of 2.735 terabytes of traffic on the AT&T-specific cellular DAS in the stadium — a pretty high mark for cellular-only traffic.

Since we know there’s also a high-capacity Wi-Fi network at Sun Life, it’s interesting to wonder how much total traffic there was for the two events. While we wait to see if the fine folks who run the stadium network will eventually provide us with the Wi-Fi details, we can drill down a bit more into the DAS numbers that AT&T is seeing across the largest stadiums this fall.

The two games in Miami that weekend were the tops for DAS traffic in both college and pro for AT&T networks, which according to AT&T is the first time one town has held the DAS crown for both spots. The FSU-Miami game, where the Hurricanes kept it close to the end, was the biggest single DAS traffic event of that weekend, college or pro, with 1,802 GB of data crossing the AT&T DAS network. What’s kind of stunning is to remember that these stats are for AT&T customer traffic only; full game traffic from the 76,530 in attendance at the FSU-Miami game was likely much higher but alas — we get no such comparable stats from other cellular providers.

Other big games between highly ranked teams also scored high in AT&T’s DAS rankings that particular weekend — Alabama’s home win over then No. 1 Mississippi State was second on the list with 849 GB of DAS traffic, while Georgia’s win over visiting Auburn that Saturday recorded 676 GB of DAS traffic.

On the pro side, the second-highest AT&T DAS traffic came interestingly from San Diego, where the Chargers eked out a 13-6 win over the Raiders. We’re wondering if the DAS mark from San Diego — 730 GB, which trailed only Miami’s Thursday night mark of 933 GB in its win over Buffalo — was higher because Qualcomm Stadium still doesn’t have Wi-Fi. And again, remember that traffic at some other stadiums might have been higher — these numbers reflect only AT&T stats from venues where AT&T has an operating DAS.

Stay tuned as the football seasons come to their conclusions — with any luck we’ll get some more DAS and Wi-Fi stats to get a more complete picture of stadium traffic this season, which — surprise! — seems to be continually growing. Verizon, Sprint and T-Mobile… lend us your stats!