Super DAS: Crown Castle’s neutral host infrastructure aims to keep Super Bowl XLIX fans connected, inside and outside the stadium

Editor’s note: This story is part 1 of a series of profiles of the providers of the extensive Distributed Antenna System (DAS) deployment for Super Bowl XLIX at and around the University of Phoenix Stadium in Glendale, Ariz. and other parts of the Phoenix city area as well. Stay tuned all week as we highlight how DAS will keep Super Bowl fans connected, no matter where they roam in and around Phoenix and Glendale this week and weekend.

University of Phoenix Stadium getting its Super Bowl on. (Click any photo for a larger image) Credit all photos: Paul Kapustka, MSR

University of Phoenix Stadium getting its Super Bowl on. (Click any photo for a larger image) Credit all photos: Paul Kapustka, MSR

When is a big air-fan vent not an air-fan vent? When it’s a fake vent covering a hidden cellular antenna, put there to keep people from noticing the technology that’s keeping their cell phones connected. Before kickoff at Super Bowl XLIX Feb. 1 in Glendale, Ariz., many fans outside the University of Phoenix Stadium will walk right by a faux vent and its sheltered equipment, never knowing the attention to detail that goes into a major-venue Distributed Antenna System (DAS) deployment.

But to stadium technology connectivity professionals, such leaps of aesthetic deception are just part of a day’s, or perhaps month’s, DAS deployment work. For neutral host DAS provider Crown Castle, the fake vents on the shell of the University of Phoenix Stadium — and the powerful antennas behind — are just one part of a massive project to ensure there is excellent mobile-device connectivity both inside and outside the Super Bowl stadium, so that fans never get a dropped signal anywhere between the parking lot and their prized seat.

During a recent press tour, a small team of Crown Castle employees showed off some of the upgraded DAS network deployed at the University of Phoenix Stadium as well as in the surrounding Westgate Sports and Entertainment District, a sort of open-air mall that stretches from the UoP Stadium past numerous attached restaurants and stores, also encompassing the Gila River Arena, home of the Phoenix Coyotes of the NHL. Over the past year or so, Crown Castle has been upgrading the DAS inside and outside the arena, throughout the mall areas as well as into the huge parking lots that surround it and the football stadium, bringing connectivity to phones being used by customers from all the four major U.S. wireless carriers.

Since the mall and all its food outlets are conveniently located a short stroll from the stadium, it’s a good bet that a large portion of the Super Bowl crowd will spend time wandering around the Westgate area before and after the big game. Thanks to Crown Castle’s efforts, there shouldn’t be many connectivity problems, as antenna deployments on light poles, building rooftops and — yes, even behind fake vents — should be able to keep devices on the cellular networks without a glitch.

Game day connectivity starts in the parking lot

Since we couldn’t actually spend much time wandering around the stadium itself — even three weeks before the big game, the facility was already on NFL security lockdown — most of the Crown Castle tour consisted of walking around the Westgate mall/neighborhood, hearing about the various methods Crown Castle used to locate the necessary DAS antennas. In all, there are five separate DAS networks Crown Castle is responsible for in the area around the stadium: The football stadium itself; the Gila River Arena (which we will profile in an upcoming feature on hockey stadiums); the Westgate shops and restaurants; the nearby Renaissance Hotel; and the surrounding parking lots.

Parking lot light poles, Westgate entertainment district. Can you spot the DAS?

Parking lot light poles, Westgate entertainment district. Can you spot the DAS?

The curious start of the tour in a far-flung parking lot made sense when we found ourselves next to a small DAS equipment box and a light pole with multiple antennas (which had not yet been covered with their final aesthetic sheaths). Aaron Lamoureux, program manager for Crown Castle’s small cell solutions, served as tour guide, and said that for the Westgate area alone there were 18 individual node locations, with about 52 antennas total. Some were located on light poles, some on rooftops, and some along walkways between buildings, to conquer the unique RF characteristics of the open-air/large building outdoor mallish area that is Westgate. (See photos for DAS geek views)

For the University of Phoenix Stadium itself, Crown Castle deployed 228 DAS antennas inside (more on this in an upcoming profile) and at 21 different locations outside the stadium, 13 of those on parking lot poles and 8 mounted on the building itself. Why building-mounted antennas? If you’ve never been there, the University of Phoenix Stadium has a large plaza area on one side, which is used for pre-game activities like rallies, bands and other walk-up amenities where fans gather before entering. The challenge for Crown Castle was finding places to deploy antennas at a low enough height to cover crowds of people standing in one location. While some parts of the building allowed for regular antenna placements, a big part of the plaza faces part of the stadium wall that is a sheer sheet, with no aesthetic place to mount a DAS antenna — unless you add a fake vent or two to the existing design, that is.

Keeping everyone happy is part of the neutral host job

See the big air vents? Nobody would tell us which ones were 'faux vents,' there to hide DAS antennas

See the big air vents? Nobody would tell us which ones were ‘faux vents,’ there to hide DAS antennas

To people outside the industry it might seem silly to go to such lengths just to keep folks from noticing antennas, but anyone who’s deployed a network for a detail-oriented building owner knows why aesthetics are important. That’s why you paint antenna enclosures to match the surrounding walls, or build sheaths to keep wires and other obvious gear out of main sight. It’s part of the art of wireless network deployment, and not as simple as it sounds. Experience counts.

The complex owner and operator relationships involved in the stadium and surrounding-area DAS also seem tailor-made for a big, experienced provider like Crown Castle, which has a long history of deploying and operating multiple-tenant networks. With five different landlords and four different carriers, being the neutral DAS host for this year’s Super Bowl is a task with many moving parts; but, as Mike Kavanagh, president of sales for Crown Castle’s small cell solutions, said, “We understand how to run networks, how to manage them and deal with carriers. It’s high touch and very fluid. But we know that business.”

COMING UP NEXT: What’s inside the network inside the stadium.

MORE PICTURES BELOW! (Click on any picture for a larger image.)

Sky Harbor Airport: Ready for Super XLIX

Sky Harbor Airport: Ready for Super XLIX

Verizon's NFL Mobile ads were in airport walkways well before the Big Game

Verizon’s NFL Mobile ads were in airport walkways well before the Big Game

If you stumble off the escalator, Bud Light is there to catch you

If you stumble off the escalator, Bud Light is there to catch you

The Westgate uber-mall should see a lot of fan activity (and connectivity) on game day

The Westgate uber-mall should see a lot of fan activity (and connectivity) on game day

Here's the official Super Bowl replay HQ (actually a place with DAS antennas on the roof that you can't see)

Here’s the official Super Bowl replay HQ (actually a place with DAS antennas on the roof that you can’t see)

Mama Gina's will offer you pizza and DAS on the roof

Mama Gina’s will offer you pizza and DAS on the roof

More DAS antennas, on a Westgate walkway

More DAS antennas, on a Westgate walkway

Outside UoP Stadium, where the architecture allows for DAS antenna placement

Outside UoP Stadium, where the architecture allows for DAS antenna placement

Close-up of that placement. Still pretty well hidden.

Close-up of that placement. Still pretty well hidden.

Parking lot light mounts. These will have sheaths by Super Sunday.

Parking lot light mounts. These will have sheaths by Super Sunday.

Here's the remote equipment box that powers the light pole antennas. Also scheduled for more concealment.

Here’s the remote equipment box that powers the light pole antennas. Also scheduled for more concealment.

Every artist leaves a signature...

Every artist leaves a signature…

Stadium Tech Report: AT&T Stadium’s massive antenna deployment delivers solid Wi-Fi, DAS performance

The old saw that says “everything’s bigger in Texas” is not just a stereotype when it comes to wireless networking and AT&T Stadium. Though our visit was brief and we didn’t have the opportunity to do a deep-dive technology tour, the MSR team on hand at the recent College Football Playoff championship game came away convinced that if it’s not the fastest fan-facing stadium network, the Wi-FI and DAS deployments at AT&T Stadium sure are the biggest, at least the largest we’ve ever heard of.

Inside AT&T Stadium at the College Football Playoff championship game. (Click on any photo for larger image) Credit all photos: Paul Kapustka, MSR

Inside AT&T Stadium at the College Football Playoff championship game. (Click on any photo for larger image) Credit all photos: Paul Kapustka, MSR

And in many ways we found, bigger is better, at least when it comes to staying connected inside one of the world’s truly humongous indoor spaces.

If you’ve not seen the stats, allow us to remind you that during the Jan. 12 championship game between the University of Oregon and THE Ohio State University the AT&T Stadium network carried more than 6 terabytes of wireless data, with almost 5 TB of that over the in-house Wi-Fi network. Another 1.4 TB was recorded being used by AT&T customers on the AT&T-hosted neutral DAS, which almost certainly carried another terabyte or two from other carriers on the system, who did not report any statistics. Any way you add it up, it’s the biggest single-day wireless data figure we’ve ever heard for a sports arena, professional or college, in any sport at any time.

Flooding the zone with more antennas and APs

How do you get such a big data number? One way is to make sure that everyone can connect, and one way to get to that point is to flood the zone with antennas and access points. Already the leader in the number of Wi-Fi access points and DAS antennas, AT&T Stadium got another 280 Wi-Fi antennas installed between Thanksgiving and the college championship game, according to John Winborn, CIO for the Dallas Cowboys. Some of those antennas, the staff said, were housed in new under-the-seat enclosures that AT&T’s Foundry designed somewhat specifically for use in the lower bowl of AT&T Stadium, which like other stadiums had previously had issues getting connectivity to seats close to field level.

According to Winborn, the AT&T Stadium now has more than 1,600 Wi-Fi APs in use for football games, and 1,400 antennas in its DAS network. By comparison, Levi’s Stadium in Santa Clara, Calif., perhaps the newest and one of the most technologically savvy venues out there, has 1,200 Wi-Fi APs and 700 DAS antennas in its deployments. Winborn also said that the antenna/AP number at AT&T can also scale up as necessary, especially for events that use up more of the building’s space, like the Final Four basketball tournament held there last spring.

“We scaled up to 1,825 [Wi-Fi] antennas for the Final Four last year,” said Winborn in a recent email, where he guessed that the venue might deploy up to 2,000 Wi-Fi APs when the Academy of Country Music Awards holds its yearly event at AT&T Stadium on April 19.

Hiding Wi-Fi APs an aesthetic priority

For all the extra numbers, one thing we noticed in walking around the building on Jan. 12 was that seeing an exposed Wi-Fi AP is about as common as seeing an albino deer. When we asked Winborn what the toughest thing was about network deployment in the venue, he responded quickly: “Finding ways to hide the APs so Jerry [Jones] doesn’t see them.”

With the price-is-no-object Jones on one side, and AT&T’s corporate image on the other, it’s clear there aren’t too many budgetary concerns when it comes down to spending more to make the network work, or look, better. Put it this way: You are never likely to have a “no signal” problem in a building that has on its outside an AT&T logo the size of the moon, and where AT&T CEO Randall Stephenson can be found wandering around the suite level during big events.

Though the immense space could probably be covered by fewer antennas, it’s worthwhile to remember that when the building was built and opened in 2009, it wasn’t designed with high-speed networking in mind. That means that almost all of the Wi-Fi and DAS deployments are a retrofit, including the ingenious circle of Wi-Fi antennas halfway up the seating bowl, which are covered by a tented ring of fiberglass designed and built specifically for the stadium.

According to Winborn the Wi-Fi network is supported by its own 2 GB backbone, with separate backbones in place for media networks and stadium application use. Winborn also noted that the stadium network runs 3,500 TVs via the Cisco StadiumVision system. Other records from this season include a peak concurrent Wi-Fi user mark of 27,523 (set at the Lions playoff game) and 38,534 unique Wi-Fi connections, that mark set at the season opener against the San Francisco 49ers.

Performance solid, even at rooftop level

The view from the nosebleed section

The view from the nosebleed section

So how fast are the Wi-Fi and DAS networks? In our limited testing time at the CFP game, we found solid connections almost everywhere we tried, including outside the stadium while we (freezingly) waited for the doors to open. Just outside the main ticket gate, we got a Wi-Fi signal of 23.93 Mbps on the download and 39.67 Mbps on the upload. At the same location a Verizon 4G LTE device got a 5.93 Mbps download speed, and a 2.59 Mbps upload speed, but it’s unclear if that was on the stadium DAS or just on the local macro network.

When the doors finally opened at 5:30 p.m. (no idea why Jerry kept us all out in the cold all afternoon) we went inside and got solid connections inside the foyer of the pro shop — 18.23/21.74 on Wi-Fi, 21.05/14.84 on an AT&T 4G LTE device, and 12.65/4.61 on a Verizon 4G LTE phone. (It’s worthwhile to note that all our Wi-Fi speeds were recorded on the Verizon device, a new iPhone 6 Plus.)

Down in our field-level suite, where we were the guests of AT&T, we got marks of 19.43/25.31 on the Wi-Fi, 7.35/11.04 on AT&T 4G and 5.71/4.05 on Verizon 4G. An interesting note here: When Oregon scored a touchdown on its opening drive, we took another Wi-Fi speedtest right after the play and got readings of 4.38/7.79, suggesting that there were many Ducks fans communicating the good news.

Later during the game we wandered up to the “Star Level” suites (floor 6 on the stadium elevator) and got a Wi-Fi mark of 11.57/30.51, and 19.31/13.46 on AT&T 4G. The only place we didn’t get a good Wi-Fi signal was at the nosebleed-level plaza above the south end zone, where we weren’t surprised by the 1.41/1.98 Wi-Fi mark since we didn’t see any place you could put an AP. We did, however, get an AT&T 4G signal of more than 7 Mbps on the download in the same location, meaning that even fans way up at the top of the stadium were covered by wireless, no small feat in such a huge space.

Bottom line: Network in place for whatever’s next

If there is a place where AT&T falls behind other stadiums, it’s in the synchronization of network and app; since it wasn’t built with food delivery in mind, it’s doubtful that AT&T will match Levi’s Stadium’s innovative delivery-to-any-seat feature anytime soon. And even though AT&T Stadium is dominated by the massive over-the-field TV set, fans at the CFP championship game were left literally in the dark during questionable-call replays, since they weren’t shown on the big screen and aren’t supported in the AT&T Stadium app.

What could be interesting is if the technology demonstrated by AT&T at the big college game – LTE Broadcast, which sends a streaming channel of live video over a dedicated cellular link – becomes part of the AT&T Stadium repertoire. From experience, such a channel could be extremely helpful during pregame events, since many fans at the college championship were wandering around outside the stadium unsure of where to go or where to find will-call windows. A “pre-game info” broadcast over LTE Broadcast could eliminate a lot of pain points of getting to the event, while also introducing fans to the network and app for later interaction.

At the very least, AT&T Stadium’s network alone puts it in at least the top three of most-connected football stadiums, alongside Levi’s Stadium and Sun Life Stadium in Miami. Here’s looking forward to continued competition among the venues, with advancements that will only further improve the already excellent wireless fan experience.

More photos from our visit below. Enjoy!

Fans freezing outside waiting for the CFP game to start

Fans freezing outside waiting for the CFP game to start

Creative OSU fan

Creative OSU fan

Plug for the app

Plug for the app

AT&T Stadium NOC aka "the Fishbowl"

AT&T Stadium NOC aka “the Fishbowl”

Sony Club. Now we know where Levi's Stadium got its "club" ideas

Sony Club. Now we know where Levi’s Stadium got its “club” ideas

Panoramic view (click on this one!)

Panoramic view (click on this one!)

A glass (cup?) of bubbly to celebrate the 6 TB event

A glass (cup?) of bubbly to celebrate the 6 TB event

Seahawks hit local Wi-Fi record during playoff game with 2.6 Terabytes of traffic; Verizon maintains cone of silence over its investment in network

Screen Shot 2015-01-16 at 11.05.18 AMThe Wi-Fi network that debuted in CenturyLink Field this season had its highest Wi-Fi traffic day last weekend, when 2.6 terabytes of data was carried during the Seattle Seahawks’ Jan. 10 playoff victory over the Carolina Panthers.

According to the Seahawks’ tech staff, 18,899 of the 68,524 fans in attendance used the Wi-Fi network at some point, with a peak concurrent user mark of 15,662. The peak bandwidth utilization of 1.4 Gbps was reached just after Cam Chancellor sealed the win with his electric 90-yard interception touchdown return, the Seahawks tech staff said.

The Wi-Fi numbers represent traffic on both of the separate Wi-Fi networks in the stadium, one of which is reserved exclusively for Verizon Wireless customers. Verizon, which has declined to comment publicly on the specifics of its partnership with the Seahawks, is believed to have bankrolled a major portion of the Wi-Fi deployment at CenturyLink. Before the Wi-Fi partnership between the Seahawks, Wi-Fi gear provider Extreme Networks and Verizon was officially announced on Oct. 29, Verizon claimed it had “added an in-stadium Wi-Fi system” at CenturyLink prior to the start of the 2014 football season, as part of a national football stadium Wi-Fi map Verizon published on Sept. 4. The Wi-Fi deployment was somewhat of a surprise, since team officials had long said they were looking at 2015 as the year they might pull the trigger on a Wi-Fi expenditure. Apparently, having available funding from Verizon helped push the project forward faster than expected; but again, we have no official confirmation or explanation of the exact fiscal participation level of all the partners involved.

For Seattle fans, having Verizon as a Wi-Fi partner has additional perks — in addition to a separate Wi-Fi network for Verizon customers, all fans at the stadium also have access to the NFL Network’s popular RedZone channel, via the new Seahawks stadium app created by YinzCam. Because of various conflicting rights contracts, RedZone isn’t available in most stadiums for fans to watch. The Seahawks also have a number of live-action and replay views available via the app; however, the stadium’s tech team did not have any metrics on fan use of the app or the number of video replays watched.

Wi-Fi access point antennas visible on poles at CenturyLink Field, Seattle. Credit: Extreme Networks

Wi-Fi access point antennas visible on poles at CenturyLink Field, Seattle. Credit: Extreme Networks

Verizon’s reluctance to comment publicly on its Wi-Fi deployments is no surprise; repeated attempts and queries by MSR for interviews with Verizon executives about Wi-Fi deployments are routinely ignored by Verizon representatives, and public quotes like the one from Bobby Morrison, president for the Pacific Northwest and Alaska at Verizon Wireless, in the official press release, don’t offer any details about Verizon’s level of fiscal commitment to the CenturyLink deployment. Verizon has also declined to comment on its Wi-Fi network deployments at Ford Field in Detroit, and at the Staples Center in Los Angeles.

Verizon executives were also conspicuously absent from a stadium-technology event centered on the Wi-Fi network earlier this week, leading some reports to omit Verizon’s considerable participation in the network’s deployment, something NFL CIO Michelle McKenna-Doyle told MSR about earlier this year.

Though no statistics were available from Verizon about the DAS deployment it also put in at CenturyLink Field this year, AT&T did share some DAS stats from CenturyLink for the Seahawks’ Dec. 14 victory over the San Francisco 49ers. During that game, AT&T customer traffic on the AT&T DAS at CenturyLink was 395 GB, according to AT&T. It will be interesting to see how much wireless traffic this weekend’s NFC championship game between the Seahawks and the Packers will generate — we’ll track down as much of it as we can to see if it compares to the 6 TB mark set at the recent College Football Playoff championship game at AT&T Stadium in Arlington, Texas.

AT&T: Miami Dolphins, Oklahoma State are tops for season-long DAS traffic for pro, college stadiums

Screen Shot 2014-09-12 at 2.21.51 PMThe Miami Dolphins’ Sun Life Stadium and the Oklahoma State Cowboys’ Boone Pickens Stadium were the top professional and collegiate venues, respectively, when it came to cellular data traffic used by fans on AT&T networks during this past football season, according to AT&T.

As a caveat we need to say that the numbers used here do not reflect all wireless data users — they only represent totals from AT&T customers at stadiums where AT&T has a DAS, or distributed antenna system, installed. Still, since AT&T is the runaway leader when it comes to DAS installations — and so far, the ONLY wireless provider willing to share stadium statistics — for now its numbers are the only ones we have, so we’ll use them to award the honorary crystal iPhone trophy, or whatever top-traffic winners get.

Among NFL stadiums, Miami’s Sun Life took the top average honors by a wide margin, with an average of 1 terabyte of DAS data per game. Next highest was Dallas and AT&T Stadium, with an average of 830 GB per game; the next three were San Diego’s Qualcomm Stadium (735 GB per game), the San Francisco 49ers’ Levi’s Stadium (601 GB per game) and the New Orleans Superdome (596 GB per game). What’s interesting about the top 5 is that all except San Diego also have fan-facing Wi-Fi, so even with Wi-Fi around there are still high cellular data totals.

On the collegiate side, OSU’s Boone Pickens Stadium racked up an average of 769 GB per home game, pretty impressive when you consider the capacity is only around 60,000 fans. In second place was the University of Miami and Sun Life Stadium, whose network must just stay warm all the time; the U fans used an average 745 GB per game, according to AT&T. The next three highest averages were all close, led by Texas A&M’s Kyle Field (668 GB per game), then Baylor’s McLane Stadium (661 GB) and the University of Alabama’s Bryant-Denny Field (660 GB). For more numbers see the AT&T blog post on the collegiate data season.

Stadium Tech Report: Network finishes season strong at Niners’ Levi’s Stadium

Arriving at Levi's Stadium for last 2014 season game

Arriving at Levi’s Stadium for last 2014 season game

While the football season didn’t turn out like 49ers fans wanted, the wireless network at the team’s new Levi’s Stadium closed out the year with strong performances to complete a largely glitch-free stretch of events at what is one of the world’s most technologically advanced stadiums.

With more than 2 Terabytes of data used by fans at each of the last two home games for the San Francisco 49ers, the Wi-Fi and DAS networks at Levi’s Stadium closed out a season of superb connectivity that eventually allowed the team to not just offer in-seat food and beverage delivery, but also ordering and delivery of merchandise like hats and T-shirts, an option that was available for the Dec. 20 game against the San Diego Chargers and the Dec. 28 closer vs. the Arizona Cardinals.

According to the Levi’s Stadium network crew, the Wi-Fi network carried 2.34 TB of data for the Chargers game and another 2.11 TB for the Cardinals game, with 20,096 fans using the network on Dec. 20 and 20,164 on Wi-Fi on Dec. 28. Peak concurrent user numbers were 13,700 for the Chargers game, and 14,400 for the season closer.

All season long, our speed tests in various parts of the stadium showed strong signals for both the Wi-Fi as well as the distrbuted antenna system (DAS) enhanced cellular network. At the final game it was no different; we found Wi-Fi download speeds of 16 Mbps on the rooftop deck, 25 Mbps in a suite and a scorching 39 Mbps in the Levi’s 501 Club seats (no doubt in part because there was a Wi-Fi antenna under the seat next to us).

Both Verizon and AT&T 4G LTE services also worked well, consistently scoring download speeds in the 4-6 Mbps range in most places and much higher in others. In short, we didn’t find any flaws in the network coverage in five games of walking all around the stadium, testing everywhere we went.

CalTrain to VTA a smooth ride

Caltrain crown en route to Arizona game

Caltrain crown en route to Arizona game

At the final game, Mobile Sports Report (me) tested out the full public-transit method of getting to the game, starting from San Mateo on CalTrain at 10:51 a.m. The parking lot at the station was almost completely empty, and free since it was Sunday; it’s possible that crowds were lighter since the Niners had been eliminated from postseason play, but nevertheless the ride to Mountain View went without hitch, a good sign for next year when many fans in town for Super Bowl 50 will no doubt be using CalTrain to get from San Francisco to Levi’s.

At the Mountain View CalTrain/VTA station operations were at their best I’ve seen, with more neon-vested VTA helpers offering clear instructions on why you might want to take an express bus instead of the light rail. Insider tip: If the express bus is available, take it, because in our testing it arrived at Levi’s in about half the time as the train trip (~20 minutes as opposed to almost 40 minutes for the light rail option).

Express bus option at Mountain View

Express bus option at Mountain View

The only thing that still remains to be ironed out is the fare confusion in switching from CalTrain to VTA, which are two different operators. On CalTrain there was advertising for a $6 “combo ticket” that would let you ride VTA and could be purchased at the same time you bought your CalTrain pass. But an online single-day ticket purchased via the VTA app was only $4.50, so it’s not clear why you would buy the CalTrain pass. Especially for the Super Bowl, it’d help fans if there was one price and one place to buy a single “Get to Levi’s” public-transit ticket.

Food order arrives as promised

Another thing I tried at the season closer was the in-seat food ordering feature on the Levi’s Stadium app. Sitting in the Levi’s Club section seats on the third level, I consulted the app to order a cold beer and a warm pretzel, which the app said could be delivered in 10 minutes.

Food runner bringing me my cold beer and warm pretzel

Food runner bringing me my cold beer and warm pretzel

After entering credit-card information into the app and hitting the order button the app updated itself with timely notices about the order being prepared, and that it was on its way. I found that information to be very assuring, a sign that things were indeed happening; there was even a big number associated with my order that appeared, apparently to make it easier for the food runner to confirm the order.

The order arrived exactly in 10 minutes’ time, as predicted by the app — it also arrived in a lot of extra packaging, a separate plastic bag for the steel bottle of beer and a paper sack holding a cellophane wrapper-encircled pretzel. Since there is no way to add a gratuity in the app, I gave the runner a cash tip, which seemed appropriate even though there is a $5 charge added to the order cost for the delivery service. I have to admit it felt a little weird to have someone bring me my food and drink but in the time it took to order and deliver I sat in my seat and watched the Niners’ game-winning TD drive so it’s clearly a fan-friendly option.

Video replays work well, for small amount of viewers

Another part of the Levi’s Stadium technology that was at peak performance by year’s end was the app’s instant replay feature. Though it started slowly and had some hiccups early on, by the final game instant replays were appearing in the app even before the next play had concluded (see our feature on how the VenueNext team gets the replays to the app so quickly).

While it’s an impressive addition to the in-game experience, the replays are a feature that only a small amount of fans are watching. According to the team network stats there were only 1,253 unique users watching replays on Dec. 20, and 1,019 during the Dec. 28 game. Total replays viewed for the Chargers game were 6,285, while 4.310 replays were watched during the season closer.

Why aren’t the replays catching on? Our main guess is that the Levi’s Stadium big screens are so clear and so quick to show replays (they also show live action as it’s happening), fans don’t find it necessary to use their phones to watch replays. It’s also possible that many fans in the stadium who are using the network aren’t using the Levi’s Stadium app. Indeed, according to the team network stats, the team app hasn’t yet cracked the top-four apps being used at any of the games this season; for the Dec. 20 game the top apps being used on the network were Amazon cloud drive, Facebook, Google APIs (probably Gmail) and Apple; for Dec. 28 the list was Amazon, Google, Facebook, then Apple.

We’ll try to get some season-long stats to share for both the network and the app features, but our quick conclusion after five live-game visits to Levi’s Stadium this year is that the wireless network and the app both pretty much lived up to their pre-season billing and hype, delivering a wireless game-day experience that is the standard other large public facilities will be measured against, going forward. More photos from our last visit below.

The Microsoft Surface/sideline Wi-Fi unit

The Microsoft Surface/sideline Wi-Fi unit

close-up of cable connection

close-up of cable connection

Niners' Flickr promotion on scoreboard -- very popular

Niners’ Flickr promotion on scoreboard — very popular

Sideline Surface tablets for Niners players and coaches

Sideline Surface tablets for Niners players and coaches

Colin Kaepernick exchanges his radio helmet for his flat-brimmed hat after throwing a TD pass

Colin Kaepernick exchanges his radio helmet for his flat-brimmed hat after throwing a TD pass

View from the Levi's skydeck out over Santa Clara

View from the Levi’s skydeck out over Santa Clara

If you throw a rooftop party, the cheerleaders might visit

If you throw a rooftop party, the cheerleaders might visit

View from the Levi's 501 Club section seats

View from the Levi’s 501 Club section seats

Wi-Fin antenna right under the seat next to me (probably why my speedtest was 40+ Mbps)

Wi-Fin antenna right under the seat next to me (probably why my speedtest was 40+ Mbps)

In-stadium signing help to get fans to light rail

In-stadium signing help to get fans to light rail

End of game view from skydeck

End of game view from skydeck

A final toast to the season at the BNY Club

A final toast to the season at the BNY Club

VTA train line. Only took 15 minutes from here to get on bus.

VTA train line. Only took 15 minutes from here to get on bus.

Caltrain platform at Mountain View. Extra trains helped make ride home easy

Caltrain platform at Mountain View. Extra trains helped make ride home easy

Stadium Tech Report: Corning, IBM bringing fiber-based Wi-Fi and DAS to Texas A&M’s Kyle Field

Kyle Field, Texas A&M University. Credit all photos: Texas A&M

Kyle Field, Texas A&M University. Credit all photos: Texas A&M

Editor’s note: The following is an excerpt from our recent Stadium Tech Report series COLLEGE FOOTBALL ISSUE, a 40-page in-depth look at Wi-Fi and DAS deployment trends at U.S. collegiate football stadiums. You can download the full report for free, to get more stadium profiles as well as school-by-school technology deployment capsules for both the SEC and Pac-12 conferences.

When Texas A&M’s newly renovated Kyle Field opens for the 2015 football season, its outside appearance will have changed dramatically. But from a networking perspective, what’s really different is hidden on the inside – namely, an optical fiber infrastructure designed to bring a new level of performance, cost savings and future-proofing to stadium network deployments.

While the use of optical fiber instead of copper cable in large networks isn’t exactly “new” in the core telecom or enterprise networking worlds, in the still-nascent field of stadium network deployments fiber has yet to make large inroads. But the promise of fiber’s ability to deliver much higher performance and greater future-proofing at lower installation costs in stadium situations may get a very visible poster child when Texas A&M’s football facility kicks off the 2015 season with a technology infrastructure designed to be among the most comprehensive in any stadium, collegiate or professional.

With a Wi-Fi network designed to support 100,000 concurrent connections, a robust DAS network with more than 1,000 antennas, and an IPTV deployment with more than 1,000 screens, the IBM-designed network based largely on Corning’s fiber-optical systems is incredibly impressive on paper – and it has already produced some eye-popping statistics this past season, when just a part of it came online during the “Phase 1” period of the two-phase $450 million Kyle Field renovation.

The final, or Phase 2 of the renovation, just now getting underway, began with an implosion of the stadium’s west stands, with reconstruction scheduled to finish in time for the 2015 season with a new, enclosed-bowl structure that will seat 102,512 fans. And if the new network delivers as planned, those fans will be among the most-connected anywhere, with plenty of future-proofing to make sure it remains that way for the foreseeable future – thanks to fiber.

Driving on the left side of the street

What’s going to be new about Kyle Field? According to news reports some of the creature comforts being added include redesigned concession stands, so-called “Cool Zones” with air conditioning to beat the Texas heat, well-appointed luxury suites and new restrooms – including 300 percent more women’s bathrooms.

Scoreboard, Kyle Field

Scoreboard, Kyle Field

According to representatives from the school, the decision to make the new stadium a standout facility extended to its network infrastructure. “Our leadership decided that [the stadium renovation] would be leading edge,” said Matthew Almand, the IT network architect for the Texas A&M University System, the administrative entity that oversees university operations, including those at the flagship school in College Station, Texas. “There were some leaps of faith and there was a decision to be leading edge with technology as well.”

Though the Phase 1 planning had started with traditional copper cable network design for the network, Almand said a presentation by IBM and its “smarter stadium” team changed the thinking at Texas A&M.

“The IBM team came in and did a really good job of presenting the positive points of an optical network,” Almand said.

Todd Christner, now the director, wireless business development at Corning, was previously at IBM as part of the team that brought the optical idea to Texas A&M. While talking about fiber to copper-cable veterans can sometimes be “like telling people to drive on the left side of the street,” Christner said the power, scalability and flexibility of a fiber network fit in well with the ambitious Kyle Field plans.

“The primary driving force [at Texas A&M] was that they wanted to build a state of the art facility, that would rival NFL stadiums and set them apart from other college programs,” Christner said. “And they wanted the fan [network] experience to be very robust.”

With what has to be one of the largest student sections anywhere – Christner said Texas A&M has 40,000 seats set aside for students – the school knew they would need extra support for the younger fans’ heavy data use on smartphones. The school officials, he said, were also concerned about DAS performance, which in the past had been left to outside operators with less than satisfactory results. So IBM’s presentation of a better, cheaper alternative for all of the above found accepting ears.

“It was the right room for us to walk into,” Christner said.

IBM’s somewhat radical idea was that instead of having separate copper networks for Wi-Fi, DAS and IPTV, there would be a single optical network with the capacity to carry the traffic of all three. Though the pitch for better performance, far more capacity, use of less space, and cheaper costs might sound a bit too good to believe, most of it is just the combination of the simple physics advantages of using fiber over copper, which are well known in the core telecom and large-enterprise networking worlds, applied to a stadium situation.

Deploying now and for the future

Corning ONE DAS headend equipment.

Corning ONE DAS headend equipment.

Without going too deeply into the physics or technology, a simple explanation of the benefits stem from the fact that optical fiber can carry far more bandwidth than copper, at farther distances, using less power. That advantage is one reason why fiber is used extensively in core backbone networks, and has been creeping slowly closer to the user’s destination, through deployments like Verizon’s FiOS.

Why hasn’t fiber won over completely? Mainly because in single-user deployments – like to a single home or office – it is still costly to replace systems already in the ground or in the wall with fiber, and for many users fiber’s capacity can be a bit of overkill. Fiber’s main benefits come when lots of bandwidth is needed, and the scale of a project is large, since one main benefit is the elimination of a lot of internal switching gear, which takes up space and consumes lots of power.

Those reasons accurately describe the perfect bandwidth storm happening in networked stadiums these days, where demand seems to keep increasing on a daily basis. Some stadiums that were at the forefront of the wireless-networking deployment trend, like AT&T Park in San Francisco and AT&T Stadium in Arlington, Texas, have been in a near-constant state of infrastructure upgrades due to the ever-increasing needs for more bandwidth. And Isaac Nissan, product manager for Corning ONE, said new equipment like Wi-Fi access points with “smart” or multiple-input antennas are also going to help push the LAN world into more fiber on the back end.

But there’s another drawback to using fiber, which has less to do with technology and more to do with history: Installers, integrators and other hands-on networking folks in general are more comfortable with copper, which they know and have used for decades. Fiber, to many, is still a new thing, since it requires different skills and techniques for connecting and pulling
wires, as well as for managing and administering optical equipment.

“There’s definitely a learning curve for some the RF [industry] people, who have been doing coax for 20 years,” Nissan said. “Fiber is a little different.”

Texas A&M’s Almand admitted that bringing the stadium’s networking group into a new technology – fiber – was a challenge, but one with a worthy payoff.

Copper cable tray hardly filled by optical fiber

Copper cable tray hardly filled by optical fiber

“There’s definitely been a gear-up cycle, getting to a new confidence level [with fiber],” Almand said. But he added that “sometimes it’s good to break out of your comfort zone.”

Lowering the IDF count

Christner said the Corning optical gear is at the center of the Kyle Field deployment, providing support for the fan-facing Wi-Fi as well as Wi-Fi for back of the house operations like point of sale; it also supports the stadium DAS, as well as a network of more than 1,000 IPTV screens. Aruba Networks is the Wi-Fi gear supplier, and YinzCam is helping develop a new Kyle Field app that will include support to use smartphones as remote-control devices for IPTVs in suites.

On the Wi-Fi side, Christner said the finished network will have 600 APs in the bowl seating areas, and another 600 throughout the facility, with a stated goal of supporting 100,000 concurrent 2 Mbps connections. The DAS, Christner said, is slated to have 1,090 antennas in 50 sectors.

With no intermediate switching gear at all, Christner said that for the fiber network in Kyle Field only 12 intermediate distribution frames (the usually wall-mounted racks that support network-edge gear, also called IDFs) would be needed, as opposed to 34 IDFs in a legacy fiber/coax system. In addition to using less power, the cabling needed to support the fiber network is a fraction of what would have been needed for coax.

One of the more striking pictures of the deployment is a 36-inch wide cable tray installed for the original copper-network plan, which is carrying just 10 inches of fiber-optic cable. Christner said the fiber network also provides a cleaner signal for the DAS network, which already had a test run this past season, when 600 DAS antennas were deployed and lit during the 2014 season.

“At the Ole Miss game we had 110,663 fans at the stadium, and according to AT&T on the DAS all their lights were green,” Christner said. “Via our completely integrated fiber optic solution, we are now able to provide the DAS with much higher bandwidth as well,” said Texas A&M’s Almand, who also said that the carriers have responded very positively to the new DAS infrastructure.

Up from the dust – a model for the future?

Antenna and zone gear box near top of stands

Antenna and zone gear box near top of stands

Also included in the design – but not being used – are an additional 4,000 spare fibers at 540 zone locations, which Christner said can be immediately tapped for future expansion needs. And all of this functionality and flexibility, he added, was being built for somewhere between one-third and 40 percent less than the cost of a traditional copper-based solution.

The proof of the network’s worth, of course, will have to wait until after the west stands are imploded, the new ones built, and the final pieces of the network installed. Then the really fun part begins, for the users who will get to play with things like 38 channels of high-def TV on the IPTV screens, to the multiple-angle replay screens and other features planned for the mobile app. At Texas A&M, IBM’s support squad will include some team members who work on the company’s traditionally excellent online effort for the Masters golf tournament, as well as the “smarter stadium” team.

For Texas A&M’s Almand, the start of the 2015 season will mark the beginning of the end, and a start to something special.

“If I were a country singer, I’d write something about looking forward to looking back on this,” Almand said. “When it’s done, it’s going to be something great.”