Texas A&M’s Kyle Field: A network built for speed

Full house at Kyle Field. All Photos: Paul Kapustka, MSR (click on any photo for a larger image)

Full house at Kyle Field. All Photos: Paul Kapustka, MSR (click on any photo for a larger image)

Is there a combined stadium Wi-Fi and DAS deployment that is as fast as the one found at Texas A&M’s Kyle Field? If so, we haven’t seen or heard of it.

In fact, after reviewing loads of live network-performance data of Kyle Field’s new Wi-Fi and DAS in action, and after maxing out the top levels on our speed tests time after time during an informal walk-around on a game day, we’ve come to the conclusion that Kyle Field has itself a Spinal Tap of a wireless deployment. Meaning, that if other stadium networks stop at 10, this one goes to 11.

Movie references aside, quite simply, by the numbers Kyle Field’s wireless network performance is unequaled by any other large public venue’s we’ve tested in terms of raw speed and the ability to deliver bandwidth. With DAS and Wi-Fi speed measurements ranging between 40 Mbps and 60+ Mbps pretty much everywhere we roamed inside the 102,512-seat venue, it’s a safe bet to say that the school’s desire to “build the best network” in a stadium hit its goal as best as it could.

Editor’s note: This story is part of our most recent STADIUM TECH REPORT, the COLLEGE FOOTBALL ISSUE. The 40+ page report, which includes profiles of stadium deployments at Texas A&M, Kansas State, Ole Miss and Oklahoma, is available for FREE DOWNLOAD from our site. Get your copy today!

On one hand, the network’s top-line performance is not that much of a surprise, since as part of an overall Kyle Field renovation that has already cost an estimated $485 million, the optical-based Wi-Fi, DAS and IPTV deployment inside the Aggies’ football palace is probably among the most expensive and expansive in-venue networks ever built. According to Phillip Ray, Vice Chancellor for Business Affairs at The Texas A&M University System, the total cost of the optical-based Wi-Fi, DAS and IPTV network was “somewhere north of $20 million.”

Remote optical cabinet and Wi-Fi AP at Kyle Field.

Remote optical cabinet and Wi-Fi AP at Kyle Field.

And even though the nation’s biggest cellular carriers, AT&T and Verizon Wireless, paid nearly half the network’s cost – $10 million, according to Ray – with the dedication and work crews brought to the table by main suppliers IBM and Corning, and Wi-Fi gear vendor Aruba, you have components, expertise and budgetary freedom that perhaps only a small group of venue owners could hope to match.

But just throwing money and technology at a stadium doesn’t necessarily produce a great network. In a venue the size of the new Kyle Field there needs to be great care and innovative thinking behind antenna placement and tuning, and in that arena Texas A&M also had the guiding hand of AmpThink, a small firm with oversized smarts in Wi-Fi deployment, as evidenced by its impressive track record of helping wireless deployments at the biggest events including several recent Super Bowls.

The core decision to go with optical for the network’s guts, and a tactical decision to put a huge chunk of the Wi-Fi APs in under-seat deployments are just part of the strategy that produced a network that – in A&M fan parlance – can “BTHO” (Beat The Hell Out) of most challengers.

Since it’s almost impossible to directly compare stadiums and venue network performances due to all the possible variables, you’ll never hear us at Mobile Sports Report declare a “champion” when it comes to click-bait themes like “the most connected stadium ever.” Given its remote location some three hours south of Dallas in College Station, Texas, Kyle Field will almost certainly never face the ultimate “big game” pressures of a Super Bowl or a College Football Playoff championship, so the network may never know the stress such large, bucket-list gatherings can produce. And so far, there aren’t many ambitious fan-facing applications that use the network, like in-seat food delivery or wayfinding apps found in other stadiums.

But as part of the football-crazy SEC, and as the altar of pigskin worship for some of the most dedicated fans seen anywhere, Kyle Field is sure to see its share of sellout contests against SEC rivals that will push wireless usage to new heights, especially as more fans learn about and use the still-new system. Though total Wi-Fi usage at the Nov. 7 game we attended versus Auburn (a 26-10 Texas A&M loss) was “only” 2.94 terabytes – a total hampered by cold, windy and rainy conditions – an Oct. 17 game earlier in the season against Alabama saw 5.7 TB of Wi-Fi usage on the Kyle Field network, a number surpassed only by last year’s Super Bowl (with 6.2 TB of Wi-Fi use) in terms of total tonnage.

At the very least, the raw numbers of total attendees and the obvious strength of the still-new network is sure to guarantee that Kyle Field’s wireless deployment will be one of the most analyzed stadium networks for the foreseeable future.

Texas A&M student recording the halftime show.

Texas A&M student recording the halftime show.

What follows are some on-the-spot observations from our visit, which was aided by the guidance and hospitality of Corning project manager Sean Heffner, who played “tour guide” for part of the day, giving us behind-the-scenes access and views of the deployment that are unavailable to the general fan audience.

An off-campus DAS head end

This story starts not inside Kyle Field, but in a section of town just over three miles away from the stadium, on a muddy road that curves behind a funky nursery growing strange-looking plants. A gray metal box, like a big warehouse, is our destination, and the only clue as to what’s inside is the big antenna located right next to it. This structure is the Kyle Field DAS head end, where cellular carrier equipment connects to the fiber network that will bring signals to and from fans inside the stadium.

Why is the head end so far away? According to Corning’s Heffner there was no room for this huge space inside the stadium. But thanks to the use of optical fiber, the location is not a problem since signals traveling at the speed of light makes 3.3 miles an insignificant span.

It might be helpful to back up a bit if you haven’t heard the full story of the Kyle Field deployment, which we told last year when the job was halfway completed. Though the rebuilding of the stadium was started with copper-based networks as the original plan, a last-minute audible championed by Texas A&M chancellor John Sharp sent the school on a decidedly untraditional path, by building a stadium network with a single optical-based core for Wi-Fi, DAS and IPTV networks. The kicker? Not only would this network have huge capacity and be future-proof against growth, it would actually cost less than a comparable copper-based deployment. If it got built on time, that is.

Though the pitch for better performance, far more capacity, use of less space, and cheaper costs might sound a bit too good to believe, most of it is just the combination of the simple physics advantages of using fiber over copper, which are well known in the core telecom and large-enterprise networking worlds, applied to a stadium situation.

One of the many maxed-out speed tests we took at Texas A&M's Kyle Field.

One of the many maxed-out speed tests we took at Texas A&M’s Kyle Field.

Without going too deeply into the physics or technology, a simple explanation of the benefits stem from the fact that optical fiber can carry far more bandwidth than copper, at farther distances, using less power. Those advantages are why fiber is used extensively in core backbone networks, and has been creeping slowly closer to the user’s destination, through deployments like Verizon’s FiOS.

And that’s also the reason why Texas A&M could put its DAS head end out in a field where it’s easier to add to (no space constraints), because the speed of fiber makes distance somewhat irrelevant. Corning’s Heffner also said that the DAS can be managed remotely, so that staff doesn’t need to be physically present to monitor the equipment.

Of course, there was the small matter of digging trenches for optical fibers to get from the head end to the stadium, but again, for this project it is apparent that getting things done was more important than strictly worrying about costs. Beyond the cash that the carriers all put in, other vendors and construction partners all put in some extra efforts or resources – in part, probably because the value of positive publicity for being part of such an ambitious undertaking makes any extra costs easy to justify.

Keeping the best fans connected and happy

From the head end, the fiber winds its way past apartment buildings and a golf course to get to Kyle Field, the center of the local universe on football game days. Deep inside the bowels of the venue is where the fiber meets networking gear, in a room chilled to the temperature of firm ice cream. Here is where the human element that helps keep the network running spends its game days, wearing fleece and ski jackets no matter what the temperature is outside.

See the white dots? Those are under-seat Wi-Fi APs

See the white dots? Those are under-seat Wi-Fi APs

In addition to Corning, IBM and AmpThink employees, this room during our visit also had a representative from YinzCam in attendance, a rarity for a company that prides itself on being able to have its stadium and team apps run without local supervision. But with YinzCam recently named as a partner to IBM’s nascent stadium technology practice, it’s apparent that the Kyle Field network is more than just a great service for the fans in the seats – it’s also a proof of concept network that is being closely watched by all the entities that helped bring it together, who for many reasons want to be able to catch any issues before they become problems.

How big and how ambitious is the Kyle Field network? From the outset, Corning and IBM said the Wi-Fi network part was designed to support 100,000 connections at a speed of 2 Mbps, so that if everyone in the stadium decided to log on, they’d all have decent bandwidth. But so far, that upper level hasn’t been tested yet.

What happened through the first season was a “take rate” averaging in the 35,000-37,000 range, meaning that during a game day, roughly one-third of the fans in attendance used the Wi-Fi at some point. The average concurrent user peaks – the highest numbers of fans using the network at the same time – generally averaged in the mid-20,000 range, according to figures provided by Corning and AmpThink; so instead of 100,000 fans connecting at 2 Mbps, this season there was about a quarter of that number connecting at much higher data rates, if our ad hoc speed tests are any proof.

Our first test that Saturday [Nov. 7, 2015], just inside a lower-level service entryway, hit 41.35 Mbps for download and 18.67 on the upload, on a Verizon iPhone 6 Plus over the stadium’s DAS. And yes, that download speed was the slowest we’d record all day, either on the DAS or the Wi-Fi.

Inside the control room we spent some time with AmpThink CEO Bill Anderson, who could probably use up an entire football game talking about Wi-Fi network deployment strategies if he didn’t have a big network to watch. On this Saturday the top things we learned about Kyle Field is that Anderson and AmpThink are solid believers in under-seat AP placements for performance reasons; according to Anderson at Kyle Field, fully 669 of the stadium’s 1,300 APs can be found underneath seats. Anderson also is a stickler for “real” Wi-Fi usage measurements, like trying to weed out devices that may have autoconnected to the Wi-Fi network but not used it from the “unique user” totals – and to take bandwidth measurements at the network firewall, to truly see how much “live” bandwidth is coming and going.

On the road to College Station, Aggie pride is everywhere. Whoop!

On the road to College Station, Aggie pride is everywhere. Whoop!

AmpThink’s attention to detail includes deploying and configuring APs differently depending on which section they are located in – student sections, for example, are more densely packed with people than other sections so the APs need different tuning. Corning’s Heffner also said that the oDAS – the DAS just outside the stadium – got special attention due to the large numbers of tailgating fans, both before and during the games. At the Alabama game, Heffner said there were some 30,000 fans who remained outside the stadium during the contest, never coming inside but still wanting to participate in the scene.

AmpThink, Corning, IBM and others involved at Kyle Field all seem keen on finding out just how much bandwidth stadium fans will use if you give them unlimited access. The guess? According to Corning’s Heffner, the mantra of stadium networks these days seems to be: “If you provide more capacity, it gets consumed.”

The ‘real’ 12th man

After walking through a tunnel with a nearly full cable tray overhead (“It’d be even more loaded if we were using copper,” Heffner said) we went out into the stadium itself, which was just starting to fill. Though the overcast day and intermittment rain squalls might have kept other teams’ fans from showing up for a 5:30 p.m. local start time, that simply wasn’t the case at an A&M home game.

Some of the Wi-FI and DAS download measurements we took at Kyle Field.

Some of the Wi-FI and DAS download measurements we took at Kyle Field.

As someone who’s attended a countless number of football games, small and large – including a Super Bowl and last year’s inaugural College Football Playoff championship game – I can honestly say that the level of fan participation at Texas A&M is like nothing I’d seen before. The student section alone spans two decks on the stadium’s east side and takes up 40,000 seats, according to stadium officials – simply dwarfing anything I’d ever witnessed. (Out of an enrollment of 57,000+, having 40,000 students attend games is incredible.) And outside of small high school crowds I’d never seen an entire full stadium participate in all the school songs, the “yells” (do NOT call them “cheers” here) and the locked-arms back-and-forth “sawing” dance without any need for scoreboard instruction.

Part of the stadium renovation that closed the structure into a bowl was, according to school officials, designed to make Kyle Field even more intimidating than it already was, by increasing the sound levels possible. Unfortunately the night of our visit some early Auburn scores took some of the steam out of the crowd, and a driving, chilling rain that appeared just before halftime sent a good part of the crowd either home or into the concourses looking for warmth and shelter. (The next day, several columnists in the local paper admonished the fans who left early for their transgressions; how dare they depart a game whose outcome was still in doubt?)

But I’ll never forget the power of the synchronized “yells” of tens of thousands of fans during pregame, and the roar that surfaced when former Aggie QB Johnny Manziel made a surprise appearance on the field before kickoff. Seattle Seahawks fans may stake the pro claim to fan support, but if you want to determine the “real” 12th man experience you need to stop by Kyle Field and give your ears a taste of loud.

Controlling the TV with the app

If the students and alumni and other fans outside provide the vocal power, the money power that helped get the stadium rebuilt can be found in the new Kyle Field suites and premium seating areas, some of which are found on the venue’s west side, which was blown up last December and rebuilt in time for this past season.

Conduit reaching to an under-seat AP

Conduit reaching to an under-seat AP

Inside the All American Club – a behind-the-walls gathering area with catered food and bars that would not seem out of place in Levi’s Stadium or AT&T Stadium – we tested the Wi-Fi and got speeds of 63 Mbps down, 69 Mbps up; Verizon’s 4G LTE service on the DAS hit 48 Mbps/14.78 Mbps, while AT&T’s 4G LTE DAS checked in at 40 Mbps/22 Mbps.

In an actual suite where we were allowed to check out the IPTV displays, the speed tests got 67/67 for Wi-Fi and 57/12 for Verizon 4G LTE. So the well-heeled backers of A&M football shouldn’t have any problems when it comes to connectivity.

As for the IPTV controls, the new system from YinzCam solves one of the problems that’s plagued stadium suites since there’s been suites: What do you do with the TV remote? What YinzCam did for Texas A&M was link the TV controls to a Texas A&M “TV Remote” app; by simply punching in a numerical code that appears on the bottom of the screen in front of you, anyone with access to a suite or club area with TVs can change the channel to a long list of selections, including multiple live game-day views (stadium screen, broadcast view) as well as to other channels, like other games on the ESPN SEC network.

By having a static code number for each TV and another set of numbers that randomly scrambles over time, the system smartly builds security into the channel changing system, and prevents someone who had been in a suite previously from being able to change the channels after they leave. The whole remote-control process took less than a minute to learn, and we had fun wandering through the club-level areas our pass gave us access to, changing screens as we saw fit.

Our favorite places to watch the game at Kyle Field were the loge-level lounges, where you could first purchase food and beverages, including alcoholic ones, at an inside bar and then sit at an outside seat with a small-screen TV in front of you for information overload. The Wi-Fi in the southwest corner loge lounge checked in at 67.03/62.93, so it was no problem being connected via mobile device, either.

What comes next for the Kyle Field network?

Even though the rain had started coming down harder, we left the comfort and warmth of the club levels to wander around the stadium’s upper decks, including the student section, where we watched numerous fans taking pictures or videos of the band’s halftime performance. Clearly most everyone in Kyle Field had gotten the message and wasn’t afraid that they won’t connect if they use their mobile device at the game, even among 102,000 of their closest friends.

Antennas on flag poles atop seating

Antennas on flag poles atop seating

The question now for Kyle Field is what does it do next with its network? The most obvious place for innovation or new features is with a stadium-centric app, one that could provide services like a wayfinding map. Maybe it was our round-the-stadium wandering that produced confusion finding our way around, but any building that seats 102,000 plus could use an interactive map. It might also be interesting to tie a map to concessions – the night we visited, there were long lines at the few hot chocolate stands due to the cold weather; in such situations you could conceivably use the network to find out where hot chocolate stands were running low, maybe open new ones and alert fans through the app.

We’re guessing parking and ticketing functions might also be tied to the app in the future, but for now we’ll have to wait and see what happens. One thing in Kyle Field’s favor for the future: thanks to the capacity of the optical network buildout, the stadium already has thousands of spare fiber connections that aren’t currently being used. That means when it’s time to upgrade or add more DAS antennas, Wi-Fi APs or whatever comes next, Kyle Field is already wired to handle it.

For the Nov. 7 game at Kyle Field, the final numbers included 37,121 unique users of the Wi-Fi network, and a peak concurrent user number of 23,101 taken near the end of the 3rd quarter. The total traffic used on the Wi-Fi network that night was 2.94 TB, perhaps low or average for Kyle Field these days but it’s helpful to remember that just three years ago that was right around the total Wi-Fi data used at a Super Bowl.

Until the next IBM/Corning network gets built in Atlanta (at the Falcons’ new Mercedes-Benz Stadium, slated to open in 2017), the Kyle Field network will no doubt be the center of much stadium-technology market attention, especially if they ever do manage to get 100,000 fans to use the Wi-Fi all at once. While A&M’s on-the-field fortunes in the competitive SEC are a yearly question, the performance of the network in the Aggies’ stadium isn’t; right now it would certainly be one of the top four seeds, if not No. 1, if there was such a thing as a college stadium network playoff.

What we’re looking forward to is more data and more reports from a stadium with a network that can provide “that extra push over the edge” when fans want to turn their connectivity dial past 10. Remember, this one goes to 11. It’s one more.

(More photos below! And don’t forget to download your copy of the STADIUM TECH REPORT for more!)

kf7
Panoramic view of Kyle Field before the 102,000 fans fill the seats.

kf2
Some things at Kyle Field operate at ‘traditional’ speeds.

kf1

Outside the south gate before the game begins.

kf3

Overhang antenna in the middle section of the stadium.

Texas A&M’s fiber-backed Wi-Fi at Kyle Field records 5.7 TB of data during Alabama game

Scoreboard, Kyle Field. Photos: Texas A&M

Scoreboard, Kyle Field. Photos: Texas A&M

We’ve been hearing rumors about how much data was flowing at the new fiber-based Wi-Fi network at Texas A&M’s Kyle Field this fall, and now we finally have some verified numbers that are sure to pop some eyeballs: According to the networking crew at Corning, fans at Kyle Field used 5.7 terabytes of Wi-Fi data during the Oct. 17 game against Alabama, which the Aggies lost 41-23.

In case you are keeping score the 5.7 TB mark is the second-largest single-game Wi-Fi usage number we’ve seen, trailing only the 6.2 TB recorded at Super Bowl XLIX in Glendale, Ariz., earlier this year. Before you pin it all on the network, however, be aware that the newly refurbished Kyle Field can hold a whole lotta fans — the announced attendance for the ‘Bama game was 105,733, which is 35,000+ more fans than the 70,288 who attended the Super Bowl at the University of Phoenix Stadium on Feb. 1. Still, building a network to support basically another baseball stadium’s worth of fans is pretty cool, too.

Other related numbers from the Wi-Fi network are in Super Bowl territory as well, including the 37,823 unique clients recorded during pre-game and game time, as well as the 26,318 peak concurrent user count. We’re not sure why only 10 people tweeted about the Wi-Fi (8 good, 2 bad) but the 3.2 Gbps throughput should also turn some heads.

Corning ONE DAS headend equipment at Texas A&M's Kyle Field deployment

Corning ONE DAS headend equipment at Texas A&M’s Kyle Field deployment

The question this all raises for us is, has the availability of a fiber backbone allowed fans to simply use more traffic? And is the demand for mobile data at big events perhaps even higher than we thought? With a regular-season game at Nebraska hitting 4.2 TB earlier this season, it’s pretty clear that data demands are showing no signs of hitting a plateau. Or maybe we can deduce that the better the network, the more traffic it will carry?

It’s also worthwhile to note that stats this season from AT&T have shown several 1+ TB data totals for games at Kyle Field on the AT&T DAS network, which uses the same fiber backbone as the Wi-Fi. This “fiber to the fan” infrastructure, built by IBM and Corning, will also be at the core of the network being built at the new home of the NFL’s Falcons, the Mercedes-Benz Stadium in Atlanta, scheduled to open in 2017.

We’ll have more soon from Kyle Field, as Mobile Sports Report is scheduled to make a visit there for the Nov. 7 game against Auburn. If you plan to be in College Station that weekend give us a holler. Or a yell, right? We are looking forward to seeing the stadium and the network firsthand, to do some speedtests to see how well all areas are covered. With 5.7 TB of Wi-Fi, it’s a good guess the coverage is pretty good.

(Statistics provided by Corning for the Oct. 17 game are below.)

Screen Shot 2015-10-29 at 9.50.45 PM

Stadium Tech Report: Corning, IBM bringing fiber-based Wi-Fi and DAS to Texas A&M’s Kyle Field

Kyle Field, Texas A&M University. Credit all photos: Texas A&M

Kyle Field, Texas A&M University. Credit all photos: Texas A&M

Editor’s note: The following is an excerpt from our recent Stadium Tech Report series COLLEGE FOOTBALL ISSUE, a 40-page in-depth look at Wi-Fi and DAS deployment trends at U.S. collegiate football stadiums. You can download the full report for free, to get more stadium profiles as well as school-by-school technology deployment capsules for both the SEC and Pac-12 conferences.

When Texas A&M’s newly renovated Kyle Field opens for the 2015 football season, its outside appearance will have changed dramatically. But from a networking perspective, what’s really different is hidden on the inside – namely, an optical fiber infrastructure designed to bring a new level of performance, cost savings and future-proofing to stadium network deployments.

While the use of optical fiber instead of copper cable in large networks isn’t exactly “new” in the core telecom or enterprise networking worlds, in the still-nascent field of stadium network deployments fiber has yet to make large inroads. But the promise of fiber’s ability to deliver much higher performance and greater future-proofing at lower installation costs in stadium situations may get a very visible poster child when Texas A&M’s football facility kicks off the 2015 season with a technology infrastructure designed to be among the most comprehensive in any stadium, collegiate or professional.

With a Wi-Fi network designed to support 100,000 concurrent connections, a robust DAS network with more than 1,000 antennas, and an IPTV deployment with more than 1,000 screens, the IBM-designed network based largely on Corning’s fiber-optical systems is incredibly impressive on paper – and it has already produced some eye-popping statistics this past season, when just a part of it came online during the “Phase 1” period of the two-phase $450 million Kyle Field renovation.

The final, or Phase 2 of the renovation, just now getting underway, began with an implosion of the stadium’s west stands, with reconstruction scheduled to finish in time for the 2015 season with a new, enclosed-bowl structure that will seat 102,512 fans. And if the new network delivers as planned, those fans will be among the most-connected anywhere, with plenty of future-proofing to make sure it remains that way for the foreseeable future – thanks to fiber.

Driving on the left side of the street

What’s going to be new about Kyle Field? According to news reports some of the creature comforts being added include redesigned concession stands, so-called “Cool Zones” with air conditioning to beat the Texas heat, well-appointed luxury suites and new restrooms – including 300 percent more women’s bathrooms.

Scoreboard, Kyle Field

Scoreboard, Kyle Field

According to representatives from the school, the decision to make the new stadium a standout facility extended to its network infrastructure. “Our leadership decided that [the stadium renovation] would be leading edge,” said Matthew Almand, the IT network architect for the Texas A&M University System, the administrative entity that oversees university operations, including those at the flagship school in College Station, Texas. “There were some leaps of faith and there was a decision to be leading edge with technology as well.”

Though the Phase 1 planning had started with traditional copper cable network design for the network, Almand said a presentation by IBM and its “smarter stadium” team changed the thinking at Texas A&M.

“The IBM team came in and did a really good job of presenting the positive points of an optical network,” Almand said.

Todd Christner, now the director, wireless business development at Corning, was previously at IBM as part of the team that brought the optical idea to Texas A&M. While talking about fiber to copper-cable veterans can sometimes be “like telling people to drive on the left side of the street,” Christner said the power, scalability and flexibility of a fiber network fit in well with the ambitious Kyle Field plans.

“The primary driving force [at Texas A&M] was that they wanted to build a state of the art facility, that would rival NFL stadiums and set them apart from other college programs,” Christner said. “And they wanted the fan [network] experience to be very robust.”

With what has to be one of the largest student sections anywhere – Christner said Texas A&M has 40,000 seats set aside for students – the school knew they would need extra support for the younger fans’ heavy data use on smartphones. The school officials, he said, were also concerned about DAS performance, which in the past had been left to outside operators with less than satisfactory results. So IBM’s presentation of a better, cheaper alternative for all of the above found accepting ears.

“It was the right room for us to walk into,” Christner said.

IBM’s somewhat radical idea was that instead of having separate copper networks for Wi-Fi, DAS and IPTV, there would be a single optical network with the capacity to carry the traffic of all three. Though the pitch for better performance, far more capacity, use of less space, and cheaper costs might sound a bit too good to believe, most of it is just the combination of the simple physics advantages of using fiber over copper, which are well known in the core telecom and large-enterprise networking worlds, applied to a stadium situation.

Deploying now and for the future

Corning ONE DAS headend equipment.

Corning ONE DAS headend equipment.

Without going too deeply into the physics or technology, a simple explanation of the benefits stem from the fact that optical fiber can carry far more bandwidth than copper, at farther distances, using less power. That advantage is one reason why fiber is used extensively in core backbone networks, and has been creeping slowly closer to the user’s destination, through deployments like Verizon’s FiOS.

Why hasn’t fiber won over completely? Mainly because in single-user deployments – like to a single home or office – it is still costly to replace systems already in the ground or in the wall with fiber, and for many users fiber’s capacity can be a bit of overkill. Fiber’s main benefits come when lots of bandwidth is needed, and the scale of a project is large, since one main benefit is the elimination of a lot of internal switching gear, which takes up space and consumes lots of power.

Those reasons accurately describe the perfect bandwidth storm happening in networked stadiums these days, where demand seems to keep increasing on a daily basis. Some stadiums that were at the forefront of the wireless-networking deployment trend, like AT&T Park in San Francisco and AT&T Stadium in Arlington, Texas, have been in a near-constant state of infrastructure upgrades due to the ever-increasing needs for more bandwidth. And Isaac Nissan, product manager for Corning ONE, said new equipment like Wi-Fi access points with “smart” or multiple-input antennas are also going to help push the LAN world into more fiber on the back end.

But there’s another drawback to using fiber, which has less to do with technology and more to do with history: Installers, integrators and other hands-on networking folks in general are more comfortable with copper, which they know and have used for decades. Fiber, to many, is still a new thing, since it requires different skills and techniques for connecting and pulling
wires, as well as for managing and administering optical equipment.

“There’s definitely a learning curve for some the RF [industry] people, who have been doing coax for 20 years,” Nissan said. “Fiber is a little different.”

Texas A&M’s Almand admitted that bringing the stadium’s networking group into a new technology – fiber – was a challenge, but one with a worthy payoff.

Copper cable tray hardly filled by optical fiber

Copper cable tray hardly filled by optical fiber

“There’s definitely been a gear-up cycle, getting to a new confidence level [with fiber],” Almand said. But he added that “sometimes it’s good to break out of your comfort zone.”

Lowering the IDF count

Christner said the Corning optical gear is at the center of the Kyle Field deployment, providing support for the fan-facing Wi-Fi as well as Wi-Fi for back of the house operations like point of sale; it also supports the stadium DAS, as well as a network of more than 1,000 IPTV screens. Aruba Networks is the Wi-Fi gear supplier, and YinzCam is helping develop a new Kyle Field app that will include support to use smartphones as remote-control devices for IPTVs in suites.

On the Wi-Fi side, Christner said the finished network will have 600 APs in the bowl seating areas, and another 600 throughout the facility, with a stated goal of supporting 100,000 concurrent 2 Mbps connections. The DAS, Christner said, is slated to have 1,090 antennas in 50 sectors.

With no intermediate switching gear at all, Christner said that for the fiber network in Kyle Field only 12 intermediate distribution frames (the usually wall-mounted racks that support network-edge gear, also called IDFs) would be needed, as opposed to 34 IDFs in a legacy fiber/coax system. In addition to using less power, the cabling needed to support the fiber network is a fraction of what would have been needed for coax.

One of the more striking pictures of the deployment is a 36-inch wide cable tray installed for the original copper-network plan, which is carrying just 10 inches of fiber-optic cable. Christner said the fiber network also provides a cleaner signal for the DAS network, which already had a test run this past season, when 600 DAS antennas were deployed and lit during the 2014 season.

“At the Ole Miss game we had 110,663 fans at the stadium, and according to AT&T on the DAS all their lights were green,” Christner said. “Via our completely integrated fiber optic solution, we are now able to provide the DAS with much higher bandwidth as well,” said Texas A&M’s Almand, who also said that the carriers have responded very positively to the new DAS infrastructure.

Up from the dust – a model for the future?

Antenna and zone gear box near top of stands

Antenna and zone gear box near top of stands

Also included in the design – but not being used – are an additional 4,000 spare fibers at 540 zone locations, which Christner said can be immediately tapped for future expansion needs. And all of this functionality and flexibility, he added, was being built for somewhere between one-third and 40 percent less than the cost of a traditional copper-based solution.

The proof of the network’s worth, of course, will have to wait until after the west stands are imploded, the new ones built, and the final pieces of the network installed. Then the really fun part begins, for the users who will get to play with things like 38 channels of high-def TV on the IPTV screens, to the multiple-angle replay screens and other features planned for the mobile app. At Texas A&M, IBM’s support squad will include some team members who work on the company’s traditionally excellent online effort for the Masters golf tournament, as well as the “smarter stadium” team.

For Texas A&M’s Almand, the start of the 2015 season will mark the beginning of the end, and a start to something special.

“If I were a country singer, I’d write something about looking forward to looking back on this,” Almand said. “When it’s done, it’s going to be something great.”

Report excerpt: SEC moving slowly on stadium Wi-Fi deployments

Jordan-Hare Stadium, Auburn University

Jordan-Hare Stadium, Auburn University

Editor’s note: The following is an excerpt from our recent Stadium Tech Report series COLLEGE FOOTBALL ISSUE, a 40-page in-depth look at Wi-Fi and DAS deployment trends at U.S. collegiate football stadiums. You can download the full report for free, to get more stadium profiles as well as school-by-school technology deployment capsules for both the SEC and Pac-12 conferences.

When it comes to college football, the South- eastern Conference – usually just known as “the SEC” – is second to none when it comes to the product on the field.

But what about the product in the stands, namely the wireless technology deployments in SEC stadiums? With just two of 14 conference schools currently with fan-facing Wi-Fi in their main venues, the SEC isn’t pushing any technology envelopes as a whole. And according to one SEC athletic director, there probably won’t be a wholesale march by the conference to the technology forefront – simply because the SEC’s in-stadium fans have other priorities on what needs fixing first.

Scott Stricklin, the AD at SEC member Mississippi State, leads a conference-wide group that is taking a close look at the in- stadium fan experience, a concern for the SEC even as the conference enjoys NFL-like popularity for its teams and games.

“We are proud that we have a pretty special product in our stadiums, and we want to take steps to keep it that way,” said Stricklin in an interview with MSR. A recent conference-wide fan survey, he said, did highlight the fact that when it comes to wireless connectivity, “none of us from a performance standpoint scored very well.”

Wi-Fi not as important as parking, good food

But Stricklin also noted that the same fan survey didn’t place stadium connectivity at the top of the list of things to fix: Instead, it fell well down, trailing issues like parking, clean restrooms, stadium sound and good food. That lack of press- ing concern, combined with Stricklin’s still-common belief that fans should be cheering instead of texting while at the stadium, means that the SEC will probably take a measured approach to Wi-Fi deployments in stadiums, and continue to rely on carrier-funded DAS networks to carry the game-day wireless load.

Scott Stricklin, Mississippi State AD

Scott Stricklin, Mississippi State AD

“I take more of a Mark Cuban approach – I’d rather people in the stands not be watching video [on their phones],” Stricklin said. “It takes away from the shared experience.”

Stricklin also noted that the two schools that have installed Wi-Fi in their stadiums – Auburn and Ole Miss – haven’t had resounding success with their deployments.

“Some [SEC schools] have done [Wi-Fi], and they’re not completely happy with the results,” said Stricklin, saying the lack of success has reinforced the cautious approach to Wi-Fi, conference-wide. “Those are the issues all of us are facing and grappling with,” he added.

SEC fans setting DAS traffic records

Even as they trail on Wi-Fi deployments, that doesn’t mean SEC schools are putting in dial-up phone booths. Indeed, Stricklin noted the huge video boards that have been installed in most conference stadiums, and did say that the recent installations of carrier-funded DAS deploymentshave somewhat eased the no-signal crunch of the near past.

At his own school, Stricklin said his office got a lot of complaints about fans not being able to get a cellular signal before AT&T updated the stadium’s DAS in 2013.

“Last year, we got very few negative comments [about cellular service],” Stricklin said. “AT&T customers were even able to stream video.”

Vaught-Hemingway Stadium, Ole Miss

Vaught-Hemingway Stadium, Ole Miss

AT&T’s aggressive plan to install as many DAS networks as it can has helped bring the SEC to a 100 percent DAS coverage mark, and the fans seem to be enjoying the enhanced cellular connectivity. According to AT&T statistics, fans at SEC schools have regularly led the carrier’s weekly DAS traffic totals for most of the football season, especially at the “big games” between SEC schools like Alabama, Auburn, Ole Miss, Mississippi State and Georgia.

During Alabama’s 25-20 home victory over then-No. 1 Mississippi State, AT&T customers at Bryant-Denny Stadium used 849 gigabytes oftraffic, the second-highest total that weekend for stadiums where AT&T has a DAS. The next two highest data-usage marks that weekend came at games at Georgia (676 GB) and Arkansas (602 GB), highlighting that SEC games typically have huge crowds, and those crowds like to use their cellphones, no matter how good the game on the field is.

Would Wi-Fi help with some of the traffic crunches? Possibly, but only two schools in the conference – Ole Miss and Auburn – currently have fan-facing Wi-Fi in their stadiums. Texas A&M, which is in the middle of a $450 million renovation of Kyle Field, is leaping far ahead of its conference brethren with a fiber-based Wi-Fi and DAS network and IPTV installation that will be among the most advanced anywhere when it is completed this coming summer.

But most of the SEC schools, Stricklin said, will probably stay on the Wi-Fi sidelines, at least until there is some better way to justify the millions of dollars in costs needed to bring Wi-Fi to a facility that might not see much regular use.

“If you only have 6 home games a year, it’s hard to justify,” said Stricklin of the cost of a Wi-Fi stadium network.

Other sports may move before football

Stricklin, the man who wants fans to keep their phones in their pockets at football games, is no stranger to technology-enhanced experiences in stadiums. He claims to “love” the in-seat food delivery options at MSU baseball and basketball games, and notes that the conference athletic directors will have a meeting soon where the game-experience panel experts will walk the ADs through the facets of wireless technology deployments.

“They’re going to lay out what are the challenges, and what are the costs” of wireless deployments, Stricklin said. What Stricklin doesn’t want to see at MSU or at any SEC school is the return of the “no signal” days.

“When fans from other schools come here, we want them to have a good experience,” Stricklin said.

But he’d still prefer that experience is real, not virtual.

“I still just wonder, is anybody really doing this?” he asked. “Are you going to pay what you pay to come to our place, and then watch your phone? What I hope is that we produce such a great experience, you’re not going to want to reach for your phone.”

Big AT&T DAS weekend in Miami: 2.7 TB of traffic for two mid-November games

Screen Shot 2014-09-12 at 2.21.51 PMWe’re a couple weeks behind in catching up here, but it’s worth backtracking to look at a huge weekend of DAS traffic at Miami’s Sun Life Stadium that took place earlier this month. According to DAS traffic figures from AT&T, the two games held at Sun Life on Nov. 13 (Miami Dolphins vs. Buffalo Bills) and Nov. 15 (Florida State vs. Miami) generated a total of 2.735 terabytes of traffic on the AT&T-specific cellular DAS in the stadium — a pretty high mark for cellular-only traffic.

Since we know there’s also a high-capacity Wi-Fi network at Sun Life, it’s interesting to wonder how much total traffic there was for the two events. While we wait to see if the fine folks who run the stadium network will eventually provide us with the Wi-Fi details, we can drill down a bit more into the DAS numbers that AT&T is seeing across the largest stadiums this fall.

The two games in Miami that weekend were the tops for DAS traffic in both college and pro for AT&T networks, which according to AT&T is the first time one town has held the DAS crown for both spots. The FSU-Miami game, where the Hurricanes kept it close to the end, was the biggest single DAS traffic event of that weekend, college or pro, with 1,802 GB of data crossing the AT&T DAS network. What’s kind of stunning is to remember that these stats are for AT&T customer traffic only; full game traffic from the 76,530 in attendance at the FSU-Miami game was likely much higher but alas — we get no such comparable stats from other cellular providers.

Other big games between highly ranked teams also scored high in AT&T’s DAS rankings that particular weekend — Alabama’s home win over then No. 1 Mississippi State was second on the list with 849 GB of DAS traffic, while Georgia’s win over visiting Auburn that Saturday recorded 676 GB of DAS traffic.

On the pro side, the second-highest AT&T DAS traffic came interestingly from San Diego, where the Chargers eked out a 13-6 win over the Raiders. We’re wondering if the DAS mark from San Diego — 730 GB, which trailed only Miami’s Thursday night mark of 933 GB in its win over Buffalo — was higher because Qualcomm Stadium still doesn’t have Wi-Fi. And again, remember that traffic at some other stadiums might have been higher — these numbers reflect only AT&T stats from venues where AT&T has an operating DAS.

Stay tuned as the football seasons come to their conclusions — with any luck we’ll get some more DAS and Wi-Fi stats to get a more complete picture of stadium traffic this season, which — surprise! — seems to be continually growing. Verizon, Sprint and T-Mobile… lend us your stats!

AT&T: Getting busy with multiple college football DAS deployments

In an interview with AT&T’s John Donovan earlier this year the company’s senior executive vice president told us that AT&T would continue to be aggressive in its deployment of stadium DAS systems. True to his word, here are announcements from no fewer than eight new top U.S. universities (and one that was announced earlier in the year) that got an AT&T DAS in time for this fall’s football season.

Included in the list of DAS deployments that AT&T either is leading or has joined another operator’s infrastructure are Baylor University, which has a whole new stadium and a new stadium Wi-Fi network as well; Big Ten schools Indiana University, Ohio State University, Michigan State University, the University of Minnesota and the University of Wisconsin (where AT&T also installed a new Wi-Fi network and some IPTV systems); the University of Missouri from the SEC; and Pac-12 schools the University of Washington as well as the University of California, an installation plan that we covered last year. AT&T also participated alongside Verizon in a unique joint DAS deployment at the University of Oregon, also announced earlier this year.

Why so much DAS? As we are finding out in the process of doing a lot of reporting for our upcoming Q4 Stadium Tech Report on college football stadium technology deployments, Wi-Fi deployments are still somewhat of a rarity, even at some of the biggest schools. As we’ve said before, bringing in a DAS deployment makes a lot of sense for schools since A) you can usually get the carrier to pay for most if not all of the cost of building the DAS; and B) a good DAS goes a long way toward eliminating the feared “no signal” problem that can still be found on many major college campus facilities.

How much have fans already been using the new networks? According to AT&T the new Mizzou DAS has done the biggest amount of traffic so far, with 290 gigabytes of traffic crossing the DAS system with its 150+ antennas at one game this season. Cal was close behind with an average of 253 GB per game so far in 2014, while up in Seattle at UDub the fans are generating an average of 190 GB per game. Remember, these stats represent ONLY AT&T traffic on the AT&T part of the DAS; since we still can’t convince Verizon to provide similar statistics we’ll just have to guess what the total-totals are.

Stay tuned for more information about college stadium deployments… look for our Q4 STR report in early December!