Cowboys hit 2+ TB, Texas A&M sees 1.8+ TB in first AT&T DAS stats for 2016 football season

dx1With the first few football games of the season now under our belts, stats from stadium wireless networks are filtering in with a refrain we’ve heard before: Fan use of wireless data is still growing, with no top reached yet.

Thanks to our friends at AT&T we have the first set of cellular network stats in hand, which show a report of 2.273 terabytes of data used on the AT&T network at AT&T Stadium for the Cowboys’ home opener, a 20-19 loss to the New York Giants on Sept. 11. That same weekend the AT&T network at Kyle Field in College Station, Texas, home of the Texas A&M Aggies, saw 1.855 TB of data during Texas A&M’s home opener against UCLA, a 31-24 overtime win over the Bruins.

Remember these stats are for AT&T traffic only, and only for the AT&T network on the DAS installations in and around the stadiums. Any other wireless carriers out there who want to send us statistics, please do so… as well as team Wi-Fi network totals. Look for more reports soon! AT&T graphics below on the first week results. We figure you can figure out which stadiums they’re talking about by the town locations.

Screen Shot 2016-09-19 at 11.21.50 AM

Screen Shot 2016-09-19 at 11.21.58 AM

Betting the Under (Part 2): Putting Wi-Fi antennas under seats is the hot new trend in stadium wireless networks

Under-seat Wi-Fi AP at Levi's Stadium. Photo: Paul Kapustka, MSR (click on any photo for a larger image)

Under-seat Wi-Fi AP at Levi’s Stadium. Photo: Paul Kapustka, MSR (click on any photo for a larger image)

Part 2 of this story picks up with the decision to put Wi-Fi APs under seats at Levi’s Stadium. If you missed it, here is the link to Part 1.

According to Chuck Lukaszewski, now vice president of wireless strategy and standards at Hewlett Packard Enterprise (formerly very high density architect in the CTO Office of Aruba Networks), Aruba had been testing under-seat AP designs since around 2010, “in one form or another.” There were some initial tests of under-seat AP deployments at Turner Field in Atlanta and at American Airlines Arena in Dallas, but nothing on the scale of AT&T Park’s 2013 deployment, or on the scale Aruba planned to have at Levi’s Stadium when it opened in 2014.

Some of the first under-seat Wi-Fi deployments in other arenas were actually deployed completely under the stands, Lukaszewski said, with signals shooting up through the concrete. Though he said “you could get reasonably good throughput through concrete,” especially for 2.4 GHz frequencies, installing antennas above the concrete was “considerably better,” Lukaszewski said.

Curiously, one of the biggest problems in stadium Wi-Fi deployment — especially for those heavy on overhead antenna use — is negotiating interference between antennas; sometimes, clients can “see” antennas and APs that are across the stadium, and will try to connect to those instead of the AP closest to them, a problem that leads to inefficient bandwidth use. Interference also means you can’t place APs too closely together, making it somewhat of an art to find ways to increase coverage without increasing interference.

Dan Williams, former VP of technology for the San Francisco 49ers, talking networking at Levi's Stadium. Photo: Paul Kapustka, MSR

Dan Williams, former VP of technology for the San Francisco 49ers, talking networking at Levi’s Stadium. Photo: Paul Kapustka, MSR

What Aruba found in its testing, Lukaszewski said, was that under-seat Wi-Fi AP deployments could be far more dense than overhead-centric designs, mainly because the human bodies in the seats would provide beneficial “blocking” of signals, allowing network designers to place APs more closely together, and to be able to re-use the same Wi-Fi channels in more antennas.

“If you can use human bodies to contain signals, you can have much smaller cells,” Lukaszewski said. Under-seat deployments, he said, “allows us to re-use the same channel less than 100 yards away.”

With more channels available for each AP, the difference in the metric Lukaszewski calls “megabytes per fan” can be “profound” for an under-seat design versus an overhead design, he said.

“We do see trends [in stadium network data] of under-seat being able to deliver well over 100 MB per fan per event, while overhead designs [deliver] significantly under 100 MB per fan per event,” said Lukaszewski.

Dan Williams, the former vice president of technology for the San Francisco 49ers, said he and Lukaszewski were in agreement that under-seat was the best method to deploy at Levi’s Stadium.

Kyle Field at Texas A&M. White spots in stands are under-seat AP locations. Photo: Paul Kapustka, MSR

Kyle Field at Texas A&M. White spots in stands are under-seat AP locations. Photo: Paul Kapustka, MSR

“I just did not believe in overhead,” said Williams, who said he brainstormed with Aruba’s Lukaszewski on the under-seat idea, which they both brought to the Wi-Fi design at Levi’s. By using under-seat APs, Williams said, the Levi’s Stadium design looked to provide “cones [of bandwidth] around the audience, immersing [fans] in a signal.”

After beating the previous year’s Super Bowl Wi-Fi total at its NFL regular-season opener in 2014, Levi’s Stadium’s Wi-Fi network more than passed its biggest test ever this year, carrying a record 10.1 terabytes of Wi-Fi data during Super Bowl 50. Those numbers are proof of Lukaszewski’s claim: “By far, under seat is better.”

New deployments trending to under-seat

Editor’s note: This excerpt is from our latest STADIUM TECH REPORT, our long-form PDF publication that combines in-depth stadium tech reports with news and analysis of the hottest topics in the world of stadium and large public venue tech deployments. Enjoy this PART 1 of our lead feature, or DOWNLOAD THE REPORT and read the whole story right now!

Even though under-seat deployments can be considerably more expensive, especially in a retrofit situation where deployment requires coring through concrete, many stadiums are now seeming to agree with another Lukaszewski claim, that “the return absolutely justifies the investment.”

At AT&T Stadium in Arlington, Texas, the Cowboys quicked followed their sister park’s lead and installed under-seat APs in force ahead of that venue’s hosting of the inaugural College Football Playoff championship game in January of 2015. John Winborn, chief information officer for the Dallas Cowboys Football Club, said the team worked with AT&T’s “Foundry” innovation centers to produce a smaller, sleeker under-seat AP enclosure that fit well with the stadium’s commitment to aesthetics.

Back on the baseball side, the Giants now have 1,628 Wi-Fi APs in their park, with the vast majority of them under-seat, in all three decks of seating. And the Giants’ main rival to the south, the Los Angeles Dodgers, also used under-seat APs in a recent Wi-Fi upgrade.

Close-up of conduit running to under-seat AP at Kyle Field. Photo: Paul Kapustka, MSR

Close-up of conduit running to under-seat AP at Kyle Field. Photo: Paul Kapustka, MSR

And if Levi’s Stadium led the way for under-seat Wi-Fi, the new mainly under-seat network at the refurbished Kyle Field at Texas A&M might be the QED on the debate, with ultra-fast network speeds and big data-consumption numbers (including 5.7 TB of Wi-Fi at a game versus Alabama) adding measureable momentum to the under-seat trend. Bill Anderson, CEO of Wi-Fi deployment strategy firm AmpThink, said he was an early disbeliever in under-seat Wi-Fi — until he saw the numbers.

“At first we mocked it, made fun of it,” said Anderson, whose firm has been called in to produce Wi-Fi network designs for several recent Super Bowls, as well as for the Kyle Field design. But when Aruba showed AmpThink the data from under-seat tests and deployments, “that was the ‘a-ha’ moment for us,” Anderson said.

Working with Aruba at Kyle Field, AmpThink was able to collect its own data, which convinced Anderson that under-seat was the way to go if you wanted dense, high-performing networks.

“The really important thing is to get APs closer to the people,” said Anderson. “That’s the future.”

Anderson said some doubters may remain, especially those who try to mix a small amount of under-seat APs with existing overhead deployments, a recipe for lowered success due to the potential interference issues. At Texas A&M, Anderson said AmpThink was able to build a design with far less interference and much greater density than an overhead solution, producing numbers that people have to pay attention to.

“We only know what we’ve observed, but we’re evangelistic supporters” of under-seat designs, Anderson said. “If someone says to you under-seat is hocus-pocus, they’re not looking at the data.”

Not for everyone, but more are trying under-seat

Though proponents of under-seat Wi-Fi all agree on its ability to deliver denser, faster networks, they all also agree that under-seat can be considerably more costly than overhead Wi-Fi, especially in a retrofit situation.

In addition to having to core through concrete seating areas to get conduit to the under-seat APs, the devices themselves need to be sealed, to guard them from weather, drink spills, and the power-washing equipment employed by most stadiums to clean seating areas.

Aruba’s Lukaszewski also noted that under-seat deployments generally use more linear feet of cabling to connect the APs than overhead, which also drives up the cost. Then since under-seat designs tend to use more APs, that also means a higher budget to cover a higher number of devices.

A row shot of the under-seat APs at AT&T Stadium. Photo: Dallas Cowboys

A row shot of the under-seat APs at AT&T Stadium. Photo: Dallas Cowboys

For some stadiums, the construction materials used prohibit the under-seat option from even being tried. At the Green Bay Packers’ legendary Lambeau Field, a late-1950s construction design that used lots of concrete and rebar — as well as part of the stadium’s bottom sitting directly in the ground — meant that under-seat Wi-Fi wasn’t an option, according to Wayne Wichlacz, director of information technology for the Packers.

Other stadiums, like the University of Nebraska’s Memorial Stadium, don’t have enough space between the stadium’s bleacher seats and the floor for under-seat APs to be safely installed. And many schools or teams simply don’t have big IT budgets like the $20-million-plus available to Texas A&M that allowed the Kyle Field design to seek the best result possible.

But many of the new stadiums under construction, as well as existing venues that are planning for new best-of-breed networks, have already committed to under-seat Wi-Fi designs, including the Sacramento Kings’ Golden 1 Center, where Ruckus Wireless will implement its first under-seat stadium Wi-Fi network.

Steve Martin, senior vice president and general manager at Ruckus, said the Golden 1 Center design, planned to be the most dense anywhere, will “primarily be underseat,” a choice he said “helps in a lot of ways.”

Foremost is the performance, something Martin said Ruckus has been testing at the Kings’ current home, the Sleep Train Arena. “It [under seat] does give you the isolation for frequency re-use,” he said.

The under-seat design also makes sense in Golden 1 Center since the stadium’s overall design is very open, with lots of glass walls and unobstructed views.

And under-seat deployment is even making inroads into the distributed antenna system (DAS) world, with Verizon Wireless implementing more than 50 under-seat DAS antennas at Levi’s Stadium prior to Super Bowl 50. Mainly installed to cover the bottom-of-the-bowl rows, the under-seat APs helped Verizon manage a record day for DAS traffic, with 7 TB reported on its in-stadium cellular network during the game.

“To get a quality signal, we had to go under seat,” said Brian Mecum, vice president, network, for Verizon Wireless, who said that in that area of the stadium, under seat was the only way to get a quality signal close to the subscriber’s phone. Verizon, he said, helped design the under-seat DAS antenna, and is looking to deploy it in other stadiums soon.

“It’s the first of more,” he said.

END PART 2… HERE IS THE LINK TO PART 1… TO READ THE WHOLE STORY NOW, DOWNLOAD OUR REPORT!

Texas A&M’s Kyle Field: A network built for speed

Full house at Kyle Field. All Photos: Paul Kapustka, MSR (click on any photo for a larger image)

Full house at Kyle Field. All Photos: Paul Kapustka, MSR (click on any photo for a larger image)

Is there a combined stadium Wi-Fi and DAS deployment that is as fast as the one found at Texas A&M’s Kyle Field? If so, we haven’t seen or heard of it.

In fact, after reviewing loads of live network-performance data of Kyle Field’s new Wi-Fi and DAS in action, and after maxing out the top levels on our speed tests time after time during an informal walk-around on a game day, we’ve come to the conclusion that Kyle Field has itself a Spinal Tap of a wireless deployment. Meaning, that if other stadium networks stop at 10, this one goes to 11.

Movie references aside, quite simply, by the numbers Kyle Field’s wireless network performance is unequaled by any other large public venue’s we’ve tested in terms of raw speed and the ability to deliver bandwidth. With DAS and Wi-Fi speed measurements ranging between 40 Mbps and 60+ Mbps pretty much everywhere we roamed inside the 102,512-seat venue, it’s a safe bet to say that the school’s desire to “build the best network” in a stadium hit its goal as best as it could.

Editor’s note: This story is part of our most recent STADIUM TECH REPORT, the COLLEGE FOOTBALL ISSUE. The 40+ page report, which includes profiles of stadium deployments at Texas A&M, Kansas State, Ole Miss and Oklahoma, is available for FREE DOWNLOAD from our site. Get your copy today!

On one hand, the network’s top-line performance is not that much of a surprise, since as part of an overall Kyle Field renovation that has already cost an estimated $485 million, the optical-based Wi-Fi, DAS and IPTV deployment inside the Aggies’ football palace is probably among the most expensive and expansive in-venue networks ever built. According to Phillip Ray, Vice Chancellor for Business Affairs at The Texas A&M University System, the total cost of the optical-based Wi-Fi, DAS and IPTV network was “somewhere north of $20 million.”

Remote optical cabinet and Wi-Fi AP at Kyle Field.

Remote optical cabinet and Wi-Fi AP at Kyle Field.

And even though the nation’s biggest cellular carriers, AT&T and Verizon Wireless, paid nearly half the network’s cost – $10 million, according to Ray – with the dedication and work crews brought to the table by main suppliers IBM and Corning, and Wi-Fi gear vendor Aruba, you have components, expertise and budgetary freedom that perhaps only a small group of venue owners could hope to match.

But just throwing money and technology at a stadium doesn’t necessarily produce a great network. In a venue the size of the new Kyle Field there needs to be great care and innovative thinking behind antenna placement and tuning, and in that arena Texas A&M also had the guiding hand of AmpThink, a small firm with oversized smarts in Wi-Fi deployment, as evidenced by its impressive track record of helping wireless deployments at the biggest events including several recent Super Bowls.

The core decision to go with optical for the network’s guts, and a tactical decision to put a huge chunk of the Wi-Fi APs in under-seat deployments are just part of the strategy that produced a network that – in A&M fan parlance – can “BTHO” (Beat The Hell Out) of most challengers.

Since it’s almost impossible to directly compare stadiums and venue network performances due to all the possible variables, you’ll never hear us at Mobile Sports Report declare a “champion” when it comes to click-bait themes like “the most connected stadium ever.” Given its remote location some three hours south of Dallas in College Station, Texas, Kyle Field will almost certainly never face the ultimate “big game” pressures of a Super Bowl or a College Football Playoff championship, so the network may never know the stress such large, bucket-list gatherings can produce. And so far, there aren’t many ambitious fan-facing applications that use the network, like in-seat food delivery or wayfinding apps found in other stadiums.

But as part of the football-crazy SEC, and as the altar of pigskin worship for some of the most dedicated fans seen anywhere, Kyle Field is sure to see its share of sellout contests against SEC rivals that will push wireless usage to new heights, especially as more fans learn about and use the still-new system. Though total Wi-Fi usage at the Nov. 7 game we attended versus Auburn (a 26-10 Texas A&M loss) was “only” 2.94 terabytes – a total hampered by cold, windy and rainy conditions – an Oct. 17 game earlier in the season against Alabama saw 5.7 TB of Wi-Fi usage on the Kyle Field network, a number surpassed only by last year’s Super Bowl (with 6.2 TB of Wi-Fi use) in terms of total tonnage.

At the very least, the raw numbers of total attendees and the obvious strength of the still-new network is sure to guarantee that Kyle Field’s wireless deployment will be one of the most analyzed stadium networks for the foreseeable future.

Texas A&M student recording the halftime show.

Texas A&M student recording the halftime show.

What follows are some on-the-spot observations from our visit, which was aided by the guidance and hospitality of Corning project manager Sean Heffner, who played “tour guide” for part of the day, giving us behind-the-scenes access and views of the deployment that are unavailable to the general fan audience.

An off-campus DAS head end

This story starts not inside Kyle Field, but in a section of town just over three miles away from the stadium, on a muddy road that curves behind a funky nursery growing strange-looking plants. A gray metal box, like a big warehouse, is our destination, and the only clue as to what’s inside is the big antenna located right next to it. This structure is the Kyle Field DAS head end, where cellular carrier equipment connects to the fiber network that will bring signals to and from fans inside the stadium.

Why is the head end so far away? According to Corning’s Heffner there was no room for this huge space inside the stadium. But thanks to the use of optical fiber, the location is not a problem since signals traveling at the speed of light makes 3.3 miles an insignificant span.

It might be helpful to back up a bit if you haven’t heard the full story of the Kyle Field deployment, which we told last year when the job was halfway completed. Though the rebuilding of the stadium was started with copper-based networks as the original plan, a last-minute audible championed by Texas A&M chancellor John Sharp sent the school on a decidedly untraditional path, by building a stadium network with a single optical-based core for Wi-Fi, DAS and IPTV networks. The kicker? Not only would this network have huge capacity and be future-proof against growth, it would actually cost less than a comparable copper-based deployment. If it got built on time, that is.

Though the pitch for better performance, far more capacity, use of less space, and cheaper costs might sound a bit too good to believe, most of it is just the combination of the simple physics advantages of using fiber over copper, which are well known in the core telecom and large-enterprise networking worlds, applied to a stadium situation.

One of the many maxed-out speed tests we took at Texas A&M's Kyle Field.

One of the many maxed-out speed tests we took at Texas A&M’s Kyle Field.

Without going too deeply into the physics or technology, a simple explanation of the benefits stem from the fact that optical fiber can carry far more bandwidth than copper, at farther distances, using less power. Those advantages are why fiber is used extensively in core backbone networks, and has been creeping slowly closer to the user’s destination, through deployments like Verizon’s FiOS.

And that’s also the reason why Texas A&M could put its DAS head end out in a field where it’s easier to add to (no space constraints), because the speed of fiber makes distance somewhat irrelevant. Corning’s Heffner also said that the DAS can be managed remotely, so that staff doesn’t need to be physically present to monitor the equipment.

Of course, there was the small matter of digging trenches for optical fibers to get from the head end to the stadium, but again, for this project it is apparent that getting things done was more important than strictly worrying about costs. Beyond the cash that the carriers all put in, other vendors and construction partners all put in some extra efforts or resources – in part, probably because the value of positive publicity for being part of such an ambitious undertaking makes any extra costs easy to justify.

Keeping the best fans connected and happy

From the head end, the fiber winds its way past apartment buildings and a golf course to get to Kyle Field, the center of the local universe on football game days. Deep inside the bowels of the venue is where the fiber meets networking gear, in a room chilled to the temperature of firm ice cream. Here is where the human element that helps keep the network running spends its game days, wearing fleece and ski jackets no matter what the temperature is outside.

See the white dots? Those are under-seat Wi-Fi APs

See the white dots? Those are under-seat Wi-Fi APs

In addition to Corning, IBM and AmpThink employees, this room during our visit also had a representative from YinzCam in attendance, a rarity for a company that prides itself on being able to have its stadium and team apps run without local supervision. But with YinzCam recently named as a partner to IBM’s nascent stadium technology practice, it’s apparent that the Kyle Field network is more than just a great service for the fans in the seats – it’s also a proof of concept network that is being closely watched by all the entities that helped bring it together, who for many reasons want to be able to catch any issues before they become problems.

How big and how ambitious is the Kyle Field network? From the outset, Corning and IBM said the Wi-Fi network part was designed to support 100,000 connections at a speed of 2 Mbps, so that if everyone in the stadium decided to log on, they’d all have decent bandwidth. But so far, that upper level hasn’t been tested yet.

What happened through the first season was a “take rate” averaging in the 35,000-37,000 range, meaning that during a game day, roughly one-third of the fans in attendance used the Wi-Fi at some point. The average concurrent user peaks – the highest numbers of fans using the network at the same time – generally averaged in the mid-20,000 range, according to figures provided by Corning and AmpThink; so instead of 100,000 fans connecting at 2 Mbps, this season there was about a quarter of that number connecting at much higher data rates, if our ad hoc speed tests are any proof.

Our first test that Saturday [Nov. 7, 2015], just inside a lower-level service entryway, hit 41.35 Mbps for download and 18.67 on the upload, on a Verizon iPhone 6 Plus over the stadium’s DAS. And yes, that download speed was the slowest we’d record all day, either on the DAS or the Wi-Fi.

Inside the control room we spent some time with AmpThink CEO Bill Anderson, who could probably use up an entire football game talking about Wi-Fi network deployment strategies if he didn’t have a big network to watch. On this Saturday the top things we learned about Kyle Field is that Anderson and AmpThink are solid believers in under-seat AP placements for performance reasons; according to Anderson at Kyle Field, fully 669 of the stadium’s 1,300 APs can be found underneath seats. Anderson also is a stickler for “real” Wi-Fi usage measurements, like trying to weed out devices that may have autoconnected to the Wi-Fi network but not used it from the “unique user” totals – and to take bandwidth measurements at the network firewall, to truly see how much “live” bandwidth is coming and going.

On the road to College Station, Aggie pride is everywhere. Whoop!

On the road to College Station, Aggie pride is everywhere. Whoop!

AmpThink’s attention to detail includes deploying and configuring APs differently depending on which section they are located in – student sections, for example, are more densely packed with people than other sections so the APs need different tuning. Corning’s Heffner also said that the oDAS – the DAS just outside the stadium – got special attention due to the large numbers of tailgating fans, both before and during the games. At the Alabama game, Heffner said there were some 30,000 fans who remained outside the stadium during the contest, never coming inside but still wanting to participate in the scene.

AmpThink, Corning, IBM and others involved at Kyle Field all seem keen on finding out just how much bandwidth stadium fans will use if you give them unlimited access. The guess? According to Corning’s Heffner, the mantra of stadium networks these days seems to be: “If you provide more capacity, it gets consumed.”

The ‘real’ 12th man

After walking through a tunnel with a nearly full cable tray overhead (“It’d be even more loaded if we were using copper,” Heffner said) we went out into the stadium itself, which was just starting to fill. Though the overcast day and intermittment rain squalls might have kept other teams’ fans from showing up for a 5:30 p.m. local start time, that simply wasn’t the case at an A&M home game.

Some of the Wi-FI and DAS download measurements we took at Kyle Field.

Some of the Wi-FI and DAS download measurements we took at Kyle Field.

As someone who’s attended a countless number of football games, small and large – including a Super Bowl and last year’s inaugural College Football Playoff championship game – I can honestly say that the level of fan participation at Texas A&M is like nothing I’d seen before. The student section alone spans two decks on the stadium’s east side and takes up 40,000 seats, according to stadium officials – simply dwarfing anything I’d ever witnessed. (Out of an enrollment of 57,000+, having 40,000 students attend games is incredible.) And outside of small high school crowds I’d never seen an entire full stadium participate in all the school songs, the “yells” (do NOT call them “cheers” here) and the locked-arms back-and-forth “sawing” dance without any need for scoreboard instruction.

Part of the stadium renovation that closed the structure into a bowl was, according to school officials, designed to make Kyle Field even more intimidating than it already was, by increasing the sound levels possible. Unfortunately the night of our visit some early Auburn scores took some of the steam out of the crowd, and a driving, chilling rain that appeared just before halftime sent a good part of the crowd either home or into the concourses looking for warmth and shelter. (The next day, several columnists in the local paper admonished the fans who left early for their transgressions; how dare they depart a game whose outcome was still in doubt?)

But I’ll never forget the power of the synchronized “yells” of tens of thousands of fans during pregame, and the roar that surfaced when former Aggie QB Johnny Manziel made a surprise appearance on the field before kickoff. Seattle Seahawks fans may stake the pro claim to fan support, but if you want to determine the “real” 12th man experience you need to stop by Kyle Field and give your ears a taste of loud.

Controlling the TV with the app

If the students and alumni and other fans outside provide the vocal power, the money power that helped get the stadium rebuilt can be found in the new Kyle Field suites and premium seating areas, some of which are found on the venue’s west side, which was blown up last December and rebuilt in time for this past season.

Conduit reaching to an under-seat AP

Conduit reaching to an under-seat AP

Inside the All American Club – a behind-the-walls gathering area with catered food and bars that would not seem out of place in Levi’s Stadium or AT&T Stadium – we tested the Wi-Fi and got speeds of 63 Mbps down, 69 Mbps up; Verizon’s 4G LTE service on the DAS hit 48 Mbps/14.78 Mbps, while AT&T’s 4G LTE DAS checked in at 40 Mbps/22 Mbps.

In an actual suite where we were allowed to check out the IPTV displays, the speed tests got 67/67 for Wi-Fi and 57/12 for Verizon 4G LTE. So the well-heeled backers of A&M football shouldn’t have any problems when it comes to connectivity.

As for the IPTV controls, the new system from YinzCam solves one of the problems that’s plagued stadium suites since there’s been suites: What do you do with the TV remote? What YinzCam did for Texas A&M was link the TV controls to a Texas A&M “TV Remote” app; by simply punching in a numerical code that appears on the bottom of the screen in front of you, anyone with access to a suite or club area with TVs can change the channel to a long list of selections, including multiple live game-day views (stadium screen, broadcast view) as well as to other channels, like other games on the ESPN SEC network.

By having a static code number for each TV and another set of numbers that randomly scrambles over time, the system smartly builds security into the channel changing system, and prevents someone who had been in a suite previously from being able to change the channels after they leave. The whole remote-control process took less than a minute to learn, and we had fun wandering through the club-level areas our pass gave us access to, changing screens as we saw fit.

Our favorite places to watch the game at Kyle Field were the loge-level lounges, where you could first purchase food and beverages, including alcoholic ones, at an inside bar and then sit at an outside seat with a small-screen TV in front of you for information overload. The Wi-Fi in the southwest corner loge lounge checked in at 67.03/62.93, so it was no problem being connected via mobile device, either.

What comes next for the Kyle Field network?

Even though the rain had started coming down harder, we left the comfort and warmth of the club levels to wander around the stadium’s upper decks, including the student section, where we watched numerous fans taking pictures or videos of the band’s halftime performance. Clearly most everyone in Kyle Field had gotten the message and wasn’t afraid that they won’t connect if they use their mobile device at the game, even among 102,000 of their closest friends.

Antennas on flag poles atop seating

Antennas on flag poles atop seating

The question now for Kyle Field is what does it do next with its network? The most obvious place for innovation or new features is with a stadium-centric app, one that could provide services like a wayfinding map. Maybe it was our round-the-stadium wandering that produced confusion finding our way around, but any building that seats 102,000 plus could use an interactive map. It might also be interesting to tie a map to concessions – the night we visited, there were long lines at the few hot chocolate stands due to the cold weather; in such situations you could conceivably use the network to find out where hot chocolate stands were running low, maybe open new ones and alert fans through the app.

We’re guessing parking and ticketing functions might also be tied to the app in the future, but for now we’ll have to wait and see what happens. One thing in Kyle Field’s favor for the future: thanks to the capacity of the optical network buildout, the stadium already has thousands of spare fiber connections that aren’t currently being used. That means when it’s time to upgrade or add more DAS antennas, Wi-Fi APs or whatever comes next, Kyle Field is already wired to handle it.

For the Nov. 7 game at Kyle Field, the final numbers included 37,121 unique users of the Wi-Fi network, and a peak concurrent user number of 23,101 taken near the end of the 3rd quarter. The total traffic used on the Wi-Fi network that night was 2.94 TB, perhaps low or average for Kyle Field these days but it’s helpful to remember that just three years ago that was right around the total Wi-Fi data used at a Super Bowl.

Until the next IBM/Corning network gets built in Atlanta (at the Falcons’ new Mercedes-Benz Stadium, slated to open in 2017), the Kyle Field network will no doubt be the center of much stadium-technology market attention, especially if they ever do manage to get 100,000 fans to use the Wi-Fi all at once. While A&M’s on-the-field fortunes in the competitive SEC are a yearly question, the performance of the network in the Aggies’ stadium isn’t; right now it would certainly be one of the top four seeds, if not No. 1, if there was such a thing as a college stadium network playoff.

What we’re looking forward to is more data and more reports from a stadium with a network that can provide “that extra push over the edge” when fans want to turn their connectivity dial past 10. Remember, this one goes to 11. It’s one more.

(More photos below! And don’t forget to download your copy of the STADIUM TECH REPORT for more!)

kf7
Panoramic view of Kyle Field before the 102,000 fans fill the seats.

kf2
Some things at Kyle Field operate at ‘traditional’ speeds.

kf1

Outside the south gate before the game begins.

kf3

Overhang antenna in the middle section of the stadium.

Last-minute audible to optical made Texas A&M’s stadium network a winner

Texas A&M student at recent Aggies football game. Photo: Paul Kapustka, MSR (click on any photo for a larger image)

Texas A&M student at recent Aggies football game. Photo: Paul Kapustka, MSR (click on any photo for a larger image)

The original game plan for the new wireless networks at Texas A&M’s Kyle Field called for copper, not optical fiber, at the network core. Then came a last-minute audible that changed the game not just for the Aggies but maybe for stadium networks overall.

After initially designing the network with a traditional copper wiring system, a late spring 2014 decision by Texas A&M chancellor John Sharp reversed field, switching instead to an all-optical network for DAS, Wi-Fi and IPTV combined. The new network, now in full operational mode, is already being hailed as the future-proof path of the future of stadium network technology, with other schools and pro teams beating a path to College Station to see what they might learn.

With screaming speeds on both the Wi-Fi and DAS networks and plenty of capacity for now and the future, Sharp’s line-of-scrimmage call to go with an IBM and Corning optical-based network seems to be a huge score, according to a school official who brought the idea to Sharp’s attention.

Editor’s note: This story is part of our most recent STADIUM TECH REPORT, the COLLEGE FOOTBALL ISSUE for 2015. The 40+ page report, which includes profiles of stadium deployments at Texas A&M, Kansas State, Ole Miss and Oklahoma, is available for FREE DOWNLOAD from our site. Get your copy today!

A sample of the Wi-Fi and DAS speed tests we took at Kyle Field.

A sample of the Wi-Fi and DAS speed tests we took at Kyle Field.

Last-minute switch from copper to optical

“We had got pretty far down the road with an older, but tried and true [network] architecture,” said Phillip Ray, Vice Chancellor for Business Affairs at The Texas A&M University System. But after hearing and reading about the possible potential of an optical fiber-based network system, Ray brought in Corning and IBM representatives over school spring break in 2014 to discuss the possibility of switching to an optical fiber-based network for Kyle Field – even though the network would have to be ready for the 2014 football season.

“We had some face to face meetings with chancellor Sharp and discussed all the pros and cons,” said Ray, who had been charged by Sharp with overseeing the network deployment part of the $485 million Kyle Field renovation. Though Ray said he was under a “lot of pressure” to stick with the older-type design, he quickly got a green light from Sharp to take the optical choice and run with it.

“If we had gone copper, we knew that we would have had a network in the stadium for game 1,” said Ray. “But the pros of optical far outweighed the cons. Chancellor Sharp instead took a big risk, and took a leap of faith for all the right reasons. He said, ‘this is the chance of a lifetime, to really move the ball and shoot for the top!’ “

According to Ray, the total cost of the combined Wi-Fi, DAS and IPTV network ended up being “just north of $20 million,” but that cost was softened when the two largest cellular carriers, AT&T and Verizon Wireless, ponied up $10 million, almost half the cost.

“The carriers embraced it, funded it, and want to be with us down the road,” said Ray. “It was a paradigm shift for them, but they wanted to be involved.” While AT&T and Verizon are live on the DAS now, Ray said that Texas A&M already has a commitment from T-Mobile to join the DAS soon, and hopes to also add Sprint before long.

Aside from the leap of faith to go optical was the on-the-ground necessity to build the network quickly, since Sharp didn’t want to start the 2014 season without it. Ray said that Todd Chrisner – a former IBM employee who moved to Corning during the past year – “helped lead a Herculean effort” of gear suppliers, service providers and construction workers who finished Phase 1 of the network in time for the first game of last season. Phase 2 of the network also required quick moving, since it didn’t get started until Texas A&M blew up and then rebuilt the entire west side of the stadium between December 2014 and the 2015 season.

On the road to College Station, Aggie pride is everywhere. Whoop!

On the road to College Station, Aggie pride is everywhere. Whoop!

Again, the network (and the building) were finished on time.

“We had a lot of Aggies involved [in the construction],” Ray said. “They knew they were going to be sitting in those seats for the next 35 years, so they worked hard.”

Now that it’s finished and working incredibly well, Ray said the Kyle Field network has already been visited by representatives from other colleges, as well as professional football and hockey stadium-networking types.

“We get calls every week, and we have people down to share what we learned – we’re an open book,” said Ray. And they’re able to tell a success story mainly because Ray, Sharp and others trusted themselves to switch from an OK play to one that could score a touchdown.

“If we had gone with copper we’d be so regretting it now,” Ray said. Having an optical-based network, he said, “sets us up for many years, and eventually will save us money. It was a lot of hard work and risk, and if it had fallen on its head, chancellor Sharp would have taken the heat. Instead, it’s one of the best decisions, ever.”

IBM formally launches sports consulting practice to bring tech to stadiums

Texas A&M student at recent Aggies football game. Photo: Paul Kapustka, MSR (click on any photo for a larger image)

Texas A&M student at recent Aggies football game. Photo: Paul Kapustka, MSR (click on any photo for a larger image)

IBM formally cemented its entrance to the sports-stadium tech deployment market with the announcement of a sports and fan experience consulting practice, and a “global consortium” of tech and service suppliers who may help IBM in its future stadium and entertainment venue deployments.

For industry watchers, the Nov. 19 debut of the IBM “Sports, Entertainment and Fan Experience” consulting practice was not a surprise, since its leader, Jim Rushton, had already appeared at tech conferences this past summer, talking about IBM’s plans to deploy a fiber-based Wi-Fi and DAS network at the new Mercedes-Benz Stadium being built for the Atlanta Falcons. IBM was also publicly behind a similar network build over the last two years at Texas A&M’s Kyle Field. For both networks, IBM is using Corning optical gear.

Still, the formal creation of the IBM practice (you can read all about it at the new IBM sports website) means that the 800-pound gorilla is now firmly inside the competitive ring of the stadium-tech marketplace, a landscape that currently has multiple players, many of which have multiple stadium deployments under their belts. However, IBM’s vast experience in big-time sports technology deployments — Big Blue is behind such endeavors as the truly wonderful online experience of The Masters, as well as technical underpinnings of three of tennis’ Grand Slam events (Wimbledon, the U.S. Open and the Australian Open) — as well as its considerable tech and monetary resources probably makes it a No. 1 contender for all of the biggest projects as well as possibly smaller ones as well.

Artist's rendering of planned overhead view of new Atlanta NFL stadium

Artist’s rendering of planned overhead view of new Atlanta NFL stadium

Rushton, who spoke with Mobile Sports Report earlier this year in one of his first public appearances as an IBMer, said in a phone interview this week that IBM’s fiber-to-the-fan network model isn’t just for large-scale deployments like the one at 105,000-seat Kyle Field or the Falcons’ new $1.4 billion nest, which will seat 71,000 for football and up to 83,000 for other events after it opens in 2017.

“That type of system [the optical network] is scalable,” Rushton said, and even in smaller venues he said it could potentially save customers 30 percent or more compared to the cost of a traditional copper-based cabled network. The flip side to that equation is that purchasers have fewer gear suppliers to choose from on the fiber-based side of things, and according to several industry sources it’s still sometimes a problem to find enough technical staffers with optical-equipment expertise.

How much of the market is left?

The other question facing IBM’s new consulting practice is the size of the market left for stadium tech deployments, an answer we try to parse each year in our State of the Stadium survey. While this year’s survey and our subsequent quarterly reports found a high number of U.S. professional stadiums with Wi-Fi and DAS networks already deployed, there are still large numbers of college venues as well as international stadiums and other large public venues like concert halls, race tracks and other areas that are still without basic connectivity.

Full house at Kyle Field. Photo: Paul Kapustka, MSR

Full house at Kyle Field. Photo: Paul Kapustka, MSR

With its new “global consortium” of companies that supply different parts and services of the connected-stadium experience, IBM could be an attractive choice to a customer that doesn’t have its own technical expertise, providing a soup-to-nuts package that could conceivably handle tasks like in-stadium IPTV, DAS and Wi-Fi, construction and stadium design, and backbone bandwidth solutions.

However, IBM will be going up against vendors who have led deployments on their own, and league-led “consortium” type arrangements like MLBAM’s project that brought Wi-Fi to almost all the Major League Baseball stadiums, and the NFL’s list of preferred suppliers like Extreme Networks for Wi-Fi and YinzCam for apps. Also in the mix are third-party integrators like CDW, Mobilitie, 5 Bars, Boingo Wireless and others who are already active in the stadium-technology deployment space. And don’t forget HP, which bought Wi-Fi gear supplier Aruba Networks earlier this year.

Certainly, we expect to hear more from IBM soon, and perhaps right now it’s best to close by repeating what we heard from Jared Miller, chief technology officer for Falcons owner Arthur Blank’s namesake AMB Sports and Entertainment (AMBSE) group, when we asked earlier this year why the Falcons picked IBM to build the technology in the new Atlanta stadium:

Remote optical cabinet and Wi-Fi AP at Kyle Field. Photo: Paul Kapustka, MSR

Remote optical cabinet and Wi-Fi AP at Kyle Field. Photo: Paul Kapustka, MSR

“IBM is unique with its span of technology footprint,” Miller said. He also cited IBM’s ability to not just deploy technology but to also help determine what the technology could be used for, with analytics and application design.

“They’ve looked at the [stadium] opportunity in a different manner, thinking about what we could do with the network once it’s built,” Miller said.

From the IBM press release, here is the IBM list of companies in its new “global consortium,” which IBM said is not binding, meaning that none of the companies listed is guaranteed any business yet, and others not on the list may end up in IBM deployments, like Kyle Field, which uses Aruba gear for the Wi-Fi:

Founding members of the consortium, include:

· Construction and Design: AECOM, HOK, Whiting Turner

· Infrastructure Technology/Carriers: Alcatel/Lucent, Anixter, Commscope, Corning, Juniper Networks, Ruckus Wireless, Schneider Electric, Smarter Risk, Tellabs, Ucopia, Zebra Technologies, YinzCam (IPTV), Zayo, Zhone

· Communications Solutions Providers: Level 3, Verizon Enterprise Solutions, AT&T

· Fan Experience Consulting & Data Management Integration: IBM

Texas A&M’s fiber-backed Wi-Fi at Kyle Field records 5.7 TB of data during Alabama game

Scoreboard, Kyle Field. Photos: Texas A&M

Scoreboard, Kyle Field. Photos: Texas A&M

We’ve been hearing rumors about how much data was flowing at the new fiber-based Wi-Fi network at Texas A&M’s Kyle Field this fall, and now we finally have some verified numbers that are sure to pop some eyeballs: According to the networking crew at Corning, fans at Kyle Field used 5.7 terabytes of Wi-Fi data during the Oct. 17 game against Alabama, which the Aggies lost 41-23.

In case you are keeping score the 5.7 TB mark is the second-largest single-game Wi-Fi usage number we’ve seen, trailing only the 6.2 TB recorded at Super Bowl XLIX in Glendale, Ariz., earlier this year. Before you pin it all on the network, however, be aware that the newly refurbished Kyle Field can hold a whole lotta fans — the announced attendance for the ‘Bama game was 105,733, which is 35,000+ more fans than the 70,288 who attended the Super Bowl at the University of Phoenix Stadium on Feb. 1. Still, building a network to support basically another baseball stadium’s worth of fans is pretty cool, too.

Other related numbers from the Wi-Fi network are in Super Bowl territory as well, including the 37,823 unique clients recorded during pre-game and game time, as well as the 26,318 peak concurrent user count. We’re not sure why only 10 people tweeted about the Wi-Fi (8 good, 2 bad) but the 3.2 Gbps throughput should also turn some heads.

Corning ONE DAS headend equipment at Texas A&M's Kyle Field deployment

Corning ONE DAS headend equipment at Texas A&M’s Kyle Field deployment

The question this all raises for us is, has the availability of a fiber backbone allowed fans to simply use more traffic? And is the demand for mobile data at big events perhaps even higher than we thought? With a regular-season game at Nebraska hitting 4.2 TB earlier this season, it’s pretty clear that data demands are showing no signs of hitting a plateau. Or maybe we can deduce that the better the network, the more traffic it will carry?

It’s also worthwhile to note that stats this season from AT&T have shown several 1+ TB data totals for games at Kyle Field on the AT&T DAS network, which uses the same fiber backbone as the Wi-Fi. This “fiber to the fan” infrastructure, built by IBM and Corning, will also be at the core of the network being built at the new home of the NFL’s Falcons, the Mercedes-Benz Stadium in Atlanta, scheduled to open in 2017.

We’ll have more soon from Kyle Field, as Mobile Sports Report is scheduled to make a visit there for the Nov. 7 game against Auburn. If you plan to be in College Station that weekend give us a holler. Or a yell, right? We are looking forward to seeing the stadium and the network firsthand, to do some speedtests to see how well all areas are covered. With 5.7 TB of Wi-Fi, it’s a good guess the coverage is pretty good.

(Statistics provided by Corning for the Oct. 17 game are below.)

Screen Shot 2015-10-29 at 9.50.45 PM

https://duwit.ukdw.ac.id/document/pengadaan/slot777/

https://mtsnupakis.sch.id/wp-content/zeusslot/

https://insankamilsidoarjo.sch.id/wp-content/slot-zeus/

https://smpbhayangkari1sby.sch.id/wp-content/slot-zeus/

https://alhikamsurabaya.sch.id/wp-content/slot-thailand/

https://mtsnupakis.sch.id/wp-content/bonus-new-member/

https://smptagsby.sch.id/wp-content/slot-bet-200/

https://lookahindonesia.com/wp-content/bonus-new-member/

https://ponpesalkhairattanjungselor.sch.id/wp-content/mahjong-slot/

https://mtsnupakis.sch.id/wp-content/slot777/

https://sdlabum.sch.id/wp-content/slot777/

https://sdlabumblitar.sch.id/wp-content/bonus-new-member/

https://sdlabumblitar.sch.id/wp-content/spaceman/

https://paudlabumblitar.sch.id/wp-content/spaceman/