MSR at the College Football Playoff Championships: Send us your speedtests!

ESPN's College Football Playoff Championships stage in downtown Fort Worth, Sunday night. Credit: Paul Kapustka, MSR

ESPN’s College Football Playoff Championships stage in downtown Fort Worth, Sunday night. Credit: Paul Kapustka, MSR

As Twitter followers found out yesterday, MSR is in “north Texas,” aka the Dallas-Fort Worth “Metroplex,” attending tonight’s inaugural College Football Playoff Championship game at AT&T Stadium.

We’re here to see a test of AT&T’s LTE broadcast technology, which will ostensibly make it easier for venues to deliver live video streams via a cellular connection. But we also are going to take advantage of the event to walk around “Jerry’s World” and take speed tests and see how the network at AT&T Stadium performs for a big game. If you are in attendence and know how to do a Wi-Fi or cellular speed test, send us the results (tweet at @paulkaps is the best way, or send me email to kaps at mobilesportsreport.com). Check Twitter for updates during the game.

Stadium Tech Report: Network finishes season strong at Niners’ Levi’s Stadium

Arriving at Levi's Stadium for last 2014 season game

Arriving at Levi’s Stadium for last 2014 season game

While the football season didn’t turn out like 49ers fans wanted, the wireless network at the team’s new Levi’s Stadium closed out the year with strong performances to complete a largely glitch-free stretch of events at what is one of the world’s most technologically advanced stadiums.

With more than 2 Terabytes of data used by fans at each of the last two home games for the San Francisco 49ers, the Wi-Fi and DAS networks at Levi’s Stadium closed out a season of superb connectivity that eventually allowed the team to not just offer in-seat food and beverage delivery, but also ordering and delivery of merchandise like hats and T-shirts, an option that was available for the Dec. 20 game against the San Diego Chargers and the Dec. 28 closer vs. the Arizona Cardinals.

According to the Levi’s Stadium network crew, the Wi-Fi network carried 2.34 TB of data for the Chargers game and another 2.11 TB for the Cardinals game, with 20,096 fans using the network on Dec. 20 and 20,164 on Wi-Fi on Dec. 28. Peak concurrent user numbers were 13,700 for the Chargers game, and 14,400 for the season closer.

All season long, our speed tests in various parts of the stadium showed strong signals for both the Wi-Fi as well as the distrbuted antenna system (DAS) enhanced cellular network. At the final game it was no different; we found Wi-Fi download speeds of 16 Mbps on the rooftop deck, 25 Mbps in a suite and a scorching 39 Mbps in the Levi’s 501 Club seats (no doubt in part because there was a Wi-Fi antenna under the seat next to us).

Both Verizon and AT&T 4G LTE services also worked well, consistently scoring download speeds in the 4-6 Mbps range in most places and much higher in others. In short, we didn’t find any flaws in the network coverage in five games of walking all around the stadium, testing everywhere we went.

CalTrain to VTA a smooth ride

Caltrain crown en route to Arizona game

Caltrain crown en route to Arizona game

At the final game, Mobile Sports Report (me) tested out the full public-transit method of getting to the game, starting from San Mateo on CalTrain at 10:51 a.m. The parking lot at the station was almost completely empty, and free since it was Sunday; it’s possible that crowds were lighter since the Niners had been eliminated from postseason play, but nevertheless the ride to Mountain View went without hitch, a good sign for next year when many fans in town for Super Bowl 50 will no doubt be using CalTrain to get from San Francisco to Levi’s.

At the Mountain View CalTrain/VTA station operations were at their best I’ve seen, with more neon-vested VTA helpers offering clear instructions on why you might want to take an express bus instead of the light rail. Insider tip: If the express bus is available, take it, because in our testing it arrived at Levi’s in about half the time as the train trip (~20 minutes as opposed to almost 40 minutes for the light rail option).

Express bus option at Mountain View

Express bus option at Mountain View

The only thing that still remains to be ironed out is the fare confusion in switching from CalTrain to VTA, which are two different operators. On CalTrain there was advertising for a $6 “combo ticket” that would let you ride VTA and could be purchased at the same time you bought your CalTrain pass. But an online single-day ticket purchased via the VTA app was only $4.50, so it’s not clear why you would buy the CalTrain pass. Especially for the Super Bowl, it’d help fans if there was one price and one place to buy a single “Get to Levi’s” public-transit ticket.

Food order arrives as promised

Another thing I tried at the season closer was the in-seat food ordering feature on the Levi’s Stadium app. Sitting in the Levi’s Club section seats on the third level, I consulted the app to order a cold beer and a warm pretzel, which the app said could be delivered in 10 minutes.

Food runner bringing me my cold beer and warm pretzel

Food runner bringing me my cold beer and warm pretzel

After entering credit-card information into the app and hitting the order button the app updated itself with timely notices about the order being prepared, and that it was on its way. I found that information to be very assuring, a sign that things were indeed happening; there was even a big number associated with my order that appeared, apparently to make it easier for the food runner to confirm the order.

The order arrived exactly in 10 minutes’ time, as predicted by the app — it also arrived in a lot of extra packaging, a separate plastic bag for the steel bottle of beer and a paper sack holding a cellophane wrapper-encircled pretzel. Since there is no way to add a gratuity in the app, I gave the runner a cash tip, which seemed appropriate even though there is a $5 charge added to the order cost for the delivery service. I have to admit it felt a little weird to have someone bring me my food and drink but in the time it took to order and deliver I sat in my seat and watched the Niners’ game-winning TD drive so it’s clearly a fan-friendly option.

Video replays work well, for small amount of viewers

Another part of the Levi’s Stadium technology that was at peak performance by year’s end was the app’s instant replay feature. Though it started slowly and had some hiccups early on, by the final game instant replays were appearing in the app even before the next play had concluded (see our feature on how the VenueNext team gets the replays to the app so quickly).

While it’s an impressive addition to the in-game experience, the replays are a feature that only a small amount of fans are watching. According to the team network stats there were only 1,253 unique users watching replays on Dec. 20, and 1,019 during the Dec. 28 game. Total replays viewed for the Chargers game were 6,285, while 4.310 replays were watched during the season closer.

Why aren’t the replays catching on? Our main guess is that the Levi’s Stadium big screens are so clear and so quick to show replays (they also show live action as it’s happening), fans don’t find it necessary to use their phones to watch replays. It’s also possible that many fans in the stadium who are using the network aren’t using the Levi’s Stadium app. Indeed, according to the team network stats, the team app hasn’t yet cracked the top-four apps being used at any of the games this season; for the Dec. 20 game the top apps being used on the network were Amazon cloud drive, Facebook, Google APIs (probably Gmail) and Apple; for Dec. 28 the list was Amazon, Google, Facebook, then Apple.

We’ll try to get some season-long stats to share for both the network and the app features, but our quick conclusion after five live-game visits to Levi’s Stadium this year is that the wireless network and the app both pretty much lived up to their pre-season billing and hype, delivering a wireless game-day experience that is the standard other large public facilities will be measured against, going forward. More photos from our last visit below.

The Microsoft Surface/sideline Wi-Fi unit

The Microsoft Surface/sideline Wi-Fi unit

close-up of cable connection

close-up of cable connection

Niners' Flickr promotion on scoreboard -- very popular

Niners’ Flickr promotion on scoreboard — very popular

Sideline Surface tablets for Niners players and coaches

Sideline Surface tablets for Niners players and coaches

Colin Kaepernick exchanges his radio helmet for his flat-brimmed hat after throwing a TD pass

Colin Kaepernick exchanges his radio helmet for his flat-brimmed hat after throwing a TD pass

View from the Levi's skydeck out over Santa Clara

View from the Levi’s skydeck out over Santa Clara

If you throw a rooftop party, the cheerleaders might visit

If you throw a rooftop party, the cheerleaders might visit

View from the Levi's 501 Club section seats

View from the Levi’s 501 Club section seats

Wi-Fin antenna right under the seat next to me (probably why my speedtest was 40+ Mbps)

Wi-Fin antenna right under the seat next to me (probably why my speedtest was 40+ Mbps)

In-stadium signing help to get fans to light rail

In-stadium signing help to get fans to light rail

End of game view from skydeck

End of game view from skydeck

A final toast to the season at the BNY Club

A final toast to the season at the BNY Club

VTA train line. Only took 15 minutes from here to get on bus.

VTA train line. Only took 15 minutes from here to get on bus.

Caltrain platform at Mountain View. Extra trains helped make ride home easy

Caltrain platform at Mountain View. Extra trains helped make ride home easy

Stadium Tech Report: How VenueNext generates replays for Levi’s Stadium app

replay2During the final home game for the San Francisco 49ers this season, I was somewhat amazed to see a replay appear on the Levi’s Stadium app even before the next play had concluded. Clearly, the VenueNext team behind the app had progressed significantly from the early season, when the app’s replay function struggled somewhat.

During the Dec. 28 game against the Arizona Cardinals, the VenueNext team invited Mobile Sports Report to Levi’s Stadium for a behind-the-scenes look at how the replays are generated; here’s a quick take on how the video moves from playing field cameras to fans’ phones in a matter of seconds.

Step 1: Capturing what the cameras see

How do you make sure you have replays ready? You start by basically capturing all the action from all the cameras in the building. Those views are shuttled back to the main Levi’s Stadium video room, where 15 people have the somewhat enviable job of watching views from all the live cameras in the stadium. (We still haven’t heard a response to our offer of bringing a cooler in exchange for a spare seat next season.)

Inside the video room at Levi's Stadium. (Click on any photo for larger image) Credit all photos: Paul Kapustka, MSR

Inside the video room at Levi’s Stadium. (Click on any photo for larger image) Credit all photos: Paul Kapustka, MSR

According to VenueNext CEO John Paul, who led our tour at Levi’s, VenueNext has its own servers to cache video — here is a look at the Elemental gear used to store the video for the app’s replay function. Paul said that so far, VenueNext has only needed a little more than half of the capacity of the four servers pictured.

Elemental gear in data room next to video room at Levi's Stadium

Elemental gear in data room next to video room at Levi’s Stadium

Step 2: Labeling and thumbnailing the replay

While most of this kind of gear is not new to broadcast operations, the interesting part from an app perspective is the next layer, the choosing and human editing. One of my bigger questions was about how you exactly get a replay set up, labeled with a headline and a thumbnail, and into the app within seconds? The answer is: You write another app, and have two Android-based tablets with football-savvy operators in the press box so they can see the plays live and quickly attach the appropriate info to the replay clips.

In the two pictures below, you can see an over-the-shoulder view of the two tablet apps; the first one is a sort of play-by-play generator, which the operator uses to label the next replay clip as either a run, pass, punt, incomplete, etc., so that a fan looking at the app can quickly figure out what the play might be (“45-yard TD pass,” or something like that). On the second tablet app, another operator sees instant thumbnails gleaned from the video room from all the different camera angles, and when the play is over, that operator picks a thumbnail with a click and the replay is on its way to the app.

First replay tablet app, which adds info about the play

First replay tablet app, which adds info about the play

Second replay tablet app, which adds a thumbnail to the replay

Second replay tablet app, which adds a thumbnail to the replay

Since we only had a few minutes to watch the replay team in action (and since Paul was simultaneously hosting a group of representatives from a prospective future sports-team client), I didn’t get to ask detailed questions about how this all works but it does give you some idea. What’s impressive about the Levi’s Stadium app is that it offers four different camera angles of the same play, as well as a multi-cam view that shows all different angles at once. Though I’m not a fan of the all-at-once option I did find it enjoyable to watch a reply from multiple different perspectives, like watching a pass being caught from both a sideline and end-zone view.

Step 3: Educating fans how to use replay

Initially one of the most-hyped features of the planned stadium app, the replay function has seen only limited use this season, making it somewhat of a disappointment to Niners CEO Jed York. From our perspective, there were two big things holding back extended use of the replay function: The first was the ever-changing appearance of the app itself, which felt like a beta project for most of the season, with a different interface almost every single game; the second thing holding people back from using the app to view replays are the two excellent humungous video boards at Levi’s, which not only show live game action but also quickly show replays, sometimes from those multiple angle views as well.

There might even be a third thing holding fans back from trying out the replay function — that would be just getting fans to use the stadium app in the first place. While usage of the cellular and Wi-Fi networks started at record levels and remained strong from game to game, most fans at Levi’s Stadium were using their devices for other apps, like Twitter, Instagram, Facebook and good old email. As the season progressed, the team started promoting the app and the network more aggressively, adding more-frequent messages to the main video boards and dressing its stadium tech support crew (the “NiNerds”) in neon vests for easier recognition.

What’s next for next season? My guess is that with a year under their belt, the Niners and VenueNext will have a more-aggressive campaign in 2015 to steer more fans to the replay function, which is truly a wonderful enhancement to the game-day experience.

Stadium Tech Report: Corning, IBM bringing fiber-based Wi-Fi and DAS to Texas A&M’s Kyle Field

Kyle Field, Texas A&M University. Credit all photos: Texas A&M

Kyle Field, Texas A&M University. Credit all photos: Texas A&M

Editor’s note: The following is an excerpt from our recent Stadium Tech Report series COLLEGE FOOTBALL ISSUE, a 40-page in-depth look at Wi-Fi and DAS deployment trends at U.S. collegiate football stadiums. You can download the full report for free, to get more stadium profiles as well as school-by-school technology deployment capsules for both the SEC and Pac-12 conferences.

When Texas A&M’s newly renovated Kyle Field opens for the 2015 football season, its outside appearance will have changed dramatically. But from a networking perspective, what’s really different is hidden on the inside – namely, an optical fiber infrastructure designed to bring a new level of performance, cost savings and future-proofing to stadium network deployments.

While the use of optical fiber instead of copper cable in large networks isn’t exactly “new” in the core telecom or enterprise networking worlds, in the still-nascent field of stadium network deployments fiber has yet to make large inroads. But the promise of fiber’s ability to deliver much higher performance and greater future-proofing at lower installation costs in stadium situations may get a very visible poster child when Texas A&M’s football facility kicks off the 2015 season with a technology infrastructure designed to be among the most comprehensive in any stadium, collegiate or professional.

With a Wi-Fi network designed to support 100,000 concurrent connections, a robust DAS network with more than 1,000 antennas, and an IPTV deployment with more than 1,000 screens, the IBM-designed network based largely on Corning’s fiber-optical systems is incredibly impressive on paper – and it has already produced some eye-popping statistics this past season, when just a part of it came online during the “Phase 1” period of the two-phase $450 million Kyle Field renovation.

The final, or Phase 2 of the renovation, just now getting underway, began with an implosion of the stadium’s west stands, with reconstruction scheduled to finish in time for the 2015 season with a new, enclosed-bowl structure that will seat 102,512 fans. And if the new network delivers as planned, those fans will be among the most-connected anywhere, with plenty of future-proofing to make sure it remains that way for the foreseeable future – thanks to fiber.

Driving on the left side of the street

What’s going to be new about Kyle Field? According to news reports some of the creature comforts being added include redesigned concession stands, so-called “Cool Zones” with air conditioning to beat the Texas heat, well-appointed luxury suites and new restrooms – including 300 percent more women’s bathrooms.

Scoreboard, Kyle Field

Scoreboard, Kyle Field

According to representatives from the school, the decision to make the new stadium a standout facility extended to its network infrastructure. “Our leadership decided that [the stadium renovation] would be leading edge,” said Matthew Almand, the IT network architect for the Texas A&M University System, the administrative entity that oversees university operations, including those at the flagship school in College Station, Texas. “There were some leaps of faith and there was a decision to be leading edge with technology as well.”

Though the Phase 1 planning had started with traditional copper cable network design for the network, Almand said a presentation by IBM and its “smarter stadium” team changed the thinking at Texas A&M.

“The IBM team came in and did a really good job of presenting the positive points of an optical network,” Almand said.

Todd Christner, now the director, wireless business development at Corning, was previously at IBM as part of the team that brought the optical idea to Texas A&M. While talking about fiber to copper-cable veterans can sometimes be “like telling people to drive on the left side of the street,” Christner said the power, scalability and flexibility of a fiber network fit in well with the ambitious Kyle Field plans.

“The primary driving force [at Texas A&M] was that they wanted to build a state of the art facility, that would rival NFL stadiums and set them apart from other college programs,” Christner said. “And they wanted the fan [network] experience to be very robust.”

With what has to be one of the largest student sections anywhere – Christner said Texas A&M has 40,000 seats set aside for students – the school knew they would need extra support for the younger fans’ heavy data use on smartphones. The school officials, he said, were also concerned about DAS performance, which in the past had been left to outside operators with less than satisfactory results. So IBM’s presentation of a better, cheaper alternative for all of the above found accepting ears.

“It was the right room for us to walk into,” Christner said.

IBM’s somewhat radical idea was that instead of having separate copper networks for Wi-Fi, DAS and IPTV, there would be a single optical network with the capacity to carry the traffic of all three. Though the pitch for better performance, far more capacity, use of less space, and cheaper costs might sound a bit too good to believe, most of it is just the combination of the simple physics advantages of using fiber over copper, which are well known in the core telecom and large-enterprise networking worlds, applied to a stadium situation.

Deploying now and for the future

Corning ONE DAS headend equipment.

Corning ONE DAS headend equipment.

Without going too deeply into the physics or technology, a simple explanation of the benefits stem from the fact that optical fiber can carry far more bandwidth than copper, at farther distances, using less power. That advantage is one reason why fiber is used extensively in core backbone networks, and has been creeping slowly closer to the user’s destination, through deployments like Verizon’s FiOS.

Why hasn’t fiber won over completely? Mainly because in single-user deployments – like to a single home or office – it is still costly to replace systems already in the ground or in the wall with fiber, and for many users fiber’s capacity can be a bit of overkill. Fiber’s main benefits come when lots of bandwidth is needed, and the scale of a project is large, since one main benefit is the elimination of a lot of internal switching gear, which takes up space and consumes lots of power.

Those reasons accurately describe the perfect bandwidth storm happening in networked stadiums these days, where demand seems to keep increasing on a daily basis. Some stadiums that were at the forefront of the wireless-networking deployment trend, like AT&T Park in San Francisco and AT&T Stadium in Arlington, Texas, have been in a near-constant state of infrastructure upgrades due to the ever-increasing needs for more bandwidth. And Isaac Nissan, product manager for Corning ONE, said new equipment like Wi-Fi access points with “smart” or multiple-input antennas are also going to help push the LAN world into more fiber on the back end.

But there’s another drawback to using fiber, which has less to do with technology and more to do with history: Installers, integrators and other hands-on networking folks in general are more comfortable with copper, which they know and have used for decades. Fiber, to many, is still a new thing, since it requires different skills and techniques for connecting and pulling
wires, as well as for managing and administering optical equipment.

“There’s definitely a learning curve for some the RF [industry] people, who have been doing coax for 20 years,” Nissan said. “Fiber is a little different.”

Texas A&M’s Almand admitted that bringing the stadium’s networking group into a new technology – fiber – was a challenge, but one with a worthy payoff.

Copper cable tray hardly filled by optical fiber

Copper cable tray hardly filled by optical fiber

“There’s definitely been a gear-up cycle, getting to a new confidence level [with fiber],” Almand said. But he added that “sometimes it’s good to break out of your comfort zone.”

Lowering the IDF count

Christner said the Corning optical gear is at the center of the Kyle Field deployment, providing support for the fan-facing Wi-Fi as well as Wi-Fi for back of the house operations like point of sale; it also supports the stadium DAS, as well as a network of more than 1,000 IPTV screens. Aruba Networks is the Wi-Fi gear supplier, and YinzCam is helping develop a new Kyle Field app that will include support to use smartphones as remote-control devices for IPTVs in suites.

On the Wi-Fi side, Christner said the finished network will have 600 APs in the bowl seating areas, and another 600 throughout the facility, with a stated goal of supporting 100,000 concurrent 2 Mbps connections. The DAS, Christner said, is slated to have 1,090 antennas in 50 sectors.

With no intermediate switching gear at all, Christner said that for the fiber network in Kyle Field only 12 intermediate distribution frames (the usually wall-mounted racks that support network-edge gear, also called IDFs) would be needed, as opposed to 34 IDFs in a legacy fiber/coax system. In addition to using less power, the cabling needed to support the fiber network is a fraction of what would have been needed for coax.

One of the more striking pictures of the deployment is a 36-inch wide cable tray installed for the original copper-network plan, which is carrying just 10 inches of fiber-optic cable. Christner said the fiber network also provides a cleaner signal for the DAS network, which already had a test run this past season, when 600 DAS antennas were deployed and lit during the 2014 season.

“At the Ole Miss game we had 110,663 fans at the stadium, and according to AT&T on the DAS all their lights were green,” Christner said. “Via our completely integrated fiber optic solution, we are now able to provide the DAS with much higher bandwidth as well,” said Texas A&M’s Almand, who also said that the carriers have responded very positively to the new DAS infrastructure.

Up from the dust – a model for the future?

Antenna and zone gear box near top of stands

Antenna and zone gear box near top of stands

Also included in the design – but not being used – are an additional 4,000 spare fibers at 540 zone locations, which Christner said can be immediately tapped for future expansion needs. And all of this functionality and flexibility, he added, was being built for somewhere between one-third and 40 percent less than the cost of a traditional copper-based solution.

The proof of the network’s worth, of course, will have to wait until after the west stands are imploded, the new ones built, and the final pieces of the network installed. Then the really fun part begins, for the users who will get to play with things like 38 channels of high-def TV on the IPTV screens, to the multiple-angle replay screens and other features planned for the mobile app. At Texas A&M, IBM’s support squad will include some team members who work on the company’s traditionally excellent online effort for the Masters golf tournament, as well as the “smarter stadium” team.

For Texas A&M’s Almand, the start of the 2015 season will mark the beginning of the end, and a start to something special.

“If I were a country singer, I’d write something about looking forward to looking back on this,” Almand said. “When it’s done, it’s going to be something great.”

Report excerpt: SEC moving slowly on stadium Wi-Fi deployments

Jordan-Hare Stadium, Auburn University

Jordan-Hare Stadium, Auburn University

Editor’s note: The following is an excerpt from our recent Stadium Tech Report series COLLEGE FOOTBALL ISSUE, a 40-page in-depth look at Wi-Fi and DAS deployment trends at U.S. collegiate football stadiums. You can download the full report for free, to get more stadium profiles as well as school-by-school technology deployment capsules for both the SEC and Pac-12 conferences.

When it comes to college football, the South- eastern Conference – usually just known as “the SEC” – is second to none when it comes to the product on the field.

But what about the product in the stands, namely the wireless technology deployments in SEC stadiums? With just two of 14 conference schools currently with fan-facing Wi-Fi in their main venues, the SEC isn’t pushing any technology envelopes as a whole. And according to one SEC athletic director, there probably won’t be a wholesale march by the conference to the technology forefront – simply because the SEC’s in-stadium fans have other priorities on what needs fixing first.

Scott Stricklin, the AD at SEC member Mississippi State, leads a conference-wide group that is taking a close look at the in- stadium fan experience, a concern for the SEC even as the conference enjoys NFL-like popularity for its teams and games.

“We are proud that we have a pretty special product in our stadiums, and we want to take steps to keep it that way,” said Stricklin in an interview with MSR. A recent conference-wide fan survey, he said, did highlight the fact that when it comes to wireless connectivity, “none of us from a performance standpoint scored very well.”

Wi-Fi not as important as parking, good food

But Stricklin also noted that the same fan survey didn’t place stadium connectivity at the top of the list of things to fix: Instead, it fell well down, trailing issues like parking, clean restrooms, stadium sound and good food. That lack of press- ing concern, combined with Stricklin’s still-common belief that fans should be cheering instead of texting while at the stadium, means that the SEC will probably take a measured approach to Wi-Fi deployments in stadiums, and continue to rely on carrier-funded DAS networks to carry the game-day wireless load.

Scott Stricklin, Mississippi State AD

Scott Stricklin, Mississippi State AD

“I take more of a Mark Cuban approach – I’d rather people in the stands not be watching video [on their phones],” Stricklin said. “It takes away from the shared experience.”

Stricklin also noted that the two schools that have installed Wi-Fi in their stadiums – Auburn and Ole Miss – haven’t had resounding success with their deployments.

“Some [SEC schools] have done [Wi-Fi], and they’re not completely happy with the results,” said Stricklin, saying the lack of success has reinforced the cautious approach to Wi-Fi, conference-wide. “Those are the issues all of us are facing and grappling with,” he added.

SEC fans setting DAS traffic records

Even as they trail on Wi-Fi deployments, that doesn’t mean SEC schools are putting in dial-up phone booths. Indeed, Stricklin noted the huge video boards that have been installed in most conference stadiums, and did say that the recent installations of carrier-funded DAS deploymentshave somewhat eased the no-signal crunch of the near past.

At his own school, Stricklin said his office got a lot of complaints about fans not being able to get a cellular signal before AT&T updated the stadium’s DAS in 2013.

“Last year, we got very few negative comments [about cellular service],” Stricklin said. “AT&T customers were even able to stream video.”

Vaught-Hemingway Stadium, Ole Miss

Vaught-Hemingway Stadium, Ole Miss

AT&T’s aggressive plan to install as many DAS networks as it can has helped bring the SEC to a 100 percent DAS coverage mark, and the fans seem to be enjoying the enhanced cellular connectivity. According to AT&T statistics, fans at SEC schools have regularly led the carrier’s weekly DAS traffic totals for most of the football season, especially at the “big games” between SEC schools like Alabama, Auburn, Ole Miss, Mississippi State and Georgia.

During Alabama’s 25-20 home victory over then-No. 1 Mississippi State, AT&T customers at Bryant-Denny Stadium used 849 gigabytes oftraffic, the second-highest total that weekend for stadiums where AT&T has a DAS. The next two highest data-usage marks that weekend came at games at Georgia (676 GB) and Arkansas (602 GB), highlighting that SEC games typically have huge crowds, and those crowds like to use their cellphones, no matter how good the game on the field is.

Would Wi-Fi help with some of the traffic crunches? Possibly, but only two schools in the conference – Ole Miss and Auburn – currently have fan-facing Wi-Fi in their stadiums. Texas A&M, which is in the middle of a $450 million renovation of Kyle Field, is leaping far ahead of its conference brethren with a fiber-based Wi-Fi and DAS network and IPTV installation that will be among the most advanced anywhere when it is completed this coming summer.

But most of the SEC schools, Stricklin said, will probably stay on the Wi-Fi sidelines, at least until there is some better way to justify the millions of dollars in costs needed to bring Wi-Fi to a facility that might not see much regular use.

“If you only have 6 home games a year, it’s hard to justify,” said Stricklin of the cost of a Wi-Fi stadium network.

Other sports may move before football

Stricklin, the man who wants fans to keep their phones in their pockets at football games, is no stranger to technology-enhanced experiences in stadiums. He claims to “love” the in-seat food delivery options at MSU baseball and basketball games, and notes that the conference athletic directors will have a meeting soon where the game-experience panel experts will walk the ADs through the facets of wireless technology deployments.

“They’re going to lay out what are the challenges, and what are the costs” of wireless deployments, Stricklin said. What Stricklin doesn’t want to see at MSU or at any SEC school is the return of the “no signal” days.

“When fans from other schools come here, we want them to have a good experience,” Stricklin said.

But he’d still prefer that experience is real, not virtual.

“I still just wonder, is anybody really doing this?” he asked. “Are you going to pay what you pay to come to our place, and then watch your phone? What I hope is that we produce such a great experience, you’re not going to want to reach for your phone.”

AT&T: Bills fans using almost 400 GB of data per game on Ralph Wilson Stadium DAS

Ralph Wilson Stadium

Ralph Wilson Stadium

The new DAS deployment at Buffalo’s Ralph Wilson Stadium is getting a workout from Bills fans, according to data from DAS operator AT&T. According to AT&T, fans on AT&T’s cellular network are using an average of 397 gigabytes of data per game so far this season, a figure that might drift a bit higher after the Bills’ big upset of Green Bay this past weekend.

The DAS, part of a $130 million stadium renovation project at Ralph Wilson Stadium for this season that also saw the installation of new HD video boards (but no Wi-Fi), has 33 sectors with 11 cell sites worth of AT&T equipment, according to news reports.

One of just 10 NFL facilities that doesn’t have fan-facing Wi-Fi, Ralph Wilson Stadium clearly now has less of a “no signal” problem, if fans are finding ways to use nearly 400 GB of data per game. We’ll circle back with the Buffalo folks to see if there is any news on future Wi-Fi plans.