From overhead to under seat: A short history of the hows and whys of stadium Wi-Fi network design

Wi-Fi handrail enclosures at U.S. Bank Stadium, Minneapolis, Minn. Credit: Paul Kapustka, MSR

Wi-Fi handrail enclosures at U.S. Bank Stadium, Minneapolis, Minn. Credit: Paul Kapustka, MSR

By Bill Anderson, AmpThink

The history of high density (HD) Wi-Fi deployments in stadiums and arenas is short. Yet the the amount of change that occurred is significant; both in terms of how these networks are deployed and why.

Venue operators, manufacturers, and integrators are still grappling with the particulars of HD Wi-Fi in large open environments, even though there are a substantial number of deployed high quality implementations. Below, I’ve shared our perspective on the evolution of HD Wi-Fi design in stadiums and arenas and put forth questions that venue operators should be asking to find a solution that fits their needs and their budget.

AmpThink’s background in this field

Over the past 5 years, our team has been involved in the deployment of more than 50 high-density Wi-Fi networks in stadiums throughout North America. In that same period, the best-practices for stadium HD Wi-Fi design have changed several times, resulting in multiple deployment methodologies.

Each major shift in deployment strategy was intended to increase total system capacity [1]. The largest gains have come from better antenna technology or deployment techinques that better isolated access point output resulting in gains in channel re-use.

What follows is a summary of what we’ve learned from the deployments we participated in and their significance for the future. Hopefully, this information will be useful to others as they embark on their journeys to purchase, deploy, or enhance their own HD Wi-Fi networks.

In the beginning: All about overhead

Editor’s note: This post is part of Mobile Sports Report’s new Voices of the Industry feature, in which industry representatives submit articles, commentary or other information to share with the greater stadium technology marketplace. These are NOT paid advertisements, or infomercials. See our explanation of the feature to understand how it works.


Designers of first generation of HD Wi-Fi networks were starting to develop the basic concepts that would come to define HD deployments in large, open environments. Their work was informed by prior deployments in auditoriums and convention centers and focused on using directional antennas. The stated goal of this approach was to reduce co-channel interference [2] by reducing the effective footprint of an individual access point’s [3] RF output.

However the greatest gains came from improving the quality of the link between clients and the access point. Better antennas allowed client devices to communicate at faster speeds which decreased the amount of time required to complete their communication, making room for more clients on each channel before a given channel became saturated or unstable.

Under seat Wi-Fi AP at Bank of America Stadium. Credit: Carolina Panthers

Under seat Wi-Fi AP at Bank of America Stadium. Credit: Carolina Panthers

The concept was simple, but limited by the fact that there were few antennas available that could do the job effectively. Creative technicians created hybrid assemblies that combined multiple antennas into arrays that rotated polarization and tightened the antenna beam to paint the smallest usable coverage pattern possible. In time, this gap was addressed and today there are antennas specifically developed for use in overhead HD deployments – Stadium Antennas.

Typically, Stadium Antennas are installed in the ceilings above seating and/or on the walls behind seating because those locations are relatively easy to cable and minimize cost. We categorize these deployments as Overhead Deployments.

From overhead to ‘front and back’

First generation overhead deployments generally suffer from a lack of overhead mounting locations to produce sufficient coverage across the entire venue. In football stadiums, the front rows of the lower bowl are typically not covered by an overhang that can be used for antenna placement.

These rows are often more than 100 feet from the nearest overhead mounting location. The result is that pure overhead deployments leave some of the most expensive seats in the venue with little or no coverage. Further, due to the length of these sections, antennas at the back of the section potentially service thousands of client devices [4].

As fans joined these networks, deployments quickly became over-loaded and generated service complaints for venue owners. The solution was simple — add antennas at the front of long sections to reduce the total client load on the access points at the back. It was an effective band-aid that prioritized serving the venues’ most important and often most demanding guests.

This approach increased the complexity of installation as it was often difficult to cable access points located at the front of a section.

And for the first time, antennas were placed where they were subject to damage by fans, direct exposure to weather, and pressure washing [5]. With increased complexity, came increased costs as measured by the average cost per installed access point across a venue.

Because these systems feature antennas at the front and rear of each seating section, we refer to these deployments as ‘Front-to-Back Deployments.’ While this approach solves specific problems, it is not a complete solution in larger venues.

‘Filling In’ the gaps

Data collected from Front-to-Back Deployments proved to designers that moving the antennas closer to end users:
— covered areas that were previously uncovered;
— increased average data rates throughout the bowl;
— used the available spectrum more effectively; and
— increased total system capacity.

The logical conclusion was that additional antennas installed between the front and rear antennas would further increase system capacity. In long sections these additional antennas would also provide coverage to fans that were seated too far forward of antennas at the rear of the section and too far back from antennas at the front of the section. The result was uniform coverage throughout the venue.

In response, system designers experimented with hand rail mounted access points. Using directional antennas, coverage could be directed across a section and in opposition to the forward-facing antennas at the rear of the section and rear-facing antennas at the front of a section. These placements filled in the gaps in a Front-to-Back Deployment, hence the name ‘In-Fill Deployment.’

While these new In-Fill Deployments did their job, they added expense to what was already an expensive endeavor. Mounting access points on handrails required that a hole be drilled in the stadia at each access point location to cable the installed equipment. With the access point and antenna now firmly embedded in the seating, devices were also exposed to more traffic and abuse. Creative integrators came to the table with hardened systems to protect the equipment – handrail enclosures. New costs included: using ground-penetrating radar to prepare for coring; enclosure fabrication costs; and more complex conduit and pathway considerations. A typical handrail placement could cost four times the cost of a typical overhead placement and a designer might call for 2 or 3 handrail placements for every overhead placement.

Getting closer, better, faster: Proximate Networks

In-Fill strategies substantially solved the coverage problem in large venues. Using a combination of back of section, front of section, and hand-rail mounted access points, wireless designers had a tool box to deliver full coverage.

But with that success came a new problem. As fans discovered these high density networks and found new uses for them, demands on those networks grew rapidly, especially where teams or venue owners pushed mobile-device content strategies that added to the network load. In-spite of well placed access points, fan devices did not attach to the in-fill devices at the same rate that they attached to the overhead placements [6]. In-fill equipment remained lightly used and overhead placements absorbed hundreds of clients. Gains in system capacity stalled.

Close-up look at U.S. Bank Stadium railing enclosure during final construction phase, summer 2016. Credit: Paul Kapustka, MSR

Close-up look at U.S. Bank Stadium railing enclosure during final construction phase, summer 2016. Credit: Paul Kapustka, MSR

To overcome uneven system loading, designers needed to create a more even distribution of RF energy within the deployment. That required a consistent approach to deployment, rather than a mix of deployment approaches. The result was the elimination of overhead antennas in favor of access points and antennas installed within the crowd, closest to the end use; hence the name ‘Proximate Networks.’

Proximate networks come in two variations: handrail only and under seat only. In the hand rail only model, the designer eliminates overhead and front of section placements in favor of a dense deployment of hand rail enclosures. In the under seat model, the designer places the access point and antenna underneath the actual seating (but above the steel or concrete decking). In both models, the crowd becomes an important part of the design. The crowd attenuates the signal as it passes through their bodies resulting in consistent signal degradation and even distribution of RF energy throughout the seating bowl. The result is even access point loading and increased system capacity.

An additional benefit of embedding the access points in the crowd is that the crowd effectively constrains the output of the access point much as a wall constrains the output of an access point in a typical building. Each radio therefore hears fewer of its neighbors, allowing each channel to be re-used more effectively. And because the crowd provides an effective mechanism for controlling the spread of RF energy, the radios can be operated at higher power levels which improves the link between the access point and the fan’s device. The result is more uniform system loading, higher average data rates, increased channel re-ue, and increases in total system capacity.

While Proximate Networks are still a relatively new concept, the early data (and a rapid number of fast followers) confirms that if you want the densest possible network with the largest possible capacity, then a Proximate Network is what you need.

The Financials: picking what’s right for you

From the foregoing essay, you might conclude that the author’s recommendation is to deploy a Proximate Network. However, that is not necessarily the case. If you want the densest possible network with the largest possible capacity, then a Proximate Network is a good choice. But there are merits to each approach described and a cost benefit analysis should be performed before a deployment approach is selected.

For many venues, Overhead Deployments remain the most cost effective way to provide coverage. For many smaller venues and in venues where system utilization is expected to be low, an Overhead deployment can be ideal.

Front-to-Back deployments work well in venues where system utilization is low and the available overhead mounting assets can’t cover all areas. The goal of these deployments is ensuring usable coverage, not maximizing total system capacity.

In-fill deployments are a good compromise between a coverage-centric high density approach and a capacity-centric approach. This approach is best suited to venues that need more total system capacity, but have budget constraints the prevent selecting a Proximate approach.

Proximate deployments provide the maximum possible wireless density for venues where connectivity is considered to be a critical part of the venue experience.

Conclusion

If your venue is contemplating deploying a high density network, ask your integrator to walk you through the expected system demand, the calculation of system capacity for each approach, and finally the cost of each approach. Make sure you understand their assumptions. Then, select the deployment model that meets your business requirements — there is no “one size fits all” when it comes to stadium Wi-Fi.

Bill Anderson, AmpThink

Bill Anderson, AmpThink

Bill Anderson has been involved in the design and construction of wireless networks for over 20 years, pre-dating Wi-Fi. His first experience with wireless networking was as a software developer building software for mobile computers communicating over 400 MHz and 900 MHz base stations developed by Aironet (now part of Cisco Systems).

His work with mobile computing and wireless networks in distribution and manufacturing afforded him a front row seat to the emergence of Wi-Fi and the transformation of Wi-Fi from a niche technology to a business critical system. Since 2011 at AmpThink Bill has been actively involved in constructing some of the largest single venue wireless networks in the world.

Footnotes

^ 1. A proxy for the calculation of overall system capacity is developed by multiplying the average speed of communication of all clients on a channel (avg data rate or speed) by the number of channels deployed in the system (available spectrum) by the number of times we can use each channel (channel re-use) or [speed x spectrum x re-use]. While there are many other parameters that come into play when designing a high density network (noise, attenuation, reflection, etc.), this simple equation helps us understand how we approach building networks that can support a large number of connected devices in an open environment, e.g. the bowl of a stadium or arena.

^ 2. Co-channel interference refers to a scenario where multiple access points are attepting to communicate with client devices using the same channel. If a client or access point hears competing communication on the channel they are attempting to use, they must wait until that communication is complete before they can send their message.

^ 3. Access Point is the term used in the Wi-Fi industry to describe the network endpoint that client devices communicate with over the air. Other terms used include radio, AP, or WAP. In most cases, each access point is equipped with 2 or more physical radios that communicate on one of two bands – 2.4 GHz or 5 GHz. HD Wi-Fi deployments are composed of several hundred to over 1,000 access points connected to a robust wired network that funnels guest traffic to and from the internet.

^ 4. While there is no hard and fast rule, most industry experts agree that a single access point can service between 50 and 100 client devices.

^ 5. Venues often use pressure washers to clean a stadium after a big event.

^ 6. Unlike cellular systems which can dictate which mobile device attaches to each network node, at what speed, and when they can communicate, Wi-Fi relies on the mobile device to make the same decisions. When presented with a handrail access point and an overhead access point, mobile devices often hear the overhead placement better and therefore prefer the overhead placement. In In-Fill deployments, this often results in a disproportionate number of client devices selecting overhead placements. The problem can be managed by lowering the power level on the overhead access point at the expense of degrading the experience of the devices that the designer intended to attach to the overhead access point.

Indiana Pacers upgrade Wi-Fi at Bankers Life Fieldhouse

Bankers Life Fieldhouse, home of the Indiana Pacers. Credit all photos: Frank McGrath/Indiana Pacers

Bankers Life Fieldhouse, home of the Indiana Pacers. Credit all photos: Frank McGrath/Indiana Pacers

Whenever you undertake a Wi-Fi retrofit project, one thing is for certain: You can always expect surprises along the way.

For the Indiana Pacers, the biggest surprise in their recent renovation of the Wi-Fi network at Bankers Life Fieldhouse was finding out that their venue already had holes drilled in the concrete under the seats, greatly simplifying (and reducing the cost) of the mainly under-seat deployment that just went live in December.

The new 400-plus AP network, using gear from Ruckus, replaces one of the NBA’s first in-stadium Wi-Fi networks, one built and run by SignalShare using gear from Xirrus. With SignalShare now in bankruptcy and facing legal charges of fraudulent behavior, the Pacers went a different route for their new network, which is part of a plan to bring more digital-based fan services to the 17-year-old venue in downtown Indianapolis, which seats roughly 18,000 for basketball games.

According to Kevin Naylor, vice president of information technology, Pacers Sports and Entertainment, that plan got an unexpected (and welcome) boost when the Pacers’ IT team looked and found pre-drilled holes underneath many of the seats, covered up with temporary aluminum plates. With Ruckus able to use the pre-drilled holes for its under-seat Wi-Fi design, the Pacers were able to save “hundreds of thousands of dollars” in deployment costs, Naylor said.

A new digital plan for fans

Editor’s note: This profile is from our latest STADIUM TECH REPORT, the ONLY in-depth publication created specifically for the stadium technology professional and the stadium technology marketplace. Read about the Sacramento Kings’ new Golden 1 Center and the new Wi-Fi network for the Super Bowl in our report, which is available now for FREE DOWNLOAD from our site!

Leading the venue’s new digital direction is Ed Frederici, chief technology officer, Pacers Sports and Entertainment, who joined the organization in the fall of 2015, after spending almost 6 years as the CTO of ExactTarget, a marketing automation provider that was aquired by Salesforce in 2013.

Though he came into the job “relatively ignorant of sports,” Frederici said he saw “a really interesting problem to solve” revolving around the ongoing evolution of the live-event fan, and who the new attendee was. With a plan to help drive the fan engagement through technology, Frederici, Naylor and the Pacers’ organization began a thorough assessment of Wi-Fi gear providers as part of their plan to bring a new network to Bankers Life Fieldhouse, replacing one that didn’t stand up to current use patterns.

“The old network tapped out when it got to about 3,000 [concurrent] users,” Frederici said.

Pacers director of IT Kevin Naylor shows off a new under-seat Wi-Fi AP

Pacers director of IT Kevin Naylor shows off a new under-seat Wi-Fi AP

According to Frederici, the Pacers looked at “all the major providers” of Wi-Fi gear, testing implementations live by putting gear into mobile merchandise-selling stands in use on the stadium concourses. The final decision, Frederici said, came down to a battle between Ruckus and Xirrus, with Ruckus the final winner.

Under seat the best option

According to Bart Giordano, vice president for business development and strategic partnerships, for Brocade’s Ruckus business unit, going under-seat with Wi-Fi seems to be the direction large public venues are all headed in.

“It [under seat deployment] is sort of standard now,” said Giordano. “You really need to have users close to the APs, and it’s hard to achieve that with overhead.”

With just over 430 APs in the new network, Frederici was worrying about the drilling costs — until it turned out that most of the drilling had already been done, apparently as part of the arena’s original electrical configuration.

“Seventeen years ago, cables were much thinner, and it looks like [the holes] were cored for electrical,” Frederici said. “But it worked out fabulously.”

And like several other venues have done recently, the Pacers have decided to scrap support for fan-facing services on the 2.4 GHz spectrum, which makes administration of the fan Wi-Fi network easier and cheaper. The team will still keep some 2.4 GHz connections for back of house use.

With 2.4 GHz, Naylor said, “the noise level just got really bad in the lower bowl. It’s much easier to go to [only] 5 GHz. Every phone made now has 5 GHz.” For the older phones, Naylor said, the arena’s neutral-host DAS run by ExteNet Systems can provide connectivity, with AT&T and Verizon Wireless already on the system with plans to add more carriers in 2017.

While the Pacers currently have a basic YinzCam-based game-day app, Frederici is looking forward to more services in the future, including the possibility of having amenities like live parking and traffic information available via the app, as well as blue-dot wayfinding to the seat. For this year, the Pacers have already added concession and restroom wait time alerts to the app, the first step in a planned process of greater digital engagement.

“We want to own the experience from your driveway to the stadium, then back home,” Frederici said. Part of the new network deal includes analytics software services from Ruckus partner Purple, which helps teams mine data from fan interaction with the Wi-Fi network.

“We’re excited to see what kind of data we can pull from them [Purple],” Naylor said.

Optical fiber, under-seat Wi-Fi will power wireless connectivity at Atlanta’s Mercedes-Benz Stadium

Aerial photo of Mercedes-Benz Stadium under construction. Credit all photos and artist renderings: Merecedes-Benz Stadium (Click on any photo for a larger image)

Aerial photo of Mercedes-Benz Stadium under construction. Credit all photos and artist renderings: Merecedes-Benz Stadium (Click on any photo for a larger image)

Once just a series of drawings on a blueprint, Atlanta’s new Mercedes-Benz Stadium is getting more real by the day, with walls being added to steel beams, and wires for the internal networks being pulled into place.

Though the June 2017 opening day still is many months away, thanks to thoughtful planning many elements of the stadium’s network have already been tested, thanks to a facility created by stadium network officials to test components under situations as close to “live” as they could possibly get. That lab environment helped the network team make its final decisions on vendors and deployment methods, like going under-seat for deployment of most of the 1,000 Wi-Fi APs that will be in the stadium’s bowl area, part of a planned total of 1,800 APs in the entire venue.

In a recent interview with Jared Miller, chief technology officer at AMB Sports and Entertainment (the entity named for Arthur Blank, the owner of the Atlanta Falcons), Mobile Sports Report got an exclusive update on the construction progress so far for the new $1.5 billion facility, along with new details about the internal network deployment, which will be using more optical fiber than any previous stadium network we know of.

Like the network built at Texas A&M’s Kyle Field, the network inside Mercedes-Benz Stadium will have a single optical core for Wi-Fi, cellular and video, using the Corning ONE platform and deployed by lead network integrator IBM along with Corning.

Wall panels being added to Mercedes-Benz Stadium in Atlanta

Wall panels being added to Mercedes-Benz Stadium in Atlanta

Miller also confirmed our earlier report that YinzCam software would be used to host the stadium’s IPTV deployment, but vendor choices for Wi-Fi gear and a stadium app have yet to be named.

As construction teams continue to hustle toward completion of the building, here are more details from our conversation with Miller about how the Falcons’ tech team went through the process of determining the products and methods that would allow them to construct a network able to “push the limits” on fan connectivity.

Under-seat for Wi-Fi, with handrail heat sinks

In our early August conversation with Miller, he was happy to report that the planned 4,000 miles of optical fiber were finally starting to be threaded into the new building. “We’re making great progress with a ton of yellow cable,” Miller said.

While the overall architecture at the network core in Mercedes-Benz Stadium will be similar to the one IBM and Corning deployed at Kyle Field, Miller said that in Atlanta his team is pushing fiber even farther to the edge, “with only the last couple feet at most being copper.”

Interior suite construction with fiber cable visible

Interior suite construction with fiber cable visible

Miller said optical fiber, which can carry more data traffic at faster speeds than copper cable, is a necessary infrastructure underpinning for facilities like Mercedes-Benz Stadium that expect to host the biggest events like the Super Bowl and college football championship games. Mercedes-Benz Stadium is already slated to host Super Bowl LIII, the 2018 College Football Playoff Championship, and the 2020 Final Four.

“I really believe [fiber] gives us the foundation to grow and react in the future, to handle technologies we don’t even know about yet,” Miller said.

On the Wi-Fi side of things, Miller said that Mercedes-Benz Stadium will also mimic Kyle Field’s extensive use of under-seat APs in the bowl seating areas. Miller said the stadium will have 1,000 APs serving the seating areas and another 800 for the rest of the venue, for a total Wi-Fi AP count of 1,800.

Since the Mercedes-Benz Stadium network will be using more optical equipment closer to the edge, Miller said that his team used 3D printing experiments to craft custom enclosures for the under-seat APs, both to ensure they didn’t act as debris “traps” and also to add elements like an internal heat sink to diffuse the warmth from the extra electrical components. The heat sink solution involved attaching the AP elements to metal chair railings to dissipate heat, Miller said.

Testing the network before the building is built

After announcing its partnership with IBM in early 2015 as lead technology integrator, the stadium network team spent 6 months reworking the network design, Miller said, a process that confirmed the choice of optical networking at the core. Then to help the network team select gear and components, the Mercedes-Benz Stadium organization built a “full-scale lab facility” that Miller said allowed his team to build multiple live networks to test gear for performance and interaction with other network elements.

Artist rendering of outside of building

Artist rendering of outside of building

“The lab enabled us to see firsthand how gear behaved, not just alone but together [with other products],” said Miller, who added that at one time the network team had three simultaneous running stadium networks inside the lab.

“We were able to bring in different endpoint devices, like POS systems, and know how it’s going to behave [in a network],” Miller said. Plus, the network gave eventual business users of the planned gear time to get hands-on experience and training well before the stadium opens its doors.

On the DAS side of the network buildout, Miller said the stadium has an on-site, raised-floor room for DAS gear with “ample room” for future growth.

“One of those things we learned was that DAS [needs] always double,” Miller said.

YinzCam software for IPTV

Though the stadium hasn’t yet announced a provider for a game-day stadium application, Miller did confirm that Mercedes-Benz Stadium will use YinzCam software to control its IPTV system, which will cover the 2,500 or so TV screens inside the building.

Artist rendering of Falcons game configuration with roof open and 'halo' video board visible

Artist rendering of Falcons game configuration with roof open and ‘halo’ video board visible

“YinzCam is just the most intuitive and capable content management system,” Miller said.

Video is going to be a big part of the stadium from all angles, beginning with the one-of-a-kind “halo board,” a circular screen that will sit inside the retractable roof lines. For standard TV placements, Miller said Mercedes-Benz Stadium will use mainly 50-inch screens and will work with YinzCam to ensure the screens can be seen.

In the stadium’s suites, TV screens will be controlled by a tablet application; Miller said that Mercedes-Benz Stadium is also “contemplating adding the ability to control TV screens with a mobile app,” like the system YinzCam deployed at Texas A&M.

Friendly food pricing and more to come

Though Miller’s concerns are mostly technological in nature, he said there are still a lot of improvements coming to the stadium “that are not always reliant on brute technology,” like the new lower-priced food menus the Falcons announced earlier this year that seem to harken another era with $2 Cokes and $2 hot dogs. Miller said the stadium team continues to get feedback from a fans’ council, which has tagged the arrival and departure experience as one of the main pain points that needs fixing.

Artist rendering of window wall with view to city

Artist rendering of window wall with view to city

Mercedes-Benz Stadium will try to alleviate ingress and egress issues by doing things like creating “ticketed spaces” perhaps on the big outdoor plazas where many fans can congregate even before entering the stadium doors. By creating such spaces, Miller said fans might be able to enter the stadium more rapidly without the logjams that sometimes occur.

“We’re going to study arrival patterns and see what it looks like,” Miller said. “We have one more season to test those kind of things.”

Another amenity that may emerge is the use of wireless charging stations at a number of locations, to combat a scenario that Miller said often happens at marquee events, mainly fans’ phones draining their batteries as they compete with other devices to connect to a wireless network.

“We are focusing on providing amazing connectivity and pushing the limits,” Miller said. “We are looking at all kinds of options to allow fans to stay connected and not be separated from their device.”

S.F. Giants add more Wi-Fi, ‘virtual reality experience’ to AT&T Park for 2016 season

The view from AT&T Park's left field corner. All photos: Paul Kapustka, MSR (click on any photo for a larger image)

The view from AT&T Park’s left field corner. All photos: Paul Kapustka, MSR (click on any photo for a larger image)

The first ballpark to bring Wi-Fi to its fans is still padding its networking lead, as AT&T Park will have 543 new or upgraded Wi-Fi access points for the 2016 season, according to the San Francisco Giants.

Most of the new APs are of the under-seat variety, completing the team’s three-year plan to put more APs under seats to increase network density and capacity. According to Bill Schlough, senior vice president and chief information officer for the Giants, the park now has a total of 1,628 Wi-Fi APs, the most of any MLB stadium and more than most big football stadiums as well. With 78.2 terabytes of data used during the baseball season and another 20+ TB used during other events, Schlough said AT&T Park’s Wi-Fi network carried more than 100 TB of data in calendar 2015.

Since it’s an even year, the Giants expect to win the World Series again, so the action on the field should be pretty good. If you want to leave reality, however, the Giants can accomodate you in that realm this season with the addition of a “virtual reality experience” at the team’s @Cafe social media spot, located on the concourse behind the left-field bleachers.

Since it's an even year, does that mean another one of these is on order for the Giants?

Since it’s an even year, does that mean another one of these is on order for the Giants?

According to the Giants, fans can be “transported” to Scottsdale Stadium to view practice from spring training, or they can see views from the AT&T Park field, the batting cages and “even Sergio Romo’s car” through a VR headset.

The Giants said fans will also notice an upgrade to the stadium’s LED ribbon boards, which circle the park on the facings of the upper decks. The new Mitsubishi screens, the Giants said, offer 150 percent more pixels than their predecessors, meaning that you might not need those reading glasses to get stat updates or read advertising messsages.

On the DAS side of things, AT&T Park finally has all four major U.S. wireless carriers on its in-house cellular network, with the DAS and Wi-Fi serviced by 13 1-Gbps backbone pipes from AT&T.

New Report: Super Bowl 50’s super wireless, under-seat Wi-Fi feature and more!

STR Q1 THUMBThe record-setting wireless network consumption at Super Bowl 50 is one of the lead topics in our latest STADIUM TECH REPORT, our long-form publication that takes an in-depth look at the most important news of the stadium technology world, alongside some great in-depth profiles of successful stadium technology deployments. Download your free copy today!

With fans consuming 26 terabytes of wireless data — 15.9 TB on the stadium’s distributed antenna system (DAS) and another 10.1 on the Wi-Fi network — the Super Bowl provided the ultimate test for the Levi’s Stadium wireless infrastructure, one that the venue passed with flying colors. One unique factor of the stadium’s wireless deployment, under-seat antennas for both the DAS and the Wi-Fi networks, is covered in-depth in our most recent issue, with a feature story about how under-seat deployments got started, and why they may become the default antenna placement for large public venues going forward.

Also in the issue: A profile of Wi-Fi and associated mobile device strategies at the University of Wisconsin, including geo-fencing for fan marketing at away games; a close-up look at the wireless infrastructure at the Denver Broncos’ Sports Authority Field at Mile High; a profile of the new Wi-Fi network at the Montreal Canadiens’ Bell Centre; and a look at some new social-media strategies deployed by the Miami Dolphins. All this information is available now for FREE DOWNLOAD so get your copy today!

We’d like to thank our Stadium Tech Report sponsors, who make this great content free for readers thanks to their support. For our Q1 issue our sponsors include Mobilitie, Crown Castle, CommScope, Samsung, Corning, JMA Wireless, Aruba, SOLiD, Xirrus and 5 Bars.

Stadium Tech Report: Upgrades keep San Francisco Giants and AT&T Park at front of stadium DAS and Wi-Fi league

Outside AT&T Park. All photos, Paul Kapustka, Mobile Sports Report. (Click on any photo for larger image)

Outside AT&T Park. All photos, Paul Kapustka, Mobile Sports Report. (Click on any photo for larger image)

What’s it like when the best-connected park in Major League Baseball loses its cellular mojo for a month? This winter the San Francisco Giants found out how fun it isn’t to revisit the days of “no signal,” when a DAS upgrade meant about 30 days of little to no connectivity inside AT&T Park.

“It was painful,” said Bill Schlough, senior vice president and chief information officer for the Giants, during a recent in-person interview at AT&T Park. Though no big sporting events took place during the Feburary-to-March overhaul of the main AT&T distributed antenna system (DAS) head end, Schlough said during that time many of the roughly 200 to 300 employees who work at AT&T Park every day were forced to find daylight to make a call, just like the bad old days before DAS.

“We never really knew how much we rely on DAS [for internal operations], but having it down really drove it home,” said Schlough. The good news on the DAS front was that once the upgrade was complete, the Giants had a lot more space in their previously cramped head-end headquarters. According to Schlough, the new back-end equipment for AT&T’s DAS operations takes up less than 50 percent of the previous gear footprint, room that is likely to be filled with gear from yet another carrier slated to join the AT&T neutral-host DAS later this season.

Painful, but worth it.

Second major upgrade in 5 years of DAS

Giants CIO Bill Schlough (left) talks with workers in the park's main DAS head end facility.

Giants CIO Bill Schlough (left) talks with workers in the park’s main DAS head end facility.

If you’re not familiar with a neutral DAS like the one at AT&T Park, it’s an implementation where there is one set of antennas and internal wiring, and then a “head end” where each carrier puts its cellular-specific networking gear, equipment that identifies and authorizes callers and then connects those calls or messages to fiber links back out to the Internet and beyond. As the lead provider of DAS and as the namesake sponsor of the park it makes sense that AT&T has the biggest DAS requirement on site. Verizon, which has been on the AT&T Park DAS for two years now, actually houses most of its head end gear in a separate facility nearby, and links to the AT&T Park system via fiber.

Part of this year’s DAS renovations include a new room specifically being built for Sprint’s DAS equipment, a sort of re-arrange-the-house construction project that saw the ballpark wall off half its painting services workshop to make space for Sprint’s gear. During our visit we saw workers putting up the racks that will hold the Sprint head end gear, as thick fiber cables snaked in the doorway.

Additional carrier(s) would likely be placed in the same room as AT&T and Verizon, on floor space that used to hold AT&T racks before those were un-drilled from the concrete floor and new racks were installed during the February-March overhaul. According to Schlough, the DAS upgrade (which required minimal tweaks to the previously installed DAS antennas) was the second major rip-and-replace action in the 5 years the DAS has been live at AT&T Park.

DAS performance improves over time; Wi-Fi is good too

White box at bottom is one of the under-the-seat Wi-Fi access points at AT&T Park.

White box at bottom is one of the under-the-seat Wi-Fi access points at AT&T Park.

Though Wi-Fi services in stadiums gets a lot of technology headlines, in many big arenas the DAS is an equal workhorse, connecting people who either don’t know how to or prefer not to connect to Wi-Fi services. Through the first 18 games of the 2014 season, Schlough said AT&T Park was seeing average AT&T traffic loads on the DAS of 150 Megabytes on the download side (fans requesting data) and 50 MB on the upload side (fans sending data). Figures for the Wi-Fi network (which is free to all customers) for the same span of games was an average of 400 MB download, 200 MB upload per game.

Schlough said performance stats for the AT&T portion of the DAS have improved vastly since the distributed antenna system was first put in.

“Just four or five years ago, 97 percent [connection rate] was actually relatively respectable,” Schlough said. Now, Schlough said network connect rates regularly hover in the “four nines” region, with a recent report showing a success rate of 99.9925 percent of all calls or texts going through.

The Wi-Fi network at AT&T Park, the first in any major sporting arena and still among the world’s most expansive, has more than 1,200 access points, many of which are now located beneath the seats. According to Schlough this coming offseason will likely represent the final phase of a stadium-wide deployment effort for the new, under-seat access points, which are installed symmetrically under the seats that are out in the open air.

Giants senior VP and CIO Bill Schlough, at the office

Giants senior VP and CIO Bill Schlough, at the office

Since AT&T Park doesn’t have many railings alongside the seats “in the bowl” or those in the upper decks, the under-the-seat APs were the only choice to extend Wi-Fi connectivity, he said. Though the box-like antennas do take away some under-seat storage area from approximately every 40th seat, Schlough said there haven’t been many complaints from fans about the gear.

What he has seen, however, are many compliments about the network services, especially from fellow professionals in the sports IT world.

“I get friends in the business who come here and send me texts with Speedtests attached, showing how great the Wi-Fi is,” said Schlough. My own ad hoc testing before our interview (albeit during non-game hours) showed speeds of greater than 40 Mbps for Wi-Fi just outside the park near McCovey Cove, and speeds of 25+ Mbps just outside the main gate. Schlough also showed us some of the new iBeacon antennas, which are being tested at MLB parks this summer to provide near-field communication marketing opportunities, like automatically checking fans in to the official At Bat app when they pass by a beacon. It’s just another way the best-connected park in baseball seeks to continue to improve the fan experience.

According to Schlough, the connectivity at AT&T Park doesn’t hurt when it comes to ticket sales.

“People do come here more frequently, I think, because they know there will be good connectivity,” said Schlough. “There’s no compromise. I do think that’s part of why we’re currently riding the third longest sellout streak in MLB history.”

MORE PHOTOS BELOW — CLICK ON IMAGES TO SEE LARGER VERSION

Can you find the iBeacon in the bowels of AT&T Park? It's the small grey box to the left of the other antenna.

Can you find the iBeacon in the bowels of AT&T Park? It’s the small grey box to the left of the other antenna.

Sprint's new DAS room at AT&T Park.

Sprint’s new DAS room at AT&T Park.

A close-up of the under-seat AP. Each AP requires holes drilled through concrete to provide wiring access. APs are weather-sealed, according to the Giants.

A close-up of the under-seat AP. Each AP requires holes drilled through concrete to provide wiring access. APs are weather-sealed, according to the Giants.

Bill Schlough's "old phones" collection. How many of these can you identify?

Bill Schlough’s “old phones” collection. How many of these can you identify?