New Report: New Wi-Fi, app and digital displays for San Jose Sharks’ SAP Center

MOBILE SPORTS REPORT is pleased to announce the Spring 2017 issue of our STADIUM TECH REPORT series, the ONLY in-depth publication created specifically for the stadium technology professional and the stadium technology marketplace.

Our profiles for this issue include a first-look visit to the San Jose Sharks’ newly wired SAP Center, where a Cisco Wi-Fi and StadiumVision network (deployed by AmpThink) has brought high-definition connectivity to the old familiar “Shark Tank.” We also have a profile of new DAS and Wi-Fi deployments at the Utah Jazz’s Vivint Smart Home Arena, as well as a recap of the wireless record-setting day at Super Bowl LI at Houston’s NRG Stadium. Plus, our first “Industry Voices” contribution, a great look at the history and progression of Wi-Fi stadium networks from AmpThink’s Bill Anderson. DOWNLOAD YOUR COPY today!

We’d also like to invite you to join in our first-ever “live interview” webinar, which will take place next Tuesday at 11 a.m. Pacific Time, 2 p.m. Eastern time. All the details are here, so register now and listen in next week for more in-depth views from Vivint Smart Home Arena, and their technology partners, Boingo and SOLiD.

We’d like to take a quick moment to thank our sponsors, which for this Stadium Tech Report issue include Mobilitie, Crown Castle, SOLiD, CommScope, Corning, Huber+Suhner, American Tower, and Aruba, a Hewlett Packard Enterprise company. Their generous sponsorship makes it possible for us to offer this content free of charge to our readers. We’d also like to welcome new readers from the Inside Towers community, who may have found their way here via our new partnership with the excellent publication Inside Towers. We’d also like to thank our growing list of repeat readers for your continued interest and support.

College Football Playoff championship sees 2.4 TB of Wi-Fi — big decline from 2016

We finally have numbers for the Wi-Fi usage at the most recent College Football Playoff championship game, and in somewhat of a first the total data used during the event was much lower than the previous year’s game, with just 2.4 terabytes of data used on Jan. 9 at Raymond James Stadium in Tampa, Fla. — compared to 4.9 TB of Wi-Fi used at the championship game in 2016, held at the University of Phoenix Stadium in Glendale, Ariz.

The reason for the decline is probably not due to any sudden dropoff in user demand, since usage of in-stadium cellular or DAS networks increased from 2016 to 2017, with AT&T’s observed network usage doubling from 1.9 TB to 3.8 TB in Tampa. More likely the dropoff is due to the fact that the Wi-Fi network at the University of Phoenix Stadium had been through recent upgrades to prepare for both the college championship game and Super Bowl XLIX, while the network in Raymond James Stadium hasn’t seen a significant upgrade since 2010, according to stadium officials. At last check, the Wi-Fi network at University of Phoenix Stadium had more than 750 APs installed.

Joey Jones, network engineer/information security for the Tampa Bay Buccaneers, said the Wi-Fi network currently in use at Raymond James Stadium has a total of 325 Cisco Wi-Fi APs, with 130 of those in the bowl seating areas. The design is all overhead placements, Jones said in an email discussion, with no under-seat or handrail enclosure placements. The total unique number of Wi-Fi users for the college playoff game was 11,671, with a peak concurrent connection of 7,353 users, Jones said.

Still tops among college playoff championship games in Wi-Fi is the first one held at AT&T Stadium in 2015, where 4.93 TB of Wi-Fi was used. Next year’s championship game is scheduled to be held at the new Mercedes-Benz Stadium in Atlanta, where one of the latest Wi-Fi networks should be in place and operational.

Seahawks see big jump in Wi-Fi usage at CenturyLink Field for 2016-17 season

Screen Shot 2016-09-12 at 1.11.24 PMThe Seattle Seahawks saw almost every metric associated with the Wi-Fi network at CenturyLink Field just about double from the 2015-16 to the 2016-17 NFL regular season, according to statistics provided by the team.

Chip Suttles, vice president of technology for the Seahawks, sent us over some excellent season-long charts of Wi-Fi activity, including unique and concurrent-user peaks, top throughput, and a couple of comparison charts mapping this most recent season’s activity compared to that a year before.

With a capacity crowd attendance total of 69,000, the always sold-out CenturyLink saw a take rate nearing 50 percent for most of the season, with a top unique-user number of 35,808 for a Nov. 7 31-25 win over the Buffalo Bills. Interestingly, the biggest day for total data usage wasn’t the Bills game (3.259 terabytes) but a 26-15 win over the Philadelphia Eagles on Nov. 20, when the Wi-Fi network saw 3.526 TB of data used. If you look at the comparitive graphs, both peak usage and total usage numbers pretty much doubled down on what was seen the year before.

According to Suttles, there wasn’t much in the way of upgrades to the Extreme Networks-based network before this past season — just some firmware and software updates, and “about a half-dozen” new APs to support additional seating added in the south end zone area. “Overall, I think it [the data increases] is more to do with familiarity,” Suttles said. Thanks to Chip and the Seahawks for sharing the numbers!

sea2

sea1

From overhead to under seat: A short history of the hows and whys of stadium Wi-Fi network design

Wi-Fi handrail enclosures at U.S. Bank Stadium, Minneapolis, Minn. Credit: Paul Kapustka, MSR

Wi-Fi handrail enclosures at U.S. Bank Stadium, Minneapolis, Minn. Credit: Paul Kapustka, MSR

By Bill Anderson, AmpThink

The history of high density (HD) Wi-Fi deployments in stadiums and arenas is short. Yet the the amount of change that occurred is significant; both in terms of how these networks are deployed and why.

Venue operators, manufacturers, and integrators are still grappling with the particulars of HD Wi-Fi in large open environments, even though there are a substantial number of deployed high quality implementations. Below, I’ve shared our perspective on the evolution of HD Wi-Fi design in stadiums and arenas and put forth questions that venue operators should be asking to find a solution that fits their needs and their budget.

AmpThink’s background in this field

Over the past 5 years, our team has been involved in the deployment of more than 50 high-density Wi-Fi networks in stadiums throughout North America. In that same period, the best-practices for stadium HD Wi-Fi design have changed several times, resulting in multiple deployment methodologies.

Each major shift in deployment strategy was intended to increase total system capacity [1]. The largest gains have come from better antenna technology or deployment techinques that better isolated access point output resulting in gains in channel re-use.

What follows is a summary of what we’ve learned from the deployments we participated in and their significance for the future. Hopefully, this information will be useful to others as they embark on their journeys to purchase, deploy, or enhance their own HD Wi-Fi networks.

In the beginning: All about overhead

Editor’s note: This post is part of Mobile Sports Report’s new Voices of the Industry feature, in which industry representatives submit articles, commentary or other information to share with the greater stadium technology marketplace. These are NOT paid advertisements, or infomercials. See our explanation of the feature to understand how it works.


Designers of first generation of HD Wi-Fi networks were starting to develop the basic concepts that would come to define HD deployments in large, open environments. Their work was informed by prior deployments in auditoriums and convention centers and focused on using directional antennas. The stated goal of this approach was to reduce co-channel interference [2] by reducing the effective footprint of an individual access point’s [3] RF output.

However the greatest gains came from improving the quality of the link between clients and the access point. Better antennas allowed client devices to communicate at faster speeds which decreased the amount of time required to complete their communication, making room for more clients on each channel before a given channel became saturated or unstable.

Under seat Wi-Fi AP at Bank of America Stadium. Credit: Carolina Panthers

Under seat Wi-Fi AP at Bank of America Stadium. Credit: Carolina Panthers

The concept was simple, but limited by the fact that there were few antennas available that could do the job effectively. Creative technicians created hybrid assemblies that combined multiple antennas into arrays that rotated polarization and tightened the antenna beam to paint the smallest usable coverage pattern possible. In time, this gap was addressed and today there are antennas specifically developed for use in overhead HD deployments – Stadium Antennas.

Typically, Stadium Antennas are installed in the ceilings above seating and/or on the walls behind seating because those locations are relatively easy to cable and minimize cost. We categorize these deployments as Overhead Deployments.

From overhead to ‘front and back’

First generation overhead deployments generally suffer from a lack of overhead mounting locations to produce sufficient coverage across the entire venue. In football stadiums, the front rows of the lower bowl are typically not covered by an overhang that can be used for antenna placement.

These rows are often more than 100 feet from the nearest overhead mounting location. The result is that pure overhead deployments leave some of the most expensive seats in the venue with little or no coverage. Further, due to the length of these sections, antennas at the back of the section potentially service thousands of client devices [4].

As fans joined these networks, deployments quickly became over-loaded and generated service complaints for venue owners. The solution was simple — add antennas at the front of long sections to reduce the total client load on the access points at the back. It was an effective band-aid that prioritized serving the venues’ most important and often most demanding guests.

This approach increased the complexity of installation as it was often difficult to cable access points located at the front of a section.

And for the first time, antennas were placed where they were subject to damage by fans, direct exposure to weather, and pressure washing [5]. With increased complexity, came increased costs as measured by the average cost per installed access point across a venue.

Because these systems feature antennas at the front and rear of each seating section, we refer to these deployments as ‘Front-to-Back Deployments.’ While this approach solves specific problems, it is not a complete solution in larger venues.

‘Filling In’ the gaps

Data collected from Front-to-Back Deployments proved to designers that moving the antennas closer to end users:
— covered areas that were previously uncovered;
— increased average data rates throughout the bowl;
— used the available spectrum more effectively; and
— increased total system capacity.

The logical conclusion was that additional antennas installed between the front and rear antennas would further increase system capacity. In long sections these additional antennas would also provide coverage to fans that were seated too far forward of antennas at the rear of the section and too far back from antennas at the front of the section. The result was uniform coverage throughout the venue.

In response, system designers experimented with hand rail mounted access points. Using directional antennas, coverage could be directed across a section and in opposition to the forward-facing antennas at the rear of the section and rear-facing antennas at the front of a section. These placements filled in the gaps in a Front-to-Back Deployment, hence the name ‘In-Fill Deployment.’

While these new In-Fill Deployments did their job, they added expense to what was already an expensive endeavor. Mounting access points on handrails required that a hole be drilled in the stadia at each access point location to cable the installed equipment. With the access point and antenna now firmly embedded in the seating, devices were also exposed to more traffic and abuse. Creative integrators came to the table with hardened systems to protect the equipment – handrail enclosures. New costs included: using ground-penetrating radar to prepare for coring; enclosure fabrication costs; and more complex conduit and pathway considerations. A typical handrail placement could cost four times the cost of a typical overhead placement and a designer might call for 2 or 3 handrail placements for every overhead placement.

Getting closer, better, faster: Proximate Networks

In-Fill strategies substantially solved the coverage problem in large venues. Using a combination of back of section, front of section, and hand-rail mounted access points, wireless designers had a tool box to deliver full coverage.

But with that success came a new problem. As fans discovered these high density networks and found new uses for them, demands on those networks grew rapidly, especially where teams or venue owners pushed mobile-device content strategies that added to the network load. In-spite of well placed access points, fan devices did not attach to the in-fill devices at the same rate that they attached to the overhead placements [6]. In-fill equipment remained lightly used and overhead placements absorbed hundreds of clients. Gains in system capacity stalled.

Close-up look at U.S. Bank Stadium railing enclosure during final construction phase, summer 2016. Credit: Paul Kapustka, MSR

Close-up look at U.S. Bank Stadium railing enclosure during final construction phase, summer 2016. Credit: Paul Kapustka, MSR

To overcome uneven system loading, designers needed to create a more even distribution of RF energy within the deployment. That required a consistent approach to deployment, rather than a mix of deployment approaches. The result was the elimination of overhead antennas in favor of access points and antennas installed within the crowd, closest to the end use; hence the name ‘Proximate Networks.’

Proximate networks come in two variations: handrail only and under seat only. In the hand rail only model, the designer eliminates overhead and front of section placements in favor of a dense deployment of hand rail enclosures. In the under seat model, the designer places the access point and antenna underneath the actual seating (but above the steel or concrete decking). In both models, the crowd becomes an important part of the design. The crowd attenuates the signal as it passes through their bodies resulting in consistent signal degradation and even distribution of RF energy throughout the seating bowl. The result is even access point loading and increased system capacity.

An additional benefit of embedding the access points in the crowd is that the crowd effectively constrains the output of the access point much as a wall constrains the output of an access point in a typical building. Each radio therefore hears fewer of its neighbors, allowing each channel to be re-used more effectively. And because the crowd provides an effective mechanism for controlling the spread of RF energy, the radios can be operated at higher power levels which improves the link between the access point and the fan’s device. The result is more uniform system loading, higher average data rates, increased channel re-ue, and increases in total system capacity.

While Proximate Networks are still a relatively new concept, the early data (and a rapid number of fast followers) confirms that if you want the densest possible network with the largest possible capacity, then a Proximate Network is what you need.

The Financials: picking what’s right for you

From the foregoing essay, you might conclude that the author’s recommendation is to deploy a Proximate Network. However, that is not necessarily the case. If you want the densest possible network with the largest possible capacity, then a Proximate Network is a good choice. But there are merits to each approach described and a cost benefit analysis should be performed before a deployment approach is selected.

For many venues, Overhead Deployments remain the most cost effective way to provide coverage. For many smaller venues and in venues where system utilization is expected to be low, an Overhead deployment can be ideal.

Front-to-Back deployments work well in venues where system utilization is low and the available overhead mounting assets can’t cover all areas. The goal of these deployments is ensuring usable coverage, not maximizing total system capacity.

In-fill deployments are a good compromise between a coverage-centric high density approach and a capacity-centric approach. This approach is best suited to venues that need more total system capacity, but have budget constraints the prevent selecting a Proximate approach.

Proximate deployments provide the maximum possible wireless density for venues where connectivity is considered to be a critical part of the venue experience.

Conclusion

If your venue is contemplating deploying a high density network, ask your integrator to walk you through the expected system demand, the calculation of system capacity for each approach, and finally the cost of each approach. Make sure you understand their assumptions. Then, select the deployment model that meets your business requirements — there is no “one size fits all” when it comes to stadium Wi-Fi.

Bill Anderson, AmpThink

Bill Anderson, AmpThink

Bill Anderson has been involved in the design and construction of wireless networks for over 20 years, pre-dating Wi-Fi. His first experience with wireless networking was as a software developer building software for mobile computers communicating over 400 MHz and 900 MHz base stations developed by Aironet (now part of Cisco Systems).

His work with mobile computing and wireless networks in distribution and manufacturing afforded him a front row seat to the emergence of Wi-Fi and the transformation of Wi-Fi from a niche technology to a business critical system. Since 2011 at AmpThink Bill has been actively involved in constructing some of the largest single venue wireless networks in the world.

Footnotes

^ 1. A proxy for the calculation of overall system capacity is developed by multiplying the average speed of communication of all clients on a channel (avg data rate or speed) by the number of channels deployed in the system (available spectrum) by the number of times we can use each channel (channel re-use) or [speed x spectrum x re-use]. While there are many other parameters that come into play when designing a high density network (noise, attenuation, reflection, etc.), this simple equation helps us understand how we approach building networks that can support a large number of connected devices in an open environment, e.g. the bowl of a stadium or arena.

^ 2. Co-channel interference refers to a scenario where multiple access points are attepting to communicate with client devices using the same channel. If a client or access point hears competing communication on the channel they are attempting to use, they must wait until that communication is complete before they can send their message.

^ 3. Access Point is the term used in the Wi-Fi industry to describe the network endpoint that client devices communicate with over the air. Other terms used include radio, AP, or WAP. In most cases, each access point is equipped with 2 or more physical radios that communicate on one of two bands – 2.4 GHz or 5 GHz. HD Wi-Fi deployments are composed of several hundred to over 1,000 access points connected to a robust wired network that funnels guest traffic to and from the internet.

^ 4. While there is no hard and fast rule, most industry experts agree that a single access point can service between 50 and 100 client devices.

^ 5. Venues often use pressure washers to clean a stadium after a big event.

^ 6. Unlike cellular systems which can dictate which mobile device attaches to each network node, at what speed, and when they can communicate, Wi-Fi relies on the mobile device to make the same decisions. When presented with a handrail access point and an overhead access point, mobile devices often hear the overhead placement better and therefore prefer the overhead placement. In In-Fill deployments, this often results in a disproportionate number of client devices selecting overhead placements. The problem can be managed by lowering the power level on the overhead access point at the expense of degrading the experience of the devices that the designer intended to attach to the overhead access point.

AT&T Stadium sees 7.25 TB of Wi-Fi for Packers vs. Cowboys playoff game

The Dallas Cowboys before taking the field against the Green Bay Packers in a Jan. 15 playoff game. Credit: James D. Smith/Dallas Cowboys

The Dallas Cowboys before taking the field against the Green Bay Packers in a Jan. 15 playoff game. Credit: James D. Smith/Dallas Cowboys

Pro football’s biggest stadium had the biggest non-Super Bowl Wi-Fi traffic day we’ve heard of this season, as the Dallas Cowboys reported seeing 7.25 terabytes of Wi-Fi data on the AT&T Stadium network during the Packers’ thrilling 34-31 victory on Jan. 15.

John Winborn, chief information officer for the Dallas Cowboys, sent us the info on the stadium’s biggest Wi-Fi day ever, surpassing the previous record of 6.77 TB seen on the AT&T Stadium Wi-Fi network for WrestleMania 32 back on April 5, 2016. The new total for Wi-Fi was even set by fewer fans, with attendance for the Jan. 15 playoff game at 93,396, compared to the 101,763 at WrestleMania.

Though he didn’t provide an exact number, Winborn also said that the take rate of unique clients on the Wi-Fi network for the Packers game was 50 percent of attendees, roughly 46,700, easily one of the biggest numbers we’ve seen anywhere. During the Cowboys’ excellent regular season, Winborn said the average of Wi-Fi data used per game was 5.28 TB, an increase of 33 percent over the 2015 season.

UPDATE: The AT&T folks have provided the DAS stats for the same game, with an additional 3 TB of data used on the AT&T cellular networks inside the stadium. So we’re up to 10.25 TB for a non-Super Bowl game… doubt we will get any other carriers to add their totals but sounds to me like this is the biggest non-Super Bowl event out there in terms of total data.

Any other NFL teams (or college teams) out there with peak games and/or season averages, send them in! Let’s keep updating this list!

THE NEW TOP 7 FOR WI-FI

1. Super Bowl 51, NRG Stadium, Houston, Feb. 5, 2017: Wi-Fi: 11.8 TB
2. Super Bowl 50, Levi’s Stadium, Santa Clara, Calif., Feb. 7, 2016: Wi-Fi: 10.1 TB
3. Green Bay Packers vs. Dallas Cowboys, Divisional Playoffs, AT&T Stadium, Arlington, Texas, Jan. 15, 2017: Wi-Fi: 7.25 TB
4. WrestleMania 32, AT&T Stadium, Arlington, Texas, April 3, 2016: Wi-Fi: 6.77 TB
5. Super Bowl 49, University of Phoenix Stadium, Glendale, Ariz., Feb. 1, 2015: Wi-Fi: 6.23 TB
6. Alabama vs. Texas A&M, Kyle Field, College Station, Texas, Oct. 17, 2015: Wi-Fi: 5.7 TB
7. Pittsburgh Steelers vs. New England Patriots, AFC Championship Game, Gillette Stadium, Foxborough, Mass., Jan. 22, 2017: Wi-Fi: 5.11 TB

Super Bowl LI Wi-Fi sees drop in average per-fan use total

Under seat Wi-Fi APs visible down seating row at NRG Stadium. Credit: 5 Bars

Under seat Wi-Fi APs visible down seating row at NRG Stadium. Credit: 5 Bars

While Super Bowl LI in Houston set records for most total Wi-Fi used in a single day event, the actual amount of average Wi-Fi data used per connected fan actually dropped from the previous year’s game, from about 370 megabytes per user at Super Bowl 50 to about 333 MB per user for Super Bowl 51.

Using official totals provided by the NFL’s official analytics provider, Extreme Networks, there was a total of 11.8 TB of data used on the Wi-Fi network at NRG Stadium in Houston during Super Bowl 51, compared to 10.1 TB used during Super Bowl 50 at Levi’s Stadium in Santa Clara, Calif.

While the total Wi-Fi data number represents approximately a 17 percent increase from Super Bowl 50 to Super Bowl 51, the most recent game had 35,430 users who connected at least once to the network, an almost 30 percent leap from Super Bowl 50’s 27,316 unique users. So while Super Bowl 51 had more unique users (and more peak concurrent users as well) and a higher data total, the average amount of data used per connected fan decreased, from about 370 MB per user to about 333 MB per user.

Data for Super Bowls in years past is thin (mainly because stadium Wi-Fi didn’t really exist), but it’s certainly the first time in very recent history that the per-user average has dropped from one Super Bowl to the next. Super Bowl 49, held at the University of Phoenix Stadium in Glendale, Ariz., saw a total of 6.23 TB of Wi-Fi used, with 25,936 unique users, for a per-user average total of 240 MB. We don’t have any stats for unique users at Super Bowl XLVIII in MetLife Stadium, but with the total Wi-Fi used there at 3.2 TB the average was also presumably much lower as well, unless there were also 50 percent fewer connected users.

Did autoconnect drop the average?

Wi-Fi gear visible above concourse kiosk at NRG Stadium. Credit: 5 Bars

Wi-Fi gear visible above concourse kiosk at NRG Stadium. Credit: 5 Bars

The drop in per-user average data for Wi-Fi is curious when compared to the huge leap in overall DAS stats for the last two Super Bowls, with Super Bowl 51 checking in at 25.8 TB of data, a figure that does not include statistics from T-Mobile, which is declining to report its data total from the game. At Super Bowl 50, all four top wireless carriers combined saw 15.9 TB, so the total for Super Bowl 51 is about 62 percent higher — and if you add in the estimated 3-4 TB that was likely recorded by T-Mobile, that leap is even bigger.

Unfortunately cellular carriers do not provide the exact number of connected users, so there is no per-user average data total available. It would be interesting to know if the expanded DAS preparations made at Super Bowl 50 and at Super Bowl 51 actually connected more total users, or allowed users to use more data per user. We have a request with Verizon for more stats, but it may be a long wait.

One theory we have here at MSR is that it’s possible that a large number of autoconnected devices may have increased the unique-user total while not necessarily adding to the overall Wi-Fi data-used total. In our reporting about the NRG Stadium network we noted that Verizon, which helped pay for the Wi-Fi deployment, had reserved 40 percent of the Wi-Fi capacity for its customers, many of whom could have been autoconnected to the network even without them knowing. We have asked both Extreme and Verizon for a breakdown on Verizon users vs. other wireless customer users on the Wi-Fi network, but have not yet received a response.