From overhead to under seat: A short history of the hows and whys of stadium Wi-Fi network design

Wi-Fi handrail enclosures at U.S. Bank Stadium, Minneapolis, Minn. Credit: Paul Kapustka, MSR

Wi-Fi handrail enclosures at U.S. Bank Stadium, Minneapolis, Minn. Credit: Paul Kapustka, MSR

By Bill Anderson, AmpThink

The history of high density (HD) Wi-Fi deployments in stadiums and arenas is short. Yet the the amount of change that occurred is significant; both in terms of how these networks are deployed and why.

Venue operators, manufacturers, and integrators are still grappling with the particulars of HD Wi-Fi in large open environments, even though there are a substantial number of deployed high quality implementations. Below, I’ve shared our perspective on the evolution of HD Wi-Fi design in stadiums and arenas and put forth questions that venue operators should be asking to find a solution that fits their needs and their budget.

AmpThink’s background in this field

Over the past 5 years, our team has been involved in the deployment of more than 50 high-density Wi-Fi networks in stadiums throughout North America. In that same period, the best-practices for stadium HD Wi-Fi design have changed several times, resulting in multiple deployment methodologies.

Each major shift in deployment strategy was intended to increase total system capacity [1]. The largest gains have come from better antenna technology or deployment techinques that better isolated access point output resulting in gains in channel re-use.

What follows is a summary of what we’ve learned from the deployments we participated in and their significance for the future. Hopefully, this information will be useful to others as they embark on their journeys to purchase, deploy, or enhance their own HD Wi-Fi networks.

In the beginning: All about overhead

Editor’s note: This post is part of Mobile Sports Report’s new Voices of the Industry feature, in which industry representatives submit articles, commentary or other information to share with the greater stadium technology marketplace. These are NOT paid advertisements, or infomercials. See our explanation of the feature to understand how it works.


Designers of first generation of HD Wi-Fi networks were starting to develop the basic concepts that would come to define HD deployments in large, open environments. Their work was informed by prior deployments in auditoriums and convention centers and focused on using directional antennas. The stated goal of this approach was to reduce co-channel interference [2] by reducing the effective footprint of an individual access point’s [3] RF output.

However the greatest gains came from improving the quality of the link between clients and the access point. Better antennas allowed client devices to communicate at faster speeds which decreased the amount of time required to complete their communication, making room for more clients on each channel before a given channel became saturated or unstable.

Under seat Wi-Fi AP at Bank of America Stadium. Credit: Carolina Panthers

Under seat Wi-Fi AP at Bank of America Stadium. Credit: Carolina Panthers

The concept was simple, but limited by the fact that there were few antennas available that could do the job effectively. Creative technicians created hybrid assemblies that combined multiple antennas into arrays that rotated polarization and tightened the antenna beam to paint the smallest usable coverage pattern possible. In time, this gap was addressed and today there are antennas specifically developed for use in overhead HD deployments – Stadium Antennas.

Typically, Stadium Antennas are installed in the ceilings above seating and/or on the walls behind seating because those locations are relatively easy to cable and minimize cost. We categorize these deployments as Overhead Deployments.

From overhead to ‘front and back’

First generation overhead deployments generally suffer from a lack of overhead mounting locations to produce sufficient coverage across the entire venue. In football stadiums, the front rows of the lower bowl are typically not covered by an overhang that can be used for antenna placement.

These rows are often more than 100 feet from the nearest overhead mounting location. The result is that pure overhead deployments leave some of the most expensive seats in the venue with little or no coverage. Further, due to the length of these sections, antennas at the back of the section potentially service thousands of client devices [4].

As fans joined these networks, deployments quickly became over-loaded and generated service complaints for venue owners. The solution was simple — add antennas at the front of long sections to reduce the total client load on the access points at the back. It was an effective band-aid that prioritized serving the venues’ most important and often most demanding guests.

This approach increased the complexity of installation as it was often difficult to cable access points located at the front of a section.

And for the first time, antennas were placed where they were subject to damage by fans, direct exposure to weather, and pressure washing [5]. With increased complexity, came increased costs as measured by the average cost per installed access point across a venue.

Because these systems feature antennas at the front and rear of each seating section, we refer to these deployments as ‘Front-to-Back Deployments.’ While this approach solves specific problems, it is not a complete solution in larger venues.

‘Filling In’ the gaps

Data collected from Front-to-Back Deployments proved to designers that moving the antennas closer to end users:
— covered areas that were previously uncovered;
— increased average data rates throughout the bowl;
— used the available spectrum more effectively; and
— increased total system capacity.

The logical conclusion was that additional antennas installed between the front and rear antennas would further increase system capacity. In long sections these additional antennas would also provide coverage to fans that were seated too far forward of antennas at the rear of the section and too far back from antennas at the front of the section. The result was uniform coverage throughout the venue.

In response, system designers experimented with hand rail mounted access points. Using directional antennas, coverage could be directed across a section and in opposition to the forward-facing antennas at the rear of the section and rear-facing antennas at the front of a section. These placements filled in the gaps in a Front-to-Back Deployment, hence the name ‘In-Fill Deployment.’

While these new In-Fill Deployments did their job, they added expense to what was already an expensive endeavor. Mounting access points on handrails required that a hole be drilled in the stadia at each access point location to cable the installed equipment. With the access point and antenna now firmly embedded in the seating, devices were also exposed to more traffic and abuse. Creative integrators came to the table with hardened systems to protect the equipment – handrail enclosures. New costs included: using ground-penetrating radar to prepare for coring; enclosure fabrication costs; and more complex conduit and pathway considerations. A typical handrail placement could cost four times the cost of a typical overhead placement and a designer might call for 2 or 3 handrail placements for every overhead placement.

Getting closer, better, faster: Proximate Networks

In-Fill strategies substantially solved the coverage problem in large venues. Using a combination of back of section, front of section, and hand-rail mounted access points, wireless designers had a tool box to deliver full coverage.

But with that success came a new problem. As fans discovered these high density networks and found new uses for them, demands on those networks grew rapidly, especially where teams or venue owners pushed mobile-device content strategies that added to the network load. In-spite of well placed access points, fan devices did not attach to the in-fill devices at the same rate that they attached to the overhead placements [6]. In-fill equipment remained lightly used and overhead placements absorbed hundreds of clients. Gains in system capacity stalled.

Close-up look at U.S. Bank Stadium railing enclosure during final construction phase, summer 2016. Credit: Paul Kapustka, MSR

Close-up look at U.S. Bank Stadium railing enclosure during final construction phase, summer 2016. Credit: Paul Kapustka, MSR

To overcome uneven system loading, designers needed to create a more even distribution of RF energy within the deployment. That required a consistent approach to deployment, rather than a mix of deployment approaches. The result was the elimination of overhead antennas in favor of access points and antennas installed within the crowd, closest to the end use; hence the name ‘Proximate Networks.’

Proximate networks come in two variations: handrail only and under seat only. In the hand rail only model, the designer eliminates overhead and front of section placements in favor of a dense deployment of hand rail enclosures. In the under seat model, the designer places the access point and antenna underneath the actual seating (but above the steel or concrete decking). In both models, the crowd becomes an important part of the design. The crowd attenuates the signal as it passes through their bodies resulting in consistent signal degradation and even distribution of RF energy throughout the seating bowl. The result is even access point loading and increased system capacity.

An additional benefit of embedding the access points in the crowd is that the crowd effectively constrains the output of the access point much as a wall constrains the output of an access point in a typical building. Each radio therefore hears fewer of its neighbors, allowing each channel to be re-used more effectively. And because the crowd provides an effective mechanism for controlling the spread of RF energy, the radios can be operated at higher power levels which improves the link between the access point and the fan’s device. The result is more uniform system loading, higher average data rates, increased channel re-ue, and increases in total system capacity.

While Proximate Networks are still a relatively new concept, the early data (and a rapid number of fast followers) confirms that if you want the densest possible network with the largest possible capacity, then a Proximate Network is what you need.

The Financials: picking what’s right for you

From the foregoing essay, you might conclude that the author’s recommendation is to deploy a Proximate Network. However, that is not necessarily the case. If you want the densest possible network with the largest possible capacity, then a Proximate Network is a good choice. But there are merits to each approach described and a cost benefit analysis should be performed before a deployment approach is selected.

For many venues, Overhead Deployments remain the most cost effective way to provide coverage. For many smaller venues and in venues where system utilization is expected to be low, an Overhead deployment can be ideal.

Front-to-Back deployments work well in venues where system utilization is low and the available overhead mounting assets can’t cover all areas. The goal of these deployments is ensuring usable coverage, not maximizing total system capacity.

In-fill deployments are a good compromise between a coverage-centric high density approach and a capacity-centric approach. This approach is best suited to venues that need more total system capacity, but have budget constraints the prevent selecting a Proximate approach.

Proximate deployments provide the maximum possible wireless density for venues where connectivity is considered to be a critical part of the venue experience.

Conclusion

If your venue is contemplating deploying a high density network, ask your integrator to walk you through the expected system demand, the calculation of system capacity for each approach, and finally the cost of each approach. Make sure you understand their assumptions. Then, select the deployment model that meets your business requirements — there is no “one size fits all” when it comes to stadium Wi-Fi.

Bill Anderson, AmpThink

Bill Anderson, AmpThink

Bill Anderson has been involved in the design and construction of wireless networks for over 20 years, pre-dating Wi-Fi. His first experience with wireless networking was as a software developer building software for mobile computers communicating over 400 MHz and 900 MHz base stations developed by Aironet (now part of Cisco Systems).

His work with mobile computing and wireless networks in distribution and manufacturing afforded him a front row seat to the emergence of Wi-Fi and the transformation of Wi-Fi from a niche technology to a business critical system. Since 2011 at AmpThink Bill has been actively involved in constructing some of the largest single venue wireless networks in the world.

Footnotes

^ 1. A proxy for the calculation of overall system capacity is developed by multiplying the average speed of communication of all clients on a channel (avg data rate or speed) by the number of channels deployed in the system (available spectrum) by the number of times we can use each channel (channel re-use) or [speed x spectrum x re-use]. While there are many other parameters that come into play when designing a high density network (noise, attenuation, reflection, etc.), this simple equation helps us understand how we approach building networks that can support a large number of connected devices in an open environment, e.g. the bowl of a stadium or arena.

^ 2. Co-channel interference refers to a scenario where multiple access points are attepting to communicate with client devices using the same channel. If a client or access point hears competing communication on the channel they are attempting to use, they must wait until that communication is complete before they can send their message.

^ 3. Access Point is the term used in the Wi-Fi industry to describe the network endpoint that client devices communicate with over the air. Other terms used include radio, AP, or WAP. In most cases, each access point is equipped with 2 or more physical radios that communicate on one of two bands – 2.4 GHz or 5 GHz. HD Wi-Fi deployments are composed of several hundred to over 1,000 access points connected to a robust wired network that funnels guest traffic to and from the internet.

^ 4. While there is no hard and fast rule, most industry experts agree that a single access point can service between 50 and 100 client devices.

^ 5. Venues often use pressure washers to clean a stadium after a big event.

^ 6. Unlike cellular systems which can dictate which mobile device attaches to each network node, at what speed, and when they can communicate, Wi-Fi relies on the mobile device to make the same decisions. When presented with a handrail access point and an overhead access point, mobile devices often hear the overhead placement better and therefore prefer the overhead placement. In In-Fill deployments, this often results in a disproportionate number of client devices selecting overhead placements. The problem can be managed by lowering the power level on the overhead access point at the expense of degrading the experience of the devices that the designer intended to attach to the overhead access point.

Stadium Tech Report: Wi-Fi, DAS and live video get good reception at Barclays Center

Concessions feature of Barclays Stadium app. Credit: Barclays Center

Concessions feature of Barclays Stadium app. Credit: Barclays Center

Sometimes, the best surprise is no surprise. That’s the case when it comes to technology deployments at the still-new Barclays Center in Brooklyn, where the Wi-Fi, DAS and live video on both fixed and mobile platforms are all performing pretty much as expected.

According to Chip Foley, vice president of building technology for Forest City Ratner Companies (the developer of Barclays Center), perhaps the only mild surprise so far at the just-over-a-year-old Barclays is that the biggest Wi-Fi usage came not during a sporting event, but instead at the MTV Video Music Awards ceremony this past August.

“We had 7,000 people using the Wi-Fi network at the VMAs, and I was a little surprised at that,” said Foley. At Brooklyn Nets games, Foley said, the average Wi-Fi load in the 17,500-seat arena is somewhere between 4,000 and 5,000 users per game. In a recent phone interview, Foley recapped the performance of the stadium’s cutting-edge technology, which also includes one of the first deployments of Cisco’s StadiumVision Mobile, which brings live video feeds to fans using the stadium app. There’s also Cisco-powered digital displays throughout the arena, and a robust DAS deployment to make sure regular cellular connections don’t fail.

HD Wi-Fi attracts 20 percent of attendees

Barclays Center, which opened in September of 2012, had the benefit that few NBA stadiums have in that it was built from the ground up with networking as a key component. If Foley has any regrets about the Cisco Connected Stadium Wi-Fi deployment, it’s that it hasn’t really been fully tested yet. Even during the VMAs, Foley said he was using the in-building Wi-Fi to watch 10 different streaming video views on his laptop, from the red carpet cameras to the behind-the-scenes views of stars getting their awards.

Chip Foley

Chip Foley

“Our goal was to build as robust a network as possible, so that we can handle big needs of one-off events [like the VMAs] as well as the 41+ Nets games every season,” Foley said. With two 1-gigabit backbone lines providing Internet access, Foley said the Barclays network is meeting its goal of being “as fast as your fiber connection at home.”

The only drawback so far seems to be getting more fans to try out the network connection when they are at the games or events. According to Foley, despite advertising and promotions, Nets crowds almost always hit a figure of between 20 percent and 25 percent of them being online, a “Groundhog Day” situation that has Foley wondering whether it’s a natural limit.

“That may just be the number of fans who want to use it [the network]” at a game, Foley said.

The Barclays Center DAS, deployed by ExteNet Systems using gear in part from TE Connectivity, is another non-surprise center for Foley.

“The DAS is great, we never get complaints [about cellular connectivity],” Foley said. “You dread hearing that people can’t send texts. That just hasn’t happened.”

Digital displays, both mobile and fixed

One of the more compelling features of the Barclays tech experience is the implementation of Cisco’s StadiumVision Mobile technology, which brings several live “channels” of video to any fan using the Wi-Fi connection and the stadium app, which was built by WillowTree. With views from the benches, behind the basket and quick replays, Barclays can bring an up-close and personal view to even those far away from the court.

StadiumVision Mobile app being used in Barclays Center. Credit: Barclays Center

StadiumVision Mobile app being used in Barclays Center. Credit: Barclays Center

“StadiumVision Mobile is great for the upper pavilion seats, you can now get a view from a different perspective, and get replays,” Foley said. According to Foley, Cisco engineers tested the technology’s performance to ensure that it worked at every seat in the house.

Fixed digital displays are also a key technology at Barclays, starting with the unique Oculus display built into the striking exterior of the building, and continuing to the hundreds of digital displays inside. Using the Cisco Stadium Vision digital display technology, Barclays Center is able to change and update information on single screens or on all screens on the fly, allowing for greater flexibility in terms of messaging and information like concession-stand prices. Barclays also uses its displays to show train schedules, giving fans better information to plan their departures from events.

“The Stadium Vision displays have been nothing but great for us — we sold a lot of advertising on them even before launch,” Foley said. “It’s fun for our content group to build out content for the L-boards [displays where an L-shaped advertisement brackets other information on the screen], and keep it changing. Restaurant operators can use an iPad to change prices [on their screens] right before an event. They don’t have to talk to us. Overall, it’s a lot less maintenance than I expected or anticipated.”

If Foley had one chance to do anything over again with displays, it would be to add more of them to the original mix. His lesson to future stadium display builders is: If you’re in doubt, put up more.

“We must have had 30-plus meetings regarding [internal] TV locations, with 3D modeling and fly-throughs,” Foley said. “For the most part, I’m happy. But if I could, I would have more clusters [of screens]. Wherever there is one screen now, I wish I had three. People always look at a cluster.”

Adding new screens after the fact, Foley said, isn’t as simple as going to Best Buy to pick up a discount TV.

“You might be able to buy a TV for a couple hundred bucks on Black Friday, but no one tells you that to put that in a venue, once you get past union costs, connectivity and everything else, it’s about $5,000,” Foley said. “It’s way more money to add them now.”

What’s next: iBeacon, Google Glass and more analytics

What’s in the future for Barclays technology? For starters, Foley will oversee deployment of Wi-Fi services for the outside spaces surrounding the arena.

“It’d be nice to have Wi-Fi for ticket scanning outside the venue,” said Foley. “That’s one of those things that you don’t understand the need for it until you open the stadium and see what happens.”

Barclays is also looking into testing the Apple iBeacon technology, which can send text messages to devices in very close proximity. Technologies like iBeacon and even digital signage must also cross internal administrative hurdles, such as simply training sales forces and alerting advertisers to the opportunities.

“For some of the streams, there’s the question of ‘how do we sell this’ — the team has never done this and sponsors may not be aware,” Foley said. “You also have to figure out things like how many notifications and emails should we send out. You don’t want to send out too many, because that turns people off.”

Foley said the Barclays social media team is also at the start of a process of mining statistics from places like Twitter, Facebook and other social media streams, to get a better handle on what fans are using the technology for and how the experience might be improved. One possible way is through a Google Glass application, something Foley agreed might not be for everyone.

“I’m fascinated by the possibility of something like an XML stats feed [in Google Glass] where you’d still be able to watch the game,” Foley said. “We’re getting closer! It’s not for everybody, but some portion of the population is probably thinking that way.”

https://duwit.ukdw.ac.id/document/pengadaan/slot777/

https://mtsnupakis.sch.id/wp-content/zeusslot/

https://insankamilsidoarjo.sch.id/wp-content/slot-zeus/

https://smpbhayangkari1sby.sch.id/wp-content/slot-zeus/

https://alhikamsurabaya.sch.id/wp-content/slot-thailand/

https://mtsnupakis.sch.id/wp-content/bonus-new-member/

https://smptagsby.sch.id/wp-content/slot-bet-200/

https://lookahindonesia.com/wp-content/bonus-new-member/

https://ponpesalkhairattanjungselor.sch.id/wp-content/mahjong-slot/

https://mtsnupakis.sch.id/wp-content/slot777/

https://sdlabum.sch.id/wp-content/slot777/

https://sdlabumblitar.sch.id/wp-content/bonus-new-member/

https://sdlabumblitar.sch.id/wp-content/spaceman/

https://paudlabumblitar.sch.id/wp-content/spaceman/