From overhead to under seat: A short history of the hows and whys of stadium Wi-Fi network design

Wi-Fi handrail enclosures at U.S. Bank Stadium, Minneapolis, Minn. Credit: Paul Kapustka, MSR

Wi-Fi handrail enclosures at U.S. Bank Stadium, Minneapolis, Minn. Credit: Paul Kapustka, MSR

By Bill Anderson, AmpThink

The history of high density (HD) Wi-Fi deployments in stadiums and arenas is short. Yet the the amount of change that occurred is significant; both in terms of how these networks are deployed and why.

Venue operators, manufacturers, and integrators are still grappling with the particulars of HD Wi-Fi in large open environments, even though there are a substantial number of deployed high quality implementations. Below, I’ve shared our perspective on the evolution of HD Wi-Fi design in stadiums and arenas and put forth questions that venue operators should be asking to find a solution that fits their needs and their budget.

AmpThink’s background in this field

Over the past 5 years, our team has been involved in the deployment of more than 50 high-density Wi-Fi networks in stadiums throughout North America. In that same period, the best-practices for stadium HD Wi-Fi design have changed several times, resulting in multiple deployment methodologies.

Each major shift in deployment strategy was intended to increase total system capacity [1]. The largest gains have come from better antenna technology or deployment techinques that better isolated access point output resulting in gains in channel re-use.

What follows is a summary of what we’ve learned from the deployments we participated in and their significance for the future. Hopefully, this information will be useful to others as they embark on their journeys to purchase, deploy, or enhance their own HD Wi-Fi networks.

In the beginning: All about overhead

Editor’s note: This post is part of Mobile Sports Report’s new Voices of the Industry feature, in which industry representatives submit articles, commentary or other information to share with the greater stadium technology marketplace. These are NOT paid advertisements, or infomercials. See our explanation of the feature to understand how it works.


Designers of first generation of HD Wi-Fi networks were starting to develop the basic concepts that would come to define HD deployments in large, open environments. Their work was informed by prior deployments in auditoriums and convention centers and focused on using directional antennas. The stated goal of this approach was to reduce co-channel interference [2] by reducing the effective footprint of an individual access point’s [3] RF output.

However the greatest gains came from improving the quality of the link between clients and the access point. Better antennas allowed client devices to communicate at faster speeds which decreased the amount of time required to complete their communication, making room for more clients on each channel before a given channel became saturated or unstable.

Under seat Wi-Fi AP at Bank of America Stadium. Credit: Carolina Panthers

Under seat Wi-Fi AP at Bank of America Stadium. Credit: Carolina Panthers

The concept was simple, but limited by the fact that there were few antennas available that could do the job effectively. Creative technicians created hybrid assemblies that combined multiple antennas into arrays that rotated polarization and tightened the antenna beam to paint the smallest usable coverage pattern possible. In time, this gap was addressed and today there are antennas specifically developed for use in overhead HD deployments – Stadium Antennas.

Typically, Stadium Antennas are installed in the ceilings above seating and/or on the walls behind seating because those locations are relatively easy to cable and minimize cost. We categorize these deployments as Overhead Deployments.

From overhead to ‘front and back’

First generation overhead deployments generally suffer from a lack of overhead mounting locations to produce sufficient coverage across the entire venue. In football stadiums, the front rows of the lower bowl are typically not covered by an overhang that can be used for antenna placement.

These rows are often more than 100 feet from the nearest overhead mounting location. The result is that pure overhead deployments leave some of the most expensive seats in the venue with little or no coverage. Further, due to the length of these sections, antennas at the back of the section potentially service thousands of client devices [4].

As fans joined these networks, deployments quickly became over-loaded and generated service complaints for venue owners. The solution was simple — add antennas at the front of long sections to reduce the total client load on the access points at the back. It was an effective band-aid that prioritized serving the venues’ most important and often most demanding guests.

This approach increased the complexity of installation as it was often difficult to cable access points located at the front of a section.

And for the first time, antennas were placed where they were subject to damage by fans, direct exposure to weather, and pressure washing [5]. With increased complexity, came increased costs as measured by the average cost per installed access point across a venue.

Because these systems feature antennas at the front and rear of each seating section, we refer to these deployments as ‘Front-to-Back Deployments.’ While this approach solves specific problems, it is not a complete solution in larger venues.

‘Filling In’ the gaps

Data collected from Front-to-Back Deployments proved to designers that moving the antennas closer to end users:
— covered areas that were previously uncovered;
— increased average data rates throughout the bowl;
— used the available spectrum more effectively; and
— increased total system capacity.

The logical conclusion was that additional antennas installed between the front and rear antennas would further increase system capacity. In long sections these additional antennas would also provide coverage to fans that were seated too far forward of antennas at the rear of the section and too far back from antennas at the front of the section. The result was uniform coverage throughout the venue.

In response, system designers experimented with hand rail mounted access points. Using directional antennas, coverage could be directed across a section and in opposition to the forward-facing antennas at the rear of the section and rear-facing antennas at the front of a section. These placements filled in the gaps in a Front-to-Back Deployment, hence the name ‘In-Fill Deployment.’

While these new In-Fill Deployments did their job, they added expense to what was already an expensive endeavor. Mounting access points on handrails required that a hole be drilled in the stadia at each access point location to cable the installed equipment. With the access point and antenna now firmly embedded in the seating, devices were also exposed to more traffic and abuse. Creative integrators came to the table with hardened systems to protect the equipment – handrail enclosures. New costs included: using ground-penetrating radar to prepare for coring; enclosure fabrication costs; and more complex conduit and pathway considerations. A typical handrail placement could cost four times the cost of a typical overhead placement and a designer might call for 2 or 3 handrail placements for every overhead placement.

Getting closer, better, faster: Proximate Networks

In-Fill strategies substantially solved the coverage problem in large venues. Using a combination of back of section, front of section, and hand-rail mounted access points, wireless designers had a tool box to deliver full coverage.

But with that success came a new problem. As fans discovered these high density networks and found new uses for them, demands on those networks grew rapidly, especially where teams or venue owners pushed mobile-device content strategies that added to the network load. In-spite of well placed access points, fan devices did not attach to the in-fill devices at the same rate that they attached to the overhead placements [6]. In-fill equipment remained lightly used and overhead placements absorbed hundreds of clients. Gains in system capacity stalled.

Close-up look at U.S. Bank Stadium railing enclosure during final construction phase, summer 2016. Credit: Paul Kapustka, MSR

Close-up look at U.S. Bank Stadium railing enclosure during final construction phase, summer 2016. Credit: Paul Kapustka, MSR

To overcome uneven system loading, designers needed to create a more even distribution of RF energy within the deployment. That required a consistent approach to deployment, rather than a mix of deployment approaches. The result was the elimination of overhead antennas in favor of access points and antennas installed within the crowd, closest to the end use; hence the name ‘Proximate Networks.’

Proximate networks come in two variations: handrail only and under seat only. In the hand rail only model, the designer eliminates overhead and front of section placements in favor of a dense deployment of hand rail enclosures. In the under seat model, the designer places the access point and antenna underneath the actual seating (but above the steel or concrete decking). In both models, the crowd becomes an important part of the design. The crowd attenuates the signal as it passes through their bodies resulting in consistent signal degradation and even distribution of RF energy throughout the seating bowl. The result is even access point loading and increased system capacity.

An additional benefit of embedding the access points in the crowd is that the crowd effectively constrains the output of the access point much as a wall constrains the output of an access point in a typical building. Each radio therefore hears fewer of its neighbors, allowing each channel to be re-used more effectively. And because the crowd provides an effective mechanism for controlling the spread of RF energy, the radios can be operated at higher power levels which improves the link between the access point and the fan’s device. The result is more uniform system loading, higher average data rates, increased channel re-ue, and increases in total system capacity.

While Proximate Networks are still a relatively new concept, the early data (and a rapid number of fast followers) confirms that if you want the densest possible network with the largest possible capacity, then a Proximate Network is what you need.

The Financials: picking what’s right for you

From the foregoing essay, you might conclude that the author’s recommendation is to deploy a Proximate Network. However, that is not necessarily the case. If you want the densest possible network with the largest possible capacity, then a Proximate Network is a good choice. But there are merits to each approach described and a cost benefit analysis should be performed before a deployment approach is selected.

For many venues, Overhead Deployments remain the most cost effective way to provide coverage. For many smaller venues and in venues where system utilization is expected to be low, an Overhead deployment can be ideal.

Front-to-Back deployments work well in venues where system utilization is low and the available overhead mounting assets can’t cover all areas. The goal of these deployments is ensuring usable coverage, not maximizing total system capacity.

In-fill deployments are a good compromise between a coverage-centric high density approach and a capacity-centric approach. This approach is best suited to venues that need more total system capacity, but have budget constraints the prevent selecting a Proximate approach.

Proximate deployments provide the maximum possible wireless density for venues where connectivity is considered to be a critical part of the venue experience.

Conclusion

If your venue is contemplating deploying a high density network, ask your integrator to walk you through the expected system demand, the calculation of system capacity for each approach, and finally the cost of each approach. Make sure you understand their assumptions. Then, select the deployment model that meets your business requirements — there is no “one size fits all” when it comes to stadium Wi-Fi.

Bill Anderson, AmpThink

Bill Anderson, AmpThink

Bill Anderson has been involved in the design and construction of wireless networks for over 20 years, pre-dating Wi-Fi. His first experience with wireless networking was as a software developer building software for mobile computers communicating over 400 MHz and 900 MHz base stations developed by Aironet (now part of Cisco Systems).

His work with mobile computing and wireless networks in distribution and manufacturing afforded him a front row seat to the emergence of Wi-Fi and the transformation of Wi-Fi from a niche technology to a business critical system. Since 2011 at AmpThink Bill has been actively involved in constructing some of the largest single venue wireless networks in the world.

Footnotes

^ 1. A proxy for the calculation of overall system capacity is developed by multiplying the average speed of communication of all clients on a channel (avg data rate or speed) by the number of channels deployed in the system (available spectrum) by the number of times we can use each channel (channel re-use) or [speed x spectrum x re-use]. While there are many other parameters that come into play when designing a high density network (noise, attenuation, reflection, etc.), this simple equation helps us understand how we approach building networks that can support a large number of connected devices in an open environment, e.g. the bowl of a stadium or arena.

^ 2. Co-channel interference refers to a scenario where multiple access points are attepting to communicate with client devices using the same channel. If a client or access point hears competing communication on the channel they are attempting to use, they must wait until that communication is complete before they can send their message.

^ 3. Access Point is the term used in the Wi-Fi industry to describe the network endpoint that client devices communicate with over the air. Other terms used include radio, AP, or WAP. In most cases, each access point is equipped with 2 or more physical radios that communicate on one of two bands – 2.4 GHz or 5 GHz. HD Wi-Fi deployments are composed of several hundred to over 1,000 access points connected to a robust wired network that funnels guest traffic to and from the internet.

^ 4. While there is no hard and fast rule, most industry experts agree that a single access point can service between 50 and 100 client devices.

^ 5. Venues often use pressure washers to clean a stadium after a big event.

^ 6. Unlike cellular systems which can dictate which mobile device attaches to each network node, at what speed, and when they can communicate, Wi-Fi relies on the mobile device to make the same decisions. When presented with a handrail access point and an overhead access point, mobile devices often hear the overhead placement better and therefore prefer the overhead placement. In In-Fill deployments, this often results in a disproportionate number of client devices selecting overhead placements. The problem can be managed by lowering the power level on the overhead access point at the expense of degrading the experience of the devices that the designer intended to attach to the overhead access point.

AT&T Stadium sees 7.25 TB of Wi-Fi for Packers vs. Cowboys playoff game

The Dallas Cowboys before taking the field against the Green Bay Packers in a Jan. 15 playoff game. Credit: James D. Smith/Dallas Cowboys

The Dallas Cowboys before taking the field against the Green Bay Packers in a Jan. 15 playoff game. Credit: James D. Smith/Dallas Cowboys

Pro football’s biggest stadium had the biggest non-Super Bowl Wi-Fi traffic day we’ve heard of this season, as the Dallas Cowboys reported seeing 7.25 terabytes of Wi-Fi data on the AT&T Stadium network during the Packers’ thrilling 34-31 victory on Jan. 15.

John Winborn, chief information officer for the Dallas Cowboys, sent us the info on the stadium’s biggest Wi-Fi day ever, surpassing the previous record of 6.77 TB seen on the AT&T Stadium Wi-Fi network for WrestleMania 32 back on April 5, 2016. The new total for Wi-Fi was even set by fewer fans, with attendance for the Jan. 15 playoff game at 93,396, compared to the 101,763 at WrestleMania.

Though he didn’t provide an exact number, Winborn also said that the take rate of unique clients on the Wi-Fi network for the Packers game was 50 percent of attendees, roughly 46,700, easily one of the biggest numbers we’ve seen anywhere. During the Cowboys’ excellent regular season, Winborn said the average of Wi-Fi data used per game was 5.28 TB, an increase of 33 percent over the 2015 season.

UPDATE: The AT&T folks have provided the DAS stats for the same game, with an additional 3 TB of data used on the AT&T cellular networks inside the stadium. So we’re up to 10.25 TB for a non-Super Bowl game… doubt we will get any other carriers to add their totals but sounds to me like this is the biggest non-Super Bowl event out there in terms of total data.

Any other NFL teams (or college teams) out there with peak games and/or season averages, send them in! Let’s keep updating this list!

THE NEW TOP 7 FOR WI-FI

1. Super Bowl 51, NRG Stadium, Houston, Feb. 5, 2017: Wi-Fi: 11.8 TB
2. Super Bowl 50, Levi’s Stadium, Santa Clara, Calif., Feb. 7, 2016: Wi-Fi: 10.1 TB
3. Green Bay Packers vs. Dallas Cowboys, Divisional Playoffs, AT&T Stadium, Arlington, Texas, Jan. 15, 2017: Wi-Fi: 7.25 TB
4. WrestleMania 32, AT&T Stadium, Arlington, Texas, April 3, 2016: Wi-Fi: 6.77 TB
5. Super Bowl 49, University of Phoenix Stadium, Glendale, Ariz., Feb. 1, 2015: Wi-Fi: 6.23 TB
6. Alabama vs. Texas A&M, Kyle Field, College Station, Texas, Oct. 17, 2015: Wi-Fi: 5.7 TB
7. Pittsburgh Steelers vs. New England Patriots, AFC Championship Game, Gillette Stadium, Foxborough, Mass., Jan. 22, 2017: Wi-Fi: 5.11 TB

Super Bowl LI Wi-Fi sees drop in average per-fan use total

Under seat Wi-Fi APs visible down seating row at NRG Stadium. Credit: 5 Bars

Under seat Wi-Fi APs visible down seating row at NRG Stadium. Credit: 5 Bars

While Super Bowl LI in Houston set records for most total Wi-Fi used in a single day event, the actual amount of average Wi-Fi data used per connected fan actually dropped from the previous year’s game, from about 370 megabytes per user at Super Bowl 50 to about 333 MB per user for Super Bowl 51.

Using official totals provided by the NFL’s official analytics provider, Extreme Networks, there was a total of 11.8 TB of data used on the Wi-Fi network at NRG Stadium in Houston during Super Bowl 51, compared to 10.1 TB used during Super Bowl 50 at Levi’s Stadium in Santa Clara, Calif.

While the total Wi-Fi data number represents approximately a 17 percent increase from Super Bowl 50 to Super Bowl 51, the most recent game had 35,430 users who connected at least once to the network, an almost 30 percent leap from Super Bowl 50’s 27,316 unique users. So while Super Bowl 51 had more unique users (and more peak concurrent users as well) and a higher data total, the average amount of data used per connected fan decreased, from about 370 MB per user to about 333 MB per user.

Data for Super Bowls in years past is thin (mainly because stadium Wi-Fi didn’t really exist), but it’s certainly the first time in very recent history that the per-user average has dropped from one Super Bowl to the next. Super Bowl 49, held at the University of Phoenix Stadium in Glendale, Ariz., saw a total of 6.23 TB of Wi-Fi used, with 25,936 unique users, for a per-user average total of 240 MB. We don’t have any stats for unique users at Super Bowl XLVIII in MetLife Stadium, but with the total Wi-Fi used there at 3.2 TB the average was also presumably much lower as well, unless there were also 50 percent fewer connected users.

Did autoconnect drop the average?

Wi-Fi gear visible above concourse kiosk at NRG Stadium. Credit: 5 Bars

Wi-Fi gear visible above concourse kiosk at NRG Stadium. Credit: 5 Bars

The drop in per-user average data for Wi-Fi is curious when compared to the huge leap in overall DAS stats for the last two Super Bowls, with Super Bowl 51 checking in at 25.8 TB of data, a figure that does not include statistics from T-Mobile, which is declining to report its data total from the game. At Super Bowl 50, all four top wireless carriers combined saw 15.9 TB, so the total for Super Bowl 51 is about 62 percent higher — and if you add in the estimated 3-4 TB that was likely recorded by T-Mobile, that leap is even bigger.

Unfortunately cellular carriers do not provide the exact number of connected users, so there is no per-user average data total available. It would be interesting to know if the expanded DAS preparations made at Super Bowl 50 and at Super Bowl 51 actually connected more total users, or allowed users to use more data per user. We have a request with Verizon for more stats, but it may be a long wait.

One theory we have here at MSR is that it’s possible that a large number of autoconnected devices may have increased the unique-user total while not necessarily adding to the overall Wi-Fi data-used total. In our reporting about the NRG Stadium network we noted that Verizon, which helped pay for the Wi-Fi deployment, had reserved 40 percent of the Wi-Fi capacity for its customers, many of whom could have been autoconnected to the network even without them knowing. We have asked both Extreme and Verizon for a breakdown on Verizon users vs. other wireless customer users on the Wi-Fi network, but have not yet received a response.

Update: Super Bowl LI breaks 37 TB wireless mark

NRG Stadium during Super Bowl LI. Credit: AP / Morry Gash/ Patriots.com

NRG Stadium during Super Bowl LI. Credit: AP / Morry Gash/ Patriots.com

It’s official now, and without any doubt Super Bowl LI broke the single-day wireless data use mark, with at least 37.6 terabytes used.

The official stats for Wi-Fi at NRG Stadium are finally in, with a mark of 11.8 TB, which is a bit more than the 10.1 TB recorded at last year’s Super Bowl at Levi’s Stadium, the previous top mark. The official stats were reported Thursday by Wi-Fi gear provider Extreme Networks, which posted them on the company website.

New DAS records even without any T-Mobile stats

On the cellular side Verizon Wireless, AT&T and Sprint all set new records, with Verizon reporting 11 TB of use and AT&T reporting 9.8 TB, while Sprint (which ran on its own DAS at NRG Stadium) hit 5 TB. At last year’s Super Bowl Verizon (7 TB) and AT&T (5.2 TB) had set their respective previous high-water marks, while Sprint had reported 1.6 TB at Levi’s Stadium. Even without numbers from T-Mobile the current DAS count is 25.8 TB, much higher than the 15.9 TB cellular total from Super Bowl 50.

(Unfortunately, T-Mobile right now is refusing to provide a total data number — a spokesperson who didn’t want to be quoted claimed on a phone call that the total data number was “not relevant,” and that T-Mobile would not provide a final number. However, we did see a blog post from the company claiming it passed its 2.1 TB total from last year by halftime, so at the very least we could probably accurately add at least another 2.2 TB to the overall DAS total. So we may see a combined total of all cellular and Wi-Fi nearing 40 TB before it’s all counted up, approved or not.)

One of our close friends in the business was at the game, and was kind enough to send us a bunch of Wi-Fi speedtests from NRG Stadium (go check our Twitter timeline at @paulkaps to see the tests linked).

What was interesting was watching the speeds go down when “spike” events occurred, like touchdowns and the end of Lady Gaga’s halftime show. The incredible comeback by the New England Patriots to claim a 34-28 overtime victory kept the network busy through the night, and after the game as well during the awards ceremony.

Tom Brady with the Lombardi Trophy. Credit: AP / Patriots.com

Tom Brady with the Lombardi Trophy. Credit: AP / Patriots.com


New record for take rate

According to Extreme, fans at NRG Stadium also set new high-water marks for unique connections to the network as well as for peak concurrent connections. At Super Bowl LI Extreme said it saw 35,430 fans connect to the network, a 49 percent take rate with the attendance of 71,795. Last year at Super Bowl 50 at Levi’s Stadium a total of 27,316 fans connected to the network out of 71,088 attending, a 38 percent take rate.

On the peak concurrent-connection side, Super Bowl LI set a new mark with 27,191 fans connected at one time, according to Extreme. At the Super Bowl 50, the top concurrent-connected mark was 20,300.

Extreme also released some social-media statistics, claiming that 1.7 TB of the Wi-Fi total was social media traffic. Leading the way in order of most users to fewer were Facebook, Instagram, Snapchat and Twitter. Interestingly, Snapchat consumed almost as much data as Facebook, according to pie graphs in the Extreme infographic, which did not provide any actual numbers for those totals. Extreme also did not report what is typically the highest use of bandwidth in any stadium situation, that being Apple iOS updates and Google Gmail activity.

The NFL, which had its own game-day application for Super Bowl LI, has not released any statistics about app use.

Congrats to all the carriers, integrator 5 Bars and Wi-Fi gear supplier Extreme Networks.

THE NEW TOP 6 FOR WI-FI

1. Super Bowl 51, NRG Stadium, Houston, Feb. 5, 2017: Wi-Fi: 11.8 TB
2. Super Bowl 50, Levi’s Stadium, Santa Clara, Calif., Feb. 7, 2016: Wi-Fi: 10.1 TB
3. WrestleMania 32, AT&T Stadium, Arlington, Texas, April 3, 2016: Wi-Fi: 6.77 TB
4. Super Bowl 49, University of Phoenix Stadium, Glendale, Ariz., Feb. 1, 2015: Wi-Fi: 6.23 TB
5. Alabama vs. Texas A&M, Kyle Field, College Station, Texas, Oct. 17, 2015: Wi-Fi: 5.7 TB
6. Pittsburgh Steelers vs. New England Patriots, AFC Championship Game, Gillette Stadium, Foxborough, Mass., Jan. 22, 2017: Wi-Fi: 5.11 TB

THE NEW TOP 4 FOR TOTAL USAGE

1. Super Bowl 51, NRG Stadium, Houston, Feb. 5, 2017: Wi-Fi: 11.8; DAS: 25.8 TB**; Total: 37.6 TB
2. Super Bowl 50, Levi’s Stadium, Santa Clara, Calif., Feb. 7, 2016: Wi-Fi: 10.1 TB; DAS: 15.9 TB; Total: 26 TB
3. Super Bowl XLIX, University of Phoenix Stadium, Glendale, Ariz., Feb. 1, 2015: Wi-Fi: 6.23 TB; DAS: 6.56 TB**; Total: 12.79 TB**
4. WrestleMania 32, AT&T Stadium, Arlington, Texas, April 3, 2016: Wi-Fi: 6.77 TB; DAS: 1.9 TB*; Total: 8.6 TB*

* = AT&T DAS stats only
** = AT&T, Verizon Wireless and Sprint DAS stats only

Verizon goes under concrete to bolster NRG Stadium DAS for Super Bowl LI

Nodes on wheels, or NOWs, provide extra coverage for Verizon Wireless in Houston for Super Bowl LI. Credit: Verizon Wireless

Nodes on wheels, or NOWs, provide extra coverage for Verizon Wireless in Houston for Super Bowl LI. Credit: Verizon Wireless

In a slight twist from its strategy for last year’s Super Bowl, Verizon Wireless has installed DAS antennas underneath the concrete flooring of lower-tier seats at Houston’s NRG Stadium, to provide extra bandwidth for the expected high wireless data usage at Super Bowl LI.

Last year at Levi’s Stadium in Santa Clara, Calif., Verizon beefed up its distributed antenna system (DAS) with under-seat antennas it designed specifically for use in stadiums. The idea of mounting antennas under seats, a growing trend in the stadium Wi-Fi world, is gaining traction as another method of bringing signals closer to fans, especially in places (like lower bowl seats) where there are no overhangs or other places to mount gear.

And while Verizon has been preparing for Sunday’s big game at NRG Stadium for years, that didn’t stop the company from “continually tweaking” its network preparations, according to Leo Perreault, executive director of network operations for Verizon’s South Central market, a region that stretches from west of Florida to Arizona, including Houston. In a phone interview this week, Perreault said that Verizon installed the under-concrete antennas during the middle of the 2016 football season, giving the company “some good experience” with the deployment ahead of Sunday’s game.

Under concrete = easier install and maintenance

A view inside the head end room that runs Verizon's NRG Stadium DAS. Credit: Verizon Wireless

A view inside the head end room that runs Verizon’s NRG Stadium DAS. Credit: Verizon Wireless

It might not be well known outside of wireless networking circles, but signals will travel through concrete; many early stadium Wi-Fi designs (and some current ones, including a new network installed at the Pepsi Center in Denver) use antennas mounted under concrete floors, pointing up. Though fixed under-seat antennas can provide better coverage, Perreault said the ease of deployment made putting the additional DAS antennas underneath the floor a better option in Houston.

“This way [under the concrete] is non-intrusive,” Perrault said, noting that the devices are also not affected by stadium power-washing units. The decision may have been influenced by the fact that NRG Stadium’s new Wi-Fi network had a big issue with moisture in under-seat AP placements, forcing a mid-season rip and replace for all the under-seat Wi-Fi APs.

Even though antennas under concrete are not as powerful, Perrault said Verizon is “very pleased with the performance. It’s a good compromise.”

Biggest stadium DAS?

Between the game being the Super Bowl and it being in Texas, there’s no shortage of hyperbole surrounding the game and all its attendant facets, including the network technology. But when Perreault claims that the DAS Verizon has installed for NRG Stadium “might be the largest we have anywhere,” that might be true since it also serves adjacent properties including the NRG Convention Center, the NRG Arena and an outdoor DAS in the surrounding spaces. In addition to Houston Texans games, NRG Park (which includes the stadium) is also host to the Houston Livestock Show and Rodeo, which humbly bills itself as the “world’s largest livestock show and richest regular-season rodeo.”

An alien spaceship, or a temporary cell tower from Verizon? You choose.

An alien spaceship, or a temporary cell tower from Verizon? You choose.

Inside NRG Stadium, Perreault said the new Verizon DAS (built before the 2015 season) has more than 900 antennas. As neutral host, Verizon will also provide access to AT&T and T-Mobile on its network; Sprint, which built a previous DAS at NRG, will continue to run on that system.

Outside the stadium and around Houston, Verizon has done the usual big-event preparations, with lots of permanent and temporary macro network improvements, and portable units like COWs (cells on wheels) and smaller NOWs (nodes on wheels). You can review all the Verizon preparations in a company blog post.

As previously reported in MSR, Verizon also helped foot part of the bill for the new NRG Stadium Wi-Fi network, a deal that will give Verizon a reserved claim to 40 percent of the Wi-Fi network’s capacity, according to Perreault.

Whether or not Super Bowl LI breaks the wireless data consumption records set at last year’s game remains to be seen, but Perreault said there doesn’t seem to be any slowing down yet of the perpetual growth in wireless data use at stadiums, especially at big events like the Super Bowl.

“Fans just seem to find ways to consume whatever additional bandwidth you provide,” he said.

NFL builds its own Super Bowl app, with no concessions delivery and fewer replays

Screen shot of map function on NFL Super Bowl LI app.

Screen shot of map function on NFL Super Bowl LI app.

The NFL has built its own Super Bowl mobile app, breaking with a recent history of using stadium-app specialists like VenueNext and YinzCam to develop specific apps for Super Sunday.

Also unlike recent years, the NFL’s Super Bowl app will not feature instant replays or have any kind of food or drink delivery services. Instead, there appears to be a big focus on promoting Super Bowl events (especially those for this weekend) and for helping out of town tourists find their way to Super Bowl events and to the game itself.

Curiously, an interview about the app with the NFL’s CIO claimed that this year’s app will also be the first to include the ability for fans at the game to watch Super Bowl commercials. The story also claims without any attribution that “In the past, commercials weren’t on the app in order to avoid using too much bandwidth in the stadium.” However, at the most recent Super Bowls, including the past two, stadium bandwidth has been more than sufficient enough to stream plenty of video. And in fact, both of the last two Super Bowl apps have included the ability for fans at the game to see Super Bowl commercials.

Last year’s app, developed by VenueNext for the Levi’s Stadium hosting of Super Bowl 50, definitely showed Super Bowl commercials, part of what the San Francisco 49ers network team said was a record-breaking day of app-based video watching. The Super Bowl 49 app, built by YinzCam, also included Super Bowl commercials according to this NFL video and according to our previous reporting.

Fewer replays, no food or beverage delivery service

Screen shot of transportation info links from Super Bowl LI app.

Screen shot of transportation info links from Super Bowl LI app.

And even though NFL CIO Michelle McKenna-Doyle is quoted in the previous story about the new app as saying “You won’t feel like you’re using two separate apps as fans have in the past,” the Super Bowl LI app contains a link to download the separate NFL Mobile app, which is apparently where Super Bowl highlights and replays will live. There was no confirmation from the NFL or Verizon about whether or not fans in the stands would be able to watch the live broadcast of the game via NFL Mobile. Fans not at the game will be able to use NFL Mobile to watch the game on cellular devices; fans can also stream the game from the FoxSports website, for PCs or tablet devices.

This year’s app will also not include any way for fans to use the app to order food or beverage delivery to their seats; last year’s app did have the ability to order in-seat delivery of beverages or to place an order for food and beverage express pickup, a service used for 3,284 orders. NRG Stadium, however, does not offer full-stadium in-seat ordering like Levi’s Stadium does; the stadium does have serving staff with wireless devices providing in-seat ordering services for club sections, which will likely be in use at the Super Bowl as well.