Comcast brings new 3-Gig backbone to Memphis Grizzlies’ FedExForum

Comcast Business, which has sponsored the backbone bandwidth to many sports arenas, announced that it has installed a new 3-gigabit fiber backbone to the NBA’s Memphis Grizzlies’ FedExForum, which will support the already-existing fan-facing Wi-Fi network at the stadium.

According to our past research, the FedExForum Wi-Fi used to be supported by a wide-area wireless link provided by Ubiquiti, the firm that Grizzlies owner Robert Pera is CEO of. We are guessing here but we suspect that the fan-facing Wi-Fi will soon have a new SSID name of xfinitywifi, the SSID used by Comcast in other arenas where it provides backbone services.

Though we haven’t updated our specific information since our original report, news reports today claiming that Comcast is supplying Wi-Fi to FedExForum for the first time are incorrect. According to Comcast, under the new deal Comcast will also provide internet service and other communications services for the team’s front office operations.

A building for the future: Tech shines through at Sacramento’s new Golden 1 Center

Golden 1 Center, the new home of the Sacramento Kings. Credit: Sacramento Kings

If you’re building for the future, it’s best to start with a building for the future.

That’s what has happened in downtown Sacramento, where the Sacramento Kings have built a technology-laden future-proof arena, a venue designed not just to host basketball games but to be the centerpiece of a metro revival for years to come.

Now open for business, the Golden 1 Center is a living blueprint for the arena of the future, especially from a technology perspective. And while some technology inside the venue is impossible to ignore — starting with the massive 4K scoreboard that overhangs the court — there’s also a wealth of less-apparent technology woven throughout the building’s core and pervasive in its operating functions.

Led by Kings majority owner and former software company founder Vivek Ranadive, the technology-focused direction of the new arena is a blend of the latest thinking in venue experiences and operations. Among the many got-to-have staples: High-quality wireless connectivity and multiple mobile device-based services, including food ordering and delivery, map-based parking, wayfinding help, and digital ticketing. While its already-available options easily place Golden 1 Center among the top tier of connected stadiums today, what may be more impressive is the internal planning for future technologies and services, a sign that its owners and operators clearly understand the ever-changing nature of digital systems.

The purple lights are on in the Golden 1 Center data room. Credit all following photos: Paul Kapustka, MSR (click on any photo for a larger image)

While the arena is open today, it’s still somewhat of a diamond in the rough, as planned surrounding structures, including adjacent hotel and retail outlets, are still in the concrete-and-cranes phase, with “coming soon” signs on the area’s many construction fences. As they wait for their team to show signs of on-court improvement, Sacramento citizens must also be patient for the full plan of the downtown arena to emerge, along with its promise to revive an area once stuck in the past.

The good news? With Golden 1 Center Sacramento fans already have a winner, in a venue that will provide fans with some of the best digital-based services and amenities found anywhere, for now and for the foreseeable future. What follows are our first impressions from an early December 2016 visit to a Kings home game, hosted by representatives of the Kings’ technical staff along with representatives from Wi-Fi gear provider Ruckus and cellular DAS deployment firm DAS Group Professionals.

Showing off the data center

Editor’s note: This profile is from our latest STADIUM TECH REPORT, the ONLY in-depth publication created specifically for the stadium technology professional and the stadium technology marketplace. Read about new networks at the Indiana Pacers’ Bankers Life Fieldhouse and the new Wi-Fi network used for the Super Bowl in our report, which is available now for FREE DOWNLOAD from our site!

Data center guards. Small, but well armed.

If you had any doubts about how proud the Kings are of their stadium technology, those are erased the moment you enter the stadium via the VIP doorway; after the metal detectors but before you hit the new-wave ticket scanners, you see a set of floor-to-ceiling glass walls and doors to your left, showing off the impressive racks of the venue’s main data equipment room.

How can gear racks be impressive? How about if they are impeccably encased in their own white metal and glass enclosures, a technique that allows the Kings to refrigerate each rack separately, leaving the rest of the room at a temperature more suitable to human bodies. You don’t have to be a network equipment operator to recognize an over-the-top attention to detail here; even the exposed fiber cabling that stretches out up and across the ceiling is color-coded in the Kings’ main team purple; another level of coolness appears when the main lights in the room are turned off, and more LEDs come on to bathe the room in a completely purple hue.

This room is also where you see the first hints of how the team is preparing for the future, with two 100 Gbps incoming bandwidth pipes (from Comcast), as well as two full rows of racks left empty, waiting for whatever innovation needs arise next. While the backbone bandwidth will eventually also support the nearby hotel and retail locations, twin 100-Gbps connections should provide adequate throughput for now and the foreseeable future.

Walk a few steps past the mini-sized Imperial Stormtroopers who guard the facility and you are in a hallway that separates a “mission control” room with monitors for a huge number of operational services, and the video control room. The innovation here starts simply with the side-by-side proximity of network, operations and video administration rooms, a rarity especially in older stadiums where coordination between people working in such rooms often meant walkie-talkies and lots of running around.

Multiple live video inputs in the “control room” at G1C.

While the video control room and its need to supply coordinated content to more than 800 monitors in the building (as well as to the app) is impressive, what’s really interesting is the “mission control” room, where Kings employees, network types and public safety personnel can track multiple inputs on a wall of monitors. In addition to security and public service video monitoring (Kings reps talk about seeing fans spill a drink and hustling to deploy clean-up services before anyone can ask for them), there are also displays for real-time social media mentions and live traffic information, which the Kings can monitor and respond to as needed.

Another “unseen” technology innovation is an operational app that provides real-time access to a huge list of game-day business statistics, like live ticket-scan numbers and real-time updates to concession purchases. This app is also available to Kings execs on their mobile devices, and it’s addicting to watch the numbers update in real time, especially the fast-moving alcoholic beverage purchase totals; according to the Kings, during a Jimmy Buffett concert at the arena, adult-beverage purchases were pushing the $1,000-per-minute mark.

When it comes to the fan experience, such “hidden” technologies may be the services that provide the best examples for how high-quality networks can bring real ROI to stadiums and large public venues. Fans may never know the guts of the system, but when a stand doesn’t run out of hot dogs or a clean-up squad arrives quickly to mop up a spilled beer, it’s a good bet that customer satisfaction will keep increasing. With massively connected systems and attached real-time analytics, such services become easier to deploy and manage; at Golden 1 Center, it’s easy to see how multiple stakeholders in the venue benefit from the decision to make networked technology a primary core of the building’s operations.

The huge scoreboard dominates the view at Golden 1 Center.

A scoreboard that stretches from hoop to hoop

Taking an elevator up to the main concourse floor, the initial impression of Golden 1 Center is its openness — it is built so that the main or ground level entrance is at the top of the bottom bowl of seats, with court level below. Open all the way around, the ability to see across the venue gives it an airy feeling, more like a bigger enclosed football stadium than a basketball arena. On the night we toured the venue its unique glass entryway windows were closed, but they can be opened to let in the breeze during milder days — adding another degree of difficulty for wireless network administration, since LTE signals can both enter and leave the building when the windows are open.

The next thing to catch your eye is the main scoreboard, which the Kings bill as the biggest 4K screen for a permanent indoor arena, with 35 million pixels. If it were lowered during a game, the Kings folks claim the screen would touch both baskets, so without any other numbers you get the idea: This thing is huge.

New entry kiosks from SkiData move more fans inside more quickly, the Kings claim.

It’s also incredibly clear, thanks in part to the 4K resolution but also in part to the fact that it is tilted at just the correct angles so that it’s easy to glance up from live action for a look at either the main screens or the bordering screens on both sides. Just citing clarity or size for scoreboards, I think, is missing a critical factor for video boards — what really matters is whether or not the screen is a positive or negative factor for during-game viewing, a subjective measurement that may take time to sink in. First impressions, however, during the live action between the Kings and Knicks during our visit, were incredibly positive, with the screen not interfering with live action views but incredibly clear for replays and live statistics.

The next part of our tour was to see if we could spot any of the 931 Ruckus Wi-Fi APs that are installed inside the venue. With the clear emphasis on clean aesthetics it was hard to spot any of the wall- or ceiling-mounted units, but we were able to locate several of the many under-seat AP enclosures, including some on retractable seats. According to the Ruckus folks on hand the retractable-seat APs took a little extra engineering, to allow the devices to be disconnected during seat movements.

The JMA Wireless DAS equipment was a little easier to spot, since like at Levi’s Stadium there are a number of antenna placements around the main concourse, pointing down into the lower bowl seating. The DAS Group Professional representatives on hand also pointed out more antennas up in the rafters, as well as some specially designed “antenna rocks” that hide cellular equipment outside the stadium in the open-air plaza. According to DGP and the Kings there are 136 DAS remote placements housing 213 antennas; right now only AT&T and Verizon Wireless are active on the DAS, with T-Mobile scheduled to join before the end of the NBA season. Negotiations with Sprint are still under discussion.

Blazing Wi-Fi in the basement of the building… and the rafters

When we dropped back down to the court-level to see the locker room entrances and one of the premium-seat club areas, we took our first Wi-Fi speed test at Golden 1 Center, and almost couldn’t believe the result: We got 132 Mbps for the download speed and 98 Mbps for upload. Greeted a few minutes later by owner Ranadive himself, we congratulated him on getting what he wanted in terms of connectivity, a theme he relentlessly promoted during the arena’s construction phases.

That’s good Wi-Fi. Taken in the Lexus Club on court level at Golden 1 Center.

The Wi-Fi connectivity was superb throughout the venue, with readings of 51.35/22.21 on press row (located at the top of the main lower bowl, just in front of the main concourse) and 42.14/38.83 in the crowded Sierra Nevada brewpub club at the top level of the arena. In section 220 in the upper deck we got Wi-Fi readings of 53.39 Mbps for download and 36.27 for upload. Throughout the stadium the Verizon LTE signal was in low teens to 20 Mbps range on the download side and usually between 20-30 Mbps on the upload side.

One of the decisions the Kings made on the Wi-Fi side was to drop 2.4 GHz coverage for fan devices in the main bowl area. According to both Ruckus and the Kings, fan devices now are almost 90 percent 5 GHz capable, meaning that it makes administrative sense to take 2.4 GHz out of the main fan Wi-Fi equation (while still keeping it for back-of-house operations like POS and wireless wristbands and cameras, which all still use 2.4 GHz technology). Other teams in the NBA, including the Indiana Pacers (who also recently installed a Ruckus Wi-Fi network) have also said that they are getting rid of 2.4 GHz coverage for fans since most devices used today have 5 GHz connectivity.

While we didn’t have time during this visit to explore all the numerous services available through the team’s app — including a game that lets fans bet loyalty points on predictions about which players will score the most points — it was clear that many fans were taking advantage of the connectivity, especially in the brewpub area where handy lean-up railings with small shelves made it easier to operate a mobile device while still being somewhat engaged with the court action below.

Team execs can get live feeds of fan-related stats on their internal app.

According to the Kings, during the first regular-season home game on Oct. 27, 2016, there were 8,307 unique users of the Wi-Fi network, out of 17,608 fans in attendance. The connected fans used a total of 1.4 terabytes of data on the Wi-Fi network that night, with a top peak concurrent connection number of 7,761 users. The highest sustained traffic to the Internet that night was a mark of 1.01 Gbps for a 15-minute period between 7:45 to 8:00 p.m., according to the Kings.

Another technology twist we saw in the brewpub was the use of Appetize’s flip-screen POS terminals, which allows for faster order taking simply by letting fans sign on screens with their fingers. Back at the front gates, the new ticket-scanning kiosks from SkiData may take some time for fans to get used to, but even obvious first-timers seemed to quickly understand the kiosk’s operation without much help needed, thanks to the helpful instructions on the wide screen that greets fans as they encounter the device. According to the Kings, tests of the new kiosks at other venues have shown that they can be as much as three times faster than previous technologies, good news to anyone who’s ever had to wait in line just to have their ticket checked.

A building for the future, whenever it comes

While we here at MSR clearly focus on venue technology, it was clear even during our brief stay at Golden 1 Center that while Sacramento fans may be immediately enjoying the amenities, they are still first and foremost concerned about the product on the court. In the upper deck two men spent several minutes questioning why Kings star DeMarcus “Boogie” Cousins (who has since been traded to the New Orleans Pelicans) didn’t seem to get the kind of refereeing treatment alloted to other NBA leaders; on an escalator another fan interrupted one of my speedtests by loudly requesting a fan-to-fan fistbump while he simply said, “Kings basketball, right baby?”

A view outside the stadium’s main entrance, with one of the two large vertical video boards visible.

Even in the face of mulitple years without playoff teams, Sacramento fans still turn out for the Kings; the point here in regards to technology is that it may take time for fans to notice and embrace the finer points of all the technological attributes of their new arena, which should become more than just an NBA venue as more concerts and civic events are held in and around its environs.

Our quick take is that fans may turn faster to services like the traffic, parking and seat-wayfinding features in the app, simply due to the newness of the building to everyone, as well as its tightly sandwiched downtown location. Like in other new arenas, the jury is still out on other app-based services like the loyalty-points voting game, and in-seat concessions ordering and delivery; the Kings declined to provide any statistics for in-seat ordering and delivery, a service which became available to the entire stadium on the night of our visit. The Kings, like many other teams, also offer instant replays via the app, but with the numerous high-quality big-screen displays (including two arena-sized screens outside the main entryway) it will be interesting to see if fans ever see an overwhelming need to check their devices for live action while attending a game.

The good news for the Kings is that they based their stadium and team app on a new flexible platform from a company called Built.io, which the Kings say allows for easier addition (or deletion) of services through an API layer. Like the future-proof parts of the building itself, the app also shows the Kings’ dedication to building something now that will almost certainly change going forward. As we look to the future it will be interesting to see which parts of the technology base contribute most to the fan experience and business operations at Golden 1 Center — and to see how many other existing or new arenas follow the lead.

More photos from our visit below!

Under seat Wi-Fi AP on a moveable section of stands.

The view from upper-deck seats.

A Wi-Fi speed test from those same seats.

One of the “rocks” hiding DAS antennas on the outside walkway.

College Football Playoff championship sees 2.4 TB of Wi-Fi — big decline from 2016

We finally have numbers for the Wi-Fi usage at the most recent College Football Playoff championship game, and in somewhat of a first the total data used during the event was much lower than the previous year’s game, with just 2.4 terabytes of data used on Jan. 9 at Raymond James Stadium in Tampa, Fla. — compared to 4.9 TB of Wi-Fi used at the championship game in 2016, held at the University of Phoenix Stadium in Glendale, Ariz.

The reason for the decline is probably not due to any sudden dropoff in user demand, since usage of in-stadium cellular or DAS networks increased from 2016 to 2017, with AT&T’s observed network usage doubling from 1.9 TB to 3.8 TB in Tampa. More likely the dropoff is due to the fact that the Wi-Fi network at the University of Phoenix Stadium had been through recent upgrades to prepare for both the college championship game and Super Bowl XLIX, while the network in Raymond James Stadium hasn’t seen a significant upgrade since 2010, according to stadium officials. At last check, the Wi-Fi network at University of Phoenix Stadium had more than 750 APs installed.

Joey Jones, network engineer/information security for the Tampa Bay Buccaneers, said the Wi-Fi network currently in use at Raymond James Stadium has a total of 325 Cisco Wi-Fi APs, with 130 of those in the bowl seating areas. The design is all overhead placements, Jones said in an email discussion, with no under-seat or handrail enclosure placements. The total unique number of Wi-Fi users for the college playoff game was 11,671, with a peak concurrent connection of 7,353 users, Jones said.

Still tops among college playoff championship games in Wi-Fi is the first one held at AT&T Stadium in 2015, where 4.93 TB of Wi-Fi was used. Next year’s championship game is scheduled to be held at the new Mercedes-Benz Stadium in Atlanta, where one of the latest Wi-Fi networks should be in place and operational.

Super Bowl LI Wi-Fi sees drop in average per-fan use total

Under seat Wi-Fi APs visible down seating row at NRG Stadium. Credit: 5 Bars

Under seat Wi-Fi APs visible down seating row at NRG Stadium. Credit: 5 Bars

While Super Bowl LI in Houston set records for most total Wi-Fi used in a single day event, the actual amount of average Wi-Fi data used per connected fan actually dropped from the previous year’s game, from about 370 megabytes per user at Super Bowl 50 to about 333 MB per user for Super Bowl 51.

Using official totals provided by the NFL’s official analytics provider, Extreme Networks, there was a total of 11.8 TB of data used on the Wi-Fi network at NRG Stadium in Houston during Super Bowl 51, compared to 10.1 TB used during Super Bowl 50 at Levi’s Stadium in Santa Clara, Calif.

While the total Wi-Fi data number represents approximately a 17 percent increase from Super Bowl 50 to Super Bowl 51, the most recent game had 35,430 users who connected at least once to the network, an almost 30 percent leap from Super Bowl 50’s 27,316 unique users. So while Super Bowl 51 had more unique users (and more peak concurrent users as well) and a higher data total, the average amount of data used per connected fan decreased, from about 370 MB per user to about 333 MB per user.

Data for Super Bowls in years past is thin (mainly because stadium Wi-Fi didn’t really exist), but it’s certainly the first time in very recent history that the per-user average has dropped from one Super Bowl to the next. Super Bowl 49, held at the University of Phoenix Stadium in Glendale, Ariz., saw a total of 6.23 TB of Wi-Fi used, with 25,936 unique users, for a per-user average total of 240 MB. We don’t have any stats for unique users at Super Bowl XLVIII in MetLife Stadium, but with the total Wi-Fi used there at 3.2 TB the average was also presumably much lower as well, unless there were also 50 percent fewer connected users.

Did autoconnect drop the average?

Wi-Fi gear visible above concourse kiosk at NRG Stadium. Credit: 5 Bars

Wi-Fi gear visible above concourse kiosk at NRG Stadium. Credit: 5 Bars

The drop in per-user average data for Wi-Fi is curious when compared to the huge leap in overall DAS stats for the last two Super Bowls, with Super Bowl 51 checking in at 25.8 TB of data, a figure that does not include statistics from T-Mobile, which is declining to report its data total from the game. At Super Bowl 50, all four top wireless carriers combined saw 15.9 TB, so the total for Super Bowl 51 is about 62 percent higher — and if you add in the estimated 3-4 TB that was likely recorded by T-Mobile, that leap is even bigger.

Unfortunately cellular carriers do not provide the exact number of connected users, so there is no per-user average data total available. It would be interesting to know if the expanded DAS preparations made at Super Bowl 50 and at Super Bowl 51 actually connected more total users, or allowed users to use more data per user. We have a request with Verizon for more stats, but it may be a long wait.

One theory we have here at MSR is that it’s possible that a large number of autoconnected devices may have increased the unique-user total while not necessarily adding to the overall Wi-Fi data-used total. In our reporting about the NRG Stadium network we noted that Verizon, which helped pay for the Wi-Fi deployment, had reserved 40 percent of the Wi-Fi capacity for its customers, many of whom could have been autoconnected to the network even without them knowing. We have asked both Extreme and Verizon for a breakdown on Verizon users vs. other wireless customer users on the Wi-Fi network, but have not yet received a response.

Arizona State upgrades DAS, Wi-Fi at Sun Devil Stadium

Sun Devil Stadium at Arizona State. Credit all photos: ASU

Sun Devil Stadium at Arizona State. Credit all photos: ASU

When Arizona State University started renovating Sun Devil Stadium three years ago, the project wasn’t so much a simple wireless refresh as it was a total reset of what technology, sports and academia could co-create.

In addition to expanded Wi-Fi and DAS for the stadium (a venue that includes classrooms, meeting rooms and retail outlets), ASU activated a virtual beacon trial. The university also joined up with Intel to explore how Internet of Things devices might yield better environmental information about the bowl, including acoustic data, Jay Steed, assistant vice president of IT operations, told Mobile Sports Report.

The university’s IT department understood that a richer fan experience for football and other events would require a robust network. Steed and his colleagues visited other venues like Levi’s Stadium, AT&T Stadium, Stanford and Texas A&M to get a better handle on different approaches to networking, applications and services.

Regardless, some sort of refresh was overdue. Wedged between two buttes in the southeastern Phoenix suburb of Tempe, the 71,000-seat Sun Devil Stadium was completed in 1958 and needed infrastructure and technology updates. Wi-Fi access was limited to point-of-sale systems and stadium suites; fans generally relied on a DAS network.

Time for an upgrade

Editor’s note: This profile is from our latest STADIUM TECH REPORT, the ONLY in-depth publication created specifically for the stadium technology professional and the stadium technology marketplace. Read about the Sacramento Kings’ new Golden 1 Center and the new Wi-Fi network for the Super Bowl in our report, which is available now for FREE DOWNLOAD from our site!

“The stadium needed a lot of facelifting, not just from a technology perspective but also for the fan experience, like ADA compliance and overall comfort,” Steed said. “We didn’t just want to rebuild a venue for six football games a year, but extend its availability to 365 days and make it a cornerstone and anchor for the whole campus.”

The 'Inferno' student section got a priority for better connectivity.

The ‘Inferno’ student section got a priority for better connectivity.

The reset included tearing out the lower bowl to “punch some new holes” — new entry points to the stadium — and to add conduits and cabling for the new 10-gigabit fiber backbone for the stadium. The network can be upgraded as needed to 40- and even 100-gigabit pipes, according to Steed.

“We wanted to make sure it could support fans’ [connectivity needs] and all the facility’s operations with regard to video and StadiumVision, learning and education, and Pac-12 needs as well,” he said.

The overall stadium renovation was budgeted at $268 million; the technology upgrades will total about $8 million.

The university added 250 new DAS antennas. The vendor-neutral network includes AT&T, Verizon, Sprint and T-Mobile, which share 21 DAS sectors to keep cell service humming inside the stadium.

On the Wi-Fi side, ASU opted for Cisco’s access points. The networking vendor was already entrenched across the 642-acre campus; Steed and the IT department prefer the simplicity of a single-vendor network. Cisco helped with the hardware and RF engineering for Sun Devil Stadium. CenturyLink offered guidance on the networking and fiber pieces of the project, while Hunt-Sundt, a joint venture, was the contractor for most of the physical construction.

Wireless service for ‘The Inferno’

When the renovation is officially completed later in 2017 (most of the network is already live), there will be 1,100 APs in and around Sun Devil Stadium. The student sections, also known as The Inferno, get more APs and bandwidth since historical data has shown students to be the biggest bandwidth consumers in the stadium. Consequently, the ratio in the student sections is one AP to every 50 users; the rest of the bowl’s APs each handle about 75 users on average, Steed said.

Breakaway look at an under-seat AP

Breakaway look at an under-seat AP

ASU’s new Wi-Fi network was engineered to deliver 1.5 Mbps upstream and 3 Mbps downstream, but Steed said so far users are getting better performance – 8 Mbps up and 12 Mbps down. “We’re getting about 25 percent saturation,” he added. “Many users put their phones away during the games, but we see spikes at halftime and during commercial breaks.” Regardless, ASU continually monitors Wi-Fi and DAS usage and adjusts bandwidth as needed.

Another big challenge is the desert climate – temperatures regularly soar into triple digits. With about 450 under-seat APs in the bowl, Steed and his team had to make sure the enclosures could withstand heat and didn’t obstruct the walkways. “We’ll see how well the electronics do, baking at 120 degrees six months out of the year,” he laughed.

ASU is also working with Intel, using the stadium’s APs as part of an Internet of Things trial. As Steed described it, IoT sensors work alongside stadium APs to measure temperature, noise, vibration and other environmental data. “We also look at lighting control and water distribution and flow,” he said.

Concourses also got expanded Wi-Fi and DAS coverage.

Concourses also got expanded Wi-Fi and DAS coverage.

Automating the control of environmental functions like heating, cooling, power usage and facilities management will help the university toward its goal of being carbon-neutral by 2025, Steed added. The trials are designed so that the technology can be expanded across the university, possibly for campus transportation kiosks or student concierge services. IoT devices could give students and visitors information about adjacent buildings or landmarks around campus.

Separate but related, the university is also testing cloud-based, Bluetooth low energy (BLE) technology from Mist Systems. These “virtual beacons” use sensors attached to an AP to flag information or a point of interest for students or stadium visitors. “The virtualized beacon technology helps us understand where people are walking around and what they’re looking at in the stadium and elsewhere around campus,” Steed said.

They’re currently being tested in some of Sun Devil Stadium’s suites; Steed foresees expanding that to the student union to help guide people to meeting rooms, retail facilities or food vendors, for example.

Steed credited excellent communication and collaboration among the university’s athletic and IT departments and other players in the upgrade equation. “Our athletic director, Ray Anderson, brought the CIO and me into his office and really worked together with us,” he explained. “The biggest piece of our success was knowing that the AD supported our recommendations and brought us in as valued advisors.”

Update: Super Bowl LI breaks 37 TB wireless mark

NRG Stadium during Super Bowl LI. Credit: AP / Morry Gash/ Patriots.com

NRG Stadium during Super Bowl LI. Credit: AP / Morry Gash/ Patriots.com

It’s official now, and without any doubt Super Bowl LI broke the single-day wireless data use mark, with at least 37.6 terabytes used.

The official stats for Wi-Fi at NRG Stadium are finally in, with a mark of 11.8 TB, which is a bit more than the 10.1 TB recorded at last year’s Super Bowl at Levi’s Stadium, the previous top mark. The official stats were reported Thursday by Wi-Fi gear provider Extreme Networks, which posted them on the company website.

New DAS records even without any T-Mobile stats

On the cellular side Verizon Wireless, AT&T and Sprint all set new records, with Verizon reporting 11 TB of use and AT&T reporting 9.8 TB, while Sprint (which ran on its own DAS at NRG Stadium) hit 5 TB. At last year’s Super Bowl Verizon (7 TB) and AT&T (5.2 TB) had set their respective previous high-water marks, while Sprint had reported 1.6 TB at Levi’s Stadium. Even without numbers from T-Mobile the current DAS count is 25.8 TB, much higher than the 15.9 TB cellular total from Super Bowl 50.

(Unfortunately, T-Mobile right now is refusing to provide a total data number — a spokesperson who didn’t want to be quoted claimed on a phone call that the total data number was “not relevant,” and that T-Mobile would not provide a final number. However, we did see a blog post from the company claiming it passed its 2.1 TB total from last year by halftime, so at the very least we could probably accurately add at least another 2.2 TB to the overall DAS total. So we may see a combined total of all cellular and Wi-Fi nearing 40 TB before it’s all counted up, approved or not.)

One of our close friends in the business was at the game, and was kind enough to send us a bunch of Wi-Fi speedtests from NRG Stadium (go check our Twitter timeline at @paulkaps to see the tests linked).

What was interesting was watching the speeds go down when “spike” events occurred, like touchdowns and the end of Lady Gaga’s halftime show. The incredible comeback by the New England Patriots to claim a 34-28 overtime victory kept the network busy through the night, and after the game as well during the awards ceremony.

Tom Brady with the Lombardi Trophy. Credit: AP / Patriots.com

Tom Brady with the Lombardi Trophy. Credit: AP / Patriots.com

New record for take rate

According to Extreme, fans at NRG Stadium also set new high-water marks for unique connections to the network as well as for peak concurrent connections. At Super Bowl LI Extreme said it saw 35,430 fans connect to the network, a 49 percent take rate with the attendance of 71,795. Last year at Super Bowl 50 at Levi’s Stadium a total of 27,316 fans connected to the network out of 71,088 attending, a 38 percent take rate.

On the peak concurrent-connection side, Super Bowl LI set a new mark with 27,191 fans connected at one time, according to Extreme. At the Super Bowl 50, the top concurrent-connected mark was 20,300.

Extreme also released some social-media statistics, claiming that 1.7 TB of the Wi-Fi total was social media traffic. Leading the way in order of most users to fewer were Facebook, Instagram, Snapchat and Twitter. Interestingly, Snapchat consumed almost as much data as Facebook, according to pie graphs in the Extreme infographic, which did not provide any actual numbers for those totals. Extreme also did not report what is typically the highest use of bandwidth in any stadium situation, that being Apple iOS updates and Google Gmail activity.

The NFL, which had its own game-day application for Super Bowl LI, has not released any statistics about app use.

Congrats to all the carriers, integrator 5 Bars and Wi-Fi gear supplier Extreme Networks.

THE NEW TOP 6 FOR WI-FI

1. Super Bowl 51, NRG Stadium, Houston, Feb. 5, 2017: Wi-Fi: 11.8 TB
2. Super Bowl 50, Levi’s Stadium, Santa Clara, Calif., Feb. 7, 2016: Wi-Fi: 10.1 TB
3. WrestleMania 32, AT&T Stadium, Arlington, Texas, April 3, 2016: Wi-Fi: 6.77 TB
4. Super Bowl 49, University of Phoenix Stadium, Glendale, Ariz., Feb. 1, 2015: Wi-Fi: 6.23 TB
5. Alabama vs. Texas A&M, Kyle Field, College Station, Texas, Oct. 17, 2015: Wi-Fi: 5.7 TB
6. Pittsburgh Steelers vs. New England Patriots, AFC Championship Game, Gillette Stadium, Foxborough, Mass., Jan. 22, 2017: Wi-Fi: 5.11 TB

THE NEW TOP 4 FOR TOTAL USAGE

1. Super Bowl 51, NRG Stadium, Houston, Feb. 5, 2017: Wi-Fi: 11.8; DAS: 25.8 TB**; Total: 37.6 TB
2. Super Bowl 50, Levi’s Stadium, Santa Clara, Calif., Feb. 7, 2016: Wi-Fi: 10.1 TB; DAS: 15.9 TB; Total: 26 TB
3. Super Bowl XLIX, University of Phoenix Stadium, Glendale, Ariz., Feb. 1, 2015: Wi-Fi: 6.23 TB; DAS: 6.56 TB**; Total: 12.79 TB**
4. WrestleMania 32, AT&T Stadium, Arlington, Texas, April 3, 2016: Wi-Fi: 6.77 TB; DAS: 1.9 TB*; Total: 8.6 TB*

* = AT&T DAS stats only
** = AT&T, Verizon Wireless and Sprint DAS stats only