Remote worker support at forefront for venue IT during coronavirus shutdowns

With almost all work now being done remotely, it’s no surprise that team and venue IT staffs have virtual operations support at the forefront as the coronavirus shuts down most business operations.

In emails and calls to a small group of venue, team and school IT leaders the task of making sure that staffs could work online in a virtual fashion was the one common response from every person who replied to our questions. According to our short list of respondents that task included getting mobile devices into the hands of those who needed them, and setting up systems like virtual private networks (VPNs) and virtual desktop environments (VDI) so that work could proceed in an orderly, secure fashion.

Since many of the people we asked for comments couldn’t reply publicly, we are going to keep all replies anonymous and surface the information only. The other main question we asked was whether or not the virus shutdowns had either delayed or accelerated any construction or other deployment projects; we got a mix of replies in both directions, as some venues are taking advantage of the shutdowns to get inside arenas that don’t have any events happening now. In addition to some wireless-tech projects that are proceeding apace, we also heard about other repairs to systems like elevators and escalators, which are more easily done when venues are empty.

But we also heard from some venues that shutdowns right now will likely push some projects back, maybe even a year or more. One venue that is largely empty in the summer will have to skip a planned network upgrade because it expects that normally empty dates in the fall and winter will be filled by cancelled events that will need to be rescheduled. Another venue said that it has projects lined up ready to go, but has not yet gotten budget approval to proceed.

Following our editorial from earlier this week, when we encouraged venues to make their spaces available for coronavirus response efforts, it was clear that many venues across the world had already started down that path. One of the quickest uses to surface was using venues’ wide-open parking lots as staging areas for mobile coronavirus testing; Miami’s Hard Rock Stadium, Tampa’s Raymond James Stadium and Washington D.C.’s FedEx Field were among those with testing systems put in parking lots.

Some venues have already been tabbed as places for temporary hospitals, with deployments at Seattle’s CenturyLink Field and New York’s Billie Jean King National Tennis Center already underway. Other venues, including Rocket Mortgage Fieldhouse in Cleveland and State Farm Stadium in Glendale, Ariz., have hosted blood drives.

Using venues to support coronavirus response efforts is a worldwide trend, with former Olympic venues in London being proposed as support sites, as well as former World Cup venues in Brazil. Perth Stadium in Australia is also being used, as a public safety command center, like Chicago’s United Center, which is being used as a logistics hub.

Many other venues are stepping forward to offer free public Wi-Fi access in parking lots so that people who don’t have internet access at home can safely drive up and connect. Ball State University and the Jackson Hole Fairgrounds are just two of many venues doing this.

Venues are also offering their extensive kitchen and food-storage capabilities for the response effort. The Green Bay Packers have been preparing and delivering meals for schools and health-care workers, while the Pepsi Center in Denver offered cooler space to store food. Many other venues have contributed existing stores of food to charitable organizations and support efforts, since those items won’t be used at any of the many cancelled events.

DIY method brings Wi-Fi to Rutgers basketball arena

The Rutgers Scarlet Knights men’s basketball team takes on the Indiana Hoosiers at Rutgers Athletic Center on Jan. 15, 2020. (Click on any picture for a larger image) Credit: Ben Solomon/Rutgers Athletics

It was a bit more complicated than a trip to Home Depot, but when the Rutgers University IT team wanted to bring fan-facing Wi-Fi to the school’s basketball arena but didn’t have the budget for a big-name contractor or vendor deal, it did what many weekend warriors do when faced with the same build vs. buy decision:

They did it themselves.

By purchasing lower-cost Wi-Fi gear and doing almost all of the design and deployment work in-house, the Rutgers IT team was able to bring a satisfactory level of coverage to the 8,000-seat Rutgers Athletic Center for a total price tag of about $62,000, according to representatives from the school’s athletic IT department. The Rutgers team first told their story at this year’s College Athletics IT peer conference in Ann Arbor, Michigan, and then provided more details in a follow-up interview with Mobile Sports Report.

The success of the DIY Wi-Fi deployment now has the Rutgers IT team looking at a similar method for bringing Wi-Fi to the school’s football stadium, starting with a localized deployment in the student section where it anticipates needs will be the highest. While fans at events in the “RAC” are probably happy for the connectivity, what might even be more important is the confidence and experience gained by the IT team by rolling up its sleeves and finding a way to deliver the network at a very reasonable price.

“The practical experience of doing this ourselves was just so much more interesting than attending conferences or networking classes,” said Jonathan Beal, systems administrator for the Rutgers athletics IT team. “I’d encourage smaller schools to look into something like this.”

Turnkey system prices ‘out of range’

Editor’s note: This profile is from our latest STADIUM TECH REPORT, which is available to read instantly online or as a free PDF download! Inside the issue is a profile of Dickies Arena in Fort Worth and a recap of a record Wi-Fi day at Super Bowl LIV! Start reading the issue now online or download a free copy!

A look at the tilt angles for the Wi-Fi APs. Credit: Rutgers Athletics

Though Rutgers isn’t exactly small (enrollment is just more than 50,000 at the main campus in New Brunswick, N.J.) and while its teams are part of the major Big Ten conference, the school simply doesn’t have the athletic-department budgets that some of its conference brethren do. And while Beal said that the school is regularly approached by technology vendors with stadium Wi-Fi pitches, the million-dollar-plus price tags for deployments are a non-starter for Rutgers.

“We get approached year after year, but the quotes are always out of our [budget] range,” Beal said. But at the college IT conference in 2019, Beal said the Rutgers team was interested in a presentation from the IT department at the University of Virginia, where that school used lower-cost equipment from Wi-Fi gear provider Ubiquiti to bring Wi-Fi to Virginia’s football stadium.

While Beal said the Virginia team detailed some initial failures in their deployment program, eventually they got it on track, and inspired the Rutgers crew to see if they could chart a similar path.

“We took notes, came back to New Jersey, made some phone calls, and asked ‘how far could we go?’,” Beal said. At the beginning, the team guessed they might be able to get the school to “absorb the cost” of a test deployment either in the basketball arena or the football stadium. What tipped the project in the basketball arena’s favor was the existence of some recently installed conduits leading to the rafters, where some biometric tracking equipment and some previous DAS gear had been installed.

“For the football stadium, the [conduit] pathways are challenging – it’s going to be costly when we do that,” Beal said.

After trying out a few test APs sent over by Ubiquiti the Rutgers team felt confident in their choice of hard- ware, and submitted a budget for $60,000 – which was quickly approved. “It was an easier sell than we thought,” said Beal. “They [the administration] trusted us.”

Overhead vs. under seat

Choosing to put Wi-Fi in the rafters pointing down instead of under the seats pointing up was another con- scious choice Rutgers made after noticing a difference between how football fans and basketball fans use in-venue wireless.

“We noticed that at football games fans download [data] and watch stuff, then go back to watching the game,” Beal said. “For basketball it’s a totally different user experience. People aren’t watching things on their phones, but they are uploading to Instagram.”

A look up at some of the Wi-Fi APs. Credit: Rutgers Athletics

So instead of solving for density and coverage (where under-seat offers a generally better experience) the Rutgers team aimed for the best upload experience for the money – which meant they could do top-down APs using line-of-sight tuning.

With a blend of a 3D rendering of the entire seating bowl (done with 360-degree cameras) and some help from Ekahau survey tools, the Rutgers team pinpointed the optimal placement points for the APs in the rafters. Since the seating in “The RAC” is mostly only on the two sides of the court – and not behind the baskets – the deployment became a fairly uncomplicated tale of two halves, with two APs for each sector.

Some tuning revealed a need to tilt the top AP down from a straight horizontal mount top since the tin roof of the RAC (which contributes to the venue’s historic reputation for being loud and an intimidating place to play) also reflects RF signals.

“Everything bounces around up there off the roof, including the RF,” said Beal. With 20 APs in the rafters (and four more down at court level for other areas) Rutgers was able to get the kind of coverage they wanted. After installing the APs with help from campus technicians – including installing backup chains to keep APs from falling onto any guests – it was time for the next step: Seeing what happened when fans joined the network.

Captive portal or free access?

Like almost every other venue that has installed Wi- Fi for guests, Rutgers struggled with how to make access available. Should it just be free to use with no restrictions, or should they try to use some kind of captive portal to get an email address or other identifying information so that the school could market to event attendees?

Joe Vassilatos, unit computing manager for the Rutgers athletics IT team, said there was some favor of a Facebook sign-in method from the Rutgers marketing team, because of the ease of identification. But Vassilatos said the IT team was “wary” of using a Facebook method, something Beal agreed with.

“We got some feedback from other schools that if you put that [Facebook sign-in] in, nobody uses the network,” said Beal.

Instead, the team opted for a sign-in method that uses a one-time SMS code with a 4-digit number that fans must enter to get access to the network. But both Beal and Vassilatos hoped that in the future there might be other ways to monetize the network – like doing offload for cellular carriers – that would allow them to make access even easier.

A top-down look at the mounting solution for the APs. Credit: Rutgers Athletics

With the network in place during this past basketball season, Rutgers saw good numbers on the usage side, with anywhere from 600 to 800 people using the network at games this winter. Beal said network statistics showed that at most games, 20 percent of the visitors connected to the network at least once, with 10 percent having dwell times in the 20- to 50-minute range.

“That shows they’re a real user, and not just a visitor,” Beal said.

For the last three games of the season, the Rutgers network got a promotional boost from a pregame light show that included fans using their mobile devices. Part of the promotion included instructions to log on to the Wi-Fi.

But according to Beal, the network wasn’t ever a secret.

“The first thing people do in any place is check for free Wi-Fi,” Beal said. “And if people are happy with it, it’s good enough.”

Next steps: Planning for football

For this offseason, the new project for the Rutgers IT team is bringing Wi-Fi to the student section of the football stadium, where they are planning to go with an under-seat approach. According to both Beal and Vassilatos deployment there is going to be more of a tuning challenge since Rutgers students rarely sit in one place, but instead crowd the area and even stand on bleachers trying to cram in.

But with a functional Wi-Fi network now inside inside the basketball arena, a place known as “The Trapezoid of Terror” (for its unique sloped-walls architecture), the Rutgers IT team is confident of its deployment chops, and takes great pride in knowing that more events can be held there with good connectivity, including more potential money-making events like career fairs and concerts.

“In the past when we had graduation ceremonies or other events [in the RAC] we had to bring out portable Wi-Fi,” Beal said. “Now we can take that load on the sta- dium network.”

For Vassilatos, the Wi-Fi is a reason for a little bit of chest-beating.

“IT is usually very inward-facing, and this was our chance to utilize our skill set to add to the bravado of the athletic experience,” Vassilatos said. “We took this on our own to implement, and we’re better from the experience.”

New Report: Dickies Arena sets a new standard for arena excellence

MOBILE SPORTS REPORT is pleased to announce the Spring 2020 issue of our STADIUM TECH REPORT series, the ONLY in-depth publication created specifically for the stadium technology professional and the stadium technology marketplace.

Our latest issue contains an in-person report on the new Dickies Arena in Fort Worth, which may have just set the new standard for excellence in an arena experience. We also recap another record Wi-Fi day at Super Bowl LIV, as well as a DIY Wi-Fi network at Rutgers University.

You can READ THE REPORT right now in our new flip-page format, with no registration required!

For those who prefer the PDF, you can also download a copy of the report for free as well!

We’d like to take a quick moment to thank our sponsors, which for this issue include Corning, Boingo, MatSing, Cox Business/Hospitality Network, Comcast Business, Samsung, and American Tower. Their generous sponsorship makes it possible for us to offer this content free of charge to our readers. We’d also like to welcome readers from the Inside Towers community, who may have found their way here via our ongoing partnership with the excellent publication Inside Towers. We’d also like to thank the SEAT community for your continued interest and support.

Oklahoma leads the way with Wi-Fi 6 network at football stadium

An AmpThink handrail enclosure for Wi-Fi APs at Oklahoma. Credit all photos: Paul Kapustka, MSR (click on any picture for a larger image)

In the long history of college football, the Univeristy of Oklahoma is a name that is always somehow in the discussion when it comes to top teams and Heisman-quality talent. And now you can add stadium Wi-Fi to the list of things Oklahoma does well, after a deployment of a 100 percent Wi-Fi 6 network at Gaylord Family-Oklahoma Memorial Stadium was in place for most of the recent football season.

Formerly among the most conspicuous Wi-Fi have-nots among big-school stadiums, the Sooners have now moved to the front of the class with a network of approximately 1,350 access points in their 80,126-seat stadium, all new models that support the emerging Wi-Fi 6 standard, also known as 802.11ax. With a deployment led by AT&T, using gear from Aruba, a Hewlett Packard Enterprise company, and a design and deployment from AmpThink, using mainly handrail-mounted enclosures in the main bowl seating areas, OU fans now have the ability to connect wirelessly at the most advanced levels, with a technology base that will support even better performance as the balance of attendee handsets starts to catch up to the network with
support for Wi-Fi 6.

“We’re very excited” about the new network, said David Payne, senior technology strategist for athletics at the University of Oklahoma’s information technology department. Payne, who has been at Oklahoma since 2003, has spent the last several years shepherding the overall stadium Wi-Fi plan into place, starting first with Wi-Fi coverage for the stadium RV parking lots, then adding initial forays into stadium Wi-Fi deployment when Oklahoma renovated the south part of the stadium three years ago. But this past offseason was the big push to full stadium coverage, a trek that included a switch in equipment vendors that was prompted by Oklahoma’s solid commitment to the emerging Wi-Fi 6 standard.

Committed to Wi-Fi 6 for the future

Editor’s note: This profile is from our latest STADIUM TECH REPORT, which is available to read instantly online or as a free PDF download! Inside the issue are profiles of the new Wi-Fi and DAS networks at Chase Center, as well as profiles of wireless deployments at Fiserv Forum and the University of Florida! Start reading the issue now online or download a free copy!

A water-sealed connection for the bottom of a handrail enclosure.

If there was a tricky time to pull the trigger on Wi-Fi 6, it was last summer, when not every vendor in the market could ensure it would have enough gear on hand to fully supply a big stadium like Oklahoma’s. And even though Wi-Fi 6 gear is new and generally more expensive than previous versions, for Payne and Oklahoma the long-term benefits combined with the periodic ability to refresh something as significant as a football stadium network made committing to Wi-Fi 6 somewhat of a no-brainer.

Payne, like many other big-school IT leaders, has spent years helping administrators and others at budget- deciding levels of leadership at his school try to understand the benefits of stadium-wide Wi-Fi connectivity. For many of those years, it just didn’t make sense to try to push through the multi-million-dollar expense of a project “that would only be used six or seven Saturdays a year,” Payne said. “There’s always a difficulty in telling the story of what value you receive in this since it’s different from traditional revenue streams,” Payne said. “There isn’t a direct dollar seen from Wi-Fi users.”

But with the late-2018 approval of a capital expenditure project to revamp the football stadium’s lower-bowl seating with new handrails, wider seats and other ADA-related improvements, Payne and the IT team were able to weave in the extra $3 million (out of a total project cost of $14.9 million) it would cost to bring full Wi-Fi coverage to the entire stadium.

“It’s just taking advantage of the timing to get economies of scale,” said Payne. Because of the already- planned work on the handrails, Oklahoma was able to add the AmpThink-designed handrail Wi-Fi enclosures (which use the handrail pipes to carry cabling) for a fraction of the cost of having to do that work as a separate project, Payne said. The university had also installed new backbone gear and cabling during the south end zone renovation, so that cost was already paid for.

The decision to commit to Wi-Fi 6, Payne said, was based on standard release projections from manufacturers. “We paid close attention to projected order availability and ship dates,” Payne said. “We were felt that if we were able to receive the gear by June, we could complete the project on time.”

Though some manufacturers were not sure of being able to fully deliver Wi-Fi 6 gear, Aruba, Payne said, had “high confidence” in meeting the deadlines, and won the deal. According to Payne, all the Aruba gear was shipped in time to begin construction in June.

A handrail enclosure in the lower bowl

“It’s important for us to get the full life cycle of technology, so that’s why we decided to go 100 percent Wi-Fi 6,” Payne said.

Attention to detail an AmpThink hallmark

On a visit before and during a home game against Texas Tech in late September 2019, Mobile Sports Report was able to test the live network in all parts of the stadium, with strong performance at even the highest seating levels as well as in sometimes overlooked spots like the long ramps that fans walk up to get in and out of the venue.

The Oklahoma deployment was part of a very busy summer for AmpThink, with similar Wi-Fi design and deployments at Oklahoma, Ohio State and Arkansas. Like those two others, Oklahoma’s main bowl AP deployment was in the patented AmpThink handrail enclosures, each stamped with the distinctive “OU” logo.

The handrail deployment system, which typically includes a core drill through the concrete floor to bring wiring into the handrail tubing, is now a standard process for AmpThink, following similar deployments at the Minnesota Vikings’ U.S. Bank Stadium and at Notre Dame Stadium, among others. At Oklahoma, AmpThink said it used 10 different handrail enclosure designs to fit all the necessary spaces.

AmpThink president Bill Anderson was present during our visit and took great pride in showing off some of the finer points of an AmpThink deployment, including a method of using a metal sleeve and some clever waterproof paint and sealant to ensure that no moisture finds its way into the holes used for cable delivery.

“We spend a tremendous amount of time [during deployments] making sure there isn’t any water leakage under the stands,” Anderson said. “Because you never know what is going to be below. This is a big part of what we do. We don’t just sell an enclosure.”

Concourse APs visible high on concrete posts

The same can be said of AmpThink’s overall network designs, which it monitors and tests and tweaks as fans use the system. On the game day we visited, no fewer than four AmpThink employees were at the stadium in the network control room, checking AP performance and network usage.

“We’re pretty proud of what we can do,” Anderson said about the company’s track record for network design in large venues. “We have proven formulas which we reliably implement.”

Solid speed tests throughout the venue

At 10:20 a.m. local time, just ahead of the early 11 a.m. kickoff, Mobile Sports Report started our testing inside the main-level concourse, where fans were already lining up to purchase cold beer, another first at the stadium this past season. In the midst of the entering crowds we got a speedtest of 55.9 Mbps on the download side and 43.7 Mbps on the upload side, an inkling of the strong tests we were to see everywhere we walked. In the concourses and near concession stands, a mix of overhead and wall-mounted APs provided coverage.

Up in the stands, we took our first test among the railing-mounted enclosures in section 6, row 51, just about at the 50-yard line. We got a mark of 68.2 Mbps / 58.7 Mbps before the stands were completely full. We then hiked up to row 67, which was underneath the press box overhang and served by overhead APs, not railing enclosures. There we got a speedtest of 27.8 Mbps / 49.5 Mbps, a half hour before kickoff.

One more speedtest in the lower bowl (around the 30-yard line, in row 19) netted a mark of 68.9 Mbps / 61.2 Mbps; then as we walked around to the south end zone, we got a mark of 38.7 Mbps / 64.3 Mbps in the south concourse, busy with fans getting food and drink ahead of the imminent kickoff.

The recently renovated south end of the stadium has a series of loge boxes and other premium seating options, and has an overhang which provides additional real estate for Wi-Fi AP mounting options. Ducking into a loge box (covered by overhead APs) for a quick test we got a mark of 36.8 Mbps / 54.2 Mbps just before kickoff. Moving around to the corner of the south stands for the pregame ceremonies we got a mark of 33.7 Mbps / 63.8 Mbps even as all the phones were out to capture the team run-on and school song rendition. After kickoff, we went into the crowded main east concourse and got a mark of 43.2 Mbps / 46.6 Mbps amidst all the late-arrivers.

Good coverage in the stairwells

Wi-Fi antennas in an overhang deployment

If there is one area where stadiums sometimes skimp on wireless coverage it’s in the stairwells and pedestrian ramps, which may not seem like an important place to have connectivity. But at Oklahoma, the multiple switchbacks it takes to climb from ground level to the top seating areas are all well covered with Wi-Fi, as we got a mark of 39.9 Mbps / 29.5 Mbps during a brief rest stop on our hike to the top of the east stands.

At a concession stand on the top-level concourse we got a mark of 61.3 Mbps / 70.5 Mbps, as we admired the neatness of the core drilling we could see that got the cabling to the underside of the seating areas above. In the stands we got a mark of 57.5 Mbps / 69.5 Mbps at one of the highest rows in the stadium, row 24 of section 226, a half hour after the game’s start.

According to Payne our visit coincided with the first live game with the Wi-Fi 6 software fully turned on, part of a sort of rolling start to the network deployment which wasn’t fully live at the first game on Aug. 31.

“It wasn’t without some hiccups and headaches,” said Payne of the overall deployment, which included a small number of temporary black-colored handrail enclosures from AmpThink, which saw its single source of handrail molding material run out of supply late in the summer. According to Payne Oklahoma started the season with 966 radios working on the network, ramping up with more at each home game until reaching full capacity later in the season. AmpThink had also replaced the black enclosures by the time of our visit with the standard silver ones.

Oklahoma also experienced what other venues deploying Wi-Fi 6 may find – that some of the very oldest devices still in use may have issues in connecting to the Wi-Fi 6 equipment. Payne said one such
problem surfaced in the press box (where reporters were using older laptops) but it was solved by creating some virtual APs which were tuned to an older version of the Wi-Fi standard.

Oklahoma fans during pregame ceremonies

OU also didn’t widely promote the network early in the season, but by the Oct. 19 home game with West Virginia not only was the school promoting the network on the stadium’s big video boards, the IT team also added the ability for students to automatically join the stadium network via their regular WiFi@OU SSID used around campus.

With 82,620 in attendance for the West Virginia game the total number of Wi-Fi users took a big jump from the previous high, with 25,079 unique connections, according to numbers provided by Payne. When Iowa State came to Norman on Nov. 9, the network saw its highest usage with 32,673 unique users, who used approximately 4.2 terabytes of data while in the stadium.

What was also interesting to Payne was the number of devices connected using the Wi-Fi 6 standard, which currently is only supported by a small number of phones. Payne noted that the first week OU had the Wi-Fi 6 working in the stadium was the same week Apple started delivery of its new iPhone 11 line, which includes support for the new Wi-Fi 6 standard. After seeing 941 devices connect on Wi-Fi 6 at the Texas Tech game, Payne said Oklahoma saw a steady increase of Wi-Fi 6 devices at each following home game, with 1,471 at the West Virginia game and 2,170 at the Iowa State game.

Is AX coming ‘sooner’… rather than later?

Though most consumer handsets being used today do not support the Wi-Fi 6 standards, Apple’s decision to include Wi-Fi 6 support in its latest iPhone 11 line as well as Wi-Fi 6 support from other new Android phone models suggests that device support for the standard may be coming sooner, rather than later, to the fans in the stands. When that happens and the Wi-Fi 6 network starts utilizing its new capabilities, Oklahoma’s network will be among the first to make use of the new standard’s ability to support more clients at higher connection speeds, critical features for big networks in small places like football stadiums.

The non-insignificant number of AX devices already seen by the stadium network, Payne said, felt like good justification of the school’s decision to commit to Wi-Fi 6. What was also interesting to Payne was some later analysis of the network which showed Wi-Fi 6 clients using nearly 10 times the data per client as older Wi-Fi 5 devices.

Looking ahead to next season, Payne said he will be working with school network officials to see how to more closely tie the stadium network with the overall campus wireless infrastructure, and to see how the school might be able to incorporate a stadium app or web-based sites to increase the ability of the network to improve the fan experience. Currently Oklahoma uses a portal from AmpThink to get email addresses from network guests, which Payne said will be used by marketing and ticketing departments to try to increase engagement.

The good news is, Payne said, is that “we are no longer looking at what it costs to put a network in place” to drive any new digital experience ideas.

For Oklahoma athletics director Joe Castiglione, it was important for the school to deliver an amenity that provided a a consistent fan experience whether a fan was in a suite or in the upper deck, a goal our tests seem to have validated.

“We feel that the Oklahoma tradition is among the strongest in the nation and really want to provide a top-notch fan experience to celebrate that tradition,” Castiglione said. “Wi-Fi is just the beginning of enhancing that experience. We hope to be able to use it to engage our fans through in venue activations and experiences that would not be available without the addition of Wi-Fi.”

The scoreboard touts the new Wi-Fi network (credit this photo: University of Oklahoma)

A panaoramic view of the stadium


Wi-Fi enclosure above a concessions stand

‘Best of Breed’ wireless drives Chase Center experience

An under-seat Wi-Fi AP enclosure at Chase Center, foreground, with a DAS enclosure visible to the left. Credit all photos (except where otherwise noted): Paul Kapustka, MSR (click on any picture for a larger image)

As stunning as Chase Center is visually, what you can’t see is equally powerful in adding to the fan experience. Namely, the wireless networks, and the gear that supports the connectivity.

Inside the shiny new home of the NBA’s Golden State Warriors, which sits on the edge of the San Francisco Bay, is a cellular DAS deployment from Verizon using Corning gear that may be the new forward-thinking model for cellular infrastructure for large public venues like stadiums and arenas. The 18,000-seat arena also has a Wi-Fi network using gear from Aruba, a Hewlett Packard Enterprise company, which supports the emerging Wi-Fi 6 standard for communications inside the main seating bowl.

But if you’re attending a Warriors game, or one of the many concerts scheduled at Chase Center, you may not ever see the equipment that brings the world-class connectivity to the fans. Both the DAS and the Wi-Fi networks utilize an under-seat antenna deployment method, just part of an aesthetic plan that does its best to minimize the visual impact of antennas and other wireless gear. Even deeper into the building is all the optical fiber supporting the networks, with capacity for future needs already in place.

During a mid-October 2019 visit before all the networks were fully tuned, Mobile Sports Report still got strong test results from both Wi-Fi and DAS networks in most areas in and around the arena, clear confirmation that the Warriors’ goal of having excellent wireless connectivity at their new home was right on track. And with the Corning ONE system in behind a DAS design built from the ground up with future needs in mind, as well as the expected capacity gains coming from Wi-Fi 6, the Warriors and their partners are confident they’ve built a wireless system worthy of their world-class venue goals.

“We feel extremely proud” of the venue’s wireless systems, said Brian Fulmer, director of information technology for the Golden State Warriors. Though the inevitable construction delays led to some late nights heading up to the arena’s Sept. 6, 2019 public debut, according to Fulmer all wireless systems were fully online for the opening Metallica concert, where the arena saw 2.58 terabytes of data used on the Wi-Fi network with another 2.69 TB used at another Metallica show a couple days later.

“It was a race to the finish line but we did it, and the performance speaks for itself,” said Fulmer.

Searching for ‘Best in Breed’

Editor’s note: This profile is from our latest STADIUM TECH REPORT, which is available to read instantly online or as a free PDF download! Inside the issue are profiles of the new Wi-Fi deployment at the University of Oklahoma, as well as profiles of wireless deployments at Fiserv Forum and the University of Florida! Start reading the issue now online or download a free copy!

If there was ever a chance to build the best-ever new arena, Chase Center was probably a once-in-a-lifetime opportunity. When you combine the championship run of the team on the court with a devoted fan base centered in one of the hottest economic markets ever, you have the liberty to search for quality instead of bargains on every level.

A Wi-Fi AP hovers over a concourse gathering area.

(Case in point: The Warriors were able to sell out of their new court-level luxury suites, which have rooms just under the stands that include private wine lockers and can cost up to $2 million per year. Clearly, this is a model that may not work in places that aren’t Silicon Valley.)

For the privately financed $1.4 billion building, the Warriors turned to consulting firm Accenture to help determine the “best in breed” technology partners, especially on the wireless front. Several Warriors executives interviewed for this story did all agree on one main point: The team was not trying to install any technology to win imaginary awards for being the best or fastest building out there. Instead, it was all about how technology, especially wireless, could help bring about a world-class experience during every visit.

“Nobody shows up [at an arena] just looking for fast wireless speeds,” said Mike Kitts, the Warriors’ senior vice president for partnerships. “They want to interact. We wanted to create unforgettable experiences in an engaging environment. With the end in mind of a world-class experience, we knew great technology would absolutely play a role.”

Like a team drafting top players, the Warriors ended up choosing Verizon to lead the distributed antenna system (DAS) for cellular wireless, and Aruba for Wi-Fi. To build its neutral-host system, Verizon chose Corning and the Corning ONE platform, with an installation led by Communication Technology Services (CTS).

“We certainly leveraged the expertise of Verizon, as well as AT&T (which is also on the DAS as a client),” said Fulmer. “They’ve done this countless times, and they have the lessons learned of painful experiences.”

Building a DAS that can handle growth

Anyone in the stadium business in Northern California doesn’t have to look too far or remember too long ago to recall one such example of the pain that the nonstop growth in cellular demand can cause. After the San Francisco 49ers’ brand-new home, Levi’s Stadium, opened in 2014, the also brand-new DAS had to be upgraded the very next season to ensure it had enough capacity for the upcoming Super Bowl 50. Verizon, which basically invented under-seat DAS antennas for that deployment, said it had a goal at Chase Center to build a DAS that didn’t need upgrading for at least a few years.

A Wi-Fi AP painted to blend into the outside facade.

Terry Vance, senior manager for Verizon’s Pacific market network performance group, said “the plan from day 1 was to build a DAS with capacity for today and tomorrow. We needed to build this DAS so that for the next 3 to 4 years, we won’t have to touch it.”

Verizon also had to build the DAS in a way that complied with the Warriors’ stringent requirements for clear sight lines, especially in the main bowl seating area. According to the Warriors’ Fulmer, the team “looked at handrail [enclosure] designs,” but rejected them in favor of an under-seat approach. Though more costly in both equipment and construction, the under-seat approach was Verizon’s favored method as well to get more density in the arena.

What Verizon ended up with was a design that currently uses 71 active sectors, with 42 of those in the seating bowl. According to Vance, all the sectors in the bowl area can basically be split into two parts if needed, for a total of 84 potential bowl sectors. Currently, Vance said there are 598 under-seat DAS antennas in use.

According to Vance the Corning ONE system’s extensive use of optical fiber makes it easier to add capacity to the system as needed.

“The fiber to the edge [in the Corning system] is especially useful as you go to 5G,” Vance said. Though it’s not part of the shared DAS system, Verizon also has full 5G bowl coverage at Chase Center, one of the first arena deployments in California. Verizon also is using a couple of MatSing ball antennas, mounted in the rafters to provide cellular coverage to the floor area for concerts and other non-basketball events.

Right now AT&T is the only other carrier on the DAS, with participation from T-Mobile and/or Sprint pending depending upon the outcome of those two companies’ potential merger.

A Verizon 5G speedtest. Credit: Verizon

Jessica Koch, sports and entertainment director of business development for Corning optical communications, gave praise to integrator CTS for its deployment know-how, which she said was “critical to the success of this project.” Corning, Koch said, knows that for fans in large venues like Chase Center, “reliable connectivity without restriction – all the time, at full speed, on any device, from anywhere – has become the expectation in our connected world.”

For Warriors president and COO Rick Welts, the best wireless system is one fans don’t see or worry about, but just use without concern.

“The best thing is if the phone just works, and I don’t have to think about it,” said Welts, who led a stadium tour during MSR’s October visit.

Though Verizon said the system went through some necessary optimization during the hectic early events schedule at Chase Center, Verizon engineers in December were getting DAS speed tests in excess of 100 Mbps for both download links in most locations, according to Philip French, vice president of network engineering for Verizon. Download speeds for 5G connections, he said, are breaking the 1 Gbps mark.

“This DAS is unique since it was the first one we’ve built with 5G in mind from the ground up,” French said. “It’s a very robust design, and for us this is the design of the future.”

Leading the way with Wi-Fi 6

Like several other stadiums that were being finished this past summer, Chase Center was able to take advantage of the release of Wi-Fi equipment that supports the emerging Wi-Fi 6 standard. Though all the new capabilities won’t be fully realized until most end-user devices also support the new version of Wi-Fi, having support for the technology inside the arena was key for the Warriors’ plans.

“You can never really be ‘future proofed’ but we were extremely fortunate with the timing [of Wi-Fi 6 gear arriving],” said the Warriors’ Fulmer. “We were right in the sweet spot for an initial deployment.”

Wi-Fi and DAS gear on the catwalk.

According to Aruba, Chase Center has approximately 250 Aruba 500 Series APs (which support Wi-Fi 6) deployed in the main seating bowl, mostly in under-seat enclosures. Overall, there are approximately 852 total APs used in the full Chase Center network, which includes coverage inside the building as well as in the connected outdoor plaza areas.

During our October visit, MSR got Wi-Fi speedtests of 27.3 Mbps on the download side and 18.2 Mbps on the upload side while standing outside the east entry doors near the big mirror balls that are selfie central for fans visiting the new arena. Inside the doors, our speedtest in the lobby got a mark of 55.8 Mbps / 68.6 Mbps.

On one upper concourse area, near several concession stands outside portal 57, we got a speedtest of 10.5 Mbps / 11.2 Mbps. In the seats in upper section 220 just before tipoff we got a mark of 46.0 Mbps / 28.0 Mbps, and in a lower-bowl concourse area outside portal 9 we got a test mark of 53.7 Mbps / 71.5 Mbps.

According to Aruba, several events other than the Metallica concerts have passed the 2 TB Wi-Fi data mark so far, with several events seeing more than 8,000 unique clients connected and marks of 6,000+ concurrent connected devices and 2.6 Gbps of throughput.

The Warriors’ Fulmer praised not just the Wi-Fi gear but the full “end to end network solutions” available from Aruba as well as from parent Hewlett Packard Enterprise, which is a founding partner at Chase Center.

“We’re still only three months in, and there’s a lot more that we want to do,” Fulmer said. “It was not a small undertaking. But I think we can let the technology speak for itself.”

Fiserv Forum’s wireless networks ready for the Democratic Convention

Milwaukee’s Fiserv Forum, home of the NBA’s Milwaukee Bucks and also the locale for this summer’s Democratic Convention. Credit all photos: Paul Kapustka, MSR (click on any picture for a larger image)

With one of the most demanding arena-sized events headed its way this upcoming summer, the wireless networks at Milwaukee’s Fiserv Forum appear to be more than ready to handle any audience demand for mobile connectivity.

With a full-featured distributed antenna system (DAS) deployed and operated by ExteNet Systems using gear from JMA Wireless, as well as a Wi-Fi network using Cisco gear, Fiserv Forum shows both the expertise of wireless providers who have a long history of knowing what works, as well as the foresight to add new techniques and technologies to combine high performance with the quality aesthetics that are the hallmark of the new home of the NBA’s Milwaukee Bucks.

And while a Mobile Sports Report visit this past fall for a Bucks game found all the wireless elements in top working order, the big event for the venue’s second year of operation will be the Democratic National Convention in July 2020. While the four-day nomination gathering is a test for any locale, Fiserv Forum’s forethought on how to prepare for numerous types of events in and around its uniquely designed structure has it well prepared to handle whatever wireless needs the convention will require.

It all starts with the DAS

Editor’s note: This profile is from our latest STADIUM TECH REPORT, which is available to read instantly online or as a free PDF download! Inside the issue are profiles of the new Wi-Fi deployment at the University of Oklahoma, as well as profiles of wireless deployments at Chase Center and the University of Florida! Start reading the issue now online or download a free copy!

Even in these days of predictions of the death of DAS, Fiserv Forum is proof that for high-profile venues, carriers will still participate in a quality deployment. And while many venues have just two or three cellular providers on their DAS, according to ExteNet, the Fiserv Forum DAS has five major carriers participating — AT&T, Verizon, T-Mobile, Sprint and U.S. Cellular.

Wi-Fi AP on an outdoor plaza light pole

Unlike some new arenas, where wireless is an afterthought to construction, ExteNet was involved early on, according to Manish Matta, vice president of marketing at Extenet.

“Getting in sooner rather than later is always better,” said Matta, who said ExteNet was well involved in the overall construction plans, ensuring that there were no delays associated with wireless deployments holding up construction of other parts of the building.

During a pregame tour in October with a team from ExteNet as well as with Robert Cordova, chief technology and strategy officer for the Bucks, Mobile Sports Report got an up-close look at some of the inside parts of the DAS network design, including the headend room and multiple antenna installations that were hard to find given their well-designed placements and camouflaging.

In addition to regular enclosures that were painted or otherwise placed in areas out of the main sight lines, ExteNet and JMA also utilized some of the newer circular flat-panel antenna enclosures that fit flush to ceilings, minimizing the exposure.

The 215 DAS antennas are powered by 40 remote units. According to JMA, the remotes are connected to the backbone with optical fiber, and use digital power to bring power to elements up to a mile away. With 16 sectors in the bowl design, the DAS is able to segment coverage to all parts of the arena, including the bowl as well as concourses and other in-house areas.

DAS antenna in a concourse location

ExteNet, which owns and operates the DAS as a neutral host, also installed 10 extra MatSing ball antennas in the rafters for additional top-down coverage. Though only AT&T is using the MatSings right now, ExteNet said they are integrated into the DAS design if other carriers should wish to utilize them in the future.

During a short walk-around before the Bucks game started, MSR got a DAS speedtest of 85.8 Mbps on the download and 14.9 Mbps on the upload, even though our older iPhone (on the Verizon network) doesn’t support all the latest DAS capabilities. Near the start of the game, as the pregame introductions were at their peak, we got a DAS mark of 18.0 Mbps / 15.7 Mbps in the middle of an upper-deck seating area (Section 227) and then a little bit after the game started, we got a mark of 21.3 Mbps / 12.5 Mbps near a bar area on the upper-level concourse.

Wi-Fi inside and out

On the Wi-Fi side of things, a visitor to Fiserv Forum can connect to the network even before coming in the doors, as part of the 623-AP Cisco installation includes Wi-Fi APs mounted on light poles in the “Deer District,” the plaza area on the stadium’s east side that connects to an outdoor beer garden and several bars and restaurants that were all part of the planned environment built in sync with the arena’s opening.

Before we went inside, we got a Wi-Fi speedtest of 40.5 Mbps / 40.2 Mbps in the middle of the Deer District plaza, which was hosting a pop-up haunted house attraction sponsored by Jack Daniels.

Inside the building, we again needed some guidance from the Bucks’ Cordova to locate some of the Wi-Fi APs, which are inside triangular enclosures that are either painted to match wall surfaces, or utilized as high-visibility section number signs, hiding the gear in plain sight.

Wi-Fi AP blended in to the wall covering

In the seating bowl, Fiserv Forum again shows its commitment to aesthetics with the smallest handrail enclosures we’ve ever seen, a discreet hand-sized enclosure that tucks the antenna components neatly into the top part of a railing, with the AP electronics hidden below the seating areas. Designed by integrator Johnson Controls and its ecosystem partners, Abaxent and AccelTex, the 28 special enclosures are also designed to be easy to detatch and re-attach (with something Johnson Controls calls a simple two-click “dart connector”) which facilitates keeping the network working when the lower-bowl seating areas need to be reconfigured for different events.

Sitting in a courtside seat near one of the handrail enclosures about 20 minutes before tipoff, we got a Wi-Fi speedtest mark of 15.8 Mbps / 33.2 Mbps. On the main concourse just after the game’s start we got a Wi-Fi mark of 28.6 Mbps / 60.4 Mbps, and later on at that same upper-concourse bar we got a mark of 39.9 Mbps / 61.1 Mbps.

Later on during the second quarter of the game, we watched another fan in our lower-bowl seating area spend most of the period keeping one eye on Monday Night Football streaming on his phone. “The Wi-Fi is really good here,” he noted.

Looking ahead to CBRS and 5G

As ExteNet and JMA prepare for the onslaught of the convention’s needs, in many areas the Bucks are already looking farther ahead, to future communications improvements including 5G millimeter wave deployments, and a possible introduction of CBRS services. Cordova, who is an advocate of the capabilities of private LTE networks over the CBRS spectrum, said the flexibility of provisioning services in a CBRS environment could be extremely useful for temporary needs, like during last year’s NBA playoffs when the NBA on TNT crew set up a temporary stage out in the plaza.

While the Bucks have already prepared for connectivity of all sorts out on the plaza space – from the top-level outside Panorama deck at Fiserv Forum that lets fans look out over the city, Cordova pointed out several metal boxes in the plaza that have home-run fiber connections for broadcast TV as well as remote power – there’s going to be all sorts of temporary connectivity needs when the convention media tents set up in the empty lot next door where the previous stadium, the Bradley Center, used to stand.

The fact that the Bucks and ExteNet were already well involved with planning for a July event in October the year before is just another sign of a networking operation that is well positioned now and already thinking about what the next necessary steps are.

Robert Cordova, chief technology and strategy officer for the Bucks, in the headend room

MatSing ball antennas point down from the rafters

The Daktronics centerhung video board

https://duwit.ukdw.ac.id/document/pengadaan/slot777/

https://mtsnupakis.sch.id/wp-content/zeusslot/

https://insankamilsidoarjo.sch.id/wp-content/slot-zeus/

https://smpbhayangkari1sby.sch.id/wp-content/slot-zeus/

https://alhikamsurabaya.sch.id/wp-content/slot-thailand/

https://mtsnupakis.sch.id/wp-content/bonus-new-member/

https://smptagsby.sch.id/wp-content/slot-bet-200/

https://lookahindonesia.com/wp-content/bonus-new-member/

https://ponpesalkhairattanjungselor.sch.id/wp-content/mahjong-slot/

https://mtsnupakis.sch.id/wp-content/slot777/

https://sdlabum.sch.id/wp-content/slot777/

https://sdlabumblitar.sch.id/wp-content/bonus-new-member/

https://sdlabumblitar.sch.id/wp-content/spaceman/

https://paudlabumblitar.sch.id/wp-content/spaceman/