Top-down approach brings Wi-Fi to OKC Thunder’s Chesapeake Energy Arena

Chesapeake Energy Arena, home of the NBA’s Thunder. Credit all photos: Oklahoma City Thunder

If there’s one sure thing about stadium Wi-Fi deployments, it’s that pretty much no two networks are ever exactly the same. So even as there is a growing large-venue trend for putting Wi-Fi access points under seats or in handrails, sometimes the traditional top-down method is still the one that works best.

Such was the case for the first full fan-facing Wi-Fi network at Chesapeake Energy Arena in Oklahoma City, home of the NBA’s Thunder. With a large amount of retractable seating in the 18,000-seat venue, an under-seat approach to Wi-Fi would prove too costly and disruptive, leading the team to look for connectivity from above.

While a solid in-building cellular distributed antenna system (DAS) had done a good job of keeping fans connected the last few years, the team’s desire to have more mobile insight to fan activity as well as a switch to a Wi-Fi-centric point of sale system led Oklahoma City to finally install fan-facing Wi-Fi throughout the venue.

Chris Nelson, manager of information technology for venue manager SMG, and Tyler Lane, director of technology for the Thunder, spoke with Mobile Sports Report about the recent Wi-Fi deployment at Chesapeake Energy Arena, which went live during the most recent NBA season.

An AP placement in the rafters

Though the venue looked at all options, Nelson said that going under-seat with APs would have been “very costly” to do, given the large number of retractable seats in the arena.

“We wanted to hang them [APs] from the top if we could,” Nelson said.

After testing the top equipment brands available, the Thunder settled on Ruckus gear, for what they said was a simple reason, one involving the 96 feet in air space from the catwalk to the arena floor.

“Ruckus was the only one whose gear could reach down all the way,” Nelson said.

Adding to the fan experience

Editor’s note: This report is from our latest STADIUM TECH REPORT, an in-depth look at successful deployments of stadium technology. Included with this report is a profile of the new Wi-Fi network at Allianz Field in St. Paul, Minn., and an in-depth research report on the new Wi-Fi 6 standard! DOWNLOAD YOUR FREE COPY now!

According to the team the deployment saw 410 total APs used, with 350 in the arena proper and another 60 deployed across the street at the Cox Convention Center. According to the Thunder’s Lane, the team rolled out the service slowly at first, with some targeted testing and feedback from season ticket holders.

Close-up of an AP placement

“We got some good feedback and then when we went to a full rollout we had signage in the concourses, communications via ticketing services and announcements over the PA and on the scoreboard,” to tell fans about the system, said Lane.

According to statistics provided by the team, the Wi-Fi was getting good traction as the season went on, with a March 16 game vs. the Golden State Warriors seeing 589.3 gigabytes of traffic, from 2,738 clients that connected to the network. Lane said the team employed Jeremy Roach and his Rectitude 369 firm to assist with the network design; Roach in the past helped design networks at Levi’s Stadium and Sacramento’s Golden 1 Center.

Now that the Wi-Fi network is in place, Lane said the Thunder is starting to increase the ways it can add to the fan experience via digital means, including app-based features like showing press conferences live and by having an artificial intelligence chatbot to help provide fans with arena information.

“It’s really all about enhancing the fan experience,” Lane said, with an emphasis on driving digital ticketing use in the YinzCam-developed team app. Lane said that the system also drives a lot of mobile concessions traffic, and added that “Ruckus did a fantastic job of asking all the right questions for our food and beverage partners.”

Can virtualization help venues meet growing mobile capacity demands?

By Josh Adelson, director, Portfolio Marketing, CommScope

U.S. mobile operators reported a combined 50 terabytes of cellular traffic during the 2018 Super Bowl, nearly doubling over the previous year. In fact, Super Bowl data consumption has doubled every year for at least the past 6 years and it shows no sign of slowing down.

Clearly, fans love their LTE connections almost as much as they love their local team. Fans have the option for cellular or Wi-Fi, but cellular is the default connection whereas Wi-Fi requires a manual connection step that many users may not bother with.[1] The same dynamic is playing out on a smaller scale in every event venue and commercial building.

Whether you are a venue owner, part of the team organization or in the media, this heightened connectivity represents an opportunity to connect more with fans, and to expand your audience to the fans’ own social connections beyond the venue walls.

But keeping up with the demand is also a challenge. High capacity can come at a high cost, and these systems also require significant real estate for head-end equipment. Can you please your fans and leverage their connectedness while keeping equipment and deployment costs from breaking the capex scoreboard?

Virtualization and C-RAN to the rescue?

Editor’s note: This post is part of Mobile Sports Report’s new Voices of the Industry feature, in which industry representatives submit articles, commentary or other information to share with the greater stadium technology marketplace. These are NOT paid advertisements, or infomercials. See our explanation of the feature to understand how it works.

Enterprise IT departments have long since learned that centralizing and virtualizing their computing infrastructures has been a way to grow capacity while reducing equipment cost and space requirements. Can sports and entertainment venues achieve the same by virtualizing their in-building wireless infrastructures? To answer this question, let’s first review the concepts and how they apply to wireless infrastructure.

In the IT domain, virtualization refers to replacing multiple task-specific servers with a centralized resource pool that can be dynamically assigned to a given task on demand. Underlying this concept is the premise that, while each application has its own resource needs, at any given time only a subset will be active, so the total shared resource can be less than the sum of the individual requirements.

How does this translate to in-building wireless? Centralizing the base station function is known as C-RAN, which stands for centralized (or cloud) radio access network. C-RAN involves not only the physical pooling of the base stations into a single location — which is already the practice in most venues — but also digital architecture and software intelligence to allocate baseband capacity to different parts of the building in response to shifting demand.

C-RAN brings immediate benefits to a large venue in-building wireless deployment. The ability to allocate capacity across the venue via software rather than hardware adds flexibility and ease of operation. This is especially important in multi-building venues that include not only a stadium or arena but also surrounding administrative buildings, retail spaces, restaurants, hotels and transportation hubs. As capacity needs shift between the spaces by time of day or day of week, you need a system that can “point” the capacity to the necessary hot spots.

C-RAN can even go a step further to remove the head-end from the building campus altogether. Mobile network operators are increasingly deploying base stations in distributed locations known as C-RAN hubs. If there is a C-RAN hub close to the target building, then the in-building system can take a signal directly from the hub, via a fiber connection. Even if the operator needs to add capacity to the hub for this building, this arrangement gives them the flexibility to use that capacity in other parts of their network when it’s not needed at the building. It also simplifies maintenance and support as it keeps the base station equipment within the operator’s facilities.

For the building owner, this architecture can reduce the footprint of the on-campus head-end by as much as 90 percent. Once the baseband resources are centralized, the next logical step is to virtualize them into software running on PC server platforms. As it turns out, this is not so simple. Mobile baseband processing is a real-time, compute-intensive function that today runs on embedded processors in specialized hardware platforms. A lot of progress is being made toward virtualization onto more standard computers, but as of today, most mobile base stations are still purpose-built.

Perhaps more important for stadium owners is the fact that the base station (also called the signal source) is selected and usually owned by the mobile network operator. Therefore the form it takes has at most only an indirect effect on the economics for the venue. And whether the signal source is virtual or physical, the signal still must be distributed by a physical network of cables, radios and antennas.

The interface between the signal source and the distribution network provides another opportunity for savings. The Common Public Radio Interface (CPRI) establishes a digital interface that reduces space and power requirements while allowing the distribution network — the DAS — to take advantage of the intelligence in the base station. To leverage these advantages, the DAS also needs to be digital.

To illustrate this, consider the head-end for a 12 MIMO sector configuration with 4 MIMO bands per sector, as shown below. In this configuration a typical analog DAS is compared with a digital C-RAN antenna system, with and without a CPRI baseband interface. In the analog systems, points of interface (POIs) are needed to condition the signals from the different sources to an equal level before they are combined and converted to optical signals via an e/o (electric to optical) transceiver. In a digital system, signal conditioning and conversion from electric to digital is integrated into a single card, providing significant space saving.

* A typical analog system will require 8 POIs (4 MIMO bands per sector) and 2 o/e transceivers per sector resulting in 10 cards per sector. A typical subrack (chassis) supports up 10-12 cards, so a subrack will support 1 MIMO sector. For 12 MIMO sectors, 12 subracks are needed and each is typically 4 rack unit height. This results in a total space of 48 rack units.

* For a digital system[2] without CPRI, each subrack supports 32 SISO ports. Each MIMO sector with 4 MIMO bands will require 8 ports resulting in supporting 4 MIMO sectors per subrack. For 12 MIMO sectors, 3 subracks of 5 rack unit height each are needed resulting in total space of 15 rack units.

* For a digital system with CPRI, each subrack supports 48 MIMO ports. Each MIMO sector with 4 MIMO bands will require 4 ports resulting in 12 MIMO sectors per subrack. For 12 MIMO sectors, only 1 subrack of 5 rack unit height is needed resulting in total space of 5 rack units.

One commercial example of this is Nokia’s collaboration with CommScope to offer a CPRI interface between Nokia’s AirScale base station and CommScope’s Era C-RAN antenna system. With this technology, a small interface card replaces an array of remote radio units, reducing space and power consumption in the C-RAN hub by up to 90 percent. This also provides a stepping-stone to future Open RAN interfaces when they become standardized and commercialized.

The Benefits in Action — A Case Study

Even without virtualization, the savings of digitization, C-RAN and CPRI at stadium scale are significant. The table below shows a recent design that CommScope created for a large stadium complex in the U.S. For this, we compared 3 alternatives: traditional analog DAS, a digital C-RAN antenna system with RF base station interfaces, and a C-RAN antenna system with CPRI interfaces. Digital C-RAN and CPRI produce a dramatic reduction in the space requirements, as the table below illustrates.

The amount of equipment is reduced because a digital system does in software what analog systems must do in hardware, and CPRI even further eliminates hardware. Energy savings are roughly proportional to the space savings, since both are a function of the amount of equipment required for the solution.

Fiber savings, shown in the table below, are similarly significant.

The amount of fiber is reduced because digitized signals can be encoded and transmitted more efficiently.

But these savings are only part of the story. This venue, like most, is used for different types of events — football games, concerts, trade shows and even monster truck rallies. Each type of event has its own unique traffic pattern and timing. With analog systems, re-sectoring to accommodate these changes literally requires on-site physical re-wiring of head-end units. But with a digital C-RAN based system it’s possible to re-sectorize from anywhere through a browser-based, drag and drop interface.

The Bottom Line

It’s a safe bet that mobile data demand will continue to grow. But the tools now exist to let you control whether this will be a burden, or an opportunity to realize new potential. C-RAN, virtualization and open RAN interfaces all have a role to play in making venue networks more deployable, flexible and affordable. By understanding what each one offers, you can make the best decisions for your network.

Josh Adelson is director of portfolio marketing at CommScope, where he is responsible for marketing the award-winning OneCell Cloud RAN small cell, Era C-RAN antenna system and ION-E distributed antenna system. He has over 20 years experience in mobile communications, content delivery and networking. Prior to joining CommScope, Josh has held product marketing and management positions with Airvana (acquired by CommScope in 2015) PeerApp, Brooktrout Technology and Motorola. He holds an MBA from the MIT Sloan School of Management and a BA from Brown University.

FOOTNOTES
1: 59 percent of users at the 2018 Super Bowl attached to the stadium Wi-Fi network.
2: Dimensioning for digital systems is based on CommScope Era.

CommScope buying Arris (and Ruckus) for $7.4 B

CommScope announced today its intent to acquire Arris International for $7.4 billion, a move that would also bring in Wi-Fi vendor Ruckus, an asset that could help turn CommScope into a single-stop shop for large public venue connectivity products and services.

Financial details are available in the official announcement, but what will be more interesting will be how CommScope now markets itself to large public venues, like stadiums, for connectivity solutions. Previously known for its cellular distributed antenna system (DAS) infrastructure and back-end operations, with Ruckus in its portfolio CommScope could now offer an end-to-end solution for stadiums including DAS and Wi-Fi and also for those interested in Citizens Broadband Radio Service (CBRS) services in the near future.

Though commercial deployments for CBRS systems have not yet emerged, Ruckus has been a leader in the testing and certification process and is expected to be at the forefront of CBRS hardware providers when systems are ready to come online, possibly sometime next year.

If you’re keeping score (like we are), this is the third Ruckus has been acquired in the last two years. The list:

Feb. 22, 2017: Arris to acquire Ruckus from Brocade as part of $800 million deal

April 4, 2016: Brocade buys Ruckus for $1.2 B

That’s a lot of email changes and a closet full of new logowear…

AT&T beefs up ski resort reception with stealthy DAS

AT&T DAS antenna stand (right) near the American Eagle lift at Copper Mountain. Credit all photos: Paul Kapustka, MSR (click on any photo for a larger image)

In order to improve cellular reception at the Copper Mountain ski area, AT&T this winter installed a stealthy seven-antenna DAS in several base-area locations, including inside ski-lodge buildings and inside a rooftop cupola.

According to Quin Gelfand, a senior real estate and construction manager for AT&T’s Antenna Solutions Group, the mountain had previously been served only by a single macro tower located up near the slopes of the popular Colorado resort, which is located just off the I-70 Interstate between Frisco and Vail.

On heavy skier-visit days, Gelfand said, the macro tower recently caused some “capacity concerns,” leading AT&T to design a DAS solution for the several base areas at Copper Mountain. In addition to just being saturated by demand, Gelfand said the single macro antennas often didn’t provide strong signals inside buildings at the base areas.

“In a lot of areas around the resort, there were low bars for LTE,” Gelfand said.

AT&T’s Quin Gelfand shows off the main head end DAS gear rack.

But on Feb. 23 this year, that situation changed for AT&T cellular customers, as the DAS went live and immediately started moving lots of cellular traffic. By the time of our visit in early April, Gelfand said the DAS installation (which has the capacity equivalent of a single large macro tower) had already seen more than 7 terabytes of data moved, averaging about 175 GB per day. Like at many Colorado ski areas, March is a busy month at Copper with lots of spring break skiers and locals driving up on weekends from Denver.

Hiding antennas in a cupola

Brad Grohusky, senior IT manager for Copper Mountain, said AT&T approached the resort a couple of years ago to discuss the idea of a DAS. “When we had a dense population of guests, it was pretty easy to saturate a signal,” Grohusky said.

On weekends, Grohusky said Copper could often see as many as 10,000 guests, and might even see as many as 14,000 visitors on popular days or holidays. Wireless communications, he said, could get even more stress if the weather turned nasty or cold, driving more people inside buildings.

DAS antenna (upper top left) in Copper Station lodge

Starting from an existing telecom service room located in an underground garage, AT&T ran fiber this past offseason to three different antenna locations. The closest and most obvious is a three-antenna stand near the “Burning Stones” gathering area and the American Eagle chairlift base. As one of the resort’s main first chairs the American Eagle often has crowds at its base, and the Burning Stones area is a small clearing between the slopes and the base area buildings that is used often for concerts and other public gatherings.

“There was lots of digging last summer,” said Grohusky of the fiber-trenching effort, which gained some extra time thanks to a warmer-than-usual fall that kept the snow at bay. “We took advantage of that extra week,” Grohusky said.

If the American Eagle-area antennas are in plain sight, the two antennas at the Union Creek Schoolhouse base area to the west would be impossible to find if you didn’t know where they were; on the roof of a building AT&T built custom-designed baffling for a rooftop cupola that completely hides the antennas while allowing cellular signals to pass through.

“You would never know the antennas were up there,” Grohusky said. “AT&T really accomodated our architecture there.”

Closer look at DAS tower near American Eagle lift

Back farther to the east, two more antennas were located at the top windows of the Copper Station lodge building, pointed outward to cover the lift base areas and the condos and other buildings in that area. According to Gelfand AT&T used Nokia RAN gear as well as Corning fiber equipment, CommScope cabling components and antennas from JMA Wireless in the deployment. The DAS is powered by a 100 Mbps fiber link from CenturyLink, and supports three cellular bands — 700 MHz, AWS and PCS, according to Gelfand.

Even though ski season is all but over, the network will still get use in the non-snowy months as Copper Mountain, like many Colorado resorts, has an active summer schedule of on-mountain activities. The resort also has a limited free public Wi-Fi network in certain base area buildings, including in and around the Starbucks location right next to the Burning Stones area. Gohusky said there are no current plans to expand the Wi-Fi, and also said that none of the other major cellular carriers are planning to add any of their own DAS deployments.

But for AT&T customers, Grohusky said connectivity is vastly improved. “The feedback has been great,” he said. “Connectivity used to be poor inside buildings, but now it’s great.”

Look back toward the Burning Stones gathering area, near American Eagle lift

Union Creek Schoolhouse building — cupola with AT&T antennas is the one closest to ski hill

JMA Wireless antenna mounted high up inside Copper Station lodge

CommScope gear inside the Copper Station node equipment room

Corning optical gear inside the Copper Station node equipment room

Copper Station lodge building (with DAS antennas) on far right, showing proximity to eastern base area

Stadium Tech Report: Carolina Panthers take ownership of DAS, Wi-Fi at Bank of America Stadium

James Hammond, director  of IT for the Panthers, poses next to an under-seat Wi-Fi AP. Credit all photos: Carolina Panthers

James Hammond, director of IT for the Panthers, poses next to an under-seat Wi-Fi AP. Credit all photos: Carolina Panthers

“The fan is the most valuable member of our team,” Jerry Richardson, owner of the Carolina Panthers, is fond of saying.

And it’s become the virtual mission statement for the Charlotte, N.C.-based National Football League franchise. So even though its home field, the Bank of America Stadium, was built relatively recently (1996), technology has come a long way in two decades. And as the Panthers began a four-phase renovation in 2014, they did it with fans’ MVP status in mind, according to James Hammond, director of IT for the Panthers. “It was time for some changes,” he said.

While Carolina was among the first NFL stadiums to install fan-facing Wi-Fi and enhanced cellular networks, the previous DAS and Wi-Fi systems weren’t keeping up with demand and that was starting to adversely impact the Panthers fan experience, Hammond said.

“We chose to perform a rip-and-replace on both DAS and Wi-Fi and take ownership in-house,” Hammond explained. Because the Panthers own and operate BofA Stadium, making those moves was a lot easier than if they were tenants.

Time for an upgrade

Editor’s note: This profile is part of our latest STADIUM TECHNOLOGY REPORT, which includes more stadium profiles as well as looks at Wi-Fi at the Mall of America, and analytics software being used by the Cleveland Browns. DOWNLOAD YOUR FREE COPY of the report today!

The first “fan-centric improvements,” as Hammond called them, came in 2014 in the form of escalators, video big boards and a distributed audio system. As part of the second phase of upgrades, the Panthers then used the 2015 offseason to renovate the club-level suites and tore out the old DAS system while they were at it. And after a careful evaluation of different DAS solutions, they shortlisted two vendors: CommScope and Corning.

CommScope ultimately got the nod; the Panthers then had to decide between the vendor’s ION-B and ION-U DAS systems. “We went with the ION-U, which was quite new and cutting edge at that point, since it had NEMA-rated remotes,” Hammond said. Other systems lacked that kind of weatherproofing and would require additional enclosures – and expense.

CommScope's ION-U powers the new DAS at Bank of America Stadium.

CommScope’s ION-U powers the new DAS at Bank of America Stadium.

“We started over with all new fiber and coax. We did the decommissioning and construction in 90 days, which was pretty quick for a ground-up project,” he said. Beam Wireless Inc. did the design, integration and optimization and is handling the ongoing maintenance of the DAS system; Optical Telecom installed the DAS gear. BofA Stadium now has 256 DAS remotes and more than 600 DAS indoor and outdoor antennas.

AT&T, Verizon and Sprint are the participating DAS carriers; T-Mobile is weighing whether to join the mix during the 2017 off-season.

The Panthers have divided BofA Stadium into 48 DAS zones: 16 zones for the upper bowl, 16 in the lower bowl, and another 16 for concourses, suites, clubs, and offices. Not all zones are used exclusively; carriers choose simulcast patterns that place multiple zones into sectors, and can change them as capacity requirements dictate, Hammond told Mobile Sports Report.

“With some minor design changes to the interior areas, we can accommodate nearly 70 zones,” he explained. “At present the most sectors in use by a carrier is 32. This means the carrier simulcasts across a mix of our 48 zones in order to match them up to 32 carrier sectors.”

Once the new DAS was built and the first couple of events were analyzed, carriers began asking for more frequencies and additional DAS sectors to continue meeting ever-growing demand. In response to the new carrier requests, the first round of DAS upgrades were implemented in the spring of 2016, Hammond said. During the 2015 season, DAS bandwidth was running around 2 GB during games. Hammond said, “With these latest DAS upgrades, we expect the bandwidth numbers to be even higher.”

A DAS remote in a NEMA-rated enclosure.

A DAS remote in a NEMA-rated enclosure.

The impact of the new DAS system was felt immediately upon its debut in July 2015. “It was a much better experience for fans who noticed the improved cellular experience,” Hammond said. Another unexpected benefit: The upgraded DAS helped mitigate bottlenecks with the old Wi-Fi system, which Hammond characterized as “under-designed.”

Going under seat for Wi-Fi upgrade

Unfortunately, there wasn’t time to address any Wi-Fi upgrades before the 2015 football season began, but the Panthers issued an RFP for new Wi-Fi in August 2015 in preparation for Phase 3 renovations that would also include security upgrades and renovations to the upper concourse.

Interested vendors needed to ensure high bandwidth rates as well as high take-rates that allowed three different ISPs (Time Warner Cable/Charter, Level 3 Communications and Windstream) to deliver in excess of 10 GB, though Hammond said they’re starting at a 7-GB threshold.

The Wi-Fi award went to Aruba, now HP Enterprise, in December 2015, and construction began in January 2016 after the last postseason game, when the Panthers beat the Arizona Cardinals to win the NFC championship and a trip to Super Bowl 50.

Similar to Levi’s Stadium and the Dodgers Stadium, the Panthers chose underseat AP enclosures; BofA Stadium sports 770 AP enclosures in the upper and lower bowls out of a total of 1,225 APs, all to ensure maximum coverage and minimal dead spots. The Panthers selected AmpThink to do the Wi-Fi integration and construction; the turnkey contractor also designed and fabricated a custom enclosure for the APs.

Indoor access point inside the stadium.

Indoor access point inside the stadium.

One other innovation in the Panthers’ Wi-Fi installation is that the underseat enclosure is mounted to the riser — the vertical part of the step — but looks like it’s on the tread, the horizontal part, which is intended as a waterproofing measure. “The riser is easier to seal and isn’t affected by pressure washers, which you’re doing constantly with an outdoor stadium,” Hammond said. “And by running pipe through the riser, you don’t have gravity working against you,” which helps keep out water, he explained.

Panthers fans access the stadium Wi-Fi through a portal page after accepting the team’s terms and conditions. From there, they are whitelisted and can automatically join the Wi-Fi network for the rest of the season. Hammond said a fan’s email is requested but not required by the portal page, and there’s a small incentive offered to encourage fans’ email subscriptions.

The new Wi-Fi system got a workout with a soccer game at BofA Stadium at the end of July 2016, then with a Panthers’ Fan Fest the following week. “All the indicators were good, and fan feedback about the system was excellent,” Hammond said. But he cautioned that the two events were not “full bowl” events with smaller attendance numbers (~50,000) than a regular season football game (75,000+). “We will continue to optimize and tune settings as we learn more during events with higher attendance,” Hammond said.

Total budget so far for the technology upgrades totals about $16 million; the DAS build-out was just under $10 million; Wi-Fi was a little more than $6 million, which included additional wired infrastructure, according to the team.

Beacons coming next

And the Panthers aren’t done making technology improvements to their stadium. Phase 4 looks to add Bluetooth beacons and do some refinement of the Panthers app. “My goal during the upcoming season is to look at options for location-aware services,” Hammond said. Some APs have beacons built in; other may need to be added to get the granularity the Panthers want for location awareness.

Hammond also wants to give fans more things to do with the Panthers app and also optimize it for push notifications, even with something as basic and useful as restroom and concessions location information. “As we learn more about fans individually, we can direct them to things of particular interest to them,” he added.

“So far, we are very pleased with the performance of the Wi-Fi and DAS systems,” Hammond said, noting the Panthers will continue to tune frequencies, add zones and increase bandwidth where needed. It’s the sort of attention that smart sporting franchises pay to their most valued team members.

car1

New Report: Carolina Panthers build new Wi-Fi and DAS; Mercedes-Benz Stadium update, and more!

Q3thumbMobile Sports Report is pleased to announce the Q3 issue of our STADIUM TECH REPORT series, the ONLY in-depth publication created specifically for the stadium technology professional and the stadium technology marketplace.

In addition to our historical in-depth profiles of successful stadium technology deployments, our Q3 issue for 2016 has additional news and analysis, including a look at Wi-Fi analytics at the Mall of America, and a story about how the Cleveland Browns found $1 million in ROI using new analytics software from YinzCam. Download your FREE copy today!

Inside the report our editorial coverage also includes:

— Bank of America Stadium profile: An in-depth look at the Carolina Panthers’ decision to bring new Wi-Fi and DAS networks in-house;
— Mercedes-Benz Stadium profile: An early look at the technology being built into the new home of the Atlanta Falcons, with an emphasis on fiber;
— T-Mobile Arena photo essay: A first look at the newest venue on the famed Las Vegas Strip;
— Avaya Stadium profile: How the stadium’s Wi-Fi network became the star of the MLS All-Star game.

We’d like to take a quick moment to thank our sponsors, which for this issue include Mobilitie, Crown Castle, SOLiD, CommScope, JMA Wireless, Corning, Samsung Business, Xirrus, Huber+Suhner, ExteNet Systems, DAS Group Professionals and Boingo Wireless. Their generous sponsorship makes it possible for us to offer this content free of charge to our readers. We’d also like to thank you for your interest and support.