Artemis announces DISH spectrum lease, setting up San Francisco pCell service trial; also makes venue-specific hub available for trial

Artemis Networks founder Steve Perlman. Credit all photos: Artemis Networks

Artemis Networks founder Steve Perlman. Credit all photos: Artemis Networks

Artemis Networks moved one step closer to a real-world offering of its pCell wireless service with the announcement of a spectrum lease deal with satellite provider DISH that will give Artemis the means to offer commercial services in San Francisco perhaps as early as sometime later this year, pending FCC approval.

In the meantime, owners of large public venues (like sports stadiums) can now test out the Artemis technology for themselves, by testing an Artemis I Hub and antenna combination in a trial arrangement with the company. Announced last year, Artemis’ pCell technology claims to solve two of the biggest problems in wireless networking, namely bandwidth congestion and antenna interference, by turning much of the current technology thinking on its head. If the revolutionary networking idea from longtime entrepreneur Steve Perlman pans out, stadium networks in particular could become more robust while also being cheaper and easier to deploy.

In a phone interview with Mobile Sports Report prior to Tuesday’s announcement, Perlman said Artemis expects to get FCC approval for its pCell-based wireless service sometime in the next 6 months. When that happens, Artemis will announce pricing for its cellular service, which will work with most existing LTE phones by adding in a SIM card provided by Artemis. Phones with dynamic SIMs like some of the newer devices from Apple, Perlman said, will be able to simply choose the Artemis service without having to add in a card.

Though he wouldn’t announce pricing yet, Perlman said Artemis services would be less expensive than current cellular plans. He said that there will likely be an option for local San Francisco service only, and another that includes roaming ability on other providers’ cellular networks for use outside the city.

More proof behind the yet-untasted pudding

When Perlman, the inventor of QuickTime and WebTV, announced Artemis and its pCell technology last year, it was met with both excitement — for its promise of delivering faster, cheaper wireless services — and no shortage of skepticism, about whether it would ever become a viable commercial product. Though pCell’s projected promise of using cellular interference to produce targeted, powerful cellular connectivity could be a boon to builders of large public-venue networks like those found in sports stadiums, owners and operators of those venues are loath to build expensive networks on untested, unproven technology. So it’s perhaps no surprise that Artemis has yet to name a paying customer for its revolutionary networking gear.

Artemis I Hub

Artemis I Hub

But being able to name names and talk about spectrum deals are steps bringing Artemis closer to something people can try, and perhaps buy. VenueNext, the application development firm behind the San Francisco 49ers’ Levi’s Stadium app, confirmed that it is testing Artemis technology, and the San Francisco network will provide Perlman and Artemis with a “beta” type platform to test and “shake out the system” in a live production environment.

“We need to be able to move quickly, get feedback and test the network,” said Perlman about Artemis’ decision to run its own network first, instead of waiting for a larger operator to implement it. “We need to be able to move at startup speed.”

For stadium owners and operators, the more interesting part of Tuesday’s news may be the Artemis I Hub, a device that supports up to 32 antennas, indoor for now with outdoor units due later this year. The trial testing will allow venue owners and operators to kick the tires on pCell deployment and performance on their own, instead of just taking Artemis’ word for it. Artemis also has published a lengthy white paper that fleshes out the explanation of their somewhat radical approach to cellular connectivity, another step toward legitimacy since publishing such a document publicly means that Artemis is confident of its claims.

If networking statistics from recent “big” stadium events are any barometer, the field of stadium networking may need some significant help soon since fans are lately using way more data than ever before, including the 13+ Gigabytes of traffic at the Super Bowl in Phoenix and the 6+ GB figure from the college football playoff championship game. To Perlman, the idea of trying to use current Wi-Fi and cellular technology to address a crowded space doesn’t make sense.

“You simply cannot use interfering technology in a situation where you have closely packed transmitters,” said Perlman. “You just can’t do it.”

Artemis explained

pCell antenna from Artemis Networking

pCell antenna from Artemis Networking

If you’re unfamiliar with the Artemis idea, at its simplest level it’s a new idea in connecting wireless devices to antennas that — if it works as advertised — turns conventional cellular and Wi-Fi thinking on its head. What Perlman and Artemis claim is that they have developed a way to build radios that transmit signals “that deliberately interfere with each other” to establish a “personal cell,” or pCell, for each device connecting to the network.

(See this BusinessWeek story from 2011 that thoroughly explains the Artemis premise in detail. This EE Times article also has more details, and this Wired article is also a helpful read.)

Leaving the complicated math and physics to the side for now, if Artemis’ claims hold true their technology could solve two of the biggest problems in wireless networking, namely bandwidth congestion and antenna interference. In current cellular and Wi-Fi designs, devices share signals from antenna radios, meaning bandwidth is reduced as more people connect to a cellular antenna or a Wi-Fi access point. Adding more antennas is one way to solve congestion problems; but especially in stadiums and other large public venues, you can’t place antennas too close to each other, because of signal interference.

The Artemis pCell technology, Perlman said, trumps both problems by delivering a centimeter-sized cell of coverage to each device, which can follow the device as it moves around in an antenna’s coverage zone. Again, if the company’s claims hold true of being able to deliver full bandwidth to each device “no matter how many users” are connected to each antenna, stadium networks could theoretically support much higher levels of connectivity at possibly a fraction of the current cost.

The next step in Artemis’ evolution will be to see if (or how well) its technology works in the wild, where everyday users can subject it to the unplanned stresses that can’t be tested in the lab. With any luck and FCC willing, we won’t have to wait another year for the next chapter to unfold.

New Atlanta football stadium picks IBM as lead technology integrator

New Atlanta football stadium under construction. Credit all images: New Atlanta Stadium

New Atlanta football stadium under construction. Credit all images: New Atlanta Stadium

The yet-to-be named new football stadium under construction in Atlanta has selected IBM as its lead technology integrator, somewhat officially welcoming the new 800-pound gorilla to the stadium technology marketplace.

While computing giant IBM has dabbled in sports deployments before — mainly contributing technology as part of its corporate sponsorship of events like The Masters in golf and the U.S. Open for tennis — only recently has Big Blue gotten into the large-venue technology integration game. And while IBM’s recent deal as technology integrator for the revamp of Texas A&M’s Kyle Field was probably its true debut, for the new Atlanta stadium IBM will lead the selection, design and deployment of a wide range of technologies, including but not limited to the core Wi-Fi and DAS networks that will provide in-venue wireless connectivity.

Due to open in March of 2017, the new $1.4 billion stadium is expected to hold 71,000 fans for football, and up to 83,000 fans for other events like basketball or concerts. And while soccer and concerts and basketball will certainly be part of its events schedule, the NFL Atlanta Falcons and owner Arthur Blank are driving the bus on the new building, picking IBM in part to help satisfy a desire to build a venue that will be second to none when it comes to fan experience.

IBM’s size and experience a draw for Atlanta

Interior stadium design rendering

Interior stadium design rendering

In addition to Wi-Fi and DAS network buildouts, IBM will design systems to control the expected 2,000-plus digital displays in the planned stadium and will also oversee other technology-related parts of the stadium, including video security, physical door controls and a video intercom system, according to an announcement made today. IBM will also partner with the stadium owners to develop as yet-undetermined applications to “leverage the power of mobility to create a highly contextual, more personalized game day experience for fans, all through the integration of analytics, mobile, social, security and cloud technologies.”

In a phone interview Thursday, Jared Miller, chief technology officer for Blank’s namesake AMB Sports and Entertainment (AMBSE) group, said IBM’s depth and breadth in technology, applications and design made it a somewhat easy choice as lead technology partner.

Miller said the stadium developers looked at the number of different technology systems that would exist within the building, and ideally wanted to identify a single partner to help build and control them all, instead of multiple providers who might just have a single “silo” of expertise.

Proposed stadium exterior

Proposed stadium exterior

“IBM is unique with its span of technology footprint,” Miller said. He also cited IBM’s ability to not just deploy technology but to also help determine what the technology could be used for, with analytics and application design.

“They’ve looked at the [stadium] opportunity in a different manner, thinking about what we could do with the network once it’s built,” Miller said.

IBM, which also has a sizable consulting business, created a group targeting “interactive experiences” about two years ago, according to Shannon Miller, the North America Fan Experience Lead for the IBM Interactive Experience group. Miller (no relation to Jared Miller), also interviewed by phone Thursday, said IBM had been working with Arthur Blank and the Falcons for more than a year to determine how to make the new stadium “the best fan experience in the world.”

And while IBM is somewhat of a newcomer to the stadium-technology integration game, IBM’s Miller said the company not only understands “how to make digital and physical work together,” but also has resources in areas including innovation, technology development and design that smaller firms may not have. And while the Kyle Field project was ambitious, IBM’s Miller said the Atlanta operation will be much bigger.

“The size and scale of what we’re going to do [in Atlanta] will be unique,” he said.

No suppliers picked yet for Wi-Fi or DAS

For industry watchers, IBM and the Falcons team have not yet picked technology suppliers for discrete parts of the coming wireless network, such as Wi-Fi access points and DAS gear. (Daktronics has already been announced as the supplier of the new planned Halo Screen video board.) But those vendor decisions will likely be coming soon, since the stadium is under a hard deadline to open for the first game of the Major League Soccer season in March of 2017.

“We’re working fast and furious on design, and we want to identify [the gear suppliers] as early as possible,” said AMBSE’s Miller.

IBM and AMBSE did announce that the stadium’s network will be fiber-based, and will probably use Corning as a fiber and Passive Optical Network (PON) technology provider, though that choice was not announced or confirmed. IBM and Corning partnered to install a fiber network core for Wi-Fi and DAS at Texas A&M’s Kyle Field, believed to be the first large fiber network in a large stadium anywhere.

The Atlanta deal puts IBM solidly into the rapidly expanding field of stadium technology integration, which includes companies like CDW (which led network deployments at the University of Nebraska and the University of Phoenix Stadium) as well as stadium ownership groups, like the San Francisco 49ers, and technology name sponsors like AT&T, which has partnered with owners for technology and network deployments at venues like AT&T Park and AT&T Stadium.

Overhead view

Overhead view

Levis’ Stadium app adds special features for Sharks-Kings outdoor hockey game

Mocked-up screen shot of what the Levi's Stadium app will look like for Saturday's outdoor hockey game. Credit: VenueNext

Mocked-up screen shot of what the Levi’s Stadium app will look like for Saturday’s outdoor hockey game. Credit: VenueNext

Other than mobile ticketing, all of the regular features of the Levi’s Stadium mobile app will be active for Saturday’s outdoor hockey game between the San Jose Sharks and the Los Angeles Kings, with fans able to use the app over the free Wi-Fi network or the enhanced cellular DAS to do things like watch instant replays, or to order food, drinks and merchandise and have those items delivered to every seat in the 68,500-seat venue.

New for the app as a special treat for fans at the Coors Light NHL Stadium Series event is a “live, crowd-generated light show” experience, using technology from Baltimore, Md.-based Wham City Lights that synchronizes smartphones to produce a mass lighting effect. The app feature will, according to the NHL and the Levi’s app producer VenueNext, “blanket the stadium with a synchronized, multi-colored visualization of the live musical entertainment on the field,” if of course enough fans download the app and activate it at the right time.

Just like Niners fans this past football season, hockey fans at Levi’s Stadium on Saturday will be able to download the free app and use it to watch live streaming video of the event, as well as instant replays from several angles. Fans can also use the app to purchase parking tickets and get directions to the stadium as well as their seating section once inside the venue.

Screen Shot 2015-02-18 at 4.18.31 PMWhat will be interesting to see is if hockey fans generate more wireless data usage than football fans, a possibility since hockey has two natural built-in mid-game breaks as opposed to football’s halftime. Since the event is also more of a “bucket list” type game than a regular-season football game, the possibility exists that Sharks, Kings and general hockey fans in attendance may break the previous Levi’s data record set at the Niners’ home opener. Stay tuned to MSR next week, when with any luck we’ll get wireless usage stats from the Levi’s Stadium network team.

AT&T upgrades DAS at University of Washington hoops arena

Alaska Airlines Arena at Hec Edumunson Pavilion. Credit: University of Washington

Alaska Airlines Arena at Hec Edumunson Pavilion. Credit: University of Washington

AT&T’s deal to bring better wireless connectivity to Pac-12 conference schools expanded into the basketball zone, with announcement of DAS upgrades at the University of Washington’s Alaska Airlines Arena.

Officially and somewhat longwindedly called the Alaska Airlines Arena at Hec Edumunson Pavilion, the cozy 10,000-seat facility is home to not only the Husky men’s and women’s basketball teams, but also hosts other indoor sports like volleyball and gymnastics. According to AT&T the new DAS deployment has more than 80 antennas spread out through the facility to enhance cellular coverage, especially 4G LTE signals.

According to AT&T, the new DAS is already up and running and is seeing an average of 17 gigabytes of traffic used by AT&T customers per basketball game so far this season. Last fall AT&T also announced DAS upgrades at the U-Dub football stadium, where it has roughly 340 DAS antennas and saw an average 190 GB of traffic per game.

DGP gets deal to extend DAS outside Levi’s Stadium

Franks and DAS: DGP DAS antennas above food station at Levi's Stadium. Photo credit: Paul Kapustka, MSR

Franks and DAS: DGP DAS antennas above food station at Levi’s Stadium. Photo credit: Paul Kapustka, MSR

DAS Group Professionals, the company that installed the neutral-host DAS inside Levi’s Stadium, now has a deal to extend the DAS outside the Levi’s walls, covering parts of the city of Santa Clara, Calif., that surround the stadium.

With next year’s Super Bowl set to take place at Levi’s Stadium, it makes sense that city officials would want to make sure the parking lots and other pre-game gathering areas outside the venue had good cellular connectivity. At the most recent Super Bowl in Glendale, Ariz., neutral host provider Crown Castle did an extensive job of building the “oDAS” or outside DAS in the spaces surrounding the University of Phoenix Stadium.

According to DGP, it will design, build and maintain an oDAS for the City of Santa Clara, initially targeting the area around the Great America theme park and the Santa Clara Convention Center, which sit on the other side of the main Levi’s Stadium parking lots. Like the DAS inside the stadium, access to the network outside the stadium will be offered to all major wireless carriers, who must pay DGP and the city for access to the network.

While the network will definitely come in handy for pre- and post-game connectivity following Levi’s Stadium events, it will also improve overall cellular performance in the area, which is also the home to several large corporate office buildings as well as the busy convention center.

Super Bowl XLIX sets new stadium Wi-Fi record with 6.2 Terabytes of data consumed

University of Phoenix Stadium. Credit: Arizona Cardinals.

University of Phoenix Stadium. Credit: Arizona Cardinals.

The Super Bowl is once again the stadium Wi-Fi champ, as fans at Sunday’s Super Bowl XLIX in Glendale, Ariz., used 6.23 terabytes of data during the contest, according to the team running the network at the University of Phoenix Stadium.

The 6.23 TB mark blew past the most recent entrant in the “most Wi-Fi used at a single-day single-stadium event” sweepstakes, the 4.93 TB used at the Jan. 12 College Football Playoff championship game at AT&T Stadium. Prior to that, pro football games this past season at Levi’s Stadium in Santa Clara, Calif., and at AT&T Stadium had pushed into the 3-plus TB mark to be among the highest totals ever reported.

The live crowd watching the New England Patriots’ 28-24 victory over the Seattle Seahawks also used about as much cellular data as well, with Verizon Wireless, AT&T and Sprint claiming a combined total of 6.56 TB used in and around the stadium on game day. All three carriers were on the in-stadium and outside-the-stadium DAS deployments being run by neutral host Crown Castle. If those figures are correct (more on this later) it would put the total wireless data usage for the event at 12.79 TB, far and away the biggest single day of wireless data use we’ve ever heard of.

Apple OS updates still the application king

Handrails with Wi-Fi antenna enclosures from AmpThink. Credit: Arizona Cardinals.

Handrails with Wi-Fi antenna enclosures from AmpThink. Credit: Arizona Cardinals.

Mark Feller, vice president of information technology for the Arizona Cardinals, and Travis Bugh, senior wireless consultant for CDW, provided Mobile Sports Report with the final Wi-Fi usage numbers, which are pretty stunning for anyone in the stadium networking profession. According to Feller the new CDW-deployed Wi-Fi network with Cisco gear at the UoP Stadium saw 2.499 TB of data downloaded, and 3.714 TB uploaded, for a total of 6.213 TB of Wi-Fi usage. Bugh of CDW said there were 25,936 unique devices connecting to the network on game day, with a peak concurrent usage of 17,322, recorded not surprisingly at halftime.

Peak download usage of 1.3 Gbps was recorded before the game’s start, while peak upload usage of 2.5 Gbps was hit at halftime. The top applications by bandwidth use, Feller said, were Apple (mobile update), Facebook, Dropbox and Snapchat.

DAS numbers also set new record, but clarification needed

The only reason we aren’t yet trumpeting the 6.564 TB of reported DAS use as a verified record is due to the differences in clarity from each of the reporting providers. We also haven’t yet heard any usage totals from T-Mobile, so it’s likely that the final final wireless data use number is somewhere north of 13 TB, if all can be believed.

Parking lot light poles, Westgate entertainment district. Can you spot the DAS?

Parking lot light poles, Westgate entertainment district. Can you spot the DAS?

As reported before, AT&T said it saw 1.7 TB of cellular wireless activity from its customers on game day, with 696 GB of that happening inside the stadium, and the balance coming from the outside areas before and after the game. We’d also like to welcome Sprint to the big-game reporting crew (thanks Sprint!), with its total of 754 GB of all 4G LTE traffic used in and around the stadium on game day. According to Sprint representatives, its Super Bowl coverage efforts included 5 COWs (cell towers on wheels) as well as expanded DAS and macro placements in various Phoenix-area locations. The Sprint coverage included the 2.5 GHz spectrum that uses TDD LTE technology.

As also previously reported, Verizon Wireless claimed 4.1 TB of customer traffic in and around the stadium on game day, which Verizon claims is all cellular traffic and does not reflect any Verizon Wireless customer use of the stadium Wi-Fi network. Verizon also reported some other interesting activity tidbits, which included 46,772 Verizon Wireless devices used at the game, of which just 59.7 percent were smartphones. Verizon also said it saw 10 million emails sent on its networks that day, and 1.9 million websites visited, while also seeing 122.308 videos sent or received over wireless connections.

We’re still waiting to see if we can get usage numbers from the Super Bowl stadium app (we’re especially interested to see if the instant replay feature caught on) but the warning for stadium owners and operators everywhere seems to be clear: If you’re hosting the big game (or any BIG game), make sure your network is ready for 6 TB and beyond!