Betting the Under: Putting Wi-Fi antennas under seats is the hot new trend in stadium wireless networks

White box at bottom is one of the under-the-seat Wi-Fi access points at AT&T Park. Photo: Paul Kapustka, MSR (click on any photo for a larger image)

White box at bottom is one of the under-the-seat Wi-Fi access points at AT&T Park. Photo: Paul Kapustka, MSR (click on any photo for a larger image)

What do you typically find under stadium seats on game days? The traditional list might include bags and purses, and get-out-of-the-way items like empty popcorn tubs, used hot dog wrappers and drink cups no longer filled with fluids.

And now you can add Wi-Fi access points and DAS antennas to the list.

A growing trend is emerging to use under-seat antenna placements to bring wireless signals closer to fans, for both Wi-Fi networks as well as cellular distributed antenna system (DAS) deployments. First used to compensate for a lack of overhang or railing placement spaces, under-seat deployments are now winning favor in all sorts of arenas for their ability to use human bodies to help build a more dense network, one that proponents say can carry far more capacity than an infrastructure that relies mainly on overhead antenna placements.

With proof points emerging quickly at venues like the San Francisco 49ers’ Levi’s Stadium and Texas A&M’s Kyle Field, as well as at pioneers AT&T Park and AT&T Stadium, under-seat Wi-Fi deployments may soon become more common, as more integrators and equipment suppliers embrace the under-seat method.

New stadiums under construction including the Sacramento Kings’ Golden 1 Center as well as new Wi-Fi deployments at existing stadiums like Houston’s NRG Stadium and the Bank of America Stadium in Charlotte are also planning to primarily use under-seat Wi-Fi deployments, both for the performance and aesthetic benefits. With such high-profile deployments embracing the method, under-seat APs may become the default placement position going forward, especially as stadium mobile-device usage by fans keeps growing.

History: a need required by architecture

Editor’s note: This excerpt is from our latest STADIUM TECH REPORT, our long-form PDF publication that combines in-depth stadium tech reports with news and analysis of the hottest topics in the world of stadium and large public venue tech deployments. Enjoy this PART 1 of our lead feature, or DOWNLOAD THE REPORT and read the whole story right now!

Giants senior VP and CIO Bill Schlough, at the office

Giants senior VP and CIO Bill Schlough, at the office

When Wi-Fi first arrived in stadiums, the obvious solution to questions about antenna and access point placement seemed evident — just mount them on ceilings, overhangs and walls, like they had always been placed historically. Mostly that decision kept the antennas out of sight, and provided good-enough reception for most network deployments.

But as fan Wi-Fi usage started growing, poor reception areas cropped up, most often in the most expensive seats near the courts or playing fields, where there was often little architectural infrastructure other than the seats themselves. At the San Francisco Giants’ AT&T Park, the stadium where fan-facing Wi-Fi was first installed in 2004, the need for more bandwidth was a big problem that needed to be solved during the 2012-13 offseason, after the team’s second World Series title run in two years had produced record wireless usage.

Even with a Wi-Fi AP placed just about everywhere they could be, the San Francisco Giants’ IT team couldn’t keep up with demand. And the trick that has been tried at some stadiums — putting AP enclosures on handrails — wasn’t an option at AT&T Park, since its lower-bowl seating areas have no railings.

With options limited, that’s when an internal battle commenced around the new idea of placing APs under seats, a plan that met fierce resistance on many fronts.

“We got beaten up pretty bad over the idea [of under-seat APs],” said Bill Schlough, senior vice president and chief information officer for the Giants, who described the 2012-13 offseason as “a very stressful time,” with lots of internal strife and discord. With multiple stakeholders checking in on the plan, including the Giants’ facilities group, the marketing group and the ticketing group, concerns about the loss of under-seat space and the potential health concerns fueled opposition to putting APs under chairs.

But without any railings or overhangs for most of the park’s lower-bowl seats, Schlough and his team had “no other alternative” than to try placing Wi-Fi APs under seats. On the possible health issue, Schlough said the Giants were assured by technology partner (and ballpark title sponsor) AT&T that the deployment would be safe and comply with all FCC regulations; “We were assured that having [an antenna] 18 inches from your butt was the [radio] equivalent of having a cell phone in your pocket,” Schlough said.

On the storage-space concern side, Schlough said the Giants’ IT team made models of the antennas out of cardboard and duct tape, and placed them under seats to see how they worked.

“The [walking] flow through the aisles was good, with the AP models tucked under we never kicked them” during testing, Schlough said. With AT&T assuring the Giants that under-seat was “the way of the future,” the team took a leap of faith and added a large number of under-seat Wi-Fi APs in preparation for the 2013 season, more than doubling the number of APs in the park (to 760 total) in the process.

Under-seat Wi-Fi enclosure at Dodgers Stadium. Photo: Terry Sweeney, MSR

Under-seat Wi-Fi enclosure at Dodgers Stadium. Photo: Terry Sweeney, MSR

Though Schlough and his team “spent a lot of time” communicating with season-ticket holders about the new technology, there was still consternation about what might happen when opening day arrived, and “fans find this box under their seat, and not have a place to put their garlic fries,” Schlough said.

As it turns out, there was almost no resistance to the method; according to Schlough the Giants only had two complaints about the under-seat APs that first day of deployment, which Schlough called “the biggest relief day of my life.”

The success of the under-seat idea was particularly noted at that time by another IT team in the Bay area, the one putting together the wireless plan for the San Francisco 49ers’ new home, Levi’s Stadium, which was being built just to the south in Santa Clara. Testing some under-seat placements of their own at Candlestick Park during that venue’s final season as the Niners’ home, the team building the Levi’s Stadium network became convinced that going under seat was the best way to build the high-density deployment they wanted to have.

END PART 1… HERE IS THE LINK TO PART 2… TO READ THE WHOLE STORY NOW, DOWNLOAD OUR REPORT!

SF Giants fans used 78.2 TB of Wi-Fi data at AT&T Park during 2015 season

The view from AT&T Park's left field corner. Photo: Paul Kapustka, MSR

The view from AT&T Park’s left field corner. Photo: Paul Kapustka, MSR

It didn’t end with a World Series championship but the 2015 season for the San Francisco Giants did see fans use 78.2 terabytes of Wi-Fi data during home games at AT&T Park, the most-ever at the venue, according to the Giants.

Bill Schlough, senior vice president and CIO for the Giants, sent over a bunch of wireless data statistics from the Giants’ season, and on both Wi-Fi and AT&T DAS usage, numbers were up significantly from the year before. In addition to the 78.2 TB of Wi-Fi data used during baseball games, Schlough said additional data used during preseason games, concerts and private parties (like the SEAT 2015 softball game!) probably added another 20+ TB to the total, putting the AT&T Park Wi-Fi usage for the year in the 100 TB range.

Anyone else out there with numbers that challenge for the total Wi-Fi season crown?

Here are some more precise measurements from the AT&T Park 2015 season, with comparisons to 2014 in parentheses:

— Average Wi-Fi Take-Rate: 34.8% (33.9% in 2014)

— Wi-Fi Traffic/Game: 966 GB (591 GB)

— AT&T DAS Traffic/Game: 264 GB (196 GB)

— Wi-Fi Traffic/Connection: +59% vs. 2014

— DAS Traffic/Connection: +35%

SEAT1

IBM formally launches sports consulting practice to bring tech to stadiums

Texas A&M student at recent Aggies football game. Photo: Paul Kapustka, MSR (click on any photo for a larger image)

Texas A&M student at recent Aggies football game. Photo: Paul Kapustka, MSR (click on any photo for a larger image)

IBM formally cemented its entrance to the sports-stadium tech deployment market with the announcement of a sports and fan experience consulting practice, and a “global consortium” of tech and service suppliers who may help IBM in its future stadium and entertainment venue deployments.

For industry watchers, the Nov. 19 debut of the IBM “Sports, Entertainment and Fan Experience” consulting practice was not a surprise, since its leader, Jim Rushton, had already appeared at tech conferences this past summer, talking about IBM’s plans to deploy a fiber-based Wi-Fi and DAS network at the new Mercedes-Benz Stadium being built for the Atlanta Falcons. IBM was also publicly behind a similar network build over the last two years at Texas A&M’s Kyle Field. For both networks, IBM is using Corning optical gear.

Still, the formal creation of the IBM practice (you can read all about it at the new IBM sports website) means that the 800-pound gorilla is now firmly inside the competitive ring of the stadium-tech marketplace, a landscape that currently has multiple players, many of which have multiple stadium deployments under their belts. However, IBM’s vast experience in big-time sports technology deployments — Big Blue is behind such endeavors as the truly wonderful online experience of The Masters, as well as technical underpinnings of three of tennis’ Grand Slam events (Wimbledon, the U.S. Open and the Australian Open) — as well as its considerable tech and monetary resources probably makes it a No. 1 contender for all of the biggest projects as well as possibly smaller ones as well.

Artist's rendering of planned overhead view of new Atlanta NFL stadium

Artist’s rendering of planned overhead view of new Atlanta NFL stadium

Rushton, who spoke with Mobile Sports Report earlier this year in one of his first public appearances as an IBMer, said in a phone interview this week that IBM’s fiber-to-the-fan network model isn’t just for large-scale deployments like the one at 105,000-seat Kyle Field or the Falcons’ new $1.4 billion nest, which will seat 71,000 for football and up to 83,000 for other events after it opens in 2017.

“That type of system [the optical network] is scalable,” Rushton said, and even in smaller venues he said it could potentially save customers 30 percent or more compared to the cost of a traditional copper-based cabled network. The flip side to that equation is that purchasers have fewer gear suppliers to choose from on the fiber-based side of things, and according to several industry sources it’s still sometimes a problem to find enough technical staffers with optical-equipment expertise.

How much of the market is left?

The other question facing IBM’s new consulting practice is the size of the market left for stadium tech deployments, an answer we try to parse each year in our State of the Stadium survey. While this year’s survey and our subsequent quarterly reports found a high number of U.S. professional stadiums with Wi-Fi and DAS networks already deployed, there are still large numbers of college venues as well as international stadiums and other large public venues like concert halls, race tracks and other areas that are still without basic connectivity.

Full house at Kyle Field. Photo: Paul Kapustka, MSR

Full house at Kyle Field. Photo: Paul Kapustka, MSR

With its new “global consortium” of companies that supply different parts and services of the connected-stadium experience, IBM could be an attractive choice to a customer that doesn’t have its own technical expertise, providing a soup-to-nuts package that could conceivably handle tasks like in-stadium IPTV, DAS and Wi-Fi, construction and stadium design, and backbone bandwidth solutions.

However, IBM will be going up against vendors who have led deployments on their own, and league-led “consortium” type arrangements like MLBAM’s project that brought Wi-Fi to almost all the Major League Baseball stadiums, and the NFL’s list of preferred suppliers like Extreme Networks for Wi-Fi and YinzCam for apps. Also in the mix are third-party integrators like CDW, Mobilitie, 5 Bars, Boingo Wireless and others who are already active in the stadium-technology deployment space. And don’t forget HP, which bought Wi-Fi gear supplier Aruba Networks earlier this year.

Certainly, we expect to hear more from IBM soon, and perhaps right now it’s best to close by repeating what we heard from Jared Miller, chief technology officer for Falcons owner Arthur Blank’s namesake AMB Sports and Entertainment (AMBSE) group, when we asked earlier this year why the Falcons picked IBM to build the technology in the new Atlanta stadium:

Remote optical cabinet and Wi-Fi AP at Kyle Field. Photo: Paul Kapustka, MSR

Remote optical cabinet and Wi-Fi AP at Kyle Field. Photo: Paul Kapustka, MSR

“IBM is unique with its span of technology footprint,” Miller said. He also cited IBM’s ability to not just deploy technology but to also help determine what the technology could be used for, with analytics and application design.

“They’ve looked at the [stadium] opportunity in a different manner, thinking about what we could do with the network once it’s built,” Miller said.

From the IBM press release, here is the IBM list of companies in its new “global consortium,” which IBM said is not binding, meaning that none of the companies listed is guaranteed any business yet, and others not on the list may end up in IBM deployments, like Kyle Field, which uses Aruba gear for the Wi-Fi:

Founding members of the consortium, include:

· Construction and Design: AECOM, HOK, Whiting Turner

· Infrastructure Technology/Carriers: Alcatel/Lucent, Anixter, Commscope, Corning, Juniper Networks, Ruckus Wireless, Schneider Electric, Smarter Risk, Tellabs, Ucopia, Zebra Technologies, YinzCam (IPTV), Zayo, Zhone

· Communications Solutions Providers: Level 3, Verizon Enterprise Solutions, AT&T

· Fan Experience Consulting & Data Management Integration: IBM

Kansas City Royals score with jump in postseason stadium Wi-Fi and DAS traffic

Royals fans at Kauffman Stadium enjoying the postseason. Credit all photos: Kansas City Royals (click on any photo for a larger image)

Royals fans at Kauffman Stadium enjoying the postseason. Credit all photos: Kansas City Royals (click on any photo for a larger image)

If you need a reason to justify Wi-Fi network installs or improvements in your stadium, here’s an optimistic rationale: If your team makes it to the playoffs and the championship, you can expect a big surge in postseason wireless traffic.

That idea was proven again this fall by the Kansas City Royals, who racked up big postseason Wi-Fi and DAS traffic numbers at Kauffman Stadium during their march to the 2015 World Series championship, including a 3.066 terabyte night on the Wi-Fi network for Game 1 of the World Series. That’s a 1 TB jump from last season, when Kansas City saw 2+ TB of Wi-Fi traffic during Game 7 of the 2014 World Series.

According to numbers provided by Brian Himstedt, senior director of information systems for the Royals, the Kauffman Stadium Wi-Fi network saw an average of 1.9 TB of aggregate throughput for the eight home games Kansas City hosted in the playoffs.

Fans cheering the Royals at Kauffman Stadium

Fans cheering the Royals at Kauffman Stadium

The average peak user count over those games was 11,850, with a high peak of 13,900 during Game 2 of the World Series. The stadium’s capacity for the postseason games, Himstedt said, was 40,500.

The postseason Wi-Fi traffic, Himstedt said, was approximately 34 percent upload and 66 percent download. During the regular season, the Kauffman Wi-Fi network had upload/download averages of 22 percent and 78 percent respectively, meaning that during the playoffs fans were probably more busy sharing information than obtaining it.

Overall, the postseason Wi-Fi numbers were much larger than the Royals’ regular-season stats, Himstedt said. Here are some of the regular-season stats during a summer that saw the network serve more than 180,500 unique clients on the Wi-Fi network:

— Average throughput per game: 625 GB
— High throughput, single game: 1.05 TB (Sept. 26)
— Average peak concurrent users per game: 5,150
— High peak concurrent users, single game: 7,500 (Opening Day, April 6)

The average attendance of Kauffman Stadium during the regular season was 33,900, Himstedt said.

Sprint DAS numbers also jump

And while the DAS at Kauffman Stadium is still awaiting full participation by all of the top wireless carriers, hometown favorite Sprint was active on the new system deployed by Advanced RF Technologies before the start of the season, and according to Sprint there were huge increases in DAS traffic compared to 2014.

Here are some DAS numbers from Sprint about the playoff traffic at Kauffman Stadium:

— Total tonnage for the 2015 eight game post-season was 2.6 terabytes

— Average tonnage per post-season game increased 4,000% in 2015 compared to 2014

— Sprint fans talked on their phones a total of 178,954 minutes in the post-season

— LTE connection rates for the 2015 post-season improved by approximately 40% compared to 2014

According to Sprint, the DAS supported all the frequencies used by Sprint devices, including the 1.9GHz, 2.5GHz and 800MHz bands.

Wi-Fi stats left on the bench in RootMetrics’ baseball stadium network performance scores

The folks at RootMetrics have another network research project out, one that claims to determine the best wireless connectivity in all the U.S. Major League Baseball stadiums. However, the report doesn’t include Wi-Fi network performance in any of its scoring processes, and it doesn’t publicly reveal the limits of its network tests, which are based on just one day’s results from a handful of devices in each venue and do not include any results from Apple iOS devices.

According to the RootMetrics survey, Fenway Park in Boston ended up atop their results, with strong scores for all the four major U.S. wireless carriers, a list that includes AT&T, Verizon Wireless, Sprint and T-Mobile. But the caveat about those “scores” is that they are composite results devised by RootMetrics itself and not a direct reflection of numerical network performance.

At Fenway, for instance, RootMetrics’ own results show that T-Mobile’s median upload and download speeds are 3.0 Mbps and 3.5 Mbps, respectively, while Verizon’s are 20.7 Mbps and 13.0 Mbps. Yet RootMetrics gives T-Mobile a third place at Fenway with a 89.5 “Rootscore,” compared to Verizon’s winning mark of 97.9, meaning that in RootMetrics’ scoring system a network six times as fast is only 10 percent better.

While it’s not included in the scoring or ranking, the Wi-Fi network at Fenway as measured by RootMetrics delivered speeds of 23.1 Mbps down and 22.0 up, besting all the cellular networks in the stadium. In its blog post RootMetrics does not explain why it doesn’t include Wi-Fi networks in its network measurements or scoring, even though its testing does show Wi-Fi performance at individual stadiums. Over the past year, Major League Baseball led a $300 million effort to install Wi-Fi networks in all MLB parks.

Unlike its metro-area tests, where RootMetrics uses “millions of data points,” the baseball stadium tests were calculated using just one device from each carrier — and all are Android-based, since RootMetrics’ internal testing system doesn’t run on iOS devices. And while RootMetrics said that for its results each park was visited “at least once,” in going through all 29 stadium reports there was only a single visit date mentioned for each one. RootMetrics also did not visit Rogers Centre in Toronto, home of the American League’s Blue Jays.

Stadium Tech Report: Los Angeles Dodgers hit it out of the park with Cisco, Aruba Wi-Fi

Dodgers Stadium, the SoCal baseball shrine. All photos: Terry Sweeney, MSR (click on any photo for a larger image)

Dodgers Stadium, the SoCal baseball shrine. All photos: Terry Sweeney, MSR (click on any photo for a larger image)

Growing up in the Los Angeles suburb of Norwalk, Ralph Esquibel recalled playing outdoors while inside the Dodger game was on the radio. “I knew from the kinds of noises coming out of the house how the game was going,” he laughed. Esquibel, now vice president of IT for the Los Angeles Dodgers, may have wished for some similar indicators or guideposts as he began the wireless retrofitting of Major League Baseball’s third oldest stadium (after Boston’s Fenway Park and Chicago’s Wrigley Field) in early 2011.

Esquibel faced multiple challenges with Dodger Stadium. First, there was all that concrete to push signals through or around. There was the size of the Chavez Ravine venue and its far-flung parking lots, spanning more than 350 acres. The stadium also has few overhangs, a favorite place to attach Wi-Fi access points or distributed antenna system (DAS) gear. Then there’s Dodger Stadium’s capacity — 56,000 seats – the largest in the league and almost 30 percent larger than the average MLB stadium (42,790).

Esquibel’s biggest hurdle? ” Trying to achieve the network that we wanted but also maintain an appropriate budget for the solution,” he said. While Esquibel would not specify what the Dodgers spent, he did allow that it was “an 8-figure project.”

Coverage challenges in the best seats

Initially, the best seats in the house presented a coverage challenge; field and club level seats along the third- and first-base lines and the dugout lack any overhangs. So while phones in those sections could receive a short, directional beam sent from across the outfield, the upstream signal couldn’t get back to the AP across the field, said Esquibel.

Ralph Esquibel, VP of IT for the Dodgers, with the new Wi-Fi relief pitcher mobile.

Ralph Esquibel, VP of IT for the Dodgers, with the new Wi-Fi relief pitcher mobile.

“We wanted to guarantee a premium experience, regardless of the seat,” said Esquibel, who joined the Dodgers 6 years ago after working in IT at Toyota and Honda. So by using what he calls “a hybrid approach,” Wi-Fi APs and antennas are installed overhead where possible, but also under seats and in staircase handrails that divide the stadium’s steep aisles.

All told, nearly 1,000 APs from Cisco and Aruba Networks blanket Dodger stadium, its concession areas and parking lots. Horizon Communications helped the Dodgers with design and installation of the Wi-Fi and DAS.

The under-seat APs/Wi-Fi antennas on the club level are housed in NEMA enclosures about every 15 seats, set eight rows apart. Esquibel was concerned about losing real estate under those seats; he also didn’t want to create any potential trip hazard for fans. In addition, the Dodgers use Cat 6A cabling, whose thickness and rigidity couldn’t run up a stepped incline. Consequently, they drilled through concrete to snake the cabling through from the clubhouse underneath. “There’s no visible conduit leading into the enclosure,” Esquibel explained. The profile and footprint of the enclosure still leaves space for fans to place belongings.

Handrail Wi-Fi enclosure

Handrail Wi-Fi enclosure

It’s the same modus operandi for the enclosures housed in the stair rails, except there are two APs in larger enclosures at the top of each staircase on the reserve level and upper deck, then a single AP per enclosure as the stairways descend. Some 290 APs offer coverage on the reserve level, which by itself has a greater capacity than nearby Staples Center (18,118 seats), Esquibel told Mobile Sports Report. After 2 years of use, there have been no issues with the AP enclosures. “We power-wash the seats and stands after games and [the enclosures] are very resilient against the sun, water and wind,” Esquibel said.

He also acknowledged some early challenges with Wi-Fi. Part of the issue was working with Cisco’s CleanAir technology, which is supposed to minimize RF interference, if not eliminate it altogether. If an AP starts broadcasting over a frequency in use by another AP, for example, CleanAir helps it find another frequency. It took a few months to fully tune the network; some directional antennas needed a 10-degree adjustment, Esquibel said. Another challenge was having APs from more than one vendor. “If your network is 100 percent Cisco and all leveraging the same controllers, [CleanAir] will work perfectly,” Esquibel said. “If you have a mixed environment that pushes Wi-Fi in certain locations, it becomes a problem — there’s competition for frequencies.”

Coordinating the APs

A third-party leveraging a non-public frequency would switch channels, for example, causing the APs for public use to also switch channels. “What we had was a lot of bouncing back and forth,” Esquibel said, which affected performance. “So we assigned channels and frequencies for each AP, which still requires a lot of coordination.”

Under-seat Wi-Fi enclosure

Under-seat Wi-Fi enclosure

Since 2013, the stadium has been carved into 24 DAS sectors. AT&T, T-Mobile USA and Verizon Wireless are the carriers presently using the DAS; Ericsson makes the DAS antennas. Stubborn Sprint relies on a tower adjacent to the stadium.

Dodger fans average anywhere from 500-655 megabytes of data use per game, according to Esquibel. During a busy game, the wireless networking accommodates 16,000 concurrent users; a slower event clocks in at 4,000-8,000. To test upload speed, Esquibel will push a 50MB video to Facebook. When there’s lots of available bandwidth, he gets 60 Mbps performance; on the low end, it’s closer to 4 Mbps. Esquibel said users are mostly streaming and posting videos and photos to social media; Dodger Stadium is the second most Instagrammed site in southern California, after Disneyland, Esquibel added.

The Dodgers have their own version of Ballpark, the in-stadium MLB app, which offers video replay and highlights; in-seat ordering of food and drink in certain areas; and stadium mapping. Check-ins on Ballpark are handled through a network of 44 iBeacons, which takes advantage of Bluetooth Low Energy (BTLE) technology. Between Ballpark and social media activity, Dodger fans have run up as much as 700 MB data usage during games — and the network is ready if more demand is needed.

“We don’t do any rate limiting, so if we consume all our bandwidth we get a free upgrade, thanks to a clause in our agreement with our ISP, AT&T,” Esquibel explained.

To ensure a family-friendly and wholesome environment, the Dodgers use Palo Alto Networks 5020 firewalls for content filtering. “As we developed our SLAs, it was one of the first issues to pop up — no sexual content, no malware/phishing, and no illegal drug sites,” he said.

What’s on his wish list for the future? “I’d like geo-fencing within the Wi-Fi network so if I see someone enter a club, I can say hi or welcome them, notify them of specials, or flag points of interest around the stadium,” Esquibel said, like the World Series trophy case or giveaway locations for promotional items. Alongside all the other applications, wireless can be used as guideposts for fans and visitors to Dodger Stadium.

https://duwit.ukdw.ac.id/document/pengadaan/slot777/

https://mtsnupakis.sch.id/wp-content/zeusslot/

https://insankamilsidoarjo.sch.id/wp-content/slot-zeus/

https://smpbhayangkari1sby.sch.id/wp-content/slot-zeus/

https://alhikamsurabaya.sch.id/wp-content/slot-thailand/

https://mtsnupakis.sch.id/wp-content/bonus-new-member/

https://smptagsby.sch.id/wp-content/slot-bet-200/

https://lookahindonesia.com/wp-content/bonus-new-member/

https://ponpesalkhairattanjungselor.sch.id/wp-content/mahjong-slot/

https://mtsnupakis.sch.id/wp-content/slot777/

https://sdlabum.sch.id/wp-content/slot777/

https://sdlabumblitar.sch.id/wp-content/bonus-new-member/

https://sdlabumblitar.sch.id/wp-content/spaceman/

https://paudlabumblitar.sch.id/wp-content/spaceman/