SVG Special Report, Part 1: ESPN’s Digital Center 2 Offers Treasure Chest of Core Technology

CLICK HERE for Part 2 of SVG’s Special Report on ESPN DC2

Years of planning, millions of dollars, and countless worker-hours will finally pay off next month when ESPN launches Digital Center 2 at its Bristol, CT, campus. The 194,000-sq.-ft. technological behemoth will serve as the new home of SportsCenter, ESPN’s NFL studio programming, and a host of other studio shows. Built around a first-of-its-kind JPEG 2000–based IP routing core and almost entirely fiber-based connectivity, ESPN’s 18th Bristol building promises to be a format-agnostic facility prepared to handle not only end-to-end 1080p production but whatever may come down the technological path — be it 4K, 8K, or beyond.

CLICK HERE for SVG’s full DC2 Photo Gallery.

“There are still pockets of baseband [workflows], but we were aiming for a format-agnostic facility that preps us for any media that is going to be coming — whether it be television, mobile, Web-based, IP video, or anything else,” says ESPN CTO/EVP Chuck Pagano, who, now that DC2 is all but finished, will retire next year after 35 years at the company. “I really don’t want to cement ourselves into one format, such as 720p or 1080p. We have a cardio-pulmonary system that is almost entirely IP over fiber and basically format-agnostic, so that, if we ever get into the 4K arena at some point, then we have at least the pipes and infrastructure to build for a new generation with minimal disruption.”

DC2 houses five studios (two for SportsCenter, one for NFL programming, one for various pre/postgame and halftime programming, and one with use to be determined), six production-control rooms, 16 craft edit suites, and four audio-control rooms.

“We now have over 400,000 sq. ft. of production space on the Bristol campus. In 2003, before we built DC1, we had 8,500 sq. ft. of studio space — just think about that,” says ESPN SVP of Content Information and Technology Kevin Stollworthy. “With DC2, we feel that we are truly prepared for the foreseeable future, as well as the unforeseeable future, in terms of both technology and studio space.”

Hub and Spokes
The facility features a hub-and-spoke infrastructure, with each control room, studio, edit bay, and audio room tied to a central routing core, in which all signals are distributed. Tying it all together is more than 1,100 miles of fiber-optic cable and 247 miles of copper. Although Hughes Integration provided systems-integration services, the entire facility was custom-designed by ESPN’s engineering and technology staff.

“The entire design was done in-house by our engineer staff,” says Mitch Rymanowski, VP, technology and engineering, ESPN. “That was a big difference from DC1, which was mostly outsourced. You really feel connected when you design it yourself, and there is a certain sense of ownership. We knew this was going to be a once-in-a-lifetime opportunity, so you might as well get dirty and design it all internally.”

A One-of-a-Kind Routing Core
At the core of the facility, a behemoth EXE-X2 IP router custom-designed by Evertz provides 2,304 ports (currently half populated) and is capable of routing more than 6,000 1080p streams and as much as 92 TBps. ESPN is solely using JPEG 2000 (J2K) compression for all IP routing, allowing ultra-low latency (less than a frame for encode and decode combined). In addition, ESPN is using Evertz MAGNUM unified control system and VIP Multiview.

“If we just went baseband here, we would need a 2,304-squared router, which is not really feasible,” says Jonathan Pannaman, senior director, technology, ESPN. “And we would have tie-line problems with other buildings. So we decided, if we want to do 1080p right now and there’s talk of 4K and 8K, we didn’t want to be tied down. That is where the [Evertz EXE] router and [J2K] compression became interesting.”

In addition, ESPN installed a 1152×1152 Evertz EQX-X2 router — a monster of a router in its own right — to handle remaining baseband workflows.

“We believe that this is truly future-proofing our facility,” says Rymanowski.  “But there is a certain feeling of safety that goes along with knowing that we have a regular baseband network that people can be comfortable with.”

To cover itself in case of a catastrophe, ESPN has also deployed redundant EXE and EQX routers, doubling the potential horsepower of the already ultra-high-powered DC2. To put things in perspective: just one of DC2’s EXE routers could route the entire Bristol campus six times over at 1080p and just under two times at 4K — and ESPN now has two of them.

“One major issue is, our Digital Center 1 will have been on the air for 10 years in June,” says Rymanowski. “At some point, you have to think about how you are going to retrofit that, and we can’t shut things down over there. So we needed this to be capable of scaling up to support all of that when the time comes. Until we decide what our plans are [for the retrofit], this will buy us some time.”

As for retrofitting DC1, Pannaman says, “I think we will make a very easy decision not to have baseband in at all. We will have had a couple years running on this and know categorically that it is all OK.”

ESPN Goes Green With Power, Water
DC2’s technological prowess is not just about content. The facility is also a highly “green-friendly” building and was submitted to the U.S. Green Building Council for consideration for the “Certified” category of the LEED New Construction program.

Among the techniques reducing overall energy consumption are high-efficiency condensing boilers, carbon dioxide monitoring for demand-control ventilation, high-performance CFL and LED lighting, occupancy light sensors and electronic dimming during daylight hours, variable-frequency–drive (VFD) controls on fan motors and pumps, and high-thermal-property glazing to keep the building cooler in summer and warmer in winter.

In addition, DC-2 deploys a number of water-saving methods. A water-efficient landscape design features native plants selected for tolerance to dry conditions so that no irrigation system is needed. In addition, to reduce the use of municipal water, groundwater is captured, treated, and used for toilet flushing, and air-handler condensate is captured and treated for use in cooling-tower makeup.

Password must contain the following:

A lowercase letter

A capital (uppercase) letter

A number

Minimum 8 characters