button-digitaledition-new

SubscriptionBanner 2

 

Monday, 25 April 2011 20:07

From the Sidelines to the Live 3D Game

Written by  Carl Mrozek
Rate this item
(0 votes)

liveevent_3d-sfw
While Avatar’s success has triggered a lemming migration into the third dimension by directors of dramatic fare, the ripples are also lapping against the bleachers of spectator sports and other live events. But while a movie can always be fixed in post, a live event is a beast of a different stripe that has, at best, a 10-second time-delay buffer. Nevertheless, ESPN’s early commitment to 3D seems to have spurred many to get off the sidelines and into the live 3D game.

Joe Signorino, Sr. is an engineer at NEP Supershooters, which specializes in covering sports and other live events for broadcast, and he designed one of the first 3D production trucks in the U.S. “We rebuilt our first 3D truck several times in 10 months and replaced it with the two 3D trucks we now operate,” Signorino reports. “Based on that experience, we can now modify a 2D truck for 3D use fairly quickly, if necessary.” Today, one of NEP’s full-time 3D trucks is dedicated entirely to ESPN productions while the other is for DIRECTV and other projects. The latter includes programs ranging from live music concerts to poker tournaments and even cooking shows. For ESPN, Signorino and the NEP Supershooters have covered a range of sporting events in 3D, including the Summer and Winter X Games, football, golf and boxing.

Madison Square Garden (MSG) Networks, also an NEP client, first tried 3D in 2007 as part of a test of 3D delivery to homes with Cablevision. However, they did their first bona fide live 3D sports broadcast in March 2010 with a hockey game between the N.Y. Islanders and N.Y. Rangers. “Since [MSG is] the host network for both teams, this enabled us to control the entire production in 3D and 2D,” explains Jerry Passaro, SVP of network operations and distribution. Before the game, Passaro had a number of concerns about 3D. “We were concerned about the speed of the game, the glass around the rink, and the protective netting around the goals,” he recalls. As a solution, Passaro rethought the camera placement in order to maximize low and close camera positions, which yield the best 3D images. “Typically our high cameras are on the ninth floor, but we brought them down to the sixth floor, one floor above the arena,” he explains. “It was still a bit high for 3D but the elevated ‘game cameras’ dovetailed pretty well with our wide rink-level cameras. We got plenty of ‘wow shots’ from the low cameras, despite juggling them to work around the glass, the goals and corner posts.”

According to Passaro, one particular sequence captured the full power of 3D: Early in the game, there was a fight on the ice that was covered up close in wide angle. “The fight was spectacular in 3D,” Passaro remembers. “It captured all of the drama and action hockey fans love and made you feel like you were right there. In our 3D theater, the audience response was incredibly visceral. They were standing up and yelling as if they were in the arena. That brawl alone made the game for us.” Similarly, Jack Kestenbaum, director of operations for the N.Y. Yankees YES Network, believes that, if done well, 3D has great potential to add excitement to televised baseball. This stems, in part, from a game played in Seattle in 2010, which was covered in 3D. “We chose Safeco [Field] Stadium [for 3D coverage], partly because they have a low home [plate] camera position,” Kestenbaum explains. “From there you can feel 95 miles-perhour fastballs whizzing at you, just like the batter does! It’s amazing! We’re convinced that 3D and baseball can make a good match, but camera placement and production technique are critical. As an industry, we have a lot to learn about 3D production technique, but we do know that fast camera moves and quick, short cuts just don’t work. In 3D, it’s more about [capturing] the fan’s perspective.”

The problem for producers and DPs is that good 3D shooting technique runs counter to the training many cameramen receive for covering live events in 2D. “Fast pans and zooms make you sick in 3D,” Signorino notes. “We train our camera operators to go slow whenever panning, tilting, zooming … [as] a good 3D [viewer] experience is paramount.” Achieving a painless, but exciting viewer experience is made even more challenging for 3D directors having to cover games and concerts with fewer 3D cameras than the 2D cameras crews have for covering the same event. “Today, all of the money is in the 2D broadcast, [which is] where the audience is,” explains Signorino. “We’re usually the second or third truck in at most venues. They get first crack at all of the camera positions, and we have to work around them, often with a half or even a third as many cameras.”

Luckily, this deficit is partly offset by the fact that some camera angles simply don’t work in 3D and are thus superfluous. Medium and close-up shots of athletes on the field made possible by ultralong (up to 100X) box lenses are a cliché of sports TV coverage. Today, these extra-long 2D lenses don’t have 3D counterparts. And, even if they did, they would yield a pretty flat 2D image, not a 3D one. Hence, most directors don’t bother to attempt shots like that in 3D. “There is only a limited number of positions in a stadium that work well for 3D anyways,” says Signorino. “Most of them are fairly close to the action.” To compensate for this deficit, one trick Signorino has learned is the selective use of the host network’s 2D cameras, whenever feasible: “Sometimes you grab a wide scenic, a POV or fixed shot in 2D and it blends in, as long as you don’t overdo it.”

In fact, some directors believe that intercutting a modest amount of 2D footage actually heightens the impact of bona fide 3D shots. “Your brain gets fatigued from repeated 3D shots. Strategically intercutting some 2D shots rests it, unconsciously,” says Nagamitsu Endo, producer for NHK’s 3D production unit NHK Cosmomedia America. This special division of NHK has been doing live production in 3D for nearly two decades. Cosmomedia sometimes provides the 2D crew and other times it utilizes 2D shots from the host network’s cameras. “Every situation is different, so you have to see what is feasible and what makes the most sense,” says Endo.

However, the reality is that in most stadiums and arenas camera positions are pre-established and limited in number. Moreover, they’re all designed for standard broadcasts in 2D, which is what virtually all terrestrial and satellite TV is. Hence, 3D live-event directors have to make the most of however many (or few) 3D camera positions they’re allotted for a given event. Within Japan, NHK Cosmomedia 3D crews have an advantage in that respect. “In Japan, the 2D crew is often with NHK also, so we can work in cooperation for access to good [3D] camera positions,” says Endo.

liveevent_sfwThe ratio of 2D to 3D camera positions may be as little as 2:1 or as many as 5:1 at major events. That’s partly why an opportunity to shoot professional soccer exclusively in 3D for Japan’s J League was so welcome. This meant that there was no other camera crew to compete with for access, and no being locked into established 2D camera positions. Nevertheless, the number of 3D camera positions weren’t multiplied. “Soccer is more forgiving in terms of camera positions than other sports like football, base-ball [or] hockey where camera positions are very restricted,” says Endo. “I rarely use more than six 3D camera positions because the key to spectacular 3D is to get the cameras as close to the action as possible.”

Whether Endo’s is the only crew at an event or is one of two or more, he always prefers to work with his own seasoned 3D crews. “If the budget allows, we always try to keep the 3D and 2D crews separate, because they have different mindsets and styles,” Endo explains. However, if a client insists, Endo will train local crews for 3D, if he can do it properly. “It takes time and experience to shoot good 3D, but I have trained many local crews to do it,” Endo reports. “The key is spending enough time, including a dress rehearsal before the big event. It makes a big difference if they do some serious shooting and can review it on a 3D monitor before the real event, so they can see what works and what doesn’t.” Endo recalls some 3D live broadcast disasters that resulted when 2D crews were put on 3D cameras with too little advance training: “In some cases, people got headaches, left the theater and demanded their money back. It takes more than an hour of training to expect good results.”

New technology may help to bridge this training gap. PACE’s new Shadow rigs are key on the 2D cameraperson to drive 3D cameras linked to them. “The Shadow-D system reads the zoom and focus from the 2D operator’s control to drive the 3D zoom and focus,” says PACE President Patrick Campbell. “3D convergence and tilt can also be driven automatically from the 2D operator’s focus control. They were first used on ESPN’s Summer X Games and later at the U.S. Open for CBS Sports and boxing for HBO Sports. The future of 3D production lies in integrating the 3D show with the 2D show as much as possible. The Shadow-D system allows the use of shared resources, camera positions and operators.”

Nevertheless, not all live event coverage requires the latest, highest-tech gear available. This was demonstrated at the University of Florida, home of pioneering work in 3D liveevent production. “Initially we used a pair of Canon’s XL H1 HDV cameras on a custom-made parallel rig,” says Mark Rodin, executive director of Seminole Productions at University of Florida at Tampa. “We did most of the alignment by eye and in post, as we had no 3D field monitors and were happy to get two to three minutes of usable footage per game.” Shooting in this way, Rodin was able to cobble together an impressive 15-minute annual highlights reel, which was projected in their 3D theater for recruiting and for alumni events. Today, Rodin gets far more game footage and saves weeks in post with Panasonic’s selfcontained AG-3DA1 3D camcorder. “Panasonic’s [stereo] camcorder is so portable that we now get shots we never attempted before like crowd reaction, behind the scenes in the locker room [and] cutaways,” Rodin reports. “Now we can get as much footage as we need, around 30 minutes per game, and it’s all perfectly aligned!” Rodin has also doubled the number of camera positions to two, one in each end zone. Moreover, each cameraman can now preview the footage live in 3D on Panasonic’s 24-inch 3D monitors. Now when editing 3D footage in [Apple] Final Cut Pro, using CineForm’s Neo3D plug-in, they can preview it on JVC’s new 46-inch passive 3D monitor.

One thing that’s high on everyone’s 2011 NAB list is a smaller, portable 3D camera for handheld use; however, Signorino says the small self-contained camcorders like Panasonic’s and Sony’s aren’t the best choice for them. “We need something portable but stable that you can connect to a switcher and which performs well in low light,” says Signorino. “The small camcorders don’t work for us.”

Login to post comments
Advertisement