FINE – Objectives

Recently there has been an explosive re-emergence of stereoscopic content and related technologies, mainly due to the wide spread distribution of digital cinema, the recent advances in 3D projectors and stereoscopic display technologies. At the time this project proposal is being written, all major Hollywood studios and most renowned producers and directors are engaged in 3D films; 46 major 3D movies both animated and live-action are expected to be released in 2009. It is important to note that the term "3D" often refers to two very different concepts; in the above statement "3D" refers to stereoscopic imaging: a display technique that involves two synchronized views of an object or scene that are interpreted as a single three-dimensional image. On the other hand, the term "3D" often refers to a truly three-dimensional representation of the geometry of the scene (i.e. "3D computer graphics" or "3D reconstruction").
The growing industry interest in stereoscopic content has motivated lots of research challenges that are being addressed by a number of research projects focused on topics such as 3D stereoscopic displays, or 3DTV. FINE is not focused on stereoscopic technology, instead it is focused on the creation of truly three-dimensional content and therefore independent of the representation or display technology used for its exhibition: live free-viewpoint content could be shown on regular 2D devices (current TVs, mobile phones, computer monitors), stereoscopic displays and projectors, current auto-stereoscopic monitors and future holographic displays or other devices. The proposed high quality view interpolation, live-action 3D reconstruction and coding of free-viewpoint content require algorithms that are computationally expensive and the performance of current approaches is far from being real-time. As a matter of fact, most current tools for high quality view interpolation and live-action 3D reconstruction used by the visual effects community for the postproduction of commercials and feature films require many hours of both manual labor and intensive computation.

Recent advances in HPC, in particular, the introduction of the first generation of affordable personal supercomputers in 2008 that pack the computational power of 960 processing cores (3.7 Teraflops) on a desktop size workstation, will open new and exciting possibilities for accelerating this kind of algorithms by harnessing the massive parallel processing power of GPUs.
One of the focuses of the FINE project will be the optimization and parallelization of existing methods and the development of new parallel algorithms for 3D reconstruction and coding of free-viewpoint content that will be specially designed and optimized for GPU.
Another important focus of the project will be the development of efficient coding and transmission algorithms that will enable the delivery of live free-viewpoint content through present and future networks taking full advantage of the high speed, increased bandwidth and other characteristics of Next Generation Networks (NGN). FINE will ensure the growing demand for personalized multimedia content and services allowing an optimal real time delivery. The research work will contribute to guarantee that the increased throughput will not affect the QoS required by the Future Internet home users.

FINE envisions a new form of live media content that will offer the consumer an unprecedented freeviewpoint experience. In particular, FINE will enable the delivery of this new interactive and immersive content to a very large number of consumers through Next Generation Networks. The viewers will be able to place a virtual camera in a real live-action scene and move it freely, in space and time, around the photorealistic scene in real-time, heightening their sense of presence and conveying a true feeling of immersion. The project will exploit the latest developments in parallel processing technology in order to achieve a highly realistic image-based rendition of the scene and still keep real-time performance appropriate for live events such as any sporting event, concert, car or horse race, etc. A football match scenario is an especially interesting one to evaluate the technologies developed in this project for two main reasons: first for its large viewer base and potential impact and second for the many scientific challenges it presents; two teams of 11 players dressed alike play in a very large arena outdoors, often in uncontrolled and changing lighting conditions. So there are 22 players plus several referees running freely in the field, bumping into each other and creating all kinds of occlusions from the cameras used in the capture. A more controlled environment, such as a boxing ring, a tennis court or a studio stage would also provide interesting but less challenging scenarios.

We expect to produce a significant scientific, technical and socio-economic impact that can be summarized through the following two achievements:
• New tools for professional media producers
• Home applications and new immersive experiences for home users