Assistant Teaching Professor

Assistant Teaching Professor Daniele Profeta introduce students to digital animation techniques as a way to construct immersive environments

Daniele Profeta is an Italian architect and designer. He is currently an Assistant Professor at Syracuse University School of Architecture.  He received a Master of Architecture from Princeton University, and completed his undergraduate studies between La Sapienza University in Rome and KTH School of Architecture in Stockholm, where he graduated with Honors.

His current research explores the potential of digital mapping as a practice of world-building, one that allows the collapse of heterogeneous forms of information to develop a thicker understanding of the contemporary architectural discourse. Daniele’s past projects range from investigative geography to construct spatial histories within the context of Guatemala’s civil war, to projective narratives speculating on the role that contemporary surveying technologies have in shaping the political landscape of our cities.

Q & A with Professor Profeta

Can you give us some background about the computer rendering that you do? 

I use rendering to produce immersive 360° digital animations. More specifically we have been rendering short videos (approx. 9mins long) of speculative environments imagining possible futures for our cities.  View samples of student work.

What is the nature of the content being rendered and software or application(s) are being used to create it?

We have been working with photogrammetry to three dimensionally capture elements of our current built environment that are then brought together to construct a digital set within animation packages such as Autodesk Maya. Ultimately, the videos are rendered using the new Arnold render farm, a rendering engine for Maya, to output still frames.  These frames are then composited within post-processing video applications such as Adobe After Effects and Premiere.

Can you tell us about the courses you teach that incorporate this technology?

ARC500 SEC M009 – ‘360° – Speculations on Google Street View’
The research in this course will continue this coming semester (Spring 2019) with another ARC500 seminar called ‘360° Future Archeologies’ being added to it. This ongoing research would have not been possible without the support of the Research Computing Infrastructure. 

In this course students were able to independently use the Arnold Render Farm setup within the Research Computing infrastructure to render their immersive animations. The students can test-render low-resolution single frames from the Computer Lab here at the School of Architecture to tweak and adjust their videos and when the videos are ready, they can directly access the render farm via a Back-burner engine that leverages the large computing capacity of the Render Farm to quickly process their submitted material.  The availability of the SU computing infrastructure immensely facilitated their workflow and allowed us to produce higher resolution renderings of complex environments.

 

How your rendering was done before SUrge became available?

Rendering before SUrge was quite complex. For each second of the final videos, we had to produce 30 still images as digital animations have a frame rate of 30-frames per second. You can imagine that this increases the amount of rendering necessary and we quickly found ourselves with over 16,000 images to be rendered.  Often this large amount of computing had to be sent off to render farms which would charge a significant amount to render out our material.

 

How did the rendering process change as a result of SUrge? 

Now, using SUrge, the process it is a lot smoother and using it facilitated the workflow immensely.  First of all, economically the advantage is significant. Secondly, and in my opinion most importantly, being in control of the final output process through the SUrge render farm, we were able to maintain a higher level of quality control over the material being produced, and this ultimately facilitated the final production of the videos.

Examples of Student Work

Work by Lionel J. Thompson, Kevin Manuel Anton and Raymond Guo

 

 

Work by Miao Hui and Sibo Fei

 

Browsing en la Bodega from Pattaraporn Kittisapkajon and Geraldine Vargas on Vimeo.

 

Anthro-Park from Doria P. Miller and Priscilla Almendariz on Vimeo.

 

Work ‘Machine Maniac’ by Miao Hui and Sibo Fei.

More About SUrge

The deployment of a Graphic Processing Unit (GPU) based computing cluster known on the Syracuse University campus as SUrge is maturing in both its physical components as well as how the jobs fed to it are managed.  

In an earlier article we have seen how the Newhouse School’s exploration of Photogrammetry projects benefited from the unique capabilities of GPU processing.  Here we will further explore how Syracuse University’s SUrge GPU cluster is facilitating the rendering of immersive digital environments by faculty and students alike in SU’s School of Architecture. 

SUrge Hardware:

  • Intel Core i9-7980XE 2.6GHz 18-Core Processor
  • 3 x Samsung – 970 Pro 1.0TB M.2-2280 Solid State Drive
  • 128GB DDR4-2666 Memory
  • 1 x NVIDIA GeForce GTX 1080 Ti GPU
  • 10 Gb Network Interface
  • Operating System:
    Microsoft Windows 10
  • Software:
    Agisoft PhotoScan

SUrge hardware

SUrge hardware

This page’s header image was sourced from the Anthro-Park video produced by Doria P. Miller and Priscilla Almendariz.