The Team behind the Hal Saflieni Hypogeum Virtual Reality Experience

--

Born and living in Malta is veritably being born and living in a museum. Every town and village is steeped in history that spans centuries and millennia. It is unnatural for any Maltese citizen not to appreciate the countless artefacts that have been left as evidence of the tumultuous lives of our ancestors who had to strive for survival.

As tribute to our nation, we at Saint Martin’s Institute of Higher Education are dedicating our research and innovation efforts to portray the cultural wealth of our nation. We chose #LoveMyCountry as our tag line. This is what we believe has driven our nondescript nation throughout the millennia to reach out to what we are today.

This particular project is unique, locally and globally.

Transforming the Hal Saflieni Hypogeum into a Virtual Reality Model

This project entailed substantial investment in research and development to stretch the technology as it emerged, in trying to achieve as realistic an experience as possible. Virtual Reality is an immersive experience, where the user’s (player) senses are tricked and transposed into the virtual model itself. The user (player) is no more a mere spectator of the game play, navigating through a controller and viewing a 2D image on a screen, but is actually inside the game with all the senses triggering the emotions that the producer is aiming for.

The Institute is a pioneer in the education of digital game design and development, and with this project, faculty and final year undergraduate students want to show how, by using readily available game design and development tools, and original software development by the department itself, an educational experience may be developed that replicates closely a real monument.

The team members who have worked on this project have decades of experience in their respective field and have published in international peer reviewed journals. The virtual reality film of the Hal Saflieni Hypogeum is just the tip of the iceberg in the overrall work that has been carried out by the Institute. It is a pleasure to showcase the efforts of Maltese academics and final year computing students to whet the aspirations of the younger generations of computer science students to join a degree programme that is an international leader in its field, rather than a follower.

In the following sections, each team member explores his role in this particular phase of the Hal Saflieni Hypogeum project, followed by a peer reviewed publication that highlights the research area of the individual.

The Making of the Hal Saflieni Hypogeum Virtual Reality Model using the Unity® Game Engine

I was one of the first people who enrolled in this project. The brief given was to create an experience which would allow one to navigate a 3D representation of the Hypogeum in a user-friendly way using Virtual Reality.

Fixing the 3D model to be dual-sided

One of the first tasks tackled was that of making the 3D Model dual-sided. Heritage Malta had created the 3D model of the Hal Saflieni hypogeum by scanning the real thing using a laser scanner, thus creating a point cloud with millions upon millions of separate points in 3D space; some of which were only 3mm apart. This 3D point cloud then needed to be simplified into a usable 3D model to allow real-time processing of said model. This simplified model is still quite complicated as it has around 3 million different polygons.

When this model was imported into the Unity® game engine, certain portions of the model were not being rendered because some polygons were facing the wrong way. For context, Unity® is designed only to render facets of polygons that are facing the player, thus improving computer performance. Since some of these faces were facing the wrong way, this was an issue.

Fig 1 The sheer number of polygons that were facing the wrong way meant that we had to duplicate each polygon to have two view points — front and back.

Since the 3D model was very complex owing to the sheer number of polygons that needed to be reversed, it was decided that it would be best to duplicate the current model and reverse all the normals of each facet in the duplicated model.

Mr Jeremy Grech describing the processes behind developing the Hal Saflieni Hypogeum VR model.

Modifying the Textures

The 3D model provided by Heritage Malta also came with several textures that were captured using images taken on-site with a high definition camera. To make these textures react to light more convincingly, they were modified by introducing normal maps and height maps into each texture. Normal maps are designed to affect how light interacts with a surface in terms of how it is scattered while height maps are used to modify the surface geometry of a polygon to give the shape more depth and thus, realism.

Fig 2. From left to Right. 2a. Untextured model. 2b. Default textures Applied. 2c. Normal maps and height maps added to the texture. 2d. All previous textures in conjunction with in game lighting.

Navigation within the 3D environment

One of the drawbacks of navigating 3D models in traditional 3D editors involves a lot of panning, zooming and orbiting around the scene camera. Needless to say, this is not precisely user-friendly as it requires one to have a good grasp of the 3D editor’s controls as well as a good sense of general orientation.

Given that navigation needed to be done while wearing a VR headset, users would not be able to see what they are doing with their hands. Also, since this experience was to be designed to be accessible to everyone, a simple and straight-forward approach was needed.

One of the significant improvements that implemented in the more recent version of the Oculus® headset was that of hand tracking controls. It allows users to see a virtual representation of how their hands are placed on the hand controllers provided by Oculus®, and thus show what buttons are being pressed.

The navigation system implemented used one controller for the forward and backwards movement while the other controller for pivoting motions. This meant that one would move forward (or backwards) in the direction that they are facing while also allowing one to rotate without having to turn around to see all around them. The result was a drone-like experience of movement which proved to be quite intuitive as test runs on new users seemed to confirm.

Plugging up the holes in the 3D models

One other problem that was encountered with the 3D model provided was that of voids; holes in the 3D model. These voids were caused because it was impossible for the laser scanning equipment to capture all of the nooks and crannies that are present on-site as the equipment was too cumbersome to fit in such tight spaces.

Fig 3. In some cases the laser scanned bit cloud did not reach certain spaces, and we ended up with a gaping black hole in the model — in which case we plugged it in by placing a rock for the experience to be complete

These voids were plugged up with highly detailed rock models which blended in well with the surrounding environments, thus allowing one to keep being immersed in the Virtual Reality experience for longer. The challenge presented with this task was down to the number, position and shape of these voids. It is also worth mentioning that some of the rock models used did not have textures that blended well with the rest of the textures in place around the hypogeum model. In such cases, the textures of these models were modified to that their hue would blend in as much as possible.

To aid users to navigate the space more efficiently, a pop-up map was included, which indicated the user’s position within the 3D environment and which way they were facing. It also showed points of interest in and around the site, which would help users explore the site more thoroughly. These points of interest would automatically pop up an information board and face the viewer whenever the user came within a close enough proximity to them. These information panels could contain images or video with accompanying text. The text would also be read to them by a narrator using a diegetic reverberation filter applied to the audio played.

Adding realistic sunlight

In the past, there was some speculation stating that on certain days of the year, sunlight would flood into the middle level of the Hypogeum and hit the back of the area commonly known as the “Holy of holies.” This idea was quite a popular myth and seemed entirely plausible as similar astronomical features can be seen in other temples around Malta. Also, the upper layer of the Hypogeum was above ground back when it was still in use. Given that the Hypogeum is now entirely covered by relatively modern buildings, it would be impossible for one to know for sure how the sunlight would have interacted with these spaces. The only way that this could be explored is through virtual means.

To explore this hypothesis, a sunlight script was incorporated into the 3D scene which allowed one to mimic the sun’s relative position in the sky to the 3D model at any time of day and any day of the year. After extensive experimentation, it was concluded that no direct sunlight could ever have filtered down much into the middle layer of the Hypogeum. However, more experiments are being conducted to see how indirect sunlight would have interacted with these spaces using the latest in real-time ray-tracing technologies.

Creating light barriers

To make the lighting effects more realistic, and also to prepare the project for further improvements such as the introduction of real-time ray-tracing, the 3D model needed to be modified yet again to protect from a rendering anomaly known as light bleed.

Fig 4. The light bleed illustrated in the left hand side was resolved by imposing a thick walled cube around the whole model as seen on the right hand side.

Light bleed is when the light seems to pass through a 3D object leaving bright spots in areas which should be in shadow. These anomalies are typically caused by lack of depth (thickness) in the 3D model, and thus, when the light encounters a surface of a 3D model, it would tend to bleed through it if there is no thickness to the surface of the model. To prevent this as much as possible, light blockers were created and placed, in and around the hypogeum model so that no stray sunlight would illuminate the interior of the model without a direct line of sight.

Conclusion

When developing such an experience, the devil is always in the detail; however, even though the road was long and frequently tedious, the result created is gratifying. One could see tangible improvements from one iteration to the next, making the overall experience more realistic, visually appealing and immersive. Improvements can always be made and at the time of writing this controllable VR experience is only accessible to those who have very powerful gaming rigs. However, work has already started to make other more lightweight experiences to increase the accessibility of this World Heritage site to the masses.

A Case Study into the User Experience of an Application of Virtual Reality at the Saint Paul’s Catacombs, Malta (Jeremy Grech et al (2020))

Several historic sites in the Maltese Islands are not easy to explore; their construction typology or fragile state of conservation can impede accessibility to all. The Saint Paul’s Catacombs in Rabat is one such historic site. With this in mind, can Virtual Reality (VR) be used to add value to the user experience (UX) of someone exploring these heritage sites? And if so, how can this value be added? This chapter attempts to answer these questions through a qualitative study of a VR system created for the St Paul’s Catacombs in Malta. An assessment of similar applications created for other cultural heritage sites around the world is conducted. This chapter also describes how this VR experience was created through the use of Game Engines, Laser Scanning and other technologies.

Grech J., Bugeja M., Seychell D. (2020) A Case Study into the User Experience of an Application of Virtual Reality at the Saint Paul’s Catacombs, Malta. In: Seychell D., Dingli A. (eds) Rediscovering Heritage Through Technology. Studies in Computational Intelligence, vol 859. Springer, Cham.

The final authenticated version is available online Click Here

The Aural Dimension of the Hypogeum VR Experience

When I first joined the project to handle the aural aspect, I was presented with the sound of the drone representing the user’s avatar in the virtual world, helping to justify the navigational agency provided to the visitor.

However this drone sound was not contextualised to the environment: there was no echo or reverberation. Moreover, the continuous drone sound was found to be noisy and almost unbearable after a few minutes, as well as providing noise pollution within the complex. This takes us to another dimension of analysis: the noise coming from the drone seemed to impose the user’s presence onto the environment, with the effect of diminishing from the sanctity of the place.

Taking this perspective helped me set my aims for the aural experience. I wanted sounds that were representative of the environment, rather than the visitor; sounds that were contextualised within the complex, with echoes and reverberations that mimic the acoustics of the place; therefore sounds that assisted in understanding the aural context of any rituals and behaviours carried out in this hypogeum, and ultimately, sounds that helped immerse the virtual visitor.

Mr Jonathan Barbara who was key in the aural element of the VR experience describing how the project developed.

Audio Profiles

To this end, we needed to capture the acoustics of the hypogeum complex and, with special permission and supervision of the curator, we set out to record a sound profile of the Hypogeum which we could then apply to any sounds we wanted to place into the virtual experience.

Figure 1: Audio Profile recording locations.

Inspired by previous research carried out to analyse the acoustics of the Oracle chamber, we carried out binaural recordings onsite having the source playing inside the Oracle chamber at the spot marked S and recordings taken from both within the Oracle Chamber, at the spot marked R1, and outside at the spot marked R2 (Fig 1).

Given the low frequencies of resonance reported in the literature, the sounds emitted for recording were male basso voice chants by yours truly, and a 16–20kHz sweep. I must admit, it was hard to pitch the notes at these low frequencies until I realised that in Neolithic times, there was no classical music! I had to forego the frequencies of the piano notes and just chant lower and lower until I could feel the Oracle reverberate. The resulting resonance was… beyond description.

Selection of Sound assets

Our next phase was to choose the audio elements that would make up the aural experience for the virtual visit to the Hypogeum.

We split the aural experience into two: action sound effects and environmental audio.

For the sound effects, we focused on three specific activities: the painting of the place, the use of fire to light up the paintings, and the chanting in the Oracle chamber.. Thus we selected paintbrush strokes, crackling fire and the recordings of the chanting for the Oracle chamber.

The different versions of the experience have different aural requirements in terms of background sound. The interactive free-roam version allows the visitor to navigate freely and thus locative sounds are needed to help them immerse themselves in the environment. Inspired by the humidity levels within the hypogeum, faint water droplets indicate the presence of water while a continuous reverberative hum indicates the spatial volume in the complex. The 360 railtrack video takes the visitor on a pre-scripted ride with a voice-over narrative that accompanies them throughout the experience. We felt that the gaps between the narrative need to be complemented by music to help maintain the immersion and thus, we looked for patterned sounds reminiscent of the era. We were lucky to come across the works of Serbian artist Paleowolf who had an album of four tracks which “invoke the visions of the ancient Megalithic culture, that once spread across the Old Europe and throughout the world. Primeval drums and prehistoric flutes fused with dark, atmospheric drones and megalithic shamanic chants are awakening the archetypal images from the mysterious past of civilization’s origins”(Paleowolf, Megalitheon, 2019). The tracks were purchased and the author made aware of their use. The tracks were cut and edited to make sure the proper segment accompanies the specific part of the narrative.

Audio Placement 2D Prototyping

To assist in the placement of sounds within the complex, use was made of an in-house audio design engine which allows quick modelling of audio by use of drag-and-drop techniques to place audio files onto a 2D map and set the range of hearing along which attenuation occurs along a radius away from the source epicenter. This allowed for 2D prototype placement of audio without spending hours in 3D VR space as well as being able to work in parallel without occupying time on the VR development machine. Screenshots from the audio design engine were then used as specification for the VR engine developer to place the sound files followed by a couple of hours collaboration needed to fine tune their placement within the 3D scene, as per figures 2 & 3.

Figure 2: Audio Design Engine for 2D Prototyping of sound placement in the lower level
Figure 3: Audio Design Engine for 2D Prototyping of sound placement in the lower level

Audio Contextualisation

The 360 video has constant non-diegetic music playing, ducking when the narration takes over to explain the current surroundings as the visitor is progressed along the experience. Contextualising the audio effects mentioned above to the underground space with multiple echoes would have extended and amplified the sound effects to the level that would occlude and obscure the narration — and it was thus decided to keep the sound effects clean.

In the case of the interactive experience, however, placement of the audio within the virtual experience was enhanced by implementing the acoustics of the site. A sound source would emanate sound to reach the listener’s ears directly but would also have single bounces off the walls to create the echo. Those echoes would then also bounce off the walls to create echoes of echoes, called reverberations, that would extend the hearing of that sound source along time while each bounce would add another layer of presence for the sound resulting in its amplification. Faithful playback of these reverberated sounds invite a number of different approaches.

One approach would be an on-site recording of 20Hz-20kHz sweeps in each room similar to the recordings carried out above (see Figure 1) with local and remote recordings of the sound source. This would require multiple recordings from multiple locations for multiple sound sources, introducing intrusive equipment into the complex putting a stress on the environment’s delicate internal climate.

Another approach would involve the digital calculation of reverberations based on the digital laser scans. The precision of the latter scans would allow for a very faithful calculation of the resulting resonances. However, this seems to be impossible to do with the current technology as one such software crashed with just the lowest level (the smallest space of all three) due to the high polygon count as a result of its high resolution scanning.

Thus the chosen approach was to make do with the audio profiles recorded in and outside the oracle chamber and use them to add reverberation to sound effects being used within the interactive VR experience. On the initial approach of the visitor towards the sound source, the audio profile from the ex-chamber recordings was used to present the sound with reverberations happening within the complex’s corridors.

Once the visitor was within the same chamber as the sound source, the audio profile from the in-chamber recording would be used. As the visitor will most probably transition between the outside and the inside of the chamber mid-way through the sound clip, a synchronised transition between the ex- and in-chamber recordings was used to make the shift as seamless as possible. However the change between the ex- and in-chamber acoustics is still very noticeable and merits further study and research, including audio profile interpolation.

Concluding Remarks

Good audio is never noticed. It’s bad audio that attracts scorn and criticism. It’s what media scholars call hypermediacy (the medium makes itself felt) and immediacy (the medium removes itself). For the greatest immersion, immediacy is what is sought, and thus much attention is given to give as faithful an aural experience as possible. What sounds are to be heard — the physical nature of sounds — and how these sounds are heard — the perceptual nature of sounds — were the most important elements of this aural dimension design process. The complexity of the complex’s architecture and its fragility made big demands with little leeway for experimentation and repeat processes. I am quite happy with the level of aural fidelity we have reached so far but also confident that such success helps push the boundaries of research further with opportunities for projects that contribute to the preservation and presentation of our great cultural heritage using the technology of today and tomorrow.

Classification of Gameplay Interaction in Digital Cultural Heritage (Jonathan Barbara (2020))

Digital heritage has matured over the past twenty years and now calls are being made for interactive experiences that augment digital representation with digital performance. The paper considers sources for such a performance: be it documented sources, contemporary cultures, or gameplay from other entertainment game genres. It considers the needs of various stakeholders: the archaeologist, the historian, the game designer and the target audience and suggests thematically consistent multiple gameplay options that serve the different needs while reusing game assets and characters. This aims to open up collaboration with the DiGRA community on serious cultural heritage game development, focusing on the player as performer, rather than just as an observer.

This paper will be available shortly on DIGRA’s Digital Library Click Here

A Film-Maker’s Dream

The Dream

The undying dream of every film-maker is to excite the senses in unimaginably original ways; as many as is possible. Even though film has been technically relegated, for many decades, to the use of two main senses, those of vision and sound. But here one must stop and think. Film has come a long way over the past century. The use of technology has enabled not just the facilitation of the use of sound and visuals in much more elaborate ways (through 3-dimensional specials effects (FX) and elaborate sound mixing), but has also brought forth particular elements which were unheard of prior to the last couple of decades.

The culmination of the Hal Saflieni Hypogeum VR project — ‘The Awakening’, was meant to bring about an experience which could be viewed in the comfort of one’s home, similar to any other film experience, yet with the highlight of virtual reality (VR), without the shackles of heavy and expensive computing equipment.

This last phase in this project was intended to create an audio-visual experience which was to include 5 main elements:

Story- Effects-Narration-Visuals-Editing

The Story

Every film needs a story. The storyteller in charge of telling the story needs to have a plot, or more, which not only serve to guide all the other pieces of the project into place, but also encourage the viewer to need to watch more. The story of ‘The Awakening’ is complex in its simplicity; with a linear plot and a final twist at the end, it is the culmination of sadness wrought in loss and that of hope born through this same loss.

The storyteller’s role, executed to perfection, enabled the buildup of a strong script, followed by the ensuing filming and montage of sound and visual FX, music scores and voice-over.

Mr Joseph Camilleri, the director/producer of the Awakening, the 360° short film shot through the VR experience.

The Effects

While aural elements are technically not meant to be heard, but rather to complement the visuals of a film production and in doing so increase the perception and understanding of viewers, such elements need to be thought of with impeccable detail and intricate planning to ensure a seamless insertion which does not lean towards the obvious, but rather the complementary.

One of the fears innate in today’s film-making is the overt use of special FX to enhance the visuals and create more expansive and unearthly scenery, larger than life explosions and beings from nightmares. But the subtle use of such elements can aid a filmmaker to increase the state of immersion of the viewers, capturing their interest and enrapturing them to a degree wherein time passes quickly as the experience transients itself into an exciting reality.

This project made use of very particular visual FX, which were chosen strictly in a manner which underlined the dread for exaggeration. Flame-like lighting, with subtle spark-like elements were included in preplanned points of interest to engage the vision of the audience, especially due to the liberty that viewers were being allowed in creating their own perspective to the plot woven by the storyteller. These lighting effects are very similar to the subtle lighting which is usually employed in 360° filming. In the latter, the main difficulty is to create lighting without giving away the source, or sourcing lighting in a most natural manner, without lighting devices, such as tungsten and HMI lights used in most feature-films to enhance lights and shadows for mood creation.

Fig 1 — With a myriad of available lighting equipment, a traditional film director is spoilt for choice. A 360-degree film director needs to look beyond such equipment

The smoke introduced in the Oracle room was meant as a complement to the words of the script. When the viewer enters the space, he/she is being told of the feelings one would feel in a place full of smoke, this is reflected in a smoke effect which further immerses the viewer. The lack of such an effect would have jarred on the senses of the viewer, the presence thereof enhanced these same senses, and in fact contributed to the imaginary use of the other senses, such as smell and touch.

The Narrator

While sound and music are complementary to the visual, in a script void of dialogue and action, one needs to find an alternative. In any film the lack of the human voice would result in a space lacking in empathy. It is not unusual for films to be filmed without the use of a dialogue, or any human voice for that matter. The consensus for the production of the short film was that a human-voice, a narrator, the god voice was a requisite which would be the essence which guided the viewer throughout the experience, an inner voice representing the Hypogeum itself, a personification of sorts, giving an emotive aspect to something which would otherwise be described simply as holes in the ground.

The Visuals

The visuals for this film were captured from the Unity® VR model that had been built prior to the commencement of this final phase. Following the script, the camera was guided through certain points of interest, gauging timing to the millisecond and facilitating the experience through a minutely studied ‘roller-coaster’ track using a dolly system which albeit virtual worked exactly like the dollies used in filming with cameras. Hence all the tricks of real-life video shooting were employed in a virtual ambience. This could have been a trap of sorts, as the virtual world is one with few limitations, thus extra care was given to ensure that any and all movement was choreographed in a manner which suited the human psyche and physique, ensuring the retention of immersion.

The Editing

Editing in a 360° ambience is not unlike editing ‘flat’ footage. In fact, the same tools were used. Adobe® applications such as After-Effects® (AE) and Premiere® were the main tools of choice, owing to the facilitating plug-ins recently acquired from Mettle®. Mettle® was the company which had created a studio built and equipped for 360° and virtual reality film aficionados. In a time when such were few and far in between, Mettle® came up with Skybox®, a 360°/VR editing suite which enabled film to progress into the 3rd dimension. Since then Skybox® has been integrated into AE®, ensuring a seamless editing process of both 360°/VR visuals and any kind of music, sound FX and other visual FX.

Fig 2–360° title editing in After Effects

The ability to edit 360°/VR footage gave a rise to viewer control. Whereas traditional filming dictated that the director dictates the point-of-view (POV) available to the viewer, now the viewer was being given a liberty which had never before existed, except for very serious attempts in the 1980s, by the likes of Jaron Lanier, which were avant-garde even then, yet lacked the technology and level of graphics present nowadays.

This liberty is both an asset and a shackle for both the director and the viewer. The viewer deems such a possibility as an asset, because the shackles of the director have been removed, and one can look in any direction and see what he/she wants to see, irrelative of the plans of direction. Yet these same liberties can easily present scenarios where important visual moments are lost to the viewer, as such viewer was looking in the opposite direction, when the action was underway. For the same reasons the director is shackled. The director needs to look away from presenting a POV, but rather think up ways and means how to attract the viewer’s gaze towards the desired direction and framing. This is an important element of immersion. Once the ‘liberty’ of the viewer has been shackled once more, the director can start to direct. Such practices are not infallible, yet this past decade has been a prolonged study of such practices, ensuring an effective measure of success through time, practice and experimentation.

Fig 3 — Working on a 2D Edit in AE Skybox for ease of manipulation

Conclusion

Keeping the above long list in mind, one enters the realm of 360° video editing with a particular excitement. You are not just narrating a story visually; you are doing much more. You are guiding the viewer’s gaze throughout the experience and playing all your trump cards as you split tracks, position timeframes and duck sound in a way that causes the viewer to inadvertently observe what you want him/her to observe and keeping his/her gaze fixed on the elements that will ultimately provide the knowledge necessary to immerse oneself in the story. Simultaneously, you are allowing the viewer to look around, appreciating the full architecture and the varying shadows and lighting aspects of the chambers.

Techniques of Filming and Audio Recording in 360-Degree Ambiences Joseph Camilleri (2020)

Education has always been an important factor of life, being continuously analysed, in the attempt to improve its delivery in today’s classrooms. Although much has been done to give education more interesting ways of delivery, yet there are several generic instances when educational techniques used in today’s classrooms are deemed as outdated, both by educators and their students. Subjects, which are meant to enhance the knowledge and appreciation of a culture’s heritage, can at times not be exposed to students in the most exciting way possible, so as to enhance learning and maximise understanding. A country’s heritage is the map to its history. The accumulation of its languages, including its artistic endeavours and representations, stand as a reminder of our ancestors who have toiled hard to create the story that we are nowadays striving to keep alive and further enrich through contemporary means. Technology has become a tool which stands alongside the brushes and rasps of the artists and sculptors of antiquity. Today, computers and their burgeoning peripherals have given art newer twists and further methods of expression, which can in turn augment the way students are drawn into the magical world of their country’s heritage. This project is endeavouring to capture film language and transpose it into a 360-degree film environment, which combined with the enrapturing use of spatial sound will recreate an epic moment in the fairly unknown initial stages of the Great Siege of Malta. This immersion is aimed not only to excite the young minds of students through the narrative techniques used, but further create compassion through an increased sense of empathy.

Camilleri J. (2020) Techniques of Filming and Audio Recording in 360-Degree Ambiences. In: Seychell D., Dingli A. (eds) Rediscovering Heritage Through Technology. Studies in Computational Intelligence, vol 859. Springer, Cham.

The final authenticated version is available online Click Here

I was brought onto the Hypogeum project in order to assist with the recording of the 360° video. During this project I have worked on 2 main areas: The camera movement and recording system, and the introduction of some graphical effects such as fire particles, fog and rendering pipelines.

Cinematography

In order to be able to record the Hal Saflieni Hypogeum VR film, tests were needed to be done in order to gauge the quality of recording that can be exported from Unity® Game Engine. A simple test scene as in figure 1 paired with a simple user-controlled movement script for the camera showed that recording in 360° is a viable option, especially when recording at high resolution and high framerates of 4K@60hz allowing for smooth, high quality video.

Fig. 1 preliminary 360 video test scene

Following this, work began on camera movement techniques. Various options were present including a manually controlled camera, however owing to the length of shooting and precision needed for the recording, the results were not up to standard expected. The imprecision would be from human error that can happen during the recording session.

Fig 2. Once set the camera tracks guaranteed a smooth flow of the camera to meet the wishes of the creative director

Therefore we set up a camera track and dolly system, ascertaining that the camera always followed a pre-set path around the hypogeum without deviating from what the creative director wished to communicate. This allowed for fine grained control over the speed, direction, position, and roll of the camera. The smoothness of the path was further increased with interpolation of the key points making up the path.

Graphical Effects & Enhancements

Performance was significantly hampered owing to the complexity of 360° recording with 5 minutes of finished video requiring potentially a number of days of recording and processing. In order to improve the efficiency, the project was ported over to the new Unity URP Graphics pipeline which improved performance by at least 7x of what was capable before. This upgrade also allowed for the use of GPU based particle effects through Unity’s VFX graph which allowed for the addition of the immersive fog and fire particles present throughout experience.

These particle effects were enhanced by light sources placed around the scene in a balance between atmospheric lights that bring life to the scene as well as lights designed to bring attention to the immediate surrounding area such as the oracle chamber and the ‘ante-chamber’ of the ‘Holy of Holies’.

Watch the Awakening click here. Best use a headset to get the immersive experience. Set your Smart Phone by downloading the VR APP and a cheap cardboard headset is enough to appreciate the immersive effect of VR. And remember to look around you — this is a 360° film you can enjoy in your living room.

Saint Martin’s Institute of Higher Education, established in 1985, is licensed by the NCFHE with license #196 ● Postal Address: Saint Martin’s Institute Foundation Building, 2, Schembri Street, Hamrun HMR 1541 ● Telephone: +356 2123 5451 ● eMail: infodesk@stmartins.edu

--

--

Saint Martin's Institute of Higher Education

@stmartinsedu Maltese a licensed (№196) private, tertiary-level institution, offering University of London qualifications. #StartMyInspiration