Time Perception during Walking in Virtual Environments – Reflection

The article says in the abstract, about differences in distance, speed, and spatial relations between Virtual reality environments and the natural world and how they have been observed in experiments, but how time works within these experiences hasn’t been considered. It’s unknown whether time is expanded or contracted within virtual reality experiences.

It also discusses that the brain will often seek for the time within these virtual reality experiences on internal biological or psychological events or external signals. Such as sunlight, or tiredness. This in turn means that in virtual reality the designer can alter the prospects of time for the player by changing external signals that the body relies on for confirmation of time.

There are also no established measures on how to best assess time judgments in VR. Which makes it difficult to analyse and measure accurately the differences between real life and VR. 

In an experiment, participants were given a space to walk in, while wearing a VR helmet and asked once they reached the end of the space how long they felt it took them. On average they felt it was 2.6 seconds shorter than the actual time it took. The article doesn’t speak on whether this was correctly done or if this is a good or bad thing. I think it’s mainly pointing out a phenomenon that requires further studies to analyse a bit more. But it’s an interesting point of VR that I hadn’t considered, how we can alter time and space and change the feeling of time is something powerful. Now I consider how we could possibly do this with sound? Music can make time feel like its going faster if you listen to an album you enjoy, so how can we also implement this phenomenon into a virtual reality space?

Collaboration Receiving video and final mix + sound effects + music for submission

We have managed to get Unity working and exported a video of playing the game. We had our final meeting last Thursday and decided for us all to email each other the video and add our new sounds to it. Alongside this, Jingya has managed to get the sounds working into unity. So they’ve now emailed the video to me and all I have to do is add the last new atmospheric sound effects for our crit this Wednesday.

Additive Synthesis attempt

I watched a few videos on additive synthesis after researching that Old School Runescape use/used additive synthesis in sound effect composition. I watched a few and understand that it’s about creating any sound with sine waves. And that by its nature mathematically you can recreate any sound in real life. I have both Ableton 11 Suite and Logic Pro X and found myself using the Logic Alchemy as it was a friendlier interface. I watched these videos and took notes.

I found that using alchemy and additive synthesis was difficult and I wasn’t making anything interesting really. I exported the sounds I made but I didn’t find them useful at all really. I might spend a bit more into dwelling into additive synthesis perhaps even try the Ableton operator one instead. But for now, I’m burnt out and a bit frustrated. I have however made some sounds that are okay and can be manipulated further let’s see!

This was a patch? if that’s what it’s called. That I made on Alchemy. Each number on the graph is a sinewave and you create a sound with volume, tune, pan, and phase parameters. I also automated the parameters with my mouse while recording into a separate audio channel instead of midi notes. I have a long performance recorded that I want to try to manipulate.

Song made forgot to update the blog

Just quickly updating my blog from a song I made back a few weeks before we received any final information about the game. At the time the music references they gave us were Cuphead which had a lot of Jazz Bepop style of music.

I originally attempted to use Ableton to create some music but felt it was corny. Instead of using midi instruments, I translated to what I know how to use which is sampling. I used my SP404MK2 which is a sampler with lots of effects built within.

I then loaded some jazz drum brush loops and got a pattern going. Went through some jazz records I own at home and recorded through my turntable into my sampler and started chopping guitar sounds to layer over the drums. Then processed with filters and other bits to create a new analogue dirty hissing sounding loop.

Again now it won’t be used as we have gone with a sci-fi futuristic sort of composition which I now will be making music for.

For reference here is the machine I use.

And here is the song below

Collaboration, Research into games-Minecraft

I’m continuing my research into games and chose Minecraft as the last one. I’ve spent hundreds of hours in this game and find the use of music and sound effects have been executed very well. The music comes on and off without ques. As well as the sound effects being really unique and fitting for the game world.

The music as shown above has many differences in it, from tracks that have heavy reverb to showcasing themes of wonder and exploration. To more sad and horrible music in the netherworld (Minecraft hell). I want to explore more into how they thought to use the music like this. As well as potentially what synths they used? Or instruments?

As well as the sound effects too, they have great variation as shown in this video. They are not high quality but distinctive. I think the music and sounds could fit into our game so I’m curious how they made them. Are they samples like other games, or synthesis? Or a combination.

So after watching the above video I understand the concepts of sound design in Minecraft a lot more.

Music should come in randomly, adding music to a film as they say is easier. You know exactly when the audio is going to play and what is happening. But in a game, it is more difficult to predict what’s happening. The score plays randomly, with lots of silence in between each song to build anticipation.

Minecraft is very laidback with its music as you are usually doing something very relaxing. In the specific biome warped forest in the nether, music doesn’t even play. Just the ambience of glitchy electronic sounds distorting. Some of the nether sounds were made using balloons to create these stretching sounds of horrible tension.

Minecraft at night shifts into monsters trying to kill you instead of a peaceful building game. Dark tunnels field with monsters is also a common occurrence. They make the enemies loud so in dark environments you can hear the enemies approaching even before you see them. This gives the player cues, they also play scary music when you are close to darkness this is to just scare you.

They also speak about not creating any dialogue for the character you play, as they want you to feel like it’s you playing not a character they’ve created. Villagers don’t speak either as they don’t want to make people feel offended.

Overall I think it’s interesting the anticipation thing they speak about. As well as the no music for anticipation. I did discuss with Ingrid that an idea I had was about music not being on a constant loop but using FMOD to have multiple tracks stacked up on the randomiser function so it chooses a random song based on the selection with a few silent tracks as well to have a pause sometimes between songs. Something I definitely want to implement if possible.

Collaboration- Research into games, Halo Infinite

Another game I want to research is Halo Infinite. I remember when it was coming out all the trailers for it. They had a lot of videos explaining the sound design. At the time we were doing the specialising module and I was working on foley and sound effects for a short film. The video showed them doing very unique things to record sound to manipulate into monsters and weapons for the game. I want to do further research into it and perhaps use what I’ve learnt for the current game I’m working on.

Firstly I watched this video that went through a few sound design field recordings they made. It started with rockets being fired outside, to then using electromagnetic microphones from LOM on an Xbox console and recording those sounds. Recording a pug eating for monster sounds, hydrophones on glass, and making the glass break. Recording old machinery bears in a zoo, the manufacturing of the Xbox remotes.

In this video they use a piano for sound effects, starting by placing a huge bass speaker on the top and recording it. Then breaking the piano, hammering it. Tightening the strings until they snap, cutting the strings as well. They even place dry ice on the piano strings, you can hear the dry ice fizzle like a screaming alien. Really cool stuff!

I also watched this interview with the sound design team at Halo Infinite.

They start by speaking about the guns, one of the most important parts of the game, since it’s first-person it’s the main character. The sound designers speak about hey they design the weapons in layers. Firstly the thud indicates the power and impact of a weapon. Secondly, the mechanical layer which offers tactile feedback with satisfying click clanks of metal. Thirdly, the tail end creates an atmosphere with the gun bouncing of the environment this gives it a sense of where the gun is fired, is it an interior or outside space.

Adaptive sound design is something they use in this game, they tailor sounds to any outcome. This gives feedback on their actions to the player, for example, the mechanical sounds of a weapon get louder as the gun begins to run out of ammo. And sounds of an empty clip are a typewriter and explain communication to the player they have nothing left to shoot.

Car sound design was grounded on real-world vehicles, they recorded real helicopters and dune buggies, as well as a 1918 vintage tractor as they didn’t want a futuristic tank to sound like a current one. 

“Making appropriate sounds from inappropriate objects.”

They also spoke on sound fatigue, something I hadn’t really considered. Because unlike a film players will be putting hundreds of hours into it, they need the sounds to not have too much high frequency or low frequency so the mix doesn’t become muddy.

Following this I want to record more outside and spend a full day attempting sound effects for our current game, there is so much room for cool ambient sounds and technology or other sounds. I want to take this further than it already is.

Collaboration-Research into another game sound – Old School Runescape

I’ve decided to do further research into other games I like to understand a bit more behind the process of sound/music creation. To expand my knowledge and learn more techniques and ways of working. As well as potentially attempt them for my game.

Old School Runescape is a 2007 version of Runescape. The game is something I played in 2007 and to see them re-release the same version is amazing. The music is very nostalgic and uses old school midi instruments and sound effects sound like synths. I want to do further research just to find out how it was made. For reference here is an example of the music in the game.

I find the music really beautiful and captivating. Maybe it’s the nostalgia but I find the compositions really fitting for the game. And I want to learn what they used. Especially for the sound effects, I think they almost could fit in the game we are working on. For the machine sounds or ambient sounds, they sound a bit crushed or low bit rate. See below for examples.

I searched online but only seemed to find research about the current Runescape version, not the older version. Or any information was mainly just stating it was midi. I found a Reddit post that Ian Taylor replied to explaining how he made the audio.

I also managed to find a tweet where Ian shows the original synthesizer they created to make all the Runescape sound effects.

I also found another Reddit post that explains the Runescape 2 sound effects were mainly additive synthesis using this above synthesis engine they created for their making their own personal sounds.

I want to attempt to explore additive synthesis and see what I can create. Explore the sound design elements and create some sound effects of machinery for the child being created and the ambience of the factor in the game.

Unity Issues

I’ve been trying to use Unity to record a full playthrough of the game for us to use as a video for backup in case adding things through Unity doesn’t work. I’ve had numerous issues and every time I figure out the problem another occurs.

The first was the game not even running or opening. I figured out I was using the wrong editor version and had to use the same one that the MA developers used while creating it. The editor version was 2019.4.11f1

Then when I opened the file I managed to get it running by clicking the play button. Now, whenever I open the game it seems like the image is out of place and not synched up correctly. As well as this, the password for the game doesn’t progress to the next screen which is frustrating. I can also see in the debug part on the bottom left of the screen it says “correct” meaning I’ve got the password correct and it should progress. I’m going to speak to Jingya and find out if she has managed to get it working and if we can record a playthrough for video use.

Otherworld VR reflection

OTHERWORLD - 2022 What to Know BEFORE You Go (London)

Otherworld VR was truly spectacular. I have used VR before at a friend’s place and I found it good but it was truly behind what the ideas were. I felt like space was a big thing it needed. At Otherworld VR they had this. I found the fact I could walk around a space really beneficial for the experience.

The beginning section where we all went down an elevator and the portal opened up was very immersive. I think the sound design for that really helped give the idea of falling into an abyss. I also noticed the directionality in the space, with sounds moving around me and where I looked the sound would be in this bubble. As if it’s ambisonic.

I spent most of my time playing a zombie game and found it enjoyable, although the movement was weird, you couldn’t walk you had to hold a button and teleport into an area. Still fun though. All the reading I have done about immersion felt to replicate itself in my experience. I found myself truly immersed and totally forgot about where I was, in this circular pod. I ended up finding the overall experience really worth it. Having that experience has definitely helped me understand the scope of current VR technology and where it can go and where it is.

Collaboration Hand in work to do / Where I am

Currently, I am almost at the hand in. I used FMOD in some experiments and I get why it’s useful. I want to attempt to use Unity for sound editing perhaps if it works. And use FMOD for music.

I need to make Another music track or two perhaps? And record and make sound effects for the atmosphere and the robot’s brain being built.

We have been sent a trailer of the game but not an entire run-through, we have received the actual unity project but I can’t get the game playing? I need to discuss this with my peers in my group to see what we can do.

Finally, in the mixing process, are we going to make a video and edit the sounds on Final Cut Pro or are we going to use a video of the game or implement them?

I feel like currently, I will make the extra song, and the sound effects and use Final Cut Pro to edit the sound to video as it might be difficult to actually implement the new sounds. We can still keep the old sounds they used and just layer in the new ones. I also don’t mind mixing and mastering the audio.