Bed Warmers – Production/Post Production


By Niall

For the set of this film, we chose my bedroom – I live a pretty minimal life and don’t have many possessions, so it was easy to turn my barren room into a post-party hellhole. We collected the oddly large amount of alcohol in the house, scattered it around the table-tops, and threw around some old clothes. Hey presto – you’re life’s a mess in no time!



The Set



Andreas brought down his go-pro for the shoot, which was attached to a belt and wrapped around my chest. We thought about using a helmet for affixing the camera to me, but it just made me seem unnaturally tall with arms too low down my torso. The big light in my room was turned on to counteract the harsh light coming in from the window.

Ben’s makeup was a particularly funny thing to do – instead of real makeup, we used tacky face-paint, adhering to the traits of the try-hard party-animal student characters we created. It was quite horrifying really. More than quite. Especially when he rubbed ketchup on his chest and put tomato paste in his teeth. Yeesh.



Ben showing the girls how it’s done


For the ‘leave’ and ‘psycho’ endings, we GENUINELY dropped a go-pro out of a window. we had Andreas and Dom down in the garden with a sheet, ready to catch on my cue. It felt strange to be dropping hundred’s of pounds worth of camera out of a window so casually, but the result was ace in the end. It was a struggle to not get a shot of either the sheet or me sticking my head out of the window, but we managed it after i perfected my release.

The audio taken from the go-pro was fairly hissy and tinny, so we spent a couple of hours in the newton studios re-dubbing mine and Ben’s dialogue. We used our fave vocal set up – a sennheiser 441 and a U87 – to do all the lines and as usual it worked a treat. While we were in there, we hashed together three ‘radio broadcasts’ with each member of the team contributing to one. Ben and I made a Rick and Morty style advert for wigs, Andreas did a wacky music show announcement, and Dom did a BBC style news bulletin.


Post Production

By Ben Jackson

The game was made in Unity. Using the code that I spent too long trying to make work. But more on that later…

First I took the footage into Adobe Premiere Pro and edited each ‘scene’ as an individual subsequence this meant that I could quickly jump in and out of scenes as well as having a master sequence that I could use to check if the edits flowed properly. I also used the master sequence to add a colour grade over the project to give it a more cinematic feel over the washed out go-pro look. A slight de-warp was applied to make the wide angle lens effect less extream.


Screenshot 2017-04-28 02.00.52.png

The Master sequence with multiple paths as layers



Once the editing was done it was time to head back into the studio to dub over Niall’s dialogue, (see above) this was then exported from ProTools and dropped into the corresponding sequence. In ProTools, we added compression and a slight reverb to make the voice have that ‘inside your head’ feel. We also used ProTools for the music track at the start as well as the radio adverts. It was important to get all the clips to the same level so that the user wasn’t constantly having to change their volume, which would have broken their immersion. Finally, I exported each subsequence as an h.264 file ready to import into Unity.

In Unity, I added all the files to the project. This converts them into Ogg Theora format which makes them easier to play in the unity engine. This does mean a loss in quality, but from my own experience with games, quality is pointless without a good story. (*Cough* Crysis *Cough*). To enable the videos to be seen I used the old school method of using a cube to project the image on as a texture. This meant making unlit textures for every video file I had. Note the word ‘unlit’, the first tests I did, I ended up lighting and they came out sunset red. It took a whole night of panicking before I realised… The video textures are fired using a script called run.cs, which basically says when the scene opens to play the video. Then another script called ‘Wait.js’ waits for a programmed set of time and then goes to a different scene.  I use public ‘Vars’ or variables to set the time and destination so that I could reuse the script over rewriting it. Must like Bill Gates said “Choose a lazy person to do a hard job. Because a lazy person will find an easy way to do it.”

An interactive video that follows on automatically isn’t interactive

So to fix this I added the options scene. This is just the last frame of the clip behind some 3D text which shows the options available. This is again powered by a script that can be reused to go to different places. The script also accesses the “rend.material.color” function to change the colour of the text as the user’s mouse hovers over each menu item, I then added a beep tone to the mouse up action. These are both important as they give user feedback and helps guide the user through the game without needed extra text.  Much like the pull side of the door having a handle and a push side with a push plate. Rather than both having a blooming handle!!!

Next, I chose to tackle the radio scene. This took two nights to program and still is a little broken. But the effect was worth it. The effect uses the user’s mouse position to affect the X translation of the cube acting as the tuning stick. I then use X position to test whether the stick is over one of the channels (2,5,8). If they are the white noise is turned off and the radio show is played. There is a slight lag as the files swap over but this helps the user work out where the channels are so I kept it in. As the channel plays, I also enable the click function which moves the player to the next sequence. (This is easier to say than do, See below)


Screenshot 2017-04-27 21.32.10 copy

What my nightmares look like now


The final bit of coding was the camera which is just a slideshow cut up to act as a camera playback. This then cycles between images as the user hits the mouse button. At the start, if they choose to remember the go to ending ‘A’ but if they choose to go to the end of the photos they end at ‘B’. I also had to double back on myself as I realised that I disabled the mouse cursor during the radio scene so I quickly enabled the mouse after the radio scenes.

Finally, I made an over the top credit scene which we’ve all grown to love in ‘indie’ games. To do this I used After Effects to create multiple text layers and then automated a camera to move around the 3d space.  Vola a cheesy outro sequence!

The Code that was forgotten

Originally we added some background music to the piece this was done using an audio source than never got destroyed as the scenes changed. We then realised that it clashed with other music sounds so I automated the mixer to turn up and down a fader to kill the music. After all that work we decided to scrap the music completely. I left the code in as a little tribute to that one guitar loop.


RIP KillSound.CS


Gify.(2017)’ The Hunger Games'[Online] Available at: [Accessed 27 Apr. 2017]


Unity Manual. (2017) [Online] Available at: [Accessed up till 27 Apr. 2017]

Unity Scripting API (2017) [Online] Available at: [Accessed up till 27 Apr. 2017]


A Grand Day Out: In Spite of Better Judgement

“What a weird, yet incredibly well-executed film”-You

Let me explain the film you just watched. After being stuck with writers’ block, I suddenly remembered a joke I made in the assignment briefing. “Don’t film the number 50 or making cups of tea!”  Why not both? I joked but the idea slowly grew on me as I realised how well the film would lend itself to the task. When I found the music track, I had to make it.

The film opens using footage from the 8mm Vintage Camera App by Nexvio. I used this app to mimic all the ‘artsy-fartsy’ hipster films on Vimeo. This was because I wanted the viewer to be misled into thinking it was a serious film, not a parody. I also feel the super 8 look helps hide the tardiness of the mobile footage. Seeing the video in the original High-Definition format makes the viewer less forgiving when it comes to technical errors.

The time-lapse was created using the 8mm app. I did this so that I wasn’t stuck with a certain length clip in the edit. (Pro-tip, by adding a time-lapse to the film if you need to add padding to the piece, you can make it longer. This trick is probably why I passed TV in the first year…) To keep the camera(phone) stable I used my shoulder rig and attached it to the bus seat. This worked with varying degrees of success, I did try hyper lapse apps but due to the limited background movement, they didn’t work as well.img_20170218_155121_01

Speaking of which,  I used the Microsoft Hyperlapse App. This is because I personally feel it does the best job the majority of the time. I also maybe biased as I was a Microsoft Insider Tester for it and the PC software. The reason the app works better is because it analyses the environment, then creates a 3d mesh and projects the image onto it. This allows the software to then view the scene through a virtual camera eliminating any movement from the input media. The only downside is the limited resolution and the fact it adds a watermark on the end of the clip but anyone with a vague understanding of the crop tool can remove this so it’s not really a problem.

The final shot of the film is the most technical and took a night of planning to pull off. I knew I wanted a slow motion shot but I had to work out how to time the action to the music as you can’t speed up gravity. I worked out a number of frames it took each water effect to reach the centre of the frame. Using this knowledge I then created a click track which the performers would follow to sync the effect perfectly. Because I was using the iPhone 6 Plus I had access to 240 frames per second. This is 10x normal speed (for 60Hz electrical grids) which equates to 18 visual frames or just over half a second which would have been near impossible to film. So I elected to film the clip in 240fps but treat it like 120fps this gave the performers 1.5 seconds to do the stunt and then gave me the headroom in the edit to slow the footage down even more after the musical stabs had been hit.

The film was shot solely on the iPhone 6 plus as it has fantastic optical stabilisation meaning that when paired with my shoulder rig, I had reasonably stable shots. I also chose the iPhone as it has a much narrower field of view compared to my OnePlus3T which has an almost fisheye effect to it, this retracted from the super 8mm look.  It also made framing easier as I could also view the 8mm footage live on the iPhone as the app is only available for iOS meaning that I would have had to send the footage from the OnePlus to the iPhone and back.

Apps used:

8mm Vintage Camera:
The best 8mm emulator out there. The Oscars can’t be wrong.
Here is some test footage I shot with it. I was originaly going to use this for my credit scene but apparently running over by 50 seconds was unacceptable.

Powerful and user-friendly. The only downside is the lack of iOS app

Honourable mentions:

YayCam Retro:
Nice features but more expensive and ultimately not as good as the 8mm app. Also kept sending me annoying adverts via notifications. (It’s a sad day when you get more ad notifications then social media ones)

Instagram has it’s only stabilisation app but after my own experimentation, I had a better time with the Microsoft Hyperlapse app. But I would recommend it for Instagramers as it allows for a streamlined workflow.

SloPro allows iPhone users to simulate shooting at 1000FPS but unfortunately, it’s too good to be true. The app uses frame blending to create the extra frame data needed, this creates very obvious artefacts in the areas in the frame that are moving. It also requires a pro account to remove the watermark. However, it is a great app for messing about with and has provided many a laugh between friends.

So that’s it! You know my trade secrets, so what are yours? Why not comment below?

meet my baby

A year ago a very nice post lady delivered me a brand spanking new, second-hand TLR! This camera is dated around late 1950’s and had for many years just been left in an attic, it’s such a shame to see great camera left as rubbish when all it needs is a little love and care for it to outshine most modern camera.
For those who don’t know the Yashica-A is a TLR 120mm medium format camera, but what makes that so good I hear you cry? Well, apart from the fact older camera are just naturally better because they produce a 1:1 representation of the image rather than a computer deciding what is and isn’t needed. The camera also shoots a larger format than most cameras, for instance, professional DSLR’s shoot what is known as Full Frame or 35mm this is three and a half times smaller than 120mm film.

Bigger is better.

I could talk about compression and other technical nonsense but I’ll simplify it the bigger is better. In fact, you can get a bigger film standard called Large Format but that costs money that I don’t have.

Shall I continue about the film? Oh okay. Another key part of the film is ASA which is how large the light sensitive crystals are. Think of it like ISO on modern cameras but it’s a lot more natural looking as the chemistry is completely random.

Another reason I love the camera is the pure simplicity of the device. There is no batteries, no meters, it’s just you and the image. That’s it. Yes, your first roll of film will probably be out of focus and not exposed properly but by the third attempt, it will have become second nature. The only other negative is the cost, But (and this is a big but) the cost for a 50mp camera is three grand and that’s not including lenses! So once you put the cost of the camera and the cost of shooting you can easily shoot 1200 frames (and that’s buying the expensive stuff)


Cheap 120 film looking beautiful