Bed Warmers – Production/Post Production


By Niall

For the set of this film, we chose my bedroom – I live a pretty minimal life and don’t have many possessions, so it was easy to turn my barren room into a post-party hellhole. We collected the oddly large amount of alcohol in the house, scattered it around the table-tops, and threw around some old clothes. Hey presto – you’re life’s a mess in no time!



The Set



Andreas brought down his go-pro for the shoot, which was attached to a belt and wrapped around my chest. We thought about using a helmet for affixing the camera to me, but it just made me seem unnaturally tall with arms too low down my torso. The big light in my room was turned on to counteract the harsh light coming in from the window.

Ben’s makeup was a particularly funny thing to do – instead of real makeup, we used tacky face-paint, adhering to the traits of the try-hard party-animal student characters we created. It was quite horrifying really. More than quite. Especially when he rubbed ketchup on his chest and put tomato paste in his teeth. Yeesh.



Ben showing the girls how it’s done


For the ‘leave’ and ‘psycho’ endings, we GENUINELY dropped a go-pro out of a window. we had Andreas and Dom down in the garden with a sheet, ready to catch on my cue. It felt strange to be dropping hundred’s of pounds worth of camera out of a window so casually, but the result was ace in the end. It was a struggle to not get a shot of either the sheet or me sticking my head out of the window, but we managed it after i perfected my release.

The audio taken from the go-pro was fairly hissy and tinny, so we spent a couple of hours in the newton studios re-dubbing mine and Ben’s dialogue. We used our fave vocal set up – a sennheiser 441 and a U87 – to do all the lines and as usual it worked a treat. While we were in there, we hashed together three ‘radio broadcasts’ with each member of the team contributing to one. Ben and I made a Rick and Morty style advert for wigs, Andreas did a wacky music show announcement, and Dom did a BBC style news bulletin.


Post Production

By Ben Jackson

The game was made in Unity. Using the code that I spent too long trying to make work. But more on that later…

First I took the footage into Adobe Premiere Pro and edited each ‘scene’ as an individual subsequence this meant that I could quickly jump in and out of scenes as well as having a master sequence that I could use to check if the edits flowed properly. I also used the master sequence to add a colour grade over the project to give it a more cinematic feel over the washed out go-pro look. A slight de-warp was applied to make the wide angle lens effect less extream.


Screenshot 2017-04-28 02.00.52.png

The Master sequence with multiple paths as layers



Once the editing was done it was time to head back into the studio to dub over Niall’s dialogue, (see above) this was then exported from ProTools and dropped into the corresponding sequence. In ProTools, we added compression and a slight reverb to make the voice have that ‘inside your head’ feel. We also used ProTools for the music track at the start as well as the radio adverts. It was important to get all the clips to the same level so that the user wasn’t constantly having to change their volume, which would have broken their immersion. Finally, I exported each subsequence as an h.264 file ready to import into Unity.

In Unity, I added all the files to the project. This converts them into Ogg Theora format which makes them easier to play in the unity engine. This does mean a loss in quality, but from my own experience with games, quality is pointless without a good story. (*Cough* Crysis *Cough*). To enable the videos to be seen I used the old school method of using a cube to project the image on as a texture. This meant making unlit textures for every video file I had. Note the word ‘unlit’, the first tests I did, I ended up lighting and they came out sunset red. It took a whole night of panicking before I realised… The video textures are fired using a script called run.cs, which basically says when the scene opens to play the video. Then another script called ‘Wait.js’ waits for a programmed set of time and then goes to a different scene.  I use public ‘Vars’ or variables to set the time and destination so that I could reuse the script over rewriting it. Must like Bill Gates said “Choose a lazy person to do a hard job. Because a lazy person will find an easy way to do it.”

An interactive video that follows on automatically isn’t interactive

So to fix this I added the options scene. This is just the last frame of the clip behind some 3D text which shows the options available. This is again powered by a script that can be reused to go to different places. The script also accesses the “rend.material.color” function to change the colour of the text as the user’s mouse hovers over each menu item, I then added a beep tone to the mouse up action. These are both important as they give user feedback and helps guide the user through the game without needed extra text.  Much like the pull side of the door having a handle and a push side with a push plate. Rather than both having a blooming handle!!!

Next, I chose to tackle the radio scene. This took two nights to program and still is a little broken. But the effect was worth it. The effect uses the user’s mouse position to affect the X translation of the cube acting as the tuning stick. I then use X position to test whether the stick is over one of the channels (2,5,8). If they are the white noise is turned off and the radio show is played. There is a slight lag as the files swap over but this helps the user work out where the channels are so I kept it in. As the channel plays, I also enable the click function which moves the player to the next sequence. (This is easier to say than do, See below)


Screenshot 2017-04-27 21.32.10 copy

What my nightmares look like now


The final bit of coding was the camera which is just a slideshow cut up to act as a camera playback. This then cycles between images as the user hits the mouse button. At the start, if they choose to remember the go to ending ‘A’ but if they choose to go to the end of the photos they end at ‘B’. I also had to double back on myself as I realised that I disabled the mouse cursor during the radio scene so I quickly enabled the mouse after the radio scenes.

Finally, I made an over the top credit scene which we’ve all grown to love in ‘indie’ games. To do this I used After Effects to create multiple text layers and then automated a camera to move around the 3d space.  Vola a cheesy outro sequence!

The Code that was forgotten

Originally we added some background music to the piece this was done using an audio source than never got destroyed as the scenes changed. We then realised that it clashed with other music sounds so I automated the mixer to turn up and down a fader to kill the music. After all that work we decided to scrap the music completely. I left the code in as a little tribute to that one guitar loop.


RIP KillSound.CS


Gify.(2017)’ The Hunger Games'[Online] Available at: [Accessed 27 Apr. 2017]


Unity Manual. (2017) [Online] Available at: [Accessed up till 27 Apr. 2017]

Unity Scripting API (2017) [Online] Available at: [Accessed up till 27 Apr. 2017]


Interactive Film – Pre Production & Planning

For our CMP group project, our initial goal was to try something different. Faced with three options, we quickly turned to interactive film.
We set up a meeting that same day and had a brainstorming session.
Murder mystery was the first feasible idea, we thought the simple concept would lend itself well as a balance to the complexity of interactivity. We played around with a few ideas, but ended up with something even simpler.

The main character wakes up in a room. There’s bottles and empty beer cans everywhere. He’s obviously feeling very rough from the night before.
There’s a bunch of objects in the room that might serve as clues to where he is, and why he’s there. As icing on the cake, there’s a woman lying next to him, and he doesn’t know who she is.

At this point the character stands up, panicking, the camera switches from 1st person view to 3d person, and he’s now faced with a series of decisions. This makes the basis for our project.

Screen Shot 2017-04-24 at 21.39.29.png

I personally have always liked the idea of mixing game and film. I almost find the story progression and cut scenes in games more intriguing than the actual playing, and I’m not sure I have the patience for a lot of films out there. This left me baffled for a few years… Not really knowing what was wrong. In steps Telltale games from California, and literally put to words what I’ve been after. Every single one of their games is heavily focused on the story, and the characters, but don’t think for a second you can sit still. Because if your interaction with the game isn’t swift enough your favourite character might die and not come back.

To make the interaction smooth, and to improve the feeling of the user experience –
we took direct inspiration from the aforementioned Telltale Games, which specialty is user interactivity-styled pieces of film (games).
A thing they do a lot in their games, and specifically in Batman: The Telltale Series game (Telltale games, 2016) is have a set of choices displayed over a screen that is moving slightly. The game won’t continue until you’ve made your pic 1.jpg

Another Telltale which is a good example is The Wolf Among Us game (Telltale games, 2013) where they offer the user small non-meaningful choices as the story progresses to enhance the experience. This will result in something happening, but not altering the meaning or progression of the story.cmp pic 2.jpeg

We wanted to include some of these styles of interactivity in our film, one plan for doing this was filming the character for 20-30 seconds, and finding an appropriate place to loop the footage. Serving as one place for the user to make a choice.

Another feature we wanted to implement was a slideshow type thing displayed on a camera. The player is supposed to scroll through the pictures, and can at any time right click move through. The twist is that there’s a situation that’s escalating in these photos, and if the player flicks through too many, he/she will be thrown into a completely different ending. By using different types of interactive features we’re hoping to immerse the viewer/player further into the experience.

For further immersion we made use of high quality audio throughout the game. We did overdubbing voices, we all did our own radio advertisements, and we even wrote a little intro song for the start of the film/game. We felt its important that we make use of our strengths, and audio production is something we’re all comfortable doing, so that influenced our final outcome for the better.

For ease when coding all this, we planned it all as sections in time, grid 1+2+3+4 corresponds to a certain order of things happening, every grid has its own set of choices that will alter the story, and even let the user experience different endings based on what choices he/she would go for.

Screen Shot 2017-04-24 at 22.00.41.png

This is an excerpt from our pre-production sheet, which was shared conveniently for the team via Google Drive. This is grid 2.1 in the story, where the user has made a choice to examine a radio. The idea is that scrolling through frequencies will play different radio messages, which will lead the user to a new place in the story.
This particular interactivity is more ambitious than just clicking through to the next bit in the story, we felt it important to showcase our ideas and the interactivity in different ways.

In the end we’re hoping all these things end up in a piece of enjoyable and playable content. By doing a lot of research and planning we’re confident we’ll achieve our goals!

Telltale Games, Batman: The Telltale Series, video game, PlayStation 4, San Rafael, California, United States

Telltale Games, The Wolf Among Us, video game, PlayStation 4, San Rafael, California, United States


In the second CMP film, I felt like I should show my more sensitive side. I wanted to make a video that had a strong cause so I cast my mind back to old charity adverts that were done in black and white and slowed down a little too much, with over dramatic title card over the top.  I also wanted to make a spreadable piece of content. To do this I looked at the groups I’m in on facebook and found that the group ‘Stagehand Humour’ had the best view to share ratio. It was then I realised that I could do a mic-dropping advert, as while in the mainstream media it has almost disappeared, for the people spending £400 every time it happens it’s still a real concern.

To create the advert I used multiple sources from all over the world so that everyone got at least couple of the references I make. I also originally just had proper microphone cleaning tutorials at the end but then I realised that the ASMR ones were much more over the top and fitted with the style of the video better. For the keen eyed viewer, you may notice that I start talking about the UK then switch to dollars and use an American toll number. This was because I was aiming to affect both markets.(And the phone number joke works better in the US format) Now is also a good time to mention why I’ve done the editing so badly and over the top. I’m sure you have all heard of a meme subculture call ‘ShitPosting’ despite the name they are rather brilliant in the fact they will share and enjoy bad videos the worse the better. (That makes sense right?) So I whipped out the speed correction tool in premiere pro and cranked it down to 10% at this point an old fashioned cartoon had more frames per second than I did. I also overdid the black and white levels and over saturated the ‘happy’ clips. This also helped me not get sued by anyone as it is very clearly a joke but also masks the fact that some clips were very low quality.



Now that I had the video I had to wait to release it into the world at the most effective time. Due to the fact, America had a bigger audience I knew I had to wait for 2 AM, which being a student was just in time for my tea! Once 2 AM hit I shared the video on my facebook page to several groups and cracked open a can of Irn Bru (Yes second-year PSVT does have a terrible addiction to it) and I joked to my housemates can I drink the can before I hit one hundred views.  One-quarter of a can later I had hit 200 views. I almost spat out my drink! From there onwards it rocketed up to 2000 views in the first hour, subsequent hours got about 1000 each. Which meant in the first 24 hours I got 25k views which by my very bias maths means that I was just 98% away from being deemed viral…  When using the Facebook Insight tool it becomes very apparent that I made the video to perfectly match my audience. 25-34 Year-olds were the biggest group with 18-24 not too far behind. I also can see that USA had the biggest audience followed by the UK. (Then the Netherlands which I don’t quite get)

This slideshow requires JavaScript.

My one day of fame


YouTube. (2017). ‘[ASMR] Microphone Brushing Touching Tapping – No Talking Upclose‘[online] Available at: [Accessed 7 Mar. 2017].

YouTube. (2017). ‘5 BEST Moments From Lady Gagas Super Bowl 51 Halftime Performance'[online] Available at: [Accessed 7 Mar. 2017].

YouTube. (2017). ‘ASMR Binaural Sound Series 41 – Cleaning Your Ears ( the Microphone)'[online] Available at: [Accessed 7 Mar. 2017].

YouTube. (2017). ‘HOW TO  clean you microphone grill so you dont get sick from it'[online] Available at: [Accessed 7 Mar. 2017].

YouTube. (2017). ‘How to Clean Shure Microphones'[online] Available at: [Accessed 7 Mar. 2017].

YouTube. (2017). ‘King Bob Drops Mic'[online] Available at: [Accessed 7 Mar. 2017].

YouTube. (2017). ‘Gary Jules-Mad World (song + lyrics)'[online] Available at: [Accessed 7 Mar. 2017].

YouTube. (2017). ‘Metallica – Ride The Lightning (Live – Mexico City, Mexico) – MetOnTour'[online] Available at: [Accessed 7 Mar. 2017].

YouTube. (2017). ‘Mic Drop-2′[online] Available at: [Accessed 7 Mar. 2017].

YouTube. (2017). ‘Mic Drop'[online] Available at: [Accessed 7 Mar. 2017].

YouTube. (2017). ‘Nerd HQ 2015 Bryan Cranston – Mic Drop (Supermansion Panel Clip)'[online] Available at:  [Accessed 7 Mar. 2017].

YouTube. (2017). ‘President Obama Does Mic Drop…Obama Out!'[online] Available at: [Accessed 7 Mar. 2017].

YouTube. (2017). ‘Randy Watson  Sexual Chocolate'[online] Available at: [Accessed 7 Mar. 2017].

YouTube. (2017). ‘Thomas Middleditch Drop the Mic Verizon Commercial 2017′[online] Available at: [Accessed 7 Mar. 2017].

YouTube. (2017). ‘UFC 202 Conor McGregor Drops Mic After Ripping Diaz Fans'[online] Available at: [Accessed 7 Mar. 2017].

A Grand Day Out: In Spite of Better Judgement

“What a weird, yet incredibly well-executed film”-You

Let me explain the film you just watched. After being stuck with writers’ block, I suddenly remembered a joke I made in the assignment briefing. “Don’t film the number 50 or making cups of tea!”  Why not both? I joked but the idea slowly grew on me as I realised how well the film would lend itself to the task. When I found the music track, I had to make it.

The film opens using footage from the 8mm Vintage Camera App by Nexvio. I used this app to mimic all the ‘artsy-fartsy’ hipster films on Vimeo. This was because I wanted the viewer to be misled into thinking it was a serious film, not a parody. I also feel the super 8 look helps hide the tardiness of the mobile footage. Seeing the video in the original High-Definition format makes the viewer less forgiving when it comes to technical errors.

The time-lapse was created using the 8mm app. I did this so that I wasn’t stuck with a certain length clip in the edit. (Pro-tip, by adding a time-lapse to the film if you need to add padding to the piece, you can make it longer. This trick is probably why I passed TV in the first year…) To keep the camera(phone) stable I used my shoulder rig and attached it to the bus seat. This worked with varying degrees of success, I did try hyper lapse apps but due to the limited background movement, they didn’t work as well.img_20170218_155121_01

Speaking of which,  I used the Microsoft Hyperlapse App. This is because I personally feel it does the best job the majority of the time. I also maybe biased as I was a Microsoft Insider Tester for it and the PC software. The reason the app works better is because it analyses the environment, then creates a 3d mesh and projects the image onto it. This allows the software to then view the scene through a virtual camera eliminating any movement from the input media. The only downside is the limited resolution and the fact it adds a watermark on the end of the clip but anyone with a vague understanding of the crop tool can remove this so it’s not really a problem.

The final shot of the film is the most technical and took a night of planning to pull off. I knew I wanted a slow motion shot but I had to work out how to time the action to the music as you can’t speed up gravity. I worked out a number of frames it took each water effect to reach the centre of the frame. Using this knowledge I then created a click track which the performers would follow to sync the effect perfectly. Because I was using the iPhone 6 Plus I had access to 240 frames per second. This is 10x normal speed (for 60Hz electrical grids) which equates to 18 visual frames or just over half a second which would have been near impossible to film. So I elected to film the clip in 240fps but treat it like 120fps this gave the performers 1.5 seconds to do the stunt and then gave me the headroom in the edit to slow the footage down even more after the musical stabs had been hit.

The film was shot solely on the iPhone 6 plus as it has fantastic optical stabilisation meaning that when paired with my shoulder rig, I had reasonably stable shots. I also chose the iPhone as it has a much narrower field of view compared to my OnePlus3T which has an almost fisheye effect to it, this retracted from the super 8mm look.  It also made framing easier as I could also view the 8mm footage live on the iPhone as the app is only available for iOS meaning that I would have had to send the footage from the OnePlus to the iPhone and back.

Apps used:

8mm Vintage Camera:
The best 8mm emulator out there. The Oscars can’t be wrong.
Here is some test footage I shot with it. I was originaly going to use this for my credit scene but apparently running over by 50 seconds was unacceptable.

Powerful and user-friendly. The only downside is the lack of iOS app

Honourable mentions:

YayCam Retro:
Nice features but more expensive and ultimately not as good as the 8mm app. Also kept sending me annoying adverts via notifications. (It’s a sad day when you get more ad notifications then social media ones)

Instagram has it’s only stabilisation app but after my own experimentation, I had a better time with the Microsoft Hyperlapse app. But I would recommend it for Instagramers as it allows for a streamlined workflow.

SloPro allows iPhone users to simulate shooting at 1000FPS but unfortunately, it’s too good to be true. The app uses frame blending to create the extra frame data needed, this creates very obvious artefacts in the areas in the frame that are moving. It also requires a pro account to remove the watermark. However, it is a great app for messing about with and has provided many a laugh between friends.

So that’s it! You know my trade secrets, so what are yours? Why not comment below?