How To Use Your Ears

How To Use Your Ears


09 MARCH 2017

written by Mike



I know that we, sound engineers, are proud of our listening devices (ears), and we are keen to offer listening advice to other people. I guess it comes with the job. One of my tutors, an old-school guy, said that he is worried about us.

He said that new generation of producers, composers and engineers tend to spend a lot of time looking at the computer screen.

And it can distract us from the “real” job.

He told us that we should approach sound just like an artist looks at the painting. He or she needs to step back to appreciate it in full scale. I agree with his ideas but on the other hand, I thought to myself “I like looking at the screen too!”

Where is the correct answer then?

Just like with sound, in balance.

For a period, I was responsible for quality control (QC) of mixes in a post production department. I had a team of people under my supervision, and our job was to listen to stuff that came out of dubbing theatres. And to find any errors and blunders.

The way we worked was with a picture on one screen and edit window on the other. I would listen for mistakes but also watch the waveforms for possible errors. After a while I found myself finding most of the errors by looking. I could recognise a click, sound drop out or a reverb cut.

It reminded me of the scene from Matrix when the guy says he doesn’t see numbers anymore.

Once, my boss asked me what do I think about the idea of QC without looking at the screens.

“Oh dear…” I thought to myself.

His suggestion was inspired by the “old” school. That was how they learnt back then.

“Mhmmm….ok….I mean I look at waveforms as a helpful guidance but let’s asked the rest of the team at the next meeting. Let’s see what they say.”

I tried to be as diplomatic as I could.

To cut to the chase, we had our meeting and the idea was dropped in a flash. The truth is, we adopted looking and listening as one. To us, breaking it would make our job harder and more prone to error.






Listening and recording at the same time is a tough cookie. It will be stressful, as you want to get it right first time. Also, in live recording; you won’t have a luxury of a second time.

What can you do to make the whole process stress-free?

Look at the big picture.

During recording, it’s easy to get caught up in details. You start to listen to individual elements and spend time on one thing while neglecting everything else. If you spend a whole day getting your drums to sound “right”, you will not have enough time to get great vocals.

Another classic example of losing the “big picture” stuff is adding new elements. It is so easy to add another layer of bass or pad.

Just one more plugin. 

All you need is a bit of extra RAM memory and good enough CPU. Too much is too much. And it doesn’t matter that you can run three hundred tracks in your DAW without any problems. Sometimes four tracks are enough.

Listen in balance.

Always do a quick mix during the recording, get the feel of the final product. A good monitor balance will make the whole process smooth, and it can help you make a decision on adding another element to the mix.

That lead melody sounds great on its own?

Listen to it in balance with everything else. Do you still need it?

Listening in context will help you to answer all these questions. It is also less stressful when you know that you won’t have to do as much “fix in the mix” stuff later on.

Make the headphone mix as good as possible.

There is a simple rule. If the headphones sound good, the musicians will sound good too. It’s not only from a practical point of view. Yes, the drummer must hear bass and vice versa, it helps. But it is also a psychological trick. If the recording already sounds pretty good just imagine how awesome the final mix will be!

A listening job is not just for the engineer; correct balance during overdubbing is crucial too.

And a few more quick tips:

Headphone bleed can be a pain. Invert the polarity to cut it off.

A vocalist needs good headphone balance to pitch with the tracks. Sometimes one ear off during the recording can help.

You can affect the vocalist’s pitch with a simple fold back tricks.

If the singer is flat, turn the vocal monitor down in the headphones.

If the singer if sharp, turn the vocal monitor up in the headphones.

Moving a bass track a few milliseconds back or forth can help to find a groove “on the go.” 






Editing comes before mixing. You don’t need to be in a dubbing theatre to do a good editing job (it would be nice though!). When you edit, you prepare sounds for a mix. That is why how you listen to them is relevant too.

Watch that screen.

I know, I have just talked about the relevance of listening, and the first point is to look at the screen. Why? Well, with editing looking at the screen is as important as listening. You need to see where to cut the audio, where to move it.

If you need to draw out some clicks, zooming in on waveforms helps.

I would say it’s 50-50 for me. It’s a draw.

Quiet environment is essential, but headphones are ok too.

Working in a peaceful environment is great but with editing, you don’t need to be as strict. I mean, yes, it will be hard to hear lip smacks or clicks if you are working near a building site. But normal house conditions will be all right.

Worst case you can also do a little edit on your headphones.

It’s not ideal, but it’s better than nothing.

Listen in context.

We editors tend to dwell on small stuff. Everything needs to be perfect! Guess what. You won’t be able to hear most of that stuff in the full mix.

Listen in context when possible. Do a little temp mix. Balance the tracks so they resemble the final mix. It will give you the idea what you should focus on.

What does the mixer want?

And the last one. And if only I knew the answer.

Jokes aside, the communication between the parties is necessary. Maybe the mixer doesn’t care about your fades, but they want you to color the clips. Maybe they don’t need your work with clip gain at all.

Find out what do they need in the session. Do it at the beginning. It can save you a lot of time.

The last point is still about the listening. To the other person. 



Mixing sounds together is all about listening, right? You can just go with the feel of the moment.

Yes and no.

Mixing is an art; I do agree. Every mixer works in a different way. But there are a few helpful tips that I want to suggest.

Listen on different systems.

So you got your expensive monitors, you soundproofed your bedroom.

Are you sitting in the perfect listening position?

Good, but guess what. No one is going to listen to your mix in that way. Most people that will hear your work will be listening on their phones, in their cars or at home while doing the dishes.

Buy a pair of cheap USB speakers; listen to your mix from your phone. With and without headphones. Play it on your laptop. If you want your work to be good, it needs to sound great through these systems. If you can only appreciate it on a high-end studio monitor. Well, you got a problem.

One more thing.

Mixers from older generation sometimes will say, “no one mixes on headphones!”. That may be true, but everyone listens on headphones. Have a couple of different pairs ready. Test your mixes on them.

Listen in different environments.

You got the previous point; that’s great. Now it is time to shake it up. Go and listen to the mix in a car. Go outside your room and listen through the door. Take a phone with you to the gym and listen when you work out.

These are only a few ideas; try to come up with other weird scenarios. Just think where other people listen to stuff.

Read a book and listen at the same time?

It won’t hurt to try.

Turn the volume down.

Everyone wants to hear their work loud, on the biggest speakers. And yes it can be good if you want to EQ some stuff.

But to analyse the balance, it is best if you turn the volume down. My sound production teacher used to say “If it sounds good on the low level, then it’s a start.”

The other good thing about low volume is that your room acoustics won’t play such a big role. And that is an important point for all the bedroom mixers out there. Smaller monitors and lower level, you can’t go wrong with that.

Listen musically and sonically.

So I understand it in this way. When I want to listen to my mix musically, I close my eyes, and I try to hear it as a whole. I may not pick up the details, but I will hear if something is off balance.

Try it for yourself. Computer screens tend to lie when it comes to sound. A second method is sonic listening. Bring up your meters and frequency analysers.

Is everything in balance? Spectrum looking good?

Use your eyes, ears and mouse. Make all these plugins work. So I try to jump between these two ideas back and forth. What you can’t hear, you will see it. And hopefully, vice versa.

Have a reference material ready.

We all have mixes that inspire us. All time favourites. The ones that we want to copy. Have it near you. When in doubt with your work, put your cherished piece on.

In a heartbeat, you will know where do you need to improve.

Watch that bass.

The unloved child.

Why do you sound so good in my mixing room, but when I play you on a TV system you betray me?


Get the bass under control. It’s easier said than done, but with some basic acoustic treatment, you should be all right. After a while, you will learn your speakers, and you will know how to tame the beast. Also, there is nothing better than a clear and strong bass in your mix.




So as I mentioned before, I used to be responsible for a Quality Control in a post house. Listening is somewhat different as you only focus on mistakes. Even if a particular mix is rubbish, it’s not your problem. You are there to point out blunders only.

Know the guidelines.

Before you start work, you need to know what you are looking for.

What are the most common mistakes? What gets fixed and what doesn’t?

Every project will be different, but once you know the protocol you are off to a better start.

Watch (sync) and listen.

With quality control, at least in movies, you need to both listen to the mix and watch the screen. Why?

Mistakes such as sounds out of sync, missing Foley or low dialogue are harder to spot if you are just listening. If there is a missing sound effect for a dog, the only way you will notice it is if you see the dog on the screen. It can get a little tricky because you will also need to watch the waveforms as some stuff you won’t necessary hear.


Let’s say there is a two-second drop out in the left surround channel. When you listen to a loud mix blasting from L-C-R, you won’t hear the error. The only way to spot it is to see it on the screen. So, yeah.

You need to watch waveforms on one screen, the picture on the other and also listen to the mix. At the same time.

Don’t obsess about the small stuff, listen to the Final Mix.

“There is a little sound click in left surround, audible when you play the isolated channel at half speed.”

Don’t be that guy. Believe me, it was I when I started. I felt like a hero for noting these little flaws. Until people much smarter than I am put me in my place. Try to see the big picture.

Can you hear this minor issue in a final mix?

Will fixing it make the product better?


Then it is probably ok. After a while, you will learn what is and what isn’t relevant.





Ok, so for the end I have something unusual. Hearing during deliverables.

What? Why would you listen to stuff when you are just prepping files?

Try to see it as the last stage of Quality Control. And it is better to be safe than sorry. Especially after you click on “send” button.

What do you deliver?

Is it 5.1 mix? Stereo? Or just the stems?

Have a quick listen and make sure you included the right material. A good practice is to spot-check the mix. Play a few sections at random.

Listen to different elements and individual channels.

Check the beginning and the end.

Check the fade in at the beginning and fade out at the end. When you work with a lot of data on a daily basis, it is easy to cut something short by mistake.

If the start and the end are both in the right place, the rest should be all right too.

Unusual places to check.

What are the unusual places you can check?

Maybe listen to LFE in isolation. Or check the overlaps between the reels of a movie. These are the places where mistakes happen often.


It’s because no one checks them. Every project will have an extra element or something different. Don’t forget to listen to these exceptions too.

Is everything in sync?

Make sure that all the tracks and all the stems are in sync with each other. Down mixes tend to have an induced delay so make sure you move it to the right place. An unfortunate mouse click can move the whole mix out of sync so make sure that everything is intact before you send.

All right, that’s it for today!

I hope now you have a better understanding how to use your ears on different stages of sound engineering.

And remember; take care of your listening devices. You only have a pair!

Liked the article? Follow me! 🙂

Subscribe for the latest updates

What is Sound Design?

What is Sound Design?


05 MARCH 2017

written by Mike



Sound design is an art of creating a soundscape for a movie, game or any other creative production. This short definition may resemble the art of Foley, but the sound design is a bit different.

Sound design is your choice for something that you can’t record or does not exist. It would be hard to record a sound of a dragon or space ship. And to fill that gap we need a team of skilled sound designers.

A Sound designer is a creative job position. It is down to your imagination how certain things will sound. Just think of work of Ben Burtt and his design work on Star Wars or Wall-e. These sounds alone have become a huge part of a cultural impact.

Of course, sound design is not just ideas and creative freedom. It can be a laborious and tiresome task too. Designing a great sounding product requires a lot of time. The time during post- production is scarce, but there are always a few steps that are worth knowing before you start.





 There are several ways to create great sound effects. You can record sounds and then process them with different plugins. You can create sounds from scratch, or you can use available sound libraries as your fundament.

The sound library is a collection of already recorded and prepared sounds you can use. It is important to remember that the libraries are usually licensed, and you must pay for them. There are websites like Freesound with free available sounds that people upload too; these places can be a good start when you are in need of sounds.

Sounds from a library can usually benefit small and low-budget productions. They are inexpensive and don’t take too much time to put in place.

The downside is that using the sound library can leave to a generic and bland sound design. It is always better to strive for originality and innovation. But sometimes using the same audio can be fun too.

Recording a basis for sound effect is the most common and also most fun way to design a great sounding scene. Imagine you need to create a cave ambience. First, you can record the drops of water from your tap. Then wind and surroundings of an empty garage, and maybe a movement of your clothes.

It may not sound and look as much in the beginning. But when you layer it in your editing software and process some of the effects, it will be a different story. Drops of water coming from the walls, quiet wind passing in the distance and atmosphere of obscure movements will make the whole place alive.

Another option is to create the sounds from scratch on the computer. For this, you need a MIDI keyboard and collection of different software synthesisers. For example software from Spectrasonics is awesome for designing sounds. By connecting your MIDI keyboard with the software synthesiser, you can design anything you want. You can start with already saved presets.

You can create laser gun or magic spells’ sounds by playing with different options of the synth. 


 After the sounds are ready, it is time for your editing and implementation. This stage is where you will synchronise sounds to the scene in a project.

Editing is a simple task of cutting, cleaning and moving audio around. For example, a scene in the eerie cave is shorter than anticipated. The atmosphere and ambience that you designed before need to be shorter. What if a fast paced car chase scene has changed?

And the director wants an extra off-screen sound in there too?

You will need to sync the sounds again and add another layer of effects too. Such is a life of a sound designer. Implementation of sound design is a bit of a different undertaking. When we talk about implementing sounds, it usually means placing it in a game or application. The same rules for recording, design and editing apply. You can do it all in the software like Avid Pro Tools. 

Implementation requires programs called sound engines. The sound engine can be a bit more complicated to use. It is a sort of a stop in the middle that connects your audio editing software to your final product such as a video game. The reason sound engines exist, is that implementing a sound into an interactive project like an app is different to linear movie sound design.

The game or app have to send a signal when a certain sound needs to be played, at what volume, in what context. Imagine playing a game where you walk on grass field then you pass on a wooden bridge. And then into a mountain cave. Listen to the footsteps of the character. You will notice that they will sound different in each environment. 





Once the sound effects are ready it is the time to mix them into the projects. Now, it is different to create a linear, movie mix to an organic game mix. Movie sound effects mixer will work together with music and dialogue mixers. As a team under a supervision of a director, they will decide on the volume and impact of SFXs in the project.

A big explosion sound can take up the whole space and shake a room. Other time the sound is muffled, distorted and presented from the point of view of main character.

For the games and applications, the mix works in a different way. A set of scripts is written and implemented, and the mix happens in real time. One player can fight a monster, and the dynamic, heroic music kicks in. The other player can hide in the shadows and avoid the fight altogether. Different kinds of sound effects will take place then.

In the design field, you can find a lot of different specialisations. Big projects will have a monster sound designer, ambience sound designer, interface sound designer and so on. In each one of these positions, you will have to come up with innovative and exciting ideas for the project. And then implement them into the scene.

Next time, when you are playing a game or watching a movie listen to sound design.

What creative decisions sound designers made?

How different it feels when you mute the sounds for a bit?

You will realise how much of a big role a great sound design plays in a success of the production.

Liked the article? Follow me! 🙂

Subscribe for the latest updates

What is DAW?

What is DAW?


01 MARCH 2017

written by Mike


DAW stands for Digital Audio Workstation. It is a computer and software that we use when working with audio. There are different types of programs that we can call DAW. Every one of them has a set of unique functions and options that we can use.

But what is the best DAW?

Today DAW is an absolute fundament and basis for any project that uses a processed audio. Of course, you can use simple functions of video programs to add your sounds. But you will find them limited, narrow and boring. If you just want to add a music track or some sound effects to your YouTube video you don’t need extra software. Editor in Creator Studio will be enough for you.

But where is the fun in that?

Audio programs can be a little daunting when you first try to use one. But after a while, you will realise that they all share the same basic concepts. Once you learn how to operate in one of them, it will be easy to use another. You still will need to learn the new interface of course.

Let’s dive into four most basic functions of DAW.

To choose the right music software you need to check how they work as every program will be a little different. And aimed at different things.


Pretty much every single audio program has an option to record sound.

It is rather a question of what you will record and how you will process it later on. If you were going to send recordings to someone else, then it would be a good idea to use the same software. It will be easier to save it as a ready session with the correct settings.

It is also critical to decide what kind of recording it will be. If it is a live recording of a band or orchestra, then Avid Pro Tools is a standard. The reason Pro Tools is so popular is its stability and control over audio input options. Recording a large music band requires a lot of power. And it is not the best if the whole thing was to crash during the performance. The issue of stability is important in that instance.

Recording digital sounds is different. A MIDI keyboard is required to send a signal to a computer to generate sounds. And not much else. You’ll find MIDI controllers in all different shapes and sizes. It’s easy to get overwhelmed so here are a few tips to help you with the decision.

Quick Tip For Selecting a MIDI Controller

The best way to decide what controller is best for you is to make a list of all of the tasks that a controller would simplify for you. Once you’re done, you’ll have a much easier time pinpointing a controller that fits your needs. For example, if you’re simply a piano player looking for a realistic way to record piano progressions into your DAW, you might opt for an 88-key keyboard that supports MIDI, such as the Yamaha P115. If you’re a beat maker that will be playing some simple chord progressions, but will be focused on percussion, a 49-key MIDI controller with drum pads, such as the MPK249 would be a great choice. At the end of the day, there isn’t a one-size-fits-all MIDI controller. It all depends on your personal studio workflow.

A program with a focus on working with MIDI will be a much better choice for this kind of projects.

Apple Logic, CubaseNuendo are just a few from a vast range of choice. For example, Logic comes with a selection of synths that can create a whole orchestra on a computer. And with a focus on MIDI control it is much easier to use it for that purpose. Of course, a lot of sound engineers work on multiple platforms. We do like to live on the edge.



We know sampling from electronic and rap music, but today it can be a part of any great production, and in most cases it is. Samples are short snippets of audio. It can be a musical theme, a sound effect or a drumbeat. Bands such as Daft Punk rely on sampling. And they do it quite well.

Not only during a creation of their music but during live performances too. Software with a focus on sampling will be useful if your project requires a responsive interface.DAWs such as Ableton Live or FL Studio aspire to be easy to operate and work when it comes to producing music. Or performing music that uses a lot of samples.

MIDI instruments such as Novation Launchpad are great tools when it comes to the art of sampling for the reason that they are easy to program and connect to your DAW.


We describe editing as cleaning recordings from unwanted noise. Cutting sounds into smaller pieces, moving sounds around or just creating a clear session layout for later work. All above is editing.

And you can get an Oscar for it too. When it comes to editing audio, there are three things that you want to take into consideration.

Stability, speed, and interface.

Stability is important when it comes to a big project. Editing audio on full feature movie or game means working through thousands of sounds. An unexpected system crash is the last thing on your wanted list. Especially if you forgot to save for some time. It happens to the best of us. Sometimes you will also need to use simple processing such as reversing, adding volume or slowing down/speeding up.

Stability and processing power will make the whole process smooth and error free. Speed it important, especially when you are on tight schedule. In fact, speed is linked to an interface, our third variable on the list.

A well-designed interface will result in greater speed. That results in meeting deadlines. That results in you sleeping better at night. And less coffee and longer life (probably!). By speed, we mean how fast you can edit and not the actual speed of your system. And that is where Avid Pro Tools is the most popular system.

A vast set of keyboard shortcuts and a clear interface is the selling point of fast-paced editing when it comes to Pro Tools.  Of course other programs such as Apple Logic Cakewalk Sonar or Steinberg Cubase are well designed too, so it is best to try out different kinds of DAW before deciding on one.



The art of mixing sounds today is different to the past. In the past, you needed an expensive analog console with many buttons and faders. In present times, you can do all the mixing on your laptop. Maybe not all, but that is not the point.

Each DAW is capable of creating a sound mix but like in everything else some of them are better than others. The industry standard is once again, Avid Pro Tools. It is due to its stability, clear controls but also hardware that comes with it. When it comes to creating a mix at home, I recommend trying a range of different programs.

Some of them might be easier to use than others but in the end, everyone works in a different way. Delivering a good mix depends on many variables and choosing the right software is only one of them. Deciding on a right DAW can cause you more than a headache. There are many on the market, some more expensive than others. Most of them have as many supporters as bad reviews, so download a demo and test it yourself.

In the end, it is important to remember that audio software is just a tool, learning how to use in the best way is another story.

Liked the article? Follow me! 🙂

Subscribe for the latest updates

Pin It on Pinterest