Understanding Sound for Music Producers
A Soft Synced Companion Guide
Core Track → The Setup → Lesson 1
How to Use This Guide
This guide contains essential concepts you'll need for Your Turn and beyond. Read it at your own pace and take breaks when needed, but don't skip it. The depth here is what makes the play and practice work.
Introduction
Before reading further, write down what sound is. One or two sentences. If you don’t have your workbook or a notepad handy, just close your eyes for a brief moment and try to define in your own words what sound is. No looking it up.
Most people write something about vibrations or air molecules or waves. That's not wrong, but it's incomplete in a way that matters for production work. Sound is not just a physical phenomenon you observe. It's the raw material you manipulate every time you open your DAW. Every plugin you use, every fader you move, every mixing decision you make is an intervention in the physics described in this lesson.
This guide covers what sound is, how frequency and amplitude work, and why understanding these concepts matters for the decisions you make dozens of times per session. The physics here is not trivia. It's the foundation for everything that follows.
1. What Sound Actually Is
Sound is pressure variation in a medium, usually air. Something vibrates (a speaker cone, a guitar string, your vocal cords), which compresses and expands the air around it. Those pressure changes travel outward as waves. When they reach your ear, your brain interprets them as sound.
That's the physics. Here's what it means for production: every sound you work with started as a physical vibration, was converted into an electrical signal, then into digital data, and will eventually be converted back into vibration through speakers or headphones. Every step in that chain involves choices about how to represent or alter those pressure variations.
Why This Matters
When you boost 3 kHz with an EQ, you're not "adding brightness." You're amplifying pressure variations that occur 3,000 times per second. When you compress a vocal, you're not "making it sound better." You're reducing the difference between the largest and smallest pressure variations in the signal. Understanding what you're actually manipulating helps you make more deliberate choices instead of following presets or guessing.
2. Frequency Determines Pitch
Frequency is how often a pressure wave repeats per second, measured in Hertz (Hz). Higher frequency means more repetitions per second, which we perceive as higher pitch. Lower frequency means fewer repetitions, which we perceive as lower pitch.
The human hearing range is roughly 20 Hz to 20,000 Hz (20 kHz). Below 20 Hz, you feel vibration more than you hear it. Above 20 kHz, most people cannot perceive it at all, and that threshold drops with age or hearing damage.
Octaves and Doubling
Doubling a frequency raises the pitch by one octave. If 220 Hz is an A, then 440 Hz is also an A, one octave higher. This pattern repeats: 440 Hz, 880 Hz, 1760 Hz are all A notes in different octaves. This relationship is why frequency is not linear across the spectrum. The distance from 100 Hz to 200 Hz is an octave. So is 1000 Hz to 2000 Hz. But the second interval is 900 Hz wide while the first is only 100 Hz wide.
Frequency Ranges in Production
Different instruments and sounds occupy different frequency ranges:
Sub-bass (20–60 Hz): Mostly felt rather than heard. Adds physical impact. Too much here makes mixes muddy on larger systems.
Bass (60–250 Hz): Kick drums, bass guitar, low synth notes. This range provides weight and power.
Low mids (250–500 Hz): Body and warmth. Also where mud accumulates if too many elements compete here.
Mids (500 Hz–2 kHz): Core of most instruments and vocals. This range carries intelligibility and presence.
High mids (2–4 kHz): Attack, clarity, edge. Also where harshness lives.
Treble (4–8 kHz): Definition and bite. Sibilance in vocals happens here.
Air (8–20 kHz): Sparkle and space. Cymbals, breath, room reflections.
These ranges matter because mixing is largely about managing frequency overlap. When too many elements occupy the same frequency range, they mask each other. The result sounds cluttered or muddy. Understanding where each element lives helps you carve space and make deliberate choices about what to emphasize.
See the instrument frequency chart below for reference.
Instrument Frequency Chart
A comprehensive visualization of the frequency ranges of musical instruments across different families
Percussion
Brass
Woodwinds
Strings
Voice & Keys
3. Amplitude Determines Loudness
Amplitude is the size of the pressure variation. Larger amplitude means the air is compressed and expanded more dramatically, which we perceive as louder. Smaller amplitude means less dramatic changes, which we perceive as quieter.
Amplitude Is Not Linear
Here's something that matters: doubling amplitude does not make something sound twice as loud. Human hearing is logarithmic, not linear. To perceive something as twice as loud, you need roughly ten times the amplitude. This is why mixing is harder than it looks. Small fader movements can create large perceptual changes, and large fader movements sometimes do less than expected.
Headroom and Clipping
In digital audio, 0 dB is the maximum level. Anything above that clips, which means the waveform gets chopped off and distortion occurs. This is not usually desirable. Producers leave headroom (space between the loudest peaks and 0 dB) to avoid clipping and to maintain flexibility during mixing and mastering.
Peak level and perceived loudness are different things. A sound can have high peaks but feel quiet if most of the signal sits at lower levels. This is why dynamics (the difference between the loudest and quietest parts) matter. Compressors reduce dynamics, which allows you to increase overall loudness without clipping. But overcompression removes dynamic contrast and makes mixes feel flat or fatiguing.
Interactive Frequency Explorer
This is a hands-on tool that lets you explore the relationship between frequency (measured in Hertz/Hz) and pitch. By dragging your mouse or finger across the pad, you'll generate sound in real-time and see exactly how frequency affects what you hear.
How to Use It
Click and drag anywhere on the black pad to generate a tone. Moving left to right changes the frequency (pitch) – left is low, right is high. Moving up and down changes the volume. The display below shows you the exact frequency in Hz, the closest musical note, and the current volume percentage.
Notice how the frequency scale isn't evenly spaced – it's logarithmic, just like how our ears work. Moving the same distance always creates the same musical interval, not the same Hz jump. That's why an octave from 100Hz to 200Hz feels the same as an octave from 1000Hz to 2000Hz, even though one is a 100Hz difference and the other is 1000Hz.
Pay attention to the reference points marked on the grid. A4 at 440Hz is the standard tuning reference. The lowest note, E0 at 20Hz, is right at the edge of human hearing – more felt than heard. The highest frequencies above 10kHz are where brightness and air live in your mixes.
4. How Waves Interact: Phase
When two or more sounds play simultaneously, their pressure waves combine. This combination can reinforce or cancel the original signals, depending on how the waves align.
Constructive Interference
When two waves are in phase (their peaks and troughs line up), they add together. The result is louder than either wave alone. If you duplicate a track and play both copies at the same time with no delay, the combined signal will be louder.
Destructive Interference
When two waves are out of phase (one wave's peaks align with the other's troughs), they cancel each other partially or completely. If you flip the polarity of one track and play it with the original, they cancel perfectly and you hear silence.
Why Phase Matters in Production
Phase issues occur frequently in multi-mic recordings. If you record a drum kit with multiple microphones, some mics will capture the same sound at slightly different times because of their different positions. When you mix those mics together, certain frequencies might cancel while others reinforce, resulting in a thin or hollow sound. This is not always bad, but it should be intentional.
Phase issues also occur when layering synthesizers or samples. Two synth patches playing the same note might sound great individually but thin or weak when layered if their waveforms cancel each other at certain frequencies.
Understanding phase does not require complex tools. Listen critically when combining sources. If layering makes something quieter or thinner instead of bigger, check phase. Most DAWs include a phase inversion button. Flip the polarity of one track and see if the combined sound improves.
Your Turn: Frequency and Amplitude in Practice
This exercise helps you connect the physics to what you actually hear. Set aside 20 minutes.
Step 1: Choose a Track
Pick a professionally mixed track in a genre you work in. Listen to it on decent speakers or headphones. No phone speakers.
Step 2: Identify Frequency Ranges
Listen specifically for:
The lowest element. What frequency range does it occupy? (Sub-bass? Bass?)
The most prominent midrange element. (Usually vocals, lead synth, or guitar.)
The highest elements. (Cymbals? Hi-hats? Vocal air?)
Write down your observations. Don't worry about being precise. The goal is to start hearing frequency as distinct ranges rather than a single "sound."
Step 3: Notice Amplitude Behavior
Listen for:
Which elements stay constant in loudness and which vary.
How the loudest moment compares to the quietest moment. Is the dynamic range wide or narrow?
Whether any element feels too loud or too quiet relative to the others.
Write down what you notice.
Step 4: Test Phase Awareness
If you have access to a DAW, import a simple loop or beat. Duplicate the track. Play both at the same time. Notice it gets louder. Now flip the polarity of one track (most DAWs have a phase invert button on the mixer; research how to use it). Notice what happens. The sound should cancel significantly or completely.
If you don't have DAW access, just make a mental note to try this when you do.
Expected Outcome
A clearer sense of how frequency and amplitude work in actual music, not just as abstract concepts. Use this awareness every time you mix.
Next time you open the app, mark this Turn complete!
Bonus Tip: Add a reflection to Your Turn to earn Depth points, which unlock Extension Courses!
Producer FAQs
-
Same frequency, different timbre. Timbre is the unique character of a sound, determined by its harmonic content (which frequencies are present above the fundamental) and its envelope (how the sound starts, sustains, and ends). A piano string produces a rich series of harmonics. A synth sine wave produces only one frequency. Different harmonic content creates different timbres, even at the same pitch.
-
Most people hear down to 20–25 Hz under ideal conditions, though this varies. Below that, sound is felt more than heard. Sub-bass in music production often includes content below 30 Hz for physical impact, but too much energy in this range can make mixes sound muddy on larger systems or cause problems in clubs and venues.
-
If the meter turns red or shows levels above 0 dB, it's clipping. Fix this by turning down the track or the master fader. Even if nothing clips, a mix can still feel too loud if there's no dynamic contrast. Loud becomes meaningless without quiet. This is why dynamic range matters.
-
Layered sounds that should feel bigger sound thin or hollow instead. Certain frequencies disappear or feel weak. The most common scenario is multi-mic recordings (drums, acoustic guitar) where mics capture the same source from different positions. Phase problems are harder to hear than frequency or amplitude issues because the effect is more subtle. If something sounds wrong but you can't identify why, check phase.
Quick Reference
Frequency
How fast pressure waves repeat; determines pitch. Production uses ranges from 20 Hz (sub-bass) to 20 kHz (air).
Amplitude
How large pressure variations are; determines loudness. Digital audio clips at 0 dB; humans hear logarithmically.
Phase
How waves align when combined. In phase reinforces; out of phase cancels. Common issue in multi-mic setups.
Next Steps
Sound is pressure variation shaped by frequency and amplitude. This lesson covered what sound is physically, how frequency and amplitude work, and why understanding these concepts matters for the decisions you make in production.
The next lesson addresses how your computer captures and represents sound. Physical vibration does not translate directly into digital data. The process involves specific constraints and compromises that affect quality, file size, and workflow. Lesson 2 explains how digital audio works and what producers need to know about sample rate and bit depth.
The Guides are your reference. The app is your journey.