Stories

What to Know About Mastering

Photo Courtesy of Lacquer Channel Mastering
July 31, 2019

The final stage of audio enhancement is key to the listener experience.

When it comes to making albums, the artist is, naturally, the center of attention—the one whose performance will ultimately sell the song, and the one who will reap the most glory if the record’s a hit. And while producers operate behind the scenes (or, rather, behind the glass), the best ones—be it George Martin or Timbaland—can become almost as famous and iconic as the artists they record. Mastering engineers, however, rarely enjoy the same prestige, their names known only to the most dedicated liner-note readers. (To wit, as of this writing, the term “superstar mastering engineer” generates rather thin Google results.) But as the final waystation on a song’s long journey from the artist’s head to the public’s ears, mastering plays a crucial role in determining how that music will be received.

If the producer’s main job is to capture recordings that sound good to the artist, the mastering engineer is entrusted with the task of making those recordings sound good to the listener, wherever they may be listening to it. But because the process is ostensibly more technical than creative, many music fans and fledgling artists have a limited understanding of what mastering actually is and why it’s so important. To help shed some light on this indispensable piece of the record-making puzzle, we spoke to Noah Mintz, a 21-year veteran of the trade. Currently a senior engineer at Toronto’s Lacquer Channel Mastering, Mintz has mastered releases for The National, Feist, and Wolf Parade, among many others. Here, he decodes the mysteries of mastering, while offering his perspective on how the job is changing along with the rest of the music industry.

Mastering is not just about making the music louder.

There’s a common perception that a mastering engineer simply takes tracks and gives them a volume boost, so that music meticulously captured in a hermetic recording studio will sound more vibrant and vivacious when blasted through your car stereo. But in reality, mastering is less a sonic steroid injection and more about achieving a delicate balance among the discrete recordings that make up an album. “Mastering is about taking the individual songs of an album and making them sound like a consistent piece—setting the tone and the volume and making it all just work together,” Mintz says. “And in terms of a single song, it’s about setting the volume at a level that’s consistent with other songs in its genre. It’s about rebalancing the audio to make the song, for all intents and purposes, better.”

Mastering is a lot more complex than it used to be.

A big part of a mastering engineer’s job is anticipating where and how a listener will be hearing the music. Before the ’80s, it was safe to assume that the majority of listening would be done on home-stereo turntables. But as technology has evolved, so too has mastering. With the advent of compact discs, engineers were able to boost volume levels far beyond what old analog formats like vinyl and cassettes could handle, ushering in what would become known as the loudness war—i.e., the push to make CDs as loud as possible to maximize sonic impact. However, as Mintz notes, in the 21st century, “there are more playback sources than at any other time in history,” from traditional home stereos to desktop computer speakers to phones with wireless earbuds. “A mastering engineer needs to know how every song is playing on every kind of device,” says Mintz.

Streaming has also introduced another set of variables. Services like Spotify employ what’s called loudness normalization, a default filter that ensures every song you stream—whether it’s an original-master recording from the 1950s or a modern pop track—is heard at the same volume without you having to manually adjust it. (The feature is especially helpful in leveling out the wild variances in sound level on curated playlists that combine older and newer tracks.) However, since users have the option of turning off the feature in their personal settings, “you can’t really master according to loudness normalization,” Mintz says. “And because all the different streaming services have different loudness-normalization protocols, you have to find a level that works for everything.”

You get what you pay for.

Computers have enabled artists to record and disseminate their music with little more than the push of a button (or the click of a trackpad), and, in recent years, mastering has likewise become a clickable, drag ‘n’ drop task. There are a number of DIY algorithmic services (such as TuneCore, LANDR, CloudBounce, and BandLab) that allow users to upload their tracks and receive instant masters at rates as low as $5 per track (roughly one-tenth of what a pro might charge an indie artist).

But Mintz isn’t worried about the robots coming for his job just yet. “It’s like coffee,” he says. “I use a Nespresso when I’m at work, and it’s fine for that. But when I go to my local coffee shop, it’s perfect, because it’s done manually. I have no problem with Nespresso, but every time you feed it, it’s going to be exactly the same.” He points out that likewise, an algorithm won’t give you thoughtful results, and most likely not the absolute best results. “If you put your song into an algorithm, sometimes, it sounds good—but sometimes a monkey playing a piano sounds good, too,” Mintz says. “Because most people don’t understand what mastering is, they’re satisfied with that, even though they don’t realize what their song could sound like.

“And,” he adds, “if they don’t like how it sounds, they can’t tell the algorithm how they want it to be different.”

—Stuart Berman

Spotify for Artists ti aiuta a sviluppare la fan base di cui hai bisogno per raggiungere i tuoi obiettivi.

Condividi questa storia
Storie popolari