How much does it cost to record a hit record in 2019? You might be surprised to find out that the answer is not much. Take Grammy-winning music producer Oak Felder, who helped create a string of recent hits by artists such as Nicki Minaj, Alessia Cara and Demi Lovato.
The equipment used on some of these tracks consists of a consumer-grade Apple laptop ($1499), an external sound card ($650), music recording software ($199), and a few software plugins ($600) — a combined cost of less than $3,000.
Of course, the claim that “anyone” can record a hit song neglects to mention a crucial fact: not everyone has the talent or technical skills required. But for those who do, the financial barriers to becoming a music producer are lower than ever before. In terms of production at least, the music business has become close to a meritocracy as more and more bedroom producers make the leap into the mainstream. The biggest breakout hit of the past 12 months, Old Town Road by Lil Nas X, gives us a taste of what’s to come. The underlying trap beat was created by Kiowa Roukema, a 16-year-old bedroom producer from the Netherlands, who sold the rights to use his work online through the BeatStars portal.
Dials, buttons and sliders
In the past, the charts were dominated by experienced producers with a track record of producing hits like Phil Spector, Mickie Most and Trevor Horn. So what changed? The move away from recording and editing on magnetic tape towards fully digital music production in the mid-90s was crucial. Digitization radically reduced the amount of equipment needed to record a song and fundamentally changed the way that music producers work.
If you try to conjure a mental image of a recording studio, you probably still think of a large wood paneled room with a glass booth in the corner containing a vast array of volume faders, dials, switches and electronic gadgets. Many studios like that still exist, and many producers still prefer the tactile controls of a physical mixing desk, but they are not strictly necessary to produce much of the music we hear in the charts.
The centerpiece of every producer’s toolbox in the analog era was the mixing console — a large desk of volume faders and dials used to manage the mix. In the days prior to automation, studios sometimes needed four or five people to “ride the faders” — the process of controlling the volumes as all the tracks were combined and recorded to the final mix.
Mixing consoles were beautiful pieces of bespoke engineering with a price to match. An installation from a manufacturer like Neve, Solid State Logic (SSL), or Automated Processes Inc. (API) would have cost the equivalent of $50,000 when adjusted for inflation in the 1980s. And this is before you took the cost of all the additional gear into account — the compressors equalizers, reverbs, and synths mounted in the studio rack. In total, the cost could easily exceed $100,000 in a moderately well equipped studio.
Point & mix
Everything changed when digital recording became mainstream in the 1990s. Digital music had a long history prior to this period. The technical foundation, Digital Pulse-Code Modulation, was invented by Bell Labs in the 1930s and the first commercial digital recording was released in 1971. So if digital recording had been technically possible for two decades, why did studios not go digital sooner?
Firstly, in order to replace the mixing console, there needed to be a virtual equivalent. While this may have been theoretically possible using the command line operating systems of the 70s and 80s, it only became a practical alternative with the advent of graphical interfaces in the 90s. Secondly, audio files required a lot of storage memory and hard drives did not have sufficient capacity to cope in the 1980s.
The first widely used digital audio workstation (DAW) was ProTools. It all started when two UC Berkeley graduates and bandmates wanted to experiment with new drum sounds. They bought a drum machine called the “Drumulator”, but quickly became frustrated with the range of drum sounds on offer.
This led them to form a business called Digidesign, which sold upgrades to provide the machine with new drum sounds. Initially, creating these samples was a laborious process, which involved manually editing the hexadecimal source code of the audio files. So in order to produce the samples more efficiently, they created Sound Designer, a software program to graphically edit and manipulate waveform files. As computer harddrive capacity expanded during the 90s this evolved to become ProTools, a multi-track digital recording program.
Much of the hardware that had been previously necessary to record music was now replaced with software. The entire mixing console was replaced with an on-screen virtual equivalent, while the various boxes to apply compression, equalization, distortion, and various other audio effects were replaced by software plugins.
Too much of a good thing?
It was now much easier than before to precisely edit and merge audio samples with original recordings. This was showcased spectacularly by Beck’s 1996 album Odelay. Recorded with ProTools, it was a dramatic collage of live recordings and samples which vividly demonstrated the power and creativity that digital editing could unleash. The album made extensive use of vintage analog synthesizers, drum machines, and looped guitar phrases that seemed to form a bridge between rock and the emerging electronic music genre.
However, some producers feel that digital recording caused music to lose its groove. A process known as quantization, for example, makes it possible to alter drum beats so that they align exactly with a tempo grid. This makes the beat technically “perfect” but it also gives it a mechanized feel more akin to a drum machine than a human drummer. This phenomenon was excellently demonstrated by veteran producer Rick Beato on his YouTube channel recently. Other critics argue that digital mastering led to the “loudness wars”, whereby producers competed to make the tracks as loud and punchy as possible at the expense of subtlety and dynamic range.
For good or ill, the digitization of music was the first step towards the music industry we have today. With time, internet speeds improved to the point that files could be quickly shared online. In combination with digital recording software, this enabled musicians in different continents to collaborate without ever setting foot in the same studio.
However, digitization also had downsides. The piracy enabled by services like Napster followed by the ubiquitous copyright infringement on user upload services like YouTube led to plummeting revenues in the early 2000s and the industry never fully recovered. At Utopia Genesis Foundation, our vision is to use digital technology to increase music industry revenues and empower the next generation of musical innovators. We’ll approach different sides of artists’ copyrights by tokenizing their music to a digital asset. Additionally, the foundation is working to support new musicians at the start of their careers by creating pools of loans and crowdfunding. Also, new to our product range is Genesis Arts, our NFT platform that guarantees all the creators of digital art will be included within a smart contract that pays fairly. If you got a great idea of how to use blockchain in the music industry join Discord and Telegram — Utopia Genesis. Brilliant ideas which will be used shall get rewarded.