I was approached by the makers of Beat Painter to take a look into their web application. Beat Painter is an intriguing take on the old concept of a music visualizer. Borrowing from various visual artists, and coupling this with an innovative and modern way of realizing and operating (eg. using your computer’s microphone, and enabling social sharing), this is a fascinating way to approach listening and enhance one’s musical experience. The website in question is https://www.beatpainter.com; I encourage you to check it out.
Here is Ilan, the web developer behind this (collaborating with an architect and friend of his, Oran), explaining the approach behind Beat Painter.
Did you ever try putting some music on, closing your eyes, and painting on a piece of paper or canvas while the music’s playing?
When we gave people this chance as part of our research for building Beat Painter – an online music visualizer that we recently built – we found that there are a lot of similarities in the way different people do it.
Above: Beat Painter’s interpretation of Johannes Brahms’s Hungarian Dance No. 5
Usually they would start and stop lines according to the beat of the music. When they perceived the music as fast they frequently changed the direction of their lines or drew wavy ones, and when the music was perceived as slow the brush or pen cruised leisurely across the page. When the volume increased they made thicker lines; when we gave them the ability to change colors, they usually chose to do so when they felt the music was dramatically changing.
While there were also many differences in the way people drew, we tried to take the unifying elements and model our music visualizer’s behavior accordingly.
Beats start new lines, volume controls thickness, and the color changes when the music changes – sounds easy enough, right? To a human, yes. But a human’s understanding of these things is intuitive and emotional. To transform these intuitive understandings into concrete computer rules turned out to be very complex.
Volume was the most straightforward – an easily measurable parameter. The concept of beat, for instance, proved to be much trickier. To hear the drumbeat, our app would have to “hear” different frequencies, and only recognize starts and stops in the music within the range that corresponds to the drum. And what will it do when the beat is created by more than one drum type (a rather common occurrence!), or when another instrument crosses into the drums’ frequency range, or when the beat is defined by maracas, or a xylophone?
We decided on a compromise. The length of a line will be decided by what the computer can detect of the beat, but since we couldn’t get this detection to be reliable enough, we made it override-able by a user controlled slider.
Two other things threatened to derail the experience we wanted to give to the users: their microphone and speakers. We made Beat Painter rely on the user’s microphone for input instead of using a list of pre-decided songs like in some other online visualizers, because we wanted it to feel reactive to everything that happens – be that a song played through the speakers, a conversation you’re having in the room or a dog barking in the floor above.
But this made the experience radically different for different people with different setups, even while they were listening to the same songs. We got reports of people hearing powerful metal music with wailing electric guitars, and our app responding with tiny feeble lines. Their microphone or speakers or placement of these two in relation to each other simply resulted in too low of a signal, making Beat Painter “think” the music was low-volume.
Above: Beat Painter’s B&W interpretation of Ahinoam Nini’s Tarnegol Ben Kelev
This behavior wasn’t what people expected and therefore “wrong” (it was also ugly). We had to fix it. The solution we settled on was to normalize the input volume across all computers and songs. We made a small confirmation window that asks the user to put a song in the background, and only then click the button. In the couple of seconds between the beginning of the song and the click on this confirmation button, Beat Painter finds the average volume of the music it’s hearing, and maps that average volume to a specific width of line that we pre-defined. In doing that, we effectively made Beat Painter reactive to relative changes in volume, instead of absolute values, and eliminated crazy big or crazy small brush strokes (except when warranted by changes in the the music).
However, our biggest challenge was giving our users a sense of creative freedom. We wanted to make the experience resemble that intuitive, eyes-shut painting, while giving people the pleasure of seeing things move and react to sound without their interference. We were trying to achieve a delicate balance of freedom and control.
We decided to give our users a limited degree of control over the parameters. They’ll be able to drag sliders to change the width and length of the paint strokes, but they won’t be able to set or see an actual number for that setting. They’ll be able to change the number of colors on the painting, but they won’t be able to choose which colors.
We hope we managed to give the people who will use our visualizer the sensation of self-expression without showing them what’s behind the curtain; we hope we enabled their computer to correctly “guess” how their hand would’ve reacted to the music, and therefore delight and excite them every time.