# Blog

## test d3

<!DOCTYPE html>
<meta charset="utf-8">
<style>

body {
font: 10px sans-serif;
}

.axis path,
.axis line {
fill: none;
stroke: #000;
shape-rendering: crispEdges;
}

.dot {
stroke: #000;
}

</style>
<body>
<script src="//d3js.org/d3.v3.min.js"></script>
<script>

var margin = {top: 20, right: 20, bottom: 30, left: 40},
width = 960 - margin.left - margin.right,
height = 500 - margin.top - margin.bottom;

var x = d3.scale.linear()
.range([0, width]);

var y = d3.scale.linear()
.range([height, 0]);

var color = d3.scale.category10();

var xAxis = d3.svg.axis()
.scale(x)
.orient("bottom");

var yAxis = d3.svg.axis()
.scale(y)
.orient("left");

var svg = d3.select("body").append("svg")
.attr("width", width + margin.left + margin.right)
.attr("height", height + margin.top + margin.bottom)
.append("g")
.attr("transform", "translate(" + margin.left + "," + margin.top + ")");

d3.tsv("data.tsv", function(error, data) {
if (error) throw error;

data.forEach(function(d) {
d.sepalLength = +d.sepalLength;
d.sepalWidth = +d.sepalWidth;
});

x.domain(d3.extent(data, function(d) { return d.sepalWidth; })).nice();
y.domain(d3.extent(data, function(d) { return d.sepalLength; })).nice();

svg.append("g")
.attr("class", "x axis")
.attr("transform", "translate(0," + height + ")")
.call(xAxis)
.append("text")
.attr("class", "label")
.attr("x", width)
.attr("y", -6)
.style("text-anchor", "end")
.text("Sepal Width (cm)");

svg.append("g")
.attr("class", "y axis")
.call(yAxis)
.append("text")
.attr("class", "label")
.attr("transform", "rotate(-90)")
.attr("y", 6)
.attr("dy", ".71em")
.style("text-anchor", "end")
.text("Sepal Length (cm)")

svg.selectAll(".dot")
.data(data)
.enter().append("circle")
.attr("class", "dot")
.attr("r", 3.5)
.attr("cx", function(d) { return x(d.sepalWidth); })
.attr("cy", function(d) { return y(d.sepalLength); })
.style("fill", function(d) { return color(d.species); });

var legend = svg.selectAll(".legend")
.data(color.domain())
.enter().append("g")
.attr("class", "legend")
.attr("transform", function(d, i) { return "translate(0," + i * 20 + ")"; });

legend.append("rect")
.attr("x", width - 18)
.attr("width", 18)
.attr("height", 18)
.style("fill", color);

legend.append("text")
.attr("x", width - 24)
.attr("y", 9)
.attr("dy", ".35em")
.style("text-anchor", "end")
.text(function(d) { return d; });

});

</script>

## Synesthetic Piano

Have you ever wondered what Beethoven's 5th symphony would look like as a painting? Of course not, that doesn't even make sense. Beethoven is music, meant to be enjoyed by the ears, not the eyes. But then, there are a lot of things we modify in order to see: vibrations of the earth's crust seen as graphs, magnetic resonance images, the gamma rays of a distant supernova, and on and on. So why can't we paint a song?

The Hill City Keys project seems like the perfect opportunity to find out. Since I can't play piano, I'll instead leverage the talent of the Lynchburg community. And since I've often heard some impressive music played at the community market piano, I decided to make it my painting piano. So I wired it up to a Raspberry Pi that detects the notes being played in real time, which then sends the notes to another Raspberry Pi at the Academy Center of the Arts, which controls a robotic arm that moves a paint brush depending on the note being sent, painting a picture of the music being played.

You can watch the arm paint in real time below, followed by a more detailed description of the project.

## The Piano

### Detecting Notes

There are many ways to detect the notes played on the piano. A few different options would be to put a touch sensor on each key, watch the keys being pressed with a camera, or you could detect the sound being played.

I decided to take the audio sampling route. Music notes are simply letters we assign to specific audio frequencies. When the air vibrates at 132Hz, that's middle C. So in order to detect the note played on the piano, we first need to record audio samples. The Synesthetic Piano does this using a simple USB microphone connected to a Raspberry Pi. The Pi records an audio sample for a short period of time, which contains a waveform of amplitude versus time. Unfortunately, amplitude is not what we're looking for. Any note can be played at any amplitude, what we need is the frequency. Fortunately, Joseph Fourier figured out that any function can be represented by a series of sine waves. Along with this realization, he came up with a clever way to transform a function of time into a function of frequency, which is exactly what we need. In essence, the Pi records an audio sample of amplitudes versus time, it then recreates that sample with a series of sin waves. Those sine waves have a known amplitude, frequency, and phase. The Pi then takes the sine wave with the highest amplitude, as it most likely represents the note being played. It then looks up the corresponding note of that frequency. Mathematically, it looks like this, with $f(t)$ being the amplitude versus time function and $f(\epsilon)$ being the resulting function of frequency.

$$f(\xi) = \int_{-\infty}^{\infty} f(t)e^{-2\pi i x\xi}dx$$

Python's wonderful Scipy library comes with a discreet Fourier transform function that does all the hard work for us. I just chose a sampling rate that keeps the notes accurate while reading one note every tenth of a second.

## Sending and Receiving Data

Once the piano figures out what note you played, the data needs to be sent to the painter. Since the piano and the painter are very far from one another, I chose to transfer the data over the internet. Another challenge to consider is that these two devices are on two different networks, neither of which I have administrative rights on. This meant that I couldn't just send data directly from the piano to the painter. Instead, I decided to use the MQTT protocol by sending all the data through CloudMQTT

## The Painter

The painter's job is to be inspired by music to paint something beautiful.

### The Arm

The arm is made of aluminum, with stepper motors as the joints, and a 3D printed hand. When C, C#, D, and D# are played, the shoulder will quickly or slowly move clockwise or counter-clockwise. Similarly, E, F, F#, and G do the same for the elbow.

### The Paint

There are four different color paints, each pumped by a 3D printed peristaltic pump. These are positive displacement pumps, which means they pump the same volume of paint with every step of the motor. When G#, A, A#, or B are played, one of the four paints are pumped onto the canvas.

## The Big Picture

There are two things I look forward to learning from this experiment: are the paintings repeatable, and do good songs look better than a random mashing of the keys?