Finally, we’ve been able to have access to the recording of our presentation at Rich Mix on the 28th of May. Watching ourselves talking in public is also a good experience and it will be very helpful for future exercises and meetings of this kind. The talk was overall smooth from beginning to end and the public found the topic of the use of sensors as an accessible feature for musical instruments quite interesting and innovative. Before our presentation, we also had an informative introductory video from our course leader Annie Goh, explaining more in depth the content of BA Sound Arts, but for some reason it wasn’t added to this excerpt. The whole event was presented by Rob Parton, the DMLab project manager & associate fundraiser, who greatly introduced the speeches and made some questions too, and Deborah Borg Brincat, programme delivery manager at Drake Music, organising and bringing questions from the zoom chat. The whole event was also brilliantly translated for deaf people by two BSL interpreters. It was a great evening and here we have the recording of our presentation.
Category: Collaborating
DMLab at Rich Mix
The very final test in this project was to run a presentation at a public event organised by Drake Music. The event hosted another presentation apart from ours, related to accessible Djing, and it was really interesting. Our presentation was great, and we enjoyed talking about the different stages and concepts of our project. We had loads of entangled questions and different suggestions for future improvements in our instruments. The attendants were very interested on the device, and the conversation ended into diverse fields of accessibility, physicality and technology. We also accompanied the talk with pictures of the development process as well as a little demonstration of the instrument and some people even tried it after the presentation. Here is a short video recorded by one of the attendants.
The whole presentation was recorded on zoom and it will be probably shared soon, however, despite of Annie and us having contacted Drake Music via email, I’m afraid that at the time that I’m writing this, on the submission day, we still don’t have access to these recordings. I will make another blog post with them so at the time of marking this unit they might be available. I hope that this could still be taken into consideration, taking into account that the presentation happened just two days before the submission deadline, but I think this is an valuable documentation about our progress during this unit.
Anyway, here I will leave Drake Music’s website for any further information, they use to upload the full presentations in Youtube too. We really enjoyed this experience, it was a really interesting project. Here are some pictures that I took from the event.






Wiring, Coding and finishing up the Air Soundscape Generator
Our patch in Pure Data is now finished and we’ll need to load our own samples to the project ready to generate soundscapes. After we finished the design and laser cutting working together we decided to split the next steps so Lucas would be in charge on the sound design and I would do the coding stuff as I’ve shown previously. He then sent me the samples and I have programmed them to run in PD using the message [open filename.wav], and the object [readsf~] will play our sound file after being triggered with a bang. The files are played and looped from the beginning using the object [loadbang] and their volume will be controlled with the distance sensors as we have seen before. The initial volume is 0, so the instrument will be silent until it detects some activity in the sensors. This is the Bela IDE, the built-in application where we load PD patches in Bela Mini.

The Bela IDE helps us to load patches, samples and other customisations as well as set up a project which will run on booth on the Bela as standalone. In the patch we can see the final programming for each sensor, in the left side, the sensor equations to calculate distance, and in the right side the sample file system with the volume control in the middle.
Now I was ready to wire up the four sensors and attach them to the instrument’s enclosure, here we can see some pictures of the main wiring system.






As mentioned before, our instrument plays three different sounds commonly used in Sound Arts; noise, ambience and field recordings, and the fourth sensors adds and effect to master channel. I also added two hinges to the top side in or the to create a lid that can be opened and closed if we want to manipulate the wiring from inside. And this is the amazing look of our finished accessible instrument:

I did a little jam to share on social media, for which I had a good feedback, with plenty of people intrigued asking questions about it. I also tagged Bela Platform in it and they shared it on their main profile. Here is that jam, where you’ll be able to see and hear how the Air Soundscape Generator works.
Ultrasonic Sensors

One of the key elements of our accessible instrument, and something that Lucas and me had very clear about the development of the instrument since the beginning was the idea of implementing sensors in the functionality of the device. These sensors, the HC-SR04, known as ultrasonic sensors or distance sensors are commonly used in interactive installations and they have been widely used in combination with Bela Board and Pure Data, some tools that we were studying during the beginning of this course in the unit “Expanded Studio Practice for 21st Century Sound Artists”.
The way these sensors work is similar to the ultrasonic vision that allows bats to navigate in the space. One of the round devices emits an ultrasonic signal, whereas the second device receives the echo. Sending a trigger from PD to the Bela Board and with an space/time equation which is available with Bela’s online documentation, the sensor is able to output the equivalent value to the distance to which any object can interfere in that direction. The sensor also needs some wiring and a couple of resistors to work, and I built these schematics in a breadboard to start experimenting with one of the sensors.

But using these sensors to control parameters is not as straightforward as it can look at first sight, and it took me quite a lot time to find out the best possible patch in order to obtain a natural and smooth feel when controlling the volume of the sample, which is the feature that we wanted to achieve in this instrument. Here is a short video of the first experimentations with volume control of a sample, in the Bela’s console we can also see the print of the distance, calculated every 60ms.
In order to make this volume control more natural, I added some smoothing to the printed values, with the object [line~] and sending the message [$1 800], where $1 is a variable, in this case is the distance, but it will take 800ms to move from a value to the next one, making the volume control smoother. I also used the objects [samphold~] and [snapshot~] to retain the distance at any point, otherwise the distance, therefore the volume, was continuously increasing after removing the hand. The maximum distance is also capped at 25cm to make a reasonable movement range. Once I had a good patch for the first sensor I just needed to duplicate it four times, with the only difference that the last one would control a filter that I made with [vcf~], and the other three samples would be routed to the filter and master output. Here is the final patch in Pure Data ready to be tested with four sensors.

Laser-cutting the enclosure in the 3D Workshop
Once our design for an accessible instrument has been created in Adobe Illustrator, is time to join the 3D Workshop staff and run the laser cutting and able the structure of the Air Soundscape Generator. The Illustrator file is loaded on a special software for laser cutting, and after setting up the machine and placing the selected material the laser cutting is ready to run.

The design will be cut and engraved on a 3mm plywood board, a very good and resistant material that is normally used for these kinds of projects. The cutting process lasted 11 minutes and our enclosure is ready to be assembled. We are really happy with how the engraving came out, the design looks precise and delicate, and the titles are explaining well the different functions in the instrument. The next step will be coding and wiring up the sensors.


First design for our accessible instrument
For the accessible instrument that we’ll be creating for Drake Music following path 2 in this collaborative unit, we are going to use the laser cutters from the 3d Workshop in order to create a wooden enclosure for the device. I’m teaming up with my BA Sound Arts classmate Lucas Yoshimura in this project, and our first step has been creating this design together in Illustrator, ready to be printed over a plywood with the laser cutter. The Air Soundscape generator as we’ve called is a kind of a sampler, controlled with sensors, which will be able to be played with any part of the body, and it will feature three different kinds of atmospheres, common in the Sound Arts, plus an effect. This is the final design ready to be assembled.

Sessions with Megan Steinberg from Drake Music
Following our visit to Drake Music and our upcoming project for an accessible instrument, we’ve also had some support lectures with Megan Steinberg, an accessibility specialist who works with Drake Music, and also holds a Phd in accessible instruments. This lectures were focused in the explanation of different kinds of disabilities and how to adapt musical elements to make them accessible. These lectures were helpful and inspiring in terms of instrument design and we also were able to discover several projects related with disabilities from both Megan and Drake Music. We also did an exercise selecting from a couple of different options, where I chose creating a musical score for a pianist with anxiety. I really enjoyed with this exercise and I learnt a bit more about how to make accessible musical notations, in this design I used a piano roll pattern, with different relaxing icons which could make a piano player feel more comfortable, this was the result:

Week 22: Visiting Drake Music
This week we have visited the site where Drake music is located and they have shown us their work on accessible instruments. The talk was very interesting and we were able to play the instruments, watch some videos of performances and we also had a Q&A discussion. This is my chosen option for this unit and I kept focused throughout the lecture to achieve the best outcome. Here are some pictures and videos about the experience on this off-site lesson.






These were some of the amazing instruments that Drake Music builds in its site for disabled people, as we were speaking they make this instruments using C++, Arduino and HTML languages. For our project, Lucas and I will be using Pure Data and Bela Board, and our instrument will consist into at least four sample tracks controlled by a sensors placed vertically, this would allow a person without or reduced mobility on their hands to control these samples with the arms, feet or even the head. The sound that we are looking to load in the instruments will relate with Sound Arts, like for example field recording, drones or noise. These are some sketches at a very early stage of development.
Follow Up, choose one: Develop a sketch or diagram for an accessible instrument design that you will be able to discuss in the next session with Megan Steinberg next week

Week 19
- Research 2 designs that already exist for accessible instruments in your area of sound and music and discuss them in a blog post.
The SynLimb
I would like to include here Bertolt Meyer, he was born without an arm and his passion for electronic music and synthesisers has taken him to create a special instrument which fit in his joint, and with the use of the electro magnetic signals from his nerves he is able to control some parameters of the modular synth, having created an accessible instrument, the SynLimb, in collaboration with Soma Synths, which allows him to perform live using both arms.
Playtronica Touch Me
This device is not exclusively designed for people with disabilities but it can be used in multiple ways so it could adapt to different kind of disabilities as it allows us to transform any conductive object like plants, fruits and metallic utensils (Or even our own body) into a musical instrument. The way it works is very simple, it transforms the electrical resistance values between two points and turns it to midi notes, so it can be used with any DAW or VST Instrument via USB.

Week 18
- Consider the criteria within the Learning Outcomes on the assignment brief and the ways in which your initial ideas can meet them
- Expand on last week’s blog post incorporating the Learning Outcomes
- Prepare a link of your work to share with students and collaborators from other disciplines
Enquiry
Engage in practice-based research and demonstrate confidence in creative problem-solving.
I think that I have plenty experience in building instruments like synths, pedals and other mods and gadgets; I’m also well versed in Pure Data and the Bela Board and I’m recently learning other programming languages. Creating an instrument for Drake Music will be a challenge but I think I can contribute with my experience to carry on with this project.
Knowledge
Demonstrate proficiency in the application of subject knowledge alongside the sharing and exchange of knowledge with other disciplines and art-forms.
I will apply my knowledge as good as I can and will also learn new things in order to achieve the project’s goal. I’m also quite interested on other visual aspects, like in design, sculpture and DIY aspects.
Communication
Demonstrate confidence in the communication and presentation of ideas.
Explaining the process is for me one of the most important parts in creative projects, I’m always keen to write, sketch or talk about my techniques and I will be happy to document every aspect of my work.
Process
Show evidence of engagement with the principles of enterprise and entrepreneurship.
I have really chosen this pathway because I think working with a company on an stablished project is a great opportunity, however; I also have experience developing my own projects and collective works like organising events, releases or any other king of collaborative ideas.
Realisation
Evidence the ability to collaborate with those of a different discipline in undertaking a common project, demonstrating an interdisciplinary perspective.
For sure I will enjoy working with other individuals and artists in this unit and I hope to learn and help from each other, this is probably the most important outcome of this assignment.
Links:
Here I have a couple of links that I could share for a collaboration,
This is a more professional portfolio with diverse works on sound design, graphics and photography:
https://danielmarindesign.wordpress.com
Also, I could show my work on my personal projects on synths, electronic music and sound design with my alias “Dasero” through this link: