Impacts of increased accessibility in Music Technology.

By Noah F-K

Abstract


This study has been conducted in order to highlight areas and impacts of creative possibility within electronic technology, available to musicians. Technology, defined as the practical application of scientific knowledge (Suchman, 1999), partnered with access to and engagement with electronics can provide a flexible and rewarding framework for contemporary musical instruments and controllers. Literature review and self directed practical experimentation has revealed systems that can be foundations for the evolution of music technology. Exploration into the possibilities of programmable electronic platforms, has exposed further debates and research into the emergent designs that allow for new forms of human-machine interaction. The direction of the research has been in some way guided by the desire to find a solution for barriers in human-machine interaction (Wessel and Wright, 2002). Addressing the therapeutic advantages of electronics education to encourage critical thinking, autonomy, motivation and  as a means to combat the negative impacts of hyper-technologicalization (Crowe and Rio, 2004). Further guidance comes from the desire to validate hybrid electronic-musical education, and raise attention to omissions in educational policy. The combination of electronic technology and music could serve as a platform for electronic design.  A further re-evaluation of Human-Machine Interfacing reveals new forms of human-human interaction (Fels, et al. 2002), made possible by technology. There is a divide in the community over the underpinning philosophies of instrument design, how accessible should an instrument be at the cost of virtuosic depth of expression (Cook, 2004).

Existing Research on Accessibility in Music Technology


Review proposes necessity for investigation into new methods of electronic instrument control by the music technology equipment industry (Wessel and Wright, 2004). Commercial equipment designers and manufacturers have not fully evolved with technological capabilities, still using arguably outdated systems, such as rotary controllers, knobs, keys, and sliders, borrowed from data entry systems (Linn, 2016). These discrete (Cook, 2004) control methods are forcing musicians into using inexpressive and lifeless instruments, resulting in repetitive and restricted performance. There is an argument to move to a continuous style of control. Continuously variable, not a finite number of positions like an on off switch, this will allow for music to be made using digital means but having the full expressive potential of the human body motor system available (Wessel and Wright, 2002). Not confined to a static interface. This has been an area that has been researched and developed since the early days of digital systems (ML Johnson, 1991). Using programmable electronics such as Arduino, there is the possibility to create new forms of interaction with music and control, a controller no longer has to be a button. A revolution in terms of human expression and communicating emotion and ideas (Edstrom, 2016).

Advanced human interfaces can be adapted to suit the performance of the body through a three dimensional field. In the LinnStrument (Linn, 2013) an effort in intuitive contemporary design, the capabilities of the human hand are maximized, an array of pressure sensitive pads can track finger movement across 3 axis, Pressure (Z axis) Front-Back (Y axis) Left-Right (X axis), with strike and release velocity. These five types of touch sensing are harnessed to provide music control, maximising expressive potential.  This effort has been made in an attempt to produce a polyphonic instrument with continuous control, two factors which have only recently been integrated together (Johan, 2013),  in order to free musicians of the constraints and expressive limitations of human-machine interfacing utilising discrete control systems. This is a similar design system to the Roli Seaboard, a multidimensional controler, similar in layout to a traditional keyboard however offering five dimensions of control as opposed to one dimension found traditionally (Roli, 2017). The producers of these instruments propose their designs as revolutionising electronic music, by re-defining the the means of production. Exploiting digital capacity to generate broader expression.

Roger Linn has been shown to be a key innovator in the development of music technology and human interfacing (Johan, 2013). In the referenced interview, the concepts he discuss are at the forefront of electronic music technology research community and may come to influence the future of music and performance. There is a potential bias in the literature due to the commercial nature of music technology, the designers such as Lin and Roli propose their new systems as remedies for a cripplingly flawed, and creatively limiting current digital instrument situation. Whilst there may be some validity to this claim to, notions of product placement and the intention to sell their product, catalyses apparent bias in their proposition.

Further experimentation with the possibilities of music and electronics is shown by the MIT Media Lab, (Weinberg et al, 2000) who have created the Embroidered Musical Ball, a soft tactile MIDI Instrument with an emphasis on accessibility for children and novices. Via pressure sensors within the ball, the user is able to interact with music in an intuitive and immediate way, not found in traditional instruments. This allows for a degree fo musical expression and exploration without the necessity for extended study or dedication. This casual platform provides a new framework for musical interaction, making widespread accessibility to the known therapeutic, cognitive, emotional, social and psychological benefits of music (Sacks, 2007), including increased happiness, memory, confidence (Mcpherson, 2006), and social cohesion (Hallam et al., 2012).

Wessel and Wright (2002) have invested effort towards developing computer-based live musical performance instrumentation. With a focus on ease of use, whilst ensuring no limit to virtuosity. Using gesture and motion systems to enable musicians to ‘play’ the computer with equal beauty and expression to traditional continuous, acoustic instruments. Furthered exploration of continuous control systems by Fells et al (2002),  interprets benefits of human gestural mapping, and importance in developing expressive devices.  Hollinger and Wanderley (2006) report on the potential uses of force based resistors, which has been further explored being coupled with haptic feedback (Berdahl and Kontogeorgakopoulos, 2013) which gives users a physical response to their input. This is a vital exploration of technologies facets and an example of how designers can create reactive instruments to further connect players to the music they are producing.

Independant Instrument designers Soma (Soma, 2018), have created  a new breath based instrument, ‘The Soma Pipe’ (Kremier, 2018). Described as a voice expander, it combines a voice contact microphone with a dynamic synthesis and processing engine, aiming to create sounds that are impossible to achieve with conventional Microphones (Kreimer, 2018). This instrument further demonstrates the alchemical capabilities of electronics in music, giving users transmutational powers over sound.

Soma proposes a new style of design philosophy, creating instruments that invite you to listen to yourself, focussed on balance and interaction rather than control and linearity, reflecting deep nature (Soma, 2018). The humanistic approach of the instrument allows anybody with a voice to play, this inclusive style could be the future of music instrument design, with no focus on forcing the player to understand and learn hand control position like traditional instruments, guitar, woodwind etc, the instrument is designed to work like an additional controller for the human voice, making it accessible to many more players, allowing instant expression over required dedication. Wether this is necessarily a positive thing is left to debate, other academics remarking on the necessity for difficulty in an instrument to allow for virtuosity and mastery in performance (Wessel and Wright, 2002).

Research into the methods in which an individual can create new musical devices reveals the open source electronics platform Arduino (Anderson and Cervo, 2013). Arduino is a physical microcontroller chip that allows operation control over mechanical, electrical and software systems via basic computer code (Blum, 2013). It can be used for human-computer interaction, creating musical instruments with control and response through sensed based interaction. The device has flexibility in design potential, and is accessible to novices in coding and electronics. (Edstrom, 2016). Arduino has a variety of functions that are of use to musicians, including sending and receiving MIDI information, audio synthesis, analogue to digital conversion, and signal processing. It can be used to create motion or pressure based controllers, enabling human-machine interaction to convert senses and energies into other forms. The Arduino for Musicians book is published by Oxford press, representing the strength and academic acceptance of the book. The contents and projects presented rely on practical, objective results, the functions are objective, however the effectiveness of performance is left to the authors interpretation. It claims that it is possible to engage in arduino with no electrical engineering experience, yet this not been reliably tested.

The arduino platform can be used to design digital musical interfaces and instruments specialised for musicians with disabilities, enabling their participation in creative audio production. The development of accessible electronics, empowers the creation unique accessible music devices, strengthening inclusion (Samuels, 2015). MIT Press, support these claims thus officiating the beneficial potential and encouraging further development. The argument proposed is valid as positive results have been seen reliably from experimentation and research. Arduino presents the opportunity to create devices with new control mechanisms,including breath, touch, light, sound, motion, and pressure (Anderson and Cervo, 2013).

Research Methodology


This research study has utilised triangulation (Jick, 1979) to gather a variety of source material, primary, secondary , quantitative and qualitative. Using a combination of methodologies to ascertain a solid and accurate view on the current debates. Initially, a short practical observation (Lofland, 2006) study prompted a subsequent literature review period, in which research findings inspired practical experimentation with technology (Diamond, 2001). Cross examining the debate from both epistemological and ontological perspectives (Killam, 2013) ensures balanced and holistic interpretations of the paradigms presented.

An informal pilot study (Antony, 2013) was conducted  in order to gage the effectiveness of a further, larger scale study, and to gain an insight into the impact accessible practical audio electronics has on encouraging further autonomous learning. Involving three voluntary participants, one with deep experience in both music and electronics (P1), the second had a large musical experience, but very little prior electronics experience (P2), the third participant had minor musical experience and very little, essentially no experience in electronics (P3). The study consisted of multiple stages. The first being assigning the subjects the practical task of assembling a basic synth circuit (Wong, 2018), soldering 7 components to a circuit board, noting their level of success and requirements for assistance. Once this task had been accomplished, a discussion with the subject measured whether their interest was raised. The third stage was a review, four weeks after the initial practical task, to evaluate the development of interest in electronics provoked by the initial task. Ethical considerations have remained at the forefront of the study approach, including participant anonymity, informed consent, and clarity with the subject on the nature of the research (Blaxter Et Al. 2010).The intention of the study was to gather information on the effects electronics engagement has on other people and their autonomous learning, continuation of study, and inspiration.

Literature review (Ridley, 2013) has been used to gather information and direct the research, evaluating the reliability, validity, and reputation of the source material in order to determine the strongest arguments. Analysing Instructional books on electronics and arduino, to investigate the possibilities of practical experimentation, and evaluating interviews and publications of industrially acclaimed designers to jude the current debate paradigms. Schematics and code have been studied to understand the practical requirements and process. The execution of observation and imitation of pre-made designs (Ntosound, 2018) allowed understanding of the fundamentals of  program design to explore. Broad literature review evinces appropriate and accessible fields of continued study (Killam, 2013), provided attention is given to source reputation.

Practical experimentation (Diamond, 2001) was implemented with Arduino to create a prototype  musical device with a different system to the traditional keyboard input. Mapping the integer control values (Edstrom, 2016) of Frequency Modulation Synthesis Parameters (Inakage, 1987), such as pitch, modulation depth, modulation speed,  to the level of light measured by light dependant resistors (Boyson and Kybett, 2012). This creates a light controlled synthesizer, the pitch and timbral qualities of the tone are affected directly by light energy. The aim of this practical experimentation is to prototype a device that demonstrates the simplicity of producing an alternative control system inspired by sensory transduction (Wangemann, 2006) and synesthesia (Bragança, Fonseca and Caramelli, 2015). The experimentation was also conducted to gather primary data on the experience and difficulty of electronic engagement. Experimentation was governed by decision making inspired by successful results, as an effective form of practical experimentation. (Anthony, 2014).

Research Results


The results of the observation study are as follows. In the initial practical task, P1, the experienced participant, completed the task with minor levels of assistance, expected as they had knowledge of soldering and reading schematics, their response to the task was positive, and the participant expressed further interest, the four week review revealed that they have had a continued interest in electronics and had a moderate amount, around 2 hours per week of practical electronics experience. The second participant P2, required more assistance in the practical task, they didn’t struggle with the soldering however the placement of components and schematic reading presented a few issues to the participant, taking more attempts to complete the task than P1. P2 expressed further interest and the desire for continued engagement in DIY electronics, and the four week review revealed P2 to dedicate around 10 hours a week to practical electronics and additional time researching electronics and its application in music technology. The third participant P3 required more assistance in the practical task, mostly needed in the placement and schematics reading, and some prompts with soldering, but soon developed their technique to success. Their initial response had provoked interest in the practical soldering side, but little interest in application of technology. The four week review revealed P3 has had no further engagement with either theoretical or practical electronics.

Interpreting the results of this study presents slight evidence of the ability for diy music technology to promote interest and engagement in electronics. With both P1 and P2 showing further engagement. P3 did not have continued engagement,  this may be due to having little musical involvement, whereas P1 and P2 are heavily involved with music, so there is more benefit and use for the applications. It seems that there may be a requirement for some form of involvement in electronics and music. It has revealed exposure to new pathways in a field of interest can promote autonomous study. This study requires a larger participant pool to improve reliability and validity. Further quantitative elements need incorporating for accuracy.

The secondary, practical experiment (Diamond, 2001)  into producing an digital musical instrument to demonstrate the simplicity of doing so yielded the desired results. Through trial, error, and correction process results were achieved. Through experimentation with arduino, and its synthesis and control capacity, a frequency modulation synthesiser was built, generating sound using the mozzie library (Ntosound, 2018) The pitch, and modulation based timbre of the instrument is controlled by light. It is built using as few components as possible, being one arduino board, a ¼ inch audio output, and 3 light dependant resistors. The simple schematics and code requirements generate a device for demonstrating arduino’s potential for musicians, whilst also showing the effectiveness of a rudimentary and beginner accessible design. See A.1 and A.2 of the appendix for arduino code and schematic. The light control system for the the synthesizer demonstrates the arduino’s ability to create new forms of musical control, more suited to the functions of the human body, and shows how limitations to creativity, such as a one dimensional controller, can be removed by using alternative methods of human-machine interaction, that can harness the full potential of human motor flexibility. Enabling deeper expression through the use of gesture and motion control. A prototype device A.3 was created initially to figure out the mechanisms of the design, and then a revised,  finalised version of the device was made A.4.i and A.4.ii utilising variable resistors and light dependant resistors to show the difference in control possibilities.

Discussion


Research has highlighted the power and potential inherent in this field, prompting further dedication to understanding the available technology. Guided by cross referencing instructional guides, and current literature, attempts have been made to create devices using both discrete analog components (Wong, 2018) and digital processors (Edstrom, 2016). Success was found in both instances, and has inspired further investigation into the possibilities that are available for musicians. First hand experience of the process has demonstrated the learning curve associated with entering this field with little prior experience. To achieve the end products, one instance being an arduino powered Frequency Modulation (Locascio, Chowning and Bristow, 1987) Synthesizer there was lots of time initially studying the topic, followed by much more trial and error experimentation, due to coming from a background of relatively little electronics experience. It has raised attention to the learning curve associated with engaging in electronics and arduino. To truly harness the power of digital microcontroller systems such as arduino one must understand computer programming and coding, in conjunction with physical electronic practice; soldering skills and fundamental electronic knowledge. Once these barriers have been surpassed, there is access to unlimited creative potential and freedom. It is not instantaneous to acquire the skills or knowledge to follow a predesigned build, or to modify the design correctly. The claims made in the book Arduino for Musicians (Edstrom, 2016) stating the accessibility of the platform without prior electronic engineering experience or coding, are not fully unwarranted, as without the previous experience the user will acquire the skills and knowledge needed to achieve the desired function. This demonstrates the ability for arduino, with bearing to musical application, can promote education in electronic technology. There is some parallel support from the results of the participant study, some participants were shown to further dedicate themselves to electronics once they had been exposed to education. This was shown only to be effective in participants with a strong musical interest, a larger sample population is required for a more accurate view, a recommendation for further study. It could highlight the advantages of combining secondary interests to the field of electronics to encourage participation. The results of the participant study indicate electronic technology may provoke a further engagement if there is a useful function or application to the individual. In essence, hybrid audio electronic education can provide a pathway to discovery, provided a degree of predisposed interest.

Partnered with concepts revealed from literature review, several further debates have risen in the effects of accessible technology in music, there is the proposition for creating more accessible instruments, that are unanimously playable and instantaneously accessible, this is made possible by technology assisting the human body in operations it is already naturally skilled. The benefits of this are clear (Samuels, 2015), it can improve inclusion and community, a whilst allow new forms of expression to the musically untrained (Crowe and Rio, 2004). This has been in some ways rebuffed by members of the musical community, as it could be seen to undermine values held in high regard dedication, practice, and technique, by allowing virtuosity through perfunctory, digitally assisted performance (Green, 2002).

From the research, it appears one approach would be to continue research into instrument design methodology, and develop a system universally accessible, whilst maintaining the capacity for extended technique derived through dedication. This approach is reinforced by developing movements in commercial technology (Linn, 2013), that emphasize the need for a restructuring of the perceptions of human machine interfacing in musical application. Driven by a post-modern awakening to the limitations of the present, prevailing, yet antiquated organisation. Expanding and developing modern accessibility frameworks in conjunction with promoting education in emergent viewpoints and strategies, may in turn allow for extended human expression, communication and bonding. A re-evaluation and desecration of the outdated contemporary disposition to musical control, may be necessary to highlight areas of potential improvement through accessible technology in musical application.

References:


Anderson, R. and Cervo, D. (2013). Pro Arduino. Berkeley, CA: Apress.

Anthony, J. (2014). Design of Experiments for Engineers and Scientists. 2nd ed. Elsevier, London.

Blaxter, L., Hughes, C., and Tight, M. (2010), How to Research. Maidenhead. Open University

Berdahl, E. and Kontogeorgopoulos, A. (2013). The FireFader: Simple, Open-Source, and Reconfigurable Haptic Force Feedback for Musicians. Computer Music Journal, 37(1), pp.23-34.

Blum, J. (2013). Exploring Arduino: Tools and Techniques for Engineering Wizards. Wiley. Indianapolis

Bragança, G., Fonseca, J. and Caramelli, P. (2015). Synesthesia and music perception. Dementia & Neuropsychologia, 9(1), pp.16-23.

Boysen, E. and Kybett, H. (2012). Complete Electronics Self-Teaching Guide with Projects. Hoboken, USA: Wiley.

Cook, P. R.  (2004) Remutualizing the Musical Instrument: Co-Design of Synthesis Algorithms and Controllers. Routledge. Journal of New Music Research 2004 Vol. 33, No. 3, pp. 315–320

Crowe, B. and Rio, R. (2004). Implications of Technology in Music Therapy Practice and Research for Music Therapy Education: A Review of Literature. Journal of Music Therapy, 41(4), pp.282-320.

Diamond, W. (2001). Practical Experiment Designs for Engineers and Scientists. New York: Wiley.

Edstrom, B. (2016) Arduino for Musicians. Oxford University Press.

Fels, S. Gadd, A. and Mulder, A. (2002). Mapping transparency through metaphor: towards more expressive musical instruments. Organised Sound, 7(02).

Green, L. (2002). How popular musicians learn: A Way Ahead for Music Education . Aldershot, Hants: Ashgate, pp.111 – 115.

Hallam, S., Creech, A., Varvarigou, M. and McQueen, H. (2012). Perceived benefits of active engagement with making music in community settings. International Journal of Community Music, 5(2), pp.155-174.

Hollinger, A.  and Wanderley M. (2006) “Evaluation of commercial force-sensing resistors,.” International Conference on New Interfaces for Musical Expression, Paris, France,

Jick, T. (1979). Mixing Qualitative and Quantitative Methods: Triangulation in Action. Administrative Science Quarterly, 24(4), p.602.

Johan, J. (2013) Roger Linn Interview. (Available: https://drive.google.com/file/d/0BwNS4xvTBVTDMTh1OVVlLVMtTHM/view?ts=5ae84c7a ) . dBs Archive. Bristol

Johnson, M. (1991). Toward an expert system for expressive musical performance. Computer Music Journal, 24(7), pp.30-34.

Inakage M. (1987) Frequency Modulation Synthesis. In: Kunii T.L. (eds) Computer Graphics 1987. Springer, Tokyo

Killam, L. (2018). Research Terminology Simplified. Laura Killam.

Kreimer, V. (2018). the PIPE – SOMA laboratory. [online] SOMA laboratory. (Available at: https://somasynths.com/pipe/ ) [Accessed 15 May 2018].

Linn, R. (2013). LinnStrument and other new expressive musical controllers. The Journal of the Acoustical Society of America, 134(5), pp.4053-4053.

Locascio, M., Chowning, J. and Bristow, D. (1987). FM Theory and Applications: By Musicians for Musicians. Computer Music Journal, 11(4), p.48.

Lofland, J. (2006). Analyzing Social Settings: A Guide to Qualitative Observation and Analysis. Wadsworth/Thomson Learning

Mcpherson, E. McCormick, J. (2006) Self-efficacy and music performance. Sage Journals.

Ntosound (2018). FM Synth Mozzi con Arduino. [online] sound object. Available at: https://soundobject.wordpress.com/2015/02/04/fm-synth-mozzi-con-arduino/ [Accessed 14 May 2018].

Paradis, J. (1997) “Electronic Music Interfaces: New Ways to Play,” IEEE Spectrum Magazine, Vol. 34, No. 12, pp. 18-30

Ridley, D. (2013). The literature review. London: SAGE Pub.

Roli, (2017) Roli Seaboard Rise user Manual. Roli

Sacks, O. (2008) Musicophilia: Tales of Music and the Brain.  Random House, New York.

Soma (2018). About SOMA – SOMA laboratory. [online] SOMA laboratory. Available at: https://somasynths.com/about-soma/ [Accessed 15 May 2018].

Suchman, L. (1994). Plans and Situated Actions: The Problem of Human-Machine Communication. 4th ed. Cambridge: Cambridge University Press.

Samuels, K. (2015). The Meanings in Making: Openness, Technology and Inclusive Music Practices for People with Disabilities. Leonardo Music Journal, 25(25), pp.25-29.

Wangemann, P. (2006), Supporting sensory transduction: cochlear fluid homeostasis and the endocochlear potential. The Journal of Physiology, 576: 11-21.

Weinberg, G. Orth, M. Russo, P. (2000) The Embroidered Musical Ball: A Squeezable Instrument for Expressive Performance. MIT Media Lab

Wessel, D. and Wright, M. (2002). Problems and Prospects for Intimate Musical Control of Computers. Computer Music Journal, 26(3), pp.11-22.

Wong, K. (2018). BJT In Reverse Avalanche Mode. [online] Available at: http://www.kerrywong.com/2014/03/19/bjt-in-reverse-avalanche-mode/ [Accessed 19 Mar. 2018]

Appendix:


A.1: Light Controlled Frequency Modulation Synthesizer Arduino Code:

Adapted from Ntosound (2018)

#include <MozziGuts.h>

#include <Oscil.h> // oscillator

#include <tables/cos2048_int8.h> // table for Oscils to play

#include <Smooth.h>

#include <AutoMap.h> // maps unpredictable inputs to a range

// int freqVal;

// desired carrier frequency max and min, for AutoMap

const int MIN_CARRIER_FREQ = 22;

const int MAX_CARRIER_FREQ = 440;

const int MIN = 1;

const int MAX = 10;

const int MIN_2 = 1;

const int MAX_2 = 15;

// desired intensity max and min, for AutoMap, note they’re inverted for reverse dynamics

const int MIN_INTENSITY = 700;

const int MAX_INTENSITY = 10;

// desired mod speed max and min, for AutoMap, note they’re inverted for reverse dynamics

const int MIN_MOD_SPEED = 10000;

const int MAX_MOD_SPEED = 1;

AutoMap kMapCarrierFreq(0, 1023, MIN_CARRIER_FREQ, MAX_CARRIER_FREQ);

AutoMap kMapIntensity(0, 1023, MIN_INTENSITY, MAX_INTENSITY);

AutoMap kMapModSpeed(0, 1023, MIN_MOD_SPEED, MAX_MOD_SPEED);

AutoMap mapThis(0, 1023, MIN, MAX);

AutoMap mapThisToo(0, 1023, MIN_2, MAX_2);

const int KNOB_PIN = 0; // set the input for the knob to analog pin 0

const int LDR1_PIN = 1; // set the analog input for fm_intensity to pin 1

const int LDR2_PIN = 2; // set the analog input for mod rate to pin 2

const int LDR3_PIN = 4; // option for additional control sources

const int LDR4_PIN = 3; // further control source

Oscil<COS2048_NUM_CELLS, AUDIO_RATE> aCarrier(COS2048_DATA);

Oscil<COS2048_NUM_CELLS, AUDIO_RATE> aModulator(COS2048_DATA);

Oscil<COS2048_NUM_CELLS, CONTROL_RATE> kIntensityMod(COS2048_DATA);

int mod_ratio = 5; // brightness (harmonics)

long fm_intensity; // carries control info from updateControl to updateAudio

// smoothing for intensity to remove clicks on transitions

float smoothness = 0.95f;

Smooth <long> aSmoothIntensity(smoothness);

void setup() {

 Serial.begin(115200); // set up the Serial output so we can look at the light level

 startMozzi(); // :))

}

void updateControl() {

 //  freqVal = map(LDR3_PIN, 0, 1023, 1, 100);

 int freqVal = mozziAnalogRead(LDR3_PIN); // value is 0-1023

 int FRQ = mapThis(freqVal);

 int knob2 = mozziAnalogRead(LDR4_PIN); // value is 0-1023

 int knob2Val = mapThis(knob2);

 // read the knob

 int knob_value = mozziAnalogRead(KNOB_PIN); // value is 0-1023

 // map the knob to carrier frequency

 int carrier_freq = kMapCarrierFreq(knob_value);

 //calculate the modulation frequency to stay in ratio

 int mod_freq = carrier_freq * mod_ratio * FRQ;

 // set the FM oscillator frequencies

 aCarrier.setFreq(carrier_freq);

 aModulator.setFreq(mod_freq);

 // read the light dependent resistor on the width Analog input pin

 int LDR1_value = mozziAnalogRead(LDR1_PIN); // value is 0-1023

 // print the value to the Serial monitor for debugging

 Serial.print(“LDR1 = “);

 Serial.print(LDR1_value);

 Serial.print(“\t”); // prints a tab

 int LDR1_calibrated = kMapIntensity(LDR1_value);

 Serial.print(“LDR1_calibrated = “);

 Serial.print(LDR1_calibrated);

 Serial.print(“\t”); // prints a tab

 // calculate the fm_intensity

 fm_intensity = ((long)LDR1_calibrated * knob2Val * (kIntensityMod.next() + 128)) >> 8; // shift back to range after 8 bit multiply

 Serial.print(“fm_intensity = “);

 Serial.print(fm_intensity);

 Serial.print(“\t”); // prints a tab

 // read the light dependent resistor on the speed Analog input pin

 int LDR2_value = mozziAnalogRead(LDR2_PIN); // value is 0-1023

 Serial.print(“LDR2 = “);

 Serial.print(LDR2_value);

 Serial.print(“\t”); // prints a tab

 // use a float here for low frequencies

 float mod_speed = (float)kMapModSpeed(LDR2_value) / 1000;

 Serial.print(”   mod_speed = “);

 Serial.print(mod_speed);

 kIntensityMod.setFreq(mod_speed);

 Serial.println();

}

int updateAudio() {

 long modulation = aSmoothIntensity.next(fm_intensity) * aModulator.next();

 return aCarrier.phMod(modulation);

}

void loop() {

 audioHook();

}


A.2: Light Controlled Frequency Modulation Synthesizer Arduino Assembly

Schematic:

Adapted from Ntosound (2018)

  • Audio output on digital pin 9

  • Variable Resistor or Potentiometer connected to analog pin 0.

         Center pin of the potentiometer goes to the analog pin.

            Side pins of the potentiometer go to +5V and ground

  • Light dependent resistor (LDR) and 5.1k resistor on analog pin 1:

            LDR from analog pin to +5V

          5.1k resistor from analog pin to ground

  •   Light dependent resistor (LDR) and 5.1k resistor on analog pin 2:

            LDR from analog pin to +5V

            5.1k resistor from analog pin to ground


A.3: Prototype Arduino Device created in practical experimentation


A.4.i: Finalised Arduino alternative Human-Machine Interface demonstration device


A.4.ii: Finalised Arduino alternative Human-Machine Interface demonstration device

By Noah_FK

Audio Engineer, Software Developer, Musician.

Leave a comment

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s