MetadataShow full item record
React has revolutionized the way client side developers think about their applications. It makes heavy use of the principles of functional reactive programming. How can these design concepts aid in the development of live music applications? With React, events effect state, and the library implements side effects that keep the presentation synchronized. This allows for a simplicity and determinism that restores joy to the process of building complex interactions. We will look at how this powerful idea can be applied to real time music synthesis, rendering music as a side effect of state change. Timing is everything when rendering music. We will look at some different approaches to achieving precise timing in a reactive programming model. These approaches include push-based, where changes to state precipitate re-rendering, and pull-based, where state is constantly rendered and changes to state are reflected on the next render. The pros and cons of these approaches will be examined. In terms of the WebAudio API, all of these changes find their way to automating the values of an AudioParam sooner or later. We will discover some potential improvements to this class that could simplify and clarify implementations in this problem domain. This will be a practical approach. There will be examples in ClojureScript and (hopefully) Elm. Live code execution and glorious rhythmic beeping will keep things fascinating.