I'm Ben. We're talking about a project that we did a little while back called remakes or so you pronounce that collection of letters. And we're going to focus especially on how the lessons we learned as they relate to the Web audio graph and related to as I said on Ben Fields this is Sam Finn we both do sort of data heavy in data science consulting with a company called Funny possible solutions out of London I also moonlight as a postdoc a researcher at Goldsmith and to just really make this explicit this talk represents our experience using my bodyweight the eyes to go from zero to a reasonably complete prototype in forty eight hours flat and we did that a year ago and your actual A.P.I. usage may vary follower advice at your own risk really really seriously your A.P.I. usage may vary. So let's talk about the context a little bit so we did this project at a event called the medium Hack Day So medium is this event that takes place in the very lovely Kong in the south of France every year and it's well according to this brochure that they stock up here it's the the leading international business event for the music ecosystem it's a very large and well established music business conference that in many ways is sort of a husky of its former glory it's a few thousand people it used to be close to ten thousand people in the eighty's when the music business actually made money so what that all means is that we get to hack about thirty of us on projects while looking out a window at this. It's almost lovely. So to give you an idea of the kind of things that come out besides what we're going to talk about in depth these are two projects that have done well one of them last year one of them the year before at the same event so. On your left is D.J. spot of the sort of system diagram for D.J. Spotify which as you might guess allows you to D.J. from Spotify it does this through some very impressive shenanigans with virtual machines running multiple streams of Spotify at the same time so that Spotify will allow you to have a single account pulling two streams and then mix it through Ableton Live then on the left side is a. Fabric interface for your phone so that you can navigate a music festival and it has some integral sort of soft circuitry interface to the bag it's called festival bag by something called Stewart. So we made this thing called remakes or it's a white label detailing in the browser there are lots of other remakes compositions that existed before but they all suck because in order participate in them you have to use something like this and that's gross and lots of people don't want to license they are the people in the room we're all obsessed but no one else can use it so we're trying to do something in the browser D.W. and that's it so our path you get into that right so. Part of doing any kind of remakes is to being able to do some kind of like effects processing effects chains and the Web audio A.P.I. actually makes a really fast so ground for us to build on top of. So if you click on any of the individual channels in the omics So you why you can actually build an effects chain you can see here that we have in effect could convert and then another one called three band E.Q. and basically this is just pulsing a standard effects chain would from source to sync doing processing through each node and the way that that works in the web video appeal is that we have no facts we have a source and a sink we connect on the source with the sync and then the connected in the audio goes through if we have an effect then we can connect on the source with the effect we could connect on the effect with the sync and the audio goes through being processed and this is great in theory. But in practice as we were building this at very very high speed while drinking in the south of France. We came into some problems so this convert effect is actually not from the web itself it's not just a no huge that you can put in there it comes from a third party library. And this is where we began in countering some problems the way you interact with Junior is you knew up to no effect object in this case the one is called Can Volver and then you invoke connect methods on it like you would of the nodes except there's a small difference and it's right here instead of calling connect on the conveyor object directly you have to call it on this like mystical input property and then this is like different and horrible and we hate it to show you what that looks like in the diagram this is what we have with standard web audio A.P.I. nodes and this is what we have with Cina these are fundamentally different and when you're trying to build a generic system where you can use third party effects and like the ones out of the web video A.P.I. by default you kind of have to hack around it this is how we hacked around it so basically when we replace an effects chain on a channel we first disconnect all source and a sink which goes away and garbage collects all of the existing web audiogram for us in the browser so we don't have to worry about that we can and then we can rebuild our effects chain we then basically set it up so that the source of our channel is the actual source audio Tighe in this case and then for each of the effects in our channel we loop and what we do is we create a game node with a gain of one point for every single effect in our effect chain something else something we should be doing but it was the only way to give us like a way to normalize to browse inodes we then connect up like the source and the sync which we know definitely browser nodes and not third party Cina nodes and then we like set the source of the current effect to be the sync of the next effect and then we'd sort of clean up at the end so that's how we deal with the facts we basically create a bunch of extraneous game nodes to normalize the A.P.I. as between things which is a bit of a disaster. Whatever So how do we do the user interface for effects Well you have a button that you can click which pops out a window you can click any of the buttons in this window which will put that effect back in the main window as the audio is still processing you can then click the edit button on one of those effects to pull out another window and then drag up and down to like cause changes so what we've basically got hair is emulating something like logic where modifying how effects work like puts out individual windows while the audio is still processing it as you achieve this cross window communications some of you might think we were using the window Post Message A.P.I. but when we went to the documentation there was a big scary warning and also the icon font didn't load. So instead we came up with something that looks like this basically in the U.I. of each of the effects that's a set into vole running once every thirty milliseconds that any time any of the state of the U.I. changes invokes this method on the template base class the template base class looks like this and basically the only important line of code is here we're using the window local storage A.P.I. as a message boss on the other side we basically have this call that reads information out of the local storage A.P.I. in order to update the state which bends going to explain in just a little bit but the important takeaway here is that you should totally use the local storage A.P.I. as a message bus because last write wins is a great consistency model. Or so so we talked about how. The idea here is that we're going to have sort of the I W That's for you know various web use cases so that means that we have to be able to share what we're doing and that might be submitting to a contest it might be pointing out to all your friends that you totally made a mix that we're game of firing in our tweet for some reason so you know it's very important for a U.R.L. to encapsulate all the information of the state of the mixer and as Sam just detailed that's quite complicated potentially for setting up an audiogram. Additionally we just want to point out that server side audio for web page is totally for suckers it's not something I have any interest in doing because we don't want you know bills for eight of us to be that high. And so those things together basically mean that what we need to be able to do is to serialize out everything in the state somewhere and to associate it with a page location. Cool so what we can do then is Syria with our state and associated with a page location so what we do is every time you change something which is this line here roughly So you've changed something what it is depends on what you plug in there then the state changes and every time the state changes you serialize it out as adjacent object and then you push it onto the U.R.L. so when I say push on to the U.R.L. I mean literally you are serialising the Jason or excuse me you're serializing the hash in Jason and then putting that Jason onto the fragment ID of the U.R.L. So what we've done is turned the fragment idea of the U.R.L. into a document store because that totally makes sense and nothing bad will happen so we generalize to every single U.I. change you can possibly touch anywhere in the entire system so for instance here this is a general sort of go through all the channels and set them all up and so every time you or not mute the channel as you are doing the A W U affect the state therefore you re serialize out and change that fragment ID hash. Same thing here so this is just setting up the hash is we can go here and look when we're setting up all of the channel States and every time you touch one you have to effect of the state back out so for instance what we have here is the input of a channel is the channel strips state and then it fires off a particular object for each channel and here we see individual channel that we're checking to make sure it's muted and we're building up any of the effects chains and indeed it's has. Decoding entire way down so so we keep doing this through the effects chains and effects you guys and whatever so what does that mean in practice it means this in practice so this is a pretty typical U.R.L. for our system and though the top of it that those first I don't know what two dozen characters fewer is the actual thing you're doing and then everything past the hash is the state of the application and so in this way we don't have a database we don't have any audio rendering the server is doing nothing except handing over a little bit of H.T.M.L. that static so that the fragmented is the entire application state engine right so at least reading and interesting mission is easy right except it turns out that Twitter and most You're all short nurse. They totally blow up so so we can actually preserve the hash so they're also we have to do this for Google which is this shenanigans so that's completely false but it was the backup works so all of this preserves the back button is totally worth it. And we're supposed to attempt a demo how many in time is there time. Well yeah I mean all do they really quick. So here we have a queen this is this is the Queen song that was available for X. if you. Just. Go Again I think you're going to like me any. Thing. Is going to try them but. They like them and the fact. That you like a nice obvious hand. Answer. Cereal on their knowledge by a lot of different effect yeah here will have to share a leader of. Yours that he's about to not give us some. So we can make it really let. You are. And. We can also not insult. The working. Great. Thanks very much well maybe take a question really quickly. Question question question it's. A question it's always stumped. Why why the tuna can offer a hot day you know what would your people think once the well versus the Bilton. So we used it we that effect was used so there were other ways to do. Reverb that one actually seemed to be more cooperative. And there was a particular reason anyway the particular interface problem that highlighted would have cropped up with a bunch of other pieces also. So. There seems to be nearly as many ways to interface with other objects in the audio graph as there are third party plugins and so the general like making everything homogenous so that we can quickly make generic pieces would have kind of anyway but yeah and I think to know we just went with because of its general as. You said. Anybody. Thank you I was just wondering the play. The middle node that you were sending it through so what was a she was a latency of there was no you so there's no issues around like. The on how the actual processing what it was more that like. So if you have a Web. You connect with it and then you call its connect method on a same. Call and do that and so we had to just like invent a solution like we started building. On the like first day of the day and so I was like I'm going to pick between every single effect in this graph because they do exactly the same thing and I know how they work and they're they're not going to affect the audio. Yet has has uniform interface yet so if I get it correctly it directly connects to the sink the two no one. I've not tried to send a question as I said what you have to connect you to the input of the the to knock on yeah but what was the issue with the server it's because there is no. Prophecy fast. In the browser so like the web presented different to what you know and that means that you have to come up with some instructions to deal with but. Also isn't so much of an abstraction as it was a giant pile of books but it's the same just repetition saying basically I think Yes All right let's think the presenter so. I.