Placeholder Image

字幕表 動画を再生する

  • Hello and welcome to the first video of a sporadic Siri's where I'm going to be messing about with Mehdi.

  • Now.

  • Some of you may not be familiar with what maybe is.

  • It stands for musical instrument digital interface, and you may be a word.

  • It is some sort of music format from decades ago.

  • You'd be right.

  • But what's interesting about MIDI is it doesn't transmit audio it all.

  • There is no sort of way forms off samples sent the committee protocol instead, Midi is a sequence of timing events that could be supplied to musical instruments and equipment as and when necessary.

  • I think the original creators off the middIe specifications were exceptionally clever and very forward thinking, and as a result, the middIe former has really stood the test of time, so much so that it could be modified with ease to accommodate new developments in the industry.

  • And it has also become ubiquitous across the industry.

  • It fully satisfies the needs off, making sure that musical instruments and equipment can talk to each other in a sensible way.

  • Now, if you're not interested in music, that's very know that Midi can also be used as a general purpose timing protocol.

  • Um, maybe I'll talk about that a little bit later on.

  • But from a programming perspective, I think that the middIe final protocol is quite interesting, and I wanted to explore it in a video.

  • So let's close down south synthesizes and bags full of cables.

  • Let's get started.

  • And I thought we'd start with a very brief MIDI primer.

  • Here I have some sort of digital instrument.

  • It could be a synthesizer, but these days it could really be anything at all.

  • And in principle, the user presses the keys on the keyboard.

  • On it emits some sort of sound that we can hear.

  • But unlike an actual instrument, it's completely synthetic.

  • The keys of electronic switches, which switch something on or off on accordingly.

  • A sound is produced now because it's digital.

  • We don't need to rely on the clumsiness of a human hands to press the switches.

  • We can, of course, send an electronic signal instead, so we could send something like an on off event for a particular key.

  • Has the key been pressed, or has it been released?

  • We need to identify the key, so that could be a particular note and there may be other parameters that we want to send to, such as How hard did we press the key?

  • This is called Velocity on These additional parameters allow all sorts of expression in the musical plane we can package.

  • You're all of this information as some sort of event and send it to the instrument.

  • Now is the name implies.

  • This event only does one thing, but of course, for a musical piece, you want to do lots of things, so it's likely we're going to want to send a sequence of events.

  • But it's no good just smashing out these events as fast as we possibly can welts.

  • The entire song will be played well instantaneously.

  • It wouldn't sound very good on as music is a temporal medium.

  • It's important that we include timing information into these events.

  • So let's say at the beginning of the song, I want to press this particular note, but I only want to hold it down for a certain amount of time, maybe 100 milliseconds or so.

  • Therefore, I augment the packet with the information to say that this event is occurring 100 milliseconds after the previous event and subsequently, we might wants this event toe happen 50 milliseconds after the previous event.

  • In effect, we send the Delta Times between our events and sending the Delta Time is quite an important thing because what if we wanted to events to occur simultaneously?

  • Well, we just specified Delta Time Off zero.

  • So even though these events have been transmitted in Siri's, whatever is receiving these events of the other end can accumulate them based on their Delta times and perform them simultaneously.

  • Now on the actual hardware itself, it doesn't work quite like that's very difficult to do.

  • Variable numbers of things simultaneously electron ICS hardware.

  • But the principle is that you run your electronics hard, work far faster, then you're sending these events.

  • So these are, let's say, we're operating in milliseconds here.

  • The process on our digital instrument could be quite happily operating in nanoseconds, and so the end result is that it feels like these are happening at precisely the same time.

  • It's imperceptible to was rather boring and limited human beings, and that introduces two interesting properties of the middIe format.

  • Firstly, it only has to be sufficient for the bounds of human perception, but Secondly, it is undeniably riel.

  • Time on we'll see is we're exploring the middIe format that were possible.

  • Keeping things synchronized and quick takes a priority.

  • Now there's one was being a programming that I'm also a bit of an audio nerd, and I have quite a sophisticated setup for dabbling about with home audio.

  • I don't consider myself a skilled musician by any means, but I like to think that I am, and I try and play my best Whenever you start working with audio.

  • One of the primary concerns you have is Leighton.

  • See, in my experience, when I'm clumsily hammering my keyboard with my fists, I start to notice that I can become out of sync if there's a latent see greater than 30 milliseconds, and that doesn't seem like a lot.

  • But it does demonstrate how good brain is.

  • Trying to synchronize audio to real time actions within your body actually becomes very difficult to play anything as the legs he gets quite large.

  • And even though I'm not that good on a keyboard, I am actually quite okay with a good time.

  • Guitar is typically quite a fast instrument to play on when I've dabbled with real time audio effects in the past.

  • One of the things I noticed is that plucking a note on the guitar, I could only tolerate a far lower Leighton.

  • See had to be about 15 milliseconds before hearing this sound.

  • But that's my fingers would be come out to sync with what my brain thinks they're doing.

  • The new resources of Leighton see in an audio set up.

  • But as you go towards the higher end that Leighton see tries to reduce.

  • But where is this late?

  • See coming from, well, festival physics.

  • You can't have zero latent seen any kind of system.

  • But originally when digital audio workstations were starting to be developed and started to be available to the consumer, the latest he was in the cables and the connections between the pieces of equipment, you could only transmit digital signals at low speeds.

  • Throw into the mix the legacy of converting those signals from digital to analog and then, at the time, some rudimentary effects processing on.

  • Eventually you end up by pressing a key on a little bit later on, only hearing the final processed sound.

  • Interestingly, as we've progressed with the technology, the communications bandwidth has got much faster.

  • We got things like USB two USB three now, but at the same time, we're demanding a lot more from our C P used to produce more interesting effects and more realistic sounds.

  • The one thing we have going for is an audio is that we're still talking about Miller Seconds hit into a computer.

  • A millisecond is an eternity.

  • Anyway.

  • I'm getting sidetracked.

  • Let's get back to the middIe video.

  • There's several layers to a MIDI sequence.

  • At the most bottom layer is the concept of a channel where each MIDI event is associating with issuing information to a certain channel on that channel typically reflects a specific instrument on it's usually down to your studio.

  • Set up or dedicated.

  • MIDI process is to make sure that these events are sent to the appropriate destinations.

  • In the middIe message, you can have a book to 16 channels, which doesn't seem like very many, but this is where the second layer comes in.

  • The concept Off tracks tracks are sequences of midi events that are occurring in parallel on here.

  • In this new track, some of these events may be mapped onto the same channels that we already have a peer.

  • But other events can be mapped too well, new channels, so we can quickly build up quite a large library of virtual instruments before we go on.

  • Dissect the MIDI file format for ourselves.

  • I thought it would be useful to actually look att a MIDI file and I found this piece of software it's free to download, but they encourage a donation simply called MIDI Editor.

  • Bye, Marcus Schwank.

  • It's very nice.

  • It's very simple, but it allows us to analyze the tracks and the channels, and the message is in some detail.

  • And it's typical of media editors to look a bit like this in the Y axis of this frame we've got which particular note on the keyboard is being played.

  • Now they're always keyboards, even if the instrument doesn't have a keyboard.

  • Say it's a drum kit.

  • Of course, the drums can't play notes, but you could assign different types of drum to different keys.

  • In this editor.

  • The tracks are all over, laid on top of each other on the old, different colors.

  • Here, I've isolated Track one, and I'll click on this big event in the middle and you probably heard it just made a sound.

  • The nice thing about this.

  • Editors that allows us to analyze the event in information that might be relevant to a programmer so we can see we've got this on ticking off tick.

  • That's the Delta time from the previous event.

  • It's actually visualized here in what is called wall types.

  • This is really time, but the event itself will only contain the Delta to the previous event.

  • It tells us that he's playing notes 77 and it's using a velocity of 100 so really hit that key heart on the channel it's associated with is Channel zero.

  • Now.

  • Don't forget, Midi doesn't contain any audio information.

  • It's simply a way of rooting thes event packets around a system.

  • So Channel Zero needs to be associate ID with something that produces sound.

  • And in this case, Channel zero has been mapped onto string ensemble one.

  • In fact, we can have a listen.

  • These instruments on the site are virtual instruments provided by my Windows operating system.

  • They sound a little bit flat and old fashioned typical middIe sounds.

  • If you're familiar with those, and that's because on window she get this Microsoft GS wave table, since which is a basic selection off virtual instruments.

  • In fact, I can change the channel to something else.

  • So let's emulate a piano sound instead.

  • One of the nice things about events in time and pitches is they can be visualized Ana to de Graff like this, and so I think that's what we'll aim for.

  • First will try reading in a midi file on visualizing it, and I just love watching the complexity of things like this.

  • When it all comes together, it produces something round, like it a lot.

  • As with anything that is considered a standard somewhere, there exists ex specifications for that standard, and this is the middIe specifications official website, and it contains lots of information about how we can interpret what's going on inside a file files a little bit different to just regular midi streams between instruments as they contain additional information that the instrument simply don't care about.

  • Regardless, there's nothing will see in my code today that we can't find listed somewhere on the specifications website.

  • Now, as usual, I'll be using the OLC pixel game engine for the visual ization.

  • Quite a high definition one, this time at 12 80 by 9 60 in one toe, one screen pixels.

  • However, interpreting the MIDI file we're going to be doing from scratch so it doesn't require the pixel game engine to understand how the MIDI file format works.

  • So we'll be coming back to all that later on because I want to develop a class that I can use in subsequent projects.

  • I'm going to create a simple MIDI file class.

  • I'll give it a constructor on all of the work is going to happen in a function called Pars file, which takes in a path to that particular MIDI file.

  • So just for convenience, I'll add an additional constructor, which can take in the path and just call the past file function.

  • Now I'm going to treat the MIDI file as a stream because, in effect, many messages are sent as a stream.

  • You don't really know what's going to happen next, And so maybe by constraining ourselves to work with the file stream will produce better code for working with MIDI streams in the future.

  • So I'm just going to create a standard input file stream object, open the file in binary mode on.

  • If there's a problem, I'm going to return false.

  • The first curiosity when it comes to Midi is understanding how to be values from the file.

  • Midi was developed way back when literally billions of years ago, but when processes used a different bites order to store numbers than what we're familiar with on our modern desktop computers, So every time we read, say, four Bites interview, we need to swap all of the bites around.

  • So I'm going to create some little Lambda functions to do just that.

  • I've got one here called Swap 32.

  • It takes in an unsigned interview.

  • Now that will be a different bite order than what we actually want.

  • So the contents of the land of function simply swaps all of the bite's over.

  • I'm also going to create one for 16 bit numbers, too.

  • As I mentioned before, MIDI files contain lots of additional information that the instruments don't care about.

  • But perhaps the editing system for that file does so they can contain things like strings.

  • Fortunately, strings in the file are going to be stored sequentially on will know the length of the string in advance, so I'll create a little Lambda function.

  • Read String, which takes in that length on allows access to the input file Stream on just successively reads the characters on depends them to a standard string object.

  • Now we get to the second curiosity with MIDI files.

  • Most MIDI files will never contain information that requires more than seven bits to store, and that might seem like an unusual design choice for platforms, which routinely work with a bits.

  • But some way it does make sense.

  • Firstly, you want to minimize the amount of bandwidth you're going to transmit, as hopefully this will reduce the lake and see.

  • But also 127 in terms of musical equipment is quite a lot of different things.

  • For example, if you had a keyboard with 100 and 27 keys on it, that's going to be quite some keyboard.

  • Alternatively, if you want some sort of expressionistic tool included in your mini event, while 127 levels of differentiation within that expression is quite finite on high resolution.

  • So this comes back to that human perception on real time thing we were talking about earlier.

  • It's good enough to get by but limiting yourself to seven bits does start to raise some interesting problems with the file.

  • Former.

  • So what if we wanted to, for example, reading a 16 bit number 32 bit number.

  • They're on several occasions.

  • Situations where?

  • 127 isn't he?

  • No, specifically, but timing the Delta Time between the events that could be many thousands off ticks, clock cycles, milliseconds, whatever we want to call it.

  • And so we can't store that in just seven bits alone.

  • And this is where I think the Creator's admit he did something rather clever.

  • They have a variable length, numerical time.

  • And so when I know that I'm going to need to read, one of these types are going to create yet another little utility help a lambda function.

  • As mentioned, the bulk of MIDI information could be represented in this bottom seven bits, but it's convenient for computer systems to read eight bits.

  • This means we have a bit here that doesn't really do anything.

  • Well, what if we used this bit to signal to the passing system that the value were trying to read in is in fact greater than seven bits and that we need to read another bite in order to read the full value.

  • So when we're reading a value, if that bit is set to one, we now need to read another bite again.

  • Only the bottom seven bits are interesting to us, but we can now construct a 14 bit word out of thes two successive seven bit reads.

  • Now what do you think happens if the next bite that we read also had its most significant bit set toe one?

  • Well, the process repeats itself.

  • We read another bite, take the bottom seven bits and form a 21 bit.

  • Word on, we can keep doing this until we see that the most significant bit of the last bite we've just read isn't one.

  • And that tells us we've got our complete word now 7 14 and 21 bit seem like unusual numbers, so I'm just going to stuff all of those into a 32 bit word.

  • Now, this may seem like a very rudimentary compression, but it works very well.

  • When we've got low value numbers.

  • We only need to transmit a fewer number bites.

  • Great first system were transmitting.

  • Things increases.

  • Leighton See, I'll storm I accumulated value in this variable end value, I'm going to need a little help of variable, which represents one bite.

  • When instructed to read a value, The first thing I'm going to do is read one bite from the stream.

  • Now that bite could be completely sufficient to give us the final value, so I'll read that into the end value variable.

  • But I need to check the most significant bit of this bite because of its set.

  • Then I need to keep reading, so I will take the bottom seven bits of the bite that I've just read and then proceed to continuously read bites until I read one where the most significant bit isn't set.

  • So in this loop now, read the bite on.

  • I'll take my currently existing value on dhe.

  • Shift it all along by seven bits on or into that value.

  • The new bottom seven bits I've just read.

  • Don't forget to return the final value from the Lambda function.

  • Now we're ready to pass the MIDI file for convenience.

  • I'm going to add in to temporary variables 32 bit on a 16 bit unsigned interview.

  • MIDI files begin with a MIDI header on this header contains information about how to read the rest of the file.

  • The first thing I'm going to read in is the file i d.

  • Now if I open a MIDI file in a bind, our editor, the file I d.

  • Is always this empty H d.

  • It isn't actually a number that means anything.

  • It's just four bites that could be instantly recognized as a midi file.

  • Once I've read it in, I'm going to swap the bites around and then immediately forget about it because I don't care.

  • The next four bites represent the length of the header, and today I don't care about that either.

  • The next two bites tell us things about the formats of the MIDI file and getting embarrassing again.

  • I don't really care about that.

  • What I do care about, though, is this next one, which is the number of track chunks on.

  • We'll come back to that in a minute because we've almost finished reading the header.

  • That's finally another two bites to read, which ultimately I don't care about so out of all of that header.

  • Well, I'm really interested in is this value here track chunks is the number of Midi tracks.

  • This file contains on those MIDI tracks will contain the middIe events.

  • Notice.

  • We've not mentioned anything about channels here that's included in the middIe event itself, but as we'll see, there are ways to augment what's happening in the middIe ecosystem of whatever software is interpreting the MIDI file.

  • The middIe header is a fixed sites, so as soon as we finish reading precisely that amount of information were into reading the data itself on, we'll start by reading the track data.

  • I know how many tracks is going to be.

  • That's my track chunks.

  • So I'm going to go toe loop, which goes through all of the tracks and just for convenience.

  • I'm going to output to the console that this is a new track We didn't know.

  • You can still output to the console in the pixel game engine.

  • That's why those two windows that appear tracks themselves also have some track header information to 32 bit values, one which identifies the track I D.

  • And if we go back to our binary midi, we can see that here again.

  • It's just a text string that identifies I am a Midi track and the next value is how many bites are contained within that track.

  • Now, if you pass your MIDI file properly, you're not going to need that value either, because from now on, everything that we read is going to be a MIDI event.

  • On all media events are deterministic in size and content.

  • I guess you could use the track length to check for corruption in your MIDI file, but it also allows you to completely skipped reading a track.

  • If you wanted to just skip the next entrapped length bites you could get to the next track, however, were not interested in doing that.

  • There is one particular MIDI event, which signifies the end of the track, so I'll create a Boolean flag just to keep track of whether that's happened.

  • And then I will sit in a loop, making sure that we don't prematurely reach the end of the file but also keeping an eye out for that event to occur where now we're going to read in and process the middie messages.

  • I'll start by assuming that all MIDI events contain a time Delta value under status value.

  • The status value will tell us what type of event it is, you'll notice, however, there's a little Asterix next to the state, despite because we'll see later on that not all MIDI events do include a status bite.

  • But let's keep it simple for now.

  • So the first information is always going to be the Delta time from the previous event.

  • For this particular track on, that's going to be one of these variable length values.

  • So I'll call our Lambda Function to read that in.

  • And then I'm going to read in the status value of the middIe message.

  • Now there's lots of different types of MIDI message, so it's time to start bringing in some information from the specifications.

  • As part of my MIDI file class, I'm going to add to Denham's on.

  • I primed the values of this scene, um, with the words necessary to represent that event type.

  • So in this case, we see things like voice note, often voice note on, so that's pressing a key and not pressing a key.

  • We've got other controls which change the expression involved with the notes of pitch.

  • Ben will slightly change the note whilst it's playing.

  • We also have another type of event called system exclusive, and we'll be using that to today.

  • System Exclusive is not directly involved in producing sound.

  • It's more about configuring the environment.

  • Or the instrument notes that it is the most significant nibble of this bite that really contains the information.

  • The bottom four bits of the status event indicate which channel the middIe event is being targeted at.

  • So we only need to compare the top four bits of our status bite to see what the event means on.

  • There's no real way around this.

  • We're just going to need to successively check each permutation.

  • So note off notes on After touch.

  • There we go.

  • I've handled all of the event types now.

  • Fortunately for you guys today, I'm not going to go into detail for each and every single one of these, just the ones that I'm interested in for visualization.

  • But we can't fully ignore them, either.

  • Don't forget, we're reading sequentially from a file stream.

  • So let's, for example, say we got a voice channel pressure event.

  • We need to read the data associated with that event or else our stream will become corrupted.

  • I guess what I'm saying is we still have to read in all of the information in a valid way.

  • But for my visual ization, I'm really only interested in capturing the voice.

  • Note on on voice note off events.

  • Now it sounds talk about the third curiosity of the MIDI file format, and they certainly caused some confusion for me when I was attempting to reverse engineer all of this.

  • It's called MIDI running status, and it's a way of compressing the data stream even further.

  • Let's assume we wanted to send a sequence of media events to an instrument.

  • We would send the event time Delta, the events data type on the event data itself, and we would send that repeatedly over and over again.

  • So if I wanted to press eight key simultaneously on a keyboard, I sent Delta event.

  • I'd even data Delta event.

  • I'd even data et cetera eight times, but the event I D is the same for all of them.

  • So that seems like eight bites.

  • I don't need to send recall that most Midi data is in the range 0 to 127.

  • Therefore, we only need seven bits on.

  • The most significant bit is superfluous again.

  • Well, you may have noticed already that all of our event I D's are greater than 127.

  • So that most significant bit indicates that the bite would just read is an event I d.

  • So if we read some time Delta and then immediately read a bite that didn't have its most significant bit set, we know that actually, what we've got is some active Millie data.

  • So which I d do we use?

  • Well, we use the previous valid i D.

  • That was sent.

  • So this means our stream of eight events for the eight keys being pressed would look more like the Time Code Delta.

  • The event i d of the 1st 1 and then the event data and then simply becomes time, Delta Data time, Delta Data.

  • Except except So this is just another way the creator's trying to optimize the bandwidth requirements on this caught me out at first because there are plenty of valid MIDI files out there that don't care about this optimization.

  • But in order to be compliant and make sure we can read any MIDI file, we do need to care about it.

  • So, as soon as I've read this status bite from the stream.

  • I'm going to check its value because if we know that the most significant bit isn't set, then we've entered this middIe running status mode.

  • This requires us to keep track of the previous status bite that we read.

  • And so if the bite doesn't have its most significant bit set, I'm going to set our current status to the previous status.

  • But in order to get this far, I've had to read the bite from the stream, have effectively just lost it.

  • And so when it comes to reading the data of the individual events later on, I'm one bite out of place.

  • Now I'm going to cheat a little bit here and simply instruct the stream to go backwards.

  • One bite, just a recap on a standard MIDI message.

  • We would have read the time Delta on.

  • We read the status bite.

  • We then interrogate that status bite to determine how we're going to read the data for that event.

  • We've not done that bit yet, However, it's possible that a status bite simply isn't sent.

  • But in order to know that we've already needed to read as if it waas a status bite were In fact, it was the first bite of the data off the current event.

  • So this line here just make sure that everything gets back in sequence.

  • So how do we read the data?

  • Well, let's assume a key is being pressed voice note on, well, rather lazily.

  • I'm going to cash the previous status and start passing this data now.

  • The bottom four bits of the status bite reflected which channel the event was being targeted at, so that's easy enough to extract.

  • And for voice note on, there are two more bites to read.

  • The first bite indicates which note has been pressed on.

  • The second bite indicates how hard the note was pressed, so the status bite told us that it was a voice note on event on the specifications, says the next two bites represent the idea and the velocity.

  • Well, see, this is a common pattern throughout a lot of these events.

  • So the voice note off that's the key being released.

  • We read the same two things now, even though for this video I don't care about what these other packets actually do, I do need to read the right amount of information from him because we're reading them from a stream.

  • And it turned out the voice after touch is exactly the same.

  • Voice control changes exactly the same, so the voice control changes the other dials and buttons on your media instrument.

  • However, program change isn't the same.

  • It just requires us to read one additional bite.

  • Astor's reading the channel pressure pitch Ben is a little bit different again.

  • It does requires to read two bites, but this time they mean something different on We'll come back to System exclusive in a little bit should also just add in here and else unrecognized status bites just in case something has happened.

  • This corruption or the MIDI file hasn't been rendered properly.

  • So now we're reading in this stuff.

  • What am I going to do with it?

  • Well, my intention is to render the MIDI file on the screen in a way too similar to what the middIe editor I showed earlier is doing so.

  • I don't really care too much about using this as some sort of midi process, sir, but I do care about translating the events I'm reading from the file into information that I can use in my application.

  • I know potentially, I'm going to have multiple tracks.

  • So I create a structure called Midi Track, which I'm going to have a string for the name a string for the instrument.

  • And I'm also going to have to Rector's.

  • These factors are going to store the events in two different ways.

  • The 1st 1 Midi note is just simply the key, the velocity when it starts and how long it's going to be.

  • So that will effectively be what I could draw on the screen.

  • The other is going to be a MIDI event.

  • It's sort of a a cruder form of storing the middIe information.

  • I'm only interested in notes being pressed, notes being released and everything else I'm not bothered about.

  • So as we now read through the file all of the different MIDI events, I'm goingto anthem to this middIe track, and I'll store all of these tracks in my MIDI file class in a vector of Midi tracks at the start of our loop.

  • We know we've got a new track, so I'm going to push a blank MIDI track into that vector.

  • Let's take the voice notes after event is an example.

  • I've read in the relevant information.

  • I'm now going to construct a MIDI event using that information in this case.

  • Note off the note i d.

  • The velocity On the time Delta on, I'm going to push it into my vector of many events for that particular track.

  • I'll do something very similar for voice Note on, However, here is another little quirk of the middIe protocol.

  • It turns out that any voice note on event where the velocity is zero is actually going to be considered a note off event.

  • So I'll check what the no velocity is on, then add to my vector events.

  • Well, the appropriate event.

  • Now, as I've mentioned already, I don't particularly care about the other events, but I'm just going to push something to my vector for now.

  • So far, so good and reasonably quite simple.

  • It would be great if Midi Files just contained this basic event information, but sadly, they don't on this now had some complexity toe what we need to read in because we have to read in everything to make sure that all of the data were reading is valid, and this complexity comes through the guys of the system exclusive events now in the source code that accompanies this video in the get hope I've gone through these events in some detail.

  • I'm not going to do that for this video, other than to look at one or two of them.

  • All we care about for now is knowing how to read them properly.

  • So I'm going to add another Denham to my MIDI file class for all of the system event types meta events in this case, and this contains all sorts of, well, meta information.

  • So generic text copyright.

  • What is the name of a particular track?

  • What in the name of particular instrument?

  • Are there any lyrics associated with this?

  • This is the kind of stuff that the editors will use in order to configure a project around a particular MIDI file.

  • Instruments don't care about this sort of thing.

  • There are some critical ones, however, setting the tempo, setting the time signature.

  • That sort of thing could be interpreted by the instrument to change its speed system.

  • Exclusive is a way of making the file format flexibly enough toe handle future changes, and so in the future we may have messages which are not defined by the specifications.

  • These messages will be guarded by these two status bites.

  • Meta message is, however, are defined in the specifications on these always have f f specified as the status bite.

  • When we know that our MIDI event is a meta message, we're going to read in what type it is on.

  • We're going to read in the length That could be variable length of the message that it's simply a case of just processing the type.

  • Now I've just couldn't paste all of these in all of these different meta message types.

  • I'm not going to go through them in detail, but we'll have a look at some of them.

  • So once we've recognized the type, for example, is the copyright information of the MIDI.

  • And we've read in the length of the message specifications says that this is going to be a string of that particular length, so we're just going to read in that string without read string Lambda function.

  • The same applies to things like the track name that's totally decided by the composer.

  • I think it's quite interesting to read the track name in because we can visualize it later with the middIe events on.

  • Believe it or not, that's that.

  • Read the header, read how many track chunks there's going to be.

  • Then read each track chunk individually, read the track chunk header and start passing the events.

  • The events all have a delta time.

  • They all have a status, and they all have an event body.

  • If the event happens to be a system exclusive event, then it could be anything that we want that needs to be defined, particularly for a specific system.

  • Or it's going to be one of these meta events, which is defined by the standard across all MIDI files.

  • And that's it.

  • So we can start to test our Parsa by loading up a MIDI file on.

  • I'm going to do that in on user create.

  • I'm just going to load the MIDI file that I was playing earlier.

  • You'll notice that in the matter, events have out put in a lot of stuff to the console.

  • So if I run the application pixel game engine doesn't do anything.

  • But if I click in the council window behind it, we can see what's been out.

  • Put it.

  • There's been a whole bunch of tracks there we've managed to determine the time signature and the tempo 185 beats per minute on.

  • We can see that each track that we read in has a name associated with it.

  • Typically, which instrument is going to be played has also been some stuff in there that shouldn't be there.

  • And this is one of the final curiosities I've learned about MIDI files is they contain all sorts of junk that doesn't necessarily adhere to any particularly known standard.

  • So that's why we have to be careful about making sure that we do read everything that we need to read.

  • Well, I'm satisfied that there is certainly Midi data being loaded, but right now our MIDI file class contains vectors per track on those vectors contain MIDI events.

  • I'm not going to convert these MIDI events into something I can visualize.

  • Don't forget.

  • The middIe event works with Delta Time and Delta.

  • Time is a little bit tricky when you want to draw things in the future.

  • So I'm going to convert the relevant MIDI events into discrete midi things that have happened.

  • So I'm going to turn them into these Midi notes.

  • I know the key.

  • I know the velocity.

  • I know when it starts in real time, Aunt, how long that note is held for.

  • I am I midi note doesn't care about on or off.

  • Once I finished passing the MIDI file, I'm now going to do this conversion from events into something which is more useful to me.

  • So I'm going to create a little auto four loop that goes through all of the tracks for this MIDI file.

  • Tracks can be treated in isolation.

  • They run in parallel to each other.

  • So timing information doesn't happen across tracks.

  • Our creator variable called Wall time, which is my real time value.

  • And it's my intention here to look at all of the middIe events as we go through time and reduce them into my MIDI note events.

  • On this has a little bit of complexity because the middIe event is far simpler.

  • It's on or it's off.

  • But my note is on with the duration, so I need to keep track of which notes are effectively being held down at any given time on when they are released on.

  • I'll do this by creating just a quick, temporary list of many notes of my list of notes being processed, and now it's time to iterated through all of the events.

  • In my vector of events, we know where we are up to in real time, simply by accumulating the delta of every event.

  • Now the Deltas could be zero.

  • That's okay, really, time hasn't progressed.

  • That just means two things are happening at the same time.

  • If the event is a note on, I'm going to add one of my many notes to my list of notes being processed.

  • The problem is, I don't know what the duration of this notice until I find the corresponding MIDI note off event for that particular note on, I'm really sort of hacking this together.

  • I'm sure there's far more effective ways.

  • If the event happens to be a MIDI note off, I'm going to search my list of currently processed notes to see if it's the same key.

  • And if that search yields something that means this is a note off event for a key that's already being processed.

  • I now know its duration, and so I can work out its duration by looking at the current wall time, minus the note that we found start time, and since this note is no longer being processed, it graduates to being put into my tracks.

  • Vector of notes and I can erase the note being processed from the list of notes being processed.

  • Those of you really paying attention will have noticed in my MIDI type structure.

  • I've got two additional variables.

  • Max, Note and Min note and upset them to 64 potentially, for a media event.

  • There are 100 and 27 notes.

  • This is very rare to see something going from zero to 127 and if I was to visualize this, I would have a lot of blank screen with just a small line of things happening in the middle of it.

  • So I'm going to record what the maximum note pressed waas and what the minimum note pressed waas on.

  • That way I can scale the height of my track visual ization, and I'll make a check for that here.

  • And this will allow me that when I'm visualizing the track, I don't need to draw lots of blank, empty space above and below where, really, all the action is happening in the MIDI file.

  • Let's quickly construct the visualization.

  • First, I'm going to clear the whole screen toe black, and I'll add to the class a 32 bit unsigned variable and track offset.

  • What I'm going to do is used track offset to move us forward and backwards through time.

  • I'm also going to need some variables that tell me howto actually visualize what I'm seeing.

  • So per column.

  • So this is one pixel across is going to represent 50 ticks in the middIe domain now takes we've not really talked about.

  • But fundamentally, it is the clock that is operating behind the scenes for the middIe system.

  • So when we talk about these Delta times, I've sort of roughly mentioned milliseconds there, actually, in midi clock ticks on these ticks can be converted into real time by looking at some of the meta message is that have come through the MIDI file.

  • Fortunately, most systems have the same definition for ticks.

  • It is important that the frequency of the middIe clocks used across your system is the same.

  • Not only does it keep your instruments synchronized, but it means that one instrument is playing stuff at the same speed as the other.

  • I'm avoiding going into timing control directly by maintaining the idea that a tick isn't arbitrary time duration specified by something in this system.

  • So I'm just going to make sure everything is relative to the tick.

  • So per pixel on the screen, I've got 50 ticks on.

  • Each note I'm going to display is being two pixels high for each track in my mini system.

  • I'm going to draw its contents, but I'm only going to do that if the track actually contained something.

  • You're quite often see in MIDI files that tracks contain additional information's as composers and orchestras and musical notes and that sort of thing stuff that we can't visualize.

  • I'm only interested in notes on a No tough because we know the minimum and maximum note of a particular track.

  • I know what the range is for that track, and so I'm going to draw a great rectangle across the screen, using the Phil Recht function, the same height as the range of that particular track.

  • I'm also going to draw the name of that track two.

  • I've just introduced this end offset Why variable?

  • We're going to need to change this as we draw the tracks because each track is going to have a different height, and we're going to have several tracks going down the screen.

  • Now it's time to draw the notes on.

  • I'm not going to optimize this in the slightest.

  • I'm just going to draw them all, even if we can't see the majority off them, each note is going to be drawn as a filled rectangle.

  • The X coordinate for the starting point of that rectangle we already know because we've extracted it from the event sequence on Go to offset it by our track offset value.

  • And don't forget that one column of pixels represents 50 ticks, so I need to divide this number by the 50 to effectively scale it in the ex direction.

  • The Y position of the note is determined by where we're starting to draw the track on the screen and the key value off the note that affects its height.

  • So is the notes get higher?

  • We want to see them moving up the screen the width of the note we've already determined, because it's the duration of the note divided by the time per column on the note.

  • Height is just a constant that we've specified and I'll draw that.

  • There's a little white rectangle I'll just add in a couple of user controls so we can change the track, offset manually with the left and right keys.

  • And that's our complete on user update function.

  • As you can see, there's not much to it.

  • And that's because we took the time to change our MIDI events into notes, something that we contaminated Lee work with in our own framework.

  • Let's take a look loaded up the MIDI file.

  • From before on we can see we've got tracks going down the screen.

  • Each grade rectangle is a track on their labeled with the name of that particular track.

  • On inside the tracks we can see there's lots of notes, and if I press the left and right keys, we can scroll forwards and backwards in time.

  • In a way, it's very similar to the middIe editor software we saw in the beginning.

  • Now this has been a bit of a roller coaster so far.

  • I'm really pleased that we've accurately visualize the contents of a MIDI file, but I'm feeling a little bit empty in underworld.

  • It's midi were meant to be hearing things were meant to be making lots of strange noises.

  • So until now, just really quickly hacking something.

  • I'm sorry.

  • It's window specific.

  • It's a total budgets, a complete hack.

  • What I'm going to do is take our notes, convert them back into many events and seven until one of my synthesizes so we can hear it.

  • And this time I really do mean we're going to do this quickly, cause I'm not going to go into any details.

  • What I'm going to do is load the Windows Multimedia Library, which will allow me to access the middIe drivers built into Windows.

  • I've no doubt people out there will be able to come up with a Lennox alternative.

  • Just added some additional variables to the main class that will maintain timing.

  • The Windows Multimedia Library is considered a bit old hat now for real time audio, but it's still fantastic for Midi.

  • It's very simple to just quickly open an instrument in the on user update function.

  • After we've drawn everything, I'm just gonna throw in a quick test where we're going to be sensitive to the space bar being pressed and released.

  • The Windows Library easily allows us to send a midi message to an instrument, and it's the same format that we've been working with already here.

  • We've got the state.

  • Despite here, we've got the note, and here we've got the velocity.

  • So when I press the space bar, I'm setting a note to be on on.

  • When I released the space bar, I'm turning yourself.

  • So here you can see one of my synthesizers if I press a key.

  • Beautiful and I've connected my synthesizer to my computer via USB, and it will transfer MIDI information via the USB.

  • So let me start my application and we could see the middIe track that's been drawn, but I press a key on my synthesizer, but I could also press my space bar.

  • Okay, so the space powers linked to my synthesizer, the committee.

  • Now I'm not kidding when I say this is botched Koda's.

  • It's best it violates all sorts of rules and principles.

  • I do not recommend that you do anything this way, but it's just to allow us to prove a point.

  • So what I'm effectively going to do is take our note sequence converted back into midi on issue those events to my synthesizer.

  • So let's take a listen.

  • Now my synthesizer can only play one track of the time.

  • So I've chosen Track one, which is sort of the lead string ensemble from before, and I'm going to play it.

  • Hopefully, you can hear that in real time.

  • Manipulate sound from my synthesizer.

  • The MIDI file I've been working with is quite a simple one, so I want to try slightly more complicated one.

  • Now let me just choose an appropriate patch on the synth that one will do on.

  • Let's Play this file at this time that I'm going to play both tracks simultaneously.

  • So have a small loop here instead of just one track.

  • Let's have a listen, Theo, and as we can see, it's quite happily loaded in absolutely every single lady note.

  • There's an additional track of the top here contains 10 change information, which I'm not using it for Playboy, but this is a very sophisticated, lots of different.

  • It seems so bloated is old s So that's that a quick and simple look at the MIDI file format now.

  • I liked Midi.

  • I like sound like programming, and I think I'm going to merge it all together.

  • You don't have to just use MIDI for instruments.

  • It can be used as general purpose timing control too.

  • And I know for a fact that there said firework displays and lighting shows, which are completely coordinated by media.

  • Also, that actually gives me an idea for a game, so I would think about that.

  • Anyway, if you've enjoyed this video, you made a big thumbs up.

  • Please have a think about subscribing the source code bill.

  • The source code will be available on the Gate Club on.

  • I'll see you next time.

  • Take care.

Hello and welcome to the first video of a sporadic Siri's where I'm going to be messing about with Mehdi.

字幕と単語

ワンタップで英和辞典検索 単語をクリックすると、意味が表示されます

B1 中級

プログラミングMIDI (Programming MIDI)

  • 0 0
    林宜悉 に公開 2021 年 01 月 14 日
動画の中の単語