Monday, October 27, 2014

Five Insightful Music Producer / Dj interviews and web posts from some of our favorit EDM Artists.

Five Insightful Music Producer / Dj interviews and web posts from some of our favorit EDM Artists.

Refracture (Paul Dobson)

interview on Djs Arena

1. How long have you been into music and what made you get into it?

 I always did some form of music at a young age at School, mainly playing the Piano and some experience with the Viola. My real interest in music began, however, when a friend introduced me to Nine Inch Nails & Aphex Twin when I was 15. I just fell in love with the melodies and intelligent production. From there I delved further into electronic music, becoming interested in trance which then lead me to breaks and the rest is history!

1. If you had to choose one element of your music that defines your music as a whole, what would it be and why?


I think the main element of my music is usually its deep melodic content. I just love deep, moving melodies and that is where I always start. This combined with a heavy low end and lots of energy is what I think the ‘Refracture’ sound is mostly about.

2. What other artists/DJs influence your work? What have you learned from them?


The likes of Nine Inch Nails, Aphex Twin & Sigur Ros have definitely been a big influence from a melodic point of view. From a production point of view I do very much like Deadmau5’s sound, it’s just very warm and while his tracks might not always be to my taste, everything is very well produced. Feed Me is also a producer I’m very impressed with. As far as performers though it goes hands down to the Stanton Warriors, their tracks always have a very cool yet simple way about them and they always get the crowd going when I’ve seen them play.

3. What defines a good producer and what direction is house music heading in? Who are some of the DJs or producers you currently appreciate?


I think, most importantly, a good producer has their own sound and doesn’t just attempt to copy others because of what’s fashionable. Production wise it’s important to make sure everything has its own space and to take your time in making something sound as full and round as possible as well good attention to detail and not being lazy with it. As far as direction, I think things are going to start getting more melodic again. More nostalgic melodies that make tracks more recognisable and less abrasive dubstep basslines. Currently I think guys like Porter Robinson, Feed me and Zedd are doing good things, because they are just heading whichever direction they wish, are extremely good producers and continue to push their sound.

4. What is the key ingredient in a track? Breakdown? Style of production? Bassline?


For me, personally, everything starts with a good melody. If I’ve got a really good melodic breakdown then everything seems to fall into place naturally from there. If the track is either evoking emotion or making me bounce around the room like an idiot, or both then I’ve got it right!


5. Do you agree with this situation that the electronic music became so hyper productive?


It was bound to happen, it’s become extremely popular so obviously a load of kids were gonna want to be the next superstar dj and start producing. There are so many producers at the moment trying to make it, but in my opinion this has only driven the standard higher as there’s so many great producers out there now you really have to be producing tracks to the very best of your ability and can’t get away with being lazy. Obviously there is more crap out there than ever but it generally isn’t competing. So, to answer your question, yes it’s hyper productive, but it’s not necessarily a bad thing.


6. Your favorite producer at the moment.


I don’t think I have one really, although whenever there is a new Feed Me or Mord Fustang track out I will always give it a listen!

7. What kind of equipment did you start out with? How much has that changed to this date?


I used to just use a Sony laptop, a copy of fruity of loops and some headphones. I got my first few releases out on beatport producing on that until I decided I really wanted to take everything to a higher level and got to work on building a studio with lots of acoustic treatment, some krk monitor/sub and a midi keyboard. For djing I used to always dj on vinyl and then cd decks but have made the move to laptop recently as I have more control over everything and can get a bit more playful with effects etc at live performances.

8. Your favourite piece of equipment/gadget?


My Macbook Pro, I’ve had it 3 years, have played gigs all over the world with it and is yet to let me down.


9. What are the top 5 tracks we should check out?

10. Planet Perfecto – Bullet In The Gun (Refracture Remix) 2. Elite Force – Be Strong (Hirshee Remix) 3. Motioned – Right Here (Miles Dyson edit) 4. Refracture – Burn It Down 5. Deenk – Funky Shit (Refracture Remix)


10. What do you hope to achieve in the future?

I hope to achieve lots of things, but most importantly I would just love to be able to make music as a living for the rest of my life, that’s my dream.

http://djsarena.com/refracture/


Far Too Loud
Production tips from Far to loud on black octopus sounds 


1. FREQUENCY CONTENT: It’s well known that you want to aim for an evenly balanced frequency spectrum in your mixes. When adding a sound to a track I always consider where it will fit in the frequency spectrum and if there are any other sounds it will compete with. Before applying EQ I’ll play about with the octaves different sounds play in to make sure they can all be heard clearly. I’ll often use EQ automation to keep the mix full but clean. For example, if a track drops with just a bass sound and drums, then the bass sound (most often in my productions) will have an even frequency content over the spectrum so that it sounds big and full, however if a lead sound comes in later on, I’ll automate an EQ on the bass channel to allow space in the frequency spectrum for it.

2. PANORAMA/WIDTH: I rarely touch the channel pan pots when mixing my tracks as I don’t think it works well in clubs to have a sound permanently louder in one channel (although I will automate the pan pot or use autopan sometimes). Instead I’ll sometimes give a sound width by somehow varying the L signal from the R. There are loads of ways in which this can be done, the simplest of which is to apply a small delay to one channel. Many synths allow you to pan unison voices which is a technique I like to use often. Once I have made a sound “wide”, I may use some kind of processing on the S (side) signal to further control the width (check this link if you’re not familiar with mid/side processing –http://www.bluecataudio.com/Tutorials/Tutorial_MidSideProcessing/). This may be simple gain adjustment for overall width control or perhaps I’ll use DMGAudio Equality (http://dmgaudio.com/products_equality.php) to EQ the S signal such that some frequencies (generally higher ones) are wider than others (generally lower ones). I think it’s important to have a good balance of wide and narrow or mono sounds so the whole panorama is filled. If I have a lead sound and a pad sound playing together, I’ll generally make one wider and one narrower or mono and sometimes play about with switching which is the wide one and pick what I think sounds best. With bass sounds I sometimes like to layer a mono sound with low and low-mid frequency energy with a wider sound with more high frequency energy to create a sound which fills the panorama. Note that it will help a lot in judging panorama and width if you set up your speakers and listening position properly (read this for more info positioning –http://www.soundonsound.com/sos/feb06/articles/studiosos.htm).

3. DEPTH: The depth of a sound is generally controlled by adding reverb, although you could use some sort of modulation or delay too. I have acoustic treatment in my studio which deadens the sound in the room – this is invaluable for listening to reverb tails. If you don’t have a treated room, use some trusted headphones to check your reverb settings. I apply reverb to nearly all elements in my tracks, even if only a little, and use a number of different reverbs in one track to give a range of depths to the different sounds. Even if I am going for an overall very dry, up-front sound, I still like to use very short reverbs, delays or chorus to give a sense of space.

4. SIDE-CHAIN COMPRESSION. Side-chain compression from the kick drum is a well-known about technique, but I have become obsessed with LFOTool from Xfer Records for this (http://xferrecords.com/products/lfo-tool). The reason is that it allows me to easily tailor the release curve for each individual sound. I can go for a quick release, so that the sound is ducked only for the “click” of the kick drum, or a long release which can completely remove the sound for the duration of the kick, or anywhere in between. I’ll spend a bit of time with each sound playing about to get the release curve sounding right and sometimes, particularly with basses, split the frequency spectrum with EQs on parallel channels and apply a different release curve to each band. LFOTool also has a filter which you can control with the LFO which can be useful for fitting a sound in the mix.

5. GAIN. On many tracks in my sessions you’ll see Sonalksis Free G (http://www.sonalksis.com/freeg.htm) in the last plug-in slot. It’s simply a really nice software fader and I’ll automate it throughout the track on many sounds to make sure the mix is always well balanced and that the prominent sounds have space. I use Free G so that I can still adjust the overall level of the sound with the channel fader without having to adjust the gain automation.

http://blackoctopus-sound.com/interviews/far-too-loud/

avorite VST Instrument at the moment? FXPansion DCAM Synth Squad

Favorite VST effect at the moment? It’s a bit more than an effect, but I need to mention NI Kore…it’s so essential to my work flow.

Favorite Sound Library? Tough to pick a fave, but here’s a cool one I discovered a while back – http://www.drivenmachinedrums.com/

DAW of choice? Cuabse 6

Do you use any hardware? I have a MIDI controller (Novation Impulse 49), a sound card (Focusrite Saffire 24 DSP), and some speakers (KRK VXT8s) which are all nice, but apart from that it’s just a computer.

Producer you are digging at the moment? Culprate

How long have you been producing for? 8 years I think

Biggest mistake beginners make? Not investing in some decent ear plugs and wearing them at gigs.



 King Felix

KING FELIX – PRODUCER INTERVIEW SERIES
by Mike Balk (Balkstar)

Six Foot Three introduces you to inspirational music producers from around the globe.

This week we are checking out King Felix, a live electro band, based in LA, California. We’ve been chatting with Jason Toth over the past few weeks, one of the original founders of King Felix, to find out how things have evolved since the bands inception back in 2010.


“WE WANT TO GIVE THEM A SHOW THAT MAKES THEM FEEL ALIVE.”

The band began around one and a half years ago when friends Jason and Shaun were looking at opportunities to make a living out of computer-based music production. A conversation in a good old American grocery store resulted in Edgar becoming the third member of the crew. Edgar, a violinist, didn’t know much about the computer side of things but his wealth of musical theory helped to widen the sound palette of the groups sound. Three months ago Denis, a friend a mentor to Jason, joined forces has been working hard to improve the playability of King Felix in live performances. The final member of the crew is Raymond a guitar Major at California State University who has been jamming with King Felix since the beginning.


“EXPERIMENTATION AND A LOVE FOR WICKED VISUALS IS HOW WE GET IT DONE”

We build designs upon designs in mainly Photoshop from scratch with lots of triangles and shapes based on ancient geometry. We then throw those designs into other programs to add mind boggling effects, for example Final Cut or Adobe Premiere. A lot of times we make backdrops or morphing designs that are rendered 10 times over each other and we don’t even know how we get to where we end up most of the time. Experimentation and a love for wicked visuals is how we get it done, no online tutorials just us literally figuring out each program on our own. We’ve gotten way faster at conceptualizing our process for video in the past two months just by doing it!


“WE MAKE EVERYTHING FROM SCRATCH HERE AT KING FELIX.”

“We learned to master our tracks by reading forums, listening to music, analyzing other producers vigorously. We are always checking out new artists and sounds, and when we listen to other artist’s beats.” Jason and Dennis love to deconstruct the sound design while Edgar loves to figure out the chord progressions and the melodies.

“We love our music and our fans. Nothing means more to us than the comments and reactions we get from our fans. We want to give them a show that makes them feel alive. Recently we have focused on visuals and music videos for our production.” “When people watch us, or listen to our beats, the goal is to have the fans forget about their usual worries and problems for a moment. It’s a beautiful thing when people are present and enjoying themselves and their surroundings.”
“I WAKE UP EVERY DAY AND CREATE MUSIC UNTIL I PASS OUT ABOUT 24 HOURS LATER.”

http://instantmemoryaccess.com/king-felix/



Lazy Rich 

5 Production tips from Lazy Rich on Black Octopus

1.  Sample packs are a great way to easily bring new ideas and depth into your track, but it’s very important that you know how to use them properly. I try to always change any samples I use before putting them in a track, even if it’s something as simple as pitch shifting or adding a delay.

2.  Try to make sure that your track is always changing and evolving to keep the listener interested. I try to make sure that something changes every four bars, and I use uplifters and downlifters to build up to those change points and make them an interesting event.

3. I find a lot of drum loop samples found in sample packs to be too busy, a great way to make them more usable is to run a simple gate over them, this way only the loudest sounds from the sample will be included.

4. Never underestimate the importance of taking regular breaks in the studio – your ears get tired after prolonged exposure to the same music. I often find that if I’m struggling with a track, leaving it and then returning the next morning makes a huge amount of difference.

5.  You should always start with the biggest and most powerful part of the track, as this is the focal point for any listener, and is the point in the track when the crowd should have their hands in the air! When starting a remix I have a very particular order that I do things in – first off I lay out all the elements contained in the sample pack and pick out those that I think will be useful. I will then work a short breakdown, adding new chords or melodies onto any vocals, followed by a build. Once the build is successful, this then gives me the perfect opportunity for taking a step back and thinking ‘what comes next’, as it gives you a reference with which to write your bassline, even if none of the elements from the sample pack are used at that point in the track.

The Lightning Round…

Favorite VST Instrument at the moment? Bit old now, but I’m really enjoying Sylenth for chords and adding some depth with background arps.

Favorite VST effect at the moment? Character which is included with the TC Powercore is a must for getting the most out of your bass synths.

Favorite Sound Library? Toby Emerson Essential FX of course, use it in EVERY track!

DAW of choice? Cuabse 5

Do you use any hardware?  If so favorite gear? None, hardware scares me!

Producer you are digging at the moment?  R3hab

How long have you been producing for? 5 years now

Biggest mistake beginners make? Not comparing their tracks using a variety of different speakers – just because it sounds ok in your studio doesn’t mean it’s going to work in a club.

Is there is anything else you want to share? Yes I’d love to invite any label owners to check out www.label-engine.com.


Feed me -

Jon Gooch's Personal Post about his music, process and why he does what he does.

'This is the greatest game in the world.'
I've almost stopped doing interviews because I'm achieving nothing. If you want to find something out about me, ask me personally. If it catches my eye, I'll respond, but dragging through another interview that no one thought about for more than two minutes seems like treading very boring water. Not that they've all been that way; but it's the trend.
A well known electronic music magazine recently wanted to do a few page spread about my production techniques. They sent me a list of preliminary questions; what plugins do I use for 'dirty' sounds, what makes a good 'drop', how much 'filth is too much filth'? Who wrote this? I could play the system; give away minimal information in exchange for some printed coverage, but at this point, fuck it. 
The Mau5hax thing was great; I got to interface with talented people and enjoy making music. I learnt as well as got involved. I didn't sit and have my mechanical techniques picked at while my actual motivation was ignored; we made decisions together.
I don't mind the occasional production Q, but what happened to mystery in music and art? There's YouTube tutorials for days now online. Look it up; these production conversations are redundant. The truth and effect comes in the sincerity and composition of the actual piece. If I read an interview with an artist of any type, what I want to know is the 'why' - not the 'how'. Why as electronic artists are we constricted to being quizzed monotonously about our techniques, and not ever our motivation? The reason anything I made sounds the way it did is because I sat and worked out every single piece of it myself. Give every one of us the same tools, and see what we all end up with - it's our differences in expression and decision making that makes us.
I'm doing this because I honestly don't know what else I can do. Music and art for me is a necessary release, and once people picked up on what I was making I was thrown into it. I was a bottled up, angry teenager, and I was completely consumed by the satisfaction I'd found in this new idea of making my own music. It consumed my life and I found I loved what it brought to it, and now I'm on an endless journey to see where it takes me, and where I can take it. Because of it, my entire late teenage and adult life I've been travelling the world, from Spor to Feed Me, constantly humbled by the people I've met, things I've seen, extremes I've lived through - I'm nothing but overwhelmingly grateful, it's almost too much.
Some of it has been physically and mentally tough, but so far I've never quit. It's never left my mind that should I drop dead, there's a million people who would kill to take my place. I don't believe in luck necessarily; I carved this out myself, but I am honoured to have what I have. If you're going to complain about your reality when you're living another persons dream, then I think you need a massive reality check. No one's forcing you. Music is magic; and I think as artists we have a duty to keep it that way, not dissolve it down into presets, complaints, one-upmanship and catering to the market. It's not all pink candy-floss cloud rides, and I think it looks fake if you depict it that way, but it really could be a lot fucking worse.
I used to lie and listen to my favourite records and daydream about how they were thought up, get lost in the sounds. There was no one to ask or study, and the resulting domino effect of speculation led me to my own ideas. It's always been the unknown that's motivated me. Spor was what I fell in to, but Feed Me is my world, a projection of a piece of me, and a way of expressing whatever I feel like. I couldn't have built what I have without you guys supporting me, but I'll always be creating and writing it none the less. I love you all for letting me take it this far.
I don't normally post my opinions on here, but I've never got anywhere by playing the game, and sometimes I just feel I need to 1) say thanks, and 2) say why. TLDR.
Feeeed.


Sunday, October 26, 2014

Florian Born's Modulares Interface B.A turns the iPad touchscreen into a modular mosaic of physical buttons, sliders, and knobs.

Florian Born's Modulares Interface -

Multi-touch devices like the iPad have become more and more popular over the last couple of years. Nowadays they are not only used for browsing and sending e-mails, but also as a medium for new fields of applications. One particular thing of multi-touch devices is in need of improvement: It is the lack of haptic feedback, which makes it difficult to set parameters precisely. 

Regarding to this problem the project has been developed to provide a variety of physical controllers. These controllers expand the usage of a touch device with a haptic feedback while adjusting parameters. By using magnets, the different controllers can easily be arranged onto the iPad. A modular interface appears, which uses a given device just like the iPad.
The system contains three different parts:
- The physical controllers (button, slider and knob), made out of conductive aluminium to pass on the electrical discharge of the human skin.
- A frame, made out of aluminium and plastic, in which the iPad is inserted. The edge of the frame has embedded magnets, making it possible to position the controller precisely and easily.
- The software, running as an app on the iPad. It organizes the control elements and sends the parameters to the corresponding software, which is controlled by the modular interface.



Modulares Interface from Florian Born on Vimeo.




sources*
http://florianborn.com/projects/modulares_interface/
http://www.ItchyTastyRecords.com
http://www.geek.com/tablets/modulares-interface-adds-big-metal-knobs-and-sliders-to-an-ipad-screen-1607311/

Thursday, October 23, 2014

Dynamite Jacksin and Pairodox DJ Sets

Dynamite Jacksin Live Dj Sets









Dynamite Jacksin's Beat-Port Page 
Dynamite Jacksin's Mix Cloud


Pairodox Live Break-Beat Dj Sets








Pairodox's MixCloud Page 
Pairodox's SoundCloud Page 

Beat Matching on CDJs BPM Matching % (Pitch Fader) aka Bpm time stretch Calculation for beat mixing

Of course every Dj should just learn how to just beat-match by ear and feel out the pitch fader when using CDJS. 

However, i often get asked whats the exact math is there an equation or formula to go by?  So for the most nerdy djs out there this post is for you. Plus some simpler formulas for the less math oriented djs out there as well.


In fact there is a few mathematical equations you can go by if you want to be technical about the pitch fader percentage values for beat matching. 



Formula 1 :  


So for pitch increases it's: -


(A-B) / A x 100

[ (128 - 129) / 128 ] x 100

0.78   (rounded for cdjs)

And for pitch decreases it's: -

(A-B) / B x 100

[ (128 - 127) / 127 ] x 100


- 0.79    (rounded for cdjs)


* tip its easier to think of this equation as small numbers in tempo difference in the first place rather than subtracting big numbers*


For example just think o 128 to 129 as 1 bpm difference in the first place. The same goes for from 128 to 127.

Also any value x 100 just moves the decimal place over 2 places no matter what number. So that is always a given.) 

so when going up 1 tempo from 128

1 / 128 =  .0078
.0078  = .78 %

When going up 3 Tempo from 128

3 / 128

2.34

when going down 1 tempo from 128

1 / 127 =  .79

When goin down 3 tempo from 128

1 / 125 =  2.42

So Formula for going up in tempo

Tempo difference / Old Tempo

Formula for going down in tempo

Tempo Difference / New Tempo


Or just use one of these web sites

http://www.mikemackay.co.uk/playground/dj-pitch-calculator/

http://musiccalculator.com/#time-stretch-bpm

http://legalize.org.il/asp/bpm.htm


You Itchin to hear some Tasty Beats?
▲---------Itchy Tasty Records---------▲
http://www.ItchyTastyRecords.com


Thursday, June 26, 2014

Dynamite Jacksin - Miami (Official Music Video)




Dynamite Jacksin - Miami (Official Music Video)

This music video was entirely created by the duo Dynamite Jacksin themselves.



Get Dynamite Jacksin's Cd here - http://amzn.to/1rE6zVF

Get The Digital Album Here http://www.beatport.com/artist/dynamite-jacksin/355611



You Itchin to hear some Tasty Beats?

▲---------Itchy Tasty Records---------▲

http://www.ItchyTastyRecords.com

http://www.ItchyTastyRecords.com

http://www.beatport.com/label/itchy-tasty-records/33841

http://www.facebook.com/ItchyTastyRecords

http://www.Twitter.com/ItchyTastyRec

http://www.soundcloud.com/ItchyTastyRecords

http://www.yellowpages.com/los-angeles-ca/mip/itchy-tasty-records-481382845?lid=481382845

https://www.etsy.com/shop/ItchyTastyRecords

http://amzn.to/1pnlJ40

Wednesday, June 25, 2014

Itchy Tasty Records Music and Fashion Samples

Press kit for Itchy Tasty Records Musical Releases


Press kit for Itchy Tasty Records edm clothing line and fashion apparel

Itchy Tasty Records is a Los Angeles independent recording label​

*other links*
​​http://www.ItchyTastyRecords.com
​http://www.beatport.com/label/itchy-tasty-records/33841
http://www.rdio.com/label/Itchy_Tasty_Records/
https://www.etsy.com/shop/ItchyTastyRecords
http://itchytastyrecords.bigcartel.com/​
​http://amzn.to/1pnlJ40
http://www.soundcloud.com/ItchyTastyRecords
http://itchytastyrecords.blogspot.com/
http://instagram.com/ItchyTastyRecords

Sunday, June 15, 2014

Veilless - Electronic Dark Metal

A new genre of music named Electronic Dark Metal is about to be revealed to the world by the newly formed musical group named Veilless.  Stay tuned for their single release in July and Full album release in September 2014.
Veilless

Veilles Electronic Dark Metal feat. Jason Laszlo Toth and Raymond Milco

Wednesday, June 4, 2014

How to Guide for Changing/Configuration of Midi Channel Messages, Controls, data, CC, Configurations, Implementation, Editing, Routing, Note-on, Note-Off and other midi parameters :)

How to Guide for Changing/Configuration of Midi Channel Messages, Controls, data, CC, Configurations, Implementation, Editing, Routing, Note-on, Note-Off  and other midi parameters :)


How MIDI Models Performances
MIDI data is simply a series of number values, from 0 to 255, that allow control events to be universally understood by different hardware and software. When you press a key on a keyboard connected to your computer, the keyboard transmits a series of numbers to the computer that represent which key you played and how hard you hit it. (See the sidebar "Behind the Scenes: Anatomy of a MIDI Message" for details.) It's then up to your computer softwarean electric piano plug-in, for instanceto decide what (if anything) to do with that numeric data.
It may be hard to believe that simple messages with 256 values can describe musical performances. However, by combining different messages for different eventsa specific message type for notes, another for turning a certain kind of knob, and so onMIDI is versatile enough for a surprisingly wide range of musical uses. Also, because the message structure is so simple, it's easy to understand MIDI messages and manipulate them fluidly. Most MIDI users seldom bother with the numbers. Except in a few specific situations, the numerical data is hidden from you to make your life easier. The essentials are described later in this chapter.

MIDI Specification and Implementation

Interpreting MIDI is a little like interpreting HTML on Web pages: Everything is supposed to work the same way everywhere, but your mileage may vary. Some elements are fixed (notes and pitch-bend usually work in a standard way), whereas others are more flexible. The MIDI Specification itself is published by the MIDI Manufacturers Association. When people say "MIDI," they're usually referring to the MIDI 1.0 Specification. Manufacturers use the technical documentation of this specification to insure that their MIDI hardware and software will work with other MIDI products in an error-free manner.

Essential MIDI Tool Belt

MIDI data is basic enough that once you're used to MIDI messages, the easiest way to troubleshoot is to watch the MIDI events themselves . In addition to a sequencer event list, which displays recorded MIDI as events labeled in English (with number values where needed), some utilities let you view and modify MIDI messages as they're transmitted. Tools for Windows and Mac can display incoming data in real time and even modify it on its way into your software for tasks as simple as a quick transposition or as complex as MIDI data splitting and conversion. Even beginners will want to have these free/donationware utilities on their hard drive. You'll find all four on the included DVD.
Mac OS X:
Nico Wald's MidiPipe (http://homepage.mac.com/nicowald/SubtleSoft) provides a set of simple real-time tools for modifying MIDI in real time. With this software you can drag filters, splits , modifiers, tuners, players, and more into a "pipe" for custom MIDI setups ( Figure 8.14 ).
Figure 8.14. MidiPipe lets you assemble custom modifiers into "pipes" of commands for real-time modification to MIDI input. It's easy to drag around the widgets you need for quick MIDI slicing and dicing.


Snoize's MIDI Monitor (www.snoize.com/MIDIMonitor) is perfect for tracking down strange MIDI problems or testing equipment. It simply displays and filters incoming MIDI messages.
Windows:
MIDI-OX (www.midiox.com) is an all-purpose MIDI utility that performs diagnostics, provides a MIDI message display, does filtering, mapping, scripting, logging, and recording, and can export MIDI data as a MIDI file. Yes, you can leave it running in the background so that brilliantidea you improvised won't be lost forever ( Figure 8.15 ). A companion utility called MIDI Yoke lets you route MIDI between Windows apps ( fairly easy on the Mac thanks to Core MIDI but not otherwise possible in most Windows software).
Figure 8.15. MIDI-OX is a general-purpose MIDI utility for Windows that does just about everything you can think of with real-time MIDI data.



Hardware usually comes with a MIDI Implementation Chart (check the back of the manual) that shows how the instrument sends and responds to different MIDI messages. Unfortunately these charts are not easy to read. Software sometimes comes with an implementation chart, but not always. If you can't find information in the manual, you'll have to resort to trial-and-error to troubleshoot MIDI problems.

Notes, Pitch, and Velocity

The MIDI message you'll use most often are, naturally, notes. Note messages include three elements:
  • Note-on or note-off on a channel (116)
  • Note number (0127), which is often used by a MIDI receiver to determine what pitch to play
  • Velocity (1127), or how hard the note is hit. (For technical reasons, a note-on message with a velocity of 0 is interpreted as a note-off.)
The most common form of velocity is attack velocity , the velocity with which the note is hit, though some keyboards also send release velocity,indicating how quickly you let go of a note. Release velocity is not often used, since there's no such equivalent on an acoustic piano, but it can add expressivity. Keyboards whose sensors don't detect release velocity always send a release velocity value of 64.
Note-on and note-off
MIDI was developed for use with keyboards, so you'll have an easier time understanding it if you're a keyboardist, or can think like a keyboardist. When you play a keyboard, you press down on a key to make sound, and then release the key to end the sound. Accordingly, MIDI notes are divided into two events ( Figure 8.16 ):
  1. Note-on: Starts a note (press a key, and the sound begins).
  2. Note-off: Turns off a note (let go of the key, and the sound stops).
Figure 8.16. Two notes played on a keyboard produce four MIDI messages: each note has a note-on and note-off message (A), accompanied by a channel number (B), note number (C), and velocity (how hard the note was played) (D). (Shown in Subtlesoft MIDI Monitor)


 A key is pressed, producing a note number and attack velocity value.
 A key is released, producing a note number and a release velocity value.
Note-off versus a "zeroed" note-on: Sending a note-on with velocity 0 is equivalent to sending a note-off, so some devices use this message instead of a true note-off. Usually the distinction is invisible to the user .


Both note-on and note-off are accompanied by a note number value (so the receiving MIDI device knows which note to play) and velocity amount (generated by a sensor that determines how hard you hit the key).

Behind the Scenes: Anatomy of a MIDI Message

Like most forms of digital data, MIDI messages are made up of bytes. As a musician, you'll seldom need to worry about the bytes themselves. But if you know a little about how computers work, you may be curious : How can so many different types of messages be crammed into such tiny packages? And why does the number 128 keep coming up in discussions of MIDI?
First, let's cover the rock-bottom basics. Computers work with bits (short for binary digits ). A bit is a number that has only two possible values: 0 or 1. A byte is a larger unit that contains eight bits. So there are only 256 possible bytes in the worldthe numbers from 0000 0000 to 1111 1111. (256 is equal to 2 to the 8th power.)
The MIDI Specification defines two types of bytes: status bytes and data bytes. A status byte always begins with a 1, and a data byte always begins with a 0. Of the 256 available bytes, then, 128 are status bytes and 128 are data bytes. Thus any individual data byte can have a value between 0 and 127.
Each MIDI message consists of a status byte followed (in most cases) by one or two data bytes. The specific meaning of the data bytes depends entirely on which status byte preceded them. So with 128 possible data bytes, MIDI can represent many different things.
Most MIDI messages contain three bytesa status byte followed by two data bytes. Here's how such a message breaks down:
  1. Status byte: Describes the kind of message and the channel for which the message is intended. Notes, control changes, and program changes are all different message types, so they use different status bytes. For instance, the status byte might indicate"here's a played note on channel 14." At this point, the note that was struck is still unknown.
  2. First data byte: The number after the status byte, which can range from 0 to 127, provides more information. If the status byte is a note-on, the first data byte indicates the note number, such as 60 for Middle C.
  3. Second data byte: The third and last number, which again can range from 0 to 127, either adds more detail to the first data byte or provides some other related piece of information. If the status byte is a note-on, the first data byte provides a note number to specify which key was struck, and the second data byte describes the velocity (speed) with which the key was struck. For instance, the complete three-byte message might specify that Middle C on channel 14 was played with a velocity value of 64, which would be a medium amount of force. Note that a few MIDI messages, notably channel pressure (defined later in this sidebar), don't have a second data byte. They're two-byte messages (one status byte and one data byte).
Inside the actual MIDI messages, all three data bytes are binary numbers (strings of 0's and 1's). Since long strings of binary digits are hard to read, rogrammers usually work with this data using the hexadecimal numeral system, which represents numbers using six letters (AF) and ten numbers (09). For instance, the decimal number 26 becomes 1A in hexadecimal. There's not much reason for you to worry about the hex codes as an end user. (For anexcellent explanation of how to read hex codes if you are coding software or DIY MIDI hardware, see http://en.wikipedia.org/wiki/Hexadecimal.)
Here's an example of a MIDI message: press the note E4 and you generate three numbers:
144 52 42
The first number (status byte, 144) indicates note-on, channel 1. The second number (first data byte, 52) is the note number corresponding to E2. The third number (second data byte, 42) is the velocity, how hard the E was hit.
Since all MIDI messages must begin with a status byte, messages can be categorized technically according to the different status bytes:
  • Note-on
  • Note-off
  • Pitch-bend
  • Key pressure (sometimes called poly aftertouch or poly pressure)
  • Channel pressure (more commonly called aftertouch; the pressure after a note is played)
  • Control change (control number and value, for one of 128 possible controllers)
  • Program change (for changing an instrument from one sound program to another)
These are all MIDI Channel messages, which means you can use them on up to 16 channels at the same time within one MIDI data stream. The remaining messages, called System messages, control an entire device or an entire music system or perform some other specialized function rather than being used to generate music. They include system-realtime, system-common, and MIDI Time Code, all of which are related to system timing synchronization, and system-exclusive. Most system-exclusive messages are defined by individual manufacturers. (MIDI Machine Code [MMC] is a type of system-exclusive data.)
System-exclusive messages can be much longer than three data bytes. By sending a "start of exclusive" byte, followed by data bytes, followed by an "end of exclusive" byte, the manufacturer of a piece of gear can make system-exclusive messages as long as needed.


Duration and "stuck notes"
MIDI has no way to represent the duration of notes: an instrument simply begins making sound when it receives a note-on message and then ends the sound shortly after it receives a corresponding note-off. This may sound limiting, but it uses a lot less data. Imagine for a moment that you're a MIDI device. If you couldn't just say, "I've pressed G" and then, somewhat later, "I've let go of G" by using note-on and note-off messages, you'd have to say something like, "I've pressed G, and now I'm still pressing G, and now I'm really still pressing G, still pressing G . . ." and so on; you'd need a lot more than just two simple messages.
If an instrument fails to receive a note-off message, it will keep producing sound indefinitely (or until the sound naturally decays, or possibly until the instrument is shut off). This phenomenon is called a stuck note . There are a variety of reasons why a MIDI device might fail to send note-off. Playback of a MIDI sequence might be stopped after a note-on has been sent but before the corresponding note-off has been sent. Most sequencers handle this situation automatically. But if the sequencer rudely crashes between the note-on and the note-off, a receiving hardware synth will get a stuck note. Most MIDI hardware and software includes an "all notes off" or panic button so you can recover from stuck notes. The panic command turns off all notes on all devices on all channels.
The MIDI hold pedal (also called a sustain or damper pedal) can also be a source of stuck notes, if for some reason a synth has received a hold pedal on message but no hold pedal off message. This can happen, for instance, if the pedal is pressed as it is connected. Try toggling your hold pedal if notes are sticking .
Note number
MIDI defines 128 note numbers, from 0 to 127; note 60 defaults to Middle C on the keyboard. Note numbers are assigned to half-steps on the keyboard, so B is 59, C  is 61, D is 62, and so on, though your software will probably display notes using musical note names instead of numbers. (Middle C is often shown as C3. Octave numbering isnt standardized in MIDI, though, so some equipment refers to Middle C as C4 instead.)
Don't panic! Memorizing control change (CC) numbers isn't always necessary, but here's a really important one to have handy. To stop stuck notes, send a CC 123 (all notes off) or a CC 120 (all sound off, which also turns off the hold pedal). (See the section "Adding Expressivity.")


Velocity
MIDI velocity data indicates how hard a key has been struck. A velocity-sensitive device senses the speed with which a key travels downward when you strike it and sends a number from 0127 as the attack velocity. (Some instruments also respond to the speed with which you release the notethe release velocity.)
The most obvious application of velocity is to make a sound louder when you hit a key harder. But since both the loudness and the timbre often change on real instruments as you strike them with more force, velocity may make other adjustments to the sound as well. A virtual piano plug-in, for instance, might provide samples of a real piano played at different dynamics, and trigger the appropriate recorded sample based on the incoming MIDI velocity for a more realistic sound. The nature and amount of velocity response is determined by the receiving instrument, not by the MIDI message itself.
Velocity does have one major limitation, and it comes back to the fact that MIDI models keyboard performance. On wind and string instruments, you can adjust the loudness of a note while playing it: a string player can adjust the pressure of the bow in the middle of a note, for instance. Not so on a piano: the dynamic level of the note is determined by the beginning of the note; the player can't affect the sound after that. Since MIDI note messages use only attack and release velocity, MIDI instruments tend to behave more like the piano. To add expressivity in the middle of the note, you'll need a separate message. (See "Adding Expressivity," p. 293.)
Hands-on: Try playing a keyboard


We'll use SampleTank2 FREE with Ableton Live, both of which are included on the DVD, so we can set up a software instrument and try playing it. (You can use SampleTank Free with any host you have, not just Ableton Live; see the installation instructions on the DVD for details.) First, let's try setting up a keyboard and playing around a bit to see how MIDI works in practiceand to make sure we're getting sound.
  1. Verify your MIDI settings: To make sure you can receive and send MIDI in Live, check Preferences > MIDI/Sync > Active Devices. You'll see inputs and outputs listed by port; double-check that each input and output you want to use is lit green by clicking the square next to the port. (Notice that you won't see individual devices connected to those ports. So if you have a Kurzweil keyboard connected to the MIDI, which is connected to an M-Audio interface, you'll select the M-Audio interface to send MIDI to the keyboard, not the keyboard itself.) If you don't have a MIDI device available for input and you want to test your setup, select the Computer Keyboard item to use your QWERTY keyboard.
  2. Add a MIDI instrument: MIDI instruments in Live work just like effects inserts : drag them from the Device Browser to a MIDI track to use them. (If you don't have a free MIDI track, create one by selecting Insert > Insert MIDI Track.) You can use one of the built-in instruments in the Live Device Browser, but here let's try SampleTank2 FREE: drag it to your MIDI track. Then enable input by arming session recording on that track: click the arm button (for MIDI, it looks like a 5-pin MIDI port). Try playing your MIDI input or computer keyboard. You should see the MIDI track meters move as you play, and a small indicator in the upper right corner of the screen (MIDI input indicator) should light up.
  3. Edit SampleTank's settings: Even though you see the MIDI indicators light up, you won't hear any sound because you haven't yet loaded a program or patch. To see SampleTank's full interface, click the Edit Plug-in Panel button on the Device title bar. (It's the icon that looks like a wrench.)
  4. Load a program: In SampleTank's Browser, you should see all the preloaded SampleTank2 Free patches included on the DVD. (If not, check the DVD installation instructions and troubleshooting advice.) Double-click the 73 EPiano for an electric piano sound reminiscent of a classic, early '70s Rhodes. The program is automatically loaded into the active part/channel (channel 1). For an acoustic piano, try double-clicking HQ Free Piano mk II.

Make Your Playing More Musical

As you play MIDI instruments, try experimenting to make them more expressive ( especially if you're using a keyboard or other MIDI instrument).
Velocity: Notice how hitting the keys harder on the 73 EPiano changes the sound, especially in the lower register. It should change from a mellower, rounder sound to a harder-edged, slightlydistorted sound as you increase your attack velocity.
Range: On many instruments, the use of different samples in different ranges allows SampleTank to more closely emulate the sound of the original instrument. On the electric piano, the upper register is more bell-like in sound. On the acoustic piano the effect is subtler, but the bass is unmistakably richer in sound, as it should be. Try playing in the different registers to take advantage of their unique timbres.
Hold pedal: If you have a MIDI hold pedal, connect it to your instrument's hold pedal jack. You won't get the same resonant sound that you get on a real acoustic piano, but by using the pedal in a pianistic way, you will be able to more closely mimic the musical sound of a traditional piano technique.


Adding Expressivity

Specifying which note a synth or other electronic instrument should play (with MIDI note number) and how hard to play it (with MIDI velocity) are a start, but to make a MIDI instrument expressive, you'll want some additional control. MIDI provides other kinds of controls, some of which have fixed definitions and produce standard types of musical effects, whereas others can be assigned more flexibly. You can add these effects to your sequencing software by drawing them in with a pencil tool or other editing features, or by using physical hardware during performance. To use these controls as you play, you'll need both a hardware controller capable of sending the appropriate message and an instrument (hardware or software) capable of receiving the message and responding to it. For instance, to control vibrato, you might use an expression pedal attached to a keyboard in conjunction with a software plug-in in which vibrato amount is assigned to the expression pedal. Your MIDI setup for keyboard playing should ideally include at least a pitch wheel and modulation wheel ("mod wheel") at the left end of your MIDI keyboard, and a pedal or two plugged into jacks at the rear of the keyboard. (Non-keyboard MIDI instruments may send similar data using other physical controls.) Even with just these few controls you can make your performances more expressive ( Figure 8.17 ).
Figure 8.17. To start adding more expression to your performance than a keyboard alone can provide, explore the uses of the pitch wheel (which sends pitch-bend data) and "mod" wheel (control change 1, modulation), as shown here on the Alesis Fusion. (Photo courtesy Alesis, Inc.)


Although pitch-bend, control change, and aftertouch are often called "continuous controllers," there's really nothing continuous about the MIDI data. Your hardware will generate a series of discrete numbers quickly enough and close enough together that the result will sound continuous to your ear as the receiving instrument gradually changes the pitch or another attribute of the sound.
Pitch-bend
Pitch-bend data allows you to move the pitch of notes smoothly up or down while they're sounding. Pitch-bend is usually controlled by a pitch wheel or some type of stick controller. In the neutral, centered position of the wheel or stick, pitch is unchanged. As you push the wheel or stick (pitch wheels most often move toward and away from the player, whereas pitch sticks move left and right), the pitch changes gradually within a specified range (typically a couple of semitones). This effect is called "bending," by analogy with the way guitarists bend their strings to pull the pitch of the string upward.
Coarse versus fine: A coarse setting is a more significant change of value, whereas the fine setting is a more detailed fraction of the same value. For example, in monetary dollars and cents , the dollar would be the "coarse"amount and the cent would be the "fine" amount. In MIDI, coarse and fine refer to amounts of values like pitch-bend.


Pitch-bend has a full data range of 016,383, not 0127 (see the sidebar "This One Goes to Eleven: Beyond 128"). Since this range offers more precision than is usually needed, however, some software simply refers to the 0127 "coarse" range for convenience.
Some music software displays pitch-bend data using the values 0127, with a center value of 64, meaning "no change in pitch." More often you'll see it displayed from 63 to +64, with 0 as the center value. If your software displays the full data range (016,383), it will probably show a full downward bend as 8191 and a full upward bend as 8192, again with 0 as the center value ( Figure 8.18 ).
Figure 8.18. Pitch-bend data is a specialized MIDI message. As you turn the wheel, you generate a series of MIDI values like the data shown on the left. With the wheel centered, the pitch is unchanged (1). Turning the wheel all the way down (2) bends the pitch downward, and turning it up (3) bends the pitch upward.


Aftertouch
Some keyboards are capable of sensing how hard the player presses down on a key after it reaches the bottom of its travel. This pressure during the middle of a note is called aftertouch (other terms that mean the same thing are key pressure and channel pressure ). On a keyboard, you generate aftertouch data by pushing down on the note after you've played it. Aftertouch is often used to add vibrato or open a synthesizer's filter for a brighter sound.
Which type of aftertouch?
Affects individual notes: Polyphonic aftertouch (also known as key pressure or polyphonic key pressure)
Affects a whole channel: Channel aftertouch (also known as channel pressure or "mono" aftertouch/pressure)


There are two types of aftertouch: channel pressure and polyphonic pressure. When a keyboard is equipped to sense and transmit channel pressure, there is only one sensor, which runs horizontally under the keys from one end of the keyboard to the other. When a keyboard is equipped to sense and transmit poly pressure, each key has its own pressure sensor, and transmits pressure data in a format that also includes key number as part of the MIDI message. Poly pressure sensors are more expensive to build, so poly pressure is not found on nearly as many keyboards. To find out whether your software or hardware instrument will respond to poly pressure (or for that matter to channel pressure), you'll have to consult the manual.
With polyphonic aftertouch, you can press down harder with one finger while holding a chord to add vibrato to that one note.
Control change messages
MIDI provides additional controls via control change (CC) data. Control change messages are numbered (you guessed it) from 0127. Some of the controller numbers are assigned to particular tasks. A piano-style hold pedal, for instance, always transmits CC 64, whereas a modulation wheel usually transmits CC 1. CC 7 is used for master volume. Other CC numbers are left open in the MIDI spec; some MIDI devices ignore them, while others use them for different purposes ( Table 8.1 ).
Table 8.1. Important MIDI CC Messages
Name
Control Change (CC) #
Function
Modulation
1
Commonly assigned to the keyboard's mod wheel, but the nature of the sound changes produced by the modulation is not defined. The mod wheel is often used to add vibrato, change the speed of a Leslie rotary speaker simulator in an organ sound, or add a phasereffect to an electric piano ( Figure 8.19 ).
Breath
2
Seldom used with actual breath controllers, which are not common; sometimes transmitted by a joystick or a third wheel in the left-hand controller section of a keyboard. As with modulation, the nature of the sound changes produced by breath controller data is left up to the designers of the synthesizer and its sound programs.
Volume
7
Controls the overall loudness of the musical part assigned to a channel. Unlike velocity, which controls one note at a time and can be mapped to different timbral qualities as well as note loudness, volume data controls the loudness of an entire channel all at onceit'sessentially a volume knob for that instrument (in the case of a monotimbral instrument) or for a multitimbral part (in a multitimbral instrument). 0 is silent; 127 is the maximum.
Pan
10
Left/right pan for a channel. A CC 10 message with a value of 0 pans the sound hard left, 64 pans it to the center, and 127 pans it hard right.
Expression
11
Used for dynamic expression within a part by controlling a percentage of the volume setting. (In other words, setting volume with CC 7 is akin to moving a mixer fader, whereas expression data would be used for crescendos and decrescendos on a note or chord while it is being played.)
Hold pedal
64
Also known as the sustain or damper pedal, the hold pedal acts (more or less) the way it does on the piano: hold it down and notes continue to sound even after their keys are released. You'll find the other two piano pedals defined in MIDI, too: the sustenuto pedal is controller 66 and the soft pedal is controller 67. Not all instruments respond to 66 and 67, but most respond to 64.


Control change or continuous controller?
Control change or CC messages in general are sometimes called continuous controllers, but this is a misleading term for two reasons. First, MIDI works only in whole numbers. You can't send a CC message with a value of 5.5 or 19.327. As a result, the "continuous" range of the data is actually stepped. Second, the CC message groupincludes on/off switches as well. The term "control change" or simply "controller" is preferred.


Since there are so many controllers available, you'll use some to add color to your musical performances, while others remain available for controlling other elements, like effects settings, synthesis parameters, mixer faders , or anything else you want to assign. With a hundred or more choices per device, you have a lot of flexibility.
Some controllers, like the hold pedal, act simply as on/off switches: Press the pedal and the CC 64 data value goes to 127; lift your foot and the CC 64 value drops back to 0. Other controllers are called continuous controllers because they can take advantage of the full range of values between 0 and 127. This is true of faders and knobs ( Figure 8.19 ).
Figure 8.19. The modulation wheel is a typical controller, capable of generating a stream of control change messages. As you turn the mod wheel on a Korg MS2000 keyboard (1), you generate a series of numbers from 0 (down) to 127 (up) (2), which will appear in your sequencer as a modulation contour (3). The modulation data can be assigned to a synthesis parameter like vibrato amount. (Photo courtesy Korg USA)


This One Goes to Eleven: Beyond 128

In some cases, MIDI needs more than 128 data values. Fortunately, there's an easy solution: if you can only use numbers from 0127, use two numbers. Instead of 128 possibilities, you now have all of the combinations of two numbers from 0127, for up to 16,384 combinations (128 x 128).
The first byte in a two-byte chunk of data is called the Most Significant Byte (MSB). It's followed by a Least Significant Byte (LSB). The standard computing technology used is a little misleading, because one isn't necessarily more important than the otherit's just a convenientmeans of remembering which is which.
The standard MIDI pitch-bend message has two data bytes. (Pitch-bend messages use just the status bytethe "here comes a pitch-bend!" bytefollowed by two numbers for the amount, instead of just one.) This allows for the full 16,384 data levels, although some devices and software will ignore the finer resolution and just use the information in the MSB, ignoring the LSB entirely.
Control change messages have only one free data byte, because the first byte after the status byte is used to specify which controller is being transmitted. To get two data bytes, you need two control change data messages, one for the MSB (the coarse setting) and one for the LSB (the fine setting). Two data bytes are used for values that need more detail, like tuning. Many instruments don't respond to CC LSB databut in case you run into one that does, you might like to know that the CC message containing the LSB has a CC number that's higher than the MSB message by 32. For instance, if the MSB is a mod wheel message (CC 1), the LSB will be transmitted as CC 33 (because 1 + 32 = 33).


For more information on the MIDI specification, see (among a number of sites):
www.midi.org
http://users.chariot.net.au/~gmarts/midi.htm
www.borg.com/~jglatt/tutr/miditutr.htm


A complete version of the MIDI 1.0 Specification is available for sale at www.midi.org.
Hands-on: Realistic-sounding reeds with mod, pitch-bend, and range

{% if main.adsdop %}{% include 'adsenceinline.tpl' %}{% endif %}

The keyboard instruments in the previous hands-on example ("Try playing a keyboard," p. 291) didn't allow you to use the modulation or pitch wheels, with good reason. When was the last time you heard a piano that could bend pitch? With an instrument like a saxophone , though, your playing will sound unrealistic if it lacks elements like vibrato and pitch-bend; it'll sound too much like it was played with a keyboard instead of the real instrument. (Photos are of the Novation X-Station, which, like many keyboards, combines pitch and mod on a single X/Y joystick. Although the physical hardware is different, the MIDI messages function exactly as on other instruments.)
To try adding some expressivity using MIDI controls, load the Alto Sax patch in SampleTank2 FREE. (Double-click Alto Sax in the Browser.)
  • Modulation: Play a sustained notesounds a little lifeless, right? Now, gradually add a little bit of vibrato by moving the modulation wheel on your controller keyboard. Real reed instruments vary the amount of vibrato over time, so use different amounts of modulation for a more authentically organic, expressive sound. You'll find vibrato sounds more musical if you add it to some notes and not others, or add vibrato just to the end of a note. For some inspiration, listen to your favorite sax recordings and also think about how singers use vibrato. (Many sax players claim their instrument is closest to a human voice.)
  • Pitch-bend: Sax players can easily perform short "scoops" up to a note or down from a note by relaxing their jaw as they play, a technique more common in jazz than in classical playing. To create a scoop up in MIDI, shift the pitch-bend wheel just below center (or to the left of center, depending on the orientation of your pitch-bend hardware), play a note, and then release the pitch wheel. Since it's spring-loaded, it will return to center pitch on its own. For a fast scoop, let go of the pitch wheel as you hit the note. With practice, and some attention paid to recordings by real sax players, you can make this sound realistic. Scoops down use the same technique in reverse: play a note at pitch, then scoop down at the end by moving the pitch wheel just before you release the note. (To make sure the release portion of the note doesn't scoop up again, hold the pitch wheel in place while the note decays to silence.)
  • Range: An essential way to make a virtual instrument sound realistic is to play it in the same range as the real instrument. The alto sax sounds most natural when played in the range from F below the treble clef to the F two octaves above (F2F4).


 Pitch and mod wheels on SampleTank display
 Optimal range
This general advice isn't just for replicating acoustic instruments. If you've created a synthetic instrument or analog sound, think about what makes it musical. In effect, you've invented a new instrument, so it'll sound more musical if it behaves as though it were a physical instrument.

Programs and Banks

When an instrument receives a program change message, it switches to a different sound program, choosing the program from among those stored in its internal memory. (Note that some very good software synthesizers, such as those in Propellerhead Reason, don't respond to program change messages.) Like everything else in MIDI, program change messages are assigned a number between 0 and 127. Program changes can be transmitted directly from the front panel of most keyboards, which is useful in live performance. They can also be stored in a sequencer track to ensure that the instrument being played by that track makes the correct sound each time the sequence is played. Program changes are often inserted at the beginning of every MIDI track in a song for precisely this reason.
Program change messages are just numbers, however, with no particular meaning. Program 12 in one synth might be a flute sound, whereas program 12 in another synth might be a distorted electric guitar. As a result, if you create a track in a MIDI sequencer and add a program change to the track, you're likely to run into problems if you should later decide to send that track to a different MIDI instrument for playback. More than likely, the sound of the new instrument will be incorrect.
The General MIDI (GM) format, found on many hardware synths and a handful of specialized software synths, is a way of dealing with this problem. If an instrument has the GM logo on its panel (or can be switched to a GM mode), then the result of sending it a program change message with a given value becomes predictable. Program 7, for instance, will always be a harpsichord, and program 36 will always be a fretless bass. Although this is genuinely useful, General MIDI is designed more for consumer music applications such as playing prerecorded arrangements of popular tunes on a home keyboard than for serious musicians . With the explosion in the variety of software and hardware available and a wide variety of instrument libraries, you'll find most instruments use their own patch numbering scheme.
Sound programs are organized into banks, so that instruments can include more than 128 sounds and so that patches can be organized into useful categories. (You might find an instrument bank and a percussion bank, for instance.)
To change programs, you'll use the program change message and select a patch (program) number from 0 to 127. At the time when MIDI was first developed (the early 1980s), nobody imagined that synthesizers would ever have enough memory to store hundreds of sound programs or that musicians would have a need for so many programs. So the number of possible program change messages is limited. To switch from one bank of sounds to another, the bank select message must be used. Bank select was grafted onto (or spliced into) the control change message area. A CC 0 message gives the coarse (MSB) value for the bank, and a CC 32 message provides the fine (LSB) value. Thus, up to 16,384 banks of 128 sounds each can be chosen via MIDI, which ought to be enough to make anybody happy. Unfortunately, various instruments respond to CC 0 and CC 32 messages in different ways. You may need to experiment and read the synth's manual in order to select the desired program. To get to, let's say, sound bank H, one synth might use a CC 0 value of 7, while another uses a CC 0 value of 0 followed by a CC 32 of 7.
To change from a patch in the current bank to a patch in another bank, you'll need to send two messages: first send a bank select message, then send a program change. (You have to send the messages in that order, because an instrument waits to change to the new bank until it receives the next program change number. So even if you're changing from bank 1, patch 1 to bank 2, patch 1, on most instruments you'll need to send a bank select followed by a program change.)
Program numbers: 127 or 128? Confusingly, some manufacturers number MIDI messages 1128 instead of 0127.


System Messages

MIDI also uses system messages. These messages apply to the entire music system that's interconnected using MIDI, not just to individual instruments. The three types of system messages are system-realtime, system-common, and system-exclusive.
The most important system-realtime messages are used for synchronization among devices (sequencers, for example) that engage in real-time recording or playback. In this category are clock messages (sent 24 times for each quarter-note), start, stop, and continue. You can even use MIDI sync data to synchronize sequencers running on two or more computers by connecting them with MIDI cables. (See Chapter 12 for details.)
System-common messages include song select and song position pointer (again, for use with sequencers) and a few other kinds of messages. But the best known and yet most mysterious of MIDI's system messages is system-exclusive, also known as sys-ex or SysEx.
SysEx
System-exclusive messages are designed to address specific hardware devices. These messages are a "back door" through which each manufacturer can create whatever MIDI features a particular instrument may need. The beginning of a SysEx message specifies that the data is intended for a specific model built by a specific manufacturer. All of the other MIDI gear in the system should ignore a SysEx message that's not addressed to them. What SysEx is used for is entirely dependent on the equipment's manufacturer.
One advantage of SysEx messages is that they aren't limited to a specific length, because they simply use a "stop" (also called EOX, or "end-of-exclusive") message to indicate that the message has concluded. Since SysEx can involve a lot of data, it's usually used for initializing a device, downloading an updated operating system from the computer to a piece of hardware, or sending other large chunks of information. Data is "dumped" to a device via a SysEx in a single large chunk, rather than by sending values for individual parameters during a performance as with MIDI data messages like notes and controllers. That said, some manufacturers use SysEx messages instead of data messages for real-time controls, and specialized SysEx data called MIDI Machine Control and Show Control (see sidebar) allow for real-time commands not possible in the channel message area of MIDI.
MIDI Time Code and MIDI clock
Timing information is essential to synchronizing playback systems such as sequencers when they're running on different computers or hardware instruments. When synchronization is used, one device or piece of software is designated the master clock source ( Figure 8.20 ), and any other hardware or software should follow the clock signals coming from the master. (In common parlance, the other devices are called slaves.) MIDI provides two methods of synchronization: MIDI clock and MIDI Time Code (MTC).
Figure 8.20. MIDI sync settings in Ableton Live let you synchronize the tempo and start/stop times of multiple devices.


MIDI Machine Control/MIDI Show Control

A specialized form of SysEx message, MIDI Machine Control (MMC) is designed for controlling systems for recording and playback remotely, and is used by recording systems like the Tascam DA-88. MMC includes more sophisticated commands than the usual system-realtime start/stop/continue commands. MMC has been expanded into a protocol called MIDI Show Control (MSC) intended for use with other equipment like lighting and special effects. MSC can even be found powering the multimedia elements of rides at Disney World.


MIDI clock is the simpler of the two types. It's sent as a regular, repeating message, 24 times for each quarter-note of the master clock source; thus, it acts like a virtual metronome. Receiving devices count the clock messages being received. After 24 clocks, they should reach the next beat. If the tempo of the master slows down or speeds up, any slaves that are synced using MIDI clock should also slow down or speed up at the same rate. MIDI clock provides tempo-dependent synchronization.
When playback starts not at the beginning of the song but somewhere in the middle, the master should also send out a message called song position pointer (SPP), which tells the slave devices where to start playback. The slave will advance its transport to the point indicated by the song position pointer and then wait for a start message to begin playback.
You can use MIDI clock to synchronize applications running on multiple computers. Since sequencers are capable of syncing tempo and song position via MIDI clock, all you have to do is connect the computers via MIDI, choose one program as the master clock source, make sure it's sending sync information to the correct MIDI output, switch the other program into external clock mode, and make sure it's receiving its sync signal from the appropriate MIDI input. You don't have to be using MIDI in any other way for this technique to be useful: tempo-synced audio effects, for example, will automatically synchronize, making this solution perfect for collaboration or performance. (See Chapter 13 for more on real-time performance.)
For clock synchronization that is not tempo-dependent but is based on actual minutes, seconds, and film/video frames, you'll use the MTC. MTC uses standard SMPTE (Society of Motion Picture and Television Engineers) time references, in hours, minutes, seconds, and frames , and it's configurable to different frame rates for different media (film, video, etc.) (See Chapter 12 for more information on working with film and video.) Even if you're not using film or video, MTC can be useful for handling time in nonmusical increments .

*Sources*


http://flylib.com/books/en/1.500.1.41/1/
http://www.ItchyTastyRecords.com