Next Previous Table of Contents
The universal midi bus for sound applications is not finished yet. It is on the way (I think all developers agree that the project has to be done), but will likely take its time. (See below)
To have something to play around with, and to test the stuff, I have written midisend. It's a program that just takes what arrives on /dev/midi00 and sends it to arts.
It requires that
It is also capable of sending from other devices, when you use
midisend /dev/some_other_device
.
All in all, you can then play on some external keyboard and listen to the sound arts generates. On my system I have not noticed any problems due to some latency, it just feels like playing on a normal keyboard.
Finally you can use Cantor and synthesize the output with Arts. It is alpha, incomplete and just a first thing, but it works. So it's proved: we can do that.
Other sequencers might follow, contact me if you are writing one and want to have midibus support.
Quickstart:
Voilą, some sound... how it sounds like is of course a question of the
synthesis model you use. If you like, you can start midisend
as well, to do some nice live improvisation on your keyboard while
africa.mid is playing ... and if it sounds really great, make an mp3
and send it to me, so we can show off how nice aRts is on the homepage ;)
BTW: Thanks Paul, Cantor is really a great aid now, and will become more and more important when it comes to further development. It will as well make it possible to really start composing with arts in some time.
BTW2: Arts and Cantor are available from the KDE CVS in the kmusic module.
Arts is distributed code already. This means you have
There is even a non gui programm, artsshell, that can make the synthesizer (among other things) start its synthesis without a gui. (So if you don't have KDE, you can still execute synthesis models, but it's of course less fun without being able to edit them)
What you have with aRts is:
Now, what midisend basically does, is adding another process (process 3), that sends mididata into the synthesizer (using CORBA as well):
So, what I'd like to have is (logically) that:
Physically (the way you would implement it), it would look perhaps like that:
Note that every box here is a seperate process, and that you can plug in new processes as you like (e.g. a virtual keyboard on the screen, that lets you send midi events by clicking on it).
Here are the idl-things you might want to have a look onto:
synth.idl: (which is not so interesting for that discussion now). midibus.idl:
interface MidiChannel { oneway void noteOn(in octet channel, in octet note, in octet volume); oneway void noteOff(in octet channel, in octet note); }; // This of course doesn't look finished ;) where are parameter, such as // echo depth, where is midi mapping, where is instrument changing, where // is... // // I hope to complete that later.
There is currently ongoing work which will probably lead to another standard. The idea of the midibus is quite ok, but one thing is that it doesn't guarantee that the midi events are there just-in-time. For realtime data (live performance) there is probably no way of doing so anyway, but for sequenced data, this is not optimal.
Another point is, that the standard gives us no easy way to go towards harddisk recording, which is probably interesting anyway.
So probably there will be some other standard, CORBA based as well, which is built of the makers of Arts, Cantor and kooBase together, and will try to combine the best of all. Stay tuned.
Next Previous Table of Contents