Next Previous Table of Contents
The universal midi bus for sound applications is not finished yet. It is on the way (I think all developers agree that the project has to be done), but will likely take its time. (See below)
To have something to play around with, and to test the stuff, I have written midisend. It's a program that just takes what arrives on /dev/midi00 and sends it to arts.
It requires that
It is also capable of sending from other devices, when you use
midisend /dev/some_other_device
.
All in all, you can then play on some external keyboard and listen to the sound arts generates. On my system I have not noticed any problems due to some latency, it just feels like playing on a normal keyboard.
Finally you can use Cantor and synthesize the output with Arts. It is alpha, incomplete and just a first thing, but it works. So it's proved: we can do that.
Other sequencers might follow, contact me if you are writing one and want to have midibus support.
Quickstart:
Voilą, some sound... how it sounds like is of course a question of the
synthesis model you use. If you like, you can start midisend
as well, to do some nice live improvisation on your keyboard while
africa.mid is playing ... and if it sounds really great, make an mp3
and send it to me, so we can show off how nice aRts is on the homepage ;)
BTW: Thanks Paul, Cantor is really a great aid now, and will become more and more important when it comes to further development. It will as well make it possible to really start composing with arts in some time.
BTW2: Arts and Cantor are available from the KDE CVS in the kmusic module.
Arts is distributed code already. This means you have
There is even a non gui programm, artsshell, that can make the synthesizer (among other things) start its synthesis without a gui. (So if you don't have KDE, you can still execute synthesis models, but it's of course less fun without being able to edit them)
What you have with aRts is:
___________ | | user interaction | process 2 | <==============> User |___________| [X11/Qt] |^ || [Corba] (synth.idl) v| ___________ | | audio data | process 1 | ===============> OSS-Free |___________| [device IO]
Now, what midisend basically does, is adding another process (process 3), that sends mididata into the synthesizer (using CORBA as well):
___________ | | user interaction | process 2 | <==============> User |___________| [X11/Qt] |^ || [Corba] (synth.idl) v| ___________ | | audio data (/dev/dsp) | process 1 | ===============> OSS-Free |___________| [device IO] ^ | [Corba] (midibus.idl) | ___________ | | midi data | process 3 | <============== OSS-Free |___________| [device IO]
_____________ _____________ ________________ | | | | | | | ksynth (1) | | sequencer | | io to OSS/free | |_____________| |_____________| |________________| || || || ====================================================================== midi bus (midi events, requests for clients, midi mapping,...) ======================================================================
_____________ _____________ ________________ | | | | | | | ksynth (1) | | sequencer | | io to OSS/free | |_____________| |_____________| |________________| ^ ^ ^ | | | +-----------+ | +------------+ CORBA i/o | | | v v v _________________ | | | midi bus server | |_________________|
Note that every box here is a seperate process, and that you can plug in new processes as you like (e.g. a virtual keyboard on the screen, that lets you send midi events by clicking on it).
Here are the idl-things you might want to have a look onto:
synth.idl: (which is not so interesting for that discussion now). midibus.idl:
interface MidiChannel { oneway void noteOn(in octet channel, in octet note, in octet volume); oneway void noteOff(in octet channel, in octet note); }; // This of course doesn't look finished ;) where are parameter, such as // echo depth, where is midi mapping, where is instrument changing, where // is... // // I hope to complete that later.
Next Previous Table of Contents