Next
Previous
Table of Contents
The universal midi bus for sound applications is not finished yet.
It is on the way (I think all developers agree that the project has to
be done), but will likely take its time. (See below)
To have something to play around with, and to test the stuff, I have
written midisend. It's a program that just takes what arrives on
/dev/midi00 and sends it to arts.
It requires that
- the synthesizer is running
- there is a structure that processes the midi events inside the
synthesizer (will work without, but you won't hear anything)
It is also capable of sending from other devices, when you use
midisend /dev/some_other_device
.
All in all, you can then play on some external keyboard and listen to the
sound arts generates. On my system I have not noticed any problems due to
some latency, it just feels like playing on a normal keyboard.
The reincarnation of kooBase, Brahms, has midi bus support now. It works
quite fine, and can be used together with other midi output ports (such
as the conventional SoundBlaster output).
Brahms also is likely to support more interaction with aRts than just
midi. Playing wave tracks is already implemented, hard disk recording
and others might follow.
As a brief summary, we intend to make Brahms/aRts fully featured as
CuBase/VST, while the relation between Brahms and aRts will be about
the same as the relation between CuBase and VST.
To get an aRts-enabled Brahms (it will compile without), first install
aRts and then Brahms. You can force Brahms to build aRts support with
--enable-arts, but normally, it should autodetect that aRts is there.
Cantor - another Qt based sequencer also has midibus support, but Brahms
currently seems to be the more usable choice for musicians
Other sequencers might follow, contact me if you are writing one and want
to have midibus support.
This works for both, Brahms and Cantor. (Simply substitute Brahms with
Cantor if you are trying Cantor).
- compile and install this arts-release
- take a Brahms version with midi bus support, every recent Brahms should
have that (should be available at
http://linux.twc.de/arts)
- compile it, normally it should detect that you have aRts installed and
build midibus support automatically.
- install it
- start artsbuilder (if you have not given artswrapper root privileges,
start artsserver as root, otherwise your realtime output won't be realtime,
and things get noisy)
- start artsbuilder
- use File/Retrieve server and get some usable midi mapping structure,
like example_mixer_eqfx
- start it using File/Execute Structure
- click on the add button to add an instrument (or more than one) - there
are a bit conflicting ideas what numbers midi channels have. aRts starts
counting with 1, Brahms starts counting with 0. So if you for instance
assigned a drum map to channel 10 in aRts, it will be channel 9 from
Brahms.
- start Brahms (or cantor from the examples subdirectory of cantor)
- compose (or load a file, or similar)
- set the output device of the tracks you want to be played by
aRts to midibus
- hit the play button
Voilą, some sound... how it sounds like is of course a question of the
synthesis model you use. If you like, you can start midisend
as well, to do some nice live improvisation on your keyboard while you
are using the sequencer ... so now you can compose a nice demosong, and
if it sounds really great, make an mp3 and send it to me, so we can show
off how nice aRts is on the homepage ;)
BTW: Thanks to both authors Jan (Brahms) and Paul (Cantor), both programs
have really helped me much during the aRts development.
BTW2: Arts and Cantor are available from the KDE CVS in the kmusic module,
Brahms will probably follow soon.
Arts is distributed code already. This means you have
- A gui that is used to design which modules in arts are connected to
which others.
- The synthesizer (running in a different process), that does synthesis.
It is capable to build an arbitary structure of interconnected
synthesis modules, and execute it.
Processes 1 and 2 communicate all the time. In fact process 1 can't exists
without process 2, while process 2 can exist without process 1.
There is even a non gui programm, artsshell, that can make the synthesizer
(among other things) start its synthesis without a gui. (So if you don't
have KDE, you can still execute synthesis models, but it's of course less
fun without being able to edit them)
What you have with aRts is:
Now, what midisend basically does, is adding another process (process 3),
that sends mididata into the synthesizer (using CORBA as well):
So, what I'd like to have is (logically) that:
Physically (the way you would implement it), it would look perhaps like that:
Note that every box here is a seperate process, and that you can plug in
new processes as you like (e.g. a virtual keyboard on the screen, that
lets you send midi events by clicking on it).
Here are the idl-things you might want to have a look onto:
arts.idl: (which is not so interesting for that discussion now).
midibus.idl:
interface MidiChannel {
oneway void noteOn(in octet channel, in octet note, in octet volume);
oneway void noteOff(in octet channel, in octet note);
};
// This of course doesn't look finished ;) where are parameter, such as
// echo depth, where is midi mapping, where is instrument changing, where
// is...
//
// I hope to complete that later.
There is currently ongoing work which will probably lead to another standard.
The idea of the midibus is quite ok, but one thing is that it doesn't
guarantee that the midi events are there just-in-time. For realtime data
(live performance) there is probably no way of doing so anyway, but for
sequenced data, this is not optimal.
Another point is, that the standard gives us no easy way to go towards
harddisk recording, which is probably interesting anyway.
We will be working on that issue, and on the aRts CORBA interface. Stay tuned.
Next
Previous
Table of Contents