Next
Previous
Table of Contents
Arts has a whole bunch of concepts. Some of them are new in Arts-0.3.0, other
have been there for quite a while now.
The important part is, that you probably will probably understand better how
midi synthesis works if you know the concepts that drive Arts. You should
also be able to do more with aRts knowing how it works.
The idea of aRts is, that synthesis can be done using small modules, which
only do one thing, and then recombine them in complex structures. The small
modules normally have inputs, where they can get some signals or parameters,
and outputs, where they produce some signals.
One module (Synth_ADD) for instance just takes the two signals at it's input
and adds them together. The result is available as output signal. The places
where modules provide their input/output signals are called ports.
A structure is a combination of connected modules, some of which may have
parameters coded directly to their input ports, others which may be connected,
and others, which are not connected at all.
What you can do with ArtsBuilder is describing structures. You describe,
which modules you want to be connected with which other modules. When you
are done, you can save that structure description to a file, or tell Arts
to create such a structure you described (Execute).
Then you'll probably hear some sound, if you did everything the right
way.
If you like to do so, you can give your structure a name, which can be used
to reference it. (Use File/Rename in artsbuilder).
Then, you can choose to publish your structure description on the Arts
server. What will happen is that a copy of your descrition will be put
into Arts and made available to other "users".
The purpose is to have modules, which can create structures as they are
needed. For instance, you may design a structure which synthesizes a base
drum.
Then, some midi handler could create such a structure whenever a certain
midi event is sent to the synthesizer. Then, on every incoming midi event
you would hear your synthesized base drum.
Of course this requires, that your audio data is mixed with the other
audio output and played. It also requires that your structure knows
when to go away, otherwise you would have more and more structures being
synthesized.
And finally you should have some way of getting more data about the midi
event that arrived, such as volume or frequency, if you want to synthesize
"real" instruments as well.
The next points treat these problems.
Busses are dynamically built connections that transfer audio. Basically,
there are some uplinks and some downlinks. All signals from the uplinks
are added and send to the downlinks.
Busses as currently implemented operate in stereo, so you can only transfer
stereo data over busses. If you want mono data, well, transfer it only
over one channel and set the other to zero or whatever.
What you need to to, is to create one or more Synth_BUS_UPLINK objects
and tell them a bus name, to which they should talk (e.g. "audio" or
"drums"). Simply throw the data in there.
Then, you'll need to create one or more Synth_BUS_DOWNLINK objects, and
tell them the bus name ("audio" or "drums" ... if it matches, the data
will get through), and the mixed data will come out again.
The uplinks and downlinks can reside in different structures, you can
even have different Artsbuilders running and start an uplink in one
and receive the data from the other with a downlink.
What is nice about busses is, that they are fully dynamic. Clients can
plug in and out on the fly. There should be no clicking or noise as
this happens.
(Of course, you should not plug out a client playing a signal, since
it will probably not be a zero level when plugged out the bus, and
then it will click.)
Now the second point for advanced instruments is how to get rid of structures
automagically.
There is a Synth_STRUCT_KILL object for that purpose, which will remove
the structure as soon as it gets an input signal > 0.5 (read: 1).
Of course other modules have been tuned to generate such signals in
reasonable situations:
Both, Synth_ENVELOPE_ADSR and Synth_PLAY_WAV have an output parameter done,
which will be set to one as soon as they are ready.
This means, if you want an instrument (e.g. base drum), which is just a
wav, it will probably be enough to have a Synth_PLAY_WAV, which plays
the wav, and then connect the done parameter to a Synth_STRUCT_KILL.
The structure will be removed as soon as the wav is played (and no
none zero output would be generated anymore anyway).
Interfaces are the way of getting data into structures. They are like
modules, but they normally have only inputs (or only outputs).
Their outputs come from the world outside the structure, and their
inputs go back there.
A structure is said to provide an interface if it contains an interface
module of that type.
For instance, midi routers expect instrument structures to provide an
interface for midi data. Then, they can pass the midi data in, and the
structure can use that data for synthesis.
Note that in recent aRts versions, interfaces are only there since they
allow some internal extra tricks - but they will probably go away and be
replaced by conventional inport and output ports of structures. (See
"using structures as modules again" below). So the only interface that is
supported by aRts is Interface_MIDI_NOTE, and if it is replaced, no other
interfaces will come up. But currently, you need it.
The idea is that you have structures which provide an Interface_MIDI_NOTE.
There will be a frequency, and a velocity (volume) and a parameter passed
to you, wether this key is still pressed.
Your structure should now synthesize exactly that note with that volume,
and react on the pressed parameter (where pressed = 1 means the user still
holds down that key and pressed = 0 means the user has released that key).
To create and use such a structure, you should do the following:
- To process the pressed parameter, it is convenient to use Synth_ENVELOPE_ADSR,
or, in case of playing some drum wav, just play it anyway, and ignore the
pressed parameter.
- The structure should kill itself when it is no longer needed,
both, wavs and envelopes provide the done parameter when done, so
thats easy.
- You'll need to publish your structure under some name.
- You can tell a Synth_MIDI_ROUTER object, that it should create
such a structure for every midi event that is arriving on channel 1
for instance.
- Oh, and of course your structure should probably play the audio data
it generates to a bus, which is convenient to postprocess with effects
and finally direct to the speakers.
It is possible, to use a complex structure as module again. To make this
even more interesting, you can give these structures ports, so that it can
process signals like every other module. Say you have found a great way to
produce a phantastic reverb. So you could put that reverb alone into a
structure and give it some ports, like two (stereo) input ports, another
one for the reverb depth, and two output ports, where the reverbiated
signal comes out again.
Now you could give it a nice name, (such as Synth_COOL_REVERB), and publish
it. At the same time, it appears as new module in the Modules menu, and you
can use it as module in other structures.
When you think you want to keep it, save it as Synth_COOL_REVERB.arts in
your $HOME/arts/structures directory, and aRts will load it everytime it
starts. (It is important that filename is like <structurename>.arts).
See "autoloading at startup".
The idea of designing an instrument is, that you do it once, and reuse it.
Basically, the idea of designing any component in arts, such as an
equalizer, an effect, a device like a mixer or ... is that you do it once
and reuse it. (Well, there is some work left to do).
There are some things to ensure to make it reusable.
- The first thing that is important is that your instrument is a structure,
that contains a Interface_MIDI_NOTE (as described above: Midi synthesis).
- The structure should terminate itself when appropriate (see: midi synthesis).
- The structure should have a port named bus, which is an in string property.
This port should be connected to a Synth_BUS_UPLINK, which gets the data
your instrument produces.
- The instrument should be named instrument_somenicenamehere, important is that
it startes with instrument_.
If you want to have tunable parameters, which can be changed by the user on
the fly, you need some more things:
- Your instrument should have an in audio signal port for every parameter that
is changeable. If you can for instance change the parameters "attack","decay"
and "cutoff", your structure would have three in audio signal ports.
- For your instrument structure, there should be another structure which will
be the "control panel" for the instrument. It should have three incoming
ports (parent, which is in audio signal, and x and y which are audio property).
- The "control panel" structure should be called instrument_somenicenamehere_GUI.
- The "control panel" structure should display itself at position x,y in the
widget parent. Simply use a Gui_SUB_PANEL to achieve this.
- The "control panel" structure should have an output for every parameter the
instrument needs, in our example it would have three output audio signal
ports, "attack","decay" and "cutoff".
That sounds complicated? Well - yes - a little, but the resulting instruments
are really reusable. If you have seen the examples with the INSTRUMENT_MAPPER,
you know what I mean.
When Artsbuilder starts, it tries to load & publish every structure that you
have in your $HOME/arts/structures directory. It will only publish them, if
they have the same name like the filename. (E.g. the structure that is in
the file $HOME/arts/structures/test.arts should be called test).
This means that you won't have to load the twenty instruments you designed
recently and their "control panels" and the ten effects you like that much
every time you begin to work with aRts. Just save them in that directory
and they will be there already whenever you load aRts.
Note that this is also the only way how you can resonable use structures
as modules. Assume that you have created a Synth_SPECIAL_FX structure and
used it in another structure (e.g. in an instrument). To load that
instrument, you would need to load & publish Synth_SPECIAL_FX manually
before. Instead of doing that - just save the structure in
$HOME/arts/structures (that is where ArtsBuilder saves by default anyway)
and it will work automagically.
You can then as well use "retrieve from server", if you want to edit
Synth_SPECIAL_FX again, since it has been loaded at program startup already
anyway.
Are documented under gui elements - they are a nice concept as well for
making a sixteen channel mixer by describing one channel and saying "I
need sixteen of them". It uses publishing, dynamic instantiation and all
that stuff to achieve it, but as you don't have the gui basics yet, its
documented when you have them ;)
If you really work with aRts, that is for instance, compose a song using
aRts and Brahms, you will probably setup lots of information after
starting your synthesis structure. You will assign which instruments
are on what channels, tune the parameters of your mixer, finetune those
instruments that are configurable and setup up effects.
Session management is supposed to save all those configurations you need
for, say, your new Brahms-Pop-Song1. So if you save the song, and want to
work at that again later, hit the "Save..." button on aRts _as well_ (it
won't save the song in Brahms of course). Save the current setup to a name
ending on .arts-session.
If you want to work on the same some again next day, simply start artsbuilder
and choose "Open Session ..." from menu. Open your session again - and
everything will be where you left off. Every mixer setting - every
instrument parameter.
Thats what professional studios play thousands of dollars for ;). (But
they of course get real sliders, that are even really moved back to their
original position using small motors).
Important: session management only works if you keep the structures
untouched that a session uses. As soon as you for instance add a module
(or slider) to one of the instruments/mixers/etc. there is no guarantee
that your saved sessions will restore properly. There might be a bugfix
for that one day (such as saving all the used structures with the session).
It is possible to use aRts for full duplex audio processing. That means,
you can record data (from line in or microphone or similar), and process
it through aRts, and replay it with a very low latency. That way, you can
use aRts as effect processor.
It is however required that you decide what you want to do with aRts when
designing the structure that will be accessing the sound card.
Either, you use Synth_PLAY as last module, which sends the data to the
soundcard. You then have decided for play-only usage of aRts. On the
other hand, you can use Synth_FULL_DUPLEX_PLAY and Synth_FULL_DUPLEX_RECORD
to access the soundcard. If you do so, make sure that both of them are
contained in the same structure, so that they get started at the same
time.
If you want to access the recorded data or other things from more than
that structure, use busses, as always.
Unfortunately, there are many soundcards out there, which don't have correct
full duplex support. For instance, most SoundBlaster 16, AWE and similar
cards only can do one - 16 bit recording or 16 bit playing. The other
direction will be done with 8 bits when running in full duplex mode, which
sounds ugly (noise).
Full duplex recording can also be used to be able to record data from
your sound card, even while aRts is running - for instance as your audio
server.
Mapped instruments are instruments, that behave differently depending on
the pitch. You could for instance build a piano of 5 octaves, using one
sample for each octave (pitchshifting it accordingly). That sounds a whole
lot better than only using one sample.
You could also build a drum map, that plays one specific drum sample per
key.
To get these parameters, your instrument structure must be "incomplete",
that is for instance, it must be a normal instrument that plays a wave
sample, but the wave sample isn't given in the structure. Instead, there
should be a in string port, that connects to the filename of the wave
player. That way, the structure is parameterized.
I know that sounds complicated. Simply look at play_wav or play_akai or
play_akai_stereo to see what I mean.
Now - suppose you use play_wav for your instrument, then your structure
has the remaining parameter sample. You can now create a file named
909drums.arts-map, with the following contents:
structure=play_wav
keygroup=35-35
{
sample=/usr/local/samples/909select/bt0a0a7.wav
}
keygroup=36-36
{
sample=/usr/local/samples/909select/bt0a0da.wav
}
keygroup=37-37
{
sample=/usr/local/samples/909select/bt3a0da.wav
}
keygroup=38-38
{
sample=/usr/local/samples/909select/bt7a0d7.wav
}
keygroup=39-39
{
sample=/usr/local/samples/909select/btaa0d0.wav
}
As you see, the missing parameter is specified differently for different
groups of keys (here the keygroups are only from one key to the same key,
but you could easily say keygroup=30-50 to specify a range of keys).
If you have multiple parameters, it works the same way, just that you
put more than one line in the keygroup definitions. Here is a small extract
of an akai playing map (for a piano).
structure=play_akai_stereo
keygroup=0-30
{
rightsample=/var/samples/BOESEND.LOUD/BOE_LD_F#0-R
leftsample=/var/samples/BOESEND.LOUD/BOE_LD_F#0-L
}
keygroup=31-33
{
rightsample=/var/samples/BOESEND.LOUD/BOE_LD_G#0-R
leftsample=/var/samples/BOESEND.LOUD/BOE_LD_G#0-L
}
keygroup=34-34
{
rightsample=/var/samples/BOESEND.LOUD/BOE_LD_A#0-R
leftsample=/var/samples/BOESEND.LOUD/BOE_LD_A#0-L
}
keygroup=35-36
{
rightsample=/var/samples/BOESEND.LOUD/BOE_LD_C1_-R
leftsample=/var/samples/BOESEND.LOUD/BOE_LD_C1_-L
}
keygroup=37-38
{
rightsample=/var/samples/BOESEND.LOUD/BOE_LD_D1_-R
leftsample=/var/samples/BOESEND.LOUD/BOE_LD_D1_-L
}
keygroup=39-39
{
rightsample=/var/samples/BOESEND.LOUD/BOE_LD_D#1-R
leftsample=/var/samples/BOESEND.LOUD/BOE_LD_D#1-L
}
keygroup=40-40
{
rightsample=/var/samples/BOESEND.LOUD/BOE_LD_E1_-R
leftsample=/var/samples/BOESEND.LOUD/BOE_LD_E1_-L
}
These map files will automatically appear in the instrument mapper, as soon
as you copy them to the arts/maps directory (in your home).
Next
Previous
Table of Contents