The K Desktop Environment

4.5. GUI Elements

GUI elements are currently in the experimental state. However, this section will describe what is supposed to happen here, so if you are a developer, you will be able to understand how aRts will deal with GUIs in the future. There is some code there already, too.

GUI elements should be used to allow synthesis structures to interact with the user. In the simplest case, the user should be able to modify some parameters of a structure directly (such as a gain factor which is used before the final play module).

In more complex settings, one could imagine the user modifying parameters of groups of structures and/or not yet running structures, such as modifying the ADSR envelope of the currently active midi instrument. Another thing would be setting the filename of some sample based instrument.

On the other hand, the user could like to monitor what the synthesizer is doing. There could be oscilloscopes, spectrum analyzers, volume meters and "experiments" that figure out the frequency transfer curve of some given filter module.

Finally, the GUI elements should be able to control the whole structure of what is running inside Arts and how. The user should be able to assign instruments to midi channels, start new effect processors, configure his main mixer pult (which is built of aRts structures itself) to have one channel more and use another strategy for its equalizers.

You see - the GUI elements should bring all possibilities of the virtual studio aRts should simulate to the user. Of course, they should also gracefully interact with midi inputs (such as sliders should move if they get midi inputs which also change just that parameter), and probably even generate events themselves, to allow the user interaction to be recorded via sequencer.

Technically, the idea is to have an IDL base class for all widgets (Arts::Widget), and derive a number of commonly used widgets from there (like Arts::Poti, Arts::Panel, Arts::Window, ...). Then, one can implement these widgets using a toolkit, for instance Qt or Gtk. Finally, effects should build their GUIs out of existing widgets. For instance, a freeverb effect could build it's GUI out of five Arts::Poti thingies and a Arts::Window. So IF there is a Qt implementation for these base widgets, the effect will be able to display itself using Qt. If there is Gtk implementation, it will also work for Gtk (and more or less look/work the same).

Finally, as we're using IDL here, artsbuilder (or other tools) will be able to plug GUIs together visually, or autogenerate GUIs given hints for parameters, only based on the interfaces. It should be relatively straight forward to write a "create GUI from description" class, which takes a GUI description (containing the various parameters and widgets), and creates a living GUI object out of it. Based on IDL and the aRts/MCOP component model, it should be easy to extend the possible objects which can be used for the GUI just as easy as it is to add a plugin implementing a new filter to aRts.