Over the past few years I’ve spent a large chunk of time trying to replace the old FLTK(link) based user interface for the ZynAddSubFX(link) musical synthesizer. When I say replace, I mean a full 100% rewrite of the interface. The rewrite was done for a variety of reasons which I may touch upon in a later post, but for now let’s look at one part of the rewrite, namely QML.

QML, or Qt Meta Language, is a domain specific language commonly used to describe a group of components within a user interface. More generally, QML defines a tree of objects, methods on object instances, a set of interrelated properties, and bindings for the properties. Within Qt, QML runs on Javascript on top of the normal tools that Qt provides.

Through the use of a dynamic language QML gains a number of properties which make interface developement easier. When trying to rewrite ZynAddSubFX’s interface I had a few properties in mind. I wanted the toolkit to be:

  1. Scriptable: Implementation uses a first class higher level language

  2. Dynamically Resizable: Fluid layouts which do not have any fixed sizes

  3. Hot Reloadable: Reloads a modified implementation without restarting

  4. Recordable: Capable of portable/serialized state capture/replay

  5. Embeddable: Can be placed within another UI without conflicts

QML within Qt was script-able, layout routines were flexible enough that resize-ability wasn’t a major issue, and it was built with hot loading in mind. I don’t recall any major support for record-ability, though I expect it was in Qt somewhere. Lastly per embed-ability, Qt does not embed well; specifically, loading two plugins which use different Qt versions (e.g. Qt4/Qt5) is known to cause issues which is a common issue in audio plugin dev (or at least it’s complained about on Linux plenty). When initial prototyping was done with QML it was acknowledged that eventually the project may need to move away from Qt.

While prototyping ZynAddSubFX’s UI, I frequently ended up accessing the C to QML layer of Qt which received much less documentation than the pure QML layer. Some of the logic/drawing routines for the program ended up in C portion which couldn’t be effectively hotloaded, which slowed development. (I’m also pretty darn terrible at using Javascript which didn’t help). After a lull in development, I revisited the floundering rewrite and realized a few key points.

  • I rarely used Qt’s tools over libstdc++ features

  • I avoided using QML’s javascript

  • I had written all layout algorithms for the UI myself

  • I didn’t use any of the components from QML beyond primitives (rectangles, component-repeaters, etc)

QML at a high level was useful, concise, and easy to dynamically manipulate. That’s when I questioned, "Why does QML need to be tied to Qt and the specific scripting language of Javascript?" I have been always partial to Ruby, so why not?

With ruby methods/callbacks QML would look virtually the same. Indeed parsing all of the QML I had written thus far didn’t depend upon the scripting language at all. With ruby it was possible to use QML to create something like:

Rectangle {
    id: window

    property String fooVar: "foo"
    property Bool   barVar: true

    Structure { id: structure }
    Model     { id: model }

    function fn(args) {
        puts args

And translate it to something similar to:

class Instance < Rectangle
    attr_reader :structure, :model
    attr_property(:fooVar, String)
    attr_property(:barVar, Bool)

    def initialize()
        add_child(@structure = Structure.new)
        add_child(@model     = Model.new)
        set_property(:fooVar, "foo")
        set_property(:borVar, true)

    def fn(args)
        puts args

While this transformation may seem trivial, the organizational structure that QML’s Quick Meta Language provides is helpful at understanding complex widget hierarchies at a glance. So, with the idea of using QML through a ruby was formed, how is this accomplished?

In the never ending maintenance of ZynAddSubFX copious amounts of developmental hours have been dedicated to the graphical user interface. For something like a synth engine a GUI has some critical benefits in being able to manipulate program state quickly (and often single handedly). Even with the advantages and worth of the GUI I’m quite ready to see it thrown to the side as it’s a key point of frustration for both myself and the users.

Getting a UI to work and look like you’d expect is a tedious loop of

  1. Build

  2. Open

  3. Navigate

  4. Close

  5. Modify

In most toolkits I feel like I end up fighting against the architecture of the library quite quickly and metaprogramming my way out of the existing interface just doesn’t seem like a fair use of my own time and the extra cpu cycles spent through the abstraction layers needed.

For the 3.x.x UI replacement for ZynAddSubFX a few key features are needed for smooth development:

The initial efforts were based upon loading basic widgets from yaml every frame and rendering that to OpenGL, so that fulfilled requirements 2, 3, and 5. Writing all of the UI code from scratch got tiring, so Qt’s QML was added on which added part of the first requirement and demos indicated that hot reloading should be possible. Embedding Qt inside other applications though isn’t a task which is easily accomplished.