Building a Beat Machine in Flutter

How to build a beat machine with sequencer in Dart / Flutter

Kenneth Reilly
ITNEXT

--

Screen recording of the example app

Introduction

Flutter has been rising in popularity since the initial release of 1.0 back in late 2018. As companies and entrepreneurs struggle to keep development costs low, the search continues for better and more efficient ways to build mobile apps and cross-platform software in general. Flutter supports all major mobile platforms and support for web and all major desktop operating systems is well under development.

In this article, we’ll check out a simple beat machine demo, and learn about the underlying design patterns used in the development of this app.

To get a Flutter environment up and running, visit the installation page. For a copy of the demo project source code, check out this repository.

Concepts

There are several key concepts within Flutter that are used extensively throughout the design of this demo app, to leverage the features of Dart and avoid having to write tons of unnecessary boilerplate or copypasta’. This has profound effects on code readability, reliability, and performance.

Getting the most out of a language can mean the difference between producing a clumsy bug-prone app or a work of art. Dart has plenty of features to assist with building highly-interactive asynchronous UX with state management.

These root concepts are:

These concepts are combined in various ways throughout the demo app, to achieve a design in which the UI rendering and control logic are neatly organized into classes with easy-to-use interfaces and properties.

Overview

The beat machine architecture and UX are kept as simple as possible, to emulate the retro drum machines of the 70’s and 80’s in which resources such as mechanical switches, copper, and fancy new 8-bit CPUs were limited, and building machines that musicians could afford required keeping the design and build cost to a minimum.

The core UI scaffold is split into four “front panel” widgets, each of which provide some interactivity, and the machine logic is contained within a sample playback service and audio engine service.

This is analogous to components on a real hardware instrument passing around data via patch or MIDI cables (both of which are still used heavily within audio engineering and music production to this day).

App Entry Point

The application is initialized within main.dart:

The main function locks the app into portrait mode first by ensuring widgets are initialized and then setting the device orientation. The UI scaffold is simple, with a Column displaying the four main interface widgets. Let’s check out the widgets and classes utilized to handle user input and efficiently render and refresh the UI as necessary.

Base Widget

The four primary widgets in the Scaffold extend a common base class to connect to the audio engine. This class is located in views/base-class.dart:

The BaseWidget and BaseeState classes extend StatefulWidget and State respectively, and implement an internal Stream that attaches a listener to AudioEngine when initialized, and refreshes state when a signal is received from the engine. Each widget that extends BaseWidget will therefore rebuild any time the audio engine sends a signal that an event has occurred within the engine and that the UI will need to be rebuilt.

Display Panel

The top-most component on the scaffold column is views/display.dart:

The DisplayPanel renders BPM and step position indicators at the top of the screen, and will auto-refresh when the base class receives a signal. Clicking on the BPM indicator opens a BPMSelector dialog with a list of choices from 1 through 256. Selecting one will set the BPM on the audio engine.

Step indicators are generated, each of which will “light up “ when the engine is running and the current step matches the index on each render cycle.

Pattern Sequencer

The pattern sequence editor widget is in views/sequencer.dart:

The Sequencer is a StatelessWidget since it does not provide any interactivity itself, but instead renders Track widgets that provide UX per track.

An expanded row is generated for each sample, with a label to the left and a Track that will auto-expand to fit the remaining space on the row.

Sequencer Track

The sequence editor Track is described in views/track.dart:

The Track widget extends BaseWidget, so it will auto-rebuild when it receives a signal from the audio engine. Each track wraps a generated list of eight note indicators which pass an event to the audio engine when clicked, which will then toggle the note on/off state internally and signal a refresh. The color of each note block indicator is determined by whether a note exists at the current position and whether the note is currently being played. When a note is not present, the color varies every other column for visibility and UX.

Transport Control

Let’s check out the transport control widget in views/transport.dart:

The Transport class builds a row of transport control buttons, each of which will call onTap when tapped, firing a state change event to the engine that will in turn signal a refresh on the widget through its base class. When a button matches the current engine state, it is disabled by passing null into the onPressed method for MaterialButton.

Pad Bank

The drum pad bank is defined in views/pad-bank.dart:

The PadBank extends StatelessWidget since it has no mutable properties and therefore does not require a State. This widget renders a Container at 1/3 the height of the available space on the parent with two rows of Pad widgets, each with a defined size, and a value derived from the current List index.

Drum Pad

The drum pad widget is defined in views/pad.dart:

The Pad widget is stateless and takes three final (immutable) parameters as arguments. Three get properties are defined to pull DRUM_SAMPLE along with corresponding sample name and color. When a pad is tapped, a PadEvent is passed to the audio engine for further processing.

Next, let’s take a look at the internal workings of the beat machine.

Sampler

Sample definitions and load/playback are in services/sampler.dart:

Sample types are defined with DRUM_SAMPLE and corresponding filenames and colors are initialized on the service, which loads the audio files during app initialization. When play is called on the sampler from the audio engine, the corresponding cached audio file is played.

Audio Engine

Let’s check out the audio service in services/audio-service.dart:

The AudioEngine service manages transport control state, handles input events, performs quantization on incoming notes when recording, stores track data, and signals all listening widgets to refresh the UI as required.

Event classes are defined for each type of audio engine event required, and a placeholder Signal class is defined to be used as a general-purpose signal for refreshing the UI. In more complex scenarios, the Signal class could be extended to pass varying types of signals to the UI.

Pattern resolution and step are defined, along with control state, BPM, initial track data, Timer / Watch / _tick calculation, and a StreamController with listener to allow listening widgets to receive signals.

When a control surface (such as a drum pad) called on with an instance of Event, the method makes use of generics to switch the event type and take the correct action. This way, all incoming messages are routed through one location and handled accordingly. Each event type corresponds to some set of operations within the engine. The control, edit, next, and synchronize methods each fire a Signal to the UI once all updates are complete.

The design of the audio engine allows for updates to occur on-the-fly without having to restart the engine, such as enabling recording with a simple state change that will cause future incoming notes to be passed through to the process method, and being able to adjust tempo in the middle of a pattern and synchronize the current running timer to the new BPM.

When an EditEvent is received, the event data is used to flip the boolean value for the note on/off condition for this track and step position. When the audio engine is started, a periodic Timer is created that will advance the sequencer once per _tick value and invoke next which will either increment or reset the timer and then check the note for each track on the current step, finally resetting the quantization _watch and signaling the UI.

Conclusion

This project illustrates the power available within the Flutter SDK for rapid application design and development.

Flutter is a great choice for building highly interactive cross-platform applications with excellent performance and reliability. With support for mobile, desktop, and web targets, developers can build high-quality apps that run great everywhere and are easy to maintain, with common syntax and nearly-universal support for libraries and packages.

This greatly simplifies the task of keeping a large multi-platform up-to-date with consistent features, rock-solid reliability, and ultra-fast performance.

Thanks for reading and good luck with your next Flutter project!

~ 8_bit_hacker

--

--