MainStage User Guide
- Welcome
-
- Overview of Edit mode
-
- Select patches and sets in the Patch List
- Copy, paste, and delete patches
- Reorder and move patches in the Patch List
- Add and rename patches
- Create a patch from several patches
-
- Overview of the Patch Settings Inspector
- Select patch settings in the Patch Library
- Set the time signature for patches
- Change the tempo when you select a patch
- Set program change and bank numbers
- Defer patch changes
- Instantly silence the previous patch
- Change patch icons
- Transpose the pitch of incoming notes for a patch
- Change the tuning for a patch
- Add text notes to a patch
-
- Overview of channel strips
- Add a channel strip
- Change a channel strip setting
- Configure channel strip components
- Show signal flow channel strips
- Hide the metronome channel strip
- Create an alias of a channel strip
- Add a patch bus
- Set channel strip pan or balance positions
- Set channel strip volume levels
- Mute and solo channel strips
- Use multiple instrument outputs
- Use external MIDI instruments
- Reorganize channel strips
- Delete channel strips
-
- Overview of the Channel Strip Inspector
- Choose channel strip settings
- Rename channel strips
- Change channel strip colors
- Change channel strip icons
- Use feedback protection with channel strips
- Set keyboard input for a software instrument channel strip
- Transpose individual software instruments
- Filter MIDI messages
- Scale channel strip velocity
- Set channel strips to ignore Hermode tuning
- Override concert- and set-level key ranges
- Add text notes to a channel strip in the Channel Strip Inspector
- Route audio via send effects
-
- Screen Control Inspector overview
- Replace parameter labels
- Choose custom colors for screen controls
- Change background or grouped screen control appearance
- Set screen controls to show the hardware value
- Set parameter change behavior for screen controls
- Set hardware matching behavior for screen controls
- Reset and compare changes to a patch
- Override concert- and set-level mappings
-
- Overview of mapping screen controls
- Map to channel strip and plug-in parameters
- Map screen controls to actions
- Map a screen control to multiple parameters
- Use screen controls to display PDF document pages
- Edit the saved value for a mapped parameter
- Set drum pads or buttons to use note velocity
- Map screen controls to all channel strips in a patch
- Undo screen control parameter mappings
- Remove screen control mappings
- Work with graphs
- Create controller transforms
- Share patches and sets between concerts
- Record the audio output of a concert
-
- Overview of concerts
- Create a concert
- Open and close concerts
- Save concerts
- How saving affects parameter values
- Clean up concerts
- Consolidate assets in a concert
- Rename the current concert
-
- Overview of the Concert Settings Inspector
- Set MIDI Routing to channel strips
- Transpose incoming note pitch for a concert
- Define the program change message source
- Send unused program changes to channel strips
- Set the time signature for a concert
- Change the tuning for a concert
- Set the pan law for a concert
- Add text notes to a concert
- Control the metronome
- Silence MIDI notes
- Mute audio output
-
- Layout mode overview
-
- Screen control parameter editing overview
- Lift and stamp screen control parameters
- Reset screen control parameters
- Common screen control parameters
- Keyboard screen control parameters
- MIDI activity screen control parameters
- Drum pad screen control parameters
- Waveform screen control parameters
- Selector screen control parameters
- Text screen control parameters
- Background screen control parameters
- How MainStage passes through MIDI messages
- Export and import layouts
- Change the aspect ratio of a layout
-
- Before performing live
- Use Perform mode
- Screen controls in performance
- Tempo changes during performance
- Tips for performing with keyboard controllers
- Tips for performing with guitars and other instruments
- Tune guitars and other instruments with the Tuner
- The Playback plug-in in performance
- Record your performances
- After the performance
- Tips for complex hardware setups
-
- Overview of keyboard shortcuts and command sets
-
- Concerts and layouts keyboard shortcuts
- Patches and sets (Edit mode) keyboard shortcuts
- Editing keyboard shortcuts
- Actions keyboard shortcuts
- Parameter mapping (Edit mode) keyboard shortcuts
- Channel strips (Edit mode) keyboard shortcuts
- Screen controls (Layout mode) keyboard shortcuts
- Perform in Full Screen keyboard shortcuts
- Window and view keyboard shortcuts
- Help and support keyboard shortcuts
-
-
- Use MIDI plug-ins
-
- Arpeggiator overview
- Arpeggiator control parameters
- Note order parameters overview
- Note order variations
- Note order inversions
- Arpeggiator pattern parameters overview
- Use Live mode
- Use Grid mode
- Arpeggiator options parameters
- Arpeggiator keyboard parameters
- Use keyboard parameters
- Assign controllers
- Modifier controls
- Note Repeater controls
- Randomizer controls
-
- Use Scripter
- Use the Script Editor
- Scripter API overview
- MIDI processing functions overview
- HandleMIDI function
- ProcessMIDI function
- GetParameter function
- SetParameter function
- ParameterChanged function
- Reset function
- JavaScript objects overview
- Use the JavaScript Event object
- Use the JavaScript TimingInfo object
- Use the Trace object
- Use the MIDI event beatPos property
- Use the JavaScript MIDI object
- Create Scripter controls
- Transposer controls
-
-
- Alchemy overview
- Alchemy interface overview
- Alchemy Name bar
- Alchemy file locations
-
- Alchemy source overview
- Source master controls
- Import browser
- Source subpage controls
- Source filter controls
- Source filter use tips
- Source elements overview
- Additive element controls
- Additive element effects
- Spectral element controls
- Spectral element effects
- Pitch correction controls
- Formant filter controls
- Granular element controls
- Sampler element controls
- VA element controls
- Source modulations
- Morph controls
- Alchemy master voice section
- Alchemy Extended parameters
-
- Playback plug-in overview
- Add a Playback plug-in
- Playback interface
- Use the Playback waveform display
- Playback transport and function buttons
- Playback information display
- Playback Sync, Snap To, and Play From parameters
- Use the Playback group functions
- Use the Playback Action menu and File field
- Use markers with the Playback plug-in
-
- Sample Alchemy overview
- Interface overview
- Add source material
- Edit mode
- Play modes
- Source overview
- Synthesis modes
- Granular controls
- Additive effects
- Additive effect controls
- Spectral effect
- Spectral effect controls
- Filter module
- Low and highpass filter
- Comb PM filter
- Downsampler filter
- FM filter
- Envelope generators
- Mod Matrix
- Modulation routing
- Motion mode
- Trim mode
- More menu
-
- Sculpture overview
- Sculpture interface
- Global parameters
- Amplitude envelope parameters
- Use the Waveshaper
- Filter parameters
- Output parameters
- Define MIDI controllers
- Extended parameters
-
- Copyright
Digital synthesizers
Modern digital synthesizers featuring variable polyphony, memory, and completely digital sound generation systems follow a semi-polyphonic approach. The number of voices that these instruments are able to generate, however, no longer depends on the number of built-in monophonic synthesizers. Rather, polyphony depends entirely on the performance capability of the computers that power them.
The rapid developments in the digital world are best illustrated by the following example. The first program that emulated sound generation entirely by means of a computer was Music I, authored by the American programmer Max Mathew. Invented in 1957, it ran on a university mainframe, an exorbitantly expensive IBM 704. Its sole claim to fame was that it could compute a triangle wave, although doing it in real time was beyond its capabilities.
This lack of capacity for real-time performance is the reason why early digital technology was used solely for control and storage purposes in commercial synthesizers. Digital control circuitry debuted in 1971 in the form of the digital sequencer found in the Synthi 100 modular synthesizer—in all other respects an analog synthesizer—from the English company EMS. Priced out of reach of all but the wealthiest musicians, the Synthi 100 sequencer featured a total of 256 events.
Ever-increasing processor performance made it possible to integrate digital technology into parts of the sound generation engine itself. The monophonic Harmonic Synthesizer, manufactured by Rocky Mountain Instruments (RMI), was the first instrument to do so. This synthesizer had two digital oscillators, combined with analog filters and amplifier circuits.
The Synclavier, introduced in 1976 by New England Digital Corporation (NED), was the first synthesizer with completely digital sound generation. Instruments like the Synclavier were based on specialized processors that had to be developed by the manufacturers themselves. This development cost made the Synclavier an investment that few could afford.
An alternative solution was the use of general-purpose processors made by third-party computer processor manufacturers. These processors, especially designed for multiplication and accumulation operations—common in audio processing tasks—are called digital signal processors (DSPs). Peavey’s DPM-3, released in 1990, was the first commercially available synthesizer completely based on standard DSPs. The instrument was 16-note polyphonic and based mainly on three Motorola 56001 DSPs. It featured an integrated sequencer and sample-based subtractive synthesis, with preset storage and user-definable samples.
Another solution was to design synthesizers as a computer peripheral, rather than as a standalone unit. The growing popularity of personal computers from the early 1980s made this option commercially viable. Passport Soundchaser and the Syntauri alphaSyntauri were the first examples of this concept. Both systems consisted of a processor card with a standard musical keyboard attached to it. The processor card was inserted into an Apple II computer. The synthesizers were programmed via the Apple keyboard and monitor. They were polyphonic and had programmable waveforms, envelopes, and sequencers. Today’s sound cards, introduced in countless numbers since 1989, follow this concept.
Exploiting the ever-increasing processing power of today’s computers, the next evolutionary step for the synthesizer was the software synthesizer, which runs as an application on a host computer.
The sound card (or built-in audio hardware) is needed these days only for audio input and output. The actual process of sound generation, effects processing, recording, and sequencing is performed by your computer’s CPU—using the MainStage software and instrument collection.