MainStage User Guide
- Welcome
-
- Overview of Edit mode
-
- Select patches and sets in the Patch List
- Copy, paste, and delete patches
- Reorder and move patches in the Patch List
- Add patches
- Create a patch from several patches
-
- Overview of the Patch Settings Inspector
- Select patch settings in the Patch Library
- Set the time signature for patches
- Change the tempo when you select a patch
- Set program change and bank numbers
- Defer patch changes
- Instantly silence the previous patch
- Change patch icons
- Transpose the pitch of incoming notes for a patch
- Change the tuning for a patch
- Add text notes to a patch
-
- Overview of channel strips
- Add a channel strip
- Change a channel strip setting
- Configure channel strip components
- Show signal flow channel strips
- Hide the metronome channel strip
- Create an alias of a channel strip
- Add a patch bus
- Set channel strip pan or balance positions
- Set channel strip volume levels
- Mute and solo channel strips
- Use multiple instrument outputs
- Use external MIDI instruments
- Reorganize channel strips
- Delete channel strips
-
- Overview of the Channel Strip Inspector
- Choose channel strip settings
- Rename channel strips
- Change channel strip colors
- Change channel strip icons
- Use feedback protection with channel strips
- Set keyboard input for a software instrument channel strip
- Transpose individual software instruments
- Filter MIDI messages
- Scale channel strip velocity
- Set channel strips to ignore Hermode tuning
- Override concert- and set-level key ranges
- Add text notes to a channel strip in the Channel Strip Inspector
- Route audio via send effects
-
- Screen Control Inspector overview
- Replace parameter labels
- Choose custom colors for screen controls
- Change background or grouped screen control appearance
- Set screen controls to show the hardware value
- Set parameter change behavior for screen controls
- Set hardware matching behavior for screen controls
- Reset and compare changes to a patch
- Override concert- and set-level mappings
-
- Overview of mapping screen controls
- Map to channel strip and plug-in parameters
- Map screen controls to actions
- Map a screen control to multiple parameters
- Use screen controls to display PDF document pages
- Edit the saved value for a mapped parameter
- Set drum pads or buttons to use note velocity
- Map screen controls to all channel strips in a patch
- Undo screen control parameter mappings
- Remove screen control mappings
- Work with graphs
- Create controller transforms
- Share patches and sets between concerts
- Record the audio output of a concert
-
- Overview of concerts
- Create a concert
- Open and close concerts
- Save concerts
- How saving affects parameter values
- Clean up concerts
- Consolidate assets in a concert
- Rename the current concert
-
- Overview of the Concert Settings Inspector
- Set MIDI Routing to channel strips
- Transpose incoming note pitch for a concert
- Define the program change message source
- Send unused program changes to channel strips
- Set the time signature for a concert
- Change the tuning for a concert
- Set the pan law for a concert
- Add text notes to a concert
- Control the metronome
- Silence MIDI notes
- Mute audio output
-
- Layout mode overview
-
- Screen control parameter editing overview
- Lift and stamp screen control parameters
- Reset screen control parameters
- Common screen control parameters
- Keyboard screen control parameters
- MIDI activity screen control parameters
- Drum pad screen control parameters
- Waveform screen control parameters
- Selector screen control parameters
- Text screen control parameters
- Background screen control parameters
- How MainStage passes through MIDI messages
- Export and import layouts
- Change the aspect ratio of a layout
-
- Before performing live
- Use Perform mode
- Screen controls in performance
- Tempo changes during performance
- Tips for performing with keyboard controllers
- Tips for performing with guitars and other instruments
- Tune guitars and other instruments with the Tuner
- The Playback plug-in in performance
- Record your performances
- After the performance
- Tips for complex hardware setups
-
- Overview of keyboard shortcuts and command sets
-
- Concerts and layouts keyboard shortcuts
- Patches and sets (Edit mode) keyboard shortcuts
- Editing keyboard shortcuts
- Actions keyboard shortcuts
- Parameter mapping (Edit mode) keyboard shortcuts
- Channel strips (Edit mode) keyboard shortcuts
- Screen controls (Layout mode) keyboard shortcuts
- Perform in Full Screen keyboard shortcuts
- Window and view keyboard shortcuts
- Help and support keyboard shortcuts
-
- Learn about Effects
-
- Learn about Amps and Pedals
-
- Bass Amp Designer overview
- Bass amplifier models
- Bass cabinet models
- Build a custom combo
- Amplifier signal flow
- Pre-amp signal flow
- Use the D.I. box
- Amplifier controls
- Bass Amp Designer effects overview
- Bass Amp Designer EQ
- Bass Amp Designer compressor
- Bass Amp Designer Graphic EQ
- Bass Amp Designer Parametric EQ
- Bass Amp Designer microphone controls
-
- Learn about Delay effects
- Echo controls
-
- MainStage Loopback overview
- Add a Loopback instance in MainStage
- MainStage Loopback interface
- MainStage Loopback waveform display
- MainStage Loopback transport and function controls
- MainStage Loopback information display
- MainStage Loopback Sync, Snap To, and Play From parameters
- Use the MainStage Loopback group functions
- MainStage Loopback Action menu
- Sample Delay controls
- Stereo Delay controls
- Tape Delay controls
-
- Use MIDI plug-ins
-
- Arpeggiator overview
- Arpeggiator control parameters
- Note order parameters overview
- Note order variations
- Note order inversions
- Arpeggiator pattern parameters overview
- Use Live mode
- Use Grid mode
- Arpeggiator options parameters
- Arpeggiator keyboard parameters
- Use keyboard parameters
- Assign controller parameters
- Modifier MIDI plug-in controls
- Note Repeater MIDI plug-in controls
- Randomizer MIDI plug-in controls
-
- Use the Scripter MIDI plug-in
- Use the Script Editor
- Scripter API overview
- MIDI processing functions overview
- HandleMIDI function
- ProcessMIDI function
- GetParameter function
- SetParameter function
- ParameterChanged function
- Reset function
- JavaScript objects overview
- Use the JavaScript Event object
- Use the JavaScript TimingInfo object
- Use the Trace object
- Use the MIDI event beatPos property
- Use the JavaScript MIDI object
- Create Scripter controls
- Transposer MIDI plug-in controls
-
- Learn about included Instruments
-
- Alchemy overview
- Name bar
-
- Alchemy source overview
- Source master controls
- Import browser
- Source subpage controls
- Source filter controls
- Source filter use tips
- Source elements overview
- Additive element controls
- Additive element effects
- Spectral element controls
- Spectral element effects
- Pitch correction controls
- Formant filter controls
- Granular element controls
- Sampler element controls
- VA element controls
- Source modulations
- Morph controls
- Master voice section
- Alchemy extended parameters
-
- MainStage Quick Sampler overview
- Add content to MainStage Quick Sampler
- MainStage Quick Sampler waveform display
- Use Flex in MainStage Quick Sampler
- MainStage Quick Sampler Pitch controls
- MainStage Quick Sampler Filter controls
- Quick Sampler filter types
- MainStage Quick Sampler Amp controls
- MainStage Quick Sampler extended parameters
-
- MainStage Playback plug-in overview
- Add a MainStage Playback plug-in
- MainStage Playback interface
- Use the MainStage Playback waveform display
- MainStage Playback transport and function buttons
- MainStage Playback information display
- MainStage Playback Sync, Snap To, and Play From parameters
- Use the MainStage Playback group functions
- Use the MainStage Playback Action menu and File field
- Use markers with the MainStage Playback plug-in
-
- Sculpture overview
- Sculpture interface
- Global parameters
- Amplitude envelope parameters
- Use the Waveshaper
- Filter parameters
- Output parameters
- Assign MIDI controllers
- Extended parameters
-
Precursors to the synthesizer
The earliest seeds of modern electronic synthesizers began in the twilight years of the 19th century. In 1897, an American inventor named Thaddeus Cahill was issued a patent to protect the principle behind an instrument known as the Telharmonium, or Dynamophone. Weighing in at 200 tons, this mammoth electronic instrument was driven by 12 steam-powered electromagnetic generators. It was played in real time using velocity-sensitive keys and, amazingly, was able to generate several different sounds simultaneously. The Telharmonium was presented to the public in a series of “concerts” held in 1906. Christened “Telharmony,” this music was piped into the public telephone network, because no public address systems were available at the time.
In 1919, Russian inventor Leon Theremin took a markedly different approach. Named after the man who masterminded it, the monophonic Theremin was played without actually touching the instrument. It gauged the proximity of the player’s hands as they were waved about in an electrostatic field between two antennae, and used this information to generate sound. This unorthodox technique made the Theremin enormously difficult to play. Its eerie, spine-tingling—but almost unvarying—timbre made it a favorite on countless horror movie soundtracks. R. A. Moog, whose synthesizers would later garner worldwide fame, began to build Theremins at the age of 19.
In Europe, Frenchman Maurice Martenot devised the monophonic Ondes Martenot in 1928. The sound generation method of this instrument was akin to that of the Theremin, but in its earliest incarnation it was played by pulling a wire back and forth.
In Berlin during the 1930s, Friedrich Trautwein and Oskar Sala worked on the Trautonium, an instrument that was played by pressing a steel wire onto a bar. Depending on the player’s preference, it enabled either infinitely variable pitches—much like a fretless stringed instrument—or incremental pitches similar to that of a keyboard instrument. Sala continued to develop the instrument throughout his life, an effort culminating in the two-voice Mixturtrautonium in 1952. He scored numerous industrial films, as well as the entire soundtrack of Alfred Hitchcock’s masterpiece The Birds, with this instrument. Although the movie does not feature a conventional musical soundtrack, all bird calls and the sound of beating wings heard in the movie were generated on the Mixturtrautonium.
In Canada, Hugh Le Caine began to develop his Electronic Sackbut in 1945. The design of this monophonic instrument resembled that of a synthesizer, but it featured an enormously expressive keyboard that responded not only to key velocity and aftertouch but also to lateral motion.
The instruments discussed thus far were all designed to be played in real time. Relatively early, however, people began to develop instruments that combined electronic sound generators and sequencers. The first instrument of this kind—named the Automatically Operating Musical Instrument of the Electric Oscillation Type—was presented by the French duo Edouard Coupleux and Joseph Givelet in 1929. This hybrid married electronic sound generation to a mechanically punched tape control. Its name was unofficially shortened to Coupleux-Givelet Synthesizer by its builders, the first time a musical instrument was called a “synthesizer.”
The term was formally introduced in 1956 with the debut of the RCA Electronic Music Synthesizer Mark I, developed by American engineers Harry F. Olson and Herbert Belar. Its dual-voice sound generation system consisted of 12 tuning forks, which were stimulated electromagnetically. For its time, the instrument offered relatively sophisticated signal-processing options. The output signal of the sound generator could be monitored by loudspeakers and, amazingly, recorded directly onto two records. A single motor powered both turntables and the control unit of the Mark 1. The synthesizer was controlled by information punched onto a roll of paper tape, enabling continuous automation of pitch, volume, timbre, and envelopes. It was extremely complicated to use, it was unreliable, and spontaneous playing was impossible.