One of the two possible standards for interpreting (scaling) the control signal input to a VCO or VCF. In the volts/octave standard, a given change in the input voltage produces a change in the circuit's frequency relative to musical octaves, in which a given note in each octave is double the cycles/second (Hz) of the same note in the octave below. (This mimics the way that the human ear perceives musical intervals; for any given musical note at a given frequency, that same note an octave higher will be at double that frequency.) For example, if a 1V input results in a 1000Hz setting, then 2V yields 2000HZ, 3V yields 4000 Hz, 4V is 8000Hz, and so on. Nearly all equipment which uses volts/octave adheres to the 1 volt per 1 octave standard shown here. Some known exceptions are:
- Modulars built by Polyfusion, which use 0.1 volts/octave
- The Micromoog at about 0.9 volts/octave (rumored to be a design error)
- All synthesizers built by EML (except the Synkey), which use 1.2 volts/octave
The volts/octave method is more difficult to implement in hardware, but it is considered more musically useful and it does not have the voltage range problems that the volts/Hz method has.