Due to digital computing limitations in the 1960s and 1970s, technicians, scientists and engineers used analog computers to solve complex problems. After advancement in digital technology, analog computing died in 20th century. Although both digital and analog solve similar problems, they have some differences.
In digital computing, numbers are produced as output. They use printers, disc drives, display screens and other peripherals to capture output. On the other hand, analog computers generate voltage signals as outputs, and have sets of oscilloscopes and meters to display the voltages.
Another major difference between digital and analog computers involve electronic circuits. Analog computers circuits use signal generators, op amps, and network of capacitors and resistors. These circuits continuously process voltage signal. On the other hand, digital computers use various on-off switching circuits, such as clock pulse generators, microprocessors, and logic gates.
Although you can program both digital and analog computers, the methods of programing are different. To program an analog computer, one is required to electrically connect different sub-systems with patch cables. Digital computers use well written lists of complex instructions, including moving data from one place to another, comparing two numbers or multiplying two numbers together.