The Dawn of the Digital Age The history of computers starts out about two thousand years ago, at the birth of the abacus. The abacus is a wooden rack holding two horizontal wires with beads strung on them. When these beads are moved around, according to "programming" rules memorized by the user, all regular arithmetic problems can be done. Blaise Pascal is usually credited for building the first digital computer in 1642. It added numbers entered with dials and was made to help his father, a tax collector. Gottfried Wilhelm von Leibniz invented a computer that was built in 1694. It could add, and multiply, after changing some of the parts around. Leibniz invented a special stepped gear mechanism for introducing the addend digits, and this is still being used. The prototypes made by Pascal and Leibniz were not used in many places. They were even considered a little weird until, a little more than a century later, Charles Xavier Thomas created the first successful mechanical calculator. Thomas' calculator could add, subtract, multiply, and divide. Many improved versions of the desktop calculator followed. By about 1890, the range of improvements on the calculator included accumulation of partial results, storage and automatic reentry of past results (memory functions), and a printing of the results. These improvements were mainly made for commercial users, and not for the needs of science. While Thomas was developing the desktop calculator, a series of very interesting developments in computers started in Cambridge, England. In 1812, Charles Babbage, a mathematics professor, realized that many long calculations, especially those needed to make mathematical tables, were really a series of predictable actions that were constantly repeated. From this he suspected that it should be possible to perform these actions automatically. Babbage began to design his automatic mechanical calculating mac...