I’m sure most of you have heard or even used the term “bug” and its derivative verb “to debug” in a technological context. However have you ever wondered about the origins of the term “bug” as applied to computer technology?
Use of the term “bug” to describe inexplicable defects has been a part of engineering jargon for many decades and predates computers and computer software; it may have originally been used in hardware engineering to describe mechanical malfunctions. For instance, Thomas Edison wrote the following words in a letter to an associate in 1878:
It has been just so in all of my inventions. The first step is an intuition, and comes with a burst, then difficulties arise—this thing gives out and [it is] then that ‘Bugs’ — as such little faults and difficulties are called—show themselves and months of intense watching, study and labour are requisite before commercial success or failure is certainly reached.
During World War II, problems with radar electronics were also referred to as bugs (or glitches). ‘Bug’ is another term (most probably American) for ‘pest’ and since engineering pioneers were constantly pestered with mechanical and electronic hiccups, the term ‘bug’ was coined.
Nevertheless in September 1947, while investigating a glitch in an early computer system, the Mark II Aiken Relay Calculator, a team of electromechanical engineers at Harvard University did find out that actually there was a real moth trapped in the machine which was causing the glitch in the system.
The operators removed the moth and affixed it to the log (see image below). The entry reads: “First actual case of bug being found.” The word went out that they had “debugged” the machine and the term “debugging a computer program” was born.