World War II was raging and the U.S. had firepower. But was it accurate?
It “had battleships that could lob shells weighing as much as a small car over distances up to 25 miles,” according to ComputerScienceLab.com.
Moreover, there were equations to determine how wind, gravity, muzzle velocity and other factors would affect trajectory.
“But solving such equations was extremely laborious,” the website said. The work was done by physicists or mathematicians, and all of the male ones from these overhwelmingly male professions had been drafted.
Female math majors — rare in the 1940s — filled in but more computing power was needed.
Innovation, in this case, was the mother of the Harvard Mark I, the “first programmable digital computer made in the U.S,” according to Computer Science Lab. (The British had made one a year earlier as part of their war effort).
The Mark I was an electro-mechanical device that used rotating shafts and clutches to deliver power.
Some of the computer’s stats:
- Multiplication: 6 seconds. Division: 15.3 seconds. A logarithm or trigonometric function: More than 1 minute.
- 765,000 components and 800km (500 miles) of wire.
- 16 m (51 ft) long x 2.4 m (8 ft) high x 61 cm (2 ft) deep.
Aiken antagonized Thomas Watson of IBM by claiming sole credit for the Mark I. Nevertheless, Watson attended the ceremony when the computer was presented to Harvard on 7 August 1944.
A literal bug — a moth for those of you asking — was discovered in this machine by computer scientist Grace Hopper. Although bug was a term already in use for glitches, Hopper is credited with coining the term debugging.
Meanwhile, in a quote that often makes it in to collections of bad predictions, Aiken once said that six computers would be enough to satisfy the requirements of the United States.
In fairness to Aiken, the inventor of the 4.5 tonne (5 ton) monster didn’t foresee the transistor revolution that would happen just 10 years later.
On August 7, 1955, a company called Tokyo Telecommunications Engineering Corporation (now Sony) started selling transistor radios.
But it was the extent to which the Japanese took over the electronics market that has long since become a case study in innovation, commercialization and manufacturing.
During a trip to the U.S. in 1952, Sony’s founder Masura Ibuka discovered that AT&T was about to license the transistor. With backing from the Japanese government, they were able to pay the $25,000 fee — $220,000 in today’s money.
Touring the States, “borrowing ideas from the American transistor manufacturers,” Ibuka improved the designs and Sony made “its first functional transistor radio in 1954,” according to Wikipedia.
The introduction of the transistor radio is now seen as a disruptive innovation that led to the demise of vacuum tube technology.
World Wide Web
August 7, 1991, marks the day Tim Berners-Lee posted the first web pages summarizing a little something he had been working on: invention of the World Wide Web.
Then at CERN with collaborator Robert Cailliau, the pair wanted a method of storing and sharing scientific documents. They developed concepts such as hyperlinking and the first browsers.
They piggybacked on the internet’s existing infrastructure but also developed web servers.
The web has since gone on to become the key driver of disruptive innovation of this generation. Some industries such as travel agents and newspapers have been decimated while a multitude of others have been spawned.
Image of Grace Hopper via Wikimedia Commons.