The Hall of Innovation
"Every great movement must experience three stages:
Ridicule, discussion, adoption"
John Stuart Mill
Ideas are the fuel of innovation. Throughout history, many of the great ideas that changed the world were initially dismissed, missunderstood, and outright rejected. Here is an account of some of those within the field of communications. If you know of others, please send me email.
A few wretched wires.
The construction of lines of relay towers, posited since ancient times for long-distance communication,
was undertaken by the brothers Chappe in the 1790s. Spaced 10-20 Km, the towers featured mechanical arms
whose positions could be varied to indicate distinct messages, which were relayed from tower to tower.
This communication network eventually spanned France and neighboring countries, and was used extensively by Napoleon.
Initially baptized as “tachygraph”, the system ended up being termed “telegraph”. In 1838, the American inventor
Samuel Morse visited France to propose an alternative telegraph system based on electric wires.
The scientist appointed to evaluate his proposal, Jules Guyot, commented:
“What can one expect from a few wretched wires?"Morse was turned down. A decade later, though, France committed to electric telegraphy and the towers were abandoned.
The telephone patent controversy.
The telephone, a momentous invention, is credited to several individuals.
Antonio Meucci, an Italian immigrant in New York, developed a voice communication apparatus in 1857, to connect his basement lab with his invalid wife up in the bedroom.
In 1870, Meucci reportedly transmitted voice one mile away. He called his device "telettrofono", but was either unable or unwilling to pay for a patent application.
He instead filed a caveat (intention to file) entitled "Sound Telegraph", which expired in 1874.
Also in the 1860s, the German Johann P. Reis had developed a similiar "telephon".
Then, on Feb. 14th 1876, two men, Elisha Gray and Alexander Graham Bell, incredibly submitted telephone designs in New York in a span of only two hours!
According to most versions, Bell’s application arrived earlier. In other versions, Gray´s caveat arrived first but remained in the in-basket until that afternoon.
Bell's application was filed later by his lawyer, who requested that it be taken to an examiner at once. Gray's caveat, on the other hand, was not examined until the following day.
In any event, Gray was convinced to abandon his caveat, which enabled Bell being granted US Patent 174,465 on March 7th. A day earlier, Bell, in one room, had called his assistant in another room:
“Mr. Watson, come here; I want to see you."What followed is the history of the founding of the Bell Telephone Company (later AT&T), which grew to be the largest telephone company in the world. Elisha Gray later challenged Bell's patent but, after years of litigation, he was defeated. Meucci also took Bell to court claiming priority and was near victory when he died in 1889, at which point the case was dropped. 113 years later, in 2002, the US House of Representatives passed a bill recognizing Meucci's accomplishment and stating that:
“If Meucci had been able to pay the $10 fee to maintain his caveat after 1874, no patent could have been issued to Bell."
This 'telephone' device is of no value.
In 1876, before starting the Bell Telephone Company, Alexander Graham Bell offered his brand new patent to the Telegraph Company (ancestor of Western Union)
and a committee was appointed to investigate the offer. The ensuing report read in part:
“The idea of installing ‘telephones’ in every city is idiotic... Why would any person want to use this ungainly and impractical device when he can send a messenger to the telegraph office and have a clear written message sent to any large city in the US? This 'telephone' has too many shortcomings to be seriously considered as a means of communication. The device is inherently of no value to us."
Girl-less wait-less telephone switching.
Almon B. Strowger owned a mortuary business in Kansas City.
As the story goes, he had a competitor whose wife was an operator at the local manual telephone exchange.
Often, calls for Strowger were deliberately put through to his competitor.
This frustrated Strowger who, after years of complaining to the telephone company, decided to do away with human operators.
With help from his nephew and others, he developed the first automated telephone switch out of electromagnets and hat pins.
A patent was issued in 1891 and the Strowger Automatic Telephone Exchange Company was formed.
In 1892, the first Strowger exchange was opened for public service with about 75 subscribers and capacity for 99.
Strowger bragged that his exchanges were "girl-less, cuss-less, out-of-order-less and wait-less."
Over time, the company was consumed by giants of the telecommunications industry such as AT&T, Verizon, GTE and Lucent Technologies.
(Entry contributed by Rodolfo Feick.)
Radio waves have no practical application.
In 1864, the Scottish mathematician James C. Maxwell formulated the laws that govern electromagnestism.
Heinrich Hertz, a German physicist, was later able to transmit and receive radio waves in a controlled laboratory manner.
Hertz, though, seemed uninterested in the practical importance of his work. He stated that:
“This is just an experiment that proves Maestro Maxwell was right - we have these mysterious electromagnetic waves that we cannot see with the naked eye; but they are there … I do not think that the wireless waves I have discovered will have any practical application."His discoveries would be taken up by entrepreneurs looking to make their fortunes.
Tesla v. Marconi: clash of titans.
Following up on Hertz´s work, the genius Nikola Tesla first demonstrated wireless communication in 1893.
One year later, the Bengali physicist Sir Jagadish C. Bose and the British physicist Sir. Oliver J. Lodge followed suit.
Meanwhile, in England, a young Italian named Guglielmo Marconi had been hard at work building a wireless device.
Tesla filed the first US radio patent in 1897. Marconi's first patent application in the US, in 1900, was turned down.
His revised applications were repeatedly rejected because of the priority of Tesla.
Nonetheless, his Marconi Wireless Telegraph Company Ltd. began thriving. Then, on Dec. 12th, 1901, Marconi for the first time transmitted signals across the Atlantic. Tesla commented:
"Marconi is a good fellow. Let him continue. He is using 17 of my patents."In 1904, surprisingly, the US Patent Office reversed previous decisions and gave Marconi a patent for the invention of radio. Marconi later won the Nobel Prize and Tesla sued his company for infringement. In 1943, a few months after Tesla's death, the US Supreme Court finally overturned Marconi's patent in favor of Tesla. The Russian Alexander Popov, who had also presented a radio receiver in 1895, reportedly said:
"The emission and reception of signals by Marconi by means of electric oscillations is nothing new. In America, the famous engineer Nikola Tesla carried the same experiments in 1893."Marconi's reputation is largely based on his accomplishments in commercializing a practical system. His demonstrations of the use of radio, equipping ships with life-saving wireless communications, establishing the first transatlantic radio service, and building the first stations for the British short-wave service, marked his place in history. Upon Marconi’s death at age 63, as a tribute, radio stations throughout the world observed two minutes of silence.
Who cares about broadcasting?
David Sarnoff, an employee of the Marconi Wireless Telegraph Company of America, demonstrated the broadcast of music in New York circa 1915.
This and other demonstrations of long-distance wireless telephony inspired many memos to his superiors on applications of radio technologies.
In 1920, he proposed that the company develop a radio music box for the amateur market of radio enthusiasts. The proposal was rejected:
"The radio music box has no imaginable commercial value. Who would pay for a message sent to nobody in particular?"Sarnoff ended up heading the Radio Corp. of America and founding NBC.
- Armstrong jazzes the world in FM. Countering the common tenet of bandwidth conservation, Edwin H. Armstrong, a student and later a professor at Columbia University, had the brilliant insight that increasing bandwidth through frequency modulation (FM) could actually reduce noise. Patented by Armstrong in 1933, FM delivers a much clearer sound than AM, dominant at the time. Invited by the Federal Communications Commission (FCC), Armstrong played a jazz record over AM radio, then switched to FM and shocked the audience with the leap in quality. He then financed construction of the first FM radio station at 42.8 MHz. This was too revolutionary for the Radio Corporation of America (RCA), which lobbyied for a change in the law that would prevent FM from becoming dominant. David Sarnoff and RCA got the FCC to move FM radio from 42-50 MHz to 88-108 MHz, thereby rendering all FM sets useless overnight and protecting RCA's AM-radio stronghold. RCA further claimed the invention of FM and won its own patent, leaving Armstrong unable to claim royalties. Ruined and despaired, he jumped to his death in 1954. His widow, who had been Sarnoff's secretary before marrying Armstrong, renewed the legal fight against RCA and finally prevailed. Although it took decades, FM ended up dominating radio broadcasting.
Frequency hopping: made in Hollywood.
George Antheil, a film composer in Hollywood, met actress Hedy Lamarr in 1940.
Discussing torpedo control, Lamarr suggested that, by transmitting signals on rapidly changing frequencies, radio-guided weapons would be more resilient to detection and jamming.
The sequence of frequencies would be known a priori by transmitter and receiver, but to the enemy the message would seem like noise.
Antheil proposed coordinating the frequency changes the way he synchronized player pianos. This early version of frequency hopping thus used a piano roll to change between 88 frequencies (number of keys in a piano).
The pair patented the idea in 1942, but it was ahead of its time. The Navy found the mechanism too bulky to fit in a torpedo.
In 1957, however, the concept was taken up by engineers at Sylvania Electronic Systems. Their implementation, using electronics rather than piano rolls, became a basic tool for secure military communications.
Other patents in frequency changing have cited Lamarr-Antheil as the basis of the field, and the concept underlies many anti-jamming devices used today.
It has also been employed in wireless communication, e.g., in GSM. The patent was little-known till 1997, when the Electronic Frontier Foundation honored Lamarr with a special Pioneer Award.
In 2005, the first Inventor's Day in German-speaking countries was held on her 92nd birthday.
(Entry suggested by Chandru Raman)
- The picturephone fiasco. After years of development at AT&T, commercial picturephone service debuted in 1970 in Pittsburgh and AT&T’s executives predicted a million units in use by 1980. As it turn out, though, people did not like picturephone. No matter how much AT&T tried, it was one of the biggest flops in communication technology history. Fewer than 500 people had signed up by the time the plug was pulled in 1974 at a loss of $1 Billion. What happened? Part of the reason was the high cost given how little the images seem to add to the communication. Also, there was the problem of how you use a picturephone when you are one of the few people to have one. Often, people simply did not want that much intimate contact. It is only now, with improvements in speed, resolution, miniaturization, and the advent of another piece of equipment, the computer, that the promise of personal video communication is being fulfilled.
The cellular concept: 30 years early.
The concept of cellular telephony was first articulated in 1947 by D. H. Ring, in a Bell Labs Technical Memorandum that started with the prescient statement:
“an adequate mobile radio system should provide service to any equipped vehicle at any point in the whole country.”The idea was to divide a large area into a grid of small zones called cells, each having a low-power transceiver to send and receive calls to mobile units within the cell. There would be automatic handoff from cell to cell and reuse of frequencies. The idea was visionary but the technology to implement it simply did not exist, and the bandwidth needed was not available. It lay dormant until the 1960s, when Joe Engel and Richard Frenkiel of Bell Labs leveraged computers and electronics to make it work. The first call from a handheld phone, however, was placed by their rival, Motorola's Martin Cooper, on April 3rd 1973, who –to the amazement of those passing by- phoned Engel while walking in New York. The handset weigthed 1 Kg and cost $4000. The first trial system was deployed by AT&T in Chicago in 1978 and went commercial in 1983. By 2011, there were 261,392 cells in the US alone.
(Richard Frenkiel's chronicle "Cellular dreams and cordless nightmares" can be downloaded here)
- Cellular telephony: just a niche market. In 1980, McKinsey & Company was commissioned by AT&T (whose Bell Labs had invented cellular telephony) to forecast cell phone penetration in the U.S. by 2000. The consultant’s prediction, 900,000 subscribers, was less than 1% of the actual figure, 109 Million. Based on this legendary mistake, AT&T decided there was not much future to these toys. A decade later, to rejoin the cellular market, AT&T had to acquire McCaw Cellular for $12.6 Billion. By 2011, the number of subscribers worldwide had surpassed 5 Billion and cellular communication had become an unprecedented technological revolution.
- The accidental success of SMS. The short message service was conceived in the 1980’s to alert individual mobile users of network disruptions, deposited voice mails, etc. Few believed it would serve for users to communicate with each other and, in fact, operators where slow to even set up charging mechanisms. (Early on, roaming customers rarely received bills for mesages sent while abroad, which made SMS an alternative to voice calls.) Since the late 1990’s, the SMS phenomenon has grown to epic proportions. The average number of monthly messages per user, which was 0.4 in 1995, reached 47 by 2006. The total number of monthly messages worldwide is now well over 100 Billion, at a cost that doubles Hollywood's box-office and global music sales combined! Second to none in fervor is the Philippines, which single-handedly generates almost 10% of the world's SMS traffic. Also, the need to convey more while staying within the 160-character limit has led to the emergence of a new communication language that uses abbreviations to save time, space and effort. Anyone can get creative. "Go ahead & GIAG BBFN & TC". If you are puzzled, it means: "go ahead and give it a go; bye-bye for now and take care". Opinion is divided on this contortion of language, but it hardly seems to matter to the users.
Cellphone meets computer and they live happily ever after.
In 1993, a revolutionary device was launched: a cellphone with touchscreen, email, fax, graphics, and text editor.
This first smartphone was not the iPhone, still years away, but the IBM Simon.
The idea was visionary, but ahead of its time; technology wasn’t ripe enough yet.
With only 1 Mb of memory and Kb/s bit rates, it couldn’t quite deliver. It was bulky and, at $899, expensive.
Others followed and developed the concept (the Nokia 9000 in 1996, with 8 Mb of memory, Ericsson’s P800 in 2002,
with camera and MP3 player, RIM’s Blackberry in 2003, with a full keyboard).
The debate raged among engineers as to whether users really wanted a multifunctional device or they preferred
dedicated phones, cameras, audio players, etc.
The debate was settled by the iPhone in 2007. With Gbs of memory and Mb/s bit rates, smartphones are finally taking over: 450 Million shipped in 2011 alone. In fact, smartphones are causing a massive surge in wireless traffic, with a 40-fold increase forecasted between 2009 and 2014, of which 2/3 will be video. Two decades later, the descendants of the IBM Simon are fulfilling the dream.
The incomprehensible work of a precocious mathematician.
In 1828, while still a teenager, Évariste Galois solved the long-standing problem of determining the condition for a polynomial to be solvable by radicals.
He produced a memoir that was submitted several times to the Academy of Sciences but never published in his lifetime.
His first attempt was refused by Cauchy. He re-submitted it to Fourier, the Academy's secretary, but Fourier died soon after and the memoir was lost.
Simeon Poisson asked him to again submit his work, but found it incomprehensible, declaring that
“Galois' argument is neither sufficiently clear nor sufficiently developed to allow us to judge its rigor.”Galois reacted violently to the rejection letter and decided to publish his papers privately. While in prison for political activism, he polished his ideas. Set for a duel after his release, and convinced of his impending death, he stayed up all night composing what would become his mathematical testament. On May 30th, 1832, at age 21, he was shot and died. Galois' mathematical contributions were published in full in 1843. His work led to what is now called Galois field theory, with direct applications in coding and cryptography.
Coding is dead.
In a 1971 workshop on coding theory, Ned Weldon famously stated that:
“Too many equations had been generated with too few consequences... Coding theorist professors had begotten more coding theory Ph.D.'s in their own image... no one else cared; it was time to see this perversion for what it was. Give up this fantasy and take up a useful occupation... Coding is dead.”The 1990s witnessed a major renaissance of coding theory with the discovery (and in some cases rediscovery) of codes defined on graphs, which play a central role in modern communication systems. Even newer forms of coding were conceived in the 2000s, including polar codes, the first type to provably achieve capacity.
Turbo codes: the French revolution.
In 1993, at the IEEE Int'l Conf. on Communications, some engineers working in France made an incredible claim.
They reported to have invented a "turbo coding" method that could come breathtakingly close to Shannon's capacity.
Their simulation curves claimed unbelievable performance, far beyond what was deemed possible.
Many experts at the conference scoffed at the claims and either did not bother attending the engineers' talk or else contended that the simulations were in error.
Prof. McEliece later said:
"What blew everyone away about turbo codes is not just that they get so close to Shannon capacity but that they're so easy. How could we have overlooked them? Berrou and Glavieux didn't know the problem was supposed to be hard, so they managed to find a new way to go about it."Today, turbo codes are at the heart of lots of communication devices.
LDPC: a piece of 21st-century coding in the 20th century.
In 1958, Robert Gallager, a Ph.D. student at MIT, had already created a class of capacity-approaching codes.
Gallager's simulations showed that, indeed, his LDPC codes came very close to Shannon's capacity.
However, they presented a computational challenge far too complex for the computers of the day.
Most coding theorists seemed to regard LDPC codes as intriguing but quixotic and
promptly forgot about them.
How is that possible? Well, in 1968, Robert Gallager wrote his classic textbook “Information Theory and Reliable Communication,”
where he explained most of the coding theory known at the time but neglected to mention LDPC codes!
Perhaps he thought he had already covered the material in a 1963 monograph that was made from his thesis, or maybe he also did not regard LDPC codes as practical.
In the late 1990s, with the coding world buzzing about turbo codes, several researchers independently rediscovered Gallager's work and LDPC codes rapidly became the
state of the art.
Amazingly, most important concepts about LDPC codes were already in Gallager’s 1960 thesis.
In the words of Prof. David Forney:
"LDPC codes are a piece of 21st-century coding that happened to fall in the 20th century".
- Gauss and the Fast Fourier Transform. In 1805, Gauss was trying to compute asteroid orbits from observations of their positions. Rather than solving a system of equations by hand, he looked for a shortcut. He discovered how to split the equations into subproblems that were easier to solve, and then how to recombine the solutions. In essence, he computed the discrete Fourier series of the data with the splitting technique that forms the basis of modern FFT algorithms. His work was published posthumously in 1866 and lay dormant for a century. In 1965, Cooley and Tukey published a modern version of the algorithm and revolutionized digital signal processing. The FFT dramatically speeds up discrete Fourier transformations that are at the core of today’s data compression, processing and communication. FFT-based OFDM (Orthogonal Frequency Division Multiplexing), in particular, is today’s preferred choice for wireless communication, digital broadcasting, and DSL.
ISDN: Innovation Suscribers Didn't Need.
In 1979, Bell Labs was designing 4.8-kbit/s modems and discussing progress towards an all-digital network
(termed ISDN) delivering about 160 kbit/s to the subscribers.
A junior team member named John Cioffi calculated, using Shannon’s capacity formula,
that speeds of 1.5 Mbits/s should be possible on a 4-mile twisted-pair line.
In a contentious meeting, he audaciously made such a proposition.
He was laughed at and embarrassed, and recalls being told:
"We're shooting for 144 to 192 kbit/s for ISDN. Your calculations are off; stop dreaming".Even his own boss told him to
"Shut up and sit down".160-kbit/s ISDN became a reality, and a commercial failure, earning the substitute acronym "Innovation Suscribers Didn't Need". Too slow to excite the consumers, ISDN did trigger the interest in really fast Digital Subscriber Lines (DSL). Meanwhile, by 1986, Cioffi had become a professor at Stanford University and was researching adaptive OFDM transmission, implemented using FFTs, for Asymmetric DSL (ADSL). Funded by a soon-to-be-bankrupt capital firm, he founded Amati Communications. In 1993, Bellcore organized an ADSL competition, the so-called Bellcore Olympics. Amati’s OFDM system faced single-carrier designs from Bellcore, GTE and British Telecom. With 1/10th the budget, Amati’s system ran four times faster (6 Mbits/s) and was selected as the US standard for ADSL. In 1998, Texas Instruments bought Amati for $395 Million. A Bellcore Olympic rerun took place in 2003 and was again won by OFDM, which became the standard for Very high-speed DSL (VDSL). DSL is featured in over 1/3 of the twisted pair lines worldwide. More details here.
- The origins of carrier multiplexing. Carrier multiplexing allows for the transmission of large numbers of signals over a common line (wire, cable, optical fiber, etc) by shifting them in frequency. The first demonstration of carrier multiplexing was carried out by George O. Squier of the US Army Signal Corps on Sept. 18th, 1910. He multiplexed two analog signals over a single 7-mile telephone circuit. Dr. Frank Jewett, later president of the Bell Telephone Laboratory, indicated that he saw no great commercial value in the idea because of the attenuation suffered by high-frequency signals. The Bell System’s engineers, however, proved him wrong and by 1918 they had developed the first commercial such system for AT&T. As it turns out, Jewett had assumed excessively long lines and that led him to a pessimistic assessment. Today, carrier multiplexing and its variants are ubiquitous. More details here.
Outside the realm of communications, there are many other examples of breakthrough ideas that were ahead of their time.
Music goes portable.
In 1972, Andreas Pavel pushed a button on his newly invented 'Stereobelt' and the song “Push Push” started on an attached headset.
Music had become portable. Realizing that his 'Stereobelt' could
"multiply the aesthetic potential of any situation,"Pavel approached companies such as ITT, Grundig, Yamaha and Philips, but these felt the public would never wear headphones in public. Frustrated, Pavel filed a patent for the 'Stereobelt' in 1977. And then, in 1979, Sony released a similar personal stereo -- built initially for one of Sony’s chairmen, who wanted to listen to operas during his frequent flights. The name ‘Walkman’ was proposed, but the company’s leadership was skeptical and other names were entertained, e.g., ‘Walky’. In the end, ‘Walkman’ prevailed. It would go down in history as the generic way of describing a portable cassette player, as listed in the Oxford English Dictionary. A legal dispute ensued with Pavel, and in 1986 Sony agreed to pay him limited royalties. A second dispute was dismissed in 1996, leaving Pavel with over $3 Million in debt. With Pavel threatening with new lawsuits, in 2003 Sony settled paying over $10 Million plus certain royalties. Pavel was also recognized as the original inventor of the Walkman; this apparently could only be achieved after the death of Akio Morita, founder of Sony. Stereobelt and Walkman were succeeded by the MiniDisc, the DiscMan, and today’s phenomenon: the iPod.
There is market for maybe 5 computers.
Thomas J. Watson Sr. presided over the growth of IBM into an international force between the 1920’s to the 1950’s, largely on the basis of punched card tabulating machines.
In 1943, however, he is alleged to have famously said:
"I think there is a world market for maybe five computers."
Who wants to copy a document on paper?
Working at a New York patent department, Chester F. Carlson frequently had to copy specifications and drawings by hand.
Tired of this frustrating method, he decided to come up with a more convenient alternative.
Despite suffering from arthritis, he set up a lab, first in the closet and later in the kitchen of his tiny apartment, and started experimenting.
On October 22nd, 1938, he made the first electrophotographic image on wax paper.
Over the next 6 years, he offered the rights to the process to every major equipment company in the country, only to be turned down every time. One of the rejection letters read:
"Who the hell wants to copy a document on plain paper?"Even the National Inventors Council dismissed the idea. Eventually, the Haloid company acquired the rights to the copying process and changed the name 'electrophotography' to 'Xerography'. Soon, Haloid changed its own name to Xerox Company.
Heating food with radar technology.
In 1945, Percy Spencer, self-taught engineer, was investigating alternative uses of the magnetron,
a unit developed to generate microwaves for radar transmissions.
While testing a magnetron, Spencer found that a chocolate bar in his pocket had melted.
He experimented further with an egg, which exploded, and with popcorn, which popped all over the room.
Spencer then constructed a metal box and fed microwaves in from a magnetron, and the microwave oven was born.
In 1947, his employer (Raytheon) built the "Radarange", the first commercial water-cooled microwave oven.
It was almost 2 m tall, weighed 340 kilos and cost US $5000. Not till the 1970s did microwave ovens began to outsell traditional
ovens but, by now, 90% of US households feature a microwave oven.
(Entry contributed by Julia Hall.)
Flying machines are impossible.
In 1895, Lord Kelvin, president of the Royal Society, said:
"Heavier-than-air flying machines are impossible."On Dec. 17th, 1903, the Wright brothers successfully made the first controlled heavier-than-air human flight.
Who the hell wants to hear actors talk?
Harold M. "Harry" Warner was one of the founders of Warner Bros. and a major contributor to the development of the film industry.
In 1925, Harry's brother acquired a radio station, KWBC, and decided to make an attempt to use synchronized sound in future Warner Bros. Pictures.
When the idea was pitched to him in 1927, Harry memorably said:
"Who the hell wants to hear actors talk?"Despite these reservations, under Harry and his brothers leadership the company was a successful pioneer of the sound film industry.
The painful birth of a pain reliever.
It was known by the 19th century that a compound called salicin relieved pain effectively.
However, salicylic acid was tough on stomachs and a means of buffering it was needed.
In 1853, the chemist Charles F. Gerhardt neutralized salicylic acid with sodium and acetyl chloride, thereby creating acetylsalicylic acid.
Gerhardt, however, had no desire to market the product and abandoned the discovery.
In 1899, Felix Hoffmann, who worked for a German company called Bayer, rediscovered Gerhardt's formula.
Hoffmann tried it on his father, who was suffering from arthritis.
When he proposed marketing the drug, the head of Bayer’s pharmacological institute sent a reply letter that included:
"This is typical Berlin hot air. The product is worthless"Ultimately, Bayer’s chairman interceded and the company marketed the drug. Aspirin was patented on Feb. 27th, 1900.
Compilers and bugs.
Software for early computers was written in assembly.
High-level languages were not invented till the benefits of reusing software across CPUs became greater than the cost of writing a compiler. In the 1950s, machine-independent languages were proposed and compilers developed, the first by Grace Hopper. A US Navy volunteer who had to get an exemption to enlist (she was below the Navy minimum weight of 54 kg), Grace joined the team developing the UNIVAC I, the second commercial computer produced in the US, and by 1952 she had an operational compiler:
"Nobody believed that. I had a running compiler and nobody would touch it. They told me computers could only do arithmetic"Grace, who went on to become an Admiral, is also famous for popularizing the term "debugging" for fixing computer glitches. She once discovered a moth stuck in a relay and impeding operation, upon whose removal she indicated that they were "debugging" the system. The moth's remains are at the Smithsonian Institution's Nat'l Museum of American History.
"The reasonable man adapts to the world around him.
The unreasonable man tries to change the world.
Therefore, all progress is the result of unreasonable men"
George Bernard Shaw