The First Chapter of computer Languages

The Tortuous Path of Early Programming.

In the perpetual darkness more than two miles below the surface of the North Atlantic, a submersible sled slowly traced the alpine contours of the ocean bottom in the summer of 1985. Named the Argo after the ship in which the legendary Greek Hero Jason sought the Golden Fleece, the 16-foot-long craft resembled a section of scaffolding flung on its side and stuffed with equipment, Powerful lights, sonar, Video cameras. Far above, arrayed in front of a video screen in the control room of the U.S. Navy research vessel Knorr, Members of a joint French-American scientific expedition intently watched the images transmitted by the submersible as it was towed above a desolate landscape of canyons and mud slides.

After 16 days of patient search, A scattering of metallic debris appeared on the screen, followed by the unmistakable outline of a ship’s boiler. A jubilant cry arose from the scientists. The ocean liner Titanic sunk 73 years earlier with more than 1,500 of its 2,200 passengers on board had finally been found.

The quest for the remains of the Titanic in the crushing depths of the sea was a remarkable application of computer technology, as exotic in its means as in its venue. Not least of the keys to the successful outcome was the agility of modern computer programming.

Argos’s ensemble of sonar, lights and cameras was orchestrated by an array of computers that each programmed in a different computer language. The computer on the unmanned Argo itself was programmed in FORTH, a concise but versatile language originally designed to regulate movement of telescopes and also used to control devices and processes ranging from heart monitors to special-effects video cameras. The computer on the Knorr was programmed in C, a powerful but rather cryptic language capable of precisely specifying computer operations. The telemetry System at either end of the finger thick Co-ax cable connecting the vessels, which in effect enabled their computers to “talk” to each other, was programmed in a third, rudimentary tongue known as assembly language.

Programming languages are the carefully and Ingeniously conceived sequences of words, letters, numerals and abbreviated mnemonics used by people to communicate with their computers. Without them, computers and their allied equipment would be useless hardware. Its own grammar and syntax regulate each language. A programming language that approximates human language and can generate more than one instruction with a single statement is deemed to be “high-level.” But computer languages tend to be much more sober and precise than human languages. They do not indulge in multiple meanings, inflections or twists of iron. Like computers themselves, computer languages have no sense of humour.

Today there are several hundred such languages, considerably more than a thousand if their variations, called Dialects, are counted. They enable their users to achieve a multitude of purposes, from solving complex mathematical problems and manipulating (or “crunching”) business statistics to creating musical scores and computer graphics. No existing Language is perfect for every situation. One or more of three factors usually determines the choice among them: The language is convenient to the programmer; it is useable on the available computer; it is well suited to the problem at hand. The multiple tongues employed on the Titanic expedition are a case in point. For the computers aboard the surface ship Knorr, C was the preferred language because it provided more direct control of the computerised hardware. FORTH was the only high-level language that could be used on the submersible Argo’s computer. And the precise timing required timing required of the signals passed by cable between the two vessels was best accomplishedby rigid assembly language.

As varied languages have become the all build on a common base. At their most fundamental level, Computers respond to only a single language, The high and low of electric voltages representing the ones and zeros of binary code. Depending on how these signals are fed into a computer’s memory. Another might be a piece of data yet to be processed.

Yet another collection of binary digits, or bits, might command the machine to perform a certain action, such as adding to numbers. The circuitry of each type of computer is designed to respond to a specific and finite set of these binary encoded machines, which may be combined and recombined to enable the machine to perform a vast range of tasks. Though straightforward enough this so-called machine is a forbidding, alien language to human beings. A computer program of any size, in its machine-code form, consists of thousands or even millions of ones and zeros, strung together like beads on a seemingly interminable string. A mistake in even one of these digits can make the difference between a program’s success and failure.

Less than half a century ago, machine code was the only means of communicating with computers. Since then, generations of language designers have harnessed the power of the computer to make it serve as its own translator. Now, when a programmer uses the command PRINT “Hello” or the statement LET A = B * (C – D) in a program, a translating program is called into action, converting those commands into the ones and zeros that the machine can understand.
The methods used to program the world’s first general-purpose computers were as cumbersome and primitive as the machines they served. The historic Mark 1, assembled at Harvard University during World War 2, was a five-ton conglomeration of relays, shafts, gears and dials, 51 feet long. It received its instructions for solving problems from the spools of punched paper tape that were prepared and fed into a computer by small corps if technicians. A more advanced machine, ENIAC (for Electronic Numerical Integrator and Computer), was completed in 1945 at the University of Pennsylvania’s Moore School of Electrical Engineering. Unlike the Mark 1, which was electromechanical, ENIAC was fully electronic. But it was still devilishly difficult to program. Its primary developers, Physicist John W. Mauchly and engineer J. Presper Eckert, had responded to the urgencies of wartime by concentrating on ENIAC’s hardware. Programming took a back seat. ENIAC was not even equipped to receive instructions on paper tape. To prepare it for operation, ma team of technicians had to set thousands of switches by hand and insert hundreds of cables into plug boards until the front of the computer resembled a bowl of spaghetti. Not surprisingly, ENIAC’s users tried to squeeze the last drop of information out of any given configuration before they undertook to change it.

These early experiences made it all too plain that a better means of communicating with the machine with the machine was needed if computers were to approach their potential. And even as ENIAC hummed through its first electronic calculations, some forward-looking work on a higher level programming was being done elsewhere. In at least one case, however, many years would pass before the results came to light.
Konrad Zuse’s world was crashing down around him early in 1945 as the allied military noose tightened on Berlin, his home city. The young German engineer had been working since before the war on a series of relatively small, general-purpose computers, using the living room of his parents apartment as his laboratory. Zuse’s efforts were a notable example of parallel yet independent developments in science; he had no idea of the similar progress being made in other nations, and his own government had shown little interest in his computer work. Shortly before the fall of Berlin, Zuse loaded his only surviving computer, dubbed the Z4, onto a wagon and fled with a convoy of other refugees to a small town in the Bavarian Alps.

During the grim years immediately after the war, Zuse found himself without either funds or facilities to work on computer hardware. Turning his energies to theory instead, he sought a better way to program a computer, not specifically the Z4, but any similar machine. What was needed, he decided, was a system of symbolic and numeric notations based on a logical sequence, in affect a calculus of problem-solving steps.

Working alone, Zuse devised a programming system that he named Plan Calculus, or, in Germen, Plankakl. He wrote a manuscript explaining his creation and applying it to a variety of problems, including sorting numbers and doing arithmetic by means of binary notation (other computers of the day operated in decimal). He also taught himself to play chess and then produced 49 pages of program fragments in Plankakl that would allow a computer to assess a player’s position. “It was interesting for me to test the efficiency and general scope of the Plankakl,” Zuse later wrote, “by applying it to chess problems.”
Zuse never expected to see his language actually run on a computer. “The Plankakl” he wrote, “arose purely as a piece of desk work, without regard to whether or not machines suitable for Plankakl programs would be available in the foreseeable future.” Although he briefly visited the United States in the late 1940s, only small portions of his manuscript were published, much less implemented, in the decade after the war; many of his ideas for a systematic, logical language remained unknown to an entire generation of computer linguists. Not until 1972 did Zuses’s experts to wonder what effect Plankakl would have had if it had been disseminated earlier. “it shows us how different things might have been,” one critic of subsequent languages has noted, “how what we have today is not necessarily the best of all possible worlds.”
While Zuse was labouring in isolation, a collegial effort to develop a programming language for real machine was under way at academic centres in Great Britain and the United States, where the earliest computers were beginning to be used. But progress was slow. Not only did each computer have its own machine code and programming method, but also developing the machines themselves required the lion’s share of the scientists’ time and talent.

During the years immediately after the war, most programmers continued to work on machine code, the binary digits that correspond to a computer’ circuits. To make the job slightly easier, some of them began using shorthand number systems to denote combinations of bits, a method akin to a stenographer’s using symbols to represent words when taking dictation. The first of these systems was Base eight, also known as octal. Just as there are only two digits, 0 and 1, in the binary system, there are eight in octal, the numerals 0 through to 7. Each of these octal numbers is used to represent one of eight possible combinations of three bits (000, 001, 010, 011, 100, 101, 110 and 111). A more ambitious numbering system that followed is base 16, or Hexadecimal (hex, to us programmers), Gathered into groups of four. The 16 possible combinations of four bits were represented by the numerals 0 through to 9 and the letters A through to F. Eg. Black #000000 White #ffffff.

To at least one frustrated American programmer, the modest progress offered by such number systems seemed grossly insufficient. Grace Murray Hopper was accustomed to being in the vanguard. She had grown up fascinated by things mechanical, “gadgets”, she called them. As a girl of seven she had taken apart all the wind-up alarm clocks in her family’s summer home in New Hampshire, to discover how they worked, However she could not put them back together. The ‘spanking’ that followed failed to dim her scientific enthusiasm. After graduating with honours from Vassar College in 1928, she earned a Ph.D in mathematics at Yale, a rare achievement for a female, and then returned to Vassar to teach.

At the height of World War 2, Hopper joined the U.S. Naval Reserve, and in June 1944 she earned her commission. Her contribution through the years would be prodigious. Lieutenant Hopper was assigned to the navy team that was developing programs for the Mark 1 at Harvard. “Mark 1 was the biggest, prettiest gadget I’d ever seen,” She later Said.

The programming team Hopper joined consisted of two male ensigns, she subsequently learned that when the men heard that a “grey-haired old college professor” was coming, one of them bribed the other so that he would not have to take the desk next to hers. Hopper soon proved her worth as a programmer, however. “I had an edge,” she said. “I had studied engineering as well as mathematics, and I knew how the machine worked from the beginning. Of course, I was lucky. When I graduated in 1928, I didn’t know there was going to be a computer in 1944.”
In 1949, a civilian again, Hopper joined the fledgling Eckert-Mauchly Computer Corporation, Which was operating out of a old factory in North Philidelphia. Mauchly and Eckert had left the University of Pennsylvania’s Moore School in 1946 after a bitter fight over patent right to their electronic computers. Once ain business for themselves, they secured several contracts and set about building a new machine that they hoped would prove the commercial viability of computing. They called the machine the Universal Automatic Computer, or UNIVAC.

Grace Hopper had learned how to work in octal, teaching herself to add, Subtract, multiply and even divide in the strange system. “The entire establishment was firmly convinced that the only way to write and efficient program was in octal,” she later lamented (the prevailing view was that the computers time was more valuable than the programmer’s; if a program could be executed swiftly, the difficulty of writing it was immaterial). And indeed octal proved very helpful in getting the company’s prototype computer up and running. She was having trouble balancing her personal bank account, an embarrassing dilemma for a trained mathematician. Finally, she appealed to her brother, who was a banker, and after several evenings’ work he solved the mystery. Occasionally she was subtracting a check in octal rather than the decimal system in the bank, and everyone else used. “I face a problem of living in two different worlds,” Hopper said. “That may have been one of the things that motivated me to get rid of octal as much as possible.”
Hopper’s efforts to ease the programmer’s burden (and keep her chequebook balanced) would eventually shape the course of computing. But she was not alone in the attempt. Shortly before she came to Philadelphia, John Mauchly made a suggestion that would take programming a first tentative step beyond octal and Hexadecimal’s. He directed his programmers to devise a computer language that would allow a person to enter problems into the machine in algebraic terms, an approach that Konrad Zuse would have approved of. By the end of 1949, the system, known as Short Code, was operational. Later promoted as an “electronic dictionary,” it was a primitive high-level language and a definite improvement over machine code. A programmer first wrote the program to be solved in the form of mathematical equations and then used a printed table to translate their equation symbols into two-character codes. For instance, a parenthesis became 09, while the plus symbol became 07. A separate program in the computer then converted these codes to ones and zeros, and the machine performed the appropriate functions.

Short Code’s partner program was essentially a primitive “interpreter”, a language translator that converts the high-level statements in which a program is written into simpler instructions for immediate execution. As programming languages evolved, interpreters would become one of the two basic categories of language translators.

New advances in languages soon overtook Short Code, but its central idea endured. Far from being simply glorified adding machines, computers are consummate manipulators of symbols, whether those symbols represent numbers, letters, colours, or even musical notes. A computer has no difficulty taking the code numerals 07 and performing the sequence of steps that leads it to add two numbers, as long as it has been programmed to recognise 07 as the symbol for addition. In the same manner, it can take a complete statement, such as IF N * 100 THEN PRINT N/47, and translate it into the basic machine instructions that will enable the hardware to carry out the desired task. This purposeful manipulation of symbols is the fundamental principle behind all programming languages.

Although short code was never a commercial success, the language made a deep impression on Grace Hopper. “Short code was the first step toward something which gave a programmer the power to write a program in a language that bore no resemblance whatsoever to the original machine code,” she said. But before the promise of Short code could be realised, much more had to be done.

The pace of progress in computer languages was tightly bound to advances in computer hardware, and during the late 1940s there were few such advances. Most of them were influenced by Mauchly and Eckert’s early work and could in fact trace their origins to a specific event: a series of lectures held at the Moore School in the summer of 1946.There, Mauchly and Eckert discussed the successor to ENIAC they were planning. Dubbed the Electronic Discrete Variable Automatic Computer, or EDVAC, it would dramatically reduce the labour involved in changing from one program to another by storing its programs and date electronically in an expanded internal memory.

One participant in that summer was Maurice V. Wilkes, then head of the Mathematical Laboratory at Cambridge University. Inspired by the lectures, Wilkes returned to England and set about designing a machine based on the EDVAC concept, construction began in 1947. Named the electronic storage Automatic Calculator, or EDSAC, it became operational in 1949, well before Mauchly and Eckert’s firm produced its first commercial computer.

Like many early computers, EDSAC was a finicky performer. One programmer recalled that even the sound of a airplane flying overhead could bring it to a halt. Whatever EDSAC was shut down for any reason, a set of “initial orders” had to be loaded into the machine to enable it to accept programs again. This process made a whiring sound, which was a signal for everyone who wanted to use the computer to come running, Programs in hand. Those fortunate enough to have offices nearest the computerusually ended up in the front of the queue. The others might have to wait a long time.
At first, EDSAC could perform 18 basic operations (modern computers usually have a capability of 200), each of them triggered by a particualar sequence of ones and zeros. Early on, EDSAC’s designers decided not to force its programmers to use this machine code in their programs. instead they set up a system of mnemonics in which each machine instruction was represented by a single capital letter. Thus S meant “Subtract”, I meant “Read the next row of holes” T meant “Transfer information to storage” and Z meant “Stop the machine”. When a programmer typed a mnemonic on a specially adapted keyboard, the corresponding binary instruction was punched into a papertape, which could then be fes to the machine.

Even more valuable than the mnemonics devised for EDSAC was the library of subroutines set up for the machine. Subroutines were already a familiar concept in computing: Grace Hopper and her group had used the on the Harvard Mark 1. But they continued to pose their own peculiar problems. Subroutines are independant sections of the computer program that are used over and over and are called for by the main program when needed.Early programmers often kept notebooks containing the comman used subroutines so that they did not have to start from scratch when one was needed. The problem was that the addresses that designated where each of a subroutine’s instructions and varibles were to reside in memory changed according to where the subroutine occured in the program.

Maurice Wilked called the EDSAC scheme of mnemonics and subroutines an assembly system, Commanly known as Assembly.

Assembly code reamsins in use today because of its close relation to the machine, an assembly language is machine-specific, designed to correspond to the set of machine-code instructions wired into a particular computer’s CPU. Thus, assembly lanuage is a favourite of programmers who want to compress their programs into the smallest possible space in memory and have them run as fast and efficiently as possible. These attributes made it ideal for programming the telemetry system used by the Titanics finders.

Anyone writing in assembly language has to be intimately familiar with how a computer does things, To know, for example, the many steps required simply to add 2 numbers. Assembly written for one computer would be totally gibberish to another computer.The language was a creation of a brilliant english mathematician Alan M. Turing.
By 1948, Turing was in charge of programming he prototype of a real computer called Mark 1, the machine that was being constructed at the University of Manchester. (It was not related to the Mark 1 of Harvard). The manchester Mark 1used combinations of five binary digits to represent the machines different instructions, with each instruction requiring four such combinations, or 20 bits. Intending to make the Mark 1 easier to program , Turing installed a system in which a mnemonic symbol was substituted for each of the 32 combinations of Zeros and Ones possible with a five-bit code. The symbols Turing assigned to the combinations were the letters, numerals and punctuation mark of a standard teleprinter keyboard. For Example, a slash (/), or “stroke” to the British, stood for 00000, or zero, an R stood for 01010, and so up to a , representing 11111.

That is the end of part one of the four part series. Please download or read the following files in the near future. I hope that you find this file helpfull.
Bibliography:
Time – Warner Books,