Third National Flag of the Confederate States of America |
P. A. Stonemann, CSS Dixieland |
National Jack of the Confederate States Navy |
CSS Dixieland
Probing the depths of knowledge
These essays by P. A. Stonemann, CSS Dixieland, cover a wide range of
historical, philosophical, scientifical and technical subjects. Each page
deals with a particular topic, divided into sections and explained by itself.
Every page shows at its top hyper links to every other page. The Start page
also has short descriptions of the other pages. CSS Dixieland expresses
gratitude to the readers that make this work meaningful.
This Web document has been tested with KDE Konqueror, graphic HTML interpreter
for Linux. It may not be rendered correctly by other graphic HTML interpreters.
It will probably be correct when rendered by text-only HTML interpreters (visual,
aural, or Braille tactile interpreters), but if feasible, please use KDE Konqueror.
Uniform Resource Locator:
Computing History page
The dawn of cybernetic evolution
Sections in this page
History of Computing
Technical note: In languages other than English or Latin, but which use mainly
Latin characters, some characters are taken from other alphabets, or some Latin
characters are modified with diacritic marks for representing different phonemic
sounds or other orthographic conventions of those languages. Those characters,
when used in this document, have been encoded as entities of Hyper Text Mark-up
Language or sometimes in Unicode UTF-8. Therefore computers using other character
encodings may render some characters inaccurately, but hopefully, it will still
be possible to read non-English words without too much difficulty.
History of Computing
Including advances in Mathematics, Physics, Electronics and related disciplines
The false notion of the computer as a "modern" invention
The impressive boom of computers in the last few years makes many ignorants
think that they represent a totally new invention, but it is not so. Although
improvements made in Computing advance at an incredible speed, devices or
machines that could be considered as a kind of primitive computers have been
projected, or have been tentatively or effectively built, for the last two
hundred to four hundred years. Taking the word "computer" in its etymological
sense of "counter" or "calculator", some of those primitive computers are:
-The calculation devices of John Napier in 1617, of William Oughtred in
1621-1627, and of Bissaker in 1654.
-The calculator machines of Heinrich Schickart in 1623, of Blaise Pascal in
1642-1652, of Sir Samuel Morland about 1660, of Wilhelm Leibnitz in 1694, and
of Mattieu Hahn in 1779.
Besides those purely mathematical calculators, the first automatic machines
were built by M. Falcon in 1728, by Basile Bouchon in those years, and by
Joseph Marie Jacquard in 1801-1804.
The first mechanic computers were tentatively built by Charles Babbage in
1821-1834 and in 1834-1871, although they were never finished by him. The
first operational mechanic computer was built by Vannevar Bush in 1930.
The first electro mechanic tabulator was built by Hermann Hollerith in
1880-1889.
The first electro mechanic computers were tentatively built by George Stibitz
with Samuel Williams in 1937-1940, although they were never finished by them.
The first operational electro mechanic computers were built by Konrad Zuse in
1936-1938, by Howard Aiken in 1937-1943, and by Alan Mathison Turing with Max
Newman in 1941-1942.
The first electronic computers were tentatively built by John Atanasoff with
Clifford Berry in 1937-1942, although they were never finished by them. The
first operational, partly electro mechanic and partly electronic computers,
were built by Konrad Zuse with Helmut Schreyer in 1941-1943.
The first operational, fully electronic computers, were built by Alan Mathison
Turing with Max Newman and with other collaborators in 1941-1943, and by
Presper Eckert with John Mauchly and with John Von Neumann in 1943-1946.
Other geniuses existed who deserve a place in Computing History. This essay
not only offers mention of a number of them, it also presents a good amount
of information about the many resources that computers can offer to those who
tackle the intellectual challenge of learning seriously how to operate these
wonderful inventions.
The true origins of computers or their ancestors
The History of Computing begins with devices used as an aid for counting.
Fingers of the hand, stone pebbles or wooden sticks can be used for this.
The Inca of South America also used the Quipu, made of strings with knots,
for keeping record of quantities. The abacus, device for counting, adding,
substracting, multiplying, dividing, raising to power or extracting root by
means of stones lined in a groove or of balls threaded in a wire, is of
unknown origin. The original idea might have come from ancient Mesopotamian
sources, but there is no historical or archeological evidence for this.
Probably different types of abaci were independently invented in different
places and at different times. There is documented use in the Roman Empire,
China, Japan and a few other countries since the I century or before, in more
countries afterwards.
The Maya of Central America seem to have used a type of abacus for counting,
in their numbering base of twenty. The abacus is still used today in parts of
Russia and Asia, mostly by vendors in street markets or in shops. Its current
use in other countries has almost become restricted today to the teaching of
elementary Arithmetics, or it is used as a score counter for games, especially
for billiards, but with good training the abacus is an efficient calculator.
A famous arithmetical competition was organised in 1946, confronting two well
known specialists in calculations: a soldier of the United States Army using
an electro-mechanic desktop calculator, and a clerk of the Japanese Postal
Service using a typical Japanese abacus. The competition consisted of five
complex operations, all of them involving adding, substracting, multiplying
and dividing. The two men gave accurate results, but the Japanese with his
abacus was faster in four of the operations, and slower in only one.
About the I or II century after Christ it is developed in India a system of
numerals in base of ten that introduces the concept of zero, and of values
that depend on the location of each cipher inside a number. The system reaches
Persia, probably during the government of the Shah Xusraw I Anushiravan (also
known as Khusraw, Khosro, or Chosroes), from the year 531 to the 578 or 579,
when the relations between India and Persia were frequent. From Persia the
system is copied by Arab mathematicians about 720, who carry it to territories
under Islamic control. The Indian numeral system thus becomes known to Muslim
scholars in the Iberian Peninsula in that VIII century, and thence it slowly
spreads to the rest of Europe, becoming about the XII century predominant
over the Roman numeral system (although Roman numerals are still used today).
Chronology
About 1550: nonius (also called vernier), by Pedro Nunes. It is commonly used
today in precision instruments.
1610-1614: Mirifici Logarithmorum Canoni Descriptio (Marvellous Description
of Logarithmic Rules), logarithmic tables for multiplying, dividing, raising
to power or extracting root, by John Napier (1550-1617). They took four years
of fastidious calculations by pencil and paper, but they were only published
in 1644, thirty years after their completion and twenty-seven after the death
of Mister Napier. Once published, they became very useful to astronomers,
engineers, mariners and other scientists and professionals. Later perfected
and enlarged by other mathematicians, logarithmic tables remained in common
use until the 1970's, when they gradually became substituted by electronic
calculators.
1617: Napier Bones, manually handled ivory tablets for multiplying, by adding
numbers (logarithms) written on the Bones, by John Napier (1550-1617).
1621-1627: circular calculation slide ruler by William Oughtred and others,
based on Napier's logarithmic tables. This ruler was the forerunner of all
calculation slide rulers of different forms that were built for three and a
half centuries. Still today they can be found at good stationery shops, but
in the same fashion as with logarithmic tables, calculation rulers gradually
fell out of favour in the 1970's, with the advent of electronic calculators.
1623: calculator machine of pinion wheels for adding, using numbering base of
ten, by Wilhelm Heinrich Schickart (or Schickard).
1642-1652: Pascaline, calculator machine of pinion wheels for adding two or
three numbers, up to the number 999 999, using numbering base of ten, by
Blaise Pascal (1623-1662). Several of them were built.
1654: calculation slide ruler by Bissaker.
About 1660: calculator machine by Sir Samuel Morland.
1666: De Ars Combinatorium (On Combinatory Art), essay on mathematical logic,
by Wilhelm Gottfried Von Leibnitz (1646-1716).
1668: cylindrical version of the Napier Bones, twelve cylinders operated by
manual rotation, each cylinder representing a cipher of a number in numbering
base of ten.
1671-1673: theory of calculator machine for adding, substracting, multiplying
or dividing, using numbering base of ten, by Wilhelm Gottfried Von Leibnitz.
The machine was built in 1694.
1676-1679: first known explanation of the concept of numbering base of two
(called binal or binary base), by Wilhelm Gottfried Von Leibnitz (see year
1930).
1694: calculator machine for adding, substracting, multiplying (directly, not
just as a sequence of additions) or dividing, using numbering base of ten, by
Wilhelm Gottfried Von Leibnitz, according to his own theory of 1671-1673.
1728: Automatic weaving machine, using perforations of 10 millimetres in
diametre through wooden tables of 200 x 500 millimetres, by M. Falcon. About
those years, another automatic weaving machine using a roll of perforated
paper was built by Basile Bouchon.
1779: calculator machine for adding, substracting, multiplying or dividing,
by Mattieu Hahn.
1801-1804: automatic weaving machine using perforated cardboard cards, by
Joseph Marie Jacquard (1753-).
1812: theory for a differential calculator machine, by Charles Babbage
(1791-1871). He attempted the construction of this machine between 1821-1834,
and of another machine based on different principles between 1834-1871.
1820: Hans Christian Oersted (1777-1851) discovers the mutual influence of
magnets and electricity.
1820: Arithmometre (also called Arithmograph), simplified calculator machine
for adding, substracting, multiplying or dividing, using numbering base of
ten, by Charles Xavier Thomas Colmar (based on the machine that had been
built by Wilhelm Gottfried Von Leibnitz). Until 1850 fifteen hundred
Arithmometres (or Arithmographs) had been sold, about fifty per year.
1821-1834: Differential Machine, logarithmic calculator machine for polynomes
of up to eight terms, using numbering base of ten, by Charles Babbage,
according to his own theory of 1812. Tentatively built between 1821-1834 but
never finished, it exists a modified version built between 1840-1854 and a
perfected version built between 1859-1860, both by Pehr Georg Scheutz.
About 1830: serial production of various machines for arithmetic calculation,
all of them using numbering base of ten.
1831: Michael Faraday (1791-1867) discovers that a variable magnetic field
generates an electric current, inventing the magneto.
1834-1871: Analytic Machine using perforated cardboard cards, for processing
logic symbols (which is the basis of Artificial Intelligence), and up to 100
numbers of 40 ciphers each, using numbering base of ten, by Charles Babbage.
An Italian mathematician wrote a description of this machine, description
that Lady Ada Augusta Countess of Lovelace enlarged in 1843, explaining how
the machine could be programmed. Tentatively built between 1834-1871 but never
finished by Babbage, the machine was only known by technical drawings and by
part of the printer and of the arithmetic logical unit, built after the death
of Babbage. In 1991, specialists of the Science Museum in Kensington finally
built the Analytic Machine of Charles Babbage, that worked correctly and with
a precision of 31 decimals. It would have been the first hardware programmable
mechanic arithmetic (digital) computer, but because it was only built in 1991,
that honour corresponds to computers built in the 1930's. The majority of
operational computers of advanced concept, in numbering base of two, which
were built in the 1930's and until the mid 1940's, were electro-mechanic
rather than purely mechanic. The German Z-3 and Z-4 computers, built by
Konrad Zuse with Helmut Schreyer, had gone a step farther and were partly
electronic. Most advanced computers built since the mid 1940's are fully
electronic, although purely mechanical or electro-mechanical counters or
calculators were built until the 1970's.
1847: The Mathematical Analysis of Logic, essay by George Boole. Together
with his essay of 1854, it is the origin of the Boolean Logic used today for
different purposes.
1840-1854: Modified version of the Differential Machine logarithmic
calculator, by Pehr Georg Scheutz (based on the never finished machine that
had been projected by Charles Babbage).
1854: An Investigation of the Laws of Thought, algebraic system for
enunciators of formal logic, by George Boole. In 1867 Charles Sanders Peirce
suggested that the system could be applied to electric circuits, while Claude
Shannon explained in 1936 how this application could be done. Boolean Logic
is today used by search engines and for many other purposes.
1859-1860: Perfected version of the Differential Machine logarithmic
calculator, by Pehr Georg Scheutz (based on the never finished machine that
had been projected by Charles Babbage).
1865: James Clerk Maxwell (1831-1879) discovers that electricity and
magnetism is one single force, that according to him propagates in the form
of waves. The Quantum Theory of Max Planck would later add the concept of
tiny packets of energy (quanta) of electro-magnetic radiation, detected as
heat, as visible or invisible light, as Roentgen X-Rays, Gamma Rays, and
other forms of radiation.
About 1870: Rack and Pinion Calculator Machine, by George Barnard.
1872: Analogue computer used to calculate sea ebb and flow, designed by Lord
Kelvin with James Thomas, built by J. White.
1875: calculator machine by Frank Baldwin.
1883: Plate Bulb (also known as thermionic diode valve or as vacuum diode
tube) by Thomas Edison. Discovered by Edison, but not further developed by
him.
1884-1890: adding machine that shows each added amount and prints the result,
by William S. Burroughs. It was the first calculator machine sold by the
thousands, by his American Arithmometer Company, founded in 1886, which was
later renamed Burroughs Company, then Sperry-Univac, and then Unisys in 1986.
1885-1887: Comptometre calculator machine, by Dorr Felt.
1887: multiplying calculator machine by Leon Bollee.
1888: Heinrich Hertz (1857-1894) discovers that electromagnetism as well as
visible and invisible light is one single force (see year 1865).
1880-1889: electro-mechanic tabulator machine using perforated cards, by
Hermann Hollerith (-1929). The suggestion of using perforated cards had been
proposed by John Shaw Billings, inspired on automatic weaving machines and
on other similar mechanic devices. This tabulator was the first important
application of a computer in History: in competition against a few other
inventions, Hollerith's machine won the contract for the North American
census of 1890. The machine was in service until the 1930's. In 1896 Hermann
Hollerith founded the Tabulating Machine Company (TMC), renamed in 1911
Computer Tabulating Recording (CTR), and in 1924 International Business
Machines (IBM).
1896: Guglielmo Marconi (1874-1937) sends a wireless signal at a distance of
three Kilometres, using sparks reflected by a partly parabollic emitter.
1900: Plate Valve (thermionic diode valve or vacuum diode tube), developed by
J. A. Fleming on the previous discovery that had been made by Edison in 1883.
1906: Grade Audion (thermionic triode valve or vacuum triode tube), by Lee de
Forest.
1907: electro-mechanic tabulator using perforated cards, by James Powers. It
was the second important application of a computer in History: the North
American census of 1910. The machine was in service until the 1940's.
1914: machine to play the chess end game of King and Castle against King, by
Leonardo Torres y Quevedo. The original, in perfect operational condition, is
treasured in the museum of the Polytechnic University, Madrid.
About 1925: Carel Czapek uses the term "robota" (Czech word meaning "worker")
in his Scientific Fiction story R.U.R. Universal Robots.
1930: Differential Analyser, analogue computer for solving equations, using
numbering base of ten, by Vannevar Bush (Massachussetts Institute of
Technology). The Differential Analyser was inspired in the analogue computer
of Lord Kelvin, James Thomas and J. White, of 1872. In the 1940's Doctor Bush
was Director of the United States Office of Scientific Research and
Development, and coordinated war time research in the application of Science
to military purposes. In his essay "As We May Think", he describes his vision
for a computer aided text system that he named "Memex". His description of
browsing the Memex of linked information includes the ability of easily
inserting new information by anyone, adding to the growing Memex, as the
hyper text system does today in the Gopher Protocol, or in the Hyper Text
Transfer Protocol and Mark-up Language used by the World Wide Web.
1930: following the ideas that had been explained by Wilhelm Gottfried Von
Leibnitz in 1676-1679, Couffignal suggests that calculator machines (or
computers) should use a numbering base of two instead of using a numbering
base of ten. That is because in numbering base of two it is easy to represent
numbers by giving either an "off" or an "on" status to memory locations. Most
previous computers or calculators had been only mechanic, some had been
electro-mechanic, but all of them using numbering base of ten by means of
pinion wheels (in the mechanic devices), or of electric relais (in the
electro-mechanic devices). Electro-mechanic computers predominated from the
1930's to the early 1950's. They were programmed by hardware connections.
1936: On Computable Numbers, essay that develops the concept of stored
programme (as opposed to programming by hardware connections), by Alan
Mathison Turing (-1954) (Cambridge University). He and other pioneers declare
that advances in Robotics and in Automatics (from the Greek word "automaton",
meaning "that who acts by itself"), will make possible the construction of
thinking machines. The Turing Test will be passed by a machine who could fool
a human into thinking that he be in communication with another human.
1936: essay explaining the application of Boolean Logic to electric circuits,
by Claude Shannon (Massachussetts Institute of Technology). This essay was
inspired on the analogue Differential Analyser of Vannevar Bush, that Shannon
had studied in detail, and on the suggestion proposed in 1867 by Charles
Sanders Peirce. The ideas of Mister Shannon have been since 1937 applied to
telephone switches. See year 1948 for another important theory written by him.
1936-1938: Z-1, electro-mechanic computer using magnetic relais, keyboard
for input and panel of lights for output, by Konrad Zuse. FIRST COMPUTER
USING NUMBERING BASE OF TWO.
1938: Z-2, electro-mechanic computer using magnetic relais, keyboard for
input and panel of lights for output, plus perforated film strips for input
or output, by Konrad Zuse. SECOND COMPUTER USING NUMBERING BASE OF TWO.
1937-1940: Complex Number Calculator, electro-mechanic computer for adding,
substracting, multiplying or dividing, using numbering base of two and
magnetic relais, by George Stibitz (Bell Telephone), in collaboration with
Samuel Williams. It was never finished, but it served as a model for some
other electro-mechanic computers in numbering base of two (although also some
purely mechanic calculators in numbering base of ten continued being made
until the 1970's).
1937-1942: ABC, Atanasoff-Berry Computer, by John Atanasoff (Iowa State
College), in collaboration with Clifford Berry. It used numbering base of two
and 300 vacuum tubes. It was presented to the public in 1939, although it was
unfinished and so it remained. If it had been finished, it would have become
the first electronic digital computer. An historical court decision in 1972
recognised that this computer had at least been an inspiration for building
some other computers.
1937-1943: Harvard Mark I, electro-mechanic computer using magnetic relais,
perforated cardboard cards and numbering base of ten, operational in 1943 and
presented to the public in 1944, by the group of Howard Aiken (Harvard
University and International Business Machines), with support of the United
States Navy. It had 200 000 components, a height of almost 3 metres and a
length of almost 20 metres. In service until 1959, it could add or substract
numbers of 23 ciphers in 0.2 seconds, multiply them in 4 seconds, or divide
them in 10 seconds.
1940: Enigma, German machine to cipher or decipher military codes, used for
secret communications between headquarters and field commanders.
1940-1942: ambitious project of an electronic digital computer using vacuum
tubes and numbering base of two, by Konrad Zuse in collaboration with Helmut
Schreyer. Never built, the expensive project was refused by the German
Government in 1940, and the two inventors had to abandon it definitely in
1942, in order to concentrate on perfecting the more modest Z-3 computer, and
in creating the Z-4 computer.
1941: Z-3, partly electro-mechanic and partly electronic computer using
magnetic relais and some vacuum tubes, of numbering base of two, by Konrad
Zuse with Helmut Schreyer. Used mostly to design flying machines, it was in
service until 1944, when it was destroyed during the attacks against Berlin.
FIRST PARTLY ELECTRONIC DIGITAL COMPUTER THAT BECAME FULLY OPERATIONAL (the
ABC of Atanasoff and Berry had never been finished).
1941-1942: Ultra, electro-mechanic computer using numbering base of ten and
magnetic relais, by Alan Mathison Turing in collaboration with Max Newman and
others. It was applied in Britain to decipher German secret communications,
mostly those produced by the German Enigma machine.
1942: printed circuit by Paul Eisler (1907-1992). It gradually substituted
connections by internal wires.
1943: Z-4, partly electro-mechanic and partly electronic computer using
magnetic relais and some vacuum tubes, of numbering base of two, by Konrad
Zuse with Helmut Schreyer. Used mostly to design flying machines, it was in
service until 1945, when it was carried out of Berlin and hidden for many
years. SECOND PARTLY ELECTRONIC DIGITAL COMPUTER THAT BECAME FULLY OPERATIONAL
(the ABC of Atanasoff and Berry had never been finished).
1941-1943: Colossus I, FIRST FULLY ELECTRONIC DIGITAL COMPUTER (of bigger
size than the ABC, the Z-3 or the Z-4), using numbering base of ten,
perforated paper bands, and 2 000 vacuum tubes, by Alan Mathison Turing with
Max Newman and others. It was applied in Britain to decipher German secret
communications, mostly those produced by the German Enigma machine. Ten
Colossus I were built, all of them disassembled in 1946.
1944: the London Times uses the term "computer", in reference to machines
capable of performing complex calculations or other intellectual operations.
1943-1946: ENIAC, Electronic Numeric Integrator Analyser and Computer,
SECOND FULLY ELECTRONIC DIGITAL COMPUTER (of much bigger size than the ABC,
the Z-3, the Z-4 or the Colossus I), by Presper Eckert in collaboration with
John Mauchly (-1980) (Moore Engineering School, University of Pennsylvania),
and in collaboration with John Von Neumann (1903-1957) (Institute for Advanced
Studies, Princeton, not to confuse with Max Newman). It used numbering base
of ten, perforated cards and 17 474 vacuum tubes at 100 Kilohertz, consuming
150 Kilowatt for operation, plus the consumption of the refrigeration system
(necessary to extract the heat generated by the vacuum tubes), programmable
by hardware connections. Presented to the public in 1946, the ENIAC had a
height of over 4 metres, a length of almost 30 metres, and a weight of 4
Megagrammes for its core only, almost 30 Megagrammes counting its peripherals
and support systems. Faster than the Harvard Mark I by a factor of over a
thousand, the ENIAC could perform operations in 200 microseconds. This kind
of computers are called "of first generation", which predominated from the
1940's to the 1950's.
John Von Neumann developed between 1945 and 1950 the theory of logic circuits
(also called "Von Neumann Architecture"), in collaboration with Burks and
Goldstine. This theory was applied to the EDVAC in 1948 (renamed UNIVAC in
1951), to the EDSAC in 1949, and to many other computers afterwards.
1945: a real bug, an insect, temporarily stops a Mark II computer at the
Naval Center in Virginia. Since that time some programmers use the term "bug"
in reference to different kinds of unexpected programming errors.
About 1945: As We May Think, essay by Vannevar Bush (Massachussetts Institute
of Technology, Director of the United States Office of Scientific Research
and Development), describing a computer aided hyper text system that he named
"Memex", able to find linked information and to insert easily new information
by its different users.
December 1947: solid state contact point transresistor, made of germanium, by
John Bardeen and Walter Brattain (Bell Telephone). Presented to the public in
1948.
1948: Norbert Wiener coins the term "Cybernetics" (from the Greek word
"kybernos", meaning "control" or "controllable"), defined as "the Science of
control and communication in animal or in machine".
1948: A Mathematical Theory of Communication, essay explaining how to apply
the numbering base of two to computers, by Claude Shannon (Massachussetts
Institute of Technology). It is the first use of the term "bit" (binal digit
or binary digit), although the concept of a minimal unit of information based
on one of two possible states had already been proposed by Konrad Zuse, who
called it a "JA - NEIN" ("YES - NO", in German). Claude Shannon greatly
influenced further development of computers using numbering base of two, and
definitely provoked the demise of the numbering base of ten for nearly all
advanced computers. He also speculated on how computers might play chess.
1948: Manchester Mark I (not to confuse with Harvard Mark I), electronic
digital computer using numbering base of two, phosphor screens and perforated
paper tapes, by Max Newman (not to confuse with John Von Neumann). Alan
Mathison Turing developed a kind of assembly language for it. FIRST COMPUTER
USING STORED CODE (as opposed to programming by hardware connections).
1948: BINAC, first computer using magnetic tapes (of big format), by John
Mauchly in collaboration with Presper Eckert.
1947-1949: EDSAC, Electronic Delay Storage Automatic Calculator, electronic
digital computer by Maurice Wilkes (Cambridge University), using numbering
base of two and tubes with mercury. SECOND COMPUTER USING STORED CODE. The
EDSAC was based on the project of the EDVAC, but it was finished before.
1948-1951: EDVAC, Electronic Discrete Variable Automatic Computer, electronic
digital computer using numbering base of two, perforated cards and vacuum
tubes, by Presper Eckert with John Mauchly and in collaboration with John Von
Neumann. The EDVAC definitely introduced the concept of stored code in
computer programming. This project was modified by Eckert and Mauchly in
order to accept big format magnetic tapes instead of perforated paper tapes,
and for building it of transistors instead of vacuum tubes. In 1951 those
modifications were released in a new model renamed UNIVAC I, Universal
Automatic Computer I (see further below).
1949: Short Order Code, by Mandy (Univac), first scientific programming
language.
1944-1950: Whirlwind, digital computer by Jay Forrester (Massachussetts
Institute of Technology), first computer operable in real time. It began in
1944 as an analogue computer, but it was shortly later modified to be digital.
1950: Pilot ACE, Automatic Computing Engine, electronic digital computer by
Alan Mathison Turing with Max Newman and others, using numbering base of two
and programmable by a kind of assembly language. THIRD COMPUTER USING STORED
CODE. Like the EDSAC, the Pilot ACE was also based on the project of the
EDVAC.
1950: Silhouette of a Scottish Dancer, first artistic image in a computer
screen (an oscilloscope), made by an anonymous operator in the EDSAC of
Maurice Wilkes at Cambridge University.
1947-1951: UNIVAC I, Universal Automatic Computer I, by Presper Eckert with
John Mauchly (Sperry Rand), in collaboration with John Von Neumann, marketed
by the Univac Division of Remington Rand. It occupied 20 square metres and
had a weight of 5 Megagrammes. In spite of being a tenth part the size of the
ENIAC, the UNIVAC I had a memory hundredfold the memory of the ENIAC. Fifteen
Univac I were built. This kind of computers are called "of second generation",
which predominated from the 1950's to the 1960's.
1951: the contact point transresistor of Bardeen and Brattain is modified by
William Shockley (Bell Telephone) and named junction transistor, made of
germanium. The three scientists were awarded the Nobel Prize in 1956.
1951: LEO, electronic digital computer using numbering base of two, serially
built and commercially marketed.
1951: A-0 Coding Translator, by Captain Grace Hopper (United States Navy and
Univac), first compiler of routines (repetitive tasks done many times by the
computer in the same or in different programmes).
1951: programme to play draughts, made by Christopher Strachey in the Mark I
of Max Newman (not to confuse with John Von Neumann) at Manchester University.
1952: proposal of an integrated circuit, by G. W. A. Dummer. This was hardly
possible in 1952, when contact point transresistors or junction transistors
were the only kinds of transistor available. Integrated circuits only became
a possibility when Jean Hoerni invented the flat transistor in December 1958.
1952: IBM 700, by International Business Machines, built of vacuum tubes.
1953: IBM 701, by International Business Machines. Successful with a few big
corporations, research institutions or government agencies that could afford
its price, the IBM 701 inaugurated a kind of electronic digital computers
serially built and commercially marketed, which used transistors, perforated
cards or paper tapes for input or output, and big format magnetic tapes for
internal storage.
1953: Speed Coding, by Seldon and John Backus (International Business
Machines), second scientific programming language.
1953: magnetic memory, by Jay Forrester (Massachussetts Institute of
Technology). It gradually substituted memory by vacuum tubes.
1954: IBM 650, by International Business Machines. The series continued in
later years with the IBM 702, IBM 704, IBM 709, IBM 790 and IBM 794. Cards
and paper tapes gradually disappeared and were almost gone by the 1980's, but
big format magnetic tapes continued in use in some big computers even after
the year 2000. This IBM series covered a wider range of applications than had
been covered by its computer predecessors (which had been mostly used for
long mathematical calculations), in the sense that these IBM computers were
also commonly used as for example electronic data bases, storing documents or
other informations in electronic form. Less than 50 computers in operation
existed in the world before the IBM 650, but over 1 000 computers of this
model were sold.
1954: junction transistor made of silicium (instead of germanium, which is
much more expensive), by Gordon Teal (Texas Instruments). It gradually
substituted vacuum tubes as valves.
1955: Tradic, first computer entirely built of transistors, by Bell Telephone.
1956: TEC, Transistor Experimental Computer, by the Massachussetts Institute
of Technology.
1956: system of batch processing (before that, programmes had been processed
one by one).
1956: at Darmouth College, ten experts in diverse disciplines meet to create
the basis for what they call Artificial Intelligence (to distinguish it from
Robotics, Automatics and Cybernetics). John Mc Carthy (Stanford University),
presented the Lisp programming language and the Mc Carthy Test for measuring
Artificial Intelligence (playing games, following conversation, receiving
information or performing other activities through a terminal). Other experts
presented programmes for playing chess or for proving mathematical theorems.
1958: first elementary but complete programme for playing chess against a
computer, by Doctor A. L. Samuel (International Business Machines).
1958: Advanced Research Projects Agency begins studies for a military Arpa
Network, that in time would become the origin of the Interconnected Networks
(also known as Internetting or Internet).
July 1958: system of time sharing is proposed at the Massachussetts Institute
of Technology to substitute batch processing. Initially it was used the big
TX-0 computer, made of transistors and equipped with screen of cathodic ray
tube and light pen. In this computer was programmed the first action game
(not counting computerised board games like draughts or chess): Mouse in the
Labyrinth, by a teacher of the institution. Time sharing became definitely
established in 1962.
December 1958: flat transistor by Jean Hoerni (Fairchild Semiconductor, Palo
Alto, High California). It makes possible the insertion of elements to form
an integrated circuit, as it had been proposed by G. W. A. Dummer in 1952.
Integrated circuits gradually substituted printed circuits as main computer
processors, although printed circuits remained in use for simpler purposes.
1958-1959: integrated circuit with base of germanium, by Jack Saint Clair
Kilby (Texas Instruments). Made possible by the flat transistor recently
invented by Jean Hoerni, the integrated circuit of Mister Kilby had about
five elements, that could be resistances, condensers, or also transistors.
1959: integrated circuit with base of silicium and chemical gravure, by
Robert Noyce (Fairchild Semiconductor), based on the flat transistor of Jean
Hoerni and on the P-N Multiple Semiconductor Junction of Kurt Lehovec.
1959: Computer Sciences Corporation, created by Roy Nutt and Fletcher Jones.
First software company in History.
1959: PDP-1, Programmed Data Processer-1, by Kenneth Olsen (Digital Equipment
Corporation). First minicomputer, in fact the size of a very big wardrobe,
but "mini" when compared to computers that occupied large rooms.
1960: IBM 7090, SOS operating system, produced by Share and IBM.
1960: it is calculated the existence of about 5 000 computers in the world.
1960-1962: Space War, action game by Stephen Slug Russell, with Wayne Witanen
and Martin Graetz, based on the Minskytron action game of Marvin Minsky, both
programmes were created in the PDP-1 minicomputer of the Massachussetts
Institute of Technology.
1960: first joy stick for playing action games, built in a wooden box by two
students of the Massachussetts Institute of Technology.
1960: Mac Project, first computer network.
1961: field effect transistor by Steven Hofstein, that made possible the
development of MOS transistor (Metallic Oxid Semiconductor) by R. C. A.
July 1961: essay on the theory of packet interchange for computer networks,
by Leonard Kleinrock (Massachussets Institute of Technology).
1961-1962: Texas Instruments and Fairchild Semiconductor start separately the
serial production of integrated circuits of silicium (shortly later called
chips), to substitute printed circuits in arithmetic logical units. In 1964
little over ten elements could enter in a square centimetre. In 1970 over a
thousand elements entered into that surface, each element separated from its
neighbour by only a few micrometres. These microcircuits began a kind of
computers called "of third generation", which predominated from the 1960's to
the 1970's.
1962: magnetic disk for memory storage.
August 1962: essay on a "Galactic Network" of computers, by J. C. R.
Licklider (Massachussets Institute of Technology).
October 1962: J. C. R. Licklider becomes the first Director of a DARPA
computer project at the Advanced Research Projects Agency. Later directors of
the project include Ivan Sutherland, Bob Taylor and Lawrence G. Roberts.
1963: PDP-5, Programmed Data Processer-5, by Kenneth Olsen (Digital Equipment
Corporation). The PDP-5 was sold at 120 000 Dollars.
1963: creation of the Augmentation Research Center by Douglas Engelbart
(Stanford Research Institute, Director of the Bootstrap Institute). He later
helped to develop hyper text.
1964: IBM 360 computer and OS/360 operating system, by International Business
Machines. It introduced bytes of 8 bits allowing up to 256 characters EBCDIC
(Extended Binary Coded Decimal Interchange Code), which replaced bytes of 6
bits that allowed up to 64 characters BCD or BCI (these ones had themselves
replaced bytes of 4 bits, which allowed only 16 characters). Shortly later
appeared the IBM 370 and the Basic Operating System, Tape Operating System
and Disk Operating System (unrelated to Basic programming language, or to
later DOS operating systems such as PC-DOS, MS-DOS, DR-DOS, et cetera).
1964: first computer with main processor entirely built as integrated circuit
(of little over ten elements per square centimetre).
1964: essay on the application of packet interchange for secret military
communications, by Paul Baran and others (R. A. N. D.).
1964: Leonard Kleinrock publishes a book on the theory of packet interchange
for computer networks. This book convinces Lawrence G. Roberts for using
packet interchange instead of circuits in the Arpanet project.
1965: PDP-8, Programmed Data Processer-8, by Kenneth Olsen (Digital Equipment
Corporation). The PDP-8 was the first commercially successful minicomputer,
sold at 18 000 Dollars.
1965: Thomas Merrill and Lawrence G. Roberts connect a Q-32 computer at the
University of California with a TX-2 computer at the Massachussetts Institute
of Technology, using a dial-up low speed telephone line, and thus making the
FIRST LONG DISTANCE COMPUTER NETWORK IN HISTORY. The experiment is a success,
but it shows that available telephone lines are still inadequate, confirming
Leonard Kleinrock's proposal for packet interchange instead of circuits.
1965: computer mouse, invented by Douglas Engelbart (Stanford Research
Institute, Director of the Bootstrap Institute).
1966: Eliza, first conversational programme (also known as talking robot), by
Joseph Weizenbaum.
December 1966: Lawrence G. Roberts becomes part of DARPA.
1967: Lawrence G. Roberts presents an Arpanet project. At the same event,
Donald Davies and Roger Scantlebury (Nuclear Physics Laboratory, in Great
Britain) also present a project on a computer network based on packet
interchange. It became clear that three organisations were independently
developing similar ideas: one was the group of Leonard Kleinrock and J. C. R.
Licklider at M. I. T. from 1961 to 1967, another was the group of Paul Baran
at R. A. N. D. from 1962 to 1965, and a third one was the group of Mister
Davies and Mister Scantlebury at British N. P. L. from 1964 to 1967. The term
"packet interchange" came from the N. P. L. project, while the proposed line
speed for the Arpanet project was increased from 2.4 Kb to 50 Kb.
1967: branching motion picture presented by Ted Nelson at the Czechoslovakian
Pavillion of Expo 67. He coined the term "hyper-text" in his book "Literary
Machines and Dream Machines", which described hyper media and advocated for a
global hyper text system that he named "Xanadu".
1968: B 2500 and B 3500, computers built with integrated circuits, by
Burroughs Corporation.
August 1968: after Lawrence G. Roberts and his DARPA group had defined the
specifications for Arpanet, a group led by Frank Heart (Bolt Beranek) and
by Newman (B. B. N.), is chosen for developing an Interface Message Processor.
Robert Kahn (B. B. N.) designs the network architecture. Topology and network
economy are taken by Lawrence G. Roberts and by the group of Howard Frank
(Network Analysis Corporation). Network measuring system falls under the
responsibility of Leonard Kleinrock and his group of the University of
California at Los Angeles.
1969: Fairchild 4100, integrated circuit of 256 bits in Read Only Memory
(Fairchild Semiconductor).
1969: Unics operating system. Created by Kenneth Thompson and Dennis Ritchie
(Bell AT & T), who were unhappy with Multics, Unics became one of the first
time-sharing operating systems. Renamed Unix, it was rewritten by its original
authors: by Kenneth Thompson in 1972 and by Dennis Ritchie in 1974, becoming
fully operational in 1974 and open source in 1978. From the 1970's to the
early 2000's a number of open source systems based on Unics were created, such
as various BSD systems, plus GNU Hurd, Linux, Minix, Open Solaris, and others.
1969: Request for Comments, method developed by S. Crocker (U. C. L. A.) for
interchange of ideas and proposals among researchers. Initially distributed
by the physical postal service, the requests for comments became commonly
distributed in later years through File Transfer Protocol. Since the 1990's,
they can also be accessed as hyper text documents via World Wide Web. They
are edited and coordinated by Jon Postel (Stanford Research Institute), and
have become the technical standard on which Internet is based.
2nd September 1969: first short distance message transmitted between two
computers, both inside of U. C. L. A. and connected by a 5 metre cable, by
Leonard Kleinrock. The computers interchanged meaningless data while about
twenty people watched the historical event, which MARKS THE START OF INTERNET.
Because of its success, the Centre for Network Measuring of Leonard Kleinrock
at U. C. L. A. is chosen for the installation of the first Interface Message
Processor of Heart and Newman, THUS CREATING THE FIRST COMPUTER HOST-SERVER.
Days later, the Human Intellect project of Douglas Engelbart (that included
NLS, a forerunner of hyper text) is installed at the Stanford Research
Institute, becoming the second computer host-server. Stanford maintains the
Host Name list for the mapping of directory and addresses of Request for
Comments.
October 1969: first long distance message transmitted between host-servers,
from the host of U. C. L. A. to that of Stanford. The message consisted only
of the two letters "LO". It was intended as "LOG IN", but the connection
crashed when trying to send the "G".
January 1970: The University of Utah and the University of California at
Santa Barbara soon join the recently born Arpanet. The group of Glen Culler
and Burton Fried at the U. C. S. B. take research on mathematic functions for
network visualisation, while that of Robert Taylor and Ivan Sutherland at
Utah concentrate on methods for third dimension representation of the network.
The four host servers (U. C. L. A., Stanford, U. C. S. B. and Utah) start a
long lived tradition of research into the network itself as well as into its
possible applications, a tradition that continues until today.
1970: it is calculated the existence of about 100 000 computers in the world.
1970: Intel 1103, integrated circuit of 1 Kilobit in Random Access Memory
(Integrated Electronics, Santa Clara, High California).
1970 to 1972: the Learning Research Group assembled by Xerox at the Palo Alto
Research Centre produces the "Alto", first operating system that featured
icons in a graphic interface, as opposed to text line in a command prompt.
Only 150 copies of the Alto were privately released (never sold), but it
inspired systems such as Lisa, IBM OS/2 (1981), Apple Macintosh (1984), or
Microsoft Windows (1985).
1971: Intel 4004, microprocessor of 4 Kilobytes of 4 bits at 60 Kilohertz, by
Marcian Edward Hoff, Stanley Mazor, Federico Faginn (Integrated Electronics).
1971: twenty-three North American universities or research institutions have
computers connected to Arpanet.
December 1971: Network Control Protocol, first host to host protocol of
Arpanet, by S. Crocker (Network Work Group). It made possible for any member
of Arpanet to develop his own applications.
1971: Project Gutenberg. In the course of time it would become the biggest
library in the Internet, with over fifty thousand books in 2016 (most of them
in English, although over forty languages are represented with at least one
book). Those books are available in electronic format in at least two on-line
collections: Project Gutenberg and Many Books, hyper linked below.
Project Gutenberg
Many Books
1972: Intel 8008, microprocessor of 16 Kilobytes of 8 bits. At a price of 200
Dollars, the Intel 8008 was used in scientific calculators and in two very
early microcomputers: Scelbi-8 H of 1973 and Personal Minicomputer Mark-8 of
1974.
March 1972: first elementary application of electronic post for Arpanet, only
with the functions "Read" and "Send", by Ray Tomlinson (B. B. N.).
July 1972: Lawrence G. Roberts added to electronic post the functions "List",
"Read selectively", "Save", "Forward" and "Reply", making the Mailto Protocol
the most important Arpanet and Internet application until the mid 1990's,
with File Transfer Protocol and Telnet Protocol following suit, all of them
superated since the mid 1990's only by Hyper Text Transfer Protocol.
October 1972: at the International Conference on Computer Communication,
Robert Kahn presented Arpanet for the first time to the public, as a network
of open architecture with the name of "Interconnected Networks", "Internet"
or "Internetting". It became obvious that the addressing system of Network
Control Protocol, based on the Interface Message Processor of Arpanet, was
insufficient for an open network, and therefore another protocol with control
of transmission errors should eventually have to be devised. This new one
would be later called Transmission Control Protocol / Internet Protocol,
whose fundamental characteristics were defined by Mr. Kahn in a report named
"Communication Principles for Operating Systems":
-Each local area network is independent, internal changes are not mandatory
for its connection to Internet.
Other characteristics under evaluation at that time were:
-Pipelining, making possible to route many packets simultaneously.
1972: Pong, first commercially successful computer action game (installed in
machines operated by coins), by Nolan Bushnell (Atari).
1972: Unics operating system re-written in assembly language by Kenneth
Thompson, using a PDP-7 computer of Digital Equipment Corporation.
1972: Usenet for News Groups of Duke University, working through Unics
operating system.
1973: first message transmitted by electronic post through Arpanet.
April 1973: Robert Kahn summons Vinton Cerf (Stanford Research Institute), an
expert in Network Control Protocol and in existing operating systems, for
developing the new protocol for the Internet project, called just TCP but in
reality having the two protocols in one, TCP as well as IP.
September 1973: Robert Kahn and Vinton Cerf present the new TCP/IP protocol
at a special meeting of the International Network Work Group, formed at
Sussex University and led by Mister Cerf. Its main characteristics are:
-Communication through a long chain of bytes (called "octets", for being
usual at that time the bytes composed of eight bits).
The original idea for the protocol is definitely separated in two protocols:
TCP to control flux or recuperate lost packets, and IP to address or route
packets. A User Datagramme Protocol is added for programmes that need only IP,
choosing not to use the control that is provided by TCP. Thus, a UDP/IP
communication is less reliable but much cheaper and faster than a TCP/IP. The
Ethernet network system was in 1973 under development by Bob Metcalfe at the
Xerox Palo Alto Research Centre, but the proliferation of local area networks
or of microcomputers was not yet foreseen in late 1973, therefore the original
model of Internet was based on the concept of a few national networks such as
Arpanet, Packet Radio or Packet Satellite. An Internet Protocol of 32 bits was
thus devised, the first 8 bits defining the network and the other 24 bits
defining the host inside that network, giving a maximum of only 256 possible
networks. This was a limitation reconsidered at the end of the 1970's, when
microcomputers and local area networks began appearing in huge numbers. The
new protocol was applied to Arpanet, Packet Radio and Packet Satellite.
1973: Arpanet connects host-servers in England and in Norway to those already
connected in North America.
1973: CP/M, Control Programme for Microprocessors, operating system of 8 bits
by Gary Kildall (Integrated Electronics and Intergalactic Digital Research).
It was initially offered to Integrated Electronics, that showed only a very
limited interest. After three years trying to convince Intel, Mister Kildall
formed his own company, Intergalactic Digital Research, which marketed CP/M
in 1976 with the name of Control Programme for Microcomputers or Control
Programme Monitor.
1973: Scelbi-8 H, prototype of microcomputer of 8 bits based on Intel 8008,
by David Ahl (Digital Corporation). Never sold, it incorporated the first
magnetic floppy disk.
1974: DARPA signs contracts with Stanford (represented by Vinton Cerf),
U. C. L. (by Peter Kirstein) and B. B. N. (by Ray Tomlinson), for implementing
TCP/IP. The group of Mister Cerf at Stanford defined a detailed specification
for it, making possible the inter-operability of Arpanet, Packet Radio and
Packet Satellite. These first implementations of TCP/IP were for systems like
Tenex or Tops 20, before the commercial release of the first microcomputers
in 1975. Doubts were then raised as to whether or not TCP/IP might be too
complex for microcomputers.
1974: Unics operating system re-written in C language by Dennis Ritchie, using
a PDP-7 computer of Digital Equipment Corporation.
1974: Personal Minicomputer Mark-8, microcomputer of 8 bits based on Intel
8008, by Johnathan Titus. It had 2 Kilobytes, extensible to 16 Kilobytes. It
only accepted code in numbering base of two, and was sold just as paper plans,
without physical parts.
1974: microprocessor Motorola 6800 of 64 Kilobytes of 8 bits (used in Tandy
Radio Shack).
1974: microprocessors Intel 8080, Intel 8084 and Intel 8085, all of 64
Kilobytes of 8 bits (used in Altair 8800). Composed of 4 500 elements, they
were capable of adding two numbers of 8 bits in less than 3 microseconds.
They were the first microprocessors for general purposes, and became the
standard for many microcomputers.
June 1975: Altair 8800, microcomputer of 8 bits based on Intel 8080, by
Edward Ted Roberts (Micro Instrumentation and Telemetry Systems). It had 256
bytes of Random Access Memory and 64 Kilobytes of storage, input by manual
switches and output by panel of ligths. Units began to be sold in 1975 at a
price of 300 Dollars as a kit with all its physical parts and assembling
instructions, or else sold as a finished microcomputer at the price of 400
Dollars. Initially it accepted only code in numbering base of two for all of
its input or output operations. Later it added elements to reach 7 Kilobytes
of Random Access Memory and accepted Basic programming language, adapted to
Altair 8800 by Paul Allen and William Bill Gates (Micro-Soft Corporation).
Another team incorporated as peripherals a reader of perforated paper tape
and a keyboard. This modified microcomputer was sold at 500 Dollars.
1975: P-System, operating system of 8 bits by the University of California in
San Diego, marketed by Softech Microsystems. Several versions.
1975: Cray I, first supercomputer, by Seymour Cray (Cray Research). Circuits
of Very Large Scale Integration, several processors in parallel architecture,
vectorial processing, and non-Von Neumann structure in numbering base of two.
Cray I inaugurated a kind of computers called "of fifth generation", that
exists only in the form of a few hundreds of supercomputers, too expensive
for the commercial market (between 4 000 000 and 20 000 000 Dollars).
1975: microprocessor MOS Technology 6502 of 64 Kilobytes of 8 bits (used in
Apple). With 4300 elements, it could add two numbers of 8 bits in 1
microsecond.
1975: microprocessor Zilog Z-80 of 64 Kilobytes of 8 bits (used in many
microcomputers).
1976: first book on Arpanet, by Leonard Kleinrock.
1976: Apple I, microcomputer of 8 bits (Apple Computer Corporation), based on
MOS Technology 6502.
1976: CP/M, Control Programme for Microcomputers or Control Programme Monitor
is marketed by Gary Kildall (Intergalactic Digital Research). In a short time
and until the early 1980's most microcomputers had the CP/M system or were
programmable in Basic, resident in Read Only Memory. Some microcomputers had
the P-System, and a few top of the line had one of the variants of Unics.
1976: Electric Pencil, line text editor by Michael Shrayer. Versions for many
microcomputers were made in the following years.
1976: microprocessor Texas Instruments TMS-9900 of 16 bits
1976: microcomputer PET Personal Electronic Transactor (Commodore), based on
Zilog Z-80 A.
1977: microprocessor Signetics 2650
1977: microprocessor Fairchild F8
1977: microprocessor Mostek 3870
1977: microprocessors Motorola 6801, Motorola 6802 and Motorola 6809
1977: microcomputer TK (Sinclair), based on Zilog Z-80 A.
1977: microcomputer TRS-80 (Tandy Radio Shack), based on Zilog Z-80 A, input
by keyboard or cassette tape, output by cathodic ray tube screen or cassette
tape. The series would continue with microcomputers named CP and DGT.
1977: Apple II, microcomputer of 8 bits (Apple Computer Corporation), based
on MOS Technology 6502.
1977: David Clark and his group at the Massachussets Institute of Technology
demonstrate that TCP/IP can be simplified for operating with microcomputers,
producing a first implementation for the Alto of Xerox at Palo Alto Research
Centre (first personal work station and graphic operating system), and in
1981 a second implementation for IBM Personal Computer.
1977: the growth of computer hosts connected to Internet forces the adoption
of a Domain Name System, invented by Paul Mockapetris (U. S. C. and I. S. I.).
Likewise, the growth of local area networks forces the implementation of a
new routing system: Interior Gateway Protocol (inside an Internet region),
and Exterior Gateway Protocol (among regions).
1977: implementation by the University of California at Berkeley of TCP/IP
for Unics BSD operating system.
About 1978: a diversity of computers built by Burroughs, Siemens or other
companies.
Late 1970's: networks for specific purposes make their appearance in North
America, such as MFEnet and HEPnet (Department of Energy), Span (National
Aeronautics and Space Administration), CSnet (by Rick Adrion, David Farber
and Larry Landweber, initially subsided by the National Science Foundation),
Usenet (not limited to a community of specialists, it was based on the Unix
to Unix Copy Communications Protocol for Unix operating system of Bell
American Telegraph & Telephone).
Late 1970's: Vinton Cerf (Director of the Internet Project of DARPA) formed
some specific groups: Internet Cooperation Board led by Peter Kirstein
(U. C. L.), for coordination with certain European nations involved with
Packet Satellite, Internet Research Group, for research and exchange of ideas
on the network, Internet Configuration Control Board led by David Clark, for
helping Mister Cerf with management of the growing Internet.
1978: Unics operating system becomes open source, and the basis of many future
operating systems. From the 1970's to the early 2000's a number of open source
systems based on Unics were created, such as various BSD systems, plus GNU
Hurd, Linux, Minix, Open Solaris, and others.
1978: Wordstar, line text editor by John Barnaby.
1978: microprocessor Intel 8086 of 1 Megabyte of 16 bits (used in the first
IBM Personal Computer-XT).
1979: Visi Calc, Visible Calculator spread sheet, first commercial programme
for microcomputers, by Daniel Bricklin and Robert Frankston (Software Arts).
Originally distributed in magnetic floppy disks of 5.25 inches, it became
incorporated into Apple II computers.
1979: Usenet news groups start from Duke University.
1979: microprocessor Zilog Z-8000 of 16 Megabytes of 16 bits.
1979: microprocessor Motorola 68000 of 16 Megabytes of 16 bits at 8 Megahertz
(in Apple, using a version of Unics operating system). Made of 70 000 elements,
it multiplied two numbers of 16 bits in less than 4 microseconds.
1979: microprocessor Intel 8088 of 1 Megabyte of 8 bits.
1980: microprocessor National 16032 of 16 Megabytes of 32 bits.
1980: it is calculated the existence of many millions of computers in the
world, most of them microcomputers.
1980: Vulcan data base, by Wayne Ratliff (National Aeronautics and Space
Administration). Forerunner of dBase.
1980: TCP/IP Protocol officially adopted for military purposes in the United
States, later forming Milnet as a military complement to the more scientific
Arpanet.
1980: 86-DOS, 86 Disk Operating System, of 16 bits (Seattle Computer, based
on CP/M-86, another operating system of 16 bits).
1981: microprocessor Intel 80186 of 1 Megabyte of 16 bits.
1981: PC-DOS, Personal Computer Disk Operating System, of 16 bits (Microsoft
Corporation, based on 86-DOS), later continued only by International Business
Machines.
1981: MS-DOS, Micro Soft Disk Operating System, of 16 bits (Microsoft
Corporation, based on 86-DOS), initially identical to PC-DOS, but afterwards
modified and continued only by Microsoft Corporation.
1981: Xerox Star, graphic operating system by Xerox at Palo Alto Research
Centre.
1981: OS/2 graphic operating system (Microsoft Corporation and International
Business Machines, later continued only by the latter).
August 1981: Personal Computer, successful series of microcomputers of 16
bits (International Business Machines). They were initially based on Intel
8086, originally incorporating PC-DOS operating system, with the option of a
16-bit version of CP/M, and programmable in Basic.
1981: Osborne I, first portable microcomputer, built by Adam Osborne on a
prototype of Lee Felsenstein (Homebrew Computer Club). With a weight of 12
Kilogrammes, the Osborne I had to be plugged into an electric socket, for it
lacked an electrolitic accumulator. Like the IBM Personal Computer, the
Osborne I had two drives for removable floppy disks of 5.25 inches, used to
boot-strap the operating system and for storage (most microcomputers of those
years had no fixed -hard- disk). For two or three years, the Osborne I was a
high commercial success.
1981: microprocessor Hewlett-Packard Superchip of 64 Megabytes of 32 bits.
Composed of 450 000 elements with MOS transistors, it could multiply two
numbers of 32 bits in less than 2 microseconds.
1981: agreement between CSnet - National Science Foundation (David Farber)
and DARPA (Robert Kahn), in order to share the infra-structure of Arpanet.
1981: Bitnet by Ira Fuchs and Greydon Freeman (not limited to a community of
specialists, it could use graphics attached to its communications). Other
networks in the early 1980's were: XNS (Xerox), DECnet (DEC) and SNA (IBM),
in the mid 1980's appeared Psi, UUnet, ANS core, Netware, Metbios and others.
1982: microprocessor Intel 80286 of 16 Megabytes of 32 bits (in IBM Personal
Computer-AT).
1982: microprocessor Motorola 68010 of 64 Megabytes of 32 bits.
1982: Lotus 1-2-3, graphic commercial programme of 16 bits, by Mitchell Kapor.
1982: R1-XCON, first expert system for practical use, controlling computers
to suit individual customer requirements, by John Mc Dermott (Carnegie Mellon
University and Digital Equipment Corporation).
1st January 1983: Arpanet renamed Internet. The Transfer Control Protocol /
Internet Protocol substituted the Network Control Protocol simultaneously for
all hosts. The User Datagramme Protocol / Internet Protocol is also used.
Domain Name: Internet Protocol address number converts to an equivalent
Uniform Resource Locator.
1983: Barry Leiner became Director of the Internet Project of DARPA. He and
David Clark dissolved the Internet Configuration Control Board and created
task forces instead, each task force focusing on a particular technical area.
The Internet Activities Board was then formed with the leaders of each task
force, led by Mister Clark. Later, Phill Gross became leader of the Internet
Engineering Task Force. In 1985, the I. E. T. F. was divided into work groups.
1984: it is calculated that Internet counts about 10 000 host-servers.
1984: microprocessor Intel 80386 SX of 4 Gigabytes of 16 bits
1984: microprocessor Intel 80386 DX of 4 Gigabytes of 32 bits (in Compaq 386,
IBM Personal Computer and other microcomputers).
1984: microprocessor Motorola 68020 of 4 Gigabytes of 32 bits at 16 Megahertz.
1984: British Janet network.
1984: Apple Macintosh, with graphic operating system.
1985: Windows 1, graphic interface run from the MS-DOS operating system
(Windows became its own operating system with Windows 95, in 1995).
About 1985: Josephson Junction super cold experimental integrated circuit,
lasting one nanosecond per operation at temperatures of a few Kelvin.
1985: it is calculated that about 25 000 different programmes exist for
microcomputers.
1985: NFSnet (National Science Foundation), led by Dennis Jennings in 1985
and by Steve Wolff in 1986, made inter-operability with the Internet of DARPA
(managed by the Internet Activities Board), extended TCP/IP, distributed
costs for development and maintenance to other North American organisations,
and helped to form a Federal Networking Council as a coordinator with
international organisations (such as R. A. R. E. in Europe) through the
Intercontinental Research Committee. At this time, DARPA was no longer the
main financial supporter of Internet, that responsibility having been taken
by the National Science Foundation and by other official departments in North
America, in Great Britain, and in a few other European nations. Increasingly
since the mid 1980's, private companies started their activities in the
Internet.
1985: Robert Kahn and Barry Leiner leave DARPA. This institution gradually
lost control of the Internet, in favour of the Internet Activities Board. The
Internet Engineering Task Force combined work groups into technical areas,
forming an Internet Engineering Steering Group with the leaders of those
areas, recognised by the I. A. B. as predominant. The I. A. B. itself combined
its task forces into an Internet Research Task Force, led by Jon Postel.
1985: Workshop for All, organised by Dan Lynch and the Internet Activities
Board, for explaining the characteristics of TCP/IP to private companies.
About 50 inventors or researchers (most of them from DARPA) showed the
protocol and its problems to almost 250 qualified representatives from a
number of corporations. The workshop was a success and prepared other events.
1985: Xerox Note Cards, hyper text system based on Lisp programming language.
1986: Owl Guide, professional hyper text system for large scale applications,
inspired on Xerox Note Cards.
1987: Hyper Card, programme to create graphical hyper text documents, by Bill
Atkinson (Macintosh). Mister Atkinson was also known for "Mac Paint", bitmap
painting programme. Hyper Card featured bitmapped graphics, form fields,
scripting, and full text search. Hyper Card spawned imitators such as
"Asymmetrix Toolbook", that created drawn graphics and was executable in the
IBM Personal Computer.
1987: workshop on hyper text systems in Chapel Hill, North Carolina, helping
to form Siglink in 1989 (later renamed ACM Sig Web), organisation that has
for many years centred attention with its annual conferences on most of the
academic research on hyper text.
1987: several proposals. First, of a Simple Network Management Protocol
(aiming at simplicity, inspired on a prior proposal called S. G. M. P.).
Second, of a H. E. M. S. (more complex, developed by advanced researchers).
And third, of a C. M. I. P. (developed by O. S. I.) for the remote uniform
control of Internet routers. In a series of encounters, the too complex
H. E. M. S. was eliminated, the C. M. I. P. of O. S. I. was considered a
solution of long term and gradually also dropped, and the simple S. N. M. P.
was considered a solution of short term and almost universally adopted for
remote uniform control in the Internet, although a few routers may still use
the C. M. I. P. of O. S. I.
1987: it is calculated that Internet counts about 100 000 host-servers.
1987: microprocessor Motorola 68030 4 Gigabytes of 32 bits at 32 Megahertz.
1988: report on "Towards a National Research Network" by Leonard Kleinrock,
Robert Kahn and David Clark (National Research Council, with support of the
National Science Foundation). Senator Al Gore got from Congress funds for a
high speed network based on this report.
1988: series of conferences on "Privatising the Internet for commercial
purposes", organised by the National Science Foundation at Kennedy School of
Government, Harvard University.
September 1988: first Interop Trade Show, with over 5 000 technicians from
about 50 private companies involved with TCP/IP. The show guaranteed the
inter-operability of computers accessing Internet, from no matter which one
of the brands present. Interop Trade Show has grown into seven events that
assemble over 250 000 people yearly.
1989: proposal of a World Wide Web, system of linked information able to work
with different kinds of computers, by Tim Berners-Lee and Robert Caillau
(Centre d'Etudes sur Recherche Nucleaire, Genevre, Switzerland). Scientists
of C. E. R. N. used TeX or Post Script for their documents at that time, a
few used Standard Generalised Mark-up Language. Mister Berners-Lee realised
that something simpler was needed to cope with dumb terminals through high
end graphical X Window Unics work stations. His proposal was thus a simple
Hyper Text Mark-up Language, with an also simple network protocol that he
named Hyper Text Transfer Protocol.
1989: microprocessors Intel 80486 DX (with integrated mathematic co-processor)
and 80486 SX (with separated mathematic co-processor) of 4 Gigabytes of 32
bits at 32 Megahertz.
1989: microprocessor Motorola 68040 of 4 Gigabytes of 32 bits at 40 Megahertz.
1990: Arpanet project discontinued, after having succesfully developed itself
into the Internet backbones of the National Science Foundation. The Transfer
Control Protocol/Intenet Protocol stands as the predominant protocol, with
the User Datagramme Protocol/Internet Protocol as the second one.
1990: the Internet begins spreading over the World. In 1990 the Internet was
limited to most of North and South America (excepting a few nations), to
West Europe, Australia, New Zealand, a few Asian and a very few African
nations. The rest of the world in 1990 was not yet connected to Internet, but
to Bitnet or to some other network, or simply unconnected. In 2000 most of
the world was already connected to Internet, excepting only a handful of
Asian or African nations. In 2010 the entire planet could well be considered
covered, even Antarctica, vast deserts, minor islands or open oceans, because
Internet signals could be received from satellites if having suitable
equipment for it.
1990: Archie search engine for File Transfer Protocol.
Late 1990 or early 1991: release of the HTTP-HTML 'World Wide Web' by Mister
Tim Berners Lee (Centre d'Etudes sur la Recherche Nucleaire, Genevre,
Switzerland), and of WWW-Talk posting list.
1991: release of Gopher Protocol (University of Minnesota). Gopher is mainly
used for transmission of text, although some Gopher clients present graphic
interface. Gopher went into a slow continuous decline after the spread of the
HTTP-HTML 'World Wide Web' of Mister Berners Lee. In terms of traffic (packet
exchange) the Web overpassed FTP, Gopher and most other Internet protocols
about 1995, but as of 2016 there are still about a hundred active Gopher
servers. Gopher presents a list of menus from which a document can be chosen.
The CSS Dixieland page on the Gopher Protocol can be visited by activating
the following internal link:
1991: Internet Society founded under leadership of Vinton Cerf, sponsored by
the Corporation for National Research Initiatives (Mister Cerf and Robert
Kahn).
1992: Hyper Text Mark-up Language, proposal by Tim Berners-Lee and Robert
Caillau, loosely based on SGML. Never official (there was never an HTML 1.0),
the CERN-Convex-Atrium specification of June 1993 is now known as "HTML 2.0":
World Wide Web Consortium
1992: WWW browser (later Nexus Web browser) by Tim Berners-Lee, implemented
in Objective-C, for NeXT work stations and the first version of HTML.
1992: Lib WWW line-mode Web browser for Unics, later for Windows and other
platforms and systems.
1992: Internet Activities Board renamed Internet Architecture Board, and
placed under the Internet Society. The I. E. S. G. and the I. E. T. F. become
main responsibles for Internet standards, at equal level with I. A. B.
1992: it is calculated that Internet counts about 1 000 000 host-servers.
1993: Lynx text-only user agent for many network protocols, originally
working in Unics and in VMS operating systems, later in DOS, Windows,
Macintosh, BSD, Linux and other platforms and systems:
Lynx
Lynx
1993: Cello graphic user agent for Windows 3.1. Other user agents released in
those years included Viola (1993), Midas (1993) and Win Web (1993).
1993: Mosaic graphic user agent by Marc Andressen and Eric Bina (National
Centre for Super Computer Applications, Urbana, Illinois), originally for
Unics, later for X Window, Windows 3.x, Macintosh and other platforms and
systems. Mosaic helped the growth of the World Wide Web, eclipsing other
systems based on Internet, like Wide Area Information System, Hytelnet,
Gopher or Usenet. Some user agents were later inspired on Mosaic, such as
Spyglass (1995) or Netscape Navigator (1995).
1993: HTML+ and HTML+ Reference, proposals by Dave Raggett. Never official.
1993: Veronica search engine for Gopher Protocol. Jughead search engine for
Gopher Protocol. Wanderer-Ex search engine for World Wide Web.
1993: microprocessor Intel Pentium of 4 Gigabytes of 32 bits at 64 Megahertz.
1994: report on "Forming idea of the future of information: the Internet and
beyond" by Leonard Kleinrock, Robbert Kahn and David Clark (National Research
Council, with support of the National Science Foundation), addressing some
critical topics such as intellectual property and internal regulation.
1994: first World Wide Web Conference, helping to create the World Wide Web
Consortium in 1996.
June 1994: HTML 2.0 approved by the I. E. T. F., strictly based on SGML. Last
revision in November 1995.
1994: HTML 3.0, proposal by Dave Raggett and others, based on HTML+ and HTML+
Reference. It never became official, but it inspired the creation of tables
for HTML 3.2, 4.0 and 4.01, plus other improvements in official HTML
specifications.
1994: Web Crawler search engine, only for Hyper Text Transfer Protocol. Elnet
Galaxy directory. Yahoo directory.
April 1995: the National Science Foundation stopped subsidies to the backbone
of NFSnet, thus effectively privatising the Internet. Many private networks
were bought into connection to the Internet. In 1986 there were 6 main nodes
with connection speed of 56 Kb, in 1995 they had increaed to 21 main nodes
with speed of 45 Mb, used by over 50 000 local area networks worldwide
(29 000 of them located in North America). Over 200 million Dollars were
invested into NFSnet in those nine years.
1995: Live Script of Netscape adopted by Sun Systems and renamed Java Script.
1996: Microsoft Internet Explorer Web browser starts a brief period known as
"the Browsers War" (mainly between Netscape and Microsoft), that lasted until
HTML 3.2 of January 1997. The release of Netscape Navigator 4.0 and Microsoft
Internet Explorer 4.0, both made in 1997, effectively puts an end to the
Browsers War.
1996: creation of the World Wide Web Consortium, under leadership of Web's
inventor Tim Berners-Lee and of Al Vezza, initially also led by the Computing
Science Laboratory of the Massachussets Institute of Technology.
October 1996: PNG, first recommendation approved by the W. W. W. Consortium.
Development had begun at Usenet before the creation of the Consortium. PNG
has better compression and colour than GIF, being free of any patents. PNG
was quickly implemented in all Web browsers and in many other graphic tools
(alpha channels, however, remained not handled correctly everywhere).
December 1996: CSS1, Cascading Style Sheets 1, approved by the W. W. W.
Consortium, then continued by it. Development of CSS had begun by Haakon Lie
at the HTML mailing list in 1994, before the creation of the Consortium.
1996: Deep Blue beats Gary Kasparov in the first game won by a chess-playing
computer against a reigning World Champion, under normal chess tournament
conditions.
January 1997: HTML 3.2 approved by the W. W. W. Consortium, strictly based on
SGML.
Early 1998: CSS test suit, shortly followed by a validator via Web (validator
and test suit for HTML came at a later time).
1998: it is calculated that Internet counts about 30 000 000 host-servers.
April 1998: HTML 4.0 approved by the W. W. W. Consortium.
Early 1999: CSS2, Cascade Style Sheets 2, approved by the W. W. W.
Consortium, then continued by it.
December 1999: HTML 4.01 approved by the W. W. W. Consortium.
Late 1990's: the I. E. T. F. counts over 75 work groups, defining all kinds
of technical regulations for the Internet. Documents are usually first
prepared as drafts, then eventually published as Requests for Comments.
Thousands of representatives from private companies join the inventors and
researchers of I. E. T. F. in their three or four meetings every year.
Microprocessors
The incredible power of a microscopic brain
First chip processor in a computer, in 1964, of little over 10 elements per
square centimetre
Burroughs B 2500 and B 3500, in 1968, computers built with integrated circuits
Fairchild 4100 of 1969, 256 bits ROM
Intel 1103 of 1970, 1 Kilobit RAM
Intel 4004 of 1971, 4 Kilobytes of 4 bits at 60 Kilohertz
Intel 8008 of 1972, 16 Kilobytes of 8 bits
Motorola 6800 of 1974, 64 Kilobytes of 8 bits
Intel 8080 of 1974, 64 Kilobytes of 8 bits
Intel 8084 and 8085 of 1974, 64 Kilobytes of 8 bits
MOS Technology 6502 of 1975, 64 Kilobytes of 8 bits
Zilog Z-80 of 1975, 64 Kilobytes of 8 bits
Texas Instruments TMS-9900 of 1976, 16 bits
Signetics 2650 of 1977
Fairchild F8 of 1977
Mostek 3870 of 1977
Motorola 6801, 6802 and 6809 of 1977
Intel 8086 (mathematic co-processor 8087) of 1978, 1 Megabyte of 16 bits
Intel 8088 of 1979, 1 Megabyte of 8 bits
Zilog Z-8000 of 1979, 16 Megabytes of 16 bits
Motorola 68000 of 1979, 16 Megabytes of 16 bits at 8 Megahertz
National 16032 of 1980, 16 Megabytes of 16 bits
Intel 80186 (mathematic co-processor 80187) of 1981-1982, 1 Megabyte of 16
bits
Hewlett-Packard Superchip of 1981, 64 Megabytes of 32 bits
Intel 80286 (mathematic co-processor 80287) of 1981-1982, 16 Megabytes of 16
bits
Motorola 68010 of 1982, 64 Megabytes of 32 bits
Intel 80386 DX of 1984-1985, of 32 bits
Intel 80386 SX of 1984-1985, of 16 bits
Motorola 68020 of 1984, of 32 bits at 16 Megahertz
Motorola 68030 of 1987, of 32 bits at 32 Megahertz
Intel 80486 DX (with integrated mathematic co-processor) of 1989, of 32 bits
at 32 Megahertz
Intel 80486 SX (mathematic co-processor 80487) of 1989, of 32 bits at 32
Megahertz
Motorola 68040 of 1989, of 32 bits at 40 Megahertz
Intel Pentium of 1993, of 32 bits at 64 Megahertz
Intel Pentium Pro of 1995, of 32 bits at 64 Megahertz
Intel Pentium II of 1997
Intel Celeron of 1998
Intel Xeon of 1998
Intel Pentium III of 1999
Intel Pentium IV of 2000
Intel Core of 2006
Intel Core II of 2006
AMD Athlon
AMD Turion
AMD Duron
AMD Sempron
AMD Opteron
Programming languages
The communication between man and machine
A programme is a set of instructions that tells the computer how to perform a
certain task. These instructions must be given EXACTLY, because the computer
would not know what to do in an ambiguous situation. It would stop or else
enter in an endless loop, whence it could only exit by quitting the task.
The first computers were programmed by hardware connections, arranging gears
or switches, or connecting wires, whenever instructions had to be changed.
Different connection boards were normally used to facilitate these changes.
Since the late 1940's almost every computer in the world understands directly
only one form of communication, known by the names of "binal code", "binary
code", "code in numbering base two", "machine code" or "low level code".
That code is composed of very long sequences of zeroes and ones that are read
by the computer in groups of four, six, eight, twelve, sixteen, twenty-four,
thirty-two, forty-eight, sixty-four or another number of ciphers, whatever
the number of bits might be in a byte of that system, looking something like:
11010110 10110011 11110100 10000110... ad nauseam (assuming in this example
bytes composed of eight bits, spaces inserted only as an illustration). Each
of those groups of zeroes and ones defines a character to be written, or else
an operation to be performed by the processor. This was the first form of
stored communication between man and computer. Understandably, few human
coders are so gifted of mathematical talent and of memory as for writing
instructions to the computer in this code, or as for reading directly these
answers given by the computer.
Due to that fact of human limitation, the communication between man and
computer is often made indirectly: since 1943 (Plan Kalkul, by Konrad Zuse)
there are forms of communication known as "symbolic languages", "assembly
languages", or "medium level languages" (similar to low level code, but much
easier for the human), and since 1949 (Short Order Code, by Mandy of Univac)
exist other forms known as "procedure-oriented languages" or "high level
languages" (very different from low level code), both created with the purpose
of easier communication between computer and human. Any medium level language
is helped by a programme known as assembler (which translates from medium to
low level). Any high level language is helped by a programme known as compiler
(which translates from high to low level). From another point of view these
auxiliary programmes may be classified as interpreters (which interpret line
by line a programme, perhaps unfinished, therefore making it meaningful to
the computer and a help to the programmer, who can see immediate results), or
as translators (which translate a finished programme, hopefully well written
by the programmer).
Those programming languages allow the human programmer to write instructions
to the computer in the form of short commands, known as mnemonic commands,
that a human can more easily remember. Mnemonic commands for medium level
languages are known as assembly commands. They can be numbering bases other
than base two (usually bases four, six, eight, ten, twelve or sixteen, but
any numbering base can be used). They can instead be short mnemonic words.
Most medium level (assembly language) commands execute only one instruction
in low level (machine code), although some assemblers allow the use of macro
instructions, where one medium level command executes some or many low level
instructions. This is the principle on which high level languages are based.
The commands of high level languages approach words of some natural human
language (often the English language). Some of those quasi natural languages
are intended to be immediately interpreted, others are meant to be translated
later, and others can be interpreted or translated (Basic, Forth or Lisp are
examples of the latter). There are certain advantages and disadvantages to
each of these approaches. Low level code is understood by the computer
directly, without need of any programme for interpretation or translation,
but it is extremely tedious and prone to errors for most humans. Low level
code or medium level languages are the best ways of optimising the resources
of a specific type of processor, but it requires from the part of the coder
or programmer to know the internal structure of that type in detail, because
he will need to address memory locations directly. Hence, programmes written
in low level code or medium level languages lack portability to computers
with other processors, if incompatible with that processor for which the code
or programme had been written.
With high level languages the programmer does not need to know the type of
processor in which the programme will be executed, although for this very
reason, the computer resources will not be maximised either. On the other
hand, high level languages present the advantage of portability to computers
with different processors, although it is necessary to use an interpreter or
translator for each type of processor. A type of processor may be represented
by only one model, or by many compatible models. X86, for instance, is a type
of processor with many models made by different brands, such as Intel or AMD.
High level languages fall into three broad categories, although their
boundaries are not sharply defined:
Sequential, linear, single block: they are languages intended for writing a
sequential linear programme in a single block, maybe jumping instruction
lines forward or backward. Basic belonged to this category until 1976, when
Steve Garland modified Dartmouth Basic 6 to facilitate structured modular
programming in connected blocks, thus he created SBasic, a precompiler that
produced Dartmouth Basic 6 output and that formed the basis of ANSI Basic.
Structured, modular, connected blocks: they are languages intended for
writing a structured modular programme in connected blocks. Besides the above
mentioned SBasic dialect of 1976, the Pascal programming language belongs to
this category from its start in 1971.
Object oriented: they are languages intended for writing a programme oriented
to objects. Visual Basic of Microsoft Corporation belongs to this category.
Aditionally, programmes can be more or less divided between those that expect
a rigid sequence of procedures performed by the human operator (like in the
DOS operating system), and programmes that allow flexible events performed
by the human, without following rigid sequence (like in the Windows operating
system). The former are always much smaller than the latter, in terms of
memory needed.
Most simple programmes are written in sequential form, even in programming
languages that allow modular structure. Sequential programming was the rule
until 1971 or so, and continue being the practice today for minor programmes.
Before that year most programming in any language was done with little or no
modular structure, just a single block, maybe jumping lines forward or
backward. Some Computing theorists like Edsger Dijkstra strongly criticised
this form of programming, in which the abuse of instructions like "goto" or
"gosub" (in the case of the Basic language) was a clear indication of a
programmer who had not paid much attention to the structure of his programme,
only to its function. This is not a problem for the computer, who can easily
jump to another line, but it makes difficult for most human programmers
(excepting perhaps the original programmer) to comprehend the totality of the
programme well enough as for making changes into it without also introducing
some error or unexpected behaviour in another part of a long and complex
programme.
In 1971 the group of Harlan Mills working at International Business Machines
managed to create an almost faultless programme structured in modules, for
The New York Times. This success boosted tremendously the idea of structured
modular programming. Some languages created at or after that time promoted
this approach to programming in various ways. In the case of Pascal (1971),
by forcing to declare all variables at the start of the programme. Another
improvement gradually done to programming was the creation of libraries with
parts of programmes, which could be inserted into different programmes when
needed. Using these libraries, the programmer has no need to create everything
from scratch or to "re-invent the wheel", but just to combine different parts
into a general programme, adapting these parts if necessary. Needless to say,
the parts and the general programme must be written in the same language and
dialect, or they must be adapted for conforming to it.
There have been about 2 000 known programming languages in the History of
Computing (not counting dialects), about half of those languages for big
computers and the other half for medium or small computers, but the vast
majority of them have had a very restricted use. Less than 100 languages have
been widely used (in some of them, only a few dialects have been widely used).
Below are listed about 60 programming languages, those that are perhaps the
best known or those that possess a marked historical interest.
A-0 Coding Translator: the first compiler of routines (repetitive tasks, done
many times by the computer in the same or in different programmes). It was
created by Captain Grace Hopper (United States Navy and Univac) in 1951.
ASIC, All-purpose Symbolic Instruction Code: with a name derived from "BASIC",
but without the "B", the ASIC programming language was created by David Visti
in 1993, together with a compiler to machine code, a converter from BASICA or
GWBASIC to ASIC, and a few other tools available in the Internet as shareware.
ASIC recognises about 90 key words. It is a dialect of BASICA and of GWBASIC.
ASP, Active Server Pages: a server side scripting language, invented and
developed by Microsoft Corporation. ASP commands are embedded within HTML
documents having their name extended with an .asp type suffix (extension), in
order to provide dynamic content. ASP is often supported by Web host servers
using a NT host server. This language supports Visual Basic by default, and
it can be made ready to support other languages as well. Besides CGI, two
languages that were from the start designed for the purpose of generating
dynamic content, ASP of Microsoft Corporation and Java of Sun Systems, stand
now as the main languages used for that purpose, in direct competition
against each other.
Ada, Universal Programming Language: named in honour to Lady Ada Augusta
Countess of Lovelace, who in 1843 suggested ideas about programming the
Analytic Calculator of Charles Babbage. Ada was created in 1979 by the United
States Department of War, in a similar way to the creation of Cobol twenty
years earlier. In 1981 Ada became official standard for military applications
in the United States. Ada is based on Pascal, combined with other languages,
and considered the most advanced programming language ever created. It was
intended as a lingua franca of programming, applied also to scientific or
commercial purposes and to computing systems.
Ada Power
Algol, Algorithmic Language: intended as another lingua franca of programming,
it was created in 1958 by an international committee assembled in Zurich. Its
three main dialects were Algol 58, Algol 60 and Algol 68. This language was
used from the late 1950's to the early 1970's.
APL, A Programming Language: structured language of very high level created
by Kenneth Iverson (International Business Machines) in 1968, targeted to big
computers. It uses some special characters that require a special keyboard,
and much memory. It is so concise that it is difficult to be read and
understood by other programmers, but of high productivity for the original
programmer. Applied to scientific or commercial purposes and to modelling.
AIML, Artificial Intelligence Mark-up Language: a sub-set of Standard
Generalised Mark-up Language. It is used for programmes called "talking
robots", that simulate a conversation.
.ASM: file type (also called suffix, or extension) commonly used for
programmes written in any assembly language.
Awk: a programming language of Unics operating system, for processing text.
Assembly: any language of medium level, specific for a given model of
processor, which uses instructions written in a numbering base other than
base of two (bases four, six, eight, ten, twelve or sixteen are commonly used,
but any numbering base can be used), or in short commands known as mnemonic
commands, which are better remembered by humans. Programmes in assembly may
be interpreted line by line or may be translated by an auxiliary programme
known as assembler. Writing a source programme in assembly maximises the
resources of a certain model of processor, although it also requires to know
the internal structure of that model in detail, because the programmer needs
to address memory locations directly, and to instruct the Central Process
Unit for doing specific tasks one by one. There is also a loss of portability,
as an assembly programme will not normally work in other computer processors.
Programming in assembly differs little from coding in low level, machine code
in numbering base of two (long sequences of zeroes and ones grouped by the
number of bits contained in one byte). Assembly languages are cryptic for
most novice programmers, hence that only old hands make regular use of it.
The compensation for that difficulty is a programme that results efficient
and small, processed faster by the computer than with any high level language.
Assembly is mostly used for basic software or for specific routines. Most
high level languages possess the means of writing part or all of the programme
in low level code or also in a medium level language. In Basic, for example,
some of the commands for working in low or in medium level are "peek", "poke"
and "call absolute".
Mister Patrik Ohman wrote this short explanations in 2004:
What is machine code ?
Assembly language or machine code ?
When to use assembly language:
Assembly language is very flexible and powerful, anything that the hardware
of the computer be capable of doing, can be done in assembly.
Mister Ohman does not mention specific assembly languages, for it would take
his introduction into too vast a field. See the Terse programming language in
the alphabetic list below. It is an excellent choice for assembly programmers.
A variety of programmes exist, ready for programming in assembly. Most
assemblers work with x86 processors, some assemblers work with other
processors.
Commercial assemblers:
Freeware or shareware assemblers:
There are also complete Integrated Development Environments, allowing the
programmer to build a whole application in assembly only, or to combine
assembly with a high level language. Known IDE for assembly are:
Alab, Asmedit (for ASM), Asmide (for ASM), Microasm (for MASM), Nasmide (for
NASM), Tasmide (for TASM).
Basic, commonly written as 'BASIC' and parsed as 'Beginners All-purpose
Symbolic Instruction Code', but that is a back acronym. The language was
originally named 'Basic' because it was a simple, basic programming language,
but since most names of programming languages were acronyms, round the mid
1970's some people began to invent back acronyms for Basic. The back acronym
referred before became commonly accepted. No acronym for Basic originally
existed, as can be verified by reading texts of the 1960's and early 1970's,
where the name 'Basic' appears ALWAYS, and the name 'BASIC' appears NEVER.
Always loyal to the origins, CSS Dixieland consistently uses the name 'Basic'
for the language, not 'BASIC'.
This language is here explained in more detail than the others, due to its
historical importance and to the fact that Basic was the first programming
language used by P. A. Stonemann, CSS Dixieland. A dialect of Basic was
included in the Read Only Memory of a Casio PB-700 (nicknamed "Soldier Boy"
by Mister Stonemann), a pocket computer made in 1983, of 8 bits and with a
Random Access Memory of only 4 Kilobytes. Yes, FOUR Kilobytes. Amazing, to be
able to do something with so few resources. Mostly mathematical tasks, using
Soldier Boy as a programmable calculator.
Information on Basic taken mainly from Wikipedia:
Wikipedia, the Web Encyclopaedia
Basic is a programming language created by John Kemeny and Thomas Kurtz
(Darmouth College) during 1963 and early 1964. It was partly inspired by Dart,
an experimental language created on a computer Royal McBee LGP-30 in 1959 by
a student of Darmouth College. Basic appeared in response to the need felt by
certain students for a simple language, easier to learn than the languages
available at that time, mainly Fortran and Algol. Basic was therefore a
simplification of them, with most key words taken from Fortran II and some
taken from Algol 60. The first dialect of Basic, now called Dartmouth Card
Basic, was a non-interactive dialect based on a standard card reader, intended
for batch processing. Work on the Basic compiler and on a time sharing system
at Darmouth College was done concurrently, therefore this first dialect of
early 1964 was executed in the batch processing system before the time sharing
system were ready.
The second dialect, now called Dartmouth Basic 1, was an interactive dialect
for the Time Sharing System, a number of dumb terminals that shared a common
mainframe computer available at Darmouth. The idea of sharing computer time
had been proposed in July 1958, and became operational in 1962 at the
Massachussetts Institute of Technology, as an improvement to the method of
batch processing. Non-technical students were in this manner able to write
their own programmes. John Kemeny and John McGeachie declared in an interview
in 1974, that at 04 hours (Official Local Time) of 1st May 1964 they had
successfully executed the first interactive Basic programme from terminals of
the Dartmouth Time Sharing System. It is not clear what this programme was,
but it seems that it consisted of the single line:
PRINT 2 + 2
They also seemed to remember that a few hours after executing that single
line, they programmed an implementation of the Sieve of Eratosthenes. In
ancient Greece, Eratosthenes of Cirene devised the following method for
finding the sequence of prime numbers (this is, a number that is divisible
only among itself or among the number one, for giving an integer):
-Write the sequence of natural numbers (positive integers) as long as desired:
1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22,
23, 24, 25...
-Eliminate number 1. Although strictly speaking one is a prime number, it does
not need to appear in the list of prime numbers because it is the start of
numbering itself.
-Start from the first number, the 2, and eliminate its multiples. The series
has become: 2, 3, 5, 7, 9, 11, 13, 15, 17, 19, 21, 23, 25...
-Continue from the next number, the 3, and eliminate its multiples (for
example 9, 15 and 21). Now the series has become: 2, 3, 5, 7, 11, 13, 17, 19,
23, 25...
-Continue from the next number, the 5, and eliminate its multiples (for
instance 25). After that we have the series: 2, 3, 5, 7, 11, 13, 17, 19, 23...
-Continue from the next number, the 7, and eliminate its multiples. Then the
next, the 11. Then the next, the 13. Then the next, the 17. Every time
eliminate the multiples of each of these numbers. The numbers that after all
those eliminations still remain in the series, are all prime numbers: 2, 3, 5,
7, 11, 13, 17, 19, 23, 29, 31, 37, 41, 43, 47, 53, 59, 61, 67, 71, 73...
Knowing prime numbers is useful for determining instantly the divisibility or
not of a given number, without need of performing the operation of division.
The long search for a formula to find prime numbers has already lasted for
over 2 500 years, without result. Most mathematicians believe that such a
formula does not exist. The Sieve of Eratosthenes is still the only method
available. Needless to say, computers can perform at high speed the algorithm
of this method, and give as a result an enormously long list of prime numbers.
Basic was one of the very first programming languages intended to be used
interactively, since Dartmouth Basic 1. Almost immediately Basic became so
succesful that in October of its first year, 1964, some modification was done
into it, thus making its third dialect now called Dartmouth Basic 2. Several
dialects were produced over the years, all implemented as compilers by
undergraduate teams working under the direction of the original designers.
Chronology of Basic:
-Early 1964: Dartmouth Card Basic becomes available to Dartmouth members.
Non-interactive dialect intended for batch processing based on a card reader.
-May 1964: Dartmouth Basic 1, interactive dialect for the Time Sharing System.
-June 1964: Dartmouth Basic 1 becomes available to Dartmouth members.
-October 1964: Dartmouth Basic 2 becomes available to Dartmouth members. It
added zero subscripts to arrays, and semi-colon to the PRINT command:
-1966: Dartmouth Basic 3 becomes available to Dartmouth members. It added 3
commands:
-1969: Dartmouth Basic 4 becomes available to Dartmouth members. It added
text manipulation and string variables.
-1970: Dartmouth Basic 5 becomes available to Dartmouth members. It added
true file handling.
-1971: Dartmouth Basic 6 becomes available to Dartmouth members. It added
separately compilable procedures with parameters. Most later Basic dialects
derive from Dartmouth Basic 6.
-1976: SBasic (Steve Basic or Structured Basic). Steve Garland modified
Dartmouth Basic 6 to facilitate structured modular programming in connected
blocks, in an effort to avert the abuse of line jumping by GOTO or GOSUB
instructions that was a typical characteristic of sequential linear
programming in a single block. This structured approach had been advocated by
programming theorists such as Edsger Dijkstra, and it was strictly applied to
some new languages, such as Pascal (1971). SBasic was a precompiler that
produced Dartmouth Basic 6 output, and that formed the basis of ANSI Basic.
-1976: Modular Basic, by John Kemeny and Thomas Kurtz. Annother effort to
facilitate structured modular programming in connected blocks, as opposed to
sequential linear programming in a single block.
-1978: ANSI Basic publicly recommended by the Basic Committee of the American
National Standards Institute.
-1979: Dartmouth Basic 7 becomes available to Dartmouth members. It was the
final original dialect of Basic at Dartmouth College, released by John Kemeny
and Thomas Kurtz as an ANSI Basic compiler, before they left the college to
concentrate on the further development of ANSI Basic in the form of True Basic.
-1985: True Basic publicly released by John Kemeny and Thomas Kurtz. Being
based on ANSI Basic, True Basic was an effort against chaotic dialects of
Basic.
Keywords of Dartmouth Time Sharing System and Dartmouth Basic 1 of May 1964:
Dartmouth Time Sharing System of 1964, 14 commands. Many non-technical human
operators often believed that the commands were part of the Basic language,
but in fact they were part of the Dartmouth Time Sharing System itself and
were also used when preparing Algol or Fortran programmes via the Dartmouth
Time Sharing System terminals.
14 commands:
The Dartmouth Time Sharing System implemented an early integrated development
environment: an interactive command line interface. Any line beginning with a
number, typed in the command line by the human operator, was added to the
programme and replaced any previously stored line with the same number and
for the same programme. Therefore to substitute a line, a new line with the
same number was written in the command line. To erase a line, an empty line
with the same number was written in the command line. It means that lines
which consisted solely of a line number were not stored, but removed any
previously line stored with the same number and for the same programme.
Anything else was assumed to be a command of the Dartmouth Time Sharing
System and it was immediately executed. This method of editing was necessary
because of the use of teletypes as the terminal units for the Dartmouth Time
Sharing System. There was no need to log out, if human operators did not
respond in a short time, they were automatically logged out by the Dartmouth
Time Sharing System.
Dartmouth Basic 1 of May 1964, 15 commands, 9 arithmetic or trigonometric
functions, 15 symbols for arithmetic, logic or grouping operations, and 286
variables. Dialect intended for sequential linear programming in single block.
15 commands:
9 functions for arithmetic or trigonometric operations:
15 symbols for arithmetic, logic or grouping operations:
Basic implemented floating-point numeric variable values, each named up to 2
characters and with the first character always a letter, therefore up to 286
variable values available (a to z, a0 to a9, b0 to b9, ..., z0 to z9).
Array names were restricted to the range a to z, representing so 26 arrays.
Arrays did not need to be dimensioned, but in the absence of a DIM statement
they defaulted to 10 elements, subscripted from 1 to 10. Instruction lines
had to be numbered, intervals being allowed. They were usually numbered in
intervals of ten or another number, thus making it possible to insert new
lines without need of numbering all lines again. This was necessary due to
the line editor used in the Dartmouth Time Sharing System. As opposed to a
page (screen) editor, a line editor can only move the cursor horizontally,
but not vertically (much like the COPY CON and similar commands that were
later used in DOS and other operating systems of the 1970's and 1980's, or
like the EDLIN text editor). Dartmouth Basic 1 had no command for entering
data. This command would be INPUT, which was introduced in Dartmouth Basic 3
of 1966. Also, if READ could not see any more DATA to be read, it considered
the programme ended. The original manual presents some extensions, such as
the handling of matrices.
In 1975, Mister Paul Allen and Mister William Bill Gates, founders of
Micro-Soft Corporation (name later modified to Microsoft Corporation), began
their activities by adapting a Basic dialect to microcomputers. In one of the
big computers of the Massachussetts Institute of Technology, they emulated
the Intel 8080 microprocessor that had been incorporated in the Altair 8800
microcomputer. Shortly later they signed a contract with the maker of Altair,
Mister Edward Ted Roberts (Micro Instrumentation and Telemetry Systems) for
the inclusion of that Basic dialect in the Read Only Memory of Altair units.
Thus, the dialect is now known as Altair Basic. To include one or another
Basic dialect in the Read Only Memory of a microcomputer became a common
practice for many makers of microcomputers in the second half of the 1970's,
all of the 1980's, and the first half of the 1990's.
The first dialects of Basic were limited to 64 Kilobytes of programme size.
BASCOM 1.0 was released by IBM in March 1982, first Basic interpreter for the
IBM PC, written by Microsoft with code and method developed by Bill Gates,
Greg Whitten and others. Microsoft had already written Basic interpreters for
Apple II computers and CP/M operating system, but BASCOM 1.0 was the most
powerful so far. Compared to Basic interpreters available then, BASCOM 1.0
offered many additional capabilities and an enormous increase in programme
execution speed. Programme statements could exceed 255 characters, a single
string could have up to 32767 characters, and assembly language subroutines
could be linked directly to a Basic programme.
Microsoft continued enhancing the interpreter, released by IBM in 1985 as
BASCOM 2.0 with many improvements. Among the most important were multi-line
DEF FN functions, dynamic arrays, network record locking, and an ISAM file
handler. With named subroutines, programmers were finally able to exceed the
size limitation of 64 Kilobytes by writing separate modules that could then
be linked together. Subroutine parameters also were an important step toward
structured programming in Basic. At the same time that IBM released BASCOM
2.0 in 1985, Microsoft offered an almost identical interpreter as QuickBasic
1.0, only without the ISAM file handler. Sold at 100 Dollars, QuickBasic 1.0
was cheaper than BASCOM 2.0
With the strong acceptance of QuickBasic 1.0, Microsoft followed it with
QuickBasic 2.0 in early 1986. This important new release added an integrated
editing environment and EGA graphic capabilities. The editor was especially
welcome because it allowed programmes to be written and tested very rapidly.
The environment was further enhanced with the advent of Quick Libraries,
which allowed assembly language subroutines to be easily added to a Basic
programme. Quick Libraries also helped to launch the start of a new class of
Basic product: third-party add-on libraries. In early 1987 Microsoft released
a major enhancement to QuickBasic as 3.0. This included a limited form of
step and trace debugging and the ability to monitor a variable value
continuously during programme execution.
Also added was a support for the EGA's 43-line text mode and for several new
language features. Perhaps most impressive of these features were the control
flow statements DO LOOP and SELECT CASE, as an improved alternative to the IF
statement. Also added with 3.0 was optional support for Intel 8087 numeric
coprocessor. However, in order to support a coprocessor, Microsoft had to
abandon its own proprietary numeric format. Both the Microsoft and IEEE
methods for storing single and double precision numbers use respectively 32
bits and 64 vbits, but the bits are organised differently. Though the IEEE
format that Intel 8087 requires is substantially slower than Microsoft's own,
it is the current standard. Therefore, a second version of the interpreter
was included solely to support IEEE mathematics.
By the time that QuickBasic 4.0 had been released in late 1987, maybe half a
million copies of QuickBasic were already in use world-wide. With QuickBasic
4.0 Microsoft had created the most sophisticated programming environment ever
seen in a main language: the threaded p-code interpreter. This remarkable
technique allowed programmers to enjoy the best features of an interpreted
language with the execution speed of a translated one. In addition to an
Immediate mode whereby programme statements could be executed one by one,
QuickBasic 4.0 also supported programme break-points, monitoring the value of
multiple variables and expressions, and even stepping backwards through a
programme.
That improvement greatly enhanced the debugging capabilities of the language,
increasing programmer productivity enormously. Also new in QuickBasic 4.0 was
support for inter-language calling. Although this meant that a programme
written in Microsoft Basic could now call subroutines written in any of the
other Microsoft languages, it also meant that IEEE mathematics was no longer
an option, it became mandatory. When a QuickBasic 4.0 programme was run in an
IBM PC equipped with a mathematic co-processor, floating point mathematics
was performed very quickly indeed. However, it was very much slower in every
other computer. This remained a sore point for many Basic programmers, until
Microsoft at the end of 1987 released Basic 6.0, which included an alternate
mathematic library that was similar to Microsoft original proprietary format.
Also added into QuickBasic 4.0 were huge arrays, long integer variables (of
32 bits), user defined TYPE variables, fixed-length strings, true functions,
and support for CodeView debugging. With the introduction of huge arrays,
Basic programmers could create arrays that exceeded 64 Kilobytes in size,
with only a few restrictions. TYPE variables let the programmer define a
composite data type comprised of any mix of Basic intrinsic data forms, thus
adding structure to a programme's data as well as to its instructions. The
newly added FUNCTION procedures greatly improved on Basic earlier DEF FN
style functions by allowing recursion, the passing of TYPE variables and
entire arrays as arguments, and the ability to modify an incoming parameter.
Although Basic 6.0 provided essentially the same environment and interpreter
as QuickBasic 4.0 did, it also included the ability to create programmes that
could be run in the OS/2 operating system. Other features of this dialect were
a utility programme to create custom run-time libraries, and a copy of the
Microsoft programmer's Editor. The custom run-time utility was particularly
valuable, since it allowed programmers to combine frequently used subroutines
with the BRUN.EXE language library, and share those routines among any number
of chained modules. QuickBasic 4.5 was introduced in 1988, although the only
major enhancement over the 4.0 was a new help system and slightly improved
pull-down menus.
Unfortunately, the new menus required much more memory than QuickBasic 4.0,
and the "improved" environment reduced the memory available for programmes
and data by approximately 40 Kilobytes. To this day many programmers continue
using QuickBasic 4.0, precisely because of its increased programme capacity.
QBasic also appeared in 1988, as an interpreted dialect that allowed 160
Kilobytes of programme size (while the QuickBasic translator allowed 64
Kilobytes), and without the linking features of QuickBasic 4.5, but otherwise
identical. Basic became a mature language, no more an exclusive domain of
beginners, but a language used by professionals as well. In answer to
programmer's demands for more string memory and smaller, more efficient
programmes, Microsoft released the QuickBasic Extended (QBX) Professional
Development System (PDS) 7.0, in late 1989.
This was an enormous project even for a company the size of Microsoft, and at
one point more than fifty programmers were working on the new interpreter and
QBX environment. QBX PDS 7.0 finally let Basic programmers exceed the usual 64
Kilobytes of string memory limit, but with some limitations. Other features
coming with QBX PDS 7.0 were an ISAM file handler, improved library granularity,
example toolbox packages for creating simple graphics and pull-down menus,
local error handling, arrays within TYPE variables and improved documentation.
The QBX interpreter uses expanded memory to store subprogrammes or functions,
thus much larger programmes could be developed without resorting to editing
and translating outside of the environment. In mid 1990 was released QBX PDS 7.1
with the long-overdue ability to redimension an array without erasing its
content. Added to QBX PDS 7.1 was also support for passing fixed-length string
arrays to subprogrammes and functions, and an option to pass parameters by
value to Basic procedures.
In 1994 Microsoft released Visual Basic for DOS. A notable difference is that
VB/DOS supports far strings only, while QBX PDS lets specify either near of
far strings. Because far strings are stored in a separate "far" area of DOS
memory, it takes slightly longer to access those strings. Therefore, a string
intensive VB/DOS programme will not be as fast as an equivalent programme
interpreted with QuickBasic or with QBX PDS near strings. VB/DOS also uses
pseudo event-driven forms. The Microsoft editor has not been very accepted by
many Basic programmers, mostly because it is big, slow and of complicated use.
Microsoft includes with Basic PDS a new editing environment called Programmer
Workbench, but it is also generally shunned by serious programmers, for the
same reasons.
From about 1975 to about 1995 Basic was ubiquitous in one dialect or another.
As said, it was present in the Read Only Memory of almost all microcomputers
and in many of the medium and big ones, from pocket computers made by Casio,
to giants made by IBM. Different dialects of Basic could be used by beginners
as well as by experienced programmers, because Basic allowed different levels
of programming complexity. For these reasons Basic became the most widely
known programming language in the world. It was even regularly taught in
North American schools, much as a natural human language. Most students had
some acquaintance with Basic, enough for being able to write some small
programmes for their own personal purposes, mostly recreational or as a help
in their other studies. It was estimated that in the early 1990's some
millions of people were reasonably fluent in one or another dialect of Basic.
Together with Forth and Lisp, Basic is one of the few languages that can be
interpreted or translated (each of these approaches has its advantages and
disadvantages). Basic has more than 250 different dialects (including minor
variations). Even before the ANSI Basic of 1978, there were already almost
100 dialects (most of them based on Darmouth Basic 6 of 1971). Some of those
dialects are very weak, while others are very powerful. Tiny Basic is an
example of the weak ones, intended as it is for small computers with very
limited memory: it occupies only one Kilobyte in memory and uses only short
integer variables (it lacks long integer, single precision, double precision,
or character strings). Examples of strong dialects are QuickBasic, QBasic,
Basic PDS or Visual Basic, all by Microsoft Corporation. Some of the most
important Basic dialects are:
Original Basic:
1964 Dartmouth Card Basic, by John Kemeny and Thomas Kurtz
Natural developments from the Original Basic:
1976 SBasic, by Steve Garland
Basic dialects for small computers:
1975 Microsoft Altair Basic, by Paul Allen and William Bill Gates
Basic dialects for medium computers:
Data General Basic
Basic dialects for big computers:
IBM Basic
For those programmers who wish to continue working in the Basic language there
are, as of 2023, five main options. One is QBasic, QuickBasic 4.5 or QuickBasic
Extended 7.1 (QBX PDS), which are available in Internet for free use under the
Microsoft Life Cycle Policy. They only work in DOS, but they can be executed in
Linux via an emulator or virtual machine like DOSbox or DOSemu. The executables
made by the QB 4.5 or QBX 7.1 translators are also for DOS only, but they can,
likewise, be executed via DOSbox or DOSemu. Unfortunately, the documentation is
in a proprietary hyper text format, difficult to use. The executable QH.EXE,
included in the package, can be used for copying pages of the documentation as
plain text, more efficiently than from the Integrated Development Environment.
A second option is Turbo Basic, later renamed Power Basic. It is good software,
but unfortunately, it only exists for Microsoft Windows. Besides, after the
death of the leader of the programming team, Turbo-Power Basic seems to have
stopped development. For years, only minor enhancements have been done to it.
A third option is Free Basic, which works in DOS, Linux, and some other systems.
This is a command-line translator, for which one or two Integrated Development
Environments have been made by third parties. However, Free Basic is of rather
difficult use, and in some situations it tends to freeze, in DOS and in Linux.
A fourth option is BW-Basic, also known as Bywater Basic. This is an interpreter
for DOS, it cannot create executables. It can be executed in Linux via DOSbox or
DOSemu (it is included in the DOSemu distribution). BW-Basic is limited in some
aspects, but it has a well explained documentation, suitable for training or for
fast checking of simple Basic sources (although not of complex Basic sources).
As a last, fifth option, QB64 exists as perhaps the best option nowadays. As
of 2023 the current version of QB64 is of 32 bits, another version of 64 bits
is in preparation. QB64 is a command-line translator that includes Integrated
Development Environment, but it translates the Basic source to C++ language,
not directly to executable. QB64 automatically sends the C++ source to GNU GCC
(the GNU C compiler), which converts it to assembly and sends it to GNU AS (the
GNU assembler), which converts it to binal object and sends it to GNU LD (the
GNU linker-loader), which finally creates the executable.
This conversion process (QB64 to GCC, from this to AS, from this to LD) takes
only a few seconds in a reasonably equipped computer, but it means that those
executables and their libraries must be installed in a known path. Likewise
installed must be these libraries:
Open Graphics Library, for including graphics or images in the executable. All
screens created by QB64 appear as windows in a graphic environment such as X-11
Window System. In general Open GL is necessary, even for text-only output.
ALSA, if wishing to include sounds in the executable created by QB64.
Zlib, for performing compression of data by the Gzip algorithm.
Because there is no QB64 interpreter, error finding has to be done by careful
perusing of the Basic source, or AFTER having created the executable. There are
various manners of doing this, one is by loading the executable into DDD (Data
Display Debugger), or directly into GDB (the GNU Debugger). The C++ source can
be called into the debugger for analysis (not the Basic source). The C++ source
of the most recent programme is called main.txt, located in the internal/temp
directory. DDD and GDB are complex, reading their documentation is mandatory.
There are three ports of QB64: for Linux, for Apple Macintosh, and for Microsoft
Windows NT series (neither for DOS, nor for Microsoft Windows DOS series). QB64
itself can be used from the command line of the Linux console, but executables
created by QB64 need X-11 or another graphic server. They can be executed via a
terminal emulator, from the Krunner of KDE or other graphic interfaces, or from
a data set manager such as Konqueror, PC Man FM, Krusader, Dolphin, et cetera.
QB64 has almost total compatibility with Microsoft QuickBasic 4.5, except for a
few commands that make sense in 16-bit DOS, but which cannot work in a 32-bit
or 64-bit operating system, such as PEEK, POKE, CALL ABSOLUTE, and a few other
commands. QB64 adds MANY new commands that do not exist in QuickBasic 4.5, but
these commands may require the installation of the dependencies mentioned above.
In particular ALSA sound, otherwise even the BEEP command does not work at all,
because QB64 uses the sound card instead of the internal computer speaker.
The documentation is excellent, though perhaps too concise. It is obviously
intended for the programmer experienced in Basic (especially in QuickBasic),
and maybe somewhat cryptic for the novice. It is a reference, not a tutorial,
but for the experienced programmer it is a mine of information. QB64 is at:
QB64
QB64
BCPL: a language that was the direct ancestor of the A, B and C languages.
C: structured modular language, similar to Pascal, created by Dennis Ritchie
(Bell Laboratories, a part of AT & T Telephone) in 1972-1974. The name comes
from two experimental languages that existed with the names of 'A' and 'B',
based on the BCPL language. C is a medium-high level language, higher than
any assembly but without being hardware dependent, therefore more portable
than assembly. It is intended for the experienced programmer. Unics operating
system was entirely re-written in assembly language in 1972 by Kenneth
Thompson, and in C language in 1974 by Dennis Ritchie.
C++: server-side scripting language commonly used to write programmes for
Common Gateway Interface, and for other purposes.
Clipper: a high level programming language.
Coder: a person who writes only in low level code (machine code, in numbering
base of two), as opposed to programmer: a person who writes in language of
medium or high level. Since the 1950's, most programming has been done in
medium level (assembly) or in high level. Low level has almost become limited
to personal experiments or to learning exercises.
Cold Fusion: a scripting language for interfacing data bases and advanced Web
development. Cold Fusion supports data bases such as Microsoft Access, Fox
Pro, d BASE, and Paradox.
Cobol, Common Business Oriented Language: a translated language used for
commercial purposes. Its creation began in 1959, being released in August
1961 by a North American committee, Codasyl, under the inspiration of Captain
Grace Hopper (United States Navy and Univac, see "A-0 Coding Translator"),
and grouping the National Bureau of Standards, the United States Navy, the
United States Air Force, International Business Machines, Sperry-Univac,
Burroughs, Honeywell-Bull, R. C. A., Sylvania and Remington. The American
National Standards Institute introduced in 1974 a dialect called Cobol ANSI
74, and another dialect was introduced in 1980 with the name of Cobol 80. In
spite of these few dialects, Cobol has always been one of the most unified of
all programming languages, and can be easily learnt by those acquainted with
the daily routine in offices and archives. Cobol, however, occupies much
memory, and it is not the best language for complex mathematical operations
(Fortran is better for that). Cobol was created for batch processing, though
some interactive dialect became available in the 1980's.
CGI, Common Gateway Interface: an interface standard that provides a method
for executing a server side scripting programme from a Web site, in order to
generate a Web document with dynamic content. Scripts conforming to this
standard may be written in any programming language that could produce an
executable record, but are most often written in C, C++, Perl, PHP, Python,
or TCL. It must be said that CGI is not really a language, but an adaptation
for providing a standard to true programming languages. Besides CGI, two
languages that were from the start designed for the purpose of generating
dynamic content, ASP of Microsoft Corporation and Java of Sun Systems, stand
now as the main languages used for that purpose, in direct competition
against each other.
Coral: a programming language used from the 1960's to the 1980's.
Dart: experimental language created in 1959 by a student of Darmouth College,
on a computer Royal McBee LGP-30. It inspired Basic in part, although the
fundaments of Basic were taken mainly from Fortran and from Algol.
DBMS: a programming language for data banks used from the 1960's to the
1980's.
DBS: a programming language for data banks used from the 1960's to the
1980's.
Delphi: a high level programming language.
DML, Dynamic Mark-up Language: a sub-set of Standard Generalised Mark-up
Language. DML is a specification, similar to HTML, developed by the World
Wide Web Consortium for Web documents. HTML will generate static pages if
used alone, while DML will generate dynamic pages. Many programmers prefer to
use HTML not alone, but in combination with dynamic scripts embedded into the
HTML code. These scripts will generate a dynamic page, server side. Languages
used for CGI, as well as the ASP, Java or i-HTML languages, are examples of
this approach.
Euphoria: an interpreted language created by Robert Craig in 1993. It claims
to be "easier than Basic, yet more powerful than C" (see also the Turing
language). More information about Euphoria at:
Euphoria
XML, Extensible Mark-up Language: a sub-set of Standard Generalised Mark-up
Language. XML is a specification, similar to HTML, developed by the World
Wide Web Consortium for Web documents. XML contains mark-up commands (also
called tags or symbols) to describe the content and formatting of a page, but
unlike HTML, the mark-up commands are unlimited and self-defining (therefore,
designers can create their own customised commands and command definitions).
XML and HTML were combined into a single language, no more based mainly on
SGML, which the World Wide Web Consortium approved as official specification
with the name of HTML 5 in October 2014.
Forth: a structured modular language, created by the astronomer Charles Moore
in 1970. It is a medium-high level language, combining characteristics of
assembly (of medium level) with those of a high level language. Forth uses
Reverse Polish Notation, which makes it very different from other languages.
It occupies less than half of the memory of a typical programme in Basic,
being however tenfold faster than Basic. Together with Basic and Lisp, Forth
is one of the few languages that can be interpreted or translated (each
approach has its advantages and disadvantages). The first important
application of Forth was to control the giant telescope of the Kitts Peak
Arizona Observatory, being later also used for graphics and for control of
processes.
Fortran, Formule Translator: one of the first scientific languages, created
in 1954-1956 by John Backus of International Business Machines and by others,
for the computer IBM 704. It is focused on the resolution of mathematical
problems of many kinds. Some of its dialects were Fortran I, Fortran II,
Fortran III, Fortran IV, Fortran IV-H and Fortran-77. Fortran is mainly a
translated language, though some interpreted dialects exist. It was the main
fundament of Basic. Fortran is not the best language for commercial purposes
(Cobol is better for that). Fortran was used for teaching before Pascal took
much of the academic field.
GASP: a simulation language. See the Simula language for a short description
on the concept of simulation languages.
GPSS: a simulation language. See the Simula language for a short description
on the concept of simulation languages.
HTML, Hyper Text Mark-up Language: publicly released in 1993, HTML is the
language used to create Hyper Text documents for being exhibited as Web
documents in the Internet. HTML documents are intended to be viewed using a
user agent to interpret the HTML commands or part of them, to give format to
text into a page (and optionally also to elements of sound or image, showing
audio or video records), to present hyper links to other pages inside the
document or to other documents, to execute Java scripts client side, or to
perform other operations. User agents may be visual (fully graphic, partly
graphic, or only textual), may be aural (text to sound), or may be tactile
(text to Braille). In the case of non-graphic user agents, they show (or they
convert to sound or to Braille) only text and hyper links, but not images or
sounds. HTML has a few dialects, of which the latest SGML version is HTML
4.01 of December 1999. HTML has been since 1990 a dialect of SGML, the
Standard Generalised Mark-up Language. However, the World Wide Web Consortium
approved in October 2014 the official specification of HTML version 5, which
is not based on SGML anymore. Instead, HTML 5 tries to combine features of
SGML with others of XML, Extended Mark-up Language. HTML 5 does not further
develop frames, and it consolidates separation of presentational mark-up by
means of CSS, Cascading Style Sheets, or of another style sheet language.
Not all style sheet languages support cascading. More information at:
Wikipedia, the Web Encyclopaedia
World Wide Web Consortium
World Wide Web Consortium
IMS: a programming language for data banks used from the 1960's to the
1980's.
i-HTML, In Line HTML: a server side programming language for developing
dynamic Internet content.
ISML, Inter Shop Mark-up Language: a set of scripting tags to generate
dynamic Web pages. ISML tags are extensions of any tag based language that
conform to SGML standard.
Java: a programming language often intended for being executed in a network,
without fear of malicious code or of other damages to the client computer. By
making use of small Java programmes, called Java Applets, Web documents can
include calculators, animations, interactive games or other functions.
Besides CGI, two languages that were from the start designed for the purpose
of generating dynamic content, ASP of Microsoft Corporation and Java of Sun
Systems, stand now as the main languages used for that purpose, in direct
competition against each other.
Java Script: a programming language for use in Web documents, that allows
dynamic content executed client side. It was invented by Netscape with the
name of "Live Script". Sun Systems adopted it in 1995 and renamed it "Java
Script", developing the language to its current state. In spite of the
similarity in name, it is not closely related to Java.
Lisp, List Programming: a language created by John Mc Carthy (Stanford
University), presented at Darmouth College in 1956 and since 1958 developed
at the Massachussetts Institute of Technology. Together with Basic and Forth,
Lisp is one of the few languages that can be interpreted or translated (each
approach has its advantages and disadvantages). With later improvements, Lisp
was until the 1990's the most used language for Artificial Intelligence,
being still used in the early XXI century. It tries to simulate natural
language, without pre-defined mathematical structures. The original Emacs
text editor for Unics systems includes a full Lisp system within it.
Logo: named in honour to the Logo Group of the Massachussetts Institute of
Technology. Created by Professor Seymour Papert (of that Institute) in 1967.
Logo is an interpreted language, used for training novice programmers. It is
a dialect of Lisp.
Modula-2: a language created by Niklaus Wirth in the 1970's, based on Pascal
(also a creation of Niklaus Wirth in 1971).
Moon Rock: a translated language created by Rowan Crowe in 1994-1995, similar
to Microsoft QuickBasic of 1985. No development seems to have been done to
Moon Rock for several years. Its distribution includes a compiler to produce
executable programmes. More information at:
Moon Rock
Mumps, Massachussetts General Hospital's Utility Multi Programming System: a
programming language, operating system and data bank created in 1966 in the
main hospital of Massachussetts. Applied to conversational purposes.
My-SQL, My Structured Query Language: an open source relational data base
management system that uses a sub-set of ANSI SQL. More information at:
My Structured Query Language
OMS: a programming language for data banks used from the 1960's to the 1980's.
Pascal: named in honour to Blaise Pascal (1623-1662), who in 1642-1647 built
some units of the "Pascaline", machine of pinion wheels for adding, using
numbering base of ten. Pascal is a translated language structured by modules,
created by Professor Niklaus Wirth (University of Zurich, Switzerland) in
1971. Based mainly on Algol, Pascal began as a teaching tool for forming new
programmers. It has a few dialects (albeit not so many as Basic), and it is
portable to different computers. Pascal became important during the 1980's
and early 1990's, largely substituting Fortran in the academic field and for
scientific or commercial purposes, being even pointed as a probable successor
of Basic, but later it gradually fell back into the classrooms whence it came.
Perl: a server side scripting language commonly used to write programmes for
Common Gateway Interface. Perl programmes, called scripts, are text data sets
that are parsed (run through and executed) by a programme called interpreter,
located in the host server.
PHP: a server side scripting language commonly used to write programmes for
Common Gateway Interface. PHP commands are embedded into the source code of
HTML documents. PHP commands are executed in the Web host server to generate
dynamic HTML pages. More information at:
PHP
Pilot: a programming language used from the 1960's to the 1980's.
PL1, Programming Language One: a language created by IBM in 1964 for big
computers. Based on Algol, PL1 was one of the first languages concerned about
modular structures. It is used for scientific or commercial purposes, and for
programmes used in Computer Aided Design.
Programmer: a person who writes in language of medium or high level, as
opposed to coder: a person who writes only in low level code (machine code,
in numbering base of two). Since the 1950's, most programming has been done
in medium level (assembly) or in high level. Low level has almost become
limited to personal experiments or to learning exercises.
Prolog, Programming in Logic: a declarative language (non-procedural) created
by Alan Colmeraver (University of Marseille, France). It is used for solving
logical relations, mainly applied to Artificial Intelligence and expert
systems, similar to Lisp. Prolog has some dialects, intended for small or for
big computers. A programme in Prolog typically has a tenth of the lines than
an equal programme in Pascal.
Python: a server side scripting language commonly used to write programmes
for Common Gateway Interface. It is an interpreted, object oriented
programming language. Python is legally protected under copy right, but the
source code is freely available and open for modification and reuse.
RPG, Report Programme Generator: a programming language created by IBM in
1964, used mainly to write reports. Originally intended for batch processing,
it was the first non-symbolic language of high level, targeted to problems.
Most programmers who worked with this language did not appreciate it, in
spite of the improvements made in the dialect called RPG III.
SSI, Server-Side Includes: a server side scripting language. SSI scripting
commands are embedded within the code of a Web document and are parsed and
executed in the Web host server to generate dynamic HTML pages. Common uses
of SSI are to include records (like a header or footer record) that are used
in many pages of the document, or also to show the current date and time.
Short Order Code: the first scientific programming language, created by Mandy
of Univac in 1949.
Simscript: a simulation language. See the Simula language for a short
description on the concept of simulation languages.
Simula: a simulation language. Simulation languages are used to emulate a
process by modelling its behaviour in a natural or realistic situation,
trying to reproduce it by passing the modelled process many times in a
computer, introducing at each pass random values for certain parameters
(inside pre-defined limits), and collecting statistics on the observed
results. Computer simulation can be programmed with almost any language,
though those languages that are specific for simulation can make the
programming work easier or more efficient. On the other hand, simulation
languages tend to use huge translators to compile the source programme, and
for this reason they are normally used only with main frame computers or with
super-computers.
Examples of some natural or real processes that try to be modelled by these
languages are weather (Meteorology), interaction of living organisms (Ecology),
battles (or even entire wars), and their sinergetic combinations. As a
particularly dramatic example of the latter, predictions on the consequences
of a nuclear war (the so called Nuclear Winter) have been refined in detail
since the first Conference on the Global Consequences of a Nuclear War
(Washington 1983). There is always, however, room for speculation on the
reliability or accuracy of any simulation made by computer, because even tiny
changes are cumulative, and after many passes those changes take their toll
and give a totally different result (the so called Butterfly Effect). In
spite of this fact, simulation is a very important field of research in
Mathematics and in Computing Science.
Slam: a simulation language. See the Simula language for a short description
on the concept of simulation languages.
Smalltalk: language and operating system used to simulate programming by
natural language. It was created in the late 1970's by the Learning Research
Group of Xerox Parc in Palo Alto, and released in 1981 as part of the Xerox
Star Information System.
Snobol: a programming language used from the 1960's to the 1980's.
Speed Coding: the second scientific programming language, created by Seldon
and John Backus of International Business Machines in 1953.
SGML, Standard Generalised Mark-up Language: the basic standard from which
most mark-up languages are derived.
TCL: a server side scripting language commonly used to write programmes for
Common Gateway Interface.
Terse: a programming language created by Jim Neil in the 1980's, operable
since 1996. Terse gives all of the control that is available in assembly
language, with the often easier use that is found in high level languages.
Terse is available at:
Terse
Turing: named in honour to Alan Mathison Turing, a pioneer of Computing.
Created in 1983 by R. Holt, J. Colby and J. Hume (University of Toronto), it
claims to be "easier than Basic, yet more elegant than Pascal or PL1" (see
also the Euphoria language).
Visual Basic: an object oriented programming language developed by Microsoft
Corporation about 1991, for the last versions of MS-DOS and for Windows. It
is translated by compiler. ASP supports Visual Basic by default.
Language List by Bill Kinnersley. It has over 2500 computer languages, with many
references to the available source code. Implementations of programming languages
that time forgot, such as ALGOL-60, FOCAL, FOOGOL, INTERCAL, JCL, MIXAL, OISC,
PILOT, TRAC, Little Smalltalk or Orthogonal.
Language List by Bill Kinnersley
Historical On-line encyclopaedia of Programming Languages
Internet data sets and protocols
Transmitting information worldwide
The tender foot asked:
"I want to download the Internet. Do I need a bigger hard disk ?"
(joke taken from the HTML tutorial offered by the World Wide Web Consortium)
There are normally three kinds of data sets that can be transmitted by the
Internet:
Executable code, a data set in numbering base of two, intended to be executed
by a computer as a programme.
Extended Binary Coded Decimal Interchange Code (EBCDIC), a standard of
characters for big computers created by International Business Machines in
1964. It introduced bytes of 8 bits, which replaced bytes of 6 bits working
with characters BCD or BCI.
American Standard Code for Information Interchange (ASCII), another standard
of characters, which can be read by humans or can also be executed as a
programme, through another programme that will act as a translator or as an
interpreter (for example, a user agent for Hyper Text Mark-up Language).
There is a 7-bit 'Standard' ASCII (characters from 0 to 127), which is
completely homogeneus worldwide, and there are two main variants of 8-bit
'Extended' ASCII (characters from 128 to 255): IBM ASCII (used for example by
DOS systems), and ANSI ASCII (used for example by Microsoft Windows systems).
Years later, Unicode addressed the problem of a truly worldwide standard of
characters. The Unicode standard is wholly incorporated into HTML for the
World Wide Web.
The Internet, however, is not the only network of computers that exists or
has existed. This is a non-exhaustive list of other computer networks that
existed at some time or that still exist today:
America On Line
Some non-Internet protocols are:
After the explosive growth of the Internet in the late 1980's and in the
1990's, a number of those networks disappeared as separate entities. Most of
the surviving ones are now connected to the Internet, although a few may
still exist working as little more than intranets, serving a limited
geographic area. There are also a number of different protocols inside the
Internet itself. Most important is the Internet Protocol, working together
with the Transfer Control Protocol or else with the User Datagramme Protocol,
which are necessary for negotiation of transmissions between computers or
through them. The Internet is not a centralised network as most of the
networks listed above, the Internet is a re-routable network, and this is a
key point for not considering those networks a part of 'Internet' as the term
is commonly used, although strictly speaking they may also use an Internet
Protocol. Uniform Resource Locators for different protocols are explained in
Request For Comments 1738.
Chargen
CSO: The Computing Services Organisation name server interface, also known as
PH or QI, which provides names and telephone numbers of university people.
CSO can also be accessed by the Gopher Protocol, listed below.
Finger: command for pointing to all the potential destinations of an Internet
transmission. Finger was useful when the list of potential destinations was
only of a few thousands (before the mid 1980's), but became impractical when
the number of computers connected to the Internet grew to millions. However,
Finger can still be seen in Unics Work Stations.
File Transfer Protocol: FTP is an Internet standard for transferring data
sets over the Internet. FTP programmes and utilities are used to upload or
download Web pages, graphics, or other data sets, from a hard disk or another
storage volume to a remote server that allow FTP access. Two commonly used
free FTP programmes are WS FTP and Cute FTP.
Gopher: protocol developed at the University of Minnesota in 1991 (a gopher
is the mascot of that university). Gopher is mainly used for transmission of
text, although some Gopher clients present graphic interface. Gopher went
into a slow continuous decline after the spread of the HTTP-HTML 'World Wide
Web' of Mister Berners Lee. In terms of traffic (packet exchange) the Web
overpassed FTP, Gopher and most other Internet protocols about 1995, but as
of 2016 there are still about a hundred active Gopher servers. Gopher presents
a menu list from which a text document or another resource can be chosen.
The CSS Dixieland page on the Gopher Protocol can be visited by activating
the following internal link:
HTTP, HTTPS: Hyper Text Transfer Protocol and its secure transmission
Mailto: A transmission of electronic post
NNTP, News, SNews: News Groups Transfer Protocol and its secure transmission
Newspost, Newsreply, SNewspost, SNewsreply: Other protocols for News Groups
Telnet, Tn3270, Rlogin: Protocols for operation of a remote computer
WAIS: Wide Area Information System
Client-Host (or Client-Server): not a protocol, but a method for indirect
Internet connection. A client computer remotely connects to a host (server)
computer. The host is directly in the Internet, and it executes a programme
for communication with the client, who then executes another copy of that
programme.
Dumb Terminal: not a protocol, but a method for direct Internet connection,
the only method that does not require a modulator-demodulator for the
transmission of signals. The dumb terminal is not connected by telephone
lines, it is connected by wires to a near mainframe computer. The mainframe
computer is directly in the Internet. Here, the expression "mainframe" is
used in its original meaning of "Central Process Unit", as opposed to
peripheric unit. Dumb terminals are often found inside the same building or
of related and physically near buildings, like it may be the case of a
library or of a university campus.
Emulation of Terminal: not a protocol, but a method for indirect Internet
connection. A computer or a dumb terminal is connected as a peripheric unit
of a remote mainframe computer. The mainframe computer is directly in the
Internet. Here, the expression "mainframe" is used in its original meaning of
"Central Process Unit", as opposed to peripheric unit. The computer connected
as a peripheric unit will be treated as a dumb terminal, whether being really
a dumb terminal or not.
Hyper links to Internet incunabula and Retrocomputing
The Way Back Machine at the Internet Archive
Documents published in different protocols of the Internet may disappear from
their host servers, or the host servers may themselves disappear. It happens
due to a number of causes, for example to the death of their authors and the
lack of another person wishing to take the responsibility of maintaining the
document, or to the demise of the organisation that kept the host server.
The Way Back Machine located at the Internet Archive is a group of really
huge computers with incredible capacity of storage. Their purpose is to make
a copy of documents considered valuable, and to store that copy for the
knowledge and enjoyment of future generations. Some old versions of this
document of CSS Dixieland are stored there, together with other Internet
incunabula.
The Way Back Machine at the Internet Archive
The Retrocomputing Museum
And for ending this page of CSS Dixieland entirely devoted to the History of
Computing, nothing better than a hyper link to the Retrocomputing Museum.
Some musea with collections of items related to computers exist in the World,
even in countries where little computer development has ever taken place. A
few musea are specifically dedicated to hardware and software. Such places
deserve respect and support, as repositories of valuable pieces of Technical
History. To the ignorant they seem like 'an attic full of dusty old things',
but to the expert they are cherised treasures of what was, once upon a time,
state-of-the-art in Computing Science. What exists now is descended from what
existed before. Readers who perceive the importance of knowing the past for
understanding the present will certainly enjoy the time travel at the museum.
The Retrocomputing Museum
Robot or human visitors to CSS Dixieland are recorded in raw access log. This
is a passive register purely for statistical purposes, no cookies are stored
in the client computer.
Go to top of this page
Go to page with index, history, exchange policy, contact CSS Dixieland:
Start
Hosted by Neocities:
https://konqueror.org/
The fascinating history of how thinking machines
have come to be the powerful brains that they are today
Walkyrie who takes our dead heroes to Walhalla in Asgard.
Wagner Frost Illustration
Microprocessors
Programming languages
Internet data sets and protocols
Hyper links to Internet incunabula and Retrocomputing
Machines operated by wooden tables or cylinders, cards or paper tapes, lined
with perforations or with raised dots, have been used from the XVIII century
in textile industries, music instruments (like the mechanic piano), toys and
other applications.
Since 1971, the oldest collection of Internet books (over fifty thousand in 2016)
http://www.gutenberg.org/
All of the books from the Gutenberg collection, converted to mobile formats
http://manybooks.net/
-An incomplete transmission of packets is re-transmitted a number of times,
defined by a "time-out" limit.
-Black boxes (later called gateways and routers) connect local area networks,
but do not keep data on passing packets.
-There is no global control at the operational level.
-Gateway functions, like Internet Protocol heading, directed interfaces,
breaking of packets into smaller fragments to be assembled later, and others.
-Point to point checking, for recuperation of fragments and for detection
of duplications.
-Global addressing system.
-Control of flux from host to host.
-Interface with a variety of operating systems.
-Efficiency in implementation and in performance among local area networks.
-The position of any octet in the chain is used to identify the octet.
-Control of flux is done using windows and acks.
-Destination can choose when recognition should be done, each ack returned is
cumulative for all packets received.
-The parametres for windows to be used between origin andd destination remain
yet undefined.
First official specification of HTML, June 1993, now called HTML 2.0
http://www.w3.org/MarkUp/html-spec/html-spec_toc.html
Text-only user agent for many protocols, platforms and systems
https://lynx.browser.org/
Text-only user agent for many protocols, platforms and systems
https://lynx.invisible-island.net/
Intel Corporation produced integrated circuits of initially 1 Kilobit, such
as the Intel 1103, which substituted memories of iron nuclei. Other chip
producing companies were Fairchild, Texas Instruments (these two had begun
earlier than Intel), Motorola, MOS Technology, Zilog, Signetics, Mostek,
National, Hewlett-Packard, AMD, Cyrix and Nexgen, all of them North American.
There is below a non exhaustive list of some of their integrated circuits,
ordered by year, where it can be seen how they grew in efficiency and in
complexity. The order of characteristics that are listed for each entry is:
company - model - year, memory size in bits or in bytes. If in bytes, it is
also given the quantity of bits in each byte. The word "byte" is not assumed
to represent an "octete" of eight bits, but a binal term of any number of
binal digits. Eventually, processing speed (frequency of the piezo-electric
crystal) is also indicated.
Used in common calculators, the Intel 4004 was composed of 2 250 elements
that if connected to other four integrated circuits gave a microcomputer of
power comparable to the big computers of the mid 1950's, capable of adding
two numbers of 4 bits in 11 microseconds. It was the first microprocessor and
the basis of many others, most of which integrated MOS transistors. Intel
4004 inaugurated a kind of computers called "of fourth generation", which
predominated from the 1970's to the early 2000's.
The 8086 was the original Intel Microprocessor, with the 8087 as its floating
decimal point mathematic co-processor. The 8086 was Intel's first 16-bit
microprocessor.
After the development of the 8086, Intel created the lower-cost 8088, which
was similar to the 8086 but with an 8-bit data bus instead of a 16-bit bus.
The 80186 was the second Intel chip in the family, the 80187 was its floating
decimal point mathematic co-processor. Except for the addition of some new
instructions, the optimisation of some old ones, and an increase in clock
speed, this processor was almost identical to the 8086.
The 80286 was the third model in the family, the 80287 was its floating
decimal point mathematic co-processor. The 80286 introduced the Protected
Mode of operation, as opposed to the Real Mode that the earlier models used.
All x86 chips can be made to run in real mode or in protected mode.
The 80386 was the fourth model in the family, and the first microprocessor
made by Intel with a byte of 32 bits (a byte is not necessarily an octete).
The 80386 DX was the original 80386 chip, and the 80386 SX was an economic
model that used the same instruction set but that only had a 16-bit bus. As
of 2016, the 80386 EX is still used in a few embedded systems.
The 80486 was the fifth model in the family. It had an integrated floating
decimal point mathematic co-processor for the first time in x86 history.
Early 80486 DX chips that were found to have defective FPU's were physically
modified to disconnect the FPU portion of the chip, then sold as the 80486 SX
(80486 SX 15, 80486 SX 20 and 80486 SX 25). An 80487 mathematic co-processor
was available to 80486 SX users, which was essentially an 80486 DX with a
working FPU and an extra pin added. The arrival of the 80486 DX 50 processor
saw the introduction of fanless heat sinks being used to keep the processor
from overheating.
Intel called it Pentium because the company could not trademark a code number
80586. The original Pentium was faster than the 80486, with some other
enhancements. Later models also integrated the MMX instruction set.
Pentium Pro was a sixth-generation architecture microprocessor, initially
intended to replace the original Pentium in a full range of applications, but
later reduced to a more narrow r\93le as server and high-end desktop chip.
The Pentium II was based on a modified version of the P6 core first used for
the Pentium Pro, but with an improved 16-bit performance and the addition of
the MMX SIMD instruction set that had already been introduced in Pentium MMX.
The Celeron chip is really a big number of different chip designs that depend
on price. The Celeron chips are the economy line of chips, often cheaper than
the Pentium chips, even if the Celeron model in question be based on a
Pentium architecture.
The Xeon processors are modern Intel processors made for servers, which have
a much bigger cache than the Pentium microprocessors. The cache of Xeon is
measured in Megabytes, in comparison to other chips whose cache is measured
in Kilobytes.
Initial versions of the Pentium III were very similar to the earlier Pentium
II, the most notable difference being the addition of SSE instructions.
The Pentium IV had a new seventh-generation "Net Burst" architecture. With up
to 3.8 GigaHertz of clock speed, it was about 2005 the fastest x86
microprocessor in the market. Pentium IV also introduced the notions of Hyper
Threading and of Poly-Core chips. A Pentium IV has about 42 000 000 elements.
The architecture of the Core processors was actually an even more advanced
version of the sixth-generation architecture, dating back to the 1995 Pentium
Pro. The limitations of the Net Burst architecture, especially in mobile
applications, were too seriuos as to justify the creation of more Net Burst
processors. The Core processors were designed to operate more efficiently
with a lower clock speed. All Core branded processors had two processing
cores, the Core Solos had one core disabled, while the Core Duos used both
processors.
An upgraded, 64-bit version of the Core architecture. All desktop versions
are poly-core.
Athlon is the brand name applied to a series of different x86 processors
designed and built by AMD. The original Athlon Classic was the first of a
seventh-generation of x86 processors. For a significant time, it retained the
initial performance advantage that it had over Intel's competing processors.
Turion 64 is the brand name that AMD applies to its 64-bit low-power (mobile)
processors. Turion 64 processors (but not Turion 64 X2 processors) are
compatible with AMD's Socket 754 and are equipped with 512 or 1024 Kb of L2
cache, a 64-bit single channel on-die memory controller, and an 800 MegaHertz
Hyper Transport bus.
The AMD Duron was an x86-compatible computer processor built by AMD. It was
released as a low-cost alternative to AMD's own Athlon processor and to the
Pentium III and Celeron processor lines from rival Intel.
Sempron was about 2005 the AMD's entry-level desk-top CPU, replacing the
Duron processor and competing against Intel's Celeron D processor.
The AMD Opteron was the first eighth-generation x86 processor (K8 core), and
the first of AMD-64 processors (x86-64). It was intended to compete in the
server market, particularly in the same segment as the Intel Xeon processor.
Resources for Ada, Universal Programming Language
http://www.adapower.com/
Although programmers tend currently to use high level languages such as C
language, C++, or Pascal, the code that is closest to the hardware is called
machine code, or low level code. Not one second passes during a powered
working session without the computer executing machine code.
To word this simply, we may say that assembly language is a human-readable
text language, and machine code is a machine-readable binary code. When you
programme in assembly language you are programming at the machine code level.
To write instructions directly in machine code is tedious, that is why you
use assembly language instead, and an auxiliary programme known as assembler
to produce the final machine code.
I personally think that, except as a learning exercise, it is a waste of time
to write something in assembly that could be written acceptably fast in a
high level language. Assembly language fits for the following situations:
First, low level control: when you need to change the flags, or also the
control registers of the processor, as when entering protected mode.
Second, speed: instructions written in machine code execute fast. They can
execute ten to a hundredfold the speed of Basic, and about twofold that of C
language or of Pascal. Therefore, critical sections of programmes written in
languages of higher level can be re-written in assembly instead, in order to
speed them.
Third, small programme size: when you write for example a "TSR" programme
(Terminate and Stay Resident in Random Access Memory), then assembly is very
useful. Writing interrupt handlers is where assembly language really shines.
MASM (Microsoft Assembler), TASM (Borland Turbo Assembler), Paradigm (Dev
Tools Assembler)
A86 Assembler-D86 Debugger, A386 Assembler-D386 Debugger, ASM (Arrowsoft
Assembler), CHASM, Cross Fire, FASM (Flat Assembler, also produces plain
machine code), GAS (GNU Assembler), GASM Gage, GASM Owen, GEMA, GOASM, IASM,
JAS, Magic, NASM (Netwide Assembler, also produces plain machine code), NBASM,
Visual, WASM (Watcom Assembler), WASM (Wolfware Assembler).
Original Dartmouth Basic programming language
http://en.wikipedia.org/wiki/Dartmouth_BASIC
PRINT; prints at next space
INPUT introduces value or string of characters
MAT manipulates matrix (a powerful new feature)
RESTORE restores value stored in DATA or read by READ
ALGOL starts programming in Algol
BASIC starts programming in Basic
CATALOG displays the names of programmes in permanent storage
FORTRAN starts programming in Fortran
HELLO logs into Dartmouth Time Sharing System
LIST displays the current programme
NEW names and begins writing a programme
OLD retrieves from permanent storage a previously named programme
RENAME changes the name of the current programme without erasing it
RUN executes the current programme
SAVE saves into permanent storage the current programme
SCRATCH erases the current programme without clearing its name
STOP interrupts the currently running programme
UNSAVE clears from permanent storage the current programme
DATA lists the variables and values to be read by READ: 10 DATA 10,20,30
DEF defines single line function
DIM defines size of array, dimension of matrices and vectors
END finishes the entire programme
FOR-TO-STEP repeats loop a number of times and steps: 20 FOR I=1 TO 7 STEP 2
GOSUB goes to a repetitive sub-routine
GOTO goes inconditionally to statement in the specified line: 30 GOTO 500
IF-THEN conditional decision following expression: 40 IF B
NEXT ends loop, continuing programme from next line after FOR-TO: 60 NEXT I
PRINT shows in output device (printer or screen): 70 PRINT A,"VALUE OF ",B+C
READ reads the value of one or more variables stored in DATA: 80 READ B,C,D
REM remarks non-executable comments
RETURN returns from a repetitive sub-routine to the main routine (sequence)
STOP stops programme before textual end (equivalent to GOTO line having END)
ABS absolute number, without regard to positive or negative sign
ATN arctangent of an angle, in radians
COS cosine of an angle, in radians
EXP exponent of power
INT integer
LOG natural logarithm
RND number at random
SIN sine of an angle, in radians
SQR square root (rooted to index of two)
. decimal fraction
+ plus, addition
- minus, substraction (binal operator, arithmetic)
* by, multiplication
/ among, division
^ raised to power of, exponentiation (3^4=81)
- negation (monal operator, logic)
= equal to
< > different from
< lesser than
> greater than
< = lesser than or equal to
> = greater than or equal to
( start grouping operations
) finish grouping operations
1964-1979 Darmouth Basic 1 to 7, by John Kemeny and Thomas Kurtz
1976 Modular Basic, by John Kemeny and Thomas Kurtz
1978 ANSI Basic, by the American National Standards Institute
1985 True Basic, by John Kemeny and Thomas Kurtz
1981-1991 Microsoft-IBM BASICA (64 Kilobytes) for PC-DOS operating system
1981-1991 Microsoft GW BASIC (64 Kilobytes) for MS-DOS operating system
1985-2001 Microsoft QuickBasic (64 Kilobytes, translated) for MS-DOS 5 or 6
1985-2001 Microsoft QBasic (160 Kilobytes, interpreted, based on QuickBasic
4.5) for MS-DOS 5 or 6
1989-2001 Microsoft QBX PDS (translated) for MS-DOS 5 or 6
1991 Microsoft Visual Basic (translated, object oriented) for MS-DOS 5 or 6,
or for Windows
M Basic
Applesoft Basic
Disk Basic
Prom Basic
Pet Basic
North Star Basic
Maxi Basic
SWTC Basic
TSC Basic
TDL Basic
Tiny Basic
Xitan Basic
Micropolis Basic
Polymorphic Basic
Basic E
C Basic
Cromenco Basic
Benton Harbor Basic
Imsai Basic
PT Basic
Helios Basic
Compucolor Basic
Software Dynamics Basic
HP Basic
Business Basic
Dec Basic
Wang Basic
Honeywell Basic
Tektronix Basic
GE Mark II Basic
RCA Spectra Basic
Univac Basic
CDC Basic
Xerox Sigma Basic
Translator from Basic to C++, almost compatible with QuickBasic 4.5
QB64 includes IDE, adds many new commands, and automatically sends C++ to GCC
https://qb64.com/
Translator from Basic to C++, almost compatible with QuickBasic 4.5
Forum where QB64 enthusiasts can send or receive information
https://qb64.boards.net/
Interpreted programming language
http://www.rapideuphoria.com/
Hyper Text Mark-up Language
http://en.wikipedia.org/wiki/HTML
Governing body for HTTP, HTML, CSS, PNG and other official specifications
http://www.w3.org/
Official specification of HTML 4.01, December 1999
http://www.w3.org/TR/html401/
Translated programming language, with compiler to produce executables
http://www.rowan.sensation.net.au/moonrock.html
Open source relational data base management system based on SQL
http://www.mysql.com/
Server side scripting language for CGI, embedded into HTML
http://www.php.net/
Programming language combining assembly control with high level use
http://www.terse.com/
Over 2500 computer languages and their available source codes
https://gilles-hunault.leria-info.univ-angers.fr/hilapr/langlist/langlist.htm
Huge list with almost 9000 computer languages
http://hopl.info/
Applelink
Bitnet
Compuserve
CSnet
Delphi
Eunet
Fidonet (Bulletin Board System)
Genie
Janet
MCI Mail
NFSnet
Prodigy
RBTnet (Bulletin Board System)
Unix to Unix Copy Protocol (Bulletin Board System)
Xmodem
Ymodem
Zmodem
Collection of millions of old documents from the Internet
http://www.archive.org/
Vintage hardware and software for lovers and connoisseurs of computers
https://www.catb.org/~esr/retro/
https://www.neocities.org/