Talk:History of computing hardware/Archive 1

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia
Archive 1 Archive 2

2001 talk

I doubt very much now that Harvard Mark I was fully programmable. Later models could switch conditionally from one paper roll to another, but since I don't believe it could rewind the paper rolls, no actual loops are possible and it's not Turing complete. But I'm not sure. Anybody know details about the Mark I?

Oh, and I just read that the Manchester Mark I was actually the first functional von Neumann machine, even before EDVAC -- but of course based on EDVAC's ideas. --AxelBoldt


I believe you are correct about the IBM-Harvard Mark I. This machine, by the way, was not built at or by Harvard. It was built for Harvard (and the U.S. Navy) by IBM.

My understanding is that the first operational stored program computer was the "Manchester Baby Mark I", a test machine for the Williams-tube storage technology, not the Manchester Mark I itself. The EDSAC at Cambridge appears to have preceeded the Manchester Mark I as the first "practical" stored program computer in operation.


Axel: An electromechanical computer necessarily uses some electronics. Thus "electro". But I understand what you mean about the electronics/electromechanical distinction.

Aiken directed the construction of the ASCC by IBM engineers at the IBM Endicott labs. Construction was completed in 1943. It was moved to Harvard, and operation began May 1944. [1]

As stated, the EDVAC was never completed--so all EDVAC-based computers were "before EDVAC". The "Baby" was first based on the EDVAC design that got a program running. --The Cunctator

I gotta say, the Wiki method really works--this entry has gotten amazingly better in a vary short period of time. It's still a little too discursive (some of the specificity would be better in stand-alone entries), but it's highly informative and readable. --The Cunctator


Not to disagree, but there's still a whole lot missing. No mention of Whirlwind, SAGE, PLATO, to give just a few examples.

Is that a disagreement or not? SAGE is mentioned in the history of networking... --The Cunctator


It seems like the end of the article is the original timeline visible at the top of this page table, and reads very much like a timeline. Wouldn't more of an overview and synthesis be appropriate, considering we have the (very good IMHO) other timeline?


I agree, particularly the latter part of the article has too many dates, names and details obscuring the general flow of progress. --AxelBoldt

I don't want to be argumentative, but I thought the new article didn't tell much of anything before WWII or after 1970, let alone flow of progress. The flight control system of the F14, while interesting, was hardly a landmark computer.

Yes, there was a fair amount where I just went in and pasted missing stuff from the old page. However, I feel it is more important to have date-filled placeholders than nothing at all. Now that some base data is there, anyone can go in and rewrite/rearrange it. By all means, feel free to edit as you see appropriate. The power of Wiki :-) --Alan Millar


Names, dates, and details are good things; but need to be pushed down into more detailed articles on more specific topics. At the same time, an overview/summary/synthesis needs to be presented at this level. But my guess is its easier to do this bottom-up rather than top-down. In other words, collect all the detailed information first, then refactor into appropriate levels of detail.

Also, should this article cover software as well as hardware? -HWR

Of course, hardware w/o software is scrap metal. The question is whether it's tangible enough to procude records. --Yooden


Anyone can refactor (a basic design feature of Wiki), but only if there is some information to refactor, so I think the bottom-up approach is necessary.

But all the information is already on the Computing timeline page, so why repeat it here? I think this article should have a bird's eye view on Computing history, just outlining the developments, and not listing anecdotes such as ads bought by certain companies at certain sports games. --AxelBoldt


As to Swiss clocks: the essence of computing is not the addition and subtraction of numbers, although it grew out of it and is a necessary part of it. The essence of computing is the execution of a sequence of instructions, and in that respect modern computers have as much in common with Swiss clocks as the abacus. And no, I'm not recommending removing the reference to the abacus :-) --Alan Millar

Swiss clocks neither process information nor can be programmed. They are just fancy mechanical devices, like all mechanical clocks. I don't see any relation to the history of computing except maybe that some early mechanical calculators used similar mechanisms as mechanical clocks (why Swiss?). Also, why are they mentioned in the paragraph about programmability? --AxelBoldt

What about music boxes? They're programmed to play tunes. -HWR

They have a single sequence, as do player pianos, and player pianos can even use a different paper roll to play a different tune. In that respect, the music box mechanically is a predecessor to the Jacquard loom. The Swiss clocks had multiple sequences of actions, where a main cog would activate other cogs to order different actions. The first GOSUB? :-) --Alan Millar

Actually, there are music boxes that play tunes from interchangeable discs. I don't know the chronology of this however.

BTW, is this article restricted to the history of DIGITAL computers? Analog computers don't generally execute sequences of instructions. -HWR


"IBM decided to enter the PC market ..., with the IBM XT" is not correct -- the XT was their second machine, with the hard drive.

That's correct--I'll change it. The first one was simply called the "IBM PC". Some mention of Compaq and the beginnings of the clone market in that era seems appropriate too. --LDC


I'm afraid this entry is getting too timeline-y...but I see that others are aware of that. Looks like we need to start thinking about some more subentries...anyone have any suggestions? --The Cunctator


Unfortunately the timeline here has many inaccuracies and ommissions of historical importance: 1965: IBM System 360 (first OS); 1968 first mouse/window system demo; 1973: CPM first micro OS; 1969 Intel 4004; 1977 Commodore Pet & TRS 80; 1978 Atari 400/800; 1979 Motorola 68000 32 bit CPU (w. 16 bit data and 24 bit address bus); 1981 Commodore Vic20 & IBM PC & Xerox Star (w. GUI/Mouse/Ethernet...); 1982 Commodore 64 with 64k RAM $600 & Timex Sinclair 2K RAM $99; 1983 1 million Commodore Vic20s and 1 million Apple IIs sold; 1985 Commodore Amiga with multitasking/Color GUI/accelerated video/stereo sound/3.5" floppy $1200; 1988 7 million Commodore 64 and 128 computers sold.... --Jonathan--

Feel free to enter whatever you think is missing to Computing timeline, not to History of computing. --AxelBoldt


Ack! It's getting insanely more timeliney! I'm thinking of paring. Please, everyone, notice Computing timeline. History of computing shouldn't supposed to list every computer, but discuss the intellectual development of the engineering/science of computing. --The Cunctator


Moved from /Permission-subpage:

I have obscured the email addresses in the message below in an obvious way. --AxelBoldt

Received: from mail11.svr.pol.co.uk
	by mail.metrostate.edu; Tue, 21 Aug 2001 19:25:28 -0500
Received: from modem-88.bass.dialup.pol.co.uk ([217.134.8.88] helo=arthur.the-roost)
	by mail11.svr.pol.co.uk with esmtp (Exim 3.13 #0)
	id 15ZLpr-0001gy-00
	for [email protected]; Wed, 22 Aug 2001 01:25:32 +0100
Received: from benji.the-roost
	([10.0.0.5] helo=localhost ident=mail)
	by arthur.the-roost with esmtp (Exim 2.12 #1)
	id 15ZLpq-0003Te-00
	for [email protected]; Wed, 22 Aug 2001 01:25:30 +0100
Received: from stephen by localhost with local (Exim 3.12 #1)
	id 15ZLpp-0000vx-00
	for [email protected]; Wed, 22 Aug 2001 01:25:29 +0100
Date: Wed, 22 Aug 2001 01:25:29 +0100
From: Stephen White <[email protected]>
To: Axel Boldt <[email protected]>
Subject: Re: Computing history timeline for GNU encyclopedia
Message-ID: <[email protected]>
References: <[email protected]>
Mime-Version: 1.0
Content-Type: text/plain; charset=us-ascii
Content-Disposition: inline
User-Agent: Mutt/1.2.5i
In-Reply-To: <[email protected]>; from [email protected] on Mon, Aug 20, 2001 at 03:25:19PM -0500
Sender:  <[email protected]>

---- Original Message ----
> From Axel Boldt <[email protected]>
> Date: Monday, 20 Aug 2001, 21:25
>
> I noticed that you have the definite computing history timeline on
> your web site. Maybe you have heard about the GNU style
> encyclopedia at http://wikipedia.com ; we currently have only a weak
> entry about computing history (in fact some of it seems to be
> illegally copied from your site). Would you consider donating your
> timeline to the Wikipedia? You can enter and edit the article about
> computing history yourself, just go to
> http://wikipedia.com/wiki/History_of_computers and click on "edit this
> page right now".

Ok.  First I'll give you permission to use whatever you want from my
computing history site in the encyclopedia.  I'd appreciate it if the
link http://www.ox.compsoc.net/~swhite/history.html is retained for
people to get the most up-to-date version of my information, however
since the GPL doesn't allow for such provsios this will remain an
informal "Gentleman's agreement" and is not legally required for the
inclusion of material from my site in your encylopedia or derived works.

On the second front I'm rather busy moving house at the end of the week
and I've been planning a bit of an update to my computing history pages
for a while - so I'm not sure when Ill have time to look closely at your
history of computing entry and possibly update it.  However I'll leave
this email in my pending folder in the hope that I'll have time to do so
in the not-too-distant future.

Good luck with the project,

-- 
Stephen White                    Oxford University Computing Society
System Administrator                  http://ox.compsoc.net/~swhite/
PGP Key ID: 0xC79E5B6A                       <[email protected]>

Fantastic!!! --User:LMS

See also : History of computing

2002 talk

Is there a reason for all the bold entries in the article? They don't seem consistent. I'd like to remove them. Aldie 15:53 Nov 29, 2002 (UTC)

It appears that the original authors were trying to break up the long, long blocks of type by bolding some of the names. Crossheads are the way to do this. Change all the existing === h3 heads to the correct == h2 heads, debold all the names, then go back and add some === h3 heads that say things like TV Typewriter, etc. Ortolan88

2003 talk

Noyce and Kilby were independent inventors of the Integrated Circuit. Intel invented the Microprocessor, of course, but not Noyce. 169.207.117.23 21:48, 25 Nov 2003 (UTC)


This page and and the timelines are incorrectly titled. They seem to be about history of technology used in computing rather than history of computing itself. Obviously, most computing until recently was done with pencil and paper, and that is not mentioned in these timelines. Would anyone object to moving this page to history of computing technology and starting a separate page that is about computing, not about machines used in computing? The statement that the "computing era" began only when computing machinery began is idiotic. Michael Hardy 21:48, 30 Nov 2003 (UTC)

Slide rules are not even mentioned on this page. Really, I'm beginning to think people trained in computer science should not be allowed in public places, in the interest of public safety. Michael Hardy 21:52, 30 Nov 2003 (UTC)

The article titled history of computing hardware is fairly long, but no one has attempted to write a history of computing itself on Wikipedia. Such an article would treat algorithms to be executed with pencil and paper, with or without the aid of tables, as well as computing with abaci, slide rules, or machines of any kind. Michael Hardy

I'd like that article (history of computing) to be renamed to "History of computing methods" for clarity. Tempshill 00:37, 4 Dec 2003 (UTC)

Why not move this article to History of computers, rather than the current and cumbersome History of computing hardware? Yes, before circa 1950, "computer" meant a person who did mathematical computation, and so one could argue that "History of computers" could refer to either computers (people) or the computers (machines)...but that would be a fairly trifling objection, I think. --Sewing 21:58, 17 Dec 2003 (UTC)

Hear, hear. A good suggestion for clarity and simplicity. --Wernher 22:05, 17 Dec 2003 (UTC)
Well, I want to change it but am reluctant to act, for 2 reasons: (1) There are a lot of pages that link to this one (which means manually changing them if I want to be a good Wikipedian); and (2) I may be wading into something I will later regret. I'll take a wait-and-see attitude for now... --Sewing 18:17, 18 Dec 2003 (UTC)

Title dispute

Originally listed at VfD

  • History of computers. It is currently a redirect to History of computing hardware. I couldn't move the 2nd article to the 1st, so I removed the redirect text in the 1st article, but I still couldn't do the move. "History of computing hardware" is a cumbersome attempt by a mathematician to distinguish the history of computers from the History of computing (the article's former title), which encompasses not only computers but pen and paper as well. His point is valid, but the new title he chose for the article is unnecessarily awkward. --Sewing 17:14, 21 Dec 2003 (UTC)
    • I am thinking whether History of computation is a better title than History of computing. btw There is a Timeline of computing, too. Optim 17:47, 21 Dec 2003 (UTC)
      • I agree History of computing is not ideal. But isn't History of computation also awkward? Anyhow, it goes back to Michael Hardy's argument that "computing" (and "computation") is not just about computers but about mathematical techniques that precede computers. I think History of computers is the best option: it is simple and unambiguous. --Sewing 18:08, 21 Dec 2003 (UTC)
      • History of computation still seems nice and more correct to me. Optim 19:01, 21 Dec 2003 (UTC)
      • I think the term computation is more often (academically) used for the theoretical side of things (algorithms, complexity,etc.), computers seems better for the practical side to me. --Imran 22:17, 21 Dec 2003 (UTC)
        • That's right. We can have a Computation article for the academic theoretical history and a Computers article for practical-business computing. how do u think? Optim 00:49, 22 Dec 2003 (UTC)
    • Keep, who wouldn't be interested in the history of computers? Lirath Q. Pynnor
    • Move to History of computers. Mathematics is as much a part of the history of computers as it is the history of their hardware. - Mark 06:58, 30 Dec 2003 (UTC)
    • I don't know why this was listed on VfD so long. It isn't really a VfD decision. It's more of a title dispute so I've listed it at Wikipedia:Current disputes over articles instead. Angela. 05:42, Jan 4, 2004 (UTC)
      • Well, it can't have been much of a dispute as no-one's discussed it for over two weeks, so I'm delisting it from the disputes page. Feel free to relist it if there really is a dispute. Angela. 01:41, Jan 22, 2004 (UTC)

Jack Kilby 1957

Even though I changed the date for the IC to 1958 to conform to the Nobel laureate article, I happen to know that Kilby thought of the IC during the mass vacation at TI (which would have been in late 1957). Kilby didn't have the vacation seniority, so he came to work at an empty Texas Instruments facility. The quietness of the work environment allowed Kilby to concentrate his thoughts and invent the IC. 169.207.115.129 01:41, 5 Jan 2004 (UTC) Thus the 1958 date must be the official publication date and not the actual date of conception.

Invention of the abacus

Some sources assert that the abacus was inventing in China around 3000 BC; others that it was invented by the Romans or Babylonians around 1000-500 BC and traveled east to China. At present, this Wikipedia article says it was of Chinese invention. It would be nice to come up with an account of the current opinion that was as complete, accurate, and NPOV as possible.


Turing Completion is not a good test for a computer

The article states that Turing Completion is "as good a test as any" for whether a machine is a computer. I fundamentally disgree. It is too easy to build a machine that is theoretically Turing Complete. The Z3 has been shown to be theoretically Turing complete yes. But so what! The z3 had no conditional branching and the proof that it was Turing complete relies on mathmatical tricks defined in the 1990's. It was never intended to be used as a general purpose machine. Babbages Analytical engine was more flexible than the Z3. Furthermore if the z3 was Turing Complete I would lay money on a bet that the ABC as also Turing Complete it was functionally very similar. And what about the Colossi. The MKii Colossi (of which 9 not 10 were built, the MKi was later converted to a MKii) at least had conditional branching. It too must have been "theoretically Turing complete.

It really isn't good enough to shy away from a hard definition by hiding behind the definition of Turing Completion. It has been shown that Conways game of life is Turing complete. It is possible to build a universal turing machine using only a carefully defined set of tiles and them applying conways rules. And what does this prove? It proves that Turing Completion is not a very difficult status to achieve.

Practical as opposed to theorectical Turing completion is something very different. The first computer that could automatically exploit the fact that it was Turing complete and could do this in a practical way, and solve real problems - That was the first computer. The ENIAC does not count it was a serial single purpose machine. Sure you could rebuild it like so many lego bricks but that is hardly a practical general purpose computer. The Manchester MKi was the first stored program machine but it's purpose was to prove that the williams kilburn tube worked effectively as a memory, not solve real problems. It was a research machine. The EDSAC at Cambridge was the first real computer in the modern sense. It was the first machine that could automatically exploit the fact that it was Turing Complete and it could do this in a practical way not merely as a party trick or under laboritory conditions. It was the first machine to impliment the von Neumann Architecture and solve real problems. (the Manchester Mki and maybe the BINIAC preceded the EDSAC but they never solved a real problem.)

A computer is a tool it must be practically capable not just theoretically capable. All the machines before EDSAC were theorectically general purpose but practically special purpose. A computer is a general purpose device. EDSAC was the first modern computer (You may now rip me to pieces ;-) John R.Harris

A minor comment: contrary to the commonly held belief, the Colossus computer in fact did not have condition branching. (Or, indeed, branching of any kind - or a program of any kind, for that matter!) So it definitely was not Turing-complete. See Talk:Colossus computer for more. Noel (talk) 05:05, 1 Mar 2005 (UTC)

The role of weather prediction in the development of computing

I am trying to work in Lewis Fry Richardson's use of differential equations for predicting weather. At the time he wrote his book 1922, computing was not practical for predicting weather, and yet I believe Atanasoff was trying to solve some meteorological problems when he invented the ABC; thus there has been a meteorological application since the first electronic computer; to this day, the supercomputers are used for predicting weather. Ancheta Wis 23:08, 5 May 2004 (UTC)

See: Navier-Stokes equations for the basic equation of weather prediction Ancheta Wis 18:08, 22 May 2004 (UTC) and alsoWikipedia:WikiProject Fluid dynamics. Richardson's approach is listed in Numerical ordinary differential equations. Ancheta Wis 10:12, 25 May 2004 (UTC)

I am replying to a high school librarian's assessment of this article: upon repeated re-reading and editing of the statements in this article, I can state categorically that the edits are made in good faith. As a professional with decades spent on technology, I have learned and experienced items which not even a professional historian could possibly have learned. Since the field has expanded every decade since the 1880's, and since technologists have not had a venue for documenting their accomplishments until the advent of Wikipedia, their work has gone unsung until now. Ancheta Wis 16:58, 26 Aug 2004 (UTC)

Italics everywhere?

Why is it that seemingly every noun in the article is in italics? Did someone get confused about how to make Wiki links? Most of the italic portions would be (I think) most appropriately either deitalicised, or made into wiki links. (Italic emphasis gratuitously added to illustrate how tiresome it is to read something formatted like that.)

Unless there's some particular reason why it's like that, I'll try to change them around a bit at some point soon. PMcM 02:27, 3 Dec 2004 (UTC)

Michael Hardy puts the usage thus: When a noun is used in a sentence, then it is not italicized, unless the sentence is about that noun, in which case it is italicized. Here is a link to further use of italics. Ancheta Wis 02:47, 3 Dec 2004 (UTC) Thus when I refer to logarithms of numbers (which are about a transformation of the respective numbers), I italicize to emphasize the transformation of the operations of multiplication and division into the operations of addition and subtraction.


Speedy response! Just finished playing around with it.

Who is Michael Hardy? I think that possibly going by the Wikipedia guidelines I feel it more appropriate to have a lot (about 75%) of what is/was in italics in that article as wiki links.

Certainly if it was written on paper it would be more appropriate to have the visual cue of italic text, used sparingly here and there where it might be confusing otherwise, but I personally don't feel it's necessary in the majority of places it was present in the article. If you're really incredibly attached to them, please feel free to put them back in, but I think the article would be less well off without the inclusion of the links I added. Thanks. PMcM 03:06, 3 Dec 2004 (UTC)

Unrelated: Any idea why this talk page has no contents section? Is it likely to be something I have set wrong, or is it the same for others? PMcM 03:10, 3 Dec 2004 (UTC)

It does have a contents section, you just have to look hard for it ;-) The reason is, the top of the page is filled with comments divided using horizontal rules. The TOC doesn't appear until after those. — Matt 10:39, 3 Dec 2004 (UTC)

Also, apologies for the somewhat patronising tone I used to initially raise the issue. PMcM 03:13, 3 Dec 2004 (UTC)

What to do with this tale...

I removed this:

During World War II, Curt Herzstark's plans for a mechanical pocket calculator (see Curta) literally saved his life. In 1938, while he was technical manager of his father's company Rechenmaschinenwerk AUSTRIA Herzstark & Co. he had already completed the design, but could not manufacture it due to the Nazi annexation of Austria. Instead, the company was ordered to make measuring devices for the German army. In 1943, perhaps influenced by the fact that his father was a liberal Jew, the Nazis arrested him for "helping Jews and subversive elements" and "indecent contacts with arian women" and sent him to the Buchenwald concentration camp. However, the reports of the army about the precision-production of the firm AUSTRIA and especially about the technical expertise of Herzstark lead the Nazis to treat him as an "intelligence-slave". His stay at Buchenwald seriously threatened his health, but his condition improved when he was called to work in the Gustloff factory linked to the camp. There he was ordered to make a drawing of the construction of his calculator, so that the Nazis could ultimately give the machine to the Führer as a gift after the successful end of the war. The preferential treatment this allowed him ensured that he survived his stay at Buchenwald until the camp's liberation in 1945, by which time he had redrawn the complete construction from memory. See: Cliff Stoll, Scientific American 290, no. 1, pp. 92-99. (January 2004) Also see: [2].

While this is a fascinating tale, I'm not sure whether its significant enough in terms of the history of computing hardware to deserve a long paragraph in an overview article on the topic. Mechanical calculators were commonplace by the 1930's, even if they weren't miniaturized. --Robert Merkel 23:38, 5 Dec 2004 (UTC)

inserted the information into Curt Herzstark Ancheta Wis 07:13, 6 Dec 2004 (UTC)

I just noticed that the DNA computing section in History of Computing was removed . Once one concedes that the travelling salesman problem is a true computation problem, then one must also concede that a computation using DNA is a computing hardware feat. If that is so, then the recognition that DNA can form the basis for a Turing tape is part of the history (and future) of computation; thus the recognition that DNA forms a code is part of the intellectual heritage of computing and part of its future. That is why Adleman actually solved a travelling salesman problem using DNA. But if that is truly a CS item, then Gamow deserves to be mentioned as this was part of the work that occurred before 1960. Ancheta Wis 01:43, 14 Dec 2004 (UTC)

This is an overview, which means some editorial judgement needs to be made about what are the most essential points to be covered in the space available. There are any number of things this article omits or discusses only briefly. There is much that could be said about analog computers, for instance. DNA computing, while an interesting concept, has not seen wide practical adoption compared to the technologies descended from those covered on this page. Therefore, remove was IMO appropriate. --Robert Merkel 12:26, 14 Dec 2004 (UTC)

Colossus relays

I altered the following:

"The Colossus used only vacuum tubes and had no relays."

It seems Colossus did use relays, both for buffering output and as part of its counters: [3], [4] — Matt Crypto 09:30, 13 Dec 2004 (UTC)