Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
How powerful was the Apollo 11 computer? (2009) (switched.com)
78 points by nantes on July 19, 2013 | hide | past | favorite | 51 comments


Minor nit-pick: The PC came out in 1981 but the XT came out in 1983. But they both have 8088 chips running at 4.77 MHz. The main difference was that you could get it with a 10 MB hard drive, but my PC (not XT) had upgraded ROM BIOS chips (you had to change the actual chips because they were not flashable) from around 1984 and so it has a 20 MB Seagate hard drive. (Which still works.) The XT also drops the cassette port.

I recently got ahold of my PC from storage again and when I opened it up I was surprised that the floppy drive uses up one of the five 8-bit ISA slots, even though it has very few simple chips. (The XT has 8 slots.)

(Pic of the floppy controller. http://upload.wikimedia.org/wikipedia/commons/a/a6/IBM_PC_Or... )


Though multitasking and VMs on the AGC were a feat, that's not what amazes me. The IBM 360 demonstrated that both were possible at the time, and the AGC impressively did it on less hardware. What does amaze me is how long it took such things to come to the personal computer, and arguably how much DOS/early Windows held things back. Unix, with its multitasking, ran on PCs and Macs in the 80s, so we only waited 20 years after the 360 (and 15 after the AGC) for multitasking. But usable VMs didn't come (or at least weren't commonplace) until, what, mid-90s? 30 years after the 360, and 25 after the minimal AGC? What the heck took so long?


Few personal computers had protected memory until the late 1980s. Unix didn't run on PC or Mac until protected memory hardware was available.

Emulation was always popular, and it is a type of virtualization. Few people wanted to emulate the same platform they already had. Pure software emulators were pretty slow in the 1980's, unless the target platform had the same cpu as the host.

My family had an Amiga and we had both a bridge board and an A-Max. The bridge board was a full PC on an expansion card, so its not really a VM. The A-Max was more like a VM. Its hardware contained the Apple ROMs and a floppy disk interface because the Amiga floppy drive could not read Mac disks. The software used the ROMs and Apple floppy drive to emulate a Mac.

Now that I think about it, I think the real reason was ram and to a lesser degree hard drive space. My first encounter with virtualization in Linux happened when most people were running between 8 and 64mb of ram. 64mb of ram was very expensive. That's a pretty big constraint.


I still love this story -- even four years later! (and not just because my husband is the author!)


Well, I started coding on an 8 bit machine, 3.5 ~~GHz~~ MHz, up to total 64 kB of memory (static and dynamic together) - but keyword here is "up to", initially it was less than that. I had all the "OS" pretty clearly mapped out in my head, by address no less. My programs did contain sometimes jumps to physical addresses.

So I think I've some idea how the AGC operated.


MHz I hope :)


Darn. Yes, MHz. :) Thanks!


For those interested in this subject, I highly recommend "The Apollo Guidance Computer: Architecture and Operation":

http://www.amazon.com/The-Apollo-Guidance-Computer-Architect...

John Pultorak built a Block I AGC (think "version 1") and provides all of his work in the Public Domain here:

http://klabs.org/history/build_agc/


I haven't read that one, but enjoyed Digital Apollo: Human and Machine in Spaceflight http://web.mit.edu/digitalapollo/


I think looking back at computer clock rates and memory is a silly endeavor. I remember a few months ago people commenting about "why we put a 2Mpx camera on the Curiosity Rover". The simple fact is, mission critical hardware/software is a different ball-game. You need reliability and fault tolerance.

If I was going to the moon today and had the choice of flight computers. Hands down, I would choose the Apollo 11 AGC over an iPhone A6 processor based system.


Ironically, the Apollo 11 landing was nearly scrubbed because the computer got overloaded with tasks and couldn't process everything in time, so that lack of power really was a problem.


The computer got overloaded because of crew error.


Procedural error, really. The crew followed their instructions, the instructions were just wrong.

In any case, my point is that a somewhat more powerful computer would have tolerated this problem while the real one did not.


I was going by the article, which called it a crew mistake, but either way, the computer was powerful enough for the correct procedure.

To me, more powerful is likely to imply more complex, and that can also offer more attack surface for bugs and mistakes, so to speak, and at the very least requires more work to prove correct.. so to just make it more capable than you actually need, just in case, may not always be the best choice?


tl;dr - The Apollo 11 computer ran at 0.001GHz with 0.000002GB memory and 0.000032GB of storage (read-only at that). The display amounted to a few dozen 1-bit pixels.

And it took them to the Moon.


I love reading this: http://www.hq.nasa.gov/alsj/a11/a11.landing.html

"This display does not place a particularly heavy load on the computer, but when added to the existing load, was sufficient to generate the 1202 alarm."

---

"102:38:26 Armstrong: (With the slightest touch of urgency) Program Alarm.

102:38:28 Duke: It's looking good to us. Over.

102:38:30 Armstrong: (To Houston) It's a 1202.

102:38:32 Aldrin: 1202. (Pause)"

http://www.dailykos.com/story/2012/08/25/1124164/-Neil-Armst...

---

"102:36:18 Armstrong: (To Houston) Our position checks down range show us to be a little long.

102:36:21 Duke: Roger. Copy. (Heavy Static)

[In a post-mission analysis, Apollo Descent and Ascent Trajectories, Floyd Bennett notes that, at PDI, Eagle was about 3 miles farther downrange than planned, due to [..]"

->

... after that, Armstrong selected a landing site visually, and landed the lunar module manually. :)

[Mission Control] Thirty seconds. (Just Mission Control telling Neil Armstrong how many seconds of fuel he has left for landing) :)


If you like this, there's Spacelog, which is a neat way to view transcripts for a number of missions:

http://apollo11.spacelog.org/

http://apollo13.spacelog.org/

Browsing through the Apollo 13 "spacelog", it's interesting how a lot of the lines in the Apollo 13 movie are very similar to the ones in real life.


Now we seem to be completely dependent on computers, but at the time, I'm sure they didn't rely on them as much. The people pretty much navigated themselves to the moon


Some time ago, I was trying to buy a few hundred dollars worth of furniture, only to be informed that the cashier could not process my order because her computer was down. Everything I was buying had a price tag and I was paying cash, so I suggested that she find a pencil and some paper, and just write everything down, so she could enter it when the computer came back online. The concept apparently fried a circuit or two in her brain, because there was 6% sales tax on everything, and how was she supposed to know how much that would be? Plus, how was she supposed to know how much change to give me?

Nowadays, we don't have price tags on anything anymore. Prices are fetched from a database using the UPC as a key, so if the computer's down, the buyer is out of luck. At a previous employer, we sold specialty goods to contractors, so we had strong relationships with our customers. I drafted a plan to print up pads of paper sale forms that looked more or less like our (80x24 greenscreen) order screens, so if we lost connection to the mainframe, we could still move product from the warehouse to the customers; we'd just give them the product and bill 'em later. It was a risky move, but not really any riskier than sending our customers across the street to our competitors. To my knowledge, they've never had to use the manual system, thankfully.

We may not be "completely dependent on computers" just yet, but we seem to be getting there.


I know you're making a point, but it just sounds like you were dealing with particularly incompetent cashiers. I've been to a number of brick and mortar stores where they had a system exactly as you described. They simply copied my credit card onto carbon paper and wrote in the calculated total so that they could take care of the billing when the system came back online.


I don't dispute the claim that there are very competent cashiers all over the place. I just seem to be particularly lucky in finding their counterparts whose greatest usefulness is in flattening the curve.

Also, isn't it a little risky letting someone copy your credit card like that? Even if you trust the cashier not to steal the number, how do you know whether they're destroying the information after they're done with it?


Those carbon copiers were the way they charged CCs for years before computers, right? So presumably there are a decent number of safeguards built in.

(Disclaimer: I'm 29, so don't really remember a time when those things were commonly used.)


Well, yes, that's the way they did it before computers, but no, the safeguards really weren't there. They used carbon paper to transfer the number from the card to the leaves of the receipt. So when the carbon gets discarded, it becomes trivial for someone downstream of the transaction to pull the carbon out of the trash and use the embedded number to do some internet or telephone-based shopping of his own.

It's funny--29 doesn't sound particularly young, but I'm really suppressing the urge to call you a whippersnapper.


Before widespread communications networks, credit card imprints were taken with machines that look like [1]. They are still sometimes used when networks are down or when there is no phone/Internet network (think temporary locations like fairs and carnivals). The security is that the credit card company will (hopefully) void invalid transactions...

[1] http://img.ehowcdn.com/article-new/ehow/images/a04/b9/g6/cre...


Thats part of the service you get by paying for a credit card. You have the opportunity to review your expenses that month and contest any that you disagree with before you pay for it. On the other hand, if you made the purchase with a debit card, you would have a much harder time trying to reverse the transaction.


You don't have to wait for the register to go down to make a modern cashier have a mental segfault - just give them extra cash so you get less bills back.

For example on a $6.95 order hand over $12 and watch first for the bewilderment while keying in the amount and then the amazement when they give back the non-dollar bill change.


"fewer" bills back, not "less".

pedantic? yes. but as a fellow member of the "give-em-too-much-to-get-fewer-back" tribe, I feel that you might appreciate this particular bit of pedantry.


Only for a new cashier, really. It doesn't take long to understand what the customer is doing - once you've had a few "I can't make change for you because I'm out", it all makes sense and cashiers start to like the customers who minimise the change requirements.

I remember talking to someone who had clearly never worked in retail, complaining about their god-given right as a consumer to have change made for them trumping an open-air market vendor's right to have change for the rest of their sales for the day.


This always makes me sad and I'm shocked on the few occasions (rarer every day) when change is correctly counted back to me: "5 cents make 7 dollars, and 5 more dollars is 12. Have a good day."


I can't see how people accept that system of pricing. Put what I pay on the tag. Put whatever you expect me to pay, and do the same on your ads. I'm capable of understanding that prices may differ place to place due to tax differences. I hate bing asked for an amount different to the tag - there is no way of telling if you're being shafted. I'm not from the US.


The problem with the US is that the sales tax can be set by three different levels of government---the state, the county and the city---for instance, the state might have a sales tax of 5%, the county none and one particular city in that county 1%. What price do you advertise for the following?

1. A country side advertising campaign? 2. A state wide advertising campaign? 3. A local newspaper that serves the county?

As odd as it seems, that's the way it is here in the States (but I can sympathize, a friend from Sweden was horribly annoyed by the differences between sticker price and final price because of the tax rate).


I'd defend the practice, but there's a certain attractiveness to just showing me what my financial liability is. You start with sales tax, then you exempt certain classes of consumer goods from sales tax, then you impose an extra eighth of a percent on purchases made within this or that economic district (usually to help finance a stadium or megamall), and pretty soon, your consumers don't have any idea what their tax liability is going to be when they check out.

Over time, one does develop a vague sense of how much their purchases will be taxed, but I would definitely support a measure that required merchants to charge no more than what's on the tag.


At a speaking engagement by James Lovell, I heard him describe the emergency procedure for navigating that they did with Apollo 13. There was a large round port window facing forward with a line engraved across the middle. Line that up with Earth's terminator line like cross hairs. Fire the main engine by hand. That's it. And it worked.


Actually, the lunar descents were all on autopilot until fairly close to the surface. And even when the astronauts took "manual" control, the LEM was still fly-by-wire, like a lot of modern aircraft: the guidance computer was reading the movements of the joysticks and translating that into thruster settings. (Pulling the joysticks "hard over" would bypass the computers, but that was considered an emergency measure only.)

The LEM software was supposed to be capable of a fully automatic landing, but none of the astronauts rode it all the way down.

David Mindell's book "Digital Apollo" goes into these systems, and their development, in much detail; the discussion of the LEM user interface is in chapter 8.


Well, the other real alternative was to determine their position, etc., then send that data back to NASA who would use their big computers to do the calculations, then send back the burn duration, etc.


Ahh, the cloud option.


I thought it was mostly rocket science? (calculating trajectories in advance, on supercomputers on the ground)


Super...what? Actually, I asked my teacher on celestial dynamics how to determine a good trajectory to get "somewhere." The answer? "Clicking." So you just keep trying different solutions until you find it, or something close enough and then numerically optimise the solution.


It's pretty awesome to know that you can get something that powerful inside your palm (the Atmega 328) for less than $3 single unit prices.

After some minor conversion, the Atmega/Arduino has almost exactly same storage space and RAM. Only, the Arduino clocks at 16MHz. Still, awesome regardless.


That was the Block II version though. The previous version (at 24k Storage and 1k memory) is proably closer to the ATtiny1634, which only has 16k of main storage but clocks at a veritable 20Mhz. The thing I find amazing is that there's a bga version of that chip that is 2x2mm. And they're like <2$ single unit. Sorry for whipping out my 1634 pet peeve.


The AGC is quite a feat, but already from the digital age. I find the TDC's used during the WWII more fascinating. I wonder how the space age would look like with electromechanical computers :)

http://en.wikipedia.org/wiki/Torpedo_Data_Computer


How powerful was Shakespeare's word processor?


Far Travelers is an awesome book about satellites that you can read for free. Much more information than you'd care to know about, it's amazing.

"Another significant improvement included the addition of extra "brain power" to allow the orbiters to perform more complex functions. Viking orbiters possessed two 4096-word, general-purpose computers that could operate in parallel or tandem modes. These replaced the small special purpose computers contained in Mariners 8 and 9. The capability for more rapid picture taking allowed for better site surveys and special regional studies. This capability was augmented by tape recorder systems that could store 2.112 megabits per second, with a capacity of 55 TV pictures-over half a billion bits of information."

http://history.nasa.gov/SP-480/ch12.htm

http://history.nasa.gov/SP-480/contents.htm


As someone who spends his time writing in High Level languages (though I have done lower level stuff in the distant past) these kinds of embedded computers hold a real fascination for me.

On my long term list of things to-do is get back into electronics and learn Ada (mostly because it's a million miles away from PHP).


Learning Ada is worth your while, embedded or no. Hint: If you have a problem and you find yourself reaching for C++, consider Ada instead. It's probably powerful enough for the task, safer, and more readable (therefore maintainable).

Try implementing Unix utilities (cat, ls, echo) in it for a weekend of fun. :)


My one true love programming language is Pascal (mainly because when I was exposed to it I'd only every used BASIC) and later Object Pascal.

Looking at Ada I see a lot of similarities, it looks like "home" to me.


followed the links to the guy who built an AGC from scratch - http://klabs.org/history/build_agc/ - detailed technical info plus photos - amazing!


The comparison with an early PC is not imaginative.

Think of the original Sinclair ZX81 with 1K of RAM and 8K ROM. Swap out the Z80 for a 6502 to get approximately the same register count. Without a screen to worry about (it took up too much RAM + CPU time) and you could do a lot, e.g. 1K chess or, as the adverts said, run a nucular power station with it. You could also service I/O with hardware interrupts.


The details on the software and the in-flight patching in the Apollo 14 mission are quite amazing. Another point to make is that nowadays we have tremendous amount of computing power combined with comprehensive numerical analysis tools on the planet which allows engineers to perform thorough simulations for the stages of space/air/sea travel and the behavior of vehicles.


"(1/1000th of 1 MHz, much as 1 MHz is 1/1000 of 1 GHz)"

Half of 1 MHz is 500 KHz not 500 Hz ... it may sound slow but the PIC uCs we were using in the '90s were clocked at 20 MHz maximum and executed about 4 MIPS. For the 1960s, that computer really was rocket science!


isnt all the code open source now? seems i read that it was not long ago




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: