Ah, nostalgia.....
Are IBM sales folks still a source of self-deprecating jokes? As recounted by our IBM account manager:
How do you describe Lassie, Rin-Tin-Tin and the IBM PCjr?
Two movie stars and a dog.
The year 1984 was a watershed for the personal computing market, a year which saw Apple introduce the Macintosh and forever change the way people looked at desktops. On the 30th anniversary of that January release, there was no shortage of coverage looking at the historic launch of the Mac and its impact on the industry and …
Maybe it was the abject failure of the PCjr keyboard that inspired IBM to create one of the best keyboards ever later? I know people that have clung on for dear life to their IBM PC clackety-clack keyboard and still seek out its modern equivalent when buying systems because of its tactile feedback (or its weight - these things had serious substance). However, in a modern office, the audible component is somewhat less appreciated, but there are less noisy equivalents now.
Personally I was more partial to the keys that used a tiny magnet attached to the keys, closing a reed switch. The depth of such keyboards meant they had to be built in, but I rather liked the feel of them.
The Model M keyboard is wonderful, but also INCREDIBLY LOUD. The other people of your household, office, and possibly surrounding countryside, will not thank you for getting a Model M.
Someone suggested it was made that loud to simulate the sound of a typewriter, much like how very early cars were made to look like horse-buggies, but it doesn't really sound like a typewriter. On the other hand he sound is, pretty much literally, iconic. Everyone knows exactly how a Model M sounds: it's the sound of all keyboards in all movies ever.
"it's the sound of all keyboards in all movies ever."
Given Hollywood's almost supernatural ability to dry-rape reality whenever computers appear, it's remarkable that they don't foley-in the Model M sound when people type on iPads.
I can understand juicing up esoteric things, like nuclear control panels or crazy floating holographic user interfaces, that either don't exist or aren't seen too much. But I'm not sure I've ever even seen someone write an email in a movie without - in addition to the gatling-gun typing sounds - showing a user interface with Reader's Digest - Oldster Edition-sized fonts, huge CRT shadow mask pixels (even on laptops), and A SOUND EVERY FUCKING TIME ANYBODY CLICKS ANYTHING!!!
What the hell?! God DAMNIT, Hollywood; everyone in your audience knows damn well that Outlook Fucking Express does not go BLEEP, BWIP, BWATCH, ZIOPP every time someone sends an email!
I don't care if you tear up the collective anuses of crap nobody knows about, because it doesn't hurt suspension of disbelief that much, but when it comes to basic computer stuff, FOR THE LOVE OF GOD stop humping and pull your goddamned dick out of the truth! IT HURTS!
Wrote :- "I actually miss the tactile feedback from those monsters .." [IBM Model M keyboards]
So why did you get rid of it? I'm typing on one now and it will last for ever.
I read a review of the PCjr back then, and the reviewer said that the keyboard (I suppose the original Chiclet) was so flimsy that he picked it up and gave it a twist. Half the keys popped out onto his desk!
Reading that destroyed my previous supposition that IBM kit meant quality. I was disgusted. It goes to show that companies with a previous quality reputation should never fritter that reputation away by using their brand name to sell tat.
All it does is generate sync signals, addresses and indicate where a (text) cursor should go in the signal, and can latch the current logical position if it receives a pulse from a light pen. Actually fetching the video byte and by whatever means turning it into colours is left to other circuitry.
It is indeed the same chip used by at least the [8-colour] BBC Micro, [27-colour] Amstrad CPC and [16/256-colour] EGA/VGA cards.
... and that's the absolute most I can possibly contribute to the conversation. I enjoyed the article but, like most Brits, have no idea how large a baseball is.
It's the size of a rounders ball. Baseball is rounders played by men, with added statistics.
It occurs to me that the PC Jr. is an object lesson that was completely ignored by Microsoft in bringing out the original Surface tablet; a crippled version of anything is going to be bad publicity, and a good display won't make up for inferior everything else.
The original 8086 was 4.77MHz; the 8088 was a crippled 8086 with an 8 bit data bus. When, in the early 80s, I designed a low cost industrial CPU for a series of test equipment we were building, I was able to use the NEC V30, a CMOS 8088 running at 8MHz. The entire module (CPU, logic, 64k RAM, firmware in eprom and I/O, plus a 12 bit A/D converter and a 2-line LCD display) used less power than a 4.77MHz 8088 on its own. Intel have got a lot better since, but then they didn't seem to see low power consumption as desirable.
"As opposed to American Football, which is rugby for girls."
Sadly untrue. The armour contributes to collisions which are causing an awful lot of brain injuries. Padding is actually more dangerous than its absence.
Incidentally, re my original comment, it was a joke, even if a very old and overdone one. If you downvoted it for that reason, I tend to agree with you.
Given that American football manages to permanently cripple a rather large percentage of its players by the time they hit 35, I'm not sure I'd so rapidly dismiss the courage / lack-of-foresight necessary to play it, presence of helmets and pads notwithstanding.
IIRC there wasn't a single ASIC chip in the original APPLE ][.
The CPU had access to memory in the for half of each clock cycle, and the video circuitry access the memory during the other half. This allowed the video circuitry to do double duty, refreshing the DRAM.
Recall the 8 line video interleave which IIRC saved a whole 2 logic gates.
I may be wrong about this, working from memory, but the cycle time of the 8088 is not one clock but 4. As I recall, the address comes out on clock 1, is latched on clock 2, and the data on the bus is read during clock 4. As a result, 350ns access time allows full speed operation. (In fact I think 450ns is adequate, but building industrial computers with 350ns EPROM added little cost and gave a useful safety margin for when ambient reached 70C.)
The 8088 in theory overlapped instruction and data, but because it was crippled with an 8 bit bus this never really worked out in practice as it took 8 clocks to read an instruction. R/R instructions might only need 3 clocks, but they still took 8 to read. Not smart.
Great days, when you could actually put ordinary oscilloscopes on computer circuitry and diagnose what was happening.
Oi, when I were a lad, we used to dream about oscilloscopes. Had to make do with a logic tester, built into the discarded felt pen. Syringe needle, bunch of wires, two crappy LEDs and one 7400.
Aargh. Shouldn't have mentioned 7400! Those bastards had zillions of 74LS chips to play with, and all they managed to do with them was to turn them into a PCJr!
/fakes a stroke and slumbers off the soapbox/
A logic tester? Luxury!
Ah, the delight of the Yorkshire men sketch.. :)
I switched to using the CMOS 4011 as soon as I could. You could do a lot more with a single battery that way :). Heck, I even used it in SMD form, a good thing I had a Weller soldering iron because they demanded a bit more quality from my then meagre soldering talents :)
"Had to make do with a logic tester, built into the discarded felt pen."
It was fun suggesting hooking up a logic analyzer to one of those once.
The looks I got before they realized I was joking was priceless!
"Aargh. Shouldn't have mentioned 7400!"
I think I still have some 74LS series chips in the basement somewhere.
The last place I saw PCJr units in operation was at Sears, where they were plugged up as kiosk machines to order from the catalog.
You're right - having been used to CPUs that did things in 1 clock cycle, I'd forgotten how slow the 8088 actually was, but it makes sense since it was a multiplexed data/address bus (at least part of it). Just checked the data-sheet and the address looks like it should be latched on the falling edge between T1 and T2, in T2 the data/address bus goes hi-impedance and the read, write and IO/Memory signals are sorted out, and data is transferred on T3 (plus any wait states).
It looks like memory has from the end of T1 until the end of T3 to sort itself out (about 400ns), so 150ns may well be overkill - mind you, since the design shared the system memory with the video controller, it may depend a bit on how access was interleaved between the two.
Time was you could count on 3/4 of any logic board being covered in these beasties. Now consolidation and miniaturization have pushed most of this functionality into the CPU and the south bridge, leaving but a few lone survivors, even these remnants just miniscule surface mount versions of the original, tiny shadows of the vast herds which used to roam motherboards.
Soon entire generations will grow up having never seen these majestic beasts, except possibly in archives such as this.
You've just brought back an ancient memory. In the early 70s when nobbut a nipper, I used two of this family of chips for a guitar effects unit I'd designed.
The first chip had some amplifiers on it. I used two of these in series to boost the input signal, so it was heavily clipped and effectively a square wave. This then went to a second chip with a number of divide-by-2 components (flip-flops?), and I used two of these components chained together. I took the input to the chip, the output from the first flip-flop and the output from the second flip-flop, each with its own potentiometer, and mixed them together.
Result? A big fuzz sound together with the octave below and the octave below that. An awesome sound (well it impressed me).
Could the first chip perhaps be a TL074 quad op-amp rather than a 74-series logic chip?
I suspect he's mis-remembering a little...
One of the classic guitar distortion designs was to use unbuffered CMOS 4000-series - e.g. the 4009UBE, which gave you 6 stages to play with.
I can't imagine TTL sounding much cop...
Vic.
"Soon entire generations will grow up having never seen these majestic beasts, except possibly in archives such as this."
True, most won't believe slamming an original PC HD on the desk to free up a stuck platter.
Or more commonly, slamming the entire PC for that purpose.
Hey, it worked and was an excellent stress reliever.
Me too!
Then i could mention how i followed up that 'sticky hd' with an intermittent fault on a Mac se keyboard that took 3 months to track down: 'twas the dust particles from the users pipe smoke and high humidity wot done it - shorting the switches of certain keys out every now and then.
J.
... IBM tended to build good peripherals for their other ranges... the Model M keyboard especially.
Last IBM CRT monitor I used was an 8514 PS/2 job, back in the era when companies sold a 14" CRT but only 12.9" visible. Not IBM - that thing weighed about twice as much and showed IIRC a touch more than the stated diagonal. 1024x768 goodness until I swapped it for a higher-res 17" Vivitron CRT (with its pixel mask support wires just visible across the screen).
Nice pictures - it looks like it was cared for through its life, which must mean it's about the best of what's left.
This might have been a crippled IBM PC, but the engineering stacks up well against the marketplace rivals of the day. IIRC though the PC Junior/Peanut was a good deal more expensive than the BBC (even once you added in the cost of a monitor)/C64/Speccy/Amstrads, and a 16 bit processor (PCJr's USP) pumping through an 8 bit bus didn't impress the lowest common denominator computer buyer of the time.
The first of the proper 16 bit home computers were already in the pipeline (Amiga and ST) and when they turned up a year or so later the rest as they say is history.
Yep, and stories about the Amiga were already appearing in the literature.
I read a preliminary report in (I think) Byte Magazine and decided that I knew what my first personal computer purchase was ging to be.
About a year later, I bought an Amiga 1000 with Half a Meg of RAM, preemptive multitasking, 4 channel stereo sound on the motherboard and amazing color graphics. A machine which I was able to add 8 Megs of additional RAM to a few months later, albeit at a cost equal to the price of the entire rest of the system.
And the entire OS would fit on a single 3 inch Floppy Disk with room to spare.
I came to Amiga from a background on C64 (didn't we all?) and low-end PCs, and was blown away by, in order of appearance, the graphics, the sound, and the OS. The OS was the most tinkering-friendly OS I've ever seen, and was of a type we'll never see again: all OS's today are designed ground-up to be defensive, to protect data from outside enemies and from the user. AmigaOS wasn't like that, it was instead designed to be configurable and transparent. (For those who do not know what it was like, imagine a streamlined XFCE linux without any multi-user or security features, where everything is wide-open to tinkering, and you're in the ballpark.)