back to article Custom ICs in small numbers to be cheap as (normal) chips

The US military says it is on track to revolutionise the world of chip manufacturing by making it possible to produce advanced sub-65-nanometer ICs in small numbers - at the same low unit costs delivered by today's billion-dollar, mass production chip factories. As most Reg readers will be aware, the standard method of putting …

COMMENTS

This topic is closed for new posts.
  1. Ken Hagan Gold badge

    A long term prediction

    Many years ago, software development was a painfully expensive business. Access to the machine was the major bottleneck and so anyone who actually had to program for a living learned how to "measure twice, cut once" with their untried code.

    Then the hardware got so cheap that everyone could have their own box and run their programs in a debugger. The economics turned on its head and the smart approach became "cut twice and throw away the one that didn't fit". Modern bug-ridden software is the result. Above a certain level of reliability, it simply isn't cost-effective to find all the bugs before you ship to the first paying customers and sometimes it is never cost-effective to fix them, because you can make more money by adding new features and selling to a wider customer base.

    The same will happen to hardware. It'll take a couple of decades, but it will happen.

    1. Martin 19
      Boffin

      No.

      What will happen is this technology will basically allow cheap ASICs to displace "microcontroller-and-software" based systems, for applications where an ASIC would be better (eg for high performance). As such, any new hardware bugs will replace software bugs; and for high performance real time kit software bugs are particularly devious and are a big problem.

      Overall reliability will probably go up.

      /ex hardware man

      1. Vic

        Maybe.

        > and for high performance real time kit software bugs are particularly devious

        > and are a big problem.

        The old hardware/software duality thing applies; hardware bugs can also be very tricky to find - especially as the level of integration goes up.

        What this really does do - if it ever sees the light of day, that is - is to enable prototypes to be made in small quantities so that testing can be far more extensive before tape ship.

        > Overall reliability will probably go up.

        I really, really doubt that. But costs will probably come down.

        Vic.

        /ex hardware man and current software man.

    2. Graham Wilson
      Boffin

      @Ken Hagan - Absolutely correct but for a moment think of the ramifications.

      Absolutely correct, but for a moment think of the ramifications and difficulties of documenting such changes let alone authenticating them in specialist or secure applications.

      Unless some well implemented and well accepted protocols are implemented then I'll predict it will quickly become become a nightmare (not dissimilar (and probably more so) to the much implemented and longstanding practice 'random' software patching where the words 'change control' and 'documented' are seemingly unknown).

      >:-)

    3. The Unexpected Bill
      Thumb Up

      I'm not sure...

      I've got my doubts that things will wind up that way. I see this as more of an enabling tool than I do a way to "cheap out" on the design of a device. I see it opening doors for amateur silicon designers, or even people who could work on a professional level if they had access to the design equipment that is required. To me, that's a good thing. Ever notice how people doing something as a hobby that they really enjoy tend to do a better job as compared to a lot of commercial offerings?

      That said, even if I do disagree, your post is well written and thoughtful.

      To me, what made the "measure once, cut twice" concept possible was not the advent of cheaper computer hardware. It was the availability of reasonably priced flashable memories and the pervasive arrival of the Internet that I feel led to the "we can fix this later" mentality.

      Whether or not a company stands behind their existing product with support is more a question of morals and integrity anyhow, and both are characteristics that are in short supply anyway.

    4. Displacement Activity

      re 'A long term prediction"

      Nice analogy, but not quite. We (in hardware) have already been going through the multi-cut/throw-away argument for the last 20 years. It's one of the major drivers of the entire (but small) EDA industry. Multi-e-beam shouldn't change this, but the dumber companies will still screw up.

      The background is that we already do short production runs using FPGAs. These are expensive (and slow, and insecure, in that most of them are relatively easy to pirate). There has been an ongoing and gradual process of cost-reduction over the years, which involves replacing FPGAs and custom logic with ASICs.

      Attempts to replace FPGAs included e-beam, over 10 years ago, and structured ASICs (I've done one of these myself). These weren't successful, and we're still in a situation where a short product run (say, <1000 units) is still much more expensive (say, a factor of 4 - 10) than one that ships 10,000, primarily because of the cost of FPGAs. An ASIC conversion of an FPGA is also generally much faster (in most cases, at least 50%). It's all about (1) cost, (2) performance, and (3) copy-protection.

      It's certainly true that when people first started using FPGAs they cut many times and threw away everything till they got the one working version, and a lot of people still do this. This is possible because most hardware bugs manifest themselves a lot more obviously than most software bugs. Anyone doing ASIC conversions, however, even "cheap" ones, puts a lot more effort into verification, because it's just too expensive to fail.

    5. TeeCee Gold badge
      Unhappy

      Measure twice, cut once.

      Actually I reckon it was speed rather than price that did for this, as it persisted long after the days of developer CPU time and storage volume allocations were gone.

      Back in the day when compiling code was a submit-to-batch-and-then-read-the-paper-while-it-does-it exercise, repeated failed compiles had a significant effect on development time (and ones continued employment). Most of us would invest quite a bit of time in going over what we'd written, looking for errors before letting the compiler loose on it. As a result we'd not only drive out most if not all the compile errors*, but quite often spot a few other foibles and tidy those up while we were at it.

      As soon as compiling became a "blink and you missed it" process, it became common to allow the compiler to find the cockups, fix the fatal ones and chuck the result into testing. Couple that approach with modern project management/planning methodologies (which all appear to end up as some variant on "cut the testing to hit the deadline" to me) and the outcome is a given.

      I don't think that being able to turn out custom ASICs on the cheap is a worry. When the process that does so take less than 5 minutes we'll be screwed though....

      *There was incentive here. A first time compile on anything of significant size and/or function meant everyone else stood you a beer.

  2. Graham Wilson
    Black Helicopters

    I'm sure the technology is great but custom ASICs are both a security and privacy threat.

    I'm sure this technology from DARPA is great technically, but as application specific ICs become easier to make they'll also become more proprietary, thus their exact uses and purposes will become more hidden. Moreover, this emerging but hidden property will be of great significance whether they're used by the military, government, private industry or individuals.

    As custom roll-your-own ASICs become easier an cheaper to make so will it be easier to disguise or obfuscate the tasks that they actually perform; ultimately this will be very significant for ordinary citizens, especially with respect to their privacy. Everything from espionage and surveillance to advertising will be easier, as making a chip for one seemingly obvious purpose then piggybacking a disguised/nefarious function or use will be become much easier and less costly. Thus, having a single one-off requirement or application will become an easy no-brainer.

    Whilst an issue now, in the near future verifying or authenticating what precise functions a chip actually performs is going to be a very big issue. For instance, whilst ASICs can now be (and are) made with multiple purposes, for example a BIOS chip with a back door for spying (a la the Clipper chip or similar), the cost and difficulty of deploying them without detection, whilst possible, is still somewhat costly and difficult (especially if they have to pass inspection by knowledgeable people).

    However, with ASICs becoming cheap and easy to manufacture in very small quantities, it will for example become much easier to slip-stream* specific dual/multipurpose units into say devices on a production line. For instance, of say 100,000 PCs (or for that matter any appliance or device) on a production line, 30 may have ASICs with an additional spying function. These would look the same and test identically to all others but upon specific commands, encrypted keys etc., they would then enter this secondary mode of operation.

    As DARPA is the organization that's actively researching cheap low production run ASICs, it doesn't require Einstein to figure out what broad and nefarious ways in which these pernicious ICs will be deployed.

    The social/human consequences of this technological 'advance' (some may say Scientific Step Backwards) will be very consequential, and it'll probably much sooner than later.

    ______

    * Adding a new or different component(s) to a production run, either secretly or openly, without stopping or changing the version of the process.

    1. The Unexpected Bill
      Welcome

      Too much paranoia?

      A security and privacy threat...really?

      How much do you know about the mass-produced chips that are in equipment today? I'm not trying to start up some kind of conspiracy theory here, and to think that I am is to miss the point. Rather, I'm saying that the technology to put almost anything on a silicon die (camera, microphone, temperature sensor, accelerometer, etc) is already there and the world hasn't ended even with the plethora of silicon designers that are out there today.

      Many chips have datasheets, but they're oftentimes not complete or don't speak of a feature in great detail. Some aren't even available except under NDA, or even at all. And how do you know everything the chip does when it's a black box? What about undocumented features?

      I was looking at an LPCIO in a Dell computer a few months ago. It's a part from SMSC, so I figured it was probably an off the shelf part. So I asked for the datasheet and was told I could not have it. In fact, the man who responded (same guy who has responded every time I posed a question about their products over the course of many years) said that nobody in the company could even get the info. It was a custom part and any data on it is "confidential".

      Why? It's an LPCIO and probably not that different from their stock designs. A lot of its function can be inferred by looking at it and seeing how it responds in software. Does that mean Dell is doing something wrong? I highly doubt it. (Never attribute to malice that which is adequately explained by stupidity. I'm sure some manager who had played all the variants of FreeCell thought that making info unavailable was somehow a good idea.)

      Even if someone does do something bad...I don't see this being a huge problem. Someone's bound to figure it out by being observant, smart, tricky or any combination. The news will get out, one way or another, and at some point.

      In other words, I'm not that worried about it.

      1. Graham Wilson
        Boffin

        @The Unexpected Bill -- Nor am I

        Nor am I, but then I'm not doing anything much more subversive than annoying other El Reg users with long posts, thus not in the league where it matters. And of course that goes for most people.

        Nevertheless, the matter is of concern for increasing numbers of organizations (see my reply to Yobgod Ababua).

        This issue is primarily about the way technology is evolving. In some ways we're not fully in control of it, as some say 'technology has, at least in part, its own autonomy'. Fifty years ago, years before the PC, all electronic equipment came with circuits/blueprints, for equipment to be sold without them (or at the very least) ready access to them would have been unthinkable to everyone, manufactures, service industry and consumers alike. It was the accepted norm that the service industry would have access to circuits and no one would have questioned otherwise. Today, this would be equivalent to Microsoft issuing source code with Windows, back then manufactures protected their equipment in other ways: radios, TV sets and even valve (tube) boxes had their patent numbers stamped all over them.

        No one really thought of licensing programs until Seymour Rubinstein had the bright idea of licensing the WordStar word processor, and then this idea only proved feasible because compilation obfuscated the programs source code--something not possible with electronic circuits--not at least until the integrated circuit came along which gave the manufactures of hardware equivalent of compiled source code (in that it was hard to reverse-engineer an IC).

        Remember the original IBM XT and AT PCs? These discrete-device PCs came with both a complete set of electronic schematics and a published source code listing for the BIOS--something unheard of today. Back then it was normal to have source code, now possession is taunt amount to stealing it--but in the meantime we've lost out to a new proprietary world, and also we've killed of a whole service sector without so much as a whimper just because technology has enabled us to obfuscate both hardware and software.

        How far you allow technology to be moved from the people into the proprietary corporation is still up for debate. Irrespective, humans have lost out in this process--at least in part. However, none of this was easily foreseen let alone actually debated by society, but obfuscation that new technologies offered was lapped up with glee by manufacturers.

        It's these unforeseen outcomes which it is said to give technology its partial autonomy.

        .

    2. Martin 19
      Big Brother

      Agree, but

      I agree with everything you've said, but there is an upside. For a particular chip that *is* examined closely, it is easier to audit the functions of the hardware than it is to examine any firmware programmed into it.

      You can if you really want to X-ray (or dissolve the casing and use a microscope, etc) a chip and see all the circuitry, and given a little computation devise the full functionality of it. Extracting the firmware from a device that doesn't want you to have it is very difficult.

      (All is AFAIK, if I'm wrong say so :) )

      PS: From what you say it seems you think that the US Gov. (for one) is planning using 'trojan horse' ASICs as an espionage tool. Do you think that (in the US) this could be achieved with products from major manufacturers; and do you think this could be done with the OEM's knowledge, or do you think they would use more covert means (eg. agent working in production)?

    3. Anonymous Coward
      Anonymous Coward

      On the upside

      with technology like this easily accessible, it becomes feasible to run well-audited completely open source cpu designs as your main desktop. As always, who do you trust? Though I'll grant the dynamics will change, yet again. Something for open sourcers to keep an eye on and jump in at the earliest opportunity.

      1. Chemist

        See..

        http://opencores.org/

    4. Vic

      I don't think so

      > 30 may have ASICs with an additional spying function.

      If you're going to infiltrate a production process to introduce a small number of compromised devices, you've got three possible outcomes :-

      - Your device fails to function, and all boards with it on are scrapped.

      - Your device functions, and goes to some random person, with entirely unpredictable effects.

      - Your device functions, and goes to someone with data that you want.

      Note that only the third option here is useful to a would-be data filcher. Note also that it requires far more than just subverting the production process; it requires the whole ordering and build system to be under the control of the ne'er-do-wells.

      Thus such espionage is only really available to well-connected types. Someone with such capability really isn't going to have any difficulty whatsoever getting a wafer of "modified" devices made by the original fab.

      So I can't see security issues here - it's simply not worth doing.

      Vic.

  3. Swarthy
    Go

    Gutenberg, Mark V

    Next step up from a 3D printer?

  4. Yobgod Ababua
    Boffin

    Graham:

    While I appreciate your enthusiastic tinfoil-hattery as an amusing diversion, I really must take issue with your chain of reasoning and conclusions.

    My understanding is that this device is to the current wafer process what waterjet CNC systems and 3D printers have been to physical manufacturing. It makes small-batch runs affordable. This would be an incredible boon to small manufacturers and hobbyists, as it would enable them to do things on silicon on a smaller scale than is currently possible, which means more designs and experiments will be made and tested.

    It has nothing to do with "slipstreaming" malicious changes into a normal production run. For one thing, normal production runs will almost certainly continue to use the current wafer process, as the economy of scale catches up quickly there once the masks are made. For another thing, making any changes to the layout of an ASIC requires at the very least that the entire chip be re-floorplanned, timing checked, etc., which is neither trivial nor guaranteed to actually work. It's actually much easier for entities to get your malicious routines inserted in the current model, by working together with the proprietary mega-chip designers behind the scenes in the first place.

    Besides, if this device fufills it's promise, I'd look forward to a growing community of open-source ASIC plans that you could have chipped out by the reputable company of your choice, after you review them for malicious government transistors. And, when it's discovered that Motorola backdoored your cellphone for the government for you, it's at least possible that a company or group could get together and spin out a compatible yet safe version of the chip in question (something that the high costs of masks prohibits currently).

    1. Graham Wilson
      Boffin

      @Yobgod Ababua - Wish I could fully agree with you.

      Wish I could fully agree with you.

      You're correct about the CNC process and 3D printers and the production processes etc. However, I wasn't specifically referring to them anyway. Sorry if I didn't make this fully clear (I was actually referring to substituting superficially identical chips but which have super-set instructions into production lines for manufactured equipment. (How chips are actually made is somewhat irrelevant to the topic, thus a wide-ranging discussion covering everything from silicon compilers, tweaking/reprogramming microcode, designing chips with internal bond-outs etc. etc. would serve little purpose.)

      The fact remains that synergies that result from different and improving manufacturing processes are making ASICs easier and cheaper to produce, hence the easier it is to obfuscate what they actually do. In fact it's an issue now and increasingly more so as time goes on.

      Let me skirt around specific sensitive stuff by giving some ancient and trite examples.

      - The Intel 8085 and Zilog Z-80 are essentially identical to many programs written for them (similarly the Intel 8088 and V20). However, the Z-80's super-set instructions put it in a class apart for programs that are specifically written for it, the 8085 knows nothing of them and an 8085 system will crash unless a substitute 8085 library is available. Even then, emulation cannot always substitute as some functions are just too different. Unlike the known differences between an 8085 and Z-80, especially designed ASICs can contain super-sets that are very difficult to find.

      - Today, the operation of many chips are disguised/obfuscated by labeling them incorrectly, or omitting labeling or deliberately removing them, such nefarious activities have been going on for many years. As chip design becomes cheaper so do the super-sets become easier to disguise and manufacture, eventually hidden internals will replace all labeling tricks. Just think back to the days of the false parity chips on memory--a chip designed with no other function other than to fool the system (and user) into 'thinking' that it/he had 9 memory chips when in fact it was only 8. It was a nasty unethical deception and remains so (but memory manufactures actually got away with it).

      - Many electronic appliances, TVs etc., contain ASIC ICs with factory setups that are even unknown to the service industry (many common instructions you will be given but try to find how to increase the line or field scan drive amplitude and you'll be almost certainly be stymied). The facts are that these hidden techniques are already in service, the next step will be much tighter integration within the chip itself.

      - And that is happening already. For the most part this is to ward of one's competitors, but taken to the logical next step it'll soon be commonplace to have chips whose parameters aren't fully published but also neither will swathes of hidden super-set instructions. Take Linksys routers for instance, the WRT310N etc. have special unpublished talk-home features that are inaccessible to normal users and whose exact purpose remains obscure. Why this issue hasn't been more controversial also remains unclear.

      - Nowadays, chips are incredibly easy to produce when compared to say 20/30 yeas ago, moreover there is a plethora of suppliers and often the source of the silicon cannot be guaranteed or easily identified with any certainty let alone properly authenticated for function (least w/o very considerable effort).

      - I don't see hidden architectures in ASICs as part of some massive conspiracy, it's just that it's developed in similar ways as to what happened over compiled code. If all that Microsoft had to write Windows with was a basic interpreter then the whole issue of product authentication would either not exist or would be very different indeed. Just as compilation makes it very difficult to reverse engineer code, easier access to ASIC design and development will provide much more opportunity to obfuscate their function.

      That said, I've had to investigate 'louseware' within ASICs, although I can't be more specific.

    2. Graham Wilson
      Coat

      @Yobgod Ababua - No comment.

      1. Intro/abstract : http://www.dsto.defence.gov.au/publications/scientific_record.php?record=9736

      2. Silicon Trojans: http://dspace.dsto.defence.gov.au/dspace/bitstream/1947/9736/1/DSTO-TR-2220%20PR.pdf

      3. Trojan Detection Using IC Fingerprinting from IBM: http://domino.research.ibm.com/library/cyberdig.nsf/1e4115aea78b6e7c85256b360066f0d4/f6e86bf32ce991d68525723c005c8be6?opendocument (download PDF from page).

      4. More on why it's an existing problem: http://www.mil-embedded.com/articles/id/?3748

      http://www.mil-embedded.com/articles/id/?3748

  5. Al 4

    Modular chips

    This will make the creation of custom modular chips easier. Yes some will create the chips from scratch but a lot will be created from modular components. Want 2k of AD sampling? 10MB of EEPROM? Lots of I/O pins for parallel data sampling with ESD protection? These types of customized chips will most likely be the first ones created. Instead of having multiple chips to provide the functionality you need, now you'll only have one. True single chip products.

  6. Anonymous Coward
    Thumb Up

    2 forks of application

    Forgetting about the security issues which may well arise.

    The two main types of use are short run applications that need the performance, as has been mentioned. The other application is as a prototyping tool that would be used by designers in the same way that they now use 3D prototyping devices to fine tune the shape and form of objects before committing to expensive production lines.

  7. Robert Heffernan
    Black Helicopters

    Industrial Espionage

    Such a device could also be used in an Industrial Espionage capacity. Say a chip designer hacks into a competitors R&D system and is able to download a copy of the design for a super-duper up and coming chip. The evil designer is able to basically 'burn' a copy of the stolen silicon data to do physical testing of the unreleased ASIC design that would otherwise not be able to be done with an FPGA due to complexity or mixed-mode designs, etc, all without spending a massive amount of money on making the physical masks.

    For example: AMD steals an upcoming Intel design (or Intel steals an AMD design), now due to patents, reverse engineering, etc, AMD (or Intel) can't actually copy the design, they'll get found out the moment the silicon hits Intel's (or AMD's) electron microscope. So they burn a physical copy of the chip to test it, and target the improvements to their own design to outperform Intel's (or AMD's) design.

  8. AlistairJ
    FAIL

    Commentard fail

    This story seems to have dredged up an awful lot of uninformed comment.

    My attentions will be focused elsewhere in future.

    1. Anonymous Coward
      Anonymous Coward

      Re: Commentard fail

      Spot on. And what is it about the Reg that attracts the delusional tin hat brigade? Some of these guys have presumably been wiping their sweaty brows with their mercury delay lines.

      1. Graham Wilson
        Flame

        @Anonymous Coward--Whatever your 'poison', you should consume it with humble pie.

        At least AlistairJ isn't anonymous.

        Yours must be another substance. I'm sure the effects of mercury aren't seeing the world through rose-coloured glasses. Quite the contrary I'm sure, there'd be a terrible sense of reality about it (and certainly not recommended).

        Whatever your 'poison', you should consume it with humble pie.

        Herewith is a little (declassified) light reading about the serious security threats that silicon Trojans pose:

        1. Intro/abstract: http://www.dsto.defence.gov.au/publications/scientific_record.php?record=9736

        2. Silicon Trojans: http://dspace.dsto.defence.gov.au/dspace/bitstream/1947/9736/1/DSTO-TR-2220%20PR.pdf

        3. Trojan Detection Using IC Fingerprinting from IBM: http://domino.research.ibm.com/library/cyberdig.nsf/1e4115aea78b6e7c85256b360066f0d4/f6e86bf32ce991d68525723c005c8be6?opendocument (download PDF from page).

        4. More on why it's an existing problem: http://www.mil-embedded.com/articles/id/?3748

    2. Graham Wilson
      Unhappy

      @AlistairJ--Do you own a mirror?

      Do you own a mirror?

      Check note to yuh mate A.C.

  9. Anonymous Coward
    Thumb Up

    FPGA's as cheap as (micro) chips, forget ASIC

    but damn it we want FPGA's as cheap as (micro) chips like these

    http://electronicdesign.com/article/digital/1-5-ghz-fpga-takes-clock-gating-to-the-max19952.aspx

    and these http://www.eetimes.com/electronics-news/4210263/Intel-to-fab-FPGAs-for-startup-Achronix?pageNumber=1

    Dylan McGrath

    10/31/2010 9:58 PM EDT

    FPGA at 65-nm moving directly to 28-nm and 22-nm later "with up to 2.5 million look up tables (LUTs), twice as many as any other FPGA, Achronix said. Because of Intel's process technology, the devices will also offer a cost advantage of more than 40 percent, according to the company.

    Like the previous generation of Speedster, the devices will also offer a peak performance of 1.5 GHz, 300 percent faster than other FPGAs, and consume 50 percent lower power, Achronix said."

    then we can have the so called real ‘reconfigurable computing’ that’s now back in vogue instead of fixed function and the inevitable Errata that ASIC has today.

This topic is closed for new posts.

Other stories you might like