back to article AMD beta seeks CPU-GPU harmony

AMD has released a free update to its ATI Stream SDK that offers OpenCL support for CPUs, taking the power of that parallel-processing technology one step closer to true usability. And if you're worried that the company is stepping off the open-standards reservation by doing so, fear not: AMD has submitted the appropriate …

COMMENTS

This topic is closed for new posts.
  1. Anonymous Coward
    Anonymous Coward

    Deja Vu?

    "OpenCL divides workloads among CPU and GPU cores, accelerating tasks by divvying up processes among the cores, offloading such parallel-data tasks as media, video, audio, and graphics processing that would otherwise be handled by the CPU onto the broad parallel-processing shoulders of a modern GPU."

    Isn't that called an Amiga? What did they do, hunt down and hire every ex-Commodore engineer they could find? All they need to do now is find an audio processor called Denise, a gpu called Agnus and an OS that can display applications with different resolutions on the same monitor at the same time, each running concurrently so that things like copying data to a hard disk while it's being formatted become possible.

  2. Louis Savain
    Thumb Up

    Very Interesting Article

    "All well and good, but our experience with OpenCL has shown it to be a bear to program with."

    **

    Exactly. This is essentially what it comes down to. Does OpenCL make parallel programming of heterogeneous processors easy? The answer is no, of course, and the reason is not hard to understand. Multicore CPUs and GPUs are two incompatible approaches to parallel computing. The former is based on concurrent threads and MIMD (multiple instructions, multiple data) while the latter uses an SIMD (single instruction, multiple data) configuration. They are fundamentally differrent and no single interface will get around that fact. OpenCL is really two languages in one. Programmers will have to change their mode of thinking in order to take effective advantage of both technologies and this is the primary reason that heterogeneous processors will be a pain to program. The other is multithreading, which, we all know, is a royal pain in the arse in its own right.

    ***

    Obviously what it needed is a new universal parallel software model, one that is supported by a single *homogeneous* processor architecture. Unfortunately for the major players, they have so much money and resources invested in last century's processor technologies that they are stuck in a rut of their own making. They are like the Titanic on a collision course with a monster iceberg. Unless the big players are willing and able to make an about-face in their thinking (can a Titanic turn on a dime?), I am afraid that the solution to the parallel programming crisis will have to come from elsewhere. A true maverick startup will eventually turn up and revolutionize the computer industry. And then there shall be weeping and gnashing of teeth among the old guard.

    **

    Google "How to Solve the Parallel Programming Crisis" if you're interested in an alternative approach to parallel computing.

  3. Anonymous Coward
    Unhappy

    RE: Deja Vu?

    No, you'll need at least a "Fat Agnus" and 2MB of chip memory to make proper use of openCL! ;-)

    I am quite interested in how quickly and efficiently companies like Apple (yes I'm a mac user, sue me.) and Adobe implement openCL. I've been using Adobe's After Effects as a main tool for going on ten years now and its openGL implementation has been nothing short of abysmal on every system I've used it on (Mac and PC, Quadro/fire and "normal" graphics cards). I know Snow Leopard will have openCL built in and I'm sure CS5 products from Adobe will to. but as with the few movie encoding programs available at the moment that are built to take advantage of GPU's for encoding the results are far from spectacular other than in "cherry picked" situations. and although refinements will inevitably be made to the software I'm still waiting for openGL to implemented in a way that allows me to be more productive, let alone openCL!

    Bollocks to new technology, bah humbug.

    I'm off to build my Amiga render farm.

    (I actually do have two working 4000's here at the moment! Anyone have any Zorro III Ethernet board lying about?)

  4. asdf
    FAIL

    amd ati epic fail

    This is all AMD has to show after dumping billions into buying ATI? That aquistion has to be one of the 10 worst in history. AMD was top of the world before buying ATI and has had nothing but losing quarters and market share since. Actually both companies were on the top of their game before and both companies have fell behind the market leaders. The great synergy Hector (ex AMD CEO) was talking about was so good it cost him his job. AMD is just and DEC and is dying slowly. No wait, DEC was about releasing cutting edge technology and then way overcharging for it but I digress. AMD except for a few good years was always a cheaper, me too clone others technology company. I did enjoy their catering to enthusiasts and have fond memorys of some of their chips but alas RIP AMD.

  5. Andy 70
    Boffin

    to AC at 00:19 GMT

    no such thing as a Zorro3 Ethernet board , the nearest you'll get is a deneb zorro3 usb2 card and some USB to ethernet dongles. works well enough.

  6. Anonymous Coward
    Joke

    @ yes I'm a mac user, sue me.

    No point. You have no money left as you are a Mac user. :)

  7. amanfromMars 1 Silver badge

    The true maverick always travels with a deck of aces

    "A true maverick startup will eventually turn up and revolutionize the computer industry. And then there shall be weeping and gnashing of teeth among the old guard." .... By Louis Savain Posted Wednesday 5th August 2009 20:32 GMT

    How very true, Louis, and in an instant become the richest and most powerful Human Being on the Planet, although their Presence, for there are bound to be more than just the One, will in All Likelihood be Maintained and Represented and Guarded Virtually, for Obvious InterNetional Security Reasons. Hell, who wants to be a prisoner like Bill or Warren or Larry or George or Paul or whoever, hiding away like thieves because of the public's wealth they have gamed/milked..... for earning it is not how it works in IT, is it? IT is All Smoke and Mirrors and the Beta Management of Perception with the Realisation of a More Powerfully Controlled Viable Imagination with the Ubiquitous World Wide Web Internet GUI for the Browser Inputting of Output Source to Countless Communicating CPU OSSystems.

  8. Anonymous Coward
    Thumb Down

    Khronos Schmonus

    It would be nice if Khronos certification meant that some great minds in some top companies had pooled their collective wisdom to produce a reasoned and intelligent standard. At the *major* semiconductor company where I worked, however, we made the biggest looser in the team our Khronos representative as nobody else wanted the job. A brain dead tadpole would have been more credible than him in the role.

  9. Anonymous Coward
    Anonymous Coward

    all well and good, but it doesnt help the UVD Asic use

    "In canned statement, AMD SVP Rick Bergman said: "By supporting multi-core CPUs and GPUs with our OpenCL environment, AMD gives developers easy access to both processing resources, so they can efficiently write cross-platform applications for heterogeneous architectures with a single programming interface."

    "

    thats all well and good Rick, but it doesnt help unilise the genric hardware assisted UVD Asic AVC,VC-1,(and to a lesser degree HD mpeg2) video playback and editing use as found in all current AMD/ATI Gfx cards does it, your so FAR behind Nvidia'a CUDA ASIC and their masive 3rd party developers code drops its not funny....and hasnt been for well over a year now....

  10. Anonymous Coward
    Thumb Down

    @ac - 10:58

    The biggest "looser", eh?

    Well if you were one of the sharper minds, the biggest "looser" must have been a right dunce!

  11. Ken Hagan Gold badge

    Bergman does not have a point.

    "Until Intel releases its many-core Larrabee processor in the first half of next year, AMD is the only company that can toss a homegrown OpenCL net over both x86 CPUs and GPUs from its own stable."

    No serious deployment is going to happen within 6 months. Beyond that timescale, Larrabee will have arrived and developers will have the choice. My guess is that they'll prefer the programming model offered by a stripped down CPU over that of a beefed up GPU.

    Having said that, Intel to date have pretty much promised to screw Larrabee's chances by offering it only on graphics adapters rather than on the motherboard. It's all for Intel to lose and so far they look like pulling it off.

This topic is closed for new posts.

Other stories you might like