back to article Nvidia boss: cloud, ¡Si! Intel, ¡No!

Nvidia chief exec Jen-Hsun Huang sees the computer industry on the cusp of radical changes. And with his company now about 65 per cent devoted to parallel computing, you can easily guess which technology he believes will drive that transformation — and which company he believes will lose. "Whatever capability you think you …

COMMENTS

This topic is closed for new posts.
  1. David Hicks
    Pint

    Good luck with that

    Most programmers are still struggling with the idea of having a few threads around, never mind actually parallelising algorithms for efficient work on embarrassingly parallel architectures.

    Nvidia may well be able to provide a ton of computer power compared to intel, but it's the question of how well it can be utilised and by how many that will need to be answered before it can get anywhere close knocking chipzilla off its dominant perch.

    Pint icon chosen because it's friday afternoon here and I feel a pint isn't all that far off...

    1. Robert Forsyth

      Re: Good luck with that

      I cannot find my crystal ball, but...

      One idea is you write your application in a sort of high level (3D iconic) script, and the parallel processing works out the best way to make it parallel.

      An event driven web-page (RIA) is almost parallel: some animated objects are running by themselves (providing a human followable transition between states), while the user manipulates the space, while an calendar event fires, while your mobile device finds a better connection to the internet, while your location is used to find helpers or friends and alert you of their presence, ...

      I'm sure many web browser users read one tab while another is loading.

      The trick is simple processes, complex data space.

  2. stu 4
    Boffin

    Time to brush the dust off those old textbooks

    Occam might have its day yet.....

  3. Richard Wharram

    Nerd alert

    CLOUD IS NO INTEL

  4. Adam Salisbury
    FAIL

    I wanna live in a Tegra house too!

    But what happens when you hit your monthly bandwidth cap eh? It's been obvious for a while (since x64 architecture IMHO) that hardware horsepower will always be knee-capped by either the internet pipe if we're talking cloud services or by software limitations if we're talking about 64bit and parrallel systems.

    All these cloud/HPC evangalists keeping underestimating the desire of people to have some CPUs at home as well as in the datacentre, they seem to have put the cart before the horse by trying to finding reasons to use something they've already built (rather than the other way around)

  5. Pascal Monett Silver badge

    Binoculars

    Connected to the cloud. That's even a worse idea than GSM phones. Not only Big Brother will know where you are, but he'll also know what you're looking at.

    I'd be scared if I thought it would be possible. Fortunately, I don't. In ten years time I'm sure wireless coverage will be better, but it won't allow you to have PC-anywhere 100mbps connections.

    I would, however, be very interested in an enhanced binocular. I'd like to see what they come up with for enhancing it. Because, unless you have military applications, the most important thing in binoculars is the optics. And we all know that software zooming just loses precision on a portion of what you can see.

  6. Anonymous Coward
    Pint

    "It's no different than us walking up to a car and saying 'Why isn't that a hybrid?"

    Well, the answer to that is that Hybrids are more expensive, not as green as their pious drivers think (not by a mile), not as fuel-efficient as their drivers think unless they're just being used for really short trips and are mains-recharged (making them effectively a poorly designed range-extended _electric_ car), and are just used for eco-posing by tossers in exactly the same way that the Guards Red 911 was used for posing in the 80s.

    By the sounds of it, if programmers can start to take advantage of massively parallel computing then it'll be a hugely better idea than Hybrids. Games shouldn't be too hard to code once the basic ideas behind them are sorted out (you can break down the tasks into concurrent threads pretty easily), and I'd imagine that massive parallelism would be perfect for database applications- it's a lot of simple functions run very quickly. Bang, there's two major segments already on their way.

    You then have web servers- they'd be good on a parallel system for the same reason databases would be.

    Chips like the Parallax Propellor have even been getting hobbyists used to the idea.

    So anyway, it looks- to me- like there aren't really any huge drawbacks to running massively parallel architectures except they're a bitch to program for as you can't just flowchart a program- you need multiple flowcharts running simultaneously and still interacting. Crack the programming and it'll quickly take off; don't crack the programming and it'll fall by the wayside.

  7. ElReg!comments!Pierre
    Pint

    I think he mean 100 years, not 10

    That is when every device we own will have it's own direct fat satellite link, or some kind of new wireless "surface" transmission.

    And even so, most of it is drivel anyway. Time-sensitive tasks (such as, erm, vision, as in binoculars maybe) will always be better performed locally. Not to mention that you can have as many pixels as you want, and very good ones too, and petaflops to process the signal, with crappy lenses you're always going to get crappy images. Unless Nvidia can somehow redefine the laws of physics.

    My tuppence: in 10 years, things will be pretty much the same except that your pencil will have it's own IP adress and GPS and will insert random ads in what you write. I can also see a lot more domotics, linked to and operated by the like of Google ("for your convenience") that will adapt the add displayed on the toaster to fit your mood based on the data collected (e.g. an add for Cialis after an all-too-quite night, maybe?).

  8. Anonymous Coward
    Headmaster

    Perdona...

    But what's the deal with using Spanish words for no apparent reason at all instead of "yes" and "no"? Is it supposed to sprinkle some coolness on otherwise square writers? As a Spaniard myself I am just curious...

This topic is closed for new posts.

Other stories you might like