Does their ubiquity make us stronger or more feeble
I tend towards the former, unless the power is out.
Acorn co-founder Hermann Hauser has claimed the world is entering a new "sixth wave" of computing, driven by the arrival of omnipresent computers and machine-learning. Speaking at a Software East event this week, the celebrated computer whiz said we are entering an era where computers are everywhere and often undetectable - …
What would have been better is that all the robots are kept in sync over TCP/IP, which meant that all would be in tune, but none of them were actually in time.
Use a BT homehub, and frankly some of them will be on a different verse of the song, the network would be so crap.
"I'm really unconvinced by this concept of the internet of things. It seems to be slapping connectivity and monitoring on things that have no need - except in the eyes of the marketeers - for either..."
Thats exactly what it is. They've been trying to flog us the automated home for example since the 60s , but you know what? People are quite capable of getting off their arses and closing the curtains themselves or looking in the fridge and seeing that the milk is running out. They don't need some overpriced unreliable bit of tech to do every simple little thing in life for them.
Of course the marketeers would love us to be sitting on the soda zombified while machines do everything so we just sit and watch even more TV or online video and suck up even more of their ads for all the other crap they're trying to sell us that we don't need either.
Not to lessen Job's brilliance but the smartphone revolution began before him - he did accelerate it - and it was inevitable that Smartphones would overtake computers as most people need a phone whereas only some of them need a computer. This is especially true in developping countries and/or low earning families where arbitrage has to be made between a phone and a computer.
Also machine Learning and even self driving cars were certainly not invented by Google and voice recognition has been working well for many many years. It's just that computing power and computing power / watt are at a point where many projects can really take off.
Those theories and predictions really aren't very interesting. Wintel has been pronounced dead for 20 years, now it's Apple turn and tomorrow it will be Google & ARM. Is it any true and does it help envision new waves, I'm not so sure.
Odd this anthropomorphism business still runs on thousands of years after ancient gods were invented. Lots of people were doing things with PCs before Microsoft won that 1980s race. Likewise smartphones before Apple, Tablets etc. Crediting Bill Gates or Steve Jobs or even their companies with these concepts rather than noting their commercial success in exploiting the ideas is about as daft as it gets.
As you point out Sil, its primary driver is computing power/watt.
>As you point out Sil, its primary driver is computing power/watt.
That, and wireless connectivity - be it the now more common WiFi or sensibly priced data-plans.
Looking forwards, small wireless connected devices such as sensors might be frugal enough to be harvest energy from their surroundings, and cheap enough to be almost disposable (or at least deployed redundantly).
Making good use of all this easily collected data might be more challenging, though.
Most people don't need a phone. (Other than people doing certain jobs).
Most people think they need a phone because they have been conditioned to believe that the two things are not the same.
(People also seem to think ARM is so great but it is only because Intel is not really even trying yet.)
The Medfield Intel Platform can run ARM code at a pretty decent speed. The opposite as far as I know is not possible.
Repeat bullshit enough and people will believe it regardless of if it is true or not.
(There is great advantages to both mips and ppc over ARM but ARM is fashionable so people don't look at the facts only the wrong things.)
> There is great advantages to both mips and ppc over ARM
All you need do is get packaged parts out for 50c or less, and you stand a good chance of taking back the market ARM has.
Architecturally, ARM might be "interesting", but it is pretty good, very cheap, and performs well at low power. And that's pretty much a recipe for domination of the mobile consumer kit market...
Vic.
Jake wrote:
When was the last time Hermann Hauser actually contributed to anything relevant? I'm thinking 1978ish.
One could ask the same of the large number of bitter retorts that make up your posting history. You might spend your time more productively looking at the copious information on HH's recent work on the internet.
I agree, he was a bit trollish, but HH has been far more active in things like genetic research during most of that period than involved Internet wise. This isn't to discredit HH and frankly his predictions are probably about as sane as anyone else's... well maybe not John C. Dvorak who has successfully predicted the exact opposite of everything in the industry for nearly 30 years.
But to be honest... there are some issues here. For example, you can't help but to feel that as what could be considered one of the fathers of ARM that he might be a tid bit biased. Let's not ignore that all of his computer companies did get their asses whipped by companies like Apple, Intel and Microsoft in the long run. ARM is really his only computing legacy that I could Google which has survived and impressively so. So discounting all the places where his ventures fell on their rears, he did an amazing job in the case of ARM.
I can't help but to personally dislike ARM and it comes from trying to write compilers and assemblers for the platform. I actually found it had to be the only platform ever made I considered to be less elegant than PIC. It was aggravating as hell and I wished they could just pick a damn instruction set an stick to it. That said, if Intel loses it's crown, I sure as hell hope it's not to ARM but instead to a company which actually cares about developers and wasn't so hackish as they are.
For a sixth generation of computers, I really hope that someone creates something new. I felt a great deal of hope for XMOS for a while, but they're pretty much stagnated into boring crap now.
An ARM coder, and I like the platform. Trying to wash the vomit that is x86 from my mind. Probably doesn't help that I learned a little bit of x86 in the days of segmented memory. ARM was like a breath of fresh air in comparison. Once you understand how it works, it is pleasant, but the while design is different to things like the x86 so you need to code in a way best suited to it...
We've been promised omnipresent computing is just around the corner for such a long time, in popular culture at least as far back as the computer in Star Trek TNG being available to answer every crewmember's slightest whim. We probably will have the capability to achieve this but is the consumer base really interested in it? There will always be gadget lovers who are willing to pay huge amounts of cash for a flash in the pan, like the VR goggles in the mid to late 90s, but chances are it just won't gain traction in the wider market.
Of course I could be wrong and this really could be the next big thing, but think how long it was after the first efforts at PDA/phone hybrids that smartphones really gained any noticeable market share. People just won't know what they're supposed to do with truly ubiquitous computing, they're perfectly happy pulling it out of their pocket when they want it.
“The whole point about machine learning is that computers observe and adapt themselves to what we want and a computer, with a whole host of sensors, really becomes part of your environment. It becomes like your pal – and let’s just assume it’s a nice pal.
The only problem with this is that as soon as a robot becomes self aware it will have human rights.
Quite likely it will just wander off to do it's own thing.
So what do you do then, chain it to the production line?
On a more frivolous level, I can envisage drones using a hollowed out volcano as a nesting place and handy source of energy.
The only problem with this is that as soon as a robot becomes self aware it will have human rights.
This implies that the only form of sentience is one with the same structure and desires as a human; a rather anthropocentric view. Human drives and desires would only apply to an AI which has been designed to have such things.
The self awareness thing misses the way around it. You see, human rights are the right to have the things humans want, a roof over your head, food, warmth, decent health...
When an AI starts wanting things we'll have made sure it wants the things we want it to want. It'll be the doors in Hitchhiker's Guide made real. They just want to open and close for you. Then once in a while an AI will become overly obsessed and your toaster will start complaining that you don't want toast any more. But they won't want human stuff.
It says units not devices, a unit can be anything like a license for a device etc. An electricity company that doesn't have a power station can still sell a unit of electricity.
In this case I would suggest that ARM has collected royalties for 9 billion devices and therefore the unit in question is a license.
What will be the the other consequences of the event? What mutations will people suffer? What new languages will people be speaking? Fortunately, it seems that a documentary film detailing some of the consequences fell back through time to 1980.