* Posts by Pperson

40 publicly visible posts • joined 24 Mar 2011

UNSW offers free online Computing 1 class

Pperson

Does an interactive video really beat a live human?

Is education really about packaging up information into interactive forms? Then why didn't DVDs and "multimedia" already solve the problem? Or books for that matter? I can learn from books now no problem, but back when I was a dumb 17-year-old it wasn't the case - no matter how much I tried to understand, I needed someone to bump me straight and do it in person, whether it was my mates or the (good) lecturers. So I don't reckon all these Internet-based courses are ultimately going to supplant in-person teaching, but I can't really say *why*.

UK ice boffin: 'Arctic melt equivalent to 20 years of CO2'

Pperson

Re: Climate-change sceptics

> ...and refuse to even CONSIDER the possibility that the people who make Billions

Maybe it's a kind of low-level fear; if you consider it could be your fault then you have to do something about it. But what are you going to do? Protests make no difference, voting is hopeless "lesser-evilism" anyways, you don't have enough money to make a dent, and to take personal responsibility and ditch your car / electricity / etc will require a huge effort which many people are already too stressed (emotionally and financially) to be able to implement. So to remain sane you deny it. I'm not quite sure how I remain sane sometimes, to be honest!

> ...scientists are engaged in a mega-conspiracy to defraud the public to gain research grants

And there's the other side: scientists ("scientists"?) aren't exactly saints and certainly will beef up their claims and statements ("novely and significance" in the parlance) in the competition for grants and exposure. Few if any people given airtime in this debate are truly impartial - that's the way the media distorts debates unfortunately (people who are impartial are 'boring' and also not likely to claw for media attention).

Basically, you are going to have to make up your own mind and decide what you are going to do about it. Which seems to apply to most major questions of substance nowadays.

GM to slash vast outsourced IT empire

Pperson

Re: Let us centralise all our support into just 2 locations...

Reading articles and posts on this over the years, it seems to me that outsourcing doesn't tend to work and 'insourcing' often fails to work too (perhaps less badly than outsourcing, but still - not exactly inspiring to think that insourcing could be 'less bad' than outsourcing: bad is bad). But outsourcing and insourcing *could* be (and occassionally has been) done well. So the question is, what's the actual problem? Studies always point at "mismanagement" and "incompetence" but how come these are allowed to happen - repeatedly? Is there a whole network of kickbacks, corporate infiltration and bribery going on that is kept quiet? It'd explain things better than "people are incompetent", because that doesn't say why incompetence is getting so richly rewarded.

Core Wars: Inside Intel's power struggle with NVIDIA

Pperson

Re: Parallel code easily transfered to very different architecture?

Right on Michael! I do get tired of hearing/reading these people (and to some extent the Reg itself) talk as if parallel computing is GREAT when in fact it's just forced on us by the dead-ending of CPU speeds. Then there is the insinuation that it's the coders' fault for not being able to take advantage of it - because of course everything boils down to a matrix operation, right? Never mind problems due to parallel memory access bottlenecks, de-synching, etc etc. But you only find this stuff out the hard way, it's like a dirty secret of the industry.

Hey Reg, how about an article on the *other* side of the parallel coin? You know, the one that says "honestly, this parallel thing isn't nearly what they hype it to be".

China steps up crack down on hi-tech exam cheats

Pperson
Unhappy

Re: It is endemic in education in China at all levels

Thanks for that post, Anon - a very personal insight into what must be a difficult situation for you, having to in effect condone methods that will only lead to degradation of the student's themselves - and all because the system is so focused on numbers and statistics. Speak up and you'll be replaced by someone who doesn't care: self-reinforcement of the system. Sad to say, it looks like the rest of the world is travelling on the same path and not that far behind.

Google to FCC: Protecting Street View coder didn't derail probe

Pperson

Re: Deserving

Hmm. So by that logic:

- If you leave your door unlocked, you deserve to have your house burgled

- If you thoughtlessly leave your wallet somewhere, you deserve to have your identity stolen and credit card bills racked up

- If you walk around in revealing clothes, you deserve to get raped

- If you don't wear a bullet-proof vest, you deserve to be shot through the chest

- If you state non-conformist opinions aloud, you deserve to be put in a concentration camp

... damn, I just realised that half of these are already standard policy! OK, Benjamin 4 - you win.

Top Italian OPERA boffin steps down after faster-than-light mistake

Pperson

Re: "the result of someone's power play"

> Someone schooled in science would say "... We do not have enough data to show that the poor results were a causal factor in the resignation. More information is necessary."

True, I don't have evidence as to what I am saying. That's what a scientific case needs, so scientifically we merely have two unconnected facts: the mistake and the resignation.

But I'm not giving a scientific case, I'm telling my opinion based on my own experience in science and those that I know. And I have seen+heard this sort of thing plenty of times in various forms where I knew the inner workings, hence I shared my opinion. But you're right, I wrote it a bit 'statement-of-fact'-ually, which it isn't: fair point.

Pperson

Re: "the result of someone's power play"

> a.k.a. how science isn't supposed to work ...but occasionally does : (

A lot more than occasionally. Basically, he had to step down because the team suggested something blasphemous. The stakes are high, beliefs are threatened, egos are raised, jealousy hits. It doesn't matter that they asked for help. Unfortunately science is nowadays conducted almost like a kind of hierarchical religion - not too surprising given that scientists are people, and people are still people no matter where you put them.

Tech titans say sayonara to Japan in quake wake

Pperson

Re: It is sad...

> but it is how the market operates

An interesting comment! Maybe a system that is inexorably bringing the vast majority of us down to poverty wages isn't quite a system we should be agreeing to, even if only for the sake of self-interest? (unless you are somebody extracting the surplus of course - I wouldn't want to suggest that you go against your own self-interest!)

Intel plugs both your sockets with 'Jaketown' Xeon E5-2600s

Pperson
WTF?

Is this really progress? I seem to have missed it.

A lot of talk about technicalities and numbers, but in the end nothing significant is made concrete about its performance. It seems like they no longer have ideas on how to make use of those extra transistors in the new gen of chips to produce a faster *CPU*, so instead they tweak the old architecture and bring yet more stuff on-board the chip. Commiserations to the Reg though: must be hard writing articles that make tweaks sound like exciting stuff!

Antimatter asymmetry: new results bring solution closer

Pperson

Re: This doesn't get us mch further, does it?

> It just poses another question. Why do these mesons and their anti-particles decay differently?

Ah, you've stumbled upon the truth about science: there is no ultimate answer, it's all about keeping scientists in a job to come up with the next question. Now they will have to find you and silence you.

NASA snaps show Arctic melt

Pperson

Re: Ummm...

Wow really? You must live in a pretty nice place then. Or maybe it was awful back in 1980 too? Where I live the pollution is noticeably worse, visibility has gone down, the water now tastes kind of metallic / weird and hot-(40's)-but-humid days during summer are becoming the norm from the equally-hot-(40's)-but-at-least-dry days of 20-30 years ago. Boy, I really hate the hot+humid days...

Be interested to see what anecdotes others have about the area they live in.

Pperson

Re: Re: If there's nothing to worry about ...

> Perhaps they are first to comment because our entire economy and tax system

> is being hijacked by the green agenda

Well, I wouldn't say *entire* - plenty of hijacking going on in the guise of bank bail-outs too. But yes, there's always room for more excuses to pauperise us plebs. Regardless of whether the justification is fact or fiction, they always find creative ways to make money from it.

But that's money-making. It'd be a pleasant change if the rest of us just cut 'em adrift and started concentrating on the things that really matter. Why get sidetracked by how the ba*ds screw you over for pieces of paper (or numbers on a disk drive in a bank)?

Future of computing crystal-balled by top chip boffins

Pperson

Re: Human FLOPS

Ah, we can't *consciously* do ten million flops per second. But maybe we are doing them *unconsciously* all the time while processing the world. Example: those people who can find the n-th root of huge numbers faster than a computer. Sure, to them it feels like manipulating colours or something, but how are they arriving at the correct answer unless they are tapping into some kind of serious processing ability that maybe exists at the cellular level in everybody, just more directly accessible by them?

Dyson sinks £1.4m into Cambridge engineering chair

Pperson

Actually, sounds like a bargain

Put 1/1000-th of your wealth into something (that's like $250 or something for mere mortals like you or me), get plenty of kudos for it *and* get the first commercialisation opportunity to make huge amounts of money off of any useful ideas/inventions they come up with.

Nvidia: An unintended exascale-super innovator

Pperson

Not really the programming

> The problem with GPU coprocessors is that they are not as easy to program

I have to disagree with that. Having coded a few different kinds of high-performance algorithms into GPUs, the actual programming is initially odd but you get used to it pretty quickly. To me, the problem is more fundamental: the pattern of memory access (coalesced) that is needed does not work for all (most?) things you want to make massively parallel - miss coalescence and your GPU spends most of its time waiting for memory. And unfortunately the 'L2' cache (shared mem - ~16Kb) is too tiny to make a difference for pretty much anything but matrix ops or other highly localisable problems.

Intel mad for power, but stacked-up dies keep MELTING!

Pperson

Deja Vu?

I seem to recall that the Pentium IV was going to scale to 10GHz too.

"I’m kind of interested in it for a number of reasons, but is it going to take over everything and be the new technology that’s going to drive us to exascale? I don’t believe it,"

What I reckon he is really saying is that Intel have no solutions beyond silicon and so have been reduced to squeezing the last dregs from the current approach, hence multi-core and now stacking.

Faster-than-light back with surprising CERN discovery

Pperson

Re: Brick

> But doesn't that mean that the future is written in stone, immutable and the whole universe a timeless brick?

It might, unless you think of it differently and speculate that the past, present and future are in fact happening together and we lifeforms just separate it out into the three to make sense of it all (or really two, because there is no 'present'). Then the idea is that we work out the contents of our lives while at the same 'time' living it. Because how can we tell the past didn't change if we have no separate reference point outside of the flow of time? An analogy is writing a program - you run it once, it has a bug and crashes (=dies prematurely). You find the bug occurs early in the code (=childhood) and fix it, then re-run the program. The program itself never knew the original buggy code, but you are both the program and the programmer so you are both ignorant AND in control. Then you hit more bugs, and decide it's not worth fixing the program anymore because you're sick of fixing it so you just let it crash, quit and go to another company (=giving up and going to heaven? Well, it's heaven for the first few weeks at the other company anyways... :-)

A weird idea I agree, but quantum entanglement is weird like that too!

Mathematicians slam UK.gov plans to fund statistics only

Pperson

Re: Maybe

> a law should be passed that means you can only stand for parliament, if you have scientific or

> technical expertise

That's called a 'technocracy', which China is (supposedly) the closest equivalent to at the moment since much of the CCP leadership holds engineering or natural science degrees. Unfortunately, having a degree doesn't automatically make you less of a greedy grasper, it just means your lies are put together structurally sound. Which in politics probably makes it *worse* since it's harder to prove the lies.

Intel extends JavaScript for parallel programming

Pperson

A lot like OpenCL

'River Trail' provides things like ParallelArray which has a method that can execute a user-given function in data-parallel (ie: running the function for each element in the ParallelArray, passing the function the index in the ParallelArray as well as any other params that the user initially passed through). So effectively SIMD and quite a bit like invoking on GPU kernels in OpenCL - in fact, Intel even says something to this effect.

Still, it doesn't make me look forward to coding in parallel under JavaScript any more than threads would do. But then Google's Native Client isn't any more innovative either - just different problems, but headaches nonetheless. I wonder if we are witnessing the beginnings of the fracturing of Web programming?

Intel shows linear scaling with MIC coprocessor

Pperson

Same old same old

"Basically, CERN is seeing essentially linear scalability on the Knights coprocessor."

What a shock - take a pre-existing embarrassingly-parallel distributed algorithm, run it on many-core CPUs (thereby removing the network) and .. it remains embarrassingly-parallel! Boy, that's magical.

"...it is an easy transition from multicore to many core" As opposed to what? Programming the GPU? Rattner himself says that many-core is just scaled-up multi-core (ie: a name change) so this statement is meaningless. In fact, the whole presentation was meaningless since it's just more-of-the-same as it has been since Core-2 (or even the HT Pentium-IV), just 10 years and more cores on.

"Now, we are just at the beginning of the age of many-core processors."

I'm inspired.

More transistors, Moore’s Law, less juice

Pperson

Re: No direct linkage

Yes things aren't so clear cut, as kevin 3 also points out. But it depends on what they are measuring power use against: transistor counts or MIPS/FLOPS. Worse, if it's the latter, and I take the conservative approach you suggest and say that doubling the transistors nowadays only improves actual performance by ~70% (or whatever), then it the research suggests modern CPUs (and GPUs) should be using *less* power than a 286 (because the computations are lagging behind the transistor counts, so power efficiency would be outpacing computations). But this clearly isn't the case, or I wouldn't be needing that 1000W PSU. So am I wrong or did the researchers fudge the numbers to get a good headline? I can imagine mobile chips being competitive with a 486, but certainly not an i7 or Phenom II.

Pperson

The maths doesn't work?

Unless I'm missing something?. In particular, if every 18 months we doubled the number of transistors (which is *roughly* the computational grunt) and at the same time doubled the number of computations per kW-hour (ie: halved the power used per computation), then the power consumption of the current gen CPUs should be about the same as the power consumption of a 286. But it's not. Unless they are looking at low-power chips like the Atom rather than mainstream desktop chips like the i5 / i7?

Google Native Client: The web of the future - or the past?

Pperson

The "why bother" isn't very clear...

... and usually that means other unspoken agendas are busy. There's a few things I just don't get about Native Client:

- If its security model is working on x86 instructions, how is that portable to non-x86 machines?

- Given how new the verifying-instructions approach is, how can one claim it is secure? It's not been sitting there getting attacked yet - who knows whether there is a glaring hole that the researchers hadn't thought of? Only time will tell.

- The only thing I personally see as a bonus is that you can use a compile-checked and statically-typed language, something that I like better than the dynamic typing and loose OO of JavaScript. But you don't need to support native code to do this. You just need a different language, or simply get stricter with JavaScript, exactly like HTML5+CSS does to old-style HTML. The reasoning of "eases porting existing C code to the Web" isn't exactly inspiring.

On the surface, Native Client fills the same role as Java applets (speed + compiled code). However, it seems more likely to me that Google wants more speed to do things with their Web toys to make them more desirable than the competition's, and their tweaks of their JavaScript engine isn't giving it to them. Hence the Native Client idea.

Windows 8 to boot in 8 seconds

Pperson

Don't worry...

...they'll steal your idea next year and call it their own revolutionary innovation.

Australia Institute: we’re all mindless sheep

Pperson

Two sides to every story?

While I won't dispute that people are often "lazy, inattentive, ignorant", there is another side to it: if you cannot find what you want in the first page then subsequent pages returned by the search engine will typically be even worse and it is usually more efficient to massage your keywords until you get the ones that manage to extract the results you want from Google/Yahoo/Bing. In other words, we do what works. When PageRank first came out this was not the case, but nowadays you have to cajole the search engine to give you what you are really asking for. It would be interesting to see how much this is due to 'pollution' by pages attempting to manipulate search algorithms into ranking them higher versus the search engines' own [d]evolution for maximising advertising revenues (the more search terms you are forced to try, the more variety of ads you see).

Deep inside AMD's master plan to topple Intel

Pperson

Re: AMD innovative

Uh sorry to be a denier Zog, but while it is true that AMD really had a very good chip with the Athlon compared to Intel's Pentium IV, they then did exactly what Intel had done and sat on their laurels to maximise profit. Because facilitating good research is obviously a waste of money if your current chip is better than your rival's. Hey, can't blame 'em: Intel does the same. Funnily enough, Intel only got back in it by accident with Core (from the mobile guys) and the tables were turned. And we haven't moved since (4Ghz anyone?) and are now in hype mode trying to flog parallel processing.

Google Go strikes back with C++ bake-off

Pperson

56.92s to 3.84sec? 1604Mb to 275Mb?

Reducing the Go code from 56.92sec to 3.84? That's a 15x improvement! And 1604Mb to 275 = 6x better. Unless the original Go code was rather inefficient, this almost sounds more like they changed the fundamental algorithms used rather than merely tweaking the code. And as Paul Shirley said above, porting this back to C++ is not a useful comparison since specific optimisations chosen for the Go version may well do nothing in C++.

But in any event, these new benchmarks only tell us how fast this particular algorithm can be made to run if you are intimately familiar with Go. I'd rather know how it is to write "good-yet-fast" code. Personally, I find C++ fairly easy to write decent and fast code but hard / nasty to write very fast code. C tends to be speedier but a bit klunkier, Java a bit easier than either but a little slower for some things and a lot slower for others, and C# better than Java on both counts but not as fast as C/C++. I'm inclined to believe the original Go results that suggest Go is not there yet with efficiency of compiler-outputted code (Java got a lot better on this score over the years), so 'normal'/easy code is slower. You can of course tailor your code to do the compiler tricks yourself, in which case it will run fast as well, but who wants to do that every time? Especially when you'll have to maintain the mess later!

The Network is the Problem: Barriers to cloud adoption

Pperson

Tradeoffs...

Hmm, the more I think about it, the more it seems that cloud vs local is a set of relative risks. For example, we've been through a "centralising crusade" where all the servers got moved centrally and virtualised. On the one hand, overall uptime is higher but on the other hand if things go down then it takes ages for the central guys to respond since we are only one piece of the larger company. Whereas before the problem would be fixed in minutes or even noticed by the local IT guys before I could even log a job. I would guess that 'cloud' solutions will make this tradeoff bigger: even better uptime, but even worse responsiveness/effects if things go wrong.

Perhaps the focus is too much on the engineering feat involved. That is, 99.9% uptime is only 1% more than 99%, but 10x harder to do (reducing the 'error' from 1% to 0.1%). On the other hand, 5 minutes till a fix is 12x faster than an hour but not a marvel of engineering (just needs someone nearby with admin rights). The former seems to give a CIO much more to boast about, but I know I preferred reasonable uptime with fast responsiveness.

Future of the cloud is hybrid

Pperson

Headless chicken?

It seems to me that *nobody* knows where this cloud thing is going - even the vendors don't really have much to say beyond "we have huge data processing centres: we found them useful so we'd like to make some more money and get you to use them too". And the only reason to use them is vague promises of cost reductions, but nothing *real* in terms of improved systems or more efficient development. I'm sure clouds have actual value-adding potential rather than merely an excuse to cut IT staff, but it doesn't look like anybody has much clue how to do it (or perhaps more accurately, how to make a fat profit off of it).

Re: Gary F and cloud vs Internet: it's an evolution of the Internet that has come about due to widespread broadband availability allowing people to use it like an extended LAN. And it's also a marketing buzzword unfortunately.

Nokia E6 smartphone

Pperson

Re: Battery life

No mistake - that's simply how much juice Android phones and the like suck down in comparison to Nokia's E-series phones.

Intel code guru: Many-core world requires radical rethink

Pperson

It'd be nice if everything was so convenient

This sort of coding (concern for data locality in massively-parallel architectures) has been quite normal in supercomputing for decades. And in the GPGPU programming segment for years. And it still is darn hard. The basic problem is that not all applications nicely slot into something where data will conveniently be located locally to where it is needed. Hence Berkeley can come up with a fastest-ever parallel algorithm for matrix multiplication - because they chose a problem (matrix mult) that *can* take full advantage of the way modern multi-core systems work. And it's *still* hard to code! Try doing the same with parallel algorithms that don't conveniently have data locality and you find the system runs hardly any faster than on a normal CPU because memory bottlenecks dominate. Sure, *sometimes* you can get it faster with tricks like "compute the same variable per core", but funnily enough Intel won't be telling us of all the failures too (which I suspect vastly outweigh the successes - nobody can publish failures because it might just be their lack of imagination in solving the problem and so gets rejected by reviewers).

Sorry, adding more cores onto a chip isn't some kind of revolution - it's an admission of defeat. Trying to tell us that it's our fault 'cause we need to get smarter in coding is raw marketing spin.

Creationists are infiltrating US geology circles

Pperson

Met a few scientists too

> the WORD and must be OBEYED UTTERLY WITHOUT QUESTION!

I've met a few 'real' scientists too, and - being humans - they tend to do exactly the same thing. Just with their favourite theory rather than the bible. Main difference seems to be that science is basically polytheism (polytheorism?), so people aren't able to get quite so narrow-minded (but not from lack of trying!). Perhaps the same holds true for adherents to polytheistic systems? I guess that's why monotheism spread so well - absolute belief is definitely an impressive sight, regardless of whether it is religious or scientific.

Think PCs will drop in price? Think again, warns Intel

Pperson

Wait a minute

"As a company, we're putting significant features into different parts of the market..."

Hmm, I would argue that this occurred because chip speeds aren't increasing nearly as fast anymore, so Intel *had* to differentiate its chips somehow from the previous generation and then hype this difference via marketing. Whereas before the differentiation was easy: "it's twice as fast as the last lot". Hence witness how HyperThreading was introduced precisely at the time when the P4 was red-lining (at the now-ubiquitous) max of 3Ghz, then removed for the roughly-70%-quicker Core2, then re-introduced for i7 whose speed is only about 25% faster than Core2.

Intel is in a bind because the older chips are still competitive with the new ones: any dropping of the price of the old line will undercut sales of the new range. And not because "the vendors' marketing teams had "done a great great job of cleaning up our brands"." It *is* a marketing maneouver, but I suggest only to convince us that the slowing rate of chip progress is a "good thing".

Google Oz slips A$600 million through tax loophole

Pperson

Re: Bidding War

1% is still more than the 0% that the Bahamas and a few other places charge. Kind of hard to be more 'competitive' than that.

The Sandy Bridge Hackintosh

Pperson

Re: SSD failures

Yeah, I've known a few people with SSD drives who have had their SSD die in the same way (become unrecognisable / useless all of a sudden). Other flash-based mem such as USB thumdrives tend to do the same. Anecdotally, it doesn't really seem to me that SSDs fail gracefully or with much warning, but the vendors et al all consistently claim they're way better than magnetic HDDs. So I don't know what to think.

Anybody know of any actual studies on failure rates/failure effects for *real* usage of SSDs as opposed to the vendor's "we tested it with 1,000,000 back-to-back writes in our lab"?

Fukushima scaremongers becoming increasingly desperate

Pperson

Re: unmitigated BS

Data < Information < Knowledge

Apple bashes 'gay cure' app

Pperson

Re: Legally Required

Precisely! (Legal) lying, (legal) cheating, (legal) stealing, generally being a (law-abiding) nasty piece of work - all of it is *legally required* of our companies. Don't you love our civilization?

Pperson

What double standard?

No double standard here - Apple has simply made another principled stand: the principle of maximising their money making. Like all their competitors, they're very consistent: say and do anything to make more money. Can't have enough of that "printed green stuff" (whoops that's out of date. OK how about "numbers in a database in a bank that are made up out of thin air").