Big Data hype has peaked and adopters are about to enter Gartner’s dread trough of disillusionment, says one of the firm’s analysts, Svetlana Sicular. Hype about Big Data is certainly prevalent: here at Vulture South the term is often thrown around by vendors who in past years were content to describe their data-crunching …
I believe it
Only yesterday morning, the Today programme on Radio 4 ran a 'big data —the next big thing!' style piece. If it's finally made it onto the radar screens of the BBC's fuckwitted technology correspondents, it's not merely peaked, it's dead and buried.
Re: I believe it
What this essentially means is that the hype has exceeded the value so its now time for the real people to get the real work done.
You're now past the point where the pointy haired management type say 'lets do big data because that IT Research group tells us to do it...' You are to the point where they now know the term Big Data and face it with some scepticism so that people have to prove that it works and are not just choosing it to puff their resume.
The Gartner Hypecycle....
... Has now reached the trough of disillusion.
Seriously, it makes Magic Quadrants look sensible.
Gone, oh no, on to the next piece of computer industry
technology innovation 'Must Have' marketing hype .......
Re: Big Data
Please register here for the next Bullshit & Bollocks Quarterly Technobabble Report. Annual subscriptions cost only € 10000 - can you afford to miss out on the next big thing?
We don't need no f*g schema!
... that is all
Re: We don't need no f*g schema!
Actually that's not true. You do need a schema. If you actually knew anything about Hive, Pig and HBase, Schema design is important, and if you're using Hive, add partitioning of the data.
Searching for diamonds
If you don't know what a diamond even looks like to begin with, sifting through a bigger bucket of shit won't help you find one. And if you do know, you also know that's not where you should be looking anyway.
Re: Searching for diamonds
I've tried explaining to recruitment agents that Hadoop is a waste of time. I know. I've tried. You can write far more efficient solutions in C or Java.
Besides - if your data set is that big - then the problem you're trying to solve will be well defined - allowing you to spend a little time crafting the right tool to get the answer. This will always execute faster in the long run.
@Anon 16 Re: Searching for diamonds
Clearly you don't grok Hadoop.
If you did you'd understand that your M/R is Java code. Or it could be streaming C/C++ code.
There are plenty of use cases that prove you wrong.
Hadoop is a parallel framework. Pretty basic in concept.
Its a wonder that any recruiter called on your.
The fail is for you.
Another reason for a loss of confidence in Big Data is that it does not deal in absolute but instead produces what Sicular describes as “a proof of your hypothesis with a certain degree of confidence” rather than a concrete answer.
Obviously, since that's what every discovery process does. Even tautological ones (those that only involve the manipulation of formal abstractions, ie mathematics) are only "proofs" under axiomatic assumptions about such things as the proper functioning of the reasoning mind.
So, welcome to Bayesian reasoning, Gartner. Glad to see you could make it.
 This is Descartes' "evil genius" argument: you can't prove that there isn't some "evil genius" with the capacity to force you to believe erroneously that some construction is logically valid. These days, neurobiologists are pretty close to constructing real tools to achieve that, between pharmacological agents and EM manipulation of CNS processes, coupled with the use of functional MRI to determine when and where to apply the tech. Experimenters have already shown they can erase (or render inaccessible) a subject's memory of a specific event, for example.
- Apple muscles in on biz world AGAIN – this time with Cisco pact
- VMworld 2015 Ed Snowden crocked cloud, says VMware CEO Pat Gelsinger
- Mass redundancy marathon nearly over at HP
- Hidden password-stealing malware lurking in your GPU card? Intel Security thinks not
- Apple's fruity wristjob breathes down Fitbit's neck