I really wish somone in M$ has a clue
As someone with a vested interest in M$ doing well (being one of the evil people actually writting production systems on M$ servers) I really do look at what they are doing right now and wonder if they have any game plan.
You've got an open-source product with good take up...so let's make another version of it but with no story about why you would use it rather than the popular one. Then tie it to a server version which costs a sh*t load. Oh, and quietly drop the product in a few years when we're just starting to getting something right on it but a new shiny thing has appeared and we need to make a new version of the shiny thing.
Even at the basic level I can't see what they are trying to do. Why not have this running on the cheapest version of Windows server. I would guess that a lot of people would like a fully distributed framework to run certain types of software which treats the underlying h/w and os as disposable items. Need more power, throw in a new server. Server dies, throw it out and stick in a new one. You don't need to care about availability because the hadoop element takes care of that sitting on a cheap stack of easily replaceable physical bits. That's feasible if the h/w and OS are at the right disposable price but as soon as one of the elements needs to be at the rolls-royce price level then it doesn't work. Given the people already have hadoop I can't see that this will work when they tie it in to Windows HPC which has only a limited visibility/availability to the development community anyway. Another waste of time...