* Posts by Trevor_Pott

6991 publicly visible posts • joined 31 May 2010

Underfunded HCI startup Maxta hits the buffers as VC cash runs out

Trevor_Pott Gold badge

Good luck, Kiran.

Super Micro says audit found no trace of Chinese spy chips on its boards

Trevor_Pott Gold badge

Re: I so believe this

I think you're a bit out to lunch on this. Want to not be jailed by the Chinese? Don't go to China. Especially when the nation you live in is in the middle of a pissing match with China, and you're a high-powered executive. That's just life at that level, regardless of whether or not someone has asked you to compromise your gear.

As for the "money is the only thing that matters" bit...you had damned well better believe that "the security and integrity of [Supermicro's] products is [Supermicro's] highest priority". A single accusation - and one that most of the infosec world didn't buy at the outset, and which has since been considered completely debunked - absolutely tanked Supermicro's shares. Please do explain to me exactly how letting $nation stick shit on their boards is somehow making Supermicro money.

Or any manufacturer.

One does not have to do business with China. One does not have to manufacture things there. China may be a few points cheaper than Taiwan, India, Vietnam, or what-have-you, but a few points is absolutely not enough to entice a manufacturer like Supermicro if the tradeoff was "we will put spy chips on your stuff".

Let's consider this rationally for a moment:

1) Supermicro isn't - and hasn't been - the low-cost tin shifter for some time. You want some of the Chinese server manufacturers for that. Supermicro, Dell, HPE and even Lenovo can't compete with the likes of Inspur, and they're not really trying to, either. Supermicro's schtick is flexibility. They have a server room widget for anything you can possibly imagine. That means higher R&D costs, which means no playing in Inspur's sandbox.

2) The big buyers are the public cloud providers. Chinese cloud providers buy predominantly from Chinese companies. Non-Chinese cloud providers don't build a lot of data centers in China. There's a lot more money to be made putting data centers near China, but not actually inside the Great Firewall, with all its geopolitical restrictions. Why in Jibbers' name would Supermicro - or any non-Chinese tin shifter - knowingly let the Chinese government put spy chips on their board, when they know damned well that the instant it was discovered it would alienate the customers responsible for 50%+ of their revenue? There is zero logic in that.

3) Supermicro are terrible at politics. Absolutely terrible. They can't even deal with their own internal politics. They are not about to start playing geopolitical bullshit, because the people in charge of that company know that they would fail catastrophically.

Look, Supermicro has a lot of problems, but willingly getting mixed up in these sort of shenanigans? I just don't buy it. Their CEO is a giant nerd. Obsessive engineer type. He is emphatically not someone who is any good at politics, and he's perfectly aware of that. This extends to pretty much all of Supermicro's leadership.

Supermicro are heavy on nerds, heavy on suit-and-tie from-the-past marketing/PR/sales types, and are more or less what you'd expect from any large predominantly USian corporation that got where they got by building something useful, but never did the ruthless, cutthroat thing and brought in corporate fixers.

If China - or anyone else - had somehow demanded that Supermicro knowingly compromise their own gear, you would not have gotten a full-throated denial from them. You would have gotten a stone wall of "no comment", and lots of "talk to our lawyers".

This is because the people in charge of Supermicro aren't masterful political chess players who even attempt to play complicated geopolitical bullshit games. They'd piss themselves in terror and do whatever their lawyers told them to, and lawyers always tell their clients the same thing: shut the hell up.

So, to me at least, the fact that this particular pack of politically mediocre nerds and empty corporate suits decided that they were going to stand up and say "nyet" actually lends Supermicro's story some credibility.

Some.

They could always have been compromised without knowledge of the decision makers...but so far, the evidence doesn't seem to be bearing that out either. And that leaves me with all sorts of questions about this entire thing...but that's a rant for another day.

Ding ding! Round Two: Second annual review for transatlantic data flow deal Privacy Shield

Trevor_Pott Gold badge

Cambridge Analytica was a small player.

Give me enough money, and watch what I can do with only the three people that work at my company. Welcome to the Big Data era, sonny: it's truly amazing what you can do with a few scripts these days.

Chinese tech titans' share prices slump after THAT Super Micro story

Trevor_Pott Gold badge

Optional

Even if all of this is true - and I honestly have no way of knowing if it is - I am not sure Supermicro would be to blame for this. Or, if they are, at least not any more than Cisco is when US.gov tampers with their stuff.

I don't know...if this is true, and the Chinese have both the technology and the expertise to pull this off...why would this be limited to Supermicro in any way? Wouldn't they be doing this to all vendors that supply their high value targets? Who else might be doing it? I have so many questions.

The one thing I don't believe, however, is that any of the brass at Supermicro would be in on it. This isn't because I don't think it couldn't happen...but because I don't think the brass at Supermicro are all that plugged in to the factory floor. Supermicro is run by engineers. They like R&D. They like design and testing. They all seem to absolutely loathe logistics.

And there's evidence of that in how Supermicro comports itself. Supermicro is absolutely terrible at parts and component distribution in a number of geos. They have significant delays on system sourcing to partners in almost every geo, and I'm pretty sure their shipping department couldn't find their own ass with two hands, a roadmap and a sherpa. That's not the sort of thing you'd expect is top executives were concerning themselves with details like product assembly or distribution.

So all of that brings me back to: I'm unsure what to make of all of this. I just don't have enough data, and I've poked my contacts at Supermicro to see what I can uncover. From what I know of the people involved, I think it's entirely possible that Supermicro's production facilities could have been compromised and the executive layer be completely unaware of it. Whether they have or not, who can say? It will be interesting to see it play out.

Windows Admin Center gets an update, just in time for Server 2019

Trevor_Pott Gold badge

Microsoft discovers webmin.

Slingshot malware uses cunning plan to find a route to sysadmins

Trevor_Pott Gold badge

Re: Sort of points out that winning against a multi-faceted adversary will never win

"What you say is only correct because the defenders use standard processes that are predictable: "common practice". Once you depart from this predictability, an attack becomes much harder and potentially less effective."

Yeah, and rolling your own crypto is a great plan. Pfffffft.

Standard security processes and procedures actually do work, except against exceptional (read: statistically extremely unlikely) threats. The problem isn't standard security processes and procedures. The problems are lazy administrators who don't implement them, and companies that don't pay for them.

You wake me up when your'e running a fully containerized and microsegemented environment with complete data path inspection, automated baselining, baseline deviation sensing, and automated incident response that includes at the very least auto-quarantining.

Unless and until you manage to get your security solutions to at least the above level, you have no place disparaging standard security procedures. If you understood today's IT security and were able to implement it, you'd understand the huge gap between today's best practices and the poor bastards cowering behind an edge router like it was 1993.

Netflix could pwn 2020s IT security – they need only reach out and take

Trevor_Pott Gold badge

Re: Does that word mean what you think it means?

Damn it, I'm not old!

I'm just alt-young.

What did we say about Tesla's self-driving tech? SpaceX Roadster skips Mars, steers to asteroids

Trevor_Pott Gold badge

Re: Asteroid belt?

Disrespect Mars and I'll go through you like a door.

Infinidat techie: Let me tell you a thing or two about ruler-format SSDs

Trevor_Pott Gold badge

Optional

Dinosaurs remain among the most populous animals on the planet. Perhaps you've heard of their modern incarnation. They're called birds.

And there are reportedly half a trillion of them out there.

So next time, pick a better analogy. One, perhaps, that doesn't demonstrate you failed to evolve your thinking to incorporate scientific knowledge that is now decades old.

No, Windows 10 hasn’t beaten Windows 7’s market share. Not for sure, anyway

Trevor_Pott Gold badge

Re: One should add to the graph...

I should point out that ad block penetration has surpassed 30% already.

No Windows 10, no Office 2019, says Microsoft

Trevor_Pott Gold badge

Crayon does seem the appropriate MID for Microsoft fans.

Trevor_Pott Gold badge

Maybe if Microsoft put in a "stop fucking spying on me, you lousy git" button, people would be more inclined to adopt it. At least with Windows 7 I can murder the bloody call home with a hosts file. In Windows 10, they hard coded the bastard in so that even this doesn't work.

Fuck Microsoft. Fuck them with Lennart Pottering.

Red Hat tries CoreOS on for size – and buys

Trevor_Pott Gold badge

Re: Great /sarcasm

Come in may. Openstack Vancouver. It's a great reason to hit the continent, and also it's Vancouver, not Edmonton. That's always better.

Trevor_Pott Gold badge
Unhappy

Re: Great /sarcasm

It's Red Hat. They'll give the whole thing to Pottering, and he'll build it into systemd. It will end up riddled with bugs and with security as an afterthought's afterthought. When the first logoed vulnerability emerges, he'll blame end users. Nobody will be able to fix anything because all the logs will be binary, and corrupted by the attacks.

Red Hat will snicker and go "who else are you going to buy from". Three more RHEL derivatives will emerge the next day. Oracle will pretend it didn't hear anything about any of this.

So, you know, a standard Tuesday in IT.

You can't ignore Spectre. Look, it's pressing its nose against your screen

Trevor_Pott Gold badge

Re: Does This Affect AMD Epyc CPUs

Why do you assume CPUs have to be "safe" for people to buy them? Do you honestly think that Spectre has slowed down CPU purchases? Do you think anyone but the handful of nerds that haunt these forums and some security nerds that read the Grugq a lot actually care about any of this?

Make no mistake, Intel is still the seller of server CPUs. They'll keep on being the seller of server CPUs through the lawsuits, and they'll emerge from this as the seller of server CPUs.

I'm as much of a fan of the underdog - and hence AMD - as anyone, but let's be realistic. Intel has an iron-fisted monopoly, and unless someone goes in there with the Almighty Axe Of Anti-Trust and cleans them out, they'll continue being a monopoly for at least the next decade.

You know it. I know it. Everyone except a few deluded die-hards knows it. So let's no pretend about this, shall we?

The people who buy things don't give a fnord about "security", and they never have. If they did, the Internet of Things wouldn't be such a gor'ram security dumpster fire. Nobody would ever buy Cisco or Supermicro again, and the list goes on and on and on.

But you know what else? It's not their asses that end up in front of the judge. It's us. The hoi polloi at the coal face. Nobody sends suits to jail. They make some poor bastard working in ops the lightning rod and ruin his life, and the lives of his family instead.

So yeah, Intel's dominance isn't going anywhere. Nobody's going to do a bloody thing about it. We're going to be responsible if/when it all goes horribly wrong, and we should know about this ahead of time so that we can take precautions and/or run the hell away in terror. (Depending on how you view risk.)

For suits, the only risk they care about it "will this cost me some of my bonuses". For nerds, the risk we need to worry about is "will this land me in front of a judge"? I leave it as an exercise for the reader to work out how likely (or not) they feel this is to affect their chances of negative consequences.

But we should all have eyes open here and understand that this issue isn't going away, and that we, as the plebians, have no choice but to deal with it.

Trevor_Pott Gold badge

Re: Does This Affect AMD Epyc CPUs

AMD and some ARM chips are affected by Spectre. But let me be 100% clear on this: AMD is completely irrelevant to this discussion.

AMD chips might powerful a smallish percentage of endpoints, but they power almost none of the existing fleet of deployed servers. Even if, for some reason, we all decided to buy AMD tomorrow, AMD couldn't deliver. At full ramp, AMD would struggle mightily to put out enough silicon to cover 10% of planetary server capacity during the next refresh cycle, and there is zero indication that demand exists for them to invest in that many wafers.

I'm sorry, but in the real world, AMD just isn't part of the any discussion about server chips. There's only one player in that market, and they'll be the one everyone buys replacement chips from.

Microsoft whips out tool so you can measure Windows 10's data-slurping creepiness

Trevor_Pott Gold badge

Re: My data is mine, not yours.

"And Google's, obviously."

Fuck off with your whataboutism. Who are you, Sean Hannity?

Jibbers garlic-covered Crabst...

Serverless: Should we be scared? Maybe. Is it a silly name? Possibly

Trevor_Pott Gold badge

Re: "but we will always have at least the original black box's capabilities."

Yes, I'm sure. I'm sure because already "serverless" doesn't mean "Amazon". Serverless has a lot of ardent followers, and they're building open source solutions that do what Amazon's Lambda does.

Similarly, machine learning, AI, BI and other BDCA tools are seeing a lot of open source growth. In short: a commercial entity (like Amazon) might come up with the initial concept and get to milk it for a while, but all technology eventually ends up democratized.

The basic approach of serverless - white simple scripts (or have a UI/digital assistant write them for you), and have those scripts simply pass data from one black box to another - isn't going away. Humans don't typically uninvent things, especially in IT.

The increasing adoption of digital assistants also makes this seem like a permanent thing to me. The black boxes used by serverless types are really no different than the "skills" one can build for Alexa. Indeed, many of those skills are nothing but serverless scripts that call black boxes, and I've already seen Alexa used to create new serverless apps, which could be published as Alexa skills...

The whole thing has already reached critical mass and started a cascade. While a bunch of whingy nerds who can't disconnect "application development" from "enterprise apps" might not get the importance of an ever-increasing library of digital capabilities that can be used by any Tom, Dick or Harry, non-nerds seem to get the importance really quickly.

So we, IT nerds used to the way things were, we might not see the utility, or ever get around to using serverless to do our jobs. Our kids, however, will use serverless-style tech to do all sorts of stuff. Using - and trusting those black boxes will be as natural to them as smartphones are to my generation, or staring blankly at VCRs flashing 12:00 was to my parents' generation.

So sure, in the short term I expect some of these black boxes to disappear. That will lead to a backlash, and to standardization, to the development of "skill libraries" and all of the predictable evolution of responses to this problem. Corporate greed is eventually overcome in tech, even it takes a decade or so for us to get our shit together.

It is very early days for this technology yet, but the basic approach is sound. And those who have used serverless in anger tend to become adherents pretty quickly. Even the disenfranchised and cynical nerds.

Trevor_Pott Gold badge

Re: I worry the author is bluring Capabilites and Serverless Environments

@criagh yes, but you are doing enterprise work. Application development for commercial purposes. What you - and all the rest of the angry nerd mob of commentards are missing - is the part in my article where I very specifically said "It is the commoditisation of retail and consumer application development".

Serverless isn't going to replace traditional development for organizations looking to build their own middleware anytime soon. VMware isn't going to make their next vSphere UI using serverless. That's not where the revolution comes in. The benefit of serverless is not "helping highly trained nerds do what they already do better", despite the inability of the commentariat to conceptualize anything different.

Serverless is going to let people who are not nerds create applications that solve problems that nerds and commercial entities are not normally interested in.

For example: Let's say that i want to grow cannabis plants at home. (In a few months we can legally grow 4 plants per household here.) Cannabis is a particularly persnickety beast to grow. Much more so than the bell pepper plants that I have all over my house.

With serverless, I could take data from my house - images of my plants, or perhaps sensors similar to these - and feed that data into a black box up in the cloud. I could set the thing up so that if A happens, the plants get watered, if B happens, a fan turns on and if C happens, it sends me an alert.

Traditionally, if I wanted something like this I would have a few options:

1) Build something myself involving an ardunio (lots of work)

2) See if a vendor has already build a pre-canned solution, probably involving their own sensor ($$$)

3) Hire a human ($$$$$$$$$$$$$$$$$$$$$)

With serverless, however, I just need someone to write a "black box" that can accept some form of data I can provide (images, sensor data, etc) and spit out simple data about the plant in question. That black box does not, to my knowledge, exist today...but I am sure it will soon. (I could even train my own black box using machine learning, but that's another discussion.)

Essentially, serverless is a scripting platform allowing access to an ever-increasing number of "skills" or "capabilities", in a marketplace-like format. A platform I don't envision being used to replace super-niche industry development, but one I envision being used by civilians to automate and enhance their daily lives.

Application development is already a thing that is out there enhancing businesses that can afford qualified nerds. Now it is going to start being something we can use in our day-to-day lives to collect, modify and act on the data around us.

Trevor_Pott Gold badge

Re: how much will it cost

The individual applications wont' last long, it's true. But the black boxes themselves can. Oh, they'll change and evolve as they're maintained, but as a general rule, once invented, they won't be uninvented.

If someone creates a black box that does facial recognition, we aren't going to find ourselves 50 years from now without a black box that does facial recognition. We may find that someone has created better facial recognition systems, but we will always have at least the original black box's capabilities.

The digital skills being created accumulate until, one day, you'll be able to say "computer, watch the roses in the garden and water them if they start to show wilt", and it will happen.

That phrase spoken to the computer is a script. It told the computer to gather a dataset consisting of images of the roses in the garden. It told the computer to send those images through a black box that determines if there is wilt. If there is wilt, it will trigger the garden's watering system. That, right there, is serverless. The only difference is the interface used to create the serverless script.

The end user doesn't care if the black box that checks images of roses for wilt is the same today as it was yesterday. They only care that it works. All that matters to them is that a black box that performs that action exists, and that it doesn't stop existing.

Trevor_Pott Gold badge

Re: "Codeless" maybe?

If we did call it gluten-free, we'd probably get a whole bunch of new people interested in application development! :)

Trevor_Pott Gold badge

There's a whole industry full of supposed trained experts with loads of experience and education and all our information is ending up on haveibeenpwned anyways. You'll excuse me if I don't feel that people doing the application development equivalent of drawing lines between black boxes that are designed and maintained by large multinational public cloud providers could possibly do any worse.

Hell, maybe those public cloud providers will actually hire enough security people that the stuff they offer is secure by default. It would be a nice change from the shitpoaclypses created by the "I know better" or furiously cheapskate crowds.

Trevor_Pott Gold badge

Re: Serverless is a stupid name

Ease of use is the difference. Yes, serverless is basically "scripting for noobs". But the "for noobs" part is really critical.

Yes, really smart, well paid, highly experienced nerds can write a bunch of gobbedly-gook scripts, or write applications in proper development languages that called libraries or other application components. That's hard. And it requires expertise. And if you are incorporating some black box (application, library or other component) into your solution you first have to find it, download it, install it, make sure it's in the right place to be called, etc.

Almost all of that goes away with serverless. You write what amounts to a script, but you have access to this (constantly growing) library of tools and features supplied by the cloud provider. Making use of those tools and features is generally much easier than a bash script, and you don't have to manage or maintain the underlying infrastructure in any way.

When you take something that requires significant expertise and make it something that nearly everyone can use, that's a pretty big deal.

I'm sure that there were a bunch of nerds who knew how to build and maintain their own electrical generators who also said that national grids were a stupid idea. If they can run their own generation capacity, and wire up their homes/businesses exactly according to their own specifications, why can't everyone? Standardization means everyone isn't using the optimal plug/wire gauge/whatever for the job! There might be inefficiency! A national grid won't be revolutionary or change everything, because people can use electricity now!

I, for one, am glad nobody listen to them.

Trevor_Pott Gold badge

Re: I worry the author is bluring Capabilites and Serverless Environments

@Craigh You are correct: there is no reason for various cloud features (such as BDCA tools) to be tied to serverless. On the other hand, serverless doesn't offer you much of any real use except the ability to push data into and pull it out of various cloud features.

The point here is that serverless basically allows one to lash together some pretty complex applications about as easily as one could write a (reasonably simple) batch script.

Real developers will build the various black boxes. Trained IT practitioners will be able to use those black boxes wither through serverless or other cloud solutions. But for the milled masses, serverless will be their gateway to application development.

Right now, today, in order to build serverless applications you need to actually type a few lines of code. But the code you need to write in order to pipe data from A to B to C and back out again is simple. It's the sort of thing that you could build into a drag-and-drop GUI and have 4 year olds use.

Nerds think about serverless from the viewpoint of "how does this help me do what I do now"? Non nerds don't do any of this now, and when shown how easy serverless is, say "wow, look at all the neat things I can do that I couldn't do before".

So serverless - in the form of AWS lambda - is itself pretty useless. But it is this gateway to all the other things that Amazon's packed into AWS. It isn't the only way to use all of AWS's goodies, but it absolutely is the means by which individuals who aren't specialists, trained practitioners or experienced cloud architects will set about making tools for their own needs.

Trevor_Pott Gold badge

Re: @Trevor Pott

@Sir Runcible Spoon well that comment certainly created a combination of wry smile and crushing, crushing depression. A little bit too on the nose...

Developers, developers, developers: How 'serverless' crowd dropped ops like it's hot

Trevor_Pott Gold badge

Re: Cloud, REST, HTTP, PHP, trendy NoSQL DB de-jour, blah blah, whatever...

Maybe if I had been, you'd have learned a few things. :P

Trevor_Pott Gold badge

Re: Cloud, REST, HTTP, PHP, trendy NoSQL DB de-jour, blah blah, whatever...

"Minix used it and look how well that didn't do out in the wild"

Minix is easily the most widly deployed OS in the world. It's in every Intel IME, and a whole heap of other embedded systems.

Think before engaging keyboard, mate. There's a lot more to tech in this machine-to-machine world than end-user-facing OSes.

KVM? Us? Amazon erases new hypervisor from AWS EC2 FAQ

Trevor_Pott Gold badge

KVM: Now powering two of the big three public clouds. What's that VMware? KVM is "not ready for the enterprise"? I think you know better.

Marissa! Mayer! pulled! out! of! retirement! to! explain! Yahoo! hack! to! Senators!

Trevor_Pott Gold badge

Re: Root Cause: HAIRBALL Systems Design

Something like WhiteSource can help developers make sure all their libraries are up to date. That's kind of it's job.

This could be our favorite gadget of 2017: A portable projector

Trevor_Pott Gold badge

"Camping with a projector to watch a movie of a campfire do you mean ?"

Where's the fun in that? Who doesn't like fire? I like fire. Hmmmm. Fire.

Trevor_Pott Gold badge

"Given the general lack of flat white walls in the general area when out camping in the woods/fields/wilderness, won't the average millennial fairly-well-off youngster be a bit pissed off lugging an 8ft projector screen around with them?"

I'd just bring a white blanket and hang the thing off the side of the car, but hey, that's me. "Flat" is a luxury. You're camping, eh?

Trevor_Pott Gold badge

"Isn't the point of camping to rough it a little?"

Depends on whom you ask. I think of camping more as "getting away from people and all the noise of a city". If I could, I'd live on an acreage surrounded by trees all the time. I'm too poor. So I go camping.

Camping with a projector to watch a movie while I poke the campfire? Sign me up.

Now, if we could just extinct mosquitoes...

Imagine the candles on its birthday cake: Astro-eggheads detect galaxy born in universe's first billion years

Trevor_Pott Gold badge

Re: Confused

They're less "technically challenging" than they are "a damned lot of tedious, boring, thankless work". I.E. a miserable pig.

You have to go through a lot of data before you even get to a candidate. Long after the fun work's been done, and the challenges of building the 'scope are long past, there's just crunching image after image. If you're really lucky an algorithm can help some. Even then, that's an awful lot of faint smudges and maths.

Trevor_Pott Gold badge

Re: Confused

G09 83808 isn't the galaxy spotted earliest in the universe's history, however, it is spotted very early on, and we've gotten a chance to get a slightly better look at it than we could with previous instruments. Any and all objects that we can spot which are from about 1B years after the universe formed or earlier will get press, and rightly so. They're a miserable pig to find, and they can tell us a great deal about the early universe.

That awkward moment when AWS charges you BEELLIONS for Lightsail

Trevor_Pott Gold badge
Coat

The title is optional. And possibly a giraffe.

This is a feature, not a bug. Oracle Microsoft Amazon don't have customers, they have hostages.

Mine's the one with the empty wallet -->

More expensive, takes longer than usual, not particularly brilliant. Yes, it's your robot surgeon

Trevor_Pott Gold badge

While the benefits of current robot-assisted surgery is somewhat questionable, this doesn't mean surgery robots are useless...or at least that they will remain so. Consider, for example, this prototype.

This prototype robot uses machine vision - amongst other modern techniques - to create a (mostly) autonomous surgery bot that is actually better than humans. Yes, it has some bugs, but it's an early prototype. I expect to see some rapid enhancement in this area.

Google, Twitter gleefully spew Texas shooter fake news into netizens' eyes

Trevor_Pott Gold badge

Optional

"Should Google fact check Presidential Tweets before showing them on its site?"

Yes. Fuck yes. Absolutely yes. Yes a dozen more times and yes.

So should everyone else.

Comet 67-P farted just as Rosetta probe flew through the gas plume

Trevor_Pott Gold badge

Re: "the Arrakis Sandworms were actually quite delicate"

No they weren't. Drop a sandworm in water, watch it shatter into sandtrout. In relatively short order they would sequester all your planet's water beneath the surface and convert the planet into a desert filled with sandworms.

Do not fuck with Shai Hulud. He will your entire planet.

Licensing rejig and standard price rises set for Windows Server 2016

Trevor_Pott Gold badge

Who could have seen this coming?

Except, you know, for most of us. Microsoft isn't trustworthy, and likely never will be.

F-35s grounded by spares shortage

Trevor_Pott Gold badge

Pork

Pork everywhere.

Evaluate this: A VM benchmark that uses 'wrong' price and config data

Trevor_Pott Gold badge

I have some questions. Where can I reach you?

Nadella says senior management pay now linked to improving gender diversity

Trevor_Pott Gold badge

Re: Female CEO

"people like them give women a bad name"

So a handful of female CEOs that screwed up "give women a bad name", but thousands of years and millions of male leaders of companies/governments/transcontinental empires etc. don't give men a bad name?

Look, I question the validity of affirmative action as much as the next guy, but dude, that's some completely unsupportable sexist bullshit right there. What the fuck.

Trevor_Pott Gold badge

Re: Now the search for the ultimate diversity employee starts.

"religion muslim or jewish or some recognized religion other than anything those based on Jesus"

Um...Jesus is an important prophet in Islam...

Trevor_Pott Gold badge

Re: Symptom of bullshit job

"Are most IT jobs actually bullshit jobs where the performance in the role actually has little impact on the org?"

I second the "yes" comment.

Drunk canoeing no longer driving offence in Canada

Trevor_Pott Gold badge

@Haefen

Speaking as a Canadian - and one who apparently has quite a bit more knowledge of what you're babbling about than you do - I would like to, in the kindest, politest possible way convey my feelings about your inane babble:

ODFO

Cheers.

Why is it that geeks' favourite enemies are... other geeks?

Trevor_Pott Gold badge

outofacannonintothesun.jpg

Trevor_Pott Gold badge

Re: SYSTEMD

*hiss*

New NIST draft embeds privacy into US govt security for the first time

Trevor_Pott Gold badge

Re: Can you please....

No, we can't. And the reason is that politics, privacy and security are tightly wound up into an inextricable morass.

Regardless of your politics, and what side you take in the multi-tentacled debate, politics affects the balance chosen between security and privacy. Or whether the needs/desires individuals (as opposed to corporations and/or states) should be considered at all.

So put on your big boy pants and welcome to the real world. Politics is everywhere.

Azure Stack will need special sysadmins, says Microsoft

Trevor_Pott Gold badge

Thank you kindly for choosing to engage directly with the community. It is nice to see anyone from any vendor taking the time.

I do have one small question however: does the rest of Microsoft know you're doing this? You're harming their ruthlessly customer hostile image. (Well, not a lot, as the Windows team exudes so much animosity towards literally everyone that it's hard to overcome...but you are denting the evil overlord image a little...)

Core-blimey! Intel's Core i9 18-core monster – the numbers

Trevor_Pott Gold badge

Re: Gamers?

I am told by some of the hard core gamers in my sphere that if you want to do VR at 240hz then having 8+ cores @ 2..8Ghz or better is usually required. As I'm poor, and still working on a video card from 3 years ago and a Sandy Bridge-era CPU, I cannot confirm this.

Apparently VR is a thing that some people do. I don't understand. Why do you need VR to play Scorched Earth?