4 bits one in 16
So you try 16 times?
Or what have I misunderstood?
Oracle insists it really is going to sell computers powered by Sparc M7 processors – the same chips it started talking about in 2014. On Monday, Big Red breathlessly unveiled hardware powered by the beefy microprocessor, and on Tuesday, its supremo Larry Ellison lauded the 64-bit CPU's security defenses. One of these defenses …
Thankfully, crashes these days are so exceedingly rare they would also immediately raise a red flag of nefarious stuff going on.
...oh, wait... (want an instant BSOD STOP error on your PC, in Anno Domini 2015? Upgrade to the latest Steam client! True story!)
If you're in a position to flip bits in someone else's pointer aren't you already in control of the application?
Generally not. The typical use-after-free attack, like most stack-smashing attacks, integer-overflow attacks, etc, must leverage the initial violation into a full exploit. Generally that's a process of some complexity - how complex depends on the vulnerability and the application in which it exists. Sometimes it's straightforward, as with many return-into-library exploits. Sometimes it isn't; Ormandy's #GP Trap exploit for Windows is a good example of a complicated one.
So it's quite plausible that you'd have a vulnerability that let you flip bits in a pointer but did not in itself give you much more than that.
Oh my, those guys who design silicon chips for a living must be real dumwits to not think of just trying 16 times to get past this protection.
Fancy that, they can design a chip that has billions of logic gates and takes 20 world records but never think of such an easy get around that defeats a 5 year design process.
OTOH, perhaps your assumptions taken from an exec keynote intended for a non-technical audience may not be entirely correct.
"Oh my, those guys who design silicon chips for a living must be real dumwits to not think of just trying 16 times to get past this protection."
An interesting programming challenge, as everytime your probe guesses wrong the chip issues an exception and halts your probe. I look forward to seeing what Joanna Rotkowska (Invisible Things Lab) has to say about the M7 security features...
"An interesting programming challenge, as everytime your probe guesses wrong the chip issues an exception and halts your probe."
Not really, your main executable would keep track of the address and colours it's tried and spawn a new process to do the actual trying. If the new process gets terminated, next colour.
your main executable would keep track of the address and colours
So you're assuming that you get to run your own executable on the server? Well that already limits the potential exploits and requires a valid login to the server and a way to upload/create an executable. This is not the vector used by most exploits such as Heartbleed.
We're talking here about preventing the malicious exploit of software bugs causing buffer overflows or double free memory errors. These are exploits of software bugs, not malicious executables.
Also you're assuming an executable can 'see' the pointer colour. It cannot, neither can it manipulate the pointer colour.
Remember that the memory is allocated (and coloured) on a malloc basis, so even contiguous memory for a single application where the memory has been malloc'd in chunks will become unreadable sequentially as different mallocs will have been allocated different colours. Both free memory and pointer metadata have dedicated colours and are automatic fails.
The protection here is not betweeen applications (that has been tried before) but between different malloc calls.
This post has been deleted by its author
A 64 bit address allows 18.4 million terabytes to be addressed. That's not just "big" :-)
By the time we need more than that, code will be writing itself and human beings will be obsolete sacks of meat.
To give you an idea, getting 18.4 million terabytes using 32GBit desktop DRAM sticks would require a surface area of around 16 square kilometres by some back-of-the-envelope calculations I found on the web.
Hollywood has already come up with a number of fiendishly clever compression algorithms to save space on future movie projects. For example, the recent "DH" algorithm compresses an entire movie, which is typically tens of Gb, into an array of around five bytes using an initial key known as a "willis".
The as-yet-unpublished RFC describes the reference implementation using the following example.
Key ("willis") = Die Hard
Input = movie to be compressed
Output: (Key) + "on a " + {bus, plane, car, office, ...}
Initial benchmarks have proven extremely effective at compressing the input in a lossless manner that allows full data recovery. For example, audiences were able to predict the content of a sample of movies including Speed, Passenger 57 and Air Force One using only the inputs of the key name and the strings "bus" and "plane".
Given that there are estimates that each person will be producing 5Tb of data/year by 2020
People make all sorts of claims (sometimes in passive voice, to avoid attributing them). I suspect anyone promulgating such an "estimate" is a tad hard of thinking.
What is it even supposed to mean? What constitutes "producing data" in this context? How much of that data is said person going to want to store? Will this same amount be true of both some techophilic wannabe-transhumanist lifeblogger and a subsistence farmer in Bangladesh, or is "each person" the sort of handwaving rubbish thrown around by pundits who can't be troubled with the details?
And why 5Tb? Because it sounds better than 0.6TB?
Trainee me, circa 1979: "Shouldn't we be storing the century part of the date here?"
Chief programmer, struggling to be heard over the laughter: "We won't be running these Cobol programs at the end of the century, silly boy!"
Chief programmer was right, at least from a Bayesian reasoner's point of view. There was a high probability that any given piece of code wouldn't still be in use in 20 years. And for what was, the savings of using 2-digit years was very likely greater than cost of remediation (once adjusted for inflation, etc). Factor in the widespread economic benefits of the capital spending and productivity boost that Y2K remediation is broadly credited with (among people who've actually studied it with some degree of rigor), and it was with high probability the better decision.
"Organisations can control root access to all devices. The big problem is an undetected cybernetic intrusion, lasting for years if properly executed."
Most hacks rely on social engineering at the top of the kill chain; that's usually how the cybernetic intrusion happens in the first place.
Also, I wouldn't overstate the ability of any corporation of significant size to control Root. The moment your IT department is big enough to include senior staff with non-techy backgrounds, you might as well publish the domain admin password in the local paper.
Large organisations are made up of thousands of people. Even the high tech ones have relatively untrained, uneducated people in some positions with systems access. And when you start looking at thousands of people, you're playing the odds.
No corporation can successfully secure its borders if it employs more than, say, one hundred humans.
"Hi DBA Duncan this developer Dave. We've written this app on the dev system that needs to go live this afternoon. We need system password on live to make it work."
If refused, this usually follows next from the project manager...
"Do you know who asked for this app to coded? Board member Brian and he's going to be mighty pissed and looking to make heads roll if anyone stands in the way of this app going live!"
The strength of security in hardware isn't so much that it can't be changed as that it can't be avoided by poor development practices and/or software quality control.
The danger - in my opinion - comes from the potential complacency it induces.
Does the danger outweigh the benefit? I don't think so. But this stuff should only be used in conjunction with defensive software techniques including, in my book, keeping use of unmanaged languages (C etc) to a minimum. Buffer overruns and pointer exploits are the gift which just keeps giving.
The most powerful security measures can be done at the layer of operating systems (sandboxing) and memory-safe programming languages.
And surely we *have to* improve security or users will go back to physical filing systems and microfiche. The experts in the Kremlin apparently already do this. They should know, as they rule the largest landmass. That is only possible with proper security, including the security of administrative information.
With respect, using the Kremlin as an example of good practice in any area is a bad choice. They're mostly a bunch of low paid, over-corrupted, lifelong bureaucrats.
Russia, as a country, doesn't work particularly well. There's a sort of mystique that's sprung up around Putin, but the reality of the Russian government is very different.
"The strength of security in hardware isn't so much that it can't be changed as that it can't be avoided by poor development practices and/or software quality control."
Well that depends on whether they are on by default or have to be enabled by software...
I suggest you take a look at the intel 286! It included some rather nice security features, which (if memory is correct) allowed you do similar to what Oracle are claiming for the M7. I expect these features have been carried forward into the x86 and x64 chip families, yet neither Windows or Linux make use of them.
Whilst there was a significant performance hit when processors ran at 10Mhz, now we have multi-cores running at 2+Ghz, the overhead is significantly reduced.
I wonder if any of the VM hypervisors implement these security features...
Certainly was the case with S0NY's PlayStation 3. Dirty little secret is that every Console built before Jan. 2011 is defacto "open". Since all the Keys... Up to that point are in the wild now. Consequently its also why anything post Jan. 2011 will in all likelihood never get hacked. Of course pissing of a bunch of nerdy geeks, by swiping their Linux install functions, + leaving (or as good as left traces), of their private key in the Firmware was also a bit of luck for those that like the idea of having as full an access to their hardware. Bit the fact remains though the immediate exploit OFW3.55 has since been closed. Thus making such a hack, less trivial to pull off, then just installing some Custom Firmware over S0NYs latest, and greatest. It will never fully close the door on such Console's, and thus remain highly exploitable.
ASLR, DEP and this thingy won't fix the problem at the core: The C language. ALGOL was already in a much better state of security during the heyday of ICL, Burroughs and Elbrus mainframes.
What we need as an industry is to migrate everything to memory-safe languages: Swift, Rust, Sappeur, Vala, Java, C#.
The first three languages demonstrate that garbage collection is not necessary for memory safety.
If we do not fix the root cause, the cyber war chaos will only grow bigger and bigger.
We don't need to migrate to so called memory safe languages because that wouldn't change much. It would be still possible to put SQL injection bug, skip certificate validation, use poor seed of a random number generator in security-sensitive context, mixup physical units and put many other bugs (human stupidity knows no limits) in any of these languages. Removing certain language features is just another band-aid.
We just need competent programmers writing in any language and API they happen to use at the moment. I have not written a single "free()" or "delete" in past 5 years programming in C++, because I know how to use smart pointers. Root cause is not the language, it is that some people who don't know the basics are deemed to be "competent programmers" and allowed to write security-sensitive code (which is most code, if you follow the principle of multiple layers of defense)
" it is that some people who don't know the basics are deemed to be "competent programmers""
This. And a general lack of security awareness throughout the whole enterprise, tbh. Most staff should now be given a few hours of security basics automatically before their even allowed to touch a PC - how to spot a fishing scam, the importance of passwords, why encryption is your friend, why USB sticks aren't. Too many businesses think that having one guy who met a CISSP once covers their security needs completely, and so everyone else can get on with their job without needing to know anything about it. Programmers are just the most obvious example of this, but the problem is endemic at all levels and ll business areas from the janitor upto the C-Suite.
And a general lack of security awareness throughout the whole enterprise, tbh
Is it the lack of security awareness, or the low priority managers put on security? Security isn't a simple tick box in Visual Studio or Eclipse! Often a manager will want code that works "well enough" to ship out the door. Time is money, and security done right takes time.
We had a shedload of C-style exploits in the Windows, HPUX and Linux kernels. The type of infallible software engineer you call for DOES NOT EXIST.
And because we can contract cancer (SQL injections), we should not implement countermeasures against the plague ? How rational of you !
"C / C++ are not safe for routine enterprise development."
strongly disagree. Yes, for certain purposes neither C nor C++ are safe, that's where you would use Ada or yet something else. However in the hands of competent programmers, using correct toolset (including compiler warnings, with added static and dynamic analysis tools) and with sane design there is nothing wrong with these languages in the enterprise.
On the other hand, leaning too much on a large monolitic sandboxed VM runtime (e.g. JVM or CLR) creates monoculture which, as we know, brings its own security risks. Unless you really believe that these environments are bug-free, in which case I want to remind you that they are written in C++ and that they are very, very complex internally. More complex than many of the enterprise applications you would want to run on top.
...are due to typical C-style bugs (in C or in C++ code) like out-of-bounds access, use after free, double free and so on.
Now, if we can rid ourselves of 50% of all serious IT security issues, I do think it is worth changing the language used. But surely there is very serious inertia from C and C++ developers for reasons ranging from laziness to malice.
After all, dozens of billions are made every year attacking and defending in cyberspace. Livelyhoods depend on these exploits. And they don't give a damn that C-style coding endangers the multi-trillion dollar IT business.
All the sane, non-corrupt folks in IT and computer science SHOULD CARE, though. C is a rotten concept, let's call it out.
"in the hands of competent programmers".. There, that's the bad assumption right there.
Firstly, it's not just incompetent programmers that are prone to fits of forgetfulness and/or stupidity.
Secondly, many - perhaps even most - organisations do not exclusively employ only competent programmers. Clearly.
If they did, we'd never hear about buffer overrun exploits, would we? But we do hear about them. So something needs to change. If we can't change the people - and it appears that we can't seem to manage that - then maybe we need to get defensive with the technology instead?
The problem is that most of the languages you list either run on a VM written in C or compile down to C. You're then reliant on THAT implementation not being broken.
It's turtles all the way down.
I'm no security guru, but you need to start with your kernel being written in a memory-safe language and workup from there. e.g. Microsoft's Singularity. (It should be noted, however, that even this O/S still has some assembler & C code in it which are still potential sources of errors)
Of course, even this will only prevent some errors. If the programmer completely screws up the algorithm (e.g. Apple's Gotofail), no language is going to prevent that.
There is no reason why a Swift compiler cannot be written in Swift. We had kernels written in ALGOL and PASCAL in the past for highly successful systems like those of ICL and HP. Unisys still sells one line of ALGOL mainframes, but they are also under economic pressure by the quick-and-dirty-language-based kernels.
Totally agree. It's not that it can't be done, it's that there's little incentive for someone to rip out years of O/S development and start afresh with a brand new language. This on its own is not a trivial undertaking. But it gets worse!
Once you've got your shiny new secure O/S & language, you then have to re-write all your applications, libraries, etc. in the new language for the new O/S.
Re-writing the entire software stack is a mammoth undertaking and will take years (decades maybe?). Who's going to pay for this?
You erect a huge strawman saying "there will be other issues, so don't even bother to implement this measure".
"Don't wash your hands to stop bacteria from proliferating, there is carcinogenous sooth in the air which may kill you".
NO - what we need to do is to use all available measures to cut down the size of the cyber war domain. C was a regression relative to ALGOL/Pascal/Ada/Swift. In addition to the issues listed, C has a major deficiency in integer over/underflows. Compare that to Pascal-style languages , where you can clearly define the valid range of a number. And transgression will lead to a defined program stop instead of a logic cancer that might open the possibility for covert subversion of your system by cybernetic attackers.
You erect a huge strawman saying "there will be other issues, so don't even bother to implement this measure".
I said all that? You jump to a huge conclusion based on what you assume I was thinking.
The inertia is not malice or laziness, but economic. As A non e mouse said, who will pay for it?
There are some Rust OS hobby projects which could be promising, but there's a long way to go.
Some of us have families and day jobs, but please, don't let me stop me you from rolling up your sleeves.
"The inertia is not malice or laziness, but economic. As A non e mouse said, who will pay for it?"
Why can't IT tell accounting and legal it's either pay some now or pay a lot more later when the inevitable lawsuits and criminal charges start coming in?
Byte addressing is a nasty Intel hack.
Byte1 addressing dates back at least to the mid-1960s, with the System/360. Intel wouldn't even exist until the end of that decade.
It's also ubiquitous for general-purpose CPUs, and for many embedded and other specialized ones as well.
True, many of the RISC CPUs have alignment requirements for objects larger than a byte, or at least take a penalty for misaligned loads and stores. But they're still byte-addressable.
1That is, octet, or 8-bit bytes. "Byte" isn't exclusively used to mean 8-bit objects, even as a term of art. People who actually know C, for example, will know that in C a "byte" is exactly the same as a "char", and the number of bits in a char is implementation-defined (and may be found as CHAR_BITS in <limits.h>).
I was issued with a Tadpole lappie with a SPARC chip at work around a decade ago. Nice enough machines but they were lacking in grunt for most laptop stuff compared to commodity hardware at a fraction of the price.
In other words they were like SPARC generally, if you need it you need it, otherwise don't bother. I've lost track of what became of Tadpole with corporate shenanigans over the years but last time I checked the line was still available and being updated.
Would this have stopped Heartbleed thought? OpenSSL used its own memory handling code, so as Theo de Raadt explains it bypasses many built in memory protection features. http://article.gmane.org/gmane.os.openbsd.misc/211963
As the original quote goes, "OpenSSL has exploit mitigation countermeasures to make sure it's exploitable."
The Heartbleed exploit relies on reading sequentially beyond the payload. For you to be correct, then the entire 64k (the typical exploit size) would have had to have been previously part of a single malloc call (so that it is all of the same 'colour'). I haven't looked at the original offending code in detail but it would seem odd for software that has been specifially designed to be performant would go around grabbing 64k chunks of memory for no particular good reason.
My guess would be that memory is grabbed for the payload on the very first heartbeat call and then re-used rather than freed and malloc'd every time.
Obviously I could be wrong, but so could you.
Anybody care to check, I'm not sure my C is good enough ...
There must be something we all missed - apparently the M7 chip does protect against Heartbleed.
see here : ADI Demo
That page isn't working, at least for me, but I doubt it shows that SSM would prevent Heartbleed in every case. There are a huge number of possible permutations. And a huge number of targets for attackers to try, so that 1-in-16 would still have left an unacceptably high hit rate.
Also, it's much easier for something like this to catch something like Heartbleed if the attacker is aggressive and opens the DTLS heartbeat window as wide as possible (i.e., tries to grab all 64KB). A careful attacker might gradually widen it, hoping to get something useful before triggering a trap. That's easy to automate.
Of course, the article says Larry claimed "SSM would have discovered Heartbleed". Discovering is not exactly the same thing as preventing.
"If it doesn't alert anyone, and SHE manages to (after enough attempts) hit matching pointer-block colors" in HER exploit, SHE'll be able to force her way inside the vulnerable application and face off any other defenses that may await HER, such as ASLR."
"It is even possible SHE can alter the color bits in a pointer to match the color of a block SHE wishes to access, and thus avoid any crashes and detection."
99.9% of hackers are men. Well , boys really. Aside from that using "they" is the standard non gender pronoun in english so how about we give that a go instead of silly student union level language politics?
That's not political correctness. It's just not. You have a misogyny problem.
99.9% of hackers are not men. Most surveys put it at 85% at most.
So you'd expect to see at least 1/8 articles referring to "she". But then given that the human population is 50% women, why should the default be "he"?
And it's not going to get any better all the time there are grumpy, sexist trolls like you in the industry.
"That's not political correctness. It's just not. You have a misogyny problem."
Oh nicely argued, go straight for the accusation. What a cliche.
"Most surveys put it at 85% at most."
Cite.
"But then given that the human population is 50% women, why should the default be "he"?"
When most of the protagonists are one gender you use a gender appropriate pronoun. Perhaps you'd be happy seeing an article about nannies or pre school teachers refering to "he" all the time just to make a political point?
"And it's not going to get any better all the time there are grumpy, sexist trolls like you in the industry."
And sexism isn't going to go away by pretending the world is the way you want it to be. Grow up.
>99.9% of hackers are men. [no source cited]
>>"Most surveys put it at 85% at most." Cite.
Uh, okay. Basic fairness suggests that if you demand a source for a statistic, you do the same for the statistic you use. That's just good manners.
It is moot, though, because any statistic about the male/female make-up of a hidden group is shaky. As it is, how can we know anything about any hacker, sex, shoe-size, real name, whatever? For all we know, 50% of hackers are female, but 85% of the hackers that get caught and prosecuted are male. Unlikely, but, hey, not provable either.
However, we can say with confidence that *some* hackers are female.
They do it to be annoying. Everybody knows that 'he' in English means either 'he' or 'she', whilst 'she' means only 'she'.
It's different in Yank, but Yanks call black people 'African Americans' to pretend that they aren't persecuting them.
This sort of thing is called 'virtue signalling' the author (and it hardly ever is an authoress) is wanting to show how right-on and PC he is - to help him get some skirt. It's insulting to women, actually, they're not stupid, and can, of course, tell that what the idiot is doing.
I used the pronoun 'she' in a Reg comment a few days ago, in reference to a hypothetical inventor in her shed. My logic was that some real inventors are women (no comment about percentages) so it would be no issue if some imaginary inventors were women. The vast majority of the time I use 'he' when writing about an imaginary individual in a context where their sex is irrelevant.
Since women are bright enough to recognise the context for 'he' meaning 'he/she', I then also credit men with the wit to read 'she' as 'he/she' if the context s appropriate.
I think visually, and maybe, after imagining a cluttered workbench in a shed, it wasn't really necessary to imagine the appearance of the shed's occupant. Doc Emmett Brown is great, but after all the coverage of Back to Future day last week I didn't need to give him a another mental cameo this week.
Look at xkcd cartoons. Sometimes the focus of his cartoon is a relationship between a man and a woman - the stick-figure with longer hair is the female, or sometimes a stick-figure is given a beard to denote maleness. The sexes of his figures are central to these cartoons.
Most of the time though, his cartoons are just aboput two physicists, or a doctor and a patient, or whoever. Sometimes he might make a doctor (stick-figure with white coat and clipboard) female (plus long hair) even though it doesn't affect the joke.
So, I guess I'm comparing pro-nouns with cartoon pony tails....
To cite the man himself:
"The role of gender in society is the most complicated thing I’ve ever spent a lot of time learning about, and I’ve spent a lot of time learning about quantum mechanics."
- http://blog.xkcd.com/2010/05/06/sex-and-gender/
I like to see "she" and "her" mixed in. It doesn't harm anyone. It reduces a low-grade feeling of exclusion for many of us which is a good thing. So why not? You seem to be starting from the assumption that it's some special effort to use "she" or "her" sometimes. I don't find it an effort, it's just something I do. And in any case, it's not your effort so why are you complaining?
Don't tell me it causes you some cognitive difference to have the hypothetical stand-in be female sometimes? Because if it does, that's not our problem.
Also, your "99%" figure is wrong.
"It reduces a low-grade feeling of exclusion for many of us which is a good thing. "
Oh get over yourself. Do you think any man anywhere has ever had a "low grade feeling of exclusion" from mainly female occupations? Do think any of them would complain if a website like mumsnet kept using the term "she" instead of "he", or that nurses in charge are called "sister" and "matron" regardless of their sex? Fact: Most people in IT are men. If you can't handle that then find another occupation.
>>Feel free to provide a list of well known female hackers.
Off the top of my head, you might want to look up "St. Jude" aka Jude Milhon who was one of the original "hackers" and was unrepentant and highly vocal up until her death that hacking was a good thing because it improved software security. Susan Headley was one of the pioneers of Social Engineering approach to hacking and part of the Cyberpunks group which if you know your hacking history was a significant part of early hacking movements. Joanna Rutkowska if you're willing to accept White Hats (which you should). Kristina Svechinskaya is another that achieved a fair bit of notoriety. Gigabyte is female.
Anyway, I'm sure I could find more if I looked but that's who I can think of right now. You really picked the wrong person to argue with about this as being both a feminist and a software engineer who's been around for a long time, you're actually going to get answers to your questions - which I'm sensing you weren't expecting. For what it's worth, when you wrote "hacker" I took that to mean in the original sense but it's plain from the rest that you're just using the modern meaning of someone who gets access to IT systems they shouldn't so that's what I gave you. But really it's a silly question as successful hackers are often anonymous. Maxim could be female - how would you know? Impact Team could be. Wank Worm could be. Okay, Wank Worm is probably male, I'll give you that.
Anyway, why you seized on hackers as your "proof" of male dominance I don't know. There are plenty of highly skilled female engineers who aren't hackers. I don't know why they don't count for you. But as you can see - yes there are female hackers and some pretty well-known ones as well. So I can assume you'll backtrack and admit that you're wrong now? Yes? :)
"you're actually going to get answers to your questions"
Makes a change on here.
"Anyway, why you seized on hackers as your "proof" of male dominance"
I really don't see why its so hard for people like yourself to accept that there are substantially more men in IT than women. Its not up for debate, its a fact. If I was referring to nurses I'd use "she" as a pronoun - for anyone in IT I'd use "he". If you think thats sexist or mysogynist then frankly you need to see a psychiatrist.
"But as you can see - yes there are female hackers and some pretty well-known ones as well. So I can assume you'll backtrack and admit that you're wrong now? Yes? :)"
Knock yourself out: https://en.wikipedia.org/wiki/List_of_hackers
"They" has gained common usage as a non-gender specific pronoun these days. You see it in place of "he" quite often.
You see the utterance "innit" tacked onto just about everything certain people say. That doesn't make it right.
"They" might have come into common parlance, but the OP claimed it was the "standard non gender pronoun". Which it isn't.
Vic.
>>"You see the utterance "innit" tacked onto just about everything certain people say. That doesn't make it right."
Actually, I haven't heard anyone say "innit" for years and even then it was used ironically. And what is inherently wrong about "innit" if someone does say it? It's just a mode of emphasis.
>>""They" might have come into common parlance, but the OP claimed it was the "standard non gender pronoun". Which it isn't."
And you wrote that "he" was the standard. But it isn't. Both "he" and "they" are in common usage. I use it when I'm referring to a real but unknown gender. For hypotheticals I tend to mix up he or she and just keep it consistent within the context of the example.
"They" has gained common usage as a non-gender specific pronoun these days.
"These days" is wrong. The singular-they usage in English is older than Modern English.
Prescriptivists who claim it's "wrong" are merely demonstrating their ignorance of the language they claim to be defending. As is generally the case with prescriptivists, of course.
Aside from that using "they" is the standard non gender pronoun in english
It isn't. "He" is the non-gender-specific pronoun in English.
You're both wrong. The English language has no broadly-recognized standards-setting body, and prescriptive claims like these are complete rubbish.
Actually, since "they" is plural, not singular, using "they" as if it were a gender-neutral singular pronoun is both ungrammatical and an example of "silly student union-level language politics". So using "she", even though malicious hackers are usually male, is actually less bad.
Actually, since "they" is plural, not singular, using "they" as if it were a gender-neutral singular pronoun is both ungrammatical and an example of "silly student union-level language politics".
This is wrong in so many ways it's really quite impressive.
Since English pronouns do not have plural inflections, their number is solely a matter of usage, not grammar. So "they" is neither plural nor singular except in context. And "ungrammatical" (insofar as it means anything in English, which is not very far at all) is completely incorrect; if singular-they were any sort of error at all, it would be an error of usage.
Questions of the political aspects of language use have been taken up, at length, by people far more knowledgeable and insightful about language use than the inhabitants of student unions. The National Council of Teachers of English (NCTE) published its first set of guidelines for non-sexist usage in 1971, and has amended it ever since; and as a body I am quite confident they know a great deal more, and have spent a great deal more time considering, language use than the entire Reg commentariat - astounding as the thought might be to some of our august members.
And as for whether they are "silly", with or without your scare quotes: I know what their credentials are, Mr Savard. What are yours?
I seriously doubt that complete security is even possible. A typical modest PC has 10^10,000,000,000 possible states, that's a number with 10 trillion digits. And if you add a hard drive, the number of possible states is a number with a quadrillion digits. Then if you connect it to the internet . . .
" ...an imaginary individual in a context where their sex is irrelevant." (Dave 126)
Three cheers for avoiding the mealy-mouthed term "gender" when what's under discussion is the difference between male and female (namely sex). In my book, "gender" is a grammatical concept, not an anatomical one. As in German: das Madchen (gender = neuter) refers to a little girl (sex = female).
<rant ends>
This post has been deleted by its author
Sounds like Larry's selling off the last of the Sun processor technology on his way out the door. Good luck to anyone who purchases one of these behemoths and is forever locked into Oracle's proprietary revenue generating machine. The license compliance teams from Oracle make the IRS look avuncular. And for the record, Larry will say anything to sell some more units.
behemoth ? a T7-2 , 64 Core 512 thread 1TB memory server is 3 u's , pretty small really....
tbh , every vendor out there fits into your description of 'Larry' or do you think they are there to do us favours rather than make a profit ?... I hope they keep it up, competition drives innovation... if it was just intel making chips , we'd prob still be on 486's....