Um...
The file is telling T-1000 and T-800 to not visit /+LarryPage or /+SergeyBrin. Google's hoping that killer robots respect the file and leave them alone. It's a parody of robots.txt.
A chap named yueq over at Y Combinator has spotted evidence that Google has decided founders Larry Page and Sergey Brin need protection from Terminators. Proof comes in the form of Google's killer-robots.txt page, which reads as follows: User-Agent: T-1000 User-Agent: T-800 Disallow: /+LarryPage Disallow: /+SergeyBrin In …
There was never a T-X in either of the two Terminator movies.
In the original timelines the multiple films and spinoff series were so successful they prepared humanity to easily fight off the machines. Because of this Skynet sent back a number of Directinators to take over the production of the later films to make them suck, and engineered the writer's strike to get the series cancelled.
I think Uwe Boll is part of a related plot.
> Notably they don't mind a visit from the T-X.
"Guys! Guys! There's a robot trying to kill me! Come over quick! It's *fantastic*!"
Skynet eventually went on to wipe out the entire human race with a single fluffy kitten with laser eyes (although it had to be careful not to look at the floor).
God. You lot are piss poor excuses for sci-fi fans. It is abundantly clear that Eric Schmidt is the physical manifestation SkyNet has chosen for the current incarnation. Page and Brin know how the end will come. They are simply trying to delay the inevitable.
Epic Fail on behalf of the author!
Surely someone reporting on a largely tech based site knows how to read a robots.txt file?
User-Agent is pointed at the robot reading the file and the Disallow statements are telling robots that they cannot follow that path. They really should have a "User-Agent: *" in there too, since this is a file aimed at killer robots.