I used to know a finance director who had a favourite mantra: “Minimise fixed costs.” The concept's a simple one: by all means use permanent staff to deal with the aspects of your business that don't change much, but where your revenue streams go up and down, think of ways of allowing the cost of servicing those revenue streams …
You'd be suprised how inbred the telco industry is, I work for a very large globa telco and we often get third party tails from cable and wireless, you'd be suprised (or not) how often they are provided by BT (OR COLT etc)
Also, Our own internal sales teams don't seem to understand when a customer says they want geographic diversity there is no point having 2 seperate circuits going between four different sites if your on net portion is on the same fibres! I have had to point out that they really should get a 3rd party supplier in some of the corners of our more far flung empire (Ex USSR springs to mind) but they still dont get it!!
Re: Last mile.......
"Our own internal sales teams don't seem to understand when a customer says they want geographic diversity there is no point having 2 seperate circuits going between four different sites if your on net portion is on the same fibres! "
Amen: It's amazing how hard it is to actually verify that circuits being sold as geographically diverse really are.
IT has come a long way
Obviously, taking all IT and telecoms operations in house is impossible. But IT has come a long way. Did you notice that, in order to increase reliability, car manufacturers have
. reduced the number of parts required to build a car
. standardized these parts accross their range and subsequently
. reduced the number of suppliers?
The same is possible in your datacentre: Reduce the number of different operating systems, applications, middleware and databases, and I guarantee the IT operation will become more transparent, affordable and flexible.
Oh, and while we are on it, the number of datacentres can be reduced, too. Instead of datacentres in the UK, Lithuania, Bangalore, Hong Kong, Manila and Chicago we only need UK and Hong Kong. And you don't need to have a layered approach of multiple suppliers and outsource service providers for all bits and bobs of your operation, you can also train up staff and/or simplify processes, and I've seen the black swan where a company saved money (!) by insourcing (!) and become more flexible as they liberated themselves from a lock in situation with a certain supplier.
Re: IT has come a long way
'The same is possible in your datacentre: Reduce the number of different operating systems, applications, middleware and databases, and I guarantee the IT operation will become more transparent, affordable and flexible.'
And can all be killed by the same bug....
Re: IT has come a long way
Instead of datacentres in the UK, Lithuania, Bangalore, Hong Kong, Manila and Chicago we only need UK and Hong Kong.
sorry, fail. Given the various shenanigans of governments around the world, you need to separate your data centres by jurisdiction to be secure. (Loosely translated as you need to keep them out of US clutches)
please please please
please be Capita next, please be Capita next, please be Capita next....
Whatever happened to the simple idea that s**t happens
Any supplier could go down the pan at any point and all companies should plan for this.
By all means hope for the best (and maybe even expect it) but don't plan for it.
I sort of thought that was what big league IT management was about.
The JCB Driver - Yes, seen this before a couple of time - the most memorable being:
Was working in the US on a WAN back to the UK HQ. One day nothing worked, and called up the UK to diagnose.
Digging in the street had taken out the main telco line. What about DR? Well, that ran next to the primary line, all the way from the server room, through the building, to the street, to the exchange. Right next to each other.
Oh, but we bought from different companies... who both happened to be reselling the same last mile lines from BT.
Nothing as fun as having to let customers know that email is now down globally, but it's all ok, they can use our new Yahoo account to contact us.
It can be the simple things ...
Some years ago I managed to wangle a place on a Business Continuity course, and very enlightening it was too. The guy doing the course wasn't a trainer by trade, he actually did "BC stuff" for a living - and as a result had some wonderful tales.
One he told us was how a number of businesses were locked out of their premises for a week after a lorry caught a phone line and pulled it down. How, one might ask, did that happen ? Well who hasn't heard "The Gasman Cometh" by Flanders and Swann ? http://www.youtube.com/watch?v=zyeMFSzPgGc
Well after the lorry pulled down the overhead cable, BT decided it would be better underground - so started digging. When they hit the water main, the water people were called to deal with it. As the water people dug a hole to access the water main, they hit the gas main.
Everyone was evacuated, and by the time it was all sorted, they'd been out of their premises for a week !
And in response to the comments about resilient routing, he mentioned that as well. It's of particular interest to people responsible for emergency call centres - and apparently even they can struggle to find out what's really happening. In one example he cited, a new centre was built, with diverse routing to two separate exchanges - only it turned out that neither exchange was actually an exchange as we'd imagine it, both were in fact just satellites off one big exchange, and so a single point of failure.
Re: It can be the simple things ...
then there was a certain Metropolitan Authority that moved its data centre out to cheaper premises in the inner suburbs. No problem, ultra high speed networking from more than one supplier with separately routed triple redundant cabling. Then there was a fire in a super secret WW 2 tunnel under the city. ALL comms cabling was routed through it. Put it down to bad luck - no-one knew, no-one was allowed to know. Mind you I never got a sensible answer from Networks as to why PC's in the new centre couldn't access servers - in the new centre (seems all routings were via the Town Hall).
Same people had earlier installed massive UPS backup, a few years later someone decided to test it - Failure (turns out the batteries were knackered - batteries are consumables not capital items, but these babies would have used up several years total budget). Never mind they said the genny can be started manually, only it couldn't - its battery was knackered and anyway there was no diesel - Health and Safety wouldn't let it be stored on the roof.
Just simple things...
Expect the unexpected
I'm not going to mention names, I don' t want them to come and get me. I worked for a large utility that had gone to the effort of having two centres, six miles apart, away from known aircraft routes, fault lines, rivers, volcanos, lay lines and crop circles. They then built their billing machines over in the sites, 100% redundancy of everything, right down to power from different grid supply points (we're talking 400,000V network tracing). As a utility, they billed millions a day at sod all margin, so they needed the cash flow to pay suppliers quickly, we used to talk about 10 days without billing means even the banks would stop lending money.
I was a grad given a project to check the business continuity, what was expected to be another pointless exercise they ran every year to keep the useless grads busy for a month. They were in for a shock, a factory that was halfway between our sites and used to process nothing in particular had changed hands, so I wrote to them asking what they did for business contiuity and if we could learn anything, seemed like something to do in my month to write this report.
The start of paragraph two said it all "As a processor of chemicals who have a statutory 10 mile exclusion zone should there be a confirmed leak......"
Re: Expect the unexpected
"The start of paragraph two said it all "As a processor of chemicals who have a statutory 10 mile exclusion zone should there be a confirmed leak......""
Ouch. But fair play could anyone have seen that one coming during planning?
- Updated BT broadband is down: Former state monopoly goes TITSUP UK-wide
- BT blames 'faulty router' for mega outage. Did they try turning it off and on again?
- Safe Harbor ripped and replaced with Privacy Shield in last-minute US-Europe deal
- Microsoft buys SwiftKey, Britain's 'stealthiest software startup'
- Health Secretary promises NHS £4.2bn to go 'digital'