Jump to content

Yesterday's Down Time, Apologies & Explanation


Jay Kae

Recommended Posts

This story begins about 60 hours ago when EIG went to go and do some normal maintenance on their datacenter, during this maintenance something happened that was not part of the plan, I will not go into detail on what that was but let's just say, it was not good. 


 


Errors started occurring on websites that should not occur, all services I run are via EIG but apart from my 'hobby' services also a plethora for my work life. Let me just say that Orbx, REX and OZx were the least of my problems in the last 60 hours. In this case there was nothing I could have done to avoid the downtime as it was not my server that was down but the whole datacenter that had bitten the dust and everything went south.


 


I can only apologise for the inconvenience this has caused.


 


No matter what contingency plan I would and could have had in place, this was a one in a billion chance of happening and ... it happened.

Link to comment
Share on other sites

No matter what contingency plan I would and could have had in place, this was a one in a billion chance of happening and ... it happened.

 

Jeeeeeeeeeeeeeez ! I wish we all would be that lucky with a lottery :D

 

No worries Jay, thank you for your tireless efforts keeping us up and running 8)

Link to comment
Share on other sites

Hello Jay Kae

 

Thank you for the explanation, from what I read it was a "global" problem.
I think all of us here completely understand that things happen!

No need to apologize, really.

 

 

Thank you for your hard work on keeping so many sites working like a charm.

 

BTW: I never heard of RailGun before. But after reading about it, we are looking into it and will probably implement it on one of our Telemedicine Solutions.

 

Cheers.

Link to comment
Share on other sites

Reminds me of a major outage we experienced a few years ago in one of our (co-hosted) tier 3 global data centers in the UK (not going to name them but let's just say a MAJOR provider). 


 


They were performing maintenance when the entire data center lost power.  That's not supposed to happen, in fact we were guaranteed that could never happen.  Battery back up did not come on, and because their generator system monitored battery backups for runtime in order to kick off the generators and didn't see a battery drain...no generator power either.  Darkness.


 


Turned out to be a part that cost no more than 5 pounds, no kidding.  One in a billion chance...needless to say, they were supremely embarrassed. 


 


Jay Kae - I remember what you did while one of our bretheren's forums was under a DOS attack and you literally stayed up 3 days straight fighting it off...yeah - no need for you to apologize mate.  In fact - if you're ever looking for a new job PM me; I manage global IT operations worldwide and will get you in our doors pronto!


Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...