Xbox Live down all day?!

Soldato
Joined
19 Feb 2004
Posts
15,102
Location
Darlington, UK
Just got in from work and I find out that the Live service is offline until 15:00PST so that would be 11pm tonight? Great! :(

Couldn't see this posted anywhere else here so just letting the rest of you guys know.
 
My live has been dead since i changed my username last tues, apparently i'm on a "priority list", i may aswell not bother phoning again tonight as no doubt they will be busy answering "why cant i connect" calls.

Pants
 
Xbox Live is set to go offline from 0200 - 1500 Tuesday Pacific Time (1000 Tuesday - 0100 Wednesday UK time) Speculation is that this is a pre- E3 servicing, ready for E3 content to be placed on the marketplace
 
Aye went off at 10 this morning when i was playing ffxi. Luckily ffxi is nothing to do with live and i can still play :-)
 
This has always confused me. Why spend a whole day with 0 service?

They should either take half their servers off line and patch them, then swap them over with the other half, and patch them. Or (and this is the correct way) they should already have enough redundancy within the system so that in the event of a disaster they have fail over sites that are currently sitting idle. They should patch the fail over sites then bring the live sites offline and failover to the backup sites. Once everything is tested and works they should then patch the live sites and either leave them as the new failover sites or perform another failover, putting the original live sites back as live, and have the original backup sites perform the backup role.

That way we would only have say 30 min outage per failover max whilst DNS entries are updated.
 
Last edited:
Guys this in preparation for E3, so the maintenance is needed. This is the first time XBL/Xbox.com has had downtime since launch iirc. Read or listen to major nelsons podcast, big things are about to happen
 
Kronologic said:
This has always confused me. Why spend a whole day with 0 service?

They should either take half their servers off line and patch them, then swap them over with the other half, and patch them. Or (and this is the correct way) they should already have enough redundancy within the system so that in the event of a disaster they have fail over sites that are currently sitting idle. They should patch the fail over sites then bring the live sites offline and failover to the backup sites. Once everything is tested and works they should then patch the live sites and either leave them as the new failover sites or perform another failover, putting the original live sites back as live, and have the original backup sites perform the backup role.

That way we would only have say 30 min outage per failover max whilst DNS entries are updated.
I think they would have thought about all the possibilities.

It may well be for a better designed Marketplace menu too, not to mention the downloads in the background thing too.
 
Back
Top Bottom