IT Support Mishaps

Ahh, I get plenty of these!

Got a call once at 9AM...

Client: Hi, I've just logged on and there's a red cross showing on the shared folders, can you take a look please.
Me: What happens when you double-click it?
Client: Its working now! That was quick, thank you so much!

:rolleyes:

:D
Yeah we get that at work. I just tell people to open the folders as usual and they'll get the files they need. Wish I could solve it as its quite an annoying "fake" problem to have. I believe its only happened since we got our new admin server a few months ago. :confused:
 
Yeah we get that at work. I just tell people to open the folders as usual and they'll get the files they need. Wish I could solve it as its quite an annoying "fake" problem to have. I believe its only happened since we got our new admin server a few months ago. :confused:
Yeah, it's a weird issue. Even refreshing my computer doesn't get rid of the red cross. It's only when you actually open the shared drive that the red cross disappears.

Happens only sometimes and for some clients. Happens intermittently on my home server network too.
 
Yeah, it's a weird issue. Even refreshing my computer doesn't get rid of the red cross. It's only when you actually open the shared drive that the red cross disappears.

Happens only sometimes and for some clients. Happens intermittently on my home server network too.
Yep I've noticed that, refreshing does nothing. Going into the drive fixes it. I don't know enough about Windows Server (yet! ;)) to say why but it is almost as if opening the shared drive forces a proper connection whereas the red cross is symptomatic of a cursory check/inquiry by Windows about the availability of the drive. :confused: Hopefully it won't be long before I start learning more about Windows Server for my MCITP that I get some understanding of what is causing this and how to fix it.
 
Now now. I have worked in places where the women know their stuff, even more than most men. So, forgive me for not siding with you when the influx of ladies come down on you like a ton of bricks. :p

i think we only got about 15 and none of them IT minions :p
 
Colleague of mine authored a "backup script" that actually deleted all of the users data on one of our servers at a remote site.

The reason for the script was to create an off-site backup because the people at the site frequently forget to put the backup tape in.

Fortunately they remembered that night and his script ran after the tape backup had finished!
 
not me personally as i was away on a course, but someone managed to log in as root and rm -rf /home just before christmas, some guys lost 20 years worth of scripts and tools.

Funnily enough /home wasn't backed up anywhere due to running out of storage and the department was to stingy to get any more.

There are a million reasons why this shouldn't have happened and how it could be fixed to never happen again, but none of them have been implemented. scary.
 
In a moment of stupidness I once heard some beeping from one of our UPS's, had a total blonde moment thinking "We have power, so I'll reset the UPS and it'll be fine" Cue every server going off and about 50 calls from around the school asking if there were network issues! "Yes, one of the servers just had a slight hiccup, I've restarted it now so 5-10 mins and it should be ok!" (I don't think anyone else knows to this day!!)


:D
 
It's a little bit IT related but I send Medical Records out to Solicitors on encrypted disks and the password obviously goes by other means.
Many a time we get the disks sent back because they are incapable of opening them (even though I send a SOP) and they have always wrote the password on the cover :D

My favourite:
Solicitor : Can you send me some more Xray disks please
Me : Yeah, sure, what happened?
Solicitor : I opened up the packet and the disk fell onto the floor and smashed into pieces

Cue me flinging a disk round the office like a frisbee and throwing it into walls.
 
I have to regularly restart a particular card on a Promina node that I look after(It's a Multiservice Access Platform which we run multiple services on). The card in question provides timings for the PABX system I also look after. For what ever reason, I had a brain fart, and I restarted, but forgot to re-activate the card. This was at the end of my shift before finishing for the day. Cue 100's of faults start being logged by users who are unable to dial any numbers or make any calls. I had to be called in a few hours later to find out what was going on.

1 command later, it was all good to go. The biggest pain in the arse was going though all these faults, as the system these faults are logged on is unbelievably sloooow.
 
Unfortunately the place were I work has experienced a few major fails in the last few years.

One of my colleagues decided in his infinite wisdom to somehow uninstall Windows Installer off a customer's PC then wonder why he couldn't install our software or anything else. I don't think he realised his mistake until there were 8 of us crowded around his PC laughing and making sarcastic comments.

We liase with some of our customers own IT support (we act as second/third line support) and one particular customers IT support are known to be lazy and somewhat reluctant to do anything remotely difficult. Well one of my colleagues had a general dislike for one of their staff and he sent an email to who he thought was our manager (they had the same first name) basically saying "It's about time they did some f'n work. Lazy so-and-so's." He actually sent it to the guy he didnt' like. Seeing his face when he realised his mistake was priceless. He spent the next 5mins trying to recall the email, then about 2mins after he stopped we get an email from the guy at the customers IT support saying please stop trying to recall this email, I've forwarded it onto my manager.
He got a severe talking to and a final verbal warning for it. He's not lived it down since.
 
Couple of week ago I was moving a server in the rack, put it back in, went to plug it back in to the UPS and accidentally knocked the mains power cable to the UPS (which must have been loose) usually no problem, however the UPS batteries are knackered and we've not got the budget to replace them, so then all the servers in the rack start shutting down... whoops :(
 
Not support but i accidentally ripped out the LUN mappings to a live server VM :( Good thing it wasn't important as I had just the previous day taken a P2V of it :o
 
Last edited:
thread revival :p

Had a mare of a time decommissioning a 2003 exchange server today for what just so happens to be one of our biggest clients. testing the public folders replication, for the life of me could not figure out why autodiscover was not working. If your not familiar with this, if autodiscover is not configured you cant set your out of office or use the scheduling assistant within Outlook. Updating my host files to point autodiscover to the exchange server worked albeit with an ssl error, and it was fine via webmail.

3 hours of googling, reading up on technet and racking my brains i realised the laptop i'm working from thanks to not being part of the domain Outlook was never going to work! a quick check with a user confirmed it worked all along!

3 hours spent sweating it out for no good reason!! :rolleyes:
 
Glad to see this revived lol.

I've had a couple more crackers myself lately.

Had one incident where a user's laptop was somehow mysteriously typing by itself. I had a quick look to see how this was occurring. Even took the laptop off the docking station. Of course taking it off the docking station actually stopped it from self typing.

So when putting the laptop back onto the docking station, it started self typing again. I was at a loss til I looked over to the right of the laptop and noticed the user's handbag sitting on top of a keyboard which was attached to said docking station. :rolleyes:
 
Couple of week ago I was moving a server in the rack, put it back in, went to plug it back in to the UPS and accidentally knocked the mains power cable to the UPS (which must have been loose) usually no problem, however the UPS batteries are knackered and we've not got the budget to replace them, so then all the servers in the rack start shutting down... whoops :(

I trust you took the opportunity to bypass the useless UPS at this point then rather than leaving a single point of failure in your infrastructure to fail again....

;)
 
Doesn't matter now they started a new company took all the work, fee earners, and computers. Then left IT, HR, and Finance in a bust company. 10 years service, no 'sorry', no redundancy, no notice.

So ya know, **** 'em! :p
 
Back
Top Bottom