I’ve picked up a new task at work. In short it involves reviewing several log files that cover a 24 hour period to ensure there are no abnormal results. Some of the errors are known and deemed as acceptable, whereas other may indicate a failure. The frequency of the errors must also be monitored. The problem here is that there are no guidelines as to what is normal and what is not. It’s all done based on experience, but the guy with all the experience is leaving…
My plan is to start grabbing the logs and start recording them somewhere so I have a history. From there I will try to start flagging things which are normal/abnormal, and the ‘normal’ amount of occurrences of that error over a period of time.
My problem is that I don’t know the best way to record the data, or how exactly to analyse it! Does anyone have any advice? I’m not looking for super sophisticated and autonomous with all the bells and whistles, but just something that will help me out a bit. It’s unlikely that I’ll be able to install software, so would prefer if I could achieve this in excel or access.
Thanks for any advice.
My plan is to start grabbing the logs and start recording them somewhere so I have a history. From there I will try to start flagging things which are normal/abnormal, and the ‘normal’ amount of occurrences of that error over a period of time.
My problem is that I don’t know the best way to record the data, or how exactly to analyse it! Does anyone have any advice? I’m not looking for super sophisticated and autonomous with all the bells and whistles, but just something that will help me out a bit. It’s unlikely that I’ll be able to install software, so would prefer if I could achieve this in excel or access.
Thanks for any advice.