We have around 10,000 users and at various stages in our logon script we have entries like this.
If Open (1, "\\server\share\file.csv", 5) = 0
$SpecLine = $Dept+ ","+ %clientname% + "," + @USERID + ","+ @FullName + "," + @DATE + ", at ," + @time
$WriteData = WriteLine (1, $SpecLine + Chr(13) + Chr(10))
Close(1)
EndIf
Obviously these files grown fairly quickly and I have archived them off numerous times. I just have a little niggle at the back of my mind about how the clients execute this.
If say the file had grown to 200mb's in size, do the clients have to pull that over the network in order to write to it? if so that could be causing some issues! Especially over slow links. Any thoughts? I have proposed our logging should be done into SQL because of the amount we capture. The other issue being on a monday morning the file will probably be locked constantly and not everything captured even if logons happen over a couple of hours.
Does that make sense and can anyone clarify?
I will do a little testing next week
If Open (1, "\\server\share\file.csv", 5) = 0
$SpecLine = $Dept+ ","+ %clientname% + "," + @USERID + ","+ @FullName + "," + @DATE + ", at ," + @time
$WriteData = WriteLine (1, $SpecLine + Chr(13) + Chr(10))
Close(1)
EndIf
Obviously these files grown fairly quickly and I have archived them off numerous times. I just have a little niggle at the back of my mind about how the clients execute this.
If say the file had grown to 200mb's in size, do the clients have to pull that over the network in order to write to it? if so that could be causing some issues! Especially over slow links. Any thoughts? I have proposed our logging should be done into SQL because of the amount we capture. The other issue being on a monday morning the file will probably be locked constantly and not everything captured even if logons happen over a couple of hours.
Does that make sense and can anyone clarify?
I will do a little testing next week