Monitoring Performance for non techy people

Soldato
Joined
19 Mar 2012
Posts
6,611
I'm not sure if this is the right place for this so please move if required.

I have just rolled out a software solution to the department i work for as a business analyst, the software is Visokio Omniscope, used to view and cut data sets.

We are stuck with this particular software as it is used by the 3rd party we have employed to enrich and classify spend data.

I have been using it for some time on my own desktop with no issues at all.

I am running a Intel Core Duo E8400 @ 3ghz with 4gb of RAM on Windows 7 Enterprise 32 bit.

All the other users in the department are reporting unacceptable performance when querying the data set on their laptops, they all have 4gb RAM and windows 7 Enterprise 32 bit, though varying processors, most will be i5 3320m @ 2.6ghz or similar.

Is there anything i can ask them to run that will identify where the bottlenecks are bearing in mind i work remotely so will not by physically there for 2 weeks and most of them are not at all technical so things like resource monitor will just scare them?

Thanks in advance.
 
Where are the data sets located? Are they local on each machine? Or stored on a central share somewhere. If it is the latter then I would suggest start looking at their connection to it. Also you need to find out what 'unacceptable' is. Is it definitely abnormal?

edit: There is a Windows Software forum which will probably be better than General Hardware
 
just get them to watch task manager performance as its running slow see what resources are running out and what the cpu % is running at.
 
The data set was a 30MB file, i advised them all to move it from the shared drive i had used to distribute it onto their local hard drives to avoid any connections issues and be able to work when on trains etc as we do a lot of travelling / working from home and our work network is notoriously flaky and one of the reasons i went for this solution rather than some of the cloud based ones.

The 3rd party then "optimised" the file to 10MB, though they did this by removing most of the data, which i pointed out was not optimising....

Anyway even with the 10MB file they are waiting 30 seconds or so for an update of the various graphs that my machine does instantaneously, and in some case it just hangs. I did witness this yesterday when i was in the office installing the viewer and talking them through it using the 30MB file.

Obviously "acceptable" is subjective, but the solution i had designed as an interim measure using Excel pivot tables and ODBC connections to Access is significantly faster so now they are pushing back on using the new solution as it makes their lives more difficult.

I would use Task Manager and Resource monitor if i was there, but these guys are all Procurement specialists and their eyes glaze over when i start talking about such things.

I may just have to wait until i'm back in the office again, but you never know i may be able to get a head start on identifying how to improve the situation.
 
The data set was a 30MB file, i advised them all to move it from the shared drive i had used to distribute it onto their local hard drives to avoid any connections issues and be able to work when on trains etc as we do a lot of travelling / working from home and our work network is notoriously flaky and one of the reasons i went for this solution rather than some of the cloud based ones.

The 3rd party then "optimised" the file to 10MB, though they did this by removing most of the data, which i pointed out was not optimising....

Anyway even with the 10MB file they are waiting 30 seconds or so for an update of the various graphs that my machine does instantaneously, and in some case it just hangs. I did witness this yesterday when i was in the office installing the viewer and talking them through it using the 30MB file.

Obviously "acceptable" is subjective, but the solution i had designed as an interim measure using Excel pivot tables and ODBC connections to Access is significantly faster so now they are pushing back on using the new solution as it makes their lives more difficult.

I would use Task Manager and Resource monitor if i was there, but these guys are all Procurement specialists and their eyes glaze over when i start talking about such things.

I may just have to wait until i'm back in the office again, but you never know i may be able to get a head start on identifying how to improve the situation.

you could always get them to install ultra VNC. that way when its happening you an take control and see for yourself
 
It's definitely sounds like a weird one! At this stage, if it was in my organisation I'd be pushing this back to the suppliers to come up with a fix.
 
Ok, thanks for all the feedback so far.

I've got a bit further now.

CPU and RAM is definitely not maxing out when these slow downs occur on the laptops.

The slow down is also worse when there are a lot of columns on a data set, when you limit the data and therefore the number of columns the application works far better.

Clutching at straws i wondered if it was the graphics capability of the laptops, so i disabled the GFX card in my desktop and i get the exact same issues as on the laptops.

The weird thing is, its not the number of columns that are visible on the screen, you have to scroll to see what could be potentially 4,000 columns on the full data set.

Is it even possible that the application is asking the graphics card / on board graphics to effectively draw columns that aren't on screen?

edit:

Re-enabled my card and still getting the issues.

Yep, pushing back to the vendor now, this is just weird.
 
Last edited:
Back
Top Bottom