Spec a server

Associate
Joined
18 Oct 2002
Posts
903
Location
Heaven
Looking to upgrade a server its currently a 2.8ghz P4 1gb 3200 mem 2 X 36gb raptors,but recently after moving to new software it seems to be slow.

When Users use the new software a 130mb database is used from the server (1gb ethernet) 15 users can be using it simultaneously and it can get slow transfering the information back and forth so all users see the same information.

Would upgrading the server to something newer make it better or would more memory be enough?

All thoughts are welcome
thank you

Gas
 
You might try running some performance monitoring software to see where the major bottlenecks lie. If you're windows, run perfmon from the command line, and trace paging, cpu, memory, and disk queue length.
You might also consider checking network traffic, to rule out lag and contention. And finally, monitor the operating software, are you serializing writing to log files from multiple threads, are you suffering from multiple locks, if you're .net based, are you hitting gc.collect thresholds, and so on.
Also consider monitoring database activity, such as table locks, index scans, full-table scans, and so on. Are users hoarding connections (not them, but the desktop software), is it a proper DBMS, or file-based like Access?
If you're not windows based, I can't help you very much, 'cept to say that all modern operating systems provide a wealth of performance capturing facilties.
Best of luck, btw.
 
JonRohan said:
So your after a pc then not a server. They are very different.

Its still a server if it serves files/database, albeit a low spec one.. But still a server.

I specced up a cheap Opteron system a month or so back for around a grand. Had a Tyan board, dual core Opty, 2GB memory and a couple 74GB Raptors. Can work it out exactly if you need more details..
 
Street said:
Its still a server if it serves files/database, albeit a low spec one.. But still a server.

I specced up a cheap Opteron system a month or so back for around a grand. Had a Tyan board, dual core Opty, 2GB memory and a couple 74GB Raptors. Can work it out exactly if you need more details..

Debateable. IMO a server is something with Dual PSU, Dual NIC, hotswap drives etc etc.

mk17 has some interesting points. If you are running windows putting another 1 gig of ram in won't do any harm. Depends what else you have on the server and what other services and software are running.
 
A server is any device that provides a service. For a lot of small/medium businesses, servers with multiple redundancies are completely unnecessary and they won't see the benefit of the extra outlay.
 
Street said:
A server is any device that provides a service. For a lot of small/medium businesses, servers with multiple redundancies are completely unnecessary and they won't see the benefit of the extra outlay.

Depends on the service. 15 users with an SBS 2003 server running an office would be pretty important IMO and require a server, not a hyped up desktop pc.

The extra outlay isn't just so an IT consultant can spend money.

I'm going to cross the desert, shall i use a mini?

I do agree that a server provides a service, but doesn't necessarily mean its on server hardware.
 
Thanks so far for all the replys, At the minute the 'server' runs windows 2000 server (16months ago it was a 1200 duron,256mem,10gb hdd and windows 98 :p ) and all it does is serve the data base which is an access database 114mb.

Over a 1gb lan I takes around 7-15 seconds to open it up, is this about right?

most of the end user PC's are 3GHZ with 512Mem.

Personally I dont see a problem but then I dont use it ;)

What differance would a dual core oppy offer?

Cheers


Gas
 
Windows 2000 should generally be happy enough with 1 gig of ram. Have you done things like degramentation, virus checking etc?

100mb for an access database is quite large. Maybe thats where the problems are coming from? Access is really a single user database application, it was never desgined for network and heavy use.

Jon
 
I've seen Access used as a distributed database,
where the database is held on the users PC
and where the updates are later synchronized.

I think it helps if the database is designed with this in mind
The developer version of Office 2000 provides a tool to help in this.
It worked, but it ain't pretty

Maybe port it to SQL ?

.
 
Last edited:
You might consider going SQL 2005 Express and port this Access database over but this may require extra development to the frontend. Also run perfmon and watch what your CPU/Memory and network are doing due work times.

Also keep an eye on your disk acceses and dare I say it, defrag. (as long as its not a domain controller). A fragmented disk can give bad performance.

I have seen access databases of over 400MB and have heard that access has a limit of a total size of 1GB? Have you tried repairing the MDB file?

Al
 
Last edited:
Back
Top Bottom