SQL Server Express 2008R2 for 20 or more users?

Depends; seems very ambiguous. Do you mean active connections to the server at the same time (via Management Studio) or an application or applications connecting to the server at the same time?

Assuming you're talking about an application it could be that the memory limitations or limited features within SQL Server Express may hinder performance somewhat. Not really used express in a while.

Good write ups here:
MSDN Blogs limitations of SQL Server 2008 R2

Introduction to SQL Server Express 2008 R2

Or if wikipedia is your thing SQL Server Express Wikipedia

Memory and CPU may be a concern, not sure depends on your app or what features you are using. *EDIT* Also what queries are running, what the deployment of the database is, if it sits on the same server as the app or usage, I/O etc etc.
 
Depends; seems very ambiguous. Do you mean active connections to the server at the same time (via Management Studio) or an application or applications connecting to the server at the same time?

Assuming you're talking about an application it could be that the memory limitations or limited features within SQL Server Express may hinder performance somewhat. Not really used express in a while.

Good write ups here:
MSDN Blogs limitations of SQL Server 2008 R2

Introduction to SQL Server Express 2008 R2

Or if wikipedia is your thing SQL Server Express Wikipedia

Memory and CPU may be a concern, not sure depends on your app or what features you are using. *EDIT* Also what queries are running, what the deployment of the database is, if it sits on the same server as the app or usage, I/O etc etc.


Not sure if you are aware but we are using a software called "IRIS Accountancy Suit".
It has a SQL Database running on our file/application server(Dell PowerEdge T320 - E5 - 2420, 32GB, 2x SAS RAID 0, dual gigabit) . For that we use SQL Server Express 2008 R2.

Company has 18 people login in to the IRIS application in their workstations - which means it is having that many active connections to the SQL DB.

It is worth mentioning that IRIS spec says it only recomends SQL server Express for upto 6 users! :eek:
 
I've got a customer using IRIS, not sure how many users but it must be over 10 and, IIRC, that is SQL Express 2008 R2.

Is that a typo above or do you have RAID 0 on a server?
 
My bad it is Raid 1 :)
It is not that bad for 18 users to be honest, it is just that there are always some who says "it is slow".
Some times when accessing some data, IRIS goes in to a busy state for nearly 10-15 seconds, so some users gets impertinent and starts clicking around to speed things up!
But the real problem will come in the near future when we are going to be almost doubled the size now.
 
Some recommendations for you. You could try profiling the SQL database (if you have the tools) to see if it's SQL performance related, either memory or hard disk I/O causing the issue. Never dealt with the software so not sure.

The usual problem with slow databases usually boils down to how efficient the hardware works with the configuration or the software around the configuration. Bad queries, unoptimised queries, slow I/O, poor resource usage within the application. Pooling issues or inefficient memory usually fall under that too.

How large is the database? Have you tried doing a sweep of the transaction logs to see if there is anything going wrong? May be worth trying a defragment on the tables or database Here

There is a tool that can help you Here

Looking at the software, depending on the size of the DB on disk you could be hitting the 1GB memory limit on express if each user is querying accountancy records. If all else seems fine but the profiler is showing temp tables as a problem it could be with the 1GB limitation that you may need web or workground edition. Select and join usually consume a lot of memory if it's pulling lots of records. Just 2p.
 
Some recommendations for you. You could try profiling the SQL database (if you have the tools) to see if it's SQL performance related, either memory or hard disk I/O causing the issue. Never dealt with the software so not sure.

The usual problem with slow databases usually boils down to how efficient the hardware works with the configuration or the software around the configuration. Bad queries, unoptimised queries, slow I/O, poor resource usage within the application. Pooling issues or inefficient memory usually fall under that too.

How large is the database? Have you tried doing a sweep of the transaction logs to see if there is anything going wrong? May be worth trying a defragment on the tables or database Here

There is a tool that can help you Here

Looking at the software, depending on the size of the DB on disk you could be hitting the 1GB memory limit on express if each user is querying accountancy records. If all else seems fine but the profiler is showing temp tables as a problem it could be with the 1GB limitation that you may need web or workground edition. Select and join usually consume a lot of memory if it's pulling lots of records. Just 2p.

Well database is 1.16GB today and growing every day.
It does use A LOT of select and join statements. Some of the views are very badly designed but IRIS is not going to change that just because I say :)

I will have a go at some of the things you have linked and suggested on the test server.
Thank you.
 
Just ran SQL Fragmentation Analyzer on the test server database(exact same as main db), and it shows 20 warnings and 103 critical tables - I guess that's bad right :(
 
Can be bad - not always the Analyzer goes by best practice, depends on the seriousness of the errors. Not all devs when developing software follow best practice. What errors is it showing?

That's a few more errors for a production, non-dev database that you'd normally expect, again depends how "critical" they are. Is it showing any database integrity or index errors?

Easiest solution is to upgrade the SQL server to allow it to use more RAM, not always the best solution. It's considered the easy and sloppy way to do it short term. Think it'll be harder to fix that many errors - again depending what they are, at that point you would be spending a lot of time optimising the performance of the db.

It's really up to you, depending how valuable your time is. Quick and easy but doesn't really fix the underlying issues or slow n' painful to fix the underlying problems. Considering the devs of the software will probably not play ball I'd recommend the easy method.
 
Back
Top Bottom