Is there a way to reduce this down

Associate
Joined
19 Jul 2006
Posts
1,847
OK I'm writing reports for a database I have

Code:
                DateTime startDate = DateTime.Parse(start);
                DateTime endDate = DateTime.Parse(end);
                ViewData["TotalDuringPeriod"] = db.AlcoholForms.Count(x => x.CreatedDate >= startDate && x.CreatedDate <= endDate);
                ViewData["TotalUniqueClientsDuringPeriod"] = db.AlcoholForms.Where(x => x.CreatedDate >= startDate && x.CreatedDate <= endDate).Select(x => x.ClientId).Distinct().Count();
                ViewData["TotalGateNew"] = db.AlcoholForms.Count(x => x.ReferredTo == "Gate" && x.Reengagement == false && (x.CreatedDate >= startDate && x.CreatedDate <= endDate));
                ViewData["TotalGateReengagment"] = db.AlcoholForms.Count(x => x.ReferredTo == "Gate" && x.Reengagement && (x.CreatedDate >= startDate && x.CreatedDate <= endDate));
                ViewData["TotalSwitchNew"] = db.AlcoholForms.Count(x => x.ReferredTo == "Switch" && x.Reengagement == false && (x.CreatedDate >= startDate && x.CreatedDate <= endDate));
                ViewData["TotalSwitchReengagement"] = db.AlcoholForms.Count(x => x.ReferredTo == "Switch" && x.Reengagement && (x.CreatedDate >= startDate && x.CreatedDate <= endDate));
                List<ReportAlcoholReferals> reportData = db.AlcoholForms.Where(x => x.Referral && (x.CreatedDate >= startDate && x.CreatedDate <= endDate)).Select(x => new ReportAlcoholReferals()
                {
                    FirstName = x.Client.FirstName,
                    LastName = x.Client.LastName,
                    Date = x.CreatedDate,                 
                    ClientId = x.ClientId,
                    AlcoholForm = x
                }).OrderBy(x => x.AlcoholForm.ReferredTo).ToList();
                return View(reportData);

Everything works, but is there a way to make it more efficient as i'm hitting the DB 7 times?

Cheers
 

AJK

AJK

Associate
Joined
8 Sep 2009
Posts
1,722
Location
UK
It's not that inefficient. The COUNT queries shouldn't take any time at all to run (if they do, look at the DB), and there's only one SELECT in there. .NET will re-use the same database connection for all the queries, so there won't be any additional overhead creating/destroying connections.

That said... you could write a stored procedure to return all the count values in one go, which would reduce your queries from 7 to 2.
 
Last edited:
Associate
Joined
1 Dec 2005
Posts
803
Well it should be easier than World Peace, but possibly more difficult than inhaling and exhaling.

Are you familiar with stored procedures and their purpose? Personally I'd use a view rather than a stored procedure. But depending on what else you might need to do with the count information it may well be 6 of one and half-dozen of the other. Bottom line is you're then making only one database trip for the count information rather than six.

Depending on exactly what your x variable is, I'd wager you could produce all of the information for your report using a view (or at worst a stored procedure). T-SQL is very powerful and for things like this usually quicker than round-tripping and calculating things in code, even if your DAL is building SQL queries itself (they're often not very well formed).

Just my 2p :)
 
Caporegime
Joined
18 Oct 2002
Posts
29,491
Location
Back in East London
While you're at it, don't bother with any where clauses, do it all in your app :rolleyes:
You do realise those .Select() and .Where() etc. methods are extension methods that don't actually manipulate the data, right? They will construct a query and until you enumerate the collection (.ToList()/.ToArray()/.First()/.Single() or a for[each] loop) the data isn't pulled.
 
Associate
OP
Joined
19 Jul 2006
Posts
1,847
Thanks for the advice guys,

Never done a stored procedure before, may look into them in the future, but after talking this through with my mate. The program is never going to get to the size where this will be an issue
 
Soldato
Joined
27 Mar 2003
Posts
2,710
Just as a side note and something I have done recently with an app that looks at aggregating daily stats why not have a daily summary table that is incremented for summary data.

so if I am understanding this correctly the viewdata statements are probably being assigned to some headers/sections on your report for total x in this period etc.

so for example in my application we had a simple summary table that is based on a day basis which consists of the following table structure (well the important bits for this example)

day, files processed, files imported successfully, files not imported, potential value

This would then create a new row for each day on the first file update (if one previously doesn't exist), then increment until the next day etc.

so ending up with something like:
day, files processed, files imported successfully, files not imported, potential value
10 Feb 2014, 20 , 10, 10, £600
11 Feb 2014, 30, 25, 5 £400
12 Feb 2014, 50, 50, 0 £900

Well hopefully you get the idea.

This data can then be easily aggregated and you don't have to go through potentially hundreds/thousands of rows to get your aggregated totals.

Obviously changing your summary table to match the data you need and from just looking at it you could put "TotalDuringPeriod", "TotalGateNew", "TotalGateReengagment", "TotalSwitchNew","TotalSwitchReengagement" into your summary table and then this could be filtered using your date criteria to sum up the values you need.

The "TotalUniqueClientsDuringPeriod" would still need to be calculated due to the "distinct" clause but this could save you time/processing power if you start having large volumes of data and select large date windows.

This summary table could be updated as part of what ever save process you have for getting this information into your system in the first place or could be done as a scheduled job via sql agent (If using ms sql) to update this info at set periods of time. Again this could all be done via a stored procedure but if sql is not your strong point then stick with what you know.

Just my thought looking at it from a more db world than app world.

But if it isn't broke and performance isn't an issue then just leave it be. (I am probably as guilty as anyone with over-engineering/ optimizing code when it's not needed)

As said just my thoughts with looking at this for 10 minutes or so.
 
Back
Top Bottom