I've been developing with c# for years now but it's all been web based so I have almost zero experience with winforms setup. I'm wanting to make up a pretty basic web crawler that can crawl over our website looking for any issue.
I've played around with the main crawling and parsing code in linqpad but I need to get it in to an actual app now, it'll be a fairly basic xaml .net 4 app.
Is there anywhere that gives good advice on which way to do threading/shared collections in winforms? I've looked around and there seems to be loads of ways now. My thoughts where to have part of the code do the actual scraping putting the response in to a shared container, then another process can parse this and put the errors and new links back in to other shared containers. The part that does the actual scraping will need to have some form of slicing/threading as well to get performance out of it.
I've played around with the main crawling and parsing code in linqpad but I need to get it in to an actual app now, it'll be a fairly basic xaml .net 4 app.
Is there anywhere that gives good advice on which way to do threading/shared collections in winforms? I've looked around and there seems to be loads of ways now. My thoughts where to have part of the code do the actual scraping putting the response in to a shared container, then another process can parse this and put the errors and new links back in to other shared containers. The part that does the actual scraping will need to have some form of slicing/threading as well to get performance out of it.