Writing tests for legacy code

Soldato
Joined
1 Mar 2003
Posts
5,508
Location
Cotham, Bristol
I recently created a question on Programmers stackexchange about creating tests for legacy code prior to a refactor/redesign of a code base.

I've read some answers to questions along a similar line such as "How do you keep your unit tests working when refactoring?". In my case the scenario is slightly different in that I've been given a project to review and bring in line with some standards we have, currently there are no tests at all for the project!

I've identified a number of things I think could have been done better such as NOT mixing DAO type code in a service layer.

Before refactoring it seemed like a good idea to write tests for the existing code. The problem it appears to me is that when I do refactor then those tests will break as I'm changing where certain logic is done and the tests will be written with the previous structure in mind (mocked dependencies etc.)

In my case, what would be the best way to procede? I'm tempted to write the tests around the refactored code but I'm aware there is a risk I may refactor things incorrectly that could change the desired behaviour.

Whether this is a refactor or a redesign I'm happy for my understanding of those terms to be corrected, currently I'm working on the following definition for refactoring "With refactoring, by definition, you don't change what your software does, you change how it does it.". So I'm not changing what the software does I'd be changing how/where it does it.

Equally I can see the argument that if I'm changing the signature of methods that could be considered a redesign.

Here's a brief example


MyDocumentService.java (current)
Code:
public class MyDocumentService {
   ...
   public List<Document> findAllDocuments() {
      DataResultSet rs = documentDAO.findAllDocuments();
      List<Document> documents = new ArrayList<>();
      for(DataObject do: rs.getRows()) {
         //get row data create new document add it to 
         //documents list
      }

      return documents;
   }
}

MyDocumentService.java (refactored/redesigned whatever)
Code:
public class MyDocumentService {
   ...
   public List<Document> findAllDocuments() {
      //Code dealing with DataResultSet moved back up to DAO
      //DAO now returns a List<Document> instead of a DataResultSet
      return documentDAO.findAllDocuments();
   }
}

As stackexchange doesn't particularly allow a forum type conversation I thought I'd allow it to continue here, highlighting a few answers/comment that stuck out for me and my thoughts

A lot of answers boil down to testing at the right level i.e. integration testing instead of unit testing.

You're looking for tests that check for regressions. i.e. breaking some existing behaviour. I would start by identifying at what level that behaviour will remain the same, and that the interface driving that behaviour will remain the same, and start putting in tests at that point.

You now have some tests that will assert that whatever you do below this level, your behaviour remains the same.

You're quite right to question how the tests and code can remain in sync. If your interface to a component remains the same, then you can write a test around this and assert the same conditions for both implementations (as you create the new implementation). If it doesn't, then you have to accept that a test for a redundant component is a redundant test.

I think this point is something I may have to accept in my case as the interface between the DAO and the Service WILL change. i.e. the DAO interface goes from

Code:
DataResultSet findAllDocuments()

to

Code:
List<Document> findAllDocuments()

I think this point is further highlighted by this answer

Don't write strict unit tests where you mock all the dependencies. Some people will tell you these aren't real unit tests. Ignore them. These tests are useful, and that's what matters.

Let's look at your example:
Code:
public class MyDocumentService {
   ...
   public List<Document> findAllDocuments() {
      DataResultSet rs = documentDAO.findAllDocuments();
      List<Document> documents = new ArrayList<>();
      for(DataObject do: rs.getRows()) {
         //get row data create new document add it to 
         //documents list
      }

      return documents;
   }
}

Your test probably looks something like this:
Code:
DocumentDao documentDao = Mock.create(DocumentDao.class);
Mock.when(documentDao.findAllDocuments())
    .thenReturn(DataResultSet.create(...))
assertEquals(..., new MyDocumentService(documentDao).findAllDocuments());

Instead of mocking DocumentDao, mock its dependencies:

Code:
DocumentDao documentDao = new DocumentDao(db);
Mock.when(db...)
    .thenReturn(...)
assertEquals(..., new MyDocumentService(documentDao).findAllDocuments());

Now, you can move logic from MyDocumentService into DocumentDao without the tests breaking. The tests will show that the functionality is the same (so far as you've tested it).

This assumed to some degree that the interface to the DAO remained the same.

Then there's this answer

I suggest - if you haven't already - reading both Working Effectively With Legacy Code as well as Refactoring - Improving the Design of Existing Code.

The problem it appears to me is that when I do refactor then those tests will break as I'm changing where certain logic is done and the tests will be written with the previous structure in mind (mocked dependencies etc.)

I don't necessarily see this as a problem: Write the tests, change the structure of your code, and then adjust the test structure also. This will give you direct feedback whether your new structure is actually better than the old one, because if it is, the adjusted tests will be easier to write (and thus changing the tests should be relatively straightforward, lowering the risk of having a newly introduced bug pass the tests).

Also, as others have already written: Don't write too detailed tests (at least not at the beginning). Try to stay at a high level of abstraction (thus your tests will probably better characterized as regression or even integration tests).

At the moment I'm tempted to adopt this approach, i.e. refactor/redesign the tests along with the code itself (seeing as the interface between the components is not going to remain the same)

And then there's a comment which goes even further

Don't, write integration tests. The "refactoring" you are planning is above the level of unit testing. Only unit test the new classes (or the old ones one you know you're keeping them).

At first I read that as "Don't write integration tests..... Only unit test the new classes". Now I realise this was actually "Don't write unit tests, write integration tests, only write unit tests for new classes"

But again because the interface between the layers that I would be doing the integration testing on is changing I come back to one of the first points I highlighted.

If your interface to a component remains the same, then you can write a test around this and assert the same conditions for both implementations (as you create the new implementation). If it doesn't, then you have to accept that a test for a redundant component is a redundant test.

Thoughts?

NB: I gave the example of a service layer interacting with a DAO and the service layer dealing with the DataResultSet, in reality it's worse than that as it's a JSF bean (i.e. View/Controller code) that has data access type code mixed in with it
 
Soldato
Joined
23 Feb 2009
Posts
4,978
Location
South Wirral
I can't add anything better than what you already have from stackexchange, but is "throw it all away and start with today's requirements" an option ? Legacy systems usually have all sorts of redundant cruft in them, sometimes its cheaper to just start again. It also provides the opportunity to add in new requirements that nobody was brave enough to try and do on the legacy code.
 
Associate
Joined
14 May 2010
Posts
1,136
Location
Somerset
Personally, I wouldn't change the existing interface. You want to make sure the refactored code is going to work in place of what you've currently got. Changing the interface is going to result in you having to change all references to this method which could be a pain (and risky). It also violates the SOLID Open/Closed principle - https://en.wikipedia.org/wiki/Open/closed_principle

Instead, take a look at using the Decorator pattern - https://en.wikipedia.org/wiki/Decorator_pattern

This will allow you to retain the original interface for compatibility reasons, but wrap it with a new interface for all future code. You can then also add an obsolete attribute (since this looks like c#) to the original method to dissuade other developers from using the original one in the future.
 
Associate
Joined
15 Oct 2009
Posts
579
Personally, I wouldn't change the existing interface. You want to make sure the refactored code is going to work in place of what you've currently got. Changing the interface is going to result in you having to change all references to this method which could be a pain (and risky). It also violates the SOLID Open/Closed principle - https://en.wikipedia.org/wiki/Open/closed_principle

Instead, take a look at using the Decorator pattern - https://en.wikipedia.org/wiki/Decorator_pattern

This will allow you to retain the original interface for compatibility reasons, but wrap it with a new interface for all future code. You can then also add an obsolete attribute (since this looks like c#) to the original method to dissuade other developers from using the original one in the future.

This.

By wrapping up the legacy code and exposing a new interface it would allow you to write tests for that new interface and then change the underlying legacy code later.
 
Back
Top Bottom