čtvrtek 17. července 2014
Unit testing Knockout applications
In ideal case any View Model in Knockout based application should be completely unit-testable. The View Model ofcourse interacts with other code but in majority of cases this would be either some UI code or the server side code, probably over REST API. The UI interaction should be minimal. If possible, the binding capabilities of Knockout should be leveraged. The REST API is not available while unit testing and thus has to be mocked or hidden by an abstraction layer. I went for the first option and this blog describes how to mock the AJAX calls while unit testing Knockout View Models. At the end I also provide information about the ChutzPah test runner and the way that the tests can be run from within Visual Studio.
The typical view model that I am using looks like the following one.
function AssetViewModel(){
var self = this;
self.city = ko.observable();
self.country = ko.observable();
self.size = ko.observable();
self.load = function(){
$.ajax("/api/assets/" + self.city(), {
data: dto,
type: "GET", contentType: "application/json",
success: function (result) {
self.updateData(data);
}
});
}
self.save = function () {
var dto = self.toDto();
self.isBusy(true);
self.message("Saving...");
$.ajax("/api/assets/", {
data: dto,
type: "POST", contentType: "application/json",
success: function (result) {
self.edit(false);
self.isBusy(false);
self.message(result.message);
self.update(result.data);
}
});
};
self.update = function(updateData){
self.city(updateData.City);
self.country(updateData.Country);
}
self.toDto = function () {
var model = new Object();
model.City = self.city();
model.Country = self.country();
return JSON.stringify(model);
};
}
You might thing that the toDto method is useless if one uses the Knockout Mapping plug-in, however in many cases the view models get much more complex and they can't be directly mapped to any kind of data transfer objects or domain objects. Other than that, nothing should be surprising here. The save method sends the dto over the wire and than treats the response.
The unit test
Nowadays one has a choice between multiple JavaScript testing frameworks, QUnit, Jasmine or Mocha being probably the most common choices - I am staying with QUnit. Testing the updateData with QUnit might look like this.
var vm;
function initTest() {
var vm = new AssetViewModel();
}
$(function () {
QUnit.module("ViewModels/AssetViewModel", {
setup: initTest
});
QUnit.test("updateData sets correctly the city", function () {
var data = {
City: "Prague",
Country:"Czech Republic"
};
vm.updateData(data);
equal(vm.city(), "Prague");
});
}
QUnit module function takes 2 parameters - name and a sort of configuration object. Configuration object can contain a setup and tearDown methods. Their usage and intend should be clear.
This test case is very simple for 2 reasons: it does not depend on any external resources and it executes synchronously.
QUnit has 3 assert methods which can be used in the tests:
- ok - One single argument which has to evaluate to true
- equal - Compare two values
- deepEqual - Recursively compare a objects properties
Asynchronous testing
Here is the test for the save method which calls the REST server interface.
function initTest() {
$.mockjax({
url: '/api/assets/Prague',
type: 'GET',
responseTime: 30,
responseText: JSON.stringify({
Name: "Prague",
Country: "Czech Republic",
Size: 20
})
});
}
$(function () {
QUnit.module("ViewModels/AssetViewModel", {
setup: initTest
});
QUnit.asyncTest("testing the load method", function () {
setTimeout(function () {
ok(true, "Passed and ready to resume!");
start();
vm.load();
QUnit.equal(vm.size(),20);
}, 100);
});
}
I am using MockJax library to mock the results of the REST calls. The initTest method setups the desired behavior of the REST service call, the test is executed after 100ms of waiting time. In this case the call is a GET and we define the response simply as JSON data. QUnit has a method for asynchronous tests called asyncTest .
Currently there is a small issue in MockJax regarding the way that incoming JSON values are handled. That might get fixed in future versions.
Mocking the server interface
Returning a simple JSON data may be sufficient for some case, for others however we would maybe like to verify the integrity of the data sent to the server, just like when testing the save method
var storedAssets = [];
function initTest() {
$.mockjax({
url: '/api/assets',
type: 'POST',
responseTime: 30,
response: function (data) {
storedAssets.push(JSON.parse(data.data));
}
});
}
$(function () {
QUnit.module("ViewModels/AssetViewModel", {
setup: initTest
});
QUnit.asyncTest("save asset - check the update of the size", function () {
vm.size(10);
vm.save();
setTimeout(function () {
ok(true, "Passed and ready to resume!");
start();
equal(storedAssets.length, 1);
var storedAssets = storedCharges[0];
equal(storedAssets.Size, vm.size());
}, 100);
});
}
In this case the save method passes the JSON data to the server side. The server is mocked by MockJax which only adds the data to a dump array, which can be then used to verify the integrity of the data.
Running Unit Tests in Visual Studio
There are 3 reasons for which I am using Visual Studion even for JavaScript project:
- Usually the application has some backend written in .NET and I don't want to use 2 IDEs for one single application.
- I can easily debug JS application from within VS. Of course Chrome's debugger is very useful as well - but if I can do everything from 1 IDE, why should I use other.
- ReSharper has really good static analysis of JavaScript and HTML files. That saves me a lot of time - typos, unknown references and other issue are catched before I run the application.
- I can run JavaScript unit tests right from the IDE.
To run the Unit Tests I am using ChutzPah test runner. ChutzPah internally uses the PhantomJS in-memory browser, and interprets the tests. While using this framework, one does not need the QUnit wrapper HTML page and the Unit Tests can be run as they are.
Note that ChutzPah already contains QUnit and you will obtain TimeOutException, if you try to add a reference to QUnit explicitly (http://chutzpah.codeplex.com/workitem/72).
Since your tests are just JavaScript files, without the HTML wrapper page, ChutzPah needs to know what libraries do your View Models reference and load them. This is handled using a configuration file chutzpah.json which has to be placed alongside the unit tests. The following is an example of configuration file that I am using for my tests.
{
"Framework": "qunit",
"References" : [
{ "Path": "../Scripts/jquery-2.1.0.js"},
{ "Path": "../Scripts/knockout-3.1.0.js"},
{ "Path": "../Scripts/jquery.mockjax.js"},
{ "Path": "../Scripts/tech", "Include": "*.js"},
{ "Path": "../ViewModels", "Include": "*.js"}
]
}
JSON DateTime serialization
This is more a side note. Dates in JSON are serialized into ISO format. That is good, the problems is that if you try to deserialize an object which contains a date, the date comes out as a string. The reason of course is that since there is no type, the de-serializer does not know that given property is a date - and keeps the value as a string. You can read more on dates serialization in JSON here. Any time that you are mocking backend which handles dates you have to be aware of this fact. Remember the mock of the back-end which inserts the object to a dummy array that I have used above:
function initTest() {
$.mockjax({
url: '/api/assets',
type: 'POST',
responseTime: 30,
response: function (data) {
storedAssets.push(JSON.parse(data.data));
}
});
}
JSON.parse call we handle the dates as strings. If the ViewModel has a date property, you will have to convert it into string before testing the equality.
úterý 2. října 2012
Introduction to Fakes and migration from Moles
Code example related to this post are available at this GitHub repository.
Fakes framework contains two constructs which can be used to isolate code:
- Stubs – should be used to implement interfaces and stub the behavior of public methods.
- Shims – allow mocking the behavior of ANY method, including static and private methods inside .NET assemblies.
When using Stubs, you provide mocked implementation of any interface to the class or method which you are testing. This is done transparently before compilation and no changes are made to the code after the compilation. On the other hand Shims use the technic called IL weaving (injection of MSIL assembly at runtime). This way the code which should be executed is replaced at runtime by the code provided in the delegate.
The framework has caused some interesting discussions across the web. The pluralsight blog has found some negative points Rich Czyzewski describes how noninvasive tests can be create while using Fakes. And finally David Adsit nicely summarizes the benefits and the possible usage of the Fakes. From what has been said on the net, here is a simple summary of negative and positive points.
Pros
- Lightweight framework, since all the power of this framework is based only on generated code and delegates.
- Allows stubbing of any method (including private and static methods).
- No complicated syntax. Just set the expected behavior to the delegate and you are ready to go.
- Great tool when working with legacy code.
- The test are based od generated code. That is to say, a phase of code generation is necessary to create the “stubs”.
- No mocking is built-in into the framework. There is no built-in way to test whether a method has been called. This however can be achieved by manually adding specific code inside the stubbed method.
Migrating from Moles to Fakes
First a small warning, A bug was apparently introduced by the Code Contracts team, which causes a crash when building a solution which uses Fakes. You will need to install the latest version of Code Contracts. (If you do not know or use Code Contracts, you should not be impacted).If you have already used Moles before, you might be wondering, how much code changes will the migration need. To give you a simple idea, I have migrated the code from my previous post about Moles in order to use Fakes. Two major steps have to be taken during the migration:
- Change the structure of the project and generate new stubs
- Rewrite the unit tests to use newly generated classes
Following images show the difference between the old and new project structure (also note that I was using VS 2010 with Moles and I am now using VS 2012 with Fakes).
The next step are the code changes.
Rewriting code using Shims
Here is a classical example of testing a method which depends on DataTime.Now value. The first snippet is isolated using Moles and the second contains the same test using Fakes:
[TestMethod]
[HostType("Moles")]
public void GetMessage()
{
MDateTime.NowGet = () =>
{
return new DateTime(1, 1, 1);
};
string result = Utils.GetMessage();
Assert.AreEqual(result, "Happy New Year!");
}
[TestMethod]
public void GetMessage()
{
using (ShimsContext.Create())
{
System.Fakes.ShimDateTime.NowGet = () =>
{
return new DateTime(1, 1, 1);
};
string result = Utils.GetMessage();
Assert.AreEqual(result, "Happy New Year!");
}
}
The main differences:- Methods using Shims, do not need the HostType annotation previously needed by Moles.
- On the other hand a ShimsContext has to be created and later disposed when the stubbing is not needed any more. The using directive provides a nice way to dispose the context right after its usage and marks the code block in which the system has “stubbed” behavior.
- Only small changes are needed due to different names generated classes.
Rewriting code which is using only Stubs
Here the situation is even easier. Besides the changes in the naming of the generated classes, no additional changes are needed to migrate the solution. The following snippet test “MakeTransfer” method, which takes two accounts as parameters.
The service class containing the method needs Operations and Accounts repositories to be specified in the constructor. The behavior of these repositories is stubbed. This is might be typical business layer code of any CRUD application. First let’s see the example using Moles.
[TestMethod]
public void TestMakeTransfer()
{
var operationsList = new List<Operation>();
SIOperationRepository opRepository = new SIOperationRepository();
opRepository.CreateOperationOperation = (x) =>
{
operationsList.Add(x);
};
SIAccountRepository acRepository = new SIAccountRepository();
acRepository.UpdateAccountAccount = (x) =>
{
var acc1 = _accounts.SingleOrDefault(y => y.Id == x.Id);
acc1.Operations = x.Operations;
};
AccountService service = new AccountService(acRepository, opRepository);
service.MakeTransfer(_accounts[1], _accounts[0], 200);
Assert.AreEqual(_accounts[1].Balance, 200);
Assert.AreEqual(_accounts[0].Balance, 100);
Assert.AreEqual(operationsList.Count, 2);
Assert.AreEqual(_accounts[1].Operations.Count, 2);
Assert.AreEqual(_accounts[0].Operations.Count, 3);
}
Note the way the repository methods are stubbed. Due to the fact that the stubs affect the globally defined variables (the list of operations, and the list of accounts) we can make assertions on these variables. This way we can achieve “mocking” and be sure that the CreateOperation method and the UpdateAccount method of Operation and Account repository have been executed. The operationsList variable in this example acts like a repository and we can easily Assert to see if the values have been changed in this list.
Let’s see the same example using Fakes:
[TestMethod]
public void TestMakeTransfer()
{
var operationsList = new List<Operation>();
StubIOperationRepository opRepository = new StubIOperationRepository();
opRepository.CreateOperationOperation = (x) =>
{
operationsList.Add(x);
};
StubIAccountRepository acRepository = new StubIAccountRepository();
acRepository.UpdateAccountAccount = (x) =>
{
var acc1 = _accounts.SingleOrDefault(y => y.Id == x.Id);
acc1.Operations = x.Operations;
};
AccountService service = new AccountService(acRepository, opRepository);
service.MakeTransfer(_accounts[1], _accounts[0], 200);
//the asserts here....
}
You can see, that the code is almost identical. The only difference is in the prefix given to the stubs (SIAccountRepository becomes StubIAccountRepository). I am almost wondering if MS could not just keep the old names, than we would just need to change the using directive…
Fakes & Pex
One of the advantages of Moles compared to other isolation frameworks, was the fact that it was supported by Pex. When Pex explores the code, it enters deep into any isolation framework which is used. Since Moles is based purely on delegates, Pex is able to dive into the delegates and generated tests according the content inside the delegates. When using another isolation framework, Pex will try to enter the isolation framework itself, and thus will not be able to generate valid tests.
So now, when Fakes are here as replacement of Moles, the question is whether we will be able to use Pex with Fakes? Right now it is not possible. Pex add-on for Visual Studio 11 does not (yet) exists and I have no idea whether it will ever exists.
I guess Pex & Moles were not that adopted by the community. On the other hand both were good tools and found their users. Personally I would be glad if MS continued the investment into Pex and automated unit testing though I will not necessarily use it everyday in my professional projects. On the other hand I would always consider it as an option when starting new project. CodeProject
středa 2. května 2012
Mocking the generic repository
Another pre-requisity is the knowledge of the repository pattern and it's generic variant.
In the majority of my projects I am using the following generic repository class.
public interface IRepository
{
T Load<T>(object id);
T Get<T>(object id);
IEnumerable<T> Find<T>(Expression<Func<T, bool>> matchingCriteria);
IEnumerable<T> GetAll<T>();
void Save<T>(T obj);
void Update<T>(T obj);
void Delete<T>(T obj);
void Flush();
int CountAll<T>();
void Evict<T>(T obj);
void Refresh<T>(T obj);
void Clear();
void SaveOrUpdate<T>(T obj);
}
Based on this technique, some people decide to implement concrete classes of this interface (CarRepository : IRepositoryI am also using the generic variant (mostly with NHibernate). Now the question is: How to mock this generic repository? It can be a bit tricky to mock. When you have one class for each repository which works for one concrete type you can mock the repository quite easily. For example StudentRepository which handles entities of type Student might be backed up a list of students.
While when working with generic repository, it might be a bit harder. Here is how I have solved the problem:
public class MockedRepository :IRepository
{
public MockedRepository()
{
cities = DeserializeList<City>("CityDto");
stations = DeserializeList<Station>("StationDto");
tips = DeserializeList<InformationTip>("InformationTipDto");
countries = DeserializeList<Country>("CountryDto");
dataDictionary = new Dictionary<Type, object>();
dataDictionary.Add(typeof(City), cities);
dataDictionary.Add(typeof(Station), stations);
dataDictionary.Add(typeof(InformationTip), tips);
dataDictionary.Add(typeof(Country), countries);
}
public T Get<T>(object id)
{
Type type = typeof(T);
var data = dataDictionary[type];
IEnumerable<T> list = (IEnumerable<T>)data;
var idProperty = type.GetProperty("Id");
return list.FirstOrDefault(x=>(int)idProperty.GetValue(x,null) == (int)id);
}
public IEnumerable<T> Find<T>(Expression<Func<T, bool>> matchingCriteria)
{
Type type = typeof(T);
var data = dataDictionary[type];
IEnumerable<T> list = (IEnumerable<T>)data;
var matchFunction = matchingCriteria.Compile();
return list.Where(matchFunction);
}
public IEnumerable<T> GetAll<T>()
{
Type type = typeof(T);
return (IEnumerable<T>)dataDictionary[type];
}
public void Save<T>(T obj)
{
Type type = typeof(T);
List<T> data = (List<T>)dataDictionary[type];
data.Add(obj);
}
}
The main building block of this mocked repository is the dictionary which contains for each type in the repository the enumerable collection of objects. Each method in the mocked repository can use this dictionary to determine which is the collection addressed by the call (by using the generic type T.).
Type type = typeof(T); var data = dataDictionary[type]; IEnumerable<T> list = (IEnumerable<T>)data;Now what to do next, depends on each method. I have shown here only the methods which I needed to mock, but the other ones should not be harded to mock. The most interesting is the Find method, which takes as the parameter the matching criteria. In order to pass this criteria to the Where method on the collection, this criteria (represented by an Expression) has to be compiled into a predicate Func
The Get also has some hidden complexity. In this implementation I assume, that there is a Id property defined on the object of type T. I am using reflection to obtain the value of that property and the whole thing happens inside the a LINQ statement.
This repository might be useful, but it is definitely not the only way to isolate your database. So the question is - Should this be the method to isolate my Unit or Integration tests? Let's take a look at other possible options:
- Use mocking framework (there is quite a choice here)
This essentialy means that in each of your tests you define the behaviour of the repository class. This requires you to write a mock for each repository method that is called inside the service method. So it means more code to write. On the other hand you controll the behaviour needed for the particular tested method. While using mocking framework you have also the option to verify that methods have been caled. - Use the repository implementation and point it to in-memmory database (SQL Lite). That is a good option in the case when:
- You are able to populate the database with the data.
- You are sure of your repository implementation
- Use the generic repository mock presented here. That is not a bad option if you have some way to populate the collections which serve as in-memmory database. I have used deserialization from JSON. Another option could be to use a framework such as AutoPoco to generate the data. You can also create one repository which can be used for the whole test suite (or application presentation).
Summary
As said before this might be a variant to consider. I am using it for Proof of Concepts and portable versions of database based applications. On the other hand for unit test you might consider either mocking framework or in-memory database. There is no clear winner in this comparison. CodeProjectúterý 3. dubna 2012
Mock Java Calendar - JMockit vs Mockito
Calendar c = Calendar.getInstance(); int day = c.get(Calendar.DAY_OF_WEEK);Now I imagine this code can be hidden somewhere inside a business method and the behaviour of that method would be dependent on the current day. Typical example can be the method which returns the schedule of the cinema on the current day.
public class ScheduleService{
public Schedule getTodaySchedule(){
Calendar c = Calendar.getInstance();
int day = c.get(Calendar.DAY_OF_WEEK);
//get it from DB or wherever you want
Schedule s = lookupAccordingToDay(day);
}
}
In order to test this method you have to mock the Calendar. You will have to verify, that for monday the service will return the schedule for monday. However since the test will be automatically called every day, not only monday, you will obtain whole different schedules and the assert will fail. There are several solutions to this. I have 3 in my mind.
Solution 1: create a separate service
First solution has nothing to do with mocking. The way to go here is to isolate the Calendar into a separate service (let's call it CurrentDayService). Than you can manually create a mock for this service. You will also have to change the body of your ScheduleService to use this CurrentDayService.
public interface ICurrentDayService {
int getCurrentDay();
}
public class CurrentDayService {
public int getCurrentDay(){
Calendar c = Calendar.getInstance();
return c.get(Calendar.DAY_OF_WEEK);
}
}
public class CurrentDayServiceMock {
private int dayToReturn;
public CurrentDayServiceMock(int dayToReturn){
this.dayToReturn = dayToReturn;
}
public int getCurrentDay(){
return dayToReturn;
}
}
public class ScheduleService {
//@Autowire or inject this service
private CurrentDayService dayService;
public Schedule getTodaySchedule(){
int day = dayService.getCurrentDay();
//get it from DB or wherever you want
Schedule s = lookupAccordingToDay(day);
}
}
Now in the unit test your schedule service, can use the mock instead of the real implementation. If you are using Dependency Injection than you can define a different context for unit tests. If not, you will have to do it manually.
Solution 2: use Mockito
Mockito allows you to mock the real Calendar class. That means that you no longer need to wrap the Calendar by some CurrentDayService class just to be able to mock the behavior. However you will still have to add a mechanism to pass the mocked Calendar to your service. That is not that complicated. Have a look at the following definition o the ScheduleService and the unit test which comes with it.
public class ScheduleService{
private Calendar calendar;
public ScheduleService(){
calendar = Calendar.getInstance();
}
public Schedule getTodaySchedule(){
int day = calendar.get(Calendar.DAY_OF_WEEK);
Schedule s = lookupAccordingToDay(day);
}
public setCalendar(Calendar c){
calendar = c;
}
}
@Test
public void testGetTodaySchedule() {
Calendar c = Mockito.mock(Calendar.class);
Mockito.when(c.get(Calendar.DAY_OF_WEEK)).thenReturn(2);
ScheduleService sService = new SomeStrangeService();
//there has to be a way to set the current calendar
sService.setCalendar(c);
Schedule schedule = sService.getTodaySchedule();
//Assert your schedule values
}
To sum it up: if the setCalendar method is not called, than the Calendar is instantiated in the constructor. So in production, it will return the current day. In your unit test, you can easily mock it, to specify different behavior. Tha drawback: if someone accidentaly calls the setCalendar method in the production, you will get into truble.
Solution 3: use JMockit, mock all the calendars in you JVM
JMockit is strong framework which as some other mocking frameworks is using the Java Instrumentation API. The code that you want to execute in your mocks is injected as bytecode at runtime. This enables JMockit to, for instance mock all the instances of Calendar class in your JVM. Here is how you can achieve this:
@MockClass(realClass = Calendar.class)
public static class CallendarMock {
private int hour;
private int day;
private int minute;
public CallendarMock(int day, int hour, int minute) {
this.hour = hour;
this.day = day;
this.minute = minute;
}
@Mock
public int get(int id) {
if (id == Calendar.HOUR_OF_DAY) {
return hour;
}
if (id == Calendar.DAY_OF_WEEK) {
return day;
}
if (id == Calendar.MINUTE) {
return minute;
}
return -1;
}
}
The previous code snippet is the infrastructure which I can use to mock the Calendar's get method. A utility class CalendarMock has to be created, which specifies the methods which are mocked. The realClass attribute in the MockClass annotation specifies which class is mocked by the defined class. So now the unit test is simplified. There is not need to specify the Calendar which should be used by the ScheduleService.
@Test
public void testGetTodaySchedule() {
Mockit.setUpMocks(new CallendarMock(Calendar.MONDAY, 12, 20));
ScheduleService sService = new SomeStrangeService();
Schedule schedule = sService.getTodaySchedule();
//Assert your schedule values
}
@After
public void destroyMock() {
Mockit.tearDownMocks();
}
At the end, you have to remember to switch-off the mocking of the Calendar. If not the Calendar will be mocked in all the tests executed after this one. Hence the call to the tearDownMocks() method.
Summary
With Mockito you can mock the real Calendar. However you have to pass the instance of the mocked callendar to the class, which actually uses it. With JMockit you are able to tell to the JVM: "from now all my mocks behave like this...". For me this simplifies the situation, while I am not forced to create a setter for a Calendar to be passed to my service class. But it would take much more time and effort to compare the two frameworks. It might be that Mockito handles some situations better than JMockit. CodeProjectsobota 4. února 2012
Choosing technologies for .NET project
Here is the structure of this blog, according to which the technologies are grouped.
- DataAccess - ORM, data generation
- Platform - Dependency Injection, Aspect Oriented Programming
- Integration - SOAP/REST, messaging, distributed objects...
- Testing - Unit testing and Mocking, Parametrized testing, Functional testing
- Presentation layer
- Security
- Logging
Typical application
Our application was a classical 3-tier application with database, business and presentation layers.
Data stored in SQL Server 2008. Data access layer implemented using Repository pattern and using ORM. Dependency Injection and Aspect Oriented Programming used to put together the application pieces. Services exposed using WCF, and two types of client applications: mobile and web.
So the technologies presented here, are the ones mostly used in this scenarios, however as said before, I would like to update the post to give more information any time I cross another technology, and that might while working on different architectures.
Data Access
The most important part of the Data Access layer is the framework used for Object Relational Mapping (ORM). There are currently two major ORM frameworks in .NET: NHibernate and Entity Framework. Both provide similar ORM functionalities (Code only approach, Lazy loading , use of POCOs as persistence classes).Entity Framework 4.0 has brought a lot of improvement to its previous version (named EF 1.0) which did not provide above mentioned functionalities and its comparable to NHibernate. Crucial for ORM framework in .NET environment is the integration of LINQ (Language Integrated Query). Entity Framework was the first to offer this functionality but the implementation in NHibernate followed shortly after.
NHibernate has still several advantages among these it’s better ability to process batch treatment and also the fact that as an open source product it can be customized. On the other hand Entity Framework provides better tools integrated into Visual Studio.
One last thing which can justify the choice of NHibernate is the possibility of using FluentNHibernate.
FluentNHibernate
NHibernate uses its XML based HBM format to define the mappings between entities and POCOs. While the separation of code and configuration in XML can be seen as nice approach it gets complicated once the XML configuration files are larger and once we are introducing changes into the POCOs. The XML is not checked upon the compilation, so potential errors can be detected at run-time only and are generally hard to localize.
NFluent allows us to define the mappings in strongly-typed C#, which practically eliminates these issues. If there is an error in configuration, it will be most likely discovered during the compilation. Currently Fluent allows provides almost full compatibility with HBM files, which means that what can be defined in HBM can be also defined in Fluent.
Data Generation
AutoPoco is a simple framework which allows generation of POCOs (Plain Old CLR Objects) with meaningful values. When building enterprise application we often need generate initial data for the database. This can of course be done using SQL scripts or in imperative language which we are using, but consists of lots of repetitive code and for loops in order to create sufficient amount of data. AutoPoco provides easy way to generate the starting data. It also provides several build-in sources for common properties which are stored in databases such as phone numbers, birth dates, name and credit card numbers.
Platform
There are two design patterns (or approaches) which are very often present among the several layers of enterprise applications: Dependency Injection and Aspect Oriented Programming.Dependency Injection is used to assemble complex system from existing blocks. There are several Dependency Injection containers available for .NET framework: Spring.NET, CastleWinsdor, StructureMap, AutoFac, Ninject, Unity (by Microsoft), LinFu.
Aspect Oriented Programming allows developers to separate cross-cutting concerns from the applications blocks. This is usually done by injecting code into object's existing methods.
There are several ways to implement AOP, two of these being most common: Proxy based AOP and IL Weaving based AOP.
Proxy based AOP is easily achieved by wrapping targeted object by a proxy class. Than it is easy to intercept the calls to the target object by the proxy class and call the code, which should be injected. It just happens so, that the Dependency Injection containers use proxy classes and therefor most of them offer also AOP. (Spring.NET, CastleWinsdor).
IL Weawing is an expression for injection of IL code after compile time before the generation of byte-code.
There are two frameworks which provide AOP through IL Weaving: PostSharp and LinFu. PostSharp has a commercial licence, however at the time of writing this post(July 2011), there is also 45 days free trial. LinFu is an opensource project under LGPL licence which covers both IoC and AOP.
I have used to choose Spring.NET because of it’s maturity, the fact that it is well documented, works great with NHibernate and allows both AOP as well as Dependency Injection. One of the disadvantages of Spring.NET is the XML configuration which as always can become too large to maintain. Other frameworks use C# as the language to configure the AOP or Dependency Injection (PostSharp makes use of attributes and frameworks such as Ninject or StructureMap use strongly typed classes to configure the dependency injection container).
I have however decided to use Ninject on my last project, which seems to have a bit of momentum right now, and I will post here later pros/cons.
Code Verification (Code Contracts)
Design by contract is software design approach, which implies that developers define clear interfaces for each software component, specifying its exact behavior. The interfaces are defined by contracts and extend the possibilities of code verification and validation.
The term was first used by Bertrand Meyer, who made it part of his Eiffel programming language.
Code Contracts is a language agnostic framework which enables the Design-by-Contract approach by allowing the programmer to define three types of conditions for each method:
Pre-condition - states in what forms the arguments of the method should be.
Post-condition - states what forms the outputs of the method will have.
Invariants - conditions which will always be true during the execution of the method.
These conditions can be later verified by two types of checks:
Static checking - is being done at the compilation type. At this time the compiler does not know what will be the values passed as arguments to the methods, but from the execution tree can determine which method calls might potentially be evoked with non-compliant parameters.
Runtime checking - the code contracts are compiled as conditions directly into .NET byte-code. This allows the program to avoid writing conditions manually inside the method bodies.
Note that Code Contracts are not language feature. They are composed of class library and the checking tools which are available as plugins for Visual Studio.
Integration
Distributed applications need a way of communication between the components. Remote Procedure Call(RPC) was the first technology used in distributed systems back in 70's. The choice here surely depends on the architecture of the application (client-server, publish-subscribe, ESB, and more...)WCF
Flexible platform which provides abstraction of transport layer configuration (security, transport format, message patterns).
WCF options and choices:
Transportation protocol: WCF can user HTTP, TCP, MSMQ
Transportation format: XML, JSON, or Binnary
One service can expose several Endpoints (URIs). Each Endpoint can be configured to use different Binding. Binding can have different transportation protocol and format options. The same services can be thus exposed using different protocols and formats. In our application we can use this advantage and expose different endpoints for different clients.
Testing
Several types of tests can be used to confirm the correct behavior of the application: Unit Tests, Integration tests, smoke tests, functional tests (or acceptance tests).Unit Testing
Mocking frameworks
When it comes to isolating the unit tests there are several Mocking frameworks available: NMock, EasyMock, Moq, JustMock (commercial), TypeMock (comercial), RhinoMocks, NSubstitute, FakeItEasy and Moles.
In our application we have decided for RhinoMocks and Moles. Moles are used in connection with Pex - test generation framework, which will be described later.
Most of the Mocking frameworks provide more or less the same functionalities thus the decision is quite complicated. RhinoMocks has the following characteristics:
- Free and Open Source
- Easy to use
- Active community
- Compatible with Silverlight (existing port to Silverlight)
Actual version 3.6, version 4 which should break backwards compatibility is in development, but if I have not missed something, there are so far no releases.
Pex & Moles - Parametrized Unit Testing
Pex & Moles are used in order to build Unit Tests for the back-end part. Pex is a tool which helps generate inputs for unit tests while Moles enables the isolation of tested code. In order for Pex to generate the inputs the the test cases have to be parametrized.
Instead of writing concrete test cases, the test method is just a wrapper which takes the same arguments as the tested method, performs necessary set-up and then passes the arguments to the tested method. Pex analyses the execution tree of tested method and suggests the parameters which should be passed to the method and builds concrete test cases.
The aim of Pex is to obtain maximal code coverage. In order to achieve that, it uses algebraic solver (Microsoft’s Z3) to determine the values of variables used in the method which will lead to execution of each branch. Than it varies the parameters to obtain these values.
Moles is a stubbing framework. It allows you to isolate the parts of the code which you want to test from other layers. There are basically two reasons why use Moles:
Moles works great with Pex. Because Pex explores the execution tree of your code, so it also tries to enter inside all the mocking frameworks which you might use. This can be problematic, since Pex will generate inputs which will cause exceptions inside the mocking frameworks. By contrast Moles generates simple stubs of classes containing delegates for each method, which are completely customizable and transparent.
Moles allows to stub static classes, including the ones of .NET framework which are usually problematic to mock(typically DateTime, File, etc)
As it says on the official web: "Moles allows you to replace any .NET method by delegate". So before writing your unit test, you can ask Moles to generate the needed stubs for any assembly (yours or other) and than use these “moles” in your tests.
Presentation Layer
The presentation layer is quite large topic with several choices: ASP.NET, ASP.MVC + JavaScript, pure HTML5 + JavaScript, some JS frameworks (jQuery, KnockOutJS, Silverlight - and all of these technologies can be combined.Silverlight
Here is a list of characteristics which can be seen as advantages:
- Intend ed to develop Rich Internet Applications.
- Supports separation of the view and the logging using the MVVM pattern.
- Possibility to use declarative language (XAML) to design user interface and imperative language tode ne the application logic.
- Data visualization support u sing open source Silverlight Toolkit (charts, line series)
- Re-usability of code on .NET compliant platform.
- Possibility to access audio and video devices on client side.
- Plug-in based technology. Requires the plug-in to be run inside the browser. The plug-in is not available for all possible combinations of platform and browser. This lowers the availability of the developed application and brings also higher requirements on hardware.
- Standard web features are missing such as navigation.
- Limited testability. Silverlight can not be tested with traditional functional testing frameworks such as Selenium. On the other hand, when the MVVM pattern is implied, the ViewModels can be tested as simple classes, using traditional Unit Testing technologies.
HTML + JavaScript
- No plug-in needed, HTML 5 is supported on the majority of the current browsers.
- Naturally comes with web standard features: navigation, bookmarking.
- Developers has to handle the "all browsers compatibility" issue.
- Compared to C\# JavaScript is dynamic language, not compiled before the execution. This may be seen as advantage and disadvantage.
Knouckout.JS seems to me as a great possibility to use the MVVM pattern with JavaScript, I will be checking it and writing about it later.
Logging
Logging is an essential part of each application. Following frameworks are available in .NET:
Log4Net - easy configurable framework.- Logging in MS Enterprise library
- NLog - version 2.0 released 7/2011 including logging framework for Windows Phone 7 and Silverlight - seems very nice, but I have never tried.
- The Objects Guy Logging Framework - lightweight logging framework
- .NET build-in tracing - alternative approach of using System.Diagnostics namespace which enables output of standard Trace and Debug Write method to XML file.
Security
There is usually a need to handle the user authentication in enterprise applications. When using ASP.NET I have found out that there are the standard Forms Authentication usually satisfies my needs. To handle OpenID authentication DotNetOpenAuth is an excellent choice.Forms Authentication
Forms Authentication scheme works by issuing a token to user the first time that he authenticates. User can be authenticated against database or any other information source.
This token in the form of cookie is added to the response which follows the authentication request. This way the cookie is added to the next request by the same client. Forms Authentication than takes care of revoking the cookie (after demanded time) as well as of checking the cookie in each requests.
Forms Authentication works automatically with browser based clients, when used from different clients, some additional work on the client has to be done in order to add the authentication cookie to each request.
DotNetOpenAuth
I have previously used this library for two task: integrating OpenID authentication and creating OAuth provider.
Integration of OpenID works hand in hand with Forms Authentication. DotNetOpenAuth library provides a means to authenticate user against any Open ID provider. Once the user is authenticated the authentication cookie can be generated using Forms Authentication.
Conclusion
When new application is being developed, there are several decisions, that have to be taken regarding the framework and technologies which might be used. This article does not give direct answers to these question, but rather lists all the possible frameworks which should be taken into account.New frameworks are being delivered by Microsoft and by Open Source community and it is hard to see which technologies will hold on which will be forgotten. I hope this overview can help to make the right decision. Any suggestions are welcomed.
CodeProject
úterý 26. dubna 2011
Pex & Moles - Testing business layer
Code examples related to this post are available at this GitHub repository.
In this post I would like to cover two parts:
- Pex and Moles basics - just a quick overview, because this is covered by other blogs and by official documentation.
- Using Pex to test business layer - I have been strungling to find a pattern to use Pex to generate unit tests for business layers of my applications. The problem is that there are quite a lot of samples which explain the basic and advanced aspects of Pex, but there is not that many examples which would show you have to use Pex in real life (putting aside the ambiguous definition of what real life is:).
Pex and Moles basics
Pex is a testing tool which helps you generate unit tests. Moles is a framework which enables you to isolate parts which are tested from other application layers.Pex basics
Pex is a tool which can help you generate inputs for your unit tests. To use Pex you have to be writing Parametrized Unit Tests. Parametrized Unit Tests are simple tests which accept parameters and Pex could help you generate these parameters.Lets take a look at a first example, here is a simple method which you would like to test:
public static string SomeDumbMethod(int i, int j)
{
if (i > j )
{
if (j == 12)
return "output1";
else
return "output2";
}
else
{
return "output3";
}
}
To test this method, you should write at least 3 unit test - in order to cover all the branches of the method, thus cover all the possible outputs (that is not a generic rule). But instead of that we will write a unit test which accepts the possible inputs as parameters.
[PexClass(typeof(Utils))]
[TestClass]
public partial class UtilsTest
{
[PexMethod]
public string SomeDumpMethod(int i, int j)
{
string result = Utils.SomeDumbMethod(i, j);
return result;
}
}
I have decorated the method with PexMethod and the class with PexClass attribute this way Pex knows that this class is used to generate unit tests. So now to ask Pex to generate the inputs, click right on the body of the method and select Run Pex Explorations. Pex will generate 3 unit tests, which you can review in the Pex Window.How does Pex work
Pex is using static analysis of your code, to determine which inputs will achieve the maximal coverage of exposed method. Pex does not randomly pick values to use as inputs, instead of that Pex is using an algebraic solver (MS Research Z3 project)to determine what values of parameters will suite the conditions leading to enter a not-yet explored branch of code.The main force of Pex is above all the ability to generate parameters which would allow to cover all the branches of tested method.
Moles basics
Moles is a stubbing framework. It allows you to isolate the parts of the code which you want to test from other layers. Several other stubbing or mocking frameworks (RhinoMock, NMock) are out there for free or not, so the question is which is the advantage of Moles?There are basically two reasons why use Moles:
- Moles works great with Pex. Because Pex explores the execution tree of your code, so it also tries to enter inside all the mocking frameworks which you might use. This can be problematic, since Pex will generate inputs which will cause exceptions inside the mocking frameworks. By contrast Moles generates simple stubs of classes containing delegates for each method, which are completely customizable and transparent.
- Moles allows to stub static classes, including the ones of .NET framework which are usually problematic to mock(typically DateTime, File, etc)
Instead of complicated descriptions, here is a simple method, which checks the actual date and outputs a string based on the date:
public static String GetMessage()
{
if (DateTime.Now.DayOfYear == 1)
{
return "Happy New Year!";
}
return "Just a normal day!";
}
Now to test this method, we need to be able to set the output of the static DateTime.Now property. Moles will help us to achieve this. You can see that in the following testing method I use MDateTime which is a mole for DateTime class, which allows me to set the delegate NowGet, which gets called when asked for DateTime.Now. To be able to use MDateTime you have to add the moles assemblies by right clicking the References in your project.After that you can write your method as follows:
[PexMethod]
public string GetMessage(bool newyear)
{
MDateTime.NowGet = () =>;
{
if (newyear)
{
return new DateTime(1,1,1);
}
return new DateTime(2,2,2);
};
string result = Utils.GetMessage();
return result;
}
Note that here I am using Pex play around a bit. I want to test both branches of my method. The only possibility which Pex has to influence the executed brunch is by generating parameters. So I add a bool parameter to the test method, which I will ask Pex to generate. Here is the result which I get:This was a particular case, but the approach should be always the same. When stubs are needed for certain assembly you can always generate them by right-clicking the reference and selecting Add moles assembly. Than you can use these stubs as any other classes in your test methods.
Use Pex to test business layer
So you are probably thinking that all that is nice, but it does not really serve in real projects? That is what I am sometimes thinking also, so here I would like to present an attempt to use Pex to test business layer of a typical Bank application. This application uses Repository pattern. Simply service classes which provide the business methods (like MakeTransfer etc.) use repositories to access the database (or any other data source).In this example I introduce an AccountService class, which depends on two repositories: AccountRepository and OperationRepository. Here are the definitions of the repositories:
public interface IOperationRepository
{
void CreateOperation(Operation o);
}
public interface IAccountRepository
{
void CreateAccount(Account account);
Account GetAccount(int id);
void UpdateAccount(Account account);
}
The actual implementations of these repositories are not important, since I want to test just the AccountServices class which is dependend on these two repositories. To test just AccountServices class I will mock these repositories (but about that later).Here is AccountServices class:
public class AccountService
{
private IAccountRepository _accountRepository;
private IOperationRepository _operationRepository;
public AccountService(IAccountRepository accountRepository, IOperationRepository operationRepository)
{
_accountRepository = accountRepository;
_operationRepository = operationRepository;
}
public void MakeTransfer(){ ... }
public IList<operation> GetOperationsForAccount() {...}
public decimal ComputeInterest(Account account, double rate) { ... }
}
AccountServices will have three methods to test:- MakeTransfer
- ComputeIntereset
- GetOperationsForAccount
Let's start with MakeTransfer method.
public void MakeTransfer(Account creditAccount, Account debitAccount, decimal amount)
{
if (creditAccount == null)
{
throw new AccountServiceException("creditAccount null");
}
if (debitAccount == null)
{
throw new AccountServiceException("debitAccount null");
}
if (debitAccount.Balance < amount && debitAccount.AutorizeOverdraft == false)
{
throw new AccountServiceException("not enough money");
}
Operation creditOperation = new Operation() { Amount = amount, Direction = Direction.Credit};
Operation debitOperation = new Operation() { Amount = amount, Direction = Direction.Debit };
creditAccount.Operations.Add(creditOperation);
debitAccount.Operations.Add(debitOperation);
creditAccount.Balance += amount;
debitAccount.Balance -= amount;
_operationRepository.CreateOperation(creditOperation);
_operationRepository.CreateOperation(debitOperation);
_accountRepository.UpdateAccount(creditAccount);
_accountRepository.UpdateAccount(debitAccount);
}
This method calls the CreateOperation method of OperationRepository and UpdateAccount method of AccountRepository. Neither of these two methods returns any value, so in your unit test you do not have to define exact behavior of these methods, so you can provide a simple stub generated by Moles to the constructor of AccountServices class.In the following example SIAccountRepository and SIOperationRepository are stubs generated by Moles.
[PexMethod, PexAllowedException("SimpleBank", "SimpleBank.AccountServiceException")]
public void MakeTransfer(Account creditAccount,Account debitAccount,decimal amount)
{
SIAccountRepository accountRepository = new SIAccountRepository();
SIOperationRepository operationRepository = new SIOperationRepository();
AccountService service = new AccountService(accountRepository, operationRepository);
service.MakeTransfer(creditAccount, debitAccount, amount);
}
Let's take a look at Pex's output after running the Pex Test. That is not bad, so Pex generated for me 6 unit tests, which normally I would have to write and also discovered Overflow exception which I did not cover in my code. What might be missing is the possibility to verify if the Update/Create method of each of the repositories was called. In other words we are limited by the fact that Moles can generate only stubs, which are not able to verify that method was executed as Mocks would be. If we wish to check whether the methods were called, we have to implement this on our own.
Now let's take a look at GetCustomersForAdvisor.
public List<operation> GetOperationsForAccount(int accountID)
{
Account account = _accountRepository.GetAccount(accountID);
if (account == null)
{
return null;
}
if (account.Operations == null)
{
return null;
}
return account.Operations.ToList();
}
This method calls the GetAccount(int id) method of AccountRepository, than it performs some null value checks and returns the result. So in order to test this method we will have to provide the behavior of the GetAccount method. In the following snippet of code I use SIAccountRepository stub generated by Moles and I specify the value which should be return after callin GetAccount(int x) method.
[PexMethod]
public List<Operation> GetOperationsForAccount(int accountID)
{
List<Operation> operations1 = new List();
operations1.Add(new Operation { Amount = 100, Direction = Domain.Direction.Credit });
operations1.Add(new Operation { Amount = 200, Direction = Domain.Direction.Debit });
List<Account> accounts = new List<Account>();
accounts.Add(new Account { Balance = 300, Operations = operations1, AutorizeOverdraft = true, Id = 1 });
accounts.Add(new Account { Balance = 0, Operations = null, AutorizeOverdraft = false, Id = 2 });
SIAccountRepository accountRepository = new SIAccountRepository();
accountRepository.GetAccountInt32 = (x) =>
{
return accounts.SingleOrDefault(a => a.Id == x);
};
SIOperationRepository operationRepository = new SIOperationRepository();
AccountService service = new AccountService(accountRepository, operationRepository);
List result = service.GetOperationsForAccount(accountID);
return result;
}
At the beginning of the testing method I define a list of accounts, with two accounts, one having several operations and other with no operations. Than I set the delegate of GetAccount method of the SIAccountRepository stub to search in the list by the account id. Now let's run Pex and see the result. So Pex basically tried the two ID's of the accounts in the predefined list and also checked the null account. There is still a drawback and that is the fact, that I have to define my own list of accounts to stub the account repository, on the other hand I do it only once and also the way the stub is of the GetAccount method is defined is quite straight-forward; I only tell Pex to search in the list, and I do not have to specify exactely which ID will provide me with which account. The last method is ComputeInterest, which should compute the monthly interest computed on annual basis (note that this is here just for demonstration).
public decimal ComputeInterest(Account account, double annualRate, int months)
{
if (account == null)
{
throw new AccountServiceException("Account is null");
}
double yearInterest = Math.Round((double)account.Balance * annualRate);
double monthInterest = yearInterest / 12;
return (decimal)(monthInterest * months);
}
This method takes the balance of the account, computes the annual interest and gives a value for one month(yes it is completely non-real life method). Now lets take a look at the test for this method.
[PexMethod, PexAllowedException(typeof(AccountServiceException))]
public decimal ComputeInterest(Account account,double annualRate,int months)
{
PexAssume.Implies(account != null, () => account.Balance = 1000);
PexAssume.IsTrue(annualRate != 0);
PexAssume.IsTrue(months != 0);
SIAccountRepository accountRepository = new SIAccountRepository();
SIOperationRepository operationRepository = new SIOperationRepository();
AccountService service = new AccountService(accountRepository, operationRepository);
decimal result = service.ComputeInterest(account, annualRate, months);
return result;
}
Here we use PexAssume to shape the inputs of the unit tests. PexAssume is a static class which provides several methods to elaborate the inputs. The most useful methods are IsTrue(cond) which shapes the inputs in that form that the condition will always be true, and Implies(cond, fact) which allows conditional clarification of inputs. Pex tries always the simpliest inputs, so right after trying a null account, it will try an account with 0 balance. If we want Pex to provide an account with different balance, than we have to use PexAssume.Implies method. If we would use just PexAssume.IsTrue(account.Balance==1000), than we would obtain null pointer exception in the test for which Pex generates null account. Now let's take a look at the result:
So here Pex generates only two cases - but that is exactly sufficient to cover all the code blocks. What is interesting is that we do not obtain the case for OverflowException here, maybe because the multiplications result in double values and the later conversion to decimal does not throw OverflowException.
Summary
Pex is a great tool when it comes to code coverage. It will exercise all the paths in your code to look for errors or exceptions.However sometimes you will have to generate the data for your test by hand and provide them to Pex.
Moles is a great tool to provide stubs for static methods (and specially static framework's methods) which normally are hard to test. It also cooperates well with Pex, because it is completely transparent. For each of you abstract classes or interfaces a stub is generated with delegates that you can redefine to fit your needs. If you would try to use other mock/stub framework, Pex will try to enter the scenes behind the framework, which might result in unexpected exceptions.
However Moles lacks the "mocking" functionality. You can substitute any method with a delegate, but there is no build-in function which would tell you if the delegate was invoked. On the other hand this functionality can be easily developed.
The provided description is my personal experience, I am still not sure if I should use Pex in my personal projects and I am definitely not sure if I am using it the right way. From my point of view Pex is great for projects containing complex method with several branches. Quite a lot of time the code, that I have to write is quite straight-forward and because Pex generates the simplest values often it will finish by a single null value passed as a test parameter.
This post does cover only small fraction of Pex capabilities and there is a lot more to learn, to start with you can check PexFactories which allow customize the generation of test inputs, the capabilities of PexAssert or cooperation of Pex and CodeContracts.
PS: If someone has another approach or some additional advices on how to use Pex it would be great to share them, I have wrote this post partially because I would like to get some feedback on the subject.
čtvrtek 31. března 2011
TFS and UnitTests - QTagent32 crash & "data.coverage' because it is being used by another process"
When there is recursion in unit test than the QTAgent32 crashes down (this is described here).
But if you commit to TFS than and this commits launches the UnitTests than these will never finish (so if you do not check than you might just have builds running for couple days and new builds in the queue behind).
However when you cancel the build and run a new one, this one will fail, saying:
'data.coverage' because it is being used by another process.
So there is some process which has locked the "data.coverage".
To solve this issue log on the TFS and kill the process VSPerfMon.exe which runs under NT AUTHORITY\NETWORK SERVICE (that should be the account which is running your builds & unit tests on TFS).