Another conference is coming. I will be speaking at ENASE (Evaluation of Novel Approaches to Software Engineering) conference in Angers, France. I’m coauthoring a paper about new technique in software engineering that involves test driven development and continuous testing. I will publish the paper after the conference on this blog. If you are interested in new approaches in software engineering please take a look at http://www.enase.org/ and it you happen to be near Angers between 4th and 6th of July – visit!
I’m happy to announce that I will be speaking about Continuous Integration in .NET at .NET Developer Conference (DCC) in Nuremberg, Germany. The .NET Developer Conference is a part of Developer Week (DWX) that combines 3 events WDC (Web Developer Conference), MDC (Mobile Developer Conference) and DDC. It takes place from 24th to 27th June 2013. My session is on Monday the 24th.
More information on the DWX 2013, the program and speakers are available at www.developer-week.de.
You are very welcome!
PS. I have a discount code for all my blog readers – if you want one please drop me a line!
LatelyI had following task to deal with. Had two assemblies, an older and a newer version of the same dll. But no source code for any of them. I had to assess the differences between this two assemblies. “How to do it efficiently?” I asked myself. The easiest way as it seemed was to open both in Just Decompile (or any other reverse engineering tool for .NET) and go through them. But, while I was writing my “Continuous Integration in .NET” book, I was lucky enough to get a free license of the NDepend tool from Patrick Smacchia. NDepend is mainly code analysis tool for .NET. It is able to check the best practices using SQL like language CQL (Code Query Language). In last version v4 the CQL can be written in Linq! But I was interested in the assembly comparison functionality. Here is how I used it.
Start Visual NDepend and choose “Choose Assemblies or Analysis to Compare” from Compare menu.
Choose the dlls you want to compare (as a demonstration I’ve used System.Web):
After running the analysis you will be presented with the HTML site containing the summary report. I found the Interactive UI Graph the most useful when comparing the functionalities from both dlls. You can run it from this dialog:
The class browser contains the compared classes. The bold ones are the ones added in the newer version and the underlined are those changed. What else does a lost developer need!
Here is the online version of my article that was published in a chapter of a book at Opole University of Technology (ISBN 978-83-63015-10-7).
Summary: The process of creating working software from source code and other components (like libraries, database files, etc.) is called “software build”. Apart from linking and compiling, it can include other steps like automated testing, static code analysis, documentation generation, deployment and other. All that steps can be automated using a build description of some sort (e.g. script). This article classifies the automatic software build processes beginning at build script and reaching the various types of continuous integration.
Keywords: software build automation, continuous integration, commanded integration, scheduled software build, triggered software build
Kawalerowicz, M. (2012). Classification Of Automatic Software Build Mothods. Prace doktorantów / Články doktorandů (pp. 37–39). Opole, Poland: Opole Univeristy of Technology.
MTS or Microsoft Technology Summit is the biggest technical conference Microsoft in Poland. It is the place where few thounsend developers, IT specialists and business people meet to get the latest informations from Microsoft and to talk about things that are relevant. This year was invited to MTS to give a talk about continuous integration. Let’s meet at MTS 2012!
Let’s say you have a following task: your small ASP.NET (MVC) website needs to be run on a computer that does not (necessarily) IIS have installed. How to make it possible?
You can do it using IIS Express. It is a small version of IIS server that Microsoft ships for free with at http://www.microsoft.com/download/en/details.aspx?id=1038. It is shipped with Visual Studio 2010 SP1 too (so there is a chance you already got it in C:\Program Files (x86)\IIS Express). They say it’s not xcopy deployable (meaning you cannot run it without installation) but it kinda is. You have to simply install it on one machine and take out what the files you need (or basically the whole folder) to copy and run it on other machine.
Here is a quick prescription:
1. take your web application and copy it to the folder “web”,
2. take the whole inside of the folder C:\Program Files (x86)\IIS Express and copy it to the folder “iis”,
3. create a cmd file to run the iis express with your app and place it on the same level as your “web” and “iis” folders, it should look like this:
It will start the IIS Express (from “iis” folder) set the web application path to web folder “web” (%CD% is current directory). IIS Express is started on the 8181 port (in fact it does not matter on what port it is started as long this port is free). After that we start the Internet Explorer and direct it to the website. On slower machine it can take some time to start the server (and as we use start command the batch will not wait and start the next command right away) so you can think of a small delay (using for example ping -n 1 188.8.131.52 >nul for 1 second delay).
Such package can be zipped and send to someone to run a local copy of your application. Of course it can be wrapped around another MSI package and automated to pick free port and add an icon to the desktop or start menu.
The only prerequisite for IIS Express to run is .NET Framework 4 and it will run on XP and up. It can run side to side with a “big” version of IIS.
By default IIS Repress hosts only the localhost websites. It is possible to configure it to server over the wire (see http://stackoverflow.com/questions/5235826/using-iis-express-to-host-a-website-temporarily). It should be possible to host a standalone SQL Server version too. But it is a topic for another blog post!
1: OutputPort led = new OutputPort(Pins.ONBOARD_LED, false);
3: while (true)
5: foreach (char t in "HELLO WORLD")
7: for (int i =
8: ",ETIANMSURWDKGOHVF,L,PJBXCYZQ,,54 ,3,,,2,,,,,,,16,,,,,,,7,,,8,90".IndexOf(t);
9: i > 0; i /= 2)
12: if ("-."[i-- % 2] == '.')
20: if (t.Equals(' ')) Thread.Sleep(400);
Yeap I’m blinking the hello to the world with the Netduino onboard LED and Morse code (thanks for the Morse code translator to the Code Golf community).
Here is the Netduino in action:
Hardy Erlinger, the head of .NET Developers Group München, has just confirmed my session in Munich, Germany. I will be speaking about Continuous Integration in .NET at January 17th. The meeting will take place probably as usual in Firma TESIS at Baierbrunner Str. 15, 81379 München. Start 18:00. Details will be available at the group website www.munichdot.net. Acha, the talk will be in German!
I’ve recently got the sales report from Manning about my book “Continuous Integration in .NET”. I’m very happy to report it sells quite well!
> Interesting you mention the Manning books on Dependency Injection and Continuous Integration… is there really a whole books-worth of stuff in each of those topics??
If you are in the .NET world then then ‘Continuous Integration in
.NET’ is really worth the time invested. It covers most of commonly
used tools (CC.NET, MSBuild & Team System, TeamCity). Goes over
integration of unit testing, code metrics, analyse tools, source
control and these like (if I recall correctly there is a section on
building installation package and getting you DB related changes under
CI as well).
However, it lack some obscure topics (e.g. I would really like to
facilitate Hudson, Maven and Sonar, but I don’t even recall a word on
these also NAnt isn’t presented too well).
Even thought, if you are just starting with CI I would give it a go.
You could skip it if you already have some CI in house and just need
to improve / extend what it offers. It’s alway nice to have a look
around, but I find hands-on experience much more important in this
I’ve recently dived deep into WCF and security. To be exact I’ve tinkered with something that is called 2-Way–SSL a little. It’s a quite complex topic and I will try to summarize what I’ve learned.
1. Issuing certificates
System: Windows 7
To start you will have to have a certificate. Essentially you have two options if you don’t have one yet:
a) getting one from a CA (Certification Authority),
b) issuing one for yourself.
While choosing the way to go, keep in mind what do you need it for. The whole point of certificates is they need to be trusted. For your development its sufficient you trust yourself, but if you want to sing the SSL communication from your website or sign the emails you send you will need something more. For some purposes a free certificate from issuer like COMODO, CAcert or StartSSL are sufficient. They will only verify that you own the domain and/or email. If you need more thorough verification you will have to pay (from tenths to thousands of dollars).
If you are .NET developer sitting on Windows (sometimes mutually exclusive – cheers to Mono developers), you can use tool called MakeCert. You will find it in Visual Studio Command Prompt. To create usable certificate you will have to act like a CA yourself. So as a fist step you will have to issue a certificate for your CA. You can do it like that:
makecert -n "CN=MkCA" -r MkCA.cer -sv MkCA.pvk
Add a password for your private key (or choose to use none). This command will create a certificate *.cer and a file containing the private key *.pvk. The certificate needs to be added to “Trusted Root Certification Authorities” store. To do so, start the Microsoft Management Console (execute mmc.exe) and from in menu go to File –> Add/Remove Snap-in… and choose Certificates. Press Add > button and choose to manage the certificates in Computer account and on Local computer. Then press Ok.
Navigate to Console root –> Certificates (Local computer) –> Trusted Root Certification Authorities and from the context menu choose All Tasks –> Import…
Find the *.cer file you created using MakeCert and add it to the certification store.
Congratulations from now on you trust the certificates issued by the CA you’ve just created. Now you need a certificate combined with private key to secure the communication from the server. To do so you have to issue following commands
makecert -n "CN=localhost" -ic MkCA.cer -iv MkCA.pvk -sv MkServer.pvk MkServer.cer
pvk2pfx -pvk MkServer.pvk -spc MkServer.cer -pfx MkServer.pfx
The fist one will create a private key file and a certificate under your certificate authority MkCA issuer for “localhost” computer. The second one will create *.pfx file that contains both the certificate and the private key (and it can be protected by a symmetric key – a password – to assign one use the –po switch to pvk2pfx). The *.pfx file is necessary to secure the communication. The server will send the certificate (containing among other information’s a server public key) to the client. The client will encrypt a random number using the server public key and send it back to the server. This number will become a symmetric key used by both parties to encrypt the communication.
2. Securing the IIS communication using SSL
Now when you have the the *.pfx ready you can secure the IIS Web Site. To do so do the following:
1. Open IIS Manager (start InetMgr.exe or go to Control Panel –> Administrative Tools –> Internet Information Services (IIS) Manager)
2. Go to server node (root node with the name of your computer) and open “Server Certificates”
3. From Actions choose Import …
4. Pick the *.pfx file you’ve created and enter password (if you used one).
5. Create a new Web Site (or use the default one if you like to secure it) and from Action choose Bindings…
5. Choose binding type https and newly imported SSL certificate.
Voila the website supports now secure communication.
Lets talk a little about the client (browser) configuration. While in IIS Manager with focus on the newly created web site choose Browse *:443 (https) from the Actions pane (or navigate to the website by typing the URL in the browser address)
As you can see there is a problem with the certificate. It’s because the browser tried to verify the issuer of the server certificate and failed. It failed because the issuer is not trusted.
As you can remember you’ve added your certificate authority key to the computer storage. But it’s not enough the browser uses the Current User certificate store. So you will have to add the certificate authority once again to user storage. Use the mmc.exe again. Add Certificates snap-in but choose “My user account” this time. Then add the CA certificate as described in section 1.
Note: if the certificate keeps disappearing (why? I don’t know – id did on my machine) from the store please experiment with the storage. Choose Registry or Local Storage as showed below.
Restart the browser and the problem report will be gone. The communication is encrypted. The certificate is trusted.
The term “2-Way-SSL” is sometimes used to describe the scenario where both the server and the client need to verify each other. It means that not only the server must present the certificate. The client need to do accordingly.
On IIS it can be achieved by setting the client certificate requirement in the “SSL Settings” of the web site or web application.
Set the Require SSL and choose to require client certificates.
From now on you will get the HTTP Error 403.7 – Forbidden if you will try to get the resource in the browser.
It’s because you don’t have the client certificate ready on the client side. Lets fix it. We will need a *.pfx file one again. Lets create one.
makecert -n "CN=marcin" -ic MkCA.cer -iv MkCA.pvk -sv MkClient.pvk MkClient.cer
pvk2pfx -pvk MkClient.pvk -spc MkClient.cer -pfx MkClient.pfx
With the *.pfx file ready you should add it to Personal –> Certificates in the Local Computer. Now the client will be able to present the client certificate and accomplish the 2-Way-SSL.
4. WCF and 2-Way-SSL
It’s now time to glue the pieces together. Lets configure WCF service to use the client certificates to communicate with the server.
I have used a simple service that took a string as a parameter and returned it back to the client. I named the service EchoService, I used wsHttpBinding to secure the communication on the transport level (it secures the whole communication as opposed to only securing the messages if you use Mosseage mode). The binding configuration looked like this:
To keep the things simple I’ve turned off the publication of metadata off. The services and behaviors looked like this:
On the client side of things the configuration needs to be extended with a behaviors section like that one:
It tells the client to use the client credentials taken from a CurrenUser certificate store and to search for the certificate using a given thumbprint (findValue). You can find the thumbprint in the certificate details tab.
Note if you get a “Invalid hexadecimal string format” from System.IdentityModel you will need to type the thumbprint by hand into the config file instead of copying it into the file. Oh, not ask why. Don’t forget do delete and add the quotation marks too
The rest of the client configuration is easy. wsHttpBinding like the one you used on the server plus gathering everything together in client configuration block.
You are done.
You can always configure the client by code like that:
5. Great finale
Keep in mind that the whole process is a bit tricky. It’s easy to make mistake so be careful and check if:
1. the certificate status you use is “This certificate is OK.”,
If not check if the CA is among the trusted root certification authorities. Beware of disappearing certificates
2. the “Issued To” field of the server certificate matches the name of the server you are using (localhost in my case),
3. If you are testing the SSL configuration using another browser then IE keep in mind that it can use its own certificate store and not the default windows one (Firefox does, Chrome not).
4. you can easily test the services with tools like soapUI. To set it up to use client certificates go to Preferences and set it up like this:
I had to create a *.pfx file WITH password to make it work.
Then if the server is configured to transport level security you can simply send a XML request to it to check if everything works fine.