Machine Learning 4 Continuous Defect Prediction

March 16, 2017 on 2:05 pm | In Article, Conference, Continuous Integration | No Comments

Defect prediction is a set of techniques used to identify a likely buggy software change (eg. a commit). Various measurements from previous changes are taken into consideration to predict weather a new change is likely to contain a bug or not. Commit messages or bug tracking system entries are usually examined to gather the measurements. Machine learning is often used to classify the buggy/clean changes. We are working now on adding a continuous notion to defect prediction. On one side by building on top the idea of continuous defect prediction in the IDE (Integrated Development Environment). On the other side by perfecting the prediction by using the unambiguous results of continuous integration builds of the software project.

The technique was described in a paper I coauthored under the title “Continuous Defect Prediction: The Idea and a Related Dataset”. I will present the paper at the 14th International Conference on Mining Software Repositories 2017 (MSR) in Buenos Aires, Argentina in May this year. MSR is colocated with 39th International Conference on Software Engineering (ICSE) which I will attend.

Scaling CI–switching poll to push

October 21, 2014 on 10:13 pm | In Continuous Integration, DotNet, SVN, TFS, Windows | No Comments

Scaling CI has many flavors. For example:

When:

  • Code base / test no. increases -> build time increases,
  • Teams grow,
  • No. of projects grows.

Then:

  • Create targeted builds (dev build, qa build),
  • Write fast unit tests,
  • Smaller teams with local integration servers,
  • Modularize the code base:
    • Scale hardware,
    • Add more build agents,
    • Parallelize.

and last but not least:

  • Ease the source control system.

Let me show you how to make Subversion and (TFS) Git pro actively inform Jenkins CI about changes in source control.

The most straight forward way to let the CI server know that something changed in the repository is to configure polling. What it means is that the CI server periodically asks the source control system “do you have changes for me”. In Jenkins CI you are configuring it under “Build Triggers” and “Poll SCM”. Jenkins uses Cron style notation like this:

image

Five stars “* * * * *” means: poll every minute. Ovary minute is as close to continuous as you can get. More often is not possible. Most of the times it is not a problem. Once a minute is quite enough. But what if you have many repositories under CI. The single Jenkins CI requests cost not so much, but if there are many repositories to check it can mean a significant delay.

There is a way to change it. Switching from poll to push. How about letting source control system inform the CI server “I have something new for you”. The mechanism that makes it possible is called hooks (at least its hooks in Subversion and Git). Hooks are scripts that are executed in different situations. On the client before or after commit in (pre-commit, post-commit). Before or after update (pre-update, post-update) and so on. Or on the server before or after receive (pre-commit, post-commit). What is interesting for us are post-commit hook in Subversion (look for hooks subdirectory on the server) or post-receive in Git (look in .git\hooks). Because Git is distributed you have it in every repo but, the one that is interesting for us is of course the repo destined for the CI server, and from its point of view it is the post-receive hooks that needs to be executed. In those hooks you can do basically everything you want. We will get back to it soon.

On the the Jenkins CI side you change to change the trigger to “Trigger build remotely”. This option is only visible if your installation of Jenkins is not secured with long and password.

image

In this case you can always trigger the build by simply calling the URL:

http://[jenkins_server]/jobs/[job_name]/build

If your installation is secured you have to flag the “Trigger build remotely” and you can set the security token for the build. Only with this token the build will be triggered.

image

The URL that needs to be called in this case is

http://[jenkins_server]/jobs/[job_name]/build?token=[token]

If you have the repository viewable without authentication it will be possible to trigger the build. But sometimes the Jenkins CI will be secured that way that nothing is viewable without log in. How to trigger a build in this case? Well there is a plug-in for that. It is called “Build Authorization Token Root Plugin” and it is available under https://wiki.jenkins-ci.org/display/JENKINS/Build+Token+Root+Plugin. In this case the URL will be

http://[jenkins_server]/buildByToken/build?job=[job_name]]&token=[token]

We are ready on the Jenkins CI side. Lets make it ready on the source control system side. Since we are Microsoft minded at CODEFUSION (my company). We have Subversion on our own Windows Server and Git on Microsoft Visual Studio Cloud.

In Subversion go to the server and look for the repositories. Go to repository you want to trigger and to hooks subdirectory. Create a file called post-commit.cmd. Subversion will run this script every time something comes in. We want to simply call an URL. Under Linux you would use the curl command. Here you can do it also but you will have to download the curl for Windows and place it somewhere on the server. But there is a better way. You can use PowerShell do call the URL. So create a post-commit.ps1 file (the name does not matter actually but lets keep it “in ordnung”). Inside write the script:

[System.Net.ServicePointManager]::ServerCertificateValidationCallback = {$true}
$url="https://[jenkins_server]/buildByToken/build?job=[job_name]]&token=[token]"
(New-Object System.Net.WebClient).DownloadString("$url");

The first line is only if you have Jenkins running over SSL with self issued certificate (like we have). In the second line please fill the gaps with to form correct URL. The third line calls this URL. Nice thing about it you have most likely PowerShell installed if you are on modern Windows Server.

Now call the PowerShell script from the post-commit.cmd like this:

PowerShell -NoProfile -ExecutionPolicy Bypass -Command "& '%~dp0post-commit.ps1'"

The NoProfile and ExecutionPolicy switches are to make it possible to call a script from command line. In Command switch pay attention to the syntax. The %~dp0 switch means current directory (of course).

Now check something in and watch the build being triggered (if it’s not – check it once again – it worked on my machine).

Now Git. We were using TFS Git from visualstudio.com. There is no access to hooks under TFS. But Microsoft was kind enough to make it possible in other way. Log into visualstudio.com. Go to your project and look for “Service Hooks”.

image

It lets you integrate with various 3rd party services. One of them is Jenkins CI.

image

I would like Microsoft to let me make simple URL call among those “Services”. Please. But since it is not possible let’s choose Jenkins.

image

Decided to trigger the build after every code push. You can set the filers to get it triggered only for certain repos or branches. Then choose to trigger generic build and provide all the necessary information like Jenkins URL, user name, API token (more to it later), build (it is job name provided automatically) and build token (as in case of SVN – provided by Jenkins when you configure “Trigger build remotely”). To get the API token on Jenkins CI go to “People”, search for the configured user and choose “Configure”

image

Look for API token and use it on visualstudio.com.

Test it and check it the build was triggered. It should. It worked on my machines.

I hope it was useful!

Vanilla build server and a little NuGet gem

October 6, 2014 on 7:37 pm | In ASP.NET MVC, Continuous Integration, DotNet, MSBuild | No Comments

Vanilla build server is a concept that says that the build server should have as few dependencies as possible. It should be like vanilla ice cream without any raisins (I have raisins in ice cream). Let me cite the classic (from: Continuous Integration in .NET):

“It’s strongly suggested that you dedicate a separate machine to act as the CI server. Why? Because a correctly created CI process should have as few dependencies as possible. This means your machine should be as vanilla as possible. For a .NET setup, it’s best to have only the operating system, the .NET framework, and probably the source control client. Some CI servers also need IIS or SharePoint Services to extend their functionality. We recommend that you not install any additional applications on the build server unless they’re taking part in the build process.”

I was recently preparing a talk for a conference and setting up a brand new CI server on Windows Server 2012. My ASP.NET MVC project build ended up of course with following error:

error MSB4019: The imported project "C:\Program Files 
(x86)\MSBuild\Microsoft\VisualStudio\v11.0\
WebApplications\Microsoft.WebApplication.targets" 
was not found. Confirm that the path in the <Import> 
declaration is correct, and that the file exists on disk.

Well of course. I have a vanilla machine without any MSBuild targets for ASP.NET MVC. I was going to solve it like usual. Create a tools directory, copy the needed targets into the repository and configure the MSBuild paths to take the targets provided with the repository. It worked like a charm in the past and it would work now. But something (call it intuition) made check over at NuGet and to my joy I found this little gem:

https://www.nuget.org/packages/MSBuild.Microsoft.VisualStudio.Web.targets/12.0.1

“MSBuild targets for Web and WebApplications that come with Visual Studio. Useful for build servers that do not have Visual Studio installed.” Exactly!

I quickly installed it. Configured the MSBuild on the build server to use it like this:

/p:VSToolsPath=’..\packages\MSBuild.Microsoft.VisualStudio.Web.targets.12.0.1\tools\VSToolsPath’

It is a command line parameter I’ve added to the build arguments.

An voila!

.NET Developer Days 2014 Conference

September 17, 2014 on 10:13 am | In Continuous Integration, DotNet, Software Engineering | No Comments

2014-09-17_11-53-07I will be speaking at .NET Developer Days 2014 in Wrocław, Poland. The conference will be held between 14th and 16th October 2014 at the City Stadium in Wrocław. The topic is “Continuous integration and deployment in .NET with Jenkins CI and Octopus Deploy”. Here is the conference website: http://developerdays.pl/.

Waiting for the first .NET wrist watch

May 9, 2014 on 2:26 pm | In Continuous Integration, DotNet, Netduino | No Comments

Almost a year ago there was a Kickstarter campaign to found a first .NET Micro Framework watch: Agent smartwatch. Nice thing about it is that you will be able to program it using C# and Visual Studio. While we are still waiting for the product there is a SDK with an emulator. It is from the same guys that gave us Netduino! I decided to check it out.

Think about it: you have a Continuous Integration server running your builds and you want to monitor it on the fly. Is there a better device to do it than a wrist watch? So I thought and decided to check it out.

Here is a quick project I’ve hacked to proof the concept. But before we begin let me show you the result:

image

Neat! Isn’t it?

I’m using Jenkins as my Continuous Integration server. It has a set of APIs for the developer to use. I decided to give Json API a try.

I typed:

http://jenkins_url/api/json?tree=jobs[name,lastBuild[building,result]]

What gave me nice Json result:

{

  • "jobs": [
    • {
      • "name": "Demo4Dev1",
      • "lastBuild": {
        • "building": false,
        • "result": "SUCCESS"

        }

      },

    • {
      • "name": "Demo4Dev2",
      • "lastBuild": {
        • "building": false,
        • "result": "SUCCESS"

        }

      },

    • {
      • "name": "DemoTest1",
      • "lastBuild": {
        • "building": false,
        • "result": "SUCCESS"

        }

      }

    ]

}

I went to the Agent website and got the SDK. I fired up Visual Studio and wen New Project –> Visual C# –> Micro Framework –> AGENT Watch Application

image

Which gave me a Hello World application.

I added System.Http and System.IO references and headed straight to get the HTTP response and read the response stream to the end. Like this:

HttpWebRequest req = (HttpWebRequest)WebRequest.Create(JenkinsApiUrl);
WebResponse resp = req.GetResponse();
StreamReader sr = new StreamReader(resp.GetResponseStream());
string respStr = sr.ReadToEnd();

Now I needed something to parse the Json text. Luckily for me I wasn’t the only one. There is nice NuGet project with Json parser. To get it issue:

PM> Install-Package Json.NetMF

Having it I head straight to deserialization:

Hashtable deserializedObject = Json.NETMF.JsonSerializer.DeserializeString(respStr) as Hashtable;

Now I went to hack and slash over over the result to find out everything is all right.

// Assume success
bool generalFailure = false;

foreach (DictionaryEntry de in deserializedObject)
{
    foreach (Hashtable ht in de.Value as ArrayList)
    {
        foreach (DictionaryEntry job in ht)
        {
            if (!job.Key.ToString().Equals("name"))
            {

                Hashtable ht2 = job.Value as Hashtable;
                if (ht2 == null) continue;
                foreach (DictionaryEntry results in ht2)
                {
                    if (!results.Key.ToString().Equals("building"))
                    {
                        if (results.Value.ToString().Equals("FAILURE"))
                            generalFailure = true;
                    }

                }
            }
        }
    }

}

I have added two result images to the resources:

image

And headed to show the result:

// initialize display buffer
_display = new Bitmap(Bitmap.MaxWidth, Bitmap.MaxHeight);

// Show result
_display.Clear();
Font fontNinaB = Resources.GetFont(Resources.FontResources.NinaB);

_display.DrawText("Jenkins", fontNinaB, Color.White, 35, 10);
if (generalFailure)
{
    _display.DrawText("FAIL!", fontNinaB, Color.White, 35, _display.Height - 20);
    Bitmap image =
        new Bitmap(Resources.GetBytes(Resources.BinaryResources.storm), Bitmap.BitmapImageType.Bmp);
    _display.DrawImage(_display.Width / 2 - image.Width / 2,
        _display.Height / 2 - image.Height / 2,
        image, 0, 0, image.Width, image.Height);

}
else
{
    _display.DrawText("SUCCESS!", fontNinaB, Color.White, 35, _display.Height - 20);
    Bitmap image =
        new Bitmap(Resources.GetBytes(Resources.BinaryResources.sun), Bitmap.BitmapImageType.Bmp);
    _display.DrawImage(_display.Width / 2 - image.Width / 2,
        _display.Height / 2 - image.Height / 2,
        image, 0, 0, image.Width, image.Height);
}
_display.Flush();
I packed everything in a never ending while loop with small delay:
while (true)
{
  // ... code ...
  Thread.Sleep(10000);
}

Done!

That’s the screen with the failure notice.

image

I can’t wait to get the Agent Watch to make the final app!

Eventful week

November 22, 2013 on 11:01 am | In Article, Continuous Integration, DotNet, Software Engineering | No Comments

Last week was quite eventful. I’ve talked about Continuous Integration in .NET and about how do we use it at my company CODEFUSION at the IT Academic Day 2013 at the Opole University of Technology (OUTech). CF-IT Academic DayIt was an event organized by the .NET Group from the OUTech and Microsoft Poland. The auditorium nearly full! Of course I’ve showed my funny CI gadget "Great Integrator Helmet". It connects wirelessly to the CI server and transfers a feedback about failing build by blinking and hauling. As usual it was very well noticed by the auditorium Winking smile

And since we are at the topic of tinkering with electronics: Idnplogo_dr’ve described how to build such a thing using Tinkerforge and .NET and together with Bernhard Kord written an article about it. It went “live” this week in the 11th 2013 issue of dotnetpro Magazin in Munich, Germany. If you are keen in German language (the article is written in German) take a look at http://www.dotnetpro.de/articles/onlinearticle4689.aspx for more details!

Developer Week – .NET Developers Conference 2013

September 16, 2013 on 1:22 pm | In Continuous Integration, DotNet, Software Engineering | No Comments

I was invited speaker at this years Developer Week – .NET Developer Conference in Nuremberg, Germany. It was quite nice event and I was speaking about “Continuous Integration in .NET”. You can find Post-Event Report in this document (German only!). If you look carefully you will surely find my picture there!

Dveloper Week / Dotnet Developer Conference 2013

June 25, 2013 on 7:20 pm | In Continuous Integration, DotNet, Software Engineering | No Comments

dwx2013Herzlichen Dank an alle die Teil in mein Vortrag wahrend Developer Week / Dotnet Developer Conference 2013 in Nürnberg genommen haben! Das war eine Hervorragende Stunde! Ich hoffe Ihr etwas über „Continuous Integration in .NET“ gelernt habt. Herrlichen Dank an Organisatoren für die Einladung. Danke an Kollegen Referenten. Danke an alle 1500 (?!) DWX 2013 Teilnehmer!

Und als Bonus – CiInDotNet_Dwx











I will be speaking at .NET Developer Conference in Nuremberg, Germany

March 8, 2013 on 8:12 am | In Continuous Integration, DotNet | No Comments

imageI’m happy to announce that I will be speaking about Continuous Integration in .NET at .NET Developer Conference (DCC) in Nuremberg, Germany. The .NET Developer Conference is a part of Developer Week (DWX) that combines 3 events WDC (Web Developer Conference), MDC (Mobile Developer Conference) and DDC. It takes place from 24th to 27th June 2013. My session is on Monday the 24th.

More information on the DWX 2013, the program and speakers are available at www.developer-week.de.

You are very welcome!

PS. I have a discount code for all my blog readers – if you want one please drop me a line!

I will be speaking at MTS 2012

September 7, 2012 on 9:47 am | In Continuous Integration, DotNet | No Comments

MTS 2012 będę prelegentemMTS or Microsoft Technology Summit is the biggest technical conference Microsoft in Poland. It is the place where few thounsend developers, IT specialists and business people meet to get the latest informations from Microsoft and to talk about things that are relevant. This year was invited to MTS to give a talk about continuous integration. Let’s meet at MTS 2012!

Next Page »

Powered by WordPress with Pool theme design by Borja Fernandez.
Text © Marcin Kawalerowicz. Hosting CODEFUSION.