[6 May 2015 | 0 Comments]
As some of you may know, I’ve been working with Microsoft on integrating one of my OSS projects, Marvin.JsonPatch, into the next version of ASP .NET. Last week at BUILD, a new RC build of Visual Studio 2015 was released. This version includes the new ASP .NET, and is thus the first one that also includes support for JsonPatch out of the box. I figured I’d create a sample project showing how to use this. 

.NET, Presentations and sessions, RealDolmen, WebAPI »

[19 May 2015 | 0 Comments]

Last week, on May 12th, I delivered a session on REST best practices at Techorama, Belgium’s premier IT conference.  Here’s a short overview of what that session was all about:



REST used to be a buzzword, but these days, it’s a given: we’re all building REST-ful API’s (or want to be building them). Yet there’s a lot to talk about and a lot that can go wrong when building a REST-ful API. In this session, we’ll dive into best practices concerning URI design/routing, partial updates, filtering, sorting & paging, data shaping, versioning and more. We’ll learn about the standards that have been created to allow some of these requirements, and how you can use them, with one purpose in mind: build a truly evolvable, cross-platform consumable REST-ful API.


In short, you’ll learn about the good, the bad and the ugly, in a code-fuelled session.



At the session, I promised I’d put both the slide deck and the code from that session on this blog for those who were interested.  So: here you go :)

.NET, Featured, General, GitHub, Headline, WebAPI »

[6 May 2015 | 0 Comments]

As some of you may know, I’ve been working with Microsoft on integrating one of my OSS projects, Marvin.JsonPatch, into the next version of ASP .NET.  Last week at BUILD, a new RC build of Visual Studio 2015 was released.  This version includes the new ASP .NET, and is thus the first one that also includes support for JsonPatch out of the box. 


A few words on what JsonPatch actually is:

JSON Patch (https://tools.ietf.org/html/rfc6902) defines a JSON document structure for expressing a sequence of operations to apply to a JavaScript Object Notation (JSON) document; it is suitable for use with the HTTP PATCH method.


One of the things this can be used for is partial updates for RESTful API's, or, to quote the IETF: "This format is also potentially useful in other cases in which it is necessary to make partial updates to a JSON document or to a data structure that has similar constraints (i.e., they can be serialized as an object or an array using the JSON grammar)."


In other words, if you’re building a RESTful API, you’ll probably want to use this.


I figured I’d create a sample project showing how to use this (a Wine Cellar management app).  The full example can be found on my GitHub page.  It contains a sample Web API, a sample MVC 6 client and a sample MVC 5 client.


Let’s start with the Web API side:


 public IActionResult Patch(int id, 
     [FromBody]JsonPatchDocument<BottleOfWine> bottleOfWinePatchDocument)
     // find the bottle with the correct id
     var bottles = Datastore.WineStore.BottlesOfWine;
     var correctBottle = bottles.FirstOrDefault<BottleOfWine>(b => b.Id == id);
     if (correctBottle == null)        
         return HttpNotFound();

     // apply patch document

     return new ObjectResult(correctBottle);


JsonPatchDocument is defined in the Microsoft.AspNet.JsonPatch namespace.  The action, attributed with “HttpPatch”, receives such a document as a parameter.  A JsonPatchDocument can be looked at as a change set that has to be executed on a resource.  Here’s an example of a patch document:


     { "op": "test", "path": "/a/b/c", "value": "foo" },
     { "op": "remove", "path": "/a/b/c" },
     { "op": "add", "path": "/a/b/c", "value": [ "foo", "bar" ] },
     { "op": "replace", "path": "/a/b/c", "value": 42 },
     { "op": "move", "from": "/a/b/c", "path": "/a/b/d" },
     { "op": "copy", "from": "/a/b/d", "path": "/a/b/e" }


What the integration of this into ASP .NET does is ensure you don’t have to create it nor run through it to apply it manually – it offers type-safe creation of a patch document (at MVC side) and a way to apply it to your resource at Web API side.


In the Patch method, you then get the object you want to apply the patch document to, and call the ApplyTo method.  That’s it – your change set has now been applied to the object.  What you want to do after this depends on your needs, but often you’ll want to save the changes to your repository.


On to the client: I’ve added an MVC 6 client to the mix, in which we can change some of the details of the bottles of wine in our cellar:




As you can see, not all fields are editable – this thus makes a fine example for a patch request.  In the controller’s Edit (HttpPost) method, the model used by the edit view gets passed in.  It’s from that model that we’ll want to create a Patch document to send to your API. 


public async Task<ActionResult> Edit(int id, BottleOfWine model)
        // we're only showing 2 properties that can be edited => not a 
        // full edit, so we'll want to use JsonPatch

        // create a JsonPatch document 
        JsonPatchDocument<BottleOfWine> patchDoc = 
        new JsonPatchDocument<BottleOfWine>();
        patchDoc.Replace(b => b.Grape, model.Grape);
        patchDoc.Replace(b => b.Year, model.Year);

        // serialize
        var serializedPatchDoc = JsonConvert.SerializeObject(patchDoc);

        // create the patch request
        var method = new HttpMethod("PATCH");
        var request = new HttpRequestMessage(method, 
            "http://localhost:1735/api/bottlesofwine/" + id)
            Content = new StringContent(serializedPatchDoc,
            System.Text.Encoding.Unicode, "application/json")

        // send it, using an HttpClient instance
        HttpClient client = new HttpClient();
        var result = await client.SendAsync(request);
        if (result.IsSuccessStatusCode)
            return RedirectToAction("Index");

        return View(result.StatusCode);
        return View();

As there’s only two fields that can be edited, it’s those two fields we’ll want to create a patch document for.   In this case, we’re only replacing values, but you can also use this to add (eg: to an array), move, copy, … fields – all those methods from the JsonPatch standard are supported. 


After that, it’s a matter of creating a new HttpRequest & sending it.  This will result in the Patch method on our API getting executed, and the document being applied to the BottleOfWine resource.  And if you fully want to follow the standard, you can send it with media type “application/json-patch+json” as content type as well, as long as you add support for that media type to the input formatter at API side.


But what if you’re using an older client version?  For example: your API is built using the new ASP .NET, but your client application is built with the previous MVC version?  Or you’ve got a Windows Phone client, or a Windows Store client?  Well, that’ll work as well – just add the Marvin.JsonPatch NuGet package to your solution to add JsonPatch support (details: https://github.com/KevinDockx/JsonPatch).  I’ve included an example MVC 5 client to show how to do this.  This also works the other way around: you can use a new ASP .NET MVC 6 client with an “old” Web API backend that implements JsonPatch support through Marvin.JsonPatch. 


All of this is of course still work in progress; more features will be added before the final release of ASP .NET.  But if you already need to use it now: this is how. 


Happy coding! :)

.NET, Featured, General, Presentations and sessions, Headline, WebAPI »

[6 Mar 2015 | 0 Comments]

I am *very* happy to announce my latest Pluralsight course is now live!  It's called Building and Securing a RESTful API for Multiple Clients in ASP .NET (that's about all the words I could fit in one title ;-)).  Keywords: REST.  Web API.  OAuth 2.0 & OpenID Connect.  ASP .NET MVC, Mobile (WinPhone).


Pluralsight Author Kit

The course combines what I've learned in the field in my day-to-day job as mobile solution architect: what works (and what doesn't) when you need to build an API that's to be consumed by a variety of clients (mobile, web, ...)?  What are the best practices for building RESTful API's?   How do you secure something like that (for example: why should you do things differently when you're building a native mobile client versus a web client versus a javascript-based client)? 


And, even more important than the how: why do you have to keep all these things into account?


Here’s a short description of what to expect:


We all seem to be building RESTful API's these days, with ASP .NET Web API.  But REST is bigger than that - it's an architectural system.  If you're looking to learn what REST actually is and how to build a REST-ful API with ASP .NET Web API, aimed at multiple client types (web/mobile), this is the right course for you. 


It’s filled with best practices concerning URI design, data shaping, paging, caching, versioning, and so on - it's very demo-driven, and we start from scratch. It contains an API and two different clients: an ASP .NET MVC client and a mobile client.


To top it off, you'll also learn all about securing both client apps and the API with OAuth 2.0 and OpenID Connect.  The focus is on what works for standardized API development for multiple (possibly cross-platform) clients.


If you think this is something for you, make sure you have a look – I hope you’ll enjoy it!

.NET, Featured, General, GitHub, Headline, WebAPI, Windows 8, Windows Phone, WinRT »

[3 Mar 2015 | 0 Comments]

If you’ve been reading this blog, you know one of my open source projects is Marvin.JsonPatch – a portable class lib implementation of the JsonPatch spec with both client & server components.  As it’s under active development, there’s quite a lot of interim releases (you can get all of those from my MyGet feed, if you *really* need them ;-)).  I don’t post updates for each interim release - but this is a pretty big one, so I figured I’d announce it (if you’ve been using this, you’ll definitely want this update).


The most important feature is support for patching non-IConvertible types.  Previously, the component included support for updating values of types that implemented IConvertible, like double, string, … This release greatly expands on that: in essence, if it’s a serializable value, you can patch it.  This allows you to patch full object trees, eg: if you’ve got a Person object that has a HomeAddress property of type Address, you no longer have to patch each property of that Address separately – you can now pass through that person’s address in one go, assuming your Address class is serializable. 


Next to that, the component is now somewhat more forgiving when it comes to working with lists of data.  Previously, you’d have to ensure that when you replaced a List, you’d pass through a List.  Now, it’s sufficient when the type you pass through can be serialized to the type you want to replace.  Eg: a collection can be serialized to a list, so you’re allowed to pass through a collection as value.


A lot of work has also gone into performance improvements: there’s quite a bit of reflection that’s now avoided (about 50% less, to be exact :)).


And lastly, even though it doesn’t *really* say anything: the number of unit tests has more than doubled since the last release – testing (& stability) was greatly improved (and unit tests are now implemented with XUnit).


In short, if you’ve been using Marvin.JsonPatch, you’ll want this update :-)  You can find it on NuGet.


Found an issue, got a comment, feel like contributing?  One address: https://github.com/KevinDockx/JsonPatch :-)

.NET, Featured, General, GitHub, Headline »

[16 Feb 2015 | 0 Comments]

I've recently been working on a few open source projects, namely JsonPatch support and HttpCache support for .NET as Portable Class Libraties; these are implementations of the JsonPatch spec & HttpCache spec respectively, and both are currently under active development.  I had a few reasons for starting these; current implementations either didn't exist or didn't fully satisfy the needs I had for them (your mileage may vary, of course :)).  As I currently use these in real-life projects, I figured they might be of use to other people as well, so: I wanted to make them available on NuGet


I started out by manually creating the packages to upload to NuGet.  It's a bit of a cumbersome process at first, but it's not that bad, so I was kinda happy. 


After a while though, "kinda happy" evolved into "meh" :-)  Surely, there had to be a way to automate this?  I also hit a more important issue: I don't want to release every single commit as an update to my NuGet packages, yet I do like the way I can manage dependencies when they ARE available on NuGet.  In fact, before considering an official release, I like to test these packages in the real-life projects I currently use them on (in a dev env, of course ;-)). 


In short: I needed some sort of automation in creating these packages.  I also required some sort of in-between private NuGet feed, so I could easily update packages in the projects I want to test them on before publishing them to "public NuGet". 


That's where MyGet fits into this story.  (just to be clear: no, I'm not getting paid to write this ;-))


My current workflow:

  • Code in Visual Studio (where else? :))
  • When a feature is done => GitHub
  • Via a WebHook, MyGet auto-builds a new version of the package on each commit
  • This package is available on my MyGet feed.  I can use that to test.
  • When it's time for a new version, I simply push the package that's already on MyGet to NuGet


Hours saved?  Lots.  Warm fuzzy feeling thanks to the fact that this all just works?  Check! :)


Now, how do you do that?  It's actually pretty easy to set up.  Once you've got a MyGet feed, navigate to "Build Services".  From there, you can select "Add build source...", and select "from GitHub".




You'll be asked to select the GitHub repo you want to link.  Select it, and leave all the other options at default - MyGet will create a WebHook @ GitHub for you.  And... that's it :-)  From now on, a new MyGet package (incl the default semantic versioning – which is really nice) will be created each time you commit code changes to your GitHub repo.


Satisfied with the package, and want to push it to NuGet?  Navigate to your package details => package history.  There, choose the package you want to push to NuGet, and choose the Push option.




You’ll end up on this screen:




As you can see, in my workflow, that still includes a “pre-release”-tag.  If you click the “Edit” button, you’ll see the pre-release tag is filled out:




I currently use my MyGet feed as one that contains pre-release feeds – thus, all my packages are pre-release, which explains why the pre-release tag is auto-filled out.  If you use your MyGet feed as one that also hosts non-pre-release packages, and you push one of those to NuGet, the pre-release tag will be empty.   So another strategy here (probably one that’s conceptually more correct ;-)) could be to ensure you’ve got a non-pre-release package at MyGet first, before pushing a version to NuGet.


Anyway, if you’re following my workflow and you want to make sure your package is not listed as a pre-release version on NuGet, click the “Edit” button and make sure you clear that tag.


By the way, MyGet automatically picks up your nuspec file from all the usual suspect locations (in my case, it's in the project dir itself) – that’s where the package details are coming from.


And that’s it – Push to publish your package to NuGet.  Hope this helps some of you who are looking to automate their workflow out :)


If you want to help out on these projects, check GitHub

If you want to access the in-between preview builds, these are my MyGet feed details - simply add the feed in VS's NuGet Package Manager options dialog.  Make sure you select the "Include pre-release"-option in the Package Manager dialog, as that's what they are :)

And if you simply want to use these packages in your own projects, you can find 'm on NuGet.


Happy coding! :-)

Featured, General, Headline »

[30 Dec 2014 | 0 Comments]

As some of you may know, I regularly speak at conferences & create courses for Pluralsight.  These types of talks and courses tend to include a lot of demos I distribute afterwards - and I can't always just put 'm on GitHub.


So, I need to clean everything up: remove unnecessary files, remove bin/obj folders, remove source control bindings, etc.  Especially for courses this gets tedious very fast (the course I’m currently working on includes more than 100 different visual studio solutions), so I was looking into a way to automate cleaning up the exercise directories, removing source control bindings, and afterwards: check if NuGet package restore works & if the build still works.


Maarten Balliauw pointed me in the right direction: Powershell.  So I started looking around, and found a great post by Daniel Thompson.


It didn't completely fit my requirements, so I adjusted this somewhat:

  • I want to delete all vssscc, user, vspscc etc files
  • I also want to delete the v12.suo files (user options) (hidden file, so I added -force)
  • Instead of just deleting *.pdb's, I want to get rid of the complete bin & obj-folders
  • I also want to get rid of the packages folder that contain the NuGet packages - as these get restored automatically on first build

Lastly, this specific set of demo files I'm currently working on contain mdf & ldf files (DB) - I don't want to distribute those for each and every starter/finished demo, but only once so the file download size is as low as possible.  Therefore, I also remove those.


get-childitem . -include *.vssscc,*.user,*.vspscc,
*.v12.suo,*.mdf,*.ldf,debug,packages,bin,obj -recurse -force |   
        remove-item $_.fullname -force -recurse   


Up next is removing all the source control bindings from solution & project files (these are coming from Daniel's post).  I tend to commit all the code I work on to (in this case) TFS Online, but it makes no sense to leave those bindings there when distributing the code: all the end user will get is an annoying "unauthorized to connect to TFS online" message.


# Remove the bindings from the sln files  
get-childitem . -include *.sln -recurse |   
        $file = $_;   
        $inVCSection = $False;  
        get-content $file |   
            $line = $_.Trim();   
            if ($inVCSection -eq $False -and $line.StartsWith('GlobalSection') -eq $True -and $line.Contains('VersionControl') -eq $True) {   
                $inVCSection = $True   
            if ($inVCSection -eq $False) {   
                add-content ($file.fullname + '.new') $_   
            if ($inVCSection -eq $True -and $line -eq 'EndGlobalSection') {   
                $inVCSection = $False  
        mv ($file.fullname + '.new') $file.fullname -force   

# Remove the bindings from the csproj files  
get-childitem . -include *.csproj -recurse |   
        $file = $_;   
        get-content $file |   
            $line = $_.Trim();   
            if ($line.StartsWith('<Scc') -eq $False) {  
                add-content ($file.fullname + '.new') $_   
        mv ($file.fullname + '.new') $file.fullname -force   



That's it - clean solutions everywhere! :) 


But, well, I kinda want to be sure package restore & a subsequent build will still work.  So here's two more scripts I run after having cleaned up everything, just to make sure the demos still work as expected.


First, test if NuGet restore effectively restores everything that has to be restored:


$nuget = "C:\nuget\nuget.exe"
get-childitem . -include *.sln -recurse |   

 	$restorepackages = "$nuget restore ""$_"""
	invoke-expression $restorepackages


Next, test if the build still works (/p:VisualStudioVersion=12.0 is required in my case b/c I use WebDeploy to Azure):


$msbuild = "C:\Windows\Microsoft.NET\Framework64\v4.0.30319\MSBuild.exe"
get-childitem . -include *.sln -recurse |   

 	$buildsln = "$msbuild ""$_"" /p:VisualStudioVersion=12.0"
	invoke-expression $buildsln


These two scripts are what I consider a failsafe - afterwards, I run the first set of scripts again to clean

everything up, and then I'm ready to distribute the demos.   As always, your mileage may differ & you might need a few adjustment before using these scripts (these are definitely aimed at my own requirements) - but these work for me, and they should be enough to get you started.


Hope this helps some of you out :) 

.NET, Featured, GitHub, Headline, WebAPI »

[17 Dec 2014 | 0 Comments]

I’ve pushed a new version of JsonPatch for .NET to NuGet.  If you’re already using it, a simple update from the NuGet dialog will do. 


This new version includes a bunch of bug changes, but the most notable addition is support for deeply nested objects.  JsonPatch now allows you to patch properties that are in nested objects, or in-between nested objects / object trees. 


This allows scenarios as such:


// replace StringProperty in nested object
patchDoc.Replace<string>(o => o.NestedObject.StringProperty, "B");

// copy StringProperty value from root to nested object
patchDoc.Copy<string>(o => o.StringProperty, o => o.NestedObject.StringProperty);

// Move integer value from array in nested object (position 0) to IntegerProperty at root
patchDoc.Move<int>(o => o.NestedObject.IntegerList, 0, o => o.IntegerProperty);



You can find the new version at NuGet, and you’re always free to look into the source code, log issues or contribute at my GitHub repository.


Happy coding! :)