[6 Mar 2015 | 0 Comments]
I am *very* happy to announce my latest Pluralsight course is now live! It's called Building and Securing a RESTful API for Multiple Clients in ASP .NET (that's about all the words I could fit in one title ;-)). Keywords: REST. Web API. OAuth 2.0 & OpenID Connect. ASP .NET MVC, Mobile (WinPhone).  
[More]

.NET, Featured, General, Presentations and sessions, Headline, WebAPI »

[6 Mar 2015 | 0 Comments]

I am *very* happy to announce my latest Pluralsight course is now live!  It's called Building and Securing a RESTful API for Multiple Clients in ASP .NET (that's about all the words I could fit in one title ;-)).  Keywords: REST.  Web API.  OAuth 2.0 & OpenID Connect.  ASP .NET MVC, Mobile (WinPhone).

 

Pluralsight Author Kit

The course combines what I've learned in the field in my day-to-day job as mobile solution architect: what works (and what doesn't) when you need to build an API that's to be consumed by a variety of clients (mobile, web, ...)?  What are the best practices for building RESTful API's?   How do you secure something like that (for example: why should you do things differently when you're building a native mobile client versus a web client versus a javascript-based client)? 

 

And, even more important than the how: why do you have to keep all these things into account?

 

Here’s a short description of what to expect:

 

We all seem to be building RESTful API's these days, with ASP .NET Web API.  But REST is bigger than that - it's an architectural system.  If you're looking to learn what REST actually is and how to build a REST-ful API with ASP .NET Web API, aimed at multiple client types (web/mobile), this is the right course for you. 

 

It’s filled with best practices concerning URI design, data shaping, paging, caching, versioning, and so on - it's very demo-driven, and we start from scratch. It contains an API and two different clients: an ASP .NET MVC client and a mobile client.

 

To top it off, you'll also learn all about securing both client apps and the API with OAuth 2.0 and OpenID Connect.  The focus is on what works for standardized API development for multiple (possibly cross-platform) clients.

 

If you think this is something for you, make sure you have a look – I hope you’ll enjoy it!

.NET, Featured, General, GitHub, Headline, WebAPI, Windows 8, Windows Phone, WinRT »

[3 Mar 2015 | 0 Comments]

If you’ve been reading this blog, you know one of my open source projects is Marvin.JsonPatch – a portable class lib implementation of the JsonPatch spec with both client & server components.  As it’s under active development, there’s quite a lot of interim releases (you can get all of those from my MyGet feed, if you *really* need them ;-)).  I don’t post updates for each interim release - but this is a pretty big one, so I figured I’d announce it (if you’ve been using this, you’ll definitely want this update).

 

The most important feature is support for patching non-IConvertible types.  Previously, the component included support for updating values of types that implemented IConvertible, like double, string, … This release greatly expands on that: in essence, if it’s a serializable value, you can patch it.  This allows you to patch full object trees, eg: if you’ve got a Person object that has a HomeAddress property of type Address, you no longer have to patch each property of that Address separately – you can now pass through that person’s address in one go, assuming your Address class is serializable. 

 

Next to that, the component is now somewhat more forgiving when it comes to working with lists of data.  Previously, you’d have to ensure that when you replaced a List, you’d pass through a List.  Now, it’s sufficient when the type you pass through can be serialized to the type you want to replace.  Eg: a collection can be serialized to a list, so you’re allowed to pass through a collection as value.

 

A lot of work has also gone into performance improvements: there’s quite a bit of reflection that’s now avoided (about 50% less, to be exact :)).

 

And lastly, even though it doesn’t *really* say anything: the number of unit tests has more than doubled since the last release – testing (& stability) was greatly improved (and unit tests are now implemented with XUnit).

 

In short, if you’ve been using Marvin.JsonPatch, you’ll want this update :-)  You can find it on NuGet.

 

Found an issue, got a comment, feel like contributing?  One address: https://github.com/KevinDockx/JsonPatch :-)

.NET, Featured, General, GitHub, Headline »

[16 Feb 2015 | 0 Comments]

I've recently been working on a few open source projects, namely JsonPatch support and HttpCache support for .NET as Portable Class Libraties; these are implementations of the JsonPatch spec & HttpCache spec respectively, and both are currently under active development.  I had a few reasons for starting these; current implementations either didn't exist or didn't fully satisfy the needs I had for them (your mileage may vary, of course :)).  As I currently use these in real-life projects, I figured they might be of use to other people as well, so: I wanted to make them available on NuGet

 

I started out by manually creating the packages to upload to NuGet.  It's a bit of a cumbersome process at first, but it's not that bad, so I was kinda happy. 

 

After a while though, "kinda happy" evolved into "meh" :-)  Surely, there had to be a way to automate this?  I also hit a more important issue: I don't want to release every single commit as an update to my NuGet packages, yet I do like the way I can manage dependencies when they ARE available on NuGet.  In fact, before considering an official release, I like to test these packages in the real-life projects I currently use them on (in a dev env, of course ;-)). 

 

In short: I needed some sort of automation in creating these packages.  I also required some sort of in-between private NuGet feed, so I could easily update packages in the projects I want to test them on before publishing them to "public NuGet". 

 

That's where MyGet fits into this story.  (just to be clear: no, I'm not getting paid to write this ;-))

 

My current workflow:

  • Code in Visual Studio (where else? :))
  • When a feature is done => GitHub
  • Via a WebHook, MyGet auto-builds a new version of the package on each commit
  • This package is available on my MyGet feed.  I can use that to test.
  • When it's time for a new version, I simply push the package that's already on MyGet to NuGet

 

Hours saved?  Lots.  Warm fuzzy feeling thanks to the fact that this all just works?  Check! :)

 

Now, how do you do that?  It's actually pretty easy to set up.  Once you've got a MyGet feed, navigate to "Build Services".  From there, you can select "Add build source...", and select "from GitHub".

 

image

 

You'll be asked to select the GitHub repo you want to link.  Select it, and leave all the other options at default - MyGet will create a WebHook @ GitHub for you.  And... that's it :-)  From now on, a new MyGet package (incl the default semantic versioning – which is really nice) will be created each time you commit code changes to your GitHub repo.

 

Satisfied with the package, and want to push it to NuGet?  Navigate to your package details => package history.  There, choose the package you want to push to NuGet, and choose the Push option.

 

image

 

You’ll end up on this screen:

 

image

 

As you can see, in my workflow, that still includes a “pre-release”-tag.  If you click the “Edit” button, you’ll see the pre-release tag is filled out:

 

image

 

I currently use my MyGet feed as one that contains pre-release feeds – thus, all my packages are pre-release, which explains why the pre-release tag is auto-filled out.  If you use your MyGet feed as one that also hosts non-pre-release packages, and you push one of those to NuGet, the pre-release tag will be empty.   So another strategy here (probably one that’s conceptually more correct ;-)) could be to ensure you’ve got a non-pre-release package at MyGet first, before pushing a version to NuGet.

 

Anyway, if you’re following my workflow and you want to make sure your package is not listed as a pre-release version on NuGet, click the “Edit” button and make sure you clear that tag.

 

By the way, MyGet automatically picks up your nuspec file from all the usual suspect locations (in my case, it's in the project dir itself) – that’s where the package details are coming from.

 

And that’s it – Push to publish your package to NuGet.  Hope this helps some of you who are looking to automate their workflow out :)

 

If you want to help out on these projects, check GitHub

If you want to access the in-between preview builds, these are my MyGet feed details - simply add the feed in VS's NuGet Package Manager options dialog.  Make sure you select the "Include pre-release"-option in the Package Manager dialog, as that's what they are :)

And if you simply want to use these packages in your own projects, you can find 'm on NuGet.

 

Happy coding! :-)

Featured, General, Headline »

[30 Dec 2014 | 0 Comments]

As some of you may know, I regularly speak at conferences & create courses for Pluralsight.  These types of talks and courses tend to include a lot of demos I distribute afterwards - and I can't always just put 'm on GitHub.

 

So, I need to clean everything up: remove unnecessary files, remove bin/obj folders, remove source control bindings, etc.  Especially for courses this gets tedious very fast (the course I’m currently working on includes more than 100 different visual studio solutions), so I was looking into a way to automate cleaning up the exercise directories, removing source control bindings, and afterwards: check if NuGet package restore works & if the build still works.

 

Maarten Balliauw pointed me in the right direction: Powershell.  So I started looking around, and found a great post by Daniel Thompson.

 

It didn't completely fit my requirements, so I adjusted this somewhat:

  • I want to delete all vssscc, user, vspscc etc files
  • I also want to delete the v12.suo files (user options) (hidden file, so I added -force)
  • Instead of just deleting *.pdb's, I want to get rid of the complete bin & obj-folders
  • I also want to get rid of the packages folder that contain the NuGet packages - as these get restored automatically on first build

Lastly, this specific set of demo files I'm currently working on contain mdf & ldf files (DB) - I don't want to distribute those for each and every starter/finished demo, but only once so the file download size is as low as possible.  Therefore, I also remove those.

 

get-childitem . -include *.vssscc,*.user,*.vspscc,
*.v12.suo,*.mdf,*.ldf,debug,packages,bin,obj -recurse -force |   
    %{   
        remove-item $_.fullname -force -recurse   
    } 

 

Up next is removing all the source control bindings from solution & project files (these are coming from Daniel's post).  I tend to commit all the code I work on to (in this case) TFS Online, but it makes no sense to leave those bindings there when distributing the code: all the end user will get is an annoying "unauthorized to connect to TFS online" message.

 

# Remove the bindings from the sln files  
get-childitem . -include *.sln -recurse |   
    %{   
        $file = $_;   
        $inVCSection = $False;  
        get-content $file |   
        %{   
            $line = $_.Trim();   
            if ($inVCSection -eq $False -and $line.StartsWith('GlobalSection') -eq $True -and $line.Contains('VersionControl') -eq $True) {   
                $inVCSection = $True   
            }   
            if ($inVCSection -eq $False) {   
                add-content ($file.fullname + '.new') $_   
            }   
            if ($inVCSection -eq $True -and $line -eq 'EndGlobalSection') {   
                $inVCSection = $False  
            }  
        }  
        mv ($file.fullname + '.new') $file.fullname -force   
    } 

# Remove the bindings from the csproj files  
get-childitem . -include *.csproj -recurse |   
    %{   
        $file = $_;   
        get-content $file |   
        %{   
            $line = $_.Trim();   
            if ($line.StartsWith('<Scc') -eq $False) {  
                add-content ($file.fullname + '.new') $_   
            }  
        }  
        mv ($file.fullname + '.new') $file.fullname -force   

    }

 

That's it - clean solutions everywhere! :) 

 

But, well, I kinda want to be sure package restore & a subsequent build will still work.  So here's two more scripts I run after having cleaned up everything, just to make sure the demos still work as expected.

 

First, test if NuGet restore effectively restores everything that has to be restored:

 

$nuget = "C:\nuget\nuget.exe"
get-childitem . -include *.sln -recurse |   
    %{   

 	$restorepackages = "$nuget restore ""$_"""
	invoke-expression $restorepackages
    }

 

Next, test if the build still works (/p:VisualStudioVersion=12.0 is required in my case b/c I use WebDeploy to Azure):

 

$msbuild = "C:\Windows\Microsoft.NET\Framework64\v4.0.30319\MSBuild.exe"
get-childitem . -include *.sln -recurse |   
    %{   

 	$buildsln = "$msbuild ""$_"" /p:VisualStudioVersion=12.0"
	invoke-expression $buildsln
    }

 

These two scripts are what I consider a failsafe - afterwards, I run the first set of scripts again to clean

everything up, and then I'm ready to distribute the demos.   As always, your mileage may differ & you might need a few adjustment before using these scripts (these are definitely aimed at my own requirements) - but these work for me, and they should be enough to get you started.

 

Hope this helps some of you out :) 

.NET, Featured, GitHub, Headline, WebAPI »

[17 Dec 2014 | 0 Comments]

I’ve pushed a new version of JsonPatch for .NET to NuGet.  If you’re already using it, a simple update from the NuGet dialog will do. 

 

This new version includes a bunch of bug changes, but the most notable addition is support for deeply nested objects.  JsonPatch now allows you to patch properties that are in nested objects, or in-between nested objects / object trees. 

 

This allows scenarios as such:

 

// replace StringProperty in nested object
patchDoc.Replace<string>(o => o.NestedObject.StringProperty, "B");

// copy StringProperty value from root to nested object
patchDoc.Copy<string>(o => o.StringProperty, o => o.NestedObject.StringProperty);

// Move integer value from array in nested object (position 0) to IntegerProperty at root
patchDoc.Move<int>(o => o.NestedObject.IntegerList, 0, o => o.IntegerProperty);

 

 

You can find the new version at NuGet, and you’re always free to look into the source code, log issues or contribute at my GitHub repository.

 

Happy coding! :)

.NET, Featured, General, Headline, GitHub, WebAPI »

[5 Nov 2014 | 0 Comments]

I’ve been working on my next Pluralsight course the last few weeks.  At a certain moment I needed support for JSON Patch (RFC 6902), to allow partial REST-ful updates (HttpPatch) with Web API. 

 

That’s not supported out of the box, so I kinda had to roll out my own implementation.  I choose to open source that, and publish it on NuGet.

 

It’s still early days & there’s work to be done, but all the basics are there.  Any comments are much appreciated, and issues can of course be reported on GitHub

 

Here’s a bit more detail:

 

“JSON Patch (https://tools.ietf.org/html/rfc6902) defines a JSON document structure for expressing a sequence of operations to apply to a JavaScript Object Notation (JSON) document; it is suitable for use with the HTTP PATCH method. The "application/json-patch+json" media type is used to identify such patch documents.

 

One of the things this can be used for is partial updates for REST-ful API's, or, to quote the IETF: "This format is also potentially useful in other cases in which it is necessary to make partial updates to a JSON document or to a data structure that has similar constraints (i.e., they can be serialized as an object or an array using the JSON grammar)."

 

That's what this package is all about. Web API supports the HttpPatch method, but there's currently no implementation of the JsonPatchDocument in .NET, making it hard to pass in a set of changes that have to be applied - especially if you're working cross-platform and standardization of your API is essential.”

 

For examples on how to use this, check my GitHub repository

 

Hope you like it! :-)

.NET, Featured, General, Headline »

[3 Oct 2014 | 0 Comments]

Over the past years, a lot has been said and thought about XAML.  At a certain time, there were even questions about whether or not it would still be worth investing your time in. 

 

Currently, its future looks very bright indeed.  I thought it’d be a good idea to create an overview of how it all started, what happened to it over the last years and what you can do with it today.  In short: the history & future of XAML (hint: we went from “meh” to “ok, looks good”, to “oh no, it’s dead” to “we can target almost every platform with it” :-)).

 

If you’re interested, check out my blogpost at Pluralsight’s blog: The Future of XAML

 

Enjoy!