Thursday, 1 May 2014

Storing DateTimeOffset values in Azure Table Storage

I recently stumbled upon a weakness of Azure Table Storage: there is no native support for DateTimes with timezone information (a .NET DateTimeOffset datatype).  If I’d read all the documentation thoroughly beforehand I’d have known this (see http://msdn.microsoft.com/library/azure/jj553018.aspx) but like most of us I didn’t read all the documentation… Embarrassed smile

And, to be fair, I did have some reason for thinking that Table Storage would support DateTimeOffset values.  Below is an example of a very basic ITableEntity class that can be read from/written to Table Storage using the Azure SDK methods in the
Microsoft.WindowsAzure.Storage.Table namespace:


 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
public class ExampleEntity : ITableEntity
{
    public void ReadEntity(IDictionary<string, EntityProperty> properties, OperationContext operationContext)
    {
        ExampleDateWithTimeZone = properties.GetDateTimeOffset("ExampleDateWithTimeZone").GetValueOrDefault();
    }
 
    public IDictionary<string, EntityProperty> WriteEntity(OperationContext operationContext)
    {
        Dictionary<string, EntityProperty> properties = new Dictionary<string, EntityProperty>();
        properties.Add("ExampleDateWithTimeZone", EntityProperty.GeneratePropertyForDateTimeOffset(ExampleDateWithTimeZone));
        return properties;
    }
 
    public DateTimeOffset ExampleDateWithTimeZone { get; set;}
 
    public string ETag { get; set;}
 
    public string PartitionKey{ get; set;}
 
    public string RowKey{ get; set;}
 
    public DateTimeOffset Timestamp{ get; set;}
} 

Look at lines 5 and 11: there are GetDateTimeOffset and GeneratePropertyForDateTimeOffset extension methods to read and write DateTimeOffset values, but no equivalent methods that read and write DateTime values.  I saw this and thought “Great!  It only supports dates with timezone information, which makes sense when the datacentre will normally be in a different timezone from the users”.

But unfortunately not…

When you do use those tempting helper methods for DateTimeOffSets what happens is:


  1. When you write the value, Azure converts it to GMT, so

    24/04/2014 14:31 +02:00

    becomes

    24/04/2014 12:31 +00:00
  2. When you read it back you get the GMT value.

I didn’t notice this for a while because until the clocks changed in spring all my work on this project had taken place in GMT (one of the perils of developing in the U.K; you’re generally pretty careful about date formats, but not so careful about timezones).  I did have some unit tests that used DateTimeOffset values with timezones but they were passing because according to DateTimeOffset.Equals():

24/04/2014 14:31 +02:00 == 24/04/2014 12:31 +00:00

(I can’t quite decide whether that’s a good thing or not)

So what did I do?

I still needed to store DateTimeOffset values in Azure Table Storage, so I converted them to strings.  Not that revolutionary, but works quite nicely.  In case I ever needed to sort them as DateTimeOffsets I used the format string “yyyyMMddHHmmssfffffffzzz”, so

01 May 2014 23:11:19 +01:00

becomes:

201405012312508966200+01:00

It turned out to be a fairly easy change to introduce because all my ReadEntity and WriteEntity implementations made heavy use of the same set of Extension Methods that look roughly like this:


 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
public static void AddPropertyIfNotNull(this Dictionary<string, EntityProperty> properties, string propertyName, int? propertyValue)
{
    if (propertyValue != null)
    {
        properties.Add(propertyName, EntityProperty.GeneratePropertyForInt(propertyValue));
    }
}
 
public static int? GetInt32(this IDictionary<string, EntityProperty> properties, string propertyName)
{
    return getValue(properties, propertyName, ep => ep.Int32Value, () => null);
} 

private static T getValue<T>(IDictionary<string, EntityProperty> properties, string propertyName, Func<EntityProperty, T> valueAccessor, Func<T> nullValue)
{
    if (properties.ContainsKey(propertyName))
    {
        return valueAccessor(properties[propertyName]);
    }
    else
    {
        return nullValue();
    }
} 

I have AddPropertyIfNotNull and GetXXX methods for every primitive type that I need, including DateTimeOffset.  So all I had to do was change the extension methods for DateTimeOffset to look like this:


 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
private const string DATETIMEOFFSET_FORMAT = "yyyyMMddHHmmssfffffffzzz";

public static DateTimeOffset? GetDateTimeOffset(this IDictionary<string, EntityProperty> properties, string propertyName)
{
    string valueAsString = getValue(properties, propertyName, ep => ep.StringValue, () => null);
    if (valueAsString == null)
    {
        return null;
    }
    else
    {
        return DateTimeOffset.ParseExact(valueAsString, DATETIMEOFFSET_FORMAT,  CultureInfo.DefaultThreadCurrentUICulture);
    }
}
 
public static void AddPropertyIfNotNull(this Dictionary<string, EntityProperty> properties, string propertyName, DateTimeOffset? propertyValue)
{
    if (propertyValue != null)
    {
        properties.Add(propertyName,  EntityProperty.GeneratePropertyForString(propertyValue.Value.ToString(DATETIMEOFFSET_FORMAT)));
    }
}

And the rest of the application carried on working as normal.

And if Azure Table Storage starts supporting DateTimeOffset natively before I go live all I have to do is switch it back…

Monday, 3 February 2014

Customising Backbone’s Sync Module

Backbone syncI’ve started using the Backbone MVC JavaScript framework recently, and have been pleasantly surprised by how easy it is to customise bits of the framework when I need some additional functionality.  by
Backbone communicates with backend web services using the Backbone.sync function; this function examine the model object being synchronised to determine whether the operation is a “create”, “read”, “update” or “delete” and then uses jQuery.ajax to perform a HTTP POST, GET, PUT or DELETE on the backend web service.  The change I want to make is to use PROPFIND and PROPPATCH instead of GET and PUT. I may go into the reasons for this change more in subsequent posts, but for now just trust me (please) that there is a reason why I want to do this.
One of the reasons that I chose Backbone was this section from the “Extending Backbone” section in the documentation:
Many JavaScript libraries are meant to be insular and self-enclosed, where you interact with them by calling their public API, but never peek inside at the guts. Backbone.js is not that kind of library.
Because it serves as a foundation for your application, you're meant to extend and enhance it in the ways you see fit
So here goes!  The Backbone.sync function is function (method, model, options) where method = “create” | “read” | “update” | “delete”.  Looking at the source of the function, the first line of the function does this:

var type = methodMap[method];

and the definition of methodMap is:

var methodMap = {
    'create': 'POST',
    'update': 'PUT',
    'patch':  'PATCH',
    'delete': 'DELETE',
    'read':   'GET'
  };


So to use PROPFIND and PROPPATCH instead of GET and PUT I should be able to simply create a different MyApp.methodMap hash (the original variable is private to the anonymous function that defines the Backbone namespace and functions), create a new MyApp.sync function which is an exact copy of Backbone.sync but referencing my new MyApp.methodMap hash and replace Backbone.sync with MyApp.sync.  Something like this:

MyApp.methodMap = {'create': 'POST', 'update': 'PROPPATCH', 'patch':'PATCH', 'delete': 'DELETE', 'read':'PROPFIND'};
MyApp.sync = function (method, model, options) {
     var type = MyApp.methodMap[method];

     // all the rest of Backbone.sync
};
Backbone.sync = MyApp.sync;


But it doesn’t work.  Monitoring the HTTP traffic using Fiddler confirms that the PROPFIND verb is being sent correctly when I call Model.fetch() but my model isn’t actually being populated with the data returned.  Looking further into the original Backbone.sync function, on about the 42nd line it decides whether to process the returned data based upon the HTTP verb that it’s sending:

// Don't process data on a non-GET request.
if (params.type !== 'GET' && !options.emulateJSON) {
  params.processData = false;
}


I just have to replace the ‘GET’ with a ‘PROPFIND’ and my model objects are populated correctly.
So to recap, all I had to do to make Backbone use PROPFIND and PROPPATCH instead of GET and PUT was to:
  1. Create a new MyApp.sync function and MyApp.methodMap hash that were exact copies of the originals.
  2. Modify the new MyApp.sync function to reference MyApp.methodMap.
  3. Modify MyApp.methodMap to return PROPFIND for read and PROPPATCH for update.
  4. Modify MyApp.sync to process returned data for PROPFIND instead of GET.
You may want to make quite different changes to Backbone.sync, but I hope you are inspired by reading this to have a go at it because it isn’t difficult.

Tuesday, 21 January 2014

Get some Backbone!

TodoMVCI’ve started looking at MVC frameworks for javascript recently.  For anyone else attempting this I would recommend the excellent TodoMVC site which contains the same simple application (a TODO list) coded using many different MVC frameworks.  If you’ve got enough time on your hands, I can see it would be a great idea to look in detail through all the implementations and pick the one that has the best combination of good technical features, flexibility and ongoing development.
But I didn’t have that much time so I had to pick one Disappointed smile.
I chose Backbone because it has:
  • Many live sites.
  • Regular releases.
  • Fairly lightweight approach (describes itself as “a library not a framework”): can be used in a variety of ways rather than prescribing just one way that you should use it.  This is particularly important to me given that there is no consensus yet on how to “do” MVC in javascript, so I want a framework that is flexible enough if I change my mind half way though my project.
  •  Open architecture: you are positively encouraged to make changes to the library if you don’t like the way part of it works.  Obviously javascript’s dynamic nature makes it a lot easier to do this than I’m used to, being a .NET statically typed man by training.
That’s all for now, I will let you know about my success (or otherwise) with Backbone in future posts.