paint-brush
When Json Serializations Go Horribly Wrongby@vkocjancic
255 reads

When Json Serializations Go Horribly Wrong

by Vladimir KocjancicDecember 28th, 2020
Read on Terminal Reader
Read this story w/o Javascript
tldt arrow

Too Long; Didn't Read

From DateTime serialization issues, to the fact that serializing DateTime.MinValue causes a crash, because UTC time is less than minimal allowed DateTime value.

Company Mentioned

Mention Thumbnail
featured image - When Json Serializations Go Horribly Wrong
Vladimir Kocjancic HackerNoon profile picture

Recently, I have been dealing with too many issues regarding .NET Json serialization on one side and Javascript (de)serialization on the other, to keep confidence in this technology. From DateTime serialization issues, to the mere fact that serializing DateTime.MinValue causes a fatal crash, because UTC time is less than minimal allowed DateTime value. But the last one was a drop too many.

Before I go any further, I would like to state that said definitely applies to .NET 4.8.x. using latest Newtonsoft.Json package. I am yet to test it on .NET 5, but I have an odd feeling, the issue remains.

The problem

I lost a lot of time trying to circumvent this issue. We are signing our responses on the server and sending hash along to the client who then
verifies, if the hash is valid.

Naturally, in our production code we are doing much more complex stuff
than shown in this article, but we'll keep it simple for easier understanding.

Imagine your .NET API produces simple JSON response:

"Value":385.0}
. SHA-2 hash value of this is:

E2A5770B9E63DCC04B1A417E8E6DEE4E83619CA87D6A22A49CEEAC9925C6643.

This data now gets sent to JS. In order to check the signature, client code must convert JSON object back to string and calculate hash. You can
do that by calling JSON.stringify() on object and use crypto.js to calculate SHA-2 hash. All fine and well, except the hash on JS client is:

99F411EF3B0CB566199EFA6835C33DE0727690325B155B4FC5C5FA8A340AA714.

Not quite what we expected. But in order to know why this happens, you need to understand...

Decimal serialization in .NET

Decimal serialization in .NET is a funny ordeal. If you read the documentation (and you really really should), you know that serialization keeps the amount of decimal places your program assigned to a variable. The reason is: "we don't want to change your data". That I can get my head around.

However, what is more difficult to explain is that this statement does not apply at all times. For instance, imagine you have an object with a decimal value of 385.00000. JSON representation of such object will be something along the lines of:

{"Value":385.00000}
. Expected, and nothing special.

However, if you set a decimal value of
385M
, the JSON representation is now:
{"Value":385.0}
and not
{"Value":385}
what one would expect. So much for "we don't want to change your data" mantra.

And, if you think I did anything special, I present you the code that generates said result:

public class SampleObject
{
    public decimal Value { get; set; }
}

var obj = new SampleObject() { Value = 385M };
System.Console.WriteLine(JsonConvert.SerializeObject(obj));

But this only introduces the problem. To know the whole story, you need to know...

JSON serialization in JavaScript

JSON serialization in JavaScript is natively supported by JSON.stringify (object) method. But, using this method on above object

{"Value":385.0}
returns a string representation of
{"Value":385}
, which is not exactly expected behaviour.

The exact same conversion happens no matter how many decimal zeros your value has. So if your object is
{"Value":385.00000}
, calling JSON.stringify on that object will produce
{"Value":385}
.

If you check Google on this, you will get the answer of typical arrogant
programmer (which is the problem with a LOT of JavaScript features):
"385 is exactly the same value as 385.0". True. Except, when you try to
check a digital signature of passed data. Then, 385 and 385.0 are as
different as night and day.

And now, we get to the trickiest part of them all...

How do I circumvent this?

First, JavaScript gives you no weapons to attack this issue. Except for some iffy, string replacement techniques.

Hence, your only option is to format the response "properly" in .NET code. Except, this cannot be done straight-forward, as as we have seen, serialization of 385M serializes into 385.0.

Lucky for us, Newtonsoft.Json library offers the ability to write custom JsonConverters. Finally some good news. We "only" need to write our own converter. But how do we convince our converter to use only as many
decimal numbers as needed? A lot of "googling" later, there seems to be
about two passable solutions. On contains formatting the number as "G29" format. The other contains division of value with 1.00000000000000000M. Both produce similar results.

I started typing and several minutes later, I ended up with a converter like this:

public class DecimalFormatConverter : JsonConverter
{
    public override bool CanConvert(Type objectType)
    {
        return (objectType == typeof(decimal)
			|| objectType == typeof(decimal?));
    }

    public override void WriteJson(JsonWriter writer, object value, 
                                   JsonSerializer serializer)
    {
	writer.WriteValue(Convert.ToDecimal(value).ToString("G29"));
    }
	
	public override bool CanRead { get { return false; } }
	
    public override object ReadJson(JsonReader reader, Type objectType,
                                 object existingValue, JsonSerializer serializer)
    {
	throw new NotImplementedException();
    }
} 

All fine and dandy, except this now serializes into

{"Value":"385"}
which, again, produces incorrect hash of

EC53BDEEC861E050E56FDA51B48621D0452006247D1501D79CF63A4C749E513F.

In order to return value as a numeric value and not string, one needs to get a little bit more creative:

public class DecimalFormatConverter : JsonConverter
{
    public override bool CanConvert(Type objectType)
    {
        return (objectType == typeof(decimal)
			|| objectType == typeof(decimal?));
    }

    public override void WriteJson(JsonWriter writer, object value, 
                                   JsonSerializer serializer)
    {
	var valCasted = Convert.ToDecimal(value);
	if (Math.Round(valCasted, 10) == Math.Truncate(valCasted)) 
	{
		writer.WriteValue((int)Math.Truncate(valCasted));
	}
	else 
	{
        	writer.WriteValue(valCasted);
	}
    }
	
    public override bool CanRead { get { return false; } }
	
    public override object ReadJson(JsonReader reader, Type objectType,
                                 object existingValue, JsonSerializer serializer)
    {
	throw new NotImplementedException();
    }
} 

What this piece of code does is that, if decimal value, rounded to 10
decimal places equals to truncated decimal value, then it serializes the number as integer. Otherwise, it outputs it as it is.

But why 10 decimals? Sometimes a floating point error can make your
value be 385.0000000000004. That is supposed to be 385, but it is not. And remember what was said in chapter about decimal serialization in .NET. Serialization respects your data (well, almost) and keeps number of
decimal spaces. So the value gets serialized as is: 385.0000000000004.
Rounding a number to 10 decimal spots helps get rid of that.

Doing all that, the serialized value in .NET is finally

{"Value":385}
and JavaScript serialization produces exact same result. Hence, both hashes are equal and response is considered valid.

Previously published at https://www.lotushints.com/