Top 5 Fastest WCF Serializers

Recently have been using WCF again at work, I wondered how slow (or fast) the latest DataContractSerailizer is. I was pleasantly surprised that it was fairly similar in speed to the JSon Serializer.

Time-100k

Memoryusage-100k

I have run some benchmarks and run speed tests on the following 5 serializers:

DataContractSerializer

This is the default WCF serializer and is used by specifying the DataContract and DataMember attributes on your classes such as the following:

[DataContract]
public class DtoDataContract : IDataContract
{
    [DataMember]
    public long Id { get; set; }

    [DataMember]
    public string Name { get; set; }

    [DataMember]
    public string Description { get; set; }

    // Other Members
}

This generates XML with each Class and Field as Tags, such as the following:

<DtoDataContract i:type="ArrayOfDtoDataContract" xmlns="http://schemas.datacontract.org/2004/07/Benchmark" xmlns:i="http://www.w3.org/2001/XMLSchema-instance">
    <DtoDataContract>
        <Balance>736.13</Balance>
        <Created>2017-02-22T09:04:29.0656576+00:00</Created>
        <Description>Open-architected responsive service-desk</Description>
        <Id>0</Id>
        <Interest>506.00</Interest>
        <Limit>230.27</Limit>
        <Name>Bernier Group</Name>
        <Transaction>260.22</Transaction>
    </DtoDataContract>
</DtoDataContract>

Size: 473 Bytes

NetDataContractSerializer

This is an alternative Serializer that comes out of the box with WCF, and it is very similar to DataContractSerializer but adds additional information to each tag that can be used by the client.

This serializer generates the following output:

<ArrayOfDtoDataContract z:Id="1" z:Type="System.Collections.Generic.List`1[[Benchmark.DtoDataContract, Benchmark, Version=1.0.0.0, Culture=neutral, PublicKeyToken=null]]" z:Assembly="0" xmlns="http://schemas.datacontract.org/2004/07/Benchmark" xmlns:i="http://www.w3.org/2001/XMLSchema-instance" xmlns:z="http://schemas.microsoft.com/2003/10/Serialization/">
    <_items z:Id="2" z:Size="4">
        <DtoDataContract z:Id="3">
            <Balance>736.13</Balance>
            <Created>2017-02-22T09:04:29.0656576+00:00</Created>
            <Description z:Id="4">Open-architected responsive service-desk</Description>
            <Id>0</Id>
            <Interest>506.00</Interest>
            <Limit>230.27</Limit>
            <Name z:Id="5">Bernier Group</Name>
            <Transaction>260.22</Transaction>
        </DtoDataContract>
        <DtoDataContract i:nil="true" />
        <DtoDataContract i:nil="true" />
        <DtoDataContract i:nil="true" />
    </_items>
    <_size>1</_size>
    <_version>1</_version>
</ArrayOfDtoDataContract>

Size: 874 Bytes

Newtonsoft.JSON

[  
   {  
      "Id":0,
      "Name":"Bernier Group",
      "Description":"Open-architected responsive service-desk",
      "Created":"2017-02-22T09:04:29.0656576+00:00",
      "Balance":736.13,
      "Transaction":260.22,
      "Interest":506.00,
      "Limit":230.27
   }
]

Size: 207 Bytes

ProtoBuf

I decided to compare Google ProtocalBuffers .NET implementation by Marc Gravell – protobufnet.

To use ProtoBuf, you need to decorate the class with ProtoContract and each member with ProtoMember(x) where x is the position of the field when serializing.

[ProtoContract]
public class DtoDataContract : IDataContract
{
    [ProtoMember(1)]
    public long Id { get; set; }

    [ProtoMember(2)]
    public string Name { get; set; }

    [ProtoMember(3)]
    public string Description { get; set; }

    // Other members
}

Result is Raw Binary Stream such as:

f\u0012
Bernier Group\u001a(Open-architected responsive service-desk"\u000b\u0008�Ũ����4\u0010\u0005*\u0006\u0008��\u0004\u0018\u00042\u0006\u0008��\u0001\u0018\u0004:\u0006\u0008��\u0003\u0018\u0004B\u0006\u0008�\u0001\u0018\u0004

Size: 104 B

BinaryWriter

This is a custom hand rolled BinaryWriter for my class using BinaryWriter and BinaryReader such as the following:

Serializing

// Count of items in list to help deserialize
writer.Write(_items.Count);

foreach (var item in _items)
{
    writer.Write(item.Id);
    writer.Write(item.Balance);
    writer.Write(item.Description);
    writer.Write(item.Interest);
    writer.Write(item.Transaction);
    writer.Write(item.Limit);
    writer.Write(item.Name);
    writer.Write(item.Created.ToBinary());
}

Deserializing

var reader = new BinaryReader(stream);

var count = reader.ReadInt32();
var list = new List<DtoDataContract>(count);

while (stream.Position < stream.Length)
{
    var item = new DtoDataContract
    {
        Id = reader.ReadInt64(),
        Balance = reader.ReadDecimal(),
        Description = reader.ReadString(),
        Interest = reader.ReadDecimal(),
        Transaction = reader.ReadDecimal(),
        Limit = reader.ReadDecimal(),
        Name = reader.ReadString(),
        Created = DateTime.FromBinary(reader.ReadInt64())
    };

    list.Add(item);
}

This produces a 30% longer binary stream than protobuf (surprisingly) but is faster

\u0001\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000�\u001f\u0001\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0002\u0000(Open-architected responsive service-desk��\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0002\u0000�e\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0002\u0000�Y\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0002\u0000

Size: 139 B

Leave a Reply

Your email address will not be published. Required fields are marked *

To create code blocks or other preformatted text, indent by four spaces:

    This will be displayed in a monospaced font. The first four 
    spaces will be stripped off, but all other whitespace
    will be preserved.
    
    Markdown is turned off in code blocks:
     [This is not a link](http://example.com)

To create not a block, but an inline code span, use backticks:

Here is some inline `code`.

For more help see http://daringfireball.net/projects/markdown/syntax

This site uses Akismet to reduce spam. Learn how your comment data is processed.