Monday, September 17, 2018

Parsing SAML 1.1 (WS-Federation) tokens without the WSFam module

Ocassionally there's a scenario where a SAML token must be parsed without the WSFederationAuthentication module. Note that when the WSFam can be used, parsing is straightforward.

For us, it was one of our old applications that still can't be upgraded to .NET 4.5, because of reasons ;), and we wanted to drop the old WIF runtime (the one that targets older .NET versions). For someone else it can be another scenario, e.g. you have the SAML token as string and just want the IPrincipal out of the token.

The solution is to think of the token as it was the regular XMLDsig signed XML - the assertion node is signed and the signature's reference points back to it:

<?xml version="1.0"?>
<t:RequestSecurityTokenResponse xmlns:t="">
    <wsu:Created xmlns:wsu="">2018-09-18</wsu:Created>
    <wsu:Expires xmlns:wsu="">2018-09-18</wsu:Expires>
  <wsp:AppliesTo xmlns:wsp="">
    <wsa:EndpointReference xmlns:wsa="">
    <saml:Assertion xmlns:saml="urn:oasis:names:tc:SAML:1.0:assertion" MajorVersion="1" MinorVersion="1" AssertionID="_assertionID" 
        Issuer="http://issuer" IssueInstant="2018-09-18">
        <saml:Attribute AttributeName="windowsaccountname" AttributeNamespace="">
      <ds:Signature xmlns:ds="">
          <ds:CanonicalizationMethod Algorithm=""/>
          <ds:SignatureMethod Algorithm=""/>
          <ds:Reference URI="#_assertionID">
              <ds:Transform Algorithm=""/>
              <ds:Transform Algorithm=""/>
            <ds:DigestMethod Algorithm=""/>
        <KeyInfo xmlns="">
What you should do is to
  1. validate the signature
  2. accept or reject the signature's certificate
  3. parse the token to retrieve claims required to create the IPrincipal
The code is rather simple, what's interesting however is that the SignedXml class has to be inherited to have the signature validator that follows the AssertionID attribute (the default convention is that the signed node's id attribute is called just ID and the default validator just won't find the node that has the id attribute called differently):
    public class SamlSignedXml : SignedXml
        public SamlSignedXml(XmlElement e) : base(e) { }

        public override XmlElement GetIdElement(XmlDocument document, string idValue)
            XmlNamespaceManager mgr = new XmlNamespaceManager(document.NameTable);
            mgr.AddNamespace("trust", "");
            mgr.AddNamespace("wsu", "");
            mgr.AddNamespace("saml", "urn:oasis:names:tc:SAML:1.0:assertion");

            XmlElement assertionNode = 
                                                         "trust:RequestedSecurityToken/saml:Assertion", mgr);

            if (assertionNode.Attributes["AssertionID"] != null &&
                string.Equals(assertionNode.Attributes["AssertionID"].Value, idValue, StringComparison.InvariantCultureIgnoreCase)
                return assertionNode;

            return null;
Note that the XPath assumes the token has the RequestSecurityTokenResponseCollection in the root, make sure your tokens follow this convention (in case of a single token, the collection node can be missing and the token's root could be just RequestSecurityTokenResponse, update the code accordingly).

The validation code is then

// token is the string representation of the SAML1 token
// expectedCertThumb is the expected certificate's thumbprint
protected bool ValidateToken( string token, string expectedCertThumb, out string userName )
 userName = string.Empty;

 if (string.IsNullOrEmpty(token)) return false;

 var xd = new XmlDocument();
 xd.PreserveWhitespace = true;

 XmlNamespaceManager mgr = new XmlNamespaceManager(xd.NameTable);
 mgr.AddNamespace("trust", "");
 mgr.AddNamespace("wsu", "");
 mgr.AddNamespace("saml", "urn:oasis:names:tc:SAML:1.0:assertion");

 // assertion
 XmlElement assertionNode = (XmlElement)xd.SelectSingleNode("//trust:RequestSecurityTokenResponseCollection/trust:RequestSecurityTokenResponse/trust:RequestedSecurityToken/saml:Assertion", mgr);

 // signature
 XmlElement signatureNode = (XmlElement)xd.GetElementsByTagName("Signature")[0];

 var signedXml = new SamlSignedXml( assertionNode );

 X509Certificate2 certificate = null;
 foreach (KeyInfoClause clause in signedXml.KeyInfo)
  if (clause is KeyInfoX509Data)
   if (((KeyInfoX509Data)clause).Certificates.Count > 0)
    certificate =

 // cert node missing
 if (certificate == null) return false;

 // check the signature and return the result.
 var signatureValidationResult = signedXml.CheckSignature(certificate, true);

 if (signatureValidationResult == false) return false;

 // validate cert thumb
 if ( !string.IsNullOrEmpty( expectedCertThumb ) )
  if ( !string.Equals( expectedCertThumb, certificate.Thumbprint ) )
   return false;

 // retrieve username

 // expires = 
 var expNode = xd.SelectSingleNode("//trust:RequestSecurityTokenResponseCollection/trust:RequestSecurityTokenResponse/trust:Lifetime/wsu:Expires", mgr );

 DateTime expireDate;

 if (!DateTime.TryParse(expNode.InnerText, out expireDate)) return false; // wrong date

 if (DateTime.UtcNow > expireDate) return false; // token too old

 // claims
 var claimNodes =                 
                  "saml:Assertion/saml:AttributeStatement/saml:Attribute", mgr );
 foreach ( XmlNode claimNode in claimNodes )
  if ( claimNode.Attributes["AttributeName"] != null && 
              claimNode.Attributes["AttributeNamespace"] != null &&
       string.Equals( claimNode.Attributes["AttributeName"].Value, "name", StringComparison.InvariantCultureIgnoreCase ) &&   
                     string.Equals( claimNode.Attributes["AttributeNamespace"].Value, "", StringComparison.InvariantCultureIgnoreCase ) &&
         claimNode.ChildNodes.Count == 1 
   userName = claimNode.ChildNodes[0].InnerText;
   return true;

 return false;
A couple of comments here.

First, the XPath could possibly be shortened to reflect the possibility of a missing collection node.

Then, the code assumes there's the name claim that contains the username but it could be the windowsaccountname or maybe yet another claim type.

Friday, September 14, 2018

WCF and default serialization of requests and responses

A short story of something new we've learned about how exactly WCF serializes the data that is sent over the wire.


Before WCF, the default way to serialize objects to XML was to use the XmlSerializer. It works and of course has its shortcomings when it comes to serialization of complex types and collections.

When WCF was introduced, a couple of new serializers were brought into the Base Class Library, including the DataContractSerializer and NetDataContractSerializer. New serializers mean new features, comparision charts are available (e.g. this one by Sebasian Krysmanski).

If you, like we did, live in a simple world where WCF just uses the new set of serializers, read on.

Usually, where both the service and the client are .NET apps, web services can be designed by writing down C# interfaces and data models first. I'd call this common approach the code first approach - you share a code between the service and the client:

// common, shared between the service and the client
public class DataModel 
   public string Whatever { get; set; }

public interface IServiceContract
    void DoWork( DataModel model );
Then, the server just implements the interface and exposes the service using a service host (IIS/self-host):
public class ServiceImpl : IServiceContract
and the client uses the ChannelFactory or the ClientBase to easily have the proxy based on the same interface.

A case of a unit test

Working on a complex integration project involving interoperable calls between a .NET client and a Java WebService, we were faced with an approach we haven't followed often before. Instead of the usual code first approach, we were given a couple of *.WSDL/*.XSD files, which makes a valid model first approach. Given these, you use an automated tool like the xsd.exe or the newer svcutil.exe to automatically create code from models:

svcutil.exe /syncOnly /n:*,Test *.wsdl *.xsd

This approach was used, the code has been generated and someone tried to write a unit test to make sure the request body is correctly serialized so that it meets the XML structure expected at the Java's side. The unit test code first used the DataContractSerializer as we believed this is what WCF uses under the hood. The test code was basically something like:

DoWorkRequest request = new DoWorkRequest();
request.model = 

var serializer = new DataContractSerializer();
var ms = new MemoryStream();

serializer.WriteObject( ms, request );

var requestXML = Encoding.UTF8.GetString( ms.ToArray() );


As it turned out, the serializer's output was something like

<DoWorkRequest ....
while the server's expectation was
<DoWork ....
(note the Request suffix missing from the root's name)

The test was obviously failing. We started an investigation.

What is really going on under the hood

After a couple of different trials and errors involving other serializers and their settings, we've found something that we never manually put into the code in the code-first approach. It was the MessageContractAttribute put over the request class by the generator:

public class DoWorkRequest {
Things started to get interesting, it looks like there's yet another serializer, not mentioned that much, that obviously respects this attribute. Googling around reveals that there is indeed yet another layer used by WCF on top of different serializers to have even more control on how your data is serialized when a web service is called. This directly leads to the TypedMessageConverter class and code snippets people already posted (e.g this one by Stanislav Dvoychenko).

A solution, finally

The solution was to rewrite the unit test to actually use the TypedMessageConverter:

var request                       = new DoWorkRequest(...);

var converter                     = TypedMessageConverter.Create( request.GetType(), "*", string.Empty, new XmlSerializerFormatAttribute());
var message                       = converter.ToMessage(request, MessageVersion.Soap11WSAddressing10);            

var writerSettings                = new XmlWriterSettings();
writerSettings.OmitXmlDeclaration = true;

var stream                        = new MemoryStream();
var writer                        = XmlWriter.Create(stream, writerSettings);


var requsetXML = Encoding.UTF8.GetString(stream.ToArray());
which gives the exact SOAP message that can be peeked using an HTTP debugger (you can possibly unpack the soap envelope it's wrapped into in your unit test code).

Monday, July 30, 2018

A Fairy Tale of an Old Music Box

One of things I really like doing in my spare time is playing the piano we bought recently. I also decided to check whether there is some decent score writing software out there and I was really surprised to find that the software not only gets much, much better over years but also that there are even some free yet advanced apps like the MuseScore.

Anyway, this looks like my chance to write down few ideas I had on my mind for all these years since I finished my music education. And since sharing is one of nice features of the software, please enjoy one of my compositions, the first one I wrote down with MuseScore, A Fairy Tale of an Old Music Box.

A Fairy Tale of an Old Music Box

Friday, July 13, 2018

.NET 4.7.1 (and higher) no longer supports SHA1 in SignedXml

There are plenty of subtle changes between .NET 4.7.0 (and lower) and .NET 4.7.1, however one of the changes hurt us badly. It looks like the SignedXml no longer supports SHA1 as the hashing method.

What it causes is the

System.Security.Cryptography.CryptographicException : Invalid algorithm specified
     at System.Security.Cryptography.Utils.SignValue(SafeKeyHandle hKey, Int32 keyNumber, Int32 calgKey, Int32 calgHash, Byte[] hash, Int32 cbHash, ObjectHandleOnStack retSignature)
     at System.Security.Cryptography.Utils.SignValue(SafeKeyHandle hKey, Int32 keyNumber, Int32 calgKey, Int32 calgHash, Byte[] hash)
     at System.Security.Cryptography.RSACryptoServiceProvider.SignHash(Byte[] rgbHash, Int32 calgHash)
     at System.Security.Cryptography.Xml.SignedXml.ComputeSignature()

The resolution is to put an additional section in the app's config file that switches the use of insecure hashes on:

    <AppContextSwitchOverrides value="Switch.System.Security.Cryptography.Xml.UseInsecureHashAlgorithms=true;
                                         Switch.System.Security.Cryptography.Pkcs.UseInsecureHashAlgorithms=true" />

If these switches seem to be ignored (we observed this in web apps where this was put in the web.config rather than an app.config, simply replace it with the code that you put in the global app class in the Application_Start:

protected void Application_Start(object sender, EventArgs e)
   AppContext.SetSwitch("Switch.System.Security.Cryptography.Xml.UseInsecureHashAlgorithms", true);
   AppContext.SetSwitch("Switch.System.Security.Cryptography.Pkcs.UseInsecureHashAlgorithms", true);

Monday, July 2, 2018

Integracja z ePUAP - dzień bez zmiany dniem straconym

Przed weekendem 30-06/01-07 blogowałem o zmianie na środowisku testowym ePUAP, przywracającym SHA1 w trybie wymuszenia (SHA256 przestało być obsługiwane) w konstruowaniu sygnatur WS-* w żądaniach do usług integracyjnych. Napisałem też że mam nadzieję na uporządkowanie sytuacji.

Cóż, sytuacja uporządkowała się o tyle że dziś (02-07) od południa na środowisku testowym ePUAP ( stoi kolejna wersja, która znów wymusza SHA256 i nie obsługuje SHA1.

Pytanie retoryczne: jak długo jeszcze może trwać ten kołowrotek i jak długo poczekamy na wersję która obsłuży oba rodzaje funkcji skrótu?

Friday, June 29, 2018

Integracja z ePUAP - zamiana powrotna SHA256 na SHA1

Dziś czyli w piątek, 29 czerwca 2018, około godziny 14-tej, środowisko testowe ePUAP zostało ponownie zaktualizowane. Tym razem do wersji która nie obsługuje całkowicie algorytmu SHA256, zamiast tego wraca do sytuacji w której w komunikacji wymagane jest SHA1.

Ta zmiana ociera się o skandal i sprawia wrażenie że COI nie panuje nad sytuacją. Byłoby zrozumiałe, gdyby nowa wersja przywracała wsparcie dla SHA1 ale poprawnie obsługiwała oba, SHA1 i SHA256. Natomiast udostępnienie wersji która na działającym kodzie wykorzystującym SHA256 zwraca z serwera wyjątki bezpieczeństwa, na które lekarstwem jest przywrócenie SHA1 po stronie integrowanej aplikacji, bardzo źle świadczy o kulturze wytwarzanego kodu i wsparcia integratorów.

Być może jest to tylko niedopatrzenie i sytuacja wróci do normy (=będą obsługiwane oba rodzaje funkcji skrótu) w niedługim czasie, ale zamieszanie jakie powoduje COI nieprzemyślaną jak widać do końca migracją, powoduje perturbacje po stronie integratorów.

Wednesday, June 13, 2018

Integracja z ePUAP - zmiana SHA-1 na SHA-256

Środowisko testowe ePUAP zostało właśnie dostosowane do zmian wymuszonych Ustawą o usługach zaufania oraz identyfikacji elektronicznej z września 2016, która w art. 137 mówi:
Do dnia 1 lipca 2018 r. do składania zaawansowanych podpisów elektronicznych lub zaawansowanych pieczęci elektronicznych można stosować funkcję skrótu SHA-1, chyba że wymagania techniczne wynikające z aktów wykonawczych wydanych na podstawie rozporządzenia 910/2014 wyłączą możliwość stosowania tej funkcji skrótu.
Istotnie, wdrożona na środowisku testowym zmiana powoduje zwracanie statusu A security error was encountered when verifying the message przy komunikacji z dowolną usługą.
Komunikat błędu nie jest może bardzo przydatny w diagnozie problemu, niemniej warto odnotować, że faktycznie chodzi o konieczność wymiany SHA-1 na SHA-256, co jest szalenie istotne w kontekście generowania podpisów XMLDsig w komunikacji z usługami - przynajmniej w .NET, domyślnie dostawca podpisów obiektu SignedXml wybiera SHA-1, ponieważ tak podpowiada sygnatura certyfikatu. A nadpisanie tej domyślnej konwencji w taki sposób żeby do wyliczenia sygnatury został użyty algorytm SHA-256 wcale nie jest takie oczywiste. A skąd wiadomo że akurat SHA-256? Cóż, na środowisku testowym zaktualizowano również dokumentację dla integratorów i tam w wyciągach z przykładowych żądań i odpowiedzi systemu pojawia się właśnie ten algorytm, wszędzie tam gdzie we wcześniejszych wersjach widniał sha1.
Niewykluczone że z początkiem lipca czeka nas wysyp "awarii" systemów zewnętrznych zintegrowanych z ePUAP, które z różnych powodów nie zaimplementują tej zmiany. W kontekście wczorajszej (2018-06-12) dużej awarii e-usług - być może ta awaria i planowana zmiana mają jakiś związek.