Cryptography – Using Base64 to encrypt/decrypt Strings rather than Unicode or ASCII – Error – ‘Length of the data to decrypt is invalid’

I recently did a conversion from .NET 1.1 to .NET 2.0 for a particular project.  The framework classes were still in 1.1 (where our TripleDES encryption library lives).  Unit Tests still run in 1.1 and all pass.


A production problem then started to show ‘Length of the data to decrypt is invalid’ and I was horribly confused as I’d inherited some of the code and all seemed good…


I thankfully found the following … http://blogs.msdn.com/shawnfa/archive/2005/11/10/491431.aspx that explained why it’s really not a good idea to assume that even though you’re only encrypting ASCII characters, you don’t use 7 or 8 bit encoding to encrypt/decrypt.  The key is in the fact that the overall sequence of bytes isn’t guaranteed to be valid Unicode or ASCII.


(Why do the unit tests pass?) – because the Cryptography classes in .NET framework were revamped for V2.0 and validation tightened up.  As Shawn says – it’s better that it doesn’t successfully decrypt into an invalid string.


This leaves a bit of a tidyup of course as I’ve now got to re-stuff all encrypted data into the database and patch the apps to ensure that the correct encryption is used.  I’ve also got to find a way to support existing files and string encrypted with the class as some code is still happily using this in 1.1-land and some is clearly ‘not’ working in 2.0-land – fun!


I guess it’s a lesson that today’s code won’t necessarily work tomorrow – and you shouldn’t discount breakages from framework changes when you’re investigating issues.