JQuery.Migrate saves the day with Telerik UI for ASP.NET MVC

The relentless pace with which libraries, and dependent javascript resources now change,  is quite difficult to keep up with – particularly when you’re attempting to be frugal and use open source projects like Telerik UI for ASP.NET MVC – which basically got dropped in 2013, and replaced by Kendo UI.

The basic problem is that when you’re using libraries like these, that depend quite a bit on things like JQuery, and get subsequently dropped – you can be left in a bit of an upgrade quandary.

It’s tempting to go to NuGet, and just ‘update all’ libraries you’re using, but all sorts of bad things can happen if you get a little trigger happy on that function.  The Telerik components for instance expect JQuery 1.7.1 (which is a 2012 version).  I now want to use Bootstrap to create a responsive theme, and that’s wanting JQuery 2.x.  I’d tried to update to a later JQuery version previously and basically the whole deal broke – almost all components were using deprecated JQuery features.  I was considering going through the code and patching where required, but I don’t have that kind of time.

Whilst looking for deprecated features on the JQuery API doco, I saw mention of JQuery.Migrate plugin – which basically fills in those gaps for you.  I then picked up the package in NuGet, added to my Bundle config (right under JQuery)

 bundles.Add(new ScriptBundle("~/bundles/jquery").Include(
 "~/Scripts/jquery-{version}.js",
 "~/Scripts/jquery-migrate-1.2.1.js",
 "~/Scripts/jquery.cookie.js",
 "~/Scripts/jquery.imagemapster.js",
 "~/Scripts/jquery.tooltipster.js"));

Stopped the site in IIS Express, restarted, and everything just magically started working again!

Now – I just have to get the responsive stuff singing and dancing with BootStrap 🙂

Working around null values in LINQ queries

I’m almost ashamed to admit I hadn’t commonly used the ‘null coalescing operator‘ ?? in C# until recently, and was commonly writing code like

var myVar = myNullableVar == null ? myNullableVar.Value : 0;

or variations on a theme using HasValue etc (still better than the long-hand if-else mind you)

Clearly this is more readable as

var myVar == myNullableVar ?? 0;

Often I find that things break down when you introduce Entity Framework, as there’s limitations on what it will understand (from the point of view of translating to the underlying data context).  Null values though are another place you can save a bit of repetitive code, as you’ll quite often have nullable dates, or other nullable types..

var output = (
    from tab in context.MyTable
    where tab.EffectiveDate == effectiveDate
    select new 
    {
        Code = tab.Code,
        /* Old
        Value = tab.Value == null ? 0 : tab.Value.Value
        */
        // New
        Value = tab ?? 0
    }).ToList();

This is a pretty simple example, but in conjunction with the SqlFunctions library, you can keep things nice and neat with type conversions in your code.

It’s only when you look a little further into the language that you see c#’s got quite a nice set of operators now 🙂

 

Book Review – Rogue Code by Mark Russinovich

Rogue Code by Mark Russinovich

Having read and reviewed Mark’s two previous ‘Jeff Aitken’ novels, Zero Day, and Trojan Horse; I was keen to read Rogue Code as soon as I heard about it.

First of all though – I want to know when the fabled movie/s are going to get made!!?

We rejoin our heroes (Jeff and Daryl – a girl btw) some time after the previous adventure, in a time where their increased reputation and success started to drive them physically apart as a couple, if not emotionally.  The two threw themselves deeper into their respective work, and we meet Frank Renkin – Jeff’s new associate, who has more than a little history at ‘the company’ himself.

Frank is probably the best new move by Russinovich, as his technical expertise, coupled with a more ‘active’ previous association with the CIA brings a new, and more physical dimension to the team, that comes in very handy indeed.

The format of the book is once again based on a diary style over the course of a hectic 10 days.

This time, the threat is on home soil, and is centred around the cut throat and bleeding edge world of high-frequency trading on the New York Stock Exchange.  Real-world examples are given around previous glitches and exploits with HFT in the ‘Flash Crash’, and IPO’s like Facebook.

The stakes are high as companies seek to gain advantage by proximity hosting of their trading code as close to the NYSE as possible.  Other, more nefarious types seek to exploit the system from the inside.

Again, the network of bad guys spreads internationally, and we learn about the complex world of modern money laundering, and how criminals have embraced uber-technical crime, due to the huge potential gains to be made.

There’s no shortage of thrills and spills, with more personal danger than ever before, while the technical detail across multiple fields again is second to none, as Russinovich’s background is bought to the fore, whilst not swamping the non-techie with unfathomable buzz words.

The book maintains a good pace throughout, and really maintains the interest as plot lines converge.  If any criticisms could be laid, they would be pretty minor, such as the occasional conspicuous re-explanation of things like Daryl’s language background, and the ‘newspaper’ articles, that felt like an interruption, just when things were ‘getting good’.  That’s all pretty nit-picky though.

All this left just one question.  Where are the movies already!?

Triangle of Happiness – Rate your job satisfaction

Something that’s interested me for a long time is what makes people tick, as far as motivational factors in the workplace. We all spend a long time in our respective workplaces, and how you measure your relative happiness is an interesting, and individual thing.

The triangle of happiness

I’ve grossly over-simplified the factors down to three measures that I’ve been banging on about for the last 10 years, after a conversation with an old friend and colleague. I’ve always known it as the ‘triangle of happiness’, and it looks a bit like the image to the left…

If you’re comfortable with 2 of the 3 factors, then you’re doing OK.  If you’re not happy with 2 or more factors, then it’s potentially tipping the balance on your overall job satisfaction, and your engagement/willingness to stay in your current position or company.

You may find that you’re more than 3-dimensional, so I used the ideas mentioned above, along with some inspiration I got from Scott Hanselman and his KeysLeft 1 page app, and I came up with a 1 page web app for you good people to model your job satisfaction.

It was also a good opportunity for me to have a play with KnockoutJS, the Foundation framework, and a few other bits and pieces.  It’s still rough around the edges, but I’d appreciate any feedback, so I can improve it in the future.

Integrating ASP.NET MVC into an existing ASP.NET Webforms application

If, like me you’re not always blessed with the opportunity to build every application from scratch, you may find yourself wanting to introduce the wholesome goodness of ASP.NET MVC into an existing ‘classic’ ASP.NET Webforms application. Most tutorials out there concentrate on nice green field development.

What follows is largely a reference for me to remember how to do this.  It’s basically a matter of manually injecting what the project templates do for a new application.  I’m also not professing to have come up with all of these steps – I’m just bringing them together.

I’ll assume you’ve got all the necessary prerequisites (MVC 4.0) installed already, and if you have a web ‘site’ project, then I’d suggest you update it to a web application before doing all of this.

Getting the structure and configuration to look like MVC

There’s a number of standard folders, and bits of code you’ll find dotted around MVC applications – Models/Views/Controller for instance 🙂

The following article goes through the first steps of getting those folders into your project (assuming you don’t have a naming conflict).

Mixing ASP.NET WebForms and ASP.NET MVC

Updating to MVC4

This is all good, but that article’s a bit old, and you’ll find the next one brings you (mostly) up to date with MVC 4.0

Adding MVC 4.0 to WebForms Project

If you want to use any of the newer features such as bundling, or if you’ve copied some views from an MVC4 project into your new MVC project, you’ll need the ASP.NET Web Optimization Framework (get it from NuGet).

You may also want to take an MVC4 project and convert the global.asax code to call off to the classes in the App_Start folder…

    public class MvcApplication : System.Web.HttpApplication
    {
        protected void Application_Start()
        {
            AreaRegistration.RegisterAllAreas();

            WebApiConfig.Register(GlobalConfiguration.Configuration);
            FilterConfig.RegisterGlobalFilters(GlobalFilters.Filters);
            RouteConfig.RegisterRoutes(RouteTable.Routes);
            BundleConfig.RegisterBundles(BundleTable.Bundles);
            AuthConfig.RegisterAuth();
        }
    }

Getting the Visual Studio MVC Template Goodness

This is great, but the icing on the cake is to make Visual Studio think this is an MVC project, so you get the nice right-click options, like add–>Controller if you’re in the controllers folder.  It turns out you just need to fool Visual Studio by adding a project type guid in your web projects csproj file…

With Visual Studio I just did a quick diff between a new MVC 4 project’s project file, and my ‘hybrid’ project’s file.

The following is what you’re looking for…


<SchemaVersion>2.0</SchemaVersion>
<ProjectGuid>{18BBC72C-702B-40CD-B347-D7FC66D276FA}</ProjectGuid>
<ProjectTypeGuids>{E3E379DF-F4C6-4180-9B81-6769533ABE47};{349c5851-65df-11da-9384-00065b846f21};{fae04ec0-301f-11d3-bf4b-00c04f79efbc}</ProjectTypeGuids>

If you just add the first ‘ProjectTypeGuids’ guid to the corresponding place in your project file, and reload, the magic happens, and Visual Studio thinks it’s now an MVC project.  You’ll probably find you already had the other 2 guids.

Book Review : Trojan Horse by Mark Russinovich

It’s been a while since I read Mark’s last book – Zero Day, and I finally got a copy of the sequel; Trojan Horse.

Like any followup action story, the main challenges for Russinovich here were to develop the main characters, introduce new characters, angles and plotlines, and also develop/increase the action.  I’m glad to report all of these were successfully undertaken, and Trojan Horse kept me as entertained as Zero Day.

In this story we see Jeff Aitken and his partner (in more ways than one), Darryl Haugen get involved in another global cyber-crisis, with even more tangled international players than before, and even more audacious technical threats employed.  Again, they find themselves getting rather closer to the action than they planned, and their simple ‘computer consultant’ existence is transformed into cyber-agent once again.

The action centres around the Iranian nuclear program, and all the interested players in its outcome.  International governments and secret agents galore, keep things moving pretty quickly, and the book is again organised in a way to make you piece together disparate events yourself to reveal the main plot.

Probably the most significant character development in the book was Darryl as she gains new strength through some traumatic events.  Jeff continues to excel from the technical side, whilst showing more reality, being rather more bumbling in other areas, like driving a manual car.

Talking of reality, it could be said that a few areas are somewhat Bond’esque in the licence they take from coincidence, good luck, and timing.  I’m a big Bond fan, so I actually don’t mind if he escapes several times in a completely implausible fashion.  I guess it all adds to the action – and again, would look great on screen.

I actually found it hard to put down as early as around half-way through, as it seemed things were slowly but significantly building to the climax from there.

The action is solid and the technical detail as always is second-to-none.  Jeff dabbles in other areas, such as Android in this story, so I do wonder quite how many languages he knows and areas of specialism he has!

As an IT guy too, I still find it hard to believe that Jeff can maintain his physique with the hours he works.  Darryl is obviously good for him 🙂

Looking forward to the next one.

Book Review : Zero Day by Mark Russinovich

Zero DayI’ve long followed the output and talent of Mark Russinovich, as any self-respecting Windows tech should know and use his Winternals and Sysinternals utilities. I don’t know what i would have done in some situations in the past without Process Explorer, Process Monitor, and all the other PsTools. Needless to say, Mark knows what he’s talking about in the world of computers.

I was surprised and intrigued recently when catching up on a Scott Hanselman podcast, he was the interviewee, on topics including his new novel.  Scott seemed genuinely surprised too that Mark had moved into this area, and also impressed with the way he’s managed to bridge the often ‘huge’ gulf between normal life and techiedom.

Zero Day is Mark’s first novel. An action thriller, centered around high technology crime (published in 2011), and he’s followed that up with Trojan Horse; further developing the main characters in the first book.

Zero Day features Jeff Aitkin as the lead character. A technical wizard, specialising in computer security, but seemingly normal in many ways that non-technical people think strange for a geek. He’s fit and good looking.  In the interview, Mark admits Jeff is based on him. 🙂

What’s interesting first of all, is that the novel was even published, as the subject matter simply wouldn’t have had mass appeal even just a few short years ago. The exploding popularity of the internet, and its crossover into normal people’s daily lives has obviously made the story more accessible than ever before. Also, the subject matter of coordinated cyber attacks was something even Mark himself thought may have happened for real before he finished the book; making the story less relevant. I guess he’s thankful in more ways than one we’ve survived up to now.

The story starts with the seemingly isolated virus infection of a company’s network, in the midst of a rather racy opening scene. Mark doesn’t hold back on language and imagery to paint the picture. I don’t know why I was surprised by this, but I guess that helps the book to sell.  Maybe it was also an attempt to quickly set the tone that this is not just a book for techheads.

We quickly then learn of many other, and varied incidents, involving computers in all sorts of places, and applications. Some with fatal consequences. What follows is a well constructed plot that explores not only the technical detail of viruses, exploits, triggers and rootkits, but the human factors that motivate individuals away from doing ‘good’, and the ways they justify their actions to themselves.

Clearly the author has done his research on many topics. The technical stuff is a given, but the characters involved are from many different backgrounds, explored well, and their back story plausible in most cases. Jeff’s own previous connection with the government, and the events leading up to 9/11 is interesting, albeit quite convenient for the story line.

My only real character comment is that the story seems to centre around a small group of very attractive computer people.  Jeff finds himself surprised when his new client is attractive, as his previous experience was that of the few women in IT, very few of them were attractive. His emerging sidekick and love interest also just happened to be a whizzkid, and complete knockout. I guess they’d better get moving on the movie!

Things develop, and it seems that this thing is bigger than individual viruses and hackers. The characters start to connect in unexpected ways, often unknown to them, and it’s a good rollercoaster ride across the world, as the characters chase the ‘cure’.  But – would it make any difference anyway on Zero Day?

Whether enough loose ends were resolved at the end of the book I’m still deciding, but that may be another reason to read Trojan Horse.

Being someone who doesn’t read many novels, I enjoyed this a lot as the story and technical detail held my interest throughout. Also, like the information age itself, the book is split into bitesize chunks that move quickly between the various storylines. I found myself wondering which thread I was going to join() again next (pardon the developer pun).

Trojan Horse - the bookI’m looking forward to getting into the next Jeff Aitkin adventure, Trojan Horse, which is also now available.

Check out the rest of Mark’s work on his site http://www.trojanhorsethebook.com/

Returning flattened data from a SQL table containing pairs of rows

I like little challenges. The ones that don’t take all day to figure out, but are enough to capture your interest for a little while. Yesterday I had such a problem to solve.

I had some data in a table that was basically in ‘pairs’ of rows. It was actually different to the example below, but the example we’ll use is a ‘Message’ table, that contains requests and replies, that are linked through a particular identifier.

Our simple example looks like this (my actual table had more fields).

CREATE TABLE Message
(
    MessageID INT NOT NULL IDENTITY ,
    MessageType CHAR(1) NOT NULL,
    TransactionID INT NOT NULL,
    MessageBody VARCHAR(30),
    CreatedDate DATETIME DEFAULT GetDate()
)

We’ll add a bit of sample data (script generated from my insert generator stored proc)

SET IDENTITY_INSERT Message ON

INSERT Message(MessageID,MessageType,TransactionID,MessageBody,CreatedDate) VALUES('1','Q','1','Request Message 1',convert(datetime,'2012-08-30 13:55:07.213',121))
INSERT Message(MessageID,MessageType,TransactionID,MessageBody,CreatedDate) VALUES('2','R','1','Reply Message 1',convert(datetime,'2012-08-30 13:55:37.680',121))
INSERT Message(MessageID,MessageType,TransactionID,MessageBody,CreatedDate) VALUES('3','Q','2','Request Message 2',convert(datetime,'2012-08-30 13:55:51.183',121))
INSERT Message(MessageID,MessageType,TransactionID,MessageBody,CreatedDate) VALUES('4','R','2','Reply Message 2',convert(datetime,'2012-08-30 13:56:04.020',121))

SET IDENTITY_INSERT Message OFF


SELECT * FROM Message

MessageID   MessageType TransactionID MessageBody                    CreatedDate
----------- ----------- ------------- ------------------------------ -----------------------
1           Q           1             Request Message 1              2012-08-30 13:55:07.213
2           R           1             Reply Message 1                2012-08-30 13:55:37.680
3           Q           2             Request Message 2              2012-08-30 13:55:51.183
4           R           2             Reply Message 2                2012-08-30 13:56:04.020 

We can see that some of the fields are consistent from row to row (in pairs), and some of the fields are unique to each row. My challenge was to represent a pair of messages in one row.

On the face of it, this seems like it would be simple – just grouping by the TransactionID (the field that links the two rows). The problem is that you won’t be able to get the unique information from both rows without some assumptions (that may not be solid).

For example, this will happily give you the MessageID’s of both sides of the transaction (given the assumption that the request comes before the reply, and that there are two messages in a transaction) …

SELECT TransactionID, MIN(MessageID) AS RequestID, MAX(MessageID) AS ReplyID
FROM [Message]
GROUP BY TransactionID HAVING COUNT(*) = 2

TransactionID RequestID   ReplyID
------------- ----------- -----------
1             1           2
2             3           4 

But – it’s doesn’t give you the unique data related to each ID, as you’d need to correlate the MessageBody to the right MessageID – MIN(MessageBody) won’t necessarily relate to the ‘Request’.

So… We can think about how to correlate the data to get the result we want. There’s a few options…

1. Use temporary tables, and get the result in two steps (reusing the query above)..

 

–1 – Two Step Process

SELECT TransactionID, MIN(MessageID) AS RequestID, MAX(MessageID) AS ReplyID
INTO #MessagePair
FROM [Message]
GROUP BY TransactionID HAVING COUNT(*) = 2

SELECT  REQ.MessageID AS RequestMessageID,
        REQ.TransactionId,
        REQ.MessageBody AS RequestBody,
        REQ.CreatedDate AS RequestDate,
        RPY.MessageID AS ReplyMessageID,
        RPY.MessageBody AS ReplyBody,
        RPY.CreatedDate AS ReplyDate
FROM #MessagePair MP
INNER JOIN [Message] REQ
   ON REQ.MessageID = MP.RequestID
INNER JOIN [Message] RPY
   ON RPY.MessageID = MP.ReplyID

RequestMessageID TransactionId RequestBody                    RequestDate             ReplyMessageID ReplyBody                      ReplyDate
---------------- ------------- ------------------------------ ----------------------- -------------- ------------------------------ -----------------------
1                1             Request Message 1              2012-08-30 13:55:07.213 2              Reply Message 1                2012-08-30 13:55:37.680
3                2             Request Message 2              2012-08-30 13:55:51.183 4              Reply Message 2                2012-08-30 13:56:04.020

2. Nasty correlated subquery and joins (not even going there)

3. Single query that makes use of the assmption that a request happens before a reply (meaning the messageId will be a lower value)

SELECT  REQ.MessageID AS RequestMessageID,
        REQ.TransactionId,
        REQ.MessageBody AS RequestBody,
        REQ.CreatedDate AS RequestDate,
        RPY.MessageID AS ReplyMessageID,
        RPY.MessageBody AS ReplyBody,
        RPY.CreatedDate AS ReplyDate
FROM [Message] REQ
INNER JOIN [Message] RPY
    ON REQ.TransactionID = RPY.TransactionID
AND REQ.MessageID < RPY.MessageID 

This produces the same result as above, and is what I ended up going with. I reckon there’s probably a few more potential viable solutions, so I’d be interested to see anyone’s alternative solutions.

NDepend – Visual Studio Addin: takes you as far as you want to go

First of all I’d like to point out that I was kindly given a license by the folks at NDepend (not very often that sort of thing happens I can assure you!) and I’m under absolutely no obligation to write anything about it.

In the beginning…

The funny thing is that it was probably over a year ago when I first installed the product without any specific requirement or
expectation. I had a little play with it (on Visual Studio 2008 as I recall), then the work I ‘had’ to do overtook my will to learn this new product and it lay gathering dust on my hard drive.  This probably explains why I haven’t posted in all that time!

But then…

Recently, I picked up an existing project (on visual Studio 2010), and wanted to have a good look inside to see what I was getting myself into. I dusted off NDepend and told myself I’d give it a good go this time…

First Impressions

The first thing I learned is that this is one significant addin, and you realistically need to ‘know you need it’ before you get it (see ‘laying
dormant comment above’). This also means you need to know what it can do for you – which is plenty!

If you’re reading this and thinking of trialling NDepend, then you either have problems to solve or you’re wanting to invest in ongoing improvement to your code. Both are very good reasons as it happens.

NDepend has few limitations in what it can do, as it has your entire codebase, Visual Studio extensibility and its own powerful rules engine at its disposal. It also employs its own CQL (code query language), to allow you to find all sorts of patterns and complexity problems with your code.

The biggest problem is knowing where to start, or discovering that first task you want to achieve with it. It’s easy to get overwhelmed by the
information it bombards you with when you spin it up).

To be fair, there’s plenty of links trying to lead you to ‘what you’re looking at is…’

Reasons to try/buy

If you’re interested in the quality of your code I believe there really is no equal.  This is the tool you need. You may already be using FX Cop in your build process to check for certain snytactical rules, and ReSharper for sorting out your code as you go, but NDepend can do all sorts of ‘different’ funky stuff (through CQL) that goes in depth to your code to enforce things that would be otherwise difficult to do It can obviously do all the simple stuff like show you where your dependencies are between methods, classes and projects, and redundant code etc.

Some highlights I quite like – made possible through CQL:

  • Enforcing of layering constraints – i.e. ‘this UI project cannot directly reference ‘that’ ‘DAL’ project
  • Simple spot check stuff like queries on a ‘lines of code’ threshold – indicating complexity
  • Code not meeting an acceptable test coverage
  • For all the possibilities you’ll need to look here.

Things to be aware of

  • It’s a technical tool, and it’s easy to get a little overwhelmed with what it can do and where to start.
  • Time is needed to understand some of the concepts and power of the product.
  • You’ll need a beefy machine to avoid things slowing down with the addin loaded (I had to disable it for a while when I was using a solution with 60 projects as I was starting to experience memory issues).  If you don’t want to run it in Visual Studio, you can run it in the standalone ‘Visual NDepend’ application.
  • I’ll admit I haven’t spent a lot of time with the interactive reports, and I don’t find some of the graphical representations of the metrics that easy to use.
  • I think like most products, you get comfortable with what you see as valuable, and tend to only try other things when you have time.

Summary

Clearly NDepend’s a very impressive tool for any serious development team to be using. It will help you to learn about reducing complexity, dependencies and generally designing your code in an efficient way. It’s basically all about improving quality.

It’s also a big product that’s not for the faint hearted. You basically get out what you put in as far as effort in understanding what it’s trying to achieve for you.

I think the key is finding the right balance between all the technical information it presents, the time you have available, and the business benefit you’ll get from code improvements.

As I said at the start. It can basically take you as far as ‘you’ want to go.

Worth taking a look at: http://www.ndepend.com/

LINQ Group by MAX Date Query

I’ve found some weird, wonderful and ridiculously complicated LINQ queries for getting the row with MAX(DATE) based on a key.  Most unnecessarily use lambda expressions, and some just had several interim steps.  I knew there had to be a better way, and found an unassuming post at the bottom of a StackOverflow page.

Here’s my non-lambda’d, contrived example…  assuming you’ve got an EntityFramework model (i.e. context)

            //Get Client Order with (max) order date 
            var maxclientOrder = (from clientOrder in context.ClientOrders
                               where clientOrder.OrderDate ==
                               (from clientOrder2 in context.ClientOrders
                                where clientOrder2.ClientID == clientOrder.ClientID
                                    select clientOrder2.OrderDate).Max()
                                select clientOrder).ToList();