In case anyone is ever wondering what browser a Samsung Blackjack (i600v) uses:
Mozilla/4.0 (compatible; MSIE 6.0; Windows CE; IEMobile 6.8) SAMSUNG-SGH-i600V/1.0
In case anyone is ever wondering what browser a Samsung Blackjack (i600v) uses:
Mozilla/4.0 (compatible; MSIE 6.0; Windows CE; IEMobile 6.8) SAMSUNG-SGH-i600V/1.0
Yesterday I looked into the building of Background Motion using the Composite Web Block, the Enterprise Library and putting all of the different .NET 3.x technologies together in a demonstration product named Dinner Now. Today was focused around SQL Server 2005 performance, optimisation and scalability followed by .NET language pragmatics.
Shu Scott presented about writing applications that make SQL Server 2005 fly, however I don’t think that name reflected the presentation all that well. The talk would have been better titled ‘Understanding The Cost Based Optimiser To Make SQL Server 2005 Fly’. None the less, Shu raised a lot of great points in her presentation and some of them I thought interesting are below:
WHERE
clause. SQL Server 2005 calculates statistics for a table per column, so as soon as you use a function on the column the statistics are unusable. The off shoot of this is that SQL Server 2005 can massively under or over estimate the selectivity of a table which on a complex-ish based query can dramatically change the query plan that SQL Server will choose.WHERE
statements. During the presentation, I didn’t get a clear understanding of why this was sub-optimal, however after some research online it is caused by the cost based optimiser having to use a reduced quality estimate for the selectivity of the table. You can avoid this problem by supplying the OPTION(RECOMPILE>
hint, use a literal instead of a local variable if possible, parameterise the query an accept the value via an input parameter.INCLUDE
keyword when creating non-clustered indexes. The INCLUDE
keyword allows you to extend past the 900 byte limit on the index key and also allows you to include previously disallowed column types within the index (ex: nvarchar(max)). This type of index is excellent for index coverage, as all columns identified are stored within the index in leaf nodes, however only the key columns enforce the index type.USE PLAN
which lets you supply an actual execution plan to the SQL statement, again without editing the SQL statement directly. Essentially, you extract the XML representation for the execution plan you would prefer to have execute and supply that to the SQL statement. If you have skewed data sets, this would be a good option to guarantee consistent access speed for a particular query. Using the skewed data sets as an example, it would be possible to have SQL Server cache a query plan which represents the atypical data and as such performs very poorly for the majority of the typical data. Supply the query plan to the SQL statement can avoid that happening. It is worth noting though, if you do supply a query plan you would need to revisit the stored plan periodically to make sure that it still reflects the most efficient access path for your particular data set.This presentation was about scaling SQL Server 2005 out, such that you’re able to continue adding more and more servers to the mix to distribute the load across. I had previously read the majority of the information covered, however I learned about two features named the SQL Server Service Broker and Query Notifications.
UPDATE
statement which might effect 100 rows locally is sent to the subscribers as 100 independent SQL statements; in case some or all of the subscribes have additional data that the publisher does not.Joel Pobar presentated on .NET programming language pragmatics and contrasted some of the recent developments in that space. At the start of the talk, he pointed out that there are generally three types of programming languages – static, dynamic and functional. The original version of the .NET Common Language Runtime was based around a static environment and has recently been enhanced to support functional programming and more recently dynamic.
The dynamic programming languages are handled through a new component, the Dynamic Language Runtime which sits on top of the existing CLR. The new Dynamic Language Runtime has allowed people to build IronPython and IronRuby, which are implementations of those particular languages sitting over the top of the .NET CLR.
Outside of the fact that it means you’ll be able to run a Python script in the native Python runtime or inside of the .NET DLR, which is just plain cool; the biggest picture here is that the .NET CLR is being enhanced and soon we’ll have a new super language (LISP advocates, back in your boxes!) which will support all of the current types of programming languages at once.
The presentation was fantastic and it is exciting to hear Joel present as he is so passionate about the field. In fact, I would go as far to say that his enthusiasm for his work is infectious; it is hard to walk away from one of his presentations and not have at least some of his excitement and enthusiasm rubbed off on you.
I’ve heard on the grape vine that Joel might be able to present at the one of the next Gold Coast .NET User Groups, can’t wait if he does!
Yesterday, I ventured into the world of Microsoft CRM 4.0 and IIS7 which were both very educational. Day two at Tech Ed was going to leave the products behind and jump into the deep end of software development.
The first presentation I attended was by Jeremy Boyd, a Microsoft Regional Director for New Zealand. The presentation was about building a community site named Background Motion which is all about sharing rich media that can be used as wallpaper within Vista utilising Dreamscene.
If the talk was simply about building a web site using ASP.NET, then it wouldn’t be all that interesting so Jeremy took everyone through how to utilise the Composite Web Block and developing against the Model View Presenter pattern, as opposed to ever popular Model View Controller approach. I really enjoyed seeing the Model View Presenter pattern in use first hand and I thought that the structure and flow felt really good; structure and order are always a good thing – anything to help code sprawling over time.
I have to give plenty of accolades to Jeremy, his presentation was without a doubt the smoothest that I have been involved with so far at Tech.Ed 2007. The flow of switching between the slides and into the Visual Studio was always seamless; no fluffing around configuring references or not having it compile unexpectedly. Jeremy used a simple system to make sure this worked as expected, he had numerous copies of his solution waiting in the appropriate state for each step of the presentation. No only were the technical aspects of the talk sorted out well in advance, his presentation style and pace throughout the talk were excellent.
The second talk I went into was about the Enterprise Library, formally known as Enterprise Application Blocks. Version three of the Enterprise Library comes with a bunch of bug fixes to some of their existing blocks such as the Data Access Application Block but the really interesting news was with the addition of the Validation Application Block and the Policy Injection Application Block.
Touching on each of those points briefly, the Validation Application Block is a generic validation package that provides an array of out of the box validation routines. Validation isn’t anything new, so the important point to note about the Validation Application block is that the same code will work identically using ASP.NET, Windows Forms and Windows Communication Foundation. You could use the validation block to provide ASP.NET level validation and provide a different or additional set of validation routines on the business object itself. The validation can be set up through configuration, attributes and through code. Through the use of the Validation Application Block, it is now convenient to only write validation routines and rules once where as it typically tends to be duplicated.
The real funk started happening when the Policy Injection Application Block came out to play. Using the Policy Injection Application Block, it is possible to separate out common tasks which happen across the enterprise or domain and reuse those through injection. As an example, common tasks like logging, authorisation and validation are common and typically should be reused throughout the code without copy/pasting the functionality. After configuring what policies to inject where and in what order, a new business object is instanced. Instead of getting back an instance of that business object, you get back a proxy that for all intended purposes looks and feels like the business object you asked for. When calling methods on this proxy business object, it invokes the Policy Injection engine and the request for the actual method must flow through pre and post execution paths on the policy injection engine before being accepted. Nifty stuff !
This talk was about how to integrate all of the different .NET 3.x features into a single application. It appears that the community can see the strengths in any one of the components, however were struggling to see all of them integrated seamlessly together in a single application.
Enter Dinner Now, a fictional online business which lets you order take away food from more than one restaurant at a time and have it all delivered to your home. The Dinner Now sample application uses a wide spread of technology including IIS7, ASP.NET Ajax Extensions, Linq, Windows Communication Foundation, Windows Workflow Foundation, Windows Presentation Foundation, Windows Powershell, and the .NET Compact Framework.
The idea behind this presentation is quite exciting, however I felt that it could have had a little more meat in it. Maybe the talk was geared at a slightly lower entry point but I felt too much time was spent explaining what the different technologies accomplishes and not enough time going through the technical aspects of it. That said, I still found the presentation entertaining and it is fantastic that Microsoft have now recognised the requirement for a sample scenario that is more complex than Northwind.
Today was my first ever experience with Microsoft Tech Ed and it was a great one, what a fantastic conference! Across the course of the day, I attended a few different presentations:
Across the course of the day, I attended three different presentations for CRM 4.0:
The presenter noted that the difference between CRM 1.x and CRM 3.0 was a revolution, while the CRM 4.0 is more of an evolution. The majority of the functionality from Microsoft CRM 3.0 exists within the updated version, however with a lot of improvements along the way. Some of the items which caught my attention during the presentation:
I’m very pleased that I attended the Titan presentation, even if I’m not going to use it immediately. It has really opened my eyes as to what the product is capable of and I can already see fantastic applications of it within our business.
IIS, the Microsoft web server, has been undergoing heavy surgical procedures since version five. IIS5 was a horribly slow, hard to configure product that no one wanted to use and the market share that Apache held reflected that. With the release of IIS6, many of the problems of IIS5 were resolves or at least reduced – however what they had still felt as though it was largely IIS5 with some spit and polish. The release of IIS7 feels as though they have finally unshackled themselves from their forefathers and are starting fresh.
The big highlights in IIS7 which I love are:
There were a lot of other very cool things which are available in IIS7, unfortunately you need to be running Windows Vista or Windows Server 2008 to get access to it; of which we’re not just yet.
Daniel Crowley-Wilson gave a quick half hour presentation on using Windows Communication Foundation to deliver RESTful web services. For a long time, .NET developers have really only been able to deliver remote services through SOAP and WS-* which work however aren’t the nicest things to deal with. I was very excited to see what looked like a clean implementation of REST; in fact I would have loved if Daniel had of had an hour or so to provide a little more comprehensive presentation but it what he delivered packed a good punch for 20 minutes!
Throughout a few different presentations, Workflow Foundation was demonstrated. As mentioned toward the top, Microsoft CRM 4.0 utilises Workflow Foundation for all of its workflow components and individual presentations demonstrated it directly. Developing against a framework like Workflow Foundation to perform complex flow related tasks just makes sense as it removes so much of the complexity. After talking to a few people and seeing it in action a handful of times in the last month, I can see clear advantages in upgrading certain components of our enterprise stack to use Windows Workflow Foundation.
I can’t wait for Tech.Ed day two, I’m going to really enjoy attending another IIS7 presentation and I hope to find the time to get in a couple of the fast paced half hour Chalk Talks.
Recently Matt Mullenweg was interviewed by Adriaan Pienaar about all things WordPress. Adriaan asks a good question about Movable Type:
And the second part of the question is, considering Movable Type (probably your main competitor for WordPress.org) has been commercial for so long now, is the fact that WP is Open-Source a competitive advantage?
and Matt responds, whilst slipping in a quick left jab:
Movable Type has been available cheaply or for free for a long time, and they recently announced that they’re going to open source it, but I think people will still continue to choose WordPress because it’s not about price, it’s about quality.
You’ve just got to love it.