Thursday, 4 November 2010

Request errors with WCF Data Services

Having watched Scott Hanselman’s excellent introduction to setting up your first oData service I thought that it would be really easy to get underway quickly in creating an oData service for Microsoft’s Data Market.

Basically, you create a database, you create an ADO.Net entity data model, you create a WCF data Service and tie it to the entity data model and stick all that in an Asp.Net web site and it works.

But for me it didn’t, and I’ve spent ages working out why.

This is what I got:

image

The first thing you think of is to set a breakpoint, but the error, wherever it is, is deep inside Microsoft code in the constructor for the entity model. Also I’m running this on a development system using Cassini, so there are no logs.

How then do you find the error?

After a lot of digging around I found this post and added the attribute below to the data service class:

namespace Services.Web
{
[System.ServiceModel.ServiceBehavior(IncludeExceptionDetailInFaults = true)]
public class ChaosKit : DataService<ChaosKitEntities>
{
// This method is called only once to initialize service-wide policies.
public static void InitializeService(DataServiceConfiguration config)
{
// TODO: set rules to indicate which entity sets and service operations are visible, updatable, etc.
// Examples:
// config.SetEntitySetAccessRule("MyEntityset", EntitySetRights.AllRead);
// config.SetServiceOperationAccessRule("MyServiceOperation", ServiceOperationRights.All);
config.DataServiceBehavior.MaxProtocolVersion = DataServiceProtocolVersion.V2;
}
}
}



Now I get:



image



So, we discover that WCF data services doesn’t like the data type “Time”. Finally!



I hope this helps you with the same kinds of problems.

Monday, 5 July 2010

Failure to launch

Scientio spends a lot of time creating solutions for other companies. More often than not these are implemented as websites or portals into a demonstration environment.

We use to use a rack of servers for this purpose, but we're now down to one big one and Windows Azure.
We've been using Windows Azure since the CTP started, and we've been happy apart from one thing.
Sometimes applications just won't fire up.

You never know if it's you - normally this means you left out some vital dll - or Microsoft.

I've seen instances fail to start that did then start after being suspended and restarted - so presumably Microsoft's fault.

Most of the time it's your fault though...

The problem is that without some way of knowing what screwed up you are forced to experiment, change things try again etc. Since each new upload and restart can take 10 minutes, this is a slow process.

Diagnostic logging doesn't help, this only kicks in once the system starts.

So I was very happy to find out how to use intelliTrace with VS2010- which seems to be the only way to find out what went wrong. Microsoft hasn't publicized this very well, so here is a quick guide:
Start by selecting the IntelliTrace option when you publish to Azure:












Note that if you use RIA Services you have to switch off IntelliTrace for some dlls, see http://blogs.msdn.com/b/kylemc/archive/2010/06/09/ria-azure-and-intellitrace.aspx

Now, if your Azure instance doesn't start a log is created with the instance which you can download by looking at the Compute instances in Server Explorer:






Now wait a few minutes and a log listing the events that caused the 'failure to launch' will be displayed.
It makes life a lot easier.....
More details at:

Sunday, 2 May 2010

Why don't Microsoft units talk to each other?

I'm a great fan of Microsoft technology. I'm a sucker too for new stuff. Microsoft have had a purple patch of late with the release of VS2010, MVC 2.0, .Net 4.0 and Silverlight 4.0.

I also am a great fan of cloud computing. Scientio's site has been hosted on Microsoft Azure for over a year, and after a lot of aggro we've got the hang of it.

The one real problem has been the fact that the various elements, RIA services, Visual Studio, etc have not kept up with each other during beta, so we had endless problems of A not working with B, and C not working with D. After a lot of struggling you get everything to hang together, and then a new version of one of them comes along, and it all starts again.

So now, all of them are released, Silverlight a week late, and RIA services is reasonably stable, so we want to run the new site we've built on Azure.

Unfortunately Azure won't run .Net 4.0, and Microsoft won't promise when it will, except it'll be within 90 days of the RTM. So we're back where we started again...

Intelligent services and REST

AI is a broad subject. On the one hand there's the world of Algorithms, measures of uncertainty, different models of learning, and on the other there's the problem of accessibitity.
Basically, how do you simplify AI tools so that they have the maximum useability?

We've had a suite of Rule based tools for most of the last decade with a simple-ish interface.
You cab data mine by preparing data in XML, preparing a spec, also in XML, and by firing the data and the spec at the data mining engine you create a rule set, (you've guessed it, in XML) and a bunch of performance figures showing how well the mining run worked.

Similarly you can test a rule set for logical holes (lacunae) using our lacuna project by firing a rule set at it and getting back the report.

And finaly, simples of all, you can use a rule set to make an inference by presenting data in XML to the inference engine, the rule set, and retrieve a new copy of the data with the results inserted in the appropriate points.

Up until now we've used SOAP web services to access these facilities, but it's finally dawned on us that a REST type interface makes more sense. Scientio's website is implemented in Asp.Net MVC, and the new version in MVC V 2, making lots of use of RIA services.

RIA Services implements the Odata spec in the newest version, which makes creating RestFul interfaces really easy.

So with MVC we can create a directory structure that follows REST practices. For instance,

online/XmlMiner/Edit/

will bring up the rule editor on that selected rule set, while:

RuleService/infer?&

will perform an inference on previously loaded data and rule sets.

We've extended this metaphor to all the services we provide in the upcoming revised Scientio site.

Tuesday, 9 March 2010

Why do we need IaaS?

As I'm sure you are aware, there's a big change afoot in the way that companies work and business takes place. Increasingly companies, small and large, use services supplied over the web to perform administrative and organisational tasks. They also frequently use software services as part of their website - mashing up with google maps or providers of data based services like credit checkers, address dismbiguation, etc. You can find loads of these at sites like http://www.strikeiron.com.

What they don't do, at least not very much, is use services to provide intelligent processing.
I think this is the next wave of services. The opportunity to sell the ability to make decisions, to analyze data, to fill in missing data, to ensure compliance with rules, governmental or corporate, is huge. Instead of selling information, we could be selling knowledge.

This idea isn't new. I made a proposal to the board of Reuters over 20 years ago that they should set up a marketplace on the Reuters data feed where third parties could sell competing trading methods, based on Reuters data. Unsurprisingly they didn't go for it!

I've been working ever since in Artificial and Computational Intelligence, developing tools that could be used to make this goal a reality. I want this blog to publicize both my work and that of others heading in the same direction.

When you think how much of human commerce is just about knowing how to do something, you then realize that the next stage is to capture that knowledge in a software system and sell it to anyone who wants it. This blog is about such a system and creating the marketplace to go with it.