Moving .. :-(

Its that sad time when a person moves from one blog address to another. I have decided to use the dropbox based blogging tool scriptogram for further posts. I shall make an effort to move all the posts from my wordpress page to the one on scriptogram. 

Adieu, dear readers , See you at http://scriptogr.am/gprasant


TempData for user notifications

The normal code flow for editing/creating an entity in MVC looks something like this

Sometimes, when you want to display a notification back to the user, it is conventional to do so by using the message in a query string

This sends out a URL like http://localhost:23000/Cart/Index?message=saved%20successfully

But this has a couple of disadvantages :

  • The url is ugly
  • If you hit F5 again, the message appears again, as if it were saved again.

One way to get around the ugly url problem is to use the ViewBag to set a Message and display it from the View, but that doesn’t solve the second problem of multiple error messages being shown.

There is another dictionary typically for this same use case. It is called the TempData dictionary.

The cool thing with the TempData dictionary is that it is implemented as a limited access dictionary. This means that you can access a key value pair one time and then that key loses its associated value. This solves both our problems pretty well.


Implicit Required arguments

I just found out today a peculiar behaviour of model binding in MVC-3. At first I thought iit was a defect but I later learnt that it was by design. So in a line, I learnt that primitive type properties in your model are treated as if the [Required] attribute were marked on them.

Now when you use Razor top generate a textbox for each of the above fields,

You can notice that the HTML code that is spit applies the required validations for all the primitive type properties on the Student class, but not for the Name property.

Notice how the Score and the Grade properties are being rendered with htmkl attributes for required Validation but not the name? The reason for this seemingly weird behaviour is that since these properties have primitive types, they have no notion of nullabality within them and when the user posts back the form , without entering a value into these fields, the Model binder will try updating these properties with a null value. Instead of failing that way, the framework assumes that you have those types as non nullable for a reason. It assumes that you really do know what you are doing.

How to get around this ?

The simple way, whatever properties  you are sure that need not be required fields during user Input, mark them as nullable fields.


Setting up an R server

R is a statistical tool used by economists and non programmers. It is a language that was specifically intended for data crunching problems. This post is about how you can set up R as a web server and send HTTP requests to R and getting responses.

The module in R that makes it possible to use it as a HTTP server is called Rook. Here’s a simple script that sets up the server to return 42 as the response always, no matter what.

R_Run_Script

Load the script, select the entire script and hit CTRL + R . You should be able to see something like this on the R Console.

S_Started

Now that the server has started, you can hit the server from the browser like so :

S_on_browser


What’s the ViewBag?

The ViewBag property in ASP.NET MVC makes a lot of things much more easier. One of the obvious advantages is that since it is a dynamic and it operates just as a key value store, you can pretty much put anything into it to be rendered in the View from the controller. You are mot forced to write models(or view Models) just to render some obsucre property that does not really deserve the status of existing in a higher level data structure.  While the VIewBag makes a lot of things very easy for us during development, it is worth putting some effort to understand how the framework handles/implements the ViewBag. The ControllerBase class defines the Viewbag as  dynamic item.

In normal usage, you could set any property on the ViewBag and  return a ViewResult from an action method. If you are referring to the same property of the ViewBag on the .cshtml file, you get the value that you put in the controller rendered on the View. So the View method on the Controller that must be responsible for passing around the ViewData/ViewBag. If you look at the implementation of the ViewBag, you can see that it is heavily dependent on the ViewData object. Whenever you set a value on the Viewbag, a key-value pair is inserted in the ViewData dictionary and so you can get/set properties interchangeably between the ViewData and ViewBag.

The ViewResultBase class also has a ViewBag property declared in the same way as declared in the Controller base. Since this Viewbag property also depends on the ViewData object for getting/ setting property values, and since the View() method sets the ViewResultBase.ViewBag to the ControllerBase.ViewBag, whatever property you set on the ViewBag from the Controller can be accessed from the View.


Parallel Faux pas

Sometimes,  we tend to use language features without knowing whether using that feature really offers a benefit. I recently saw one such example. The problem to be solved was to clone a huge array from one variable to another. Someone decided to get creative and used Parallel.For() to achieve the task. I decided to inspect, along with a couple of friends what this method implied and whether it offered any performance improvements.

We tried out three approaches to solve the problem :

  1. Naïve array indexed copying
  2. Parallel.For()
  3. Using Object.clone()

It turns out that Parallel.For() is actually the slowest method. Naïve array indexed copying comes in second, but it is more than 4 times faster. Object.Clone() comes in first.

Comparison_Array_Copy


EF Code First Application deployed on AppHarbor

AppHarbor’s awesome. But there are some stumpers that you might encounter when  you try to deploy your EF code first app on appharbor. I’m going to assume that you have your code first app ready and youre waiting to push it to appharbor. If you use the database, you are going to have to install one of the SQL Server addon – I installed Yocto, its free and works reasonably well given my needs – But you could choose from a wide range of services provided by Sequelizer. In order to add a Sql server addon, go to your application page and select Addons. Addons

Next, choose the Sql Server plugin that you wantyocto

After this step, you are allotted a database that runs on Sequelizer. App harbor gives you the credentials that you can use to connect to the database through ssms.

yocto

Note that I have modified the connectionString alias to the name of my DBCOntext that I defined in code. I also happen to have a connectionString with the same name in the Web.Config file. When deployed, appHarbor automatically changes the ConnectionString  and replaces the value with the actual connection string of the Sequelizer database. So, even a connectionString like

 <add name="JeevanDBContext" connectionstring="Data Source=.\SQLEXPRESS;Initial Catalog=Jeevan123.Database.DBContext;Integrated Security=True" providername="System.Data.SqlClient" />

will be replaced by the correct connection string if you give the name of your connectionString alias exactly the same as your DBContext derivede class.

For your database strategy, if you use System.Data.Entity.DropCreateDatabaseAlways<TContext>, you cant use this strategy on Appharbor because Sequelizer does not allow you to drop or create the database.  I got around this by a small hack, which I do not recommend for large projects. I Used the System.Data.Entity.DropCreateDatabaseIfModelChanges<TContext> . How Entity framework recognizes that your model has changed  is by using a model hash based on your Domain classes. This is  stored in the Database in the ModelHashTable. I copied over the data from my local database into the remote database so that both databases were in sync and entity framework didn’t have to drop the database on the server.

Exporting Data

Now that the initial db setup is done, we need to look at migrating schema/data. You can generate the scripts from your local database to do this using ssms. Once the scripts are generated, run the scripts against the remote database by changing all instances of your local database with the remote database name. You have  to connect to the remote database server through SSMS before you can do this though.


Follow

Get every new post delivered to your Inbox.