How to redirect Urls without www to with www – Permanent (301) Redirection in ASP.net

First, why should we redirect Urls without “www” to with “www” or vice versa?

Answer – If you site cares for search engine ranking then this is a must. Famous search engines like Google or Bing prefer that you have ONLY one domain associated with your website. In search engine world, “www.mydomain.com” and mydomain.com are considered two different domains. If you don’t redirect calls for mydomain.com to “www.mydomain.com”  Or vice-a-versa, crawler will treat these two as two different yet totally duplicate sites. Hence, your search engine ranking will have a negative impact.

Other than this important reason, you also might want to stick to one of these Urls to maintain your Social Media Like Counters e.g. Facebook will also consider two of these as two different sites.

This post is not about understanding the benefits of permanent redirection to one preferred domain url. For more information you can read this article by google.

Now lets get to fixing this issue in Asp.net. URL rewrite can be done in IIS or by using <rewrite> rule in Asp.net web.config file.

url_rewrite
Adding above code to web.config will permanent redirect (status code 301) all requests for mydomain.com to “www.mydomain.com” .

1) Inbound Rule – “.*” regex will match all request coming to web server for mydomain.com.

2) Condition – Regex pattern “^mydomain.com$” will match domain name without “www” prefix.

3) Action – Re-write action replaces the current query string with a substitution string. In other words, pattern specified in condition input will be replaced with url specified in action tag and we will force this as Permanent redirection Type (it goes with status 301).

If you are not familiar with Regex expressions, read this tutorial by Microsoft to know more.

To further better understand <rewrite> module and various attributes in-depth, I suggest reading this excellent article from Microsoft.

I hope this helps you reach the solution you were looking for.

 

Happy Coding!

Savita

IF Azure Role Instances Stop Unexpectedly – Do not Panic!

Recently, our Azure Deployed Production Web Sites stopped and were giving message – “503 Service Unavailable Error“. I think we jumped on error too quickly since production was down.  Checked Errors logs, nothing there. Logged on to Azure Portal  and found that our Azure Worker and Web role Instances have stopped unexpectedly . If you experience this error, do not panic and wait for few minutes. Most likely cause of this unexpected stoppage is Microsoft Guest OS Updates. After the scheduled updates, instances will auto-start and websites will be up and running again.

Approximate every month Microsoft releases Guest OS Updates for Azure PaaS VMs. As long as you have 2 or more instance per role, you will not experience this downtime. In-fact, 2 instances of a role are required to meet the 99.95% up-time SLA. A great article about having max up and running time for azure website, I found here – http://blog.toddysm.com/2010/04/upgrade-domains-and-fault-domains-in-windows-azure.html

Learn more about Guest OS updates history and schedules here – Windows Guest OS Releases

Enjoy!!

How to Update Stored Procedures in EF 6 Database First

In EF 3.5 DB First, there  was no direct way to update SP result sets and I was surprised to learn yesterday that EF 6 also has same issue. Please note, updates do not mean SP code is not updated, that works fine. Its just the Result Set you are returning from a SP with a simple SELECT clause or Using SELECT from TEMP table .

Lets look at this in more detail.

In Db first approach, for all SPs included in .edmx file will generate a Complex Type Set. Naming Conventionis => {SPName}_Result and you can view all result sets in “Model Browser”, under Complex Types. Lets take a simple example for illustration.

e.g. we wish to call a SP named – GetRecipeByRecipeId . We know it will return us Result Set GetRecipeByRecipeId_Result , say which gives me columns {RecipeName, Description, CookTime, PrepTime }. Everything works fine.

Now, we update our SP to include two new columns {TotalTime, Serves}. Update the .edmx file in VS and run the Stored Procedure again. You will notice that underlying Result Set GetRecipeByRecipeId_Result, still has four old columns and 2 new columns have not been added. Do a s\quick search on SP name  “GetRecipeByRecipeId”, you will notice that instead of updating the existing SP, EF has created a new SP named – “GetRecipeByRecipeId1” with prefix 1. This behavior repeats every time you try to update an already added SP.

Solution to this issue is a hack but works perfectly. Open Model Browser Window and do following steps:

  1. Delete  GetRecipeByRecipeIdResult and GetRecipeByRecipeId1_Result from  Complex Types.
  2. Delete GetRecipeByRecipeId and GetRecipeByRecipeId1 from Function Imports
  3. Delete GetRecipeByRecipeId and GetRecipeByRecipeId1 from DBModel.Store > Stored Procedures /Functions
  4. Save changes and Update the Model From Daatabase and include the SP GetRecipeByRecipeId

This time you will notice, Result Set will have all 6 columns and everything works like a charm.

NEXT Post –  SPs with SELECT on Temp Tables are not included in result set in EF 6.

The good, the bad and the ugly of ASP.NET Identity

brockallen

Ok, here we go again… and if you don’t know what I’m talking about, then see this post.

With Visual Studio 2013 and .NET 4.5.1 we have a new framework called ASP.NET Identity. ASP.NET Identity is yet another identity management framework from Microsoft (recall that we also had two prior frameworks from Microsoft: Membership and SimpleMembership).  Let’s take a look at the good and the bad aspects of this new framework.

TLDR; Click here to get to the ugly conclusion.

Good: Storage customization

One of the major complaints with the previous identity management frameworks was that it was either too cumbersome (with Membership) or too subtle (with SimpleMembership) to customize the storage. With this release, they’ve actually achieved a separation of the storage of the identity information (e.g. username, password, etc.) from the code that implements the security (e.g., password hashing, password validation, etc.). The way they’ve done…

View original post 3,115 more words

Solution to ALTER TABLE statement conflicted with the FOREIGN KEY constraint

Have you experienced “ALTER TABLE statement conflicted with the FOREIGN KEY constraint” error while trying to add a Foreign Key Constraint to a pre-existing table?

STATEMENT:

ALTER TABLE [dbo].[Table_X]
ADD CONSTRAINT [FK_Table_X_Table_Y] FOREIGN KEY([YId])
REFERENCES [dbo].[Table_Y] ([YId])

ERROR:

“Msg 547, Level 16, State 0, Line 1

The ALTER TABLE statement conflicted with the FOREIGN KEY constraint “FK_Table_X_Table_Y”. The conflict occurred in database “XYDB”, table “dbo.Table_Y”, column ‘YId’.”

Actually message is pretty clear but sometimes can be bit illusive. We are here trying to add a FK constraint to an already existing table dbo.Table_X. If this table has no records, this error will never occur. But, if you have data in Table_X prior to applying this constraint , that data is (may be) conflicting with the constraint, so you have to first correct the data before forcing a constraint.

In above case, dbo].[Table_X] had orphan value – 100 in YId column of all records prior to forcing constraint, whereas Table_Y does not have a record with YId 100.

SOLUTION:

In this scenario, either

1) you can delete all “100” orphan records  from Table_Y (Only if data loss isn’t an issue)

2) Update all orphan records with a valid YId which exist in Table_Y (consider your business model before  updating data)

3) Add a new record to Table_Y with YId = 100 (again consider your business model before updating data )

No matter, which option above you apt for, at the end we want to fix the records in Table_Y which are conflicting with the constraint in question. Once you fix that, statement will work like a charm.

Happy Coding!!

Redirect Old Image Url to New Image Url in MVC

Recently, we upgraded the project I working on to MVC 5 and we also decided to move and deploy all images used in application to a separate location. This will reduce the load on Application since image processing will happen separately. e.g. we resize and watremark our images on fly using handlers, so it will be cool to keep this work separate from app work load.

Current Image Location (A) :  http://{DomainName}/Images/{Itemfolders}/{ImageName}

New Image Location (B) : http://{Assets.DomainName}/Images/{Itemfolders}/{ImageName}

First, how to achieve this re-routing? – Using .net Routing engine. You can either create a Custom Route Handler and redirect all requests for A to B. Or you can simply create Action in controller and decorate it with Route tag. We opted for later and here is the snippet of code (extra code removed for brevity):

[Route(“/Images/{itemfolders}/{imageName}“, Name = “RedirectImageUrl“)]
public void RedirectImageUrl(string itemfolders, string imageName)
{

Response.Redirect(string.Format(“http://assets.DomainName/Images/{0}/{1}”, itemfolders, imageName), true);

}

If you try accessing old route now, it might throw error “The resource you are looking for has been removed, had its name changed, or is temporarily unavailable.” If you type same old route without extension of image file (in debug mode) , it does hit the action method but is  no use to us. We obviously want it to work with extension. There are two ways to send this request to .net engine:

a) Enabling RAMMFAR – Enabling “runAllManagedModulesForAllRequests” will enable all managed modules for all requests. That means static files such as images, PDFs and everything else will be processed by .NET when they don’t need to be. This option does result in overhead but works like a charm. Enable it and you will see, above error will disappear and routing will start working.

b) Using Transfer Request Handler – This HTTP handler looks for specific path criteria. If request matches, it correctly sent to .NET for processing. Add following line of code to site’s main web.config under system.webserver / handler section and try requesting the old url (A), .net will beautifully redirect it to new url (B).

<add name="ApiURIs-ISAPI-Integrated-4.0"
     path="/Images/*"
     verb="GET,HEAD,POST,DEBUG,PUT,DELETE,PATCH,OPTIONS"
     type="System.Web.Handlers.TransferRequestHandler"
     preCondition="integratedMode,runtimeVersionv4.0" />

Have fun and Happy coding!!

New Features in VS 2013

I am excited about launch of Visual Stuido 2013, packed with lots of new features that developers can (must) learn to make their dev jobs a bit easier and rewarding ( may be!!)

Lets look at the features:

1. Social Authentication Helpers : There are some neat templates in 2013 dedicated to social sign-in with Facebook, Google OAuth2 and OpenId Sign-on. So badly awaited feature. Say goodbye to all add-on packages and custom wrappers for social sign-in and have everything pre-packaged in VS 2013. Read more about it here.

2. .NET 4.5.1 : Yes!! VS 2013 comes with new version of .NET SDK 4.5.1. It has bug fixes on release on .NET 4.5 and there are new  performance improvements, and opt-in features etc. Like ADO.net Idle connection resiliency, Async callstack aware debugging windows etc. Read more here.

3. Attributing routing in MVC 5 and Web API 2: Oh Boy!! this will be a life saver or what? Goodbye all constraints you apply on custom routes to control the routing mach. And good news is, you can combine it with conventional custom routing to get the best of both worlds. Mike Wasson has added a good article to understand Attribute Routing in Web Api 2. This article will be your one stop learning about Attribute Routing.

4. Browser Link: Refresh multiple browser from IDE to see impact of your changes all at once. Learn more here.

5. Entity Framework 6: VS 2013 is shipped with new and improved EF 6.

6. ASP.NET Identity: ASP.NET Identity (new membership system for ASP.NET) promises to start unifying things a little better. Learn more about ASP.NET Identity feature here.

7.Peek Definition Window: A cool way to view and change the code that is been called from your current in-focus code. It shows same result as Go To Definition does, but results are added in a pop-up window which you can edit there itself without loosing context of your current page. More about this here.

Learning StreamInsight – Query SQL Database as IEnumarable Event Source

I started implementing StreamInsight (SI) for Enterprise Project and realized there are not many latest SI articles around. I am sharing with you my experience one post at a time, hope will be beneficial to someone. If you are new to StreamInsight(SI), please refer to Microsoft Article – http://msdn.microsoft.com/en-us/library/hh750618(v=sql.10).aspx , it explains about all available versions of SI, last released version of SI at the time of writing this article is – StreamInsight 2.1.

Please note SI 2.1 has some breaking API changes if you were previously using SI 1.2. Read overview of new changes and backward compatibility in SI 2.1 here – http://msdn.microsoft.com/en-us/library/ee362329(v=sql.111).aspx

I am posting this article with assumption that you know what is SI and can write basic queries in SI using LINQPad. Also, please note that it is highly recommended that you read the StreamInsight Server Concepts before starting development to make yourself familiar with the concepts.

This simple end-to-end code demonstrates the use of an event source (SQL database in this case) and event sink that implement the IEnumerable interface to create a working StreamInsight application example.

The StreamInsight engine is a server that can be embedded (in-memory) or remote (e.g. the Azure Service).  We first use Server.Create to create a server instance and return a handle to that instance.

using (Server server = Server.Create(“Instance1”))
{

Application application = server.CreateApplication(“MyFirstApp”);

/*First, define the event source data for the query by issuing a LINQ to Entities query over the Northwind or (database of your choice, like I choose RecipeDB) database. */

using (RecipeDBEntities recipedb = new RecipeDBEntities())
{
// Query all recipes where there is a known Create date, Modification/Update date and UserId for an Active Recipe. It will work the same way on real-time data or past recorded events.

var databaseQuery = from o in recipedb.Table_Recipes
where o.DateModified.HasValue && o.DateCreated.HasValue
&& o.UserId != null
&& o.IsActive.HasValue && o.IsActive.Value == true
orderby o.DateModified.Value
select o;

/*Next, transform the result of the query into a stream of interval event*/
// Transform the query results into a stream of interval events whose start and end
// times are defined by the recipe creation and modification timestamps. Keep track of the UserId.

var streamSource = databaseQuery
.ToIntervalStream(application,
o => IntervalEvent.CreateInsert(
o.DateCreated.Value,
o.DateModified.Value,
new { o.UserId  }), AdvanceTimeSettings.IncreasingStartTime);

/*Next, write the time-aware StreamInsight query that is appropriate for the incoming stream of events:*/
// Find time intervals during which more than 3 recipes are in process/updated for a User.

var streamQuery = from o in streamSource
group o by o.UserId into g
from window in g.SnapshotWindow(SnapshotWindowOutputPolicy.Clip)
select new { RecipeCount = window.Count(), UserId = g.Key } into agg
where agg.RecipeCount > 3
select agg;

/*Next, transform the output stream from the query into an enumerable result:*/
// Convert temporal query results into an enumerable result of interval events. This example
// filters out CTI events, and projects the relevant portions of the interval event.

var results = from intervalEvent in streamQuery.ToIntervalEnumerable()
where intervalEvent.EventKind != EventKind.Cti
select new
{
intervalEvent.StartTime,
intervalEvent.EndTime,
intervalEvent.Payload.RecipeCount,
intervalEvent.Payload.UserId
};

/*Consume the results of the query.*/
//Enumerating the results triggers the underlying SQL Server and StreamInsight queries.
foreach (var activeInterval in results)
{
Console.WriteLine(“Between {0} and {1}, {2} recipes were updated by user ‘{3}’ .”,
activeInterval.StartTime,
activeInterval.EndTime,
activeInterval.RecipeCount,
activeInterval.UserId);
}
Console.ReadLine();
}

}

I am hoping to continue posting in this SI series with end objective to creating a StreamInsight (codename – Austin) Azure Service with frontend integration with MVC and SignalR to process CEs and notify a MVC app users of latest results using SignalR.  Stay tuned and check back again for any future posts or you can subscribe to post by email facility.

WCF Custom tool error: Failed to generate code for the service reference

Yesterday, I was updating my WCF service reference on a MVC project in Visual 2012 and I came across this very old error/warning:

“Custom tool error: Failed to generate code for the service reference”

Everything was working fine until I decided to update my WCF service references. It failed, so I eventually undone my changes and tried again, but it failed even then! So I realized that “WCF service references updates” has started failing even without doing any changes to service. Also, it was pretty clear that it is not my change in service but something else breaking.

Lack of further details with regards to this error, I researched and tried many options. I would like to mention that I am reusing my Core assemblies in my MVC project and not referencing assemblies exposed  with WCF Proxy. In order to achieve this behavior, while referencing the WCF service, one should “Check” the box provided to “Reuse Types in referenced assemblies.” [Default Checked]

1

Few blogs suggested to uncheck above options to fix the issue. This could have been a big change for me since if I don’t reuse assemblies, I have to do a lot of changes to start using proxy assemblies. I decided to keep the option checked, but this time I used the second Radio Button : “Reuse types in specified referenced assemblies”  and only checked my CORE project specific assemblies from checkboxes below.

Once I selected  “Reuse types in specified referenced assemblies” which  lets you specifically choose the assemblies that you want to reuse. Every thing worked like charm again  Hence, excluding certain types from serialization allows service reference to be added successfully.

2

Adding this post to help someone who is stuck like I was.

Keeping together WCF WebRole and Website WebRole for Dev

This is the most common issue developer face with Azure Dev Environment. No one seems to provide a concrete answer to this solution. In non-azure .NET world, we are used to creating WCF services and hosting in local Dev Server Or local IIS and then add service reference to Website (be it MVC or Asp.net).

In Azure world also, it works on the same lines. Even though you may later decide to deploy your WCF Web Role as a separate Role in cloud. In Dev environment,  keep WCF Web Role (Service Host) and Website Web Role (which is also your WCF client or consumer) under  the same Cloud Service Role.

To achieve this,

1) In ServiceDefinition.csdef file of Cloud Service Role, make sure that ServiceHost Web role is added before the Website  Web Role or define an Entry point to ServiceHost Web role.

2) If your using .NET framework 3.5, make sure to set following in behavior section of Service Model. (In higher versions it is set by default.)

<useRequestHeadersForMetadataAddress />

3) Update the Client Endpoint .svc reference to Azure Development Fabric ServiceHost address and hit f5.

WCF Web Role and Website runs on Azure Dev Fabric like charm..

Happy Cloud Computing…