Quantcast
Channel: ASP.NET Blog
Viewing all 7144 articles
Browse latest View live

SignalR and user identity (authentication and authorization)

$
0
0

There are too many authentication types (Basic, Windows, Cookie, OAuth) to explain how to use all of them. In this sample, I focus on using Cookie Authentication to secure a website, a Persistent Connection, and a Hub. Authentication is configured on OWIN, you have to add some nuget packages and add code in Startup.cs. I started with a web project using the MVC Template. By default, it creates web forms to register users, input user credentials, configures an anti-forgery token for http requests, creates an entity framework repository for User Identity. For the self host server there is no template, so I created it using the previous project as sample but removed things like the anti-forgery token, MVC, and entity framework.

The important thing to remember is OWIN takes care of authentication and all frameworks on top of OWIN (SignalR, MVC, WebApi, etc) simply consume the user identity provided by OWIN. So, if you can’t see the identity on SignalR, the problem is in your OWIN configuration.

A SignalR Persistent Connection gives you access to the user identity by overriding AuthorizeRequest method. The sample code below allows to create a persistent connection only to authenticated users. you could add more logic to allow only some user roles by using the method request.User.IsInRole(string role)  

namespace Common.Connections
{
  public class AuthorizeEchoConnection : PersistentConnection
  {
    protected override bool AuthorizeRequest(IRequest request)
    {
      return request.User != null && request.User.Identity.IsAuthenticated;
    }

    ...

  }
}

A SignalR Hub gives you access to the user identity using Context.User. If you want to restrict access to a Hub only to authenticated users, add the [Authorize] attribute. Do you want to allow only some user roles? Add [Authorize(Roles="myRole")]. Do you want to allow specific users? Add [Authorize(Users="myUser")]

namespace Common.Hubs
{
  [Authorize]
  public class AuthorizeEchoHub : Hub
  {
    public override Task OnConnected()
    {
      return Clients.Caller.hubReceived("Welcome " + Context.User.Identity.Name + "!");
    }

    ...

  }
}

Full sample code is here. It contains a web host server and a self host server. Then you can use any of the clients to authenticate and establish a SignalR connection:

  • JavaScript client connecting as cross-domain
  • C# console client
  • C# windows phone
  • C# windows store app

For more information, read SignalR documentation about security


Introducing Web Development and Tooling TV

$
0
0

We are very proud to present the first episode on our brand new Channel9 show – Web Development and Tooling TV. In this episode we talk to Scott Hunter who’s a Principal Group Program Manager on the Web Team (aka the boss). Scott introduces us to this new TV show and talks about what the team has been up to lately, as well as what’s to come in the future.

The content will be focused primarily around ASP.NET and Visual Studio web tooling. It’s our plan to interview a lot of the developers, testers and program managers on the team to keep you up-to-date with all the stuff we’re working on. We’ll even show some of the new prototypes for features not even released yet.

So if you are a web developer, then this new show is for you. If there are any subjects you think we should do a video on, please let us know in the comments below.

Remember to check out the first episode and subscribe to the show.

ASP.NET features in New Project Templates in Visual Studio 2013

$
0
0

The New Project Templates in VS 2013 show case the best practices when it comes to using ASP.NET. The project templates provide a good starting point which a developer can build upon or use them as a guide to see how the features are being used. As the ASP.NET framework gets richer with features, it is important for us to make these features more discoverable.

This post lists out the ASP.NET features that you get when you create a project in VS 2013 and a brief introduction about them. I am hoping that this post would be a starting point to learn about the new features and then you can drill down into the details of these features.

In VS 2013, the team has revamped the File – New Project experience and have provided a more simple experience around creating a project. For a detailed walkthrough on creating a new ASP.NET application in VS 2013 and the different options possible when creating an app, please visit http://www.asp.net/visual-studio/overview/2013/creating-web-projects-in-visual-studio

Over the last few years ASP.NET team has been releasing their features as NuGet packages which can be downloaded from the NuGet gallery. Frameworks such as ASP.NET MVC, Web API, SignalR, EntityFramework, Identity, OWIN etc. are all available as NuGet packages. Here is an overview of NuGet http://docs.nuget.org/docs/start-here/overview. I strongly recommend getting more familiar with NuGet and using NuGet as part of your developer workflow.

With the release of Visual Studio 2013, we have taken a step towards unifying the experience of using ASP.NET technologies, so that you can easily mix and match the ones you want. For example, you can start a project using MVC and easily add Web Forms pages to the project later, or scaffold Web APIs in a Web Forms project. One ASP.NET is all about making it easier for you as a developer to do the things you love in ASP.NET. No matter what technology you choose, you can have confidence that you are building on the trusted underlying framework of One ASP.NET.

Following features are common to all ASP.NET projects. You can use them in ASP.NET Web Forms, MVC, Web API, SPA etc. Please Note that some features might not be available based on the Authentication option you chose.  For Eg. Microsoft.AspNet.Identity is not available if you choose NoAuth. This post just lists the features that are available in New Projects.

Functionality in the Project Templates

When you create an ASP.NET project in Visual Studio 2013, you will get the following functionality. It does not matter what you choose i.e. ASP.NET MVC, Web Forms, Web API, SPA will all show the same.

  • Navigation: Basic navigation between Home, About and Contact pages.
  • theming using Bootstrap
  • Authentication
    • Windows Auth: Template shows how you can login with Windows Auth
    • Individual Accounts : Template shows how you can register a local account or login with a local account or social providers such as Facebook, Twitter, Microsoft Account, Google and you can add more providers.
    • Organizational Accounts : Template shows how you can have an application where users of the Active Directory can login. The template also shows how you can query the Active Directory to get or update the user information.

Project_Readme.html page

When you create your project, the Project_Readme page gets opened by default. The goal of this page is to guide you towards the new features in ASP.NET and help you with the next steps once you create a project. Things like – how should you customize the theme of your site, where should you checkin, how do you deploy and how can you get help – What are the different forums or channels where you can get help is something that the Project_Readme page should help you with.

image

ASP.NET Identity

ASP.NET Identity is the new membership system for ASP.NET applications. ASP.NET Identity makes it easy to integrate user-specific profile data with application data. ASP.NET Identity also allows you to choose the persistence model for user profiles in your application. You can store the data in a SQL Server database or another data store, including NoSQL data stores such as Windows Azure Storage Tables. For more information, see Individual User Accounts in Creating ASP.NET Web Projects in Visual Studio 2013.

NuGet packages.

  • Microsoft.AspNet.Identity.Core.1.0.0
  • Microsoft.AspNet.Identity.EntityFramework.1.0.0
  • Microsoft.AspNet.Identity.Owin.1.0.0

For more information and further documentation please visit asp.net/identity

ASP.NET Web Optimization

The ASP.NET Web optimization framework provides services to improve the performance of your ASP.NET Web applications.
Current services provided by the framework include:

  • bundling - combining multiple scripts or style resources together into a single resource, and thereby requiring browsers to make fewer HTTP requests
  • minification - making scripts or styles smaller using techniques such variable name shortening, white space elimination, etc.

NuGet package: Microsoft.AspNet.Web.Optimization.1.1.1

This package brings in the following dependencies (in the form of NuGet packages) as well – WebGrease and Antlr

Here is a short video about this feature http://www.asp.net/aspnet/overview/optimization and documentation to get started http://aspnetoptimization.codeplex.com/documentation

ASP.NET Universal Providers

ASP.NET Universal Providers are used to store session data for your application. You can install the following package Microsoft.AspNet.Providers.Core 

Entity Framework

Entity Framework (EF) is an object-relational mapper that enables .NET developers to work with relational data using domain-specific objects. It eliminates the need for most of the data-access code that developers usually need to write. Please visit http://www.asp.net/entity-framework to learn more.

NuGet package: EntityFramework

Authentication

ASP.NET Project support the following Authentication options. Please visit http://www.asp.net/visual-studio/overview/2013/creating-web-projects-in-visual-studio#auth to learn more on how you can configure Authentication.

image

  • No Authentication
  • Individual User Accounts (ASP.NET membership or social provider log in)
  • Organizational Accounts (Active Directory in an internet application)
  • Windows Authentication (Active Directory in an intranet application)

Microsoft OWIN Components

When you configure Individual Accounts for MVC, Web Forms, Web API  or Organization Accounts for Web API, the project templates make use of the following OWIN authentication components. The names of these components are the names of the NuGet packages as well.

Microsoft.Owin.Security.ActiveDirectory. Enables authentication using on-premise or cloud-based directory services.

  • Microsoft.Owin.Security.Cookies Enables authentication using cookies. This package was previously namedMicrosoft.Owin.Security.Forms.

  • Microsoft.Owin.Security.Facebook Enables authentication using Facebook’s OAuth-based service.

  • Microsoft.Owin.Security.Google Enables authentication using Google’s OpenID-based service.

  • Microsoft.Owin.Security.Jwt Enables authentication using JWT tokens.

  • Microsoft.Owin.Security.MicrosoftAccount Enables authentication using Microsoft accounts.

  • Microsoft.Owin.Security.OAuth. Provides an OAuth authorization server as well as middleware for authenticating bearer tokens.

  • Microsoft.Owin.Security.Twitter Enables authentication using Twitter’s OAuth-based service.

  • To learn more about Microsoft OWIN and Katana please visit http://www.asp.net/aspnet/overview/owin-and-katana

jQuery

All project templates use jQuery library.

NuGet package: jQuery

 

Bootstrap

All Project templates are built with Bootstrap 3. Bootstrap is a responsive CSS framework which is used to build web sites. Bootstrap is bundled in your application by using the Web Optimization feature. You can look in App_Start\BundleConfig.cs to see how the bootstrap files are bundled.

NuGet package:bootstraphttp://www.nuget.org/packages/bootstrap

Please visit http://www.asp.net/visual-studio/overview/2013/creating-web-projects-in-visual-studio#bootstrap to learn more about this integration.

RespondJs

The templates use Bootstrap v3 where some of the responsive features do not work in IE 8 & 9. Following is the support matrix from Bootstrap http://getbootstrap.com/getting-started/#browsers. To ensure that the ASP.NET applications are responsive on lower versions of IE we are using respond.js and this library is bundled with bootstrap in App_Start\BundleConfig.cs

NuGet package: Respond.1.2.0.

Newtonsoft.Json

This is an open source project for JSON framework for .NET. Please visit http://json.codeplex.com/ for more information.

NuGet package: Newtonsoft.Json

Modernizr

This is an open source JavaScript library that can be used to detect HTML and CSS features in the browser.

NuGet package: Modernizr

 

Following features are specific to the particular Frameworks

ASP.NET Web Forms Project Template

ASP.NET FriendlyURLs

ASP.NET FriendlyURLs make it easy to generate friendly URLs (without extensions) and they also add View switching functionality so you can easily between a mobile view and desktop view.

NuGet package:  Microsoft.AspNet.FriendlyUrls

Please visit http://www.asp.net/aspnet/overview/friendly-urls for more information.

ASP.NET ScriptManager packages for jQuery, Microsoft Ajax and WebForms JavaScript libraries

There were lots of improvements in ScriptManager control to support Web Optimization. The improvements allowed registration of bundles with ScriptManger. The project template show you how you can register jquery bundles with ScriptManager. The following post has more information on the improvements.

http://blogs.msdn.com/b/webdev/archive/2012/09/21/asp-net-4-5-scriptmanager-improvements-in-webforms.aspx

NuGet packages:

- AspNet.ScriptManager.jQuery.1.10.2

-Microsoft.AspNet.ScriptManager.MSAjax.5.0.0

-Microsoft.AspNet.ScriptManager.WebForms.5.0.0

ASP.NET MVC Project Template

ASP.NET MVC

ASP.NET MVC gives you a powerful, patterns-based way to build dynamic websites that enables a clean separation of concerns and gives you full control over markup for enjoyable, agile development.

NuGet packages.(along with the dependencies)

- Microsoft.AspNet.MVC

- Microsoft.AspNet.WebPages

- Microsoft.AspNet.Razor

ASP.NET MVC Unobtrusive Libraries

The following package adds support for Unobtrusive support for client side validation in ASP.NET MVC

NuGet package: Microsoft.jQuery.Unobtrusive.Validation

 

ASP.NET Web API Project Template

ASP.NET Web API

ASP.NET Web API is a framework that makes it easy to build HTTP services that reach a broad range of clients, including browsers and mobile devices. ASP.NET Web API is an ideal platform for building RESTful applications on the .NET Framework. To learn more please visit http://www.asp.net/web-api

NuGet packages (and dependencies):

- Microsoft.AspNet.WebAPI

- Microsoft.AspNet.WebAPI.WebHost

- Microsoft.AspNet.WebAPI.Core

- Microsoft.AspNet.WebAPI.Client

Help Page

The project template has the Help Page built into it. The help page allows a service developer to view the metadata about the service. For more information please visit http://www.asp.net/web-api/overview/creating-web-apis/creating-api-help-pages

NuGet package: Microsoft.AspNet.WebAPI.HelpPage

Conclusion

This was an exhaustive list of features being included in the templates. I hope this list is useful and at least will give you an idea on what is installed when you create a project in VS 2013. I am hoping the getting started links will help you learn more about these features and will help you learn.

Please let us know if this was useful or you would like to see something more.

I can be reached via twitter (@rustd)

Customizing profile information in ASP.NET Identity in VS 2013 templates

$
0
0

ASP.NET Identity is the new membership system for ASP.NET applications. To learn more about ASP.NET Identity please visit http://blogs.msdn.com/b/webdev/archive/2013/06/27/introducing-asp-net-identity-membership-system-for-asp-net-applications.aspx

One of the mainline features about ASP.NET Identity is to make it easy to add profile information about the user. In the existing ASP.NET Membership system, User and Profile were separate tables and Profile information about the user was retrieved by using the Profile provider. This made it difficult to customize the Profile information and associate it with the User and application data.  ASP.NET Identity makes it really easy to add profile data for your User.

[Update] Adding information on how to store profile information in a different table

[Update] Rick Anderson has a tutorial on asp.net site which also shows how you can add Social Logins to the template as well. Please refer to the tutorial for an entire walkthrough

http://www.asp.net/mvc/tutorials/mvc-5/create-an-aspnet-mvc-5-app-with-facebook-and-google-oauth2-and-openid-sign-on


This post shows how you can customize profile information about the user when you start with the project templates in Visual Studio 2013. Following are the steps to follow to add profile information about the user.

    • In Visual Studio 2013, create a new ASP.NET MVC application.
      • You can create a Web Forms application and follow the same steps to add profile information.
    • Run the application and register a user
      • You would notice that at the registration screen, the application only asks for a UserName and password. Let’s say that you wanted to add one more field called BirthDate that the users should specify when registering a user.
    • Enable Entity Framework Migrations
      • Migrations are recommended to use when you are modifying the schema of the database or you are changing the Code First Model. ASP.NET Identity uses Entity Framework Code First as an underlying framework. For more information on EF migrations please visit http://msdn.microsoft.com/en-us/data/jj591621
      • Open Package Manager Console by doing Tools – Library Package Manager – Package Manager Console
      • image
      • Type in “Enable-Migrations” and press enter
    • Add properties for the Profile information that you want to store.
      • In our example we will add BirthDate as a property
      • Goto Models\IdentityModels.cs and add the following property to ApplicationUser
publicclass ApplicationUser : IdentityUser {public DateTime BirthDate { get; set; } }
      • Add using System; at the top of the class.
    • Add-Migrations to modify the database
      • Since we have added a new property to the User, we need to update the database to reflect this change. This is where EF migrations is really useful. You can use migrations to update the database schema by doing the following
      • Goto Package Manager console and do the following
      • Add-Migration "Birthdate"
        • This will add a migration file to your project
      • Update-Database
        • This command will run all the migration files and update the database schema to add a BirthDate column
    • Modify the application to add a BirthDate field
      • The steps here are specific to how you would add a view to MVC, but you can do similar view specific rendering in Web Forms as well
      • Add BirthDate to RegisterViewModel in Models\AccountViewModels.cs
   1:  publicclass RegisterViewModel { 
   2:  [Required] [Display(Name = "User name")] 
   3:  publicstring UserName { get; set; }  
   4:   
   5:  [Required] 
   6:  [StringLength(100, ErrorMessage = "The {0} must be 
       at least {2} characters long.", MinimumLength = 6)] 
   7:  [DataType(DataType.Password)] [Display(Name = "Password")] 
   8:  publicstring Password { get; set; }  [DataType(DataType.Password)]
        [Display(Name = "Confirm password")] 
   9:   
  10:  [Compare("Password", ErrorMessage = 
"The password and confirmation password do not match.")] 

11: publicstring ConfirmPassword { get; set; }

public DateTime BirthDate { get; set; } }

        • Add using System; at the top of the class.
      • Update the Register view in Views\Account to add a field to enter BirthDate.
   1:  <divclass="form-group">
   2:   @Html.LabelFor(m => m.BirthDate, new { @class = "col-md-2 control-label" }) 
   3:  <divclass="col-md-10"> @Html.TextBoxFor(m => m.BirthDate, new { @class = "form-control" }) 
   4:  </div>
   5:  </div>
      • Update the Register Action in AccountController to store the BirthDate as well for the User
var user = new ApplicationUser() { UserName = model.UserName, BirthDate=model.BirthDate };
  • Run the application and register a user again. You will see the BirthDate field as shown below

 

    • image
  • Display the profile infromation on the page
    • When the User Logs in, you can display the profile information by doing the following
    • Get the current logged in UserId, so you can look the user up in ASP.NET Identity system
      • var currentUserId = User.Identity.GetUserId();
    • Instantiate the UserManager in ASP.Identity system so you can look up the user in the system
      • var manager = new UserManager<MyUser>(new UserStore<MyUser>(new MyDbContext()));
    • Get the User object
      • var currentUser = manager.FindById(User.Identity.GetUserId());
    • Get the profile information about the user
      • currentUser.BirthDate

Adding Profile information in a different table.

While this post shows you a way where you can easily add profile properties to the User Table itself. Since ASP.NET Identity uses Entity Framework, you can customize profile information as you would like to. For eg. let’s say that you want to add profile information in a different table rather than the same table then you can do the following to your model

Add a new class to store Profile information

Code Snippet
  1. publicclass MyUser : IdentityUser
  2.     {
  3.         publicvirtual MyUserInfo MyUserInfo { get; set; }
  4.     }
  5.  
  6.     publicclass MyUserInfo{
  7.         publicint Id { get; set; }
  8.         publicstring FirstName { get; set; }
  9.         publicstring LastName { get; set; }
  10.     }
  11.     publicclass MyDbContext : IdentityDbContext<MyUser>
  12.     {
  13.         public MyDbContext()
  14.             : base("DefaultConnection")
  15.         {
  16.         }
  17.         public System.Data.Entity.DbSet<MyUserInfo> MyUserInfo { get; set; }
  18.      }

In this case when you run the application you will get the FirstName and LastName in a different table called MyUserInfo

image

Getting Profile information

  • When the User Logs in, you can display the profile information by doing the following
  • Get the current logged in UserId, so you can look the user up in ASP.NET Identity system
    • var currentUserId = User.Identity.GetUserId();
  • Instantiate the UserManager in ASP.Identity system so you can look up the user in the system
    • var manager = new UserManager<MyUser>(new UserStore<MyUser>(new MyDbContext()));
  • Get the User object
    • var currentUser = manager.FindById(User.Identity.GetUserId());
  • Get the profile information about the user
    • currentUser.MyUserInfo.FirstName

Conclusion

This post shows how you can customize the profile information about the user. If you have any questions, please leave comments or reach me at twitter (@rustd)

Get more information from Social providers used in the VS 2013 project templates

$
0
0

When you create a New ASP.NET Project in VS 2013 and choose Individual Accounts, the template shows how you can login with Social providers such as Microsoft Account, Facebook, Google and Twitter. When you login with these Social Providers such as Facebook, you can request more information about the user such as the User’s picture, friends etc. and if the user allows your app to access this data then you can get this information and provide a rich experience in your site.

In the following post I am going to show you how you can request more data (or scopes) when a user logs in via Facebook provider. This post assumes that you have enabled Facebook login and are familiar with the basic walkthrough of Facebook Login.  You can visit http://www.asp.net/mvc/tutorials/mvc-5/create-an-aspnet-mvc-5-app-with-facebook-and-google-oauth2-and-openid-sign-on to see a basic walkthrough on how to enable Facebook Login in the template.

[Update]: I have updated the following code to fix some issues that were reported.

Following are the steps to get more scopes from Facebook

Code Snippet
  1.  List<string> scope = newList<string>() { "email", "user_about_me", "user_hometown", "friends_about_me", "friends_photos" };
  2.  var x = newFacebookAuthenticationOptions();
  3.  x.Scope.Add("email");
  4.  x.Scope.Add("friends_about_me");
  5.  x.Scope.Add("friends_photos");
  6.  x.AppId = "636919159681109";
  7.  x.AppSecret = "f3c16511fe95e854cf5885c10f83f26f";
  8.  x.Provider = newFacebookAuthenticationProvider()
  9. {
  10.     OnAuthenticated = async context =>
  11.     {
  12.          //Get the access token from FB and store it in the database and
  13.         //use FacebookC# SDK to get more information about the user
  14.         context.Identity.AddClaim(
  15.         new System.Security.Claims.Claim("FacebookAccessToken",
  16.                                              context.AccessToken));
  17.     }
  18. };
  19.  x.SignInAsAuthenticationType = DefaultAuthenticationTypes.ExternalCookie;
  20.  app.UseFacebookAuthentication(x);

 

    • Line 10-17, we are hooking to the OnAuthenticated event for the Facebook OWIN authentication middleware. This method is called each time a user authenticates with Facebook.
      When the user is authenticated and has granted this app access to this data, all the data is stored in the FacebookContext in the Facebook authentication middleware.
    • Line 14, also stores the FacebookAccessToken which we get from Facebook and which we will use to get the Users’ friends information         
    • Note: In this example I am storing all the data as claims but you can also persist it in the database using ASP.NET Identity.

 

  • Let us add some code which uses this FacebookAccessToken and gets the list of friends and their pictures

 

    • You can add a link in Views\Shared\_LoginPartial.cshtml
 1: <li>
 2:       @Html.ActionLink("FacebookInfo", "FacebookInfo","Account")
 3: </li>
    • When a user logs in and clicks this link, we will execute code to get the list of Friends and their pictures

 

    • Get the claims from UserIdentity and store in the database. In this code we are reading the claims that we got from the OWIN middleware and store the claim (FacebookAccessToken) in the ASP.NET Identity membership database.
Code Snippet
  1. //
  2.         // GET: /Account/LinkLoginCallback
  3.         publicasyncTask<ActionResult> LinkLoginCallback()
  4.         {
  5.             var loginInfo = await AuthenticationManager.GetExternalLoginInfoAsync(XsrfKey, User.Identity.GetUserId());
  6.             if (loginInfo == null)
  7.             {
  8.                 return RedirectToAction("Manage", new { Message = ManageMessageId.Error });
  9.             }
  10.             var result = await UserManager.AddLoginAsync(User.Identity.GetUserId(), loginInfo.Login);
  11.             if (result.Succeeded)
  12.             {
  13.                 var currentUser = await UserManager.FindByIdAsync(User.Identity.GetUserId());
  14.                 //Add the Facebook Claim
  15.                 await StoreFacebookAuthToken(currentUser);
  16.                 return RedirectToAction("Manage");
  17.             }
  18.             return RedirectToAction("Manage", new { Message = ManageMessageId.Error });
  19.         }
      • Line 14-15 stores the FacebookAccessToken in the ASP.NET Identity database.
      • StoreFacebookAuthToken gets the claims from the UserIdentity and persists the AccessToken in the database as a User Claim. LinkLoginCallback action is called when the user is logged in and is associating another login provider.

ExternalLoginConfirmation

        action is called when you login with the Facebook provider for the first time. 

In Line 26, once the User is created we add a new line to add  the FacebookAccessToken as a claim for the user.

Code Snippet
  1. [HttpPost]
  2.         [AllowAnonymous]
  3.         [ValidateAntiForgeryToken]
  4.         publicasyncTask<ActionResult> ExternalLoginConfirmation(ExternalLoginConfirmationViewModel model, string returnUrl)
  5.         {
  6.             if (User.Identity.IsAuthenticated)
  7.             {
  8.                 return RedirectToAction("Manage");
  9.             }
  10.  
  11.             if (ModelState.IsValid)
  12.             {
  13.                 // Get the information about the user from the external login provider
  14.                 var info = await AuthenticationManager.GetExternalLoginInfoAsync();
  15.                 if (info == null)
  16.                 {
  17.                     return View("ExternalLoginFailure");
  18.                 }
  19.                 var user = newApplicationUser() { UserName = model.Email };
  20.                 var result = await UserManager.CreateAsync(user);
  21.                 if (result.Succeeded)
  22.                 {
  23.                     result = await UserManager.AddLoginAsync(user.Id, info.Login);
  24.                     if (result.Succeeded)
  25.                     {
  26.                         await StoreFacebookAuthToken(user);
  27.                         await SignInAsync(user, isPersistent: false);
  28.                         return RedirectToLocal(returnUrl);
  29.                     }
  30.                 }
  31.                 AddErrors(result);
  32.             }
  33.  
  34.             ViewBag.ReturnUrl = returnUrl;
  35.             return View(model);
  36.         }
      • ExternalLoginCallback action is called when you associate the User with an external login provider for the first time. In line 17 we add a new line to add  the FacebookAccessToken as a claim for the user.
Code Snippet
  1. //
  2.         // GET: /Account/ExternalLoginCallback
  3.         [AllowAnonymous]
  4.         publicasyncTask<ActionResult> ExternalLoginCallback(string returnUrl)
  5.         {
  6.             var loginInfo = await AuthenticationManager.GetExternalLoginInfoAsync();
  7.             if (loginInfo == null)
  8.             {
  9.                 return RedirectToAction("Login");
  10.             }
  11.  
  12.             // Sign in the user with this external login provider if the user already has a login
  13.             var user = await UserManager.FindAsync(loginInfo.Login);
  14.             if (user != null)
  15.             {
  16.                 //Save the FacebookToken in the database if not already there
  17.                 await StoreFacebookAuthToken(user);
  18.                 await SignInAsync(user, isPersistent: false);
  19.                 return RedirectToLocal(returnUrl);
  20.             }
  21.             else
  22.             {
  23.                 // If the user does not have an account, then prompt the user to create an account
  24.                 ViewBag.ReturnUrl = returnUrl;
  25.                 ViewBag.LoginProvider = loginInfo.Login.LoginProvider;
  26.                 return View("ExternalLoginConfirmation", newExternalLoginConfirmationViewModel { Email = loginInfo.Email });
  27.             }
  28.         }
    • Store the FacebookAccessToken as a User Claim in the ASP.NET Identity database
Code Snippet
  1. privateasyncTask StoreFacebookAuthToken(ApplicationUser user)
  2.         {
  3.             var claimsIdentity = await AuthenticationManager.GetExternalIdentityAsync(DefaultAuthenticationTypes.ExternalCookie);
  4.             if (claimsIdentity != null)
  5.             {
  6.                 // Retrieve the existing claims for the user and add the FacebookAccessTokenClaim
  7.                 var currentClaims = await UserManager.GetClaimsAsync(user.Id);
  8.                 var facebookAccessToken = claimsIdentity.FindAll("FacebookAccessToken").First();
  9.                 if (currentClaims.Count() <=0 )
  10.                 {
  11.                     await UserManager.AddClaimAsync(user.Id, facebookAccessToken);
  12.                 }

 

    • Add the following in AccountViewModel.cs for binding view model binding
 1: publicclass FacebookViewModel
 2:     {
 3:         [Required]
 4:         [Display(Name = "Friend's name")]
 5: publicstring Name { get; set; }
 6:  
 7: publicstring ImageURL { get; set; }
 8:     }

 

    • Add the following Code in AccountController to get the pictures of your Friends
      •    
Code Snippet
  1. //GET: Account/FacebookInfo
  2. [Authorize]
  3. publicasyncTask<ActionResult> FacebookInfo()
  4. {
  5.     var claimsforUser = await UserManager.GetClaimsAsync(User.Identity.GetUserId());
  6.     var access_token = claimsforUser.FirstOrDefault(x => x.Type == "FacebookAccessToken").Value;
  7.     var fb = newFacebookClient(access_token);
  8.     dynamic myInfo = fb.Get("/me/friends");
  9.     var friendsList = newList<FacebookViewModel>();
  10.     foreach (dynamic friend in myInfo.data)
  11.     {
  12.         friendsList.Add(newFacebookViewModel()
  13.            {
  14.                Name = friend.name,
  15.                ImageURL = @"https://graph.facebook.com/" + friend.id + "/picture?type=large"
  16.            });
  17.     }
  18.  
  19.     return View(friendsList);
  20. }
        

 

    • Add the following View to display the list of friends and the pictures
      • Create a View called FacebookInfo.cshtml under Views\Account folder and add the following markup
 1: @model IList<WebApplication96.Models.FacebookViewModel>
 2: @if (Model.Count > 0)
 3: {
 4: <h3>List of friends</h3>
 5: <divclass="row">
 6:             @foreach (var friend in Model)
 7:             {
 8: <divclass="col-md-3">
 9: <ahref="#"class="thumbnail">
 10: <imgsrc=@friend.ImageURLalt=@friend.Name/>
 11: </a>
 12: </div>
 13:              }
 14: </div>
 15: }
  • At this point you are all set to see your friends information
  • Run the project and log in using Facebook. You should be taken to the Facebook Site where when you successfully login and grant this app permissions to access this data, then you should be redirected back to the application.
  • When you click the FacebookInfo Link you should see a page which looks like the following

image

Conclusion

This was an easy way to extend the Social providers and get more information about the logged in user so you can provide a rich experience for the web site users. You can do this with the other Social Providers as well. If you have any questions, please visit the asp.net/forums or reach me via twitter (@rustd)

Attribute Routing in ASP.NET MVC 5

$
0
0

Routing is how ASP.NET MVC matches a URI to an action. MVC 5 supports a new type of routing, called attribute routing. As the name implies, attribute routing uses attributes to define routes. Attribute routing gives you more control over the URIs in your web application.

The earlier style of routing, called convention-based routing, is still fully supported. In fact, you can combine both techniques in the same project.

This post will cover the basic features and options of Attribute Routing, in ASP.NET MVC 5.

Why Attribute Routing?

For example, a socially enhanced e-commerce website could have the following routes:

  • {productId:int}/{productTitle}
    Mapped to ProductsController.Show(int id)
  • {username}
    Mapped to ProfilesController.Show(string username)
  • {username}/catalogs/{catalogId:int}/{catalogTitle}
    Mapped to CatalogsController.Show(string username, int catalogId)

(Don’t mind the specific syntax right now, we will touch on this later.)   

In previous version of ASP.NET MVC, the rules would be set in the RouteConfig.cs file, and point to the actual controller actions, as such:

  • routes.MapRoute(
  •     name: "ProductPage",
  •     url: "{productId}/{productTitle}",
  •     defaults: new { controller = "Products", action = "Show" },
  •     constraints: new { productId = "\\d+" }
  • );

When the route definitions are co-located with the actions, within the same source file rather than being declared on an external configuration class, it can make it easier to reason about the mapping between URIs and actions. The previous route definition would be set using the following, simple attribute:

  • [Route("{productId:int}/{productTitle}")]
  • public ActionResult Show(int productId) { ... }

Enabling Attribute Routing

To enable attribute routing, call MapMvcAttributeRoutes during configuration.

  • public class RouteConfig
  • {
  •     public static void RegisterRoutes(RouteCollection routes)
  •     {
  •         routes.IgnoreRoute("{resource}.axd/{*pathInfo}");
  •  
  •         routes.MapMvcAttributeRoutes();
  •     }
  • }

You can also combine attribute routing with convention-based routing.

  • public static void RegisterRoutes(RouteCollection routes)
  • {
  •     routes.IgnoreRoute("{resource}.axd/{*pathInfo}");
  •  
  •     routes.MapMvcAttributeRoutes();
  •  
  •     routes.MapRoute(
  •         name: "Default",
  •         url: "{controller}/{action}/{id}",
  •         defaults: new { controller = "Home", action = "Index", id = UrlParameter.Optional }
  •     );
  • }

Optional URI Parameters and Default Values

You can make a URI parameter optional by adding a question mark to the route parameter. You can also specify a default value by using the form parameter=value.

  • public class BooksController : Controller
  • {
  •     // eg: /books
  •     // eg: /books/1430210079
  •     [Route("books/{isbn?}")]
  •     public ActionResult View(string isbn)
  •     {
  •         if (!String.IsNullOrEmpty(isbn))
  •         {
  •             return View("OneBook", GetBook(isbn));
  •         }
  •         return View("AllBooks", GetBooks());
  •     }
  •  
  •     // eg: /books/lang
  •     // eg: /books/lang/en
  •     // eg: /books/lang/he
  •     [Route("books/lang/{lang=en}")]
  •     public ActionResult ViewByLanguage(string lang)
  •     {
  •         return View("OneBook", GetBooksByLanguage(lang));
  •     }
  • }

In this example, both /books and /books/1430210079 will route to the “View” action, the former will result with listing all books, and the latter will list the specific book. Both /books/lang and /books/lang/en will be treated the same.

Route Prefixes

Often, the routes in a controller all start with the same prefix. For example:

  • public class ReviewsController : Controller
  • {
  •     // eg: /reviews
  •     [Route("reviews")]
  •     public ActionResult Index() { ... }
  •     // eg: /reviews/5
  •     [Route("reviews/{reviewId}")]
  •     public ActionResult Show(int reviewId) { ... }
  •     // eg: /reviews/5/edit
  •     [Route("reviews/{reviewId}/edit")]
  •     public ActionResult Edit(int reviewId) { ... }
  • }

You can set a common prefix for an entire controller by using the [RoutePrefix] attribute:

  • [RoutePrefix("reviews")]
  • public class ReviewsController : Controller
  • {
  •     // eg.: /reviews
  •     [Route]
  •     public ActionResult Index() { ... }
  •     // eg.: /reviews/5
  •     [Route("{reviewId}")]
  •     public ActionResult Show(int reviewId) { ... }
  •     // eg.: /reviews/5/edit
  •     [Route("{reviewId}/edit")]
  •     public ActionResult Edit(int reviewId) { ... }
  • }

Use a tilde (~) on the method attribute to override the route prefix if needed:

  • [RoutePrefix("reviews")]
  • public class ReviewsController : Controller
  • {
  •     // eg.: /spotlight-review
  •     [Route("~/spotlight-review")]
  •     public ActionResult ShowSpotlight() { ... }
  •  
  •     ...
  • }

Default Route

You can also apply the [Route] attribute on the controller level, capturing the action as a parameter. That route would then be applied on all actions in the controller, unless a specific [Route] has been defined on a specific action, overriding the default set on the controller.

  • [RoutePrefix("promotions")]
  • [Route("{action=index}")]
  • public class ReviewsController : Controller
  • {
  •     // eg.: /promotions
  •     public ActionResult Index() { ... }
  •  
  •     // eg.: /promotions/archive
  •     public ActionResult Archive() { ... }
  •  
  •     // eg.: /promotions/new
  •     public ActionResult New() { ... }
  •  
  •     // eg.: /promotions/edit/5
  •     [Route("edit/{promoId:int}")]
  •     public ActionResult Edit(int promoId) { ... }
  • }

Route Constraints

Route constraints let you restrict how the parameters in the route template are matched. The general syntax is {parameter:constraint}. For example:

  • // eg: /users/5
  • [Route("users/{id:int}"]
  • public ActionResult GetUserById(int id) { ... }
  •  
  • // eg: users/ken
  • [Route("users/{name}"]
  • public ActionResult GetUserByName(string name) { ... }

Here, the first route will only be selected if the "id" segment of the URI is an integer. Otherwise, the second route will be chosen.

The following table lists the constraints that are supported.

 

ConstraintDescriptionExample
alphaMatches uppercase or lowercase Latin alphabet characters (a-z, A-Z){x:alpha}
boolMatches a Boolean value.{x:bool}
datetimeMatches a DateTime value.{x:datetime}
decimalMatches a decimal value.{x:decimal}
doubleMatches a 64-bit floating-point value.{x:double}
floatMatches a 32-bit floating-point value.{x:float}
guidMatches a GUID value.{x:guid}
intMatches a 32-bit integer value.{x:int}
lengthMatches a string with the specified length or within a specified range of lengths.{x:length(6)}
{x:length(1,20)}
longMatches a 64-bit integer value.{x:long}
maxMatches an integer with a maximum value.{x:max(10)}
maxlengthMatches a string with a maximum length.{x:maxlength(10)}
minMatches an integer with a minimum value.{x:min(10)}
minlengthMatches a string with a minimum length.{x:minlength(10)}
rangeMatches an integer within a range of values.{x:range(10,50)}
regexMatches a regular expression.{x:regex(^\d{3}-\d{3}-\d{4}$)}

Notice that some of the constraints, such as "min", take arguments in parentheses.

You can apply multiple constraints to a parameter, separated by a colon, for example:

  • // eg: /users/5
    // but not /users/10000000000 because it is larger than int.MaxValue,
    // and not /users/0 because of the min(1) constraint.
  • [Route("users/{id:int:min(1)}")]
  • public ActionResult GetUserById(int id) { ... }
Custom Route Constraints

You can create custom route constraints by implementing the IRouteConstraint interface. For example, the following constraint restricts a parameter to set of valid values:

  • public class ValuesConstraint : IRouteConstraint
  • {
  •     private readonly string[] validOptions;
  •     public ValuesConstraint(string options)
  •     {
  •         validOptions = options.Split('|');
  •     }
  •  
  •     public bool Match(HttpContextBase httpContext, Route route, string parameterName, RouteValueDictionary values, RouteDirection routeDirection)
  •     {
  •         object value;
  •         if (values.TryGetValue(parameterName, out value) && value != null)
  •         {
  •             return validOptions.Contains(value.ToString(), StringComparer.OrdinalIgnoreCase);
  •         }
  •         return false;
  •     }
  • }

The following code shows how to register the constraint:

  • public class RouteConfig
  • {
  •     public static void RegisterRoutes(RouteCollection routes)
  •     {
  •         routes.IgnoreRoute("{resource}.axd/{*pathInfo}");
  •  
  •         var constraintsResolver = new DefaultInlineConstraintResolver();
  •  
  •         constraintsResolver.ConstraintMap.Add("values", typeof(ValuesConstraint));
  •  
  •         routes.MapMvcAttributeRoutes(constraintsResolver);
  •     }
  • }

Now you can apply the constraint in your routes:

  • public class TemperatureController : Controller
  • {
  •     // eg: temp/celsius and /temp/fahrenheit but not /temp/kelvin
  •     [Route("temp/{scale:values(celsius|fahrenheit)}")]
  •     public ActionResult Show(string scale)
  •     {
  •         return Content("scale is " + scale);
  •     }
  • }

Route Names

You can specify a name for a route, in order to easily allow URI generation for it. For example, for the following route:

  • [Route("menu", Name = "mainmenu")]
  • public ActionResult MainMenu() { ... }

you could generate a link using Url.RouteUrl:

  • <href="@Url.RouteUrl("mainmenu")">Main menu</a>

Areas

You can define that a controller belongs to an area by using the [RouteArea] attribute. When doing so, you can safely remove the AreaRegistration class for that area.

  • [RouteArea("Admin")]
  • [RoutePrefix("menu")]
  • [Route("{action}")]
  • public class MenuController : Controller
  • {
  •     // eg: /admin/menu/login
  •     public ActionResult Login() { ... }
  •  
  •     // eg: /admin/menu/show-options
  •     [Route("show-options")]
  •     public ActionResult Options() { ... }
  •  
  •     // eg: /stats
  •     [Route("~/stats")]
  •     public ActionResult Stats() { ... }
  • }

With this controller, the following link generation call will result with the string "/Admin/menu/show-options"

  • Url.Action("Options", "Menu", new { Area = "Admin" })

You can set up a custom prefix for the area that defer from the area name, by using the AreaPrefix named parameter, for example:

  • [RouteArea("BackOffice", AreaPrefix = "back-office")]

If you are using both Areas with route attributes, and areas with convention based routes (set by an AreaRegistration class), then you need to make sure that area registration happen after MVC attribute routes are configured, however before the default convention-based route is set. The reason is that route registration should be ordered from the most specific (attributes) through more general (area registration) to the mist generic (the default route) to avoid generic routes from “hiding” more specific routes by matching incoming requests too early in the pipeline.

Example:

  • public static void RegisterRoutes(RouteCollection routes)
  • {
  •     routes.IgnoreRoute("{resource}.axd/{*pathInfo}");
  •  
  •     routes.MapMvcAttributeRoutes();
  •  
  •     AreaRegistration.RegisterAllAreas();
  •  
  •     routes.MapRoute(
  •         name: "Default",
  •         url: "{controller}/{action}/{id}",
  •         defaults: new { controller = "Home", action = "Index", id = UrlParameter.Optional }
  •     );
  • }

Announcing release of ASP.NET and Web Tools for Visual Studio 2013

$
0
0

I’m excited to show the new features of ASP.NET and Web Tools for Visual Studio 2013. Visit Somasegar’s blog for details on Visual Studio 2013 RTW. Download Visual Studio 2013 to try out the new features in this release. Read the Visual Studio blog for information about general editor enhancements, including the preview time JavaScript editor improvement blog post and future posts on JavaScript editor improvements,

Check http://www.asp.net/ for release notes, documentation, and tutorials. This blog includes features introduced after the RC release as well as features included in the earlier RC announcement blog. We will have incoming blogs to talk about the features in detail.

Visual Studio Web Tooling Enhancements

One ASP.NET

We made a simple UI for creating projects that offer support for multiple ASP.NET frameworks (Web Forms, MVC, and Web API). New features are available for Web Forms that used to be offered only for MVC, such as automatic test project creation and multiple authentication configurations.

image

image

Different authentication configurations can be chosen. The chosen authentication works the same in all ASP.NET frameworks and in different hosting environments, such as IIS, IIS Express, or OWIN based self hosted Web API and SignalR sites.

All of the ASP.NET project templates now use Bootstrap 3.0 to provide responsive design and theming capabilities. Responsive design means your web page automatically adapts to changes in browser window size by doing things like changing the way menu bars are displayed or the way columns are arranged. Theming means you can change the look and feel of a site easily, and you can choose from a wide selection of ready-made themes. You can do these things by writing CSS, but that requires a lot of complex code: Bootstrap makes it easy. For an example of responsive design in action and how to change a site’s theme, see Bootstrap in the Visual Studio 2013 web project templates.

The Single Page Application template has been updated with support for OAuth 2.0 and a Web API based account controller that can be used from JavaScript and also from native mobile applications.

For more information about the new process for creating web projects, see Creating ASP.NET Web Projects in Visual Studio 2013.

Browser Link – SignalR channel between browser and Visual Studio

A new feature, Browser Link, enables you to click a button in Visual Studio to trigger a page refresh in one or more browsers that are running from your web project. You don’t have to click Refresh in each browser or close and open the browser. In Visual Studio 2012 a typical development cycle is edit markup and code, press Control+F5 to run in the browser, close the browser, edit again, and repeat the process. In Visual Studio 2013 you cut out some of these steps. You can edit markup and code, press Control+F5 to run in the browser, edit again, click the Browser Link Refresh button, and see the results of your changes. You can connect multiple browsers to your development site, including mobile emulators, and click a button to refresh all the browsers all at the same time.

image

Browser Link works by using a SignalR channel between browsers and Visual Studio 2013.

We also released API to write Browser Link extensions. Mads Kristensen’s Web Essentials for Visual Studio 2013 (source) contains several useful extensions, such as:

  • Design Mode – Change HTML text in the browser, and the change is reflected in the HTML editor.
  • Inspect Mode – Inspect elements in the browser and the HTML editor automatically scrolls to the corresponding code.
  • Sync F12 Changes – Change styles in F12 browser tools, and the changes are automatically updated in the corresponding style sheet files.
  • Find Unused CSS – Easily find unused CSS in your project.
  • CSS Sync on Save – Save a CSS or LESS document, and the styles are automatically reloaded in the browser without doing a full page refresh.
  • Best Practices – Analyzes your web pages for violations to best practices and gives you an easy way to fix them.

clip_image007

clip_image008

New HTML editor

In VS2013, a new HTML editor is released for Razor files and HTML files. Web Forms .aspx, .master and user control .ascx files are still using the legacy editor for various reasons. The new HTML editor provides a single unified HTML5-based schema. It has some improvement such as automatic brace completion, jQueryUI and AngularJS attribute IntelliSense, attribute IntelliSense grouping, etc. We make CSS class attribute IntelliSense work even if bundling and minification is used.

clip_image009

Van Kichline has more details in HTML Editing Features in Visual Studio 2013 blog.

Azure website tooling with Windows Azure SDK 2.2

With installation of Windows Azure SDK 2.2 , all your Windows Azure services are accessible under one Windows Azure node in Server Explorer. You can connect to Windows Azure just by entering your user name and password – you don’t have to download and import a subscription file. You can perform many management and configuration functions from Server Explorer, such as creating Web Sites, stopping and starting Web Sites, setting logging and tracing options, and more.

clip_image010

Scaffolding

ASP.NET Scaffolding is a code generation framework for ASP.NET Web applications. It makes it easy to add boilerplate code to your project that interacts with a data model. VS 2013 has completely rewritten ASP.NET Scaffolding. Among the new features are an option to generate asynchronous code, the ability to scaffold Web API controllers, and the ability to use scaffolding in any type of ASP.NET project (for example, you can scaffold MVC controllers and views in Web Forms projects). For more information, see ASP.NET Scaffolding Overview.

image

We removed Web Forms Scaffolding from this release because it's not ready yet. We’ll put the new bits in future release, possibly targeting update 1.

ASP.NET Framework Enhancements

ASP.NET MVC 5

MVC projects are now standard Web Applications and do not use their own project GUID. An MVC 5 project is created if the MVC checkbox is checked in the One ASP.NET new project dialog. To get started, see Getting Started with ASP.NET MVC 5.

MVC 5 includes a variety of new features including:

· Attribute routing

· Authentication filters

· Filter overrides

For details on ASP.NET MVC 5 features, see the release notes or refer to the ASP.NET MVC 5 documentation on asp.net.

ASP.NET Web API 2

ASP.NET Web API 2 includes a variety of new features including:

  • Attribute routing
  • OAuth 2.0
  • OData improvements
  • Reuse existing EDM
  • Request batching
  • Improved unit testability
  • CORS
  • Support for portable libraries
  • OWIN integration
  • IHttpActionResult
  • HttpRequestContext
  • Authentication filters
  • Filter overrides

For details on ASP.NET Web API 2 features, see the release notes or refer to the ASP.NET Web API 2 documentation here.

ASP.NET SignalR

SignalR 2.0.0 is included with VS2013. It includes support for MonoTouch (enables you to develop iPhone apps using C# and .NET) and MonoDroid (enables you to develop Android apps using C# and .NET), portable .NET client, and the self-hosting package Microsoft.AspNet.SignalR.SelfHost, and it is backwards compatible for servers. For more information, including notes about how to upgrade from version 1.x, see the SignalR documentation on asp.net.

Entity Framework

Entity Framework 6.0.0 is included with VS2013. For more information, see Entity Framework Version History .

Microsoft OWIN Components

The Microsoft OWIN components (also known as the Katana project) integrates support for the Open Web Interface for .NET (OWIN), deep into ASP.NET. OWIN defines a standard interface between .NET web servers and web applications. Any OWIN-based application or middleware can run on any OWIN capable host, including ASP.NET on IIS.

You host OWIN-based middleware and frameworks in ASP.NET using the Microsoft.Owin.Host.SystemWeb NuGet package. The Microsoft.Owin.Host.SystemWeb package has been enhanced to enable an OWIN middleware developer to provide hints to the SystemWeb server if the middleware needs to be called during a specific ASP.NET pipeline stage (for example, during the authentication stage). The Microsoft OWIN Components also includes an HttpListener-based server, a self-host API and an OwinHost executable for running OWIN applications without having to create a custom host. With the OwinHost 2.0.0 NuGet package, VS2013 can now easily convert an IIS Express based WebAPI or SignalR project to a project that is run from OwinHost.exe.

image

ASP.NET authentication is now based on OWIN middleware that can be used on any OWIN-based host. Microsoft OWIN Components includes a rich set of middleware components for authentication including support for cookie-based authentication, logins using external identity providers (like Microsoft Accounts, Facebook, Google, Twitter), and logins using organizational accounts from your on-premises Active Directory or Windows Azure Active Directory. Also included is support for OAuth 2.0, JWT and CORS. For more information see An Overview of Project Katana.

ASP.NET Identity

ASP.NET Identity is the new membership system for building ASP.NET applications. ASP.NET Identity makes it easy to integrate user profile data with application data. With ASP.NET Identity you control the persistence model of your application. For example, you can store your user data in a SQL Server database or another data store, including NoSQL data stores such as Windows Azure Storage Tables. For the default SQL Server data store, ASP.NET Identity uses Entity Framework Code First for all data access,. You can use Migrations to modify the membership tables and to deploy the database, and you can put your own tables and the membership tables in the same DbContext, making it easy to write code that does joins between membership and application tables.

For information about using ASP.NET Identity with Individual User Accounts authentication, see Creating ASP.NET Web Projects in Visual Studio 2013 - Individual User Accounts.

NuGet

NuGet 2.7 is included with VS2013. See NuGet 2.7 Release Notes for more details.

Summary

Today’s Visual Studio 2013 RTW release has a lot of useful features for developers using ASP.NET. Read the release notes to learn even more, and install it today!

Please use Connect to submit bugs, ASP.NET UserVoice to submit and vote for suggestions, and the ASP.NET Forums for Q&A. The send-a-smile tool in Visual Studio can help you send feedback as well.

image

Building a simple ToDo application with ASP.NET Identity and associating Users with ToDoes

$
0
0

Hello everyone. I was prompted to write this post from a comment I received on http://blogs.msdn.com/b/webdev/archive/2013/10/17/announcing-release-of-asp-net-and-web-tools-for-visual-studio-2013.aspx. I am pasting the comment verbatim from the post

“I'm having a lot of difficulty integrating the IdentityDbContext with other DbContexts to create a comprehensive model and database for my applications. If the membership system always stood on its own, there wouldn't be an issue but the User ID typically gets used throughout other models and tables (think of a blog, for example, where some authenticated users are authorized to create blog posts whereas others are only authorized to leave comments - all users, though, need to have their User ID stored with blogs and/or comments). What are the recommended ways to combine these EF Code First models?”

While this requirement may seem fairly straight forward, it does highlight lots of key components of using any membership system in an application. I thank “Robert Gaut” for posting this comment which motivated me to write this post.

So this should set the context of this blog post :) This blog post shows how you can create a simple ToDo application and associate ToDoes with Users from ASP.NET Identity. In other words, this post shows how you can mix and match Entity Framework Code First Models for the application specific data with the Models of User from ASP.NET Identity.

Let us define the requirements of this application :-

  • Only authenticated users should be able to create, edit, delete and see their own ToDoes.
  • Users should not be able to view or edit ToDoes which were created by other users. D’Oh you would be thinking that this should be a given, but I figured I should mention that since this is what I hit while writing this sample.
  • Only Users who belong to Admin Role (I am going to call this User as Admin User in this post) should be authorized to see all the ToDoes in the application. This means that the Admin User can see the ToDoes for all the users. This requirement will highlight the authorization requirements mentioned in the comment.

Let us see how we can build an application using ASP.NET Identity which fulfills these requirements

I am going to write this post as a tutorial as well and use the ASP.NET Project templates in Visual Studio 2013 so hopefully it would be easy to follow and try out and later on you can implement this flow in your own application.

    • Create a File – New ASP.NET Project and select ASP.NET MVC with Individual User Accounts.

    • This project creates an application where a user can login by registering an account with the website or use Social Login providers such as Facebook, Twitter etc. For the purpose of this example we will register a user with the website.
    • Initialize ASP.NET Identity to create Admin User and Admin Role and Add Admin User to Admin Role
      • You can initialize ASP.NET Identity when the application starts. Since ASP.NET Identity is Entity Framework based in this sample,  you can create DatabaseInitializer which is configured to get called each time the app starts.
      • Set the Database Initializer in Global.asax

             1:  Database.SetInitializer<MyDbContext>(new MyDbInitializer());

      • Initialize the database to create Admin Role and Admin User
        • In Line 5 we are creating a UserManger from ASP.NET Identity system which will let us do operations on the User such as Create, List, Edit and Verify the user. You can think of the UserManager as being analogus to SQLMembershpProvider in ASP.NET 2.0
        • In Line 6, we are creating a RoleManager from ASP.NET Identity system which lets us operate on Roles. You can think of the RoleManager as being analogus to SQLRoleMembershpProvider in ASP.NET 2.0
        • In this example, my User type is called in MyUser. In the project template, it is called ApplicationUser.

 

             1:  publicclass MyDbInitializer : DropCreateDatabaseAlways<MyDbContext>
             2:      {
             3:  protectedoverridevoid Seed(MyDbContext context)
             4:          {

          5: var UserManager = new UserManager<MyUser>(new

          UserStore<MyUser>(context));

             6:              var RoleManager = new RoleManager<IdentityRole>(new
                                                    RoleStore<IdentityRole>(context));
             7:   
             8:  string name = "Admin";
             9:  string password = "123456";
           
            13:   
            14:  //Create Role Admin if it does not exist
            15:  if (!RoleManager.RoleExists(name))
            16:              {
            17:                  var roleresult = RoleManager.Create(new IdentityRole(name));
            18:              }
            19:   
            20:  //Create User=Admin with password=123456
            21:              var user = new MyUser();
            22:              user.UserName = name;
            23:              var adminresult = UserManager.Create(user, password);
            24:   
            25:  //Add User Admin to Role Admin
            26:  if (adminresult.Succeeded)
            27:              {
            28:                  var result = UserManager.AddToRole(user.Id, name);
            29:              }
            30:  base.Seed(context);
            31:          }
            32:      }

    • Create an Entity Framework Code First ToDo model in Models\AppModels,cs.
      • Since we are using Entity Framework Code First, EF will create the right keys between Users and ToDo table.
      •    1:  publicclass MyUser : IdentityUser
           2:      {
           3:  publicstring HomeTown { get; set; }
           4:  publicvirtual ICollection<ToDo>
           5:                               ToDoes { get; set; }
           6:      }
           7:   
           8:  publicclass ToDo
           9:      {
          10:  publicint Id { get; set; }
          11:  publicstring Description { get; set; }
          12:  publicbool IsDone { get; set; }
          13:  publicvirtual MyUser User { get; set; }
          14:      }

    • Use Scaffolding to generate a ToDo MVC controller along with Views to create, update, delete and list all the ToDoes. Follow this tutorial on how to use Scaffolding in ASP.NET http://www.asp.net/visual-studio/overview/2013/aspnet-scaffolding-overview
      • Please ensure that you reuse the existing database context so you can store ASP.NET Identity and ToDoes in the same database. This is a convenient way of managing application data and membership data. You DbContext should look something like this
      •    1:  publicclass MyDbContext : IdentityDbContext<MyUser>
           2:      {
           3:  public MyDbContext()
           4:              : base("DefaultConnection")
           5:          {
           6:          }
           7:   
           8:  protectedoverridevoid OnModelCreating(DbModelBuilder modelBuilder)
           9:          {
          10:  public System.Data.Entity.DbSet<AspnetIdentitySample.Models.ToDo> 
                             ToDoes { get; set; }
          11:      }

    • Update the generated ToDo controller to associate a User with ToDoes. In this sample I am only going to show you Create and list, but you can follow the similar pattern for Edit and Detele
      • Add an overload of ToDo controller constructor which takes in a UserManager for ASP.NET Identity. UserManager allows you to manage the User in ASP.NET Identity system.
        •    1:  private MyDbContext db;
             2:  private UserManager<MyUser> manager;
             3:  public ToDoController()
             4:          {
             5:              db = new MyDbContext();
             6:              manager = new UserManager<MyUser>(new UserStore<MyUser>(db));
             7:          }
             8:  

      • Update Create Action
        • When you create a ToDo, we look up the logged in User in the ASP.NET Identity and associate the User object with the ToDoes.
             1:  public async Task<ActionResult> Create
             2:          ([Bind(Include="Id,Description,IsDone")] ToDo todo)
             3:          {
             4:              var currentUser = await manager.FindByIdAsync
                                                           (User.Identity.GetUserId()); 
             5:  if (ModelState.IsValid)
             6:              {
             7:                  todo.User = currentUser;
             8:                  db.ToDoes.Add(todo);
             9:                  await db.SaveChangesAsync();
            10:  return RedirectToAction("Index");
            11:              }
            12:   
            13:  return View(todo);
            14:          }
              

      • Update List Action
        • We will only get the list of ToDoes created by the current logged in User
        •    1:  public ActionResult Index()
             2:          {

          3: var currentUser = manager.FindById(User.Identity.GetUserId());

             4:  return View(db.ToDoes.ToList().Where(
                                             todo => todo.User.Id == currentUser.Id));
             5:          }

      • Allow only Admins to view the ToDoes for all the users
          • We will add a new Action in the ToDo controller to list all the ToDoes, but we will only authorize Users in Role Admin to have access to these ToDoes. Again we are using the same [Authorize] attribute that we used before
          •    1:          [Authorize(Roles="Admin")]
               2:  public async Task<ActionResult> All()
               3:          {
               4:  return View(await db.ToDoes.ToListAsync());
               5:          }

      • Add support to view the User details from the ToDo table
          • Since we associate the ToDoes with a User object, the benefit that we get is when we get the list of ToDoes for all the users, we can easily access any profile specific data for the user from the ToDo model itself. For eg. when the Admin views all the ToDoes we can also get a profile property for the user if we added any. In this sample we have added HomeTowm so we ill display HomeTown next to the ToDoes.
          • The markup for the views looks like as follows
            •    1:  @model IEnumerable<AspnetIdentitySample.Models.ToDo>
                 2:   
                 3:  @{
                 4:      ViewBag.Title = "Index";
                 5:  }
                 6:   
                 7:  <h2>List of ToDoes for all Users</h2>
                 8:  <p>
                 9:      Notice that we can see the User info (UserName) and profile info such as HomeTown for the user as well.
                10:      This was possible because we associated the User object with a ToDo object and hence
                11:      we can get this rich behavior.
                12:  </p>
                13:   
                14:  <tableclass="table">
                15:  <tr>
                16:  <th>
                17:              @Html.DisplayNameFor(model => model.Description)
                18:  </th>
                19:  <th>
                20:              @Html.DisplayNameFor(model => model.IsDone)
                21:  </th>
                22:  <th>@Html.DisplayNameFor(model => model.User.UserName)</th>
                23:  <th>@Html.DisplayNameFor(model => model.User.HomeTown)</th>
                24:  </tr>
                25:   
                26:      @foreach (var item in Model)
                27:      {
                28:  <tr>
                29:  <td>
                30:                  @Html.DisplayFor(modelItem => item.Description)
                31:  </td>
                32:  <td>
                33:                  @Html.DisplayFor(modelItem => item.IsDone)
                34:  </td>
                35:  <td>
                36:                  @Html.DisplayFor(modelItem => item.User.UserName)
                37:  </td>
                38:  <td>
                39:                  @Html.DisplayFor(modelItem => item.User.HomeTown)
                40:  </td>
                41:  </tr>
                42:      }
                43:   
                44:  </table>

      • Update the Layout Page to add links for ToDoes
        •    1:  <li>@Html.ActionLink("ToDo", "Index", "ToDo")</li>
             2:  <li>@Html.ActionLink("ToDo for User In Role Admin", "All", "ToDo")</li>
             3:  
      • Run the application
        • When you run the application you will the links at the top of the page as follows
        • image

      • Create ToDo as a normal user (non admin)
          • Click ToDo and you will be redirected to Login page since you are not authenticated
          • You can register a new account and create a ToDo
            • image
          • Once you create the ToDo, then you can view the ToDoes for yourself. Note you cannot view ToDoes for all users
          • image
      • View the ToDoes for all the users (admin only access)
        • Click the link “ToDo for User in Role Admin”. you will be redirected back to the login page since you are not an Admin and hence you are not authorized to view this page
        • Logout from this application and Login using the Admin you created when you were initializing the ASP.NET Identity system. User = Admin Password=123456
        • Once you login you can view the ToDoes for all users
        • image

    Conclusion

    I hope you will find this walkthrough useful. For a completed sample you can visit my project at https://github.com/rustd/AspnetIdentitySample. If you have any questions around ASP.NET Identity, please feel free to leave comments, ask them on asp.net/forums of stackoverflow. I can also be reached on twitter (@rustd)


    Adding Core References Support in an ASP.NET Empty Project

    $
    0
    0

    Adding Core References Support in an ASP.NET Empty Project

    Brief

    In Visual Studio 2013 we introduced the core reference framework to ASP.NET project creation. You can choose what core references you want in your to-be-created project.

    How

    Let’s take the example of adding Web API core references in an
    Empty ASP.NET Project

    • Open Visual Studio 2013 New Project Dialog
    • Choose “ASP.NET Web Application” under “Web” 
    • Select any template in the New ASP.NET Project
      dialog.

      Notice the three check boxes under “Add
      folders and core references for:”
    • The availability of Web Forms, MVC and Web API depend on which template you choose. For example, MVC is always selected for the MVC template. When you select Empty, all three are optional since it
      is an empty template.
    • To add Web API core reference, select Web API, then OK,  and then create project.
    • Examine the packages.config:

      Web API NuGet packages are all added.
    • Examine the Global.asax.cs:

      Web API is also registered.

    Summary

    As you can see, adding the core references will not only add the required NuGet packages but also correctly configure the app for the selected core reference.

    Similarly you can add MVC and  or Web Forms support in the same way. And the supported core references are not limited to empty template only. They’re supported on all six template.

    This approach is a very helpful function to set up a customized ASP.NET project quickly.

    How to use XDT in NuGet - Examples and Facts

    $
    0
    0

    Starting with NuGet 2.6, XML-Document-Transform (XDT) is supported to transform XML files inside a project. The XDT syntax can be utilized in the .install.xdt and .uninstall.xdt file(s) under the package’s Content folder, which will be applied during package installation and uninstallation time, respectively.

    One of XDT’s greatest strengths is its simple but powerful syntax for manipulating the structure of an XML DOM. Rather than simply overlaying one fixed document structure onto another structure, XDT provides controls for matching elements in a variety of ways, from simple attribute name matching to full XPath support. Once a matching element or set of elements is found, XDT provides a rich set of functions for manipulating the elements, whether that means adding, updating, or removing attributes, placing a new element at a specific location, or replacing or removing the entire element and its children.

    In this blog, examples of using NuGet’s XDT feature to manipulate XML DOM are demonstrated, based on the common transformation scenarios of Web.config files. Important aspects of applying this feature including its drawback are also given, which we want your feedback on how to further improve this feature and make it more useful.

    Part A: Common XDT Transformations in Web.config.(un)install.xdt file

    • Change the attribute values of an element in Web.config

      To change the attribute value such as connectionString of a Web.config file, either the xdt:Transform=”SetAttributes” or xdt:Transform=”Replace” attribute can be used in Web.config.(un)install.xdt file, in conjunction with the xdt:Locator attribute.

      The use of xdt:Transform=”SetAttributes” below will update the value of the connectionString attribute only, while leaving other attribute values in the <add> element untouched.

    <?xml version="1.0"?><configuration xmlns:xdt="http://schemas.microsoft.com/XML-Document-Transform"><connectionStrings><add name="DefaultConnection" connectionString="value for the deployed Web.config file" xdt:Transform="SetAttributes" xdt:Locator="Match(name)"/></connectionStrings>
        ...</configuration >

    On the other hand, the use of xdt:Transform=”Replace” will replace the entire <add> element named “DefaultConnection”, with what’s specified in the Web.config.(un)install.xdt file.

    <?xml version="1.0"?><configuration xmlns:xdt="http://schemas.microsoft.com/XML-Document-Transform"><connectionStrings><add name="DefaultConnection" connectionString="value for the deployed Web.config file"  
    		      providerName="System.Data.SqlClient" xdt:Transform="Replace" xdt:Locator="Match(name)"/></connectionStrings>
        ...</configuration >
    • Replace all elements under a section of Web.config

      To replace all elements under the <system.web> section, the xdt:Transform=”Replace” attribute can be placed at the root of the <system.web> section. For example, the following XDT transform will update the entire <system.web> section of the Web.config to just contain the two simple elements (<compilation> and <httpRuntime>) specified by the Web.config.(un)install.xdt file.

    <?xml version="1.0"?><configuration xmlns:xdt="http://schemas.microsoft.com/XML-Document-Transform"><system.web xdt:Transform="Replace"><compilation debug="false" targetFramework="4.5" /><httpRuntime executionTimeout="00:05:00"/></system.web></configuration>
    • Insert a new element before the specific element in Web.config

      Suppose the starting Web.config looks like below (from ASP.NET Web Forms Application):

    <?xml version="1.0"?><configuration>
        ...<system.webServer><modules runAllManagedModulesForAllRequests="true" /></system.webServer><runtime>
          ...</configuration>

    To insert a new <validation> element in the <system.webServer> section but before the <modules> element, the xdt:Transform=”InsertBefore” attribute can be utilized in your Web.config.(un)install.xdt file:

    <?xml version="1.0"?><configuration xmlns:xdt="http://schemas.microsoft.com/XML-Document-Transform"><system.webServer><validation validateIntegratedModeConfiguration="false" xdt:Transform="InsertBefore(/configuration/system.webServer/modules)" /></system.webServer></configuration>
    • Insert a section of elements if missing from the current Web.config

      There are occasions that the starting Web.config does not have the section present, which elements need to be inserted to. In this case, the xdt:Transform=”InsertIfMissing” can be used. However, for Web.config file that does have the target section available but contain different elements, the insertion won’t take place. The end result would be that the target section of the config file stays unmodified.

    <?xml version="1.0"?><configuration xmlns:xdt="http://schemas.microsoft.com/XML-Document-Transform"><configSections xdt:Transform="InsertIfMissing"><sectionGroup name="elmah"><section name="security" requirePermission="false" type="Elmah.SecuritySectionHandler, Elmah" /><section name="errorLog" requirePermission="false" type="Elmah.ErrorLogSectionHandler, Elmah" /><section name="errorMail" requirePermission="false" type="Elmah.ErrorMailSectionHandler, Elmah" /><section name="errorFilter" requirePermission="false" type="Elmah.ErrorFilterSectionHandler, Elmah" /></sectionGroup></configSections>
        ...</configuration>
    • Two step transformation that ensures updating an existing value, or adding a new one

      So far the transformations that we talked about are all one-step transformation that adheres to the XDT syntax. For some of the cases, the package authors may not know the exact content of the starting Web.configs. Therefore, the transformation could fail, due to reasons such as the targeted sections/elements cannot be found. One common case would be direct calling xdt:Transform=”Insert” to a section or element that is not available in the current Web.config.

      As a workaround, one can apply a two-step transformation and have code like below in the Web.config.(un)install.xdt file, i.e. by removing the matching element if existing, and then inserting the element with the same name back. Fortunately the transformations are done in sequence, so this works with a Web.config that has <appSettings> section in place.

    <?xml version="1.0" encoding="utf-8"?><configuration xmlns:xdt="http://schemas.microsoft.com/XML-Document-Transform"><appSettings><!-- 
    		Doing this in two steps (remove, then insert) ensures that we can update an existing value, or add a new one
    	--><add
    	  	key="page:Version"
    		xdt:Transform="Remove"
    		xdt:Locator="Match(key)"/><add
    		key="page:Version"
    		value="2.0.0"
    		xdt:Transform="Insert" /></appSettings></configuration>
    • Transformation that can store/restore old values via package install/uninstall

      A cool thing that the NuGet’s XDT feature can do is that the original values can be stored upon package install, and then restored back via package uninstall. The example below showed that:

      During package install by calling Web.config.install.xdt, the “SourceFile_advances” key is replaced with a new value, while the old value is saved as a comment in the new <local> section.

    <?xml version="1.0"?><configuration xmlns:xdt="http://schemas.microsoft.com/XML-Document-Transform"><appSettings xdt:Transform="InsertIfMissing"><add key="SourceFile_Advances" 
    			  keyvalue="C:\Code\SavvysoftValuations\new.csv" xdt:Locator="Match(key)" xdt:Transform="Replace" /></appSettings><local xdt:Transform="Insert"><!-- <add key="SourceFile_Advances" keyvalue="C:\Code\SavvysoftValuations\original.csv" /> --></local></configuration>

    During package uninstall by calling Web.config.uninstall.xdt, the <appSettings> and <local> section was removed and the original value for “SourceFile_advances” key was added back.

    <?xml version="1.0"?><configuration xmlns:xdt="http://schemas.microsoft.com/XML-Document-Transform"><appSettings xdt:Transform="Remove"></appSettings><local xdt:Transform="Remove"></local><appSettings xdt:Transform="InsertIfMissing"><add key="SourceFile_Advances" keyvalue="C:\Code\SavvysoftValuations\original.csv" /></appSettings></configuration>

    Part B: Things that you should know about the NuGet XDT feature

    1. Unlike the other NuGet packages, which files/references are automatically removed during package uninstallation, NuGet performs XML transformation specified in the .uninstall.xdt file. In that sense, XDT package uninstallation is not symmetrical to what’s been added by package installation.

    2. The XDT feature can be apply to any XML files including the Web.Debug.config and Web.Release.config files. The transform .xdt files would need to be named Web.Debug.config.(un)install.xdt and Web.Release.config.(un)install.xdt accordingly.

    3. The XDT feature works well under nested Content folders, also recognizes the targetframeworks, by placing under the approriate folder such as net45.

    4. Due to limitation of the current transform syntax, authoring a universal package that can handle various forms of Web.config files (with sections/elements/attributes variations) can be difficult at this point. Especially, the combined usages of xdt:Transform=”InsertIfMissing” and/or xdt:Transform=”Insert” does not work well. Taking the below Web.config.install.xdt file as an example, when the <runtime> section does not exist, the whole section will be inserted, and the <dependentAssembly> element will be inserted one more time via the xdt:Transform=”Insert” statement.

    <?xml version="1.0"?><configuration xmlns:xdt="http://schemas.microsoft.com/XML-Document-Transform"><runtime xdt:Transform="InsertIfMissing"> <assemblyBinding xmlns="urn:schemas-microsoft-com:asm.v1"> <dependentAssembly xdt:Transform="Insert"> <assemblyIdentity name="X" publicKeyToken="032d34d3e998f237" culture="neutral" /> <bindingRedirect oldVersion="0.0.0.0-2.0.1.5" newVersion="2.0.1.5" /> </dependentAssembly> </assemblyBinding> </runtime></configuration>

    One potential solution to this issue would be allowing customized XDT transform, so that more complicated syntax can be applied during package install/uninstall, such as merging sections and elements between the starting Web.config files and the specified transformations. By this way, we are moving up the complication to the XDT syntax layer, instead on the transform file itself. A sample custom transformation engine has been authored by AppHarbor, with its open source tester on github.

    For more topics about NuGet, please visit http://blog.nuget.org/

    Sending a CORS Request in IE

    $
    0
    0

    From the time we added CORS support for ASP.NET WEB API, we have seen many questions on its usage, including questions about sending cross-origin requests (CORS) from IE. IE 10 and higher fully support using XMLHttpRequest to send cross-origin requests. CORS for XHR in IE10  is a great blog post on this approach.

    CORS is also supported in IE 8/9, however, in a different way. Instead of XMLHttpRequest, an XDocumentRequest object is used to send cross-origin requests. Here’s another good reading about it: http://blogs.msdn.com/b/ieinternals/archive/2010/05/13/xdomainrequest-restrictions-limitations-and-workarounds.aspx. If you’re using jQuery, here’s how to send CORS in IE 8/9 http://stackoverflow.com/questions/10232017/ie9-jquery-ajax-with-cors-returns-access-is-denied

    Last, sending CORS request is not supported in IE 6/7.

    Web publishing updates for app offline and usechecksum

    $
    0
    0

    In Visual Studio 2013 we have added a couple of small features for web publishing that I’d like to share with you. Those updates are; how to take your app offline during publishing and how you can update the default file compare option.

    App offline support

    In Visual Studio when you publish your web application we do not force the remote app to be stopped/restarted. Based on your publishing artifacts your site may end up being restarted (for example you change web.config) but Visual Studio never had a way to take your application offline during a publish operation.

    There are a lot of reasons why you may want to take your app offline during publishing. For example you app has files locked which need to be updated, or you need to clear an in-memory cache for changes to take effect. We heard a good amount of feedback from ASP.NET users regarding the lack of this support on our uservoice site. From that suggestion we worked with the Web Deploy team to introduce a new AppOffline rule which enables this.

    We also added support for this in Visual Studio 2013 as well. We have not yet created any specific UI for this feature but it’s very easy to enable. Since Visual Studio 2012 web publish profiles are stored as MSBuild files under Properties\PublishProfiles (My Project\PublishProfiles for VB). These files end with a .pubxml extension (not to be confused with .pubxml.user) and have the same name as the Publish Profile in Visual Studio.

    Each of these publish profiles contain the settings for that publish profile. You can customize these files to modify the publish process. To enable this find the .pubxml file corresponding to the publish profile you’d like to update. Then add the following element in the PropertyGroup element.

    <EnableMSDeployAppOffline>true</EnableMSDeployAppOffline>
    

    So the resulting publish profile will look something like the following.

    <?xml version="1.0" encoding="utf-8"?>
    <Project ToolsVersion="12.0" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
      <PropertyGroup>
        <EnableMSDeployAppOffline>true</EnableMSDeployAppOffline>
        <WebPublishMethod>MSDeploy</WebPublishMethod>
        <MSDeployServiceURL>(removed)</MSDeployServiceURL>
        <DeployIisAppPath>Default Web Site</DeployIisAppPath>
        <AllowUntrustedCertificate>True</AllowUntrustedCertificate>
        <SkipExtraFilesOnServer>True</SkipExtraFilesOnServer>
        <DeployAsIisApp>False</DeployAsIisApp>
        <MSDeployPublishMethod>WMSVC</MSDeployPublishMethod>
        <UserName>sayedha</UserName>
        <ExcludeApp_Data>False</ExcludeApp_Data>
        <_SavePWD>True</_SavePWD>
      </PropertyGroup>
    </Project>
    

    After that you can save and close the file. When you publish (either in Visual Studio or the command line) using that profile your app will be taken offline during publishing.

    This feature was implemented as a direct result of your feedback so please keep letting us know how we can improve.

    Use checksum support

    Web Deploy has two methods of determining which files will be synced when a publish operation is performed.

    1. Use file time stamps
    2. Use the CRC (Cyclic Redundancy Check) checksum

    By default Visual Studio uses the time stamps method. The reason for this is that there is a noticeable performance impact when using the CRC checksum.

    In team scenarios, or on build servers, it may make sense for you to use the CRC method instead. Enabling this is very similar to the app offline support. Find the .pubxml file which is associated with your web project and add the following element under the PropertyGroup element.

    <MSDeployUseChecksum>true</MSDeployUseChecksum>
    

    After that when you publish using that profile (from Visual Studio or the command line) the CRC checksum method will be used instead of time stamps.

    Other ways to apply these settings

    If you would like to enable this for multiple project, or multiple profiles it may get a bit cumbersome to modify every .pubxml file. These properties are standard MSBuild properties so there are several different ways you can apply these settings. Below I’ve outlined a few different options, but if these don’t meet your needs there may be additional options.

    Set these properties on the command line/build server

    When you invoke msbuild.exe you can pass this property in as you would any other MSBuild property. Use the following syntax,

    /p:EnableMSDeployAppOffline=true /p:MSDeployUseChecksum=true
    

    Set these properties for every profile in a given project

    In this post I suggested that you place the properties directly inside of the .pubxml file. Instead of this you can place this property directly inside your .csproj/.vbproj file. You should place the following PropertyGroup in your project file.

    <PropertyGroup>
      <EnableMSDeployAppOffline>true</EnableMSDeployAppOffline>
      <MSDeployUseChecksum>true</MSDeployUseChecksum>
    </PropertyGroup>
    
    Note: This element should be above the Import for Microsoft.WebApplication.targets.

    Set these properties for every project on a machine

    If you have a build server (or your own dev box) that you’d like to apply these settings for every build you can create an environment variable with the name/value desired.

    You can also use the CustomBeforeMicrosoftCommonTargets MSBuild property. I’ve blogged about how you can use this technique in the past.

     

    Sayed Ibrahim Hashimi | http://msbuildbook.com | @SayedIHashimi

    New Tutorial Published on Migrating to ASP.NET Identity

    $
    0
    0

    ASP.NET Identity is the new membership system for building ASP.NET applications. The new tutorial  explains how to migrate an application that uses ASP.NET Membership to the new ASP.NET Identity system. The tutorial shows how to enable existing users to log in with the new Identity system by migrating the database tables that store user credentials, roles and  profile information. The tutorial shows how to migrate from SQL Membership, but the concepts can also be used to migrate from Universal Providers or Simple Membership Providers.

     

    Once the migration is completed, you can use the new features available in Identity: you can let users log in through their Google, Facebook, Twitter or Microsoft accounts, use OWIN authentication middleware, integrate user profile and application data, and so on. Feedback is welcome and do let us know if you hit issues during migration!

     

    To learn more about ASP.NET Identity, visit http://www.asp.net/identity.

     

    Thanks to Tom Dykstra and Rick Anderson for reviewing the article.

    Tips When Making Changes in Entity Framework Code First Models after Scaffolding

    $
    0
    0

    When you scaffold an existing Entity Framework model, using MVC5 scaffolding in Visual Studio 2013, you can easily run into the issue of “The model backing the <DbContextName> context has changed since the database was created” as shown below.

    image

    For example, in an MVC project, add the following model.

    publicclass Product
        {
            publicint Id { get; set; }
            publicstring Name { get; set; }
        }

    Scaffold the Product model using “MVC 5 Controller with views, using Entity Framework” scaffolder in Visual Studio 2013. View the generated pages, Index/Edit/Details/Create, to verify things are working properly.

    Now, suppose we need to modify the Product model to add more fields, like Description and Category.

    publicclass Product
        {
            publicint Id { get; set; }
            publicstring Name { get; set; }
            publicstring Description { get; set; }
            publicstring Category { get; set; }
        }

    Scaffold the Product model again and view a scaffold page, you will see the error message mentioned above. The exception is caused by the model change, not actually a scaffolding issue. However, because of the order in which users do things, it might sometimes appear to be related to scaffolding.

    There are a few ways to workaround this error; each has some pros and cons, depending on what you need.

    1) Code First Migration with Entity Framework.

    Following the instructions at Code First Migration requires a number of steps.

    First, you need to bring up “Package Manager Console”, Tools –> Library Package Manger –> Package Manager Console, and run the Enable-Migrations command in the console as the first step to enable migrations for your context.

    PM> Enable-Migrations -ContextTypeName WebApplication7.Models.WebApplication7Context
    Checking if the context targets an existing database...
    Detected database created with a database initializer. Scaffolded migration '201309271941079_InitialCreate' corresponding to existing database. To use an automatic migration instead, delete the Migrations folder and re-run Enable-Migrations specifying the -EnableAutomaticMigrations parameter.
    Code First Migrations enabled for project WebApplication7.

     

    From now on, each time you modify the model and re-run the scaffolding, you need to run the Add-Migration, and Update-Database commands. In our example above, after adding Description and Category, we run those commands as follows.

    PM> Add-Migration AddDescriptionCategory
    Scaffolding migration 'AddDescriptionCategory'.
    The Designer Code forthis migration file includes a snapshot of your current Code First model. This snapshot is used to calculate the changes to your model when you scaffold the next migration. If you make additional changes to your model that you want to include inthis migration, then you can re-scaffold it by running 'Add-Migration AddDescriptionCategory' again.
    PM> Update-Database
    Specify the '-Verbose' flag to view the SQL statements being applied to the target database.
    Applying explicit migrations: [201309280107120_AddDescriptionCategory].
    Applying explicit migration: 201309280107120_AddDescriptionCategory.
    Running Seed method.
    PM> 

    Now you can test view the scaffolded pages and won’t see the error anymore.

    The good thing about this approach is you just modify the exact table representing the model that you’re changing, and leave the rest of the database intact. However, if you do a lot of changes on the model and want some quick verification of the scaffold pages, going through those steps each time can add a considerable amount of time. Also, those steps will add a number of files into your projects that can grow quickly.

    Migration is usually the best choice when you use it to deploy to production the initial time and when you’re deploying updates to production later. For more information about migrations for deployment, see http://www.asp.net/web-forms/tutorials/deployment/visual-studio-web-deployment/introduction.

    For rapid development, option 2 and 3 below might be a better choice.

    2) Modify the Database name in the connectionString

    A quick workaround to the existing database issue without going through all the database migration steps would be to modify the connection string in web.config. Each time you change the code first model and scaffold it, you can give the connection string a new “Initial Catalog” and “AttachDbFilename” value.

    image

    This approach will create a new database each time you give a new database name in the connection string and leave the old database unused. This might not be ideal if you already have a large existing database. However, it could be useful if you just want to verify something quick once or twice.

     

    3) Database Initializer with Entity Framework

    With the second workaround, you still need to remember to modify the connection string each time the model is changed. If you want to make one setting, and forget about it when you’re building your app, you can achieve this with a custom initializer class. EF will drop, recreate and re-seed the database each time the model changes, and you set up this action only once in your project.

    Following is an example of a custom initializer class.

    publicclass SchoolInitializer : DropCreateDatabaseIfModelChanges<SchoolContext>
        {
    protectedoverridevoid Seed(WebApplication3Context context)
            {
                var students = new List<Student>
                {
    new Student { FirstMidName = "Carson",   LastName = "Alexander", EnrollmentDate = DateTime.Parse("2005-09-01") },
    new Student { FirstMidName = "Meredith", LastName = "Alonso",    EnrollmentDate = DateTime.Parse("2002-09-01") },
    new Student { FirstMidName = "Arturo",   LastName = "Anand",     EnrollmentDate = DateTime.Parse("2003-09-01") },
    new Student { FirstMidName = "Gytis",    LastName = "Barzdukas", EnrollmentDate = DateTime.Parse("2002-09-01") },
    new Student { FirstMidName = "Yan",      LastName = "Li",        EnrollmentDate = DateTime.Parse("2002-09-01") },
    new Student { FirstMidName = "Peggy",    LastName = "Justice",   EnrollmentDate = DateTime.Parse("2001-09-01") },
    new Student { FirstMidName = "Laura",    LastName = "Norman",    EnrollmentDate = DateTime.Parse("2003-09-01") },
    new Student { FirstMidName = "Nino",     LastName = "Olivetto",  EnrollmentDate = DateTime.Parse("2005-09-01") }
                };
                students.ForEach(s => context.Students.Add(s));
                context.SaveChanges();
            }
        }

    Then add the following code in Global.asax.cs

    publicclass MvcApplication : System.Web.HttpApplication
        {
            protectedvoid Application_Start()
            {
              Database.SetInitializer<SchoolContext>(new SchoolInitializer());
            }
        }

    From now on, you can just focus on building the right model for your application, scaffold, and test it as many times as you need, and don’t have to do anything extra in order to avoid the error mentioned at the beginning of this blog.

    This approach will drop and create a new database only when the model is changed, and it won’t leave behind a lot of unused databases, which option 2 will do.

    Another way to tell Entity Framework to use your initializer class is to add an element to the entityFramework element in the application’s Web.config file, instead of modifying the Global.asax.cs file. For more information about how to do it, you can read Tom's fabulous blog at Creating an Entity Framework Data Model for an ASP.NET MVC Application. Setting up the initializer in the Web.Config file is sometimes preferable because you can turn it on or off or change it without changing code.

    Tutorial series updated for Entity Framework 6 Code First with MVC 5

    $
    0
    0

    We have updated our EF Code First / MVC tutorial series to use Visual Studio 2013, Entity Framework 6, and MVC 5. New Entity Framework 6 features covered in these tutorials include:

    • Connection resiliency
    • Command interception
    • Code-based configuration
    • Async
    • Stored procedures

    The series includes the following tutorials:

    The old EF 5 / MVC 4 series is still available at a new URL:

    The old series covers some topics that have not been included in the new series yet.  We plan to add these to the new series later:

    If there are other topics you’d like us to cover in these tutorials that we haven’t covered yet, please let us know.

    - Tom Dykstra

    - Rick Anderson


    Introducing batch support in Web API and Web API OData

    $
    0
    0

    With the release of Microsoft ASP.NET Web API 2 OData, we have introduced support for batching requests. Batching is a web API feature that allows a customer to pack several API requests and send them to the web API service in one HTTP request and receive a single HTTP response with the response to all their requests. This way, the client can optimize calls to the server and improve the scalability of its service.

    For a more in depth look at the batch support, you can take a look at the specification.

    Batch in Web API

    In order to start using batch in Web API, the only requirement is to register a route with a batch handler. Let’s start by creating the server. For this sample we will be using the brand new OWIN host for Web API. The code below creates the server and configures a web API route and a batch route.

    staticvoid Main(string[] args)
    {
        string serviceUrl = "http://localhost:12345";
        using (WebApp.Start(serviceUrl, Configuration))
        {
    
            Console.WriteLine("Service listening at {0}", serviceUrl);
            Console.WriteLine("Press any key to stop the service and exit the application");
            Console.ReadKey();
        }
    }
    
    privatestaticvoid Configuration(IAppBuilder builder)
    {
        HttpConfiguration configuration = new HttpConfiguration();
        HttpServer server = new HttpServer(configuration);
        configuration.Routes.MapHttpBatchRoute(
            routeName:"batch",
            routeTemplate:"api/batch",
            batchHandler:new DefaultHttpBatchHandler(server));
        configuration.Routes.MapHttpRoute("api", "api/{controller}/{id}", new { id = RouteParameter.Optional });
        builder.UseWebApi(server);
    }
    

    If we look in detail to the code in the Configuration method, we can see the following method call is the only thing required to enable batching in your Web API service:

    configuration.Routes.MapHttpBatchRoute(
            routeName:"batch",
            routeTemplate:"api/batch",
            batchHandler:new DefaultHttpBatchHandler(server));
    

    The important things to notice are the following:

    • The DefaultHttpBatchHandler requires an instance of an HttpServer to work. We can create this instance ourselves in the self-host scenario or we can get it by accessing it from GlobalConfiguration.DefaultServer.
    • The DefaultHttpBatchHandler executes each request sequentially, but in case you don’t have any dependency between requests, you can set a property to make execution non-sequential.
    • The MapHttpBatchRoute route template can contain parameters. So, in case you create your own batch handler, you can pass in parameters to it.

    That’s all that is required to enable batch in your service. Let’s see how to send batch requests and read batch responses using HttpClient. In order to do that, we are going to need a model, a web API controller and a route to dispatch the incoming requests. We will be using Entity Framework as the backend and AutoFixture to generate some sample data for the service. Here is the code for all of it:

    Route:

    configuration.Routes.MapHttpRoute(
        name: "api",
        routeTemplate: "api/{controller}/{id}",
        defaults: new { id = RouteParameter.Optional });
    

    Model:

    publicclass Customer
    {
        publicint Id { get; set; }
        publicstring Name { get; set; }
    }
    

    Controller:

    publicclass WebCustomersController : ApiController
    {
        CustomersContext context = new CustomersContext();
        [Queryable(PageSize = 10, MaxExpansionDepth = 2)]
        public IHttpActionResult Get()
        {
            return Ok(context.Customers);
        }
    
        public async Task<IHttpActionResult> Post([FromBody] Customer entity)
        {
            if (entity == null)
            {
                return BadRequest(ModelState);
            }
            context.Customers.Add(entity);
            await context.SaveChangesAsync();
            return CreatedAtRoute("api", new { controller = "ApiCustomers" }, entity);
        }
    
        public async Task<IHttpActionResult> Put(int id, [FromBody] Customer entity)
        {
            if (entity == null)
            {
                return BadRequest(ModelState);
            }
            elseif (id != entity.Id)
            {
                return BadRequest("The key from the url must match the key of the entity in the body");
            }
            var originalCustomer = await context.Customers.FindAsync(id);
            if (originalCustomer == null)
            {
                return NotFound();
            }
            else
            {
                context.Entry(originalCustomer).CurrentValues.SetValues(entity);
                await context.SaveChangesAsync();
            }
            return Content(HttpStatusCode.OK, entity);
        }
    
        public async Task<IHttpActionResult> Delete(int id)
        {
            Customer entity = await context.Customers.FindAsync(id);
            if (entity == null)
            {
                return NotFound();
            }
            else
            {
                context.Customers.Remove(entity);
                await context.SaveChangesAsync();
                return StatusCode(HttpStatusCode.NoContent);
            }
        }
    }

    Entity framework context:

    publicclass CustomersContext : DbContext
    {
        static CustomersContext()
        {
            Database.SetInitializer<CustomersContext>(new CustomersContextInitializer());
        }
    
        protectedoverridevoid OnModelCreating(DbModelBuilder modelBuilder)
        {
            base.OnModelCreating(modelBuilder);
        }
    
        privateclass CustomersContextInitializer : DropCreateDatabaseAlways<CustomersContext>
        {
            protectedoverridevoid Seed(CustomersContext context)
            {
                Fixture fixture = new Fixture();
                IEnumerable<Customer> customers = fixture.CreateMany<Customer>(20).ToList();
                context.Customers.AddRange(customers);
            }
        }
    
        public DbSet<Customer> Customers { get; set; }
    }
    

    Now that we have a running service with some data on it, and we have enabled batching support, we can move on to actually perform batch requests. The default Web API batch implementation is based on the mime/multipart content type. It accepts mime/multipart requests containing multiple HTTP Requests and processes all the requests sending a mime/multipart response containing the individual responses. Let’s start by creating some requests:

    Fixture fixture = new Fixture();
    HttpClient client = new HttpClient();
    dynamic listOfCustomers = JToken.Parse(await client.GetStringAsync("http://localhost:12345/api/WebCustomers"));
    dynamic firstCustomer = listOfCustomers[0];
    firstCustomer.Name = "Peter";
    dynamic secondCustomer = listOfCustomers[1];
    JsonMediaTypeFormatter formatter = new JsonMediaTypeFormatter();
    
    //Create a request to query for customers
    HttpRequestMessage queryCustomers = new HttpRequestMessage(HttpMethod.Get, "http://localhost:13245/api/WebCustomers");
    //Create a message to add a customer
    HttpRequestMessage addCustomer = new HttpRequestMessage(HttpMethod.Post, "http://localhost:13245/api/WebCustomers");
    addCustomer.Content = new ObjectContent<Customer>(fixture.Create<Customer>(), formatter);
    //Create a message to update a customer
    HttpRequestMessage updateCustomer = new HttpRequestMessage(HttpMethod.Put, string.Format("http://localhost:13245/api/WebCustomers/{0}", firstCustomer.Id));
    updateCustomer.Content = new ObjectContent<dynamic>(firstCustomer, formatter);
    //Create a message to remove a customer.
    HttpRequestMessage removeCustomer = new HttpRequestMessage(HttpMethod.Delete, string.Format("http://localhost:13245/api/WebCustomers/{0}", secondCustomer.Id));

    The first block before the blank line only performs a query to get some customers that we can update and delete in a request later. At this point, we could just send four requests using HttpClient, and our service would just process the requests and send individual responses back.

    In order to batch those requests together into a single HTTP request, we need to encapsulate them into HttpMessageContent instances and add those instances to a MultipartContent instance. Here is the code to do that:

    //Create the different parts of the multipart content
    HttpMessageContent queryContent = new HttpMessageContent(queryCustomers);
    HttpMessageContent addCustomerContent = new HttpMessageContent(addCustomer);
    HttpMessageContent updateCustomerContent = new HttpMessageContent(updateCustomer);
    HttpMessageContent removeCustomerContent = new HttpMessageContent(removeCustomer);
    
    //Create the multipart/mixed message content
    MultipartContent content = new MultipartContent("mixed", "batch_" + Guid.NewGuid().ToString());
    content.Add(queryContent);
    content.Add(addCustomerContent);
    content.Add(updateCustomerContent);
    content.Add(removeCustomerContent);

    If you look at the code above, the multipart content needs to have a subtype of mixed, and a boundary which is a unique identifier that determines the separation between the different parts of the content.

    Now that we have created the multipart/mixed content, the only last thing that we need to do in order to have a valid batch request is create the HTTP request and associate the content to it. We can do that as we see in the following fragment:

    //Create the request to the batch service
    HttpRequestMessage batchRequest = new HttpRequestMessage(HttpMethod.Post, "http://localhost:12345/api/batch");
    //Associate the content with the message
    batchRequest.Content = content;

    With this last part, we are ready to send a batch request to the server and get a response. To do that, we just use an HttpClient instance and call SendAsync passing the request message to it and capture the associated response. If we just do that, the following request gets send and the following response comes back:

    Request:

    POST http://localhost:12345/api/batch HTTP/1.1
    Content-Type: multipart/mixed; boundary="batch_357647d1-a6b5-4e6a-aa73-edfc88d8866e"
    Host: localhost:12345
    Content-Length: 857
    Expect: 100-continue
    
    --batch_357647d1-a6b5-4e6a-aa73-edfc88d8866e
    Content-Type: application/http; msgtype=request
    
    GET /api/WebCustomers HTTP/1.1
    Host: localhost:13245
    
    
    --batch_357647d1-a6b5-4e6a-aa73-edfc88d8866e
    Content-Type: application/http; msgtype=request
    
    POST /api/WebCustomers HTTP/1.1
    Host: localhost:13245
    Content-Type: application/json; charset=utf-8
    
    {"Id":129,"Name":"Name4752cbf0-e365-43c3-aa8d-1bbc8429dbf8"}
    --batch_357647d1-a6b5-4e6a-aa73-edfc88d8866e
    Content-Type: application/http; msgtype=request
    
    PUT /api/WebCustomers/1 HTTP/1.1
    Host: localhost:13245
    Content-Type: application/json; charset=utf-8
    
    {"Id":1,"Name":"Peter"}
    --batch_357647d1-a6b5-4e6a-aa73-edfc88d8866e
    Content-Type: application/http; msgtype=request
    
    DELETE /api/WebCustomers/2 HTTP/1.1
    Host: localhost:13245
    
    
    --batch_357647d1-a6b5-4e6a-aa73-edfc88d8866e--
    

    If we look at the request above, we can see, as we have said before, that all the messages are separated with a boundary, which in this case is “batch_357647d1-a6b5-4e6a-aa73-edfc88d8866e”. Also, we can see that every request message has a Content-Type of application/http. This header is introduced by the HttpContent where we wrapped all our requests. If we wanted to, we could have added extra headers that we could read and use for handling the processing of the message in a custom batch handler.

    If we look at the response below we can clearly see that it follows the same pattern, with a multipart/mixed content type, a boundary to separate the different parts of the multipart and a collection of responses encapsulated in HttpContent parts.

    Response:

    HTTP/1.1 200 OK
    Content-Length: 1373
    Content-Type: multipart/mixed; boundary="61cfbe41-7ea6-4771-b1c5-b43564208ee5"
    Server: Microsoft-HTTPAPI/2.0
    Date: Fri, 25 Oct 2013 06:30:14 GMT
    
    --61cfbe41-7ea6-4771-b1c5-b43564208ee5
    Content-Type: application/http; msgtype=response
    
    HTTP/1.1 200 OK
    Content-Type: application/json; charset=utf-8
    
    [{"Id":1,"Name":"Namefc4b8794-943b-487a-9049-a8559232b9dd"},{"Id":2,"Name":"Name244bbada-3e83-43c8-82f7-5b2c4d72f2ed"},{"Id":3,"Name":"Nameec11d080-7f2d-47df-a483-7ff251cdda7a"},{"Id":4,"Name":"Name14ff5a3d-ad92-41f6-b4f6-9b94622f4968"},{"Id":5,"Name":"Name00f9e4cc-673e-4139-ba30-bfc273844678"},{"Id":6,"Name":"Name01f6660c-d1de-4c05-8567-8ae2759c4117"},{"Id":7,"Name":"Name60030a17-6316-427c-a744-b2fff6d9fe11"},{"Id":8,"Name":"Namefa61eb4c-9f9e-47a2-8dc5-15d8afe33f2d"},{"Id":9,"Name":"Name9b680c10-1727-43f5-83cf-c8eda3a63790"},{"Id":10,"Name":"Name9e66d797-d3a9-44ec-814d-aecde8040ced"}]
    --61cfbe41-7ea6-4771-b1c5-b43564208ee5
    Content-Type: application/http; msgtype=response
    
    HTTP/1.1 201 Created
    Location: http://localhost:13245/api/ApiCustomers
    Content-Type: application/json; charset=utf-8
    
    {"Id":21,"Name":"Name4752cbf0-e365-43c3-aa8d-1bbc8429dbf8"}
    --61cfbe41-7ea6-4771-b1c5-b43564208ee5
    Content-Type: application/http; msgtype=response
    
    HTTP/1.1 200 OK
    Content-Type: application/json; charset=utf-8
    
    {"Id":1,"Name":"Peter"}
    --61cfbe41-7ea6-4771-b1c5-b43564208ee5
    Content-Type: application/http; msgtype=response
    
    HTTP/1.1 204 No Content
    
    
    --61cfbe41-7ea6-4771-b1c5-b43564208ee5--

    Finally, now that we have sent and received a response we need to read it and extract all the individual responses from the content. For accomplishing this task, we can use the ReadAsMimeMultipartAsync and ReadAsHttpResponseMessageAsync methods on the Content property of the response.

    HttpResponseMessage response = await client.SendAsync(batchRequest);
    //Reads the individual parts in the content and loads them in memory
    MultipartMemoryStreamProvider responseContents = await response.Content.ReadAsMultipartAsync();
    //Extracts each of the individual Http responses
    HttpResponseMessage queryResponse = await responseContents.Contents[0].ReadAsHttpResponseMessageAsync();
    HttpResponseMessage addResponse = await responseContents.Contents[1].ReadAsHttpResponseMessageAsync();
    HttpResponseMessage updateResponse = await responseContents.Contents[2].ReadAsHttpResponseMessageAsync();
    HttpResponseMessage removeResponse = await responseContents.Contents[3].ReadAsHttpResponseMessageAsync();
    

    And with this, we have successfully sent and read a batch request to Web API. Let’s look at how to do the same thing with Web API OData.

    Batch in OData

    In order to use batch with OData services, the flow is very similar to the flow in Web API. We will be reusing the model and the backend from the above sample, and the only things we’ll need are an OData model a route that defines the OData endpoint and a controller to handle the incoming requests. All of which can be shown below:

    Route:

    configuration.Routes.MapODataRoute("odata", "odata", GetModel(), new DefaultODataBatchHandler(server));

    As you can see from the fragment above, MapODataRoute accepts an ODataBatchHandler as a parameter. In this case, we are using the DefaultODataBatchHandler, but we could also use the UnbufferedODataBatchHandler.

    OData model:

    privatestatic IEdmModel GetModel()
    {
        ODataModelBuilder builder = new ODataConventionModelBuilder();
        builder.ContainerName = "CustomersContext";
        EntitySetConfiguration<Customer> customers = builder.EntitySet<Customer>("Customers");
        return builder.GetEdmModel();
    }
    

    Controller:

    publicclass CustomersController : ODataController
    {
        CustomersContext context = new CustomersContext();
        [Queryable(PageSize = 10, MaxExpansionDepth = 2)]
        public IHttpActionResult Get()
        {
            return Ok(context.Customers);
        }
    
        public async Task<IHttpActionResult> Post([FromBody] Customer entity)
        {
            if (entity == null)
            {
                return BadRequest(ModelState);
            }
            context.Customers.Add(entity);
            await context.SaveChangesAsync();
            return Created(entity);
        }
    
        public async Task<IHttpActionResult> Put([FromODataUri] int key, [FromBody] Customer entity)
        {
            if (entity == null)
            {
                return BadRequest(ModelState);
            }
            elseif (key != entity.Id)
            {
                return BadRequest("The key from the url must match the key of the entity in the body");
            }
            var originalCustomer = await context.Customers.FindAsync(key);
            if (originalCustomer == null)
            {
                return NotFound();
            }
            else
            {
                context.Entry(originalCustomer).CurrentValues.SetValues(entity);
                await context.SaveChangesAsync();
            }
            return Updated(entity);
        }
    
        [AcceptVerbs("PATCH", "MERGE")]
        public async Task<IHttpActionResult> Patch([FromODataUri] int key, Delta<Customer> patch)
        {
            object id;
            if (patch == null)
            {
                return BadRequest("The entity is malformed");
            }
            elseif (patch.TryGetPropertyValue("Id", out id) && (int)id != key)
            {
                return BadRequest("The key from the url must match the key of the entity in the body");
            }
            Customer originalEntity = await context.Customers.FindAsync(key);
            if (originalEntity == null)
            {
                return NotFound();
            }
            else
            {
                patch.Patch(originalEntity);
                await context.SaveChangesAsync();
            }
            return Updated(originalEntity);
        }
    
    
        public async Task<IHttpActionResult> Delete([FromODataUri]int key)
        {
            Customer entity = await context.Customers.FindAsync(key);
            if (entity == null)
            {
                return NotFound();
            }
            else
            {
                context.Customers.Remove(entity);
                await context.SaveChangesAsync();
                return StatusCode(HttpStatusCode.NoContent);
            }
        }
    }

    We could use HttpClient to send batch requests to our OData service, but it’s much easier if we make use of the generated client built in through the Add Service Reference dialog in Visual Studio 2013.

    To do that, we can just start the service by pressing Ctrl+F5, then we only need to go to the References icon of the project Right Click, Add Service Reference and give it the metadata url of our service which in this case is http://localhost:12345/odata/$metadata.

    clip_image002

    Now that we have a generated client, we can just start using it to work with the OData endpoint and batch requests. The generated client supports two flows. It can batch a set of queries and return a batch response with the responses to the individual queries, or it can batch a change set of requests that perform modifications on the server.

    In order to batch and read a set of queries, we can do the following:

    CustomersContext context = new CustomersContext(new Uri("http://localhost:12345/odata/"));
    context.Format.UseJson();
    //Create the queries
    DataServiceRequest<Customer> firstTwoCustomers = new DataServiceRequest<Customer>(new Uri("http://localhost:12345/odata/Customers?$top=2&$orderby=Id"));
    DataServiceRequest<Customer> nextTwoCustomers = new DataServiceRequest<Customer>(new Uri("http://localhost:12345/odata/Customers?$skip=2&$top=2&$orderby=Id"));
    //Send the queries
    DataServiceResponse batchResponse = context.ExecuteBatch(firstTwoCustomers, nextTwoCustomers);
    foreach (QueryOperationResponse response in batchResponse)
    {
        foreach(Customer c in response.Cast<Customer>())
        {
            //Do something
        }
    }

    In order to batch a set of changes to the server, we can do the following:

    Fixture fixture = new Fixture();
    CustomersContext context = new CustomersContext(new Uri("http://localhost:12345/odata/"));
    context.Format.UseJson();
    IList<Customer> customers = context.Customers.ToList();
    Customer customerToAdd = fixture.Create<Customer>();
    Customer customerToUpdate = customers.Skip(1).First();
    Customer customerToDelete = customers.Skip(2).First();
    context.AddToCustomers(customerToAdd);
    customerToUpdate.Name = "Peter";
    context.UpdateObject(customerToUpdate);
    context.DeleteObject(customerToDelete);
    DataServiceResponse response = context.SaveChanges(SaveChangesOptions.Batch | SaveChangesOptions.ReplaceOnUpdate);

    Finally, the wire format for OData batch requests is the same as the format for Web API batch requests but with a few minor differences. In OData, requests in a batch are divided in two categories. query operations and change sets. query operations, as it name implies, don’t perform any modification on the server, in contrast with change sets, that basically group a set of state changing operations as a unit.

    These concepts get reflected on the format of the request and of the response. Basically change sets get encoded as nested mime/multipart parts of the external mime/multipart content that is the batch request. The response follows the same structure, as we can see in the response below:

    HTTP/1.1 202 Accepted
    Content-Length: 1088
    Content-Type: multipart/mixed; boundary=batchresponse_34d95fb9-d930-4443-8b8b-774b467ba1af
    Server: Microsoft-HTTPAPI/2.0
    DataServiceVersion: 3.0
    Date: Fri, 25 Oct 2013 08:01:17 GMT
    
    --batchresponse_34d95fb9-d930-4443-8b8b-774b467ba1af
    Content-Type: multipart/mixed; boundary=changesetresponse_7b32b21f-547b-4eb5-a1ca-cd7b28753fec
    
    --changesetresponse_7b32b21f-547b-4eb5-a1ca-cd7b28753fec
    Content-Type: application/http
    Content-Transfer-Encoding: binary
    
    HTTP/1.1 201 Created
    Location: http://localhost:12345/odata/Customers(21)
    Content-ID: 11
    Content-Type: application/json; odata=minimalmetadata; charset=utf-8
    DataServiceVersion: 3.0
    
    {
      "odata.metadata":"http://localhost:12345/odata/$metadata#Customers/@Element","Id":21,"Name":"Name7a88d78d-61e0-4951-8852-6b05be5e913b"
    }
    --changesetresponse_7b32b21f-547b-4eb5-a1ca-cd7b28753fec
    Content-Type: application/http
    Content-Transfer-Encoding: binary
    
    HTTP/1.1 204 No Content
    Content-ID: 12
    
    
    --changesetresponse_7b32b21f-547b-4eb5-a1ca-cd7b28753fec
    Content-Type: application/http
    Content-Transfer-Encoding: binary
    
    HTTP/1.1 204 No Content
    Content-ID: 13
    
    
    --changesetresponse_7b32b21f-547b-4eb5-a1ca-cd7b28753fec--
    --batchresponse_34d95fb9-d930-4443-8b8b-774b467ba1af--
    

    With this, I conclude our blog post on the support for batch in Web API and Web API OData. I hope you enjoy it and happy coding :).

    Remote Debugging a Window Azure Web Site with Visual Studio 2013

    $
    0
    0

    In the Azure SDK 2.2 we released remote debugging support for Windows Azure Cloud Services. You can read more about that release at Scott Guthrie’s blog post Windows Azure: Announcing release of Windows Azure SDK 2.2 (with lots of goodies). You can find more info on Windows Azure Web Sites diagnostics and debugging at our docs for Web Sites diagnostics and debugging as well.

    When we released the Azure SDK 2.2 the server side support for remote debugging Windows Azure Web Sites was not yet in production. Because of this the command was not shown in Visual Studio. We have now published the server side support in Windows Azure Web Sites, and the feature is now automatically enabled in Visual Studio.

    In this post you will find the download links required to try out the new features as well as more info about the support.

    How to get the new features?

    In order to remotely debug your site you will need to download and install the following.

    · Any version of Visual Studio 2013 which supports remote debugging

    · Azure SDK 2.2

    After installing the Azure SDK 2.2 you will now see a new menu option, Attach Debugger, for your Azure Web Sites. In the image below you’ll find this new menu option.

    image

    Now let’s see how you can use this new feature.

    Remote debugging walkthrough

    For a new site running in Windows Azure Web Site you’ll need to follow the following steps to get your remote debugging session started.

    1. Publish your site to Windows Azure Web Sites

    2. Invoke the Attach Debugger menu option in Server Explorer

    To have the best debugging experience you should publish your site using the Debug build configuration. You can configure this for your publish profile on the Settings tab of the Web Publish dialog. The drop down is shown in the following image.

    clip_image002

    After publishing your application you can use the Server Explorer in Visual Studio to access your web sites. If you haven’t already you may need to sign in to Windows Azure in Visual Studio. You can do this using the Connect to Windows Azure button on the Server Explorer. See the image below for that button.

    clip_image003

    After signing in you will see your Web Sites under the Windows Azure node in Server Explorer. Right click on the site that you would like to debug and select Attach Debugger. When this is invoked the remote debugging agent will be started on your web site, you site is restarted with the agent attached, your default browser will be opened to the URL of your site, and Visual Studio will attach the remote debugger. The first time you do this the delay will be about 20 seconds, but subsequent usages will attach much quicker. If you disable the remote debugger option in the portal you’ll experience the ~20 second delay again.

    After that you can debug your remote site as you would your local project. You can step through code, set breakpoints, break on exceptions, evaluate expressions, and all the other goodness you are used to.

    Note: currently the support here is designed for single instance sites. If you attach to a web site running multiple instances, you will attach to a random instance. In the future we may look at providing a better experience here, but we do not have any specific plans yet.

    For more info on remote debugging Windows Azure Web Sites you can visit http://www.windowsazure.com/en-us/develop/net/tutorials/troubleshoot-web-sites-in-visual-studio/#remotedebug.

    Remote debugging with Visual Studio 2012

    You can also remotely debug your Windows Azure Web Site with Visual Studio 2012, but you’ll need to configure a few things manually for now. We are working to bring the same experience for remote debugging to Visual Studio 2012 but we are not there yet. For now you can use the steps below for Visual Studio 2012.

    1. In the Windows Azure Management Portal, go to the Configure tab for your web site, and then scroll down to the Site Diagnostics section
    2. Set Remote Debugging to On, and set Remote Debugging Visual Studio Version to 2012 image
    3. In the Visual Studio Debug menu, click Attach to Process
    4. In the Qualifier box, enter the URL for your web site, without the http:// prefix
    5. Select Show processes from all users
    6. When you're prompted for credentials, enter the user name and password that has permissions to publish the web site. To get these credentials, go to the Dashboard tab for your web site in the management portal and click Download the publish profile. Open the file in a text editor, and you'll find the user name and password after the first occurrences of userName= and userPWD=.
    7. When the processes appear in the Available Processes table, select w3wp.exe, and then click Attach.
    8. Open a browser to your site URL.
    • You might have to wait 20 seconds or so while Windows Azure sets up the server for debugging. This delay only happens the first time you run in debug mode on a web site. Subsequent times within the next 48 hours when you start debugging again there won't be a delay.

     

    Please let us know what you think about this feature in the comments below.

    Sayed Ibrahim Hashimi | http://msbuildbook.com | @SayedIHashimi

    A High-Value, Undocumented LESS Editor Feature in Visual Studio

    $
    0
    0

    Recently a very valuable, undocumented feature in the Visual Studio LESS editor came to my attention. This applies equally to Visual Studio 2012 Update 2 and later, and to Visual Studio 2013. The installation of Web Essentials is not required to benefit from this feature.

    The Visual Studio LESS Editor respects @import statements, and can resolve namespace, mixin and variable definitions from imported files. However, a common pattern with larger LESS projects is to have one include file establishing the include order, and a large number of source files with no @imports at all. The BootStrap project, for example, contains bootstrap.less, which looks like:

    // Core variables and mixins
    @import "variables.less";
    @import "mixins.less";

    // Reset
    @import "normalize.less";
    @import "print.less";

    When any of the project files, such as mixins.less, is opened it will display a long list of validation errors. IntelliSense will not work for the items which come from imports unless one adds an @import statement. Unfortunately, this ruins your ability to build or easily update your project.

    A solution to this problem is to use a special “reference” comment, which is exactly like those used for injecting JavaScript references in *.js files. At the top of each BootStrap LESS file, add:

    /// <reference path="bootstrap.less" />

    Once this is done, the editor references the specified LESS file(s) when preparing validation and IntelliSense without changing the semantics of the source code.

    Note: the LESS compiler used by Web Essentials does not respect this statement; you will not be able to build such projects inside the IDE with Web Essentials. You will still need to use a command line build. However, you will find that your editing experience will be greatly enhanced.

    OData Scaffolding

    $
    0
    0

    With the release of Visual Studio 2013 RTM, we added support for scaffolding OData controllers with Entity Framework. In this blog topic we will cover the following topics
    •    Scaffolding an OData controller with Entity Framework on a Web API 2 project.
    •    Extra configuration steps required to setup OData scaffolding in a MVC project.

    Scaffolding an OData controller with Entity Framework on a Web API2 project

    Create a Web project using ASP.NET Web Application template and select Web API. Create the following  model classes in the Models folders of the project

    public class Customer   
    {   
           public int CustomerId { get; set; }
            public string CustomerName { get; set; }   
            public ICollection<Order> Orders { get; set; }   
    }

    public class Order   
    {   
            public int OrderId { get; set; }
            public string OrderName { get; set; }   
            public Customer Customer { get; set; }   
      } 

    Build the project.   
    Right click on the Controllers folder and select “New Scaffolded Item”. As an alternative, you can also select “Controller”

                          Pic1  
    Choose “Web API 2 OData Controller with actions, using Entity Framework”

                       Pic2

    In the “Add Controller” dialog, name the controller “CustomerController”. Choose Customer class as the model in the dropdown menu and click on “New Data Context”. If you check “Use async controller actions”, it will create an OData controller with async methods. You can look at the blog topic for async controllers here.

                        Pic3

    This creates an OData Controller for Customer. To get this code running, we need to follow the instructions in readme.txt that gets generated once a controller is created.

    Pic4

     

    Pic5

    As mentioned in readme.txt, look at the instructions in CustomerController.cs. Copy the using statements (that don’t already exist) in the instructions of CustomerController.cs file to the top of the Webapiconfig.cs file. Copy the rest of the statements in the instructions of CustomerController.cs and place them inside the Register method of WebapiConfig.cs. After you are done, your Webapiconfig.cs should look like this:

              Pic6
    If your model has one or more navigation properties, besides the CRUD actions for the entity, the Get method is created for each navigation property too

               Pic9
    Build the project. Now you are all set to run your application and use the OData actions in Customer controller.


    Do note that OData is case-sensitive, so when calling your actions, make sure you call them with right capitalization. For instance, the URI for the GetCustomer actions is:  http://localhost:43910/odata/Customer. Each action has its relative URI as a comment above it for guidance.

            Pic7
    If you scaffold OrderController too in the similar way mentioned above, do note that generated instructions in OrderController.cs file will include adding Order and Customer entity sets (again) to Register method of WebapiConfig.cs file. Since these were already added once, you don’t have to add them again.  

    Extra configuration steps required to setup OData scaffolding in a MVC project

    If you are using an Mvc project, apart from adding ODataRoute to Webapiconfig.cs file, you should also modify the Global.asax.cs file as explained in the readme.txt file

    Hope you find this helpful. Thanks to all my team members for reviewing this blog.

    Related Posts

    http://www.asp.net/web-api/overview/odata-support-in-aspnet-web-api/creating-an-odata-endpoint

    Office Web Apps are using ASP.NET SignalR to power real-time co-authoring

    $
    0
    0

    Office Web Apps are now using SignalR as the backend to synchronize changes when two or more people are editing files on SkyDrive or on SharePoint Online.  SignalR enables real time communication among different browsers and native mobile apps to communicate with each other through the backend server.  It is flexible, scalable and offers great performance, demonstrated by its usage in SkyDrive Office Web Apps.

    To verify its usage in Office web apps, let’s go to SkyDrive and create a word document.

    image_thumb[1]

    Log on another computer and edit the same SkyDrive document from any browser. SignalR works cross browser and will auto-select the appropriate real-time transport.

    image_thumb[10]

    F12 on IE11 to bring up the Developer Tools, go to debugger and search for “SignalR” in the scripts to verify its usage.

    image_thumb[12]

    Go to http://blogs.office.com/b/office365tech/archive/2013/11/06/collaboration-just-got-easier-real-time-co-authoring-now-available-in-microsoft-office-web-apps.aspx to learn more about Office Web Apps real-time co-authoring.

    Go to asp.net/SignalR to learn more about SignalR!

    Viewing all 7144 articles
    Browse latest View live


    <script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>