Quantcast
Channel: ASP.NET Blog
Viewing all 7144 articles
Browse latest View live

Scaffolding ADO.NET Entity Data Model Designer-based-models

$
0
0

The ADO.NET Entity Data Model Designer (Entity Designer) is a GUI tool for editing Entity Framework models. You can use the Entity Designer to visually create and modify entities, associations, mappings, and inheritance relationships. You can also use the tool to validate the model.

When you build the Entity Framework models, Entity Designer in Visual Studio 2013 generates a DbContext based container with the associated T4 template. These context files do not carry any metadata information for the target EDMX files. They are generated to always throw the UnintentionalCodeFirstException as follows:

Code Snippet
  1. protectedoverridevoid OnModelCreating(DbModelBuilder modelBuilder)
  2. {
  3.     thrownew UnintentionalCodeFirstException();
  4. }

Note:
See http://blog.oneunicorn.com/2012/02/26/dont-use-code-first-by-mistake/, for more information about UnintentionalCodeFirstException.

Instead, the metadata information that you described in Entity Designer is embedded into the target assemblies as resources when the Metadata Artifact Processing property of the EDMX file is set to Embed in Output Assembly, which is default. If the property is set to Copy to Output Directory, it is copied to separate files. For both cases, Entity Designer adds an EntityConnection connection string into the Web.config file of the target Web project as follows:

Code Snippet
  1. <configuration>
  2.   <connectionStrings>
  3.     <addname="Northwind_Entities"
  4.     connectionString="metadata=res://*/Northwind.csdl|
  5.                       res://*/Northwind.ssdl|
  6.                       res://*/Northwind.msl;
  7.                       provider=System.Data.SqlClient;
  8.                       provider connection string=&quot;Data Source=.\sqlexpress;nitial Catalog=Northwind;Integrated Security=True;MultipleActiveResultSets=True&quot;"
  9.     providerName="System.Data.EntityClient"/>
  10.   </connectionStrings>
  11. </configuration>

Entity Designer also creates a DbContext constructor to have DbContext use the connection string to retrieve the metadata information by passing the connection string name. For example:

Code Snippet
  1. publicclassNorthwindContext : DbContext
  2. {
  3.     public NorthwindContext()
  4.         : base("name=Northwind_Entities")
  5.     {
  6.     }
  7. }

Note: to see more information about how Entity Framework uses connection strings, see http://msdn.microsoft.com/en-US/data/jj592674.

To scaffold the models with the connection strings, you have to choose the generated container for the data context classes. In this walkthrough, you will scaffold the EDMX generated models for Model First approach. You should be able to use the same tool for Database First approach in a similar way.

1. Open an existing MVC 5 project in Visual Studio 2013 or create a new MVC 5 project. In this walkthrough, we will use the MVC project template with the default options.

2. Create a sample model with Entity Designer

a. In Solution Explorer, under the Models folder, add a new ADO.NET Entity Data Model item called BloggingModel.

image

b. In the following Entity Data Model Wizard, select Empty model, and then click Finish.

image

3. The Entity Designer is opened with a blank model. Now we can start adding entities, properties and associations to the model.

a. Right-click on the design surface and select Properties.

b. In the Properties window change the Entity Container Name to BloggingContext.
This is the name of the derived context that will be generated for you, the context represents a session with the database, allowing us to query and save data.

c. Right-click on the design surface and select Add New -> Entity.

d. Enter Blog as the entity name and BlogId as the key name and click OK.

image

e. Right-click on the new entity on the design surface and select Add New -> Scalar Property, enter Name as the name of the property.

f. Repeat this process to add a Url property.

g. Right-click on the Url property on the design surface and select Properties, in the Properties window change the Nullable setting to True
This allows us to save a Blog to the database without assigning it a Url.

h. Using the same steps used to create the Blog entity, add a Post entity with a PostId key property.

i. Add Title and Content scalar properties to the Post entity.

4. Now that we have a couple of entities, add an association (or relationship) between them.

a. Right-click on the design surface and select Add New -> Association.

b. Make one end of the relationship point to Blog with a multiplicity of One and the other end point to Post with a multiplicity of Many.
This means that a Blog has many Posts and a Post belongs to one Blog.

c. Ensure the Add foreign key properties to 'Post' Entity box is checked and click OK.

image

5. Given our model, Entity Framework can calculate a database schema that will allow us to store and retrieve data using the model. To do that, we should map the model to the physical database. For Model First Approach, we will generate a database from the model.

Note:
You do not need to perform this step in Database First Approach because your models are already mapped to the database. For more information about how to use Entity Designer for Database First approach, see http://msdn.microsoft.com/en-us/data/jj206878.aspx.

a. Right-click on the design surface and select Generate Database from Model.

b. Click New Connection, select Microsoft SQL Server for Data source, specify (localdb)\v11.0, and then enter ModelFirst.Blogging as the database name.

image

c. Select OK and you will be asked if you want to create a new database, select Yes. Make sure that the Save entity connection settings in Web.Config check box is selected.

d. Select Next and the Entity Framework Designer will calculate a script to create the database schema.

e. Once the script is displayed, click Finish and the script will be added to your project and opened

f. Right-click on the script and select Execute, you will be prompted to specify the database to connect to, specify (localdb)\v11.0.

6. Build the project. The Entity Designer generates two entity classes, Blog and Post, and the database context file as follows:

image

a. Open the Web.config file to verify the connection string that Entity Designer adds. The name of the generated connection string should have the same name that you set for the Entity Container Name property. In this walkthrough, it is BloggingContext.

Code Snippet
  1. <addname="BloggingContext"
  2.      connectionString="metadata=res://*/BloggingModel.csdl|res://*/BloggingModel.ssdl|res://*/BloggingModel.msl;provider=System.Data.SqlClient;provider connection string=&quot;data source=(localdb)\v11.0;initial catalog=ModelFirst.Blogging;integrated security=True;MultipleActiveResultSets=True;App=EntityFramework&quot;"
  3. providerName="System.Data.EntityClient" />

b. Open BloggingModel.Context.cs to verify the constructor generated to use the connection string.

Code Snippet
  1. publicpartialclassBloggingContext : DbContext
  2. {
  3.     public BloggingContext()
  4.         : base("name=BloggingContext")
  5.     {
  6.     }
  7.  
  8.     protectedoverridevoid OnModelCreating(DbModelBuilder modelBuilder)
  9.     {
  10.         thrownew UnintentionalCodeFirstException();
  11.     }
  12.  
  13.     publicvirtual DbSet<Blog> Blogs { get; set; }
  14.     publicvirtual DbSet<Post> Posts { get; set; }
  15. }

7. Build your project again to make sure everything works fine. You can now use new ASP.NET Scaffolding as usual for the EDMX generated models, Blog and Post. Just do not forget to select the newly generated BloggingContext for Data context class while scaffolding as follows:

image

Note:
For more information on how to use new MVC5 ASP.NET Scaffolding, see http://www.asp.net/visual-studio/overview/2013/aspnet-scaffolding-overview.


Some notes on using Organizational Auth with ASP.NET

Announcing the Release of ASP.NET MVC 5.1, ASP.NET Web API 2.1 and ASP.NET Web Pages 3.1

$
0
0

The NuGet packages for ASP.NET MVC 5.1, ASP.NET Web API 2.1 and ASP.NET Web Pages 3.1 are now live on the NuGet gallery!

Download this release

You can install or update to the released NuGet packages for ASP.NET MVC 5.1, ASP.NET Web API 2.1 and ASP.NET Web Pages 3.1 using the NuGet Package Manager Console, like this:

  • Install-Package Microsoft.AspNet.Mvc -Version 5.1.0
  • Install-Package Microsoft.AspNet.WebApi -Version 5.1.0
  • Install-Package Microsoft.AspNet.WebPages -Version 3.1.0

Pre-requisites for this release

What’s in this release?

This release is packed with great new features summarized below:

ASP.NET MVC 5.1

ASP.NET Web API 2.1

ASP.NET Web Pages 3.1

You can find a complete listing of the features and fixes included in this release by referring to the corresponding release notes:

Documentation

Tutorials and other information about this release are available from the ASP.NET web site (http://www.asp.net).

Questions and feedback

You can submit related to this release on the ASP.NET forums (MVC, Web API, Web Pages). Please submit any issues you encounter and feature suggestions for future releases on our CodePlex site.

Thanks and enjoy!

Announcing preview of Dynamic Data provider and EntityDataSource control for Entity Framework 6

$
0
0

Today, we are pleased to announce an update to ASP.NET DynamicData and EntityDataSource control, so that they work with Entity Framework 6.

What’s in this preview

- DynamicData provider for Entity Framework 6

- EntityDataSource control for Entity Framework 6

How to install this preview

You can download this preview for ASP.NET DynamicData.EFProvider (http://www.nuget.org/packages/Microsoft.AspNet.DynamicData.EFProvider/) and EntityDataSource (http://www.nuget.org/packages/Microsoft.AspNet.EntityDataSource/) as preview NuGet packages from the NuGet gallery. You can install these pre-release packages through NuGet using the NuGet Package Manager Console, like this:

  • Install-Package Microsoft.AspNet.DynamicData.EFProvider -Version 6.0.0-alpha1 –Pre
  • Install-Package Microsoft.AspNet.EntityDataSource -Version 6.0.0-alpha1 –Pre

Getting started

Microsoft.AspNet.DynamicData.EFProvider

This package has a DynamicData EFProvider for EntityFramework 6. This provider can work with a Model (either Code First or Model First) which was created using Entity Framework 6. This package also installs the Page Templates, Entity Templates and Field Templates which are required for DynamicData. The templates have been updated to use Microsoft.AspNet.EntityDataSource control which we are also previewing today as well.

For more information on ASP.NET DynamicData please see http://msdn.microsoft.com/en-us/library/cc488545.aspx

Following are the steps for using this package in a ASP.NET DynamicData application:

    • Create a new ASP.NET Dynamic Data Entities Web Application
    • Add the Microsoft.AspNet.DynamicData.EFProvider NuGet package
    • This will do the following
      • Add a reference to the DynamicData EFProvider binary
      • Install the templates. If you are starting with a new project, then you can override the templates. If you have an existing application, then you should be careful when overriding the changes. The templates replace the EntityDataSource control which shipped in .NET with Microsoft.AspNet.EntityDataSource and also have a few bug fixes in the Many-Many field template so they work with EntityFramework 6.
    • Create your model using Entity Framework Code First or EF Designer.
    • Add the following code in RegisterRoutes in Global.asax.cs to register your EF Model:
Code Snippet
  1. DefaultModel.RegisterContext(
  2.     new Microsoft.AspNet.DynamicData.ModelProviders.EFDataModelProvider(() => newNorthwindEntities1()),
  3.     newContextConfiguration { ScaffoldAllTables = true });
  • Run the project
  • You would see all the tables listed on the Default page.

Microsoft.AspNet.EntityDataSourceControl

This is an update to the EntityDataSource control which shipped in the .NET Framework. The EntityDataSource control has been updated to work with Entity Framework 6.

To use this control, please do the following

    • Create an ASP.NET application
    • Install the packageMicrosoft.AspNet.EntityDataSource
      • This package will
        • install the runtime binary for Microsoft.AspNet.EntityDataSource
        • Install the EntityFramework version 6 NuGet package
        • Add the following tag prefix in web.config
Code Snippet
  1. <pages>
  2.   <controls>
  3.     <addtagPrefix="ef"assembly="Microsoft.AspNet.EntityDataSource"namespace="Microsoft.AspNet.EntityDataSource" />
  4.   </controls>
  5. </pages>
  • Create a new Web Form page
  • Use the control as follows and bind it to any Databound control such as GridView, FormView etc.
  • Code Snippet
    1. <ef:EntityDataSourceID="GridDataSource"runat="server"EnableDelete="true"/>

Known Issues

  • The templates in Microsoft.AspNet.DynamicData.EFProvider are for C# only.
  • The templates in Microsoft.AspNet.DynamicData.EFProvider are for Web Application projects only and will not work for WebSites projects.

Give feedback

If you find any issues with this preview, please file issues at the EntityFramework CodePlex site https://entityframework.codeplex.com

Thank you for trying out the preview and helping us make it better.

Announcing preview of Microsoft.AspNet.Identity 2.0.0-beta1

$
0
0

Today, we are releasing a preview of ASP.NET Identity. The main focus in this release was to add support for two-factor authentication and fix bugs. We released 2.0.0-alpha1 in December 2013 where we added lots of features. Please read about 2.0.0-alpha1 release here.

Download this release

You can download ASP.NET Identity from the NuGet gallery. You can install or update to these pre-release packages through NuGet using the NuGet Package Manager Console, like this:

  • Install-Package Microsoft.AspNet.Identity.EntityFramework –Version 2.0.0-beta1 –Pre
  • Install-Package Microsoft.AspNet.Identity.Core -Version 2.0.0-beta1 –Pre
  • Install-Package Microsoft.AspNet.Identity.OWIN -Version 2.0.0-beta1 –Pre
  • Install-Package Microsoft.AspNet.Identity.Samples -Version 2.0.0-beta1 –Pre

Please remember to select the “Include Prerelease” option when searching for packages using the NuGet Package Manager or the Package Manager Console. For more information on how to install pre-release packages please read http://docs.nuget.org/docs/Reference/Versioning#Prerelease_Versions and http://docs.nuget.org/docs/release-notes/nuget-1.7#Show_prerelease_packages_in_the_Manage_NuGet_packages_dialog

What’s in this release?

Following is the list of features and major issues that were fixed in 2.0.0-Beta1.

Two-Factor Authentication

ASP.NET Identity now support two-factor authentication. Two-factor authentication provides an extra layer of security to your user accounts in the case where your password gets compromised. Most of the websites protect their data by having a user create an account on their website with a username and password. Passwords are not very secure and sometimes users choose weak passwords which can lead to user accounts being compromised.

To add an extra layer of security, it is important to add a second factor of authentication after a user enters username/ password. Since a password is something a user knows, two-factor authentication allows youto authenticate the user with something only a user possesses such as a phone or email account. Two-factor authentication involves sending the user a code to something only a user has access to such as sending a SMS to user’s phone or emailing the code. The user can enter the code when they get it on their phone or email.

In ASP.NET Identity, the SMS and Email two-factor providers are built in so you can easily configure them to send a text message or email. You can extend and write your own providers such as QR code generators and use Authenticator apps on phones to validate them.

A user can also manage the two-factor authentication options and can enable/ disable two-factor authentication for their user account. This is demonstrated in the ASP.NET Identity Samples NuGet package. A user can also choose to remember the two-factor authentication option if the device they are accessing the website is a personal device so that when they login, they are not asked to verify the two-factor authentication each time. This is a more common usage that we see today in most of the websites.

To try out this feature, you can install ASP.NET Identity Samples NuGet package (in an Empty ASP.NET app) and follow the steps to configure and run the project.

Indexing on Username

In ASP.NET Identity Entity Framework implementation, we have added a unique index on the Username using the new IndexAttribute in EF 6.1.0-Beta1. We did this to ensure that Usernames are always unique and there was no race condition in which you could end up with duplicate usernames.

Enhanced Password Validator

The password validator that was shipped in ASP.NET Identity 1.0 was a fairly basic password validator which was only validating the minimum length. There is a new password validator which gives you more control over the complexity of the password. Please note that even if you turn on all the settings in this password, we do encourage you to enable two-factor authentication for the user accounts.

Code Snippet
  1. manager.PasswordValidator = new PasswordValidator
  2.             {
  3.                 RequiredLength = 6,
  4.                 RequireNonLetterOrDigit = false,
  5.                 RequireDigit = false,
  6.                 RequireLowercase = false,
  7.                 RequireUppercase = false,
  8.             };

 

ASP.NET Identity Samples NuGet package

We are releasing a Samples NuGet package to make it easier to install and run samples for ASP.NET Identity and follow the best practices. This is a sample ASP.NET MVC application. Please modify the code to suit your application before you deploy this in production. The sample should be installed in an Empty ASP.NET application.

Following are the features in this samples package

  • Initialize ASP.NET Identity to create an Admin user and Admin role.
    • Since ASP.NET Identity is Entity Framework based in this sample, you can use the existing methods of initializing the database as you would have done in EF.
  • Configure user and password validation.
  • Register a user and login using username and password
  • Login using a social account such as Facebook, Twitter, Google, Microsoft account etc.
  • Basic User management
    • Do Create, Update, List and Delete Users. Assign a Role to a new user.
  • Basic Role management
    • Do Create, Update, List and Delete Roles.
  • Account Confirmation
  • Password Reset
  • Two-Factor authentication
  • Security Token providers
  • Configure the Db context and UserManager to use a single instance per request.
  • The AccountController has been split into Account and Manage controller. This was done to simplify the account management code.

Following is the list of features and major issues that were fixed in 2.0.0-Alpha1.

For the full list of the features please read the following blog post for more details.

  • Account Confirmation
  • Password Reset
  • Security Token Provider
  • Make the type of Primary Key be extensible for Users and Roles
  • Support IQueryable on Users and Roles
  • Support Delete operation through the UserManager
  • UserManagerFactory Middleware
  • DbContextFactory Middleware

 

Entity Framework 6.1.0-Beta1

ASP.NET Identity 2.0.0-beta1 depends upon Entity Framework 6.1.0-beta1 which was also released today. For more details please read the following announcement post for more details.

List of bugs fixed

You can look at all the bugs that were fixed in this release by clicking this link.

Samples

  • We have a sample project which shows these new features at https://aspnet.codeplex.com. Please look for the Identity folder in the source. https://aspnet.codeplex.com/SourceControl/latest
  • For documentation on ASP.NET Identity please visit http://www.asp.net/identity. We are working on adding more documentation on this site.

Known Issues/ Change list

Migrating from ASP.NET Identity 2.0.0-alpha1 to 2.0.0-beta1

Following are the changes you will have to make to your application if you are upgrading from 2.0.0-alpha1 to 2.0.0-Beta1 of Identity.

· GetUserManager() extension method moved under Microsoft.AspNet.Identity.Owin namespace

· GetConfirmationToken property on UserManager changed to GetEmailConfirmationToken

· ConfirmUser() method changed to ConfirmEmail()

· IsConfirmed() method changed to IsEmailConfirmed()

· UserManagerFactory() changed to CreatePerOwinContext<T>()

· ApplicationCreate static method on UserManager method now has an additional method which takes in a parameter for IOwinContext

· PasswordResetTokens and UserConfirmationTokens properties on UserManager are now replaced with a single UserTokenProvider property on UserManager

· DataProtectorTokenProvider class now strongly typed with generics DataProtectorTokenProvider<TUser>

· IsConfirmed column in the AspnetUsers Table has been renamed to EmailConfirmed.

    • If you are using Entity Framework Code First Migrations to migrate the database from 2.0.0-alpha1 to 2.0.0-Beta1 then you need to edit the migrations script that were generated by the EF migrations.

                       Following is what is generated

            AddColumn("dbo.AspNetUsers", "EmailConfirmed", c => c.Boolean(nullable: false));

            DropColumn("dbo.AspNetUsers", "IsConfirmed");

            You need to change it to

                        RenameColumn("dbo.AspNetUsers", "IsConfirmed","EmailConfirmed");

Known issues in Entity Framework while migrating from Entity Framework 6.1.0-alpha1 to 6.1.0-beta

Entity Framework changed the way indexes were being recognized which causes the indexes between tables to be generated even though they exist. This is a known behavior and occurs only when migrating from EF 6.1.0-alpha to 6.1.0-beta1. For ASP.NET Identity the generated migration script will have the following incorrect entries

            CreateIndex("dbo.AspNetUserRoles", "UserId");

            CreateIndex("dbo.AspNetUserRoles", "RoleId");

            CreateIndex("dbo.AspNetUsers", "VehicleId");

            CreateIndex("dbo.AspNetUserClaims", "UserId");

            CreateIndex("dbo.AspNetUserLogins", "UserId");

Any custom index declaration by user defined classes will also be present here. Running this as is will cause it to fail since the indexes are already present. The solution is to delete these ''CreateIndex' calls for existing indexes

 

Migrating from ASP.NET Identity 1.0 to 2.0.0-beta1

If you are migrating from ASP.NET Identity 1.0 to 2.0.0-Beta1, then please refer to this article on how you can use Entity Framework Code First migrations to migrate your database http://blogs.msdn.com/b/webdev/archive/2013/12/20/updating-asp-net-applications-from-asp-net-identity-1-0-to-2-0-0-alpha1.aspx

This article is based on migrating to ASP.NET Identity 2.0.0-alpha1 but the same steps apply to ASP.NET Identity 2.0.0-beta1

Give feedback and get support

  • If you find any bugs please open them at our Codeplex Site where we track all our bugs https://aspnetidentity.codeplex.com/
  • If you want to discuss these features, please discuss them on Stack Overflow and use the following tag “asp.net-identity”

Thank You for trying out the preview and your feedback for ASP.NET Identity.

Per request lifetime management for UserManager class in ASP.NET Identity

$
0
0

Introduction

We recently released the 2.0.0-beta1 version of ASP.NET Identity. Learn more here by visiting this link. This is an update to 2.0.0-alpha1 and adds the two-factor auth feature along with a few bug fixes. To learn more about the Alpha release, please visit this link.

In this article I am going to explain the significance of the UserManager class in an application and some of the best practices when using it. I am going to use an MVC application that is created using VS 2013 RTM. In the Visual Studio 2013 RTM templates, which had the 1.0.0 version of ASP.NET Identity, we demonstrated directly instantiating the UserManager class as needed in the application. This approach had a few issues, which are explained further in this article. These have been fixed in 2.0.0-beta1.

Understanding Managers and Stores

ASP.NET Identity consists of classes called managers and stores. Managers are high-level classes which an application developer uses to perform operations in the ASP.NET Identity system, such as creating a user. Stores are lower-level classes that specify how entities, such as users and roles, are persisted. Stores are closely coupled with the persistence mechanism, but managers are decoupled from stores which means you can replace the persistence mechanism without disrupting the entire application.

The current article outlines the steps to configure the UserManager in the application. We will start with an application with Identity 1.0.0. We will migrate the solution to 2.0.0-beta1, and change the way UserManager is instantiated and used in the application. Additionally, with this approach it is possible to configure the properties on the UserManager, such as password length and complexity.

Create Application

In Visual Studio 2013 RTM create a new web application. Choose MVC

clip_image002

UserManager explained

Let us briefly look at how the UserManager class is used in the application. All user account management actions are defined in the AccountController class.

Code Snippet
  1. public AccountController()
  2.             : this(newUserManager<ApplicationUser>(newUserStore<ApplicationUser>(newApplicationDbContext())))
  3.         {
  4.         }
  5.  
  6.         public AccountController(UserManager<ApplicationUser> userManager)
  7.         {
  8.             UserManager = userManager;
  9.         }
  10.  
  11.         publicUserManager<ApplicationUser> UserManager { get; privateset; }

The controller has a UserManager property of type UserManager which is set in the constructor. The UserManager class takes in an instance of UserStore class which implements operations on the user that are persistence-specific. In our case, we have a UserStore class which implements these operations specific to EntityFramework. To persist data to/from the database the UserStore class takes in an instance of DBContext class. The UserStore class is defined in the Microsoft.AspNet.Identity.EntityFramework assembly.

Note: The rationale behind implementing this pattern is that if the developer wishes to store user information in the any other storage system (for example, Azure Table storage), all they have do is replace the ‘UserStore’ class with an Azure Table storage implementation. There would be no additional changes needed in the AccountController, and the existing application would function seamlessly.

The problem

In the current approach, if there are two instances of the UserManager in the request that work on the same user, they would be working with two different instances of the user object. An example for this would be using the one in the class property and instantiating one locally in the method under execution. In this scenario the changes made by either of them would not reflect the changes made by the other. Hence persisting these changes back to the database would lead to incorrect changes being made to the user object.

The same problem exists when you are trying to use the DBContext class in the application.

The solution

The solution to the above problem is to store a single instance of UserManager and DbContext per request and reuse them throughout the application. Since Identity hooks into the OWIN pipeline via cookie middleware, we can store the UserManager and DbContext in the OWIN context object and retrieve them as needed.

1. In the application created above, update the Identity packages to 2.0.0-beta1 from the NuGet feed. This can be done through the Manage Nuget packages window. The steps to update Nuget packages are explained here.

2. Instead of directly working with UserManager<T> class we can define a custom class, ApplicationUserManager that extends from UserManager<T>. In the project under the App_Start folder create a new file IdentityConfig.cs and add a new class ApplicationUserManager. The ApplicationUserManager should extend the UserManager class

Code Snippet
  1. publicclassApplicationUserManager : UserManager<ApplicationUser>
  2.     {
  3.     }

3. The web application uses the new OWIN cookie middleware for the cookie-based authentication. During application start, the Configuration method in the Startup class is invoked, which configures all the middleware components registered in the application. In the MVC 5 template, the cookie middleware is configured through the ConfigAuth method defined in the Startup.Auth class.

Since we need to register the UserManager and DBContext class with the OWIN context during app start, we will add methods to do that in the ConfigureAuth method. The ‘CreatePerOwinContext<T>’ method is defined in the Microsoft.AspNet.Identity.Owin namespace. This method registers a static callback method which returns an instance of type <T>. This method is invoked once per request and used to obtain instance object which is used during the lifetime of the request.

4. To create a static callback method that returns an instance of DbContext , in the ApplicationDbContext class create a method as defined below

Code Snippet
  1. publicstatic ApplicationDbContext Create()
  2.     {
  3.         returnnew ApplicationDbContext();
  4.     }

5. Similarly define a method in the ApplicationUserManager that returns an instance of the ApplicationUserManager.

Code Snippet
  1. publicstaticApplicationUserManager Create(IdentityFactoryOptions<ApplicationUserManager> options, IOwinContext context)
  2.         {
  3.             var manager = newApplicationUserManager(new UserStore<ApplicationUser>(context.Get<ApplicationDbContext>()));
  4.             return manager;
  5.         }

In the constructor of the UserManager, we need to retrieve the instance of DbContext to configure the UserStore. We can get the instance of the object from the OwinContext using the ‘Get<ApplicationDbContext>’ method that in turn returns the single instance of DbContext class created using ApplicationDbContext.Create callback method.

6. Register these two callback methods in the ConfigureAuth method through the ‘CreatePerOwinContext’ method

Code Snippet
  1. publicvoid ConfigureAuth(IAppBuilder app)
  2.         {
  3.         app.CreatePerOwinContext<ApplicationDbContext>(ApplicationDbContext.Create);
  4.         app.CreatePerOwinContext<ApplicationUserManager>(ApplicationUserManager.Create);
  5.          .
  6.          }

7. Next we hook this up in the AccountController class. Change the constructor and the UserManager property on the class to type ApplicationUserManager

Code Snippet
  1. public AccountController()
  2.         {
  3.         }
  4.         public AccountController(ApplicationUserManager userManager)
  5.         {
  6.             UserManager = userManager;
  7.         }
  8.         privateApplicationUserManager _userManager;
  9.         publicApplicationUserManager UserManager
  10.         {
  11.             get;
  12.             privateset;
  13.         }

8. In the set property of the UserManager we need can retrieve the UserManager instance from the OWIN context. For this we have an extension method provided in the Microsoft.AspNet.Identity.Owin namespace

Code Snippet
  1. publicApplicationUserManager UserManager
  2.         {
  3.             get
  4.             {
  5.                 return _userManager ?? HttpContext.GetOwinContext().GetUserManager<ApplicationUserManager>();
  6.             }
  7.             privateset
  8.             {
  9.                 _userManager = value;
  10.             }
  11.         }

9. Run the application and verify that a local user can be created in the application.

10. Also in the application if we need to work with the DbContext object directly we can get the instance of the class from the OWIN context as mentioned earlier using the ‘Get<T>’ method

Code Snippet
  1. var dbContext = context.Get<ApplicationDbContext>();

Configuring UserManager properties

Another advantage of following this approach is that we can configure the UserManager properties when instantiating it in a single place. For example, the UserManager by default has a password validator that validates that the supplied password is of length 6 characters. We can change this to use the new password validator in 2.0.0-beta1 which checks for additional complexity in the supplied password during registration.

To do this, simply set the PasswordValidator property in the ‘Create’ method of the ApplicationUserManager with the new ‘PasswordValidator’ class

Code Snippet
  1. public ApplicationUserManager(UserStore<ApplicationUser> userStore)
  2.             : base(userStore)
  3.         {
  4.         }
  5.         publicstaticApplicationUserManager Create(IdentityFactoryOptions<ApplicationUserManager> options, IOwinContext context)
  6.         {
  7.             var manager = newApplicationUserManager(newUserStore<ApplicationUser>(context.Get<ApplicationDbContext>()));
  8.             manager.PasswordValidator = newPasswordValidator
  9.             {
  10.                 RequiredLength = 10,
  11.                 RequireNonLetterOrDigit = true,
  12.                 RequireDigit = true,
  13.                 RequireLowercase = false,
  14.                 RequireUppercase = false,
  15.             };
  16.             return manager;
  17.         }

Here the PasswordValidator class is configured to verify that the supplied password has a non-alphanumeric character and a numeric character. Also the length of password should be of 10 characters length. Similarly other properties of the UserManager can be configured to be persisted through the context lifecycle.

Summary

This post shows how to get a per-request, single instance of the UserManager and DbContext classes from the OWIN context to be used throughout the application. This will form a base for additional blog posts outlining the new features in ASP.NET Identity 2.0.0-beta1.

I hope you have found this walkthrough useful. If you have any questions around ASP.NET Identity or find issues, please feel free to open bugs on the Identity Codeplex site, https://aspnetidentity.codeplex.com/ , leave comments on this blog, or ask questions or StackOverflow (tag: aspnet-identity). I can also be reached on twitter (@suhasbjoshi).


Introducing ASP.NET Project “Helios”

$
0
0

In late 2013 we made available a prerelease NuGet package which allows running a managed web application directly on top of IIS without going through the normal ASP.NET (System.Web) request processing pipeline. This was a relatively quiet event without too much fanfare. At last month’s MVA Windows Azure Deep Dive, we spoke about this for the first time publicly to a global audience.

Today, I’d like to give a formal introduction to ASP.NET Project “Helios”. This post will talk about why we’re introducing this project, what we hope to accomplish with it, and how this might fit in to our ecosystem moving forward.

I assume that the reader has a basic understanding of OWIN and ASP.NET Project Katana. If you are not familiar with these, a brief overview can be found at http://www.asp.net/aspnet/overview/owin-and-katana/an-overview-of-project-katana.

The world today

Prescriptive behaviors

ASP.NET is an “everything and the kitchen sink” programming model. It’s very prescriptive, and developers are pushed to follow its recommended coding patterns. This makes it great for quickly designing and deploying LOB web applications, which was ASP.NET’s original intended audience. However, over the years frameworks like MVC, WebAPI, and SignalR have come online, and developers have begun writing more complex applications following newer standards. These applications don’t fit nicely into the LOB model which made ASP.NET famous. And while ASP.NET certainly allows writing this style of modern application, the simple truth is that the framework’s prescriptive patterns don’t always lend themselves to this style of application. Sometimes these behaviors are downright intrusive. Developers can end up spending a significant amount of time cajoling the framework into meeting their needs.

Below are some sample behaviors which made sense when they were first implemented, but nowadays they are a large source of complaints from our developer base:

  • You cannot mix Windows Authentication (NTLM, Kerberos) and Forms Authentication in the same application.
  • The built-in Forms Authentication module silently converts HTTP 401 Unauthorized responses into HTTP 302 Found (redirect) responses.
  • Request validation shuts down requests which contain ‘<’, ‘&’, or other “dangerous” characters, even if your application handles these requests correctly.
  • Using Task.Wait() can lead to deadlocks in ASP.NET due to the special SynchronizationContext we use. Channel 9 has a video where Brad, Damian, and I speak about this in more detail.

Why Helios?

When we look at our ecosystem, we’re pleased by the success of MVC, WebAPI, SignalR, and our other recent high-level frameworks. These are valuable tools, they have a low barrier to entry for most developers, and they’re deployed completely out-of-band. This allows us to innovate quickly. MVC and WebAPI have published new major releases annually; SignalR has approximately quarterly releases. It allows our customers to deploy immediately, even to shared hosters.

Yet because System.Web is part of the .NET Framework proper, the ASP.NET runtime itself cannot iterate as quickly as we would like it to. We are bound by the release schedules of the .NET Framework as a whole. If a developer asks us to add a feature to ASP.NET, he must wait for the entire framework to rev. And then he must wait for his hoster or IT administrator to update the .NET Framework version on the web server. And if there’s a bug he must again wait for us to provide a fix.

Our core runtime iterates on the scale of years. The state of web technologies is much more agile – much more nimble. A web technology can live its entire lifetime – conception to sunset – in the time that elapses between major releases of the .NET Framework. Our developer audience deserves a base on which they can build a new breed of modern web applications.

And it’s not just wanting more agile development. Recall the list of ASP.NET pain points from earlier: unwanted redirects, too-helpful security handholding resulting in requests being denied, and so on. We’ll never be able to make more than minor tweaks to these behaviors, as we can’t risk breaking customers who have deployed sites and are depending on the existing behaviors.

Finally, we’ll never be able to make the ASP.NET core runtime a “pay-for-play” model. We have experimented several times with moving Web Forms out of System.Web.dll and into its own out-of-band package. This would finally allow us finally fix bugs that have been plaguing us for years. But Web Forms defined ASP.NET for years. The ASP.NET core pipeline and Web Forms processing are inextricably linked.

On IIS and self-host

A developer who cares deeply about these matters may opt to self-host his application. This is perfectly valid, and the Katana stack makes it easy to write host-agnostic applications. But consider all the features that IIS brings to the table.

IIS handles application lifetime management, inspecting request patterns and frequencies to determine how long applications should run. It can suspend (rather than terminate) processes that are idle to help balance available system resources. For improved responsiveness, IIS offers a built-in user-mode cache and can automatically compress response content if appropriate. Application health monitoring is available via built-in logging and failed request tracing modules. IIS supports request filtering and transient worker process identities (least privilege) as part of its 10+ years of security hardening. And the inetmgr utility makes all of this (and more!) available to server administrators.

In a self-hosted scenario, you’re ultimately responsible for the process or Windows service, so you could absolutely achieve feature parity. But this is quite the undertaking to reimplement, and such an endeavor would be error-prone. Besides, if IIS already exists, why not just leverage it for our new host so that we get all of these automatically?

And there’s the crux of Project “Helios”. We want to combine the best of both worlds: pair the granular control offered by self-hosted scenarios with the benefits offered by being hosted inside IIS. And we want to do this while jettisoning the behaviors that developers have told us they dislike about ASP.NET.

Goals and non-goals

As with all things, we need to define our goals before we can determine whether we have been successful in this endeavor. It is not our intent to make a new framework that is everything to all developers. In particular:

  • It is not our goal to have screaming high throughput for “Hello World” scenarios. While Helios does in fact perform significantly better than the full ASP.NET pipeline for such scenarios, these metrics aren’t terribly useful for real-world applications.
  • It is not our goal to provide 100% compatibility with existing applications. In particular, Helios projects do not support .aspx or .ashx endpoints or other ASP.NET-isms.
  • It is not our goal to compete with self-host for developer mindshare. Each OWIN host has its own benefits and drawbacks, and developers should choose the host that meets their needs. We’ll discuss choosing a host later in this post.

On the flip side:

  • It is our goal to enable higher density on web servers. For a machine running a single application, this might be measured by allowing a greater number of concurrent requests on the machine. For a shared hoster, this might be measured by allowing more active sites on a single machine.
  • It is our goal to provide behavior that mimics self-host more than it mimics web-host. We’re trying to eliminate as much magic as possible from the new host.
  • It is our goal to make the Helios framework fully out-of-band. The framework should be able to run without requiring installation as long as the target machine meets the minimum system requirements called out below. Developers should be able to acquire bug fixes / feature additions by acquiring updated packages through NuGet and bin-deploying to their servers / hosters.
  • It is our goal to reduce the friction of deploying a web application built on the Helios host. It should be just as easy to deploy a Helios-hosted application as it is any typical ASP.NET application.

Getting started

Minimum system requirements

You’ll need the following on your development box in order to write a Helios-based application:

  • Windows 8 or Windows Server 2012
  • .NET Framework 4.5.1 (download)
  • Visual Studio 2012 (Visual Studio 2013 preferred but not required)

And your web server / web hoster needs the following in order to run a Helios-based application:

  • Windows Server 2012
  • .NET Framework 4.5.1
  • Full trust

Some hosters (such as Windows Azure Web Sites) already meet these minimum requirements, and developers can begin creating and deploying Helios-based applications to these hosters immediately. Contact your hoster if you’re unsure about their capabilities or system configuration.

We’re aware that requiring Windows 8 or Windows Server 2012 can be a bit of a pain. We’re working on relaxing this requirement in a future prerelease.

Finally, since this application will be hosted in IIS, you’ll need to make sure the application pool is configured as below:

  • Managed runtime version: v4.0
  • Pipeline mode: Integrated

Creating a new OWIN application on the Helios host

I’m using Visual Studio 2013 for this walkthrough, but the same general steps should work on Visual Studio 2012.

Important: Make sure NuGet Package Manager is fully up-to-date before you begin! The latest NuGet Package Manager for Visual Studio 2013 can be found at http://visualstudiogallery.msdn.microsoft.com/4ec1526c-4a8c-4a84-b702-b21a8f5293ca.

  1. In Visual Studio, select File -> New -> Project.
  2. Select Templates -> Visual C# -> Web, then ASP.NET Web Application. Ensure the target framework is set to .NET Framework 4.5 or later. Then hit OK.

    image
  3. Select the Empty template, then hit OK.

    image
  4. Install the Microsoft.Owin.Host.IIS NuGet package. If you use the NuGet Package Manager UI (in the Solution Explorer pane, right-click the project name, then select Manage NuGet Packages), remember to change the release type dropdown to Include Prerelease.

    You can also find installation instructions at https://www.nuget.org/packages/Microsoft.Owin.Host.IIS/. At the time of this writing, the latest prerelease version is 0.1.5-pre.

    image
  5. Add an OWIN startup class. Add a C# file named Startup.cs to your project with these contents:
    using System;
    using Owin;

    publicclassStartup {
        publicvoid Configuration(IAppBuilder app) {
            // New code:
            app.Run(async context => {
                context.Response.ContentType = "text/plain";
                await context.Response.WriteAsync("Hello, world.");
            });
        }
    }
  6. Run the project via CTRL-F5. You should see Hello, world. written to the response.

    Note: If you instead see an HTTP 403 Forbidden error page, verify that you have .NET 4.5.1 installed on your machine. The download is located here.

That’s it! You’ve now created an OWIN application running atop the Helios host for IIS. You can easily verify that System.Web.dll isn’t involved anywhere in request processing by changing the code to print out all assemblies loaded into the current AppDomain along with a full stack trace.

publicclassStartup {
    publicvoid Configuration(IAppBuilder app) {
        // New code:
        app.Run(async context => {
            context.Response.ContentType = "text/plain";

            await context.Response.WriteAsync("Assemblies in AppDomain:" + Environment.NewLine);
            foreach (var asm inAppDomain.CurrentDomain.GetAssemblies()) {
                await context.Response.WriteAsync(asm.FullName + Environment.NewLine);
            }
            await context.Response.WriteAsync(Environment.NewLine);

            await context.Response.WriteAsync("Stack trace:" + Environment.NewLine);
            await context.Response.WriteAsync(Environment.StackTrace);
        });
    }
}

The output for 0.1.5-pre will read:

Assemblies in AppDomain:
mscorlib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089
Microsoft.AspNet.Loader.IIS, Version=0.1.5.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35
System.Core, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089
System, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089
System.Configuration, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a
System.Xml, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089
Microsoft.Owin, Version=2.1.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35
Microsoft.Owin.Host.IIS, Version=0.1.5.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35
Microsoft.Owin.Host.IIS.Security, Version=0.1.5.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35
Microsoft.Owin.Hosting, Version=2.1.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35
Owin, Version=1.0.0.0, Culture=neutral, PublicKeyToken=f0ebd12fd5e55cc5
WebApplication32, Version=1.0.0.0, Culture=neutral, PublicKeyToken=null
Microsoft.CSharp, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a
System.Dynamic, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a
Anonymously Hosted DynamicMethods Assembly, Version=0.0.0.0, Culture=neutral, PublicKeyToken=null
System.Security, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a

Stack trace:
at System.Environment.GetStackTrace(Exception e, Boolean needFileInfo)
at System.Environment.get_StackTrace()
at Startup.<<Configuration>b__0>d__2.MoveNext() in c:\Users\levib\Documents\Visual Studio 2013\Projects\WebApplication32\WebApplication32\Startup.cs:line 17
at System.Runtime.CompilerServices.AsyncMethodBuilderCore.Start[TStateMachine](TStateMachine& stateMachine)
at System.Runtime.CompilerServices.AsyncTaskMethodBuilder.Start[TStateMachine](TStateMachine& stateMachine)
at Startup.<Configuration>b__0(IOwinContext context)
at Microsoft.Owin.Extensions.UseHandlerMiddleware.Invoke(IOwinContext context)
at Microsoft.Owin.Infrastructure.OwinMiddlewareTransition.Invoke(IDictionary`2 environment)
at Microsoft.Owin.Host.IIS.HeliosCallContext.Invoke(Func`2 app)
... boring intrinsic stuff removed ...
at Microsoft.AspNet.Loader.IIS.Infrastructure.PipelineExecutor.MapRequestHandler(MapRequestHandlerData* pMapRequestHandlerData)

You can see that System.Web.dll and related assemblies (like System.Web.Extensions.dll) aren’t in the stack trace at all. In fact, they’re not even in the current AppDomain! They exert no influence over request processing, so you’re not paying their per-request memory overhead and aren’t battling their unwanted behaviors.

Note: If you attach a debugger and look at the loaded modules list (Debug -> Windows -> Modules), you’ll see that System.Web.dll is still loaded in the process. The reason for this is that it’s still ultimately involved with process management: application startup and shutdown, IIS idle process page-out, and a handful of other features. But even though System.Web.dll is present in the process, this only incurs a small fixed per-process overhead. There is no per-request overhead. (In fact, no System.Web.dll code paths are involved whatsoever in request processing.)

Adding Helios to an existing OWIN-based application

Have an existing OWIN-based application? No problem! Consider the SignalR-StockTicker sample on GitHub. This sample uses the System.Web host for OWIN.

  1. Open the SignalR-StockTicker project in Visual Studio 2013.
  2. Install the Microsoft.Owin.Host.IIS NuGet package to the project.
  3. [optional] Remove the Microsoft.Owin.Host.SystemWeb NuGet package from the project. This step is not strictly required, as the Helios framework will suppress loading this particular package at runtime anyway.
  4. In the Solution Explorer pane, open the SignalR.StockTicker folder, right-click StockTicker.html, and hit View in Browser.
  5. Hit the Open Market button on the page.

    image

That’s it! The only thing that’s needed to convert a web-hosted OWIN application into a Helios-hosted OWIN application is to pull the necessary NuGet package. As long as the application truly is OWIN-based (doesn’t rely on HttpRuntime, HttpContext, etc.), no other changes should be necessary.

There’s one more nifty thing I want to point out: notice that SignalR is continuing to utilize WebSockets under the covers, even after migrating to the Helios host.

Using the Helios runtime without OWIN

Finally, the option exists to run an application directly on top of Helios rather than go through the Helios host for OWIN. This provides the most granular control over application startup, shutdown, and request processing. More on this can be found in the supplemental post.

Caveats

Remember our stated non-goals: we do not intend 100% compatibility with existing applications. In particular, since Helios completely bypasses the typical ASP.NET pipeline, anything that has a hard dependency on the ASP.NET pipeline will not work correctly in a Helios application. Requests to Web Forms (.aspx) or Razor / Web Pages (.cshtml) endpoints will not be recognized. ASP.NET MVC endpoints will be ignored. Accessing HttpRuntime, HostingEnvironment, or other ASP.NET-hosting-intrinsic types could result in undefined behavior.

ASP.NET Web API endpoints will be reachable as long as Web API has been configured via the OWIN extensibility layer rather than as a typical ASP.NET IHttpModule. To use Web API atop OWIN:

  1. In your project, reference the Microsoft.AspNet.WebApi.Owin NuGet package.
  2. Use the IAppBuilder.UseWebApi extension method to register Web API as an OWIN middleware. A sample Startup.cs that defines a default Web API route is provided below:
    using System;
    using System.Web.Http;
    using Owin;

    publicclassStartup {
        // This code configures Web API. The Startup class is specified as a type
        // parameter in the WebApp.Start method.
        publicvoid Configuration(IAppBuilder appBuilder) {
            // Configure Web API for self-host.
            HttpConfiguration config = newHttpConfiguration();
            config.Routes.MapHttpRoute(
                name: "DefaultApi",
                routeTemplate: "api/{controller}/{id}",
                defaults: new { id = RouteParameter.Optional }
            );

            appBuilder.UseWebApi(config);
        }
    }

Further thoughts

Performance and resource consumption

The Helios architecture follows the modern trend of “pay-for-play” design. As much as possible, the framework tries to avoid magic behavior, opting instead to give the developer nearly complete control over the application. This is the same model followed by the Katana project. If your application doesn’t use a particular capability or feature, the unused functionality shouldn’t affect the application’s behavior or its performance characteristics.

The Helios runtime achieves around 2x – 3x the throughput for “Hello world” scenarios when compared to System.Web, but this honestly isn’t a terribly interesting measurement. These requests are best handled by caches rather than by the web server itself. Developer code tends to dominate in real-world applications, so throughput remains steady regardless of the underlying framework. What is interesting is considering other resources like available memory, where the framework does in fact tend to show up in profiles. In our tests the Helios runtime consumes around 96% less amortized per-request memory than does the System.Web runtime. See the supplemental post for more information on these measurements and their real-world impact.

Support

This is a very early prerelease. One might even call it pre-alpha. There is no official support provided for this. We do not guarantee that we’ll ever ship a release version. The full EULA can be found here.

The EULA does allow for deployment to a hosting provider. However, deploying such an early prerelease to a service that you care about is ill-advised.

Conclusion

We’re excited about what this could mean for the future of our platform, especially as more frameworks and components break their strict dependency on System.Web.dll. This new design promises to allow us to ship new functionality fully out-of-band and to avoid surprising developers with unwanted behaviors.

I also want to stress that this is strictly an option. The target audience for this package is a minority of our overall developer audience. The team has no plans to force our general developer audience on to this system.

Finally, there is a supplemental post available with further information available for more advanced developers.  That post discusses performance and resource utilization in more detail. It also discusses using the Helios APIs directly without going through OWIN.

As always, feedback is greatly appreciated! Feel free to sound off in the comments below. And thanks for reading!

Thanks to Eilon, Glenn, Damian, and Scott, who all graciously offered their time to help review this post.

Supplemental to ASP.NET Project “Helios”

$
0
0

This is a supplemental document to my earlier Introducing ASP.NET Project “Helios” post.  It contains extra information that might be of interest to the advanced developer but which didn’t make it into the main post.  I encourage reading the original post before continuing.

On performance and resource consumption

When most web developers discuss performance, they’re thinking in terms of requests per second (RPS) throughput. As a rule of thumb, the lower you go in the stack, the more raw throughput you’re able to achieve. An application that opens a raw socket to listen for and process web requests will always outperform a higher-level framework like ASP.NET when it comes to “Hello World” scenarios. This is invariant. But let’s be honest: no real web application is a simple “Hello World” application. The goal of serving static content to visitors as quickly as possible is best served by a web cache than by a web server proper.

Real web applications perform non-trivial request processing. They hit databases and file systems. Perhaps they make calls to backend services. Attach a profiler to such an application, and you’ll see that the cost of the application logic and everything it calls dwarfs the ASP.NET runtime overhead. In my nearly seven years on the ASP.NET team, I have never once heard a customer complain that the ASP.NET runtime was simply too slow (in terms of throughput) for his needs when compared with other frameworks or hosts.

Throughput is just one aspect of performance measurements. A web server has a finite number of resources available to it: CPU, memory, hard drive space and I/O speed, and so on. Each request consumes some chunk of these shared resources, and the server must be mindful of how resources are allocated to each request. Administrators measure the resources required for each request and use this to make a determination of how many concurrent requests the server can handle while still meeting availability and reliability goals.

And once you calculate the number of concurrent requests per machine, how does one go about scaling the application up? The traditional way to do this in ASP.NET applications is to simply throw more hardware at the problem: build out a web farm. If the backend database is the bottleneck, cluster it out. Perhaps add a backend cache such as Redis. The particular course of action taken depends on the application.

When you think of improving performance as solving a resource allocation problem rather than as boosting throughput, you might be surprised where this train of logic leads. Let’s take a moment to consider one resource for now – memory.

Comparing memory usage of System.Web and Helios

To compare memory usage of a System.Web-based application versus a Helios-based application, we need a reference application. Any application based on OWIN is an ideal candidate for such a comparison. This allows us to leave the application code the same, so the only thing really changing between the runs is the underlying runtime.

Consider the following Web API controller whose Get() method simply holds the connection open while releasing the request thread. If we make several thousand requests to this application, this mimics an application processing many concurrent requests with long-running asynchronous operations.

public class MyApiController : ApiController {
    public Task Get() {
        // return a Task which never completes
        return new TaskCompletionSource<object>().Task;
    }
}

Note: This is a very simple example. In practice you would use a realtime framework like SignalR to achieve this goal, but this simple example is still useful for determining the minimum amount of memory required to maintain a single persistent connection. In the case of SignalR running atop the System.Web OWIN host, the WebSocket transport generally consumes less memory than the other available transports, so actual per-request memory usage in that scenario may be lower than what is reported here.

In this test, I created a simple OWIN-based (via Microsoft.Owin.Host.SystemWeb) WebAPI application with the above controller. No other middleware was added, and I did not change ASP.NET configuration (other than increase the maximum allowed concurrent connection count from its default value). The web application was deployed to a 64-bit application pool in IIS 8.5 (Windows Server 2012 R2). I then hit this endpoint with 50,000 connections and monitored the # Bytes in all heaps performance counter for the w3wp.exe process. (I also forced garbage collections throughout the test to reclaim unreachable memory.)

The performance counter showed 1,480,856,008 allocated bytes in all heaps for the worker process. Divided by 50,000 requests, this gives an amortized overhead of 28.9 KiB per request. We can’t treat this number as absolutely golden when performing capacity planning exercises. For instance, it doesn’t account for any unmanaged per-request memory usage. But it can tell us a few things, such as that on a machine with 8 GB of RAM and running a 64-bit worker process, the amount of physical memory will become a bottleneck at the 300k concurrent request level or earlier.

The ASP.NET code paths are optimized so that they’re just background noise in throughput measurements, but they definitely show up when other resources like memory are considered. Developers pay for these features – even if those features are never used by the application. This is one of the tradeoffs of the “everything and the kitchen sink” mantra followed by the ASP.NET runtime.

We then reran the test with the Helios OWIN package installed. Installing the Microsoft.Owin.Host.IIS NuGet package is the only change we made to this project. In this new run, the performance counter showed 53,295,232 allocated bytes, which divided by 50,000 concurrent requests gives an amortized overhead of 1.04 KiB per request. Given that the System.Web overhead for this same test is 28.9 KiB per request, the Helios architecture provides a 96.4% reduction in per-request managed memory overhead compared with the full ASP.NET pipeline.

Memory overhead (amortized per request)

System.Web-basedHelios-basedDifference
28.9 KiB / request1.04 KiB / request-96.4%

Let’s put this another way. In absolute numbers, the Helios architecture allowed our sample application to achieve 50,000 concurrent requests with approximately 1 GB less overhead compared with the standard ASP.NET pipeline. And since the sample application was designed to be a minimum baseline, one can reasonably expect this same absolute number to apply to any non-trivial application as well.

And no. We’re not just pulling a sleight-of-hand and making unmanaged memory allocations in place of managed allocations. We’re not that sneaky. :)

Saving memory has beneficial ripple effects. Fewer page faults puts less pressure on the page file. Because there are fewer per-request managed allocations, there is also less pressure on the CLR garbage collector. Collections occur less frequently, and when they do occur they tend to complete much more quickly. In one of our internal “Hello World” performance runs (warning: unrealistic workload!), the full ASP.NET pipeline spent around 2.0% of its time performing garbage collection (see % Time in GC performance counter). That same application when Helios-hosted averaged 0.06% time in GC.

Using the Helios runtime without OWIN

The Helios runtime (Microsoft.AspNet.Loader.IIS.dll) is a standalone assembly and doesn’t have any direct integration with the OWIN pipeline. An application is free to use the APIs exposed by the Helios runtime directly rather than use the OWIN extensibility points provided by the Microsoft.Owin.Host.IIS package.

If you’d like to use the Helios APIs directly, follow these steps to get started:

  1. In Visual Studio 2013, select File -> New -> Project.
     
  2. Verify that the target framework is .NET Framework 4.5 or later. Select ASP.NET Web Application.
     
  3. In the New ASP.NET Project dialog, select the Empty template, then hit OK.
     
  4. Install the Microsoft.AspNet.Loader.IIS NuGet package into the project. Do not install the Microsoft.Owin.Host.IIS package, otherwise the OWIN compatibility layer will initialize and you may see weird runtime behaviors due to multiple HttpApplicationBase instances being available.
     
  5. Add a class which subclasses the Microsoft.AspNet.Loader.IIS.HttpApplicationBase type. At minimum, your derived type must override the ProcessRequestAsync method.
     
  6. Add an assembly-level Microsoft.AspNet.Loader.IIS.HttpApplicationAttribute which points to the type of your HttpApplicationBase-derived type.

    A sample MyHeliosApplication.cs file which combines steps (5) and (6) is provided below:

    using System;
    using System.Globalization;
    using System.Text;
    using System.Threading.Tasks;
    using System.Web;
    using Microsoft.AspNet.Loader.IIS;

    [assembly: HttpApplication(typeof(MyHeliosApplication))]
    public class MyHeliosApplication : HttpApplicationBase {
        public override async Task ProcessRequestAsync(IHttpContext context) {
            context.Response.StatusCode = 200;
            context.Response.StatusDescription = "OK";
            context.Response.Headers["Content-Type"] = new[] { "text/plain" };
            await context.Response.WriteLineAsync("The current time is {0}.", DateTimeOffset.Now);
            var asms = AppDomain.CurrentDomain.GetAssemblies();
            await context.Response.WriteLineAsync("There are {0} assemblies in the current AppDomain:", asms.Length);
            foreach (var asm in asms) {
                await context.Response.WriteLineAsync(asm.GetName().ToString());
            }
        }
    }

    internal static class ResponseExtensions {
        public static Task WriteLineAsync(this IHttpResponse response, string format, params object[] args) {
            byte[] bytes = Encoding.UTF8.GetBytes(String.Format(CultureInfo.CurrentCulture, format, args) + Environment.NewLine);
            return response.WriteEntityBodyAsync(bytes, 0, bytes.Length);
        }
    }


Run the project via CTRL-F5. You should see the current time and list of loaded assemblies written to the response, as in the sample output below.

The current time is 2/11/2014 11:08:18 AM -08:00.
There are 5 assemblies in the current AppDomain:
mscorlib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089
Microsoft.AspNet.Loader.IIS, Version=0.1.5.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35
System.Core, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089
System, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089
WebApplication34, Version=1.0.0.0, Culture=neutral, PublicKeyToken=null

There is no official documentation as yet, but the Microsoft.AspNet.Loader.IIS NuGet package includes some limited Intellisense for these APIs.  They roughly correspond to a slimmed-down version of the APIs on System.Web.HttpContext.

Feedback

If you have any feedback, please leave a comment on the original blog post.  Thanks for reading!


Adding two-factor authentication to an application using ASP.NET Identity

$
0
0

Introduction

We recently released the 2.0.0-beta1 version of ASP.NET Identity. Learn more here by visiting this link. This is an update to 2.0.0-alpha1 and adds two-factor authentication along with a few bug fixes. To learn more about the Alpha release, visit this link.

As mentioned in the release article, with the 2.0.0-beta1 version of Identity we have added support for enabling two-factor authentication in an application. Two factor authentication is a 2 step process for authenticating a user. For example, using local account credentials and a secure PIN sent as a text message, or using OAuth login and QR code readers. More information on multi-factor authentication can be found here.

At a base level the two factor authentication flow is explained below

clip_image002[5]

Step 1 is when the user enters his/her local username and password on the Login page. If the local credentials are valid, then an email is sent to the registered email address of the user with the security PIN. An alternative is to send the PIN in a text message. The user then enters the PIN on the web page. The PIN is validated and the user logged in to the application. To do this we need to initially confirm the email or the phone number that the user provides during the registration process.

The current article explains how to add two-factor authentication in an application where the users use their local account credentials and a secure PIN sent in email. We will start with an application created using Visual Studio 2013, update the Identity packages to 2.0.0-beta1, add code to confirm user password, register and enable the email token provider for the two step authentication, and verify the functionality.

All the code in the article is available in the newly released Microsoft.AspNet.Identity.Samples package which has some additional features too. The article here is a walkthrough explaining the steps as we add the feature in the application.

Create application and configure UserManager

The application created using Visual Studio 2013 has Identity version 1.0.0. This article describes updating the version to 2.0.0-beta1 and configuring the UserManager class in the application in the recommend manner. This article shows you how can get a single instance of UserManager per request.

Please follow the article as the first step in the adding the new features.

Setting up test email client

For the two factor authentication shown in this post, the security PIN is emailed to the user. The UserManager has properties which are used for the messaging operations. The two properties ‘EmailService’ and ‘SmsService’ can be set to the implementation of ‘IIdentityMessageService’ interface to send an email or a text message. The ‘IIdentityMessageService’ has a single ‘SendAsync’ method which is implemented according to the messaging action. For our sample, we will create a class to send an email using a test SMTP server.

For the sake of this article we will use the Windows Live SMTP server as a test server which is available free of cost if you have a Windows Live account.

Create a class called EmailService in the project. Paste the code below

  1. publicclassEmailService : IIdentityMessageService
  2.     {
  3.         publicTask SendAsync(IdentityMessage message)
  4.         {
  5.             MailMessage email = newMailMessage("XXX@hotmail.com", message.Destination);
  6.  
  7.             email.Subject = message.Subject;
  8.  
  9.             email.Body = message.Body;
  10.  
  11.             email.IsBodyHtml = true;
  12.  
  13.             var mailClient = newSmtpClient("smtp.live.com", 587) { Credentials = newNetworkCredential("XXX@hotmail.com", "password"), EnableSsl = true };
  14.  
  15.             return mailClient.SendMailAsync(email);
  16.         }
  17.     }

The IdentityMessage class is the wrapper class for the message to be sent to the user. It has the following properties

· Destination: Where the message is to be sent. In our case, the user’s email address

· Subject: message subject

· Body: the actual message

The ‘SendAsync’ method takes the IdentityMessage class parameter which has all the properties set by the UserManager. We construct the email message as a System.Net.Mail.MailMessage object and set the respective properties. We make the mail to be in the form of HTML so that the users can click on the links in the email. The SMTP host for the Windows Live server is ‘smtp.live.com’ and the port is ‘587’. The credentials property of the SmtpClient class should have the Windows Live account credentials of the application developer testing this feature.

This is a sample mail client and can be extended as needed. Users can substitute the Windows Live SMTP server with, for example, SendGrid to send email to the users.

Confirm user email

To send the two-factor login PIN to the user via email, we need to confirm the email address the user provides during registration. The UserManager class exposes methods that let you obtain tokens and confirm user email addresses using these security tokens.

1. We need to first register a token provider which generates and validates tokens to be emailed in the confirmation link to the user. Using this secure token the user can confirm their email address. This token provider property has to be set on the UserManager and hence can be done in the ApplicatioUserManager.Create method.

The IdentityFactoryOptions parameter has a property DataProtectionProvider which is an implementation of the OWIN Data Protection API (DPAPI) feature. This is set in the ‘CreatePerOwinContext<T>’ method in the AppBuildExtensions class. The code can be viewed by reflecting the Microsoft.AspNet.Identity.Owin dll. We can use this implementation for the UserTokenProvider in UserManager.

  1. publicstaticApplicationUserManager Create(IdentityFactoryOptions<ApplicationUserManager> options, IOwinContext context)
  2.         {
  3.             var manager = newApplicationUserManager(newUserStore<ApplicationUser>(context.Get<ApplicationDbContext>()));
  4.  
  5.             // Configure validation logic for passwords
  6.             manager.PasswordValidator = newPasswordValidator
  7.             {
  8.                 RequiredLength = 6,
  9.                 RequireNonLetterOrDigit = false,
  10.                 RequireDigit = false,
  11.                 RequireLowercase = false,
  12.                 RequireUppercase = false,
  13.             };
  14.  
  15.             var dataProtectionProvider = options.DataProtectionProvider;
  16.  
  17.             if (dataProtectionProvider != null)
  18.             {
  19.                 manager.UserTokenProvider = newDataProtectorTokenProvider<ApplicationUser>(dataProtectionProvider.Create("ASP.NET Identity"));
  20.             }
  21.  
  22.             return manager;
  23.         }

This registers a single token provider that is used to generate tokens for user email confirmation and resetting password.

2. Next assign the ‘EmailService’ property on the UserManager to the ‘EmailService’ class created in the previous section inside the ‘Create’ method

  1. manager.EmailService = newEmailService();

Next we steps outline the process of modifying VS 2013 RTM templates to use enable the user email confirm feature.

3. In the current project, users register themselves with username and password. We shall change the registration flow to take an additional email field along with the username. Edit the RegisterViewModel to add another property for Email.

  1. publicclassRegisterViewModel
  2.     {
  3.         [Required]
  4.         [Display(Name = "User name")]
  5.         publicstring UserName { get; set; }
  6.  
  7.         [Required]
  8.         [Display(Name = "Email address")]
  9.         publicstring Email { get; set; }
  10.  
  11.         ......
  12.     }

Note: In this example we are setting the username and email separately. You can also choose to set the email as the username. Refer to the Microsoft.Aspnet.Identity.Samples Nuget package for implementing this.

4. In the Register.cshtml page, add a new ‘div’ tag for the email similar to the one for username.

  1. <divclass="form-group">
  2.         @Html.LabelFor(m => m.Email, new { @class = "col-md-2 control-label" })
  3.         <divclass="col-md-10">
  4.             @Html.TextBoxFor(m => m.Email, new { @class = "form-control" })
  5.         </div>
  6.     </div>

5. In the Register post method of the AccountController, set the email from the view model class when creating a user.

  1. var user = newApplicationUser() { UserName = model.UserName, Email = model.Email };

6. In the current registration logic, the user is signed in once the user is registered. We will change the flow to get the email confirmation token and email it to the user. The users need to check their email for the confirmation link and confirm their account. In the Register post method replace the following code:

  1. if (result.Succeeded)
  2.                 {
  3.                     await SignInAsync(user, isPersistent: false);
  4.  
  5.                     return RedirectToAction("Index", "Home");
  6.  
  7.                 }

with the following code for sending the confirmation link to user via email.

  1. if (result.Succeeded)
  2.                 {
  3.                     string code = await UserManager.GetEmailConfirmationTokenAsync(user.Id);
  4.                     var callbackUrl = Url.Action("ConfirmEmail", "Account", new { userId = user.Id, code = code }, protocol: Request.Url.Scheme);
  5.                     await UserManager.SendEmailAsync(user.Id, "Confirm your account", "Please confirm your account by clicking this link: <a href=\"" + callbackUrl + "\">link</a>");
  6.                     return View("ShowEmail");
  7.                 }

We get the email confirmation token which then used to create an url pointing to the ‘ConfirmEmail’ action in the application. The UserManager.SendEmailSync method called the ‘SendAsync’ method of the ‘EmailService’ to send the email. The user is redirected to a ‘ShowEmail’ page with a message to check the inbox of the user’s email.

7. Add a view ShowEmail.cshtml that displays a message to user to check their email for a link to confirm their email address

  1. <h3>
  2.     Please check your email for a link to confirm your email address
  3. </h3>

8. In step 6, in the code we see that the token generated is included in a URL that the user clicks to confirm the email. The link will point to an ActionMethod ‘ConfirmEmail’ in the AccountController. We need a method to bind the parameters from the URL, verify that this token is valid one for the user and then confirm the email. Add a new method called ConfirmEmail in the AccountController as below

  1. [HttpGet]
  2.         [AllowAnonymous]
  3.         publicasyncTask<ActionResult> ConfirmEmail(string userId, string code)
  4.         {
  5.             if (userId == null || code == null)
  6.             {
  7.                 return View("Error");
  8.             }
  9.  
  10.             IdentityResult result = await UserManager.ConfirmEmailAsync(userId, code);
  11.  
  12.             if (result.Succeeded)
  13.             {
  14.                 return View("ConfirmEmail");
  15.             }
  16.             else
  17.             {
  18.                 AddErrors(result);
  19.                 return View();
  20.             }
  21.         }

The above code binds the ‘userId’ and ‘code’ parameters from the url and calls the method on the UserManager to confirm the email address.

9. Add the corresponding view ConfirmEmail.cshtml to display the messages

  1. <h2>@ViewBag.Title.</h2>
  2.  
  3. <div>
  4.  
  5.     <p>
  6.         Thank you for confirming your email. Please @Html.ActionLink("click here to log in", "Login", "Account", routeValues: null, htmlAttributes: new { id = "loginLink" })
  7.     </p>
  8. </div>

Setting up Two-Factor Authentication in the application

As of now we have set up the email client and added code to confirm the user email. Now we need to include additional action methods and views to complete the two-factor authentication.

In the existing web application, when a user tries to login using the Login page, the local account credentials or the OAuth credentials are validated and the user is logged in. We need to alter the flow to send the PIN via email and wait for the user to enter it.

1. The two-factor authentication providers have to be registered with the UserManager. This can be done in the ApplicationUserManager.Create method. The Identity framework ships with two built in two-factor authentication providers namely EmailTokenProvider and SmsTokenProvider which generate the security PIN and validate it once it is received from the user. For our project we’ll use the EmailTokenProvider<T> provider.

2. We need to register this provider to the list of two-factor authentication providers in UserManager. This can be done in the ‘Create’ method as below

  1. publicstaticApplicationUserManager Create(IdentityFactoryOptions<ApplicationUserManager> options, IOwinContext context)
  2.         {
  3.             var manager = newApplicationUserManager(newUserStore<ApplicationUser>(context.Get<ApplicationDbContext>()));
  4.             manager.PasswordValidator = newPasswordValidator
  5.             {
  6.                 RequiredLength = 6,
  7.                 RequireNonLetterOrDigit = false,
  8.                 RequireDigit = false,
  9.                 RequireLowercase = false,
  10.                 RequireUppercase = false,
  11.             };
  12.             manager.RegisterTwoFactorProvider("EmailCode", newEmailTokenProvider<ApplicationUser>()
  13.             {
  14.                 Subject = "SecurityCode",
  15.                 BodyFormat = "Your security code is {0}"
  16.             });
  17.  
  18.             manager.EmailService = newEmailService();
  19.  
  20.             var dataProtectionProvider = options.DataProtectionProvider;
  21.             if (dataProtectionProvider != null)
  22.             {
  23.                 manager.UserTokenProvider = newDataProtectorTokenProvider<ApplicationUser>(dataProtectionProvider.Create("ASP.NET Identity"));
  24.             }
  25.             return manager;
  26.         }

The ‘Subject’ and ‘BodyFormat’ properties format the email message as needed so you can edit them as required.

3. In the Login post action, instead of logging in the local user when the local credentials are valid, we need to generate the secure PIN and send it in the email. The ‘GenerateTwoFactorTokenAsync’ method on the UserManager generates the code and calls the SendAsync method on the EmailService property to send the email with the secure PIN.

  1. var user = await UserManager.FindAsync(model.UserName, model.Password);
  2.                 if (user != null)
  3.                 {
  4.                     await UserManager.GenerateTwoFactorTokenAsync(user.Id, "EmailCode");
  5.                     ...

4. After sending the email, the application needs to set a cookie in the browser. This is an essential authentication step to prevent users who were not authenticated with valid local credentials from directly accessing the page. The userId is set as a claim so that it can be used to query the user object during PIN verification. Create a new private method to set the cookie

  1. privatevoid SetTwoFactorAuthCookie(string userId)
  2.         {
  3.             ClaimsIdentity identity = newClaimsIdentity(DefaultAuthenticationTypes.TwoFactorCookie);
  4.             identity.AddClaim(newClaim(ClaimTypes.NameIdentifier, userId));
  5.             AuthenticationManager.SignIn(identity);
  6.         }

Call this method from in the Login method after the call to send email. Redirect to a new action to wait for the user to enter the PIN

  1. var user = await UserManager.FindAsync(model.UserName, model.Password);
  2.                 if (user != null)
  3.                 {
  4.                     await UserManager.GenerateTwoFactorTokenAsync(user.Id, "EmailCode");
  5.                     SetTwoFactorAuthCookie(user.Id);
  6.                     return RedirectToAction("VerifyCode");
  7.                 }

5. ASP.NET Identity uses OWIN middleware for cookie-based authentication. We need to configure the OWIN cookie middleware to store a two-factor authentication cookie in the request. The cookie middleware in the application is configured during application start via the ConfigureAuth method in Startup.Auth. We can add our code here to configure the cookie middleware to use the two factor cookie for temporary authentication. Copy the below code in the Startup.Auth class

  1. app.UseTwoFactorSignInCookie(DefaultAuthenticationTypes.TwoFactorCookie, TimeSpan.FromMinutes(5));

6. The code in step 3 and 4 changed the flow when a user logs in using their local credentials. If a user signs in using OAuth credentials like Google or Facebook and it is linked with a local account, we need to enable the two-factor authentication for them too. To do this for the existing application, we need to send the PIN to the email of the user once they are authenticated from the external provider.

In the AccountController, the ‘ExternalLoginCallback’ method is called after the user is authenticated by the OAuth provider. If there is an existing user with these credentials, then the user is signed in. To implement OAuth we need to replace the sign in action with one that sends the secure PIN and waits for the user to enter the PIN. To do this, replace the call to:

  1. await SignInAsync(user, false);
  2.                 return RedirectToLocal("Home");

with the one defined in step 4.

  1. var user = await UserManager.FindAsync(loginInfo.Login);
  2.             if (user != null)
  3.             {
  4.                 await UserManager.GenerateTwoFactorTokenAsync(user.Id, "EmailCode");
  5.                 SetTwoFactorAuthCookie(user.Id);
  6.                 return RedirectToAction("VerifyCode");
  7.             }

This implements two-factor authentication when user logs in with their OAuth credentials too.

7. Add a method to retrieve the UserId from the two factor authentication cookie set on successful login by local user.

  1. privateasyncTask<string> GetTwoFactorUserIdAsync()
  2.         {
  3.             var result = await AuthenticationManager.AuthenticateAsync(DefaultAuthenticationTypes.TwoFactorCookie);
  4.  
  5.             if (result != null&& result.Identity != null&& !String.IsNullOrEmpty(result.Identity.GetUserId()))
  6.             {
  7.  
  8.                 return result.Identity.GetUserId();
  9.             }
  10.             returnnull;
  11.         }

Create a new action method ‘VerifyCode’ to wait for the PIN from the user. In the get action for this method, verify the userId from the cookie. This way we can validate that the user logged in using valid credentials and was then redirected to the current page. If the cookie is not found, redirect to the Login page for the user to login.

  1. [AllowAnonymous]
  2.         publicasyncTask<ActionResult> VerifyCode()
  3.         {
  4.             if (String.IsNullOrEmpty(await GetTwoFactorUserIdAsync()))
  5.             {
  6.                 return RedirectToAction("Login");
  7.             }
  8.             return View();
  9.         }

8. Add a new cshtml to get the PIN from the user. For convenience, no view model is added to bind the PIN to it. We directly read it from the form element.

  1. @using (Html.BeginForm("VerifyCode", "Account", FormMethod.Post, new { @class = "form-horizontal", role = "form" }))
  2. {
  3.  
  4.     <h3>Enter the PIN sent to the email</h3>
  5.  
  6.     <divclass="form-group">
  7.  
  8.         @Html.Label("pinLabel", new { @class = "col-md-2 control-label" })
  9.  
  10.         <divclass="col-md-10">
  11.  
  12.             @Html.TextBox("pin", "", new { @class = "form-control" })
  13.  
  14.         </div>
  15.  
  16.     </div>
  17.  
  18.     <divclass="form-group">
  19.  
  20.         <divclass="col-md-offset-2 col-md-10">
  21.  
  22.             <inputtype="submit"class="btn btn-default"value="Submit"/>
  23.  
  24.         </div>
  25.  
  26.     </div>
  27.  
  28. }

9. We need to validate the PIN entered by the user. This is the final method in the flow to validate the PIN submitted by the user. Add a new method ‘VerifyCode’ with HttpPost attribute which is called when the above form is posted. Here we validate if the userId is present in the cookie, which shows that this is a valid request. Then we call Verify on the UserManager to see if this is a valid PIN for this user. If it is, then we sign in to the application; otherwise we report back the error.

  1. [HttpPost]
  2.         [AllowAnonymous]
  3.         publicasyncTask<ActionResult> VerifyCode(string pin)
  4.         {
  5.             string userId = await GetTwoFactorUserIdAsync();
  6.  
  7.             if (userId == null || String.IsNullOrEmpty(pin))
  8.             {
  9.  
  10.                 return View("Error");
  11.  
  12.             }
  13.             var user = await UserManager.FindByIdAsync(userId);
  14.  
  15.             if (await UserManager.VerifyTwoFactorTokenAsync(user.Id, "EmailCode", pin))
  16.             {
  17.  
  18.                 await SignInAsync(user, false);
  19.  
  20.                 return RedirectToLocal("Home");
  21.  
  22.             }
  23.             else
  24.             {
  25.                 ModelState.AddModelError("", "Invalid code");
  26.             }
  27.             return View();
  28.         }

A note on the two factor PIN generated

Identity uses an implementation of the RFC 6238 Time-based One-Time Password algorithm (TOTP) for generation of the PIN used for two factor authentication. The PIN generated is 6 digits in length and is valid for a period for 180 seconds (3 minutes).

Running the application

Run the application.

1. On the Register page, enter the username and email.

clip_image004

2. Clicking on Register redirects to a page where it says to check the email for a confirmation link.

clip_image006

3. Verify the email for the link. The link might be URL encoded. Use any URL decoder (available online) to decode the URL. Paste it in the browser.

clip_image007

4. Now click the link to go to the log in page. Enter the login credentials. Clicking Login redirects to the page where the application expects the PIN sent in the email.

clip_image008

5. Enter the PIN and hit Submit. The user is logged into the application.

Code

With 2.0.0-beta1 release we are shipping a sample application as a Nuget package (Microsoft.Aspnet.Identity.Samples) which should be installed in an empty ASP.NET web application. This application has the code which implements all features of the 2.0.0-beta1 release and can be used as a template when updating your applications or to try out the new features.

This sample NuGet package also shows how to use SMS for two-factor authentication and other 2.0 features which are not covered in this article. The sample application created using this article is available at link.

Summary

This post shows how you can add user email confirmation and two-factor authentication via email using ASP.NET Identity to an application that was using ASP.NET Identity 1.0. In 2.0 we shipped more features such as Password Reset and two-factor authentication using SMS. All these features can be implemented by following the same flow or by looking at the Microsoft.AspNet.Identity.Samples package.

I hope you have found this walkthrough useful. If you have any questions around ASP.NET Identity or find issues, please feel free to open bugs on the Identity Codeplex site, https://aspnetidentity.codeplex.com/ or ask questions on asp.net/forums or StackOverflow (tag: aspnet-identity). I can also be reached on twitter (@suhasbjoshi).

Thanks to Tom Dykstra and Pranav Rastogi for reviewing the article


Using Claims in your Web App is Easier with the new OWIN Security Components

$
0
0

Hello everybody! My name is Vittorio Bertocci: I am a program manager in the Windows Azure Active Directory team, where I work on developer experience.

In the last few months the ASP.NET and Active Directory teams have been busy collaborating on a new OWIN-based programming model for securing modern ASP.NET applications. Today I have the privilege to announce the first developer preview of the OWIN components that will allow you to secure your ASP.NET applications with Windows Azure AD, ADFS and any other identity provider supporting WS-Federation.

Here there’s an image that gives some measure of the improvements we were able to achieve. On the left side, you can see a typical web.config file of an ASP.NET app configured to use claims based identity with current technology. On the right side, you can see the equivalent initialization logic when using the new OWIN components. Are you curious about how we got there? Read on! Smile

image

Claims Based Identity and the .NET Framework

Claims based identity made its debut in the developer’s toolbox back in 2009, with the first release of Windows Identity Foundation (WIF). At that time the only people working with claims based identity were individuals with both development and administration background, often leaning on the latter, with deep understanding of the underlying security protocols. The introduction of classes that took care of the low level details and Visual Studio tools to facilitate app configuration helped more and more developers to take advantage of claims’ ability to cross boundaries between platforms and between on-premises and cloud. With the release of .NET 4.5 in 2012 all the WIF classes migrated in the .NET Framework, with System.Security.Claims moving into mscorlib itself. Starting with the release of Visual Studio 2013, support for claims based identity is available directly in the project templates out of the box. And all the while, we’ve been updating the out of the box functionalities by releasing NuGet libraries implementing the latest industry advancements (such as support for new lightweight token formats).

Fast forward to today, claims based identity has gone completely mainstream. I have lost count of all the services and server products that take advantage of .NET’s rich support for claims and identity protocols.

image

Despite all of those advancements, however, the way in which web application developers interact with claims based identity has remained the same for all this time. The original design is web.config-heavy, as the administrators-developers of the time expected; it is based on HttpModules, tying it to apps based on the ASP.NET/IIS hosting; it exposes a lot of protocol-level information, empowering the protocol expert to control every aspect but requiring tools to generate the complex configuration files it needs; and so on.

It was time for us to go back and rethink how to make available the rich claims based identity capabilities of the .NET Framework in a modern, more effective way.

Enter Microsoft.Owin.Security.WsFederation

If you are following this space, you are already familiar with the Microsoft OWIN Components (Howard wrote a great introductory article here).
We already leveraged that framework in Visual Studio 2013 to deliver the next generation authentication components for Web API projects (see an introduction here). We got really good feedback on the approach we took there, which entails asking you via super simple initialization logic the absolute minimal amount of information required for securing your Web API. We decided to apply the same approach to browser-based sign on.

The .NET Framework already provides all of the raw functionality for processing claims based identity flows: token formats, cryptography, representation of the caller in form of ClaimsPrincipal… all those were already present and proven by half a decade of enterprise-grade use, hence we had no desire to replace those. We focused on the top layer of the stack, the one you need to touch directly when configuring your app to be secured using a claims based identity protocol, and re-implemented it using OWIN components.

image

Our JWT token handler library was already designed to work without web.config or HttpModules. We wanted the same for other assets, like the SAML token handlers, hence we created a very thin layer on top of those (the Microsoft.IdentityModel.Protocol.Extensions library).

Next, we extended Microsoft.Owin.Security to include base classes to be used for implementing standard web sign in protocols such as WS-Federation and OpenID Connect.

Finally, we created Microsoft.Owin.Security.WsFederation, a new component containing middleware for handling the WS-Federation protocol. We started with WS-Federation because that’s the most commonly supported protocol in our ecosystem today, allowing you to connect to both Windows Azure AD and ADFS from version 2.0 on. OpenId Connect support will come soon after.

You can get the packages above using the NuGet Package Manager Console with the following:

  • Install-Package Microsoft.Owin.Security.WsFederation -Version 3.0.0-alpha1 –Pre

Configuring your application to be secured via WS-Federation does not require you to use any tools, you just need to reference the right NuGet package and add some code similar to the following:

publicvoid ConfigureAuth(IAppBuilder app)
{
    app.UseCookieAuthentication(new CookieAuthenticationOptions
        {
            AuthenticationType = 
               WsFederationAuthenticationDefaults.AuthenticationType
        });

    app.UseWsFederationAuthentication(new WsFederationAuthenticationOptions
        {
            MetadataAddress = "https://login.windows.net/azurefridays.onmicrosoft.com/federationmetadata/2007-06/federationmetadata.xml",
            Wtrealm = "http://myapps/WsFed_AAD1",
        });
}

Most of it is absolutely boilerplate code, that will look the same in all of your projects. The only parts that will change are:

  • MetadataAddress. This value represents the Windows Azure AD tenant (or ADFS instance) you want to use for authenticating your users. If you are writing line of business apps for your own company, this line will likely never change.
  • Wtrealm. This is the identifier of your application, as assigned when you configured your app in your authenticating authority (in the Windows Azure portal for Windows Azure AD, in the management console for ADFS).

… and that’s all you need!

Besides being significantly easier to program against, the new approach has many more advantages:

  • It is now possible to combine multiple authentication types in a single application. For example, you could have some routes serve web UX and some others handle Web API calls – and have the former secured via WS-Federation while the latter are secured via OAuth2 bearer tokens. Owin makes it very straightforward!
  • It’s now very easy to use claims-based identity in self hosted scenarios

Same Expressive Power for Advanced Uses

Making the easy cases easy was an area where the HttpModule-based approach wasn’t great: we believe the OWIN approach improved that.

However we did not want to lose the flexibility and expressive power that made possible the big success of claims based identity: hence we build on the experience of half a decade of observing developers use the current extensibility model, and ensured that equivalent affordances are available in the new components. Where possible, we consolidated by eliminating parts that weren’t used and improved on scenarios that needed better support.

I will be posting various deep dives on www.cloudidentity.com in the next few days, but just to give you a taste: the wsFederationAuthenticationOptions class referenced above can either be initialized with minimal amount of info (as shown above) or it can be used to specify finer grained settings. You can control most aspects of the protocol and inject your own custom logic (via delegates) at key stages of the validation pipeline. If you are familiar with the protocol and/or the current extensibility model, you might see some furniture rearranged but mostly you should feel right at home.

image

An Example: Securing an MVC App with Windows Azure AD

See this blog post where I provide step by step instructions on securing a MVC app with Windows Azure AD.

Next Steps

We are very, very excited to make this preview available to you, and we can’t wait to hear what you think about it! At this phase your feedback is absolutely crucial. If you want to join the conversation, there are plenty of ways for you to do so:

As mentioned above, I’ll be posting various deep dives on www.cloudidentity.com– if there are specific scenarios you want to see covered, please make sure you use the channels above to let us know and we’ll do our best to prioritize accordingly.

Thanks and enjoy!

Announcing new Web Features in Visual Studio 2013 Update 2 CTP2

$
0
0

Today, the Visual Studio team announced the release of CTP2 of Visual Studio 2013 Update 2 .  Our team added a few useful features and did some bug fixing in this update to improve the web development experience.  We will have future blogs to talk about some of the features in detail.  The release note contains more details.

New SASS project item and editor

We added LESS in VS2013 RTM, and we now have a SASS project item and editor.  SASS editor features are comparable to LESS editor, include colorization, variable and Mixins IntelliSense, comment/uncomment, quick info, formatting, syntax validation, outlining, goto definition, color picker, tools option setting etc.

image

image

New JSON project item and editor

We have added a JSON project item and editor to Visual Studio.  Current JSON editor features include colorization, syntax validation, brace completion, outlining, tools option setting and more.

image

Create remote Azure resources option when creating a new Web project

We added a Windows Azure “Create remote resources” checkbox on the new web application dialog.  By choosing it, you will be able to integrate the experience of creating a new web application, setting up the Windows Azure publishing site for testing, and creating publishing profile in a few simple steps.

image

image

We also supports remote debugging for WAWS and remote view of Azure website content files in the server explorer.

ASP.NET Scaffolding

If your model is using Enums, then the MVC Scaffolder will generate dropdown for Enum. This uses the Enum helpers in MVC.

Updated the EditorFor templates in MVC Scaffolding so they use the Bootstrap classes.

MVC and Web API Scaffolders will add 5.1 packages for MVC and Web API

One ASP.NET Template changes

We updated ASP.NET templates to support Account Confirmation and Password Reset.

We updated ASP.NET Web API template to support authentication using On Premises Organizational Accounts.

ASP.NET SPA template now shows a template where the authentication is based on MVC and server side views. The template has a WebAPI controller which can only be accessed by authenticated users.

LESS editor improvements

We added features including nested media queries, named parameter support, support for selector interpolation, support for semicolons as parameter separators, goto definition for @import, goto definition of variables and mixins.

Knockout IntelliSense upgrade

We added a non-standard KnockOut syntax for VS intelliSense, “ko-vs-editor viewModel:” syntax.  It can be used to bind to multiple view models on a page using comments in the form:

Code Snippet
  1. <!-- ko-vs-editor viewModel: <any javascript expression that evaluates to an object> -->
  2.  
  3. <!-- /ko-vs-editor >

image

We also added support for nested ViewModel IntelliSense, so you may drill into deeply nested objects on the ViewModel.

<div data-bind=”text: foo.bar.baz.etc” />

The IntelilSense displayed is the full IntelliSense of the JavaScript Object.

image

New URL Picker in HTML, Razor, CSS, LESS and SASS pages

VS 2013 shipped with no URL picker outside of WebForm pages.  The new URL picker for HTML, Razor, CSS, LESS and SASS editors is a dialog-free, fluent typing picker that understands ‘..’ and filters file lists appropriately for img’s and links.

image

image

image

Browser Link New Features

BrowserLink now supports HTTPS connections and will list that in Dashboard with other connections as long as the certificate is trusted by browser.

ASP.NET Web Forms

The Web Forms templates now show how to do Account Confirmation and Password Reset for ASP.NET Identity.

Entity Framework Data Source and Dynamic Data Provider for Entity Framework 6. For more details please see http://blogs.msdn.com/b/webdev/archive/2014/01/30/announcing-preview-of-dynamic-data-provider-and-entitydatasource-control-for-entity-framework-6.aspx

ASP.NET MVC 5.1.1, ASP.NET Web API 2.1.1 and ASP.NET Web Pages 3.1.1 are included

We announced ASP.NET MVC 5.1, ASP.NET Web API 2.1 and ASP.NET Web Pages 3.1 in January.  We integrated that release with some minor bug fixes into VS 2013 Update 2 CTP2.

ASP.NET Identity

We integrated Microsoft.AspNet.Identity 2.0-alpha1 into the new project templates.  You can upgrade it to Microsoft.AspNet.Identity 2.0-beta1 to use two factor authentication and more features.

Entity Framework

We integrated Entity Framework 6.1.0-alpha1 into the new project template.  You can upgrade it to Entity Framework 6.1.0-beta1 to use the newest beta1 features.

Microsoft OWIN Components

We integrated stable version Microsoft OWIN Components (2.0.2) into the new project templates.  You can upgrade it to 2.1.0.  Please look at the release notes for the latest stable version (2.1.0) for more detailed information, which includes support for Google OAuth2 authentication and static file server.

NuGet

NuGet 2.8 RTM is included in this release.  You can always get the latest NuGet extension for Visual Studio through the menu “Tools->Extensions and Updates…”.

ASP.NET SignalR

We included 2.0.2 NuGet package for SignalR .  Please look at the release notes for more detailed information https://github.com/SignalR/SignalR/releases/tag/2.0.2

Known Problems

Web Essential 2013 is not compatible with the Update 2 CTP2. If you install Update 2 CTP2, after opening Visual Studio, you’ll get an error message “EditorExtensionsPackage” couldn’t be loaded.   We hope to have a new version of Web Essentials out soon to support this release.

When creating an App for SharePoint with an ASP.NET MVC web application, users will receive the following message:

o Error: this template attempted to load component assembly 'Microsoft.VisualStudio.Web.Project, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a'. For more information on this problem and how to enable this template, please see documentation on Customizing Project Templates

o The workaround to create a new provider-hosted or autohosted app for SharePoint using an MVC web application is to follow the steps:

  • Create a MVC web application project first.
  • After the web project is created, right-click the project node to launch the context menu.
  • In the context menu, select “Convert”, then choose “Convert to App for SharePoint Project…”.

Summary

We hope you can evaluate these new features and let us know about any bugs and suggestions.  For VS features, please use Connect to submit bugs, ASP.NET UserVoice to submit and vote for suggestions, and the ASP.NET Forums for Q&A.  For MVC/WebAPI/WebPages issues, please submit any issues you encounter and feature suggestions for future releases on our CodePlex site. Thank you!

New JSON editor features in Visual Studio 2013 Update 2 CTP2

$
0
0

The Customer Tech Preview (CTP2) release of Visual Studio 2013 Update 2 contains a new JSON editor. In this release, the JSON editor includes a number of features such as colorization, and JSON syntax validation. More features will be added during subsequent releases.

New JSON item template

In a web application, you can use the new JSON item template under the Markup category to add a new file with the .json extension.

clip_image001

Colorization

When opening a file in the JSON editor, the content will be tokenized and have distinct colors. Invalid tokens will have the default black color as shown below.

clip_image002

The color for numbers can be set using the setting at Tools–> Options–> Environment–> Fonts and Colors–> Number.

clip_image003

JSON syntax validation

Visual Studio 12 Update 2 will validate JSON syntax and provide squiggle syntax errors that have been detected, and list them in the Error List window. Double-clicking on an error in the Error List will bring the file to the foreground if it’s not currently active and put the cursor at the error.

clip_image004

Automatic brace completion and type-through

When you type an opening bracket or quote, the matching closing character is automatically inserted after the cursor. If you type the automatically inserted character, you will “type through” that character. This would help keep your fingers in home position and maintain the typing flow.

clip_image005

Outlining

Outlining is available for the JSON content so you can choose to either collapse or expand any part of the content for easy viewing.

clip_image006

Tools Options Settings

In addition to the general Text Editor settings, we offer the Validation settings under the Advanced tab that you can change to suite your desire.

clip_image007

JSON content sniffing

When double-clicking a file in SolutionExplorer to open a file with an unknown extension, the JSON editor will scan through the content of the file and determine if it’s indeed a JSON file. If so, the JSON editor will open the file and you will see all of the features mentioned above.

However, if the JSON editor detects that there are more than 5 errors in the first 4 kilobytes of the content, it will quit and the file will be treated the same way as an unknown file extension, where it will be opened in whatever editor set as the default.

There will be additional features in the JSON editor in subsequent releases and we will blog more about them when they become available.

Announcing the release of Dynamic Data provider and EntityDataSource control for Entity Framework 6

$
0
0

Today, we are pleased to announce RTM of ASP.NET Dynamic Data and EntityDataSource control for EntityFramework 6

What’s in this release

- Dynamic Data provider for Entity Framework 6

- EntityDataSource control for Entity Framework 6

How to install

You can download this release for ASP.NET DynamicData.EFProvider (http://www.nuget.org/packages/Microsoft.AspNet.DynamicData.EFProvider/) and EntityDataSource (http://www.nuget.org/packages/Microsoft.AspNet.EntityDataSource/) from the NuGet gallery.

  • Install-Package Microsoft.AspNet.DynamicData.EFProvider -Version 6.0.0
  • Install-Package Microsoft.AspNet.EntityDataSource -Version 6.0.0

Getting started

Microsoft.AspNet.DynamicData.EFProvider

This package has a DynamicData EFProvider for EntityFramework 6. This provider can work with a Model (either Code First or Model First) which was created using Entity Framework 6. This package also installs the Page Templates, Entity Templates and Field Templates which are required for DynamicData. The templates have been updated to use Microsoft.AspNet.EntityDataSource control which we are also previewing today as well.

For more information on ASP.NET DynamicData please see http://msdn.microsoft.com/en-us/library/cc488545.aspx

Following are the steps for using this package in a ASP.NET DynamicData application:

    • Create a new ASP.NET Dynamic Data Entities Web Application
    • Add the Microsoft.AspNet.DynamicData.EFProvider NuGet package
    • This will do the following
      • Add a reference to the DynamicData EFProvider binary
      • Install the templates. If you are starting with a new project, then you can override the templates. If you have an existing application, then you should be careful when overriding the changes. These templates will replace the EntityDataSource control which shipped in .NET Framework with Microsoft.AspNet.EntityDataSource and the Page Templates, Field Templates, Entity Templates.
    • Create your model using Entity Framework Code First or EF Designer.
    • Add the following code in RegisterRoutes in Global.asax.cs to register your DbContext:
    • Code Snippet
      1. DefaultModel.RegisterContext(
      2. new Microsoft.AspNet.DynamicData.ModelProviders.EFDataModelProvider(() => new YourDbContext()),
      3. newContextConfiguration { ScaffoldAllTables = true });  

  • Run the project
  • You would see all the tables listed on the Default page.

Microsoft.AspNet.EntityDataSourceControl

This is an update to the EntityDataSource control which shipped in the .NET Framework. The EntityDataSource control has been updated to work with Entity Framework 6.

To use this control, please do the following

    • Create an ASP.NET application
    • Install the packageMicrosoft.AspNet.EntityDataSource
      • This package will
        • Install the runtime binary for Microsoft.AspNet.EntityDataSource
        • Install the EntityFramework version 6 NuGet package
        • Add the following tag prefix in web.config
Code Snippet
  1. <pages>
  2.   <controls>
  3.     <addtagPrefix="ef"assembly="Microsoft.AspNet.EntityDataSource"namespace="Microsoft.AspNet.EntityDataSource" />
  4.   </controls>
  5. </pages>
  • Create a new Web Form page
  • Use the control as follows and bind it to any Databound control such as GridView, FormView etc.
  • Code Snippet
    1. <asp:GridViewID="GridView1"runat="server"DataSourceID="GridDataSource"></asp:GridView>
    2. <ef:EntityDataSourceID="GridDataSource"runat="server"EnableDelete="true"/>

Give feedback

If you find any issues with this preview, please file issues at the EntityFramework CodePlex sitehttps://entityframework.codeplex.com

Thank you for trying out this release.

Tutorial Series for Entity Framework 6 MVC 5 Published

$
0
0

We have published the final 3 tutorials in the series on the Entity Framework 6 Code First / MVC 5. At the same time, we updated the first 9 tutorials, adding new material and responding to customer suggestions.

New Entity Framework 6 features covered in this series include:

  • Connection resiliency
  • Command interception
  • Code-based configuration
  • Async
  • Stored procedures

The 3 newly published tutorials are these:

The 9 that were published earlier and have now been updated are these:

The old EF 5 / MVC 4 series is still available at this URL:

- Tom Dykstra

- Rick Anderson

ASP.NET MVC 5 Lifecycle Document Published

$
0
0

Fresh out of the oven is a PDF document that charts the lifecycle of every ASP.NET MVC 5 application. Many of you have requested this document over the years and we're glad to finally put it in your hands now.

You will find the PDF document very similar to the ASP.NET application lifecycle topic in its approach. It's a graphical representation of the ASP.NET MVC 5 lifecycle, but you will also see all the comprehensive details and explanations of the MVC pipeline that can help you make the correct decision on the what, when, and how of your ASP.NET MVC code development. Here's a preview:

When you view the PDF document, you can also jump to useful links by clicking one of the clickable elements. There is much more useful content out there, and we can only include one link for each piece of detail, but it is one way for us to recognize a the priceless contributions the ASP.NET community has made to this cause over the years.

Please check it out, and send us your feedback! We (I) will definitely follow up and improve this document to make it valuable to you as we should.

-Cephas Lin


Getting started with ASP.NET Web API 2.2 for OData v4.0

$
0
0

A few weeks ago we started publishing nightly builds for our initial support in ASP.NET Web API for the OData v4.0 protocol. Our OData v4.0 support is based on the OData Library for OData v4.0 that has been released in the past few months. The OData v4.0 protocol introduces a lot of changes and new features that allow more flexibility in the way to model services and improvements over many features from the past versions of the protocol.

In addition to this, the OData protocol has been recently ratified as an OASIS standard which will help bolster the adoption of the protocol by many companies and services all over the internet. If you want to know more about OData you can check the official site at www.odata.org where you can find the complete specification of the protocol and the features, the different formats supported and information about existing OData clients you can use in your apps. If you want to take a sneak peak at the new features and changes in the v4.0 version, you can do it here.

During the past few months, the Web API team has been working on the initial support for the v4.0 version. Many of the existing changes in the current nightly build deal with protocol and format changes from the v3.0 to the v4.0 version, but we have managed to add some interesting features to our current OData support. This list of features include:

1. OData attribute routing: This feature allows you to define the routes in your controllers and actions using attributes.

2. Support for functions: This feature allows you to define functions in your OData model and bind them to actions in your controller that implement them.

3. Model aliasing: This feature allows to change the names of the types and properties in your OData model to be different than the ones in your CLR types.

4. Support for limiting allowed queries: This feature allows the service to define limitations on the properties of the model that can be filtered, sorted, expanded or navigated across.

5. Support for ETags: This feature allows to generate an @odata.etag annotation based on some properties of the entity that can be used in IfMatch and IfNoneMatch headers in following requests.

6. Support for Enums: We’ve improved our support for Enums and now we support them as OData enumerations.

7. Support for $format: We’ve also added support for $format, so clients are able to specify the desired format of the response in the URL.

Important changes in this version

The OData v4.0 protocol includes a lot of new features and many changes to existing ones that improve the protocol and the modeling capabilities for the services implementers, but at the same time, those changes make difficult to support multiple versions of the protocol in a single implementation.

For that reason, we have decided to create a new assembly to support the v4.0 version of the protocol while maintaining the current assembly for those people who want to implement services based on previous versions.

One of the important goals with this new implementation has been to support the side by side scenario where customers can have v3 and v4 services running on the same application. To that effect, we had to make some changes in the current naming of some classes and methods to allow for a reasonable user experience. Here are the most important changes:

1. The package ID for the v4.0 is Microsoft.AspNet.OData.

2. The assembly name and the root namespace are now System.Web.OData instead of System.Web.Http.OData.

3. All the extension methods have been moved to System.Web.OData.Extensions.

4. We have removed all the extension methods that used to exist for HttpRequestMessage like GetODataPath or GetEdmModel and we have added a single extension method, ODataProperties that returns an object containing the common OData properties that were accessible by the old extension methods, like the IEdmModel of the service or the ODataPath of the request.

5. MapODataRoute has been changed to MapODataServiceRoute.

6. QueryableAttribute has been changed to EnableQueryAttribute.

For the sake of consistency between versions, we have done the same set of changes in the Microsoft.AspNet.WebApi.OData to achieve a similar development experience. Only the namespace remains System.Web.Http.OData in this version. The current methods and class names can still be used with the System.Web.Http.OData (OData v3.0), but we have marked them as obsolete, and they are not available in the new assembly.

Enough talking, let’s write an OData v4.0 service!

We’ll start our new OData v4.0 service by creating a simple web application that we’ll call ODataV4Service. We’ll chose to use the Web API template that will install the default Web API packages required for our application.

Once the basic application has been created, the first thing we need to do is update the existing Web API packages to use the nightly versions hosted on MyGet. In order to do that, right click on “References” in the project we have just created on the solution explorer, click on “Manage Nuget Packages” and expand the Updates section on the left.

image

Check that there is a source for WebStack Nightly, and if not, just proceed to add it by clicking the Settings button on the left bottom corner of the window and adding the source in the windows that appears after clicking, as shown in the following figure.

clip_image004

As you can see from the image, the URL for the nightly ASP.NET packages is http://www.myget.org/f/aspnetwebstacknightly/ and you can see all the different published packages on https://www.myget.org/gallery/aspnetwebstacknightly.

Now that we have setup our nightly package source we can go and update the Web API packages. In order to do that, we need to select the Include Prerelease option on the dropdown menu on the top of the window. Then we just need to click Update All.

Before leaving the Nuget Package Manager we need to install the Web API 2.2 for OData v4.0 package, in order to do that, we expand the Online tab, select the WebStack Nightly Source and the Include Prerelease option and then search for Microsoft.AspNet.OData.

clip_image006

After installing this package, we can exit the Nuget Package Manager and try running our application by pressing F5. The default page should appear in the browser.

At this point we have our application running on the latest 5.2 assemblies and we are ready to create our OData service. The first step is to create a model, for that we create a couple of C# classes representing entities as follow:

publicclass Player
{publicvirtualint Id { get; set; }publicvirtualint TeamId { get; set; }publicvirtualstring Name { get; set; }
}publicclass Team
{publicvirtualint Id { get; set; }publicvirtualstring Name { get; set; }publicvirtualdouble Rate { get; set; }publicvirtualint Version { get; set; }publicvirtual ICollection<Player> Players { get; set; }public Category Category { get; set; }
}

We are going to need some data to use, so we are going to use Entity Framework for that, in order to do that, we install the Entity Framework package from Nuget in the same way we have done with the OData package, except this time we pick the nuget.org package source and a stable version of the package. Then we create a context and include an initializer to seed the database with some data, as shown here:

 

publicclass LeagueContext : DbContext
{public DbSet<Team> Teams { get; set; }public DbSet<Player> Players { get; set; }static LeagueContext()
    {
        Database.SetInitializer<LeagueContext>(new LeagueContextInitializer());
    }privateclass LeagueContextInitializer : DropCreateDatabaseAlways<LeagueContext>
    {protectedoverridevoid Seed(LeagueContext context)
        {
            context.Teams.AddRange(Enumerable.Range(1, 30).Select(i =>new Team
                {
                    Id = i,
                    Name = "Team " + i,
                    Rate = i * Math.PI / 10,
                    Players = Enumerable.Range(1, 11).Select(j =>new Player
                        {
                            Id = 11 * (i - 1) + j,
                            TeamId = i,
                            Name = string.Format("Team {0} Player {1}", i, j)
                        }).ToList()
                }
            ));
        }
    }
}

The next step is creating our OData model. We are going to create it in the WebApiConfig.cs file as the next figure shows:

 

publicstatic IEdmModel GetModel()
{
    ODataModelBuilder builder = new ODataConventionModelBuilder();

    builder.EntitySet<Team>("Teams");
    builder.EntitySet<Player>("Players");return builder.GetEdmModel();
}

OData attribute routing

Now that we have created our model, we need to define the route for the OData service. We are going to use OData Attribute Routing to define the routes in our service. In order to do that, we need to open the WebApiConfig.cs file under our App_Start folder and add the System.Web.OData.Extensions and System.Web.OData.Routing namespaces to the list of our usings. Then, we need to modify our Register method to add the following lines:

ODataRoute route = config.Routes.MapODataServiceRoute("odata", "odata",GetModel());
route.MapODataRouteAttributes(config);

At this point we have successfully configured our OData service, but we haven’t defined yet any controller to handle the incoming requests. Ideally we would use scaffolding for this, but we are still working on getting the OData v4.0 scaffolders ready for preview (the existing scaffolders only support OData v3.0 services). So we have to create our controllers by hand, but we’ll see that with attribute routing it’s not difficult at all.

In previous versions of our Web API OData support, we had a very tight restriction on the names of the controllers, actions and even parameter names of our actions. With attribute routing, all those restrictions go away. We can define a controller or an action using whatever name we want as the following fragment of code shows:

 

[ODataRoutePrefix("Teams")]publicclass TeamsEntitySetController : ODataController
{privatereadonly LeageContext _leage = new LeageContext();
    [EnableQuery]
    [ODataRoute]public IHttpActionResult GetFeed()
    {return Ok(_leage.Teams);
    }
    [ODataRoute("({id})")]
    [EnableQuery]public IHttpActionResult GetEntity(int id)
    {return Ok(SingleResult.Create<Team>(_leage.Teams.Where(t => t.Id == id)));
    }
}

As we can see on the figure above, we can use ODataRoutePrefixAttribute to specify a prefix for all the routes in the actions on the controller, and we can use ODataRouteAttribute to specify further segments that will get combined with the ones in the prefix. That way, the GetFeed action, represents the route /Teams and the GetEntity action represents routes like Teams(1), Teams(2), etc.

Support for Functions

Now that we have a basic service up and running, we are going to introduce some business logic. For that, we are going to define a function that will give us the teams whose rating is around a certain threshold with a given tolerance.

Obviously, we could achieve the same result with a query, but in that case, the clients of our service are ones responsible for defining the query and might make mistakes. However, if we give them a function, they only need to care about sending the right parameters.

In order to define a function that represents the business logic that we have specified, we can modify our GetModel function as follows:

 

publicstatic IEdmModel GetModel()
{
    ODataModelBuilder builder = new ODataConventionModelBuilder();
    EntitySetConfiguration<Team> teams = builder.EntitySet<Team>("Teams");
    builder.EntitySet<Player>("Players");
    FunctionConfiguration withScore = teams
        .EntityType
        .Collection
        .Function("WithScore");
    withScore.Parameter<double>("threshold");
    withScore.Parameter<double>("tolerance");
    withScore.ReturnsCollectionFromEntitySet<Team>("Teams");return builder.GetEdmModel();
}

Functions can be defined at the service level (unbounded), at the collection level (bounded to collection) or at the entity level (bounded to the entity). In this case, we have defined a function bounded to the collection, but similar methods exist on the ODataModelBuilder class (to define service level functions) and on the EntityConfiguration class (to define entity level functions).

Now, the last step is to define an action that implements the function, in order to do that, we are going to take advantage of attribute routing. The action in the figure below shows the implementation:

 

[ODataRoute("Default.WithScore(threshold={threshold},tolerance={tolerance})")]
[EnableQuery]public IHttpActionResult GetTeamsWithScore(double threshold, double tolerance)
{return Ok(_league.Teams.Where(t =>
        (t.Rate < (threshold + tolerance)) &&
        (t.Rate > (threshold - tolerance))));
}

As you can see, the way we call the function is by using it’s fully qualified name after the entity set on which we want to call it. We use attribute routing to define the parameters of the function and bind them to the parameters of the action in a very elegant way. In this case a sample call to the function would use the following URL /odata/Teams/Default.WithScore(threshold=3, tolerance=2)

Important note: If you try this in IIS, you’ll probably get a 404 response. This is because IIS doesn’t like the dot in the URL on the last segment (IIS thinks it´s a file). One possible way to fix this is to add piece of configuration on you web.config to ensure IIS runs the routing module on all the requests.

<system.webServer><modulesrunAllManagedModulesForAllRequests="true"></modules></system.webServer>

Model aliasing

So far we’ve seen attribute routing and functions, now we are going to show another very interesting feature, model aliasing. Many times we want to expose some data from our domain, but we want to change things like the names of the domain entities or the names of some properties. In order to do that, we can use model aliasing.

There are two ways to configure model aliasing in our model, we can do it directly through the model builder by setting the name property of the types and the properties of the types, or we can annotate our types with DataContract and DataMember attribute. For example, we can change our model using data contract in the following way:

[DataContract(Name = "Member")]publicclass Player
{
    [DataMember]publicvirtualint Id { get; set; }
    [DataMember(Name = "Team")]publicvirtualint TeamId { get; set; }
    [DataMember]publicvirtualstring Name { get; set; }
}

Support for limiting the set of allowed queries

As we said above, query limitations allow a service to limit the types of queries that users can issue to our service by imposing limitations on the properties of the types of the model. A service can decide to limit the ability to sort, filter, expand or navigate any property of any type on the model.

In order to do that, there are two options, we can use attributes like Unsortable, NonFilterable, NotExpandable or NotNavigable on the properties of the types in our model, or we can configure this explicitly in the model builder. In this case, we’ll do it though attributes.

 

publicclass Team
{publicvirtualint Id { get; set; }
    [Unsortable]publicvirtualstring Name { get; set; }
    [NonFilterable]publicvirtualdouble Rate { get; set; }
    [NotExpandable]
    [NotNavigable]publicvirtual ICollection<Player> Players { get; set; }
}

The meaning of Unsortable, NonFilterable and NotExpandable is self-explanatory, as for NotNavigable it is a shortcut for specifying that a property is Unsortable and NonFilterable. When a client issues a query that involves a limited property, the server will answer with a 400 status code and will indicate the limited property that is causing the request to fail.

Support for ETags

The next feature we are going to see is ETags. This feature allows a service to define what fields of an entity are part of the concurrency check for the entity. Those fields will be used to generate an @odata.etag annotation that will be sent to the clients when returning the entity, either as part of a feed or just the single entity.

The client can use this ETag value in the If-Match and If-None-Match headers to implement optimistic concurrency updates and efficient resource caching. In order to mark a field as part of the concurrency check, we can use the ConcurrencyCheck attribute or the Timestamp attribute.

It’s important to note that we should use one or another, but not both at the same time. The difference strives in that ConcurrencyCheck is applicable to multiple fields of the entity and Timestamp is meant to be applied to a single field.

The individual properties can also be marked as part of the concurrency check explicitly using the model builder. In this case, we’ll do it through attributes. For example, we have modified the Team entity to add a Version property and mark it as part of the ETag for the entity. The result is shown in the next figure:

publicclass Team
{publicvirtualint Id { get; set; }
    [Unsortable]publicvirtualstring Name { get; set; }
    [NonFilterable]publicvirtualdouble Rate { get; set; }
    [ConcurrencyCheck]publicint Version { get; set; }
    [NotExpandable]
    [NotNavigable]publicvirtual ICollection<Player> Players { get; set; }
}

Now, we will serialize the ETag of the entity when we retrieve it through a GET request, but we still need to take advantage of the ETag on the actions of our service. In order to do that, we are going to add a Put action and we’ll bind ODataQueryOptions<Team> in order to use the ETag.

[ODataRoute("({id})")]public IHttpActionResult Put(int id, Team team, ODataQueryOptions<Team> options)
{if (!ModelState.IsValid)
    {return BadRequest(ModelState);
    }if (id != team.Id)
    {return BadRequest("The key on the team must match the key on the url");
    }if (options.IfMatch != null&&
        !(options.IfMatch.ApplyTo(_leage.Teams.Where(t => t.Id == id))as IQueryable<Team>).Any())
    {return StatusCode(HttpStatusCode.PreconditionFailed);
    }else
    {
        _leage.Entry(team).State = EntityState.Modified;
        _leage.SaveChanges();return Updated(team);
    }
}

As we can see, we can take advantage of the ETag by binding ODataQueryOptions as a parameter and using the IfMatch or IfNoneMatch properties in that object in order to apply the ETag value to a given query.

In the above example, we check if the ETag on the IfMatch header exists and if so, if it doesn’t the value of the Team with the id represented by the URL to return a Precondition Failed status in that case.

Support for Enums

We already had support for Enums in Web API OData v3.0 by serializing them as strings, but the new version of the protocol has added full support for them, so we have upgraded our Enum support accordingly. In order to use Enums you just need to define a property with an Enum type and we’ll represent it as an Enum in the $metadata and the clients will able to use Enum query operators in $filter clauses. There are also specific overloads on the model builder in case we want to configure the enumeration explicitly. Defining an OData Enum property in your type is as simple as this:

publicenum Category
{
    Amateur,
    Professional
}publicclass Team
{publicvirtualint Id { get; set; }
    [Unsortable]publicvirtualstring Name { get; set; }
    [NonFilterable]publicvirtualdouble Rate { get; set; }
    [ConcurrencyCheck]publicvirtualint Version { get; set; }
    [NotExpandable]
    [NotNavigable]publicvirtual ICollection<Player> Players { get; set; }public Category Category { get; set; }
}

Support for $format

This feature allows a client to specify the format they want in the query string of the URL bypassing any value set by the accept header. For example, the user can issue queries like this one to get all the metadata in the response, instead of just the minimal ammount (which is the default):

http://localhost:12345/odata/Teams?$format=application/json;odata.metadata=full

The above query uses a MIME media type and includes parameters in order to ask for a specific JSON version.

http://localhost:12345/odata/Teams?$format=json

The above query uses an alias to refer to a specific MIME media type, application/json which in the case of OData is equivalent to application/json;odata.metadata=minimal

Using the .NET OData client to query the v4.0 service

The OData client for .NET has been released this week, the following blog post contains the instructions on how to use it to generate a client that can be used to query Web API for OData v4.0 services.

Note: If you plan to use $batch it won’t work properly with the client that gets generated by default. This is caused due to the fact that we are still using the beta version of OData Lib (we plan to update to the RTM version in the near future) and the client uses the RTM version of OData Lib. In order to workaround this issue, you can do the following:

Open the Nuget Package Console and downgrade the OData Client package to beta1 doing:

1. Uninstall-package Microsoft.OData.Client -RemoveDependencies -project <ProjectName>

2. Install-package Microsoft.OData.Client -version 6.0.0-beta1 -pre -project <ProjectName>

Perform the following changes on the T4 template mentioned on the blog:

1. Replace Microsoft.OData.Client.Key with Microsoft.OData.Service.Common.DataServiceKeyAttribute

2. Replace Microsoft.OData.Client.ODataProtocolVersion with Microsoft.OData.Service.Common.DataServiceProtocolVersion

Samples

Along with the nightly build we have published samples with the new features and new samples showing off common scenarios that target OData v4 in the ASP.NET codeplex site.

Conclusion

We have started publishing nightly builds for our OData v4.0 support that it’s built on top of the ODataLib support that has already shipped. Our nightly builds can be found on http://www.myget.org/f/aspnetwebstacknightly. The package ID to find them is Microsoft.AspNet.OData (Remember to select IncludePrerelease in the Nuget Package Manager).

We’ve also seen a brief introduction to all the new features in this version, like OData attribute routing, functions, model aliasing, query limitations, ETags, enums and $format. You can find samples for all the new features in https://aspnet.codeplex.com/SourceControl/latest#Samples/WebApi/OData/v4/.

Enjoy!

Announcing RTM of ASP.NET Identity 2.0.0

$
0
0

Today, we are releasing the final version of ASP.NET Identity 2.0. The main focus in this release was to add security and account management features as well as address feedback from the community.

Download this release

You can download ASP.NET Identity from the NuGet gallery. You can install or update to these packages through NuGet using the NuGet Package Manager Console, like this:

What’s in this release?

Following is the list of features and major issues that were fixed in 2.0.0.

Two-Factor Authentication

ASP.NET Identity now support two-factor authentication. Two-factor authentication provides an extra layer of security to your user accounts in the case where your password gets compromised. Most  websites protect their data by having a user create an account on their website with a username and password. Passwords are not very secure and sometimes users choose weak passwords which can lead to user accounts being compromised.

SMS is the preferred way of sending codes but you can also use email in case the user does not have access to their phone. You can extend and write your own providers such as QR code generators and use Authenticator apps on phones to validate them.

There is also protection for brute force attacks against the two factor codes. If a user enters incorrect codes for a specified amount of time then the user account will be locked out for a specified amount of time. These values are configurable.

To try out this feature, you can install ASP.NET Identity Samples NuGet package (in an Empty ASP.NET app) and follow the steps to configure and run the project.

Account Lockout

Provide a way to Lockout out the user if the user enters their password or two-factor codes incorrectly. The number of invalid attempts and the timespan for the users are locked out can be configured.  A developer can optionally turn off Account Lockout for certain user accounts should they need to.

Account Confirmation

The ASP.NET Identity system now supports Account Confirmation by confirming the email of the user. This is a fairly common scenario in most websites today where when you register for a new account on the website, you are required to confirm your email before you could do anything in the website. Email Confirmation is useful because it prevents bogus accounts from being created. This is extremely useful if you are using email as a method of communicating with the users of your website such as Forum sites, banking, ecommerce, social web sites.

Note: To send emails you can configure SMTP Server or use some of the popular email services such as SendGrid (http://sendgrid.com/windowsazure.html) which integrate nicely with Windows Azure and require no configuration on the application developer

In the sample project below, you need to hook up the Email service for sending emails. You will not be able to reset your password until you confirm your account

Password Reset

Password Reset is a feature where the user can reset their passwords if they have forgotten their password.

Security Stamp (Sign out everywhere)

Support a way to regenerate the Security Stamp for the user in cases when the User changes their password or any other security related information such as removing an associated login(such as Facebook, Google, Microsoft Account etc). This is needed to ensure that any tokens (cookies) generated with the old password are invalidated. In the sample project, if you change the users password then a new token is generated for the user and any previous tokens are invalidated.

This feature provides an extra layer of security to your application since when you change your password, you will be logged out where you have logged into this application. You can also extend this to Sign out from all places where you have logged in from. This sample shows how to do it.

You can configure this in Startup.Auth.cs by registering a CookieAuthenticationProvider as follows.

Code Snippet
  1. app.UseCookieAuthentication(newCookieAuthenticationOptions {
  2.                 AuthenticationType = DefaultAuthenticationTypes.ApplicationCookie,
  3.                 LoginPath = newPathString("/Account/Login"),
  4.                 Provider = newCookieAuthenticationProvider {
  5.                     // Enables the application to validate the security stamp when the user logs in.
  6.                     // This is a security feature which is used when you change a password or add an external login to your account. 
  7.                     OnValidateIdentity = SecurityStampValidator.OnValidateIdentity<ApplicationUserManager, ApplicationUser>(
  8.                         validateInterval: TimeSpan.FromMinutes(30),
  9.                         regenerateIdentity: (manager, user) => user.GenerateUserIdentityAsync(manager))
  10.                 }
  11.             });

Make the type of Primary Key be extensible for Users and Roles

In 1.0 the type of PK for Users and Roles was strings. This means when the ASP.NET Identity system was persisted in Sql Server using Entity Framework, we were using nvarchar. There were lots of discussions around this default implementation on Stack Overflow and based on the incoming feedback, we have provided an extensibility hook where you can specify what should be the PK of your Users and Roles table. This extensibility hook is particularly useful if you are migrating your application and the application was storing UserIds are GUIDs or ints.

Since you are changing the type of PK for Users and Roles, you need to plug in the corresponding classes for Claims, Logins which take in the correct PK. Following is a snippet of code which shows how you can change the PK to be int

For a full working sample please see https://aspnet.codeplex.com/SourceControl/latest#Samples/Identity/ChangePK/readme.txt

 

Code Snippet
  1.  
  2. publicclassApplicationUser : IdentityUser<int, CustomUserLogin, CustomUserRole, CustomUserClaim>
  3. {
  4. }
  5.  
  6. publicclassCustomRole : IdentityRole<int, CustomUserRole>
  7. {
  8.     public CustomRole() { }
  9.     public CustomRole(string name) { Name = name; }
  10. }
  11.  
  12. publicclassCustomUserRole : IdentityUserRole<int> { }
  13. publicclassCustomUserClaim : IdentityUserClaim<int> { }
  14. publicclassCustomUserLogin : IdentityUserLogin<int> { }
  15.  
  16. publicclassApplicationDbContext : IdentityDbContext<ApplicationUser, CustomRole, int, CustomUserLogin, CustomUserRole, CustomUserClaim>
  17. {
  18. }

Support IQueryable on Users and Roles

We have added support for IQueryable on UsersStore and RolesStore so you can easily get the list of Users and Roles.

For eg. the following code uses the IQueryable  and shows how you can get the list of Users from UserManager. You can do the same for getting list of Roles from RoleManager

 

Code Snippet
  1.  
  2. // GET: /Users/
  3. publicasyncTask<ActionResult> Index()
  4. {
  5.     return View(await UserManager.Users.ToListAsync());
  6. }

Delete User account

In 1.0, if you had to delete a User, you could not do it through the UserManager. We have fixed this issue in this release so you can do the following to delete a user

Code Snippet
  1. var result = await UserManager.DeleteAsync(user);

IdentityFactory Middleware/ CreatePerOwinContext

UserManager

You can use Factory implementation to get an instance of UserManager from the OWIN context. This pattern is similar to what we use for getting AuthenticationManager from OWIN context for SignIn and SignOut. This is a recommended way of getting an instance of UserManager per request for the application.

Following snippet of code shows how you can configure this middleware in StartupAuth.cs. This is in the sample project listed below.

 

Code Snippet
  1.  
  2. app.CreatePerOwinContext<ApplicationUserManager>(ApplicationUserManager.Create);

Following snippet of code shows how you can get an instance of UserManager

Code Snippet
  1. HttpContext.GetOwinContext().GetUserManager<ApplicationUserManager>();

DbContext

ASP.NET Identity uses EntityFramework for persisting the Identity system in Sql Server. To do this the Identity System has a reference to the ApplicationDbContext. The DbContextFactory Middleware returns you an instance of the ApplicationDbContext per request which you can use in your application.

Following code shows how you can configure it in StartupAuth.cs. The code for this middleware is in the sample project.

Code Snippet
  1. app.CreatePerOwinContext(ApplicationDbContext.Create);

Indexing on Username

In ASP.NET Identity Entity Framework implementation, we have added a unique index on the Username using the new IndexAttribute in EF 6.1.0. We did this to ensure that Usernames are always unique and there was no race condition in which you could end up with duplicate usernames.

Enhanced Password Validator

The password validator that was shipped in ASP.NET Identity 1.0 was a fairly basic password validator which was only validating the minimum length. There is a new password validator which gives you more control over the complexity of the password. Please note that even if you turn on all the settings in this password, we do encourage you to enable two-factor authentication for the user accounts.

Code Snippet
  1. // Configure validation logic for passwords
  2.             manager.PasswordValidator = new PasswordValidator
  3.             {
  4.                 RequiredLength = 6,
  5.                 RequireNonLetterOrDigit = true,
  6.                 RequireDigit = true,
  7.                 RequireLowercase = true,
  8.                 RequireUppercase = true,
  9.             };

You can also add Password policies as per your own requirements. The following sample shows you how you can extend Identity for this scenario. https://aspnet.codeplex.com/SourceControl/latest#Samples/Identity/Identity-PasswordPolicy/Identity-PasswordPolicy/Readme.txt

ASP.NET Identity Samples NuGet package

We are releasing a Samples NuGet package to make it easier to install samples for ASP.NET Identity. This is a sample ASP.NET MVC application. Please modify the code to suit your application before you deploy this in production. The sample should be installed in an Empty ASP.NET application.

Following are the features in this samples package

    • Initialize ASP.NET Identity to create an Admin user and Admin role.
      • Since ASP.NET Identity is Entity Framework based in this sample, you can use the existing methods of initializing the database as you would have done in EF.
    • Configure user and password validation.
    • Register a user and login using username and password
    • Login using a social account such as Facebook, Twitter, Google, Microsoft account etc.
    • Basic User management
      • Do Create, Update, List and Delete Users. Assign a Role to a new user.
    • Basic Role management
      • Do Create, Update, List and Delete Roles.
    • Account Confirmation by confirming email.
    • Password Reset
    • Two-Factor authentication
    • Account Lockout
    • Security Stamp (Sign out everywhere)
    • Configure the Db context, UserManager and RoleManager  using IdentityFactory Middleware/ PerOwinContext.
    • The AccountController has been split into Account and Manage controller. This was done to simplify the account management code.

The sample is still in preview since we are still working on improving the sample and fixing issues with it but it is in a state where you can easily see how to add ASP.NET Identity features in an application.

Entity Framework 6.1.0

ASP.NET Identity 2.0.0 depends upon Entity Framework 6.1.0 which was also released earlier in the week. For more details please read this announcement post.

List of bugs fixed

You can look at all the bugs that were fixed in this release by clicking this link.

Samples/ Documentation

Known Issues/ Change list

Migrating from ASP.NET Identity 1.0 to 2.0.0

If you are migrating from ASP.NET Identity 1.0 to 2.0.0, then please refer to this article on how you can use Entity Framework Code First migrations to migrate your database http://blogs.msdn.com/b/webdev/archive/2013/12/20/updating-asp-net-applications-from-asp-net-identity-1-0-to-2-0-0-alpha1.aspx

This article is based on migrating to ASP.NET Identity 2.0.0-alpha1 but the same steps apply to ASP.NET Identity 2.0.0

Following are some changes to be aware of while migrating

    • The migrations adding the missing columns in the AspNetUsers table. One of the columns is ‘LockoutEnabled’ which is set to false by default. This means that for existing user accounts Account Lockout will not be enabled. To enable Account Lockout for existing users you need to set it to true  by setting the ‘defaultvalue:true’ in the migration code.
    • In Identity 2.0 we changed the IdentityDbContext to handle generic User types differently. You will not see the discriminator column which is because the IdentityDbContext now works with ‘ApplicationUser’ instead of the generic ‘IdentityUser’. For apps that have more than one types deriving from IdentityUser, they need to change their DbContext to callout all the derived classes explicitly. For eg.
Code Snippet
  1. publicclassApplicationDbContext : IdentityDbContext<IdentityUser>
  2. {
  3. public ApplicationDbContext()
  4. : base("DefaultConnection", false)
  5. {
  6. }
  7.  
  8. protectedoverridevoid OnModelCreating(System.Data.Entity.DbModelBuilder modelBuilder)
  9. {
  10. base.OnModelCreating(modelBuilder);
  11. modelBuilder.Entity<ApplicationUser>();
  12. modelBuilder.Entity<FooUser>();
  13. }
  14. }

Migrating from ASP.NET Identity 2.0.0-Beta1 to 2.0.0

Following are the changes you will have to make to your application if you are upgrading from 2.0.0-Beta1 to 2.0.0 of Identity.

    • We have added Account Lockout feature, which is new 2.0.0 RTM
    • The GenerateTwoFactorAuthAsync generates the two factor auth only. The users need to explicitly call ‘NotifyTwoFactorTokenAsync’ to send the code.
    • While migrating data the EF migrations may add ‘CreateIndex’ for existing indices.

Give feedback and get support

    • If you find any bugs please open them at our Codeplex Site where we track all our bugs https://aspnetidentity.codeplex.com/
    • If you want to discuss these features or have questions, please discuss them on Stack Overflow and use the following tag “asp.net-identity”

Thank You for trying out the previews and your feedback for ASP.NET Identity. Please let us know your feedback around ASP.NET Identity

Announcing 0.2.0-alpha2 preview of Windows Azure WebJobs SDK

$
0
0

We are releasing an update to Windows Azure WebJobs SDK introduced by Scott Hanselman here.

Download this release

You can download WebJobs SDK in a console application project from the NuGet gallery. You can install or update to these packages through NuGet gallery using the NuGet Package Manager Console, like this:

What is WebJobs SDK?

The WebJobs feature of Windows Azure Web Sites provides an easy way for you to run programs such as services or background tasks in a Web Site. You can upload and run an executable file such as an .exe, .cmd, or .bat file to your web site. You can run these as triggered or continuous WebJobs. Without WebJobs SDK, connecting and running background task requires a lot of complex programming. The SDK provides a framework that lets you write a minimum amount of code to get common tasks done.

The WebJobs SDK has a binding and trigger system which works with Windows Azure Storage Blobs, Queues and Tables. The binding system makes it easy to write code that reads or writes Windows Azure Storage objects. The trigger system calls a function in your code whenever any new data is received in a queue or blob.

The WebJobs SDK includes the following components:

    • NuGet packages. The NuGet packages enable your code to use the WebJobs SDK binding and trigger system with Windows Azure Storage tables, blobs and queues.
    • Dashboard. Part of the WebJobs SDK is already installed in Windows Azure Web Sites and provides rich monitoring and diagnostics for the programs that you write by using the NuGet packages. You don't have to write code to use these monitoring and diagnostics features.

Scenarios for WebJobs SDK

Here are some typical scenarios you can handle more easily with the Windows Azure WebJobs SDK:

    • Image processing or other CPU-intensive work. A common feature of web sites is the ability to upload images or videos. Often you want to manipulate the content after it's uploaded
    • Other long-running tasks that you want to run in a background thread, such as sending emails. Until now you couldn't do this in ASP.NET because IIS would recycle your app if your app was idle for some time. Now with AlwaysOn in Windows Azure Web Sites you can keep the web site from being recycled when the app is idle. AlwaysOn ensures that the site does not go to sleep, which means you can run long-running tasks or services using WebJobs and the WebJobs SDK.
    • Queue processing. A common way for a web frontend to communicate with a backend service is to use queues. When the web site needs to get work done, it pushes a message onto a queue. A backend service pulls messages from the queue and does the work. This is a common producer – consumer pattern.
    • RSS aggregation. If you have a site that maintains a list of RSS feeds, you could pull in all of the articles from the feeds in a background process.
    • File maintenance, such as aggregating or cleaning up log files.  You might have log files being created by several sites and you want to do analysis on them. Or you might want to schedule a task to run weekly to clean up old log files.

Goals of the SDK

    • Provide a way to make it easier to use Windows Azure Storage when doing any background processing work.
      • The SDK makes it easier to consume Azure Storage within your application. You do not have to deal with writing code to read/ write from storage.
    • Provider a rich diagnostics and monitoring experience without having the user write any diagnostics and logging code.

Features of the SDK

Azure Storage

The SDK works with Azure Blobs, Queues and Tables.

Triggers

Functions get executed when a new input is detected on a Queue or a Blob. For example. In the following code ProcessQueue function will be triggered when a new message comes on a queue called “longqueue”. For more details on triggers please see this post.

Code Snippet
  1. publicstaticvoid ProcessQueue([QueueInput("longqueue")] string output)
  2. {
  3.     Console.WriteLine(output);
  4.  
  5. }

 

Bindings

The SDK supports binding to provides model binding between C# primitive types and Azure storage like Blobs, Tables, and Queues. This makes it easy for a developer to read/ write from Blobs, Tables and Queues as they do not have to learn about the code around reading/ writing from Azure Storage.

  • Convenience. You can pick the type that’s most useful for you to consume and the WebJobs SDK will take care of the glue code. If you’re doing string operations on a blob, you can bind directly to TextReader/TextWriter, rather than worry about how to convert to a TextWriter.
  • Flushing and Closing: The WebJobs SDK will automatically flush and close outstanding outputs.
  • Unit testability. The SDK makes it possible to unit test your code since you can mock primitive types like TextWriter rather than ICloudBlob.
  • Diagnostics.  Model binding works with the dashboard to give you real time diagnostics on your parameter usage.

The following Bindings are currently supported: Stream, TextReader/Writer, and String. You can add support for binding to your custom types and other types from the Storage SDK as well.

For more details on how Bindings work against Azure Storage, please read Blobs, Queuesand Tables

Hosting

A JobHost is an execution container which knows what all functions do you have in your program. A JobHost object (which lives in Microsoft.WindowsAzure.Jobs.Host ) reads the bindings, listens on the triggers, and invokes the functions. In the following example, you create an instance of JobHost and call RunAndBlock(), which will cause the JobHost to listen for any triggers on any functions that you define in this Host.

Code Snippet
  1. staticvoid Main(string[] args)
  2.         {
  3.             JobHost host = newJobHost();
  4.             host.RunAndBlock();
  5.  
  6.         }

Dashboard for monitoring WebJobs.

As WebJobs (written in any language and of any type) execute, you can monitor them in real time. You can see their state (Running, Stopped, Successfully completed), last run time and the logs of a particular execution. The following screenshot shows you a view of all WebJobs running in your Website.

image

When you write a WebJob using the SDK, you get diagnostics and monitoring experience for the functions in your program. For example, let’s say that you have an Image processing WebJob called “ImageResizeAndWaterMark” that has the following flow.

When a user uploads an image to a Blob container called “images1-input”, the SDK will trigger WaterMark function. Watermark will process the image and write to “images2-input” container which will trigger the Resize function. Resize function will resize the image and write it to “images2-output” Blob container. The following code shows the WebJob described above. For a full working sample, please see the sample here 

image

When you run the WebJob in Azure, you can view the WebJobs Dashboard by clicking the logs link of the “ImageResizeAndWaterMark” in the WEBJOBS tab of Windows Azure Websites portal. Since the Dashboard is a SiteExtension you can access it by going to the url: 

image

Since the Dashboard is a SiteExtension you can access it by going to the url:  https://mysite.scm.azurewebsites.net/azurejobs.  You will need your deployment credentials to access the SiteExtension. For more information on accessing Site Extension, see the documentation on the Kudu project https://github.com/projectkudu/kudu/wiki/Accessing-the-kudu-service

Function execution details

When you are monitoring a particular execution of this “ImageResizeAndWaterMark” WebJob, you can view invocation details about the functions in the program such as:

    • What are the parameters of this function?
    • How much time did it take for the function to execute.
    • How much time did it take to read from Blob and how many bytes were read/ written.

image

Invoke & Replay

In the above example if the WaterMark function fails for some reason, you can upload a new image and Replay WaterMark function, which will trigger the execution chain and call Resize function as well. This is useful to diagnose and debug an issue when you have a complicated graph for chaining functions together. You can also Invoke a function from the dashboard.

Causality of functions

In the above example, we know that when the WaterMark function writes to a Blob, it will trigger the Resize function. The dashboard will show this causality between functions. If you have chained lots of functions which will get triggered as new inputs are detected then it can be useful to see this causality graph.

Search Blobs

You can click on Search for a Blob and get information on what happened to that Blob. For example, in the case of the ImageResizeAndWaterMark, the Blob was written because the WaterMark function got executed. For more details on Search Blobs see this post.

Samples

Samples for WebJobs SDK can be found at https://aspnet.codeplex.com/SourceControl/latest#Samples/AzureWebJobs/ReadMe.txt

    • You can find samples on how to use triggers and bindings for Blobs, Tables and Queues.
    • There is a sample called PhluffySuffy which is an Image processing Website where a customer can upload pictures which will trigger a function to process those pictures from Blob storage.

Documentation

Deploying WebJobs with SDK

If you don't want to use the WebJobs portal page to upload your scripts, you can use FTP, git, or Web Deploy. For more information, see How to deploy Windows Azure WebJobs and Git deploying a .NET console app to Azure using WebJobs

If you want to deploy your WebJobs along with your Websites, check out the following Visual Studio extension.

Known Issues from 0.1.0-alpha1 to 0.2.0-alpha2

Dashboard will only work for WebJobs deployed with 0.2.0-alpha2

If you had a WebJob deployed with 0.1.0-alpha1 of SDK and, if you access the dashboard to see the logs for the WebJob, then you will see a warning about “Host not running”. This happens because as part of this release a newer version of the dashboard gets deployed to all Azure Websites. The new dashboard has some protocol changes which are not compatible with 0.1.0-alpha1. To work around this error, please update your WebJob to use 0.2.0-alpha2 NuGet package and redeploy your WebJob.

 

image

Give feedback and get help

The WebJobs feature of Windows Azure Web Sites and the Windows Azure WebJobs SDK are in preview and not formally supported. Feedback will be considered in changes we make to future versions.

If you have questions that are not directly related to the tutorial, you can post them to the Windows Azure forum, the ASP.NET forum, or StackOverflow.com. Use #AzureWebJobs for Twitter and the tag Azure-WebJobsSDK for StackOverflow.

OWIN security components in ASP.NET: OpenID Connect!

$
0
0

It’s been about a month since we released the first preview of the new claims-based identity programming model in ASP.NET. Yesterday we published a refresh of the preview with lots of improvements in WS-Federation support, and a brand-new feature: OpenID Connect!

Thanks for the Feedback

The new programming model was very well received, which makes us very happy; however, you were not shy about letting us know which features you wanted us to change and add. I would like to take this opportunity to thank Dominick and Poul for their deep involvement and great feedback!
Among the main points we heard:

  • Ensure that the new components are compatible with the Azure Active Directory OAuth bearer middleware
  • Maintain consistency with well-established conventions in the framework (e.g. support for SignInAsAuthenticationType)
  • Keep the layering which separates pure protocol concepts from the middleware proper

We managed to address all of the above, and more. Besides the inevitable bug fixing, we rearranged the validation pipeline to ensure that every stage receives the info it needs in the notifications; we improved error handling & sign out support; and we verified some notable composite scenarios. Then, we added support for an entirely new protocol Smile

OpenID Connect in ASP.NET and Azure AD!

By now you certainly heard of OpenId Connect, the recently ratified open standard that layers authentication on top of OAuth2 and the JWT token format. For a quick intro see this and this.

Azure Active Directory supported OpenID Connect already for quite some time – every time you sign in the Microsoft Azure portal, that’s what’s you’re using – but we didn’t have support for it in our web programming stack. Well, now we do Smile

image

If you search the NuGet.org feed for “OpenIDConnect” and include prerelease packages, you’ll find a couple of new packages which contain base classes constituting the OpenID Connect primitives and the middleware proper implementing the protocol.

We’ll give more details (far more details) next week, however: if you want to get a taste of how it works, start by following from start to finish the WS-Federation & OWIN tutorial here.

Once you are done, switching from WS-Federation to OpenID Connect is super easy!

  • Install the package Microsoft.Owin.Security.OpenIdConnect (remember, it’s in the prerelease feed)
  • Go to Startup.Auth.cs, and substitute the ConfigureAuth implementation with the following:
publicvoid ConfigureAuth(IAppBuilder app)
{
    app.SetDefaultSignInAsAuthenticationType(CookieAuthenticationDefaults.AuthenticationType);

    app.UseCookieAuthentication(new CookieAuthenticationOptions { });

    app.UseOpenIdConnectAuthentication(
        new OpenIdConnectAuthenticationOptions
        {
            Client_Id = "d71c88d1-f3d3-47e9-8313-06bc9af9a991",
            Authority = "https://login.windows.net/azurefridays.onmicrosoft.com/"
         }
}

Yes, we did shrink this even further Smile
The most interesting part is the initialization code for the OpenIdConnectAuthenticationOption.

  • Client_Id is the unique identifier assigned to your application by Azure AD at creation time. You’ve been using this value for all OAuth2 flows where the app acted as a client: in OpenId Connect you use it in roughly the same way in which you used the realm in WS-Federation
  • Authority here represent your Azure AD tenant. AAD exposes metadata describing signing key and issuer values to validate, in analogy to what happens for WS-Federation and SAML: the Authority property is just a convenient way for you to refer to that document without knowing its exact address (which BTW is of the form https://login.windows.net/azurefridays.onmicrosoft.com/.well-known/openid-configuration and can be used directly in the MetadataAddress property if you prefer to set it explicitly)

That’s all you need for adding OpenId Connect sign on to your app!

Of course there’s far more that you could do. For example, with OpenId Connect it is very easy to sign in and at the same time obtain an access token for your app to access other APIs (such as the Graph, Office 365, your own, etc) and the new object model makes it very natural. We’ll talk about this and many other scenarios at length next week!

Next

As the boss of my boss of my boss puts it, “Widely-available secure interoperable digital identity is the key to enabling easy-to-use, high-value cloud-based services for the devices and applications that people use. OpenID Connect fills the need for a simple yet flexible and secure identity protocol and also lets people leverage their existing OAuth 2.0 investments. Microsoft is proud to be a key contributor to the development of OpenID Connect, and of doing our part to make it simple to deploy and use digital identity across a wide range of use cases”.

We were very keen to add OpenId Connect support in our web programming stack, and we are doubly excited to do so in the new OWIN security components in ASP.NET. You have been great in giving us feedback during the first preview, we hope you’ll find the time to try the new bits and let us know what you think:

In addition, if next week you happen to be at S.Francisco and you want to chat about this come find us, either on the //BUILD conference floor or at this meetup: we’ll be happy to give more details.

Thanks and enjoy!

Announcing new web features in Visual Studio 2013 Update 2 RC

$
0
0

Today, the Visual Studio team announced the release of RC version of Visual Studio 2013 Update 2 .  Our team added a few useful features and did some bug fixing in this update to improve the web development experience.  This blog will contain all of the features introduced in CTP2 and a few updates. We will have future blogs to talk about some of the features in detail.  The release notes contain more details.

What’s new since CTP2?

We added the following new features in the RC release, which can be seen in details in the corresponding sections.

New Sass project item and editor

We added LESS in VS2013 RTM, and we now have a Sass project item and editor.  Sass editor features are comparable to the LESS editor, include colorization, variable and Mixins IntelliSense, comment/uncomment, quick info, formatting, syntax validation, outlining, goto definition, color picker, tools option setting etc.

clip_image001

clip_image002

New JSON project item and editor

We have added a JSON project item and editor to Visual Studio.  Current JSON editor features include colorization, syntax validation, brace completion, outlining, tools option setting and more.

clip_image003

IntelliSense now supports JSON Schema v3 and v4. There is a schema combo box to choose existing schemas, edit the local schema path, or simply drag drop a project JSON file to it to get the relative path.

image

image

Create remote Azure resources option when creating a new Web project

We added a Azure “Create remote resources” checkbox on the new web application dialog.  By choosing it, you will be able to integrate the experience of creating a new web application, setting up the Windows Azure publishing site for testing, and creating publishing profile in a few simple steps.

image

clip_image009

We added a new template: Azure Mobile Service, with its support in publish dialog as well. We added VM Support in One ASP.NET dialog and publish dialog.

We also added Power shell scripts support during Project Creation.

We also support remote debugging for Azure Web Sites and remote view of Azure website content files in the server explorer.

A new dialog to trust IIS express SSL certificate

To eliminate the security warning when browsing and debugging HTTPS on localhost, we added a dialog to allow Internet Explorer and Chrome to trust the self-signed IIS express SSL certificate.

For example, a web project property can be set to use SSL. Click F4 to bring up the properties dialog. Change SSL Enabled to true. Copy the SSL URL.

clip_image010

Set the web project property page web tab to use the HTTPS based URL (The SSL URL will be https://localhost:44300/ unless you've previously created SSL Web Sites.)

clip_image011

Press CTRL+F5 to run the application. Follow the instructions to trust the self-signed certificate that IIS Express has generated.

clip_image012

Read the Security Warning dialog and then click Yes if you want to install the certificate representing localhost.

clip_image013

The site will be shown in IE or Chrome without the certificate warning in the browser.

clip_image014

Firefox uses its own certificate store, so it will display a warning.

ASP.NET Scaffolding

  • The MVC Scaffolder will generate dropdowns for Enums. This uses the Enum helpers in MVC.
  • We updated the EditorFor templates in MVC Scaffolding so they use the Bootstrap classes.
  • MVC and Web API Scaffolders will add 5.1 packages for MVC and Web API.

Here’s some screen shots when scaffolding models with Enum.

Model code:

clip_image015

Compile, and then click add New Scaffolded Item…

clip_image016

Choose MVC5 Controller with views, using Entity Framework:

image

Add Controller using the model:

image

Check the generated code, for example Views/WeekdayModels/Edit.cshtml contains @Html.EnumDropDownListFor

image

Run the page to see the enum combobox generated, notice that if a value can be null, an empty string can be chosen for the combobox. For example, the create page shows the following:

clip_image022

One ASP.NET Template changes

We updated ASP.NET templates to support Account Confirmation and Password Reset.

We updated the ASP.NET Web API template to support authentication using On Premises Organizational Accounts.

The ASP.NET SPA template now contains authentication that is based on MVC and server side views. The template has a WebAPI controller which can only be accessed by authenticated users.

LESS editor improvements

We added features including nested media queries, named parameter support, support for selector interpolation, support for semicolons as parameter separators, goto definition for @import, goto definition of variables and mixins.

New URL Picker in HTML, Razor, CSS, LESS and Sass documents

VS 2013 shipped with no URL picker outside of Web Forms pages.  The new URL picker for HTML, Razor, CSS, LESS and Sass editors is a dialog-free, fluent typing picker that understands ‘..’ and filters file lists appropriately for img’s and links.

clip_image025

clip_image026

clip_image027

Browser Link New Features

Browser Link added updates for:

· HTTPS connections and will list that in the Dashboard with other connections as long as the certificate is trusted by browser.

· Static HTML source mapping

· SPA support for mapping data

· Auto-update mapping data

ASP.NET Web Forms

The Web Forms templates now show how to do Account Confirmation and Password Reset for ASP.NET Identity.

The Entity DataSource control and the Dynamic Data Provider were updated for Entity Framework 6. For more details please see Announcing the release of Dynamic Data provider and EntityDataSource control for Entity Framework 6.

ASP.NET MVC 5.1.2, ASP.NET Web API 2.1.2 and ASP.NET Web Pages 3.1.2 are included

We announced ASP.NET MVC 5.1, ASP.NET Web API 2.1 and ASP.NET Web Pages 3.1 in January.  We integrated that release with some minor 5.1.1 bug fixes into VS 2013 Update 2 RC. 5.1.2 contains the same binaries plus localization for IntelliSense usage.

ASP.NET Identity

We integrated Microsoft.AspNet.Identity 2.0.0 RTM into the new project templates, which includes two-factor authentication, account lockout, account confirmation, password reset, security stamp, user account delete, extensibility of primary key for users and roles etc.

Entity Framework

We integrated Entity Framework 6.1.0 RTM into the new project template. 

Microsoft OWIN Components

We integrated the latest stable version (2.1.0) of OWIN components into the new project templates.  OWIN 2.1.0 supports Google OAuth2 authentication and static file server.

NuGet

NuGet 2.8 RTM is included in this release.  You can always get the latest NuGet extension for Visual Studio through the menu “Tools->Extensions and Updates…”.

ASP.NET SignalR

We included the 2.0.2 NuGet package for SignalR.  Please look at the release notes for more detailed information https://github.com/SignalR/SignalR/releases/tag/2.0.2

Known Problems

Knockout IntelliSense is disabled in this release. We hope to have it restored for the RTM release.

Summary

We hope you can evaluate these new features and let us know about any bugs and suggestions.  For VS features, please use Connect to submit bugs, ASP.NET UserVoice to submit and vote for suggestions, and the ASP.NET Forums for Q&A.  You can also visit the following open source site to leave suggestions and open issues directly:

Viewing all 7144 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>