Quantcast
Channel: ASP.NET Blog
Viewing all 7144 articles
Browse latest View live

Plugging custom OAuth/OpenID providers

$
0
0

In the previous post, I wrote about how you can use the existing providers for Google, Facebook etc. and retrieve extra metadata about the authenticated users. Let’s assume you wanted to change the way the providers request for information. Some examples of this could be

  • You want to request more data about the user
  • You want to apply different scope levels when requesting the data

This post covers how you can write your own provider and plug it into your ASP.NET web application

Write your own provider

Each Provider implements from OpenIdClient. Following example shows a custom implementation of Google Provider which requests information about the user such as firstname/lastname etc

Please Note:  This addresses a bug with the existing google provider which does not return the extra data about the user such as Country/FirstName/LastName. The version of google provider is DotNetOpenAuth.AspNet" version="4.0.3.12153". We have logged a bug for this and will fix it in next update of this package.

 

namespace MyApplication
{
    using System.Collections.Generic;
    using DotNetOpenAuth.OpenId.Extensions.AttributeExchange;
    using DotNetOpenAuth.OpenId.RelyingParty;
 
    /// <summary>
    /// Represents Google OpenID client.
    /// </summary>
    public class GoogleCustomClient : OpenIdClient
    {
        #region Constructors and Destructors
 
        public GoogleCustomClient()
            : base("google", WellKnownProviders.Google) { }
 
        #endregion
 
        #region Methods
 
        /// <summary>
        /// Gets the extra data obtained from the response message when authentication is successful.
        /// </summary>
        /// <param name="response">
        /// The response message. 
        /// </param>
        /// <returns>A dictionary of profile data; or null if no data is available.</returns>
        protected override Dictionary<string, string> GetExtraData(IAuthenticationResponse response)
        {
            FetchResponse fetchResponse = response.GetExtension<FetchResponse>();
            if (fetchResponse != null)
            {
                var extraData = new Dictionary<string, string>();
                extraData.Add("email", fetchResponse.GetAttributeValue(WellKnownAttributes.Contact.Email));
                extraData.Add("country", fetchResponse.GetAttributeValue(WellKnownAttributes.Contact.HomeAddress.Country));
                extraData.Add("firstName", fetchResponse.GetAttributeValue(WellKnownAttributes.Name.First));
                extraData.Add("lastName", fetchResponse.GetAttributeValue(WellKnownAttributes.Name.Last));
 
                return extraData;
            }
 
            return null;
        }
 
        /// <summary>
        /// Called just before the authentication request is sent to service provider.
        /// </summary>
        /// <param name="request">
        /// The request. 
        /// </param>
        protected override void OnBeforeSendingAuthenticationRequest(IAuthenticationRequest request)
        {
            // Attribute Exchange extensions
            var fetchRequest = new FetchRequest();
            fetchRequest.Attributes.AddRequired(WellKnownAttributes.Contact.Email);
            fetchRequest.Attributes.AddRequired(WellKnownAttributes.Contact.HomeAddress.Country);
            fetchRequest.Attributes.AddRequired(WellKnownAttributes.Name.First);
            fetchRequest.Attributes.AddRequired(WellKnownAttributes.Name.Last);
 
            request.AddExtension(fetchRequest);
        }
 
        #endregion
    }
}

 

Source Code for existing providers

The source code for existing providers is public and can be accessed at https://github.com/AArnott/dotnetopenid/tree/master/src/DotNetOpenAuth.AspNet/Clients

Register your provider with your application

WebForms

  • In App_Start/AuthConfig.cs register the custom provider as follows
OpenAuth.AuthenticationClients.Add("Custom Google", () => new MyApplication.GoogleCustomClient());
//OpenAuth.AuthenticationClients.AddGoogle();
   

MVC

  • In App_Start/AuthConfig.cs register the custom provider as follows

 OAuthWebSecurity.RegisterClient(new MyApplication.GoogleCustomClient(),"Google",null);
           // OAuthWebSecurity.RegisterGoogleClient();

WebPages

  • In _AppStart.cshtml register the custom provider as follows

 

 OAuthWebSecurity.RegisterClient(new MyApplication.GoogleCustomClient(),"Google",null);
           // OAuthWebSecurity.RegisterGoogleClient();

This post has been cross posted to http://blogs.msdn.com/b/pranav_rastogi/archive/2012/08/23/plugging-custom-oauth-openid-providers.aspx

Please do reach me via twitter (@rustd) for any questions


NuGet Feed Performance Update

$
0
0

As you might know, NuGet has been having some performance (and timeout) related issues recently. Earlier this week, we completed a deployment that helped, but it didn't address everything.

Many users are still seeing slow responses or even timeouts when trying to use the ‘Manage NuGet Packages’ dialog in Visual Studio.

Ongoing Investigation

The deployment earlier this week greatly improved the packages page on the gallery, but it didn't address the Visual Studio dialog performance as much as we had hoped. Since that deployment, we’ve been focusing on the queries behind the Visual Studio dialog.

We have found that the SQL queries that get executed from the ‘Manage NuGet Package’ dialog are not great. While the execution plans look okay, the memory grants for the queries are HUGE–MUCH bigger than they need to be. Because of the huge memory grants, the queries are stacking up behind each other waiting for memory to be granted. This is leading to poor performance and also timeouts.

We are working to get a different approach for these queries in place. We will let you know when a fix is ready. If you’re curious about what the queries look like, you can see the current and tentative rewritten queries here. The rewritten queries are using about 1/100th of the memory of the current queries.

To stay in touch with us, follow @nuget, or the #nuget tag on twitter for live updates. You can also check in on JabbR's #nuget room.

Source: http://blog.nuget.org/20120824/nuget-feed-performance-update.html

Customizing the login UI when using OAuth/OpenID

$
0
0

In the last post I showed how you can plug in your own OAuth/OpenID provider. This post shows you how you can pass in extra data about the provider such as display name, image etc and use this information when building up the UI for login screen

 

If you see the experience of login screen in the ASP.NET templates, it looks like this.

Let’s see how can we customize this UI to look like the following

Web Forms

  • Pass in extra data in App_Start\AuthConfig.cs when registering the provider as follows
Dictionary<string, object> FacebooksocialData = new Dictionary<string, object>();
            FacebooksocialData.Add("Icon", "~/Images/facebook.png");
            OpenAuth.AuthenticationClients.AddFacebook(
                appId: "your Facebook app id",
                appSecret: "your Facebook app secret",
                extraData: FacebooksocialData                
                );
  • Access this data in the View(in the WebForms Internet template case it is Account\OpenAuthProviders.ascx
<img src="<%# Item.ExtraData["Icon"] %>" alt="Alternate Text" />

 

MVC

  • Pass in extra data in App_Start\AuthConfig.cs when registering the provider as follows
Dictionary<string, object> FacebooksocialData = new Dictionary<string, object>();
            FacebooksocialData.Add("Icon", "~/Images/facebook.png");
            OAuthWebSecurity.RegisterFacebookClient(
                appId: "someid",
                appSecret: "somesecret",
                displayName: "Facebook",
                extraData: FacebooksocialData);
    • Access this data in the View(in the MVC Internet template case it is Views\Account\_ExternalLoginsListPartial

     

     @foreach (AuthenticationClientData p in Model)
            {
                <img src="@p.ExtraData["Icon"]" alt="Icon for @p.DisplayName" />
               }

     

    WebPages

    • In _AppStart.cshtml

     

    Dictionary<string, object> FacebooksocialData = new Dictionary<string, object>();
                FacebooksocialData.Add("Icon", "~/Images/facebook.png");
                OAuthWebSecurity.RegisterFacebookClient(
                    appId: "empty",
                    appSecret: "empty",
                    displayName: "Facebook",
                    extraData: FacebooksocialData);

       

      • Access this data in the View(in the webpages template case it is Account\_ExternalLoginsListPartial
       @foreach (AuthenticationClientData p in Model)
              {
                  <img src="@p.ExtraData["Icon"]" alt="Icon for @p.DisplayName" />
                 }

      Cross posted to http://blogs.msdn.com/b/pranav_rastogi/archive/2012/08/24/customizing-the-login-ui-when-using-oauth-openid.aspx

      Pranav Rastogi | @rustd

      List of ASP.NET Web API and HttpClient Samples

      $
      0
      0

      Here is a list of the Web API and HttpClient samples you can find in our samples repository on aspnet.codeplex.com. They illustrate various features of Web API and HttpClient targeting either Visual Studio 2010 using .NET 4 or Visual Studio 2012 using .NET 4.5 with async/await language support.

      For details on how get up and running with the samples, please see the blog ASP.NET Web API Samples on Codeplex. You can also check out additional information about ASP.NET Web API as well as find the Open Source runtime on Codeplex.

      If there are samples that you miss or you find issues then please register an issue and let us know what you think!

      HttpClient Samples

      Bing Translate Sample | VS 2012 source

      Sample illustrating using the Bing translator API from HttpClient. The API requires an OAuth token which we obtain by sending a request to the Azure token server each time we send a request to the translator service. The result from that request is fed into the request sent to the translation service itself. Before you can run this sample you must obtain an application key from Azure Marketplace and fill in the information in the AccessTokenMessageHandler sample class.

      Google Maps Sample | detailed description | VS 2012 source

      This sample uses HttpClient to download a map of Redmond, WA from Google Maps API, saves it as a local file, and opens the default image viewer.

      Twitter Client Sample | detailed description | VS 2012 source

      This sample illustrates how to write a simple twitter client using HttpClient. The sample uses an HttpMessageHandler to insert the appropriate OAuth authentication information into the outgoing HttpRequestMessage. The result from twitter is read using JSON.NET as a JToken. Before you can run this sample you must obtain an application key from twitter, and fill in the information in the OAuthMessageHandler sample class.

      World Bank Sample | detailed description | VS 2010 source | VS 2012 source

      This sample shows how to retrieve data from the World Bank data site using JSON.NET to parse the result as JToken.

      Web API Samples

      Batching Sample | detailed description | VS 2012 source

      This sample shows how to implement HTTP batching within ASP.NET. The batching consists of putting multiple HTTP requests within a single MIME multipart entity body which is then sent to the server as an HTTP POST. The requests are then processed individually and the responses are put into another MIME multipart entity body which is returned to the client.

      Content Controller Sample | detailed description | VS 2010 source | VS 2012 source

      This ASP.NET Web API sample illustrates how to read and write request and response entities asynchronously using streams. The sample controller has two actions: a PUT action which reads the request entity body asynchronously and stores it in a local file and a GET action which returns the content of the local file.

      Custom Assembly Resolver Sample | VS 2012 source

      This sample illustrates how to modify ASP.NET Web API to support discovery of controllers loaded dynamically from a dynamically loaded controller library assembly. The sample implements a custom IAssembliesResolver which takes the default implementation providing the default list of assemblies and then adds the ControllerLibrary assembly as well.

      Custom Media Type Formatter Sample | detailed description | VS 2010 source

      This sample illustrates how to create a custom media type formatter using the BufferedMediaTypeFormatter base class for formatters which primarily are using synchronous read and write operations. In addition to showing the media type formatter, the sample shows how to hook it up by registering it as part of the HttpConfiguration for your application. Note that it is also possible to use the MediaTypeFormatter base class directly for formatters which primarily use asynchronous read and write operations.

      Custom Parameter Binding Sample | detailed description | VS 2010 source

      This ASP.NET Web API sample illustrates how to customize the parameter binding process which is the process that determines how information from a request is bound to action parameters. In this sample, the Home controller has four actions:

      1. BindPrincipal shows how to bind an IPrincipal parameter from a custom generic principal, not from an HTTP GET message;
      2. BindCustomComplexTypeFromUriOrBody shows how to bind a Complex Type parameter which could come from either the message body or request Uri of an HTTP POST message;
      3. BindCustomComplexTypeFromUriWithRenamedProperty shows how to bind a Complex Type parameter with a renamed property which comes from request Uri of an HTTP POST message;
      4. PostMultipleParametersFromBody shows how to bind multiple parameters from body for a POST message;

      File Upload Sample | detailed description | VS 2012 source

      This sample illustrates how to upload files to an ApiController using HttpClient using MIME Multipart File Upload as defined by HTML. It also shows how to set up progress notifications with HttpClient using ProgressNotificationHandler. The FileUploadController reads the contents of an HTML file upload asynchronously and writes one or more body parts to a local file. It then responds with a result containing information about the uploaded file (or files).

      Http Message Handler Pipeline Sample | detailed description | VS 2010 source

      This sample illustrates how to wire up HttpMessageHandlers on both client and server side as part of either HttpClient or ASP.NET Web API. In the sample, the same handler is used on both client and server side. While it is rare that the exact same handler can run in both places, the object model is the same on client and server side.

      JSON Upload Sample | VS 2012 source

      This sample illustrates how to upload and download JSON to and from an ApiController. The sample uses a minimal ApiController and accesses it using HttpClient.

      Mashup Sample | detailed description | VS 2012 source

      This sample shows how to asynchronously access multiple remote sites from within an ApiController action. Each time the action is hit, the requests are performed asynchronously so that no threads are blocked.

      Memory Tracing Sample | detailed description | VS 2010 source

      This sample illustrates how to wire up HttpMessageHandlers on both client and server side as part of either HttpClient or ASP.NET Web API. In the sample, the same handler is used on both client and server side. While it is rare that the exact same handler can run in both places, the object model is the same on client and server side.

      MongoDB Sample | detailed description | VS 2012 source

      This sample illustrates how to use MongoDB as the persistent store for an ApiController using a repository pattern.

      Response Body Processor Sample | VS 2012 source

      This sample illustrates how to copy a response entity (i.e. an HTTP response body) to a local file before it is transmitted to the client and perform additional processing on that file asynchronously. It does so by hooking in a HttpMessageHandler that wraps the response entity with one that both writes itself to the output as normal and to a local file.

      Upload XDocument Sample | detailed description | VS 2012 source

      This sample illustrates uploading an XDocument to an ApiController using PushStreamContent and HttpClient.

      Validation Sample | VS 2010 source

      This sample illustrates how you can use validation attributes on your models in ASP.NET WebAPI to validate the contents of the HTTP request. It demonstrates how to mark properties as required, how to use both framework-defined and custom validation attributes to annotate your model, and how to return error responses for invalid model states.

      Web Form Sample | detailed description | VS 2010 source

      This sample shows an ApiController added to a WebForm project using the menu "Add New Item" and then select "Web API Controller Class". In addition to the controller we also add a default route to global.asax.cs file.

      Web API Extensions Preview Samples

      OData Queryable Sample | detailed description | VS 2010 source

      This sample shows how to introduce OData queries in ASP.NET Web API using either the [Queryable] attribute or by using the ODataQueryOptions action parameter which allows the action to manually inspect the query before it is being executed.

      The CustomerController shows using [Queryable] attribute and the OrderController shows how to use the ODataQueryOptions parameter. The ResponseController is similar to the CustomerController but instead of the GET action returning IEnumerable<Customer> it returns an HttpResponseMessage. This allows us to add extra header fields, manipulate the status code, etc. while still using query functionality. The sample illustrates queries using $orderby, $skip, $top, any(), all(), and $filter.

      OData Service Sample | detailed description | VS 2010 source

      This sample illustrates how to create an OData service consisting of three entities and three ApiControllers. The controllers provide various levels of functionality in terms of the OData functionality they expose:

      The SupplierController exposes a subset of functionality including Query, Get by Key and Create, by handling these requests:

      • GET /Suppliers
      • GET /Suppliers(key)
      • GET /Suppliers?$filter=..&$orderby=..&$top=..&$skip=..
      • POST /Suppliers

      The ProductsController exposes GET, PUT, POST, DELETE, and PATCH by implementing an action for each of these operations directly.

      The ProductFamilesController leverages the EntitySetController base class which exposes a useful pattern for implementing a rich OData service.

      In addition the OData service exposes a $metadata document which allows the data to the consumed by WCF Data Service clients and other clients that accept the $metadata format.


      Have fun!

      Henrik

      Getting Symbols and Source with ASP.NET Nightly NuGet Packages

      $
      0
      0

      You can now get full symbols and source along with the nightly NuGet packages making it possible to debug the latest MVC, Web API, and Web Pages bits by tracing directly through the source. This is enabled by SymbolSource, which hosts the symbols and source for the nightly NuGet packages, and MyGet which hosts the nightly NuGet feed. Great services!

      If you want to use the nightly NuGet packages then please see Using Nightly ASP.NET Web Stack NuGet Packages for getting started. Please remember that the nightly NuGet packages are “raw” and come with no guarantees.

      Configuring Visual Studio

      The instructions apply to both Visual Studio 2010 and 2012 and works in both full and express editions.

      First open the Debug | Options and Settings menu, go the General tab and do the following:

      1. Uncheck Enable Just My Code
      2. Check Enable source server support
      3. Uncheck Require source files to exactly match the original version

      It should look something like this:

      DebugOptions

       

      Now go to the Symbols tab and add http://srv.symbolsource.org/pdb/MyGet to the list of symbols locations. Also, make sure you have a short path for the symbols cache as the file names otherwise can get too long resulting in the symbols not getting loaded properly. A suggestion is to use C:\SymbolCache. It should look something like this:

      DebugSymbols

      For more details, please check out these instructions from symbolsource.org.

      Trying it Out

      Let’s try this out on the Validation Sample project which is one of the ASP.NET Web API samples. After updating the NuGet packages to use the nightly feed, start the debugger. First we set a break point in the sample in PostValidCustomer method which is hit as expected:

      ValidationSampleDebug1

      Now we hit F11 to step into the next statement and it takes us to the JsonMediaTypeFormatter class in ASP.NET Web API:

      ValidationSampleDebug2

      If you have issues or questions then please follow up on the Symbols for nightly builds available discussion thread.

      Have fun!

      Henrik

      Integrate OpenAuth/OpenID with your existing ASP.NET application using Universal Providers

      $
      0
      0

      Over the past couple of weeks I have come across lots of questions/discussions on while OAuth/OpenId is cool as a feature in the ASP.NET templates in Visual Studio 2012, but how do I easily integrate this into my application outside of the templates. More so how do I extend the Universal Providers to integrate OAuth/OpenId and use other functionality such as roles etc. I am going to cover these two areas in this post using WebForms but you could integrate the same with MVC applications as well

      Following are the steps of integrating OpenAuth/OpenId into your existing application

        • I started with an empty 4.5 webapplication(yes nothing in my project except a web.config)
        • Use Nuget to get the following packages
          • DotNetOpenAuth.AspNet
            • This package is the core package for OAuth/OpenID protocol communication
          • Microsoft.AspNet.Providers.Core
            • This package brings in Universal Providers
          • Microsoft.AspNet.Providers.LocalDb
            • This package sets the connectionstring for the Universal Providers
          • Microsoft.AspNet.Membership.OpenAuth
            • This package provides the extension to integrate OAuth/OpenID with Universal Providers
        • Change web.config to use formsauthentication
        <authentication mode="Forms">
             <forms loginUrl="Default.aspx"></forms>
        </authentication>
          • In App_Start Register the list of OAuth/OpenId providers you want to use. By convention any application_start registration is done in a folder called App_Start
          // See http://go.microsoft.com/fwlink/?LinkId=252803 for details on setting up this ASP.NET
                      // application to support logging in via external services.
           
                      //OpenAuth.AuthenticationClients.AddTwitter(
                      // consumerKey: "your Twitter consumer key",
                      // consumerSecret: "your Twitter consumer secret");
           
                      //OpenAuth.AuthenticationClients.AddFacebook(
                      // appId: "your Facebook app id",
                      // appSecret: "your Facebook app secret");
           
                      //OpenAuth.AuthenticationClients.AddMicrosoft(
                      // clientId: "your Microsoft account client id",
                      // clientSecret: "your Microsoft account client secret");
           
                      OpenAuth.AuthenticationClients.AddGoogle();

           

            • Create a page to display the list of providers to use for logging in(This page reads the list configured in App_Start) In my sample I created Default.aspx.
              • Markup
            <asp:ListView runat="server" ID="providerDetails" ItemType="Microsoft.AspNet.Membership.OpenAuth.ProviderDetails"
                         SelectMethod="GetProviderNames" ViewStateMode="Disabled">
                         <ItemTemplate>
                             <button type="submit" name="provider" value="<%#: Item.ProviderName %>"
                                 title="Log in using your <%#: Item.ProviderDisplayName %> account.">
                                 <%#: Item.ProviderDisplayName %>
                             </button>
                         </ItemTemplate>
                         <EmptyDataTemplate>
                             <p>There are no external authentication services configured. </p>
                         </EmptyDataTemplate>
                     </asp:ListView>
                  • Code
                public IEnumerable<ProviderDetails> GetProviderNames()
                     {
                         return OpenAuth.AuthenticationClients.GetAll();
                     }

                At this stage the UI will look as follows

                providers

                 

                  • Request a call to the OpenID/OAuth provider for RequestAuthentication. This code will make an outbound call to the provider where a user can enter the login details and the provider will call back to the app’s return url
                  public string ReturnUrl { get; set; }
                   
                          protected void Page_Load(object sender, EventArgs e)
                          {
                              if (IsPostBack)
                              {
                                  var provider = Request.Form["provider"];
                                  if (provider == null)
                                  {
                                      return;
                                  }
                   
                                  var redirectUrl = "~/ExternalLoginLandingPage.aspx";
                                  if (!String.IsNullOrEmpty(ReturnUrl))
                                  {
                                      var resolvedReturnUrl = ResolveUrl(ReturnUrl);
                                      redirectUrl += "?ReturnUrl=" + HttpUtility.UrlEncode(resolvedReturnUrl);
                                  }
                   
                                  OpenAuth.RequestAuthentication(provider, redirectUrl);
                              }
                          }

                  At this stage the UI will look as follows

                  googlelogin

                    • Now when the provider calls back to the app, we have to check whether the user was authenticated without any errors and if so then login the user. In my sample user I configured the returnurl to be ExternalLoginLandingPage.aspx so create a page called ExternalLoginLandingPage in the root of your app. This page serves the following functions(For brevity, I am pasting in relevant methods/markup here. This entire sample is posted on my github repository https://github.com/rustd/SocialLoginASPNET)
                  1. Display the authenticated username from the provider and verify if the authentication from provider succeeded or not(eg. did you enter correct username/password)
                    ProcessProviderResult() in page_load does this processing
                        1. localaccount
                        2. You can set the local username of the user if you want to and create the membership user and associate the OAuth/OpenID and save this to the database
                          //Markup and refer to codebeind methods
                          <ol>
                                         <li class="email">
                                             <asp:Label ID="Label1" runat="server" AssociatedControlID="userName">User name</asp:Label>
                                             <asp:TextBox runat="server" ID="userName" />
                                             <asp:RequiredFieldValidator ID="RequiredFieldValidator1" runat="server" ControlToValidate="userName"
                                                 Display="Dynamic" ErrorMessage="User name is required" ValidationGroup="NewUser" />                    
                                             <asp:ModelErrorMessage ID="ModelErrorMessage2" runat="server" ModelStateKey="UserName" CssClass="field-validation-error" />                    
                                         </li>
                                     </ol>
                                     <asp:Button ID="Button1" runat="server" Text="Log in" ValidationGroup="NewUser" OnClick="logIn_Click" />
                                     <asp:Button ID="Button2" runat="server" Text="Cancel" CausesValidation="false" OnClick="cancel_Click" />

                          At this stage the UI will look as follows

                          loggedin

                          Database structure

                          Once the membership user is saved to the database, the database will have the following tables

                          listoftables

                          All tables would seem familiar as they are used by Universal Providers for membership, roles, profile. The 2 new tables were created by Microsoft.AspNet.Membership.OpenAuth to integrate OAuth/OpenId information with membership system.

                          UsersOpenAuthAccounts: This holds the information on what providers can the user login by.eg if your app is configured to use Facebook, Google then the user can login via either of them and this information will be stored here

                          UsersOpenAuthData: This table integrates the OAuth/Openid login to the membership system.

                          Following image shows how OAuth/OpenId login information is wired to membership system.

                          usersdata

                          The membershipusername is the username in the Users table.At this stage since you have the users table populated you can create roles and add/remove these users from roles and thus achieve OAuth/OpenId integration with Roles as well

                          This entire sample is posted on my github repository(https://github.com/rustd/SocialLoginASPNET)

                          Feel free to download it and give it a try

                           

                          What the default templates demonstrate more than this

                          To view the default templates incase you do not have VS 2012, you can browse them at the following github repro https://github.com/rustd/ASPNETTemplates

                          • How to protect against XSRF attacks
                          • Associate a local username/password with OAuth/OpenID account
                          • Register with more than one OpenID/OAuth provider

                          I hope this would help in integration OAuth/OpenId easily into your application when you are not starting with the templates

                          ASP.NET Data Access Guidance Published

                          $
                          0
                          0

                          On 9/11 we published a new* set of pages on MSDN offering guidance for getting started with data access in ASP.NET. If you're already using ASP.NET, parts of these pages will just repeat what you already know or take for granted, but a few parts contain recommendations that are new for ASP.NET 4.5. These pages clearly identify which technologies are still available and supported but are no longer part of Microsoft's strategic direction, such that you should think twice before choosing them if you're creating something new. One page provides help setting up SQL Server connection strings.

                          Here are links to the new pages, with a sampling of what you can find in them:

                          • ASP.NET Data Access Content Map
                            • Links to Internet resources for data access in ASP.NET, organized by subject.
                          • Choosing Data Access Options for ASP.NET Web Applications
                            • Choosing a DBMS (choose SQL Server).
                            • Choosing a SQL Server edition for development (choose LocalDB for VS 2012, SQL Server Express for VS 2010)
                            • Choosing an ORM (choose Entity Framework).
                            • Choosing an Entity Framework development workflow (Code First vs. Database First or Model First).
                            • LINQ vs. SQL (use LINQ wherever possible).
                            • Web Forms choices
                              • When to use ListView vs. Repeater vs. GridView controls.
                              • When to use DetailsView vs. FormView control.
                              • When to use TemplateField vs. BoundField vs. DynamicField controls.
                              • When to use the DynamicControl control.
                              • When to use data source controls vs. Web Forms model binding vs. manual data binding.
                              • When to use Eval and Bind vs. Item and BindItem data-binding expressions.
                              • Recommendation to use the new-in-4.5 data-binding expression delimiter (<%#:) for HTML encoding.
                              • When to use Dynamic Data project templates.
                            • When to use Web API vs. WCF
                          • SQL Server Connection Strings for ASP.NET Web Applications
                            • Sample connection strings for LocalDB, SQL Server Express, SQL Server, SQL Database, and SQL Server Compact.
                            • How to convert a connection string from LocalDB to SQL Server Express and vice versa.
                            • How to configure Data Source, AttachDbFileName, Initial Catalog, Integrated Security, MultipleActiveResultSets, and User Instance settings.
                          • Using SQL Server Compact 4.0 for ASP.NET Web Applications
                            • SQL Server Compact's status compared to other SQL Server editions.
                            • Sections on installing, deploying, tools, and migrating to SQL Server.
                          • ASP.NET Data Access FAQ
                            • When to use Microsoft Access in a web application (don't).
                            • Is SQL Server Express OK to use in production (it is).
                            • Can you combine the membership database with your application database (yes).
                            • How to round-trip web projects with databases between Visual Studio 2010 and 2012 (install SQL Server Express 2008 in Visual Studio 2012).

                          *Some of these ASP.NET 4.5 pages existed in ASP.NET 4 or earlier versions, but the content is all new. The others are new pages for ASP.NET 4.5.

                          Feedback is welcome, and this blog is the best place to post comments.

                          -- Tom Dykstra

                          How to use the EntityDataSource control with the DbContext API

                          $
                          0
                          0

                          The EntityDataSource control is designed to work with the ObjectContext API because that's all that existed when the control was created. The DbContext API was introduced later, in Entity Framework 4.1 alongside the Code First development workflow.  Since Code First and DbContext were introduced together you might be accustomed to associating the two, but now in Entity Framework 5.0 the DbContext API is the recommended and default API for all development workflows – Database First and Model First as well as Code First.  So if you want to use the EntityDataSource control, sooner or later you'll want to know how to use it with the DbContext API.

                          Pranav Rastogi has an excellent post on this blog explaining how to do that in a Dynamic Data project (Using Dynamic Data with Entity Framework DbContext). For a regular ASP.NET Web Forms application, the workaround is different. What you have to do is handle the context creating event of the data source and provide the underlying ObjectContext instance that you get from IObjectContextAdapter. For example, here is your .aspx markup for the EntityDataSource control: 

                          <asp:EntityDataSource ID="SchoolContextEntityDataSource" runat="server" 
                              OnContextCreating="SchoolContextEntityDataSource_ContextCreating"
                              EntitySetName="Departments">
                          </asp:EntityDataSource>
                          

                          And here is the code-behind that provides the ObjectContext:

                          protected void SchoolContextEntityDataSource_ContextCreating(object sender, EntityDataSourceContextCreatingEventArgs e)
                          {
                              var db = new SchoolContext();
                              e.Context = (db as IObjectContextAdapter).ObjectContext;
                          }

                          IObjectContextAdapter is in the System.Data.Entity.Infrastructure namespace:

                          using System.Data.Entity.Infrastructure;

                          Thanks to Diego Vega for showing me how to do this.

                          -Tom Dykstra


                          An ASP.NET Open Source How-To Decoder Ring

                          $
                          0
                          0

                          As you probably know, ASP.NET MVC, Web API, and Web Pages are available as open source on aspnetwebstack.codeplex.com. If you want to go beyond using the official RTM versions and either use the latest nightly drops, compile it yourself, or create pull requests then here’s a decoder ring for how to get started:

                          1) Using the Nightly NuGet Packages

                          For when you want to try out the latest nightly NuGet packages from our preview feed without having to compile the source.

                          2) Getting Symbols and Source for Nightly Builds

                          For when you run into trouble using the nightly NuGet packages or want to see what is going on. Makes full source and symbols available directly in the Visual Studio Debugger.

                          3) Compiling the source

                          For when you want to compile the source and run unit tests yourself.

                          4) Contributing to the project

                          For when you want to make source contributions back to the project. It explains how to create a fork and make a pull request with your proposed changes.

                          In addition, you can try out the samples which live on aspnet.codeplex.com and of course let us know of any problems by logging issues.

                          Have fun!

                          Henrik

                          Configuring your ASP.NET application for Microsoft OAuth account

                          $
                          0
                          0

                          This post is a part of the series about how to enable and use OpenID/OAuth support that was added to the ASP.NET templates in Visual Studio 2012. In this post I am going to detail the instructions on configuring your application to use Microsoft account for authentication. I am going to be focusing on configuring the application for a development environment(using VS & IISExpress)

                          • Create a new ASP.NET MVC/WebForms or WebPages application  
                            • Build and run the website.

                           

                          • Use a test domain
                            • You need to use a domain other than localhost since Microsoft account cannot redirect back to localhost
                            • Some folks have graciously reserved localtest.me for local testing of domains so you do not have to mess with hosts files. Follow the link for more information
                            • You can use Foo.localtest.me as a test domain. (Make sure Foo is unique. I would recommend prefixing it with something unique such as Foo<MyName>.localtest.me ). For the purpose of this article we will use Foo.localtest.me
                            • Ping Foo.localtest.me should be reverted back to local machine

                          • Configuring the Microsoft live portal
                            • Open link https://manage.dev.live.com/AddApplication.aspx. Login using live credentials
                            • Enter Application name and click on I accept (Read the terms of use and privacy if it concerns you !)
                            • In the Live Connect Developer Center where the APP was created in step 7, click on Application Settings page link -> API settings
                            • In the redirectDomain enter the above created domain name. Eg: http://Foo.localtest.me
                            • Hit Save and the changes should be saved

                          • Configuring the keys in your application
                            • For MVC/WebForms applications
                              • Edit  the AuthConfig file in the App_Start folder and uncomment the code for the Microsoft OAuth provider. Copy the client ID and client secret in the AuthConfig file in the uncommented section of Microsoft login
                            • For WebPages applications
                              • Edit  the  _AppStart.cshtml file and uncomment the code for the Microsoft OAuth provider. Copy the client ID and client secret to this file in the uncommented section of Microsoft login

                           

                          • Map the test domain to your application
                            • Now we need to add mapping for the domain name to be redirected to our application created in step 1
                            • For IIS Express
                              • Open applicationHost.config in %Documents%\IISExpress\config
                              • Locate the binding for the web application in the file. It will be defined in the <site> tag. For example
                                • <site name="WebApplication5" id="6">
                                  <application path="/" applicationPool="Clr4IntegratedAppPool">
                                  <virtualDirectory path="/" physicalPath="pathtoapplication\WebApplication5" />
                                  </application>
                                  <bindings>
                                  <binding protocol="http" bindingInformation="*:46178:localhost" />
                                  </bindings>
                                  </site>
                              • Add a new binding for port 80 and the domain name to this web application under the <bindings>
                              • <binding protocol="http" bindingInformation="*:80:Foo.localtest.me " />
                              • Restart IIS Express and relaunch the website. Test the above setting made by opening in the browser Foo.localtest.me. It should open the web application created.
                            • IIS (7.0) upwards
                              • Host you application in IIS
                              • Open up IIS Manager locate your web application under the Sites list and select "Bindings" from the action menu on the right. Add a binding for your chosen hostname (e.g. Foo.localtest.me ).
                          • Run the site to see it in action
                            • Make sure you are running VS as an admin for this to work
                            • Now launch the application either hosted in IIS Express or IIS
                            • Browse to the test domain Foo.localtest.me
                            • Navigate to Login page and login using Microsoft Live credentials. Login should be successful

                          Hope you have fun while integrating Microsoft Account into your applications

                          Cross posted to http://blogs.msdn.com/b/pranav_rastogi/archive/2012/09/19/configuring-your-asp-net-application-for-microsoft-oauth-account.aspx

                          ASP.NET 4.5 ScriptManager Improvements in WebForms

                          $
                          0
                          0

                          The ScriptManger control has undergone some key targeted changes in ASP.NET 4.5 which makes it easier to register, manage and combine scripts using the ASP.NET weboptimization feature. This post will highlight the changes that have happened to this control

                          Easy Integration with JQuery and JQueryUI

                          The default templates for WebForms ship with the following packages “AspNet.ScriptManager.jQuery” and “AspNet.ScriptManager.jQuery.UI.Combined”. These packages make it easier to bring in jquery and jqueryUI libraries  and also register them with the ScriptManager. Here is how it works.

                          These packages add the following ScriptMappings for jquery and jqueryUI in the PreApplicationStart method of the application

                          JQuery

                          string str = "1.7.1";
                              ScriptManager.ScriptResourceMapping.AddDefinition("jquery", new ScriptResourceDefinition
                              {
                                  Path = "~/Scripts/jquery-" + str + ".min.js", 
                                  DebugPath = "~/Scripts/jquery-" + str + ".js", 
                                  CdnPath = "http://ajax.aspnetcdn.com/ajax/jQuery/jquery-" + str + ".min.js", 
                                  CdnDebugPath = "http://ajax.aspnetcdn.com/ajax/jQuery/jquery-" + str + ".js", 
                                  CdnSupportsSecureConnection = true, 
                                  LoadSuccessExpression = "window.jQuery"
                              });

                          JQuery.UI

                          string str = "1.8.20";
                              ScriptManager.ScriptResourceMapping.AddDefinition("jquery.ui.combined", new ScriptResourceDefinition
                              {
                                  Path = "~/Scripts/jquery-ui-" + str + ".min.js", 
                                  DebugPath = "~/Scripts/jquery-ui-" + str + ".js", 
                                  CdnPath = "http://ajax.aspnetcdn.com/ajax/jquery.ui/" + str + "/jquery-ui.min.js", 
                                  CdnDebugPath = "http://ajax.aspnetcdn.com/ajax/jquery.ui/" + str + "/jquery-ui.js", 
                                  CdnSupportsSecureConnection = true
                              });

                           

                          These SciptMappings are registered with ScriptManager as follows

                          <asp:ScriptManager runat="server">
                                  <Scripts>
                                      <asp:ScriptReference Name="jquery" />
                                      <asp:ScriptReference Name="jquery.ui.combined" />            
                                  </Scripts>
                              </asp:ScriptManager>

                           

                          Now you can enjoy all the benefits of having ScriptManager such as

                          Debug/Release Support

                          If you are debugging your application(debug=true) in web.config then ScriptManager will serve the scripts from debug path(non minified scripts) such as “~/Scripts/jquery-1.7.1.js”

                          CDN Support

                          If your set EnableCDN=true on the ScriptManager control, then all the Scripts will be served from the CDN path such as “http://ajax.aspnetcdn.com/ajax/jquery-1.7.1.js

                          Override Script Mappings

                          Let’s assume that you wanted to override the scriptmappings to change the CDN path where these scripts are served from. You can do so by changing the scriptmapping before the page is accessed. Typically doing this in Global.asax is better

                          string str = "1.7.1";
                              ScriptManager.ScriptResourceMapping.AddDefinition("jquery", new ScriptResourceDefinition
                              {
                                  Path = "~/Scripts/jquery-" + str + ".min.js", 
                                  DebugPath = "~/Scripts/jquery-" + str + ".js", 
                                  CdnPath = "http://http://code.jquery.com/jquery-"+ str + ".min.js", 
                                  CdnDebugPath = "http://code.jquery.com/jquery-"+ str + ".js", 
                                  CdnSupportsSecureConnection = true
                              });

                           

                          Updating Jquery/JqueryUI libraries

                          Let’s say a new version of Jquery or JqueryUI comes along. Traditionally you would have to download the jquery packages for these libraries and then update the script references everywhere on your pages. With the above integration of AspNet.ScriptManager.Jquery packges this scenario really becomes easy since when you update Jquery/JqueryUI you will get corresponding versions of AspNet.ScriptManager.Jquery/AspNet.ScriptManager.Jquery.UI which will update the scriptmappings to the current version of jquery that was downloaded and you do not have to change anything in your application

                          LoadSuccessExpression

                          This is a new property which is added to the ScriptMapping and it takes care of the following use case. Imagine that you are serving your scripts from a CDN path and if there is an outage in the CDN, your site will be affected because you will not be able to serve any scripts. This property takes in an expression which evaluates whether the script was loaded correctly and if it fails then it renders the script from the local application path.

                          Following is how the script tag looks like for jquery(assuming that you are serving scripts from CDN)

                          <script src="http://ajax.aspnetcdn.com/ajax/jQuery/jquery-1.8.1.js" type="text/javascript"></script>
                          <script type="text/javascript">
                          //<![CDATA[
                          (window.jQuery)||document.write('<script type="text/javascript" src="Scripts/jquery-1.8.1.js"><\/script>');//]]>
                          </script>

                           

                          Remapping Framework scripts

                          One neat improvement that happened in ASP.NET 4.5 is the decoupling of the “Microsoft Ajax script files(MicrosoftAjaxCore etc)” and the WebForms scripts(GridView.js etc). We did the work in this release so you can serve these scripts from your application Scripts folder rather than load then up from System.Web This makes the scripts easily redistributable and updateable. The following section explains on how you can use these scripts and combine them with WebOptimization feature

                          Serving Framework Scripts from inside the application

                          We created the following 2 packages “Microsoft.AspNet.ScriptManager.MSAjax” “Microsoft.AspNet.ScriptManager.WebForms” which do the following

                          Install the Scripts into your application so it looks like follows

                          fxscripts

                          These packages also create the following scriptmappings so that they can be registered with ScriptManager as follows

                          MicrosoftAjax scripts

                          We are registering all the scripts and we are bundling them together as “MsAjaxBundle”

                          ScriptManager.ScriptResourceMapping.AddDefinition("MsAjaxBundle", new ScriptResourceDefinition
                              {
                                  Path = "~/bundles/MsAjaxJs", 
                                  CdnPath = "http://ajax.aspnetcdn.com/ajax/4.5/6/MsAjaxBundle.js", 
                                  LoadSuccessExpression = "window.Sys", 
                                  CdnSupportsSecureConnection = true
                              });
                              PreApplicationStartCode.AddMsAjaxMapping("MicrosoftAjax.js", "window.Sys && Sys._Application && Sys.Observer");
                              PreApplicationStartCode.AddMsAjaxMapping("MicrosoftAjaxCore.js", "window.Type && Sys.Observer");
                              PreApplicationStartCode.AddMsAjaxMapping("MicrosoftAjaxGlobalization.js", "window.Sys && Sys.CultureInfo");
                              PreApplicationStartCode.AddMsAjaxMapping("MicrosoftAjaxSerialization.js", "window.Sys && Sys.Serialization");
                              PreApplicationStartCode.AddMsAjaxMapping("MicrosoftAjaxComponentModel.js", "window.Sys && Sys.CommandEventArgs");
                              PreApplicationStartCode.AddMsAjaxMapping("MicrosoftAjaxNetwork.js", "window.Sys && Sys.Net && Sys.Net.WebRequestExecutor");
                              PreApplicationStartCode.AddMsAjaxMapping("MicrosoftAjaxHistory.js", "window.Sys && Sys.HistoryEventArgs");
                              PreApplicationStartCode.AddMsAjaxMapping("MicrosoftAjaxWebServices.js", "window.Sys && Sys.Net && Sys.Net.WebServiceProxy");
                              PreApplicationStartCode.AddMsAjaxMapping("MicrosoftAjaxTimer.js", "window.Sys && Sys.UI && Sys.UI._Timer");
                              PreApplicationStartCode.AddMsAjaxMapping("MicrosoftAjaxWebForms.js", "window.Sys && Sys.WebForms");
                              PreApplicationStartCode.AddMsAjaxMapping("MicrosoftAjaxApplicationServices.js", "window.Sys && Sys.Services");

                          Here is the Bundle definition for these scripts

                          bundles.Add(new ScriptBundle("~/bundles/MsAjaxJs").Include(
                                          "~/Scripts/WebForms/MsAjax/MicrosoftAjax.js",
                                          "~/Scripts/WebForms/MsAjax/MicrosoftAjaxApplicationServices.js",
                                          "~/Scripts/WebForms/MsAjax/MicrosoftAjaxTimer.js",
                                          "~/Scripts/WebForms/MsAjax/MicrosoftAjaxWebForms.js"));

                           

                          Here is the ScriptManger registration for these scripts

                          <asp:ScriptManager runat="server" >
                                  <Scripts>
                                      <%--Framework Scripts--%>
                                      <asp:ScriptReference Name="MsAjaxBundle" />
                                      <%--Site Scripts--%>
                                  </Scripts>
                              </asp:ScriptManager>

                           

                          WebForms scripts

                          We are registering all the scripts and we are bundling them together as “WebFormsBundle”

                          ScriptManager.ScriptResourceMapping.AddDefinition("WebFormsBundle", new ScriptResourceDefinition
                              {
                                  Path = "~/bundles/WebFormsJs", 
                                  CdnPath = "http://ajax.aspnetcdn.com/ajax/4.5/6/WebFormsBundle.js", 
                                  LoadSuccessExpression = "window.WebForm_PostBackOptions", 
                                  CdnSupportsSecureConnection = true
                              });

                           

                          Bundle Definition

                          bundles.Add(new ScriptBundle("~/bundles/WebFormsJs").Include(
                                            "~/Scripts/WebForms/WebForms.js",
                                            "~/Scripts/WebForms/WebUIValidation.js",
                                            "~/Scripts/WebForms/MenuStandards.js",
                                            "~/Scripts/WebForms/Focus.js",
                                            "~/Scripts/WebForms/GridView.js",
                                            "~/Scripts/WebForms/DetailsView.js",
                                            "~/Scripts/WebForms/TreeView.js",
                                            "~/Scripts/WebForms/WebParts.js"));

                           

                          ScriptManager registration

                          The reason we have the Assembly and the Path attribute here is because ScriptManager special cases these scripts when it tries to load them, so we had to make some special arrangements with ScriptManager code to make this work Smile This is a process which is called deduping. Basically when the ScriptManager tried to load up these scripts, it can load them up from System.Web or the path attribute and eventually in this case, the ScriptManager dedupes the script references and serves the scripts from the path attribute. This is part of the “special arrangement” that we did in ScriptManager. We were very cautious when doing this given this code path was on a very critical path for mainline ScriptManager scenarios which we did not want to regress.

                          <asp:ScriptManager runat="server" >
                                  <Scripts>
                                      <%--Framework Scripts--%>
                                      <asp:ScriptReference Name="WebForms.js" Assembly="System.Web" Path="~/Scripts/WebForms/WebForms.js" />
                                      <asp:ScriptReference Name="WebUIValidation.js" Assembly="System.Web" Path="~/Scripts/WebForms/WebUIValidation.js" />
                                      <asp:ScriptReference Name="MenuStandards.js" Assembly="System.Web" Path="~/Scripts/WebForms/MenuStandards.js" />
                                      <asp:ScriptReference Name="GridView.js" Assembly="System.Web" Path="~/Scripts/WebForms/GridView.js" />
                                      <asp:ScriptReference Name="DetailsView.js" Assembly="System.Web" Path="~/Scripts/WebForms/DetailsView.js" />
                                      <asp:ScriptReference Name="TreeView.js" Assembly="System.Web" Path="~/Scripts/WebForms/TreeView.js" />
                                      <asp:ScriptReference Name="WebParts.js" Assembly="System.Web" Path="~/Scripts/WebForms/WebParts.js" />
                                      <asp:ScriptReference Name="Focus.js" Assembly="System.Web" Path="~/Scripts/WebForms/Focus.js" />
                                      <asp:ScriptReference Name="WebFormsBundle" />
                                      <%--Site Scripts--%>
                           
                                  </Scripts>
                              </asp:ScriptManager>

                           

                          When you run this page you will get script references as follows

                          <script src="/bundles/MsAjaxJs?v=J4joXQqg80Lks57qbGfUAfRLic3bXKGafmR6wE4CFtc1" type="text/javascript"></script>
                          <script src="/bundles/WebFormsJs?v=q9E9g87bUDaS624mcBuZsBaM8xn2E5zd-f4FCdIk2cA1" type="text/javascript"></script>
                              <script type="text/javascript">

                           

                          So these were the improvements that happened in the ScriptManager control for registering scripts. In case you want to refresh your memory, ScriptManager was greatly enhanced in ASP.NET v4.0 and Dave Reed had a fantastic post about it, which I strongly recommend to go through.

                          This is cross posted on http://blogs.msdn.com/b/pranav_rastogi/archive/2012/09/21/asp-net-4-5-scriptmanager-improvements-in-webforms.aspx

                          WCF Duplex Bi-directional Streaming with WebSocket Transport

                          $
                          0
                          0

                          DuplexBiDirectionalStreaming VS2012 project DuplexBiDirectionalStreaming.zip

                          Introduction

                          With WebSocket transport, it’s possible to use streamed transport on the Duplex Callback – something that was previously a technical limitation with WCF but is now possible using this transport. New to .Net4.5 is the NetHttpBinding (and NetHttpsBinding), which is a new standard binding that leverages the WebSocket transport if it makes sense to do so. You can configure it to always use WebSocket transport, but by default, it will only use it for Duplex channel shapes. Why? Well, what this transport gives you is the ability for the server to push data to the client without an incoming request. If your contract is already request-reply, there isn’t much increase in efficiency by using the WebSocket transport. But with duplex, there isn’t necessarily a 1-1 correspondence with requests and replies. Of course, the binding does give you the ability to change this setting.

                          Requirements

                          WebSocket transport is only possible on the Windows8 operating system. I’m using Windows8 for this demo with Visual Studio 2012. IIS needs to be installed, with full support for Http Activation and WebSocket Transport. To run the service in IIS from Visual Studio, you need to run Visual Studio as an administrator.

                          Walk-Through

                          First, I created a new Console Project in Visual studio. This will be the client. I’ll leave it alone for now, and add a WCF service:

                          clip_image002

                          Next, I’ll define the service contract. I want to be able to upload bytes for as long as I want. Also, I want to be able to download bytes at the same time, and have a way to tell the service to stop sending me bytes. I’ll also put something in there to grab logs from the service implementation. I created this in a new class library so it can be shared between the client and service.

                          Code Snippet
                          1. [ServiceContract(CallbackContract = typeof(IClientCallback))]
                          2. public interface IStreamService
                          3. {
                          4.     [OperationContract(IsOneWay = true)]
                          5.     void UploadStream(Stream stream);
                          6.  
                          7.     [OperationContract(IsOneWay = true)]
                          8.     void StartDownloadingStream();
                          9.  
                          10.     [OperationContract(IsOneWay = true)]
                          11.     void StopDownloadingStream();
                          12.  
                          13.     [OperationContract(IsOneWay = true)]
                          14.     void DownloadLog();
                          15. }

                          Remember, this is duplex, so all the service operations return void. Anything that needs to go to the client goes through the callback channel, defined by this:

                          Code Snippet
                          1. public interface IClientCallback
                          2. {
                          3.     [OperationContract(IsOneWay = true)]
                          4.     void ReceiveStream(Stream stream);
                          5.  
                          6.     [OperationContract(IsOneWay = true)]
                          7.     void ReceiveLog(List<string> log);
                          8. }

                          Now, to implement the service. To do this, I’ll need some kind of custom stream implementation. For demonstration purposes, I’ll use a stream implementation that sends random bytes and can be turned on or off manually or configured to send for a given duration. In order to not blow MaxReceivedMessageSize buffers, I’ll make a configurable throttle on it. The code for this FlowControlledStream can be found in the attached project.

                          See the attached project for the full service implementation, but the new thing here is sending a stream on a callback; previously not possible with WCF. Also in there is the implementation of StopDownloadingStream.

                          Code Snippet
                          1. public void StartDownloadingStream()
                          2. {
                          3.     log.Add(string.Format("[{0}] StartDownloadingStream Invoked.", DateTime.Now.Minute + ":" + DateTime.Now.Second + "." + DateTime.Now.Millisecond));
                          4.     IClientCallback clientCallbackChannel = OperationContext.Current.GetCallbackChannel<IClientCallback>();
                          5.     ThreadPool.QueueUserWorkItem(new WaitCallback(PushStream), clientCallbackChannel);
                          6. }
                          7.  
                          8. private void PushStream(object state)
                          9. {
                          10.     IClientCallback pushCallbackChannel = state as IClientCallback;
                          11.     localStream = new FlowControlledStream();
                          12.     localStream.ReadThrottle = TimeSpan.FromMilliseconds(500);
                          13.  
                          14.     pushCallbackChannel.ReceiveStream(localStream);
                          15. }
                          16.  
                          17. public void StopDownloadingStream()
                          18. {
                          19.     log.Add(string.Format("[{0}] StopDownloadingStream Invoked.", DateTime.Now.Minute + ":" + DateTime.Now.Second + "." + DateTime.Now.Millisecond));
                          20.     localStream.StopStreaming = true;
                          21. }

                          With that finished, I need to modify the Web.config that was generated for me. I want to use WebSocket Transport, so I’ll use the standard NetHttpBinding. By default, netHttpBinding will use WebSocket transport if the contract is duplex, so I shouldn’t have to explicitly add that. But I will increase the MaxReceivedMessageSize to a ridiculous value. So, I replaced the generated System.ServiceModel section with the following:

                          Code Snippet
                          1. <system.serviceModel>
                          2.   <behaviors>
                          3.     <serviceBehaviors>
                          4.       <behavior name="MyServiceBehavior">
                          5.         <serviceMetadata httpGetEnabled="true" httpsGetEnabled="true"/>
                          6.         <serviceDebug includeExceptionDetailInFaults="true"/>
                          7.       </behavior>
                          8.     </serviceBehaviors>
                          9.   </behaviors>
                          10.   <serviceHostingEnvironment aspNetCompatibilityEnabled="true" multipleSiteBindingsEnabled="true" />
                          11.   <bindings>
                          12.     <netHttpBinding>
                          13.       <binding name="MyBinding" maxReceivedMessageSize="67108864" transferMode="Streamed">
                          14.       </binding>
                          15.     </netHttpBinding>
                          16.   </bindings>
                          17.   <services>
                          18.     <service behaviorConfiguration="MyServiceBehavior" name="DuplexService.StreamingService">
                          19.       <endpoint address="" contract="CommonArtifacts.IStreamService" binding="netHttpBinding" bindingConfiguration="MyBinding" />
                          20.     </service>
                          21.   </services>
                          22. </system.serviceModel>

                          So, after that, I open the service’s Properties page, go to the Web section, and make sure “Use Local IIS Web server” is selected, and I uncheck “Use IIS Express”. When I save that, and allow VS to create the virtual directory, I can browse to the .svc file and get the standard WCF Service help page. Time to work on the Client code.

                          The goal for the client is to prove that we are doing true bi-directional streaming. To do this my client will execute the following:

                          1. Invoke a service operation to start downloading a stream.

                          2. Signal that bytes are being received.

                          3. Upload a stream for a few seconds.

                          4. Signal to the service to stop sending bytes to the receiver.

                          5. Wait for the end of the stream.

                          If the service can’t process the incoming stream while sending bytes out, then it will deadlock. The program is in the attached zip file, but here’s a snipped version:

                          Code Snippet
                          1. ClientReceiver receiver = new ClientReceiver();
                          2.  
                          3. string address = "http://localhost/DuplexService/DuplexService.svc";
                          4. NetHttpBinding binding = new NetHttpBinding();
                          5. binding.MaxReceivedMessageSize = 64 * 1024 * 1024;
                          6. binding.TransferMode = TransferMode.Streamed;
                          7. DuplexChannelFactory<IStreamService> factory = new DuplexChannelFactory<IStreamService>(new InstanceContext(receiver), binding, address);
                          8. IStreamService client = factory.CreateChannel();
                          Code Snippet
                          1. client.StartDownloadingStream();
                          2.  
                          3.  Console.WriteLine("[{0}] Waiting for the receiver to start reading bytes.", DateTime.Now.Minute + ":" + DateTime.Now.Second + "." + DateTime.Now.Millisecond);
                          4.  receiver.ReceiveStreamInvoked.WaitOne();
                          5.  
                          6.  Console.WriteLine("[{0}] Client Invoking UploadStream, while the receiver is receiving bytes.", DateTime.Now.Minute + ":" + DateTime.Now.Second + "." + DateTime.Now.Millisecond);
                          7.  FlowControlledStream uploadStream = new FlowControlledStream();
                          8.  uploadStream.ReadThrottle = TimeSpan.FromMilliseconds(500);
                          9.  uploadStream.StreamDuration = TimeSpan.FromSeconds(5);
                          10.  client.UploadStream(uploadStream);
                          11.  
                          12.  Console.WriteLine("[{0}] Client Invoking StopDownloadingStream.", DateTime.Now.Minute + ":" + DateTime.Now.Second + "." + DateTime.Now.Millisecond);
                          13.  client.StopDownloadingStream();
                          14.  
                          15.  Console.WriteLine("[{0}] Waiting on ReceiveStreamCompleted from the receiver.", DateTime.Now.Minute + ":" + DateTime.Now.Second + "." + DateTime.Now.Millisecond);
                          16.  receiver.ReceiveStreamCompleted.WaitOne();

                          Running it produces the following log:

                          [45:21.308] Client Invoking StartDownloadingStream.

                          [45:21.844] Waiting for the receiver to start reading bytes.

                          [45:22.861] ReceiveStream invoked.

                          [45:22.862] Client Invoking UploadStream, while the receiver is receiving bytes.

                          [45:27.873] Client Invoking StopDownloadingStream.

                          [45:27.873] Waiting on ReceiveStreamCompleted from the receiver.

                          [45:28.371] ReceiveStream read 725248 bytes.

                          [45:28.371] Getting the log from the server.

                          The following are the logs from the server:

                          [45:21.854] StartDownloadingStream Invoked.

                          [45:23.866] UploadStream Invoked.

                          [45:27.873] UploadStream Read 528640 bytes.

                          [45:27.874] StopDownloadingStream Invoked.

                          [45:28.373] DownloadLog Invoked.

                          Comparing the server log and the client log shows that server read bytes from the client for about 5 seconds while sending bytes to the client at the same time. I made sure to set the ReadThrottle to the same amount on both client and service, so the client should read more bytes than the service, which it does.

                          Workaround for HTML closing tag problem

                          $
                          0
                          0

                          In Visual Studio 2012, HTML tags will fail to be automatically closed when using the following keyboard layouts:

                          • Canadian Multilingual
                          • Croatian
                          • Czech
                          • Hungarian
                          • Latvian
                          • Polish
                          • Romanian
                          • Serbian
                          • Slovak
                          • Slovenian

                          On these keyboards, the “>” character is typed by pressing AltGr+. or RightAlt+.

                          Work around

                          1. Open the Tools/Customize dialog
                          2. Click the “Keyboard…” button
                          3. In the “Show commands containing:” text field, enter: view.autoclose
                            1. The list view will display “View.AutoCloseTagOverride”
                            2. “Shortcuts for selected command” will display: Ctrl+Alt+. (HTML Editor Source View)
                          4. Press the “Remove” button
                          5. Press the “OK button
                          6. In the Customize dialog, press the “Close” button


                          Typing an HTML tag will now result in the closing tag being automatically inserted, as in VS 2010.

                          More Details on the “Closing Tag Problem”

                          $
                          0
                          0

                          We are testing a fix for the “closing tag problem” bug, but it will still take some time to deliver the fix to customers. The work-around posted earlier will alleviate the problem now and should not need to be undone once the fix is shipped.

                          This bug has actually been in the product for several release, but was previously hidden by another bug. When you begin typing an HTML tag, an Intellisense completion list is displayed. The earlier bug was that this list blocked keystrokes bound to commands from firing. An obscure command, “AutoCloseTagOverride” was bound to the key combination Ctrl+Alt+Period in the HTML Editor. It turns out that this is also fired by the key combination AltGr+Period.

                          So, in Visual Studio 2012, this command began firing when users typed the closing “>” of an HTML tag on the ten specific keyboards listed in the previous post, which cancelled the insertion of the closing tag whenever an opening tag was typed.

                          Clearly this was a test hole. Automated tests failed to simulate the problem and manual testing missed this area of risk. Test coverage has been corrected and this won’t occur again.

                          Spell Checker extension for Visual Studio 2012 HTML, ASP.NET, CSS and other files

                          $
                          0
                          0

                          I have updated spell checker extension for Visual Studio 2012. You can download it from Visual Studio Gallery.

                          Spell checker supports text verification in:

                          • HTML and ASP.NET element content and attributes
                          • HTML style comments <-- HTML -->
                          • ASP.NET server side comments: <%-- ASP.NET --%>
                          • JScript, C# and C++ comments: // C++ style comments
                          • CSS and C style comments: /* C style comments */
                          • VB and VBScript style comments: 'This is VB comment

                          Spell checking is supported in style and script blocks as well as in JS, CS, VB, CSS, CPP and H files. Spell checker is able to detects lang attribute specified on HTML elements, extract ISO language and use it to specify appropriate dictionary for the Office spell checking engine.

                          Requirements

                          Microsoft Visual Studio 2012 any edition except Express.
                          Microsoft Word 2003, 2007 or 2010. Extension was not tested with Office 15 preview.

                          How to use Spell Checker

                          After installing the extension you should see Spell Checker menu item in the Tools menu. Spell checker is not 'as you type' so you have to click the menu item each time you want to re-check the document.

                           

                           

                          Spell checker messages also show up in the Error List as informational messages. They are not entered as errors or warnings so they don't break builds. Right click on the error marker allows you to pick correct word from list of suggestions.

                           

                           

                          Spell checker is able to detects langattribute specified on elements, extract ISO language and use it to specify appropriate dictionary for the Office spell checking engine. You can define language of comments in non-HTML files by adding the following in the first comment in the file:

                          // <spellcheck-language='fr'

                          In order to be able to spell check pages in multiple languages you may need to install appropriate Office Language Pack. If you never used particular language dictionary in Word, you have to try using it at least once before it becomes available to the Spell Checker extension. Many dictionaries are installed on demand and if particular language was never activated in Word, the dictionary may be missing. Open Word, type something in the desired language and run Word spell checker at least once to make sure it works and dictionary is installed.

                          Office language options can be found in Word File | Options menu:

                           

                          Customization

                          You can customize spell checker behavior by editing rules.xml file located in the extension install folder (typically under C:\Users\<user_name>\AppData\Local\Microsoft\VisualStudio\11.0\Extensions. You can exclude certain elements and add more rules for attribute checking if you want spell checker to verify spelling in custom control attributes. All element and attribute names much be in lowercase. You don't have to close document or Visual Studio after editing the file, the file is loaded every time spell checking is performed.

                          <?xml version="1.0" encoding="utf-8"?>
                          <rules>
                          <!-- Exclude content of script and style elements from spell check -->
                          <exclude name="script" />
                          <exclude name="style" />

                          <!-- Check 'value' attribute on all elements without a namespace -->
                          <element name="*">
                          <attribute name="value" />
                          </element>

                          <!-- Rules in ASP namespace -->
                          <namespace name="asp">

                          <!-- Check all attributes ending in 'text' as well as tooltip attribute in all ASP.NET elements -->
                          <element name="*">
                          <attribute name="*text" />
                          <attribute name="tooltip" />
                          </element>

                          <!-- Special rule for asp:Calendar -->
                          <element name="calendar">
                          <attribute name="caption" />
                          </element>

                          <!-- Add more rules for ASP.NET elements here if needed -->
                          </namespace>

                          <!-- Add rules for custom controls here if needed -->

                          </rules>

                          Thanks

                          Mikhail Arkhipov


                          BlogEngine.NET and Windows Azure Web Sites

                          $
                          0
                          0

                          The Windows Azure Web Sites team has been hard at work looking at various applications and working with vendors and community contributors to add some great applications to the web sites gallery. If you’re a blogger and you’d like to get started for free with a simple, yet extensible blogging tool, you might want to check this out. Starting this week you can install BlogEngine.NET into a free instance of a Windows Azure Web Site. I’ll walk you through the process in this post.

                          The Gallery

                          Windows Azure Web Sites makes it easy to get started with a new web site, and you can’t beat the price and the ability to upgrade your horsepower whenever you need (or to downgrade whenever you don’t need it). The web application gallery in the Windows Azure portal adds the sugary sweet ability to get up and running with a completely-built and pre-configured application in seconds. In most cases you won’t even need to write a line of code. Think of all those hours of your life you’ve spent configuring a CMS or blogging tool to get it working on your server – the gallery provides point-and-click creation and deployment of many of these types of tools.

                          Below, for instance, you can see how easy it is for me to start building a brand new blog using BlogEngine.NET in Windows Azure.

                          be.net-1

                          Installing BlogEngine.NET

                          Once you select BlogEngine.NET and click the next arrow button, the gallery installer will need to get some pretty basic information from you and provision the site. The next step in the installation process gets your site’s name, which will serve as the prefix in the [prefix].azurewebsites.net manner.

                          be.net-2

                          Since BlogEngine.NET’s default storage mechanism doesn’t require a database (but you can, of course, customize your installation later to use SQL if you’d like), but that’s basically the last step in the process. Two steps, and a brand new BlogEngine.NET blog will be created and deployed into Windows Azure Web Sites. Below you’ll see how the portal reflects the status of the deployment.

                          be.net-3

                          In a few moments, the site will be completely deployed, and by clicking on the browse button at the bottom of the portal when you’ve got your freshly-deployed blog selected…

                          be.net-4

                          You’ll be impressed at how easy the process is when you’re running BlogEngine.NET on Windows Azure Web Sites. In seconds, your new blog will be ready, deployed, and willing to record your every contemplation.

                          be.net-5 

                          Using BlogEngine.NET

                          BlogEngine.NET is great, the administrative interface a breeze, and the whole experience painless when you install it this way. I’ve not used the product in a number of years. Though I don’t recall the installation and configuration process as a difficult one, it sure as heck wasn’t this easy the last time I installed it on a server.

                          The first thing you’ll want to do is to log in as admin (password by default is also admin), according to the BlogEngine.NET installation documentation, and change your password. The administration dashboard is quite simple, and blogging using the product is a breeze.

                          be.net-6

                          As simple as BlogEngine.NET is on the surface, it has a great community and huge list of extensions and resources available. The extensions page in the BlogEngine.NET administration console should give you an idea of what’s available for your use if you’re into the idea of customizing the functionality.

                          be.net--7

                          So in seconds – it literally took me less than 1 minute on the first try – you can have a brand new blog running in Windows Azure Web Sites, for free or for however much you need whenever you need it, that’s simple to use and extensible with a great plug-in model.

                          Start Simple. Go Big When You Need To

                          If you haven’t tried Windows Azure yet, you can get started right now, for free, and stay free for a year if all you really need is a BlogEngine.NET site. In 5 minutes you’ll have your own site up and running, without writing a single line of code or performing any complex configuration tasks. When you pair the ease of setup and deployment Windows Azure Web Sites and the application gallery provide with the blogging simplicity and elegance BlogEngine.NET provide, you can’t go wrong.

                          Cryptographic Improvements in ASP.NET 4.5, pt. 1

                          $
                          0
                          0

                          I am Levi Broderick, a developer on the ASP.NET team at Microsoft. In this series, I want to introduce some of the improvements we have made to the cryptographic core in ASP.NET 4.5. Most of these improvements were introduced during beta and spent several months baking. When you create a new project using the 4.5 templates baked into Visual Studio 2012, those projects will take advantage of these improvements automatically. The intent of this series is both to explain why the ASP.NET team made these investments and to educate developers as to how they can take maximum advantage of this system.

                          This series will be divided into three posts:

                          1. Background regarding the use of cryptography in ASP.NET 4 (today's post).
                          2. Changes that were introduced in ASP.NET 4.5.
                          3. Usage notes and miscellaneous Q&A.

                          Throughout the series I'll refer to a sample solution. This Visual Studio 2012 solution contains projects that demonstrate many of the core concepts mentioned here. It can be downloaded from http://sdrv.ms/T4aMyg.

                          Background

                          Ever since ASP.NET's inception over a decade ago, the product has consumed cryptography in some form. We have a variety of use cases: ViewState, ScriptResource.axd and WebResource.axd URLs, FormsAuthentication tickets, membership passwords, and more. And for a while, we just assumed that the types in the System.Security.Cryptography namespace solved all our problems automatically, ignorant of the fact that the callers have the ultimate responsibility to call the APIs correctly. This led to MS10-070, whereby attackers exploited the fact that ASP.NET misused these cryptographic primitives and were able to read sensitive files from the web application directory.

                          We quickly released a patch for that issue, but at the same time we realized that we needed to perform a more thorough investigation of cryptographic uses inside ASP.NET. Through a joint effort between members of the ASP.NET security team, the .NET Framework security team, and Microsoft's crypto board, we identified several areas for improvement and set to work drafting changes.

                          A brief digression: auto-generated machine keys

                          Whenever any discussion of cryptography in ASP.NET comes up, the topic of conversation eventually comes around to the <machineKey> element. And the confusion is understandable since the term is overloaded. There are four attributes in particular which are most immediately interesting.

                          Attribute Description
                          decryption An algorithm which performs encryption and decryption using a symmetric key.
                          decryptionKey A hex string specifying the key used by instances of the decryption algorithm.
                          validation An algorithm which generates a message authentication code over some payload.
                          validationKey A hex string specifying the key used by instances of the validation algorithm.

                          The format of decryptionKey and validationKey is as follows:
                          key-format = (hex-string | ("AutoGenerate" [",IsolateApps"] [",IsolateByAppId"]))

                          Normally these keys are expected to be represented by hex strings, but developers can also specify that ASP.NET use auto-generated keys instead of explicitly-specified keys. If an auto-generated key is used, the runtime will automatically populate the registry key HKCU\Software\Microsoft\ASP.NET\4.0.30319.0\AutoGenKeyV4 with a random number generated by a cryptographically-secure RNG. The registry key holds enough random bits for both an encryption key and a validation key to exist side-by-side without overlapping, and the value is itself protected using DPAPI.

                          There is an important consequence of the above: the auto-generated machine key is unique per user (where the user is usually the Windows identity of the worker process) on a given machine. If two web applications are deployed on a machine, and if those applications' process identities are equivalent, then the auto-generated keys will be the same for both applications.

                          Throughout this series, the term "machine key" refers to the tuple of decryptionKey and validationKey. When I refer to the machine key being transformed, envision that the transform is being applied to the decryption and validation keys independently. It is also possible to use an auto-generated decryptionKey with an explicit validationKey and vice versa, but this configuration is discouraged.

                          ASP.NET provides two optional modifiers that can further alter the auto-generated machine key before it is consumed by the application. (The particular transformation mechanism is described later in this post.) The currently supported modifiers are:

                          • IsolateApps – The runtime uses the value of HttpRuntime.AppDomainAppVirtualPath to transform the auto-generated key. If multiple applications are hosted on the same port in IIS, the virtual path is sufficient to differentiate them.
                          • IsolateByAppId – The runtime uses the value of HttpRuntime.AppDomainAppId to transform the auto-generated key. If two distinct applications share a virtual path (perhaps because those applications are running on different ports), this flag can be used to further distinguish them from one another. The IsolateByAppId flag is understood only by the ASP.NET 4.5, but it can be used regardless of the compatibilityMode setting (which will be introduced in tomorrow’s post).

                          If no explicit hex key is specified in an application's configuration file, then the runtime assumes a default value of AutoGenerate,IsolateApps. Thus by default the user's auto-generated machine key is transformed with the application's virtual path, and this transformed value is used as the cryptographic key material.

                          The world in ASP.NET 4

                          Transforming the auto-generated machine key

                          One notable change to ASP.NET 4's cryptographic pipeline is that we added support for the SHA-2 family of algorithms. This was possible partly due to the fact that Windows XP / Server 2003 were the minimum system requirements for .NET 4, and the latest service pack for both OSes at the time brought native support for SHA-2. We also added a configuration option for specifying the particular algorithms used, so any developer is able to swap in their own SymmetricAlgorithm or KeyedHashAlgorithm-derived types.

                          When the runtime is asked to use auto-generated machine keys in ASP.NET 4, it selects AES with a 192-bit key (this is a holdover from when ASP.NET used Triple DES, which takes a 192-bit key) and HMACSHA256 with a 256-bit key. Consider only the encryption key for now. The auto-generated machine key as retrieved from the registry will provide the full 192 bits of entropy. Assume that those bytes are:

                          ee 1c df 76 16 ed 18 37 70 05 30 a8 17 d0 e6 69 97 65 21 de 00 3b 92 70

                          Remember: the auto-generated key as stored in the registry contains both the encryption and the validation keys. The encryption and validation keys are extracted individually from this registry entry, and the transformation is applied to each independently.

                          Furthermore, recall that if IsolateApps is specified, this key is further transformed before being consumed by the application. The particular manner in which this occurs is that the runtime hashes the application's virtual path into a 32-bit integer, and these 32 bits replace the first 32 bits of the value that we got from the registry. Thus if the application’s virtual path is "/myapp", and if that string hashes to 0x179AB900, then IsolateApps will transform the key read from the registry into:

                          17 9a b9 00 16 ed 18 37 70 05 30 a8 17 d0 e6 69 97 65 21 de 00 3b 92 70

                          The immediate consequence of this is that the 192-bit key used for encryption contains only 160 bits of entropy. (If IsolateByAppId is also specified, then the next 32 bits will be likewise replaced by the hash of the application's AppDomainAppID, and the total entropy is reduced to 128 bits.) It is important to note that neither the application's virtual path nor its application ID is secret knowledge, and in fact the former is trivial to guess since it is often in the URL itself. So if an application is deployed to the virtual path "/" and is using the default behavior of AutoGenerate,IsolateApps, it can be assumed that the first 32 bits of encryption key material are 4e ba 98 b2, which is the hash of the string "/".

                          Transformation of the validation key works in a similar fashion. The default auto-generated key is a 256-bit key for use with HMACSHA256. IsolateApps reduces the entropy to 224 bits, and IsolateApps and IsolateByAppId together will reduce the entropy to 192 bits.

                          Use of key material

                          The particular design of <machineKey> requires that there be only a single set of encryption and validation keys at any given time. There are two implications to this design. The first is that this presents a hardship to organizations which require that cryptographic key material be refreshed on a regular basis. The standard way of doing this is to update the keys in Web.config, then redeploy the affected applications. However, all existing encrypted or MACed data will then be rendered invalid since the framework will no longer be able to interpret existing ciphertext payloads.

                          The second implication has a greater impact on application security but is more subtle to most observers. Things should come more into focus with a bit of exposition.

                          The API FormsAuthentication.Encrypt takes as a parameter a FormsAuthenticationTicket instance, and it returns a string corresponding to the protected version of that ticket. More specifically, the API serializes the FormsAuthenticationTicket instance into a binary form (the plaintext), and this binary form is run through encryption and MACing processes to produce the ciphertext. Typical usage is as follows:

                          var ticket = new FormsAuthenticationTicket("username", false, 30); 
                          string encrypted = FormsAuthentication.Encrypt(ticket);

                          (A similar code path is invoked by FormsAuthentication.SetAuthCookie and other related methods.)

                          In earlier versions of ASP.NET, the ticket serialization routine automatically prepended 64 bits of randomness before outputting fields like the username, creation and expiration dates, etc. Assuming a good RNG, there is a 1 in 256 chance of the first byte being any particular value, such as 0x54. This would normally seem harmless, but...

                          The ScriptResourceHandler (ScriptResource.axd) type provides several services for AJAX-enabled ASP.NET applications. The API is called via ScriptResource.axd?d=xyz, where xyz is ciphertext. ScriptResourceHandler will extract the plaintext and perform some action depending on the value of the first plaintext byte. If this first byte is 0x54, the plaintext payload is dumped to the response. (This behavior is intended to support AJAX navigation on browsers which do not include native support for the feature.)

                          Putting these two details together, one reasons that there is a 1 in 256 chance that an encrypted FormsAuthentication ticket given to a client can be echoed back to ScriptResource.axd for decryption. This highlights the second implication mentioned above: since there is a single set of cryptographic keys for the application, all components necessarily share the same set of keys. All cryptographic consumers within an application need to be aware of each other and differentiate their payloads; otherwise, attackers can start playing the individual consumers off one another. A weakness in a single component can quickly turn into a weakness in an entirely different part of the system.

                          The project FormsAuthScriptResource in the sample solution demonstrates the above problem with forms authentication and ScriptResource.axd. The application mimics logging out and back in several times in succession until ScriptResource.axd accepts the provided forms authentication ticket as valid. Keep in mind that since we fixed this particular bug as part of MS11-100, I have modified this particular project's Web.config such that the application exhibits the old (pre-fix) behavior. This was done strictly for educational purposes, and production applications should never disable any of our security fixes.

                          (It should be noted that the root cause of CVE-2011-3416 was a forms authentication ticket serialization flaw that was privately disclosed to us by a third party. The "payload collision" flaw mentioned above was an internal find by our security team. Since fixing CVE-2011-3416 required us to change the forms authentication ticket payload format anyway, we just piggybacked the payload collision fix on top of it, and the whole package went out as part of the MS11-100 release.)

                          The MachineKey public APIs

                          Finally, I want to discuss the MachineKey.Encode and Decode APIs. These APIs were added in ASP.NET 4 due to high customer demand for some form of programmatic access to the crypto pipeline. A common use case is that the application needs to round-trip a piece of data via an untrusted client and doesn't want the client to decipher or tamper with the data. The easiest way to write such code in 4.0 is:

                          string ciphertext = ...; // provided by client 
                          byte[] decrypted = MachineKey.Decode(ciphertext, MachineKeyProtection.All);

                          As described above, since the same cryptographic keys are used throughout the ASP.NET pipeline, it turns out that the Decode method can also be used to decrypt payloads like forms authentication tickets. Consumers of the Decode method can try to defend against clients passing these payloads through, but it requires developers to be cognizant of the fact that the Decode method can even be abused in this manner.

                          Even more dangerous is the way in which the Encode method is often called:

                          byte[] plaintext = ...; // provided by client 
                          string ciphertext = MachineKey.Encode(plaintext, MachineKeyProtection.All);

                          Now consider what happens if a client provides this payload:

                          01 02 83 c7 3d c6 96 53 cf 08 fe 00 80 30 05 6d f3 d1 08 00 05 61 00 64 00 6d 00 69 00 6e 00 00 01 2f 00 ff

                          This payload happens to correspond to the current serialized forms authentication ticket format (post-MS11-100). The ticket has a username of "admin" and an expiration date of January 1, 2015. If this payload is passed to MachineKey.Encode, and if the resulting ciphertext is then returned to the client, then the client has successfully managed to forge a forms authentication ticket.

                          It should be apparent that these APIs are a double-edged sword. By providing access to the cryptographic pipeline, we are providing developers with a great deal of power, but we are also trusting developers to use the APIs correctly. And therein lies the problem: correct usage requires intimate knowledge of ASP.NET internal payload formats (not just existing formats, but also any format we might add in the future!), and this is simply an onerous and unrealistic expectation. It's a pit of failure, delicately lined with crocodiles and pointy spikes.

                          In tomorrow's post, I'll discuss pipeline changes in ASP.NET 4.5, including new configuration switches and APIs that lend themselves to the pit of success.

                          Cryptographic Improvements in ASP.NET 4.5, pt. 2

                          $
                          0
                          0

                          Thanks for joining us for day two of our series on cryptography in ASP.NET 4.5! In yesterday's post, I discussed how ASP.NET uses cryptography in general, where key material is pulled from and how it is stored, and various problems that the APIs have introduced over the years. In today's post, I'll discuss how we're mitigating those issues using 4.5's opt-in model. The series outline is copied below for quick reference.

                          1. Background regarding the use of cryptography in ASP.NET 4.
                          2. Changes that were introduced in ASP.NET 4.5 (today's post).
                          3. Usage notes and miscellaneous Q&A.

                          Throughout the series I'll refer to a sample solution. This Visual Studio 2012 solution contains projects that demonstrate many of the core concepts mentioned here. It can be downloaded from http://sdrv.ms/T4aMyg.

                          The world in ASP.NET 4.5

                          Please keep in mind that everything discussed in this section is opt-in. New ASP.NET applications created using the 4.5 project templates will get the behavior described here. Existing ASP.NET 4 applications will retain their existing behavior, even when running on a machine with 4.5 installed. See the compatibility section later in this post for more information.

                          Transforming the auto-generated machine key, redux

                          When ASP.NET is configured to use 4.5 machine key compatibility mode, the manner in which keys are generated and transformed changes drastically. I'll discuss the symmetric encryption key transformation in detail, but the same applies to the validation key transformation.

                          For starters, using an auto-generated key causes us to read a full 256 bits of key material from the value stored in HKCU, compared with 192 bits in ASP.NET 4. (Recall from the earlier discussion that the default symmetric encryption algorithm is AES.) Assume that those bytes are:

                          82 7d b1 00 6c 22 4a 21 eb 81 33 1a 1f 85 19 b0 68 1f fb e1 bb 08 be f0 48 4e 27 a9 fe e3 c8 6f

                          This key is not used directly as a cryptographic key in the product. Instead, it is used as the key derivation key (KDK) into a key derivation function (KDF). We internally use the NIST SP800-108 [PDF link] counter-mode KDF with HMACSHA512 serving as the pseudo-random function (PRF), and this KDF has been baked into almost all of the 4.5 core crypto code paths. The label and context inputs are populated with the application virtual path or application ID if IsolateApps or IsolateByAppId is specified. The KDF is then run to generate what we refer to as an application master key. In the above example, the application master key might come out as:

                          db d5 30 f4 9c 5e d7 dc 5e a9 40 a3 dd 0b 06 1a 9d 68 f1 a2 60 34 59 e5 3a a9 83 b0 20 b0 aa 93

                          Note that the algorithm has preserved all 256 bits of entropy, even though IsolateApps or IsolateByAppId may have been involved in the conversion from auto-generated key to application master key. Because we're using a proper KDF, one can envision future support for additional transformation flags, and we could support limitless such flags without further reducing the strength of the auto-generated key. Alternatively, if an explicit key has been specified in Web.config, we'll just use that as the application master key rather than running it through the KDF to generate a master key.

                          Terminology note: the application master key is actually a set of keys: one used for a symmetric encryption algorithm, the other used for a message authentication algorithm. When I mention transforming or consuming them, imagine the transform being done to each in parallel.

                          Of primitives and derived keys

                          In ASP.NET 4's crypto code paths, symmetric encryption and HMAC calculation were kept separate. The APIs were disjointed, and callers had to verify that they were calling into the appropriate primitives in the correct order. (That we left this up to the individual callers rather than putting it in a centralized code path was one of the factors that led to MS10-070.) In ASP.NET 4.5's code paths, these two operations are now inextricable. The caller does not specify if he would like the data to be encrypted or MACed; the code paths automatically handle the correct invocation of both.

                          One immediate implication of the above is that payloads run through the new system can no longer be only MACed or only encrypted; they must be both or neither. Thus even though the default behavior of ViewState is MAC-only, when run through the 4.5 code paths it will always end up being both encrypted and MACed. If ViewState MACing is disabled by setting EnableViewStateMac to false, then ViewState will be afforded no protections.

                          Never set EnableViewStateMac to false in production. Not even for a single page. No exceptions! The EnableViewStateMac switch will be removed in a future version.

                          We also modified all of our internal call sites to pass one additional piece of information to the core crypto routines: their purpose. These purpose objects are basically strings that describe the caller. Some example purposes used internally by ASP.NET are equivalents for "ScriptResource.axd", "FormsAuth ticket", and "ViewState for ~/default.aspx". Importantly, if a cryptographic payload was generated with a particular purpose string, that same string must be provided when trying to verify and decrypt the payload, otherwise the operation will fail.

                          This is implemented internally by using the same NIST SP800-108 KDF mentioned earlier. Recall that each application has an application master key. That master key is itself used as a KDK, and label and context parameters are populated from the provided purpose string. The KDF is then used to generate a new derived key, and this derived key is used to carry out the requested cryptographic operation. Thus even a trivial change to a call site's purpose string will result in the generation of wildly different key material, the end result being that a purpose of "ViewState for ~/about.aspx" cannot be used to decipher a payload protected with "ViewState for ~/default.aspx" or any other non-matching purpose.

                          We expect that this feature alone will provide considerable defense in depth. By isolating the cryptographic consumers from one another we hope to contain any future bugs or flaws that are found in the product. For example, if a bug is found in ScriptResource.axd or ViewState, these changes drastically reduce the risk that the bug can be used by an adversary to attack the FormsAuthentication component.

                          Introducing DataProtector

                          In ASP.NET 4, we provided the ability to replace the symmetric encryption and message authentication algorithms used by the cryptographic pipeline. This is still supported in 4.5, and we will use the specified algorithms when encrypting and MACing data. To specify your own algorithms, change the <machineKey> element like so:

                          <machineKey decryption="alg:typename" validation="alg:typename" />

                          To specify a custom symmetric encryption algorithm, replace typename in the decryption attribute above with the assembly-qualified name of a type subclassing SymmetricAlgorithm. Similarly, to specify a custom message authentication algorithm, replace typename in the validation attribute above with the assembly-qualified name of a type subclassing KeyedHashAlgorithm. We will create instances of these types and set their Key properties as appropriate. (The Key properties are populated with the output of the KDFs.)

                          In addition to replacing the individual algorithms, ASP.NET now also allows wholesale replacement of the entire crypto pipeline. This functionality is provided by the new .NET 4.5 DataProtector type. Subclassed types implement the abstract ProviderProtect and ProviderUnprotect methods; these methods are responsible for black-box protection of arbitrary data. To use a DataProtector-derived type instead of ASP.NET’s built-in KDF / encrypt-then-MAC logic, specify the type in Web.config:

                          <machineKey applicationName="applicationName" dataProtectorType="typename" />

                          The meaning of these attributes is as follows:

                          Attribute Description
                          applicationName A string which uniquely identifies the application within the domain of all other web applications running on the same host. This value doesn't necessarily have to be secret; using the web application name itself is often appropriate.
                          dataProtectorType The assembly-qualified name of a type subclassing DataProtector. If this attribute is specified, then applicationName must also be specified.

                          The sample solution includes a project DpapiProtectorDemo which demonstrates use of a DataProtector type for protection. That project uses the built-in DpapiDataProtector type, which uses the DPAPI functionality provided by Windows to encrypt and tamper-proof data using keys specific to the current local Windows user account. The DataProtector constructor parameters are populated from a combination of the application name specified in <machineKey> and the purpose string passed to the cryptographic APIs. One benefit to using DPAPI in this manner is that the OS itself is responsible for key management, relieving ASP.NET developers of this need. But DPAPI keys are local to the current user on the current machine, making this technique unsuitable for web farm deployments. There are some ways around this; tomorrow's post contains a section on advanced usage and speaks further to this point.

                          Since DataProtector behavior may be tied to a particular Windows user account, the ASP.NET runtime will impersonate the application identity before instantiating and calling into a DataProtector. This impersonation is reverted immediately after the call, restoring any impersonation that existed before the call.

                          There is a caveat to replacing the stack: WebResource.axd and ScriptResource.axd URLs will not go through any configured DataProtector. The payloads have unique caching requirements that require any given plaintext to always result in the same ciphertext, and since DataProtector implementations are expected to include some type of randomness in their protected output this makes DataProtector ill-suited for protecting these particular payloads. WebResource.axd and ScriptResource.axd will honor custom SymmetricAlgorithm or KeyedHashAlgorithm configurations, however, and they will continue to go through the same KDF transformations that the application master keys are otherwise subject to.

                          MachineKey API changes

                          New for ASP.NET 4.5 are additional APIs on MachineKey: Protect and Unprotect. These APIs are similar to the original Encode and Decode methods, but they take advantage of the features mentioned so far in this section:

                          • The caller no longer needs to know whether something needs to be encrypted or MACed. The payload is simply "protected".
                          • The caller can supply one or more purpose strings to isolate this specific consumer from others. The MSDN documentation for MachineKey.Protect provides good guidance for choosing purpose strings.
                          • If a DataProtector is configured, calls to MachineKey.Protect / Unprotect will be routed through the configured DataProtector.

                          When unprotecting data, the same purpose string that was used to protect it must be provided. Consider the following:

                          byte[] plaintext = ...; 
                          byte[] ciphertext = MachineKey.Protect(plaintext, "foo component");
                          byte[] deciphered = MachineKey.Unprotect(ciphertext, "bar component");

                          The last line above will throw a CryptographicException since a different purpose string was used for protection and unprotection, signaling that the decryption is taking place in the wrong context. The end result is that callers gain automatic payload differentiation. The Protect / Unprotect APIs automatically prevent the caller from passing in a purpose string that is used by the ASP.NET runtime itself (e.g., for FormsAuthentication).

                          The Encode and Decode APIs have been deprecated, and we intend for these new APIs to be their long-term replacements. We hope that you find these APIs useful, that you feel confident in using them, and that they lend themselves well to the pit of success.

                          Algorithmic implementations

                          Finally, since .NET 4.5 requires Windows Vista / Server 2008 or higher, we are able to make some assumptions that simplify the code and lead to improved performance. One result is that we are able to standardize on CNG almost everywhere. This is beneficial since this layer is where the Windows team intends to add most of their low-level extensibility hooks and focus most of their performance work. For example, many CNG routines contain hand-rolled assembly instructions that take full advantage of the particular instruction sets available on the target processor.

                          Compatibility

                          Opting in or out of the 4.5 code paths

                          As you might imagine, such drastic changes to the crypto pipeline come at the expense of compatibility. And since .NET 4.5 is an in-place update to .NET 4, we cannot enable these new behaviors by default, otherwise we run the unacceptable risk of breaking existing applications.

                          To opt in to the new ASP.NET 4.5 behaviors, all that need be done is to set the following in Web.config:

                          <machineKey compatibilityMode="Framework45" />

                          Alternatively, you can set the following switch, which is what the ASP.NET 4.5 project templates do:

                          <httpRuntime targetFramework="4.5" />

                          The above switch is responsible for a slew of runtime behavioral changes, but that is a blog post for another day. The important bit here is that setting the target framework to 4.5 in the <httpRuntime> element automatically implies a default setting of Framework45 for the <machineKey> compatibility mode unless the machine key compatibility mode has been explicitly specified.

                          ASP.NET has historically supported sharing forms authentication tickets between different versions of the framework. This allows tickets to be generated by an application running ASP.NET 2.0 and validated by an application running ASP.NET 4, for example. If you are writing an application targeting ASP.NET 4.5 (you have set <httpRuntime targetFramework="4.5" />) and you need to share tickets with applications running earlier versions of ASP.NET, you must set the following in the 4.5 project's Web.config:

                          <machineKey compatibilityMode="Framework20SP1" />

                          The value Framework20SP1 is the default machine key compatibility mode for all ASP.NET versions. This has the effect of using the legacy crypto code paths, even if .NET 4.5 is installed on the machine. An existing ASP.NET 4 application that happens to be running on a machine with 4.5 installed will not get the new behaviors automatically since neither <httpRuntime targetFramework="4.5" /> nor <machineKey compatibilityMode="Framework45" /> would be present in that application's Web.config. If, however, you have made a new application targeting 4.5 (and as such it has those config settings) and need to maintain forms authentication ticket compatibility with existing applications, you can set Framework20SP1 to be interoperable with earlier versions of ASP.NET

                          Behavioral miscellany

                          The following consumers and APIs will always go through the legacy code paths, regardless of whether the application is otherwise operating in 4.5 machine key compatibility mode:

                          • Membership, if configured to use reversible encryption when storing passwords in the database (please don't do this!). This was done to preserve backward compatibility.
                          • MachineKey.Encode / Decode, since developers may depend on their ability to decode arbitrary payloads that have been generated by other ASP.NET runtime components. It is also conceivable that the output of the Encode method might be persisted, and we did not want to render these payloads unreadable.

                          The following consumers and APIs will always go through the new code paths, regardless of whether the application is otherwise operating in 4.5 machine key compatibility mode:

                          • MachineKey.Protect / Unprotect, as these are new APIs and are not bound by the same compatibility requirements as the legacy APIs.

                          We also apply some restrictions on the <machineKey> configuration when the compatibility mode is set to 4.5:

                          • The <machineKey> validation value must be SHA1 (which we interpret as HMACSHA1), HMACSHA256, HMACSHA384, HMACSHA512, or a reference to a KeyedHashAlgorithm-derived type. The values AES, 3DES, and MD5 are no longer allowed, as they were in ASP.NET 4.
                          • The decryptionKey and validationKey values – if specified – must be well-formed hex strings. The 4.5 routines are stricter about this than the downlevel routines were.

                          Tomorrow's post will provide usage notes and go over some advanced scenarios for these new APIs. I will demonstrate how these APIs can be used with new cryptographic routines in Windows 8 to simplify key management for site administrators. I'll also go over some miscellaneous Q&A regarding this design and some reasons why this particular design was chosen over alternatives.

                          Cryptographic Improvements in ASP.NET 4.5, pt. 3

                          $
                          0
                          0

                          Thanks for joining us for the final day of our series on cryptography in ASP.NET 4.5! Up to now, the series has discussed how ASP.NET uses cryptography in general, including how the pipelines are implemented in both ASP.NET 4 and ASP.NET 4.5. We introduced APIs to give developers fuller control over the cryptographic pipeline and to drive consumers toward a wider pit of success. In today's post, I'll discuss advanced usage scenarios and answer some common questions that we anticipate developers might have. The series outline is copied below for quick reference.

                          1. Background regarding the use of cryptography in ASP.NET 4.
                          2. Changes that were introduced in ASP.NET 4.5.
                          3. Usage notes and miscellaneous Q&A (this post).

                          Throughout the series I'll refer to a sample solution. This Visual Studio 2012 solution contains projects that demonstrate many of the core concepts mentioned here. It can be downloaded from http://sdrv.ms/T4aMyg.

                          Usage notes

                          We have tried to create a system that leads to success automatically. However, it is always good to be aware of the right way to use these systems, and to that end we want to arm you with the knowledge to be successful. You may find these tips helpful when developing your applications.

                          • Use the 4.5 project templates to get the new runtime behaviors automatically. If you have an existing application which you are considering migrating to 4.5, consider whether opting in to the new behaviors is right for your application.
                          • If possible, move away from the MachineKey.Encode / Decode APIs and call the Protect / Unprotect APIs instead.
                          • If you do call the Protect / Unprotect APIs, we strongly recommend that you provide a purpose string. Remember: MSDN provides good guidance on choosing an appropriate purpose string for your call site.
                          • Auto-generated keys are stored in the HKCU registry, hence they are tied to the particular user account the web server is running under. If you're running multiple applications on the same box, application isolation matters! If not, the applications are going to be able to read each other's keys or otherwise affect each other. KB2698981 offers guidance for how to isolate applications on the same system from one another.
                          • In 4.5 compatibility mode, the auto-generated machine key contains 256 bits of entropy for the symmetric encryption algorithm and 256 bits of entropy for the message authentication algorithm. The KDF can generate an appropriate amount of key material for the algorithm in use, but the KDF cannot increase the amount of entropy. If your application uses an algorithm that takes a longer key, e.g. HMACSHA512, consider using an explicit key instead of an auto-generated key.
                          • In 4.5 compatibility mode, the PRF used in the KDF is HMACSHA512, which has an output size of 512 bits. Configuring the system to use a symmetric encryption or message authentication algorithm which takes keys greater than 512 bits in length will not significantly increase the security of the system. See FIPS PUB 198 [PDF link], section 3 for more information.

                          Advanced usage

                          There is one other project in the solution – AdvancedDataProtectorDemos – which demonstrates using DPAPI:NG for data protection. One advantage of DPAPI:NG is that key management can be pushed to Active Directory, which eliminates the need for the application to maintain this information. (There is a caveat – recall the ScriptResource.axd / WebResource.axd exclusions mentioned yesterday – but these payloads are generally less sensitive than ViewState, forms authentication, and the like.) DPAPI:NG is available on Windows 8 / Server 2012 machines, and AD key management requires that the application identity be a domain identity and that the AD servers be running Windows Server 2012.

                          Other questions

                          Why not use PBKDF2 for key derivation?

                          The .NET Framework has two built-in KDF types. One is PasswordDeriveBytes (PBKDF1), which is insufficient for our needs since PBKDF1 can only derive keys of at most 160 bits in length. The other built-in type is Rfc2898DeriveBytes (PBKDF2). Unfortunately this particular implementation of PBKDF2 is hardcoded to use HMACSHA1. While this is perfectly sufficient for low-entropy input sources such as passwords, it's unacceptable for the high-entropy input sources that we would be providing to it and could actually end up weakening the cryptographic key material. See RFC 2898, appendix B.1.1 for further discussion.

                          At this point we resigned ourselves to the fact that we'd have to implement the KDF in our own layer rather than calling any existing APIs. (Windows provides BCryptDeriveKeyPBKDF2, but that function doesn’t exist on all platforms that ASP.NET 4.5 targets.) We ended up choosing NIST SP800-108 [PDF link] due to the fact that it is designed to derive a new key from an existing high-entropy key rather than from a low-entropy password, and this behavior more closely aligns with our intended usage.

                          Why not use an authenticated encryption mode of operation?

                          Authenticated encryption modes of operation (CCM, GCM, etc.) are generally geared toward streaming protocols, and the particular constraints that these modes of operation put on inputs like nonces reflect this. Consider a high-traffic cloud-hosted web site using AES-GCM (see NIST SP800-38D [PDF link]). A purely random nonce would be unacceptable since a high-traffic web site can serve 232 requests in mere days, and a (machine ID, invocation ID) tuple-based nonce would likewise not suffice since machines in the cloud may be transient, precluding the ability to uniquely identify machines.

                          However, this problem is not unsolvable. One possible solution is to have a small cluster of machines responsible for cryptographic services and to have the web servers delegate to this, allowing the tuple-based nonce to work. Another solution is to have each server responsible for rolling its own keys at application startup (thus the nonce can be a simple counter since no other machine uses that specific key) and on a regular basis thereafter, and provide each server in the cluster the ability to infer which key was used for a particular payload. We prototyped both of these solutions using custom DataProtector-derived classes, so we are confident that developers have the extensibility hooks necessary to go this route if desired. But when compared to the standard encrypt-then-MAC implementation, authenticated encryption modes of operation proved to be too ill-suited for us to provide an in-box implementation.

                          What about membership?

                          There has been a recent resurgence of discussion about the appropriate way to store passwords on a server, precipitated by breaches of several high-profile sites. Many of these discussions have focused on ASP.NET's built-in routines, like those in SqlMembershipProvider.

                          We support configuring custom hash algorithms specifically for membership, and there exist demonstrations of plugging PBKDF2 into this pipeline. We could have considered codifying this in SqlMembershipProvider for 4.5, but as it turns out that wouldn't have been terribly useful. It is likely that the majority of applications using SqlMembershipProvider going forward are those that require compatibility with earlier versions of ASP.NET, so they are bound by the algorithms supported by those versions. On the other hand, the Microsoft.AspNet.Providers types are out-of-band so are free to improve on their own schedule rather than being tied to .NET Framework releases. The 4.5 project templates all use Microsoft.AspNet.Providers instead of SqlMembershipProvider.

                          As always, the best protection is for users to follow good password selection criteria such as choosing strong passwords and not reusing passwords between sites. If a user chooses a weak password, the server's mitigations act only as a hurdle. It might force an attacker to slow down, but if he is sufficiently determined he will crack it eventually.

                          Will this work be backported?

                          This work is unlikely to be backported to ASP.NET 4 or earlier. For starters, much of the code we have takes advantage of the fact that we have the full range of .NET 4.5 types at our disposal, and we're also making assumptions about the underlying OS's base capabilities. These assumptions aren't necessarily valid when running on Windows XP / Server 2003. Furthermore, while we believe the new crypto stack is better than the legacy crypto stack, that doesn't mean that the legacy stack is bad. We just thought it would be beneficial to implement an improvement.

                          Does this obsolete protected configuration?

                          Not at all! Protected configuration is still a valuable tool if you want to store cryptographic keys in the application's Web.config but don't want the keys themselves to appear as plaintext within that file. MSDN has a walkthrough on enabling this feature.

                          Final words

                          I hope I have successfully conveyed the utility of the cryptographic changes. These posts have demonstrated how the new code paths promote good practices, improve usability, and offer even greater extensibility than before. And we already have consumers of the new APIs: MVC's anti-XSRF helpers and the ASP.NET OAuth package call into the new APIs when running on 4.5. WIF also calls into the new routines in certain cases.

                          Finally, by being transparent with these changes, we hope to offer some insight as to how the team operates when security issues are reported to us. Our goal is not to develop in a vacuum, nor is it just to patch and move on. We want to take the time to learn from our mistakes, revisit our assumptions, and continually improve both ourselves and our product. It allows us – and hopefully you – to be confident in the quality of the framework.

                          Thanks to Angela, Barry, Suha, Nazim, Bob, Tolga, and others for all your feedback! I really appreciate you guys slogging through this. Also, thanks to Brock and Mike for getting some early coverage of the new MachineKey APIs in their blogs. We love our developer community! :)

                          ASP.NET Web Forms Application (2012) Templates on Visual Studio 2010

                          $
                          0
                          0

                          The release of Visual Studio 2012 included updated templates for ASP.NET WebForms, MVC and WebPages. The templates showcased the use of modern standards in HTML5, CSS and JavaScript. Apart from these, they also showcased social login via Twitter, Facebook etc.  Scott Hanselman has a short video demonstrating social login. While this amazing goodness was available for developers who had access to VS2012. If you were a Web Forms developer using VS2010SP1, then there was no way for you install these templates.

                          To solve this problem, I created a Visual Studio Extension which will install the templates that we shipped with VS2012.

                          Note: This Extension is by no means supported by Microsoft. It is something I created in my own spare time to make it easier for WebForms developers using VS2010SP1 to use the updated templates that we shipped with VS2012

                           

                          Download and Use the Extension

                          • If you are using VS2010SP1, then you can do the following else you can download the extension(ASPNETWebForms) from the VS Gallery
                          • Open VS2010SP1
                          • Click Tools – Extension Manager
                          • Goto Online Gallery and search for “aspnet webforms”
                          • Alternately, you can search for the name of the extensions “ASPNETWebForms
                          • Once you have searched for the extension, it will appear as follows in the search resultextensionsearch

                           

                          • Download the extension/ The templates will show up in the File-New project dialog in the following category(Visual C#– Web)
                          • To use the installed templates, create the project in “File – New Project – Visual C#-Web – ASPNETWebFormsApplication”

                          extension_installed

                          • Once the project is created, follow the instructions in the readme.txt file to Build & Run the project

                          http://visualstudiogallery.msdn.microsoft.com/ec79e369-51bd-4212-83d0-71349d038461 has more details about the Extension and also what SKUs are supported.

                          Please do give feedback on the VS Gallery for the extension for any features/bugs.

                          Viewing all 7144 articles
                          Browse latest View live


                          <script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>