Quantcast
Channel: ASP.NET Blog
Viewing all 7144 articles
Browse latest View live

Creating a Custom Scaffolder for Visual Studio

$
0
0

With the release of Visual Studio 2013 last October, we introduced the concept of Scaffolding to Web Application projects.  Scaffolding is the framework on which code generation for MVC and WebAPI is built.  For more information on Scaffolding or the MVC Scaffolders check the following blog post: http://www.asp.net/visual-studio/overview/2013/aspnet-scaffolding-overview.

However, the true potential for the scaffolding framework comes from the new extensibility surface released in Update 2.  With this new functionality, any VSIX can code against the Scaffolding API surface and have their scaffolds added to the Add New Scaffold Dialog.  This blog post will walk through the creation of a custom scaffolder.

To get started make sure you have the following installed on your machine:

Creating a New Scaffolder Project Using Sidewaffle

  1. Go to create a new project.
  2. Click on the C#->Extensibility->Sidewaffle Node.
  3. Select new “Basic Scaffolder”.
  4. Input the desired name of your Scaffolder.
  5.  Create the Project.

clip_image002

Now, that your custom scaffolder solution has been created, you will notice that it has two projects in the solution, one named BasicScaffolder1 and the other BasicScaffolder1Extension (with the Italics substituted with the name you gave your solution).  The former is a class library project which will contain most of the code for your custom scaffolder, the latter is the VSIX project which will build the vsix that you will be able to share with others (or upload to the VS Extension Gallery).  The template has also added dependencies to the Microsoft.AspNet.Scaffolding.12.0 and Microsoft.AspNet.Scaffolding.EntityFramework.12.0 dlls necessary to interface with the scaffolding framework.  Finally, the template has configured your project and solution settings to correctly launch the experimental instance of VS to test out your Scaffolder on F5.

Fixing the Metadata

The first step is filling out the metadata of your Custom Scaffolder.  There are two files that need to be updated, the source.extension.vsixmanifest in the VSIX project (the one ending in Extention), and the CustomCodeGeneratorFactory class in the Class Library Project.

Within the source.extension.vsixmanifest (which will look very familiar to anyone who has built a VSIX extension before), make sure to modify all the values that you want (these are the values that will show up on the VS Extension Gallery), but leave the “Tags” unchanged.  Having this set to “Scaffolding” will allow for the VSIX to show up in the Tools->Scaffolders node in the VS Extension Gallery.

In the CodeGeneratorFactory class, there is a CodeGeneratorInfo static field that contains the data that will be displayed in the Add Scaffold Dialog.

privatestatic CodeGeneratorInformation _info = new CodeGeneratorInformation(
//Text for the selection in the main Add Scaffold Selection
    displayName: "Custom Scaffolder",
//On the right sidebar, this is the description 
    description: "This is a custom scaffolder.",
//On the right sidebar this is the author 
    author: "Microsoft", 
//On the right sidebar this is the version
    version: new Version(1, 1, 0, 0),  
//On the right sidebar this is the id value 
    id: typeof(CustomCodeGenerator).Name, 
//In the main Selection area this is the icon
    icon: ToImageSource(Resources._TemplateIconSample), 
//These are the right click gestures this scaffolder will display for 
    gestures: new[] { "Controller", "View", "Area" }, 
//These are the categories the scaffolder will display in  
    categories: new[] { Categories.Common, Categories.MvcController, Categories.Other }); 

And this is the Dialog where that info will be rendered:

clip_image004

Customizing your Scaffolding UI

With your metadata set, it is time to actually implement your scaffolder.  The class that actually implements your scaffolder is the CodeGenerator class.  This class implements the ICodeGenerator Interface.  The interface specifies two methods, ShowUIAndValidate() and GenerateCode().  These are the two methods that the scaffolding core will use to run your scaffolder.

The ShowUIAndValidate() method is run after your scaffolder is selected from the Add Scaffold Dialog.  This code should launch any UI you wish to show for the end user to provide input into your scaffolder, and then validate both the state of the project and the inputted values before returning.  No changes to the project should be made within this method call.

In the sidewaffle project we have added a basic UI that allows the user to select a .cs class from their current project.  However, more complex examples can be seen using the WebAPI or MVC scaffolders shipped with Visual Studio.  Note that in the template, that the UI selection is persisted after the dialog is accepted, so that the value can be used in the Code Generation part of the Scaffolder, this is not done by default, and if you want to have the state persisted, you should make sure that the values are perserved.

Final Note: There is no requirement to show UI, and you can skip straight to the GenerateCode() method by just returning true.  However, this would not allow any user input into your scaffolder.

Writing Your Scaffolding Code Generator

Now that the scaffolder has the user inputs necessary to do your scaffolding (and you have persisted that data), it is time to modify the project.  There are several types of actions built into the Scaffolding Framework to help you build your scaffolder, these are in the ICodeGeneratorActionsService interface of the Microsoft.AspNet.Scaffolding.12.0 dll.  Of course you can create your own actions, but the benefit of using the method calls exposed by the Scaffolding Framework is that they can be Rolled Back in case any step of the Scaffolding Fails.  Rolling back will return the project to the state it was in prior to the scaffolder being invoked.

The subsections that follow go over the most common ActionServices that Scaffold Authors will use.  Full documentation of all of these methods can be found in the Object Browser when expanding the Scaffolding dll.

Adding a static file

The most basic action a scaffolder can make to a project is adding a static file to the project.  The Action Service exposed by the Scaffolding Framework to accomplish this is AddFile, which takes in a DTE.Project (which can be gotten from the from CodeGeneratorContext.ActiveProject.get()), the relative path of the file in the project, the absolute path to the file on disk, and a Boolean indicating if the file should be overwritten if it already exists.

Creating a Folder

Pairing with the adding of a static file, the scaffolder can also create a new folder in the project.  Using the AddFolder() method in the Action Service, you can create a new folder in the targeted project by passing in the Project DTE object, and a relative path to where the folder should be added.

Using T4 Templates to add a file or update a file

This Action is more interesting than the previous two, as we will be generating a new file for the targeted project based on a t4 template.  On the surface, this method is similar to the AddFile() method described above, but there are some slight differences. 

First, the path to the template file can either be absolute, or it can be relative to the “Templates/YourScaffolderCodeGenerator” filepath.  Remember also that the .cs.t4 or .vb.t4 extension is not needed in the path, it will be resolved at execution time to match the project language of the project being scaffolded.  If you only wish to support certain languages go to the IsSupported() method of the CodeGeneratorFactory class and make sure to return false for all non-supported language types.

Additionally, the method takes in a dictionary of key/value pairs to use as the parameters of the t4 template. 

The example below shows how this method is invoked, the T4 template being invoked, and what the generated class is:

publicoverridevoid GenerateCode() 
{            
      var parameters = new Dictionary<string, object>() 
      { 
            { "ClassName", "DemoClass" }, 
            { "NameSpace", "DemoNamespace" } 
      }; 
 
this.AddFileFromTemplate(Context.ActiveProject, 
"DemoFile", 
"CustomTextTemplate", 
                               parameters, 
                               skipIfExists: false); 
}

Here is my CustomTextTemplate.cs.t4:

<#@ template language="C#" #>
<#@ assembly name="System.Core" #>
<#@ assembly name="EnvDTE" #>
<#@ import namespace="System.Linq" #>
<#@ import namespace="System.Text" #>
<#@ import namespace="System.Collections.Generic" #>
<#@ parameter name="ClassName" type="System.String" #>
<#@ parameter name="NameSpace" type="System.String" #>
 
namespace<#= NameSpace #>
{
publicclass<#= ClassName #>
    {
public<#= ClassName #>()
        {
//This is where you will instantiate this class 
        }
    }
}

And the generated code from my Scaffolder is:

clip_image006

Of course that is a very basic t4 template example, but it illustrates how the process will work.  In creating your own scaffolders please feel free to make the t4 templates as complex as your scaffolding requires.

Adding NuGet Packages

Another common action that a custom scaffolder may wish to perform is installing a NuGet package (either one included with the VSIX, or one from a remote repository like the NuGet Gallery).  There are three ways to accomplish this.

If you need a NuGet package installed before any of your code generation is done, add the following override in your CustomCodeGenerator class:

publicoverride IEnumerable<NuGetPackage> Dependencies
{
      get
      {
            List<NuGetPackage> t = new List<NuGetPackage>();
            t.Add(new NuGetPackage("jquery", 
"1.6.4", 
new NuGetSourceRepository()));
return (IEnumerable<NuGetPackage>)t;
      }
}

If you just want a NuGet package installed to be able to run your generated code in the user’s project, modify your CreateInstance() method in your CustomCodeGeneratorFactory class:

publicoverride ICodeGenerator CreateInstance(CodeGenerationContext context)
{
      context.Packages.Add(new NuGetPackage("jquery",
"1.6.4",
new NuGetSourceRepository()));
returnnew CustomCodeGenerator(context, Information);
}

Finally, you can always do installations directly.  This is useful in cases where you want to not just fail if the install fails, but have different code paths to follow in that scenario.  The code below is used to install the jquery 1.6.4 package to the project running the scaffolder, assuming that the package (and any version thereof) is already installed.  It has been encapsulated in a try/catch block in the case that the package is already installed, as we do not want the scaffolder to fail if the required dependency is already installed.  Again this code is in the GenerateCode() block in the CodeGenerator class.

NuGetPackage demoPackage = new NuGetPackage("jquery", 
"1.6.4", 
new NuGetSourceRepository());
try
{
      var nugetService = (INuGetService)Context.ServiceProvider.GetService(typeof(INuGetService));
      nugetService.InstallPackage(Context.ActiveProject, demoPackage);
}
catch (Exception e)
{
      System.Console.Out.Write(e);
}

Next Steps


Now that you have the basics of creating a scaffolder down, here are some additional resources for what to do next:

Additionally you can look to create more complex scaffolders using the following services:

  • ICategoryRegistrationService – to add new Categories in the Add Scaffold Dialog
  • IServiceRegistrar – to add new ActionServices that you can invoke during scaffolding
  • IRollbackService – to make the services registered above be able to use the Scaffolding rollback feature
  • The Scaffolding.EntityFramework dll – to help with the processing of EF models (this is used by the MVC and WebAPI Entity Framework Scaffolders to create the controllers and for MVC the views)

EF Code First Migrations Deployment to an Azure Cloud Service

$
0
0

To deploy a Code First database to an Azure Web Site, you can use the Execute Code First Migrations check box in the Publish Web wizard:

blog1

When you select that check box, Visual Studio configures the destination web site so that Entity Framework automatically deploys the database or updates it by running the MigrateDatabaseToLatestVersion initializer on application start, as explained in this VS web deployment tutorial.

But if you’re deploying to an Azure cloud service you don’t get to use the Publish Web wizard. What then?

In that case, you have two options:  

  • Write code that executes migrations when the application starts.
  • Write Web.config transforms to configure the MigrateDatabaseToLatestVersion initializer to run.

Write App_Start code to run Migrations

You can run migrations manually from the Application_Start method of Global.asax with the following code:

Code Snippet
  1. var configuration = new MvcWebRole.Migrations.Configuration();
  2. var migrator = newDbMigrator(configuration);
  3. migrator.Update();

In this snippet, Configuration is the migrations Configuration class from your Migrations folder. You can optionally specify a connection string to be used for the migration; for an example, see Running & Scripting Migrations from Code on Rowan Miller’s blog.

However, you probably don’t want to call migrations every time the app runs, so a better solution is to use a Web.config file setting or a role environment setting to control it. For example, add the following appSettings key to the Web.config file:

Code Snippet
  1. <appSettings>
  2.   <addkey="MigrateDatabaseToLatestVersion"value="true"/>
  3. </appSettings>

And only run migrations if the setting is true:

Code Snippet
  1. if (bool.Parse(ConfigurationManager.AppSettings["MigrateDatabaseToLatestVersion"]))
  2. {
  3.     var configuration = new MvcWebRole.Migrations.Configuration();
  4.     var migrator = newDbMigrator(configuration);
  5.     migrator.Update();
  6. }

With this method you can enable or disable the update-to-latest functionality locally by editing the Web.config file, and in a deployed site by using a Web.config transform. For example, you could set the setting to false for development and use the following code in Web.Release.config to enable it in production:

Code Snippet
  1. <configurationxmlns:xdt="http://schemas.microsoft.com/XML-Document-Transform">
  2.   <appSettings>
  3.     <addkey="MigrateDatabaseToLatestVersion"value="true"xdt:Locator="Match(key)"xdt:Transform="SetAttributes"/>
  4.   </appSettings>
  5. </configuration>

The xdt:Locator attribute specifies that the add element with the same key value is the one to be updated in the deployed Web.config file, and the xdt:Transform element specifies that value=”true” from the transform file will be used in the destination web site, even if value=”false” is in the Web.config file.

(For an Azure cloud service an alternative is to use a setting in the .cscfg file and use RoleEnvironment.GetConfigurationSettingValue to retrieve the setting value.)

Write Web.config transforms to configure MigrateDatabaseToLatestVersion

Another option is to set up Web.config transforms to make the same changes to the Web.config file that Visual Studio does when you click Execute Code First Migrations. For example, the following sample Web.Release.config file will configure the MigrateDatabaseToLatestVersion initializer to run, and the initializer will use a connection string other than the application connection string to update the database schema.

Code Snippet
  1. <configurationxmlns:xdt="http://schemas.microsoft.com/XML-Document-Transform">
  2.   <connectionStrings>
  3.     <addname="DefaultConnection"
  4.       connectionString="Server=tcp:server.database.windows.net,1433;Database=database;User ID=dbuser@server;Password=password;Trusted_Connection=False;Encrypt=True;Connection Timeout=30;"
  5.       providerName="System.Data.SqlClient"xdt:Transform="SetAttributes"xdt:Locator="Match(name)"/>
  6.       <addname="DefaultConnection_DatabasePublish"connectionString="Server=tcp:server.database.windows.net,1433;Database=database;User ID=dbuser@server;Password=password;Trusted_Connection=False;Encrypt=True;Connection Timeout=30;"providerName="System.Data.SqlClient"xdt:Transform="Insert"/>
  7.     </connectionStrings>
  8.     <entityFramework>
  9.       <contextsxdt:Transform="Insert">
  10.         <contexttype="MvcWebRole.Models.ApplicationDbContext, MvcWebRole">
  11.           <databaseInitializertype="System.Data.Entity.MigrateDatabaseToLatestVersion`2[[MvcWebRole.Models.ApplicationDbContext, MvcWebRole], [MvcWebRole.Migrations.Configuration, MvcWebRole]], EntityFramework, PublicKeyToken=b77a5c561934e089">
  12.             <parameters>
  13.               <parametervalue="DefaultConnection_DatabasePublish" />
  14.             </parameters>
  15.           </databaseInitializer>
  16.         </context>
  17.       </contexts>
  18.     </entityFramework>
  19. </configuration>

This sample transform code updates the existing DefaultConnection connection string and adds a new one named DefaultConnection_DatabasePublish for the initializer to use. The code also adds to the <entityFramework> element a <contexts> element that specifies the DbContext class and specifies that the MigrateDatabaseToLatestVersion initializer should be used with that class. Note that the new connection string and the <contexts> element have xdt:Transform=Insert attributes.

The code as shown works with a web project as created with the standard Visual Studio 2013 template for new web projects. If your project is different you can find the code you need as follows:

  1. Create a publish profile for publishing the web project to an Azure web site (not a cloud service).
  2. Select the Execute Code First Migrations check box and publish.
  3. In Server Explorer, expand the node for your web site under the Azure – Web Sites node, expand Files, and double-click the application Web.config file.
  4. You now see what Visual Studio did to enable migrations deployment, and you can copy the connection string element and the <contexts> element, paste them into a transform file, and add the xdt:Transform attributes.

If your application connection string has credentials for a user that has permissions to modify the database schema, you can simplify this code by not adding a new connection string, and by specifying the application connection string in the <parameter> element under entityFramework/context/context/databaseInitializer.

Choosing the right method

For most scenarios adding code to Application_Start is the best choice:

  • When you’re debugging, you can see and step through the code – there’s no magic happening behind the scenes due to Web.config settings.
  • The transform code for appSettings is much simpler than the transform code to configure an initializer. In addition, in Azure it can be even easier to change the setting because you can do it by changing the value of an environment variable.
  • If you ever change your context name, your code that runs migrations doesn’t change. Web.config transform code that configures an initializer would have to change.

An advantage of the Web.config transforms shown above is that they enable you to specify a connection string for the initializer to use for schema-modification work, different from the connection string used by the application itself. You could then implement the security best practice of using credentials that have only the minimum permissions required: CRUD permissions for the application, and database owner for migrations. The blog mentioned earlier shows how to do this for the code method, but of course if you do that you have more work to do in the Web.config and transform files as well as in code, so you lose some of the relative simplicity of the code method. See the discussion about multiple connection strings in ASP.NET Web Deployment using Visual Studio: Deploying to Test.

Summary

When you select the Execute Code First Migrations check box in the Publish Web wizard, Visual Studio automatically configures the destination Web.config file for migrations deployment. This post has shown some alternatives for accomplishing migrations deployment when Visual Studio doesn’t display the wizard or the check box.

Thanks to Rowan Miller, Sayed Hashimi, and Brady Gaster for providing some of the code and content for this post and reviewing it.

Intellisense for JSON Schema in the JSON Editor

$
0
0

In the previous post, we introduced our new JSON editor in the CTP 2 release of Visual Studio 2013 Update2. In the RC version of Visual Studio 2013 Update 2, we added intellisense support for JSON Schema v3 and v4. This will make working with complex JSON structures based on a schema much easier.

 

Specify the schema within a JSON file using the “$schema” property

When you add a new JSON file to your project, the schema dropdown box will show <No schema Selected>

clip_image001

You can specify the schema to be used for this JSON file by defining the “$schema” property as shown below

clip_image002

You need to close and re-open the file, and the JSON editor will read the $schema property, add an entry in the schema dropdown list and select it.

clip_image003

Note that the JSON editor will load and associate the schema defined in the $schema property only once for a new JSON file. If you change the dropdown selection to “<No Schema Selected>” and specify a new schema in the $schema property, the editor won’t pick up and apply the new schema.

 

Specify the schema in the dropdown textbox

You can also specify the schema for a JSON file using the schema textbox instead of using the $schema property in the file and it can be faster. However, if you need to pass this same JSON file to another user, or use it in a different project, the schema specified in the schema textbox will not stick. You have to use the $schema property in order for the JSON editor to pick it up outside of your current project.

You can select a schema from the dropdown list which includes some common schemas to start with.

image

You can also type directly into the schema textbox the path to the schema. It can be either a full path or a relative path from the current JSON file, or from the project root. For example, for a project with the following structure:

image

All of the following schema paths are equivalent for json1.json:

“/Json/Json2/Schema2.json”

“../Json2/Schema.json”

“c:\users\<username>\documents\visual studio 2013\Projects\WebApplication1\WebApplication1\Json\Json2\Schema2.json”

Note that using the absolute path won’t work when you publish or use the JSON file on a different machine, so it’s better to use a relative path instead.

Or you can drag and drop the schema node from Solution Explorer in Visual Studio to the schema textbox; which is easy and fast. However, you can only use this method for a schema within the Solution Explorer.

 

Reload a schema

A local schema will only be loaded into memory. If a schema is being downloaded from the internet, it will be cached locally to improve the performance. The path to the schema cache is at C:\Users\<username>\AppData\Local\Microsoft\VisualStudio\12.0\Schemas\JSON.

We don’t check and reload the updated schema automatically when the schema is in use. We do reload a schema in certain circumstances depending on whether it’s a local schema or an online one. For a local schema, the schema will be reloaded into memory when the user starts a new session of Visual Studio. For an online schema, when starting a new session of Visual Studio, we check the schema cache. If the schema is found there, we will use it. If it's not present, we will download the schema again and store in the cache.

We can also force the JSON editor to reload the schema by using the “?reload” option appended to the schema path, for example: “/Json/Schema.json?reload”. If it’s a local schema, we will reload its current version into memory. If it’s an online schema, we will re-download it into the cache directory, and load into memory. Using this option helps us not have to restart a new Visual Studio session in order for the JSON editor to start using an updated version of the schema.

 

Use Hash in a schema path

We also provide the capability to reference a particular structure in a schema using the hash sign (#) in the schema path. For example, you have defined the “there” property in the “Schema2.json” schema as shown in the screenshot below.

image

If you only want to use the “there” object in the schema, you can specify the schema path as “/Json/Json2/Schema2.json#/properties/there”, and you will get the intellisense listing only properties defined in the “there” object.

image

Helpful Error messages when loading a schema

To help troubleshoot issues when working with a JSON schema, we display errors in the Output window if the JSON editor failed to load the associated schema.

In the following example, the schema is missing a comma separating name/value pairs.

image

In the following example, the schema is referencing JSON draft 3 schema, but using JSON draft 4 schema definition for the “require” property value. Note: If you are following along and change a schema to test the error output, you will need to reload the schema.

image

 

We would love to hear any feedback or suggestions about the JSON editor as we continue to improve and add more features in subsequent releases.

Why Katana should be on your radar

$
0
0
  • Agility, flexibility and composition: Unlike traditional ASP.NET, Katana decouples components, which are independently updatable by NuGet. Rather being restricted to the functionality built into your server or framework, you can now compose together multiple middleware and frameworks to get only what you need.  For example,  Self-Hosting WebApi and SignalR in the same application, guarded by the same security middleware. Components can release independently, which means that it’s likely features and functionally  would be changing more quickly. More frequent changes is not universally positive, more frequent changes could introduce more problems.
  • Portable: Your application can be portable across servers; IIS, HttpListener, or even something like NOwin (which uses raw sockets in lieu of HttpListener). Some of these may be available on Mono.
  • Footprint: Footprint is not directly related to Katana , but rather that Katana can be hosted without going through the normal ASP.NET (System.Web) request processing pipeline (which brings in the .Net System.Web and all its dependencies). 

Additional Resources

· Introducing ASP.NET Project “Helios” by @LeviBroderick

· Getting Started with the Katana Project MSDN magazine article by @howard_dierking

· Getting Started with OWIN and Katana by Mike Wasson

· An Overview of Project Katana  by @howard_dierking

· Video: The Katana Project - OWIN for ASP.NET by @howard_dierking

· The future is now – OWIN and multi-hosting ASP.NET web applications by @filip_woj

ASP.NET 4.5.2 and EnableViewStateMac

$
0
0

A few months ago, we posted that we were making changes to the way EnableViewStateMac behaves in ASP.NET. I’ll forego the typical blog post ceremony and cut right to the chase:

Starting with ASP.NET 4.5.2, the runtime enforces EnableViewStateMac=true.

If an application sets <%@ Page EnableViewStateMac="false" %> as a directive or <pages enableViewStateMac="false" /> as a config setting, the runtime ignores it and pretends that the developer had written "true" instead.

Currently the .NET Framework 4.5.2 is an optional download on Microsoft Download Center. However, these changes will eventually be pushed out over Windows Update / Microsoft Update. Servers which have opted in to WU / MU servicing will automatically get this new behavior at that time. (At present we have no timeline for when this will be pushed out over WU / MU.)

We’re aware that this change could affect a substantial number of web applications. It is never our intention to break web applications in an in-place update, but we felt it necessary to address this issue head-on due to the prevalence of misinformation regarding this switch and the number of customers who are running with it set to an insecure setting. To see if you need to take any action with regard to your existing ASP.NET applications, please read on.

What if I have EnableViewStateMac set to true everywhere?

Great! No action is required on your part. Nothing will change; your application will continue to work as it always has.

What if I never set EnableViewStateMac to anything, true or false?

No action is required on your part. Nothing will change. The default value for this switch has always been true, so your application will continue to work as it always has.

When is it safe to set EnableViewStateMac=false?

It is never safe to set EnableViewstateMac=false.

Doesn’t setting EnableViewStateMac=true harm performance?

No. The EnableViewStateMac switch has no measurable impact on performance.

What’s the worst that can happen if I set EnableViewStateMac=false?

As called out in the original advisory, an attacker may be able to leverage this setting to upload and execute arbitrary code on the web server. This would grant her complete control over the web application subject to the permissions of the web worker process.

Can I set EnableViewStateMac=false if I also set EnableViewState=false?

No. It doesn’t matter if the application uses EnableViewState=false or if the application requires authentication. It is categorically insecure to set EnableViewStateMac=false, even on a single page.

What about cross-page posts?

In 4.5.2, the ASP.NET framework special cases cross-page posts to minimize the chance of them causing errors at postback time. However, setting <form action="some-different-page.aspx" /> has never been the recommendation for cross-page posts in WebForms. Consider using PostBackUrl instead to make this scenario work. See http://msdn.microsoft.com/en-us/library/ms178140(v=vs.100).aspx for more information.

What if my servers are in a web farm and I don’t want to sync <machineKey> elements?

You are required to create a <machineKey> element and synchronize it across machines in your farm. See KB 2915218, Appendix A for full instructions on how to generate a <machineKey> element. That appendix contains a block of code that you can copy and paste into a PowerShell window. This will generate a <machineKey> element locally that you can put into your Web.config.

Note: these keys are sensitive. Anybody who has access to them could end up with full administrative control over your web application. For this reason we suggest that you never use an online “click here to generate a <machineKey> element” utility. Only ever use <machineKey> elements that you generated yourself on your own machine. And use caution not to leak these keys in public places like online forums.

What if I encounter MAC validation errors when I set EnableViewStateMac=true?

We updated the “validation of view state MAC failed” error message to contain a link to KB 2915218, which lists common reasons that the MAC verification step might fail. Check that article to see if it calls out an item specific to your scenario.

Are there any other behavioral changes as a result of this?

Yes, there is a minor behavioral difference as a result of the EnableViewStateMac change. If a __VIEWSTATE form field is written out to the response, it will now be accompanied by a new form field <input type="hidden" name="__VIEWSTATEGENERATOR" ... />. This new form field is used by the ASP.NET runtime to help determine whether a postback is going to the same page or is cross-page. It’s similar in concept to the __PREVIOUSPAGE form field that is written when a control’s PostBackUrl property is set.

What about ASP.NET 2.x / 3.x?

At this time we’re not pushing out an “enforce EnableViewStateMac=true” change over Windows Update / Microsoft Update for these framework versions. However, based on feedback we receive from the 4.5.2 changes, we may eventually push this change out for earlier versions as well.

If you have an existing ASP.NET 2.x / 3.x application and would like to opt-in to EnableViewStateMac=true enforcement in the runtime, see KB 2905247.

Further resources

If you have any questions about this change, please let us know. Thanks, and happy coding!

Microsoft .NET Framework 4.5.2: announcement | installer | new features | known issues | breaking changes

Announcing new Web Features in Visual Studio 2013 Update 2 RTM

$
0
0

Today, the Visual Studio team announced the release of RTM version of Visual Studio 2013 Update 2 .  Our team added a few useful features and did some bug fixing in this update to improve the web development experience.  This blog will contain all of the features introduced in RC and a few updates. We will have future blogs to talk about some of the new features in detail.  The release notes contain more details.

We added the following new features in the RTM release since RC release, which can be seen in detail in the corresponding sections.

clip_image001[9]

The following is a list of new web feature for Visual Studio 2013 Update 2 RTM.

New Sass project item and editor

We added LESS in VS2013 RTM, and we now have a Sass project item and editor.  Sass editor features are comparable to the LESS editor and include the following:  colorization, variable and Mixins IntelliSense, comment/uncomment, quick info, formatting, syntax validation, outlining, goto definition, color picker, tools option setting, and more.

clip_image002[5]

clip_image003

New JSON project item and editor

We have added a JSON project item and editor to Visual Studio.  Current JSON editor features include colorization, syntax validation, brace completion, outlining, tools option setting, format, and more.

IntelliSense now supports JSON Schema v3 and v4. There is a schema combo box to choose existing schemas, edit the local schema path, or simply drag drop a project JSON file into the editor to get the relative path.

clip_image004

clip_image006

Create remote Azure resources option when creating a new Web project

We added an Azure “Create remote resources” checkbox on the new web application dialog.  By choosing it, you will be able to integrate the experience of creating a new web project, creating a new Azure web site or VM, , and creating a publish profile.

clip_image008

clip_image009

We also support remotedebug and remote viewing and editing of Azure website content files.

A new dialog to trust IIS express SSL certificate

To eliminate the security warning when browsing and debugging HTTPS on localhost, we added a dialog to allow Internet Explorer and Chrome to trust the self-signed IIS express SSL certificate.

For example, a web project property can be set to use SSL. Click F4 to bring up the properties dialog. Change SSL Enabled to true. Copy the SSL URL.

clip_image010

Set the web project property page web tab to use the HTTPS URL (The SSL URL will be https://localhost:44300/ unless you've previously created SSL Web Sites.)

clip_image011

Press CTRL+F5 to run the application. Follow the instructions to trust the self-signed certificate that IIS Express has generated.

clip_image012

Read the Security Warning dialog and then click Yes if you want to install the certificate representing localhost.

clip_image013

The site will be shown in IE or Chrome without the certificate warning in the browser.

clip_image014

Firefox uses its own certificate store, so it will display a warning.

ASP.NET Scaffolding

The MVC Scaffolder will generate dropdowns for Enums. This uses the Enum helpers in MVC.

We updated the EditorFor templates in MVC Scaffolding so they use the Bootstrap classes.

MVC and Web API Scaffolders will add 5.1 packages for MVC and Web API

Here are some screen shots when scaffolding models with Enum.

Model code:

clip_image015

Compile, and then click add New Scaffolded Item…

clip_image016

Choose MVC5 Controller with views, using Entity Framework:

clip_image018

Add Controller using the model:

clip_image019

Check the generated code, for example Views/WeekdayModels/Edit.cshtml contains @Html.EnumDropDownListFor

clip_image021

Run the page to see the enum combobox generated, notice that if a value can be null, an empty string can be chosen for the combobox. For example, the create page shows the following:

clip_image022

One ASP.NET Template changes

We updated ASP.NET templates to support Account Confirmation and Password Reset.

We updated the ASP.NET Web API template to support authentication using On Premises Organizational Accounts.

The ASP.NET SPA template now contains authentication that is based on MVC and server side views. The template has a WebAPI controller which can only be accessed by authenticated users.

LESS editor improvements

We added features including nested media queries, named parameter support, support for selector interpolation, support for semicolons as parameter separators, goto definition for @import, goto definition of variables and mixins.

Knockout IntelliSense upgrade

We added a non-standard KnockOut syntax for VS intelliSense, “ko-vs-editor viewModel:” syntax.  It can be used to bind to multiple view models on a page using comments in the form:

Code Snippet

<!-- ko-vs-editor viewModel: <any javascript expression that evaluates to an object> -->

<!-- /ko-vs-editor -->

clip_image023

We also added support for nested ViewModel IntelliSense, so you may drill into deeply nested objects on the ViewModel. 

<div data-bind=”text: foo.bar.baz.etc” />

The IntelilSense displayed is the full IntelliSense of the JavaScript Object. 

clip_image024

New URL Picker in HTML, Razor, CSS, LESS and Sass documents

VS 2013 shipped with no URL picker outside of Web Forms pages.  The new URL picker for HTML, Razor, CSS, LESS and Sass editors is a dialog-free, fluent typing picker that understands ‘..’ and filters file lists appropriately for img’s and links.

clip_image025

clip_image026

clip_image027

Browser Link New Features

Browser Link added updates for:

· HTTPS connections (shown in the Dashboard with other connections as long as the certificate is trusted by browser).

· Static HTML source mapping

· SPA support for mapping data

· Auto-update mapping data

ASP.NET Web Forms

The Web Forms templates now show how to do Account Confirmation and Password Reset for ASP.NET Identity.

The EntityDataSource control and the Dynamic Data Provider were updated for Entity Framework 6. For details please see Announcing the release of Dynamic Data provider and EntityDataSource control for Entity Framework 6.

ASP.NET MVC 5.1.2, ASP.NET Web API 2.1.2 and ASP.NET Web Pages 3.1.2 are included

We announced ASP.NET MVC 5.1, ASP.NET Web API 2.1 and ASP.NET Web Pages 3.1 in January.  We integrated that release with some minor 5.1.1 bug fixes into VS 2013 Update 2 RC and RTM. 5.1.2 contains the same binaries plus localization for IntelliSense usage.

ASP.NET Identity

We integrated Microsoft.AspNet.Identity 2.0.0 RTM into the new project templates, which includes two-factor authentication, account lockout, account confirmation, password reset, security stamp, user account delete, extensibility of primary key for users and roles, and more.

Entity Framework

We integrated Entity Framework 6.1.0 RTM into the new-project templates. 

Microsoft OWIN Components

We integrated the latest stable version (2.1.0) of OWIN components into the new-project templates.  OWIN 2.1.0 supports Google OAuth2 authentication and static file server.

NuGet

NuGet 2.8 RTM is included in this release.  You can always get the latest NuGet extension for Visual Studio through the menu “Tools->Extensions and Updates…”.

ASP.NET SignalR

We included the 2.0.2 NuGet package for SignalR.  Please look at the release notes for more detailed information https://github.com/SignalR/SignalR/releases/tag/2.0.2

Summary

We hope you can evaluate these new features and let us know about any bugs and suggestions.  For VS features, please use Connect to submit bugs, ASP.NET UserVoice to submit and vote for suggestions, and the ASP.NET Forums for Q&A.  You can also visit the following open source sites to leave suggestions and open issues directly:

Enabling the .NET Compiler Platform (“Roslyn”) in ASP.NET applications

$
0
0

The .NET languages team recently announced the availability and open sourcing of a public preview of “Roslyn”, the new .NET Compiler Platform. This is the long awaited .NET “compiler as a service” that represents the future of languages and compilation for .NET. You can download a preview including new compilers and Visual Studio tooling that enable you to explore upcoming features in the languages and Visual Studio editor, but how do you use these new features in an ASP.NET application?

Compilation in ASP.NET applications

First, let’s take a moment to revisit compilation in the context of ASP.NET applications. There are generally two types of compilation that typically take place in an ASP.NET application: project compilation, and runtime compilation.

Project compilation

This type of compilation is used by ASP.NET applications built using the project system approach, i.e. they have a .csproj/.vbproj file. Project compilation typically takes place in Visual Studio via the project system and msbuild. It compiles any C# or VB files in your project and produces an assembly in the project’s \bin folder. This assembly is then deployed to your web server along with your application and is loaded by the ASP.NET runtime when your application starts. Note this assembly could also be built from the command line using msbuild directly, e.g. when on a continuous integration or build server.

Assets such as page, user control, and handler code-behind classes, controllers and embedded resources are built using project compilation. Typically the source code for these assets is not deployed to the server. Other asset types, including .aspx, .ascx, .ashx, .cshtml, and .vbhtml, are deployed to the server and built by the ASP.NET runtime itself using runtime compilation.

Runtime compilation

ASP.NET includes a full runtime compilation pipeline that compiles many of your application’s assets on the web server when the application is running (hence “runtime compilation). Even if you’re using the project compilation model detailed above, part of your application will be compiled using ASP.NET’s runtime compilation feature. If you’re using the “Web Site” model for your application (you chose File –> New Web Site… in Visual Studio and your application doesn’t have a .csproj/.vbproj file) your application is built exclusively using ASP.NET’s compilation system. In this model, your application will contain shared code files in the App_Code folder.

Page assets in your application such as .aspx, .ascx and .cshtml/.vbhtml files, are compiled in the following fashion:

  • The file is parsed and turned into CodeDOM using the configured build provider
  • The CodeDOM graph that represents the file is used to generate a string of C# or VB code using the configured CodeDOM provider (the code generation step)
  • The C# or VB code is then compiled into an assembly using the configured CodeDOM provider (the code compilation step)
  • The assembly is loaded into the application and optionally cached to disk in the Temporary ASP.NET Files folder

You can read more about ASP.NET’s compilation system on MSDN.

Runtime pre-compilation

To confuse matters slightly, you can elect to pre-compile the portions of your application that use the ASP.NET runtime compilation system before deployment, so that the compilation doesn’t take place at runtime. This is achieved using the ASP.NET Compilation Tool, aspnet_compiler.exe. This will produce assemblies that can be deployed along with your application in the \bin folder and will remove the cost associated with compiling those assets when the application starts up. The tool is simply a wrapper around the runtime compilation feature and doesn’t use msbuild in any way. It will only compile the portions of your application that would’ve been compiled at runtime by ASP.NET, so if you’re using the project compilation model from above, you’ll still need to compile your application’s project using Visual Studio or msbuild.

You can read more about ASP.NET pre-compilation on MSDN.

Why Roslyn compilation in ASP.NET?

Enabling the new Roslyn compilers in your ASP.NET application will result in two main benefits:

  1. Support for new language features
  2. Potentially improved application startup/pre-compilation time

The first item is pretty self-explanatory. The second should be particularly helpful for customers with very large, complex ASP.NET applications. In our testing of a suitably large and complex application (>600 assemblies in \bin, >500 user controls & pages), the runtime compilation cost at startup/pre-compilation dropped from ~15 minutes to ~70 seconds after enabling the new CodeDOM providers detailed below.

Enabling Roslyn compilation in ASP.NET

To properly enable Roslyn compilation in ASP.NET you need to do two things:

  1. Install the .NET Compiler Platform (“Roslyn”) End User Preview on your developer machine (where you use Visual Studio or build your application’s project file)
  2. Install the new CodeDOM Providers for .NET Compiler Platform (“Roslyn”) NuGet package into your ASP.NET application. In the Package Manager Console in Visual Studio type: install-package Microsoft.CodeDom.Providers.DotNetCompilerPlatform

The first item will cover the project compilation portion of your ASP.NET application. The second item will cover the runtime compilation portion.

Announcing the first preview of new CodeDOM Providers for .NET Compiler Platform (“Roslyn”)

You can now get the first preview release of new CodeDOM providers that use Roslyn from NuGet in the Microsoft.CodeDom.Providers.DotNetCompilerPlatform package. Install the package into your application using NuGet in Visual Studio. These are a drop-in replacement for the in-box providers and installing the package is all you should have to do to enable them (the package will modify your web.config file for you). That said, if you want the best possible startup time, you can elect to Ngen and GAC the Roslyn assemblies so that your application doesn’t have to pay the cost of JIT’ing and loading them at runtime (which can be measurable given the size of the Roslyn assembly).

The new providers work equally well at runtime and when using the ASP.NET Pre-compilation tool, aspnet_compiler.exe. The tool will read your application’s web.config file and load the configured CodeDOM provider so you’ll get benefits regardless of whether you pre-compile your application or let ASP.NET do it on the web server.

The new providers only replace the compilation step of ASP.NET’s runtime compilation feature. Your .aspx, .cshtml, etc. files are still parsed and the C#/VB code is still generated using the same mechanisms as before, but the final compilation now takes place using Roslyn.

The providers are currently version 0.1.0-pre, so it should be obvious these are still very much in the pre-alpha stage. The internal implementation may change before we settle on a 1.0.0-beta but your application shouldn’t know any different. We’ll continue to release previews as the Roslyn team produce updates to the corresponding Roslyn NuGet packages. In the future, we’ll update the ASP.NET project templates in Visual Studio to use the final version of these providers by default.

We’re keen to have customers use them and provide feedback so we can make them the best they can be. We’ve tested the providers with a number of common ASP.NET application frameworks including DotNetNuke, Orchard, and Umbraco.

Known issues in the preview
  • In this first preview, only C# is supported. VB support will be added in a future release
  • Small applications may actually see an increase in application startup time. In our testing this was only a few seconds and in moderately sized applications this is more than overshadowed by the reduction in compilation of the application itself at startup
  • Very large pages containing long spans of HTML with no server blocks (e.g. <%= %>) can cause a stack overflow when debug compilation is enabled that will crash the application. Note that in our testing it’s taken a page that’s ~4x as large one that would’ve produced a similar issue in the old provider, but in this case it crashes the entire application, whereas with the old provider it just fails the offending page
  • If you retarget your app to 4.5.1 in Visual Studio after installing the NuGet package, the web.config changes are lost and will need to be reapplied manually

Feedback

If you find any issues or want to ask a question regarding what’s discussed in this article, please leave a comment below, or use the Contact Owners form for the NuGet package.

We hope you enjoy using Roslyn in your ASP.NET applications!

Announcing ASP.NET Session State Provider for Redis Preview Release

$
0
0

Today, we are pleased to announce a preview release of ASP.NET Session State Provider for Redis. You can use it with your own Redis server or the newly announce Microsoft Azure Redis Cache (Preview).

What is Redis

Redis is an open source, BSD licensed, advanced key-value store. It is often referred to as a data structure server since keys can contain strings, hashes, lists, sets and sorted sets. It’s getting popular in the web development community as a session state store because of its simplicity, rich data structure support and outstanding performance.

Installing

You can download this preview release of ASP.NET session state provider for Redis (https://www.nuget.org/packages/Microsoft.Web.RedisSessionStateProvider) from NuGet gallery by running this command

Install-Package Microsoft.Web.RedisSessionStateProvider -Pre

Get started

Get Redis

Before using the session state provider for Redis, you need to (of course) get a Redis server. There are a lot of ways to get a Redis server depends on your OS and whether your app is in the cloud or on-prem. Today, I’m going to show you two:

In the cloud – Microsoft Azure Redis Cache (Preview)

The newly announced Microsoft Azure Redis Cache (Preview) provides your Redis host in Azure. It’s super easy to use. Just follow this get started article to get one in minutes. After it’s done, get hold of the host name and access key since we need to use them later.

On-prem – Redis on Windows

The Redis project doesn’t directly support Windows. However, Microsoft Open Technologies, Inc. develops and maintains a high quality Windows port. You can install it from NuGet, Chocolatey or download it directly from the project github repository. After the installation, you can find the following two .exe files that we are interested at:

  • redis-server.exe: Run this .exe file and you will have a Redis server running on your machine.
  • redis-cli.exe: We will use this as a command line tool later to check the data in your Redis server. Note: we’ll use it to check the data in Azure Redis Cache as well.

Create a simple project

To get started,

  • Create an ASP.NET Web Application in Visual Studio.
  • Add RedisSessionStateProvider NuGet package which will do the following:
    • Add references to the ASP.NET session state provider for Redis assembly and its dependencies.
    • Add the following configuration to system.web section in Web.config file.
Code Snippet
  1. <sessionStatemode="Custom"customProvider="MySessionStateStore">
  2.   <providers>
  3.     <!--
  4.       <add name="MySessionStateStore"
  5.         host = "127.0.0.1" [String]
  6.         port = "" [number]
  7.         accessKey = "" [String]
  8.         ssl = "false" [true|false]
  9.         throwOnError = "true" [true|false]
  10.         retryTimeoutInMilliseconds = "0" [number]
  11.       />
  12.     -->
  13.     <addname="MySessionStateStore"type="Microsoft.Web.Redis.RedisSessionStateProvider"host="127.0.0.1"accessKey=""ssl="false" />
  14.   </providers>
  15. </sessionState>
  • If you are using Azure Redis Cache, modify host and accessKey with the values you get when you create the Redis Cache. If you are using local Redis server, leave them as it is.
  • Add line 9 shown below to the Login method in AccountController.cs to add the login time to session.
Code Snippet
  1. publicasyncTask<ActionResult> Login(LoginViewModel model, string returnUrl)
  2. {
  3.     if (ModelState.IsValid)
  4.     {
  5.         var user = await UserManager.FindAsync(model.Email, model.Password);
  6.         if (user != null)
  7.         {
  8.             await SignInAsync(user, model.RememberMe);
  9.             Session["loginTime"] = DateTime.Now.ToString(); // Add this line
  10.             return RedirectToLocal(returnUrl);
  11.         }
  12.         else
  13.         {
  14.             ModelState.AddModelError("", "Invalid username or password.");
  15.         }
  16.     }
  17.  
  18.     // If we got this far, something failed, redisplay form
  19.     return View(model);
  20. }
  • Run the project.
  • Sign up an account and log in with it.
  • Run redis-cli.exe from command line to connect to the Redis server and check out the session data by running the following commands and you’ll see the session data is successfully stored in Redis. If you are running Redis server locally, you don’t need the -h (host) or -a (password) parameter. (You see the hex data because we convert the value into bytes before putting it into Redis). For more information about redis-cli, please refer to http://redis.io/commands.

image

Protect your data Azure Redis Cache access key when using redis-cli.exe

When you use redis-cli.exe to connect to your Azure Redis Cache, since it doesn’t support SSL by default, the data and more importantly the access key are sent via TCP in clear text. To protect them, follow the instructions below to set up a SSL proxy on your machine for redis-cli.exe.

  • Download and install stunnel.
  • Open stunnel GUI Start from your start screen
  • Click Menu –> Configuration –> Edit Configuration. This will open the stunnel configuration file.
  • Go to the section named as “Example SSL client mode services” and add the following configuration.

Code Snippet
  1. [redis-cli]
  2. client = yes
  3. accept = 127.0.0.1:6380
  4. connect = azrocks.redis.cache.windows.net:6380
  • Click Menu –> Configuration –> Reload Configuration. This will let stunnel load the new configuration.
  • Connect to your Azure Redis Cache using the following command

redis-cli.exe -p 6380 –a <your access key>

Now, you are having the TCP traffic from redis-cli.exe via a local SSL proxy provided by stunnel.

Configuration details

Now let’s look a little bit deeper into the settings you can use on this session state provider and how they will change its behaviors.

How to connect to Redis server

These are the settings that configure connections to Redis server:

  • host: The IP address or host name of your Redis server. By default it’s localhost.
  • port: The port of your Redis server. By default it’s 6379 for non-ssl and 6380 for ssl (if you are using Azure Redis Cache). 
  • accessKey: The password of your Redis server when Redis authorization is enabled. By default is empty, which means the session state provider won’t use any password when connecting to Redis server. If your Redis server is in a publicly accessible network, like Azure Redis Cache, be sure to enable Redis authorization to improve security.
  • ssl: Whether to connect to Redis server via ssl or not. By default is false because Redis doesn’t support SSL out of the box. If you are using Azure Redis Cache which supports SSL out of the box,  be sure to set this to true to improve security.

How session state provider should behave

throwOnError

When we talk to developers about the current available ASP.NET session state providers, one of the top complaints is that with the current available session state providers, if an error occurs during a session operation, the session state provider will throw an exception, which will blow up the entire application.

We want to address this in a way that it won’t surprise existing ASP.NET session state provider users and at the same time, provide the ability to opt-in the advanced behaviors. To do this, we introduced the following setting in this session state provider:

  • throwOnError: Whether or not to throw an exception when some error occurs. The default is true.

As you can see, the default behavior will still throw an exception when some error occurs. This is consistent with the other ASP.NET session state providers we provide so there won’t be any surprise and your existing code will just work.

If you set throwOnError to false, then instead of throwing an exception when some error occurs, it will fail silently. If you need to check if there was some error and if there was one, what the exception was, you can check it using this static property

Code Snippet
  1. Microsoft.Web.Redis.RedisSessionStateProvider.LastException

This is really cool because

  • If you have some existing applications using some existing session state provider, you don’t need to change any code but get away from blowing up the entire application for free.
  • You can explicitly check for errors if there were any.

retryTimeoutInMilliseconds

We also want to provide some retry logic to simplify the case where some session operation should retry on failure because of things like network glitch. At the same time, we also heard from developers that they want the ability to control the retry timeout or opt-out of retry entirely because they know retry won’t solve the issue in their cases. To do this, we introduced the following setting in this session state provider:

  • retryTimeoutInMilliseconds: How long it will retry when an operation fails. Default is 0, meaning no retry.

If you set retryTimeoutInMilliseconds to a number, say 5000, then when a session operation fails, it will retry for 5000 milliseconds before treating it as an error. So if you would like to have the session state provider to apply this retry logic for you, you can simply configure the timeout. The first retry will happen after 20 milliseconds since that is good enough in most cases when a network glitch happens. After that, it will retry every 1 second till it times out. Right after the time out, it will retry one more time to make sure that it won’t cut off the timeout by (at most) 1 second.

If you don’t think you need retry (like when you are running the Redis server on the same machine as your application) or if you want to handle the retry logic yourself, you can just leave it to the default value 0.

Conclusion

We are really excited about this release. Give it a try and let us know what you think!


Improvements to ASP.NET Web Forms

$
0
0

The Visual Studio team released Visual Studio 2013 Update 2 RTM. As part of this update we have added new features for ASP.NET Web Forms. This post will highlight some of the improvements done in ASP.NET Web Forms which are part of this release and some changes that we are working on.

ASP.NET Identity 2.0 Support

ASP.NET Identity is the new modern membership system for building ASP.NET applications. Identity makes it easier to add user profile, social logins such as Facebook, Twitter, Microsoft Account, Google and more, change persistence layer from SqlServer to Azure Table Storage and adds features such as Account Confirmation, Two-Factor Authentication, Account Lockout, Single Sign out from everywhere, Password Reset and many more security related features.

In Visual Studio 2013 Update2, the Web Forms templates have been updated to use Account Confirmation, Password Reset and other features from ASP.NET Identity 2.0

Entity DataSource control for Entity Framework 6

Entity Framework 6 is the latest version of EF. EF 6 introduced many new features such as Connection Resiliency which allows EF to automatically retry any commands that fail due to these connection breaks. To use EF 6 in your Web Forms application, we have created a new Entity DataSource control. The new control ships as a NuGet package (Microsoft.AspNet.EntityDataSource). For more details on this release see this post.

Dynamic Data provider for Entity Framework 6

Dynamic Data is a runtime scaffolding engine which allows you to create rich data driven applications where you can customize the UI and business logic of your application. To add support for Entity Framework 6, we have released a new Dynamic Data provider for EF 6, so that you can take benefit of the new EF 6 features. This provider ships as a NuGet package (Microsoft.AspNet.DynamicData.EFProvider). For more details on this release see this post.

Universal Providers for Entity Framework 6

Universal providers were used in the ASP.NET Web Forms templates that shipped in Visual Studio 2012. Universal providers were based on Entity Framework 5. We updated the providers to work with EF 6 so that you can take benefit of the new features in EF 6. You can update to the new providers from the NuGet gallery (Microsoft.AspNet.Providers.Core)

Schedule small background worker tasks in .NET 4.5.2

The .NET team released an update to .NET .NET 4.5.2.  It is a highly compatible, in-place update to the .NET Framework 4, 4.5 and 4.5.1. One of the features to highlight in this release is addition of a new HostingEnvironment.QueueBackgroundWorkItem method that lets you schedule small background work items. ASP.NET tracks these items and prevents IIS from abruptly terminating the worker process until all background work items have completed. These will enable ASP.NET applications to reliably schedule Async work items.

For more details on this release and download instructions see this post.

Web Forms Scaffolding

In Visual Studio 2013, we released a new Scaffolding Framework to allow users to Scaffold a Model and add Create, replace, update and delete views. We are bringing in this support for Web Forms where you can Scaffold a model and it will generate Web Forms pages for creating, editing, updating and deleting the model.

For eg. When use Scaffolding to Scaffold a model which has a Product and a List of Categories, you will get an Insert page which will look like follows.

scaffoldingimage

You can get the latest preview here We are working to add VB support in Scaffolders as well.

Rosyln Support

You can now get the first preview release of new CodeDOM providers that use Roslyn from NuGet in the Microsoft.CodeDom.Providers.DotNetCompilerPlatform package. Install the package into your application using NuGet in Visual Studio.

Enabling the new Roslyn compilers in your ASP.NET application will result in two main benefits:

  1. Support for new language features
  2. Potentially improved application startup/pre-compilation time

The first item is pretty self-explanatory. The second should be particularly helpful for customers with very large, complex ASP.NET applications. In our testing of a suitably large and complex application (>600 assemblies in \bin, >500 user controls & pages), the runtime compilation cost at startup/pre-compilation dropped from ~15 minutes to ~70 seconds after enabling the new CodeDOM providers detailed below.

These are a drop-in replacement for the in-box providers and installing the package is all you should have to do to enable them (the package will modify your web.config file for you). The new providers work equally well at runtime and when using the ASP.NET Pre-compilation tool, aspnet_compiler.exe.

Full details are available at http://blogs.msdn.com/b/webdev/archive/2014/05/12/enabling-the-net-compiler-platform-roslyn-in-asp-net-applications.aspx

One ASP.NET Support

The Web Forms project templates integrate seamlessly with the new One ASP.NET experience. You can add MVC and Web API support to your Web Forms project, and you can configure authentication using the One ASP.NET project creation wizard. For more information, see Creating ASP.NET Web Projects in Visual Studio 2013.

Bootstrap support

The Web Forms templates use Bootstrap to provide a sleek and responsive look and feel that you can easily customize. For more information, see Bootstrap in the Visual Studio 2013 web project templates

Following is how the Web Forms templates look in the browser and on the phone.

webformstemplate

If you have any questions/ suggestions please go to http://forums.asp.net/18.aspx/1?Web+Forms or http://aspnet.uservoice.com/forums/41202-asp-net-web-forms

ASP.NET vNext: the future of .NET on the Server

$
0
0

At TechEd we announced our plans and vision for ASP.NET vNext. ASP.NET vNext is being designed from the bottom up to be a lean and composable .NET stack for building web and cloud based applications. You can find an overview of ASP.NET vNext and walkthroughs of the current experience at http://www.asp.net/vnext.

  • MVC, Web API, and Web Pages will be merged into one framework, called MVC 6. MVC 6 has no dependency on System.Web.
  • ASP.NET vNext includes new cloud-optimized versions of MVC 6, SignalR 3, and Entity Framework 7.
  • ASP.NET vNext will support true side-by-side deployment for all dependencies, including .NET for cloud. Nothing will be in the GAC.
  • ASP.NET vNext is host agnostic. You can host your app in IIS, or self-host in a custom process.
  • Dependency injection is built into the framework.
  • Web Forms, MVC 5, Web API 2, Web Pages 3, SignalR 2, EF 6 will be fully supported on ASP.NET vNext
  • .NET vNext (Cloud Optimized) will be a subset of the .NET vNext Framework, optimized for cloud and server workloads.
  • MVC 6, SignalR 3, EF 7 will have some breaking changes:
    • New project system
    • New configuration system
    • MVC / Web API / Web Pages merge, using a common set of abstractions for HTTP, routing, action selection, filters, model binding, and so on
    • No System.Web, new lightweight HttpContext

To learn more about the ASP.NET vNext announcements, see the TechEd sessions:

ASP.NET vNext is an open source project released under Apache License Version 2.0 by Microsoft Open Technologies, Inc. You can follow its progress and find instructions on how to contribute on https://github.com/aspnet .

We’d love to hear your feedback. Please provide it in Github, comments on this blog, or the ASP.NET vNext forum. Thanks for being with us in this exciting time.

New Getting-Started Resources for Azure Cloud Services and ASP.NET

$
0
0

This week we published a new getting-started tutorial with sample project for Azure Cloud Services on the azure.microsoft.com site. This is a multi-tier application that uses Azure SQL Database and Azure Storage queues and blobs.  It’s designed to be as simple as possible for a multi-tier application with web and worker roles, so you can learn the basics of building, testing, and deploying in minutes rather than having to invest hours working with a large and complex project.

Another multi-tier sample application and tutorial that uses Azure Cloud Services and Azure Storage queues and blobs has been available since late 2012: the Azure Email Service Cloud Service tutorial series. That is still the best choice for an introduction to more real-world coding scenarios, or for learning how to use Azure Storage tables. It was written a year and a half ago for Visual Studio 2012 and SDK 2.0, but notes have been added to explain how to make it work with Visual Studio 2013 and the latest SDK. We’re working on updating it.

The tutorials focus on providing hands-on experience rather than a conceptual introduction. For an up-to-date introduction to Azure Cloud Services that starts with the basics and does go into more depth (and with future installments promised), see the great new article on the new justazure.com site: Microsoft Azure Cloud Services Part 1: Introduction.

Release Candidates for ASP.NET MVC 5.2, Web API 2.2 and Web Pages 3.2

$
0
0

The release candidate NuGet packages for ASP.NET MVC 5.2, ASP.NET Web API 2.2 and ASP.NET Web Pages 3.2 are now live on the NuGet gallery!

Download this release

You can install or update the release candidate NuGet packages for ASP.NET MVC 5.2, ASP.NET Web API 2.2 and ASP.NET Web Pages 3.2 using the NuGet Package Manager Console, like this:

  • Install-Package Microsoft.AspNet.Mvc -Version 5.2.0-rc -Pre
  • Install-Package Microsoft.AspNet.WebApi -Version 5.2.0-rc -Pre
  • Install-Package Microsoft.AspNet.WebPages -Version 3.2.0-rc -Pre

Prerequisites for this release

What’s in this release?

This release primarily includes great new features for Web API OData v4 as summarized below but has bug fixes and minor features that bring in a lot more goodness to MVC, Web API, and Web Pages:

ASP.NET MVC 5.2 Release Candidate

ASP.NET Web API 2.2 Release Candidate

ASP.NET Web Pages 3.2 Release Candidate

You can find a complete listing of the features and fixes included in this release by referring to the corresponding release notes:

Additional Documentation

Tutorials and other information about this release are available from the ASP.NET web site (http://www.asp.net).

Questions and feedback

You can submit related to this release on the ASP.NET forums (MVCWeb APIWeb Pages). Please submit any issues you encounter and feature suggestions for future releases on our CodePlex site.

Thanks and enjoy!

 
 

Announcing web features in Visual Studio “14” CTP

$
0
0

Today, the Visual Studio team announced the release of Visual Studio “14” CTP version 14.0.21730.1 DP.  Our team added features to support ASP.NET vNext development.  We will have future blogs to talk about some of the features in detail.  Note, Visual Studio side by side support is not available on this early build. Do not install this CTP on a machine with any other version of Visual Studio installed.

ASP.NET vNext Templates

ASP.NET vNext projects can be created using C# templates with “ASP.NET vNext” in the name.

clip_image002[1]

To generate an MVC 6 project, select “ASP.NET vNext Web Application template” and click OK.

clip_image004[1]

ASP.NET vNext Project

You can find the generated project file has a new extension “.kproj”. It looks like a normal “.csproj” file, but doesn’t contain any reference elements, and has a few other differences such as target file and project type. The project system gets most project information from the file and folder structure and the project.json file.

Checking the generated solution folder, we can see it has a packages folder containing all the packages needed for the project. It also contains a “<solutionName>.sln.ide” folder which is used by Roslyn Compiler engine to store temporary files. This folder should be excluded from the source control system normally.

The project’s property page can change the active target framework, between “.NET Framework 4.5” and “.NET Core Framework 4.5”. “.NET Core Framework 4.5” is the new cloud-optimized runtime.

clip_image006[1]

Using the project’s reference tree, you can view the package dependencies clearly, under the active target framework.

clip_image007

For library projects, the build will create dlls targeting both “.NET Framework 4.5” and “.NET Core Framework 4.5”, generating corresponding NuGet packages in the output bin folder.

For web and console projects, the build will not generate any packages or dlls.  When you deploy the project to the file system, you will see that the source code will be copied as well.  The projects are compiled and run dynamically. 

ASP.NET vNext Project.Json IntelliSense Support

To ease project.json editing, we added IntelliSense support.

clip_image008

For the dependencies object, it provides IntelliSense options by searching the NuGet server to list all the available packages and their versions.

clip_image009[1]

clip_image010[1]

clip_image011[1]

No build needed for changes to appear in browser

Thanks to the Rosyln compiler, if you change ".cs” files or project.json file and want to see the change in the browser, you don’t need to build the project any more. Just refresh the browser.

All files are included in the project

All the files and folders (except bin and obj folders) under the project folder are automatically included as project files. Any file system change under the project folder will be automatically picked up in Solution Explorer and IntelliSense get refreshed automatically.

Automatic Package Restore when modifying project.json dependencies

If you change dependencies by modifying the project.json file, the packages are automatically restored, and IntelliSense get adjusted automatically. You can check the “Package Manager Log” output window to see the action.

clip_image002[3]

Publish supports Azure website and file system

You can publish your MVC 6 project to an Azure website or the file system with a similar publishing story as before.

The following screen shot shows file system publishing after you click Publish in the context menu for the project:

1. Choose “File System” or choose an existing file system publishing profile.

clip_image013

2. Set up a local folder:

clip_image015

3. Choose Publish, after it is finished, go to the published folder, run the web.cmd file.

clip_image017

4. Examine the project.json file’s “web” command to get the URL, which in our case is http://localhost:5000. Type this URL in your browser to see the running site.

There is no IIS express, nor IIS involved when you run from the command line. It means that you can publish your website to a USB drive, and run it by double clicking the web.cmd file!

Summary

You can find more tooling details in asp.net article Getting Started with ASP.NET vNext and Visual Studio "14".

With this new tooling you can now open the existing MVC Music Store and Bug Tracker sample projects on GitHub in Visual Studio and get a full development experience. You can find instructions on how to open, build, run and publish these sample ASP.NET vNext project in the documentation for the corresponding GitHub repos.

We’d love to hear your feedback. For VS tooling related, please submit bugs through Connect; send suggestions on UserVoice and quick thoughts via Send-a-Smile in the Visual Studio IDE. For ASP.NET vNext, please provide feedback in Github or the ASP.NET vNext forum. If you ask a question in Stack Overflow, use the asp.net-vnext tag.  Thanks for being with us in this exciting time.

ASP.NET vNext in Visual Studio “14” CTP

$
0
0

A few weeks ago at TechEd North America we announced ASP.NET vNext. Today we are releasing the first preview of Visual Studio “14” CTP, which will have support for creating and debugging ASP.NET vNext applications. This post will cover what is ASP.NET vNext and give you a walkthrough of what you can do in this preview.

ASP.NET vNext has been designed to provide you with a lean and composable .NET stack for building modern cloud-based apps. ASP.NET vNext will build on .NET vNext.

.NET vNext is the next major release of .NET Framework. .NET vNext will have a cloud optimized mode which will have a smaller footprint as compared to the full .NET Framework. For more details see this post.

ASP.NET vNext can be best described by highlighting the following scenarios

  • Cloud-Optimized
    • ASP.NET vNext can target .NET vNext (with cloud optimized mode). This means that services such as session state and caching can be replaced based on whether the app is running in the cloud or in a traditional hosting environment, while providing a consistent API for developers.
  • Side by side support
    • When targeting the Cloud optimized mode in .NET vNext, ASP.NET vNext will let you deploy your own version of .NET Framework. Since now the .NET vNext Framework can be deployed with the app, each app can run different versions of .NET vNext side-by-side and upgrade separately, all on the same machine.
  • Enhanced developer experience
    • In ASP.NET vNext, you can now edit your code files and refresh the browser to see the changes without explicitly building your app. You can also edit your application outside of Visual Studio.
    • ASP.NET vNext projects have project.json file where all the project dependencies are stored. This makes it easier to open vNext projects outside of Visual Studio so that you can edit them using any editor such as Notepad etc. You can even edit ASP.NET vNext projects in the cloud.
  • A single programming model for building Web sites and services
    • MVC and Web API have been merged into a single programming model. For example, there’s now unified controller, routing and model binding concepts between them. You can now have a single controller that returns both MVC views and formatted Web API responses, on the same HTTP verb.
  • Modular Stack
    • ASP.NET vNext will ship as NuGet packages. NuGet packages will also be the unit of reference in your application. NuGet packages and libraries references will be treated the same so it will be easier to manage the references in your project.
    • This makes it possible for an application developer to choose what functionality they want to bring into their application. In the previous versions of ASP.NET features such as HttpContext, Session, Caching, and Membership were baked into the framework. As an app developer now if you do not need these features then you can choose not to bring it into your app.
  • Dependency Injection
    • Dependency Injection is built into vNext and is consistent across the stack. All of the vNext components such as MVC, Web API, SignalR, EF and Identity will use the same DI. This will allow us to provide the right set of services based on the environment that you are running in.
  • Configuration
    • There is a new configuration system which can read values from environment variables. This configuration provides a unified API system for accessing configuration values. You can use the same APIs to access configuration values locally or in Azure.
  • Open Source
    • The entire source code is already released as open source via the .NET Foundation. You can see the source at https://github.com/aspnet and follow progress on vNext in real time. You can also send pull requests and contribute to the source code.
  • Cross-platform support
    • We're developing vNext with cross-platform in mind, including an active collaboration with Xamarin to ensure that cloud-optimized .NET applications can run on Mac or Linux on top of the Mono runtime.

The following chart shows the feature set that will be available in .NET vNext and .NET vNext (Cloud Optimized)

vnextfeaturechart

In this post I’ll outline how you can get the tools and walk you through some of the features which we’ve enabled in this preview.

How to download

Getting started is easy. Just download Visual Studio “14” CTP. After that you’ll have everything you need to follow along with the remainder of this post. One important thing to note in this early preview is that you should not install this on your main machine. We recommend installing the product in a VM, a VHD, or a computer that is not used in a production environment because there are known side by side compatibility issues with other versions of Visual Studio. In later previews we will ensure that we have a good side-by-side story, but in this early preview we are not quite there yet.

Walkthrough

In Visual Studio you can find the vNext templates under the Visual C#\Web node as shown below.

new-proj-dialog-01

In the screenshot above, the first template, ASP.NET Web Application, will open up the existing One ASP.NET dialog. The remaining templates are for ASP.NET vNext. Later we will integrate the vNext templates into the One ASP.NET dialog box. There are three flavors of templates here. These templates can run on .NET vNext and cloud optimized .NET vNext.

  • Web templates – for creating web applications
  • Class library – for creating cloud-optimized class library projects
  • Console application – for creating console apps

We have two web templates out of the box; an empty template and a starter template. Let’s take a look at the starter template, ASP.NET vNext Web Application. When you create a vNext Web Application you’ll probably notice that the project files/layout is very familiar. Controller classes go in the Controllers folder, Views are in the Views folder, etc. In addition to some familiar files, we have introduced some new/modified files. I’ll highlight those here.

Startup.cs

This class must contain a Configure method that takes an IBuilder parameter, and you configure the HTTP pipeline inside this Configure method.

startup.cs

In Startup.cs you’ll see the Configure method as you have previously. You can add and configure various services that you need in your application. In the snippet above we can see the following customizations.

  • Add Browser Link
  • Configuration system is configured to pull values from config.json and environment variables
  • Add Entity Framework to persist data in SQL server
  • Add ASP.NET Identity
  • Add ASP.NET MVC

Now let’s take a look at the project.json file.

project.json

Project.json is used by the ASP.NET vNext. This file is where you’ll find items like dependencies (for example NuGet packages or framework references for your project), a list of available build configurations, and a list of commands which can be invoked from the command line.

project.json

In the image above you can see that the starter template depends on several NuGet packages. Besides references being in a new file there is another significant change here. In vNext apps the list of dependent NuGet packages in project.json includes only the NuGet packages which are directly referenced by the project. In ASP.NET projects created in previous releases of Visual Studio in packages.config you’ll find the NuGet packages which are directly referenced by your project, and any other NuGet packages which those packages depends on. This new view simplifies things. If you uninstall a NuGet package it will automatically remove it dependencies as well, assuming that no other NuGet package in project.json depends on it.

In the project.json you can see there are two other sections; commands and configurations. Each entry in the commands section corresponds to a command line that may be executed. The configurations element defines the configurations which the project should be built against. Here net45 represents the full .NET desktop CLR and k10 represents the cloud-optimized CLR.

myproject.kproj

When you create a project in Visual Studio a corresponding .kproj file will be placed in the root of the project. It’s is an MSBuild file like most other project files that Visual Studio creates. Visual Studio uses this file to persist settings which it needs to load/run your application. The ASP.NET vNext runtime does not use this file, it’s only used by Visual Studio. Later this file will also be useful for CI scenarios, like project files are today.

In this preview release you’ll find that all files in your project are listed in the .kproj file. In later versions the files will not be listed in the .kproj. More info on that later in this post. Now that we’ve discussed the artifacts in the project let’s move on to debugging the application, and then add some new content to this project.

config.json

In previous versions of ASP.NET, the application and web server configuration were both found in web.config. You can specify your application specific configuration settings in the config file. The file format can be the one of your choice. You can use ini, xml, json and plug in any format that you want. The template uses Microsoft.Framework.ConfigurationModel.Json package to read configuration values from a json file called config.json. Below you’ll find the default contents of config.json from the vNext Web Application template.

config.json

Debugging an ASP.NET vNext application

Debugging an ASP.NET vNext application is the same as any standard Visual Studio project. Once you start debugging your application you can set breakpoints, use the immediate window, and use all the other features that you are familiar with. Debugging works for both the full .NET desktop CLR as well as the cloud-optimized CLR.

Adding a new Controller and View

In this new preview we have not enabled the Add View/Add Controller menu options yet. That will be coming later. For now we have created a basic set of item templates that can be used. To get to these item templates you can use the Add New Item menu option.

Let’s add a new controller and view to the starter web template we created. To do that, right click on Controllers and select Add New Item. Once you do that you’ll see the list of available templates.

add-controller

OK name the controller HelloWorldController and click on the Add button. After that you’ll get a new controller class created with an Index method. In the Index method let’s pass a message to the view that we will create. Just like in previous versions one method to do this is to use the ViewBag property. Let’s add a Message property and set the value to “Hello World”. Your controller class should look like the following.

hello-controller

Now that we’ve created our controller we need to create the corresponding view. The Visual Studio templates have routes configured in the same way as in previous releases. This means we will need to create a new view at /Views/HelloWorld/Index.cshtml. You can use the MVC 5 View Page (Razor) item template to get you started.

After creating the view you can update it to display the message that the controller passes to it. The view item template has the layout and title commented out by default. Since we started with a web application template, which has the layout page in the default location, we can just un-comment that out to enable it. Below is the current contents of the view.

hello-view

Now that you have created the controller and view, you can visit the page by browsing to /helloworld/index or /helloworld/. Let’s move on to see another way you can add files to your project.

Adding files to a project

In the previous section we used the Add New Item dialog to add new files to your project. When working with a vNext project, all files/folders in the project folder are automatically included in your project. If you have existing files that you need to add to a vNext project you can simply copy the files into your project folder. Visual Studio will notice the new files and include them. In this preview we have not implemented the ability to exclude files from the project. You can exclude source files from the build process by editing project.json. Now that you’ve seen how to add files to your project let’s move on to see how to add a dependency in your project.

Adding a new dependency

In a vNext app dependencies are declared in the project.json file. In this preview, to add a new dependency you can edit the project.json file. Even though we have not yet enabled the Visual Studio dialogs to simplify adding dependencies, we have implemented another really cool feature that I think you’ll love: dynamic IntelliSense in the project.json file for dependencies. In the screenshot below you can see this in action. The IntelliSense here is showing ClassLibrary1, which is a vNext class library project in my solution.

project.json-intellisense-01

When adding a reference to a project in the solution the format in project.json for that is “ProjectName”: “”.

In addition to IntelliSense for projects, you’ll also find IntelliSense for NuGet packages. The NuGet package IntelliSense covers package names as well as version, including remote packages. Local packages (those installed with Visual Studio) have the folder icon, and remote packages have the NuGet icon.

project.json-intellisense-02

When a change to the project.json file is saved, a package restore is started in the background by Visual Studio automatically. The References node in the Solution Explorer is also updated to show this new reference. Below I’ve included a screenshot of that.

references-node

Now that we have covered a few basics let’s see how to opt-in to the cloud-optimized CLR.

Opting into .NET vNext Cloud-Optimized

By default all the vNext projects are configured to use the full .NET vNext Framework. To switch the project to target the cloud-optimized .NET vNext Framework we’ll modify the project properties. In the solution explorer right-click on your web application and select the Properties menu option. A screenshot is below.

properties-dialog

In this dialog we can switch the Active Target Framework property .NET Core Framework 4.5 to start using the cloud-optimized .NET. There are a few other properties in this dialog, but for now its a little bit bare bones. Now let’s see how you can deploy your app. In this preview we have enabled publishing vNext apps to Microsoft Azure Web Sites. Let’s take a look at that now.

Publish to Microsoft Azure Web Sites

Publishing to Azure Web Sites is really easy in Visual Studio. For the vNext projects we have a trimmed down version of the Publish Web dialog. In this preview we are supporting publishing to the file system (local folder, network folder, etc.) and publishing to Azure Web Sites. In later versions you’ll see all the existing publish methods enabled for vNext apps. Ok let’s see how you can publish your project to Azure Web Sites.

To publish your vNext app right-click on the project in the Solution Explorer and select the Publish menu option as you would a standard ASP.NET project. After that you’ll see the modified Publish Web dialog.

web-publish-01

To publish to Azure Web Sites click on the top button. After signing in you can select an existing site, or create a new one in the following dialog.

web-publish-02-create-az-website

For this project let’s create a new Azure Web Site. To do that pick the New button and then configure the name on the Create Site dialog. After creating your site the publish settings are automatically downloaded by Visual Studio. The publish dialog will look something like the following.

web-publish-03-publish-settings

At this point you can click on the Publish button to start the publish process. When the publish process starts, the Web Publish Activity window is opened, which will show you the status of the publish operation. Once the publish operation has completed, a browser will be opened with your site’s URL.

In vNext projects we’ve also enabled the selective publishing feature. To use selective publishing, right-click on one or more files in the Solution Explorer and select Publish. You can see publish, preview, and replace menu options in the following screenshot.

web-publish-04-selective-publish

That covers the publish support that we have enabled for this release. Let’s move on to discuss command line scenarios.

Command Line support

As you saw above you can use Visual Studio to create, edit and debug your ASP.NET vNext application. You can also use your favorite editor of choice to edit the files and run the application from the command line.

The following image shows an ASP.NET vNext project opened in Sublime. I have changed the About action in the Home Controller

nonVSEditor

I can run the application by using a command called “k web” from the command line. “web” is a command which is defined in the project.json file.

kweb

Once I run the command I can launch the browser and navigate to http://localhost:5000 as specified in my “web” command and see this application running in the browser.

commandline

FAQ

What is Cloud Optimized .NET vNext?

NET vNext will have a cloud optimized mode that enables you to deploy your apps with a copy of the .NET Framework libraries they need. Since the runtime and framework libraries are deployed on an app-basis, each app can run different versions of .NET vNext side-by-side and upgrade separately, all on the same machine. These libraries have been slimmed down significantly to reduce the footprint of the framework, and will be distributed via NuGet.

What’s new for ASP.NET Web Forms in ASP.NET vNext

There were bunch of new improvements to ASP.NET Web Forms in VS2013 Update2 including:

Note: You will be able to use your existing Web Forms apps and run them on ASP.NET vNext which is running on .NET vNext. You will not be able run them on cloud optimized .NET vNext.

How do I migrate my apps to ASP.NET vNext

You can run your existing application that were built on ASP.NET on .NET vNext. If you want to port your application to run on Cloud optimized .NET vNext, then you will have to see which libraries are supported in Cloud optimized .NET vNext. This is because the cloud optimized .NET vNext has a reduced set of assemblies. You can use the APIPort tool to analyze your app. The tool provides you with two main pieces of data: the platforms that you can easily/reasonably target with your code, and the dependencies that are preventing you from targeting additional platforms.

Please read the following post for more information on how to target multiple platforms.

We will provide with a migration tool which will help you to migrate your existing applications to ASP.NET vNext for both .NET vNext and cloud optimized .NET vNext

What does compatibility look like?

  • Web Forms, MVC 5, Web API 2, Web Pages 3, SignalR 2, EF 6, Identity 2 will be fully supported on .NET vNext
  • MVC, Web API, Web Pages 6, SignalR 3, EF 7, Identity 3
  • MVC, Web API and Web Pages have been merged into a single framework MVC 6. For example, there’s now unified controller and routing concepts between all three.
  • These include some of the following changes which will require modifications to your app.
    • New project system
    • New configuration system
    • No System.Web, new lightweight HttpContext (not System.Net.Http)
    • We will have a migration tool which will help you migrate your application to use ASP.NET vNext on .NET vNext and cloud optimized .NET vNext. This will cover scenarios such as migrating from MVC 5 to 6 and more.

Can I use existing libraries in the Cloud Optimized .NET vNext?

If the libraries that you were using were already PCL, then you should be able to use them.

What level of support do we provide for Cross-platform?

We are actively collaborating with the Mono team so that we can add Mono to our test matrix so we can provide a better experience. We will fix bugs that prevent us from running on Mono.

Is cloud optimized .NET vNext only for the cloud? Can I use it in my environment?

Since you are running cloud optimized .NET vNext, your app contains the framework and the libraries needed for the app to run. In this case you can self-host your app, host it in IIS and your own environment.

More Info

This section will help you find more information about ASP.NET vNext

Learn more

The ASP.NET site has been updated with all new vNext content and walkthroughs.

Relevant blog posts for ASP.NET vNext

Following blogs have content related to ASP.NET vNext which are good to read and follow.

Videos

Source Code

ASP.NET vNext is an open source project released under Apache License Version 2.0. You can follow its progress and find instructions on how to contribute on https://github.com/aspnet

Give Feedback

- If you find bugs file issues at GitHub

- Add suggestions for new features on User voice http://aspnet.uservoice.com/forums/252111-asp-net-vnext

- Discuss on the ASP.NET vNext Forums http://forums.asp.net/1255.aspx/1?ASP+NET+vNext

Running ASP.NET vNext on mono, on both Mac and Linux

Lastly ASP.NET vNext runs on Mono, on both Mac and Linux. We are collaborating with the Mono team to make sure that our ASP.NET vNext stack just works on Mono.

mac

Following are some articles explaining how to run ASP.NET vNext on Mac and Linux.

- Graeme Christie’s article on running ASP.NET vNext on OSX and Linux

- https://github.com/akoeplinger/vagrant-mono-aspnetvnext

Known Issues

Visual Studio side by side support is not available on this early build. Do not install this CTP on a machine with any other version of Visual Studio installed.

Summary

Today’s announcement is just a step towards many more awesome features that will be part of ASP.NET vNext. Please do provide your feedback and thank you for trying out the preview.

QueueBackgroundWorkItem to reliably schedule and run long background processes in ASP.NET

$
0
0

Stack Overflow is loaded with questions on how to reliably run a resource intensive process on a background thread. See  so0, so1, so2, so3, so4, so5, so6, so7, so8, so9, so10 . Examples of long running tasks include sending email, image processing and generating a PDF file. When Phil Haack was a program manager on the ASP.NET MVC team, he wrote the definitive blog on the inherent unreliability of running background tasks on ASP.NET. While Phil’s blog is a good read, there are now three  supported approaches to launching long running process on ASP.NET:

  1. Cloud worker role is an environment in which you can run code. It’s basically a computer, really. You run whatever code you want (EXE, BAT, PS1, NodeJS, .NET, etc.)  An Azure worker role provides the most industrial strength and scalable solution to this problem.  For an excellent tutorial with this approach, see Tom Dykstra’s Get Started with Azure Cloud Services and ASP.NET.
  2. Web Jobs (including the Web Jobs SDK) are a way in Azure to run scheduled tasks or tasks that trigger on demand (given various types of triggers). Apps specifically written with the Azure Jobs SDK can be used to run code in any environment, including a local computer, Azure Web Site, Azure Worker Role, Azure VM, etc.  Although you can run them anywhere, they run most efficiently within Azure. For more information see Azure WebJobs - Recommended Resources.
  3. QueueBackgroundWorkItem (QBWI). This was specifically added to enable ASP.NET apps to reliably run short lived background tasks. (see QBWI limitations at the end of this blog).  As of today, you can’t use QBWI on an Azure Web Site or cloud web role because QBWI requires .Net 4.5.2. We hope to have Azure Web/Cloud running .Net 4.5.2 soon.

QueueBackgroundWorkItem overview

QBWI schedules a task which can run in the background, independent of any request. This differs from a normal ThreadPool work item in that ASP.NET can keep track of how many work items registered through this API are currently running, and the ASP.NET runtime will try to delay AppDomain shutdown until these work items have finished executing.

QueueBackgroundWorkItem API

[SecurityPermission(SecurityAction.LinkDemand, Unrestricted =true)]
public static void QueueBackgroundWorkItem(Action<CancellationToken> workItem);

Takes a void-returning callback; the work item will be considered finished when the callback returns.

[SecurityPermission(SecurityAction.LinkDemand, Unrestricted = true)] 
public static void QueueBackgroundWorkItem(Func<CancellationToken, Task> workItem);

Takes a Task returning callback; the work item will be considered finished when the returned Task transitions to a terminal state.

Send email with attachment using QBWI

To use QBWI (QueueBackgroundWorkItem) in Visual Studio, you’ll need to install .Net 4.5.2, then install the .Net 4.5.2 Developer Pack. For my sample I created an MVC app and used SendGrid to send an email with a large jpg attachment. To use QBWI, you’ll need to right click the project in solution explore and select Properties. Select the Application tab on the left, then select .Net Framework 4.5.2 in the Target Framework dropdown. If you don’t see 4.5.2, you didn’t install the .Net 4.5.2 Developer Pack or you don’t have .Net 4.5.2 installed.

netVers

The following code sends email with an image file attached:

[HttpPost] [ValidateAntiForgeryToken] public ActionResult SendEmail([Bind(Include = "Name,Email")] User user) { if (ModelState.IsValid) { HostingEnvironment.QueueBackgroundWorkItem(ct => SendMailAsync(user.Email)); return RedirectToAction("Index", "Home"); } return View(user); } private async Task SendMailAsync(string email) { var myMessage = new SendGridMessage(); myMessage.From = new MailAddress("Rick@Contoso.com"); myMessage.AddTo(email); myMessage.Subject = "Using QueueBackgroundWorkItem"; //Add the HTML and Text bodies myMessage.Html = "<p>Check out my new blog at " + "<a href=\"http://blogs.msdn.com/b/webdev/\">" +"http://blogs.msdn.com/b/webdev/</a></p>"; myMessage.Text = "Check out my new blog at http://blogs.msdn.com/b/webdev/"; using (var attachmentFS = new FileStream(GH.FilePath,FileMode.Open)) { myMessage.AddAttachment(attachmentFS, "My Cool File.jpg"); } var credentials = new NetworkCredential( ConfigurationManager.AppSettings["mailAccount"], ConfigurationManager.AppSettings["mailPassword"] ); // Create a Web transport for sending email. var transportWeb = new Web(credentials); if (transportWeb != null) await transportWeb.DeliverAsync(myMessage); }

I set the account and password on the Configure tab in the Azure portal to keep my credentials secure.

conTab

Using the cancellation token

You can drop the following code in a new MVC app to test the cancellation token:

public class HomeController : Controller{static int logCount = 0;public ActionResult Index()
   {return View();
   }public ActionResult About()
   {
      ViewBag.Message = "Your application description page.";HostingEnvironment.QueueBackgroundWorkItem(ct => workItemAction1(ct, "About"));return View();
   }public ActionResult Contact()
   {
      ViewBag.Message = "Your contact page.";HostingEnvironment.QueueBackgroundWorkItem(ct => workItemAction1Async(ct, "Contact"));return View();
   }private void workItemAction1(CancellationToken ct, string msg)
   {
      logCount++;int currentLogCount = logCount;

      ct = addLog(ct, currentLogCount, msg);
   }

   private async Task<CancellationToken> workItemAction1Async(CancellationToken ct, string msg)
   {
      logCount++;int currentLogCount = logCount;await addLogAsync(ct, currentLogCount, msg);return ct;
   }private CancellationToken addLog(CancellationToken ct, int currentLogCount, string msg)
   {Trace.TraceInformation(msg);for (int i = 0; i < 5; i++)
      {if (ct.IsCancellationRequested)
         {Trace.TraceWarning(string.Format("{0} - signaled cancellation", DateTime.Now.ToLongTimeString()));break;
         }Trace.TraceInformation(string.Format("{0} - logcount:{1}", DateTime.Now.ToLongTimeString(), currentLogCount));Thread.Sleep(6000);
      }return ct;
   }private async Task<CancellationToken> addLogAsync(CancellationToken ct, int currentLogCount, string msg)
   {try{for (int i = 0; i < 5; i++)
         {if (ct.IsCancellationRequested)
            {Trace.TraceWarning(string.Format("{0} - signaled cancellation : msg {1}",DateTime.Now.ToLongTimeString(), msg));break;
            }Trace.TraceInformation(string.Format("{0} - msg:{1} - logcount:{2}",DateTime.Now.Second.ToString(), msg, currentLogCount));// "Simulate" this operation took a long time, but was able to run without
            // blocking the calling thread (i.e., it's doing I/O operations which are async)
            // We use Task.Delay rather than Thread.Sleep, because Task.Delay returns
            // the thread immediately back to the thread-pool, whereas Thread.Sleep blocks it.
            // Task.Delay is essentially the asynchronous version of Thread.Sleep:await Task.Delay(2000, ct);
         }
      }catch (TaskCanceledException tce)
      {Trace.TraceError("Caught TaskCanceledException - signaled cancellation " + tce.Message);
      }return ct;
   }
}

Hit F5 to debug the app, then click on the About or Contact link. Right click on the IIS Express icon in the task notification area and select Exit.

iisX

The visual studio output window shows the task is canceled.

out

QueueBackgroundWorkItem limitations

  • The QBWI API cannot be called outside of an ASP.NET managed AppDomain.
  • The AppDomain shutdown can only be delayed 30 second. If you have too many items queued to be completed in 30 seconds, the ASP.NET runtime will unload the AppDomain without waiting for the work items to finish.
  • The caller's ExecutionContext is not flowed to the work item.
  • Scheduled work items are not guaranteed to ever execute, once the app pool starts to shut down, QueueBackgroundWorkItem calls will not be honored.
  • The provided CancellationToken will be signaled when the application is shutting down. The work item should make every effort to honor this token.  If a work item does not honor this token and continues executing, the ASP.NET runtime will unload the AppDomain without waiting for the work item to finish.
  • We don’t guarantee that background work items will ever get invoked or will run to completion.  For instance, if we believe a background work item is misbehaving, we’ll kill it.  And if the w3wp.exe process crashes, all background work items are obviously dead.  If you need reliability, you should use Azure’s built-in scheduling functions.

Special thanks to @LeviBroderick  who not only wrote the QBWI code, but helped me with this post.

Follow me ( @RickAndMSFT )   on twitter where I have a no spam guarantee of quality tweets.


ASP.NET Identity 2.1.0-alpha1

$
0
0

Today, we are releasing a preview ASP.NET Identity 2.1.0-alpha1. The main focus in this release was to fix bugs and add SignInManager to make it easier to use security features for login.

Download this release

You can download ASP.NET Identity from the NuGet gallery. You can install or update to these packages through NuGet using the NuGet Package Manager Console, like this:

What’s in this release?

Following is the list of features and major issues that were fixed in 2.1.0-alpha1.

SignInManager

SignInManager makes it easier to add Two-Factor authentication, account lockout and other security features when you login. Earlier in your application code you had to keep track of how many times had the user incorrectly attempted a login and update the count to track whether the account should be locked out or not. The same logic will used when entering the verification code when doing two-factor authentication. If you incorrectly enter the code then your account will be locked out for some time. All these values for account lockout are configurable.

In the samples package that we had for 2.0 the SignInManager was in the application code and now we have moved it into the Framework.

The following tutorials cover account confirmation and two-factor authentication (including account lockout)

http://www.asp.net/identity/overview/features-api/account-confirmation-and-password-recovery-with-aspnet-identity

http://www.asp.net/identity/overview/features-api/two-factor-authentication-using-sms-and-email-with-aspnet-identity

Following screenshot show the login code of an application which uses two-factor authentication and account lockout.

2fa

List of bugs fixed

You can look at all the bugs that were fixed in this release by clicking here.

Samples/ Documentation

Migrating from ASP.NET Identity 2.0.0

This is a compatible release with 2.0.0 and there are no schema updates so you should be able to update to this version without anything being broken.

Give feedback and get support

    • If you find any bugs please open them at our Codeplex Site where we track all our bugs https://aspnetidentity.codeplex.com/
    • If you want to discuss these features or have questions, please discuss them on Stack Overflow and use the following tag “asp.net-identity”

Thank You for trying out the preview and your feedback for ASP.NET Identity. Please let us know your feedback around ASP.NET Identity

Updating the MVC Facebook API

$
0
0

Over the past several months Facebook made changes to their application development APIs that were incompatible with the MVC Facebook support.

We have been working on updates while the Facebook API kept evolving, and on 4/30/2014 Facebook announced a two-year stability guarantee. This was a fantastic announcement because this ensured a similar stability for ASP.NET MVC developers developing Facebook applications. We've fixed the Facebook package and renamed it to Microsoft.AspNet.Facebook. You can try out the updated package on the ASP.NET Webstack nightly NuGet package feed: https://www.myget.org/F/aspnetwebstacknightly.  Instructions on how to use the feed are here.

We encourage all of the Facebook application developers to go and try out our updated Microsoft.AspNet.Facebook API with new Facebook applications.

Here are the issues that we’ve fixed.

If you’re new to the world of Facebook application development on MVC you can check out some of our other tutorials here and here. Keep in mind these tutorials will be gradually updated so they may not be entirely accurate.

The important new stuff

The original Microsoft.AspNet.Mvc.Facebook package and corresponding API’s at the time had no concept of optional or default permissions. This created friction with some of the updated prompt dialogs that Facebook released. To address this we’ve made the Facebook’s authorize filter more flexible by providing permission prompt hooks to control login dialog flow.

FacebookAuthorizeFIlter now exposes OnPermissionPrompt and an OnDeniedPermissionPrompt hooks.So what do these hooks do? Let’s go over each in detail.

OnPermissionPrompt

Every time a prompt is about to be shown the OnPermissionPrompt is invoked and passed a PermissionContext. The hook enables you to modify the login flow by setting the context's Result property just like you’d do in an authorization filter.

To utilize the OnPermissionPrompt you can create a FacebookAuthorizeFilter like so:

public class CustomFacebookAuthorizeFilter : FacebookAuthorizeFilter{public CustomFacebookAuthorizeFilter(FacebookConfiguration config)
        : base(config)
    { }protected override void OnPermissionPrompt(PermissionContext context)
    {// This sets context.Result to ShowPrompt(context) (default behavior)base.OnPermissionPrompt(context);
    }
}

And then you can apply the filter the same way you’d apply a FacebookAuthorizeFilter in the old world:

GlobalFilters.Filters.Add(new CustomFacebookAuthorizeFilter(yourFacebookConfiguration));

The default behavior of the OnPermissionPrompt hook is to set the PermissionContext's result to a ShowPromptActionResult by calling into the ShowPrompt method. The source code of the OnPermissionPrompt method looks like this:

protected virtual void OnPermissionPrompt(PermissionContext context)
{
    context.Result = ShowPrompt(context);
}

The ShowPrompt method returns an action result that shows the permission prompt to the user. We can do more though; if we really wanted to, we could redirect the user to a different action every time a prompt was about to be shown by overriding the default signature as shown:

protected override void OnPermissionPrompt(PermissionContext context)
{
    context.Result = new RedirectToRouteResult(new RouteValueDictionary{
            { "controller", "home" },
            { "action", "foo" }
        });
}

This would redirect us to the Home controller’s action Foo instead of prompting the user for permissions.

Lastly we could modify the OnPermissionPrompt method to ignore every prompt that’s shown via setting the PermissionContext's Result to be null:

protected override void OnPermissionPrompt(PermissionContext context)
{
    context.Result = null;
}

This would make it so we never prompt a user for permissions. This isn’t ideal but it is possible.

OnDeniedPermissionPrompt

The OnDeniedPermissionPrompt is invoked when we detect that a user has revoked, declined, or skipped permissions. This occurs instead of the OnPermissionPrompt hook when there are denied permissions. Just like the OnPermissionPrompt we can utilize this hook by creating a FacebookAuthorizeFilter like so:

public class CustomFacebookAuthorizeFilter : FacebookAuthorizeFilter{public CustomFacebookAuthorizeFilter(FacebookConfiguration config)
        : base(config)
    { }protected override void OnDeniedPermissionPrompt(PermissionContext context)
    {// Does nothing (default behavior is to leave context.Result null)base.OnDeniedPermissionPrompt(context);
    }
}

And then just like the OnPermissionPrompt above you can apply the filter the same way you’d apply a FacebokAuthorizeFilter in the old world:

GlobalFilters.Filters.Add(new CustomFacebookAuthorizeFilter(yourFacebookConfiguration));

The default behavior of the OnDeniedPermissionPrompt hook is to leave the passed in PermissionContext’s Result member null to ignore the “denied” permission prompt. The source code of the OnDeniedPermissionPrompt method looks like this:

protected virtual void OnDeniedPermissionPrompt(PermissionContext context)
{
}

Here we’re doing nothing so the PermissionContext's Result member is null; this indicates that we do not want to show the permission prompt to the user. If we were to do the same thing as OnPermissionPrompt and set the PermissionContext's Result to be a ShowPromptActionResult we’d infinite loop in our login dialogs; the reason why is because every time the method is invoked there would still be denied permissions.

Like the OnPermissionPrompt hook you can modify the login flow with the result that you return. For example, let’s say we wanted to redirect to a “skip” handling page if we detect a user has skipped permissions:

protected override void OnDeniedPermissionPrompt(PermissionContext context)
{if (context.SkippedPermissions.Any())
    {
        context.Result = new RedirectResult("http://www.contoso.com/SomeUrlToHandleSkips");
    }
}

PermissionContext

In the previous sections I did not discuss the PermissionContext object that is passed into both the OnPermissionPrompt and OnDeniedPermissionPrompt hooks. This object exposes a significant amount of information that enable developers to modify the login flow. Let’s examine what the PermissionContext has to offer:

  • IEnumerable<string> DeclinedPermissions
    • Permissions that were previously requested for but not granted for the lifetime of the application. This can happen by a user revoking, skipping or choosing not to allow permissions in the Facebook login dialog.
  • FacebookContext FacebookContext
    • Provides access to Facebook-specific information.
  • AuthorizationContext FilterContext
    • Provides access to filter information.
  • IEnumerable<string> MissingPermissions
    • The entire list of missing permissions for the current page, including DeclinedPermissions and SkippedPermissions.
  • HashSet<string> RequiredPermissions
    • The entire list of requested permissions for the current page. Includes permissions that were already prompted for.
  • IEnumerable<string> SkippedPermissions
    • Permissions that were previously requested for but skipped. This can happen from a user hitting the "skip" button when requesting permissions.
  • ActionResult Result
    • The ActionResult that should be used to control the login flow. If value is null then we will continue onto the action that is intended to be invoked. Non-null values short-circuit the action.

With all of the information the PermissionContext has to offer you should be able to fully control the login flow.

Hook flow chart

For reference sake I’ve created a flow chart to indicate how Facebook’s login dialog and our hooks interact.

clip_image001

Conclusion

We’d love to see people adopt the new Microsoft.AspNet.Facebook package early (via our nightly MyGet feed) to ensure we haven’t missed anything. If you have any questions feel free to reach out below.

Instructions on how to start using our nightly MyGet feed to get the Microsoft.AspNet.Facebook package can be found here.

To find or file issues do so here.

Dependency Injection in ASP.NET vNext

$
0
0

Dependency Injection (DI) is a software design pattern, a particular case of the Inversion of Control pattern, in which one or more dependencies are injected into dependent objects. The pattern is used to create program designs that are loosely coupled and testable.

This article assumes that you are already familiar with DI. If not, you can read this article for a brief introduction.

DI in ASP.NET vNext

In ASP.NET vNext, dependency injection is a first class citizen. While in the previous versions of the framework, DI was partially supported, in ASP.NET vNext it is available throughout the entire stack. A minimalistic DI container is provided out of the box but we are leaving the door open to BYOC (Bring Your Own Container). The default container is useful in the cases when you don’t need any advanced injection capabilities (see the known limitations section at the end of the post).

BYOC is possible because of an abstraction over the actual DI container implementation. The abstraction is the IServiceProvider interface and it represents the least set of container behavior our components are limited to assuming are present. All the framework components (MVC, Routing, SignalR, Entity Framework, etc.) rely only on the capabilities of IServiceProvider, but your own application code can use any feature that your chosen DI container has. When you BYOC, you can replace the default implementation of IServiceProvider with a wrapper around your own container.  Once that happens, all the dependency resolution calls will be routed to your own container. In the case when you want to use your own container strictly for your own custom types, we support fallback to our default container.

Because all framework components use the same container to register services, we can now flow dependencies across the stack. This opens the door to new scenarios that were not possible before, like injecting a SignalR broadcaster into an MVC controller action. As we walk up the stack, there are different layers of dependency resolvers. All dependencies are added to a single container and everyone can see everybody else’s services. The single container also addresses the cross-cutting concern customization story. It is trivial in the new stack to change cross-cutting concerns (e.g. logging) via a single entry point.

The out of the box container supports the following lifestyles:

LifestyleDescription
InstanceA specific instance is given all the time. You are responsible for its initial creation
TransientA new instance is created every time
SingletonA single instance is created and it acts like a singleton
ScopedA single instance is created inside the current scope. It is equivalent to Singleton in the current scope

Per Request Scope

A popular feature for DI in web applications is to create objects that have a single instance per web request. This means that the objects acts as a singleton inside that request but two distinct requests will have different instances of the objects.

In ASP.NET vNext, the Per Request Scope is achieved using a middleware and a scoped lifestyle. The middleware, when invoked, will create a new scoped container which will replace the container for the current request. All the subsequent middleware in the pipeline will then utilize the scoped container. After the request flows through the pipeline and the container middleware is signaled to complete, the scope is destroyed and all the objects created inside it are disposed.

 

The source code for the ContainerMiddleware, is available on GitHub.

In the rare case in which you need to create your own container scope, you can use the IServiceScopeFactory to do this. When the default implementation of the IServiceProvider is created, the IServiceScopeFactory is one of the services that are registered by default.

New vs Old

For the purpose of showing the differences between DI in the old and new stack, we are going to use an MVC controller that writes a string provided through an injected dependency:

Code Snippet
  1. public interface IMessageGenerator
  2. {
  3.     string GenerateMessage();
  4. }
  5.  
  6. public class HelloMessageGenerator : IMessageGenerator
  7. {
  8.     public string GenerateMessage()
  9.     {
  10.         return"Hello DI!";
  11.     }
  12. }
  13.  
  14. public class MessageController : Controller
  15. {
  16.     private readonly IMessageGenerator messageGenerator;
  17.  
  18.     public MessageController(IMessageGenerator generator)
  19.     {
  20.         if (generator == null)
  21.         {
  22.             throw new ArgumentNullException("generator", "The generator dependecy is mandatory");
  23.         }
  24.  
  25.         this.messageGenerator = generator;
  26.     }
  27.  
  28.     public string GetMessage()
  29.     {
  30.         return this.messageGenerator.GenerateMessage();
  31.     }
  32. }

None of the code above is changed between the old and the new stack. The only difference is where and how the dependency are registered and resolved.

In the old MVC stack, controller dependencies are resolved through a custom controller factory. For the purpose of this demo, we are going to implement the Poor Man’s DI and manually compose the dependencies:

Code Snippet
  1. public class DIControllerFactory : DefaultControllerFactory
  2. {
  3.     public override IController CreateController(RequestContext requestContext, string controllerName)
  4.     {
  5.         // If a message controller is requested...
  6.         if (controllerName == "Message")
  7.         {
  8.             // ... then create a new controller and set up the dependency
  9.             return new MessageController(new HelloMessageGenerator());
  10.         }
  11.  
  12.         // Otherwise, fallback to the default implementation
  13.         return base.CreateController(requestContext, controllerName);
  14.     }
  15. }
  16.  
  17. public class MvcApplication : HttpApplication
  18. {
  19.     protected void Application_Start()
  20.     {
  21.         ...
  22.  
  23.         // Register the controller factor
  24.         ControllerBuilder.Current.SetControllerFactory(new DIControllerFactory());
  25.  
  26.         ...
  27.     }
  28. }

The controller factory will inject a concrete implementation (HelloMessageGenerator) of the interface (IMessageGenerator) in the Message controller.

The same code, ported to ASP.NET vNext, doesn’t need a Poor Man’s DI implementation. As mentioned before, in this new stack, a DI container is available out of the box. Thus, all we need to do is tell it what is the mapping between the interface and the concrete implementation.

Code Snippet
  1. public class Startup
  2. {
  3.     public void Configure(IBuilder app)
  4.     {
  5.         ...
  6.         app.UseServices(services =>
  7.         {
  8.             ...
  9.             // Set up the dependencies
  10.             services.AddTransient<IMessageGenerator, HelloMessageGenerator>();
  11.             ...
  12.         });
  13.         ...
  14.     }
  15. }

The default implementation of the controller factory in ASP.NET vNext uses the IServiceProvider to resolve the dependencies. If you BYOC, none of the code above will change. You would only have to tell the application to use a different implementation of IServiceProvider.

Replacing the default DI container

The code below shows how the previous sample can be rewritten to use Autofac instead of the out of the box container.

The code uses compiler directives to use Autofac code only for the full .NET 4.5 Framework because Autofac is not available for the .NET Core Framework 4.5. In Visual Studio “14” CTP you can change the target framework of a project by right clicking on it and going to Properties -> Active Target Framework.

Code Snippet
  1. using System;
  2. using Microsoft.AspNet.Builder;
  3. using Microsoft.AspNet.Routing;
  4. using Microsoft.Framework.DependencyInjection;
  5. using Microsoft.Framework.DependencyInjection.Fallback;
  6. using Microsoft.AspNet.Mvc;
  7. using Microsoft.AspNet.RequestContainer;
  8. using DISample.Models;
  9.  
  10. #if NET45
  11. using Autofac;
  12. using Microsoft.Framework.DependencyInjection.Autofac;
  13. using Microsoft.Framework.OptionsModel;
  14. #endif
  15.  
  16. namespace DISample
  17. {
  18.     public class Startup
  19.     {
  20.         public void Configure(IBuilder app)
  21.         {
  22.             // Add the MVC services
  23.             ServiceCollection services = new ServiceCollection();
  24.             services.AddMvc();
  25.             services.Add(OptionsServices.GetDefaultServices());
  26.  
  27. // The NET45 symbol is defined when the project targets .NET Framework 4.5
  28. #if NET45
  29.             // Create the autofac container
  30.             ContainerBuilder builder = new ContainerBuilder();
  31.  
  32.             // Register the message generator through autofac
  33.             builder.RegisterType<HelloMessageGenerator>().As<IMessageGenerator>();
  34.  
  35.             // Create the container and use the default application services as a fallback
  36.             AutofacRegistration.Populate(
  37.                 builder,
  38.                 services,
  39.                 fallbackServiceProvider: app.ApplicationServices);
  40.  
  41.             IContainer container = builder.Build();
  42.  
  43.             // Replace the default container
  44.             app.ApplicationServices = container.Resolve<IServiceProvider>();
  45. #else
  46.             // Here we are running on .NET Core Framework 4.5 so we cannot use Autofac
  47.  
  48.             services.AddTransient<IMessageGenerator, HelloMessageGenerator>();
  49.             app.ApplicationServices = services.BuildServiceProvider(app.ApplicationServices);
  50. #endif
  51.             // MVC requires the container middleware
  52.             app.UseMiddleware(typeof(ContainerMiddleware));
  53.  
  54.             app.UseMvc(routes =>
  55.             {
  56.                 routes.MapRoute(
  57.                     name: "default",
  58.                     template: "{controller}/{action}/{id?}",
  59.                     defaults: new { controller = "Message", action = "GetMessage" });
  60.             });
  61.         }
  62.     }
  63. }

In order to compile the code above, you must add the dependency injection dependencies to project.json. Since Autofac is not available for .NET Core Framework 4.5, the Autofac dependency is only defined for in the net45 section:

Code Snippet
  1. {
  2.     "dependencies": {
  3.         "Helios": "0.1-alpha-build-*",
  4.         "Microsoft.AspNet.Mvc": "0.1-alpha-build-*",
  5.         "Microsoft.AspNet.Identity.Entity": "0.1-alpha-build-*",
  6.         "Microsoft.AspNet.Identity.Security": "0.1-alpha-build-*",
  7.         "Microsoft.AspNet.Security.Cookies": "0.1-alpha-build-*",
  8.         "Microsoft.AspNet.Server.WebListener": "0.1-alpha-build-*",
  9.         "Microsoft.AspNet.StaticFiles": "0.1-alpha-build-*",
  10.         "Microsoft.Data.Entity": "0.1-alpha-build-*",
  11.         "Microsoft.Data.Entity.SqlServer": "0.1-alpha-build-*",
  12.         "Microsoft.Framework.ConfigurationModel.Json": "0.1-alpha-build-*",
  13.         "Microsoft.VisualStudio.Web.BrowserLink.Loader": "14.0-alpha",
  14.         "Microsoft.Framework.DependencyInjection": "0.1-alpha-build-*"
  15.     },
  16.     "commands": {
  17.         "web": "Microsoft.AspNet.Hosting --server Microsoft.AspNet.Server.WebListener --server.urls http://localhost:5000"
  18.     },
  19.     "configurations": {
  20.         "net45": {
  21.             "dependencies": {
  22.                 "System.Data": "",
  23.                 "System.ComponentModel.DataAnnotations": "",
  24.                 "Microsoft.Framework.DependencyInjection.Autofac": "0.1-alpha-build-*",
  25.                 "Autofac": "3.3.0"
  26.             }
  27.         },
  28.         "k10": {
  29.         }
  30.     }
  31. }

Best Practices for DI in ASP.NET vNext

When using DI, we recommend the following practices:

  1. Register all dependencies in the application startup method, before doing anything else.
  2. Avoid consuming the IServiceProvider interface directly. Instead, add explicit dependencies as constructor parameters and let the callers resolve them. Use abstract factories when the number of dependencies becomes hard to manage.
  3. Code against contracts rather than actual implementations.

Known Limitations

The out of the box container for ASP.NET vNext Alpha has a few known limitations:

  • It only supports constructor injection
  • It can only resolve types with one and only one public constructor
  • It doesn’t support advanced features (like per thread scope or auto discovery)

Summary

ASP.NET vNext is DI-friendly and it uses dependency injection throughout the stack. While there is a default minimalistic DI container out of the box, you can Bring Your Own Container and use advanced DI capabilities. The source code for Dependency Injection is available on GitHub.

We’d love to hear your feedback. Please provide it in Github, comments on this blog, or the ASP.NET vNext forum. If you ask a question in Stack Overflow, use asp.net-vnexttag.

ASP.NET vNext is an open source project released under Apache License Version 2.0 by Microsoft Open Technologies, Inc. You can follow its progress and find instructions on how to contribute on https://github.com/aspnet.

Announcing the release of Dynamic Data provider and EntityDataSource control for Entity Framework 6

$
0
0

Today, we are pleased to announce RTM of ASP.NET Dynamic Data and EntityDataSource control for EntityFramework 6

What’s in this release

- Dynamic Data provider for Entity Framework 6

- EntityDataSource control for Entity Framework 6

How to install

You can download this release for ASP.NET DynamicData.EFProvider (http://www.nuget.org/packages/Microsoft.AspNet.DynamicData.EFProvider/) and EntityDataSource (http://www.nuget.org/packages/Microsoft.AspNet.EntityDataSource/) from the NuGet gallery.

  • Install-Package Microsoft.AspNet.DynamicData.EFProvider -Version 6.0.0
  • Install-Package Microsoft.AspNet.EntityDataSource -Version 6.0.0

Getting started

Microsoft.AspNet.DynamicData.EFProvider

This package has a DynamicData EFProvider for EntityFramework 6. This provider can work with a Model (either Code First or Model First) which was created using Entity Framework 6. This package also installs the Page Templates, Entity Templates and Field Templates which are required for DynamicData. The templates have been updated to use Microsoft.AspNet.EntityDataSource control which we are also previewing today as well.

For more information on ASP.NET DynamicData please see http://msdn.microsoft.com/en-us/library/cc488545.aspx

Following are the steps for using this package in a ASP.NET DynamicData application:

  • Create a new ASP.NET Dynamic Data Entities Web Application
  • Add the Microsoft.AspNet.DynamicData.EFProvider NuGet package
  • This will do the following
    • Add a reference to the DynamicData EFProvider binary
    • Install the templates. If you are starting with a new project, then you can override the templates. If you have an existing application, then you should be careful when overriding the changes. These templates will replace the EntityDataSource control which shipped in .NET Framework with Microsoft.AspNet.EntityDataSource and the Page Templates, Field Templates, Entity Templates.
  • Create your model using Entity Framework Code First or EF Designer.
  • Add the following code in RegisterRoutes in Global.asax.cs to register your DbContext:
Code Snippet
  1. DefaultModel.RegisterContext(
  2. new Microsoft.AspNet.DynamicData.ModelProviders.EFDataModelProvider(() => new YourDbContext()),
  3. newContextConfiguration { ScaffoldAllTables = true });  
 
  • Run the project
  • You would see all the tables listed on the Default page.

Microsoft.AspNet.EntityDataSourceControl

This is an update to the EntityDataSource control which shipped in the .NET Framework. The EntityDataSource control has been updated to work with Entity Framework 6.

To use this control, please do the following

  • Create an ASP.NET application
  • Install the packageMicrosoft.AspNet.EntityDataSource
    • This package will
      • Install the runtime binary for Microsoft.AspNet.EntityDataSource
      • Install the EntityFramework version 6 NuGet package
      • Add the following tag prefix in web.config
Code Snippet
  1. <pages>
  2.   <controls>
  3.     <addtagPrefix="ef"assembly="Microsoft.AspNet.EntityDataSource"namespace="Microsoft.AspNet.EntityDataSource" />
  4.   </controls>
  5. </pages>
  • Create a new Web Form page
  • Use the control as follows and bind it to any Databound control such as GridView, FormView etc.
Code Snippet
  1. <asp:GridViewID="GridView1"runat="server"DataSourceID="GridDataSource"></asp:GridView>
  2. <ef:EntityDataSourceID="GridDataSource"runat="server"EnableDelete="true"/>

Give feedback

If you find any issues with this preview, please file issues at the EntityFramework CodePlex sitehttps://entityframework.codeplex.com

Thank you for trying out this release.

ASP.NET vNext Routing Overview

$
0
0

Introduction

The ASP.NET Routing system is primarily responsible for two operations:

  1. It maps incoming HTTP requests to a route handler given a collection of routes.
  2. It generates URLs (links) from these routes. 

A typical route definition is a string that contains a URL template to match, such as:

"blog/{year}-{month}-{day}/{slug}"

With the URL template, the route definition can also contain default values and constraints for parts of the URL. This helps define exactly which URLs the route can match, for example:

routes.MapRoute(
    name: "Product", 
    template: "Product/{productId}", 
    defaults: new { controller = "Product", action = "Details" }, 
    constraints: new { productId = @"\d+" });

The first argument "Product", is the name of the route. The second argument "Product/{productId}", is the URL template for this route. The URL must be of the form “Products/n”, where n is the productID. The 3rd argument contains the default route values. In this example, the default controller is ProductController. 

When a route matches a request, route data is generated and passed to the route handler. The route data contains a set of key value pairs typically representing the matched segments and default data. In an MVC application the matched segments are used to select a controller and action and execute it.

The ASP.NET Routing system can use the same templates to generate URLs as well. 

History

Originally, ASP.NET Routing was written as integrated feature of ASP.NET MVC (in pre-v1.0-beta days of ASP.NET MVC). Later it was extracted in its own assembly (System.Web.Routing) and it was made part of the .NET Framework. This more general routing system supports not only MVC, but also WebForms with page routes.

When ASP.NET Web API was introduced there was a need for it to be hostable outside of the ASP.NET pipeline in IIS. In that scenario Web API cannot use ASP.NET routing. As a result, Web API was built using a routing façade, which can delegate to its own version of routing. When Web API is hosted on ASP.NET in IIS with the WebHost NuGet package the Web API routing façade delegates to ASP.NET’s built-in routing system.

What’s new in vNext

In ASP.NET vNext the routing systems have been unified to a single implementation that is used when hosting in IIS, self-hosting, and for in-memory local requests for unit testing. Routing in ASP.NET vNext resides in the Microsoft.AspNet.Routing NuGet package. The source repository is located here: https://github.com/aspnet/Routing. Similarly MVC and Web API have been unified into a new MVC framework.

There are several major improvements to routing in ASP.NET vNext.

Rerouting the request

To understand this new feature, it’s important to understand that routing and action selection are two separate systems.

  • The routing system’s responsibility is to find a matching route, create route data, and dispatch the request to a handler.
  • Action selection is an implementation detail of the MVC’s handler. It uses route data and other information from the incoming request to select the action to execute. In an MVC application, MvcRouteHandler is the handler the request gets dispatched to.

In MVC 5 and Web API 2 there is no way for the handler to return control to the routing system if no action could be matched. In this scenario MVC and Web API will typically return an “HTTP 404 - Not Found” response. That meant that a developer had to be careful about the order of the routes, and constraints had to be crafted carefully to make sure incoming URLs reached the desired action.

In ASP.NET vNext there are two improvements to this behavior:

  1. The handler can return control to the routing system indicating that it cannot handle the request. In that case the routing system will try and match the next route in the route collection. This allows for a more flexible ordering of routes and reduces the need to add constraints to each route.
  2. Before dispatching the request to the handler, the routing system can use dependency injected constraints, so it can in effect look into what actions are available down stream at the time of matching.

Routing as middleware

Since ASP.NET MVC and WebAPI’s release, the ASP.NET framework is a family of pluggable components rather than a single framework.

In ASP.NET vNext, Routing is built as a middleware component that can be added to the request pipeline. It can be composed along with any other middleware components, such as the static file handler, error page, or the SignalR server.

Of course just like in MVC 5, a developer can hook up handlers other than MVC’s MvcRouteHandler.

The request flow through the routing pipeline

This section brings together everything that you’ve read so far:

  1. When a request is processed by ASP.NET vNext, the routing middleware will try to match the request with routes in the route collection.
  2. If one of the routes matches the request, it looks for the handler for the route.
  3. The RouteAsync method of the handler is called.
  4. The RoutingContext has a flag called IsHandled. If that is set to true, it means the request was successfully handled by the handler. If it is set to false, it means the route wasn’t able to handle the request, and that the next route should be tried.

Inline constraints

Route constraints let developers restrict how the parameters in the route template are matched. For example, a regex constraint requires that a regular expression match the value in order for the route to match. In MVC 5, the constraint syntax looks like this:

routes.MapRoute("Product", "Product/{productId}", 
    defaults: new { controller = "Product", action = "Details" }, 
    constraints: new { productId = @"\d+" });

In this code sample the productId parameter has a regex constraint that matches an integer of one or more digits (0-9). This style of defining routes and constraints is called “convention-based routing”. In MVC 5 and Web API 2 we introduced a new style of constraints and defaults for attribute routing scenarios, called inline constraints. In ASP.NET vNext, we are introducing a similar syntax for defining route constraints for convention-based routing. It simplifies specifying the constraints with an inline syntax in the form of: 

"{parameter:constraint}"

A developer can now apply a constraint in a more succinct way as part of the URL template, Here are a few approaches. The full list of constraints that can be applied will be identical to the Web API 2 attribute routing constraints.

Constraint

Description

Example template

alpha

Matches uppercase or lowercase Latin alphabet characters (a-z, A-Z)

"Product/{ProductName:alpha}"

int

Matches a Signed 32-bit integer value.

"Product/{ProductId:int}"

long

Matches a Signed 64-bit integer value.

"Product/{ProductId:long}"

minlength

Matches a string with a minimum length.

"Product/{ProductName:minlength(10)}"

Regex

Matches a regular expression.

"Product/{productId:regex(^\\d{4}$)}"

Optional URI Parameters and Default Values

A URI parameter can be made optional by adding a question mark to the route parameter. The example specifies that the productId parameter needs to satisfy the built-in “long” constraint.  This example also shows that the parameter is optional, as denoted by the “?” token.

routes.MapRoute("Product", "Product/{productId:long?}", new { controller = "Product", action = "Details" });

You can specify a default value inside the route template as follows. This example shows that the productId is optional and default value is 1.

routes.MapRoute("Product", "Product/{productId:long=1}", new { controller = "Product", action = "Details" });

Looking forward

For the beta release of ASP.NET vNext the following features are planned. These features should be available in the nightly builds as they become available.

  1. Inline constraints for convention-based routing, as described in the details above.
  2. Attribute based routing with inline constraints
    The Web API 2 already has this feature, and it will soon be available for ASP.NET vNext. 

There are a lot of improvements done and planned for Routing for ASP.NET vNext. Your feedback will be very valuable. Please provide your feedback on these features and feel free to let us know what you’d like to see that you don’t see here – we’ll definitely consider it! You can provide feedback in comments on this blog or on GitHub (https://github.com/aspnet/Routing). If you ask a question in Stack Overflow, use the asp.net-vnext tag.

Viewing all 7144 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>