Quantcast
Channel: damienbod – Software Engineering
Viewing all 357 articles
Browse latest View live

Full Server logout with IdentityServer4 and OpenID Connect Implicit Flow

$
0
0

The article shows how to fully logout from IdentityServer4 using an OpenID Connect Implicit Flow. Per design when using an access token to use protected data from a resource server, even if the client has logged out from the server, the access token can be used so long it is valid (AccessTokenLifetime) as it is a consent. This it the normal use case.

Sometimes, it is required that once a user logs out from IdentityServer4, no client with the same user can continue to use the protected data without logging in again. Reference tokens can be used to implement this. With reference tokens, you have full control over the lifecycle.

Code: https://github.com/damienbod/AspNet5IdentityServerAngularImplicitFlow

Other posts in this series:

To use reference tokens in IdentityServer4, the client can be defined with the AccessTokenType property set to AccessTokenType.Reference. When a user and the client successfully login, a reference token as well as an id_token is returned to the client and not an access token and an id_token. (response_type: id_token token)

public static IEnumerable<Client> GetClients()
{
	// client credentials client
	return new List<Client>
	{
		new Client
		{
			ClientName = "angular2client",
			ClientId = "angular2client",
			AccessTokenType = AccessTokenType.Reference,
			AllowedGrantTypes = GrantTypes.Implicit,
			AllowAccessTokensViaBrowser = true,
			RedirectUris = new List<string>
			{
				"https://localhost:44311"

			},
			PostLogoutRedirectUris = new List<string>
			{
				"https://localhost:44311/Unauthorized"
			},
			AllowedCorsOrigins = new List<string>
			{
				"https://localhost:44311",
				"http://localhost:44311"
			},
			AllowedScopes = new List<string>
			{
				"openid",
				"dataEventRecords",
				"securedFiles"
			}
		}
	};
}

In IdentityServer4, when a user decides to logout, the IPersistedGrantService can be used to remove reference tokens for this user and client. The RemoveAllGrantsAsync method from the IPersistedGrantService uses the Identity subject and the client id to delete all of the corresponding grants. The GetSubjectId method is an IdentityServer4 extension method for the Identity. The HttpContext.User can be used to get this. The client id must match the client from the configuration.

using System;
using System.Collections.Generic;
using System.Linq;
using System.Security.Claims;
using System.Threading.Tasks;
using Microsoft.AspNetCore.Authorization;
using Microsoft.AspNetCore.Identity;
using Microsoft.AspNetCore.Mvc;
using Microsoft.AspNetCore.Mvc.Rendering;
using Microsoft.Extensions.Logging;
using IdentityServerWithAspNetIdentity.Models;
using IdentityServerWithAspNetIdentity.Models.AccountViewModels;
using IdentityServerWithAspNetIdentity.Services;
using IdentityServer4.Services;
using IdentityServer4.Quickstart.UI.Models;
using Microsoft.AspNetCore.Http.Authentication;
using IdentityServer4.Extensions;

namespace IdentityServerWithAspNetIdentity.Controllers
{
    [Authorize]
    public class AccountController : Controller
    {
        private readonly UserManager<ApplicationUser> _userManager;
        private readonly SignInManager<ApplicationUser> _signInManager;
        private readonly IEmailSender _emailSender;
        private readonly ISmsSender _smsSender;
        private readonly ILogger _logger;
        private readonly IIdentityServerInteractionService _interaction;
        private readonly IPersistedGrantService _persistedGrantService;

        public AccountController(
            IIdentityServerInteractionService interaction,
            IPersistedGrantService persistedGrantService,
            UserManager<ApplicationUser> userManager,
            SignInManager<ApplicationUser> signInManager,
            IEmailSender emailSender,
            ISmsSender smsSender,
            ILoggerFactory loggerFactory)
        {
            _interaction = interaction;
            _persistedGrantService = persistedGrantService;
            _userManager = userManager;
            _signInManager = signInManager;
            _emailSender = emailSender;
            _smsSender = smsSender;
            _logger = loggerFactory.CreateLogger<AccountController>();
        }

        /// <summary>
        /// Handle logout page postback
        /// </summary>
        [HttpPost]
        [ValidateAntiForgeryToken]
        public async Task<IActionResult> Logout(LogoutViewModel model)
        {
            var subjectId = HttpContext.User.Identity.GetSubjectId();

            // delete authentication cookie
            await HttpContext.Authentication.SignOutAsync();


            // set this so UI rendering sees an anonymous user
            HttpContext.User = new ClaimsPrincipal(new ClaimsIdentity());

            // get context information (client name, post logout redirect URI and iframe for federated signout)
            var logout = await _interaction.GetLogoutContextAsync(model.LogoutId);

            var vm = new LoggedOutViewModel
            {
                PostLogoutRedirectUri = logout?.PostLogoutRedirectUri,
                ClientName = logout?.ClientId,
                SignOutIframeUrl = logout?.SignOutIFrameUrl
            };


            await _persistedGrantService.RemoveAllGrantsAsync(subjectId, "angular2client");

            return View("LoggedOut", vm);
        }

The IdentityServer4.AccessTokenValidation NuGet package is used on the resource server to validate the reference token sent from the client. The IdentityServerAuthenticationOptions options are configured as required.

"IdentityServer4.AccessTokenValidation": "1.0.1-rc1"

This package is configured in the Startup class in the Configure method.

public void Configure(IApplicationBuilder app, IHostingEnvironment env, ILoggerFactory loggerFactory)
{
	loggerFactory.AddConsole();
	loggerFactory.AddDebug();

	app.UseExceptionHandler("/Home/Error");
	app.UseCors("corsGlobalPolicy");
	app.UseStaticFiles();

	JwtSecurityTokenHandler.DefaultInboundClaimTypeMap.Clear();

	IdentityServerAuthenticationOptions identityServerValidationOptions = new IdentityServerAuthenticationOptions
	{
		Authority = "https://localhost:44318/",
		ScopeName = "dataEventRecords",
		ScopeSecret = "dataEventRecordsSecret",
		AutomaticAuthenticate = true,
		SupportedTokens = SupportedTokens.Both,
		// required if you want to return a 403 and not a 401 for forbidden responses

		AutomaticChallenge = true,
	};

	app.UseIdentityServerAuthentication(identityServerValidationOptions);

	app.UseMvc(routes =>
	{
		routes.MapRoute(
			name: "default",
			template: "{controller=Home}/{action=Index}/{id?}");
	});
}

The SPA client can then be used to login, logout from the server. If 2 or more clients with the same user are logged in, once the user logs out from the server, none will have access to the protected data. All existing reference tokens for this user and client can no longer be used to access the protected data.

By using reference tokens, you have full control over the access lifecycle to the protected data. Caution should be taken when using long running access tokens.

Another strategy would be to use short lived access tokens and make the client refresh this regularly. This reduces to time which an access token lives after a logout, but the access token can still be used to access the private data until it has timed out.

Links

http://openid.net/specs/openid-connect-core-1_0.html

http://openid.net/specs/openid-connect-implicit-1_0.html

https://github.com/IdentityServer/IdentityServer4/issues/313#issuecomment-247589782

https://github.com/IdentityServer/IdentityServer4

https://leastprivilege.com

https://github.com/IdentityServer/IdentityServer4/issues/313

https://github.com/IdentityServer/IdentityServer4/issues/310



Setting the NLog database connection string in the ASP.NET Core appsettings.json

$
0
0

This article shows how the NLog connection string for the DatabaseTarget can be configured in the appsettings.json in an ASP.NET Core project and not the XML nlog.config file. All the NLog target properties can be configured in code if required and not just in the NLog XML configuration file.

Code: https://github.com/damienbod/AspNetCoreNlog

NLog posts in this series:

  1. ASP.NET Core logging with NLog and Microsoft SQL Server
  2. ASP.NET Core logging with NLog and Elasticsearch
  3. Settings the NLog database connection string in the ASP.NET Core appsettings.json

The XML nlog.config file is the same as in the previous post, with no database connection string configured.

<?xml version="1.0" encoding="utf-8" ?>
<nlog xmlns="http://www.nlog-project.org/schemas/NLog.xsd"
      xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
      autoReload="true"
      internalLogLevel="Warn"
      internalLogFile="C:\git\damienbod\AspNetCoreNlog\Logs\internal-nlog.txt">


  <targets>

    <target name="database" xsi:type="Database" >

<!-- THIS is not required, read from the appsettings.json

<connectionString>
        Data Source=N275\MSSQLSERVER2014;Initial Catalog=Nlogs;Integrated Security=True;
</connectionString>
-->

<!--
  Remarks:
    The appsetting layouts require the NLog.Extended assembly.
    The aspnet-* layouts require the NLog.Web assembly.
    The Application value is determined by an AppName appSetting in Web.config.
    The "NLogDb" connection string determines the database that NLog write to.
    The create dbo.Log script in the comment below must be manually executed.

  Script for creating the dbo.Log table.

  SET ANSI_NULLS ON
  SET QUOTED_IDENTIFIER ON
  CREATE TABLE [dbo].[Log] (
      [Id] [int] IDENTITY(1,1) NOT NULL,
      [Application] [nvarchar](50) NOT NULL,
      [Logged] [datetime] NOT NULL,
      [Level] [nvarchar](50) NOT NULL,
      [Message] [nvarchar](max) NOT NULL,
      [Logger] [nvarchar](250) NULL,
      [Callsite] [nvarchar](max) NULL,
      [Exception] [nvarchar](max) NULL,
    CONSTRAINT [PK_dbo.Log] PRIMARY KEY CLUSTERED ([Id] ASC)
      WITH (PAD_INDEX  = OFF, STATISTICS_NORECOMPUTE  = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS  = ON, ALLOW_PAGE_LOCKS  = ON) ON [PRIMARY]
  ) ON [PRIMARY]
-->

          <commandText>
              insert into dbo.Log (
              Application, Logged, Level, Message,
              Logger, CallSite, Exception
              ) values (
              @Application, @Logged, @Level, @Message,
              @Logger, @Callsite, @Exception
              );
          </commandText>

          <parameter name="@application" layout="AspNetCoreNlog" />
          <parameter name="@logged" layout="${date}" />
          <parameter name="@level" layout="${level}" />
          <parameter name="@message" layout="${message}" />

          <parameter name="@logger" layout="${logger}" />
          <parameter name="@callSite" layout="${callsite:filename=true}" />
          <parameter name="@exception" layout="${exception:tostring}" />
      </target>

  </targets>

  <rules>
    <logger name="*" minlevel="Trace" writeTo="database" />

  </rules>
</nlog>

The NLog DatabaseTarget connectionstring is configured in the appsettings.json as described in the ASP.NET Core configuration docs.

{
    "Logging": {
        "IncludeScopes": false,
        "LogLevel": {
            "Default": "Debug",
            "System": "Information",
            "Microsoft": "Information"
        }
    },
    "ElasticsearchUrl": "http://localhost:9200",
    "ConnectionStrings": {
        "NLogDb": "Data Source=N275\\MSSQLSERVER2014;Initial Catalog=Nlogs;Integrated Security=True;"
    }
}

The configuration is then read in the Startup constructor.

public Startup(IHostingEnvironment env)
{
	var builder = new ConfigurationBuilder()
		.SetBasePath(env.ContentRootPath)
		.AddJsonFile("appsettings.json", optional: true, reloadOnChange: true)
		.AddJsonFile($"appsettings.{env.EnvironmentName}.json", optional: true)
		.AddEnvironmentVariables();
	Configuration = builder.Build();
}

The Nlog DatabaseTagert is then configured to use the connection string from the app settings and sets all the DatabaseTarget instances for NLog to use this. All target properties can be configured in this way if required.

public void Configure(IApplicationBuilder app, IHostingEnvironment env, ILoggerFactory loggerFactory)
{
	loggerFactory.AddNLog();

	foreach (DatabaseTarget target in LogManager.Configuration.AllTargets.Where(t => t is DatabaseTarget))
	{
		target.ConnectionString = Configuration.GetConnectionString("NLogDb");
	}

	LogManager.ReconfigExistingLoggers();

	app.UseMvc();
}

Links

https://github.com/NLog/NLog.Extensions.Logging

https://github.com/NLog

https://github.com/NLog/NLog/blob/38aef000f916bd5ffd8b80a5576afa2423192e84/examples/targets/Configuration%20API/Database/MSSQL/Example.cs

https://docs.asp.net/en/latest/fundamentals/logging.html

https://msdn.microsoft.com/en-us/magazine/mt694089.aspx

https://github.com/nlog/NLog/wiki/Database-target

https://docs.asp.net/en/latest/fundamentals/configuration.html


IdentityServer4, Web API and Angular2 in a single ASP.NET Core project

$
0
0

This article shows how IdentityServer4 with Identity, a data Web API, and an Angular 2 SPA could be setup inside a single ASP.NET Core project. The application uses the OpenID Connect Implicit Flow with reference tokens to access the API. The Angular 2 application uses webpack to build.

Code: https://github.com/damienbod/AspNet5IdentityServerAngularImplicitFlow

Other posts in this series:

Step 1: Create app and add IdentityServer4

Use the Quickstart6 AspNetIdentity from IdentityServer 4 to setup the application. Then edit the project json file to add your packages as required. I added the Microsoft.AspNetCore.Authentication.JwtBearer package and also the IdentityServer4.AccessTokenValidation package. The buildOptions have to be extended to ignore the node_modules folder.

{
  "userSecretsId": "aspnet-IdentityServerWithAspNetIdentity-1e7bf5d8-6c32-4dd3-b77d-2d7d2e0f5099",

    "dependencies": {
        "IdentityServer4": "1.0.0-rc1-update2",
        "IdentityServer4.AspNetIdentity": "1.0.0-rc1-update2",

        "Microsoft.NETCore.App": {
            "version": "1.0.1",
            "type": "platform"
        },
        "Microsoft.AspNetCore.Authentication.Cookies": "1.0.0",
        "Microsoft.AspNetCore.Diagnostics": "1.0.0",
        "Microsoft.AspNetCore.Diagnostics.EntityFrameworkCore": "1.0.0",
        "Microsoft.AspNetCore.Identity.EntityFrameworkCore": "1.0.0",
        "Microsoft.AspNetCore.Mvc": "1.0.0",
        "Microsoft.AspNetCore.Razor.Tools": {
            "version": "1.0.0-preview2-final",
            "type": "build"
        },
        "Microsoft.AspNetCore.Server.IISIntegration": "1.0.0",
        "Microsoft.AspNetCore.Server.Kestrel": "1.0.0",
        "Microsoft.AspNetCore.StaticFiles": "1.0.0",
        "Microsoft.EntityFrameworkCore.Sqlite": "1.0.0",
        "Microsoft.EntityFrameworkCore.Sqlite.Design": {
            "version": "1.0.0",
            "type": "build"
        },
        "Microsoft.EntityFrameworkCore.Tools": {
            "version": "1.0.0-preview2-final",
            "type": "build"
        },
        "Microsoft.Extensions.Configuration.EnvironmentVariables": "1.0.0",
        "Microsoft.Extensions.Configuration.Json": "1.0.0",
        "Microsoft.Extensions.Configuration.UserSecrets": "1.0.0",
        "Microsoft.Extensions.Logging": "1.0.0",
        "Microsoft.Extensions.Logging.Console": "1.0.0",
        "Microsoft.Extensions.Logging.Debug": "1.0.0",
        "Microsoft.Extensions.Options.ConfigurationExtensions": "1.0.0",
        "Microsoft.VisualStudio.Web.BrowserLink.Loader": "14.0.0",
        "Microsoft.VisualStudio.Web.CodeGeneration.Tools": {
            "version": "1.0.0-preview2-final",
            "type": "build"
        },
        "Microsoft.VisualStudio.Web.CodeGenerators.Mvc": {
            "version": "1.0.0-preview2-final",
            "type": "build"
        },
        "Microsoft.AspNetCore.Authentication.JwtBearer": "1.0.0",
        "IdentityServer4.AccessTokenValidation": "1.0.1-rc1"
    },

  "tools": {
    "BundlerMinifier.Core": "2.0.238",
    "Microsoft.AspNetCore.Razor.Tools": "1.0.0-preview2-final",
    "Microsoft.AspNetCore.Server.IISIntegration.Tools": "1.0.0-preview2-final",
    "Microsoft.EntityFrameworkCore.Tools": "1.0.0-preview2-final",
    "Microsoft.Extensions.SecretManager.Tools": "1.0.0-preview2-final",
    "Microsoft.VisualStudio.Web.CodeGeneration.Tools": {
      "version": "1.0.0-preview2-final",
      "imports": [
        "portable-net45+win8"
      ]
    }
  },

  "frameworks": {
    "netcoreapp1.0": {
      "imports": [
        "dotnet5.6",
        "portable-net45+win8"
      ]
    }
  },

  "buildOptions": {
    "emitEntryPoint": true,
    "preserveCompilationContext": true,
    "compile": {
        "exclude": [ "node_modules" ]
    }
  },

  "runtimeOptions": {
    "configProperties": {
      "System.GC.Server": true
    }
  },

  "publishOptions": {
    "include": [
      "wwwroot",
      "Views",
      "Areas/**/Views",
      "appsettings.json",
      "web.config"
    ]
  },

  "scripts": {
    "prepublish": [ "bower install", "dotnet bundle" ],
    "postpublish": [ "dotnet publish-iis --publish-folder %publish:OutputPath% --framework %publish:FullTargetFramework%" ]
  }
}

The IProfileService interface is implemented to add your user claims to the tokens. The IdentityWithAdditionalClaimsProfileService class implements the IProfileService interface in this example and is added to the services in the Startup class.

using System;
using System.Collections.Generic;
using System.Linq;
using System.Security.Claims;
using System.Threading.Tasks;
using IdentityModel;
using IdentityServer4.Extensions;
using IdentityServer4.Models;
using IdentityServer4.Services;
using IdentityServerWithAspNetIdentity.Models;
using Microsoft.AspNetCore.Identity;

namespace ResourceWithIdentityServerWithClient
{
    public class IdentityWithAdditionalClaimsProfileService : IProfileService
    {
        private readonly IUserClaimsPrincipalFactory<ApplicationUser> _claimsFactory;
        private readonly UserManager<ApplicationUser> _userManager;

        public IdentityWithAdditionalClaimsProfileService(UserManager<ApplicationUser> userManager,  IUserClaimsPrincipalFactory<ApplicationUser> claimsFactory)
        {
            _userManager = userManager;
            _claimsFactory = claimsFactory;
        }

        public async Task GetProfileDataAsync(ProfileDataRequestContext context)
        {
            var sub = context.Subject.GetSubjectId();

            var user = await _userManager.FindByIdAsync(sub);
            var principal = await _claimsFactory.CreateAsync(user);

            var claims = principal.Claims.ToList();
            if (!context.AllClaimsRequested)
            {
                claims = claims.Where(claim => context.RequestedClaimTypes.Contains(claim.Type)).ToList();
            }

            claims.Add(new Claim(JwtClaimTypes.GivenName, user.UserName));
            //new Claim(JwtClaimTypes.Role, "admin"),
            //new Claim(JwtClaimTypes.Role, "dataEventRecords.admin"),
            //new Claim(JwtClaimTypes.Role, "dataEventRecords.user"),
            //new Claim(JwtClaimTypes.Role, "dataEventRecords"),
            //new Claim(JwtClaimTypes.Role, "securedFiles.user"),
            //new Claim(JwtClaimTypes.Role, "securedFiles.admin"),
            //new Claim(JwtClaimTypes.Role, "securedFiles")

            if (user.IsAdmin)
            {
                claims.Add(new Claim(JwtClaimTypes.Role, "admin"));
            }
            else
            {
                claims.Add(new Claim(JwtClaimTypes.Role, "user"));
            }

            if (user.DataEventRecordsRole == "dataEventRecords.admin")
            {
                claims.Add(new Claim(JwtClaimTypes.Role, "dataEventRecords.admin"));
                claims.Add(new Claim(JwtClaimTypes.Role, "dataEventRecords.user"));
                claims.Add(new Claim(JwtClaimTypes.Role, "dataEventRecords"));
            }
            else
            {
                claims.Add(new Claim(JwtClaimTypes.Role, "dataEventRecords.user"));
                claims.Add(new Claim(JwtClaimTypes.Role, "dataEventRecords"));
            }

            if (user.SecuredFilesRole == "securedFiles.admin")
            {
                claims.Add(new Claim(JwtClaimTypes.Role, "securedFiles.admin"));
                claims.Add(new Claim(JwtClaimTypes.Role, "securedFiles.user"));
                claims.Add(new Claim(JwtClaimTypes.Role, "securedFiles"));
            }
            else
            {
                claims.Add(new Claim(JwtClaimTypes.Role, "securedFiles.user"));
                claims.Add(new Claim(JwtClaimTypes.Role, "securedFiles"));
            }

            claims.Add(new System.Security.Claims.Claim(StandardScopes.Email.Name, user.Email));


            context.IssuedClaims = claims;
        }

        public async Task IsActiveAsync(IsActiveContext context)
        {
            var sub = context.Subject.GetSubjectId();
            var user = await _userManager.FindByIdAsync(sub);
            context.IsActive = user != null;
        }
    }
}

Step 2: Add the Web API for the resource data

The MVC Controller DataEventRecordsController is used for CRUD API requests. This is just a dummy implementation. I would implement all resource server logic in a separate project. The Authorize attribute is used with and without policies. The policies are configured in the Startup class.

using ResourceWithIdentityServerWithClient.Model;

using Microsoft.AspNetCore.Authorization;
using Microsoft.AspNetCore.Mvc;
using System.Collections.Generic;
using System;

namespace ResourceWithIdentityServerWithClient.Controllers
{
    [Authorize]
    [Route("api/[controller]")]
    public class DataEventRecordsController : Controller
    {
        [Authorize("dataEventRecordsUser")]
        [HttpGet]
        public IActionResult Get()
        {
            return Ok(new List<DataEventRecord> { new DataEventRecord { Id =1, Description= "Fake", Name="myname", Timestamp= DateTime.UtcNow } });
        }

        [Authorize("dataEventRecordsAdmin")]
        [HttpGet("{id}")]
        public IActionResult Get(long id)
        {
            return Ok(new DataEventRecord { Id = 1, Description = "Fake", Name = "myname", Timestamp = DateTime.UtcNow });
        }

        [Authorize("dataEventRecordsAdmin")]
        [HttpPost]
        public void Post([FromBody]DataEventRecord value)
        {

        }

        [Authorize("dataEventRecordsAdmin")]
        [HttpPut("{id}")]
        public void Put(long id, [FromBody]DataEventRecord value)
        {

        }

        [Authorize("dataEventRecordsAdmin")]
        [HttpDelete("{id}")]
        public void Delete(long id)
        {

        }
    }
}

Step 3: Add client Angular 2 client API

The Angular 2 client part of the application is setup and using the ASP.NET Core, Angular2 with Webpack and Visual Studio article. Webpack is then used to build the client application.

Any SPA client can be used which supports the OpenID Connect Implicit Flow. IdentityServer4 (IdentityModel) also have good examples using the OIDC javascript client.

Step 4: Configure application host URL

The URL host is the same for both the client and the server. This is configured in the Config class as a static property HOST_URL and used throughout the server side of the application.

public class Config
{
        public static string HOST_URL =  "https://localhost:44363";

The client application reads the configuration from the app.constants.ts provider.

import { Injectable } from '@angular/core';

@Injectable()
export class Configuration {
    public Server: string = "https://localhost:44363";
}

IIS Express is configured to run with HTTPS and matches these configurations. If a different port is used, you need to change these two code configurations. In a production environment, the data should be configurable pro deployment.

Step 5: Deactivate the consent view

The consent view is deactivated because the client is the only client to use this data resource and always requires the same consent. To improve the user experience, the consent view is removed from the flow. This is done by setting the RequireConsent property to false in the client configuration.

public static IEnumerable<Client> GetClients()
{
	// client credentials client
	return new List<Client>
	{
		new Client
		{
			ClientName = "singleapp",
			ClientId = "singleapp",
			RequireConsent = false,
			AccessTokenType = AccessTokenType.Reference,
			//AccessTokenLifetime = 600, // 10 minutes, default 60 minutes
			AllowedGrantTypes = GrantTypes.Implicit,
			AllowAccessTokensViaBrowser = true,
			RedirectUris = new List<string>
			{
				HOST_URL

			},
			PostLogoutRedirectUris = new List<string>
			{
				HOST_URL + "/Unauthorized"
			},
			AllowedCorsOrigins = new List<string>
			{
				HOST_URL
			},
			AllowedScopes = new List<string>
			{
				"openid",
				"dataEventRecords"
			}
		}
	};
}

Step 6: Deactivate logout screens

When the Angular 2 client requests a logout, the client is logged out, reference tokens are invalidated for this application and user, and the user is redirected back to the Angular 2 application without the server account logout views. This improves the user experience.

The existing 2 Logout action methods are removed from the AccountController and the following is implemented. The controller requires the IPersistedGrantService to remove the reference tokens.

/// <summary>
/// special logout to skip logout screens
/// </summary>
/// <param name="logoutId"></param>
/// <returns></returns>
[HttpGet]
public async Task<IActionResult> Logout(string logoutId)
{
	var user = HttpContext.User.Identity.Name;
	var subjectId = HttpContext.User.Identity.GetSubjectId();

	// delete authentication cookie
	await HttpContext.Authentication.SignOutAsync();


	// set this so UI rendering sees an anonymous user
	HttpContext.User = new ClaimsPrincipal(new ClaimsIdentity());

	// get context information (client name, post logout redirect URI and iframe for federated signout)
	var logout = await _interaction.GetLogoutContextAsync(logoutId);

	var vm = new LoggedOutViewModel
	{
		PostLogoutRedirectUri = logout?.PostLogoutRedirectUri,
		ClientName = logout?.ClientId,
		SignOutIframeUrl = logout?.SignOutIFrameUrl
	};


	await _persistedGrantService.RemoveAllGrantsAsync(subjectId, "singleapp");

	return Redirect(Config.HOST_URL + "/Unauthorized");
}

Step 7: Configure Startup to use all three application parts

The Startup class configures all three application parts to run together. The Angular 2 application requires that its client routes are routed on the client and not the server. Middleware is added so that the server does not handle the client routes.

The API service needs to check the reference token and validate. Policies are added for this and also the extension method UseIdentityServerAuthentication is used to check the reference tokens for each request.

IdentityServer4 is setup to use Identity with a SQLite database.

using Microsoft.AspNetCore.Builder;
using Microsoft.AspNetCore.Hosting;
using Microsoft.AspNetCore.Identity.EntityFrameworkCore;
using Microsoft.EntityFrameworkCore;
using Microsoft.Extensions.Configuration;
using Microsoft.Extensions.DependencyInjection;
using Microsoft.Extensions.Logging;
using IdentityServerWithAspNetIdentity.Data;
using IdentityServerWithAspNetIdentity.Models;
using IdentityServerWithAspNetIdentity.Services;
using QuickstartIdentityServer;
using IdentityServer4.Services;
using System.Security.Cryptography.X509Certificates;
using System.IO;
using System.Linq;
using Microsoft.AspNetCore.Http;
using System.Collections.Generic;
using System.IdentityModel.Tokens.Jwt;
using System;
using Microsoft.AspNetCore.Authorization;
using IdentityServer4.AccessTokenValidation;

namespace ResourceWithIdentityServerWithClient
{
    public class Startup
    {
        private readonly IHostingEnvironment _environment;

        public Startup(IHostingEnvironment env)
        {
            var builder = new ConfigurationBuilder()
                .SetBasePath(env.ContentRootPath)
                .AddJsonFile("appsettings.json", optional: true, reloadOnChange: true)
                .AddJsonFile($"appsettings.{env.EnvironmentName}.json", optional: true);

            if (env.IsDevelopment())
            {
                builder.AddUserSecrets();
            }

            _environment = env;

            builder.AddEnvironmentVariables();
            Configuration = builder.Build();
        }

        public IConfigurationRoot Configuration { get; }

        public void ConfigureServices(IServiceCollection services)
        {
            var cert = new X509Certificate2(Path.Combine(_environment.ContentRootPath, "damienbodserver.pfx"), "");

            services.AddDbContext<ApplicationDbContext>(options =>
                options.UseSqlite(Configuration.GetConnectionString("DefaultConnection")));

            services.AddIdentity<ApplicationUser, IdentityRole>()
            .AddEntityFrameworkStores<ApplicationDbContext>()
            .AddDefaultTokenProviders();

            var guestPolicy = new AuthorizationPolicyBuilder()
            .RequireAuthenticatedUser()
            .RequireClaim("scope", "dataEventRecords")
            .Build();

            services.AddAuthorization(options =>
            {
                options.AddPolicy("dataEventRecordsAdmin", policyAdmin =>
                {
                    policyAdmin.RequireClaim("role", "dataEventRecords.admin");
                });
                options.AddPolicy("dataEventRecordsUser", policyUser =>
                {
                    policyUser.RequireClaim("role", "dataEventRecords.user");
                });

            });

            services.AddMvc();

            services.AddTransient<IProfileService, IdentityWithAdditionalClaimsProfileService>();

            services.AddTransient<IEmailSender, AuthMessageSender>();
            services.AddTransient<ISmsSender, AuthMessageSender>();

            services.AddDeveloperIdentityServer()
                .SetSigningCredential(cert)
                .AddInMemoryScopes(Config.GetScopes())
                .AddInMemoryClients(Config.GetClients())
                .AddAspNetIdentity<ApplicationUser>()
                .AddProfileService<IdentityWithAdditionalClaimsProfileService>();
        }

        public void Configure(IApplicationBuilder app, IHostingEnvironment env, ILoggerFactory loggerFactory)
        {
            loggerFactory.AddConsole(Configuration.GetSection("Logging"));
            loggerFactory.AddDebug();

            var angularRoutes = new[] {
                "/Unauthorized",
                "/Forbidden",
                "/home",
                "/dataeventrecords/",
                "/dataeventrecords/create",
                "/dataeventrecords/edit/",
                "/dataeventrecords/list",
                };

            app.Use(async (context, next) =>
            {
                if (context.Request.Path.HasValue && null != angularRoutes.FirstOrDefault(
                    (ar) => context.Request.Path.Value.StartsWith(ar, StringComparison.OrdinalIgnoreCase)))
                {
                    context.Request.Path = new PathString("/");
                }

                await next();
            });

            app.UseDefaultFiles();

            if (env.IsDevelopment())
            {
                app.UseDeveloperExceptionPage();
                app.UseDatabaseErrorPage();
                app.UseBrowserLink();
            }
            else
            {
                app.UseExceptionHandler("/Home/Error");
            }

            app.UseIdentity();
            app.UseIdentityServer();

            app.UseStaticFiles();

            JwtSecurityTokenHandler.DefaultInboundClaimTypeMap.Clear();

            IdentityServerAuthenticationOptions identityServerValidationOptions = new IdentityServerAuthenticationOptions
            {
                Authority = Config.HOST_URL + "/",
                ScopeName = "dataEventRecords",
                ScopeSecret = "dataEventRecordsSecret",
                AutomaticAuthenticate = true,
                SupportedTokens = SupportedTokens.Both,
                // TokenRetriever = _tokenRetriever,
                // required if you want to return a 403 and not a 401 for forbidden responses
                AutomaticChallenge = true,
            };

            app.UseIdentityServerAuthentication(identityServerValidationOptions);

            app.UseMvcWithDefaultRoute();
        }
    }
}

The application can then be run and tested. To test, right click the project and debug.

Links

https://github.com/IdentityServer/IdentityServer4

http://docs.identityserver.io/en/dev/

https://github.com/IdentityServer/IdentityServer4.Samples

https://docs.asp.net/en/latest/security/authentication/identity.html

https://github.com/IdentityServer/IdentityServer4/issues/349

ASP.NET Core, Angular2 with Webpack and Visual Studio


Using SASS with Webpack, Angular2 and Visual Studio

$
0
0

This post shows how to use SASS with Webpack and Angular 2 in Visual Studio. I had various problems trying to get this to work from Visual Studio using a Webpack build. The following is a solution which works, but not the only one.

Code: https://github.com/damienbod/Angular2WebpackVisualStudio

Install node-sass globally, so that the package is available everywhere. The latest installed node-sass will then be available on the path.

npm install node-sass -g

Add the SASS packages as required in the project npm packages.json file.

 "devDependencies": {
        "node-sass": "3.10.1",
        "sass-loader": "^3.1.2",
        "style-loader": "^0.13.0",
        ...
}

The SASS configuration can then be added to the Webpack config file(s). The SASS scss files are built as part of the Webpack build.

{
  test: /\.scss$/,
  loaders: ["style", "css", "sass"]
},

Now a SASS file can be created and appended to any Angular 2 component.

body {
    padding-top: 50px;
}

.starter-template {
    padding: 40px 15px;
    text-align: center;
}

.navigationLinkButton:hover {
    cursor: pointer;
}

a {
    color: #03A9F4;
}

The scss file or files can be used in the Angular 2 component typescript file using the @Component. The styles property defines an array of strings so each scss require method, needs to be converted to a string, otherwise it will not work. Thanks to Jackie Gleason for this solution.

import { Component, OnInit } from '@angular/core';
import { Router } from '@angular/router';

@Component({
    selector: 'my-app',
    template: require('./app.component.html'),
    styles: [String(require('./app.component.scss')), String(require('../style/app.scss'))]
})

export class AppComponent {

    constructor(private router: Router) {
    }
}

If you have the following error message somewhere in your Webpack build: binding.node. Try reinstalling `node-sass`, try the following fixes:

1: Reinstall node-sass

npm install node-sass -g

2: Edit the Visual Studio project and settings

You can use the node from your path, and not the installed Visual Studio node by changing the Tools > Options > Projects and Solutions > External Web Tools. Move the path option to the top. This solution is from Mads Kristensen and explained here: solution

m1les gave this solution in Stack overflow.

You can then view the css styles created from SASS Visual Studio Webpack build.

Links

http://stackoverflow.com/questions/31301582/task-runner-explorer-cant-load-tasks/31444245

https://github.com/webpack/style-loader/issues/123

http://sass-lang.com/

https://www.bensmithett.com/smarter-css-builds-with-webpack/

https://github.com/jtangelder/sass-loader

http://eng.localytics.com/faster-sass-builds-with-webpack/


Angular2 autocomplete with ASP.NET Core and Elasticsearch

$
0
0

This article shows how autocomplete could be implemented in Angular 2 using ASP.NET Core MVC as a data service. The API uses Elasticsearch to query the data requests. ng2-completer is used to implement the Angular 2 autocomplete functionality.

Code: https://github.com/damienbod/Angular2AutoCompleteAspNetCoreElasticsearch

To use autocomplete in the Angular 2 application, the ng2-completer package needs to be added to the dependencies in the npm packages.json file.

"ng2-completer": "^0.2.2"

This project uses Webpack to build the Angular 2 application and all vendor packages are added to the vendor.ts which can then be used throughout the application. The ng2-completer package is added to the vendor.ts file which is then built using Webpack.

import '@angular/platform-browser-dynamic';
import '@angular/platform-browser';
import '@angular/core';
import '@angular/http';
import '@angular/router';

import 'ng2-completer';

import 'bootstrap/dist/js/bootstrap';

import './css/bootstrap.css';
import './css/bootstrap-theme.css';

PersonCity is used as the data model for the autocomplete. The server side of the application uses the PersonCity model to store and search for data.

export class PersonCity {
    public id: number;
    public name: string;
    public info: string;
    public familyName: string;
}

The ng2-completer autocomplete is used within the PersonCityAutocompleteSearchComponent. This component returns a PersonCity object to the using component. When a new search request is finished, the @Output bindModelPersonCityChange is updated. The @Output is chained to the onPersonCitySelected event from ng2-completer.

A custom CompleterService, PersonCityDataService, is used to request the data from the server.

import { Component, Inject, EventEmitter, Input, Output, OnInit, AfterViewInit, ElementRef } from '@angular/core';
import { Http, Response } from "@angular/http";

import { Subscription } from 'rxjs/Subscription';
import { Observable } from 'rxjs/Observable';
import { Router } from  '@angular/router';

import { Configuration } from '../app.constants';
import { PersonCityDataService } from './personCityDataService';
import { PersonCity } from './personCity';

import { CompleterService, CompleterItem } from 'ng2-completer';

@Component({
    selector: 'autocompletesearch',
  template: `
<ng2-completer [dataService]="dataService" (selected)="onPersonCitySelected($event)" [minSearchLength]="0" [disableInput]="disableAutocomplete"></ng2-completer>

`,
  styles: [String(require('./personCityAutocompleteSearch.component.scss'))]
})

export class PersonCityAutocompleteSearchComponent implements OnInit    {

    constructor(private completerService: CompleterService, private http: Http, private _configuration: Configuration) {

        let actionUrl = _configuration.Server + 'api/personcity/search/';
        this.dataService = new PersonCityDataService(http, _configuration); ////completerService.local("name, info, familyName", 'name');
    }

    @Output() bindModelPersonCityChange = new EventEmitter<PersonCity>();
    @Input() bindModelPersonCity: PersonCity;
    @Input() disableAutocomplete: boolean = false;

    private searchStr: string;
    private dataService: PersonCityDataService;

    ngOnInit() {
        console.log("ngOnInit AutocompleteSearch");
    }

    public onPersonCitySelected(selected: CompleterItem) {
        console.log(selected);
        this.bindModelPersonCityChange.emit(selected.originalObject);
    }
}

The PersonCityDataService extends the CompleterItem and implements the CompleterData as described in the ng-completer documentation. When PersonCity items are returned from the service, the results are mapped to CompleterItem items as required. This could also be done on the server and then the default remote service could be used. By using the custom service, it can easily be extended to add the security headers for the data service as required.

import { Http, Response } from "@angular/http";
import { Subject } from "rxjs/Subject";

import { CompleterData, CompleterItem } from 'ng2-completer';
import { Configuration } from '../app.constants';

export class PersonCityDataService extends Subject<CompleterItem[]> implements CompleterData {
    constructor(private http: Http, private _configuration: Configuration) {
        super();

        this.actionUrl = _configuration.Server + 'api/personcity/search/';
    }

    private actionUrl: string;

    public search(term: string): void {
        this.http.get(this.actionUrl + term)
            .map((res: Response) => {
                // Convert the result to CompleterItem[]
                let data = res.json();
                let matches: CompleterItem[] = data.map((personcity: any) => {
                    return {
                        title: personcity.name,
                        description: personcity.familyName + ", " + personcity.info,
                        originalObject: personcity
                    }
                });
                this.next(matches);
            })
            .subscribe();
    }

    public cancel() {
        // Handle cancel
    }
}

The PersonCityAutocompleteSearchComponent also implemented the specific styles using the PersonCityAutocompleteSearchComponent scss file. The ng-completer components comes with css classes which can be extended or overwritten.


.completer-input {
    width: 500px;
    display: block;
    height: 34px;
    padding: 6px 12px;
    font-size: 14px;
    line-height: 1.42857143;
    color: #555;
    background-color: #fff;
    background-image: none;
    border: 1px solid #ccc;
    border-radius: 4px;
  -webkit-box-shadow: inset 0 1px 1px rgba(0, 0, 0, .075);
          box-shadow: inset 0 1px 1px rgba(0, 0, 0, .075);
  -webkit-transition: border-color ease-in-out .15s, -webkit-box-shadow ease-in-out .15s;
       -o-transition: border-color ease-in-out .15s, box-shadow ease-in-out .15s;
          transition: border-color ease-in-out .15s, box-shadow ease-in-out .15s;
}

.completer-dropdown {
    width: 480px !important;
}

ASP.NET Core MVC API

The PersonCityController MVC Controller implements the service which is used by the Angular 2 application. This service implements the Search action method which uses the IPersonCitySearchProvider to search for the data. Helper methods to create and add some documents to Elasticsearch are also implemented so that the search service can be tested.

using Microsoft.AspNetCore.Mvc;

namespace Angular2AutoCompleteAspNetCoreElasticsearch.Controllers
{
    [Route("api/[controller]")]
    public class PersonCityController : Controller
    {
        private readonly IPersonCitySearchProvider _personCitySearchProvider;

        public PersonCityController(IPersonCitySearchProvider personCitySearchProvider)
        {
            _personCitySearchProvider = personCitySearchProvider;
        }

        [HttpGet("search/{searchtext}")]
        public IActionResult Search(string searchtext)
        {
            return Ok(_personCitySearchProvider.QueryString(searchtext));
        }

        [HttpGet("createindex")]
        public IActionResult CreateIndex()
        {
            _personCitySearchProvider.CreateIndex();
            return Created("http://localhost:5000/api/PersonCity/createindex/", "index created");
        }

        [HttpGet("createtestdata")]
        public IActionResult CreateTestData()
        {
            _personCitySearchProvider.CreateTestData();
            return Created("http://localhost:5000/api/PersonCity/createtestdata/", "test data created");
        }

        [HttpGet("indexexists")]
        public IActionResult GetElasticsearchStatus()
        {
            return Ok(_personCitySearchProvider.GetStatus());
        }
    }
}

The ElasticsearchCrud Nuget package is used to access Elasticsearch. The PersonCitySearchProvider implements this logic. Nest could also be used, only the PersonCitySearchProvider implementation needs to be changed to support this.

"ElasticsearchCRUD":  "2.3.3.1"

The PersonCitySearchProvider class implements the IPersonCitySearchProvider interface which is used in the MVC controller. The IPersonCitySearchProvider needs to be added to the services in the Startup class. The search uses a QueryStringQuery search with wildcards. Any other query, aggregation could be used here, depending on the search requirements.

using System.Collections.Generic;
using System.Linq;
using ElasticsearchCRUD;
using ElasticsearchCRUD.ContextAddDeleteUpdate.IndexModel.SettingsModel;
using ElasticsearchCRUD.Model.SearchModel;
using ElasticsearchCRUD.Model.SearchModel.Queries;
using ElasticsearchCRUD.Tracing;

namespace Angular2AutoCompleteAspNetCoreElasticsearch
{
    public class PersonCitySearchProvider : IPersonCitySearchProvider
    {
        private readonly IElasticsearchMappingResolver _elasticsearchMappingResolver = new ElasticsearchMappingResolver();
        private const string ConnectionString = "http://localhost:9200";
        private readonly ElasticsearchContext _context;

        public PersonCitySearchProvider()
        {
            _context = new ElasticsearchContext(ConnectionString, new ElasticsearchSerializerConfiguration(_elasticsearchMappingResolver))
            {
                TraceProvider = new ConsoleTraceProvider()
            };
        }

        public IEnumerable<PersonCity> QueryString(string term)
        {
            var results = _context.Search<PersonCity>(BuildQueryStringSearch(term));

            return results.PayloadResult.Hits.HitsResult.Select(t => t.Source);
        }

        /// <summary>
        /// TODO protect against injection!
        /// </summary>
        /// <param name="term"></param>
        /// <returns></returns>
        private Search BuildQueryStringSearch(string term)
        {
            var names = "";
            if (term != null)
            {
                names = term.Replace("+", " OR *");
            }

            var search = new Search
            {
                Query = new Query(new QueryStringQuery(names + "*"))
            };

            return search;
        }

        public bool GetStatus()
        {
            return _context.IndexExists<PersonCity>();
        }

        public void CreateIndex()
        {
            _context.IndexCreate<PersonCity>(new IndexDefinition());
        }

        public void CreateTestData()
        {
            var jm = new PersonCity { Id = 1, FamilyName = "Moore", Info = "Muenich", Name = "John" };
            _context.AddUpdateDocument(jm, jm.Id);
            var jj = new PersonCity { Id = 2, FamilyName = "Jones", Info = "Münich", Name = "Johny" };
            _context.AddUpdateDocument(jj, jj.Id);
            var pm = new PersonCity { Id = 3, FamilyName = "Murphy", Info = "Munich", Name = "Paul" };
            _context.AddUpdateDocument(pm, pm.Id);
            var sm = new PersonCity { Id = 4, FamilyName = "McGurk", Info = "munich", Name = "Séan" };
            _context.AddUpdateDocument(sm, sm.Id);
            var sob = new PersonCity { Id = 5, FamilyName = "O'Brien", Info = "Not a much use, bit of a problem", Name = "Sean" };
            _context.AddUpdateDocument(sob, sob.Id);
            var tmc = new PersonCity { Id = 6, FamilyName = "McCauley", Info = "Couldn't a ask for anyone better", Name = "Tadhg" };
            _context.AddUpdateDocument(tmc, tmc.Id);
            var id7 = new PersonCity { Id = 7, FamilyName = "Martini", Info = "Köniz", Name = "Christian" };
            _context.AddUpdateDocument(id7, id7.Id);
            var id8 = new PersonCity { Id = 8, FamilyName = "Lee", Info = "Basel Stadt", Name = "Phil" };
            _context.AddUpdateDocument(id8, id8.Id);
            var id9 = new PersonCity { Id = 9, FamilyName = "Wil", Info = "Basel Stadt", Name = "Nicole" };
            _context.AddUpdateDocument(id9, id9.Id);
            var id10 = new PersonCity { Id = 10, FamilyName = "Mario", Info = "Basel in some small town", Name = "Tim" };
            _context.AddUpdateDocument(id10, id10.Id);
            var id11 = new PersonCity { Id = 11, FamilyName = "Martin", Info = "Biel", Name = "Scott" };
            _context.AddUpdateDocument(id11, id11.Id);
            var id12 = new PersonCity { Id = 12, FamilyName = "Newman", Info = "Lyss", Name = "Tim" };
            _context.AddUpdateDocument(id12, id12.Id);
            var id13 = new PersonCity { Id = 13, FamilyName = "Lamb", Info = "Thun", Name = "Carla" };
            _context.AddUpdateDocument(id13, id13.Id);
            var id14 = new PersonCity { Id = 14, FamilyName = "Goldi", Info = "Zug", Name = "Ida" };
            _context.AddUpdateDocument(id14, id14.Id);
            _context.SaveChanges();
        }
    }
}

When the application is started, the autocomplete is deactivated as no index exists.

angular2autocompleteaspnetcoreelasticsearch_01

Once the index exists, data can be added to the Elasticsearch index.
angular2autocompleteaspnetcoreelasticsearch_02

And the autocomplete can be used.

angular2autocompleteaspnetcoreelasticsearch_03

Links:

https://github.com/oferh/ng2-completer

https://github.com/damienbod/Angular2WebpackVisualStudio

https://www.elastic.co/guide/en/elasticsearch/reference/current/query-dsl-query-string-query.html

https://www.elastic.co/products/elasticsearch

https://www.nuget.org/packages/ElasticsearchCRUD/

https://github.com/damienbod/ElasticsearchCRUD


Angular2 search with ASP.NET Core and Elasticsearch

$
0
0

This article shows how a website search could be implemented using Angular 2, ASP.NET Core and Elasticsearch. Most users expect autocomplete and a flexible search like some of known search websites. When the user enters a char in the search input field, an autocomplete using a shingle token filter with a terms aggregation used to suggest possible search terms. When a term is selected, a match query request is sent and uses an edge ngram indexed field to search for hits or matches. Server side paging is then implemented to iterate though the results.

Code: https://github.com/damienbod/Angular2AutoCompleteAspNetCoreElasticsearch

ASP.NET Core server side search

The Elasticsearch index and queries was built using the ideas from these 2 excellent blogs, bilyachat and qbox.io. ElasticsearchCrud is used as the dotnet core client for Elasticsearch. To setup the index, a mapping needs to be defined as well as the index with the required settings analysis with filters, analyzers and tokenizers. See the Elasticsearch documentation for detailed information.

In this example, 2 custom analyzers are defined, one for the autocomplete and one for the search. The autocomplete analyzer uses a custom shingle token filter called autocompletefilter, a stopwords token filter, lowercase token filter and a stemmer token filter. The edge_ngram_search analyzer uses an edge ngram token filter and a lowercase filter.

private IndexDefinition CreateNewIndexDefinition()
{
	return new IndexDefinition
	{
		IndexSettings =
		{
			Analysis = new Analysis
			{
				Filters =
				{
					CustomFilters = new List<AnalysisFilterBase>
					{
						new StemmerTokenFilter("stemmer"),
						new ShingleTokenFilter("autocompletefilter")
						{
							MaxShingleSize = 5,
							MinShingleSize = 2
						},
						new StopTokenFilter("stopwords"),
						new EdgeNGramTokenFilter("edge_ngram_filter")
						{
							MaxGram = 20,
							MinGram = 2
						}
					}
				},
				Analyzer =
				{
					Analyzers = new List<AnalyzerBase>
					{
						new CustomAnalyzer("edge_ngram_search")
						{
							Tokenizer = DefaultTokenizers.Standard,
							Filter = new List<string> {DefaultTokenFilters.Lowercase, "edge_ngram_filter"},
							CharFilter = new List<string> {DefaultCharFilters.HtmlStrip}
						},
						new CustomAnalyzer("autocomplete")
						{
							Tokenizer = DefaultTokenizers.Standard,
							Filter = new List<string> {DefaultTokenFilters.Lowercase, "autocompletefilter", "stopwords", "stemmer"},
							CharFilter = new List<string> {DefaultCharFilters.HtmlStrip}
						},
						new CustomAnalyzer("default")
						{
							Tokenizer = DefaultTokenizers.Standard,
							Filter = new List<string> {DefaultTokenFilters.Lowercase, "stopwords", "stemmer"},
							CharFilter = new List<string> {DefaultCharFilters.HtmlStrip}
						}


					}
				}
			}
		},
	};
}

The PersonCity is used to add and search for documents in Elasticsearch. The default index and type for this class using ElasticsearchCrud is personcitys and personcity.

public class PersonCity
{
	public long Id { get; set; }
	public string Name { get; set; }
	public string FamilyName { get; set; }
	public string Info { get; set; }
	public string CityCountry { get; set; }
	public string Metadata { get; set; }
	public string Web { get; set; }
	public string Github { get; set; }
	public string Twitter { get; set; }
	public string Mvp { get; set; }
}

A PersonCityMapping class is defined so that required mapping from the PersonCityMappingDto mapping class can be defined for the personcitys index and the personcity type. This class overrides the ElasticsearchMapping to define the index and type.

using System;
using ElasticsearchCRUD;

namespace SearchComponent
{
    public class PersonCityMapping : ElasticsearchMapping
    {
        public override string GetIndexForType(Type type)
        {
            return "personcitys";
        }

        public override string GetDocumentType(Type type)
        {
            return "personcity";
        }
    }
}

The PersonCityMapping class is then used to map the C# type PersonCityMappingDto to the default index from the PersonCity class using the PersonCityMapping. The PersonCityMapping maps to the default index of the PersonCity class.

public PersonCitySearchProvider()
{
	_elasticsearchMappingResolver.AddElasticSearchMappingForEntityType(typeof(PersonCityMappingDto), new PersonCityMapping());
	_context = new ElasticsearchContext(ConnectionString, new ElasticsearchSerializerConfiguration(_elasticsearchMappingResolver))
	{
		TraceProvider = new ConsoleTraceProvider()
	};
}

A specific mapping DTO class is used to define the mapping in Elasticsearch. This class is required, if a non default mapping is required in Elasticsearch. The class uses the ElasticsearchString attribute to define a copy mapping. The field in Elasticsearch should be copied to the autocomplete and the searchfield field when adding a new document. The searchfield and the autocomplete field uses the two analyzers which where defined in the index when adding data. This class is only used to define the type mapping in Elasticsearch.

using ElasticsearchCRUD.ContextAddDeleteUpdate.CoreTypeAttributes;

namespace SearchComponent
{
    public class PersonCityMappingDto
    {
        public long Id { get; set; }

        [ElasticsearchString(CopyToList = new[] { "autocomplete", "searchfield" })]
        public string Name { get; set; }

        [ElasticsearchString(CopyToList = new[] { "autocomplete", "searchfield" })]
        public string FamilyName { get; set; }

        [ElasticsearchString(CopyToList = new[] { "autocomplete", "searchfield" })]
        public string Info { get; set; }

        [ElasticsearchString(CopyToList = new[] { "autocomplete", "searchfield" })]
        public string CityCountry { get; set; }

        [ElasticsearchString(CopyToList = new[] { "autocomplete", "searchfield" })]
        public string Metadata { get; set; }

        public string Web { get; set; }

        public string Github { get; set; }

        public string Twitter { get; set; }

        public string Mvp { get; set; }

        [ElasticsearchString(Analyzer = "edge_ngram_search", SearchAnalyzer = "standard", TermVector = TermVector.yes)]
        public string searchfield { get; set; }

        [ElasticsearchString(Analyzer = "autocomplete")]
        public string autocomplete { get; set; }
    }
}

The IndexCreate method creates a new index and mapping in elasticsearch.

public void CreateIndex()
{
	_context.IndexCreate<PersonCityMappingDto>(CreateNewIndexDefinition());
}

The Elasticsearch settings can be viewed using the HTTP GET:

http://localhost:9200/_settings

{
	"personcitys": {
		"settings": {
			"index": {
				"creation_date": "1477642409728",
				"analysis": {
					"filter": {
						"stemmer": {
							"type": "stemmer"
						},
						"autocompletefilter": {
							"max_shingle_size": "5",
							"min_shingle_size": "2",
							"type": "shingle"
						},
						"stopwords": {
							"type": "stop"
						},
						"edge_ngram_filter": {
							"type": "edgeNGram",
							"min_gram": "2",
							"max_gram": "20"
						}
					},
					"analyzer": {
						"edge_ngram_search": {
							"filter": ["lowercase",
							"edge_ngram_filter"],
							"char_filter": ["html_strip"],
							"type": "custom",
							"tokenizer": "standard"
						},
						"autocomplete": {
							"filter": ["lowercase",
							"autocompletefilter",
							"stopwords",
							"stemmer"],
							"char_filter": ["html_strip"],
							"type": "custom",
							"tokenizer": "standard"
						},
						"default": {
							"filter": ["lowercase",
							"stopwords",
							"stemmer"],
							"char_filter": ["html_strip"],
							"type": "custom",
							"tokenizer": "standard"
						}
					}
				},
				"number_of_shards": "5",
				"number_of_replicas": "1",
				"uuid": "TxS9hdy7SmGPr4FSSNaPiQ",
				"version": {
					"created": "2040199"
				}
			}
		}
	}
}

The Elasticsearch mapping can be viewed using the HTTP GET:

http://localhost:9200/_mapping

{
	"personcitys": {
		"mappings": {
			"personcity": {
				"properties": {
					"autocomplete": {
						"type": "string",
						"analyzer": "autocomplete"
					},
					"citycountry": {
						"type": "string",
						"copy_to": ["autocomplete",
						"searchfield"]
					},
					"familyname": {
						"type": "string",
						"copy_to": ["autocomplete",
						"searchfield"]
					},
					"github": {
						"type": "string"
					},
					"id": {
						"type": "long"
					},
					"info": {
						"type": "string",
						"copy_to": ["autocomplete",
						"searchfield"]
					},
					"metadata": {
						"type": "string",
						"copy_to": ["autocomplete",
						"searchfield"]
					},
					"mvp": {
						"type": "string"
					},
					"name": {
						"type": "string",
						"copy_to": ["autocomplete",
						"searchfield"]
					},
					"searchfield": {
						"type": "string",
						"term_vector": "yes",
						"analyzer": "edge_ngram_search",
						"search_analyzer": "standard"
					},
					"twitter": {
						"type": "string"
					},
					"web": {
						"type": "string"
					}
				}
			}
		}
	}
}

Now documents can be added using the PersonCity class which has no Elasticsearch definitions.

Autocomplete search

A terms aggregation search is used for the autocomplete request. The terms aggregation uses the autocomplete field which only exists in Elasticsearch. A list of strings is returned to the user from this request.

public IEnumerable<string> AutocompleteSearch(string term)
{
	var search = new Search
	{
		Size = 0,
		Aggs = new List<IAggs>
		{
			new TermsBucketAggregation("autocomplete", "autocomplete")
			{
				Order= new OrderAgg("_count", OrderEnum.desc),
				Include = new IncludeExpression(term + ".*")
			}
		}
	};

	var items = _context.Search<PersonCity>(search);
	var aggResult = items.PayloadResult.Aggregations.GetComplexValue<TermsBucketAggregationsResult>("autocomplete");
	IEnumerable<string> results = aggResult.Buckets.Select(t =>  t.Key.ToString());
	return results;
}

The request is sent to Elasticsearch as follows:

POST http://localhost:9200/personcitys/personcity/_search HTTP/1.1
Content-Type: application/json
Accept-Encoding: gzip, deflate
Connection: Keep-Alive
Content-Length: 124
Host: localhost:9200

{
	"size": 0,
	"aggs": {
		"autocomplete": {
			"terms": {
				"field": "autocomplete",
				"order": {
					"_count": "desc"
				},
				"include": {
					"pattern": "as.*"
				}
			}
		}
	}
}

Search Query

When an autocomplete string is selected, a search request is sent to Elasticsearch using a Match Query on the searchfield field which returns 10 hits from the 0 document. If the paging request is sent, the from value is a multiple of 10 depending on the page.

public PersonCitySearchResult Search(string term, int from)
{
	var personCitySearchResult = new PersonCitySearchResult();
	var search = new Search
	{
		Size = 10,
		From = from,
		Query = new Query(new MatchQuery("did_you_mean", term))
	};

	var results = _context.Search<PersonCity>(search);

	personCitySearchResult.PersonCities = results.PayloadResult.Hits.HitsResult.Select(t => t.Source);
	personCitySearchResult.Hits = results.PayloadResult.Hits.Total;
	personCitySearchResult.Took = results.PayloadResult.Took;
	return personCitySearchResult;
}

The search query as sent as follows:

POST http://localhost:9200/personcitys/personcity/_search HTTP/1.1
Content-Type: application/json
Accept-Encoding: gzip, deflate
Connection: Keep-Alive
Content-Length: 74
Host: localhost:9200

{
	"from": 0,
	"size": 10,
	"query": {
		"match": {
			"searchfield": {
				"query": "asp.net"
			}
		}
	}
}

Angular 2 client side search

The Angular 2 client uses an autocomplete input control and then uses a ngFor to display all the search results. Bootstrap paging is used if more than 10 results are found for the search term.

<div class="panel-group">

    <personcitysearch
      *ngIf="IndexExists"
      (onTermSelectedEvent)="onTermSelectedEvent($event)"
      [disableAutocomplete]="!IndexExists">
    </personcitysearch>

    <em *ngIf="PersonCitySearchData.took > 0" style="font-size:smaller; color:lightgray;">
      <span>Hits: {{PersonCitySearchData.hits}}</span>
    </em><br />
    <br />

    <div  *ngFor="let personCity of PersonCitySearchData.personCities">
        <b><span>{{personCity.name}} {{personCity.familyName}} </span></b>
        <a *ngIf="personCity.twitter"  href="{{personCity.twitter}}">
          <img src="assets/socialTwitter.png" />
        </a>
        <a *ngIf="personCity.github" href="{{personCity.github}}">
          <img src="assets/github.png" />
        </a>
        <a *ngIf="personCity.mvp" href="{{personCity.mvp}}">
          <img src="assets/mvp.png" width="24" />
        </a><br />

        <em style="font-size:large"><a href="{{personCity.web}}">{{personCity.web}}</a></em><br />
        <em><span>{{personCity.metadata}}</span></em><br />
        <span>{{personCity.info}}</span><br />
        <br />
        <br />

    </div>

    <ul class="pagination" *ngIf="ShowPaging">
        <li><a (click)="PreviousPage()" >&laquo;</a></li>

        <li><a *ngFor="let page of Pages" (click)="LoadDataForPage(page)">{{page}}</a></li>

        <li><a (click)="NextPage()">&raquo;</a></li>
    </ul>
</div>

The personcitysearch Angular 2 component implements the autocomplete functionality using the ng2-completer component. When a char is entered into the input, a HTTP request is sent to the server which in turns sends a request to the Elasticsearch server.

import { Component, Inject, EventEmitter, Input, Output, OnInit, AfterViewInit, ElementRef } from '@angular/core';
import { Http, Response } from "@angular/http";

import { Subscription } from 'rxjs/Subscription';
import { Observable } from 'rxjs/Observable';
import { Router } from  '@angular/router';

import { Configuration } from '../app.constants';
import { PersoncityautocompleteDataService } from './personcityautocompleteService';
import { PersonCity } from '../model/personCity';

import { CompleterService, CompleterItem } from 'ng2-completer';

import './personcityautocomplete.component.scss';

@Component({
    selector: 'personcityautocomplete',
  template: `
<ng2-completer [dataService]="dataService" (selected)="onPersonCitySelected($event)" [minSearchLength]="0" [disableInput]="disableAutocomplete"></ng2-completer>

`
})

export class PersoncityautocompleteComponent implements OnInit    {

    constructor(private completerService: CompleterService, private http: Http, private _configuration: Configuration) {

        this.dataService = new PersoncityautocompleteDataService(http, _configuration); ////completerService.local("name, info, familyName", 'name');
    }

    @Output() bindModelPersonCityChange = new EventEmitter<PersonCity>();
    @Input() bindModelPersonCity: PersonCity;
    @Input() disableAutocomplete: boolean = false;

    private searchStr: string;
    private dataService: PersoncityautocompleteDataService;

    ngOnInit() {
        console.log("ngOnInit PersoncityautocompleteComponent");
    }

    public onPersonCitySelected(selected: CompleterItem) {
        console.log(selected);
        this.bindModelPersonCityChange.emit(selected.originalObject);
    }
}

And the data service for the CompleterService which is used by the ng2-completer component:

import { Http, Response } from "@angular/http";
import { Subject } from "rxjs/Subject";

import { CompleterData, CompleterItem } from 'ng2-completer';
import { Configuration } from '../app.constants';

export class PersoncityautocompleteDataService extends Subject<CompleterItem[]> implements CompleterData {
    constructor(private http: Http, private _configuration: Configuration) {
        super();

        this.actionUrl = _configuration.Server + 'api/personcity/querystringsearch/';
    }

    private actionUrl: string;

    public search(term: string): void {
        this.http.get(this.actionUrl + term)
            .map((res: Response) => {
                // Convert the result to CompleterItem[]
                let data = res.json();
                let matches: CompleterItem[] = data.map((personcity: any) => {
                    return {
                        title: personcity.name,
                        description: personcity.familyName + ", " + personcity.cityCountry,
                        originalObject: personcity
                    }
                });
                this.next(matches);
            })
            .subscribe();
    }

    public cancel() {
        // Handle cancel
    }
}

The HomeSearchComponent implements the paging for the search results and and also displays the data. The SearchDataService implements the API calls to the MVC ASP.NET Core API service. The paging css uses bootstrap to display the data.

import { Observable } from 'rxjs/Observable';
import { Component, OnInit } from '@angular/core';
import { Http } from '@angular/http';

import { SearchDataService } from '../services/searchDataService';
import { PersonCity } from '../model/personCity';
import { PersonCitySearchResult } from '../model/personCitySearchResult';
import { PersoncitysearchComponent } from '../personcitysearch/personcitysearch.component';

@Component({
    selector: 'homesearchcomponent',
    templateUrl: 'homesearch.component.html',
    providers: [SearchDataService]
})

export class HomeSearchComponent implements OnInit {

    public message: string;
    public PersonCitySearchData: PersonCitySearchResult;
    public SelectedTerm: string;
    public IndexExists: boolean = false;

    constructor(private _dataService: SearchDataService, private _personcitysearchComponent: PersoncitysearchComponent) {
        this.message = "Hello from HomeSearchComponent constructor";
        this.SelectedTerm = "none";
        this.PersonCitySearchData = new PersonCitySearchResult();
    }

    public onTermSelectedEvent(term: string) {
        this.SelectedTerm = term;
        this.findDataForSearchTerm(term, 0)
    }

    private findDataForSearchTerm(term: string, from: number) {
        console.log("findDataForSearchTerm:" + term);
        this._dataService.FindAllForTerm(term, from)
            .subscribe((data) => {
                console.log(data)
                this.PersonCitySearchData = data;
                this.configurePagingDisplay(this.PersonCitySearchData.hits);
            },
            error => console.log(error),
            () => {
                console.log('PersonCitySearch:findDataForSearchTerm completed');
            }
            );
    }

    ngOnInit() {
        this._dataService
            .IndexExists()
            .subscribe(data => this.IndexExists = data,
            error => console.log(error),
            () => console.log('Get IndexExists complete'));
    }

    public ShowPaging: boolean = false;
    public CurrentPage: number = 0;
    public TotalHits: number = 0;
    public PagesCount: number = 0;
    public Pages: number[] = [];

    public LoadDataForPage(page: number) {
        var from = page * 10;
        this.findDataForSearchTerm(this.SelectedTerm, from)
        this.CurrentPage = page;
    }

    public NextPage() {
        var page = this.CurrentPage;
        console.log("TotalHits" + this.TotalHits + "NextPage: " + ((this.CurrentPage + 1) * 10) + "CurrentPage" + this.CurrentPage );

        if (this.TotalHits > ((this.CurrentPage + 1) * 10)) {
            page = this.CurrentPage + 1;
        }

        this.LoadDataForPage(page);
    }

    public PreviousPage(page: number) {
        var page = this.CurrentPage;

        if (this.CurrentPage > 0) {
            page = this.CurrentPage - 1;
        }

        this.LoadDataForPage(page);
    }

    private configurePagingDisplay(hits: number) {
        this.PagesCount = Math.floor(hits / 10);

        this.Pages = [];
        for (let i = 0; i <= this.PagesCount; i++) {
            this.Pages.push((i));
        }

        this.TotalHits = hits;

        if (this.PagesCount <= 1) {
            this.ShowPaging = false;
        } else {
            this.ShowPaging = true;
        }
    }
}

Now when characters are entered into the search input, records are searched for and returned with the amount of hits for the term.

searchaspnetcoreangular2_01

The paging can also be used, to do the server side paging.

searchaspnetcoreangular2_02

The search functions like a web search with which we have come to expect. If different results, searches are required, the server side index creation, query types can be changed as needed. For example, the autocomplete suggestions could be replaced with a fuzzy search, or a query string search.

Links:

https://github.com/oferh/ng2-completer

https://github.com/damienbod/Angular2WebpackVisualStudio

https://www.elastic.co/guide/en/elasticsearch/reference/current/query-dsl-query-string-query.html

https://www.elastic.co/products/elasticsearch

https://www.nuget.org/packages/ElasticsearchCRUD/

https://github.com/damienbod/ElasticsearchCRUD

http://www.bilyachat.com/2015/07/search-like-google-with-elasticsearch.html

http://stackoverflow.com/questions/29753971/elasticsearch-completion-suggest-search-with-multiple-word-inputs

http://rea.tech/implementing-autosuggest-in-elasticsearch/

https://qbox.io/blog/an-introduction-to-ngrams-in-elasticsearch

https://www.elastic.co/guide/en/elasticsearch/guide/current/_ngrams_for_partial_matching.html


Extending Identity in IdentityServer4 to manage users in ASP.NET Core

$
0
0

This article shows how Identity can be extended and used together with IdentityServer4 to implement application specific requirements. The application allows users to register and can access the application for 7 days. After this, the user cannot log in. Any admin can activate or deactivate a user using a custom user management API. Extra properties are added to the Identity user model to support this. Identity is persisted using EFCore and SQLite. The SPA application is implemented using Angular 2, Webpack 2 and Typescript 2.

Code: github

Other posts in this series:

Updating Identity

Updating Identity is pretty easy. The package provides the IdentityUser class implemented by the ApplicationUser. You can add any extra required properties to this class. This requires the Microsoft.AspNetCore.Identity.EntityFrameworkCore package which is included in the project as a NuGet package.

using System;
using Microsoft.AspNetCore.Identity.EntityFrameworkCore;

namespace IdentityServerWithAspNetIdentity.Models
{
    public class ApplicationUser : IdentityUser
    {
        public bool IsAdmin { get; set; }
        public string DataEventRecordsRole { get; set; }
        public string SecuredFilesRole { get; set; }
        public DateTime AccountExpires { get; set; }
    }
}

Identity needs to be added to the application. This is done in the startup class in the ConfigureServices method using the AddIdentity extension. SQLite is used to persist the data. The ApplicationDbContext which uses SQLite is then used as the store for Identity.

services.AddDbContext<ApplicationDbContext>(options =>
	options.UseSqlite(Configuration.GetConnectionString("DefaultConnection")));

services.AddIdentity<ApplicationUser, IdentityRole>()
.AddEntityFrameworkStores<ApplicationDbContext>()
.AddDefaultTokenProviders();

The configuration is read from the appsettings for the SQLite database. The configuration is read using the ConfigurationBuilder in the Startup constructor.

"ConnectionStrings": {
        "DefaultConnection": "Data Source=C:\\git\\damienbod\\AspNet5IdentityServerAngularImplicitFlow\\src\\ResourceWithIdentityServerWithClient\\usersdatabase.sqlite"
    },

The Identity store is then created using the EFCore migrations.

dotnet ef migrations add testMigration

dotnet ef database update

The new properties in the Identity are used in three ways; when creating a new user, when creating a token for a user and validating the token on a resource using policies.

Using Identity creating a new user

The Identity ApplicationUser is created in the Register method in the AccountController. The new extended properties which were added to the ApplicationUser can be used as required. In this example, a new user will have access for 7 days. If the user can set custom properties, the RegisterViewModel model needs to be extended and the corresponding view.

[HttpPost]
[AllowAnonymous]
[ValidateAntiForgeryToken]
public async Task<IActionResult> Register(RegisterViewModel model, string returnUrl = null)
{
	ViewData["ReturnUrl"] = returnUrl;
	if (ModelState.IsValid)
	{
		var dataEventsRole = "dataEventRecords.user";
		var securedFilesRole = "securedFiles.user";
		if (model.IsAdmin)
		{
			dataEventsRole = "dataEventRecords.admin";
			securedFilesRole = "securedFiles.admin";
		}

		var user = new ApplicationUser {
			UserName = model.Email,
			Email = model.Email,
			IsAdmin = model.IsAdmin,
			DataEventRecordsRole = dataEventsRole,
			SecuredFilesRole = securedFilesRole,
			AccountExpires = DateTime.UtcNow.AddDays(7.0)
		};

		var result = await _userManager.CreateAsync(user, model.Password);
		if (result.Succeeded)
		{
			await _signInManager.SignInAsync(user, isPersistent: false);
			_logger.LogInformation(3, "User created a new account with password.");
			return RedirectToLocal(returnUrl);
		}
		AddErrors(result);
	}

	return View(model);
}

Using Identity creating a token in IdentityServer4

The Identity properties need to be added to the claims so that the client SPA or whatever client it is can use the properties. In IdentityServer4, the IProfileService interface is used for this. Each custom ApplicationUser property is added as claims as required.

using System;
using System.Linq;
using System.Security.Claims;
using System.Threading.Tasks;
using IdentityModel;
using IdentityServer4.Extensions;
using IdentityServer4.Models;
using IdentityServer4.Services;
using IdentityServerWithAspNetIdentity.Models;
using Microsoft.AspNetCore.Identity;

namespace ResourceWithIdentityServerWithClient
{
    public class IdentityWithAdditionalClaimsProfileService : IProfileService
    {
        private readonly IUserClaimsPrincipalFactory<ApplicationUser> _claimsFactory;
        private readonly UserManager<ApplicationUser> _userManager;

        public IdentityWithAdditionalClaimsProfileService(UserManager<ApplicationUser> userManager,  IUserClaimsPrincipalFactory<ApplicationUser> claimsFactory)
        {
            _userManager = userManager;
            _claimsFactory = claimsFactory;
        }

        public async Task GetProfileDataAsync(ProfileDataRequestContext context)
        {
            var sub = context.Subject.GetSubjectId();

            var user = await _userManager.FindByIdAsync(sub);
            var principal = await _claimsFactory.CreateAsync(user);

            var claims = principal.Claims.ToList();
            if (!context.AllClaimsRequested)
            {
                claims = claims.Where(claim => context.RequestedClaimTypes.Contains(claim.Type)).ToList();
            }

            claims.Add(new Claim(JwtClaimTypes.GivenName, user.UserName));

            if (user.IsAdmin)
            {
                claims.Add(new Claim(JwtClaimTypes.Role, "admin"));
            }
            else
            {
                claims.Add(new Claim(JwtClaimTypes.Role, "user"));
            }

            if (user.DataEventRecordsRole == "dataEventRecords.admin")
            {
                claims.Add(new Claim(JwtClaimTypes.Role, "dataEventRecords.admin"));
                claims.Add(new Claim(JwtClaimTypes.Role, "dataEventRecords.user"));
                claims.Add(new Claim(JwtClaimTypes.Role, "dataEventRecords"));
            }
            else
            {
                claims.Add(new Claim(JwtClaimTypes.Role, "dataEventRecords.user"));
                claims.Add(new Claim(JwtClaimTypes.Role, "dataEventRecords"));
            }

            if (user.SecuredFilesRole == "securedFiles.admin")
            {
                claims.Add(new Claim(JwtClaimTypes.Role, "securedFiles.admin"));
                claims.Add(new Claim(JwtClaimTypes.Role, "securedFiles.user"));
                claims.Add(new Claim(JwtClaimTypes.Role, "securedFiles"));
            }
            else
            {
                claims.Add(new Claim(JwtClaimTypes.Role, "securedFiles.user"));
                claims.Add(new Claim(JwtClaimTypes.Role, "securedFiles"));
            }

            claims.Add(new System.Security.Claims.Claim(StandardScopes.Email.Name, user.Email));


            context.IssuedClaims = claims;
        }

        public async Task IsActiveAsync(IsActiveContext context)
        {
            var sub = context.Subject.GetSubjectId();
            var user = await _userManager.FindByIdAsync(sub);
            context.IsActive = user != null && user.AccountExpires > DateTime.UtcNow;
        }
    }
}

Using the Identity properties validating a token

The IsAdmin property is used to define whether a logged on user has the admin role. This was added to the token using the admin claim in the IProfileService. Now this can be used by defining a policy and validating the policy in a controller. The policies are added in the Startup class in the ConfigureServices method.

services.AddAuthorization(options =>
{
	options.AddPolicy("dataEventRecordsAdmin", policyAdmin =>
	{
		policyAdmin.RequireClaim("role", "dataEventRecords.admin");
	});
	options.AddPolicy("admin", policyAdmin =>
	{
		policyAdmin.RequireClaim("role", "admin");
	});
	options.AddPolicy("dataEventRecordsUser", policyUser =>
	{
		policyUser.RequireClaim("role", "dataEventRecords.user");
	});
});

The policy can then be used for example in a MVC Controller using the Authorize attribute. The admin policy is used in the UserManagementController.

[Authorize("admin")]
[Produces("application/json")]
[Route("api/UserManagement")]
public class UserManagementController : Controller
{

Now that users can be admin users and expire after 7 days, the application requires a UI to manage this. This UI is implemented in the Angular 2 SPA. The UI requires a user management API to get all the users and also update the users. The Identity EFCore ApplicationDbContext context is used directly in the controller to simplify things, but usually this would be separated from the Controller, or if you have a lot of users, some type of search logic would need to be supported with a filtered result list. I like to have no logic in the MVC controller.

using System;
using System.Collections.Generic;
using System.Linq;
using IdentityServerWithAspNetIdentity.Data;
using Microsoft.AspNetCore.Authorization;
using Microsoft.AspNetCore.Mvc;
using ResourceWithIdentityServerWithClient.Model;

namespace ResourceWithIdentityServerWithClient.Controllers
{
    [Authorize("admin")]
    [Produces("application/json")]
    [Route("api/UserManagement")]
    public class UserManagementController : Controller
    {
        private readonly ApplicationDbContext _context;

        public UserManagementController(ApplicationDbContext context)
        {
            _context = context;
        }

        [HttpGet]
        public IActionResult Get()
        {
            var users = _context.Users.ToList();
            var result = new List<UserDto>();

            foreach(var applicationUser in users)
            {
                var user = new UserDto
                {
                    Id = applicationUser.Id,
                    Name = applicationUser.Email,
                    IsAdmin = applicationUser.IsAdmin,
                    IsActive = applicationUser.AccountExpires > DateTime.UtcNow
                };

                result.Add(user);
            }

            return Ok(result);
        }

        [HttpPut("{id}")]
        public void Put(string id, [FromBody]UserDto userDto)
        {
            var user = _context.Users.First(t => t.Id == id);

            user.IsAdmin = userDto.IsAdmin;
            if(userDto.IsActive)
            {
                if(user.AccountExpires < DateTime.UtcNow)
                {
                    user.AccountExpires = DateTime.UtcNow.AddDays(7.0);
                }
            }
            else
            {
                // deactivate user
                user.AccountExpires = new DateTime();
            }

            _context.Users.Update(user);
            _context.SaveChanges();
        }
    }
}

Angular 2 User Management Component

The Angular 2 SPA is built using Webpack 2 with typescript. See https://github.com/damienbod/Angular2WebpackVisualStudio on how to setup a Angular 2, Webpack 2 app with ASP.NET Core.

The Angular 2 requires a service to access the ASP.NET Core MVC service. This is implemented in the UserManagementService which needs to be added to the app.module then.

import { Injectable } from '@angular/core';
import { Http, Response, Headers, RequestOptions } from '@angular/http';
import 'rxjs/add/operator/map';
import { Observable } from 'rxjs/Observable';
import { Configuration } from '../app.constants';
import { SecurityService } from '../services/SecurityService';
import { User } from './models/User';

@Injectable()
export class UserManagementService {

    private actionUrl: string;
    private headers: Headers;

    constructor(private _http: Http, private _configuration: Configuration, private _securityService: SecurityService) {
        this.actionUrl = `${_configuration.Server}/api/UserManagement/`;
    }

    private setHeaders() {

        console.log("setHeaders started");

        this.headers = new Headers();
        this.headers.append('Content-Type', 'application/json');
        this.headers.append('Accept', 'application/json');

        var token = this._securityService.GetToken();
        if (token !== "") {
            let tokenValue = 'Bearer ' + token;
            console.log("tokenValue:" + tokenValue);
            this.headers.append('Authorization', tokenValue);
        }
    }

    public GetAll = (): Observable<User[]> => {
        this.setHeaders();
        let options = new RequestOptions({ headers: this.headers, body: '' });

        return this._http.get(this.actionUrl, options).map(res => res.json());
    }

    public Update = (id: string, itemToUpdate: User): Observable<Response> => {
        this.setHeaders();
        return this._http
            .put(this.actionUrl + id, JSON.stringify(itemToUpdate), { headers: this.headers });
    }
}

The UserManagementComponent uses the service and displays all the users, and provides a way of updating each user.

import { Component, OnInit } from '@angular/core';
import { SecurityService } from '../services/SecurityService';
import { Observable } from 'rxjs/Observable';
import { Router } from '@angular/router';

import { UserManagementService } from '../user-management/UserManagementService';
import { User } from './models/User';

@Component({
    selector: 'user-management',
    templateUrl: 'user-management.component.html'
})

export class UserManagementComponent implements OnInit {

    public message: string;
    public Users: User[];

    constructor(
        private _userManagementService: UserManagementService,
        public securityService: SecurityService,
        private _router: Router) {
        this.message = "user-management";
    }

    ngOnInit() {
        this.getData();
    }

    private getData() {
        console.log('User Management:getData starting...');
        this._userManagementService
            .GetAll()
            .subscribe(data => this.Users = data,
            error => this.securityService.HandleError(error),
            () => console.log('User Management Get all completed'));
    }

    public Update(user: User) {
        this._userManagementService.Update(user.id, user)
            .subscribe((() => console.log("subscribed")),
            error => this.securityService.HandleError(error),
            () => console.log("update request sent!"));
    }

}

The UserManagementComponent template uses the Users data to display, update etc.

<div class="col-md-12" *ngIf="securityService.IsAuthorized">
    <div class="panel panel-default">
        <div class="panel-heading">
            <h3 class="panel-title">{{message}}</h3>
        </div>
        <div class="panel-body"  *ngIf="Users">
            <table class="table">
                <thead>
                    <tr>
                        <th>Name</th>
                        <th>IsAdmin</th>
                        <th>IsActive</th>
                        <th></th>
                    </tr>
                </thead>
                <tbody>
                    <tr style="height:20px;" *ngFor="let user of Users">
                        <td>{{user.name}}</td>
                        <td>
                            <input type="checkbox" [(ngModel)]="user.isAdmin" class="form-control" style="box-shadow:none" />
                        </td>
                        <td>
                            <input type="checkbox" [(ngModel)]="user.isActive" class="form-control" style="box-shadow:none" />
                        </td>
                        <td>
                            <button (click)="Update(user)" class="form-control">Update</button>
                        </td>
                    </tr>
                </tbody>
            </table>

        </div>
    </div>
</div>

The user-management component and the service need to be added to the module.

import { NgModule } from '@angular/core';
import { FormsModule } from '@angular/forms';
import { BrowserModule } from '@angular/platform-browser';

import { AppComponent } from './app.component';
import { Configuration } from './app.constants';
import { routing } from './app.routes';
import { HttpModule, JsonpModule } from '@angular/http';

import { SecurityService } from './services/SecurityService';
import { DataEventRecordsService } from './dataeventrecords/DataEventRecordsService';
import { DataEventRecord } from './dataeventrecords/models/DataEventRecord';

import { ForbiddenComponent } from './forbidden/forbidden.component';
import { HomeComponent } from './home/home.component';
import { UnauthorizedComponent } from './unauthorized/unauthorized.component';

import { DataEventRecordsListComponent } from './dataeventrecords/dataeventrecords-list.component';
import { DataEventRecordsCreateComponent } from './dataeventrecords/dataeventrecords-create.component';
import { DataEventRecordsEditComponent } from './dataeventrecords/dataeventrecords-edit.component';

import { UserManagementComponent } from './user-management/user-management.component';


import { HasAdminRoleAuthenticationGuard } from './guards/hasAdminRoleAuthenticationGuard';
import { HasAdminRoleCanLoadGuard } from './guards/hasAdminRoleCanLoadGuard';
import { UserManagementService } from './user-management/UserManagementService';

@NgModule({
    imports: [
        BrowserModule,
        FormsModule,
        routing,
        HttpModule,
        JsonpModule
    ],
    declarations: [
        AppComponent,
        ForbiddenComponent,
        HomeComponent,
        UnauthorizedComponent,
        DataEventRecordsListComponent,
        DataEventRecordsCreateComponent,
        DataEventRecordsEditComponent,
        UserManagementComponent
    ],
    providers: [
        SecurityService,
        DataEventRecordsService,
        UserManagementService,
        Configuration,
        HasAdminRoleAuthenticationGuard,
        HasAdminRoleCanLoadGuard
    ],
    bootstrap:    [AppComponent],
})

export class AppModule {}

Now the Identity users can be managed fro the Angular 2 UI.

extendingidentity_01

Links

https://github.com/IdentityServer/IdentityServer4

http://docs.identityserver.io/en/dev/

https://github.com/IdentityServer/IdentityServer4.Samples

https://docs.asp.net/en/latest/security/authentication/identity.html

https://github.com/IdentityServer/IdentityServer4/issues/349

ASP.NET Core, Angular2 with Webpack and Visual Studio


Contributing to OSS projects on gitHub using fork and upstreams

$
0
0

This article is a simple guideline on how you could contribute to gitHub OSS projects using fork and upstream. This is not the only way to do it. git Extensions is used for this demo, but any git client can be used. In this example, aspnet/AspLabs from Microsoft is used as the target repository.

So you have something to contribute, cool, that’s the hard part.

Before you can make your contribution, you need to create a fork of the repository where you want to make your contribution. Open the project on github, and click the fork button in the top right corner.

githuboss_01

Now clone your forked repository

githuboss_02

In git Extensions, click the clone repository and select a folder somewhere on your computer.

githuboss_03

Now you have a master branch and also a server master branch of your forked repository. The next step is to configure the remote upstream branch. This is required to synchronize with the parent repository, as you might not be the only person contributing to the repository. Click the Repository menu in git Extensions and add a new remote repository with the url from the parent repository.

githuboss_04

Now you can pull from the upstream repository. You pull from the upstream/master branch to your local master branch. Due to this you should NEVER work on your master branch. Then you can also configure your git to rebase the local master with the upstream master if preferred.

githuboss_05

Once you have pulled from the upstream, you can push to your remote master, ie the forked master. Just to mention it again, NEVER WORK ON YOUR LOCAL FORKED MASTER, and you will save yourself hassle.

Now you’re ready to work. Create a new branch. A good recommendation is to use the following pattern for naming:

<gitHub username>/<reason-for-the-branch-in-lowercase>

Here’s an example:

damienbod/add-urls-check

By using your gitHub username, it makes it easier for the person reviewing the pull request.

When your work is finished on the branch, you are ready to create a pull request. Go to the parent repository and click on the ‘New pull request’ button:

githuboss_06

Choose your working branch and select the target branch on the parent repository, usually the master.

NOTE: if your branch was created from an older master commit than the actual master on the parent, you need to pull from the upstream and rebase your branch to the latest commit. This is easy as you do not work on the local master.

If you are contributing to an aspnet repository, you will need to sign an electronic agreement before you can contribute.

If you are working together with a maintainer of the repository, or your pull request is the result of an issue, you could add a comment with the github name of the person that will review and merge, so that he or she will be notified that you are ready. They will receive a notification on gitHub.

Now just wait and fix the issues as required. Once the pull request is merged, you need to pull from the upstream on your local forked repository and rebase if necessary to continue with you next pull request.

And who knows, you might even get a coin from Microsoft.

Thanks to Andrew Stanton-Nurse for his tips.

I am also grateful for tips from anyone on how to improve this guideline.

Links:

https://gitextensions.github.io/

gist from CristinaSolana



EF Core diagnosis and features with MS SQL Server

$
0
0

This article shows how Entity Framework Core messages can be logged, and compared using the SQL Profiler and also some of the cool new 1.1 features, but not all. All information can be found on the links at the bottom and especially the excellent docs for EF Core.

Code: https://github.com/damienbod/EFCoreFeaturesAndDiag

project.json with EF Core packages and tools

When using EF Core, you need to add the correct packages and tools to the project file. EF Core 1.1 has a lot of changes compared to 1.0. You should not mix the different versions from EF Core. Use either the LTS or the current version, but not both. The Microsoft.EntityFrameworkCore.Tools.DotNet is a new package which came with 1.1.

{
    "dependencies": {
        "Microsoft.AspNetCore.Diagnostics.EntityFrameworkCore": "1.1.0",
        "Microsoft.EntityFrameworkCore": "1.1.0",
        "Microsoft.EntityFrameworkCore.Relational": "1.1.0",
        "Microsoft.EntityFrameworkCore.SqlServer": "1.1.0",
        "Microsoft.EntityFrameworkCore.SqlServer.Design": {
            "version": "1.1.0",
            "type": "build"
        },
        "Microsoft.EntityFrameworkCore.Tools": "1.1.0-preview4-final",
        ...
    },

    "tools": {
        "Microsoft.EntityFrameworkCore.Tools.DotNet": "1.1.0-preview4",
        ...
    },

    ...

}

Startup ConfigureServices

The database and EF Core can be configured and usually is in the Startup ConfigureServices method for an ASP.NET Core application. The following code sets up EF Core to use a MS SQL server database and uses the new retry on failure method from version 1.1.

public void ConfigureServices(IServiceCollection services)
{
	var sqlConnectionString = Configuration.GetConnectionString("DataAccessMsSqlServerProvider");

	services.AddDbContext<DomainModelMsSqlServerContext>(
		options => options.UseSqlServer(
			sqlConnectionString,
			sqlServerOptions => sqlServerOptions.EnableRetryOnFailure()
		)
	);

	services.AddScoped<IDataAccessProvider, DataAccessMsSqlServerProvider>();

	services.AddMvc().AddJsonOptions(options =>
	{
		options.SerializerSettings.ReferenceLoopHandling = ReferenceLoopHandling.Ignore;
	});
}

Data Model

As with version 1.1, Entity Framework Core can now use backing fields to connect with the database and not just properties. This opens up a whole new world of possibilities of how entities can be designed or used. The _description field from the following DataEventRecord entity will be used for the database.

public class DataEventRecord
{
	private string _description;

	[Key]
	public long DataEventRecordId { get; set; }

	public string Name { get; set; }

	public string MadDescription {
		get { return _description; }
		set { _description = value;  }
	}

	public DateTime Timestamp { get; set; }

	[ForeignKey("SourceInfoId")]
	public SourceInfo SourceInfo { get; set; }

	public long SourceInfoId { get; set; }
}

DbContext with Field and Column mapping

In the OnModelCreating method from the DBContext, the description field is mapped to the column description for the MadDescription property. The context also configures shadow properties to add updated timestamps.

using System;
using System.Linq;
using Microsoft.EntityFrameworkCore;

namespace EFCoreFeaturesAndDiag.Model
{
    // >dotnet ef migration add testMigration
    public class DomainModelMsSqlServerContext : DbContext
    {
        public DomainModelMsSqlServerContext(DbContextOptions<DomainModelMsSqlServerContext> options) :base(options)
        { }

        public DbSet<DataEventRecord> DataEventRecords { get; set; }

        public DbSet<SourceInfo> SourceInfos { get; set; }

        protected override void OnModelCreating(ModelBuilder builder)
        {
            builder.Entity<DataEventRecord>().HasKey(m => m.DataEventRecordId);
            builder.Entity<SourceInfo>().HasKey(m => m.SourceInfoId);

            // shadow properties
            builder.Entity<DataEventRecord>().Property<DateTime>("UpdatedTimestamp");
            builder.Entity<SourceInfo>().Property<DateTime>("UpdatedTimestamp");

            builder.Entity<DataEventRecord>()
                .Property(b => b.MadDescription)
                .HasField("_description")
                .HasColumnName("_description");

            base.OnModelCreating(builder);
        }

        public override int SaveChanges()
        {
            ChangeTracker.DetectChanges();

            updateUpdatedProperty<SourceInfo>();
            updateUpdatedProperty<DataEventRecord>();

            return base.SaveChanges();
        }

        private void updateUpdatedProperty<T>() where T : class
        {
            var modifiedSourceInfo =
                ChangeTracker.Entries<T>()
                    .Where(e => e.State == EntityState.Added || e.State == EntityState.Modified);

            foreach (var entry in modifiedSourceInfo)
            {
                entry.Property("UpdatedTimestamp").CurrentValue = DateTime.UtcNow;
            }
        }
    }
}

Logging and diagnosis

Logging for EF Core is configured for this application as described here in the Entity Framework Core docs.

The EfCoreFilteredLoggerProvider.

using Microsoft.Extensions.Logging;
using System;
using System.Linq;
using Microsoft.EntityFrameworkCore.Storage.Internal;

namespace EFCoreFeaturesAndDiag.Logging
{
    public class EfCoreFilteredLoggerProvider : ILoggerProvider
    {
        private static string[] _categories =
        {
            typeof(Microsoft.EntityFrameworkCore.Storage.Internal.RelationalCommandBuilderFactory).FullName,
            typeof(Microsoft.EntityFrameworkCore.Storage.Internal.SqlServerConnection).FullName
        };

        public ILogger CreateLogger(string categoryName)
        {
            if (_categories.Contains(categoryName))
            {
                return new MyLogger();
            }

            return new NullLogger();
        }

        public void Dispose()
        { }

        private class MyLogger : ILogger
        {
            public bool IsEnabled(LogLevel logLevel)
            {
                return true;
            }

            public void Log<TState>(LogLevel logLevel, EventId eventId, TState state, Exception exception, Func<TState, Exception, string> formatter)
            {
                Console.WriteLine(formatter(state, exception));
            }

            public IDisposable BeginScope<TState>(TState state)
            {
                return null;
            }
        }

        private class NullLogger : ILogger
        {
            public bool IsEnabled(LogLevel logLevel)
            {
                return false;
            }

            public void Log<TState>(LogLevel logLevel, EventId eventId, TState state, Exception exception, Func<TState, Exception, string> formatter)
            { }

            public IDisposable BeginScope<TState>(TState state)
            {
                return null;
            }
        }
    }
}

The EfCoreLoggerProvider class:

using Microsoft.Extensions.Logging;
using System;
using System.IO;

namespace EFCoreFeaturesAndDiag.Logging
{
    public class EfCoreLoggerProvider : ILoggerProvider
    {
        public ILogger CreateLogger(string categoryName)
        {
            return new MyLogger();
        }

        public void Dispose()
        { }

        private class MyLogger : ILogger
        {
            public bool IsEnabled(LogLevel logLevel)
            {
                return true;
            }

            public void Log<TState>(LogLevel logLevel, EventId eventId, TState state, Exception exception, Func<TState, Exception, string> formatter)
            {
                File.AppendAllText(@"C:\temp\log.txt", formatter(state, exception));
                Console.WriteLine(formatter(state, exception));
            }

            public IDisposable BeginScope<TState>(TState state)
            {
                return null;
            }
        }
    }
}

The EfCoreLoggerProvider logger is then added in the DataAccessMsSqlServerProvider constructor.

public class DataAccessMsSqlServerProvider : IDataAccessProvider
{
	private readonly DomainModelMsSqlServerContext _context;
	private readonly ILogger _logger;

	public DataAccessMsSqlServerProvider(DomainModelMsSqlServerContext context, ILoggerFactory loggerFactory)
	{
		_context = context;
		loggerFactory.AddProvider(new EfCoreLoggerProvider());
		_logger = loggerFactory.CreateLogger("DataAccessMsSqlServerProvider");

	}

The EF Core logs

Here’s the logs produced for an insert command:

Executing action method AspNet5MultipleProject.Controllers.DataEventRecordsController.AddTest (EFCoreFeaturesAndDiag) with arguments ((null)) - ModelState is ValidOpening connection to database 'EfcoreTest' on server 'N275\MSSQLSERVER2014'.Beginning transaction with isolation level 'Unspecified'.Executed DbCommand (60ms) [Parameters=[@p0='?' (Size = 4000), @p1='?' (Size = 4000), @p2='?', @p3='?'], CommandType='Text', CommandTimeout='30']
SET NOCOUNT ON;
INSERT INTO [SourceInfos] ([Description], [Name], [Timestamp], [UpdatedTimestamp])
VALUES (@p0, @p1, @p2, @p3);
SELECT [SourceInfoId]
FROM [SourceInfos]
WHERE @@ROWCOUNT = 1 AND [SourceInfoId] = scope_identity();Executed DbCommand (1ms) [Parameters=[@p4='?' (Size = 4000), @p5='?' (Size = 4000), @p6='?', @p7='?', @p8='?'], CommandType='Text', CommandTimeout='30']
SET NOCOUNT ON;
INSERT INTO [DataEventRecords] ([_description], [Name], [SourceInfoId], [Timestamp], [UpdatedTimestamp])
VALUES (@p4, @p5, @p6, @p7, @p8);
SELECT [DataEventRecordId]
FROM [DataEventRecords]
WHERE @@ROWCOUNT = 1 AND [DataEventRecordId] = scope_identity();Committing transaction.Closing connection to database 'EfcoreTest' on server 'N275\MSSQLSERVER2014'.Executed action method AspNet5MultipleProject.Controllers.DataEventRecordsController.AddTest (EFCoreFeaturesAndDiag), returned result Microsoft.AspNetCore.Mvc.OkObjectResult.No information found on request to perform content negotiation.Selected output formatter 'Microsoft.AspNetCore.Mvc.Formatters.StringOutputFormatter' and content type 'text/plain; charset=utf-8' to write the response.Executing ObjectResult, writing value Microsoft.AspNetCore.Mvc.ControllerContext.Executed action AspNet5MultipleProject.Controllers.DataEventRecordsController.AddTest (EFCoreFeaturesAndDiag) in 1181.5083msConnection id "0HL13KIK3Q7QG" completed keep alive response.Request finished in 1259.1888ms 200 text/plain; charset=utf-8Request starting HTTP/1.1 GET http://localhost:46799/favicon.ico  The request path /favicon.ico does not match an existing fileRequest did not match any routes.Connection id "0HL13KIK3Q7QG" completed keep alive response.Request finished in 36.2143ms 404

MS SQL Profiler

And here’s the corresponding insert request in the SQL Profiler from MS SQL Server.

sqlprofiler_efcore_01

EF Core Find

One feature which was also implemented in EF Core 1.1 is the Find method, which is very convenient to use.

public DataEventRecord GetDataEventRecord(long dataEventRecordId)
{
	return _context.DataEventRecords.Find(dataEventRecordId);
}

EF Core is looking really good and with the promise of further features and providers, this is becoming really exciting.

Links:

https://docs.microsoft.com/en-us/ef/

https://docs.microsoft.com/en-us/ef/core/miscellaneous/logging

https://blogs.msdn.microsoft.com/dotnet/2016/11/16/announcing-entity-framework-core-1-1/

https://msdn.microsoft.com/en-us/library/ms181091.aspx

https://github.com/aspnet/EntityFramework/releases/tag/rel%2F1.1.0

Experiments with Entity Framework Core and ASP.NET Core 1.0 MVC

ASP.NET Core 1.0 with PostgreSQL and Entity Framework Core

ASP.NET Core 1.0 MVC File Upload with MS SQL SERVER FileTable

Setting the NLog database connection string in the ASP.NET Core appsettings.json


Implementing a Client White-list using ASP.NET Core Middleware

$
0
0

This article shows how a client white-list could be implemented using ASP.NET Core middleware checking the Remote IP address of the request. If the client IP is on the white-list, no restrictions exist.

Code: https://github.com/damienbod/ClientIpAspNetCoreIIS

The middleware uses an admin white-list parameter from the constructor to compare with the remote ip address from the HttpContext Connection property. This is different to previous versions of .NET. In the example, all GET requests are allowed. If any other request method is used, the remote IP is used to check if it exists in the white-list. If it does not exist, a 403 is returned.

using System;
using System.Collections.Generic;
using System.Linq;
using System.Net;
using System.Threading.Tasks;
using Microsoft.AspNetCore.Http;
using Microsoft.Extensions.Logging;

namespace ClientIpAspNetCore
{
    public class AdminWhiteListMiddleware
    {
        private readonly RequestDelegate _next;
        private readonly ILogger<AdminWhiteListMiddleware> _logger;
        private readonly string _adminWhiteList;

        public AdminWhiteListMiddleware(RequestDelegate next, ILogger<AdminWhiteListMiddleware> logger, string adminWhiteList)
        {
            _adminWhiteList = adminWhiteList;
            _next = next;
            _logger = logger;
        }

        public async Task Invoke(HttpContext context)
        {
            if (context.Request.Method != "GET")
            {
                var remoteIp = context.Connection.RemoteIpAddress;
                _logger.LogInformation($"Request from Remote IP address: {remoteIp}");

                string[] ip = _adminWhiteList.Split(';');
                if (!ip.Any(option => option == remoteIp.ToString()))
                {
                    _logger.LogInformation($"Forbidden Request from Remote IP address: {remoteIp}");
                    context.Response.StatusCode = (int)HttpStatusCode.Forbidden;
                    return;
                }
            }

            await _next.Invoke(context);

        }
    }
}

The white-list is configured in the appsettings.config. This is a ‘;’ separated list which is split in the middleware class.

{
    "AdminWhiteList":  "127.0.0.1;192.168.1.5",
    "Logging": {
        "IncludeScopes": false,
        "LogLevel": {
            "Default": "Debug",
            "System": "Information",
            "Microsoft": "Information"
        }
    }
}

In the startup class, the AdminWhiteListMiddleware type is added using the appsettings configuration.

public void Configure(IApplicationBuilder app, IHostingEnvironment env, ILoggerFactory loggerFactory)
{
	...

	app.UseStaticFiles();

	app.UseMiddleware<AdminWhiteListMiddleware>(Configuration["AdminWhiteList"]);
	app.UseMvc();
}

If a request is sent, other that a GET method, and it is not in the white-list, the 403 response is returned to the client and logged.

2016-12-18 16:45:42.8891|0|ClientIpAspNetCore.AdminWhiteListMiddleware|INFO|  Request from Remote IP address: 192.168.1.4
2016-12-18 16:45:42.9031|0|ClientIpAspNetCore.AdminWhiteListMiddleware|INFO|  Forbidden Request from Remote IP address: 192.168.1.4

An ActionFilter could also be used to implement this, for example if more specific logic is required.

using System.Threading.Tasks;
using Microsoft.AspNetCore.Authorization;
using Microsoft.AspNetCore.Mvc.Authorization;
using Microsoft.AspNetCore.Mvc.Filters;
using Microsoft.Extensions.Logging;

namespace ClientIpAspNetCore.Filters
{
    public class ClientIdCheckFilter : ActionFilterAttribute
    {
        private readonly ILogger _logger;

        public ClientIdCheckFilter(ILoggerFactory loggerFactory)
        {
            _logger = loggerFactory.CreateLogger("ClassConsoleLogActionOneFilter");
        }

        public override void OnActionExecuting(ActionExecutingContext context)
        {
            _logger.LogInformation($"Remote IpAddress: {context.HttpContext.Connection.RemoteIpAddress}");

            // TODO implement some business logic for this...

            base.OnActionExecuting(context);
        }
    }
}

The ActionFilter can be added to the services.

public void ConfigureServices(IServiceCollection services)
{
	services.AddScoped<ClientIdCheckFilter>();

	services.AddMvc();
}

And can be used specifically on any controller as required.

[ServiceFilter(typeof(ClientIdCheckFilter))]
[Route("api/[controller]")]
public class ValuesController : Controller

Note: I have not tested this with all the different possible hops, forward headers. Only tested with IIS and kestrel.

Links:

https://docs.microsoft.com/en-us/aspnet/core/fundamentals/middleware

http://odetocode.com/blogs/scott/archive/2016/11/22/asp-net-core-and-the-enterprise-part-3-middleware.aspx


Creating an ASP.NET Core 1.1 VS2017 Docker application

$
0
0

This blog shows how to setup a basic ASP.NET Core 1.1 application using Visual studio 2017 and Docker.

Code: https://github.com/damienbod/AspNetCoreVS2017Docker

This article from Swaminathan Vetri demonstates how to setup everything for an ASP.NET Core 1.0 application.

Now the application needs to be updated to ASP.NET Core 1.1. Open the csproj file and update the PackageReference and also the TargetFramework to the 1.1 packages and target. At present, there is no help when updating the packages, like the project.json file had, so you need to know exactly what packages to use when updating directly.

<Project ToolsVersion="15.0" Sdk="Microsoft.NET.Sdk.Web">
  <PropertyGroup>
    <OutputType>Exe</OutputType>
    <TargetFramework>netcoreapp1.1</TargetFramework>
    <PreserveCompilationContext>true</PreserveCompilationContext>
  </PropertyGroup>
  <ItemGroup>
    <Folder Include="wwwroot\" />
  </ItemGroup>
  <ItemGroup>
    <PackageReference Include="Microsoft.NETCore.App" Version="1.1.0" />
    <PackageReference Include="Microsoft.AspNetCore.Mvc" Version="1.1.0" />
    <PackageReference Include="Microsoft.AspNetCore.Routing" Version="1.1.0" />
    <PackageReference Include="Microsoft.AspNetCore.Server.IISIntegration" Version="1.1.0" />
    <PackageReference Include="Microsoft.AspNetCore.Server.Kestrel" Version="1.1.0" />
    <PackageReference Include="Microsoft.Extensions.Configuration.EnvironmentVariables" Version="1.1.0" />
    <PackageReference Include="Microsoft.Extensions.Configuration.FileExtensions" Version="1.1.0" />
    <PackageReference Include="Microsoft.Extensions.Configuration.Json" Version="1.1.0" />
    <PackageReference Include="Microsoft.Extensions.Logging" Version="1.1.0" />
    <PackageReference Include="Microsoft.Extensions.Logging.Console" Version="1.1.0" />
    <PackageReference Include="Microsoft.Extensions.Logging.Debug" Version="1.1.0" />
    <PackageReference Include="Microsoft.Extensions.Options.ConfigurationExtensions" Version="1.1.0" />
  </ItemGroup>
  <ItemGroup>
    <Content Update=".dockerignore">
      <DependentUpon>Dockerfile</DependentUpon>
    </Content>
    <Content Update="docker-compose.override.yml">
      <DependentUpon>docker-compose.yml</DependentUpon>
    </Content>
    <Content Update="docker-compose.vs.debug.yml">
      <DependentUpon>docker-compose.yml</DependentUpon>
    </Content>
    <Content Update="docker-compose.vs.release.yml">
      <DependentUpon>docker-compose.yml</DependentUpon>
    </Content>
  </ItemGroup>
</Project>

Then update the docker-compose.ci.build.yml file. You need to select the required image which can be found on Docker Hub. The microsoft/aspnetcore-build:1.1.0-msbuild image is used here.

version: '2'

services:
  ci-build:
    image: microsoft/aspnetcore-build:1.1.0-msbuild
    volumes:
      - .:/src
    working_dir: /src
    command: /bin/bash -c "dotnet restore && dotnet publish -c Release -o ./bin/Release/PublishOutput"

Now update the DockerFile to target the 1.1.0 version.

FROM microsoft/aspnetcore:1.1.0
ARG source
WORKDIR /app
EXPOSE 80
COPY ${source:-bin/Release/PublishOutput} .
ENTRYPOINT ["dotnet", "AspNetCoreVS2017Docker.dll"]

Start the application using Docker. You can debug the application which is running inside Docker, all in Visual Studio 2017. Very neat.

vs2017_docker_basic_1

Notes:

When setting up Docker, you need to share the C drive in Docker.

Links:

Debugging Asp.Net core apps running in Docker Containers using VS 2017

https://www.sesispla.net/blog/language/en/2016/05/running-asp-net-core-1-0-rc2-in-docker/

https://hub.docker.com/r/microsoft/aspnetcore-build/tags/

https://blogs.msdn.microsoft.com/webdev/2016/11/16/new-docker-tools-for-visual-studio/

https://blogs.msdn.microsoft.com/stevelasker/2016/06/14/configuring-docker-for-windows-volumes/


Building production ready Angular apps with Visual Studio and ASP.NET Core

$
0
0

This article shows how Angular SPA apps can be built using Visual Studio and ASP.NET Core which can be used in production. Lots of articles, blogs templates exist for ASP.NET Core and Angular but very few support Angular production builds.

Although Angular 2 is not so old, many different seeds and build templates already exist, so care should be taken when choosing the infrastructure for the Angular application. Any Angular template, seed which does not support AoT or treeshaking should NOT be used, and also any third party Angular component which does not support AoT should not be used.

This example uses webpack 2 to build and bundle the Angular application. In the package.json, npm scripts are used to configure the different builds and can be used inside Visual Studio using the npm task runner.

Code:

VS2015: https://github.com/damienbod/Angular2WebpackVisualStudio

VS2017: https://github.com/damienbod/Angular2WebpackVisualStudio/tree/VisualStudio2017

Short introduction to AoT and treeshaking

AoT

AoT stands for Ahead of Time compilation. As per definition from the Angular docs:

“With AOT, the browser downloads a pre-compiled version of the application. The browser loads executable code so it can render the application immediately, without waiting to compile the app first.”

With AoT, you have smaller packages sizes, fewer asynchronous requests and better security. All is explained very well in the Angular Docs:

https://angular.io/docs/ts/latest/cookbook/aot-compiler.html

The AoT uses the platformBrowser to bootstrap and not platformBrowserDynamic which is used for JIT, Just in Time.

// Entry point for AoT compilation.
export * from './polyfills';

import { platformBrowser } from '@angular/platform-browser';
import { enableProdMode } from '@angular/core';
import { AppModuleNgFactory } from '../aot/angular2App/app/app.module.ngfactory';

enableProdMode();

platformBrowser().bootstrapModuleFactory(AppModuleNgFactory);

treeshaking

Treeshaking removes the unused portions of the libraries from the application, reducing the size of the application.

https://angular.io/docs/ts/latest/cookbook/aot-compiler.html

npm task runner

npm scripts can be used easily inside Visual Studio by using the npm task runner. Once installed, this needs to be configured correctly.

VS2015: Go to Tools –> Options –> Projects and Solutions –> External Web Tools and select all the checkboxes. More infomation can be found here.

In VS2017, this is slightly different:

Go to Tools –> Options –> Projects and Solutions –> Web Package Management –> External Web Tools and select all checkboxes:

vs_angular_build_01

npm scripts

ngc

ngc is the angular compiler which is used to do the AoT build using the tsconfig-aot.json configuration.

"ngc": "ngc -p ./tsconfig-aot.json",

The tsconfig-aot.json file builds to the aot folder.

{
  "compilerOptions": {
    "target": "es5",
    "module": "es2015",
    "moduleResolution": "node",
    "sourceMap": false,
    "emitDecoratorMetadata": true,
    "experimentalDecorators": true,
    "removeComments": true,
    "noImplicitAny": true,
    "suppressImplicitAnyIndexErrors": true,
    "skipLibCheck": true,
    "lib": [
      "es2015",
      "dom"
    ]
  },
  "files": [
    "angular2App/app/app.module.ts",
    "angular2App/main-aot.ts"
  ],
  "angularCompilerOptions": {
    "genDir": "aot",
    "skipMetadataEmit": true
  },
  "compileOnSave": false,
  "buildOnSave": false
}

build-production

The build-production npm script is used for the production build and can be used for the publish or the CI as required. The npm script used the ngc script and the webpack-production build.

"build-production": "npm run ngc && npm run webpack-production",

webpack-production npm script:

"webpack-production": "set NODE_ENV=production&& webpack",

watch-webpack-dev

The watch build monitors the source files and builds if any file changes.

"watch-webpack-dev": "set NODE_ENV=development&& webpack --watch --color",

start (webpack-dev-server)

The start script runs the webpack-dev-server client application and also the ASPNET Core server application.

"start": "concurrently \"webpack-dev-server --inline --progress --port 8080\" \"dotnet run\" ",

Any of these npm scripts can be run from the npm task runner.

vs_angular_build_02

Deployment

When deploying the application for an IIS, the build-production needs to be run, then the dotnet publish command, and then the contents can be copied to the IIS server. The publish-for-iis npm script can be used to publish. The command can be started from a build server without problem.

"publish-for-iis": "npm run build-production && dotnet publish -c Release"

vs_angular_build_02

https://docs.microsoft.com/en-us/dotnet/articles/core/tools/dotnet-publish

When deploying to an IIS, you need to install the DotNetCore.1.1.0-WindowsHosting.exe for the IIS. Setting up the server IIS docs:

https://docs.microsoft.com/en-us/aspnet/core/publishing/iis

Why not webpack task runner?

The Webpack task runner cannot be used for Webpack Angular applications because it does not support the required commands for Angular Webpack builds, either dev or production. The webpack -d build causes map errors in IE and the ngc compiler cannot be used, hence no production builds can be started from the Webpack Task Runner. For Angular Webpack projects, do not use the Webpack Task Runner, use the npm task runner.

Full package.json

{
  "version": "1.0.0",
  "description": "",
  "main": "wwwroot/index.html",
  "author": "",
  "license": "ISC",
  "scripts": {
    "ngc": "ngc -p ./tsconfig-aot.json",
    "start": "concurrently \"webpack-dev-server --inline --progress --port 8080\" \"dotnet run\" ",
    "webpack-dev": "set NODE_ENV=development&& webpack",
    "webpack-production": "set NODE_ENV=production&& webpack",
    "build-dev": "npm run webpack-dev",
    "build-production": "npm run ngc && npm run webpack-production",
    "watch-webpack-dev": "set NODE_ENV=development&& webpack --watch --color",
    "watch-webpack-production": "npm run build-production --watch --color"
  },
  "dependencies": {
    "@angular/common": "~2.4.1",
    "@angular/compiler": "~2.4.1",
    "@angular/core": "~2.4.1",
    "@angular/forms": "~2.4.1",
    "@angular/http": "~2.4.1",
    "@angular/platform-browser": "~2.4.1",
    "@angular/platform-browser-dynamic": "~2.4.1",
    "@angular/router": "~3.4.1",
    "@angular/upgrade": "~2.4.1",
    "angular-in-memory-web-api": "~0.2.3",
    "core-js": "^2.4.1",
    "reflect-metadata": "^0.1.8",
    "rxjs": "5.0.1",
    "zone.js": "^0.7.4",
    "@angular/compiler-cli": "2.4.1",
    "@angular/platform-server": "~2.4.1",
    "bootstrap": "^3.3.7",
    "ie-shim": "^0.1.0"
  },
  "devDependencies": {
    "@types/node": "^6.0.52",
    "angular2-template-loader": "^0.5.0",
    "awesome-typescript-loader": "^2.2.4",
    "clean-webpack-plugin": "^0.1.9",
    "concurrently": "^3.1.0",
    "copy-webpack-plugin": "^2.1.3",
    "css-loader": "^0.23.0",
    "file-loader": "^0.8.4",
    "html-webpack-plugin": "^2.8.1",
    "jquery": "^2.2.0",
    "json-loader": "^0.5.3",
    "node-sass": "^3.10.1",
    "raw-loader": "^0.5.1",
    "rimraf": "^2.5.2",
    "sass-loader": "^4.0.2",
    "source-map-loader": "^0.1.5",
    "style-loader": "^0.13.0",
    "ts-helpers": "^1.1.1",
    "typescript": "2.0.3",
    "url-loader": "^0.5.6",
    "webpack": "^2.2.0-rc.3",
    "webpack-dev-server": "^1.16.2"
  },
  "-vs-binding": { "ProjectOpened": [ "watch-webpack-dev" ] }
}

Full webpack.prod.js

var path = require('path');

var webpack = require('webpack');

var HtmlWebpackPlugin = require('html-webpack-plugin');
var CopyWebpackPlugin = require('copy-webpack-plugin');
var CleanWebpackPlugin = require('clean-webpack-plugin');
var helpers = require('./webpack.helpers');

console.log("@@@@@@@@@ USING PRODUCTION @@@@@@@@@@@@@@@");

module.exports = {

    entry: {
        'vendor': './angular2App/vendor.ts',
        'app': './angular2App/main-aot.ts' // AoT compilation
    },

    output: {
        path: "./wwwroot/",
        filename: 'dist/[name].[hash].bundle.js',
        publicPath: "/"
    },

    resolve: {
        extensions: ['.ts', '.js', '.json', '.css', '.scss', '.html']
    },

    devServer: {
        historyApiFallback: true,
        stats: 'minimal',
        outputPath: path.join(__dirname, 'wwwroot/')
    },

    module: {
        rules: [
            {
                test: /\.ts$/,
                loaders: [
                    'awesome-typescript-loader'
                ]
            },
            {
                test: /\.(png|jpg|gif|woff|woff2|ttf|svg|eot)$/,
                loader: "file-loader?name=assets/[name]-[hash:6].[ext]",
            },
            {
                test: /favicon.ico$/,
                loader: "file-loader?name=/[name].[ext]",
            },
            {
                test: /\.css$/,
                loader: "style-loader!css-loader"
            },
            {
                test: /\.scss$/,
                exclude: /node_modules/,
                loaders: ["style-loader", "css-loader", "sass-loader"]
            },
            {
                test: /\.html$/,
                loader: 'raw-loader'
            }
        ],
        exprContextCritical: false
    },

    plugins: [
        new CleanWebpackPlugin(
            [
                './wwwroot/dist',
                './wwwroot/assets'
            ]
        ),
        new webpack.NoErrorsPlugin(),
        new webpack.optimize.UglifyJsPlugin({
            compress: {
                warnings: false
            },
            output: {
                comments: false
            },
            sourceMap: false
        }),
        new webpack.optimize.CommonsChunkPlugin(
            {
                name: ['vendor']
            }),

        new HtmlWebpackPlugin({
            filename: 'index.html',
            inject: 'body',
            chunksSortMode: helpers.packageSort(['vendor', 'app']),
            template: 'angular2App/index.html'
        }),

        new CopyWebpackPlugin([
            { from: './angular2App/images/*.*', to: "assets/", flatten: true }
        ])
    ]
};


Links:

https://damienbod.com/2016/06/12/asp-net-core-angular2-with-webpack-and-visual-studio/

https://github.com/preboot/angular2-webpack

https://webpack.github.io/docs/

https://github.com/jtangelder/sass-loader

https://github.com/petehunt/webpack-howto/blob/master/README.md

https://blogs.msdn.microsoft.com/webdev/2015/03/19/customize-external-web-tools-in-visual-studio-2015/

https://marketplace.visualstudio.com/items?itemName=MadsKristensen.NPMTaskRunner

http://sass-lang.com/

http://blog.thoughtram.io/angular/2016/06/08/component-relative-paths-in-angular-2.html

https://angular.io/docs/ts/latest/guide/webpack.html

http://blog.mgechev.com/2016/06/26/tree-shaking-angular2-production-build-rollup-javascript/

https://angular.io/docs/ts/latest/tutorial/toh-pt5.html

http://angularjs.blogspot.ch/2016/06/improvements-coming-for-routing-in.html?platform=hootsuite

https://angular.io/docs/ts/latest/cookbook/aot-compiler.html

https://docs.microsoft.com/en-us/aspnet/core/publishing/iis

https://weblog.west-wind.com/posts/2016/Jun/06/Publishing-and-Running-ASPNET-Core-Applications-with-IIS


Angular 2 Lazy Loading with Webpack 2

$
0
0

This article shows how Angular 2 lazy loading can be supported using Webpack 2 for both JIT and AOT builds. The Webpack loader angular-router-loader from Brandon Roberts is used to implement this.

A big thanks to Roberto Simonetti for his help in this.

Code: Visual Studio 2015 project | Visual Studio 2017 project

Blogs in this series:

First create an Angular 2 module

In this example, the about module will be lazy loaded when the user clicks on the about tab. The about.module.ts is the entry point for this feature. The module has its own component and routing.
The app will now be setup to lazy load the AboutModule.

import { NgModule } from '@angular/core';
import { CommonModule } from '@angular/common';

import { AboutRoutes } from './about.routes';
import { AboutComponent } from './components/about.component';

@NgModule({
    imports: [
        CommonModule,
        AboutRoutes
    ],

    declarations: [
        AboutComponent
    ],

})

export class AboutModule { }

Add the angular-router-loader Webpack loader to the packages.json file

To add lazy loading to the app, the angular-router-loader npm package needs to be added to the packages.json npm file in the devDependencies.

"devDependencies": {
    "@types/node": "7.0.0",
    "angular2-template-loader": "^0.6.0",
    "angular-router-loader": "^0.5.0",

Configure the Angular 2 routing

The lazy loading routing can be added to the app.routes.ts file. The loadChildren defines the module and the class name of the module which can be lazy loaded. It is also possible to pre-load lazy load modules if required.

import { Routes, RouterModule } from '@angular/router';

export const routes: Routes = [
    { path: '', redirectTo: 'home', pathMatch: 'full' },
    {
        path: 'about', loadChildren: './modules/about/about.module#AboutModule',
    }
];

export const AppRoutes = RouterModule.forRoot(routes);

Update the tsconfig-aot.json and tsconfig.json files

Now the tsconfig.json for development JIT builds and the tsconfig-aot.json for AOT production builds need to be configured to load the AboutModule module.

AOT production build

The files property contains all the module entry points as well as the app entry file. The angularCompilerOptions property defines the folder where the AOT will be built into. This must match the configuration in the Webpack production config file.

{
  "compilerOptions": {
    "target": "es5",
    "module": "es2015",
    "moduleResolution": "node",
    "sourceMap": false,
    "emitDecoratorMetadata": true,
    "experimentalDecorators": true,
    "removeComments": true,
    "noImplicitAny": true,
    "suppressImplicitAnyIndexErrors": true,
    "skipLibCheck": true,
    "lib": [
      "es2015",
      "dom"
    ]
  },
  "files": [
    "angular2App/app/app.module.ts",
    "angular2App/app/modules/about/about.module.ts",
    "angular2App/main-aot.ts"
  ],
  "angularCompilerOptions": {
    "genDir": "aot",
    "skipMetadataEmit": true
  },
  "compileOnSave": false,
  "buildOnSave": false
}

JIT development build

The modules and entry points are also defined for the JIT build.

{
  "compilerOptions": {
    "target": "es5",
    "module": "es2015",
    "moduleResolution": "node",
    "sourceMap": true,
    "emitDecoratorMetadata": true,
    "experimentalDecorators": true,
    "removeComments": true,
    "noImplicitAny": true,
    "skipLibCheck": true,
    "lib": [
      "es2015",
      "dom"
    ],
    "types": [
      "node"
    ]
  },
  "files": [
    "angular2App/app/app.module.ts",
    "angular2App/app/modules/about/about.module.ts",
    "angular2App/main.ts"
  ],
  "awesomeTypescriptLoaderOptions": {
    "useWebpackText": true
  },
  "compileOnSave": false,
  "buildOnSave": false
}

Configure Webpack to chunk and use the router lazy loading

Now the webpack configuration needs to be updated for the lazy loading.

AOT production build

The webpack.prod.js file requires that the chunkFilename property is set in the output, so that webpack chunks the lazy load modules.

output: {
        path: './wwwroot/',
        filename: 'dist/[name].[hash].bundle.js',
        chunkFilename: 'dist/[id].[hash].chunk.js',
        publicPath: '/'
},

The angular-router-loader is added to the loaders. The genDir folder defined here must match the definition in tsconfig-aot.json.

 module: {
  rules: [
    {
        test: /\.ts$/,
        loaders: [
            'awesome-typescript-loader',
            'angular-router-loader?aot=true&genDir=aot/'
        ]
    },

JIT development build

The webpack.dev.js file requires that the chunkFilename property is set in the output, so that webpack chunks the lazy load modules.

output: {
        path: './wwwroot/',
        filename: 'dist/[name].bundle.js',
        chunkFilename: 'dist/[id].chunk.js',
        publicPath: '/'
},

The angular-router-loader is added to the loaders.

 module: {
  rules: [
    {
        test: /\.ts$/,
        loaders: [
            'awesome-typescript-loader',
            'angular-router-loader',
            'angular2-template-loader',
            'source-map-loader',
            'tslint-loader'
        ]
    },

Build and run

Now the application can be built using the npm build scripts and the dotnet command tool.

Open a command line in the root of the src files. Install the npm packages:

npm install

Now build the production build. The build-production does a ngc build, and then a webpack production build.

npm run build-production

You can see that Webpack creates an extra chunked file for the About Module.

lazyloadingwebpack_01

Then start the application. The server is implemented using ASP.NET Core 1.1.

dotnet run

When the application is started, the AboutModule is not loaded.

lazyloadingwebpack_02

When the about tab is clicked, the chunked AboutModule is loaded.

lazyloadingwebpack_03

Absolutely fantastic. You could also pre-load the modules if required. See this blog.

Links:

https://github.com/brandonroberts/angular-router-loader

https://www.npmjs.com/package/angular-router-loader

https://github.com/robisim74/angular2localization/tree/gh-pages

https://vsavkin.com/angular-router-preloading-modules-ba3c75e424cb

https://webpack.github.io/docs/


Creating an ASP.NET Core Docker application and deploying to Azure

$
0
0

This blog is a simple step through, which creates an ASP.NET Core Docker image using Visual Studio 2017, deploys it to Docker Hub and then deploys the image to Azure.

Thanks to Malte Lantin for his fantastic posts on MSDN. See the links at the end of this post.

Code: https://github.com/damienbod/AspNetCoreDockerAzureDemo

Step 1: Create a Docker project in Visual Studio 2017 using ASP.NET Core

In the example, an ASP.NET Core Visual Studio 2017 project using msbuild is used as the demo application. Then the Docker support is added to the project using Visual Studio 2017.

Right click the project, Add/Docker Project Support
firstazuredocker_01

Update the docker files to ASPNET.Core and the correct docker version as required. More information can be found here:

http://www.jeffreyfritz.com/2017/01/docker-compose-api-too-old-for-windows/

https://damienbod.com/2016/12/24/creating-an-asp-net-core-1-1-vs2017-docker-application/

Now the application will be built in a layer on top of the microsoft/aspnetcore image.

Dockerfile:

FROM microsoft/aspnetcore:1.0.3
ARG source
WORKDIR /app
EXPOSE 80
COPY ${source:-bin/Release/PublishOutput} .
ENTRYPOINT ["dotnet", "AngularClient.dll"]

docker-compose.yml

version: '2'

services:
  angularclient:
    image: angularclient
    build:
      context: .
      dockerfile: Dockerfile

Once the project is built and ready, it can be deployed to docker hub. Do a release build of the projects.

Step 2: Build a docker image and deploy to docker hub

Before you can deploy the docker image to docker hub, you need to have, or create a docker hub account.

Then open up the console and create a docker tag for your application. Replace damienbod with your docker hub user name. The docker image angularclient, created from Visual Studio 2017, will be tagged to damienbod/aspnetcorethingsclient.

docker tag angularclient damienbod/aspnetcorethingsclient

Now login to docker hub in the command line:

docker login

Once logged in, the image can be pushed to docker hub. Again replace damienbod with your docker hub name.

docker push damienbod/aspnetcorethingsclient

Once deployed, you can view this on docker hub.

https://hub.docker.com/u/damienbod/

firstazuredocker_02

For more information on docker images and containers:

https://docs.docker.com/engine/getstarted/step_four/

Step 3: Deploy to Azure

Login to https://portal.azure.com/ and click the new button and search for Web App On Linux. We want to deploy a docker container to this, using our docker image.

Select the + New and search for Web App On Linux.
firstazuredocker_03

Then select. Do not click the create until the docker container has been configured.

firstazuredocker_04

Now configure the docker container. Add the new created image on docker hub to the text field.

firstazuredocker_05

Click create and the aplication will be deployed on Azure. Now the application can be used.

firstazuredocker_07

And the application runs as requires.

http://thingsclient.azurewebsites.net/home

firstazuredocker_06

Notes:

The Visual Studio 2017 docker tooling is still rough and has problems when using newer versions of docker, or the msbuild 1.1 versions etc, but it is still in RC and will improve before the release. Next steps are now to use CI to automatically complete all these steps, add security, and use docker compose for multiple container deployments.

Links

https://blogs.msdn.microsoft.com/malte_lantin/2017/01/12/create-you-first-asp-net-core-app-and-host-it-in-a-linux-docker-container-on-microsoft-azure-part-13/

https://blogs.msdn.microsoft.com/malte_lantin/2017/01/13/create-you-first-asp-net-core-app-and-host-it-in-a-linux-docker-container-on-microsoft-azure-part-23/

https://blogs.msdn.microsoft.com/malte_lantin/2017/01/13/create-you-first-asp-net-core-app-and-host-it-in-a-linux-docker-container-on-microsoft-azure-part-33/

https://hub.docker.com/

Orchestrating multi service asp.net core application using docker-compose

Debugging Asp.Net core apps running in Docker Containers using VS 2017

https://docs.docker.com/engine/getstarted/step_four/

https://stefanprodan.com/2016/aspnetcore-cd-pipeline-docker-hub/


Docker compose with ASP.NET Core, EF Core and the PostgreSQL image

$
0
0

This article show how an ASP.NET Core application with a PostgreSQL database can be setup together using docker as the deployment containers for both web and database parts of the application. docker-compose is used to connect the 2 containers and the application is build using Visual Studio 2017.

Code: https://github.com/damienbod/AspNetCorePostgreSQLDocker

Setting up the PostgreSQL docker container from the command line

The PostgreSQL docker image can be started or setup from the command line simple by defining the required environment parameters, and the port which can be used to connect with PostgreSQL. A named volume called pgdata is also defined in the following command. The container is called postgres-server.

$ docker run -d -e POSTGRES_USER=postgres -e POSTGRES_PASSWORD=damienbod
 --name postgres-server -p 5432:5432 -v pgdata:/var/lib/postgresql/data
 --restart=always postgres

You can check all your local volumes with the following docker command:

$ docker volume ls

The docker containers can be viewed by running the docker ps -a:

$ docker ps -a

Then you can check the docker container for the postgres-server by using the logs command and the id of the container. Only the first few characters from the container id is required for docker to find the container.

$ docker logs <docker_id>

If you would like to view the docker container configuration and its properties, the inspect command can be used:

$ docker inspect <docker_id>

When developing docker applications, you will regularly need to clean up the images, containers and volumes. Here’s some quick commands which are used regularly.

If you need to find the dangling volumes:

$ docker volume ls -qf dangling=true

A volume can be removed using the volume id:

$ docker volume rm <volume id>

Clean up container and volume (dangerous as you might not want to remove the data):

$ docker rm -fv <docker id>

Configure the database using pgAdmin

Open pgAdmin to configure a new user in PostgreSQL, which will be used for the application.

EF7_PostgreSQL_01

Right click your user and click properties to set the password

EF7_PostgreSQL_02

Now a PostgreSQL database using docker is ready to be used. This is not the only way to do this, a better way would be to use a Dockerfile and and docker-compose.

Creating the PostgreSQL docker image using a Dockerfile

Usually you do not want to create the application from hand. You can do everything described above using a Dockerfile and docker-compose. The PostgresSQL docker image for this project is created using a Dockerfile and docker-compose. The Dockerfile uses the latest offical postgres docker image and adds the required database to the docker-entrypoint-initdb.d folder inside the container. When the PostgreSQL inits, it executes these scripts.

FROM postgres:latest
EXPOSE 5432
COPY dbscripts/10-init.sql /docker-entrypoint-initdb.d/10-init.sql
COPY dbscripts/20-damienbod.sql /docker-entrypoint-initdb.d/20-database.sql

The docker-compose defines the image, ports and a named volume for this image. The POSTGRES_PASSWORD is required.

version: '2'

services:
  damienbodpostgres:
     image: damienbodpostgres
     restart: always
     build:
       context: .
       dockerfile: Dockerfile
     ports:
       - 5432:5432
     environment:
         POSTGRES_PASSWORD: damienbod
     volumes:
       - pgdata:/var/lib/postgresql/data

volumes:
  pgdata:

Now switch to the directory where the docker-compose file is and build.

$ docker-compose build

If you want to deploy, you could create a new docker tag on the postgres container. Use your docker hub name if you have.

$ docker ps -a
$ docker tag damienbodpostgres damienbod/postgres-server

You can check your images and should see it in your list.

$ docker images

Creating the ASP.NET Core application

An ASP.NET Core application was created in VS2017. The EF Core and the PostgreSQL nuget packages were added as required. The Docker support was also added using the Visual Studio tooling.

<Project ToolsVersion="15.0" Sdk="Microsoft.NET.Sdk.Web">
  <PropertyGroup>
    <OutputType>Exe</OutputType>
    <TargetFramework>netcoreapp1.1</TargetFramework>
    <PreserveCompilationContext>true</PreserveCompilationContext>
  </PropertyGroup>
  <ItemGroup>
    <PackageReference Include="Microsoft.NETCore.App" Version="1.1.0" />
    <PackageReference Include="Microsoft.AspNetCore.Mvc" Version="1.1.0" />
    <PackageReference Include="Microsoft.AspNetCore.Routing" Version="1.1.0" />
    <PackageReference Include="Microsoft.AspNetCore.Diagnostics.EntityFrameworkCore" Version="1.1.0" />
    <PackageReference Include="Microsoft.AspNetCore.Server.IISIntegration" Version="1.1.0" />
    <PackageReference Include="Microsoft.AspNetCore.Server.Kestrel" Version="1.1.0" />
    <PackageReference Include="Microsoft.EntityFrameworkCore" Version="1.1.0" />
    <PackageReference Include="Microsoft.EntityFrameworkCore.Relational" Version="1.1.0" />
    <PackageReference Include="Microsoft.EntityFrameworkCore.Tools" Version="1.1.0-preview4-final" />
    <PackageReference Include="Microsoft.Extensions.Configuration.EnvironmentVariables" Version="1.1.0" />
    <PackageReference Include="Microsoft.Extensions.Configuration.FileExtensions" Version="1.1.0" />
    <PackageReference Include="Microsoft.Extensions.Configuration.Json" Version="1.1.0" />
    <PackageReference Include="Microsoft.Extensions.Logging" Version="1.1.0" />
    <PackageReference Include="Microsoft.Extensions.Logging.Console" Version="1.1.0" />
    <PackageReference Include="Microsoft.Extensions.Logging.Debug" Version="1.1.0" />
    <PackageReference Include="Microsoft.Extensions.Options.ConfigurationExtensions" Version="1.1.0" />
    <PackageReference Include="Microsoft.AspNetCore.StaticFiles" Version="1.1.0" />
    <PackageReference Include="Npgsql.EntityFrameworkCore.PostgreSQL" Version="1.1.0" />
    <PackageReference Include="Npgsql.EntityFrameworkCore.PostgreSQL.Design" Version="1.1.0" />
  </ItemGroup>
  <ItemGroup>
    <DotNetCliToolReference Include="Microsoft.EntityFrameworkCore.Tools.DotNet" Version="1.1.0-preview4-final" />
    <DotNetCliToolReference Include="Microsoft.VisualStudio.Web.CodeGeneration.Tools" Version="1.0.0-msbuild1-final" />
  </ItemGroup>
  <ItemGroup>
    <Content Update=".dockerignore">
      <DependentUpon>Dockerfile</DependentUpon>
    </Content>
    <Content Update="docker-compose.override.yml">
      <DependentUpon>docker-compose.yml</DependentUpon>
    </Content>
    <Content Update="docker-compose.vs.debug.yml">
      <DependentUpon>docker-compose.yml</DependentUpon>
    </Content>
    <Content Update="docker-compose.vs.release.yml">
      <DependentUpon>docker-compose.yml</DependentUpon>
    </Content>
  </ItemGroup>
</Project>

The EF Core context is setup to access the 2 tables defined in PostgreSQL.

using System;
using System.Linq;
using Microsoft.EntityFrameworkCore;

namespace AspNetCorePostgreSQLDocker
{
    // >dotnet ef migration add testMigration in AspNet5MultipleProject
    public class DomainModelPostgreSqlContext : DbContext
    {
        public DomainModelPostgreSqlContext(DbContextOptions<DomainModelPostgreSqlContext> options) :base(options)
        {
        }

        public DbSet<DataEventRecord> DataEventRecords { get; set; }

        public DbSet<SourceInfo> SourceInfos { get; set; }

        protected override void OnModelCreating(ModelBuilder builder)
        {
            builder.Entity<DataEventRecord>().HasKey(m => m.DataEventRecordId);
            builder.Entity<SourceInfo>().HasKey(m => m.SourceInfoId);

            // shadow properties
            builder.Entity<DataEventRecord>().Property<DateTime>("UpdatedTimestamp");
            builder.Entity<SourceInfo>().Property<DateTime>("UpdatedTimestamp");

            base.OnModelCreating(builder);
        }

        public override int SaveChanges()
        {
            ChangeTracker.DetectChanges();

            updateUpdatedProperty<SourceInfo>();
            updateUpdatedProperty<DataEventRecord>();

            return base.SaveChanges();
        }

        private void updateUpdatedProperty<T>() where T : class
        {
            var modifiedSourceInfo =
                ChangeTracker.Entries<T>()
                    .Where(e => e.State == EntityState.Added || e.State == EntityState.Modified);

            foreach (var entry in modifiedSourceInfo)
            {
                entry.Property("UpdatedTimestamp").CurrentValue = DateTime.UtcNow;
            }
        }
    }
}

The used database was created using the dockerfile scripts executed in the docker container init. This could also be done with EF Core migrations.

$ dotnet ef migrations add postgres-scripts

$ dotnet ef database update

The connection string used in the application must use the network name defined for the database in the docker-compose file. When debugging locally using IIS without docker, you would have so supply a way of switching the connection string hosts. The host postgresserver is defined in this demo, and so used in the connection string.

 "DataAccessPostgreSqlProvider": "User ID=damienbod;Password=damienbod;Host=postgresserver;Port=5432;Database=damienbod;Pooling=true;"

Now the application can be built. You need to check that it can be published to the release bin folder, which is used by the docker-compose.

Setup the docker-compose

The docker-compose for the application defines the web tier, database server and the network settings for docker. The postgresserver service is built using the damienbodpostgres image. It exposes the PostgreSQL standard post like we have defined before. The aspnetcorepostgresqldocker web application runs on post 5001 and depends on postgresserver. This is the ASP.NET Core application in Visual studio 2017.

version: '2'

services:
  postgresserver:
     image: damienbodpostgres
     restart: always
     ports:
       - 5432:5432
     environment:
         POSTGRES_PASSWORD: damienbod
     volumes:
       - pgdata:/var/lib/postgresql/data
     networks:
       - mynetwork

  aspnetcorepostgresqldocker:
     image: aspnetcorepostgresqldocker
     ports:
       - 5001:80
     build:
       context: ./src/AspNetCorePostgreSQLDocker
       dockerfile: Dockerfile
     links:
       - postgresserver
     depends_on:
       - "postgresserver"
     networks:
       - mynetwork

volumes:
  pgdata:

networks:
  mynetwork:
     driver: bridge

Now the application can be started, deployed or tested. The following command will start the application in detached mode.

$ docker-compose -d up

Once the application is started, you can test it using:

http://localhost:5001/index.html

01_postgresqldocker

You can add some data using Postman
02_postgresqldocker

POST http://localhost:5001/api/dataeventrecords
{
  "DataEventRecordId":3,
  "Name":"Funny data",
  "Description":"yes",
  "Timestamp":"2015-12-27T08:31:35Z",
   "SourceInfo":
  {
    "SourceInfoId":0,
    "Name":"Beauty",
    "Description":"second Source",
    "Timestamp":"2015-12-23T08:31:35+01:00",
    "DataEventRecords":[]
  },
 "SourceInfoId":0
}

And the data can be viewed using

http://localhost:5001/api/dataeventrecords

03_postgresqldocker

Or you can view the data using pgAdmin

04_postgresqldocker
Links

https://hub.docker.com/_/postgres/

https://www.andreagrandi.it/2015/02/21/how-to-create-a-docker-image-for-postgresql-and-persist-data/

https://docs.docker.com/engine/examples/postgresql_service/

http://stackoverflow.com/questions/25540711/docker-postgres-pgadmin-local-connection

http://www.postgresql.org

http://www.pgadmin.org/

https://github.com/npgsql/npgsql

https://docs.docker.com/engine/tutorials/dockervolumes/



Hot Module Replacement with Angular and Webpack

$
0
0

This article shows how HMR, or Hot Module Replacement can be used together with Angular and Webpack.

Code: Visual Studio 2015 project | Visual Studio 2017 project

Blogs in this series:

package.json npm file

The webpack-dev-server from Kees Kluskens is added to the devDependencies in the npm package.json file. The webpack-dev-server package implements and supports the HMR feature.

"devDependencies": {
  ...
  "webpack": "^2.2.1",
  "webpack-dev-server": "2.2.1"
},

In the scripts section of the package.json, the start command is configured to start the dotnet server and also the webpack-dev-server with the –hot and the –inline parameters.

See the webpack-dev-server documentation for more information about the possible parameters.

The dotnet server is only required because this demo application uses a Web API service implemented in ASP.NET Core.

"start": "concurrently \"webpack-dev-server --hot --inline --port 8080\" \"dotnet run\" "

webpack dev configuration

The devServer is added to the module.exports in the webpack.dev.js. This configures the webpack-dev-server as required. The webpack-dev-server configuration can be set here as well as the command line options, so you as a developer can decide which is better for you.

devServer: {
	historyApiFallback: true,
	contentBase: path.join(__dirname, '/wwwroot/'),
	watchOptions: {
		aggregateTimeout: 300,
		poll: 1000
	}
},

The output in the module.exports also needs to be configured correctly for the webpack-dev-server to work correctly. If the ‘./’ path is used in the path option of the output section, the webpack-dev-server will not start.

output: {
	path: __dirname +  '/wwwroot/',
	filename: 'dist/[name].bundle.js',
	chunkFilename: 'dist/[id].chunk.js',
	publicPath: '/'
},

Running the application

Build the application using the webpack dev build. This can be done in the command line. Before building, you need to install all the npm packages using npm install.

$ npm run build-dev

The npm script build-dev is defined in the package.json file and uses the webpack-dev script which does a development build.

"build-dev": "npm run webpack-dev",
"webpack-dev": "set NODE_ENV=development && webpack",

Now the server can be started using the start script.

$ npm start

hmr_angular_01

The application is now running on localhost with port 8080 as defined.

http://localhost:8080/home

If for example, the color is changed in the app.scss, the bundles will be reloaded in the browser without refreshing.
hmr_angular2_03

Links

https://webpack.js.org/concepts/hot-module-replacement/

https://webpack.js.org/configuration/dev-server/#devserver

https://github.com/webpack/webpack-dev-server

https://www.sitepoint.com/beginners-guide-to-webpack-2-and-module-bundling/

View story at Medium.com


Implementing an Audit Trail using ASP.NET Core and Elasticsearch with NEST

$
0
0

This article shows how an audit trail can be implemented in ASP.NET Core which saves the audit documents to Elasticsearch using NEST.

Code: https://github.com/damienbod/AspNetCoreElasticsearchNestAuditTrail

Should I just use a logger?

Depends. If you just need to save requests, responses and application events, then a logger would be a better solution for this use case. I would use NLog as it provides everything you need, or could need, when working with ASP.NET Core.

If you only need to save business events/data of the application in the audit trail, then this solution could fit.

Using the Audit Trail

The audit trail is implemented so that it can be used easily. In the Startup class of the ASP.NET Core application, it is added to the application in the ConfigureServices method. The class library provides an extension method, AddAuditTrail, which can be configured as required. It takes 2 parameters, a bool parameter which defines if a new index is created per day or per month to save the audit trail documents, and a second int parameter which defines how many of the previous indices are included in the alias used to select the audit trail items. If this is 0, all indices are included for the search.

Because the audit trail documents are grouped into different indices per day or per month, the amount of documents can be controlled in each index. Usually the application user requires only the last n days, or last 2 months of the audit trails, and so the search does not need to search through all audit trails documents since the application began. This makes it possible to optimize the data as required, or even remove, archive old unused audit trail indices.

public void ConfigureServices(IServiceCollection services)
{
	var indexPerMonth = false;
	var amountOfPreviousIndicesUsedInAlias = 3;
	services.AddAuditTrail<CustomAuditTrailLog>(options =>
		options.UseSettings(indexPerMonth, amountOfPreviousIndicesUsedInAlias)
	);

	services.AddMvc();
}

The AddAuditTrail extension method requires a model definition which will be used to save or retrieve the documents in Elasticsearch. The model must implement the IAuditTrailLog interface. This interface just forces you to implement the property Timestamp which is required for the audit logs.

The model can then be designed, defined as required. NEST attributes can be used for each of the properties in the model. Use the keyword attribute, if the text field should not be analyzed. If you must use enums, then save the string value and NOT the integer value to the persistent layer. If integer values are saved for the enums, then it cannot be used without the knowledge of what each integer value represents, making it dependent on the code.

using AuditTrail.Model;
using Nest;
using System;

namespace AspNetCoreElasticsearchNestAuditTrail
{
    public class CustomAuditTrailLog : IAuditTrailLog
    {
        public CustomAuditTrailLog()
        {
            Timestamp = DateTime.UtcNow;
        }

        public DateTime Timestamp { get; set; }

        [Keyword]
        public string Action { get; set; }

        public string Log { get; set; }

        public string Origin { get; set; }

        public string User { get; set; }

        public string Extra { get; set; }
    }
}

The audit trail can then be used anywhere in the application. The IAuditTrailProvider can be added in the constructor of the class and an audit document can be created using the AddLog method.

private readonly IAuditTrailProvider<CustomAuditTrailLog> _auditTrailProvider;

public HomeController(IAuditTrailProvider<CustomAuditTrailLog> auditTrailProvider)
{
	_auditTrailProvider = auditTrailProvider;
}

public IActionResult Index()
{
	var auditTrailLog = new CustomAuditTrailLog()
	{
		User = User.ToString(),
		Origin = "HomeController:Index",
		Action = "Home GET",
		Log = "home page called doing something important enough to be added to the audit log.",
		Extra = "yep"
	};

	_auditTrailProvider.AddLog(auditTrailLog);
	return View();
}

The audit trail documents can be viewed using QueryAuditLogs which supports paging and uses a simple query search which accepts wildcards. The AuditTrailSearch method returns a MVC view with the audit trail items in the model.

public IActionResult AuditTrailSearch(string searchString, int skip, int amount)
{

	var auditTrailViewModel = new AuditTrailViewModel
	{
		Filter = searchString,
		Skip = skip,
		Size = amount
	};

	if (skip > 0 || amount > 0)
	{
		var paging = new AuditTrailPaging
		{
			Size = amount,
			Skip = skip
		};

		auditTrailViewModel.AuditTrailLogs = _auditTrailProvider.QueryAuditLogs(searchString, paging).ToList();

		return View(auditTrailViewModel);
	}

	auditTrailViewModel.AuditTrailLogs = _auditTrailProvider.QueryAuditLogs(searchString).ToList();
	return View(auditTrailViewModel);
}

How is the Audit Trail implemented?

The AuditTrailExtensions class implements the extension methods used to initialize the audit trail implementations. This class accepts the options and registers the interfaces, classes with the IoC used by ASP.NET Core.

Generics are used so that any model class can be used to save the audit trail data. This changes always with each project, application. The type T must implement the interface IAuditTrailLog.

using System;
using Microsoft.Extensions.DependencyInjection.Extensions;
using Microsoft.Extensions.Localization;
using AuditTrail;
using AuditTrail.Model;

namespace Microsoft.Extensions.DependencyInjection
{
    public static class AuditTrailExtensions
    {
        public static IServiceCollection AddAuditTrail<T>(this IServiceCollection services) where T : class, IAuditTrailLog
        {
            if (services == null)
            {
                throw new ArgumentNullException(nameof(services));
            }

            return AddAuditTrail<T>(services, setupAction: null);
        }

        public static IServiceCollection AddAuditTrail<T>(
            this IServiceCollection services,
            Action<AuditTrailOptions> setupAction) where T : class, IAuditTrailLog
        {
            if (services == null)
            {
                throw new ArgumentNullException(nameof(services));
            }

            services.TryAdd(new ServiceDescriptor(
                typeof(IAuditTrailProvider<T>),
                typeof(AuditTrailProvider<T>),
                ServiceLifetime.Transient));

            if (setupAction != null)
            {
                services.Configure(setupAction);
            }
            return services;
        }
    }
}

When a new audit trail log is added, it uses the index defined in the _indexName field.

public void AddLog(T auditTrailLog)
{
	var index = new IndexName()
	{
		Name = _indexName
	};

	var indexRequest = new IndexRequest<T>(auditTrailLog, index);

	var response = _elasticClient.Index(indexRequest);
	if (!response.IsValid)
	{
		throw new ElasticsearchClientException("Add auditlog disaster!");
	}
}

The _indexName field is defined using the date pattern, either days or months depending on your options.

private const string _alias = "auditlog";
private string _indexName = $"{_alias}-{DateTime.UtcNow.ToString("yyyy-MM-dd")}";

index definition per month:

if(_options.Value.IndexPerMonth)
{
	_indexName = $"{_alias}-{DateTime.UtcNow.ToString("yyyy-MM")}";
}

When quering the audit trail logs, a simple query search query is used to find, select the audit trial documents required for the view. This is used so that wildcards can be used. The method accepts a query filter and paging options. If you search without any filter, all documents are returned which are defined in the alias (used indices). By using the simple query, the filter can accept options like AND, OR for the search.

public IEnumerable<T> QueryAuditLogs(string filter = "*", AuditTrailPaging auditTrailPaging = null)
{
	var from = 0;
	var size = 10;
	EnsureAlias();
	if(auditTrailPaging != null)
	{
		from = auditTrailPaging.Skip;
		size = auditTrailPaging.Size;
		if(size > 1000)
		{
			// max limit 1000 items
			size = 1000;
		}
	}
	var searchRequest = new SearchRequest<T>(Indices.Parse(_alias))
	{
		Size = size,
		From = from,
		Query = new QueryContainer(
			new SimpleQueryStringQuery
			{
				Query = filter
			}
		),
		Sort = new List<ISort>
			{
				new SortField { Field = TimestampField, Order = SortOrder.Descending }
			}
	};

	var searchResponse = _elasticClient.Search<T>(searchRequest);

	return searchResponse.Documents;
}

The alias is also updated in the search query, if required. Depending on you configuration, the alias uses all the audit trail indices or just the last n days, or n months. This check uses a static field. If the alias needs to be updated, the new alias is created, which also deletes the old one.

private void EnsureAlias()
{
	if (_options.Value.IndexPerMonth)
	{
		if (aliasUpdated.Date < DateTime.UtcNow.AddMonths(-1).Date)
		{
			aliasUpdated = DateTime.UtcNow;
			CreateAlias();
		}
	}
	else
	{
		if (aliasUpdated.Date < DateTime.UtcNow.AddDays(-1).Date)
		{
			aliasUpdated = DateTime.UtcNow;
			CreateAlias();
		}
	}
}

Here’s how the alias is created for all indices of the audit trail.

private void CreateAliasForAllIndices()
{
	var response = _elasticClient.AliasExists(new AliasExistsRequest(new Names(new List<string> { _alias })));
	if (!response.IsValid)
	{
		throw response.OriginalException;
	}

	if (response.Exists)
	{
		_elasticClient.DeleteAlias(new DeleteAliasRequest(Indices.Parse($"{_alias}-*"), _alias));
	}

	var responseCreateIndex = _elasticClient.PutAlias(new PutAliasRequest(Indices.Parse($"{_alias}-*"), _alias));
	if (!responseCreateIndex.IsValid)
	{
		throw response.OriginalException;
	}
}

The full AuditTrailProvider class which implements the audit trail.

using AuditTrail.Model;
using Elasticsearch.Net;
using Microsoft.Extensions.Options;
using Nest;
using Newtonsoft.Json.Converters;
using System;
using System.Collections.Generic;
using System.Linq;

namespace AuditTrail
{
    public class AuditTrailProvider<T> : IAuditTrailProvider<T> where T : class
    {
        private const string _alias = "auditlog";
        private string _indexName = $"{_alias}-{DateTime.UtcNow.ToString("yyyy-MM-dd")}";
        private static Field TimestampField = new Field("timestamp");
        private readonly IOptions<AuditTrailOptions> _options;

        private ElasticClient _elasticClient { get; }

        public AuditTrailProvider(
           IOptions<AuditTrailOptions> auditTrailOptions)
        {
            _options = auditTrailOptions ?? throw new ArgumentNullException(nameof(auditTrailOptions));

            if(_options.Value.IndexPerMonth)
            {
                _indexName = $"{_alias}-{DateTime.UtcNow.ToString("yyyy-MM")}";
            }

            var pool = new StaticConnectionPool(new List<Uri> { new Uri("http://localhost:9200") });
            var connectionSettings = new ConnectionSettings(
                pool,
                new HttpConnection(),
                new SerializerFactory((jsonSettings, nestSettings) => jsonSettings.Converters.Add(new StringEnumConverter())))
              .DisableDirectStreaming();

            _elasticClient = new ElasticClient(connectionSettings);
        }

        public void AddLog(T auditTrailLog)
        {
            var index = new IndexName()
            {
                Name = _indexName
            };

            var indexRequest = new IndexRequest<T>(auditTrailLog, index);

            var response = _elasticClient.Index(indexRequest);
            if (!response.IsValid)
            {
                throw new ElasticsearchClientException("Add auditlog disaster!");
            }
        }

        public long Count(string filter = "*")
        {
            EnsureAlias();
            var searchRequest = new SearchRequest<T>(Indices.Parse(_alias))
            {
                Size = 0,
                Query = new QueryContainer(
                    new SimpleQueryStringQuery
                    {
                        Query = filter
                    }
                ),
                Sort = new List<ISort>
                    {
                        new SortField { Field = TimestampField, Order = SortOrder.Descending }
                    }
            };

            var searchResponse = _elasticClient.Search<AuditTrailLog>(searchRequest);

            return searchResponse.Total;
        }

        public IEnumerable<T> QueryAuditLogs(string filter = "*", AuditTrailPaging auditTrailPaging = null)
        {
            var from = 0;
            var size = 10;
            EnsureAlias();
            if(auditTrailPaging != null)
            {
                from = auditTrailPaging.Skip;
                size = auditTrailPaging.Size;
                if(size > 1000)
                {
                    // max limit 1000 items
                    size = 1000;
                }
            }
            var searchRequest = new SearchRequest<T>(Indices.Parse(_alias))
            {
                Size = size,
                From = from,
                Query = new QueryContainer(
                    new SimpleQueryStringQuery
                    {
                        Query = filter
                    }
                ),
                Sort = new List<ISort>
                    {
                        new SortField { Field = TimestampField, Order = SortOrder.Descending }
                    }
            };

            var searchResponse = _elasticClient.Search<T>(searchRequest);

            return searchResponse.Documents;
        }

        private void CreateAliasForAllIndices()
        {
            var response = _elasticClient.AliasExists(new AliasExistsRequest(new Names(new List<string> { _alias })));
            if (!response.IsValid)
            {
                throw response.OriginalException;
            }

            if (response.Exists)
            {
                _elasticClient.DeleteAlias(new DeleteAliasRequest(Indices.Parse($"{_alias}-*"), _alias));
            }

            var responseCreateIndex = _elasticClient.PutAlias(new PutAliasRequest(Indices.Parse($"{_alias}-*"), _alias));
            if (!responseCreateIndex.IsValid)
            {
                throw response.OriginalException;
            }
        }

        private void CreateAlias()
        {
            if (_options.Value.AmountOfPreviousIndicesUsedInAlias > 0)
            {
                CreateAliasForLastNIndices(_options.Value.AmountOfPreviousIndicesUsedInAlias);
            }
            else
            {
                CreateAliasForAllIndices();
            }
        }

        private void CreateAliasForLastNIndices(int amount)
        {
            var responseCatIndices = _elasticClient.CatIndices(new CatIndicesRequest(Indices.Parse($"{_alias}-*")));
            var records = responseCatIndices.Records.ToList();
            List<string> indicesToAddToAlias = new List<string>();
            for(int i = amount;i>0;i--)
            {
                if (_options.Value.IndexPerMonth)
                {
                    var indexName = $"{_alias}-{DateTime.UtcNow.AddMonths(-i + 1).ToString("yyyy-MM")}";
                    if(records.Exists(t => t.Index == indexName))
                    {
                        indicesToAddToAlias.Add(indexName);
                    }
                }
                else
                {
                    var indexName = $"{_alias}-{DateTime.UtcNow.AddDays(-i + 1).ToString("yyyy-MM-dd")}";
                    if (records.Exists(t => t.Index == indexName))
                    {
                        indicesToAddToAlias.Add(indexName);
                    }
                }
            }

            var response = _elasticClient.AliasExists(new AliasExistsRequest(new Names(new List<string> { _alias })));
            if (!response.IsValid)
            {
                throw response.OriginalException;
            }

            if (response.Exists)
            {
                _elasticClient.DeleteAlias(new DeleteAliasRequest(Indices.Parse($"{_alias}-*"), _alias));
            }

            Indices multipleIndicesFromStringArray = indicesToAddToAlias.ToArray();
            var responseCreateIndex = _elasticClient.PutAlias(new PutAliasRequest(multipleIndicesFromStringArray, _alias));
            if (!responseCreateIndex.IsValid)
            {
                throw responseCreateIndex.OriginalException;
            }
        }

        private static DateTime aliasUpdated = DateTime.UtcNow.AddYears(-50);

        private void EnsureAlias()
        {
            if (_options.Value.IndexPerMonth)
            {
                if (aliasUpdated.Date < DateTime.UtcNow.AddMonths(-1).Date)
                {
                    aliasUpdated = DateTime.UtcNow;
                    CreateAlias();
                }
            }
            else
            {
                if (aliasUpdated.Date < DateTime.UtcNow.AddDays(-1).Date)
                {
                    aliasUpdated = DateTime.UtcNow;
                    CreateAlias();
                }
            }
        }
    }
}

Testing the audit log

The created audit trails can be checked using the following HTTP GET requests:

Counts all the audit trail entries in the alias.
http://localhost:9200/auditlog/_count

Shows all the audit trail indices. You can count all the documents from the indices used in the alias and it must match the count from the alias.
http://localhost:9200/_cat/indices/auditlog*

You can also start the application and the AuditTrail logs can be displayed in the Audit Trail logs MVC view.

01_audittrailview

This view is just a quick test, if implementing properly, you would have to localize the timestamp display and add proper paging in the view.

Notes, improvements

If lots of audit trail documents are written at once, maybe a bulk insert could be used to add the documents in batches, like most of the loggers implement this. You should also define a strategy on how the old audit trails, indices should be cleaned up, archived or whatever. The creating of the alias could be optimized depending on you audit trail data, and how you clean up old audit trail indices.

Links:

https://www.elastic.co/guide/en/elasticsearch/reference/5.2/indices-aliases.html

https://www.elastic.co/guide/en/elasticsearch/reference/current/query-dsl-simple-query-string-query.html

https://docs.microsoft.com/en-us/aspnet/core/

https://www.elastic.co/products/elasticsearch

https://github.com/elastic/elasticsearch-net

https://www.nuget.org/packages/NLog.Web.AspNetCore/


.NET Core logging to MySQL using NLog

$
0
0

This article shows how to log to MySQL in a .NET Core application using NLog.

Code: https://github.com/damienbod/AspNetCoreNlog

NLog posts in this series:

  1. ASP.NET Core logging with NLog and Microsoft SQL Server
  2. ASP.NET Core logging with NLog and Elasticsearch
  3. Settings the NLog database connection string in the ASP.NET Core appsettings.json
  4. ASP.NET Core, logging to MySQL using NLog

Set up the MySQL database

MySQL Workbench can be used to add the schema ‘nlog’ which will be used for logging to the MySQL database. The user ‘damienbod’ is also required, which must match the defined user in the connection string. If you configure the MySQL database differently, then you need to change the connection string in the nlog.config file.

nlogmysql_02

You also need to create a log table. The following script can be used. If you decide to use NLog.Web in a ASP.NET Core application and add some extra properties, fields to the logs, then this script needs to be extended and also the database target in the nlog.config.

CREATE TABLE `log` (
  `Id` int(10) unsigned NOT NULL AUTO_INCREMENT,
  `Application` varchar(50) DEFAULT NULL,
  `Logged` datetime DEFAULT NULL,
  `Level` varchar(50) DEFAULT NULL,
  `Message` varchar(512) DEFAULT NULL,
  `Logger` varchar(250) DEFAULT NULL,
  `Callsite` varchar(512) DEFAULT NULL,
  `Exception` varchar(512) DEFAULT NULL,
  PRIMARY KEY (`Id`)
) ENGINE=InnoDB AUTO_INCREMENT=2 DEFAULT CHARSET=utf8;

Add NLog and the MySQL provider to the project.

The MySql.Data pre release NuGet package can be used to log to MySQL. Add this to your project.

nlogmysql

The NLog.Web.AspNetCore package also needs to be added or just NLog if you do not require any web extensions.

nlog.config

The database target needs to be configured to log to MySQL. The database provider is set to use the MySQL.Data package which was downloaded using NuGet. If your using a different MySQL provider, this needs to be changed. The connection string is also set here, which matches what was configured previously in the MySQL database using Workbench. If you read the connection string from the app settings, a NLog variable can be used here.

  <target name="database" xsi:type="Database"
              dbProvider="MySql.Data.MySqlClient.MySqlConnection, MySql.Data"
              connectionString="server=localhost;Database=nlog;user id=damienbod;password=1234"
             >

          <commandText>
              insert into nlog.log (
              Application, Logged, Level, Message,
              Logger, CallSite, Exception
              ) values (
              @Application, @Logged, @Level, @Message,
              @Logger, @Callsite, @Exception
              );
          </commandText>

          <parameter name="@application" layout="AspNetCoreNlog" />
          <parameter name="@logged" layout="${date}" />
          <parameter name="@level" layout="${level}" />
          <parameter name="@message" layout="${message}" />

          <parameter name="@logger" layout="${logger}" />
          <parameter name="@callSite" layout="${callsite:filename=true}" />
          <parameter name="@exception" layout="${exception:tostring}" />
      </target>

NLog can then be used in the application.

using System;
using System.Collections.Generic;
using System.Linq;
using System.Threading.Tasks;
using NLog;
using NLog.Targets;

namespace ConsoleNLog
{
    public class Program
    {
        public static void Main(string[] args)
        {

            LogManager.Configuration.Variables["configDir"] = "C:\\git\\damienbod\\AspNetCoreNlog\\Logs";

            var logger = LogManager.GetLogger("console");
            logger.Warn("console logging is great");

            Console.WriteLine("log sent");
            Console.ReadKey();
        }
    }
}

Full nlog.config file:
https://github.com/damienbod/AspNetCoreNlog/blob/master/src/ConsoleNLogMySQL/nlog.config

Links

https://github.com/nlog/NLog/wiki/Database-target

https://github.com/NLog/NLog.Extensions.Logging

https://github.com/NLog

https://github.com/NLog/NLog/blob/38aef000f916bd5ffd8b80a5576afa2423192e84/examples/targets/Configuration%20API/Database/MSSQL/Example.cs

https://docs.asp.net/en/latest/fundamentals/logging.html

https://msdn.microsoft.com/en-us/magazine/mt694089.aspx

https://docs.asp.net/en/latest/fundamentals/configuration.html


Testing an ASP.NET Core MVC Protobuf API using HTTPClient and xUnit

$
0
0

The article shows how to test an ASP.NET Core MVC API using xUnit and a HTTPClient client using Protobuf for the content formatters.

Code: https://github.com/damienbod/AspNetMvc6ProtobufFormatters

Posts in this series:

The test project tests the ASP.NET Core API produced here. xUnit is used as a test framework. The xUnit dependencies can be added to the test project using NuGet in Visual Studio 2017 as well as the Microsoft.AspNetCore.TestHost package. Microsoft provide nice docs about Integration testing ASP.NET Core.

When the NuGet packages have been added, you can view these in the csproj file, or install and update directly in this file. A reference to the project containg the API is also added to the test project.

<Project Sdk="Microsoft.NET.Sdk">

  <PropertyGroup>
    <TargetFramework>netcoreapp1.1</TargetFramework>
    <AssemblyName>AspNetCoreProtobuf.IntegrationTests</AssemblyName>
  </PropertyGroup>

  <ItemGroup>
    <PackageReference Include="Microsoft.NET.Test.Sdk" Version="15.0.0" />
    <PackageReference Include="xunit.runner.console" Version="2.2.0" />
    <PackageReference Include="xunit.runner.visualstudio" Version="2.2.0" />
    <PackageReference Include="xunit" Version="2.2.0" />
    <PackageReference Include="Microsoft.AspNetCore.TestHost" Version="1.1.1" />
    <PackageReference Include="protobuf-net" Version="2.1.0" />
    <PackageReference Include="xunit.runners" Version="2.0.0" />
  </ItemGroup>

  <ItemGroup>
    <ProjectReference Include="..\AspNetCoreProtobuf\AspNetCoreProtobuf.csproj" />
  </ItemGroup>

  <ItemGroup>
    <Service Include="{82a7f48d-3b50-4b1e-b82e-3ada8210c358}" />
  </ItemGroup>

</Project>

The TestServer is used to test the ASP.NET Core API. This is setup for all the API tests.

private readonly TestServer _server;
private readonly HttpClient _client;

public ProtobufApiTests()
{
	_server = new TestServer(
		new WebHostBuilder()
		.UseKestrel()
		.UseStartup<Startup>());
	_client = _server.CreateClient();
}

HTTP GET request test

The GetProtobufDataAndCheckProtobufContentTypeMediaType test sends a HTTP GET to the test server, and requests the content as application/x-protobuf. The result is deserialized using protobuf and the header and the expected result is checked.

[Fact]
public async Task GetProtobufDataAndCheckProtobufContentTypeMediaType()
{
	// Act
	_client.DefaultRequestHeaders.Accept.Add(new MediaTypeWithQualityHeaderValue("application/x-protobuf"));
	var response = await _client.GetAsync("/api/values/1");
	response.EnsureSuccessStatusCode();

	var result = ProtoBuf.Serializer.Deserialize<ProtobufModelDto>(await response.Content.ReadAsStreamAsync());

	// Assert
	Assert.Equal("application/x-protobuf", response.Content.Headers.ContentType.MediaType );
	Assert.Equal("My first MVC 6 Protobuf service", result.StringValue);
}

HTTP POST request test

The PostProtobufData test method sends a HTTP POST request to the test server with a protobuf serialized content. The status code of the request is validated.

[Fact]
public void PostProtobufData()
{
	// HTTP GET with Protobuf Response Body
	_client.DefaultRequestHeaders.Accept.Add(new MediaTypeWithQualityHeaderValue("application/x-protobuf"));

	MemoryStream stream = new MemoryStream();
	ProtoBuf.Serializer.Serialize<ProtobufModelDto>(stream, new ProtobufModelDto
	{
		Id = 2,
		Name= "lovely data",
		StringValue = "amazing this ah"

	});

	HttpContent data = new StreamContent(stream);

	// HTTP POST with Protobuf Request Body
	var responseForPost = _client.PostAsync("api/Values", data).Result;

	Assert.True(responseForPost.IsSuccessStatusCode);
}

The tests can be executed or debugged in Visual Studio using the Test Explorer

The tests can also be run with dotnet test in the commandline.

C:\git\damienbod\AspNetCoreProtobufFormatters\src\AspNetCoreProtobuf.IntegrationTests>dotnet test
Build started, please wait...
Build completed.

Test run for C:\git\damienbod\AspNetCoreProtobufFormatters\src\AspNetCoreProtobuf.IntegrationTests\bin\Debug\netcoreapp1.1\AspNetCoreProtobuf.IntegrationTests.dll(.NETCoreApp,Version=v1.1)
Microsoft (R) Testausführungs-Befehlszeilentool Version 15.0.0.0
Copyright (c) Microsoft Corporation. Alle Rechte vorbehalten.

Die Testausf├╝hrung wird gestartet, bitte warten...
[xUnit.net 00:00:00.5821132]   Discovering: AspNetCoreProtobuf.IntegrationTests
[xUnit.net 00:00:00.6841246]   Discovered:  AspNetCoreProtobuf.IntegrationTests
[xUnit.net 00:00:00.7273897]   Starting:    AspNetCoreProtobuf.IntegrationTests
info: Microsoft.AspNetCore.Hosting.Internal.WebHost[1]
      Request starting HTTP/1.1 GET http://
info: Microsoft.AspNetCore.Mvc.Internal.ControllerActionInvoker[1]
      Executing action method AspNetCoreProtobuf.Controllers.ValuesController.Post (AspNetCoreProtobuf) with arguments (AspNetCoreProtobuf.Model.ProtobufModelDto) - ModelState is Valid
info: Microsoft.AspNetCore.Mvc.Internal.ControllerActionInvoker[2]
      Executed action AspNetCoreProtobuf.Controllers.ValuesController.Post (AspNetCoreProtobuf) in 137.2264ms
info: Microsoft.AspNetCore.Hosting.Internal.WebHost[2]
      Request finished in 346.8796ms 200
info: Microsoft.AspNetCore.Hosting.Internal.WebHost[1]
      Request starting HTTP/1.1 GET http://
info: Microsoft.AspNetCore.Mvc.Internal.ControllerActionInvoker[1]
      Executing action method AspNetCoreProtobuf.Controllers.ValuesController.Get (AspNetCoreProtobuf) with arguments (1) - ModelState is Valid
info: Microsoft.AspNetCore.Mvc.Internal.ObjectResultExecutor[1]
      Executing ObjectResult, writing value Microsoft.AspNetCore.Mvc.ControllerContext.
info: Microsoft.AspNetCore.Mvc.Internal.ControllerActionInvoker[2]
      Executed action AspNetCoreProtobuf.Controllers.ValuesController.Get (AspNetCoreProtobuf) in 39.0599ms
info: Microsoft.AspNetCore.Hosting.Internal.WebHost[2]
      Request finished in 43.2983ms 200 application/x-protobuf
info: Microsoft.AspNetCore.Hosting.Internal.WebHost[1]
      Request starting HTTP/1.1 GET http://
info: Microsoft.AspNetCore.Mvc.Internal.ControllerActionInvoker[1]
      Executing action method AspNetCoreProtobuf.Controllers.ValuesController.Get (AspNetCoreProtobuf) with arguments (1) - ModelState is Valid
info: Microsoft.AspNetCore.Mvc.Internal.ObjectResultExecutor[1]
      Executing ObjectResult, writing value Microsoft.AspNetCore.Mvc.ControllerContext.
info: Microsoft.AspNetCore.Mvc.Internal.ControllerActionInvoker[2]
      Executed action AspNetCoreProtobuf.Controllers.ValuesController.Get (AspNetCoreProtobuf) in 1.4974ms
info: Microsoft.AspNetCore.Hosting.Internal.WebHost[2]
      Request finished in 3.6715ms 200 application/x-protobuf
[xUnit.net 00:00:01.5669956]   Finished:    AspNetCoreProtobuf.IntegrationTests

Tests gesamt: 3. Bestanden: 3. Fehler: 0. Übersprungen: 0.
Der Testlauf war erfolgreich.
Testausführungszeit: 2.7499 Sekunden

appveyor CI

The project can then be connected to any build server. Appveyor is a easy one to setup and works well with github projects. Create an account and select the github repository to build. Add an appveyor.yml file to the root of your project and configure as required. Docs can be found here:
https://www.appveyor.com/docs/build-configuration/

image: Visual Studio 2017 RC
init:
  - git config --global core.autocrlf true
install:
  - ECHO %APPVEYOR_BUILD_WORKER_IMAGE%
  - dotnet --version
  - dotnet restore
build_script:
- dotnet build
before_build:
- appveyor-retry dotnet restore -v Minimal
test_script:
- cd src/AspNetCoreProtobuf.IntegrationTests
- dotnet test

The appveyor badges can then be used in your project md file.

|                           | Build                                                                                                                                                             |
| ------------------------- | ----------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| .NET Core                 | [![Build status](https://ci.appveyor.com/api/projects/status/ihtrq4u81rtsty9k?svg=true)](https://ci.appveyor.com/project/damienbod/aspnetmvc6protobufformatters)  |

This would then be displayed in github as follows:

Links

https://developers.google.com/protocol-buffers/docs/csharptutorial

http://www.stackoverflow.com/questions/7774155/deserialize-long-string-with-protobuf-for-c-sharp-doesnt-work-properly-for-me

https://xunit.github.io/

https://www.appveyor.com/docs/build-configuration/

https://www.nuget.org/packages/protobuf-net/

https://github.com/mgravell/protobuf-net

http://teelahti.fi/using-google-proto3-with-aspnet-mvc/

https://github.com/damienpontifex/ProtobufFormatter/tree/master/src/ProtobufFormatter

http://www.strathweb.com/2014/11/formatters-asp-net-mvc-6/

http://blogs.msdn.com/b/webdev/archive/2014/11/24/content-negotiation-in-mvc-5-or-how-can-i-just-write-json.aspx

https://github.com/WebApiContrib/WebApiContrib.Formatting.ProtoBuf

https://damienbod.wordpress.com/2014/01/11/using-protobuf-net-media-formatter-with-web-api-2/

https://docs.microsoft.com/en-us/aspnet/core/testing/integration-testing


ASP.NET Core Error Management with elmah.io

$
0
0

This article shows how to use elmah.io error management with an ASP.NET Core application. The error, log data is added to elmah.io using different elmah.io nuget packages, directly from ASP.NET Core and also using an NLog elmah.io target.

Code: https://github.com/damienbod/AspNetCoreElmah

elmah.io is an error management system which can help you monitor, find and fix application problems fast. While structured logging is supported, the main focus of elmah.io is handling errors.

Getting started with Elmah.Io

Before you can start logging to elmah.io, you need to create an account and setup a log. Refer to the documentation here.

Logging exceptions, errors with Elmah.Io.AspNetCore and Elmah.Io.Extensions.Logging

You can add logs, exceptions to elmah.io directly from an ASP.NET Core application using the Elmah.Io.AspNetCore and the Elmah.Io.Extensions.Logging nuget packages. These packages can be added to the project using the nuget package manager.

Or you can just add the packages directly in the csproj file.

<Project Sdk="Microsoft.NET.Sdk.Web">

  <PropertyGroup>
    <TargetFramework>netcoreapp1.1</TargetFramework>
  </PropertyGroup>

  <PropertyGroup>
    <UserSecretsId>AspNetCoreElmah-c23d2237a4-eb8832a1-452ac4</UserSecretsId>
  </PropertyGroup>

  <ItemGroup>
    <Content Include="wwwroot\index.html" />
  </ItemGroup>
  <ItemGroup>
    <PackageReference Include="Elmah.Io.AspNetCore" Version="3.2.39-pre" />
    <PackageReference Include="Elmah.Io.Extensions.Logging" Version="3.1.22-pre" />
    <PackageReference Include="Microsoft.ApplicationInsights.AspNetCore" Version="2.0.0" />
    <PackageReference Include="Microsoft.AspNetCore" Version="1.1.1" />
    <PackageReference Include="Microsoft.AspNetCore.Mvc" Version="1.1.2" />
    <PackageReference Include="Microsoft.AspNetCore.StaticFiles" Version="1.1.1" />
    <PackageReference Include="Microsoft.Extensions.Logging.Debug" Version="1.1.1" />
    <PackageReference Include="Microsoft.Extensions.Configuration.UserSecrets" Version="1.1.1" />
  </ItemGroup>
  <ItemGroup>
    <DotNetCliToolReference Include="Microsoft.VisualStudio.Web.CodeGeneration.Tools" Version="1.0.0" />
  </ItemGroup>

</Project>

The Elmah.Io.AspNetCore package is used to catch unhandled exceptions in the application. This is configured in the Startup class. The OnMessage method is used to set specific properties in the messages which are sent to elmah.io. Setting the Hostname and the Application properties are very useful when evaluating the logs in elmah.io.

app.UseElmahIo(
	_elmahAppKey,
	new Guid(_elmahLogId),
	new ElmahIoSettings()
	{
		OnMessage = msg =>
		{
			msg.Version = "1.0.0";
			msg.Hostname = "dev";
			msg.Application = "AspNetCoreElmah";
		}
	});

The Elmah.Io.Extensions.Logging package is used to log messages using the built in ILoggerFactory. You should only send warning, errors, critical messages and not just log everything to elmah.io, but it is possible to do this. Again the OnMessage method can be used to set the Hostname and the Application name for each log.

loggerFactory.AddElmahIo(
	_elmahAppKey,
	new Guid(_elmahLogId),
	new FilterLoggerSettings
	{
		{"ValuesController", LogLevel.Information}
	},
	new ElmahIoProviderOptions
	{
		OnMessage = msg =>
		{
			msg.Version = "1.0.0";
			msg.Hostname = "dev";
			msg.Application = "AspNetCoreElmah";
		}
	});

Using User Secrets for the elmah.io API-KEY and LogID

ASP.NET Core user secrets can be used to set the elmah.io API-KEY and the LogID as you don’t want to commit these to your source. The AddUserSecrets method can be used to set this.

private string _elmahAppKey;
private string _elmahLogId;

public Startup(IHostingEnvironment env)
{
	var builder = new ConfigurationBuilder()
		.SetBasePath(env.ContentRootPath)
		.AddJsonFile("appsettings.json", optional: false, reloadOnChange: true)
		.AddJsonFile($"appsettings.{env.EnvironmentName}.json", optional: true)
		.AddEnvironmentVariables();

	if (env.IsDevelopment())
	{
		builder.AddUserSecrets("AspNetCoreElmah-c23d2237a4-eb8832a1-452ac4");
	}

	Configuration = builder.Build();
}

The user secret properties can then be used in the ConfigureServices method.

 public void ConfigureServices(IServiceCollection services)
{
	_elmahAppKey = Configuration["ElmahAppKey"];
	_elmahLogId = Configuration["ElmahLogId"];
	// Add framework services.
	services.AddMvc();
}

A dummy exception is thrown in this example, which then sends the data to elmah.io.

[HttpGet("{id}")]
public string Get(int id)
{
	throw new System.Exception("something terrible bad here!");
	return "value";
}

Logging exceptions, errors to elmah.io using NLog

NLog using the Elmah.Io.NLog target can also be used in ASP.NET Core to send messages to elmah.io. This can be added using the nuget package manager.

Or you can just add it to the csproj file.

<PackageReference Include="Elmah.Io.NLog" Version="3.1.28-pre" />
<PackageReference Include="NLog.Web.AspNetCore" Version="4.3.1" />

NLog for ASP.NET Core applications can be configured in the Startup class. You need to set the target properties with the elmah.io API-KEY and also the LogId. You could also do this in the nlog.config file.

loggerFactory.AddNLog();
app.AddNLogWeb();

LogManager.Configuration.Variables["configDir"] = "C:\\git\\damienbod\\AspNetCoreElmah\\Logs";

foreach (ElmahIoTarget target in LogManager.Configuration.AllTargets.Where(t => t is ElmahIoTarget))
{
	target.ApiKey = _elmahAppKey;
	target.LogId = _elmahLogId;
}

LogManager.ReconfigExistingLoggers();

The IHttpContextAccessor and the HttpContextAccessor also need to be registered to the default IoC in ASP.NET Core to get the extra information from the web requests.

public void ConfigureServices(IServiceCollection services)
{
	services.AddSingleton<IHttpContextAccessor, HttpContextAccessor>();

	_elmahAppKey = Configuration["ElmahAppKey"];
	_elmahLogId = Configuration["ElmahLogId"];

	// Add framework services.
	services.AddMvc();
}

The nlog.config file can then be configured for the target with the elmah.io type. The application property is also set which is useful in elmah.io.

<?xml version="1.0" encoding="utf-8" ?>
<nlog xmlns="http://www.nlog-project.org/schemas/NLog.xsd"
      xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
      autoReload="true"
      internalLogLevel="Warn"
      internalLogFile="C:\git\damienbod\AspNetCoreElmah\Logs\internal-nlog.txt">

  <extensions>
    <add assembly="NLog.Web.AspNetCore"/>
    <add assembly="Elmah.Io.NLog"/>
  </extensions>


  <targets>
    <target name="elmahio" type="elmah.io" apiKey="API_KEY" logId="LOG_ID" application="AspNetCoreElmahUI"/>

    <target xsi:type="File" name="allfile" fileName="${var:configDir}\nlog-all.log"
                layout="${longdate}|${event-properties:item=EventId.Id}|${logger}|${uppercase:${level}}|TraceId=${aspnet-traceidentifier}| url: ${aspnet-request-url} | action: ${aspnet-mvc-action} |${message} ${exception}" />

    <target xsi:type="File" name="ownFile-web" fileName="${var:configDir}\nlog-own.log"
             layout="${longdate}|${event-properties:item=EventId.Id}|${logger}|${uppercase:${level}}|TraceId=${aspnet-traceidentifier}| url: ${aspnet-request-url} | action: ${aspnet-mvc-action} | ${message} ${exception}" />

    <target xsi:type="Null" name="blackhole" />

  </targets>

  <rules>
    <logger name="*" minlevel="Warn" writeTo="elmahio" />
    <!--All logs, including from Microsoft-->
    <logger name="*" minlevel="Trace" writeTo="allfile" />

    <!--Skip Microsoft logs and so log only own logs-->
    <logger name="Microsoft.*" minlevel="Trace" writeTo="blackhole" final="true" />
    <logger name="*" minlevel="Trace" writeTo="ownFile-web" />
  </rules>
</nlog>


The About method calls the AspNetCoreElmah application method which throws the dummy exception, so we send exceptions from both applications.

public async Task<IActionResult> About()
{
	_logger.LogInformation("HomeController About called");
	// throws exception
	HttpClient _client = new HttpClient();
	var response = await _client.GetAsync("http://localhost:37209/api/values/1");
	response.EnsureSuccessStatusCode();
	var responseString = System.Text.Encoding.UTF8.GetString(
		await response.Content.ReadAsByteArrayAsync()
	);
	ViewData["Message"] = "Your application description page.";

	return View();
}

Now both applications can be started, and the errors can be viewed in the elmah.io dashboard.

Where you open the dashboard in elmah.io and access you logs, you can view the exceptions.

Here’s the log sent from the AspNetCoreElmah application.

Here’s the log sent from the AspNetCoreElmahUI application using NLog with Elmah.Io.


Links

https://elmah.io/

https://github.com/elmahio/elmah.io.nlog

http://nlog-project.org/


Viewing all 357 articles
Browse latest View live