Some time ago, I wrote this article about creating a Minimal API CRUD using Entity Framework. That project worked fine, but something was missing: Authentication and Authorization. When dealing with an API that can access sensitive data, it’s crucial to restrict access to ensure that only authorized users can access it.

We will take on that project and add authentication and authorization to its endpoints, to secure them.

Authentication and Authorization

First, let’s differentiate these two concepts:

Authentication is the process of verifying the identity of a user. The primary goal of authentication is to answer the question, "Who are you?". When the user enters its identification with the use of some kind of credentials, such as usernames, passwords, or more advanced techniques like biometric data (fingerprint, facial recognition, etc.), security tokens, or smart cards, the service compares it with the stored values to validate the user’s identity and give access to the system.

Once authenticated, the system must verify if the user is allowed to use the resources and at what level. In other words, the primary goal of authorization is to answer the question, "What are you allowed to do?"
Authorization is based on the user’s role, permissions, and privileges within the system. Permissions are defined in access control policies and can be specific to individual users or user groups. For example, some users may have read-only access, while others may have read and write access, and some may have administrative privileges.

We will add both authentication and authorization to our project to safeguard the data.

Minimal APIs support these authentication strategies:

In this article, we will focus on implementing token-based authentication using JWT (JSON Web Tokens). JWTs are a popular choice due to their simplicity, compactness, and ease of use.

Enabling authentication in the project

We will work with the project at https://github.com/bsonnino/CustomerService. Clone the project in your local disk, run it, and open a browser page with the address https://localhost:7191/swagger/index.html (the port number may change). You’ll be greeted by the Swagger test page, which lets you test the APIs and see that they’re not restricted.

To enable authentication in the project, add the package Microsoft.AspNetCore.Authentication.JwtBearer using the following command:

dotnet add package Microsoft.AspNetCore.Authentication.JwtBearer

and then register the authentication and authorization middlewares with

using Microsoft.EntityFrameworkCore;

var builder = WebApplication.CreateBuilder(args);

builder.Services.AddEndpointsApiExplorer();
builder.Services.AddSwaggerGen();
builder.Services.AddDbContext<CustomerDbContext>();
builder.Services.AddAuthentication().AddJwtBearer();
builder.Services.AddAuthorization();

var app = builder.Build();

app.UseSwagger();
app.UseSwaggerUI();
app.UseAuthentication();
app.UseAuthorization();

Running the program at this stage won’t yield any visible changes since we haven’t yet secured our endpoints:

  • Use the [Authorize] attribute on the method for the endpoint
  • Add the RequireAuthorization to the endpoint

The first method will be used for the customers endpoint, and the second method will be applied to the customers/{id} endpoint:

app.MapGet("/customers", [Authorize]async (CustomerDbContext context) =>
{
    logger.LogInformation("Getting customers...");
    var customers = await context.Customer.ToListAsync();
    logger.LogInformation("Retrieved {Count} customers", customers.Count);
    return Results.Ok(customers);
});

app.MapGet("/customers/{id}", async (string id, CustomerDbContext context) =>
{
    var customer = await context.Customer.FindAsync(id);
    if (customer == null)
    {
        return Results.NotFound();
    }

    return Results.Ok(customer);
}).RequireAuthorization();

Now, attempting to use any of the endpoints without proper authentication will result in a 401 response (unauthorized):

To authenticate, the call must include a token issued by a central authority, such as an identity server, which we don’t currently possess. During development, the dotnet user-jwts tool can be used, invoked with the following command:

dotnet user-jwts create

Executing this command generates a token and stores it in the user secrets. It will also modify the appsettings.Development.json file to enable the toke issuer (ensure this file is present in the project folder to avoid errors):

{
  "Logging": {
    "LogLevel": {
      "Default": "Information",
      "Microsoft.AspNetCore": "Warning"
    }
  },
  "AllowedHosts": "*",
  "Authentication": {
    "Schemes": {
      "Bearer": {
        "ValidAudiences": [
          "http://localhost:30888",
          "https://localhost:44321",
          "https://localhost:7191",
          "http://localhost:5057"
        ],
        "ValidIssuer": "dotnet-user-jwts"
      }
    }
  }
}

By visiting https://jwt.io, you can input the token and view it in decoded form:

At this point, you can run the project and use the following curl command to query the service:

curl -X GET https://localhost:7191/customers -H "accept: */*" -H "Authorization: Bearer <token>"

While effective, this approach can be cumbersome for API testing. Using Swagger for testing would be more convenient. However, upon opening the Swagger page, you’ll notice there’s no provision for entering the token. This can be easily resolved by adjusting the configuration options for Swagger in the Program.cs file:

builder.Services.AddSwaggerGen(options => {
    options.AddSecurityDefinition("Bearer", new OpenApiSecurityScheme() {
        Name = "Authorization",
            Type = SecuritySchemeType.ApiKey,
            Scheme = "Bearer",
            BearerFormat = "JWT",
            In = ParameterLocation.Header,
            Description = "JWT Authorization header using the Bearer scheme. \r\n\r\n Enter 'Bearer' [space] and then your token in the text input below.\r\n\r\nExample: \"Bearer 1safsfsdfdfd\"",
    });
    options.AddSecurityRequirement(new OpenApiSecurityRequirement {
        {
            new OpenApiSecurityScheme {
                Reference = new OpenApiReference {
                    Type = ReferenceType.SecurityScheme,
                        Id = "Bearer"
                }
            },
            new string[] {}
        }
    });
});

AddSecurityDefinition instructs Swagger to incorporate the authorization feature. This will add an Authorize button at the top and configure the authorization. This configuration specifies the use of the Bearer scheme in the header, using JWT format.

AddSecurityRequirement defines the security prerequisites for your API endpoints. In this case, we are telling that Swagger needs to add the Bearer scheme to each call. When you run the program again, you can see the Authorize button. Clicking on it, you can add your token:

Subsequently, the token will be automatically added to the headers, preventing any further 401 errors:

Enabling authorization

Up to this point, we’ve implemented authentication (verifying the user’s claimed identity), but we haven’t addressed authorization. For instance, while any authenticated user can access the customer list or view an individual customer, only Admin users should be able to modify data. To achieve this, we can modify the RequireAuthorization method by adding specific parameters:

app.MapPost("/customers", async (Customer customer, CustomerDbContext context) =>
{
    context.Customer.Add(customer);
    await context.SaveChangesAsync();
    return Results.Created($"/customers/{customer.Id}", customer);
}).RequireAuthorization(new AuthorizeAttribute() { Roles = "Admin" });

However, if you run the program now—despite having the previous token—the attempt to add a new customer will fail, as the token lacks the Admin role:

To address this, we need to create a new token with the Admin role:

Using this token, we now can create a new user:

More complex authorization requirements

Sometimes, more elaborated policies are necessary, which demand more than just the role. For example, we might stipulate that only a token with the can_delete_user claim is authorized to delete a customer. For that, we have to add a new policy in the AddAuthorization method:

builder.Services.AddAuthorization(options => {
    options.AddPolicy("DeleteUser", policy => policy.RequireClaim("can_delete_user", "true"));
});

And include this requirement within the MapDelete call:

app.MapDelete("/customers/{id}", async (string id, CustomerDbContext context) =>
{
    var currentCustomer = await context.Customer.FindAsync(id);

    if (currentCustomer == null)
    {
        return Results.NotFound();
    }

    context.Customer.Remove(currentCustomer);
    await context.SaveChangesAsync();
    return Results.NoContent();
}).RequireAuthorization(new AuthorizeAttribute() { Policy = "DeleteUser" });

As a result, we’re unable to delete the user even with the Admin role:

This issue can be resolved by obtaining a new token:

Armed with this new token, we can now proceed to delete the user:

By combining policy requirements with role requirements, intricate scenarios for various operations can be established and effortlessly applied to our endpoints.

Conclusion

In summary, integrating authentication and authorization into our minimal APIs is a relatively straightforward process, and the effectiveness can be tested using the generated tokens. In a production environment, these tokens would be substituted by an Authentication Server, which identifies users and generates tokens for API usage.

All the code for this article is at https://github.com/bsonnino/CustomerAuth

In the last article, I wrote about logging in C# and how to use log formatters to suit your needs. Once you start adding logs to your applications, you start to notice there is too much info that you must add to every message: the event id, level, scopes, besides the message (which you can even add parameters, if you also want to do some structured logging).

When the project starts to grow, some issues with the logging appear: you cannot enforce standards and you may miss some messages (Did you add the correct event id to the message? Should it be a trace or an info message?) when processing the logs.

Fortunately, Microsoft introduced Compile-time logging source code generation. With this, you can customize your log messages and ease the task of adding logs.

The generated code will use the LoggerMessage.Define functionality to generate code at compile time that is much faster than run time approaches.

To use the source code generator, all you have to do is to have a partial class with a partial method, decorated with the LoggerMessage attribute. With that, the implementation for the partial method will be generated by the compiler.

In this article, we will take the project created in this article and add some code generated logging to it. This project can be downloaded at https://github.com/bsonnino/CustomerService.

Once we download and run it, we can open Swagger to test it with https://localhost:7191/swagger/index.html (the port number can change). In the console window, you will see something like this:

Now, we will start to customize our log messages. For that, we will create a new partial static class and will name it LogExtensions. In this class, we will add a partial method named ShowRetrievedCustomerCount and will decorate it with the LoggerMessage attribute:

public  static partial class LoggerExtensions
{
    [LoggerMessage(EventId = 100, Level = LogLevel.Information, 
        Message = "{methodName}({lineNumber}) - Retrieved {Count} customers")]
    public static partial void ShowRetrievedCustomerCount(this ILogger logger, 
        int count, [CallerMemberName] string methodName = "", [CallerLineNumber] int lineNumber = 0);
}

And that’s all we need for now. Some notes, here:

  • I’m passing the ILogger instance as a parameter for the function. That’s needed for the source code generation work. It doesn’t need to be the first parameter, but it must be present.
  • I used the keyword this for the ILogger parameter. This is not mandatory, but I wanted to make this method as an extension method, so it can be called with logger.ShowRetrievedCustomerCount(10)
  • The method must be void
  • I am passing the method name and line number, obtained with the attributes CallerMemberName and CallerLineNumber

With that, we can use our new function for the logging:

app.MapGet("/customers", async (CustomerDbContext context) =>
{
    var customers = await context.Customer.ToListAsync();
    logger.ShowRetrievedCustomerCount(customers.Count);
    return Results.Ok(customers);
});

When we run the code and test it, we get:

As you can see, the message is output to the console, with level Information and event id 100. It is a structured message (the customer count is a variable of the message) and it has the source method name and line.

We can change the format of the messages and remove the Entity Framework messages by changing the appsettings.json file to:

{
  "Logging": {
    "LogLevel": {
      "Default": "Information",
      "Microsoft.AspNetCore": "Warning",
      "Microsoft.EntityFrameworkCore": "Warning"
    },
    "Console": {
      "FormatterName": "simple",
      "FormatterOptions": {
        "SingleLine": true,
        "IncludeScopes": false,
        "TimestampFormat": "yyyy-MM-dd HH:mm:ss zzz ",
        "ColorBehavior": "enabled",
        "JsonWriterOptions": {
          "Indented": true
        }
      }
    }
  },
  "AllowedHosts": "*"
}

We set the formatting to single line and add the timestamp to get something like this:

NOTE – When I changed the appsettings file, I saw that the messages that came from the app were properly formatted and the ones that I had introduced were not. That’s because, in the original code, I have created a different logger that doesn’t get its messages for appsettings.json (the default logger created by the Asp.NET app does that automatically). The simplest and cleanest solution was to use the default logger, by changing this code:

var logger = LoggerFactory.Create(config =>
    {
        config.AddConsole();
    }).CreateLogger("CustomerApi");

with this one:

var logger = app.Logger;

If we want more info for the log, we can set the formatter name to json and IncludeScopes to true:

As you can see, in the State object, you have the count, method name, line number and even the original format of the messages. In the Scopes object you have data about the message, connection id and even the request path for the request.

We can continue adding messages for logging:

using System.Runtime.CompilerServices;

public  static partial class LoggerExtensions
{
    [LoggerMessage(EventId = 100, Level = LogLevel.Information,
	 Message = "{methodName}({lineNumber}) - Retrieved {Count} customers")]
    public static partial void ShowRetrievedCustomerCount(this ILogger logger, int count, 
	 [CallerMemberName] string methodName = "", [CallerLineNumber] int lineNumber = 0);

    [LoggerMessage(EventId = 404, Level = LogLevel.Information,
	 Message = "{methodName}({lineNumber}) - Customer {customerId} not found.")]
    public static partial void CustomerNotFound(this ILogger logger, string customerId, 
	 [CallerMemberName] string methodName = "", [CallerLineNumber] int lineNumber = 0);
    
    [LoggerMessage(EventId = 200, Level = LogLevel.Trace,
	 Message = "{methodName}({lineNumber}) - Retrieved customer {customerId}")]
    public static partial void RetrievedCustomer(this ILogger logger, string customerId, 
	 [CallerMemberName] string methodName = "", [CallerLineNumber] int lineNumber = 0);

    [LoggerMessage(EventId = 300, Level = LogLevel.Information,
	 Message = "{methodName}({lineNumber}) - Creating customer {customerId}")]
    public static partial void CreatingCustomer(this ILogger logger, string customerId, 
	 [CallerMemberName] string methodName = "", [CallerLineNumber] int lineNumber = 0);

    [LoggerMessage(EventId = 302, Level = LogLevel.Trace,
	 Message = "{methodName}({lineNumber}) - Created customer {customerId}")]
    public static partial void CreatedCustomer(this ILogger logger, string customerId, 
	 [CallerMemberName] string methodName = "", [CallerLineNumber] int lineNumber = 0);

    [LoggerMessage(EventId = 400, Level = LogLevel.Information,
	 Message = "{methodName}({lineNumber}) - Updating customer {customerId}")]
    public static partial void UpdatingCustomer(this ILogger logger, string customerId, 
	 [CallerMemberName] string methodName = "", [CallerLineNumber] int lineNumber = 0);

    [LoggerMessage(EventId = 402, Level = LogLevel.Trace,
	 Message = "{methodName}({lineNumber}) - Updated customer {customerId}")]
    public static partial void UpdatedCustomer(this ILogger logger, string customerId, 
	 [CallerMemberName] string methodName = "", [CallerLineNumber] int lineNumber = 0);

    [LoggerMessage(EventId = 500, Level = LogLevel.Information,
	 Message = "{methodName}({lineNumber}) - Deleting customer {customerId}")]
    public static partial void DeletingCustomer(this ILogger logger, string customerId, 
	 [CallerMemberName] string methodName = "", [CallerLineNumber] int lineNumber = 0);

    [LoggerMessage(EventId = 502, Level = LogLevel.Trace,
	 Message = "{methodName}({lineNumber}) - Deleted customer {customerId}")]
    public static partial void DeletedCustomer(this ILogger logger, string customerId, 
	 [CallerMemberName] string methodName = "", [CallerLineNumber] int lineNumber = 0);
}

As you can see we have some messages who have the Information level and others that have the Trace level. We can add now the log messages in the code:

using Microsoft.EntityFrameworkCore;

var builder = WebApplication.CreateBuilder(args);

builder.Services.AddEndpointsApiExplorer();
builder.Services.AddSwaggerGen();
builder.Services.AddDbContext<CustomerDbContext>();

var app = builder.Build();

app.UseSwagger();
app.UseSwaggerUI();

var logger = app.Logger;

using (var scope = app.Services.CreateScope())
{
    var dbContext = scope.ServiceProvider.GetRequiredService<CustomerDbContext>();
    await dbContext.Database.EnsureCreatedAsync();
}

app.MapGet("/customers", async (CustomerDbContext context) =>
{
    var customers = await context.Customer.ToListAsync();
    logger.ShowRetrievedCustomerCount(customers.Count);
    return Results.Ok(customers);
});

app.MapGet("/customers/{id}", async (string id, CustomerDbContext context) =>
{
    var customer = await context.Customer.FindAsync(id);
    if (customer == null)
    {
        logger.CustomerNotFound(id);
        return Results.NotFound();
    }
    logger.RetrievedCustomer(id);
    return Results.Ok(customer);
});

app.MapPost("/customers", async (Customer customer, CustomerDbContext context) =>
{
    logger.CreatingCustomer(customer.Id);
    context.Customer.Add(customer);
    await context.SaveChangesAsync();
    logger.CreatedCustomer(customer.Id);
    return Results.Created($"/customers/{customer.Id}", customer);
});

app.MapPut("/customers/{id}", async (string id, Customer customer, CustomerDbContext context) =>
{
    logger.UpdatingCustomer(id);
    var currentCustomer = await context.Customer.FindAsync(id);
    if (currentCustomer == null)
    {
        logger.CustomerNotFound(id);
        return Results.NotFound();
    }

    context.Entry(currentCustomer).CurrentValues.SetValues(customer);
    await context.SaveChangesAsync();
    logger.UpdatedCustomer(id);
    return Results.NoContent();
});

app.MapDelete("/customers/{id}", async (string id, CustomerDbContext context) =>
{
    logger.DeletingCustomer(id);
    var currentCustomer = await context.Customer.FindAsync(id);
    if (currentCustomer == null)
    {
        logger.CustomerNotFound(id);
        return Results.NotFound();
    }

    context.Customer.Remove(currentCustomer);
    await context.SaveChangesAsync();
    logger.DeletedCustomer(id);
    return Results.NoContent();
});

app.Run();

After changing the appsettings file to show the trace messages and give a more succint report:

{
  "Logging": {
    "LogLevel": {
      "Default": "Information",
      "Microsoft.AspNetCore": "Warning",
      "Microsoft.EntityFrameworkCore": "Warning",
      "CustomerService": "Trace"
    },
    "Console": {
      "FormatterName": "simple",
      "FormatterOptions": {
        "SingleLine": true,
        "IncludeScopes": false,
        "ColorBehavior": "enabled",
        "JsonWriterOptions": {
          "Indented": true
        }
      }
    }
  },
  "AllowedHosts": "*"
}

We get this:

As you can see, we have standard messages, with the correct event ids and levels and we have the methods and line numbers, without having to bother with message formatting and setting values for the event ids or levels.

The full source code for this article is at https://github.com/bsonnino/EnhancedLogging

Introduction

Logging is an essential aspect of software development that enables us to track and understand the behavior of our applications. In fact, there are many logging frameworks to help with this task. In this post, I’ve shown how to use Serilog to generate structured logging.

In .NET 5.0, Microsoft introduced a new feature for logging, log formatters. Prior to that, the logging system only supported a single log format, which was plain text: you could only create logs using plain text and, if you wanted something different, you had to format the data by yourself. The introduction of logging formatters made it possible to format log messages in different formats, such as JSON and SystemD (it also allowed developers to create their own custom formatters).

In this blog post, we will show log formatters, exploring their purpose, implementation, examples on how to use and create formatters that suit your needs.

What are Log Formatters?

Log formatters are components responsible for formatting log messages in a specific way. They take raw log data, such as timestamps, log levels, and message details, and transform them into a human-readable format. Log formatters greatly help in the analysis and comprehension of log entries, making troubleshooting and debugging easier.

Predefined log formatters

The .NET team has implemented three predefined log formatters:

  • Simple – With this formatter, you can add time and log level in each log message, and also use ANSI coloring and indentation of messages.
  • JSON – This formatter generates the log in a json format.
  • SystemD – This formatter allows to use the Syslog format, available in containers, does not color the messages and always logs the messages in a single line.

To show the usage of the formatters, we’ll create a console app that will use these new features to log data.

Create a new console app with these commands

dotnet new console -o PredefFormatters
cd PredefFormatters
code .

Add the packages Microsoft.Extensions.Logging and Microsoft.Extensions.Logging.Console with

dotnet add package Microsoft.Extensions.Logging
dotnet add package Microsoft.Extensions.Logging.Console

In VS Code, add this code in Program.cs:

using Microsoft.Extensions.Logging;

var loggerFactory = LoggerFactory.Create(builder =>
{
    builder.AddSimpleConsole();
});

var logger = loggerFactory.CreateLogger<Program>();

logger.LogTrace("This is a trace message");
logger.LogDebug("This is a debug message");
logger.LogInformation("This is an information message");
logger.LogWarning("This is a warning message");
logger.LogError("This is an error message");
logger.LogCritical("This is a critical message");

When you run it, you will get something like this:

At this point, you must be puzzled, asking yourself what’s happened with the other messages. The answer is that the logger is caching the messages, and the program finishes before it can show all messages, thus you can only see one message. You can see a detailed answer here.

The solution in this case is to add a delay at the end to wait all the messages to be shown:

using Microsoft.Extensions.Logging;

var loggerFactory = LoggerFactory.Create(builder =>
{
    builder.AddSimpleConsole();
});

var logger = loggerFactory.CreateLogger<Program>();

logger.LogTrace("This is a trace message");
logger.LogDebug("This is a debug message");
logger.LogInformation("This is an information message");
logger.LogWarning("This is a warning message");
logger.LogError("This is an error message");
logger.LogCritical("This is a critical message");
Task.Delay(1000).Wait();

At the end, I’m adding a delay of 1 second to flush all messages. Usually you won’t need this hack, but that’s good to know.

EDIT – After publishing the article, Thomas (see comment below) pointed me a cleaner way to flush the log queue: just dispose the logger factory. This is a cleaner way to do it and should be used, instead of the Delay. In this case, the code will be like this:

using Microsoft.Extensions.Logging;

// With this notation, we don't need the braces, the logger factory will be disposed at the end
// Feel free to use the braces if you feel that it's more explicit
using var loggerFactory = LoggerFactory.Create(builder =>
{
    builder.AddSimpleConsole();
});

var logger = loggerFactory.CreateLogger<Program>();

logger.LogTrace("This is a trace message");
logger.LogDebug("This is a debug message");
logger.LogInformation("This is an information message");
logger.LogWarning("This is a warning message");
logger.LogError("This is an error message");
logger.LogCritical("This is a critical message");

If you notice the output of the program:

You can see that there are still two messages missing: the Trace and Debug messages. That’s because the default level for logging is Information and, without any configuration, you you won’t be able to see these messages. One way to configure logging is by changing the builder configuration to set the new level:

using Microsoft.Extensions.Logging;

var loggerFactory = LoggerFactory.Create(builder =>
{
    builder
        .AddSimpleConsole()
        .SetMinimumLevel(LogLevel.Trace);
});

var logger = loggerFactory.CreateLogger<Program>();

logger.LogTrace("This is a trace message");
logger.LogDebug("This is a debug message");
logger.LogInformation("This is an information message");
logger.LogWarning("This is a warning message");
logger.LogError("This is an error message");
logger.LogCritical("This is a critical message");
Task.Delay(1000).Wait();

If you run the program again, you will get the missing messages. Although this is a good way to configure logging, you have to recompile the code every time you want to change something. In this case, there is a better way to do it, using a configuration file.

For that we must follow the steps shown at this article and add the packages Microsoft.Extensions.Configuration and Microsoft.Extensions.Configuration.Json:

dotnet add package Microsoft.Extensions.Configuration
dotnet add package Microsoft.Extensions.Configuration.Json

And add this code to add the configuration:

using Microsoft.Extensions.Configuration;
using Microsoft.Extensions.Logging;

var configuration = new ConfigurationBuilder()
    .SetBasePath(Directory.GetCurrentDirectory())
    .AddJsonFile($"appsettings.json", false, true)
    .Build();
                                
var loggerFactory = LoggerFactory.Create(builder =>
{
    builder
        .AddConfiguration(configuration.GetSection("Logging"))
        .AddSimpleConsole();
});

var logger = loggerFactory.CreateLogger<Program>();

logger.LogTrace("This is a trace message");
logger.LogDebug("This is a debug message");
logger.LogInformation("This is an information message");
logger.LogWarning("This is a warning message");
logger.LogError("This is an error message");
logger.LogCritical("This is a critical message");
Task.Delay(1000).Wait();

Once you do this and add an appsettings.json file to the folder with this content:

{
    "Logging": {
        "LogLevel": {
            "Default": "Trace"
        }
    }
}

You will be able to see all messages. As you can see from the output, every message is shown in two lines: the first one shows the category (Program, that was used in the CreateLogger method), and the other has the message. If we want everything in a single line, we can add this configuration:

{
    "Logging": {
        "LogLevel": {
            "Default": "Trace"
        },
        "Console": {
            "FormatterOptions": {
                "SingleLine": true
            }
        }
    }
}

This will cause the output to be in a single line:

We can even change the log formatting in the configuration, but for that, we need to change the line .AddSimpleConsole(); to .AddConsole(); and change the Console section in appsettings to

"Console": {
    "FormatterName": "simple",
    "FormatterOptions": {
        "SingleLine": true
    }
}

Now, we can change the FormatterName setting to json or systemd and have different formatting for the log:

As you can see, we can change the log formats by changing the configuration file. We can add more configurations for the logs, by adding new options:

"Console": {
    "FormatterName": "simple",
    "FormatterOptions": {
        "SingleLine": true, 
        "IncludeScopes": true,
        "TimestampFormat": "yyyy-MM-dd HH:mm:ss zzz",
        "ColorBehavior": "disabled"
    }
}

"Console": {
    "FormatterName": "json",
    "FormatterOptions": {
        "SingleLine": true, 
        "IncludeScopes": true,
        "TimestampFormat": "yyyy-MM-dd HH:mm:ss zzz",
        "ColorBehavior": "disabled",
        "JsonWriterOptions": {
            "Indented": true
        }
    }
}

You may be asking what does the number [0] besides the category mean. This is the event Id, a struct that has the properties Id and Name, to identify the types of events. When you do something like this:

using Microsoft.Extensions.Configuration;
using Microsoft.Extensions.Logging;

EventId AppLogEvent = new EventId(100, "AppLog");
EventId DbLogEvent = 200;

var configuration = new ConfigurationBuilder()
    .SetBasePath(Directory.GetCurrentDirectory())
    .AddJsonFile($"appsettings.json", false, true)
    .Build();
                                
var loggerFactory = LoggerFactory.Create(builder =>
{
    builder
        .AddConfiguration(configuration.GetSection("Logging"))
        .AddConsole();
});

var logger = loggerFactory.CreateLogger<Program>();

logger.LogTrace(AppLogEvent,"This is a trace message");
logger.LogDebug(DbLogEvent,"This is a debug message");
logger.LogInformation(300,"This is an information message");
logger.LogWarning("This is a warning message");
logger.LogError("This is an error message");
logger.LogCritical("This is a critical message");
Task.Delay(1000).Wait();

As you can see, you can create EventIds explicitly, use the implicit conversion from an int or use an int directly as the first parameter for the log, and the log will show the id numbers instead of the [0]:

Scopes

You can also group the messages by using Scopes. A scope is a class that implements IDisposable and is created by the BeginScope method. It remains active until it’s disposed. Scopes can be nested.

For example, we could create scopes in our program with this code:

using Microsoft.Extensions.Configuration;
using Microsoft.Extensions.Logging;

var configuration = new ConfigurationBuilder()
    .SetBasePath(Directory.GetCurrentDirectory())
    .AddJsonFile($"appsettings.json", false, true)
    .Build();
                                
var loggerFactory = LoggerFactory.Create(builder =>
{
    builder
        .AddConfiguration(configuration.GetSection("Logging"))
        .AddConsole();
});

var logger = loggerFactory.CreateLogger<Program>();

using(var scope1 = logger.BeginScope("Scope 1"))
{
    logger.LogInformation("This is an information message in scope 1");
    logger.LogWarning("This is a warning message in scope 1");
    logger.LogError("This is an error message in scope 1");
    logger.LogCritical("This is a critical message in scope 1");
}
using(var scope2 = logger.BeginScope("Scope 2"))
{
    logger.LogInformation("This is an information message in scope 2");
    logger.LogWarning("This is a warning message in scope 2");
    logger.LogError("This is an error message in scope 2");
    logger.LogCritical("This is a critical message in scope 2");
    using(var scope3 = logger.BeginScope("Scope 3"))
    {
        logger.LogInformation("This is an information message in scope 3");
        logger.LogWarning("This is a warning message in scope 3");
        logger.LogError("This is an error message in scope 3");
        logger.LogCritical("This is a critical message in scope 3");
    }
}
logger.LogTrace("This is a trace message");
logger.LogDebug("This is a debug message");
logger.LogInformation("This is an information message");
logger.LogWarning("This is a warning message");
logger.LogError("This is an error message");
logger.LogCritical("This is a critical message");
Task.Delay(1000).Wait();

Creating a custom formatter

In addition to using the available formatters, you can create your own formatter that formats the data according to your specific needs. For that, you must create a new class that inherits from ConsoleFormatter and use it to generate the data. Create a new file and name it CsvLogFormatter.cs. Add this code to create the formatter:

using Microsoft.Extensions.Logging;
using Microsoft.Extensions.Logging.Abstractions;
using Microsoft.Extensions.Logging.Console;

public class CsvLogFormatter : ConsoleFormatter
{
    public CsvLogFormatter() : base("CsvFormatter")
    {
    }

    public override void Write<TState>(in LogEntry<TState> logEntry, IExternalScopeProvider? scopeProvider,     TextWriter textWriter)
    {
        string? message =
            logEntry.Formatter?.Invoke(
                logEntry.State, logEntry.Exception);

        if (message is null)
        {
            return;
        }
        var scopeStr = "";
        if (scopeProvider != null)
        {
            var scopes = new List<string>();
            scopeProvider.ForEachScope((scope, state) => state.Add(scope?.ToString() ?? ""), scopes);
            scopeStr = string.Join("|", scopes);
        }
        var logMessage = $"\"{logEntry.LogLevel}\",\"{logEntry.Category}[{logEntry.EventId}]\"," +
            $"\"{scopeStr}\",\"{message}\"";
        textWriter.WriteLine(logMessage);
    }
}

The formatter inherits from ConsoleFormatter and overrides the Write method, that receives the log entry, the scope provider (which will proved the scopes for the message) and the TextWriter, used for the log output. This method will format the message received, then it will obtain the scopes and join them in a string separated by “|” and then set the complete log message that will be written by the text writer.

This class works fine and outputs each log message as a line of comma-separated values, where every field is enclosed by quotes and the fields are separated by commas. To use it, we must set it up in Program.cs by calling AddConsoleFormatter within the ConfigureLogging method, as shown in the code snippet below:

var loggerFactory = LoggerFactory.Create(builder =>
{
    builder
        .AddConfiguration(configuration.GetSection("Logging"))
        .AddConsole()
        .AddConsoleFormatter<CsvLogFormatter, ConsoleFormatterOptions>();
});

As you can see, the ConsoleFormatterOptions is the second parameter in the generic call, which is automatically provided by the logging framework. However, there is currently no mention of these options in the formatter code. To use options in the code, we need to enhance the code, but, before that, we will create an options class specific for this formatter. One thing that must be added is the list separator character, which we will add in the CsvFormatterOptions class.

public sealed class CsvFormatterOptions : ConsoleFormatterOptions
{
    public string? ListSeparator { get; set; }
}

This class inherit from the ConsoleFormatterOptions class and introduces a new property called ListSeparator. To use it, we must enhance our formatter like this:

public class CsvLogFormatter : ConsoleFormatter, IDisposable
{
    CsvFormatterOptions? _options;
    private readonly IDisposable? _optionsReloadToken;

    public CsvLogFormatter(IOptionsMonitor<CsvFormatterOptions> options) : base("CsvFormatter")
    {
        _options = options.CurrentValue;
        _optionsReloadToken = options.OnChange(ReloadLoggerOptions);
    }

    private void ReloadLoggerOptions(CsvFormatterOptions currentValue)
    {
        _options = currentValue;
    }

    public override void Write<TState>(in LogEntry<TState> logEntry, IExternalScopeProvider? scopeProvider, TextWriter textWriter)
    {
        string? message =
            logEntry.Formatter?.Invoke(
                logEntry.State, logEntry.Exception);

        if (message is null)
        {
            return;
        }
        var scopeStr = "";
        if (_options?.IncludeScopes == true && scopeProvider != null)
        {
            var scopes = new List<string>();
            scopeProvider.ForEachScope((scope, state) => state.Add(scope?.ToString() ?? ""), scopes);
            scopeStr = string.Join("|", scopes);
        }
        var listSeparator = _options?.ListSeparator ?? ",";
        if (_options?.TimestampFormat != null)
        {
            var timestampFormat = _options.TimestampFormat;
            var timestamp = "\"" +DateTime.Now.ToLocalTime().ToString(timestampFormat, 
                CultureInfo.InvariantCulture) + "\"";
            textWriter.Write(timestamp);
            textWriter.Write(listSeparator);
        }
        var logMessage = $"\"{logEntry.LogLevel}\"{listSeparator}"+
            $"\"{logEntry.Category}[{logEntry.EventId}]\"{listSeparator}" +
            $"\"{scopeStr}\"{listSeparator}\"{message}\"";
        textWriter.WriteLine(logMessage);
    }

    public void Dispose() => _optionsReloadToken?.Dispose();
}

Now the class also implements the IDisposable interface. The implementation of IDisposable is necessary because the constructor of CsvLogFormatter receives an IOptionsMonitor<CsvFormatterOptions> that registers an event handler for detecting changes in the options. By implementing IDisposable, we can ensure the proper disposal of the options event handler. This instance also has the current options value, that we use to initialize the options.

Since our options inherit from ConsoleFormatterOptions, we have access to all properties in the parent class, allowing us to utilize them for enhancing the output:

  • If IncludeScopes is true, we will gather the scopes and include them to the log entry
  • If TimestampFormat is defined, we will add the current timestamp to the output
  • We can use the ListSeparator option to change the list separator between the log values

All these options can still be configured in appsettings.json. For example, if we change the file to:

{
    "Logging": {
        "LogLevel": {
            "Default": "Trace"
        },
        "Console": {
            "FormatterName": "CsvFormatter",
            "FormatterOptions": {
                "SingleLine": true,
                "IncludeScopes": true,
                "ColorBehavior": "enabled",
                "TimestampFormat": "yyyy-MM-dd HH:mm:ss.fff zzz",
                "JsonWriterOptions": {
                    "Indented": true
                },
                "ListSeparator": ";"
            }
        }
    }
}

And run the program, we will get something like this:

Conclusion:

As you can see, log formatters offer flexibility, and you can tailor them to suit your specific requirements. You can change the output format for the data and even transform or filter it. That can improve a lot your logging experience.

All the source code for this project is at https://github.com/bsonnino/LogFormatters

C# 9 introduced a new feature that allows you to inspect user code as it is being compiled and generate new C# source files that are added to the compilation. This enables you to write code that runs during compilation and produces additional source code based on the analysis of your program. In this blog post, I will explain what C# source code generators are, why they are useful, and how to use them in your projects.

What are C# source code generators?

A C# source code generator is a piece of code that you can write to hook into the compilation pipeline and generate additional source files that will be compiled. The process is like this one:


(From https://devblogs.microsoft.com/dotnet/introducing-c-source-generators/)

When the compilation runs, the source code is passed to our code generator, which analyzes it and generates new code that is added to the current code and compiled with it, generating an executable that has both the user code and the generated one.

This allows you to make your code more performant and robust: one scenario in which it could be used is to remove the need for reflection in your code: all code is generated at compile time and you don’t need to use reflection to work with the class data. Another use is to remove boilerplate code: you can add a generator to add all the extra code that’s needed for a specific situation. For example, in my MVVM Toolkit 8 article I’ve shown the use of removing all the boilerplate code for the INotifyPropertyChanged interface implementations. In fact, this code:

[ObservableProperty]
private Customer _selectedCustomer;

would create the property with a getter and setter, which triggers the PropertyChanged event in the setter.

Creating a C# source code generator

We will now create a simple Code Generator. The source code generator is a .NET Standard 2.1 dll that will be added to the main project. To create the dll, you should use these commands:

dotnet new classlib -o HelloGenerator
cd HelloGenerator
code .

Add the packages Microsoft.CodeAnalysis.Analyzers and Microsoft.CodeAnalysis.CSharp:

dotnet add package Microsoft.CodeAnalysis.Analyzers
dotnet add package Microsoft.CodeAnalysis.CSharp

The csproj file must be changed to set the new target version and be used as a generator:

<Project Sdk="Microsoft.NET.Sdk">

  <PropertyGroup>
    <TargetFramework>netstandard2.1</TargetFramework>
    <ImplicitUsings>enable</ImplicitUsings>
    <Nullable>enable</Nullable>
    <LangVersion>latest</LangVersion>
    <EnforceExtendedAnalyzerRules>true</EnforceExtendedAnalyzerRules>
  </PropertyGroup>

  <ItemGroup>
    <PackageReference Include="Microsoft.CodeAnalysis.Analyzers" Version="3.3.4">
      <IncludeAssets>runtime; build; native; contentfiles; analyzers; buildtransitive</IncludeAssets>
      <PrivateAssets>all</PrivateAssets>
    </PackageReference>
    <PackageReference Include="Microsoft.CodeAnalysis.CSharp" Version="4.6.0" />
  </ItemGroup>

</Project>

In VS Code, create a new file and name it HelloGenerator.cs. The structure of the file is this:

using System.Text;
using Microsoft.CodeAnalysis;
using Microsoft.CodeAnalysis.Text;

[Generator]
public class HelloGenerator : ISourceGenerator
{
    public void Initialize(GeneratorInitializationContext context)
    {
        // Initialization code 
    }

    public void Execute(GeneratorExecutionContext context)
    {
        // Code that will be executed at compile time
    }
}

We will create a HelloGenerator class that implements the ISourceGenerator interface and add the [Generator] attribute. This interface has two methods:

  • Initialize, to initialize the generator. This method is called once, when your generator is loaded. You can use this method to register callbacks for syntax or semantic changes in the user code.
  • Execute, where the source code will be generated. The GeneratorExecutionContext parameter will provide information about the code being compiled.

Our sample generator looks like this:

using System.Text;
using Microsoft.CodeAnalysis;
using Microsoft.CodeAnalysis.Text;

namespace HelloWorldGenerator;

[Generator]
public class HelloGenerator : ISourceGenerator
{
    public void Initialize(GeneratorInitializationContext context)
    {
        // No initialization required for this one
    }

    public void Execute(GeneratorExecutionContext context)
    {
        // Create the source to inject
        var sourceBuilder = new StringBuilder(@"
        using System;
namespace HelloWorldGenerated
{
    public static class HelloWorld
    {
        public static void SayHello()
        {
            Console.WriteLine(""Hello from generated code!"");
        }
    }
}");
        // Add the source file to the compilation
        context.AddSource("HelloWorld", SourceText.From(sourceBuilder.ToString(), Encoding.UTF8));
    }
}

We don’t need any initialization, and the execution will create a single file with a static class HelloWorld that has a SayHello method.

We can run dotnet build to build the dll. Once you do it, we can create the code that uses it. Create a new console project and name it HelloTest. In this project, we must reference the HelloGenerator project:

cd ..
dotnet new console -o HelloTest
cd HelloTest
dotnet add reference ..\HelloGenerator\HelloGenerator.csproj
code .

In the HelloTest project, we must change the reference for the generator project:

<Project Sdk="Microsoft.NET.Sdk">

  <ItemGroup>
    <ProjectReference Include="..\HelloGenerator\HelloGenerator.csproj" 
                      OutputItemType="Analyzer"
                      ReferenceOutputAssembly="false"/>
  </ItemGroup>

  <PropertyGroup>
    <OutputType>Exe</OutputType>
    <TargetFramework>net7.0</TargetFramework>
    <ImplicitUsings>enable</ImplicitUsings>
    <Nullable>enable</Nullable>
  </PropertyGroup>

</Project>

The program for this project is:

HelloWorldGenerated.HelloWorld.SayHello();

When you run it, you get something like this:

As you can see, the source generator created a new class called HelloWorld in a namespace called HelloWorldGenerated, and added a static method called SayHello that prints a message to the console. The user code then referenced this class and called its method.

This is a very simple example, but it demonstrates the basic steps of creating and using a source generator. In the next section, we will see how to write a more complex source generator that implements a specific scenario.

Json serializer generator

Let’s say we have a class like this:

public class Person
{
    public string FirstName { get; set; }
    public string LastName { get; set; }
    public int Age { get; set; }
}

And we want to implement a method ToJson, that will take an instance of the Person class and generate a json string with the data. We could create an extension method, like this one:

using System.Text;

public static class JsonExtensions
{
    public static string ToJson<T>(this T element)
    {
        var properties = typeof(T).GetProperties();
        var builder = new StringBuilder();
        builder.Append("{");
        foreach (var property in properties)
        {
            var name = property.Name;
            var value = property.GetValue(element);
            builder.Append($"\"{name}\":\"{value}\",");
        }
        builder.Remove(builder.Length - 1, 1); // Remove trailing comma
        builder.Append("}");
        return builder.ToString();
    }
}

This approach works fine, but it requires reflection, which has some performance overhead.

Another approach is to use a source generator, where you could write some code that analyzes the Person class at compile time and generates a new class that implements a ToJson method especially written for it, without having to resort to reflection. This approach is more performant, because everything is generated at compile time and there is no need to do anything at runtime.

At the end, this generator will produce a new file with the following extension methods:

public static class PersonExtensions
{
     public static string ToJson(this Person person)
    {
        // Generate code to serialize person to JSON
    }

    public static Person FromJson(string json)
    {
        // Generate code to deserialize person from JSON
    }
}

To write this generator, we need to do the same thing that we did in the previous project:

Create a new classlib project, add the packages Microsoft.CodeAnalysis.Analyzers and Microsoft.CodeAnalysis.CSharp and change the project file be used as a generator:

dotnet new classlib -o JsonGenerator
cd JsonGenerator
code .
dotnet add package Microsoft.CodeAnalysis.Analyzers
dotnet add package Microsoft.CodeAnalysis.CSharp
<Project Sdk="Microsoft.NET.Sdk">

  <PropertyGroup>
    <TargetFramework>netstandard2.0</TargetFramework>
    <ImplicitUsings>enable</ImplicitUsings>
    <Nullable>enable</Nullable>
    <LangVersion>latest</LangVersion>
    <EnforceExtendedAnalyzerRules>true</EnforceExtendedAnalyzerRules>
  </PropertyGroup>

  <ItemGroup>
    <PackageReference Include="Microsoft.CodeAnalysis.Analyzers" Version="3.3.4">
      <IncludeAssets>runtime; build; native; contentfiles; analyzers; buildtransitive</IncludeAssets>
      <PrivateAssets>all</PrivateAssets>
    </PackageReference>
    <PackageReference Include="Microsoft.CodeAnalysis.CSharp" Version="4.6.0" />
  </ItemGroup>

</Project>

Create a new file JsonGenerator.cs and create a new class that implements the ISourceGenerator interface and add the [Generator] attribute to it.

using System.Text;
using Microsoft.CodeAnalysis;
using Microsoft.CodeAnalysis.Text;

[Generator]
public class JsonGenerator : ISourceGenerator
{
    public void Initialize(GeneratorInitializationContext context)
    {
        // Initialization code 
    }

    public void Execute(GeneratorExecutionContext context)
    {
        // Code that will be executed at compile time
    }
}

Implement the Initialize method to register a callback for syntax receiver creation. This callback will create an instance of a custom syntax receiver class that will collect all the classes with the [Serializable] attribute in the user code and create this class, that implements the ISyntaxReceiver interface and overrides the OnVisitSyntaxNode method. This method will be called for every syntax node in the user code and will check if it is a class declaration with the [Serializable] attribute. If this is the case, it will be added to the CandidateClasses list, that will be used in the Execute method.

public class JsonGenerator : ISourceGenerator
{
    public void Initialize(GeneratorInitializationContext context)
    {
        // Register a callback for syntax receiver creation
        context.RegisterForSyntaxNotifications(() => new SerializationSyntaxReceiver());
    }
    
    ...
}

// A custom syntax receiver that collects all the classes with the [Serializable] attribute 
public class SerializationSyntaxReceiver : ISyntaxReceiver 
{ 
    public List<ClassDeclarationSyntax> CandidateClasses { get; } = new List<ClassDeclarationSyntax>();

    public void OnVisitSyntaxNode(SyntaxNode syntaxNode)
    {
        // Check if the syntax node is a class declaration with the [Serializable] attribute
        if (syntaxNode is ClassDeclarationSyntax classDeclaration &&
            classDeclaration.AttributeLists.Count > 0 &&
            classDeclaration.AttributeLists.Any(
                al => al.Attributes.Any(a => a.Name.ToString() == "Serializable")))
        {
            // Add it to the candidate list
            CandidateClasses.Add(classDeclaration);
        }
    }
}

Implement the Execute method to generate the extension methods for each collected class using the syntax and semantic models provided by the context parameter.

public void Execute(GeneratorExecutionContext context)
{
    // Get the compilation object that represents all user code being compiled
    var compilation = context.Compilation;


    // Get the syntax receiver that was created by our callback
    var receiver = context.SyntaxReceiver as SerializationSyntaxReceiver;
    if (receiver == null)
    {
        return;
    }

    // Loop through all the syntax trees in the compilation
    foreach (var syntaxTree in compilation.SyntaxTrees)
    {
        // Get the semantic model for the syntax tree
        var model = compilation.GetSemanticModel(syntaxTree);

        // Loop through all the class declarations
        foreach (var classDeclaration in receiver.CandidateClasses)
        {
            try
            {
                // Get the symbol for the class declaration
                var classSymbol = model.GetDeclaredSymbol(classDeclaration) as INamedTypeSymbol;

                if (classSymbol != null)
                {
                    // Generate the extension method for the ToJson method
                    var source = GenerateSource(classSymbol);

                    // Add a new source file to the compilation with a unique hint name
                    context.AddSource($"{classSymbol.Name}.ToJson.cs", SourceText.From(source, Encoding.UTF8));
                }
            }
            catch (System.Exception ex)
            {
                System.Console.WriteLine(ex);
            }
        }
    }
}

This code will get the Compilation object and loop through all the syntax trees, looking for classes that were selected in the Initialize method. For each found class, it will call the GenerateSource method and will write a new text file with the generated source:

    private string GenerateSource(INamedTypeSymbol classSymbol)
    {
        // Get the name of the class
        var className = classSymbol.Name;

        // Get the properties of the class
        var properties = classSymbol.GetMembers().OfType<IPropertySymbol>();
        if (!properties.Any())
        {
            return "";
        }

        // Generate code for the ToJson method using StringBuilder
        var builder = new StringBuilder();
        builder.AppendLine($@"
using System.Text.Json;

public static class {className}Extensions
{{
    public static string ToJson(this {className} {className.ToLower()})
    {{");

        // Append code to create a new JSON object as a string using string interpolation and escaping
        builder.Append(@"        return $""{{");
        foreach (var property in properties)
        {
            builder.Append($@"\""{property.Name}\"":\""{{{className.ToLower()}.{property.Name}}}\"",");
        }
        builder.Remove(builder.Length - 1, 1); // Remove trailing comma
        builder.Append(@"}}"";");
        builder.Append(@$"
    }}
        
    public static {className} FromJson(string json)
    {{");

        // Append code to parse the JSON string as a JSON object using JsonDocument.Parse
        builder.AppendLine($@"
        using var document = JsonDocument.Parse(json);
        var root = document.RootElement;");

        // Append code to create a new instance of the class using its default constructor
        builder.AppendLine($@"
        var {className.ToLower()} = new {className}();");

        // Append code to assign each property value from the JSON object using JsonElement.GetProperty and TryGet methods
        foreach (var property in properties)
        {
            builder.AppendLine($@"
        if (root.TryGetProperty(""{property.Name}"", out var {property.Name}Element))
        {{
            {className.ToLower()}.{property.Name} = {GetConversionCode(property.Type, $"{property.Name}Element")};
        }}");
        }

        // Append code to return the created object as a result
        builder.AppendLine($@"
        return {className.ToLower()};
    }}
}}");

        // Return the generated source code as a string
        return builder.ToString();
    }

    private string GetConversionCode(ITypeSymbol type, string value)
    {
        // Generate code to convert a JsonElement value to a given type using switch expression and JsonElement.Get methods
        return type switch
        {
            INamedTypeSymbol namedType when namedType.SpecialType == SpecialType.System_String => $"{value}.GetString()",
            INamedTypeSymbol namedType when namedType.SpecialType == SpecialType.System_Int32 => $"{value}.GetInt32()",
            INamedTypeSymbol namedType when namedType.SpecialType == SpecialType.System_Double => $"{value}.GetDouble()",
            INamedTypeSymbol namedType when namedType.SpecialType == SpecialType.System_Boolean => $"{value}.GetBoolean()",
            INamedTypeSymbol namedType when namedType.SpecialType == SpecialType.System_DateTime => $"{value}.GetDateTime()",
            _ => throw new NotSupportedException($"Unsupported type: {type}")
        };
    }

This code will get the properties for the classes that have the [Serializable] attribute and will generate a new static class with two methods, ToJson and FromJson, that will serialize and deserialize the class. For example, this code will generate this class for the Person class:

using System.Text.Json;

public static class PersonExtensions
{
    public static string ToJson(this Person person)
    {
        return $"{{\"FirstName\":\"{person.FirstName}\",\"LastName\":\"{person.LastName}\",\"Age\":\"{person.Age}\"}}";
    }
        
    public static Person FromJson(string json)
    {
        using var document = JsonDocument.Parse(json) ;
        var root = document.RootElement;

        var person = new Person();

        if (root.TryGetProperty("FirstName", out var FirstNameElement))
        {
            person.FirstName = FirstNameElement.GetString();
        }

        if (root.TryGetProperty("LastName", out var LastNameElement))
        {
            person.LastName = LastNameElement.GetString();
        }

        if (root.TryGetProperty("Age", out var AgeElement))
        {
            person.Age = AgeElement.GetInt32();
        }

        return person;
    }
}

As you can see, the generated code is specific for the Person class and doesn’t make use of reflection. To test this code, we can create a new project, JsonGeneratorTest and add the generator project as a reference:

cd ..
dotnet new console -o JsonTest
cd JsonTest
dotnet add reference ..\JsonGenerator\JsonGenerator.csproj
code .

And then change the csproj file to tell the referenced project is an analyzer:

<Project Sdk="Microsoft.NET.Sdk">

  <ItemGroup>
    <ProjectReference Include="..\JsonGenerator\JsonGenerator.csproj" 
                      OutputItemType="Analyzer"
                      ReferenceOutputAssembly="false"/>
  </ItemGroup>

  <PropertyGroup>
    <OutputType>Exe</OutputType>
    <TargetFramework>net7.0</TargetFramework>
    <ImplicitUsings>enable</ImplicitUsings>
    <Nullable>enable</Nullable>
  </PropertyGroup>

</Project>

The code for the tester program is:

var person = new Person
{
    FirstName = "John",
    LastName = "Doe",
    Age = 20
};
Console.WriteLine(person.ToJson());

var person1 = PersonExtensions.FromJson("{ \"FirstName\": \"Mary\", \"LastName\": \"Jane\", \"Age\": 30}");
Console.WriteLine(person1.ToJson());

[Serializable]
public class Person
{
    public string? FirstName { get; set; }
    public string? LastName { get; set; }
    public int Age { get; set; }
}

Once you run the code, you will get something like this:

As you can see, the generator has created the class and compiled it with the code and we can use it the same way as if we had written it ourselves. This saves us from writing boilerplate code and avoids the use of reflection.

The full source code for the article is at https://github.com/bsonnino/SourceGenerators

Introduction

Some time ago, I wrote this article about FeatherHTTP. Things have evolved since then and, in .NET 6, Microsoft introduced a new approach called "Minimal APIs" which simplifies the process of build RESTful APIs. This article will guide you through the process of creating RESTful APIs using Minimal APIs in ASP.NET, highlighting its simplicity and efficiency.

What are Minimal APIs?

Minimal APIs in ASP.NET are a lightweight approach to building APIs with minimal ceremony. They allow you to define routes and endpoints using a concise syntax, reducing the amount of boilerplate code traditionally required. This approach makes it easier to build APIs quickly and with a smaller code footprint.

Setting up the Project

To get started, make sure you have .NET 6 or a higher version installed. Create a new ASP.NET project using the web template. Open a terminal window and execute the following command:

dotnet new web -o MinimalApi

This will create a new ASP.NET project named MinimalApi using the Minimal API template.

If you open the code, you will see that the generated project is very simple:

It contains the project file, the main file and two files with the settings. If you take a look at Program.cs, you will see something like this:

var builder = WebApplication.CreateBuilder(args);
var app = builder.Build();

app.MapGet("/", () => "Hello World!");

app.Run();

When you run it, you will get a very simple service that answers with "Hello World!":

Defining Routes and Endpoints

In Minimal APIs, you define routes and endpoints using lambda expressions and the app.Map* methods provided by the WebApplication class. We will modify this project to create the weather forecast service, like we did in the previous article. For that, change the code in Program.cs to:

using System.Text.Json;

var builder = WebApplication.CreateBuilder(args);
var app = builder.Build();

var Summaries = new[]
{
   "Freezing", "Bracing", "Chilly", "Cool", "Mild", "Warm", "Balmy", "Hot", "Sweltering", "Scorching"
};

var rnd = new System.Random();
app.MapGet("/", () => Enumerable.Range(1, 5)
        .Select(index => new WeatherForecast(DateTime.Now.AddDays(index), 
            rnd.Next(-20, 55), 
            Summaries[rnd.Next(Summaries.Length)])));

app.Run();

public record WeatherForecast(DateTime Date, int TemperatureC,string Summary)
{
    public int TemperatureF => 32 + (int)(TemperatureC / 0.5556);
}

When you run it and call the server, you will get a json array of temperatures:

Now, we can add Swagger to the API. To do this, we have to add the package Swashbuckle.AspNetCore to the project:

dotnet add package Swashbuckle.AspNetCore

Open the Program.cs file and modify the program as follows:

var builder = WebApplication.CreateBuilder(args);
builder.Services.AddEndpointsApiExplorer();
builder.Services.AddSwaggerGen();

var app = builder.Build();

var Summaries = new[]
{
   "Freezing", "Bracing", "Chilly", "Cool", "Mild", "Warm", "Balmy", "Hot", "Sweltering", "Scorching"
};

app.UseSwagger();

var rnd = new System.Random();
app.MapGet("/", () => Enumerable.Range(1, 5)
        .Select(index => new WeatherForecast(DateTime.Now.AddDays(index), 
            rnd.Next(-20, 55), 
            Summaries[rnd.Next(Summaries.Length)])));

app.UseSwaggerUI();

app.Run();

When you run the program and go to https://localhost:7035/swagger (the port should be different), you will get the Swagger page:

In this example, we use the MapGet method to define a route for the GET HTTP method at the path "/". When a request is made to this route, the lambda expression is executed, which returns 5 generated weather forecasts.

This is something simple, but we will see that the minimal API can also handle more complicated requests. We’ll create a server that uses EF Core with a SQLite database to manage a table of customers. It will be able to create, update or delete customers using different endpoints.

Implementing the CRUD Service with EF Core

To start our project, create a new web project, add the Swashbuckle.AspNetCore, Microsoft.EntityFrameworkCore.Sqlite and Microsoft.EntityFrameworkCore.Tools packages to the project:

dotnet new web -o CustomerService
cd CustomerService
dotnet add package SwashBuckle.AspnetCore
dotnet add package Microsoft.EntityFrameworkCore.Sqlite
dotnet add package Microsoft.EntityFrameworkCore.Tools

Now we will setup the infrastructure for our service, by adding a new file named Customer.cs with this code:

using System.Text.Json;
using Microsoft.EntityFrameworkCore;

public record Customer(string Id, string Name, string Address, string City, string Country, string Phone);

public class CustomerDbContext : DbContext
{
    public CustomerDbContext(DbContextOptions options) : base(options)
    {
    }

    public DbSet<Customer> Customer { get; set; }

    override protected void OnConfiguring(DbContextOptionsBuilder optionsBuilder)
        => optionsBuilder.UseSqlite("Data Source=customer.db");
}

The Customer class represents a customer entity. It has six properties that correspond to the customer’s id, name, address, city, country, and phone number. The CustomerDbContext class is a database context class that provides access to the Customer entities in the database. It inherits from the DbContext class provided by the Entity Framework Core library. We are overriding the OnConfiguring method to use the SQLite database named customer.db.

The DbSet property in the CustomerDbContext class represents a collection of Customer entities in the database. It allows developers to query, insert, update, and delete Customer entities using LINQ queries or Entity Framework Core APIs.

With that in place, we can add the first endpoint, which retrieves all customers. We will also add Swagger to the project, to allow us to test the service:

using Microsoft.EntityFrameworkCore;

var builder = WebApplication.CreateBuilder(args);
builder.Services.AddEndpointsApiExplorer();
builder.Services.AddSwaggerGen();
builder.Services.AddDbContext<CustomerDbContext>();

var app = builder.Build();

app.UseSwagger();
app.UseSwaggerUI();

// Ensure database is created during application startup
using (var scope = app.Services.CreateScope())
{
    var dbContext = scope.ServiceProvider.GetRequiredService<CustomerDbContext>();
    await dbContext.Database.EnsureCreatedAsync();
}

app.MapGet("/customers", async (CustomerDbContext context) =>
{
    return await context.Customer.ToListAsync();
});

app.MapGet("/customers/{id}", async (string id, CustomerDbContext context) =>
{
    var customer = await context.Customer.FindAsync(id);
    if (customer == null)
    {
        return Results.NotFound();
    }

    return Results.Ok(customer);
});

app.Run();

We are adding the Swagger services, like before, then we add the DbContext, to use EF Core.

After that, we get the DbContext from the service provider and we ensure that the database is created at start.

Then, we set two endpoints, /customers and customers/{id}, to retrieve all customers and a single customer by id. In the endpoints, we are calling the EnsureCreated method, to be sure that the database and tables are created before retrieving the data.

To retrieve the customer by id, we are using the Find method, that will retrieve the record based on the primary key.

When you run the program and go to the swagger endpoint, you will see something like this:

Seeding data

If you try the customers endpoint, you will see that it doesn’t show anything, as expected: we didn’t add any customer to the database. We need to seed it with some data: we will use a json file for the initial seed. Create a json file and name it Customer.json. Add the following content to it:

[
    {
      "Id": "BLAUS",
      "Name": "Blauer See Delikatessen",
      "Address": "Forsterstr. 57",
      "City": "Mannheim",
      "Country": "Germany",
      "Phone": "0621-08460"
    },
    {
      "Id": "BOLID",
      "Name": "Bólido Comidas preparadas",
      "Address": "C/ Araquil, 67",
      "City": "Madrid",
      "Country": "Spain",
      "Phone": "(91) 555 22 82"
    },
    {
      "Id": "CHOPS",
      "Name": "Chop-suey Chinese",
      "Address": "Hauptstr. 31",
      "City": "Bern",
      "Country": "Switzerland",
      "Phone": "0452-076545"
    },
    {
      "Id": "FOLKO",
      "Name": "Folk och fä HB",
      "Address": "Åkergatan 24",
      "City": "Bräcke",
      "Country": "Sweden",
      "Phone": "0695-34 67 21"
    },
    {
      "Id": "FRANR",
      "Name": "France restauration",
      "Address": "54, rue Royale",
      "City": "Nantes",
      "Country": "France",
      "Phone": "40.32.21.21"
    },
    {
      "Id": "FURIB",
      "Name": "Furia Bacalhau e Frutos do Mar",
      "Address": "Jardim das rosas n. 32",
      "City": "Lisboa",
      "Country": "Portugal",
      "Phone": "(1) 354-2534"
    },
    {
      "Id": "GOURL",
      "Name": "Gourmet Lanchonetes",
      "Address": "Av. Brasil, 442",
      "City": "Campinas",
      "Country": "Brazil",
      "Phone": "(11) 555-9482"
    },
    {
      "Id": "HANAR",
      "Name": "Hanari Carnes",
      "Address": "Rua do Paço, 67",
      "City": "Rio de Janeiro",
      "Country": "Brazil",
      "Phone": "(21) 555-0091"
    },
    {
      "Id": "ISLAT",
      "Name": "Island Trading",
      "Address": "Garden House Crowther Way",
      "City": "Cowes",
      "Country": "UK",
      "Phone": "(198) 555-8888"
    },
    {
      "Id": "KOENE",
      "Name": "Königlich Essen",
      "Address": "Maubelstr. 90",
      "City": "Brandenburg",
      "Country": "Germany",
      "Phone": "0555-09876"
    },
    {
      "Id": "LAUGB",
      "Name": "Laughing Bacchus Wine Cellars",
      "Address": "2319 Elm St.",
      "City": "Vancouver",
      "Country": "Canada",
      "Phone": "(604) 555-3392"
    },
    {
      "Id": "LILAS",
      "Name": "LILA-Supermercado",
      "Address": "Carrera 52 con Ave. Bolívar #65-98 Llano Largo",
      "City": "Barquisimeto",
      "Country": "Venezuela",
      "Phone": "(9) 331-6954"
    },
    {
      "Id": "LINOD",
      "Name": "LINO-Delicateses",
      "Address": "Ave. 5 de Mayo Porlamar",
      "City": "I. de Margarita",
      "Country": "Venezuela",
      "Phone": "(8) 34-56-12"
    },
    {
      "Id": "MAGAA",
      "Name": "Magazzini Alimentari Riuniti",
      "Address": "Via Ludovico il Moro 22",
      "City": "Bergamo",
      "Country": "Italy",
      "Phone": "035-640230"
    },
    {
      "Id": "MEREP",
      "Name": "Mère Paillarde",
      "Address": "43 rue St. Laurent",
      "City": "Montréal",
      "Country": "Canada",
      "Phone": "(514) 555-8054"
    },
    {
      "Id": "OTTIK",
      "Name": "Ottilies Käseladen",
      "Address": "Mehrheimerstr. 369",
      "City": "Köln",
      "Country": "Germany",
      "Phone": "0221-0644327"
    },
    {
      "Id": "PERIC",
      "Name": "Pericles Comidas clásicas",
      "Address": "Calle Dr. Jorge Cash 321",
      "City": "México D.F.",
      "Country": "Mexico",
      "Phone": "(5) 552-3745"
    },
    {
      "Id": "QUEDE",
      "Name": "Que Delícia",
      "Address": "Rua da Panificadora, 12",
      "City": "Rio de Janeiro",
      "Country": "Brazil",
      "Phone": "(21) 555-4252"
    },
    {
      "Id": "RANCH",
      "Name": "Rancho grande",
      "Address": "Av. del Libertador 900",
      "City": "Buenos Aires",
      "Country": "Argentina",
      "Phone": "(1) 123-5555"
    },
    {
      "Id": "RICAR",
      "Name": "Ricardo Adocicados",
      "Address": "Av. Copacabana, 267",
      "City": "Rio de Janeiro",
      "Country": "Brazil",
      "Phone": "(21) 555-3412"
    },
    {
      "Id": "SIMOB",
      "Name": "Simons bistro",
      "Address": "Vinbæltet 34",
      "City": "Kobenhavn",
      "Country": "Denmark",
      "Phone": "31 12 34 56"
    }
]

To seed the data, we will override another DbContext method, OnModelCreating and add the following:

protected override void OnModelCreating(ModelBuilder modelBuilder)
{
    var customersJson = File.ReadAllText("customer.json");
    var customers = JsonSerializer.Deserialize<List<Customer>>(customersJson);
    if (customers is null) return;
    var builder = modelBuilder.Entity<Customer>();
    builder.HasKey(c => c.Id);
    builder.Property(x => x.Id).HasColumnType("TEXT COLLATE NOCASE");
    builder.HasData(customers);
}

The program reads the customer.json file and creates a list of customers. Then we use the the modelBuilder object to configure the Customer entity. The HasData method seeds the entity with data from the customers list, and the HasKey method specifies that the Id property of the Customer entity is the primary key. We set the property Id to be case-insensitive, so we can search independently of the case.

Now we have some data to show, and both methods should work:

The next step is to create the methods for the other operations:

app.MapPost("/customers", async (Customer customer, CustomerDbContext context) =>
{
    context.Customer.Add(customer);
    await context.SaveChangesAsync();
    return Results.Created($"/customers/{customer.Id}", customer);
});

app.MapPut("/customers/{id}", async (string id, Customer customer, CustomerDbContext context) =>
{
    var currentCustomer = await context.Customer.FindAsync(id);
    if (currentCustomer == null)
    {
        return Results.NotFound();
    }

    context.Entry(currentCustomer).CurrentValues.SetValues(customer);
    await context.SaveChangesAsync();
    return Results.NoContent();
});

app.MapDelete("/customers/{id}", async (string id, CustomerDbContext context) =>
{
    var currentCustomer = await context.Customer.FindAsync(id);

    if (currentCustomer == null)
    {
        return Results.NotFound();
    }

    context.Customer.Remove(currentCustomer);
    await context.SaveChangesAsync();
    return Results.NoContent();
});

These three methods will allow us to create, update and delete customers from the database. If you run the code, you will see that the operations appear in the swagger page and all operations work fine:

Adding Logging

We have a functional service, but we can improve it in many ways, like adding logging

using Microsoft.Extensions.Logging;

var builder = WebApplication.CreateBuilder(args);

builder.Services.AddEndpointsApiExplorer();
builder.Services.AddSwaggerGen();
builder.Services.AddDbContext<CustomerDbContext>();

var app = builder.Build();

var logger = LoggerFactory.Create(config =>
    {
        config.AddConsole();
    }).CreateLogger("CustomerApi");

... 

app.MapGet("/customers", async (CustomerDbContext context) =>
{
    logger.LogInformation("Getting customers...");

    var customers = await context.Customer.ToListAsync();

    logger.LogInformation("Retrieved {Count} customers", customers.Count);

    return Results.Ok(customers);
});

To do this, you must add the package Microsoft.Extensions.Logging and add the namespace to the program. Then, you should create the logger variable, which you can use in the application. If you run the service, you will see this in the console window:

Other improvements

You can add other improvements, like Authorization, Caching or even Controllers to the application. For a quick reference guide on this kind of application, you can take a look at https://learn.microsoft.com/en-us/aspnet/core/fundamentals/minimal-apis?view=aspnetcore-7.0.

Conclusion

As you can see, Minimal APIs in ASP.NET provide a lightweight and simplified approach to building RESTful APIs with minimal ceremony. This article introduced the concept of Minimal APIs and demonstrated how to create RESTful APIs using Minimal APIs in ASP.NET Core.

The article started by explaining the benefits of Minimal APIs, highlighting their concise syntax and reduced boilerplate code, which allows for faster API development and a smaller code footprint. It then provided a step-by-step guide on setting up a new ASP.NET project using the Minimal API template and creating a basic API endpoint that returns weather forecast data. Then, we expanded on the capabilities of Minimal APIs by integrating Swagger, a popular tool for API documentation and testing, providing a user-friendly interface for exploring and testing the API endpoints.

Additionally, the article demonstrated how to implement a more advanced scenario using Minimal APIs and Entity Framework Core. It guided readers through the process of setting up a customer service API that performs CRUD operations on a SQLite database using EF Core, using and seeding the DbContext and difining endpoints for performing the operations on the database.

Overall, Minimal APIs in ASP.NET offer a streamlined approach to building RESTful APIs, allowing developers to quickly create APIs with less boilerplate code. With its simplicity, efficiency and extensibility, Minimal APIs in ASP.NET Core provide a compelling option for developers looking to build modern and scalable APIs.

The code for this article is located at https://github.com/bsonnino/CustomerService

Some time ago I wrote a blog post on Creating OpenXml files with Delphi. It showed how to open and create Word files using Delphi by using only a Zip and an Xml component.

There, I showed the structure of an OpenXml file: a zip file with a folder structure and a bunch of xml files:

In this blog post, we will explore the process of opening, inspecting, and creating OpenXml files using Delphi. We’ll also discuss the benefits of using the OpenXml SDK for high-level file manipulation.

Inspecting an OpenXml file

To open this kind of file in C#, we can use the Packaging API. This API allows us to access and analyze the file’s parts without directly dealing with the complexities of zips and xmls. As seen in the example we can get the parts of the file with this code:

using System.IO.Packaging;

using var package = Package.Open(fileName, FileMode.Open, FileAccess.Read);
package.GetParts().Select(p => p.ContentType).OrderBy(p => p).ToList().ForEach(p => Console.WriteLine(p));

You have to add the package System.IO.Packaging to the project. Once you do that, you will be able to retrieve and display the parts of the OpenXml file. The output will provide insights into the file’s structure, such as the main part being "application/vnd.openxmlformats-officedocument.wordprocessingml.document.main+xml.", which indicates we are analyzing a Word file.

To retrieve the relationships within the file, you can use the code snippet below:

using System.IO.Packaging;

using var package = Package.Open(fileName, FileMode.Open, FileAccess.Read);
package.GetRelationships().ToList()
    .ForEach(r => Console.WriteLine(
        $"{r.Id} - {r.SourceUri} - {r.TargetUri} - {r.TargetMode} - {r.RelationshipType}")); 

Additionally, you can obtain the main and properties files using the code below:

using System.IO.Packaging;

using var package = Package.Open(fileName, FileMode.Open, FileAccess.Read);
package.GetParts().Where(p => p.ContentType.Contains("main+xml")).ToList()
    .ForEach(p => Console.WriteLine(p.Uri));
package.GetParts().Where(p => p.ContentType.Contains("core-properties")).ToList()
    .ForEach(p => Console.WriteLine(p.Uri));
package.GetParts().Where(p => p.ContentType.Contains("extended-properties")).ToList()
    .ForEach(p => Console.WriteLine(p.Uri));

Once we have done that, we can open the file and get its contents, such as core properties and extended properties.

using System.IO.Packaging;
using System.Xml;

using var package = Package.Open(fileName, FileMode.Open, FileAccess.Read);
package.GetParts().Where(p => p.ContentType.Contains("core-properties")).ToList()
    .ForEach(p => WriteXmlToConsole(package.GetPart(p.Uri).GetStream());
package.GetParts().Where(p => p.ContentType.Contains("extended-properties")).ToList()
    .ForEach(p => WriteXmlToConsole(package.GetPart(p.Uri).GetStream()));


// Write formatted XML to console
void WriteXmlToConsole(Stream stream)
{
    var doc = new XmlDocument();
    doc.Load(stream);
    var settings = new XmlWriterSettings
    {
        Indent = true,
        IndentChars = "  ",
        NewLineChars = "\r\n",
        NewLineHandling = NewLineHandling.Replace
    };
    using var writer = XmlWriter.Create(Console.OpenStandardOutput(), settings);
    doc.Save(writer);
}

Creating an OpenXml file

Instead of using the low-level Packaging API, we can leverage the OpenXml SDK, which provides a higher-level API for OpenXml file manipulation. You can find the source code and documentation at https://github.com/OfficeDev/Open-XML-SDK.

With the OpenXML SDK, you don’t need to manipulate packages, relationships, or properties directly. Instead, you have new classes that allow you to work with Office files directly. For example, you have WordprocessingDocument, SpreadsheetDocument, and PresentationDocument classes for working with documents, spreadsheets, or presentations. The following code snippet creates a Word file with a text sentence:

using System.Xml;
using DocumentFormat.OpenXml;
using DocumentFormat.OpenXml.Packaging;
using DocumentFormat.OpenXml.Wordprocessing;

void CreateDoc(string filepath, string message)
{
    using (WordprocessingDocument doc = WordprocessingDocument.Create(filepath, WordprocessingDocumentType.Document))
    {
        MainDocumentPart mainPart = doc.AddMainDocumentPart();

        mainPart.Document = new Document();
        Body body = mainPart.Document.AppendChild(new Body());
        Paragraph para = body.AppendChild(new Paragraph());
        Run run = para.AppendChild(new Run());
        run.AppendChild(new Text(message));
        para.AppendChild(new Run());
    }
}

You need to add the DocumentFormat.OpenXml package to the project.

If you call it using CreateDoc("hello.docx","Hello World!"); you will get

Adding more information to the file

While the previous example created a simple file, you can extend the code to include more complex information. As an illustration, let’s create a file that lists all the fonts installed on the system:

void CreateDocWithAllFonts(string filepath)
{
    using (WordprocessingDocument doc = WordprocessingDocument.Create(filepath, WordprocessingDocumentType.Document))
    {
        MainDocumentPart mainPart = doc.AddMainDocumentPart();

        mainPart.Document = new Document();
        Body body = mainPart.Document.AppendChild(new Body());

        // Get all fonts available in the system
        var fonts = System.Drawing.FontFamily.Families.Select(f => f.Name).ToList();

        foreach (var font in fonts)
        {
            Paragraph para = body.AppendChild(new Paragraph());
            Run run = para.AppendChild(new Run());
            run.AppendChild(new Text(font));
            run.RunProperties = new RunProperties(new RunFonts() { Ascii = font });
            para.AppendChild(new Run());
        }
    }
}

To execute this code, you’ll need to add the System.Drawing.Common package to your project. The resulting file will contain a list of all the installed fonts.

Conclusions

As you can see, manipulating OpenXml files in C# is made easier with the OpenXml SDK. By using the provided classes, you can work directly with the file’s data without worrying about its internal structure. However, if you require more granular control, you can still utilize the low-level Packaging API to work with the raw data. This flexibility enables effortless creation and modification of OpenXml files while maintaining the ability to handle specific details when needed.

All the source code for this project is at https://github.com/bsonnino/OpenXmlCSharp

One advantage of the new .NET Core (now .NET) apps is that they are cross platform. Once you create an app, the same code will run on Windows, Mac or Linux with no change. But there are some exceptions to this: WPF or WinForms apps are only for Windows. You cannot create a WPF app and run it on Linux, for example.

If you want to create a cross platform app, you will have to use another technology, like Blazor or MAUI. Blazor UI is based on Razor components and it’s not compatible with XAML, so it may be very difficult to convert. MAUI, on the other side, uses XAML and can be used to port your app to Mac, iOS or Android. But MAUI apps don’t run on Linux. If you want a Linux app, this is not the way. You can use Uno Platform (see my article here), it can run on the Web (as a WebAssembly), Linux, Mac or Windows, or you also have the option of using Avalonia UI.

In this article we will show how to convert an existing WPF app to Avalonia UI. We will use the project described in my MVVM Community toolkit 8.0 article. This project has some interesting features to explore:

  • It uses a .NET Standard library for the repository
  • It has a Datagrid to display the data
  • It uses the MVVM pattern and the MVVM Community toolkit NuGet package

Avalonia UI is an open source cross platform framework for .NET to develop cross platform apps using XAML. To use it, you need to install the Avalonia project templates with:

dotnet new install Avalonia.Templates

Once you do it, you can create a new basic project with:

dotnet new avalonia.app -o BasicApp
cd BasicApp
dotnet run

When you run it, you will see a basic app:

The generated code has these differences:

  • The extension for the view files is axaml instead of xaml (and the code behind extension is axaml.cs)
  • The default namespace for the view files is https://github.com/avaloniaui instead of http://schemas.microsoft.com/winfx/2006/xaml/presentation
  • The project includes the Avalonia, Avalonia.Desktop, Avalonia.Diagnostics and XamlNameReferenceGenerator NuGet packages
  • The project targets net6.0 (or net7.0), instead of net6.0-windows
  • There is no need to include the UseWPF clause in the project file
  • The Program.cs file and Main method are explicit
  • There is some initialization code in App.axaml.cs

Apart from that, designing the UI and the C# code aren’t much different from the standard WPF app.

For this app, we will use Visual Studio 2022 and the Avalonia Extension. This extension will provide all templates and a designer for the views. If you don’t want to use Visual Studio, you can use VS Code, but you won’t have the visual designer. In Visual Studio, go to Extensions/Manage Extensions and install the Avalonia for Visual Studio 2022 extension:

Let’s start converting our app. We have two approaches, here: convert our app in place, making changes in the files as needed, or create a new basic Avalonia project and add the features incrementally. I prefer to use the second approach, in this case all the basic infrastructure is already set and we can make sure that things are running while we are adding the features. In the in place conversion, it’s an all-or-nothing, and at the end we may not have any clue of what we’ve missed, in case it doesn’t run.

The first step is to clone our app from https://github.com/bsonnino/MVVMToolkit8.git. Then, we will create our app in Visual Studio:

That will create a new basic app. If you open MainWindow.axaml, you will see the code and the visual designer:

Let’s start converting our app. The first step is to add the two NuGet packages, CommunityToolkit.Mvvm and Microsoft.Extensions.DependencyInjection.

Then, copy the CustomerLib folder with all files to the folder of the Avalonia solution. We will use this project as is, as it’s a .NET Standard project and it can be used by Avalonia unchanged. In the solution explorer, add an existing project and select the CustomerLib.csproj file. That will add the lib to our solution. In the main project, add a project reference and add the CustomerLib project:

Then, copy the ViewModel folder to the project folder, it will appear in the solution explorer. Open MainViewModel.cs, you will see an error in ColletionViewSource:

That’s because the CollectionViewSource class doesn’t exist in Avalonia and we need to replace it with this code:

private readonly ICustomerRepository _customerRepository;
private Func<Customer, bool> _filter = c => true;
public IEnumerable<Customer> Customers => _customerRepository.Customers.Where(_filter);

[RelayCommand]
private void Search(string textToSearch)
{
    if (!string.IsNullOrWhiteSpace(textToSearch))
        _filter = c => ((Customer)c).Country.ToLower().Contains(textToSearch.ToLower());
    else
        _filter = c => true;
    OnPropertyChanged(nameof(Customers));
}

Instead of using the WPF CollectionViewSource class, we are creating our filter and using it before displaying the data. Just to check, we can copy the Test project to the solution folder, add it to the current solution and run the tests to check. For that, we must do the following changes:

  • Change the Target Framework in the csproj file to .net6.0
  • Change the reference for the main project to MvvmAvalonia

Once we do that, we can compile the project, but we get the errors for the CollectionViewSource. For that, we must change the tests to:

[TestMethod]
public void SearchCommand_WithText_ShouldSetFilter()
{
    var customers = new List<Customer>
    {
        new Customer { Country = "a"},
        new Customer { Country = "text"},
        new Customer { Country = "b"},
        new Customer { Country = "texta"},
        new Customer { Country = "a"},
        new Customer { Country = "b"},
    };
    var repository = A.Fake<ICustomerRepository>();
    A.CallTo(() => repository.Customers).Returns(customers);
    var vm = new MainViewModel(repository);
    vm.SearchCommand.Execute("text");
    vm.Customers.Count().Should().Be(2);
}

[TestMethod]
public void SearchCommand_WithoutText_ShouldSetFilter()
{
    var customers = new List<Customer>
    {
        new Customer { Country = "a"},
        new Customer { Country = "text"},
        new Customer { Country = "b"},
        new Customer { Country = "texta"},
        new Customer { Country = "a"},
        new Customer { Country = "b"},
    };
    var repository = A.Fake<ICustomerRepository>();
    A.CallTo(() => repository.Customers).Returns(customers);
    var vm = new MainViewModel(repository);
    vm.SearchCommand.Execute("");
    vm.Customers.Count().Should().Be(6);
}

Now, when we run the tests, they all pass and we can continue. We will start adding the UI to the main window:

<Grid>
	<Grid.RowDefinitions>
		<RowDefinition Height="40" />
		<RowDefinition Height="*" />
		<RowDefinition Height="2*" />
		<RowDefinition Height="50" />
	</Grid.RowDefinitions>
	<StackPanel Orientation="Horizontal">
		<TextBlock Text="Country" VerticalAlignment="Center" Margin="5"/>
		<TextBox x:Name="searchText" VerticalAlignment="Center" Margin="5,3" Width="250" Height="25" VerticalContentAlignment="Center"/>
		<Button x:Name="PesqBtn" Content="Find" Width="75" Height="25" Margin="10,5" VerticalAlignment="Center"
                Command="{Binding SearchCommand}" CommandParameter="{Binding ElementName=searchText,Path=Text}"/>
	</StackPanel>
	<DataGrid AutoGenerateColumns="False" x:Name="master" CanUserAddRows="False" CanUserDeleteRows="True" Grid.Row="1"
              ItemsSource="{Binding Customers}" SelectedItem="{Binding SelectedCustomer, Mode=TwoWay}">
		<DataGrid.Columns>
			<DataGridTextColumn x:Name="customerIDColumn" Binding="{Binding Path=CustomerId}" Header="Customer ID" Width="60" />
			<DataGridTextColumn x:Name="companyNameColumn" Binding="{Binding Path=CompanyName}" Header="Company Name" Width="160" />
			<DataGridTextColumn x:Name="contactNameColumn" Binding="{Binding Path=ContactName}" Header="Contact Name" Width="160" />
			<DataGridTextColumn x:Name="contactTitleColumn" Binding="{Binding Path=ContactTitle}" Header="Contact Title" Width="60" />
			<DataGridTextColumn x:Name="addressColumn" Binding="{Binding Path=Address}" Header="Address" Width="130" />
			<DataGridTextColumn x:Name="cityColumn" Binding="{Binding Path=City}" Header="City" Width="60" />
			<DataGridTextColumn x:Name="regionColumn" Binding="{Binding Path=Region}" Header="Region" Width="40" />
			<DataGridTextColumn x:Name="postalCodeColumn" Binding="{Binding Path=PostalCode}" Header="Postal Code" Width="50" />
			<DataGridTextColumn x:Name="countryColumn" Binding="{Binding Path=Country}" Header="Country" Width="80" />
			<DataGridTextColumn x:Name="faxColumn" Binding="{Binding Path=Fax}" Header="Fax" Width="100" />
			<DataGridTextColumn x:Name="phoneColumn" Binding="{Binding Path=Phone}" Header="Phone" Width="100" />
		</DataGrid.Columns>
	</DataGrid>
	<customerApp:Detail Grid.Row="2" DataContext="{Binding SelectedCustomer}" Margin="5" x:Name="detail"/>
	<StackPanel Orientation="Horizontal" HorizontalAlignment="Right" Margin="5" Grid.Row="3">
		<Button Width="75" Height="25" Margin="5" Content="Add" Command="{Binding AddCommand}" />
		<Button Width="75" Height="25" Margin="5" Content="Remove" Command="{Binding RemoveCommand}" />
		<Button Width="75" Height="25" Margin="5" Content="Save" Command="{Binding SaveCommand}" />
	</StackPanel>
</Grid>

There is an error with the DataGrid. That’s because we need to add the package Avalonia.Controls.Datagrid. Once we add that, we can see some other errors:

  • The ItemsSource property has been changed to Items
  • The columns don’t have the Name field and should be removed
  • The CanUserAddRows and CanUserDeleteRows do not exist and should be removed
  • We should add the themes for the DataGrid in App.axaml:
<Application.Styles>
    <StyleInclude Source="avares://Avalonia.Themes.Default/DefaultTheme.xaml"/>
    <StyleInclude Source="avares://Avalonia.Themes.Default/Accents/BaseLight.xaml"/>
    <StyleInclude Source="avares://Avalonia.Controls.DataGrid/Themes/Default.xaml"/>
</Application.Styles>

We can also see that this code is missing the Detail control. Add to the project a new item of type UserControl (Avalonia) and add the content from the original project:

<Grid>
    <Grid Name="grid1" >
        <Grid.ColumnDefinitions>
            <ColumnDefinition Width="Auto" />
            <ColumnDefinition Width="*" />
        </Grid.ColumnDefinitions>
        <Grid.RowDefinitions>
            <RowDefinition Height="Auto" />
            <RowDefinition Height="Auto" />
            <RowDefinition Height="Auto" />
            <RowDefinition Height="Auto" />
            <RowDefinition Height="Auto" />
            <RowDefinition Height="Auto" />
            <RowDefinition Height="Auto" />
            <RowDefinition Height="Auto" />
            <RowDefinition Height="Auto" />
            <RowDefinition Height="Auto" />
            <RowDefinition Height="Auto" />
        </Grid.RowDefinitions>
        <Label Content="Customer Id:" Grid.Column="0" Grid.Row="0"  Margin="3" VerticalAlignment="Center" />
        <TextBox Grid.Column="1" Grid.Row="0"   Margin="3" Name="customerIdTextBox" Text="{Binding Path=CustomerId, Mode=TwoWay, ValidatesOnExceptions=true, NotifyOnValidationError=true}" VerticalAlignment="Center"  />
        <Label Content="Company Name:" Grid.Column="0" Grid.Row="1"  Margin="3" VerticalAlignment="Center" />
        <TextBox Grid.Column="1" Grid.Row="1"   Margin="3" Name="companyNameTextBox" Text="{Binding Path=CompanyName, Mode=TwoWay, ValidatesOnExceptions=true, NotifyOnValidationError=true}" VerticalAlignment="Center"  />
        <Label Content="Contact Name:" Grid.Column="0" Grid.Row="2"  Margin="3" VerticalAlignment="Center" />
        <TextBox Grid.Column="1" Grid.Row="2"   Margin="3" Name="contactNameTextBox" Text="{Binding Path=ContactName, Mode=TwoWay, ValidatesOnExceptions=true, NotifyOnValidationError=true}" VerticalAlignment="Center"  />
        <Label Content="Contact Title:" Grid.Column="0" Grid.Row="3"  Margin="3" VerticalAlignment="Center" />
        <TextBox Grid.Column="1" Grid.Row="3"   Margin="3" Name="contactTitleTextBox" Text="{Binding Path=ContactTitle, Mode=TwoWay, ValidatesOnExceptions=true, NotifyOnValidationError=true}" VerticalAlignment="Center"  />
        <Label Content="Address:" Grid.Column="0" Grid.Row="4" HorizontalAlignment="Left" Margin="3" VerticalAlignment="Center" />
        <TextBox Grid.Column="1" Grid.Row="4" Margin="3" Name="addressTextBox" Text="{Binding Path=Address, Mode=TwoWay, ValidatesOnExceptions=true, NotifyOnValidationError=true}" VerticalAlignment="Center" />
        <Label Content="City:" Grid.Column="0" Grid.Row="5"  Margin="3" VerticalAlignment="Center" />
        <TextBox Grid.Column="1" Grid.Row="5"   Margin="3" Name="cityTextBox" Text="{Binding Path=City, Mode=TwoWay, ValidatesOnExceptions=true, NotifyOnValidationError=true}" VerticalAlignment="Center"  />
        <Label Content="Postal Code:" Grid.Column="0" Grid.Row="6"  Margin="3" VerticalAlignment="Center" />
        <TextBox Grid.Column="1" Grid.Row="6"   Margin="3" Name="postalCodeTextBox" Text="{Binding Path=PostalCode, Mode=TwoWay, ValidatesOnExceptions=true, NotifyOnValidationError=true}" VerticalAlignment="Center"  />
        <Label Content="Region:" Grid.Column="0" Grid.Row="7"  Margin="3" VerticalAlignment="Center" />
        <TextBox Grid.Column="1" Grid.Row="7"   Margin="3" Name="regionTextBox" Text="{Binding Path=Region, Mode=TwoWay, ValidatesOnExceptions=true, NotifyOnValidationError=true}" VerticalAlignment="Center"  />
        <Label Content="Country:" Grid.Column="0" Grid.Row="8"  Margin="3" VerticalAlignment="Center" />
        <TextBox Grid.Column="1" Grid.Row="8"   Margin="3" Name="countryTextBox" Text="{Binding Path=Country, Mode=TwoWay, ValidatesOnExceptions=true, NotifyOnValidationError=true}" VerticalAlignment="Center"  />
        <Label Content="Phone:" Grid.Column="0" Grid.Row="9"  Margin="3" VerticalAlignment="Center" />
        <TextBox Grid.Column="1" Grid.Row="9"   Margin="3" Name="phoneTextBox" Text="{Binding Path=Phone, Mode=TwoWay, ValidatesOnExceptions=true, NotifyOnValidationError=true}" VerticalAlignment="Center"  />
        <Label Content="Fax:" Grid.Column="0" Grid.Row="10"  Margin="3" VerticalAlignment="Center" />
        <TextBox Grid.Column="1" Grid.Row="10"   Margin="3" Name="faxTextBox" Text="{Binding Path=Fax, Mode=TwoWay, ValidatesOnExceptions=true, NotifyOnValidationError=true}" VerticalAlignment="Center"  />
    </Grid>
</Grid>

We must remove the , ValidatesOnExceptions=true, NotifyOnValidationError=true from the code, as it’s not available in Avalonia. Then, we should add the correct using clause in the main xaml:

xmlns:customerApp="using:MvvmAvalonia"

Once we do that and we run, we can see the UI (but not the data):

For the data, we must add the configuration for the services, in App.axaml.cs:

public partial class App : Application
{
    public override void Initialize()
    {
        AvaloniaXamlLoader.Load(this);
        Services = ConfigureServices();
    }

    public override void OnFrameworkInitializationCompleted()
    {
        if (ApplicationLifetime is IClassicDesktopStyleApplicationLifetime desktop)
        {
            desktop.MainWindow = new MainWindow();
        }

        base.OnFrameworkInitializationCompleted();
    }

    public new static App Current => (App)Application.Current;

    public IServiceProvider Services { get; private set; }

    private static IServiceProvider ConfigureServices()
    {
        var services = new ServiceCollection();

        services.AddSingleton<ICustomerRepository, CustomerRepository>();
        services.AddSingleton<MainViewModel>();

        return services.BuildServiceProvider();
    }

    public MainViewModel MainVM => Services.GetService<MainViewModel>();
}

Then, we must set the DataContext on MainWindow.axaml.cs:

public MainWindow()
{
    InitializeComponent();
    DataContext = App.Current.MainVM;
}

Now, when we run the code, we can see it runs fine:

We’ve ported our WPF project to Avalonia, now it’s ready to be run on Linux. We’ll use WSLg (Windows Subsystem for Linux GUI) to run the app. Just open a Linux tab on terminal and cd to the project directory (the drive is mounted on /mnt/drive, like /mnt/c) and run the app with dotnet run. You should have something like this:

As you can see, porting a WPF app to Avalonia requires some changes, but most of the code is completely portable. If you want to ease the process, you can move the non-UI code to .NET Standard libraries and use them as-is. We’ve used the DataGrid and the MVVM Community Toolkit with no problems.

All the source code for the project is at https://github.com/bsonnino/MvvmAvalonia

Sometime ago, I wrote this article for the MSDN Magazine, about Aspect Oriented Programming and how it could solve cross-cutting concerns in your application, like:

  • Authentication
  • Logging
  • Data audit
  • Data validation
  • Data caching
  • Performance measuring

The article shows how to use the Decorator pattern and the RealProxy class to create a Dynamic Proxy to solve these issues in a simple manner. If you want to have a full introduction on the subject, I suggest that you take a look at the article.
The time has passed, things changed a lot and we are now close to the introduction of .NET 7. The RealProxy class doesn’t exist anymore, as it’s based in Remoting, which was not ported to .NET Core. Fortunately, we still have the System.Reflection.DispatchProxy class that can solve the problem.

With this class, we can still write proxies that decorate our classes and allow us to implement AOP in our programs. In this article, we will use the DispatchProxy class to create a dynamic proxy that allows us to implement a filter for the methods to be executed and execute other functions before and after the method execution.

In the command prompt, create a new console app with:

dotnet new console -o DynamicProxy
cd DynamicProxy
code .

In Program.cs, we will define a Customer record (put it at the end of the code):

record Customer(string Id, string Name, string Address);

Then, add a new Repository.cs file and add an IRepository interface in it:

public interface IRepository<T>
{
    void Add(T entity);
    void Delete(T entity);
    IEnumerable<T> GetAll();
}

The next step is to create the generic class Repository that implements this interface:

public class Repository<T> : IRepository<T> 
{
    private readonly List<T> _entities = new List<T>();
    public void Add(T entity)
    {
        _entities.Add(entity);
        Console.WriteLine("Adding {0}", entity);
    }

    public void Delete(T entity)
    {
        _entities.Remove(entity);
        Console.WriteLine("Deleting {0}", entity);
    }

    public IEnumerable<T> GetAll()
    {
        Console.WriteLine("Getting entities");
        foreach (var entity in _entities)
        {
            Console.WriteLine($"  {entity}");
        }
        return _entities;
    }
}

As you can see, our repository class is a simple class that will store the entities in a list, delete and retrieve them.

With this class created, we can add the code to use it in Program.cs:

Console.WriteLine("***\r\n Begin program\r\n");
var customerRepository = new Repository<Customer>();
var customer = new Customer(1, "John Doe", "1 Main Street");
customerRepository.Add(customer);
customerRepository.GetAll();
customerRepository.Delete(customer);
customerRepository.GetAll();
Console.WriteLine("\r\nEnd program\r\n***");

If you run this program, you will see something like this:

Now, let’s say we want to implement logging to this class, and have a log entry for every time it enters a method and another entry when it exits. We could do that manually, but it would be cumbersome to add logging before and after every method.

Using the DispatchProxy class, we can implement a proxy that will add logging to any class that implements the IRepository interface. Create a new file RepositoryLoggerProxy.cs and add this code:

using System.Reflection;

class RepositoryLogger<T> : DispatchProxy where T : class
{
    T? _decorated;

    public T? Create(T decorated)
    {
        var proxy = Create<T, RepositoryLogger<T>>() as RepositoryLogger<T>;
        if (proxy != null)
        {
            proxy._decorated = decorated;
        }
        return proxy as T;
    }


    protected override object? Invoke(MethodInfo? methodInfo, object?[]? args)
    {
        if (methodInfo == null)
        {
            return null;
        }

        Log($"Entering {methodInfo.Name}");
        try
        {
            var result = methodInfo.Invoke(_decorated, args);
            Log($"Exiting {methodInfo.Name}");
            return result;
        }
        catch
        {
            Log($"Error {methodInfo.Name}");
            throw;
        }
    }

    private static void Log(string msg)
    {
        Console.ForegroundColor = msg.StartsWith("Entering") ? ConsoleColor.Blue :
            msg.StartsWith("Exiting") ? ConsoleColor.Green : ConsoleColor.Red;
        Console.WriteLine(msg);
        Console.ResetColor();
    }
}

The RepositoryLogger class inherits from DispatchProxy and has a Create method that will create an instance of a class that implements the interface that’s decorated. When we call the methods of this class, they are intercepted by the overriden Invoke method and we can add the logging before and after executing the method.

To use this new class, we can use something like:

Console.WriteLine("***\r\n Begin program\r\n");
var customerRepository = new Repository<Customer>();
var customerRepositoryLogger = new RepositoryLogger<IRepository<Customer>>().Create(customerRepository);
if (customerRepositoryLogger == null)
{
    return;
}
var customer = new Customer(1, "John Doe", "1 Main Street");
customerRepositoryLogger.Add(customer);
customerRepositoryLogger.GetAll();
customerRepositoryLogger.Delete(customer);
customerRepositoryLogger.GetAll();
Console.WriteLine("\r\nEnd program\r\n***");

Now, running the code, we get:

We have logging entering and exiting the class without having to change it. Remove logging is as simple as changing one line of code.

With this knowledge, we can extend our proxy class to do any action we want. To add actions before, after and on error is just a matter of passing them in the creation of the proxy. We can create a DynamicProxy class with this code:

using System.Reflection;

class DynamicProxy<T> : DispatchProxy where T : class
{
    T? _decorated;
    private Action<MethodInfo>? _beforeExecute;
    private Action<MethodInfo>? _afterExecute;
    private Action<MethodInfo>? _onError;
    private Predicate<MethodInfo> _shouldExecute;

    public T? Create(T decorated, Action<MethodInfo>? beforeExecute, 
        Action<MethodInfo>? afterExecute, Action<MethodInfo>? onError, 
        Predicate<MethodInfo>? shouldExecute)
    {
        var proxy = Create<T, DynamicProxy<T>>() as DynamicProxy<T>;
        if (proxy == null)
        {
            return null;
        }
        proxy._decorated = decorated;
        proxy._beforeExecute = beforeExecute;
        proxy._afterExecute = afterExecute;
        proxy._onError = onError;
        proxy._shouldExecute = shouldExecute ?? (s => true);
        return proxy as T;
    }

    protected override object? Invoke(MethodInfo? methodInfo, object?[]? args)
    {
        if (methodInfo == null)
        {
            return null;
        }
        if (!_shouldExecute(methodInfo))
        {
            return null;
        }
        _beforeExecute?.Invoke(methodInfo);
        try
        {
            var result = methodInfo.Invoke(_decorated, args);
            _afterExecute?.Invoke(methodInfo);
            return result;
        }
        catch
        {
            _onError?.Invoke(methodInfo);
            throw;
        }
    }
}

In the Create method, we pass the actions we want to execute before after and on error after each method. We can also pass a predicate to filter the methods we don’t want to execute. To use this new class, we can do something like this:

Console.WriteLine("***\r\n Begin program\r\n");
var customerRepository = new Repository<Customer>();
var customerRepositoryLogger = new DynamicProxy<IRepository<Customer>>().Create(customerRepository,
    s => Log($"Entering {s.Name}"),
    s => Log($"Exiting {s.Name}"),
    s => Log($"Error {s.Name}"),
    s => s.Name != "GetAll");
if (customerRepositoryLogger == null)
{
    return;
}
var customer = new Customer(1, "John Doe", "1 Main Street");
customerRepositoryLogger.Add(customer);
customerRepositoryLogger.GetAll();
customerRepositoryLogger.Delete(customer);
customerRepositoryLogger.GetAll();
Console.WriteLine("\r\nEnd program\r\n***");

static void Log(string msg)
{
    Console.ForegroundColor = msg.StartsWith("Entering") ? ConsoleColor.Blue :
        msg.StartsWith("Exiting") ? ConsoleColor.Green : ConsoleColor.Red;
    Console.WriteLine(msg);
    Console.ResetColor();
}

Executing this code will show something like:

Note that, with this code, the method GetAll isn’t executed, as it was filtered by the predicate.

As you can see, this is a very powerful class, as it can implement many different aspects for any interface (the DispatchProxy class only works with interfaces). For example, if I want to create my own mocking framework, where I don’t execute any method of a class, I can change the code of the Invoke method to

protected override object? Invoke(MethodInfo? methodInfo, object?[]? args)
{
    if (methodInfo == null)
    {
        return null;
    }
    _beforeExecute?.Invoke(methodInfo);
    try
    {
        object? result = null;
        if (_shouldExecute(methodInfo))
        {
            result = methodInfo.Invoke(_decorated, args);
        }
        _afterExecute?.Invoke(methodInfo);
        return result;
    }
    catch
    {
        _onError?.Invoke(methodInfo);
        throw;
    }
}

And create the proxy with something like this:

var customerRepositoryLogger = new DynamicProxy<IRepository<Customer>>().Create(customerRepository,
    s => Log($"Entering {s.Name}"),
    s => Log($"Exiting {s.Name}"),
    s => Log($"Error {s.Name}"),
    s => false);

In this case, the real functions won’t be called, just the methods before and after the call:

As you can see, the DispatchProxy class allows the creation of powerful classes that add aspects to your existing classes, without having to change them. With the DynamicProxy class you just have to add the actions to execute and the filter for the functions to be executed.

All the source code for this article is at https://github.com/bsonnino/DynamicProxy

When you are developing a new project and need to store settings for it, the first thing that comes to mind is to use the Appsettings.json file. With this file, you can store all settings in a single file and restore them easily.

For example, let’s create a console project that has three settings: Id, Name and Version. In a command line prompt, type:

dotnet new console -o SecretStorage
cd SecretStorage
code .

This will open VS Code in the folder for the new project. Add a new file named appSettings.json and add this code in it:

{
    "AppData": {
        "Id": "IdOfApp",
        "Name": "NameOfApp",
        "Version": "1.0.0.0"
    }
}

We will add a AppData class in the project:

public class AppData
{
    public string Id { get; init; }
    public string Name { get; init; }
    public string Version { get; init; }

    public override string ToString()
    {
        return $"Id: {Id}, Name: {Name}, Version: {Version}";
    }
}

To read the settings into the class, we could use JsonDeserializer from System.Text.Json:

using System.Text.Json;

var settingsText = File.ReadAllText("appsettings.json");
var settings = JsonSerializer.Deserialize<Settings>(settingsText);
Console.WriteLine(settings);

You need to define a class Settings to read the data:

public class Settings
{
    public AppData AppData { get; set; }
}

That’s fine, but let’s say you have different settings for development and production, you should have a copy of the appsettings.json with the modified values and some code like this one:

using System.Diagnostics;
using System.Text.Json;

string settingsText; 
if (Debugger.IsAttached)
{
    settingsText = File.ReadAllText("appsettings.development.json");
}
else
{
    settingsText = File.ReadAllText("appsettings.json");
}
var settings = JsonSerializer.Deserialize<Settings>(settingsText);
Console.WriteLine(settings);

Things start to become complicated. If there was a way to simplify this… In fact, yes there is. .NET provides us with the ConfigurationBuilder class. With it, you can read and merge several files to get the configuration. The following code will merge the appsettings.json and appsettings.development.json into a single class. In production, all you have to do is to remove the appsettings.development.json from the package and only the production file will be used.

To use the ConfigurationBuilder class you must add the NuGet packages Microsoft.Extensions.Configuration and Microsoft.Extensions.Configuration.Binder with

dotnet add package Microsoft.Extensions.Configuration
dotnet add package Microsoft.Extensions.Configuration.Binder

You will also have to add the Microsoft.Extensions.Configuration.Json package to use the AddJsonFile extension method.

One other thing that you will have to do is to tell msbuild to copy the settings file to the output directory. This is done by changing the csproj file, adding this clause

<ItemGroup>
  <Content Include="appsettings*.json">
    <CopyToOutputDirectory>Always</CopyToOutputDirectory>
  </Content>  
</ItemGroup>

Once you do that, this code will read the config files, merge them and print the settings:

IConfigurationRoot config = new ConfigurationBuilder()
    .AddJsonFile("appsettings.json", false)
    .AddJsonFile("appsettings.development.json", true)
    .Build();
var appdata = config.GetSection(nameof(AppData)).Get<AppData>();
Console.WriteLine(appdata);

The second parameter in the AddJsonFile method tells that the appsettings.development.json file is optional and, if it’s not there, it wont be read. One other advantage is that I don’t need to duplicate all settings. You just need to add the overridden settings in the development file and the other ones will still be available.

Now, one more problem: let’s say we are using an API that requires a client Id and a client Secret. These values are very sensitive and they cannot be distributed. If you are using a public code repository, like GitHub, you cannot add something like this to appsettings.json and push your changes:

{
    "AppData": {
        "Id": "IdOfApp",
        "Name": "NameOfApp",
        "Version": "1.0.0.0"
    },
    "ApiData": {
        "ClientId": "ClientIdOfApp",
        "ClientSecret": "ClientSecretOfApp"
    }
}

That would be a real disaster, because your API codes would be open and you would end up with a massive bill at the end of the month. You could add these keys to appsettings.development.json and add it to the ignored files, so it wouldn’t be uploaded, but there is no guarantee that this won’t happen. Somebody could upload the file and things would be messy again.

The solution, in this case, would be to use the Secret Manager Tool. This tool allows you to store secrets in development mode, in a way that they cannot be shared to other users. This tool doesn’t encrypt any data and must only be used for development purposes. If you want to store the secrets in a safe encrypted way, you should use something like the Azure Key Vault.

To use it, you should initialize the storage with

dotnet user-secrets init

This will initialize the storage and generate a Guid for it and add it to the csproj file:

  <PropertyGroup>
    <OutputType>Exe</OutputType>
    <TargetFramework>net6.0</TargetFramework>
    <ImplicitUsings>enable</ImplicitUsings>
    <Nullable>enable</Nullable>
    <UserSecretsId>fc572277-3ded-4467-9c46-534a075f905b</UserSecretsId>
  </PropertyGroup>

Then, you need to add the package Microsoft.Extensions.Configuration.UserSecrets:

dotnet add package Microsoft.Extensions.Configuration.UserSecrets

We can now start utilizing the user secrets, by adding the new configuration type:

IConfigurationRoot config = new ConfigurationBuilder()
    .AddJsonFile("appsettings.json", false)
    .AddJsonFile("appsettings.development.json", true)
    .AddUserSecrets<Settings>()
    .Build();

Then, we can add the secret data:

dotnet user-secrets set "ApiData:ClientId" "ClientIdOfApp"
dotnet user-secrets set "ApiData:ClientSecret" "ClientSecretOfApp"

As you can see, the data is flattened in order to be added to the user secrets. You can take a look at it by opening an Explorer window and going to %APPDATA%\Microsoft\UserSecrets\{guid}\secrets.json:

{
  "ApiData:ClientId": "ClientIdOfApp",
  "ApiData:ClientSecret": "ClientSecretOfApp"
}

As you can see, there isn’t any secret here, it’s just a way to store data with no possibility to share it in an open repository.

You can get the values stored with

dotnet user-secrets list

To remove some key from the store, you can use something like

dotnet user-secrets remove ClientId

And to clear all data, you can use

dotnet user-secrets clear

If you have some array data to store, you will have to flatten in the same way, using the index of the element as a part of the name. For example, if you have something like

public class Settings
{
    public AppData AppData { get; set; }
    public ApiData ApiData { get; set; }
    public string[] AllowedHosts { get; set; }
}

You can store the AllowedHosts data with

dotnet user-secrets set "AllowedHosts:0" "microsoft.com"
dotnet user-secrets set "AllowedHosts:1" "google.com"
dotnet user-secrets set "AllowedHosts:2" "amazon.com"

And you can read the settings with some code like this:

IConfigurationRoot config = new ConfigurationBuilder()
    .AddJsonFile("appsettings.json", false)
    .AddJsonFile("appsettings.development.json", true)
    .AddUserSecrets<Settings>()
    .Build();
var settings = config.Get<Settings>();
foreach (var item in settings.AllowedHosts)
{
    Console.WriteLine(item);
}
Console.WriteLine(settings.AppData);
Console.WriteLine(settings.ApiData);

As you can see, if you need something to keep your development data safe from uploading to a public repository, you can use the user secrets in the same way you would do by using a json file. This simplifies a lot the storage of config files and allows every developer to have their own settings.

The full source code for this project is at https://github.com/bsonnino/SecretStorage

Introduction

I am a long time user of Gmail, and I usually don’t delete any email, I just archive the emails after reading and processing them, to keep my inbox clean.

Last week, I got a notice from Gmail that I was reaching the 15GB free limit and, in order to continue receiving emails, I should I should either buy some extra storage or clean my mail archive.

I know that I store a lot of garbage there, so I decided to clean the archive: the first step was to delete some old newsletters, and some junk email, but this didn’t even scratch the size of my mailbox (maybe it removed about 200Mb of data).

Then I started to use Gmail’s size filters: if you enter “larger:10M” in the query box, Gmail will show only messages with 10Mb or more. This is a great improvement, but there are two gotchas here: the messages aren’t sorted by size and you don’t know the size of every message.

That way, you won’t be able to effectively clean your mailbox – it will be a very difficult task to search among your 1500 messages which ones are good candidates to delete. So I decided to bite the bullet and create a C# program to scan my mailbox, list the largest ones and delete some of them. I decided to create a Universal Windows Platform app, so I could use on both my desktop and Windows Phone with no changes in the app code.

Registering the app with Google

The first step to create the app is to register it with Google, so you can get an app id to use in your app. Go to https://console.developers.google.com/flows/enableapi?apiid=gmail and create a new project. Once you have registered, you must get the credentials, to use in the app.

Figure 1 – Adding credentials

This will create new credentials for your app, which you must download and add to your project. When you download the credentials, you get a file named client_id.json , which you will include in your project. The next step is to create the project.

Creating the project

You must go to Visual Studio and create a new UWP app (blank app).

Figure 2 – Creating a new UWP app

A dialog appears asking you the target version for the app, and you can click OK. Then, you must add the NuGet package Google.Apis.Gmail.v1. This can be done in two ways: in the Solution Explorer, right click in the “References” node and select “Manage NuGet Packages” and search for Gmail, adding the Gmail package.

The second way is to open the Package Manager Console Window and adding the command:

Install-Package Google.Apis.Gmail.v1

Once you have installed the package, you must add the json file with the credentials to your project. Right click the project and select Add/Existing item and add the client_id.json file. Go to the properties window and select Build Action to Content and Copy to Output Directory as Copy always.

Getting User authorization

The first thing you must do in the program is to get the user authorization to access the email. This is done using OAuth2, with this code:

public async Task<UserCredential> GetCredential()
{
    var scopes = new[] { GmailService.Scope.GmailModify };
    var uri = new Uri("ms-appx:///client_id.json");
    _credential = await GoogleWebAuthorizationBroker.AuthorizeAsync(
        uri, scopes, "user", CancellationToken.None);
    return _credential;
}

We call the AuthorizeAsync method of GoogleWebAuthorizationBroker, passing the uri for client_id.json , andthe scope we want (modify emails). You can call this method in the constructor of MainPage:

public MainPage()
{
    this.InitializeComponent();
    GetCredential();
}

When you run the program, it will open a web page to get the user’s authorization. This procedure doesn’t store any password in the application, thus making it safe for the users: they can give their credentials to the Google broker and the broker will send just an authorization token to access the email.

Figure 3 – Web page for authorization

Figure 4 – Authorization consent

If you take a look at Figure 4, you will see that we are not asking for permission to delete the mail. We won’t need this permission, because we are going to just move the messages to trash. Then, the user will be able to review the messages and delete them permanently.

Getting emails

Once you have the authorization, you can get the emails from the server. In MainPage.xaml, add the UI for the application:

<Grid Background="{ThemeResource ApplicationPageBackgroundThemeBrush}">
    <Grid.RowDefinitions>
        <RowDefinition Height="Auto"/>
        <RowDefinition Height="*"/>
        <RowDefinition Height="Auto"/>
    </Grid.RowDefinitions>
    <Button Content ="Get Messages" Click="GetMessagesClick" 
            HorizontalAlignment="Right" Margin="5" Width="120"/>
    <ListView Grid.Row="1" x:Name="MessagesList" />
    <TextBlock Grid.Row="2" x:Name="CountText" Margin="5"/>
</Grid>

We will have a button to get the messages and add them to the listview. At the bottom, a textblock will display the message count. The code for retrieving the messages is:

private async void GetMessagesClick(object sender, RoutedEventArgs e)
{
    var service = new GmailService(new BaseClientService.Initializer()
    {
        HttpClientInitializer = _credential,
        ApplicationName = AppName,
    });
    UsersResource.MessagesResource.ListRequest request =
    service.Users.Messages.List("me");
    request.Q = "larger:5M";
    request.MaxResults = 1000;
    messages = request.Execute().Messages;
    MessagesList.ItemsSource = messages;
    CountText.Text = $"{messages.Count} messages";
}

We create a request for getting the messages larger than 5Mb and returning a maximum of 1000 results. If you have more than 1000 emails larger than 5Mb, there’s no guarantee you will get the largest emails, but you can change the query settings to satisfy your needs. Then, we query the server and fill the listview. If you run the app and click the button, you will see something like in Figure 5:

Figure 5 – Mail results displayed in the app window

We see only the item type because we didn’t set up an item template. That can be done in MainPage.xaml:

<ListView Grid.Row="1" x:Name="MessagesList" >
    <ListView.ItemTemplate>
        <DataTemplate>
            <StackPanel Margin="5">
                <TextBlock Text="{Binding Id}"/>
                <TextBlock Text="{Binding Snippet}" FontWeight="Bold"/>
                <TextBlock Text="{Binding SizeEstimate}"/>
            </StackPanel>
        </DataTemplate>
    </ListView.ItemTemplate>
</ListView>

Running the app, you will see that the list only shows the Ids for the messages. This first call doesn’t return the full message. To get the message contents, we must do a second call, to retrieve the message contents:

private async void GetMessagesClick(object sender, RoutedEventArgs e)
{
    var service = new GmailService(new BaseClientService.Initializer()
    {
        HttpClientInitializer = _credential,
        ApplicationName = AppName,
    });
    
        UsersResource.MessagesResource.ListRequest request =
        service.Users.Messages.List("me");
        request.Q = "larger:5M";
        request.MaxResults = 1000;
        messages = request.Execute().Messages;
        var sizeEstimate = 0L;
        for (int index = 0; index < messages.Count; index++)
        {
            var message = messages[index];
            var getRequest = service.Users.Messages.Get("me", message.Id);
            getRequest.Format =
                UsersResource.MessagesResource.GetRequest.FormatEnum.Metadata;
            getRequest.MetadataHeaders = new Repeatable<string>(
                new[] { "Subject", "Date", "From" });
            messages[index] = getRequest.Execute();
            sizeEstimate += messages[index].SizeEstimate ?? 0;
        }
    });
    MessagesList.ItemsSource = messages.OrderByDescending(m => m.SizeEstimate));
    CountText.Text = $"{messages.Count} messages. Estimated size: {sizeEstimate:n0}";
}

When we are getting the messages, we limit the data recovered. The default behavior for the Get request is to retrieve the full message, but this would be an overkill. We only get the Subject, Date and From headers for the message. If you run the app, you will get the snippet and the size, but you will see that the app hangs while retrieving the messages. This is not a good thing to do. We must get the messages in the background:

private async void GetMessagesClick(object sender, RoutedEventArgs e)
{
    var service = new GmailService(new BaseClientService.Initializer()
    {
        HttpClientInitializer = _credential,
        ApplicationName = AppName,
    });
    var sizeEstimate = 0L;
    IList<Message> messages = null;
    
    await Task.Run(async () =>
    {
        UsersResource.MessagesResource.ListRequest request =
        service.Users.Messages.List("me");
        request.Q = "larger:5M";
        request.MaxResults = 1000;
        messages = request.Execute().Messages;
        
        for (int index = 0; index < messages.Count; index++)
        {
            var message = messages[index];
            var getRequest = service.Users.Messages.Get("me", message.Id);
            getRequest.Format =
                UsersResource.MessagesResource.GetRequest.FormatEnum.Metadata;
            getRequest.MetadataHeaders = new Repeatable<string>(
                new[] { "Subject", "Date", "From" });
            messages[index] = getRequest.Execute();
            sizeEstimate += messages[index].SizeEstimate ?? 0;
        }
    });
    MessagesList.ItemsSource = messages.OrderByDescending(m => m.SizeEstimate));
    CountText.Text = $"{messages.Count} messages. Estimated size: {sizeEstimate:n0}";
}

Now the code doesn’t block the UI, but there’s no indication of what’s happening. Let’s add a progress bar to the UI:

<TextBlock Grid.Row="2" x:Name="CountText" Margin="5"/>
<Border x:Name="BusyBorder" Grid.Row="0" Grid.RowSpan="3" 
        Background="#40000000" Visibility="Collapsed">
    <StackPanel VerticalAlignment="Center" HorizontalAlignment="Center">
        <TextBlock Text="downloading messages" x:Name="OperationText"/>
        <ProgressBar x:Name="ProgressBar" Margin="0,5"/>
        <TextBlock x:Name="DownloadText"  HorizontalAlignment="Center"/>
    </StackPanel>
</Border>

To update the progress bar while downloading, we use this code:

private async void GetMessagesClick(object sender, RoutedEventArgs e)
{
    var service = new GmailService(new BaseClientService.Initializer()
    {
        HttpClientInitializer = _credential,
        ApplicationName = AppName,
    });
    var sizeEstimate = 0L;
    IList<Message> messages = null;
    
    BusyBorder.Visibility = Visibility.Visible;
    await Task.Run(async () =>
    {
        UsersResource.MessagesResource.ListRequest request =
        service.Users.Messages.List("me");
        request.Q = "larger:5M";
        request.MaxResults = 1000;
        messages = request.Execute().Messages;
        await Dispatcher.RunAsync(CoreDispatcherPriority.Normal, () =>
            ProgressBar.Maximum = messages.Count);

        for (int index = 0; index < messages.Count; index++)
        {
            var message = messages[index];
            var getRequest = service.Users.Messages.Get("me", message.Id);
            getRequest.Format =
                UsersResource.MessagesResource.GetRequest.FormatEnum.Metadata;
            getRequest.MetadataHeaders = new Repeatable<string>(
                new[] { "Subject", "Date", "From" });
            messages[index] = getRequest.Execute();
            sizeEstimate += messages[index].SizeEstimate ?? 0;
            
            var index1 = index+1;
            await Dispatcher.RunAsync(CoreDispatcherPriority.Normal, () =>
            {
                ProgressBar.Value = index1;
                DownloadText.Text = $"{index1} of {messages.Count}";
            });
        }
    });
    BusyBorder.Visibility = Visibility.Collapsed;
    MessagesList.ItemsSource = messages.OrderByDescending(m => m.SizeEstimate));
    CountText.Text = $"{messages.Count} messages. Estimated size: {sizeEstimate:n0}";
}

We set the visibility of the Busy border to visible before downloading the messages. As we download the messages, we update the progress bar. We are running the code in a background thread, so we can’t update the progress bar and the text directly, we must use the Dispatcher to update the controls in the main thread. Now, when we run the code, the busy border is shown and the progress bar is updated with the download count. At the end, we get the messages, with the snippets and size.

Figure 6 – Message results

You can see some problems in this display:

  • It’s far from good, and should be improved
  • It doesn’t show the subject, date and who sent the message
  • The snippet format is encoded and should be decoded
  • The size could be formatted

We can fix these issues by creating a new class:

public class EmailMessage
{
    public string Id { get; set; }
    public bool IsSelected { get; set; }
    public string Snippet { get; set; }
    public string SizeEstimate { get; set; }
    public string From { get; set; }
    public string Date { get; set; }
    public string Subject { get; set; }
}

And use it, instead of the Message class:

private async void GetMessagesClick(object sender, RoutedEventArgs e)
{
    var service = new GmailService(new BaseClientService.Initializer()
    {
        HttpClientInitializer = _credential,
        ApplicationName = AppName,
    });
    var sizeEstimate = 0L;
    IList<Message> messages = null;
    var emailMessages = new List<EmailMessage>();
    OperationText.Text = "downloading messages";
    BusyBorder.Visibility = Visibility.Visible;
    await Task.Run(async () =>
    {
        UsersResource.MessagesResource.ListRequest request =
        service.Users.Messages.List("me");
        request.Q = "larger:5M";
        request.MaxResults = 1000;
        messages = request.Execute().Messages;
        await Dispatcher.RunAsync(CoreDispatcherPriority.Normal, () =>
            ProgressBar.Maximum = messages.Count);

        for (int index = 0; index < messages.Count; index++)
        {
            var message = messages[index];
            var getRequest = service.Users.Messages.Get("me", message.Id);
            getRequest.Format =
                UsersResource.MessagesResource.GetRequest.FormatEnum.Metadata;
            getRequest.MetadataHeaders = new Repeatable<string>(
                new[] { "Subject", "Date", "From" });
            messages[index] = getRequest.Execute();
            sizeEstimate += messages[index].SizeEstimate ?? 0;
            emailMessages.Add(new EmailMessage()
            {
                Id = messages[index].Id,
                Snippet = WebUtility.HtmlDecode(messages[index].Snippet),
                SizeEstimate = $"{messages[index].SizeEstimate:n0}",
                From = messages[index].Payload.Headers.FirstOrDefault(h => 
                    h.Name == "From").Value,
                Subject = messages[index].Payload.Headers.FirstOrDefault(h => 
                    h.Name == "Subject").Value,
                Date = messages[index].Payload.Headers.FirstOrDefault(h => 
                    h.Name == "Date").Value,
            });
            var index1 = index+1;
            await Dispatcher.RunAsync(CoreDispatcherPriority.Normal, () =>
            {
                ProgressBar.Value = index1;
                DownloadText.Text = $"{index1} of {messages.Count}";
            });
        }
    });
    BusyBorder.Visibility = Visibility.Collapsed;
    MessagesList.ItemsSource = new ObservableCollection<EmailMessage>(
        emailMessages.OrderByDescending(m => m.SizeEstimate));
    CountText.Text = $"{messages.Count} messages. Estimated size: {sizeEstimate:n0}";
}

With this new code, we can change the item template, to show the new data:

<ListView.ItemTemplate>
    <DataTemplate>
        <Grid>
            <Grid.ColumnDefinitions>
                <ColumnDefinition Width="40"/>
                <ColumnDefinition Width="*"/>
            </Grid.ColumnDefinitions>
            <CheckBox HorizontalAlignment="Left" VerticalAlignment="Center" 
                      IsChecked="{Binding IsSelected, Mode=TwoWay}" Margin="5"/>
            <StackPanel Grid.Column="1" Margin="5">
                <StackPanel Orientation="Horizontal">
                    <TextBlock Text="From:" Margin="0,0,5,0"/>
                    <TextBlock Text="{Binding From}"/>
                </StackPanel>
                <StackPanel Orientation="Horizontal">
                    <TextBlock Text="Date:" Margin="0,0,5,0"/>
                    <TextBlock Text="{Binding Date}"/>
                </StackPanel>
                <StackPanel Orientation="Horizontal">
                    <TextBlock Text="Size: " Margin="0,0,5,0"/>
                    <TextBlock Text="{Binding SizeEstimate}"/>
                </StackPanel>
                <TextBlock Text="{Binding Subject}"/>
                <TextBlock Text="{Binding Snippet}" FontWeight="Bold"/>
            </StackPanel>
            <Rectangle Grid.Column="0" Grid.ColumnSpan="2" 
                       HorizontalAlignment="Stretch" VerticalAlignment="Bottom" 
                       Height="1" Fill="Black"/>
        </Grid>
    </DataTemplate>
</ListView.ItemTemplate>

With this code, we get a result like this:

Figure 7 – Message results with more data

We could still improve the performance of the app by making multiple requests for messages at the same time, but I leave this for you.

Deleting emails from the server

Now, the only task that remains is to delete the emails from the server. For that, we must add another button:

<Button Content ="Get Messages" Click="GetMessagesClick" 
        HorizontalAlignment="Right" Margin="5" Width="120"/>
<Button Grid.Row="0" Content ="Delete Messages" Click="DeleteMessagesClick" 
        HorizontalAlignment="Right" Margin="5,5,140,5" Width="120"/>

The code for deleting messages is this:

private async void DeleteMessagesClick(object sender, RoutedEventArgs e)
{
    var messages = (ObservableCollection<EmailMessage>) MessagesList.ItemsSource;
    var messagesToDelete = messages.Where(m => m.IsSelected).ToList();
    if (!messagesToDelete.Any())
    {
        await (new MessageDialog("There are no selected messages to delete")).ShowAsync();
        return;
    }
    var service = new GmailService(new BaseClientService.Initializer()
    {
        HttpClientInitializer = _credential,
        ApplicationName = AppName,
    });
    OperationText.Text = "deleting messages";
    ProgressBar.Maximum = messagesToDelete.Count;
    DownloadText.Text = "";
    BusyBorder.Visibility = Visibility.Visible;
    var sizeEstimate = messages.Sum(m => Convert.ToInt64(m.SizeEstimate));
    await Task.Run(async () =>
    {
        for (int index = 0; index < messagesToDelete.Count; index++)
        {
            var message = messagesToDelete[index];
            var response = service.Users.Messages.Trash("me", message.Id);
            response.Execute();
            var index1 = index+1;
            await Dispatcher.RunAsync(CoreDispatcherPriority.Normal, () =>
            {
                ProgressBar.Value = index1;
                DownloadText.Text = $"{index1} of {messagesToDelete.Count}";
                messages.Remove(message);
                sizeEstimate -= Convert.ToInt64(message.SizeEstimate);
                CountText.Text = $"{messages.Count} messages. Estimated size: {sizeEstimate:n0}";
            });
        }
    });
    BusyBorder.Visibility = Visibility.Collapsed;
}

If you run the app, you can select the messages you want and delete them. They will be moved to the trash folder, so you can double check the messages before deleting them.

Conclusion

This article has shown how to access the Gmail messages using the Gmail API and delete the largest messages, all in your Windows 10 app. Oh, and did I mention that the same app works also in Windows Phone, or in other Windows 10 devices?

All the source code for this article is available on GitHub, at http://github.com/bsonnino/LargeEmailsGmail

This article was first published at https://learn.microsoft.com/en-us/archive/blogs/mvpawardprogram/accessing-and-deleting-large-e-mails-in-gmail-with-c