log4net logger for Microsoft Azure Log Analytics

Logging to Azure Log Analytics using log4net

Recently I wrote an article on how to Send data to Azure Log Analytics from C# code which is demonstrating pushing data from your code to Microsoft Azure Log Analytics and simple quering the data from the Log Analytics.

Azure Log Analytics are extremely helpful for storing application logs because they provide transformation of the raw data into queriable columns, you can easily query your logs stored in Log Analytics and export to Excel format if you need to share the data or you just want to make some offline data analysis.

Adding code from previously mentioned article into an existing application can be headache because you probably have your logging already implemented. However, it is not that difficult to wrap data posting to Azure Log Analytics with log4net. In case you are already using log4net for logging in your application or you are following Dependency Injection pattern, it wouldn't be a big hassle to involve custom log4net logger.

Log4net Ala

log4net itself is pretty extendible and easy to implement your custom logger just by following pattern,
So let's start with the coding. Before we got o an actual implementation of a logger we will need a model which we'll deserialize and post to Log Analytics. We can reuse the same model from the article mentioned above.

using System;using System.Collections.Generic;using System.Runtime.Serialization;using log4net.Core;using log4net.Util;namespace Log4net.AzureLogAnalytics{[Serializable][DataContract]public class LogAnalyticsLoggingEvent{public LogAnalyticsLoggingEvent(LoggingEvent otherEvent, Func<string> formatMessage): this(formatMessage, otherEvent.Domain, otherEvent.ExceptionObject, otherEvent.Identity, otherEvent.Level, otherEvent.LoggerName, otherEvent.Properties, otherEvent.RenderedMessage, otherEvent.ThreadName, otherEvent.TimeStamp, otherEvent.UserName){}public LogAnalyticsLoggingEvent(Func<string> formatMessage, string domain, Exception exceptionObject, string identity, Level level, string loggerName, PropertiesDictionary properties, string message, string threadName, DateTime timestamp, string userName)
        {
            Domain = domain;
            ExceptionObject = exceptionObject;
            Identity = identity;
            LoggerName = loggerName;
            Properties = new Dictionary<string, object>();
            Message = formatMessage != null ? formatMessage() : message;
            ThreadName = threadName;
            Timestamp = timestamp;
            UserName = userName;
            LevelValue = level.Value;
            LevelName = level.Name;

            string[] propertyKeys = properties.GetKeys();
            foreach (string propertyKey in propertyKeys)
            {
                Properties[propertyKey] = properties[propertyKey];
            }
        }

        [DataMember]
        public string Domain { get; }

        [DataMember]
        public Exception ExceptionObject { get; }

        [DataMember]
        public string Identity { get; }

        [DataMember]
        public int LevelValue { get; }

        [DataMember]
        public string LevelName { get; }

        [DataMember]
        public string LoggerName { get; }

        [DataMember]
        public IDictionary<string, object> Properties { get; set; }

        [DataMember]
        public string Message { get; }

        [DataMember]
        public string ThreadName { get; }

        [DataMember]
        public DateTime Timestamp { get; }

        [DataMember]
        public string UserName { get; }
    }
}
    

You can modify the model to suite your needs and put additional parameters you may want to save to Log Analytics. The class structure will reflect column of Log Analytics record stored.

log4net Azure Log Analytics logger implementation

Now to start with an actual logger. The issue with article which describes how to Send data to Azure Log Analytics from C# code is that posting the data is a network operation. A network operation is basically an IO operation which is by default slow, so calling it from your code will eventually slow down your application code, so you cannot just directly call it from the logger implementation. Instead you can use a buffer in form of System.Collections.Concurrent.ConcurrentQueue which will accept type of LogAnalyticsLoggingEvent model instances. Then asynchronously you dequeue items and send them to Azure Log Analytics.

Since you will not keep your dequeue worker read constantly the ConcurrentQueue, you need to repetitively call it with an interval.
For the purpose I used Quartz.NET library which is installed as a NuGet package

using Newtonsoft.Json;
using Quartz;
using System;
using System.Collections.Concurrent;
using System.IO;
using System.Net;
using System.Security.Cryptography;
using System.Text;

namespace Log4net.AzureLogAnalytics
{
    internal class BufferReadScheduleJob : IJob
    {

        public void Execute(IJobExecutionContext context)
        {
            String workspaceId = context.JobDetail.JobDataMap.GetString("WorkspaceId");
            String sharedKey = context.JobDetail.JobDataMap.GetString("SharedKey");
            String logType = context.JobDetail.JobDataMap.GetString("LogType");

            ConcurrentQueue<LogAnalyticsLoggingEvent> loggingEvenQueue = context.JobDetail.JobDataMap.Get("LoggingEvenQueue") as ConcurrentQueue<LogAnalyticsLoggingEvent>;

            if (loggingEvenQueue.Count > 0)
            {
                loggingEvenQueue.TryDequeue(out LogAnalyticsLoggingEvent loggingEvent);
                if (loggingEvent != null)
                {
                    String jsonMessage = JsonConvert.SerializeObject(loggingEvent);
                    Post(jsonMessage, workspaceId, sharedKey,logType);
                }
            }
        }



        private void Post(String json, String workspaceId, String sharedKey, String logType, string apiVersion = "2016-04-01")
        {
            String requestUriString = $"https://{workspaceId}.ods.opinsights.azure.com/api/logs?api-version={apiVersion}";
            DateTime dateTime = DateTime.UtcNow;
            String dateString = dateTime.ToString("r");

            String signature;
            string message = $"POST\n{json.Length}\napplication/json\nx-ms-date:{dateString}\n/api/logs";
            byte[] bytes = Encoding.UTF8.GetBytes(message);
            using (HMACSHA256 encryptor = new HMACSHA256(Convert.FromBase64String(sharedKey)))
            {
                signature = $"SharedKey {workspaceId}:{Convert.ToBase64String(encryptor.ComputeHash(bytes))}";
            }
            HttpWebRequest request = (HttpWebRequest)WebRequest.Create(requestUriString);
            request.ContentType = "application/json";
            request.Method = "POST";
            request.Headers["Log-Type"] = logType;
            request.Headers["x-ms-date"] = dateString;
            request.Headers["Authorization"] = signature;
            byte[] content = Encoding.UTF8.GetBytes(json);
            using (Stream requestStreamAsync = request.GetRequestStream())
            {
                requestStreamAsync.Write(content, 0, content.Length);
            }
            using (HttpWebResponse responseAsync = (HttpWebResponse)request.GetResponse())
            {
                if (responseAsync.StatusCode != HttpStatusCode.OK && responseAsync.StatusCode != HttpStatusCode.Accepted)
                {
                    Stream responseStream = responseAsync.GetResponseStream();
                    if (responseStream != null)
                    {
                        using (StreamReader streamReader = new StreamReader(responseStream))
                        {
                            throw new Exception(streamReader.ReadToEnd());
                        }
                    }
                }
            }
        }


    }
}

    

So now whole logic for posting the data to Azure Log Analytics is shifted to the Quartz Job class instance and Appender implementation only needs to write to the ConcurrentQueue instance which is shared with the Quartz job. This way logger call will not block the application code with an IO operation.

using log4net.Appender;
using log4net.Core;
using Quartz;
using Quartz.Impl;
using System;
using System.Collections.Concurrent;
using System.Collections.Generic;
using System.IO;

namespace Log4net.AzureLogAnalytics
{
    public class LogAnalyticsAppender : AppenderSkeleton
    {
        #region Fileds
        private readonly ConcurrentQueue<LogAnalyticsLoggingEvent> azureLoggingEventQueue;
        private readonly IScheduler scheduler;
        private String workspaceId;
        private String sharedKey;
        private String logType;
        private int bufferTimeout = 500;

        #endregion

        #region Properties
        string WorkspaceId
        {
            get
            {
                return this.workspaceId;
            }
            set
            {
                this.workspaceId = value;
                initilizeLogger();
            }
        }

        string SharedKey
        {
            get
            {
                return this.sharedKey;
            }
            set
            {
                this.sharedKey = value;
                initilizeLogger();
            }
        }

        string LogType
        {
            get
            {
                return this.logType;

            }
            set
            {
                this.logType = value;
                initilizeLogger();
            }
        }

        #endregion


        #region Constructors
        public LogAnalyticsAppender()
        {
            this.azureLoggingEventQueue = new ConcurrentQueue<LogAnalyticsLoggingEvent>();
            this.scheduler = StdSchedulerFactory.GetDefaultScheduler();

        }
        #endregion

        #region Methods
        private void initilizeLogger()
        {
            if (!String.IsNullOrWhiteSpace(workspaceId) && !String.IsNullOrWhiteSpace(sharedKey) && !String.IsNullOrWhiteSpace(logType))
            {
                scheduler.Start();
                IDictionary<String, Object> map = new Dictionary<String, Object>() {
                { "LoggingEvenQueue", this.azureLoggingEventQueue },
                { "SharedKey",this.SharedKey },
                { "WorkspaceId",this.WorkspaceId },
                { "LogType",this.LogType }
            };

                IJobDetail job = JobBuilder.Create<BufferReadScheduleJob>()
                                           .UsingJobData(new JobDataMap(map))
                                           .Build();

                ITrigger trigger = TriggerBuilder.Create()
                     .WithSimpleSchedule
                      (s =>
                         s.WithInterval(TimeSpan.FromMilliseconds(bufferTimeout))
                         .RepeatForever()
                      )
                    .Build();

                scheduler.ScheduleJob(job, trigger);
            }
        }
        #endregion

        #region Abstract implementation methods
        protected override void Append(LoggingEvent loggingEvent)
        {
            var serializableEvent = new LogAnalyticsLoggingEvent(loggingEvent,
                () =>
                {
                    if (Layout != null)
                    {
                        using (StringWriter writer = new StringWriter())
                        {
                            Layout.Format(writer, loggingEvent);
                            return writer.ToString();
                        }
                    }
                    return loggingEvent.RenderedMessage;
                });

            azureLoggingEventQueue.Enqueue(serializableEvent);
        }

        protected override void OnClose()
        {
            if (!scheduler.IsShutdown)
            {
                scheduler.Shutdown();
            }

            base.OnClose();
        }
        #endregion
    }
}

    

How to use log4net Log Analytics logger

So now we have the logger and we need to involve it in our application. First thing we need to do is to configure it along with all Azure Log Analytics parameters. Because I already explained how to get the parameters in this article, I will not repeat it, but you can got to Send data to Azure Log Analytics from C# code article and check how to get Log Analytics workspace parameters from Azure Portal.

I stored the configuration in a separate configuration file called log4net.config

<?xml version="1.0" encoding="utf-8" ?>
<log4net debug="true">
  <appender name="AzureAppender" type="Log4net.AzureLogAnalytics.LogAnalyticsAppender, Log4net.AzureLogAnalytics">
    <WorkspaceId value="xxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxx" />
    <SharedKey value="xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx" />
    <LogType value="ApplicationLog" />
  </appender>
  <root>
    <level value="ALL" />
    <appender-ref ref="AzureAppender" />
  </root>
</log4net>
    

Now when we have the logger configured we can start using it. I wrote a small console application which consumes Azure log4net logger for Log Analytics just for the demo purposes.

using log4net;
using System;

[assembly: log4net.Config.XmlConfigurator(ConfigFile = @"log4net.config", Watch = true)]
namespace Log4net.AzureLogAnalytics.TestConsole
{
    class Program
    {
        private static ILog log = LogManager.GetLogger("TestLogger");

        static void Main(string[] args)
        {
            log.Info("Helo from log");
            Console.ReadKey();

        }
    }
}

    

After execution the test console app code you should see the record in your Log Analytics workspace.

ApplicationLog_CL
| order by TimeGenerated desc 

    
Note

The data and log type may not appear right away as Azure is not indexing at runtime, so you might expect your data to show up in query results in about 1-5 minutes

Log Analytics Record

Complete code is available as a Git repository on Github https://github.com/dejanstojanovic/log4net-Azure-Log-Analytics, so you can fetch latest code with possibly applied fixes and improvements.

References

Disclaimer

Purpose of the code contained in snippets or available for download in this article is solely for learning and demo purposes. Author will not be held responsible for any failure or damages caused due to any other usage.


About the author

DEJAN STOJANOVIC

Dejan is a passionate Software Architect/Developer. He is highly experienced in .NET programming platform including ASP.NET MVC and WebApi. He likes working on new technologies and exciting challenging projects

CONNECT WITH DEJAN  Loginlinkedin Logintwitter Logingoogleplus Logingoogleplus

JavaScript

read more

SQL/T-SQL

read more

Umbraco CMS

read more

PowerShell

read more

Comments for this article