Skip to content

Logging API Requests and Responses to GrayLog over Azure API Management using Azure Functions

As you know, one of the most basic features that nearly every backend application must have is logging of requests and responses. Especially in environments where multi API usage is concerned, it is a basic feature that switches from “nice-to-have” state to “must-have” state.

Because, there may be a time, that we need to be able to show that the incoming request, that is sent by the client to our API, has invalid parameters.

In short, it is a feature that we should bring to our API to be able to get a better “traceability“.

Story

Within a short period of time, we needed to log all API requests and responses. Yes, there are many ways to do this. For example, we can log requests and responses by developing a middleware in each APIs. Of course, they all have different tradeoffs.

Let’s say we have written a middleware. Well, who will implement it to all APIs? Will it have a bad impact on performance? Of course, one of the important points is time.

In this article, I will try to mention how I added requests and responses logging feature in our existing system as a “quick-win” solution.

Well!

For a clearer understanding of the solution, I want to give a piece of brief information about our system. The project that I’m working on is completely designed as a cloud-native and microservice approach. We orchestrated all our applications on Azure Kubernetes Service and also we are managing all our APIs with Azure API Management.

Azure API Management is one of the key parts on our distributed system. In context of this article, I won’t go into the details of the benefits of using the API Gateway in microservice architecture.

However, I would like to mention one of the points that the use of API Gateway gives me a quick-win solution.

API Gateway allows us to create a “single entry point” and uniform experience.

Hence, besides the routing process, it also allows us to perform manipulations on requests and responses.

In this case, why would I perform request and response logging operation in each APIs? Right?

Azure API Management has a logging policy like other API Gateways. Using these policies, it is possible to perform a platform-independent and centralized logging operation.

So, How?

At this point, we will follow a few steps:

  1. Creation of Azure Event Hub
  2. Creation of Azure API Management Logger
  3. Configuring Azure API Management Log Policy
  4. Creation of Azure Function that is triggered by Azure Event Hub and sending logs to GrayLog
  5. Deployment operation of the Azure Function

1. Creation of Azure Event Hub

We have an advanced policy named “Log to EventHub” on Azure API Management. Using this policy, we can send logs to Event Hub with minimum performance impact (we should consider performance impact too) and then we can process/save.

If you are not currently using an Event Hub, you need an Event Hub namespace and instance.

Azure Event Hub is an event ingestion service that is capable of consuming and processing dozens of events per second.

You can follow the steps here to create it easily.

After completing the Event Hub creation, we need a shared access policy in order to use Event Hub in Azure API Management. To get this policy, let’s click the Event Hub entity you have just created and enter the details.

Then click on the “Shared access policies” tab on the left side as above.

Now let’s click the “Add” button to create a policy. I have created a policy named “AzureAPIManagement“. After the creation operation, we will be able to access the connection string information of the Event Hub instance as saw above.

2. Creation of Azure API Management Logger

We have completed the Event Hub steps. Now we need a logger inside the Azure API Management. We will create a logger through the Azure API Management REST API.

Before starting to logger creation operation, we need to enable REST API service on Azure API Management. In order to enable this service, we need to check the “Enable API Management REST API” checkbox in the Azure API Management instance by entering the “Management API” tab as follows.

After enabling the REST API, let’s get an access token by clicking the “Generate” button in the “Access token” section below.

Now we can create a logger. In order to create a logger, we need to send a HTTP PUT request to the following URL.

https://my-api-management. management.azure-api.net/loggers/my-logger?api-version=2018-01-01

NOTE: Don’t forget to change {my-api-management} and {my-logger} parameters with yours service names.

There are a few parameters that we should add to the header.

  • Content-Type : application/json
  • Authorization : “API Management REST API access token”

and the payload should be:

Event Hub logger is ready to use now. In here let’s take note the logger name which is the “my-logger“. We will use it later to define the logger policy.

3. Configuring Azure API Management Log Policy

To define the logger policy, let’s enter the related Azure API Management instance and then click to the “APIs” tab in the left menu. After that, let’s go to the “All APIs” section as follows.

What we want to do here is to add an Event Hub logger that will apply to all our APIs. To be able to do that we will add a “Log to EventHub” policy to the “inbound” and “outbound” processing sections.

To add the policy, let’s click the icon next to the “Policies” title on the right and go to the code view screen.

Now add the “Log to EventHub” policy, which is placed under the “Advanced Policies” section, between “inbound” and “outbound” tags and edit it as follows.

In here the point that we should consider is the “logger-id” parameter. We set the logger name, that we specified when creating the Event Hub logger, as a value for this parameter. In addition, we are correlating requests and responses with the “message-id” variable and send them to the Event Hub.

Btw, it’s nice to be able to write C# based policy in Azure API Management, right?

4. Creation of Azure Function that is triggered by Azure Event Hub and sending logs to GrayLog

Now we are keeping the request and response logs on the Event Hub. There are a few ways to consume these logs on the Event Hub. At this point, I chose the Azure Functions, that is serverless compute, to avoid any extra effort.

First, let’s create an Azure Function with following this documentation steps.

When you complete the steps up to “Create an HTTP triggered function” section in the documentation, let’s select the “All” option by clicking the “Change template filter” option. Then, select the “EventHubTrigger” option like above. After that, let’s give a name to the function (I named RequestLoggingFunction) and follow the rest of the steps to complete the function creation process.

Now we have an Azure Function template.

There are two operations that we will do here.

  1. We will consume the logs on the Event Hub.
  2. Then we will send logs into GrayLog (if you are using a different log provider, you can integrate it).

Since we are already using GrayLog, I have decided to integrate request and response message logs into the GrayLog. If you wish, you can take a look at different solutions like Moesif.

If you want to get a bit of information about the installation of GrayLog, you can check this article.

First, let’s add the Gelf package of the NLog to the project via NuGet.

Then, let’s create a new folder called “Models” and define the following three classes.

NOTE: You can change the message log structure as what you wish.

We will parse the log data, that we will consume over the Event Hub, to “HttpRequestMessage” and “HttpResponseMessage” types through the “HttpMessage” class.

https://github.com/dgilling/ApimEventProcessor/blob/master/src/ApimEventProcessor/HttpMessage.cs

Let’s edit the class that comes with the function template as follows.

NOTE: Don’t forget to change the “my-log-hub” and “EVENTHUB_CONNECTION” keys that I set to the “EventHubTrigger” attribute as parameter.

So let’s take a look at what we did in the function.

We configured the logger of NLog in order to send a log to GrayLog. In the “Run” method, we performed a logging operation by parsing events according to the related event type.

Now we need to create “nlog.config” file under the root directory of the function as follow.

NOTE: Don’t forget to also edit the GrayLogendpoint“.

5. Deployment operation of the Azure Function

Now we are ready to deploy. Well, in order to perform deployment operation quickly, first let’s follow these steps.

NOTE: When creating a function on Azure, I recommend, that you select the pricing tier carefully. I chose the “app service” plan instead of the “consumption” plan to keep my function, which will log all request and response messages, under the control.

After completing the deployment process, we need to add the Event Hub‘s connection string information to the function’s app settings section via the Azure Portal.

First, let’s enter the “Function Apps” tab on the portal and click on the function we have created. Then click the “Application settings” option by entering the “Platform features” section.

Since I have defined the “Connection” key in the function as “EVENTHUB_CONNECTION“, I have added the same name in the app settings.

So finally we are ready. If we haven’t made any mistakes, the function we created will start to run successfully. In order to know that the function runs successfully or not, you can check it by entering the “Monitor” tab in the above screen.

An example request log will appear in GrayLog as follows.

Conclusion

Although it may seem like a simple operation, logging of request and response messages is important against many situations. I have also tried to show how I have done it inside our existing system, without adding more overhead, using the services that Azure has provided us.

Happy clouding!

Bu makale toplam (898) kez okunmuştur.

4
0



Published in.NET CoreASP.NET CoreAzureLogging

Be First to Comment

Leave a Reply

Your email address will not be published. Required fields are marked *

*

This site uses Akismet to reduce spam. Learn how your comment data is processed.