Azure Log Analytics (ALA) is a powerful way to log within Azure. Log Analytics has a REST API endpoint that allows Boomi to create custom logs within a Log Analytics Workspace.
Create an Azure Log Analytics Workspace.
Go to Agents management and copy the Workspace Id and Primary Key. These values will be used within Boomi.
Figure 1. Agent management within Azure Log Analytics.
Once you have copied the Workspace Id and Primary Key you can begin to set up the connector within Boomi. The rest of the steps will follow along with Figure 2, which is an example subprocess. The script that is used to create the Authorization header does take some time to execute and has to be done per document being sent. It is recommended (but not required) to send the log payloads to an Atom Queue (or another queue) and then send the message to ALA within the subscriber instead of directly within your main process.
Figure 2. ALA Sub-process Layout
First, create an HTTP Client connection and populate it with the information below.
URL: https://<CustomerId>.ods.opinsights.azure.com
Authentication Type: None
Replace CustomerId with your Workspace Id.
Figure 3. HTTP Client Connection.
Content Type: application/jsonHTTP Method: POST
Request Header Name/Value
Authorization: Check "Is replacement variable?"
x-ms-date: Check "Is replacement variable?"
Log-Type: Check "Is replacement variable?"
accept: */*
Resource Path: api/logs?api-version=2016-04-01
Figure 4. HTTP Client Operation.
DPP_WORKSPACE_ID and DPP_PRIMARY_KEY are both used within the script below and must be set. Log-Type will be used to set a header within the HTTP connector. The Log-Type will be the name of the custom logs table within ALA.
Figure 5. Set Property Shape Configuration.
The script below is written in Groovy 2..4 and is based on Microsoft’s documentation on creating an Authorization header. This script will create the Authorization and x-ms-date headers. Please review Microsoft’s documentation for more detail on the Authorization header.
// Groovy 2.4
// Doc https://docs.microsoft.com/en-us/rest/api/loganalytics/create-request
// Doc https://docs.microsoft.com/en-us/azure/azure-monitor/logs/data-collector-api
import java.util.Properties;
import java.io.InputStream;
import com.boomi.execution.ExecutionUtil;
import javax.crypto.Mac
import javax.crypto.spec.SecretKeySpec
import java.nio.charset.StandardCharsets
import java.security.InvalidKeyException
import java.security.NoSuchAlgorithmException
import java.text.SimpleDateFormat
String workspaceId = ExecutionUtil.getDynamicProcessProperty("DPP_WORKSPACE_ID");
String sharedKey = ExecutionUtil.getDynamicProcessProperty("DPP_PRIMARY_KEY");
for (int i = 0; i < dataContext.getDataCount(); i++) {
InputStream is = dataContext.getStream(i);
Properties props = dataContext.getProperties(i);
String contentLength = is.text.length();
String dateString = getServerTime();
String httpMethod = "POST";
String contentType = "application/json";
String xmsDate = "x-ms-date:" + dateString;
String resource = "/api/logs";
String stringToHash = String
.join("\n", httpMethod, contentLength, contentType,
xmsDate, resource);
String hashedString = getHMAC256(stringToHash, sharedKey);
String signature = "SharedKey " + workspaceId + ":" + hashedString;
props.setProperty("document.dynamic.userdefined.Authorization", signature);
props.setProperty("document.dynamic.userdefined.x-ms-date", dateString);
is.reset();
dataContext.storeStream(is, props);
}
private static String getServerTime() {
Calendar calendar = Calendar.getInstance();
SimpleDateFormat dateFormat = new SimpleDateFormat("EEE, dd MMM yyyy HH:mm:ss z", Locale.US);
dateFormat.setTimeZone(TimeZone.getTimeZone("GMT"));
return dateFormat.format(calendar.getTime());
}
private static String getHMAC256(String input, String key) throws InvalidKeyException, NoSuchAlgorithmException {
String hash;
Mac sha256HMAC = Mac.getInstance("HmacSHA256");
Base64.Decoder decoder = Base64.getDecoder();
SecretKeySpec secretKey = new SecretKeySpec(decoder.decode(key.getBytes(StandardCharsets.UTF_8)), "HmacSHA256");
sha256HMAC.init(secretKey);
Base64.Encoder encoder = Base64.getEncoder();
hash = new String(encoder.encode(sha256HMAC.doFinal(input.getBytes(StandardCharsets.UTF_8))));
return hash;
}
After you have set up your process you will need to create a JSON payload to send to ALA. Each document that is sent to ALA will become a new row. The JSON structure is straightforward and each element corresponds to a column. The name of the column will match the element name. If there are any nested objects or arrays within the element, then the column within ALA will have a similar nested structure.
Example Payload
{
"processName" : "Azure Log Analytics Demo",
"environmentName" : "dev",
"executionId" : "<executionId>",
"atomId" : "<atomId>",
"timestamp" : "2022-01-24T21:23:47.545Z",
"messageType" : "ProcessLog",
"details" : {
"log" : [
{
"logLevel" : "info",
"logMesage" : "Start"
}
]
}
}
The first time you send data to ALA, it will take some time (about 15-20 minutes) to process the data and setup a new custom table. Subsequent executions will populate the table within seconds.
Figure 6. Example of Querying Logs
The article was originally posted at Boomi Community.