Friday, January 15, 2021

Forwarding Red Hat Openshift Logs to vRealize Log Insight Cloud using Log Forwarding API

 Overview

Red Hat OpenShift is an open-source container application platform based on the Kubernetes container orchestrator for enterprise application development and deployment. 

Starting with OpenShift 4.3, a new approach called the Log Forwarding API  was made available to not only simplify the ability to integrate with external log aggregation solutions such as vRealize Log Insight Cloud ( vRLIC). 

What is Log Forwarding API

The Log Forwarding API enables you to configure custom pipelines to send container and node logs to specific endpoints within or outside of your cluster. You can send logs by type to the internal OpenShift Container Platform Elasticsearch instance and to remote destinations not managed by OpenShift Container Platform cluster logging, such as an existing logging service, an external Elasticsearch cluster, external log aggregation solutions, or a Security Information and Event Management (SIEM) system.


What is being deployed as part of the integration? 

We will be deploying fluentd forwarder within Openshift Cluster which will forward the logs to vRealize Log Insight Cloud 













Once the logs are flowing you can create a Dashboard to visualize your OpenShift environment like below I have created a sample dashboard.














Pre-requisites


  • Openshift Environment 4.3 and above
  • Cluster Logging Operator & Elasticsearch Operator is installed for openshift-logging namespace  following 
  • Cluster logging instance is created using the Custom Resource Definition. 

The procedure listed is here 
https://docs.openshift.com/container-platform/4.3/logging/cluster-logging-deploying.html#cluster-logging-deploy-clo_cluster-logging-deploying








Procedure

Following are high-level steps that will be executed

  1. Generate vRealize Log Insight Cloud API Key
  2. Create a namespace for vRLI Cloud
  3. Download Github Repository which has required files (fluent.conf & fluentd-deployment.yaml)
  4. Create config map using fluent.conf file for fluentd as deployment 
  5. Create fluentd deployment using fluentd-deployment.yaml
  6. Get Cluster IP for the fluentd forwarder service.
  7. Create Cluster Log Forwarding instance and configure to forward it to vRLI Cloud 


Step 1


Generate vRealize Log Insight Cloud API Key from here















Step 2 

oc create ns vrlicloud-logging


Step 3

git clone https://github.com/munishpalmakhija/loginsight-cloud-openshift.git

Update fluent.conf with API Key generated with Step 1


Step 4


oc create configmap vrlicloud-fluent-config --from-file=fluent.conf -n vrlicloud-logging




Step 5


Deploy fluentd using following yaml file 

oc apply -f fluentd-deployment.yaml

It creates the following resources

  • ServiceAccount - fluentd-vrlicloud-logging
  • ClusterRole - fluentd-vrlicloud-clusterrole
  • ClusterRoleBinding - fluentd-vrlicloud-clusterrole
  • Deployment - fluentd-vrlicloud-logging
  • Service - fluentd-vrlicloud-logging





Step 6

Get Cluster IP for fluentd service 

oc get service -n vrlicloud-logging



Step 7

Create Cluster Log Forwarding instance using CRD YAML

OpenShift environments (version <4.6) with the Tech Preview (TP) of the Log Forwarding API required the ClusterLogging instance be annotated as follow

oc annotate clusterlogging -n openshift-logging instance clusterlogging.openshift.io/logforwardingtechpreview=enabled

Create a YAML file as below 

The endpoint is the cluster IP of your fluentd forwarder from Step 6. YAML has been tested on Openshift 4.3. Log Forward API has been promoted from Technology Preview to Generally Available in OpenShift Container Platform 4.6 which require you to make a change to your ClusterLogging custom resource (CR) and to replace your LogForwarding custom resource (CR) with a ClusterLogForwarder CR.  Please refer Openshift documentation

Create CRD using the YAML file either from the UI or command line















If everything is successful you can search for logs using the following filter 

environment contains openshift







You can also create Dashboard to view your Openshift Environment  


This procedure has been co-validated by Dean Lewis. Please do visit his blog with Openshift version 4.6.  

No comments:

Post a Comment