Wednesday, January 27, 2021

vRA Cloud Template to deploy AWS Lambda to forward S3 Logs to vRealize Log Insight Cloud

Recently vRealize Log Insight Cloud announced Multi-Cloud support which allows you to forward logs from AWS and Azure. For more details, you can refer to the blog post


In this blog, I would demonstrate how to use vRA Blueprint a.k.a Cloud Templates, to deploy AWS Lambda to forward the S3 Buckets logs / Events to vRealize Log Insight Cloud.


Once you provision, it will create AWS Lambda which will forward the logs to vRealize Log Insight Cloud, when new S3 object is created in the bucket.

 






















Pre-requisites 


Following are the pre-requisites 
  • Access to VMware vRealize Log Insight Cloud 
  • API Key for VMware vRealize Log Insight Cloud 
  • AWS Account added as vRA Cloud Account 
    • PowerUser role should be enough for creating the required resources. 
  • AWS IAM role which has access to Lambda Service 
    • It needs to exist and provided as Deployment Input. 
  • Source S3 Bucket 
    • You will need to download the Lambda.zip from Github to S3 Bucket. 
  • Target S3 Bucket(s) for which you want to forward logs & events.
    • It needs to exist and provided as Deployment Input.

Download vRA Cloud Templates aka Blueprints


Clone repo which has the vRA Cloud Template aka Blueprints

git clone https://github.com/munishpalmakhija/cas.git

Import following 2 blueprints 

Navigate to the "vRLICloud-AWSLambda Blueprint" folder and import the following blueprints 
  • Deploy-vRLICloud-AWS-Lambda-S3BucketLogs
  • Deploy-vRLICloud-AWS-Lambda-S3Events













































Provision Blueprints


Deployment Inputs 


You will need to provide the following inputs for both blueprints 



















It creates the following 3 resources 

Cloud.Service.AWS.Lambda.Function
Cloud.Service.AWS.Lambda.Permission
Cloud.Service.AWS.S3.Bucket.Notification





































Once the deployment is successful you will see 2 AWS Lambda functions created.




Try uploading a new file that has logs into the S3 Bucket(s) and validate logs are showing in vRealize Log Insight Cloud. 

You should be able to view logs by using the following filter.

log_type starts with aws






Thursday, January 21, 2021

Perform Day2 Operations on vRA Deployments using PowervRACloud

With the latest version 1.4 of PowervRACloud, I have added cmdlets to manage vRA Deployments. 

In this blog, I will demonstrate how to perform Day2 Operations on vRA Deployments using PowervRACloud. 

You can visit here to get more details on PowervRACloud or reach out to me on my Twitter or @PowervRACloud


YouTube Demo




You can view the recorded YouTube Demo or go through the blog for a detailed procedure 

Pre-requisites


Following are the pre-requisites 

  • PowervRACloud Version 1.4
  • Powershell 6/7
  • Access to vRA Cloud or vRA 8.x environment

Environment Details


The cmdlets have been tested with the following environment 
  • Powershell 7.1 installed on CentOS 7
  • vRA Cloud December Release
  • vRA 8.2 




















Procedure 


Step 1 - You will need to first Connect using Connect cmdlets

Connect-vRA-Cloud 
Connect-vRA-Server


Step 2 - Execute any of the following commands to perform Day2 Operations on vRA Deployment(s)



PowervRACloud Command Description Sample Output Screenshot
Get-vRA-SingleDeployment It retrieves single vRA Deployment in a particular Org. You can use output to save in variable for other cmdlets

Get-vRA-DeploymentResources It retrieves resources associated with a vRA Deployment in a particular Org. You can use output to save in variable for other cmdlets

PowerOff-vRA-Deployment It powers Off vRA Deployment in a particular Org

PowerON-vRA-Deployment It Powers ON vRA Deployment in a particular Org


Create-vRA-DeploymentResourceSnapshot It creates Snapshot for vRA Deployment Resource (vSphere Virtual Machine) in a particular Org


Delete-vRA-DeploymentResourceSnapshot It deletes Snapshot for vRA Deployment Resource (vSphere Virtual Machine) in a particular Org


Revert-vRA-DeploymentResourceSnapshot It reverts to a Snapshot for vRA Deployment Resource (vSphere Virtual Machine) in a particular Org

PowerOff-vRA-DeploymentResource It powers off a vRA Deployment Resource (vSphere Virtual Machine) in a particular Org

PowerON-vRA-DeploymentResource It powers on a vRA Deployment Resource (vSphere Virtual Machine) in a particular Org








Reset-vRA-DeploymentResource It resets a vRA Deployment Resource (vSphere Virtual Machine) in a particular Org


Reboot-vRA-DeploymentResource It reboots a vRA Deployment Resource (vSphere Virtual Machine) in a particular Org



Suspend-vRA-DeploymentResource It suspends a vRA Deployment Resource (vSphere Virtual Machine) in a particular Org


Change-vRA-DeploymentLease It modifies vRA Deployment lease in a particular Org


Change-vRA-DeploymentOwner It modifies vRA Deployment owner in a particular Org

Friday, January 15, 2021

Forwarding Red Hat Openshift Logs to vRealize Log Insight Cloud using Log Forwarding API

 Overview

Red Hat OpenShift is an open-source container application platform based on the Kubernetes container orchestrator for enterprise application development and deployment. 

Starting with OpenShift 4.3, a new approach called the Log Forwarding API  was made available to not only simplify the ability to integrate with external log aggregation solutions such as vRealize Log Insight Cloud ( vRLIC). 

What is Log Forwarding API

The Log Forwarding API enables you to configure custom pipelines to send container and node logs to specific endpoints within or outside of your cluster. You can send logs by type to the internal OpenShift Container Platform Elasticsearch instance and to remote destinations not managed by OpenShift Container Platform cluster logging, such as an existing logging service, an external Elasticsearch cluster, external log aggregation solutions, or a Security Information and Event Management (SIEM) system.


What is being deployed as part of the integration? 

We will be deploying fluentd forwarder within Openshift Cluster which will forward the logs to vRealize Log Insight Cloud 













Once the logs are flowing you can create a Dashboard to visualize your OpenShift environment like below I have created a sample dashboard.














Pre-requisites


  • Openshift Environment 4.3 and above
  • Cluster Logging Operator & Elasticsearch Operator is installed for openshift-logging namespace  following 
  • Cluster logging instance is created using the Custom Resource Definition. 

The procedure listed is here 
https://docs.openshift.com/container-platform/4.3/logging/cluster-logging-deploying.html#cluster-logging-deploy-clo_cluster-logging-deploying








Procedure

Following are high-level steps that will be executed

  1. Generate vRealize Log Insight Cloud API Key
  2. Create a namespace for vRLI Cloud
  3. Download Github Repository which has required files (fluent.conf & fluentd-deployment.yaml)
  4. Create config map using fluent.conf file for fluentd as deployment 
  5. Create fluentd deployment using fluentd-deployment.yaml
  6. Get Cluster IP for the fluentd forwarder service.
  7. Create Cluster Log Forwarding instance and configure to forward it to vRLI Cloud 


Step 1


Generate vRealize Log Insight Cloud API Key from here















Step 2 

oc create ns vrlicloud-logging


Step 3

git clone https://github.com/munishpalmakhija/loginsight-cloud-openshift.git

Update fluent.conf with API Key generated with Step 1


Step 4


oc create configmap vrlicloud-fluent-config --from-file=fluent.conf -n vrlicloud-logging




Step 5


Deploy fluentd using following yaml file 

oc apply -f fluentd-deployment.yaml

It creates the following resources

  • ServiceAccount - fluentd-vrlicloud-logging
  • ClusterRole - fluentd-vrlicloud-clusterrole
  • ClusterRoleBinding - fluentd-vrlicloud-clusterrole
  • Deployment - fluentd-vrlicloud-logging
  • Service - fluentd-vrlicloud-logging





Step 6

Get Cluster IP for fluentd service 

oc get service -n vrlicloud-logging



Step 7

Create Cluster Log Forwarding instance using CRD YAML

OpenShift environments (version <4.6) with the Tech Preview (TP) of the Log Forwarding API required the ClusterLogging instance be annotated as follow

oc annotate clusterlogging -n openshift-logging instance clusterlogging.openshift.io/logforwardingtechpreview=enabled

Create a YAML file as below 

The endpoint is the cluster IP of your fluentd forwarder from Step 6. YAML has been tested on Openshift 4.3. Log Forward API has been promoted from Technology Preview to Generally Available in OpenShift Container Platform 4.6 which require you to make a change to your ClusterLogging custom resource (CR) and to replace your LogForwarding custom resource (CR) with a ClusterLogForwarder CR.  Please refer Openshift documentation

Create CRD using the YAML file either from the UI or command line















If everything is successful you can search for logs using the following filter 

environment contains openshift







You can also create Dashboard to view your Openshift Environment  


This procedure has been co-validated by Dean Lewis. Please do visit his blog with Openshift version 4.6.