Wednesday, July 11, 2018

Using Fluentd to send logs from any cloud to VMware Log Intelligence

This Blog Post demonstrates the step-by-step process to send logs from any cloud to VMware Log Intelligence using Fluentd. This blog has been co-authored by Yang Liang & Chris McClanahan

What is Fluentd 

Fluentd is an open source data collector. which lets your unify data collection and consumption for a better use and understanding of data

Monday, June 25, 2018

Log Intelligence Features and UI Walk Through

 This post describes the features of Log Intelligence and does UI Walk Through

Home Page

When you log in, this is the first page where you will land up.You will see 3 sections

  • Search Bar
  • Event Observations
  • Recent Alerts

Search Bar

Assuming you have data ingested already you can start your smart searching. It allows you to search for keywords, queries. One of key feature it provides it will offer a suggestion for auto-complete while you are typing. A couple of examples of searches could be

  • logs where text contains error
  • hostname containing VC

You can try different combinations to match your requirement

Events Observations

If this is the first time you have logged in and there are no Data Collectors then it will show "Add Collector". You can refer to my previous blog which shares the process How to Deploy Data Collector.

If you already have Data Collector and data is being ingested then it will show interesting events in last hour what Log Intelligence has detected. Observations can include spikes/ dips of vSphere errors, warnings or All events or if any hosts are sending an unusual number of events compared to other hosts in last hour.

You can also view the Observations for last 1 day to analyze and identify anything obvious which would have caused the same.

Recent Alerts

This section shows if there are any triggered alerts in last hour for the Alerts which have been configured and enabled. You can also view alerts triggered over last day in your environment

If you have any alert triggered you can click from this section and it will show you the exact details of the triggered alert

Explore Logs

I would like to call this page as the heart of Log Intelligence. On this page you can do following things. I have divided them into 2 types based on my personal choices


  • Search for log stream, events types, and Alerts based on different filters and various time range
  • Analyze the logs which is the most important purpose of a customer choosing log intelligence
  • Visualize the logs in form of Charts. Optionally you can show Alerts occurred during that time frame on the Chart itself. 
    • Default Chart Type is Area however you can view in Column, Line, Pie, and Bubble based on your log search


  • Save log search as Queries
  • Create Alert Definitions based on your search/queries
  • Open existing queries and modify the same
  • Add Queries to Dashboards
  • Export Chart Data (CSV)
  • Export Log Events ( RAW, JSON)

This Page has so many things to offer which I can't cover here as part of introduction. I will cover them in a dedicated blog.


This page displays dashboards created by you. By default, there is no OOTB dashboard. You can create dashboards using OOTB queries or create Custom queries. It allows to view the query and Remove Chart from the Dashboard Page 


This section shows everything related to Alerts as the name suggests. It has 2 subpages

Recent Alerts

 As the name suggests it shows recent alerts which are triggered in your environment for the Alerts which have been configured and enabled. You can also visualize the number of Alerts over last hour, day and week and search for specific Alerts

If you have any alert triggered you can click on 3 dots which will give you 3 options to view

  • Details of the Alert -  It will show the details of the specific alert like time range, logs
  • Definition of Alert – It will open the Alert Definition where you can view the criteria of the Alert
  • Query of Alert - If you choose this it will take you to Explorer Logs based and opens the specific query with all the relevant filters

Alert Definitions

 This shows a list of all the Alert Definitions for your Org including

  • Out of the box (OOTB) - By default, you get all the content for VMware SDDC ( vSphere (ESXi & VC) , VSAN and NSX)
  • Custom Alert Definitions - All the Custom alerts you have created in your environment


 Actual name of the section is called Manage. As the name suggestion it has list of pages which are used to manage your Org and environment. It includes

  • Email Configuration - You can configure your company's email server which can be used to notify whenever there is an alert triggered. By default, Log Intelligence used one of hosted server
  • Webhook Configuration - You can configure details to notify or send details to other services using webhook
  • Data Collectors - It lists the status of all current Data Collectors and you can add new data collectors from this page as well. You can refer my previous blog which shares the process How to Deploy Data Collector
  • API Keys - Details to be shared in a dedicated post. The idea is to create suspense

Log Intelligence Introduction and Getting Started


VMware Log Intelligence is one of many SaaS offering as part of  VMware Cloud Services

It offers unified visibility across private clouds and AWS, including VMware Cloud on AWS, to provide deep operational insights and faster root cause analysis. It adds structure to unstructured log data, provides rich dashboards and delivers innovative indexing and machine learning based intelligent grouping for faster troubleshooting.

You can refer Technical FAQ for more details Click Here

Getting Started 

As you would have understood by the diagram , first step is to deploy Data Collector and configure systems to start forwarding logs to the same.

What is Data Collector

I like to call it as Gateway device between Private Cloud and VMware Cloud Services including Log Intelligence. It is a Virtual Machine which system can be configured to forward logs via syslog or CFAPI

There 2 types for Data Collectors for Log Intelligence,
  • OVA for VMware SDDCs including VMware Cloud on AWS
  • AMI for AWS Workloads


Before you can download OVA or deploy AMI you will need to request access to Log Intelligence ( Request Access)

Deploy OVA

  • Log in to VMware Console using your Credentials
  • Navigate to Data Collectors Page and ADD NEW

  • On Setup, a Data Collector Virtual Appliance, Click "DOWNLOAD OVA " or Copy LINK if you plan to deploy directly in VC instead of downloading

  • Navigate to your Web Client and Right Click where you want to deploy and select Deploy OVF Template
  • Fill details need in the wizard, including the One Time Key (OTK) listed in Setup a Data Collector Virtual Appliance
  • Once deployed, you will need to Power ON the VM
  • Once Powered On, Please wait for a couple of minutes before it does its magic. You can navigate back to Data Collectors Page and validate once it shows up with Green tick mark means its Ready

Deploy AMI


To deploy the Data Collector on AWS in a US Region, log into your AWS account and from EC2 select "Launch Instance". Change to the Community AMIs tab on the left-hand side and search for the instance in your region:
  • US-East Ohio : ami-66ad9c03
  • US-East N.Virginia : ami-28a20655
  • US-West N.California : ami-e63a2b86
  • US-West Oregon : ami-4f442f37
  • Select an instance size of t2.medium or large (not the default t2.micro) then the "Next: Configure Instance Details" button. Configure the instance as applicable to your environment then expand Advanced Details
  • The User data option will default to text. Select the "Input is already base64 encoded" checkbox. In the dialog box paste the  One Time Key (OTK) listed in Setup a Data Collector Virtual Appliance

  • Finish the wizard to deploy and power on your instance
  • Please wait for a couple of minutes before it does its magic. You can navigate back to Data Collectors Page and validate once it shows up with Green tick mark means its Ready

Please ensure Security Group is configured and all the relevant ports are allowed for Inbound and Outbound 

Sunday, March 11, 2018

vRealize Lifecycle Manager Installation and Configuration

It’s been a while since I last wrote my last blog. It feels so nice to be back in the routine of writing blogs and sharing knowledge . I have always believed more you share more you learn

In this blog post I will be sharing details about new Product which was launched recently – vRealize Lifecycle Manager (vRLCM)

What is vRealize Lifecycle Manager (vRLCM)

vRLCM automates Day 0 to Day 2 operations of the entire vRealize Suite, enabling simplified operational experience for customers. The vRealize Lifecycle Manager automates install, configuration, upgrade, patch, configuration management, drift remediation and health from within a single pane of glass, thereby freeing IT Managers/Cloud admin resources to focus on business-critical initiatives, while improving time to value (TTV), reliability and consistency.

Read the Solution Brief 



The following section takes you through the initial installation of the product

Download the OVA from here
Deploy the OVA in your OnPrem environment and Power it ON
Once the appliance is up and running you can access it via browser using IP or FQDN

At the login page you can use default credentials – admin@localhost / vmware
It will prompt you to choose the new password


On the initial login you will be prompted to start short tour of the product to guide you how to get started




Following section takes you through the initial configuration

Navigate to the Setting Page --> Common Configuration
It will prompt you for Root, Admin & SSH User passwords. By default SSH is enabled
Next step would be to Generate Certificate by entering the requested information


Now you will need to do OVA Configuration. It gives you 3 options ( local, NFS , MyVMware)
Local- Admin needs to copy the required OVAs manually to vRSLCM directory (This is what I have done)
NFS – Admin needs to download and copy OVAs to shared NFS directory
Enter MyVMware credentials and UI will guide you through downloading the same

For Fresh Installs you will need 5 OVAs as listed in the image below

If you prefer to choose MyVMware Account you can download directly in the UI

Next step is to create Data Center and add vCenter
Navigate to Manage Datacenter and click 3 dots and Add Datacenter

Click Manage vCenter and Add the vCenter

To Verify vCenter data collection , Click on Request icon on the left navigate panel

Now we will create new Environment
Navigate to Home Create New Environment
Choose Installation Type. (As it’s fresh install for us we will Choose “Using Installation Wizard”

You can choose individual Standalone product or Choose Solution which is as per VVD 4.1 design

Accept the EULA

Enter vRealize Suite license details or you can use MyVMware credentials if its updated


 Select the vCenter to be used to deploy the Appliance and specify the details such as Cluster , Network , Datastore & Disk Format

Note – It doesn’t allow to Choose Resource Pool

Complete the Network Details

Select Certificate to be use


 We need to enter details for the products we will be installing. For this blog I am only going to install vROPs 6.6.1

Confirm the details on the Summary Screen

Note – You can Download the Configuration file which is in JSON format and leverage it for future references

You can login to vCenter and ensure vROPS appliance is deployed and initial cluster configuration is completed successfully

In next blog I will try to do upgrade of vROPS from 6.6 to 6.6.1