Wednesday, September 5, 2018

Kubernetes Log forwarding to VMware Log Intelligence using Fluentd

In this blog I will walk you through how to use fluentd to forward logs from Kubernetes deployed pods(containers)

Pre-requisites


  • A running Kubernetes Cluster. My setup has with Kubernetes 1.11.1 on CentOS VMs on vSphere. Click Here for detailed instructions 
  • Admin access to the Cluster as we will be deploying fluentd in kube-system name space
  • Application writes to "stdout" and "stderr" streams 
  • An understanding of VMware Log Intelligence. Click Here to read through if you haven't done it yet 


Getting Started


Before to get started, make sure you understand or have a basic idea about the following concepts from Kubernetes (Click Here :

Node            


A node is a worker machine in Kubernetes, previously known as a minion. A node may be a VM or physical machine, depending on the cluster. Each node has the services necessary to run pods and is managed by the master components…

Pod


A pod is a group of one or more containers (such as Docker containers), the shared storage for those containers, and options about how to run the containers. Pods are always co-located and co-scheduled, and run in a shared context…

DaemonSet


A DaemonSet ensures that all (or some) nodes run a copy of a pod. As nodes are added to the cluster, pods are added to them. As nodes are removed from the cluster, those pods are garbage collected. Deleting a DaemonSet will clean up the pods it created. 

We will be using fluentd kubernetes daemon set (Click Here) which will be reading log files for the containers and sending it to VMware Log Intelligence


Options 


There are 2 ways we can send kubernetes logs to VMware Log Intelligence.

Option 1 - Using Syslog fluentd daemon set via Data Collector using syslog protocol


This is the most simplest and the quickest way. You will need to have Data Collector deployed. Click Here to see how to deploy Data Collector
















Steps 


Copy the yaml file for syslog (Click Here) and save it locally
Modify the yaml file and to add the Data Collector IP which accepts connections on syslog port (514)
Create fluentd daemon set by executing following command 

kubectl create -f fluentd-syslog.yaml
















Once deployed and configured successfully you will see logs from fluentd pod in VMware Log Intelligence














Option 2 - Using Custom daemon set directly to VMware Log Intelligence via HTTPS protocol


You will need to modify the daemon set and install http-out-ext fluentd plugin which can forward logs directly to VMware Log Intelligence using API keys

















Steps


First Step is to create your own docker image 

I will be using the debian image as its recommended for Production environments. https://github.com/fluent/fluentd-kubernetes-daemonset/tree/master/docker-image/v1.2/debian-syslog
From the documentation - The following repository expose images based on Alpine Linux and Debian. For production environments, we strongly suggest to use Debian images.

Clone the github repo. Install git (if you don't have it already) and execute following command









Navigate to "fluentd-kubernetes-daemonset/docker-image/v1.2/debian-syslog/conf"






Modify the fluent.conf file to following 


Before













After




















You will notice that we have added is two environment variables. It is similar to what we have for the standalone fluentd setup. Click Here to view the standalone fluentd installation

Next we will need to modify the docker file to install the http-out plugin "fluentd-kubernetes-daemonset/docker-image/v1.2/debian-syslog/Dockerfile"

&& gem install fluent-plugin-out-http-ext -v 0.1.10 \


Before 





















After






















docker build -t docker.io/mydockerhubusername/reponame:v1 ./

It should take a couple of mins 

If you want to push it registry or you can use local image

You can modify the existing yaml file for syslog with following yaml file. Please ensure to generate API Key from VMware Log Intelligence UI























Execute following command. In my case name of the file is mm-fluentd.yaml

kubectl create -f mm-fluentd.yaml








Once deployed and configured successfully you will see logs from fluentd pod in VMware Log Intelligence














Happy Logging !!!


References



Sunday, August 26, 2018

How to create your own Fluentd container to send logs to VMware Log Intelligence


I will walk through how to build your own fluentd container which can be used to send logs to VMware Log Intelligence

Earlier blogs had procedure listed how to install fluentd on linux and windows using rpms.

Linux – Click Here
Windows – Click Here

Use Cases


This container helps with  following use cases

  • Centralized Syslog Receiver on standard syslog port (514) and forwards to VMware Log Intelligence
  • Centralized App Logs Receiver on HTTP port 9880 and forwards to VMware Log Intelligence
  • Centralized Server to receive events from other fluentd agents over tcp port 24224 and forwards to VMware Log Intelligence











VMware Log Intelligence is a Cloud based service which means you will need to send logs over WAN and users may not want to send it from all the Machines hence having centralized server which sends out to Cloud would be a good idea 

Steps


We will be using https://github.com/fluent/fluentd-docker-image and modify to install http-out-ext plugin which is needed to send logs to VMware Log Intelligence

Execute the following command to clone the fluentd-docker-image repo

git clone https://github.com/fluent/fluentd-docker-image.git







You will notice it will create a folder "fluentd-docker-image"







I will be using fluentd version 1.2 which latest with Debian image as it is recommended one

Navigate to cd fluentd-docker-image/v1.2/Debian 

You will see Dockerfile 




















The only part which we will be adding is to install the http-out-ext plugin in the docker image

Add following line in the Docker file

&&gem install fluent-plugin-out-http-ext -v 0.1.10 \





















Next, we will run docker build to build the container image

docker build -t docker.io/mmakhija/fluentdcontainer:v0 ./





















A couple of points to note

  • The reason I am choosing this name is to that I can push the container name to public docker hub. You don't need to do it unless you want to share it with others. If you have your own private repo you can use that
    • mmakhija – docker hub username
    • fluentdcontainer - repo name on docker hub
  • It will take a couple of mins to build the docker. It will show some red color. Don't be scared it like I was when I did it for the first time


You can view the image by executing the following command

docker images




Next, you will need to copy fluentd.conf listed below on the docker host. Please remember the path where you are saving it because you will need to specify that when running docker run command.

I have saved the file at which I have specified below "/fluent/fluentd.conf"





















https://github.com/munishpalmakhija/fluentd/blob/master/fluent.conf


Execute the following command to run the fluentd docker container.

docker run -d -p 24224:24224 -p 24224:24224/udp -p 514:5140 -p 514:5140/udp -p 9880:9880 -v /fluent:/fluentd/etc/  mmakhija/fluentdcontainer:v0

Please ensure to match the name of the image which you had it when you ran docker build command and the path of the fluent.conf file





If everything goes you will see a fluentd container running as below and you can configure ESXi host to forward logs to ip of the docker host where this container is running on port 514 and you should see logs flowing









Please feel free to leave comments or suggestions of if something not working 


Saturday, August 25, 2018

Windows vCenter Log forwarding to VMware Log Intelligence using Fluentd

I co-authored the blog where we displayed how to install fluentd and send logs to VMware Log Intelligence (Click Here) however we did that for Linux which covers most of the scenarios however in this blog I will walk through fluentd installation on Windows where I have vCenter installed as an Application

Steps


In following section I will walk through how to install fluentd on windows 2012 R2 which has vCenter 6.5 installed

Install td-agent


You can download the ".msi" file from here, and install the software












Configure and Run td-agent


After you've installed .msi package, you'll see the program called Td-agent Command Prompt installed. Please double click this icon in the Windows menu and execute following command

fluentd -c etc\td-agent\td-agent.conf






























Please launch another Td-agent Command Prompt and type the command below

echo {"message":"hello"} | fluent-cat debug.event

It's working properly if td-agent process outputs the message

















Register Run td-agent as Windows Service


Please execute Td-agent Command Prompt again but with administrative privilege, and type the two commands below.

fluentd --reg-winsvc i
fluentd --reg-winsvc-fluentdopt '-c C:/opt/td-agent/etc/td-agent/td-agent.conf -o C:/opt/td-agent/td-agent.log'









You will now be able to see "Fluentd Windows Service" as one of the services installed

Right Click and start the service. By default, the Startup Type is Manual you can change it to Automatic if needed

You should see log file being created at "C:/opt/td-agent/td-agent.log"





Install plugins


We will need to install HTTP Output plugin additionally to send logs directly to VMware Log Intelligence.

Please execute following command in Td-agent Command Prompt

fluent-gem install fluent-plugin-out-http-ext









Configure td-agent.conf with Windows vCenter Details


Default location for vCenter vpxd logs is "C:\ProgramData\VMware\vCenterServer\logs\vmware-vpx"

We will need a pos file. I have created the file in the same location as vpxd to keep it simple. Please ensure it has appropriate permissions

Modify the default td-agent.conf which is located at "C:\opt\td-agent\etc\td-agent.conf" by replacing it will following configuration.




https://github.com/munishpalmakhija/fluentd/blob/master/td-agent-windows.conf

Please note you will need to generate your own API key for your org as mentioned in my blog here

Restart the "Fluentd Windows Service" for the new config to be used

If everything goes well you should be able to view logs by filtering out hostname contains windowsvcenter







Wednesday, July 11, 2018

Using Fluentd to send logs from any cloud to VMware Log Intelligence


This Blog Post demonstrates the step-by-step process to send logs from any cloud to VMware Log Intelligence using Fluentd. This blog has been co-authored by Yang Liang & Chris McClanahan

What is Fluentd 


Fluentd is an open source data collector. which lets your unify data collection and consumption for a better use and understanding of data


https://cloud.vmware.com/community/2018/07/10/using-fluentd-send-logs-cloud-vmware-log-intelligence/

Monday, June 25, 2018

Log Intelligence Features and UI Walk Through


 This post describes the features of Log Intelligence and does UI Walk Through

Home Page


When you log in, this is the first page where you will land up.You will see 3 sections

  • Search Bar
  • Event Observations
  • Recent Alerts

Search Bar


Assuming you have data ingested already you can start your smart searching. It allows you to search for keywords, queries. One of key feature it provides it will offer a suggestion for auto-complete while you are typing. A couple of examples of searches could be

  • logs where text contains error
  • hostname containing VC


You can try different combinations to match your requirement

Events Observations


If this is the first time you have logged in and there are no Data Collectors then it will show "Add Collector". You can refer to my previous blog which shares the process How to Deploy Data Collector.

If you already have Data Collector and data is being ingested then it will show interesting events in last hour what Log Intelligence has detected. Observations can include spikes/ dips of vSphere errors, warnings or All events or if any hosts are sending an unusual number of events compared to other hosts in last hour.

You can also view the Observations for last 1 day to analyze and identify anything obvious which would have caused the same.

Recent Alerts


This section shows if there are any triggered alerts in last hour for the Alerts which have been configured and enabled. You can also view alerts triggered over last day in your environment

If you have any alert triggered you can click from this section and it will show you the exact details of the triggered alert



Explore Logs


I would like to call this page as the heart of Log Intelligence. On this page you can do following things. I have divided them into 2 types based on my personal choices

Primary

  • Search for log stream, events types, and Alerts based on different filters and various time range
  • Analyze the logs which is the most important purpose of a customer choosing log intelligence
  • Visualize the logs in form of Charts. Optionally you can show Alerts occurred during that time frame on the Chart itself. 
    • Default Chart Type is Area however you can view in Column, Line, Pie, and Bubble based on your log search

Secondary 

  • Save log search as Queries
  • Create Alert Definitions based on your search/queries
  • Open existing queries and modify the same
  • Add Queries to Dashboards
  • Export Chart Data (CSV)
  • Export Log Events ( RAW, JSON)


This Page has so many things to offer which I can't cover here as part of introduction. I will cover them in a dedicated blog.




Dashboards

This page displays dashboards created by you. By default, there is no OOTB dashboard. You can create dashboards using OOTB queries or create Custom queries. It allows to view the query and Remove Chart from the Dashboard Page 



Alerts


This section shows everything related to Alerts as the name suggests. It has 2 subpages

Recent Alerts

 As the name suggests it shows recent alerts which are triggered in your environment for the Alerts which have been configured and enabled. You can also visualize the number of Alerts over last hour, day and week and search for specific Alerts

If you have any alert triggered you can click on 3 dots which will give you 3 options to view

  • Details of the Alert -  It will show the details of the specific alert like time range, logs
  • Definition of Alert – It will open the Alert Definition where you can view the criteria of the Alert
  • Query of Alert - If you choose this it will take you to Explorer Logs based and opens the specific query with all the relevant filters

Alert Definitions

 This shows a list of all the Alert Definitions for your Org including

  • Out of the box (OOTB) - By default, you get all the content for VMware SDDC ( vSphere (ESXi & VC) , VSAN and NSX)
  • Custom Alert Definitions - All the Custom alerts you have created in your environment

Administration

 Actual name of the section is called Manage. As the name suggestion it has list of pages which are used to manage your Org and environment. It includes

  • Email Configuration - You can configure your company's email server which can be used to notify whenever there is an alert triggered. By default, Log Intelligence used one of hosted server
  • Webhook Configuration - You can configure details to notify or send details to other services using webhook
  • Data Collectors - It lists the status of all current Data Collectors and you can add new data collectors from this page as well. You can refer my previous blog which shares the process How to Deploy Data Collector
  • API Keys - Details to be shared in a dedicated post. The idea is to create suspense



Log Intelligence Introduction and Getting Started

Introduction

VMware Log Intelligence is one of many SaaS offering as part of  VMware Cloud Services

It offers unified visibility across private clouds and AWS, including VMware Cloud on AWS, to provide deep operational insights and faster root cause analysis. It adds structure to unstructured log data, provides rich dashboards and delivers innovative indexing and machine learning based intelligent grouping for faster troubleshooting.




You can refer Technical FAQ for more details Click Here

Getting Started 


As you would have understood by the diagram , first step is to deploy Data Collector and configure systems to start forwarding logs to the same.

What is Data Collector


I like to call it as Gateway device between Private Cloud and VMware Cloud Services including Log Intelligence. It is a Virtual Machine which system can be configured to forward logs via syslog or CFAPI

There 2 types for Data Collectors for Log Intelligence,
  • OVA for VMware SDDCs including VMware Cloud on AWS
  • AMI for AWS Workloads


Pre-requisites


Before you can download OVA or deploy AMI you will need to request access to Log Intelligence ( Request Access)

Deploy OVA

  • Log in to VMware Console using your Credentials
  • Navigate to Data Collectors Page and ADD NEW

  • On Setup, a Data Collector Virtual Appliance, Click "DOWNLOAD OVA " or Copy LINK if you plan to deploy directly in VC instead of downloading




  • Navigate to your Web Client and Right Click where you want to deploy and select Deploy OVF Template
  • Fill details need in the wizard, including the One Time Key (OTK) listed in Setup a Data Collector Virtual Appliance
  • Once deployed, you will need to Power ON the VM
  • Once Powered On, Please wait for a couple of minutes before it does its magic. You can navigate back to Data Collectors Page and validate once it shows up with Green tick mark means its Ready



Deploy AMI


Please ensure Security Group is configured and all the relevant ports are allowed for Inbound and Outbound 


  • Log into your AWS account and from EC2 select "Launch Instance". Change to the Community AMIs tab on the left-hand side and search for the instance in your region:

  • US-East Ohio : ami-66ad9c03
  • US-East N.Virginia : ami-28a20655
  • US-West N.California : ami-e63a2b86
  • US-West Oregon : ami-4f442f37
  • Select an instance size of t2.medium or large (not the default t2.micro) then the "Next: Configure Instance Details" button. Configure the instance as applicable to your environment then expand Advanced Details
  • The User data option will default to text. Select the "Input is already base64 encoded" checkbox. In the dialog box paste the  One Time Key (OTK) listed in Setup a Data Collector Virtual Appliance

  • Finish the wizard to deploy and power on your instance
  • Please wait for a couple of minutes before it does its magic. You can navigate back to Data Collectors Page and validate once it shows up with Green tick mark means its Ready