Pages

Saturday 9 March 2019

DevSecOps Pune Meetup 4



9th March 2019

This is our 4th DevSecOps meetup. More and More and More swag sponsors added to the list. New sponsors: Elastic, Sonatype, Synk along with previous sponsors i.e. Polyverse and Cloudneeti already into the list. The meetup head count was usual and exact. We have been hitting the correct and intended audience.

Qualys Pune was the venue for the meetup. This is the 2nd time we organized the meetup at Qualys. We started at 10.30 am. We had a huge list of topics this time and I was pretty sure not all of them are going to get discussed since the most voted topics were very interesting. The following topics got discussed:

  • Machine Learning for Security: This topic got discussed for a good 40 minutes although none of us was an ML engineer and only understood theoretical ML concepts. The topic was chosen more for a discussion as the person suggester of the topic just had the curiosity to see if this can be done. The topic unfolded as people discussed and understood how ML works and how is it being used in the industry. Different tools like TensorFlow, Pandas, etc got discussed. Once we had a clear picture of ML, we moved to understand what security breaches we experienced in our Ops lives in industry. Based on our experience, we framed logic as to how ML algorithm could be written by Data Science engineers by studying security auth logs and application logs. We also discussed how ELK stack could be used to prove a security attack on system and further. We agreed to the part that as an Ops person we can only best provide inputs and prove security issues in systems. An ML engineer should be the best person to provide inputs on what ML algorithms could be used to mitigate security issues.
  •  Ansible for DevSecOps CI/CD pipeline: Ansible was suggested multiple times in the past meetups as well but always went not noticed and never got discussed. In this session, we started with Ansible and went on an on for a good 1 hour. We understood how Ansible is best used in different organizations. We also discussed how a bad code in Ansible can mess up and not make any difference between olden days shell script. We discussed Idempotency feature of Ansible. We also discussed as to how Ansible is being used by some firms for Provisioning, Configuration Management and Deployment altogether. We agreed how many used Puppet for multiple jobs and later moving to a different tool was difficult for them. Hence using different tools for specific different jobs could be useful. We discussed a typical CI/CD workflow with Terraform for provisioning, Ansible for CM and Fabric/Capistrano or different language specific deployment tools for easy deployments and rollbacks. We also did a small white board presentation on how CI/CD can be used for all 3 purposes. We also discussed DR strategies, Cloud managed services like auto-scaling services etc.


  • Git for securing code : The suggester was a Developer and knew exactly how was not being used in the best manner to ensure security and highlighted some best ways like code review practices, git hooks to be used in Jenkins, linting analysis using pre-commit hooks, static code analysis with SonarQube before merge. We discussed this in details already in CI/CD pipeline above as well.
  • Burpsuite :  Burpsuite was a tool that many knew but never got a chance to use as such. The suggester of the topic had extensive experience using it and spoke about multiple possibilities and usecases that could be achieved using Burpsuite. He also highlighted on how precautions need to be  taken before using this tool for testing giving examples of one of his projects. Burpsuite indeed was very interesting for many members and we decided to have a demo session on this.
Takeaways from this session for speakers to prepare were:
  •  Session required on ML in security
  • Demo on CI/CD in DevSecOps
  • Demo on Burpsuite
Many topics that did not get discussed were:
  • Nexus scan in Jenkins pipeline
  • SonarQube for secure code analysis
  • Pentesting with Python
  • IOT Security
  • Metasploit for pentesting
  • Cloud security
  • Prowler
  • Automobile software release cycle and missing lacking devops chain
  • Maven
  • Regulations
In the end Shirish wanted to highlight the recent All Intel Chips Open To New 'Spoiler' Non-Spectre Attack

Some clicks :)














Wednesday 20 February 2019

Source build Envoy proxy on Ubuntu 18.04

sudo apt-get update
sudo apt-get install openjdk-8-jdk build-essential autoconf libtool cmake ninja-build
echo "deb [arch=amd64] http://storage.googleapis.com/bazel-apt stable jdk1.8" | sudo tee /etc/apt/sources.list.d/bazel.list
curl https://bazel.build/bazel-release.pub.gpg | sudo apt-key add -
sudo apt-get update && sudo apt-get install bazel

wget https://dl.google.com/go/go1.11.5.linux-amd64.tar.gz
tar -xvf go1.11.5.linux-amd64.tar.gz
sudo chown -R root:root ./go
sudo mv go /usr/local
echo "export GOPATH=$HOME/go" >> ~/.profile
echo "export PATH=$PATH:/usr/local/go/bin:$GOPATH/bin" >> ~/.profile

 git clone https://github.com/envoyproxy/envoy.git
cd envoy/
bazel build --package_path %workspace%:/home/<user>/envoy/ //source/exe:envoy-static

RELAX, ITS GOING TO TAKE LONG TIME




To generate the example configurations run the following from the root of the repo:

mkdir -p generated/configs
bazel build //configs:example_configs


RELAX, ITS GOING TO TAKE LONG TIME

I wrote an Ansible playbook too for this. I will publish it later on Github

Easily SCP/Rsync through bastion host or SCP/Rsync through multiple hops

Often we work in environment where we need to copy files or directories from a local system to another server that can be accessed only through a Bastion host. In such cases, typically we transfer from local machine to Bastion and from Bastion to the intended server. This is time consuming, repetitive and unreliable too. There are many ways you can make this automated. I found a way to get this done through SSH tunneling. Here's how it works:

There are 3 machines involved here:
  1. localhost
  2. Bastion host
  3. Intended server

1. Create an SSH tunnel from localhost to the intended host through bastion. The tunnel will be created from port 1234 at localhost. You may choose any other port.
ssh -L 1234:<intended_server>:22 <user>@<bastion-host> cat -
2. In a new tab initiate the file/directory transfer using the tunnel port
scp -P 1234 <file_to_transfer> <user_of_intended_server>@127.0.0.1:~/

As I did this, I realized SCP is very slow in getting the transfer done due to its linear and sequential file transfer behavior. Hence, I used Rsync which made it pretty fast due to its delta based transfer algorithm

rsync -avz -e "ssh -p 1234" <file_to_transfer> <user_of_intended_server>@127.0.0.1:~/

Saturday 9 February 2019

DevSecOps Pune Meetup 3



9th Feb 2019

This is our 3rd DevSecOps meetup. I am glad to see the head count getting even better. Another Goodies sponsor added to the list. Cloudneeti Software sponsored T-shirt and Mugs to best presenter.

For this meetup, we got an exact count again with 2-3 last minute exits. Cloudneeti Software was the venue for this meetup. We started sharp at 10.30. Surprisingly the topics discussed this time were pretty advanced and much looked out for. The following topics got discussed:
  •  Kali Linux for security : The topic was super hit and got discussed for pretty long hours. We spoke for a good 20 minutes on Kali Linux and only stopped because the next topics were also equally interesting. We spoke about the old distro Backtrack and how Kali got introduced. Others added their inputs as to how Kali is used by different SecOps people in industry. Overall we all agreed that Kali was a really vast topic to be just discussed and this rather needed a presentation + demo with so many tools within.
  • OWASP Top 10 web app security risks: As we discussed Kali, we also spoke about OWASP top 10 in the same discussion and this got prolong for another 20 mins. Not all of us were aware of all OWASP top 10 attacks hence we Googled it just for making notes, however with time limitations we could only discuss the most famous SQL Injection and a little about XSS. We also concluded that this needed a bigger session.
  • CI/CD in DevOps Pipeline: DevSecOps CI/CD pipeline was discussed shortly in 1st meetup but we did not deep dive in it then. In this meetup we actually went to the depth of Jenkins, Git, SonarQube, Static code analysis, Container Image security, SecOps role to play, Vulnerable libraries being used by devs and how to resolve this. How Github has inbuilt vulnerability analysis done now and many more details. We also drew the following CI/CD architecture diagram for DevSecOps pipeline.
  • Ansible secure key rotation: This was more of a question to the forum as to how this could be done since Chef and Puppet use clients and use their own keys for security, while Ansible uses SSH. We agreed to the fact that the keys must get rotated and many companies do follow this. Ansible's authorized_keys module helps you to rotate keys was the perfect answer suggested.
  • Security Patching at scale in cloud: This was again a question and the answer was many companies did this using cloud native tools to create a golden images and other tools to get into the image to verify if the images were CIS compliant this is not just for images but also for Cloud environment and resources in general. Once again Prowler was discussed to evaluate the cloud environment.
Takeaways from this session for speakers to prepare were:

  •  Kali Linux hands on demo
  • OWASP Top 10 risks
  • CI/CD in DevSecOps
A few topics that did not get discussed were:
  • Metasploit for Pentesting
  • GDPR automation
  • Deploying Software securely
  • Cloud security trends 2018-19
An important Topic that Budhram did put forth as to what minimum qualifications and expertise do companies look for in a fresher candidate to call him an eligible candidate for DevOps Engineering further. This was a good debate cum discussion that we all spoke about in the end.

This turned out to be a long event in spite of small number of attendees. Ashish, Budhram, Shrikant and Dhiru got goodies to keep discussion more happening and actively participating in all topics as well. Some others got stickers.

Some clicks :)







Copying ssh keys easily

I use VMs/Vagrant a lot in my day work for all sysadmin/devops automation. One of the problems that I always face with the systems is to authorize my server for the 1st time with my master host. If I am using 10 VMs I need to authorize them 10 times? I wrote a small script to automate this process:

  1. Create a "list" file and add all IPs and hostnames for the VMs in it.
  2. Create a "password" file to write your SSH password in it, you may choose to write the password in the bash, however I feel this gives me the flexibility to add the bash to my source code if need by putting password file in a.gitignore
  3. Next create a shell script that will read the IP addresses and the hostnames from the "list" file and password/s from the password file(I generally keep the same password for all VMs for simplicity)
  4. Remember 2 commands are useful here ssh-copy-ip and ssh-keyscan. Here's how you use them:
 
    ssh-keyscan -H <IP> >> ~/.ssh/known_hosts                                                                                                                                
    sshpass -f <password> ssh-copy-id -i ~/.ssh/id_rsa.pub <USER>@<IP>                                                                                                      
The ssh-keyscan command command is for gathering the public ssh host key of a VM host specified. After collecting the publich ssh-key it adds it to your localhost. You can verify this by checking the contents of "~/.ssh/known_hosts"

The ssh-copy-id command copies the public key of your default identity (otherwise use -i identity_file for other identities) to the remote host. You can verify this by checking the content on ~/.ssh/authorized_keys in the VM host.

The final script looks like this with a loop:

#!/bin/bash                                                                                                                                                                 
user="vagrant"                                                                                                                                                              
for ip in `cat ./list`; do                                                                                                                                                  
    ssh-keyscan -H $ip >> ~/.ssh/known_hosts                                                                                                                                
    sshpass -f password.txt ssh-copy-id -i ~/.ssh/id_rsa.pub $user@$ip                                                                                                      
done 


That's how my "list" file looks like:



consul-server1                                                                                                                                                              
consul-server2                                                                                                                                                              
bootstrap-server1                                                                                                                                                           
client1                                                                                                                                                                     
client2                                                                                                                                                                     
client3                                                                                                                                                                     
client4                                                                                                                                                                     
client5                                                                                                                                                                     
192.168.3.111                                                                                                                                                               
192.168.3.112                                                                                                                                                               
192.168.3.121                                                                                                                                                               
192.168.3.151                                                                                                                                                               
192.168.3.152                                                                                                                                                               
192.168.3.153                                                                                                                                                               
192.168.3.154                                                                                                                                                               
192.168.3.155                                                                                                                                                               

I purposely add IP as well as hostname as I keep using them interchangeably. I also came to know about ansible authorized_keys module that does the ssh-copy-id task:
- name: Set authorized key for user ubuntu copying it from current user
  authorized_key:
    user: ubuntu
    state: present
    key: "{{ lookup('file', lookup('env','HOME') + '/.ssh/id_rsa.pub') }}"
However, you will still need the the ssh-keyscan here. This script goes handy for ops who keep destroying their local environment and use a new one.This is available on Github: https://github.com/iamrawtion/ansible-autossh

Sunday 20 January 2019

AWS Tagger




Image credits : jdhancock

Tagging in AWS is often not considered useful by many users. Tagging of resources in cloud and DC not only helps us identify resources but it can also do multiple other wonders that one might have never thought about. We don't tag resources in cloud for many reasons, laziness being the topmost reason.

Lets see why tagging is important:
  1. Identification and Isolation: Tagging allows identification of resources as to what purpose a specific resource may have been created for. It also allows you to separate resources from each other. e.g. separating different environments.
  2.  Automation: When you tag resources with certain values you can ensure that your automation scripts only addresses certain intended resources and not all. e.g execute security patches on certain systems that need to be compliant.
  3. Costing: You can identify based on tags as to which resource is costly and also make business decisions based on the results received.
  4. Define ownership: You can also understand based on proper tags as to who are the stakeholders for a certain resource or group of resources.
  5. Versioning: Sometimes when you need certain resources to be preserved based on its state, you may also versionize them based on tagging. Although AWS provides versioning mechanism for a few services, it may not be applicable to all of them.
 In many organizations although the importance of tagging is understood a lot later. Until then its too late to start tagging and it becomes almost always a manual process to tag all the resources. Or you may need to write complex programs to identify systems and tag them as per your requirement. Thankfully, AWS Tagger comes to rescue if you have a requirement to tag your AWS resources. You may also bulk tag them to avoid a lot of manual work. So how do we do this.

Its a 3 step process to Bulk tag resources:
  1. Collection : This is a simple process. Here all you need to do is, collect all the resources in a file. Hereafter you may process this data. AWS Tagger heavily depends of resource ID's of all the resources you create. Resource ID's are further used to implement all the tags. To get the resource ID's for all the resources, simply login to your AWS account and navigate to https://resources.console.aws.amazon.com/r/tags . On this page, you are given a field to enter the region for the resource you want to choose and all choose the types of resource. Choose "All resource types" here and click on "Find Resources" button. Click on the "Download" button to download the CSV data generated.
  2. Identification and filtering: I recommend this step particularly to filter the data so that AWS Tagger can act on individual resources. Here you may use your excel skills to separate data based on resource types.
  3. Tagging: Once the resources are separated, you may start executing AWS Tagger scripts as per the documentation provided on their Github page.

Friday 18 January 2019

DevSecOps Pune Meetup 2

For the 2nd DevSecOps meetup, we already had our first swag sponsor. For DevOps Pune, I received swags from Docker and Ansible. Hashicorp was also planning to send a few.

With DevSecOps our 1st swag sponsor was Polyverse :) I couldn't resist posting these.


For this meetup, I changed the RSVP format to get an exact count. I was expecting to start soon at a bigger location. We couldn't risk wasting of resources. Lean coffee needs logistics to take care of and hence needs to be addressed really well. Everyone who RSVP' at the meetup page was informed to Call/SMS/WhatsApp the organizers to book a slot. So the Meetup page RSVP meant nothing.

Qualys Pune, was our venue, logistics and refreshment sponsor this time. When I 1st told them about the meetup, this is how the they arranged the seating :) :






I told them about the format and later we changed the seating to best suit the format. 

We had 12 RSVP and 10 attendees this time with just 1 last minute informed cancellation. That was a perfect number . We also made sure that the refreshment we take was a packed and long lasting one so we don't waste food. 

With 10 attendees we stared pretty much on time and this time we had a huge list to discuss and the participants were from mixed domains unlike last meetup. This time we had QAs, Support Engineers, DevOps, Consultants and Developers altogether.

The following topics got discussed:
  • Understanding PKI - Public Key Infra (How SSL Works?)
  • CIS Benchmarking
  • SOAR (Security, Orchestration, Automation and Response)
  • AWS Compliance
  • Securing serverless in Azure (Function as a Service)
  • Debian Linux and Contributing to it
  • Microservices with an example
SSL and PKI got discussed a lot since Muneeb Shaikh really explained concepts that we were unaware of that goes behind PKI formation and how public and private keys work. A 5 minutes discussion prolonged to a good long 30 minutes with inputs from everyone.

SOAR was a new process and concept that Rahul spoke about and worth reading for everyone. CIS compliance was a hit one this time too. We understood that it was topic worth presentation hence added to the DevOps Pune meetups Agenda. Later we spoke about AWS compliance and how Prowler could be used to ensure compliance in AWS. We also discussed importance of good naming conventions and Tagging in AWS. There was a chaos when we spoke about both serverless and microservices resulting to a debate topic that we discussed at the end of the meetup. The final closure was on Debian systems and how we would contribute to it.

Takeaways from this session for speakers to prepare were:
  •  PKI and SSL encryption
  • Microservices
A few topics that did not get discussed were:
  • Types of Security and importance of each
  • GRC - Governance, Risk and Compliance
  • Security Testing with Selenium
  • Achieving CI/CD with Ansible
This turned out to be a long event in spite of small number of attendees. Muneeb got Polyverse T-Shirt to keep PKI discussion happening and also actively participating in all other topics as well. Some others got stickers.

Some clicks :)





DevSecOps Pune Meetup 1

After some good success, a huge gap and some mixed learning experiences with organizing DevOps Pune meetups, I decided to start with DevSecOps Pune meetup. This was mainly since I was exploring possibilities in Information Security world. The idea started after seeing DevSecOps Seattle Meetup and the learning experience I had their simply by reading their updates. I saw regular posts on Facebook about this meetup group from my long time mentor Archis Gore. I was still confused whether to start a meetup in Pune or just stick to Seattle meetup and attend it virtually. You don't always have to be an organizer to learn.

However, Archis told me that the Seattle meetup could not be attended virtually, as the format could not support virtual attendance. This was a Lean Coffee format. Something different for me. On learning more about the format, it sounded really like a plan to start a similar meetup in Pune. Archis was here in Pune in October 2018, when I met him to understand the organizer's roles in this format. And then, and we were in or a great start. I got Rahul Khengare with me this time as a Co-organizer and started the meetup group. Cloudneeti helped us sponsor the meetup group.

The first meetup I knew would be a small one with limited attendees. I expected less than 10 attendees to show up and the RSVP count always go wrong. I remember wasting lot of food and other resources in the past due to incorrect RSVP count. I chose a location to the central Pune so that its easy to commute to everyone. Thanks to Bobby Jadhav for sponsoring the venue, i.e HauteBook's office.

We had 5 attendees in this meetup as expected. The count was not important. What was more important was whether good topics came up. With Lean Coffee we expect every participant to come up with good topics to vote for and speak about. The topics that got discussed were:

  • Docker image security and its challenges (Highest voted)
  • DevSecOps CI/CD pipeline with Kubernetes
  • Cloud Security
  • Metasploit - Kali Linux
We discussed a lot about Docker security and Snyk, Twistlock being the docker security tools were also explored further. How CIS compliance helps cloud security and the recent how engineers disable SELinux first on any system and that is a bad practice. Although just 5, the discussion went along for a huge 2 hours and it was indeed a wonderful learning experience.

We also decided later that the takeaways from these lean-coffee format will end up becoming speaker-attendee format topic for us to deep dive into important topics further. This was a great takeaway since with DevOps meetups we only spoke about what the speaker was best at, and may or may not be community learning requirement. Takeaways from this session for Speakers to prepare about was:
  • CIS Compliance
  • SELinux / Apparmour
These topics were added as Topics on DevOps Pune meetup and the hunt for the speaker started there. A few topics that did not get discussed due to lack of time were:
  • iptables
  • Ansible, Terraform and CI/CD for pod deployments on AWS
  • AWS security alternatives
  • Security compliance
  • Securing Nginx
Overall it was a wonderful learning experience. Cheers to all the attendees.

Some clicks :)