Summary
Overview
Work History
Education
Skills
Timeline
Generic

Rajesh Kumar Reddy Kunam

Glen Allen

Summary

Location : Richmond ,Virginia Professional Summary: 18 years of professional IT experience on DevOps, AWS, Big data, Python, Datadog and ELK. Worked for Banking, Healthcare and Financial Domains. Strong Experience on AWS modules EC2, EMR, S3, Lambdas, IAM, RDS, SNS, EBS, AWS CloudFormation. Solid hands-on experience with Team managed pipelines and One Pipeline CI/CD’s Extensive experience on Hadoop ecosystem Heavy scripting experience in Python and Unix Shell Scripting Experience working on migrating applications to AWS Cloud. Good experience in DataDog monitoring and alert tools Have good experience in ELK and Developed Kibana reports.

Overview

10
10
years of professional experience

Work History

AWS Cloud Engineer

Capital One
08.2019 - Current
  • Project: Enrichment Cloud Environment (ECE, The purpose of this project is to create an Ab-initio ecosystem for Enterprise by rebuilding our Platform from the ground up, in the cloud using modern tools and technologies
  • It is a transition away from our current On-Prem environment to the Cloud
  • As part of this project, application teams are making use of the latest cloud technologies
  • Environment: EC2, EMR, S3, Lambdas, IAM, RDS, SNS, EBS, AWS CloudFormation,GitHub, Jenkin and Ab-initio
  • Responsibilities:
  • Responsible for designing, implementing, and supporting AWS resources and its solutions
  • Automated EC2 instances rehydration using Lambda and Team Managed Pipeline
  • Created a process to take the backups of EBS volumes using Lambdas
  • Automated Lambdas deployment using onepipeline
  • Building the docker images and automated deployment to ECE servers
  • Developed installer scripts using Python and UNIX for various products to be hosted on Application Servers
  • Responsible for Rehydration of AWS servers in DEV, QA and PROD environments for every 30 days
  • Converted Instance OS from RHEL7 to amazon Linux2
  • Hands on experience in creating/updating IAM and SG roles
  • Configured AWS Route53 to route traffic between different regions as part failover process
  • Configuring the AWS Cloud Watch alarms to monitor the EC2 instances on memory usage, EBS volume usage and Snapshots Count
  • Using lambda functions to implement enterprise standards and following best practices
  • Developing scripts for build, deployment, maintenance and related tasks using Jenkins
  • Writing Jenkinsfile for the entire pipeline process
  • Writing CFT’s (cloud formation templates) to automate provisioning infrastructure
  • Responsible for Continuous Integration (CI) and Continuous Delivery (CD) process implementation-using Lambda’s along with scripts to automate routine jobs to speed up deployments
  • Working on CloudFormation templates for building 2-tier to 3-tier applications
  • Collecting System log and Cloud Trail collection using Elastic Search and creating dashboards using Kibana
  • Installation, Configuration, and Management of DataDog agent on instances and Building the dashboards for monitoring and reporting with real-time metrics
  • Creating S3 buckets and maintaining and utilizing the policy management of S3 buckets and Glacier for storage and backup on AWS and also providing role-based access and maintaining the bucket policies to give a user the least privileges as needed
  • Involved in Technical feasibility discussions, design review meetings, peer reviews, Code reviews
  • CapitalOne
  • Project: Enhanced

Hadoop Engineer

Capital One
06.2013 - 07.2019
  • Project Description: Project Implementation for Hbase (NoSQL Database) real time capabilities and work on various developments around Spend feed- Enhanced Transactions, Deploy Bundle pipe line to support Direct Debit transactions, bulk loading of Transaction data from Bundle Pipeline into Hbase table and retrieving the record based on Merchant lookup data in real time.denormalizing the source data and loading it to Hbase tables, data retrieval in minimum response time
  • Merchant Lookup Data-The data RTM provides when a customer does a Credit Card Transaction
  • Environment: Pig, java, Hbase, HDFS, Shell scripting, Windows 7, Linux and Hadoop Cluster - 20 nodes
  • Responsibilities:
  • Developed Table creation scripts to create hbase table
  • De-normalized the source data using pig scripts and create a load ready source file
  • Performing bulk upload utility to load the Hfile in to Hbase table and performing sanity checks on the Hbase region servers
  • Automation of end of end use case for batch load into hbase tables
  • On daily basis sending the task status to onsite team
  • Daily Communicating with onsite team and Clients about tasks

Education

Master Of Computer Applications -

University of Madras

Skills

  • Technical Skills:
  • Cloud : AWS S3, EC2, EMR, EBS,Lambda, SQS, SNS and RDS
  • Scripting languages : Python, UNIX Shell Scripting
  • Container Services : Docker
  • Cloud Integration : AWS
  • Monitoring Tools : AWS Cloud Watch, Data Dog and ELK
  • Build Tools : Jenkins
  • Operating Systems :
  • RHEL 7 Linux, Amazon Linux2
  • Artifacts Repo : Artifactory
  • Source Control : Git

Timeline

AWS Cloud Engineer

Capital One
08.2019 - Current

Hadoop Engineer

Capital One
06.2013 - 07.2019

Master Of Computer Applications -

University of Madras
Rajesh Kumar Reddy Kunam