| 20,000+ Fresh Resumes Monthly | |
|
|
| | Click here or scroll down to respond to this candidateSri Hari BellamkondaSr. Cloud DevOps EngineerEMAIL AVAILABLE PHONE NUMBER AVAILABLELinkedIn: https://LINKEDIN LINK AVAILABLEProfessional Summary:With more than 10+ years of extensive IT experience across both public and private cloud environments, including Azure, AWS, GCP, OpenStack, and Kubernetes, I am a certified administrator with expertise in CI/CD pipelines, source code management, build/release processes, and virtualization technologies. I have strong troubleshooting skills, especially in addressing performance issues, and a solid background in Unix, Linux, and Windows platforms. My experience covers Agile and Waterfall methodologies in the software development lifecycle, along with advanced knowledge in Kubernetes, OpenShift, Docker, Terraform, Azure DevOps, scripting languages, and monitoring tools.Technical Skills:Cloud Technologies: Azure, AWS, GCP, open stackContainerization: Docker, Docker Swarm, KubernetesCI/CD Tools: Jenkins/Hudson, Azure dev0psConfiguration Management Tools: Ansible, Chef, Puppet, Ansible TowerVersion Control Tools: GIT, GitLab Bitbucket, SVN, TFS, SubversionScripting/programming Languages: Python, Shell (PowerShell/Bash), Ruby, YAML, JSON, Perl, Groovy, JavaScript, C, PHP, Java/ J2EE, .Net, Spring Core, Spring MVC, REST Web servicesMonitoring Tools: Splunk, Nagios, ELK, App Dynamics, Cloud Watch, Prometheus, GrafanaVirtualization Technologies: VMware, Windows Hyper-V, Virtual box, VagrantOperating Systems: Ubuntu, CentOS, RedHat Linux, Windows.Databases: MySQL, MS Access, NoSQL (MongoDB, DynamoDB)Web Servers: Apache HTTP 3.x, Apache Tomcat, NginxBuild Tools: Maven, Ant, GradleBug Tracking Tools: JIRA, Service Now, RemedyRepository Management: JFrog, Nexus, ArtifactoryEducation:Bachelors in Computer Science from Sathyabama University, Chennai, IndiaSaint peter s university, NJ, USAProfessional Experience:Client: MasterCard, St Louis, MO Jul 2022 to PresentRole: Sr. Azure Cloud DevOps EngineerProject Description:Responsibilities:Created and administered Azure Active Directory (AD) tenants, overseeing user and group management, and configured application integration with Azure AD. Integrated on-premises Windows AD with Azure AD, set up multi-factor authentication (MFA), and implemented federated single sign-on (SSO).Deployed Azure DevOps for Continuous Integration (CI), Continuous Delivery (CD), and monitoring, addressing build, deployment, and release issues for maintenance, enhancements, bug fixes, and emergency patches.Developed Azure DevOps CI/CD pipelines by integrating Maven, JFROG, Gradle, and SonarQube, and implemented multistage release pipelines for dev, UAT, and production environments.Built CI/CD pipelines in Azure DevOps for .NET and Python applications, integrating source codes from GitHub and VSTS, and established deployment areas for testing, pre-production, and production in Kubernetes clusters.Created and managed custom Docker images, including tagging and pushing to Docker Hub, and utilized Docker Container registry for image storage.Extensive experience in Kafka architecture, including setup, configuration, and integration within Azure environments.Enhanced system monitoring and logging by implementing ELK Stack, Grafana, and Prometheus, integrated with GitLab for improved visibility and troubleshooting.Provisioned ARM templates in Azure Pipelines for custom build processes in Azure Kubernetes Service (AKS), and integrated Azure DevOps pipelines with Azure Boards and Microsoft Teams for notifications.Established and maintained Continuous Integration (CI) processes using Azure DevOps (VSTS) across multiple environments, facilitating an automated and agile development process and enabling safe code deployments in Azure Kubernetes Services (AKS) using YAML scripting.Implemented the Hybrid Runbook Worker feature of Azure Automation accounts to run runbooks within applications and automated daily operations using Azure Automation accounts.Leveraged Azure Container Instances (ACI) for data processing, storing processed data in Azure Blob storage.Automated the deployment and scaling of Kafka clusters using Infrastructure-as-Code (IaC) principles.Managed MongoDB database instances, including installation, configuration, and ongoing maintenance.Designed an Azure App Service plan in a specific region, utilizing allocated compute resources for application deployment within the Azure App Service Environment.Proficient in automating infrastructure deployments using Infrastructure as Code (IaC) tools such as Terraform and ARM templates.Utilized Shared Image Gallery for image storage and built Azure pipelines in Azure DevOps to implement various services. Employed Azure DevOps YAML pipelines, GitLab CI pipelines, and Jenkins declarative pipelines.Expertise in deploying applications using CI/CD tools like Azure DevOps and GitHub Actions, integrating automated tests in continuous integration, and running nightly scripts with GitHub Actions.Participated in data ingestion processes for Azure services like Azure Data Lake, Azure Storage, Azure SQL, Azure Data Factory (ADF), and Azure Data Warehouse, and processed data in Azure Databricks.Created Power BI and Lens reports and dashboards using data from Kusto and Azure SQL databases, sourced from Cosmos DB and Redis DB, and utilized scope scripting.Integrated existing APIs into Azure API Management, managing attributes such as security, usage plans, throttling, analytics, monitoring, and alerts.Strong knowledge of Azure Synapse Analytics, combining data warehousing, data integration, ETL pipelines, and analytics tools for big-data processing, visualization, and dashboards.Implemented Kubernetes (K8s) features, including Horizontal Pod Auto-scaling (HPA), Kubernetes secrets, readiness and liveness probes, DaemonSets, ConfigMaps, and service accounts.Utilized Helm Package Manager to develop custom Helm charts and deploy them in Kubernetes clusters.Employed Argo CD as a Kubernetes controller for continuous monitoring, ensuring that live applications match the desired target state.Collaborated with development teams to troubleshoot and optimize Kafka performance within Azure environments.Implemented Istio as a service mesh to manage and monitor network traffic between services across multiple Kubernetes clusters.Monitored Kubernetes clusters using Prometheus, configuring additional components of the Prometheus stack within Kubernetes.Used Flux to achieve GitOps in Kubernetes clusters, ensuring alignment between cluster configurations and Git, and automating deployments.Implemented Open Service Mesh (OSM) on Azure Kubernetes Service (AKS) to manage traffic, security, and observability for workloads.Designed and implemented Service-Oriented Architecture (SOA) with Ingress and Egress controls in Azure Kubernetes Services (AKS).Used Terraform to create application stacks and CI/CD workflows with Bitbucket, Git, Jenkins, and API gateways, and developed Lambda functions to export logs to Datadog for dashboards and alerts.Automated Sentinel policy workspace pipelines using Terraform Enterprise code.Implemented Azure DevOps solutions for development teams, managing source control, work item tracking, and build and release management.Leveraged Dynatrace for auto-discovery, continuous dependency mapping, and monitoring of Azure cloud services, including Azure App Service, database performance, and Azure Kubernetes Services (AKS).Developed and executed PowerShell, Python, and Shell scripts to automate deployment processes.Environment: Azure, Azure DevOps, Azure Kubernetes Services, Databricks, Terraforms, ansible, Kafka, Shell, JIRA, Docker, Kubernetes, Virtualization, Nginx, Nexus, Gitlab, GitHub/BitbucketClient: CVS Health Care, Woonsocket, RI Apr 2021 to Jun 2022Azure Devops EngineerResponsibilities:Developed dashboards in Azure DevOps to monitor CI/CD pipelines, track work items, and manage bug logs. Identified and logged defects in Azure DevOps and interacted with developers to prioritize the issues.Built Azure infrastructure using ARM templates and Azure pipelines for automated build, test, and release processes. Managed Azure Container Registry (ACR) for storing and deploying private Docker images.Provisioned and automated infrastructure across various cloud services (PaaS, SaaS, IaaS) for deployment on cloud platforms.Designed Network Security Groups (NSGs) to manage inbound and outbound traffic for NICs, subnets, and VMs. Created Azure Key Vault and Variable groups for secure storage of certificates and secrets.Migrated SQL Server databases to Azure SQL Database using SQL Azure Migration Wizard and automated log uploads to Azure Blob storage via Python API.Utilized HashiCorp Vault to securely store secrets, manage certificates, and provide application-level encryption.Implemented Kafka monitoring and alerting using Azure Monitor and the ELK Stack.Deployed and managed containerized applications in Kubernetes clusters, orchestrated through GitLab CI/CD.Managed Azure Databricks clusters, optimizing them for big data processing and scalability.Built and maintained Virtual Machine Scale Sets (VMSS) clusters to support various workloads, enhancing build flexibility through Azure Pipelines (Classic, Release, and YAML).Developed Azure Automation Assets, including graphical and PowerShell runbooks, to manage PaaS and IaaS environments in Microsoft Azure.Automated cloud services deployment, including Jenkins and Buildkite on Docker, using Terraform. Configured infrastructure using Terraform scripts.Specialized in deploying microservices and developing applications using Azure services like Azure DevOps, Kubernetes Service (AKS/EKS), MySQL DB, Grafana, and RBAC, with a focus on monitoring clusters.Managed test plans, suites, and cases using Azure DevOps. Installed and utilized the Test and Feedback Extension for authoring, running, and analyzing manual tests.Implemented production environments using Terraform and Terragrunt, with backend state management in S3.Deployed Java Spring Boot applications on Kubernetes clusters hosted on Azure Container Instances (ACI) and optimized performance using AppDynamics.Worked with Red Hat OpenShift for Docker and Kubernetes, managing containerized applications through nodes, ConfigMaps, and services, and deploying containers as Pods.Leveraged Kubernetes operators and Spinnaker pipelines for automated deployment and management of applications and databases across public and private clouds.Deployed and configured Kafka clusters on Azure Kubernetes Service (AKS).Virtualized servers using Docker for test and development environments, automating configuration, scaling, and management through Docker containers.Actively contributed to infrastructure as code, utilizing Terraform for execution plans, resource graphs, and change automation, and developed plugins to extend Terraforms functionality.Created Terraform (TFE) Sentinel policies to manage infrastructure components, including networking, storage, and account baselines.Designed and implemented CI/CD pipelines using Code Pipeline, CodeBuild, and Code Deploy in Jenkins. Automated Jenkins CLI tasks through scripting.Monitored logs using Azure Monitor, Azure Log Analytics, Nagios, and Splunk, tailored for tenant-specific needs.Developed Python scripts to automate log rotation for web servers and created shell scripts to automate SSH key management and password generation in Python.Environment: Azure, Azure DevOps, Azure Kubernetes Services, Azure Databricks, Terraforms, Kafka, Gitlab, Logstash & Kibana (ELK), Ansible, Shell, JIRA, Docker, Jenkins, Virtualization, Nginx, Nexus, GitHub/Bitbucket.State of MI- DTMB (Michigan Department of Technology, Management & Budget), Dimondale, MI Feb 2020 to Mar 2021Role: GCP Devops EngineerResponsibilities:Worked on Google cloud platform (GCP) services like VPC's, Google Maps, compute engine, cloud load balancing and cloud deployment manager.Setup GCP Firewall rules to allow or deny traffic to and from the VMs instances based on specified configuration and used GCP cloud CDN (content delivery network) to deliver content from GCP cache locations drastically improving user experience and latency.Proficient in deploying Docker containers to Kubernetes clusters on GCP using Google Kubernetes Engine (GKE). Managed container lifecycles, scaling, and load balancing through Kubernetes deployments, services, and ingress controllers.Involved in setting up Azure DevOps platform by creating multiple projects, teams in an organization and also onboarded users to the organization to different access levels such as Stakeholders, Basic, MSDN based on desired role in their team. involved in application discovery phase for JAVA, python and NodeJS apps.Use of ADO libraries, Pipelines, ADO Boards, Pipelines for Build and Releases, Azure Artifact etc.Experience in migrating existing on- premises databases to both Azure and Google Cloud Environments for better reporting experience.Used Chef for configuration management of hosted instances within GCP. Configuring and Networking of Virtual Private Cloud (VPC).Built Terraform scripts and deployed them in a cloud deployment manager to spin up resources such as Compute Engines in public and private subnets, cloud virtual networks, and AutoScaler on Google Cloud Platform (GCP).Built pipelines for monolithic and micro services-based applications and Build pipelines for Infra as a code and Integrated security into pipelines by using Descopes Design. And Moreover, Built pipelines for azure data factory pipelines and azure data bricks Environment.Worked on GKE Topology Diagram including masters, slave, RBAC, helm, Kubectl, ingress controllers GKE Diagram including masters, slave, RBAC, helm, Kubectl, ingress controllers.Experience in providing highly available and fault tolerant applications utilizing orchestration technologies like Kubernetes and Apache Mesos on Google Cloud Platform.Experience in designing a Terraform and deploying it in cloud deployment manager to spin up resources like cloud virtual networks, Compute Engine in public and private subnets along with AutoScaler in Google Cloud Platform.Monitoring Kubernetes Cluster using Prometheus and configured additional components of the Prometheus stack inside Kubernetes by covering Kubernetes cluster components.Implemented Flux to make GitOps happen in the Kubernetes cluster to ensure that the cluster config matches the one in git and automates the deployments.Experience in setup and build GCP infrastructure using resources Shared service VPC model, Compute engine, Cloud Storage, Cloud SQL, IAM using the Terraform foundation modules.Setting up VPCs in the Google Cloud and linked to on-premises VPN's using IPSec. And Responsible for setting up GCP's cloud identity and access management (IAM) to manage users, roles and privileges.Experience in configuring the Cloud Firewall rules to allow the inbound traffic to GCP GKE cluster.Deployed the Micro services with Spring boot on GCP / GKE Kubernetes Cluster configured with master nodes and worker nodes. Responsible for creating data pipelines to ingest data into Big Query using Cloud Pub/Sub and Cloud Dataflow.Experience in Migrating the Legacy application into GCP Platform, Responsible for Deploying Artifacts in GCP platform by using Packer.Responsible for managing the GCP services such as Compute Engine, Google Maps, App Engine, Cloud Storage, VPC, Load Balancing, Big Query, Firewalls, and Stack Driver.Responsible for managing the Docker orchestration for transferring the data from store database to REDIS cache server.Worked on TERRAFORM for provisioning of Environments in GCP platform, Managing the Infrastructure on Google cloud Platform using Various GCP services.Deployed and monitored scalable infrastructure on Google cloud & configuration management using Docker and Google Kubernetes Engine (GKE).Scheduled automated backups of Compute Engine virtual machine (VM) instances in GCP. Utilized Packer to deploy artifacts on the GCP platform.Automated the IAM secrets policy management for HashiCorp Vault by integrating it with Jenkins, deployment of PostgreSQL databases, load balancers for AquaSec container security tool inside GCP using CloudSDK and Python.Hands on developing and executing PowerShell Scripts, Python and Shell Scripts to automate the deployments.Environment: GCP, Terraforms, ansible, Kafka, Shell, JIRA, Docker, Kubernetes, Virtualization, Nginx, Nexus, Gitlab, GitHub/Bitbucket, Dataflow, Pub/Sub, Dataproc, Dataprep, Cloud StorageClient: NTTDATA (LPL Financial), Ind Dec 2017 to Oct 2019Role: Azure Devops EngineerResponsibilities:Created Azure Resource Groups and implemented role-based access control for users using Azure Management APIs. Integrated Single Sign-On (SSO) authentication and authorization via Azure Active Directory.Extensive experience managing Azure services and subscriptions through Azure portals, including handling Azure resources with Azure Resource Manager.Integrated Azure AD with Windows-based Active Directory and applications. Managed identity access across Azure Subscriptions, Azure AD, Azure AD Application Proxy, Azure AD Connect, and Azure AD Pass-Through Authentication.Built pipelines in Azure Data Factory to migrate data from on-premises environments to Azure SQL Data Warehouse and from Amazon S3 buckets to Azure Blob Storage.Developed a migration strategy to transition workloads from on-premises Active Directory to Windows Azure and created new cloud-ready application solutions.Configured and set up Azure Cosmos DB SQL API for data storage related to Connectivity Check and Workshop User Web UI tools. Also configured Azure Load Balancers, Private Link Service, and Private Endpoints.Hands-on experience with scripting languages like Shell, Ruby, Groovy, Python, Perl, and XML, along with database management using MySQL, MongoDB, DynamoDB, and Elastic Cache.Automated code builds and deployments using Jenkins, Ant, Maven, Nexus, Shell Script, and Java/J2EE.Utilized JFrog Artifactory to build projects and manage artifacts in both Artifactory SaaS and Artifactory Edge.Conducted Data Center Migration from Physical Servers to Virtual Servers (P2V) using VMware Converter.Extensively worked with Ansible Playbooks, Inventory files, and the Vault feature for server configuration, software deployment, data encryption, and orchestrating continuous deployments or zero-downtime rolling updates. Managed Ansible Tower for its dashboard and role-based access control features.Developed various Terraform modules for infrastructure management and authored a module published to the Terraform registry for deploying products in client environments.Experienced in installing and maintaining Kubernetes on on-premises cloud servers, configuring master and slave nodes to establish and manage clusters.Worked with Docker and Kubernetes across multiple cloud providers, assisting developers in building and containerizing applications for CI/CD and deploying them on both public and private clouds.Utilized CI/CD tools like Jenkins, Git/GitHub, Jira, and Docker registry/daemon for configuration management and automation using Ansible and Chef.Leveraged GitLab within the CI/CD pipeline, using Git for version control, issue tracking, pipeline management, and wiki documentation to integrate and automate the code checkout process.Environment: Azure, Azure DevOps, Azure Kubernetes Services, Databricks, Terraforms, ansible, Kafka, Shell, JIRA, Docker, Kubernetes, Virtualization, Nginx, Nexus, Gitlab, GitHub/BitbucketClient: NTTDATA (T-Mobile), Ind July 2016 to Nov 2017Role: Devops Linux AdministratorResponsibilities:Worked extensively in AWS, utilizing services like EC2, ELB, S3, CloudFormation templates, VPC, Route 53, RDS, and CloudWatch to optimize infrastructure.Managed AWS services including IAM for user accounts, RDS, Route 53, VPC, RDB, DynamoDB, SES, SQS, and SNS within the cloud environment.Migrated VMware VMs to AWS using AWS CLI by uploading data to S3 and utilizing the EC2-import-image feature and established a disaster recovery repository for VMs in AWS using Elastic Beanstalk.Automated infrastructure deployment end-to-end by managing AWS services such as Elastic Beanstalk, RDS Aurora, Lambda, S3, and Neptune with Terraform to support Docker-based applications.Created AWS components to establish a private cloud infrastructure and facilitated the migration of on-premises database and application servers to AWS.Modernized applications by writing Docker files, building Docker images, and scanning them for vulnerabilities using tools like Twist lock and AWS Inspector.Developed REST APIs for microservices with AWS API Gateway (APIG) and created Docker images using multi-stage build processes.Performed administrative and management tasks in Linux environments using shell scripts written in Bash, Python, C Shell, and automated them with CRON jobs.Proficient in installing and configuring web servers, including HTTP Web Server and Apache, on Ubuntu, Red Hat Linux, Fedora, CentOS, and Amazon Linux instances.Installed and configured database servers like SQL Server, Oracle, MongoDB, and DynamoDB across various Linux distributions, including Oracle Enterprise Linux, RedHat Enterprise Linux, and Ubuntu.Hands-on experience with version control tools such as Git, Bitbucket, and SVN, managing branching, merging, tagging, and version control across Windows and Linux platforms.Strong expertise in installation, upgrades, patches, migrations, security, configuration, packaging, troubleshooting, backups, disaster recovery, performance monitoring, and optimization across Unix, Linux, and Windows systems.Environment: AWS (EC2, EMR, Lambda, S3, ELB, Elastic Filesystem, RDS, VPC, Route53, Security Groups, CloudWatch, CloudTrail, IAM Rules, SNS, SQS, VPN), Docker, Kubernetes, Jenkins, Git, ANT, Maven, Chef, Puppet, Ansible, SVN, TFS, Python, Terraform, Shell, Perl, Ruby, YAML, Apache, Tomcat, WebLogic, Linux, UbuntuDevops Engineer Aug 2014 to June 2016Birla soft, - DTDC & Conduent IndiaResponsibilities:Utilized various AWS sources, including VPC, EC2, S3, Dynamo DB, IAM, EBS, Route53, Security Group, Auto Scaling Group (ASG), and RDS, to establish the infrastructure for AWS application deployment. Accomplished this using CloudFormation and Terraform templates.Developed and executed an AWS CloudFormation template utilizing JSON/YAML to deploy Tomcat and Apache Web servers. Collaborated with both offshore and onshore teams to ensure the successful deployment to production. Additionally, created a CRON job using AWS REST API to recover and store critical production data on DynamoDB.Created AWS Lambda functions triggered by CloudWatch events, integrating them with Amazon DynamoDB tables, S3 buckets, and Amazon API Gateway to handle HTTP requests.Built and Deployed Java/J2EE to a web application server in an Agile continuous integration environment and automated the whole process.Deployed Java applications into Apache Tomcat Application Server.Utilized Ansible to automate infrastructure management by crafting playbooks and inventories using YAML. Additionally, employed Ansible Vault for encrypting sensitive data to ensure security.Engaged in the development of build deployment, build scripts, and automated solutions using Python, Perl, and various scripting languages such as bash, ruby, and shell.Worked on MongoDB database concepts such as locking, transactions, indexes, sharing, replication, schema design.Leveraged the JIRA tool for creating, updating, and tracking project status through stories. Utilized JIRA in all our projects following scrum methodology, including story creation and point assignment.Environment: AWS (EC2, EMR, Lambda, S3, ELB, Elastic Filesystem, RDS, VPC, Route53, Security Groups, CloudWatch, CloudTrail, IAM Rules, SNS, SQS, VPN), Docker, Kubernetes, Jenkins, Git, ANT, Maven, Chef, Puppet, Ansible, SVN, TFS, Python, Terraform, Shell, Perl, Ruby, YAML, Apache, Tomcat, WebLogic, Linux, Ubuntu |