Quantcast

Cloud Native Developer Resume Lubbock, T...
Resumes | Register

Candidate Information
Name Available: Register for Free
Title Cloud Native Developer
Target Location US-TX-Lubbock
Email Available with paid plan
Phone Available with paid plan
20,000+ Fresh Resumes Monthly
    View Phone Numbers
    Receive Resume E-mail Alerts
    Post Jobs Free
    Link your Free Jobs Page
    ... and much more

Register on Jobvertise Free

Search 2 million Resumes
Keywords:
City or Zip:
Related Resumes

Web Services Stack Developer Lubbock, TX

Stack Developer C# Lubbock, TX

Sr .NET developer Lubbock, TX

full stack .Net developer Lubbock, TX

Sql Developer Bi Lubbock, TX

Software Development Hr Manager Lubbock, TX

DBA Lubbock, TX

Click here or scroll down to respond to this candidate
Candidate's Name
PHONE NUMBER AVAILABLEEMAIL AVAILABLELinkedInAround 5 years of extensive experience in integrating various Cloud/ DEVOPS/ technologies  Cloud (AWS, GCP), DEVOPS (Docker, Docker Swarm, Jenkins, Puppet, Ansible, GIT, Maven, Splunk, Junit, Udeploy, Terraform, AWS CodePipeline, AWS CodeBuild, CodeDeploy), Databases (MySQL, Postgresql, Oracle), and, Linux (Suse, Red hat, Oracle) and Windows environmentsHands-on experience in Analysis, Design & Development. - Strong knowledge and experience in designing, developing and deploying solutions on all major Cloud Service Providers such as AWS, Azure, GCP and Certified in AWS.Developed Serverless APIs in AWS, and GCP, used services such as AWS Lambda, API Gateway, Cloud Functions, Cloud Run.Managed Amazon Web Services like EC2, S3 bucket, ELB, Auto-Scaling, SNS, SQS, AMI, IAM, DynamoDB, Elastic search, Virtual Private Cloud (VPC) through AWS Console and API Integration.Excellent in deploying the applications in AWS as EC2 instances and created snapshots for the data that had to be stored in AWS S3.Defined AWS Security Groups which acted as virtual firewalls that controlled the traffic allowed reaching one or more EC2 instances.Utilized JavaScript and TypeScript to develop dynamic and interactive user interfaces, improving the learning experience and navigation.Implemented frontend components with React.js and Angular to create modern and intuitive user interfaces for seamless interaction with learning resources.Ensured optimal user experience across various devices and screen sizes by employing HTML5 and CSS3 for responsive design.Integrated Redux architecture for efficient state management, ensuring scalability and maintainability of learning management interfaces.Leveraged Jest and Enzyme for thorough unit testing of frontend components, ensuring reliability and robustness of user interface functionalities.Implemented CI/CD pipelines with Jenkins and AWS CodePipeline for automated deployment of frontend applications, speeding up time-to-market and facilitating seamless updates.Good knowledge on Amazon Web Services (AWS), Amazon Cloud Services like Elastic Compute Cloud (EC2), Simple Storage Service(S3), Amazon Cloud Watch, SNS and experienced deploying highly scalable java applications using various architecturesGood Experience in creating Lambda and Edge Lambda functions in AWS and integrating them to API Gateway.Good Experience in creating Cloud Formation templates to deploy different AWS services.Hands-on experience with scaling technologies and solutions.Experience in Amazon Web Services (AWS) infrastructure with automation and configuration managementProficiency on AWS and related services like S3, EC2, EBS, RDS, ELB and Route53 etc.Working on AWS(Amazon Web services) build automation, Jenkins, Ansible scripting for Docker/ non - Docker Micro services.Implemented Splunk and Dynatrace for comprehensive monitoring and analysis of AWS infrastructure and application workloads, leveraging their respective dashboards, alerts, and AI-driven capabilities.Configured customized monitoring solutions to track performance metrics, dependencies, and security events across cloud environments, ensuring proactive identification and resolution of issues.Integrated Splunk and Dynatrace with CI/CD pipelines and cloud-native services for continuous monitoring and optimization of cloud-based applications and infrastructure.Experience on writing JSON and YAML scripts to deploy AWS Services using Cloud Formation.Experience as a DevOps Engineer overseeing deployments in AWS (S3, ELBs, EC2, EBS, Route53) Continuous Integration, Continuous Deployment, performing ITIL Release Management and Change Management activitiesHands on experience in AZURE Cloud Services: (ARM Templates, Virtual Network, NSG, Networking, Application Gateway, VM Scale Sets, DNS, Application Insights, DNS, Private Link, Load Balancers, Storage Accounts, Bastion, Log Analytics Workspace, Azure Alerts, AKS, Service Fabric Clusters, Containers, Recovery Services vaults, Databases,Computing services) .Automated deployments for 200+ cloud servers using Python and BashExcellent knowledge in SAM deployments using Git Repository and Jenkins.Manage 50+ total AWS, Jenkins accounts to more effectively control access to resources and increase securityAbility to communicate effectively with all levels of management and the technical community.Worked closely with various teams like project management, development, Automation testing and production team to align project deliverables, dates and report issues/concerns.Orchestrated Kubernetes deployment on Amazon EKS using Jenkins CI/CD, emphasizing Terraform for scalable infrastructure provisioning.Utilized Terraform to configure AWS Batch for daily execution of Spring Batch applications, leading to a 30% reduction in processing time.Cloud and Devops TechnologiesAWS services (Lambda, API Gateway, SES, CodePipeline, CodeBuild, AWS SAM, EC2, S3, Route53, SQS, IAM, CloudWatch, CloudFormation, EKS, ECR, Batch, EC2), Docker, Ansible, Terraform, Git, Jenkins, Splunk Enterprise, SiteScope, Dynatrace, DataDogContainerizationDocker, KubernetesWeb TechnologiesHTML, CSS, Bootstrap, React.js, AngularProgramming LanguagesPython, Java, Type Script, JavaScriptDatabasesMYSQL, Postgresql, OracleSQLFrameworksFlask, Spring Boot, Spring BatchOperating SystemsWindows, Linux, UNIXScriptingPython, Perl, Bash, Ruby, Yaml, Java Script, jsonWorked as a Cloud Native Developer at Sharpwhiz in Houston from August 2023 to Present.Worked as a Cloud Native Developer at Deloitte in Hyderabad from October 2021 to July 2022.Worked as an Application Development Analyst at Accenture in Bangalore from June 2021 to October 2021.Worked as a Programmer Analyst at Cognizant in Chennai from June 2020 to June 2021.Worked as a Programmer Analyst - Trainee at Cognizant in Chennai from May 2019 to June 2020.Interned at Cognizant in Chennai from January 2019 to April 2019.CERTIFICATIONS:Amazon Web Services DevOps Engineer - ProfessionalAWS Certified Developer  AssociateGoogle Cloud Certified - Associate Cloud EngineerAviatrix Certified Engineer - Multi-Cloud Network AssociateACHIEVEMENTS:Bronze Award for Stellar Performance in Cloud Deliverables, CognizantSpot Award for excellence in work contributions, DeloitteEDUCATION:Masters Degree in Computer and Information Sciences from Texas Tech UniversityBachelors Degree in computer science and engineering from Sathyabama UniversityPROJECT DETAILS:Project #1Project Title: nextGen STXClient Name: SharpwhizDuration: August 2023 to PresentRole: Cloud Native DeveloperProject Description:Created the API services on Amazon EKS using Jenkins CI/CD pipelines and Terraform, focusing on infrastructure as code for efficient provisioning. Developed a POST API for transaction input and a GET API for status queries. Implemented SonarQube for code analysis, ensuring high-quality standards. Designed and automated Spring Batch applications using AWS Batch, Terraform, and Jenkins for tracking transaction statuses and updating databases. Collaborated across teams, embraced DevOps practices, and provided valuable feedback in code reviews. Resolved user-reported issues, prioritizing iterative development and optimization for enhanced performance. Documented technical specifications and deployment procedures, ensuring transparency. Deployed both API and Batch applications consistently using Jenkins and Terraform in a DevOps-centric approach.Responsibilities:Deployed API services on Kubernetes within Amazon Elastic Kubernetes Service (EKS) using Jenkins CI/CD pipelines, emphasizing infrastructure as code with Terraform for efficient and reproducible infrastructure provisioning.Implemented POST API to enable end-users to add transactions to the orchestration system.Created GET API to allow end-users to query the status of their previously added transactions.Utilized SonarQube for static code analysis, ensuring code quality, identifying bugs, and maintaining high standards in the development process.Designed and developed two Spring Batch applications to automate transaction status tracking and database updates.Configured AWS Batch for executing the Spring Batch applications on a daily schedule, employing Terraform for infrastructure orchestration and Jenkins CI/CD for streamlined automation.Integrated the first Spring Batch application to track transaction statuses and update the internal database, deploying it using Jenkins and Terraform.Integrated the second Spring Batch application to update a central database through an API call and synchronize with the internal database, leveraging Jenkins and Terraform for deployment.Collaborated with cross-functional teams to ensure seamless integration and delivery of features, emphasizing DevOps practices.Participated in code reviews and provided valuable feedback to improve code quality, maintainability, and adherence to DevOps principles.Resolved issues and bugs reported by end-users, prioritizing continuous improvement in user experience through iterative development and DevOps practices.Utilized Splunk for comprehensive monitoring of Kubernetes clusters deployed on Amazon EKS, enabling real-time visibility into containerized applications and infrastructure performance.Integrated Splunk with Kubernetes logging and database monitoring tools to collect and analyze logs and metrics, facilitating centralized log management and correlation of events across microservices and databases.Implemented Dynatrace for end-to-end monitoring of deployed API services and Spring Batch applications, providing insights into application performance, dependencies, and user experience.Configured Dynatrace dashboards and AI-powered monitoring capabilities to automatically detect and prioritize performance issues, enabling proactive optimization and troubleshooting aligned with DevOps practices.Conducted performance testing and optimization to enhance API response times and reduce latency, aligning with a DevOps approach for efficient application delivery.Documented technical specifications, API endpoints, and deployment procedures, fostering transparency and collaboration within the development and operations teams.Deployed both API and Batch applications using Jenkins and Terraform, ensuring a consistent and automated deployment process for all components.Environment: AWS, Python, Boto3, S3, SES, AWS Lambda, AWS EC2, AWS SSM Parameter store, Docker,, AWS CloudWatch Events, Amazon Elastic Kubernetes Service (EKS), Jenkins CI/CD pipelines, Dynatrace, Terraform, SonarQube, AWS Batch, Amazon S3, Amazon RDS (MySQL), Splunk, AWS CodeBuild, AWS CodeCommit, AWS CodePipeline, AWS Secrets Manager, Oracle SQL.Project #2Project Title: CommentaryClient Name: DeloitteDuration: Oct 2021 to Aug 2022Role: Cloud Native DeveloperProject Description:Commentary is a web-based application that will facilitate the product owners to migrate different XML products that are residing in the On-premises servers to the AWS S3 and convert them to expected format by processing those XML files. commentary application helps the product owners to migrate different XML product files and allow them to approve the converted files by reviewing them. Once approved the converted files will be available for end-user consumptionResponsibilities:Developed APIGateway Python rest APIs to handle different API requests that will be received from the Web Application to migrate the On-Premises XML files to the AWS S3 and convert them to the output format.Designed and developed AWS Step Functions state-machine which includes multiple lambda functions to handle the migration, processing, and conversion part of the XML files. This StepFunctions state-machine will be integrated with the AWS API Gateway and gets triggered whenever the API gateway receives a new request.Used AWS Boto3 python client SDK to interact with the different AWS services for completing the migration and conversion process.Developed the code to notify the Product owners when the product gets converted or Failed using the AWS SES and Boto3.Integrated SQLAlchemy ORM modules as AWS lambda layers to the Lambda handlers so that the lambda functions can constantly interact and update the RDS PostgreSQL database on the migration changes across the state-machine workflow.Created SQL files and developed scripts to automate the DB migration while migration using the Flyway migration tool, and developed Sqlalchemy modules to perform CRUD-related operations from the APIs.Designed and developed a way out for removing unconsumed messages from a message queue. A second queue was created which then stores unconsumed messages. This prevents message pile-up and keeps the main queue clear which increases fault tolerance of APIs used by the client.Created python modules and added them as lambda layers to AWS Lambda Function for converting the XML files to the different other formats.Implemented Dynatrace for end-to-end monitoring of the AWS Step Functions state-machine and associated Lambda functions, providing insights into performance, dependencies, and execution metrics.Configured Dynatrace dashboards and AI-powered monitoring capabilities to automatically detect and prioritize performance issues within the Step Functions workflow, Lambda functions, and API Gateway interactions.Utilized Dynatrace's distributed tracing functionality to trace requests across the entire migration and conversion process, identifying bottlenecks and optimizing performance.Integrated Dynatrace with CI/CD pipelines in AWS CodePipeline to monitor the deployment of APIs and serverless resources, ensuring that performance is continuously monitored throughout the development lifecycle.Integrated Databricks for advanced data processing and analytics, optimizing the migration and conversion processes of XML files stored on AWS S3.Utilized Unity Catalog to streamline and manage metadata associated with the XML files, enhancing the organization and accessibility of data during migration tasks.Created CI/CD pipelines in AWS CodePipeline to automate the deployment of the APIs using the AWS SAM framework.Defined AWS SAM and OpenAPI definition YAML templates for creating AWS API Gateway APIs, Lambda Function, and other Serverless resources.Designed an Open API Definition template for defining API resources, methods, and other API configurations.Created BuildSpec file for AWS CodeBuild step in CodePipeline to Deploy the SAM template which will deploy all serverless services-based on test results and approval.Environment: AWS, Python, Step Functions, Boto3, S3, Flask, Flyway migration tool, SQLAlchemy, SQS, SES, AWS Secrets Manager, AWS SAM, API Gateway, Dynamodb, TypeScript, JavaScript, Dynatrace, Angular, React.js, AWS Lambda, AWS EC2, CloudFormation, AWS CodePipeline, AWS CodeBuild, AWS SSM Parameter store, Docker, CodeCommit, PostgresSQL, AWS CloudWatchProject #3Project Title: AdaptClient Name: AccentureDuration: June 2021 to Oct 2021Role: Application Development AnalystProject Description:Provider-Pier Intake is a web application which allows managing all the ETL pipelines that are running across different major cloud providers such as AWS, Azure, and GCP. Project owners or ETL pipeline developers can easily access the web application and create, update, view, and delete different ETLs across all the major cloud service providers in one place. The application helps all the individuals who are working with the ETLs to collaborate and manage the complex ETL tasks effortlessly.Responsibilities:Developed APIGateway Python rest APIs to handle different API requests to manage the ETL pipelines across different Cloud Service Providers.Designed and developed python handlers to work with major cloud service providers ETL services such as AWS Glue, Azure Data Factory, GCP Data Fusion, etc ., using cloud SDKs such as Boto3, GCP SDKs, and Azure SDK.Implemented design patterns logic in python handlers to extract and load different kinds of data from various sources and load them into the destinations based on the configurations made while defining the ETLs in the application using cloud-specific SDKs.Created CI/CD pipelines in AWS CodePipeline to automate the deployment of the APIs using the AWS SAM framework.Defined AWS SAM and OpenAPI definition YAML templates for creating AWS API Gateway APIs, Lambda Function, and other Serverless resources.Designed an Open API Definition template for defining API resources, methods, and other API configurations.Created BuildSpec file for AWS CodeBuild step in CodePipeline to Deploy the SAM template which will deploy all serverless services-based on test results and approval.Incorporated Databricks and Unity Catalog into APIGateway Python REST APIs, enhancing ETL pipeline capabilities across diverse cloud service providers for efficient data processing.Integrated Databricks and Unity Catalog functionalities into CI/CD pipelines, automating deployment processes and ensuring seamless updates of ETL pipelines and frontend applications.Implemented CI/CD pipelines with Jenkins and AWS CodePipeline for automated deployment of frontend applications, accelerating time-to-market and ensuring seamless updates.Integrated all the secrets used in the APIs with the AWS Secrets Manager which will rotate the DB secrets automatically. And added the secret retrieval python boto3 code in the lambda handlers.Employed Dynatrace to provide comprehensive monitoring of ETL pipelines spanning multiple cloud service providers, offering deep insights into performance metrics, dependencies, and execution patterns.Configured customized Dynatrace dashboards and utilized its AI-driven monitoring capabilities to proactively identify and prioritize performance issues within the ETL workflows, cloud SDK interactions, and API Gateway endpoints.Leveraged Dynatrace's distributed tracing functionality to trace requests across the entirety of the ETL process, enabling the identification of performance bottlenecks and facilitating optimization across diverse cloud environments.Designed and developed Factory patterns in the python code to interact with different cloud service providers using their cloud-specific SDKs without complicating the code as we use Factory Design Patterns.Implemented a notification system in the application to notify based on the ETL pipeline state change using AWS python boto3 SDK and AWS SES email service to notify the owners.Built an AWS Cloud Watch Events-based solution to check AWS Glue errors and report to the operations team.Used SQLAlchemy ORM to perform DB-related operations using models and other SQLAlchemy features.Environment: AWS, Python, GCP, Azure, Boto3, S3, Flask, Flask_restful, SQLAlchemy, SQS, SES, AWS Secrets Manager, TypeScript, JavaScript, Angular, React.js, AWS SAM, Dynatrace, API Gateway, Dynamodb, AWS Lambda, AWS EC2, CloudFormation, Google Cloud Storage, Azure Cloud Data Storage, AWS CodePipeline, AWS CodeBuild, AWS SSM Parameter store, Docker, CodeCommit, PostgresSQL, AWS CloudWatch EventsProject #4Project Title: AcademyClient Name: CognizantDuration: June 2020 to June 2021Role: Programmer AnalystProject Description:Developed the comprehensive serverless application infrastructure using AWS and GCP services. Defined APIs with AWS API Gateway, utilized Lambda functions and DynamoDB for processing and storing data. Implemented logging, security measures, and containerization for scalability. Managed infrastructure with Puppet and Cloud Formation templates, while ensuring continuous integration and deployment with Jenkins and AWS CodePipeline. Developed dynamic user interfaces with React.js and Angular, ensuring responsiveness and reliability through Jest and Enzyme testing. Integrated with Elastic Search for real-time data analysis and visualization.Responsibilities:Defined serverless APIs using AWS API gateway to handle the requests from the application.Used AWS lambda functions to process API Gateway Requests.Defined AWS APIGateway using OpenAPI Specification.Used AWS Dynamodb as a non-relational database to store the data related to the application.Used CloudWatch logs and log filters to track the runtime issues to the AWS Lambda Functions.Secured the application-related secrets using the AWS secrets manager and used python AWS SDK to retrieve the secrets.Used PyMySQL to perform the CRUD operations from the Lambda Functions.Added custom code as AWS Lambda Layers to AWS Lambda Functions.Developed GCP Cloud Functions using python to process the incoming requests from GCP Cloud API Gateway.Created DockerFiles and configured GCP CloudRun Service to spin up containers whenever it receives the traffic.Deploy and monitor scalable infrastructure on Amazon web services (AWS) & configuration management using Puppet.Worked in creating Lambda and Edge Lambda functions in AWS and integrating them to API Gateway.Worked in creating Cloud Formation templates to deploy different AWS services.Worked on SAM deployments using Git Repository and Jenkins.Involved in periodic archiving and storage of the source code for disaster recovery.Utilized JavaScript and TypeScript to develop dynamic and interactive user interfaces, enhancing the learning experience and ensuring smooth navigation.Utilized Splunk for comprehensive monitoring of AWS infrastructure, including serverless APIs deployed using AWS API Gateway, AWS Lambda functions, and DynamoDB databases.Configured Splunk dashboards and alerts to track runtime issues, monitor performance metrics, and analyze CloudWatch logs for AWS Lambda functions and API Gateway endpoints.Integrated Splunk with AWS Secrets Manager to securely manage application-related secrets, ensuring data security and compliance with best practices.Leveraged Splunk to analyze and visualize real-time data from Restful APIs of Elastic Search, enabling dynamic analysis, search, and visualization of data for enhanced insights and decision-making.Implemented frontend components with React.js and Angular, providing a modern and intuitive user interface for seamless interaction with learning resources.Utilized HTML5 and CSS3 to ensure responsive design and optimal user experience across various devices and screen sizes.Integrated Redux architecture for efficient state management, ensuring scalability and maintainability of learning management interfaces.Leveraged Jest and Enzyme for unit testing of frontend components, ensuring reliability and robustness of user interface functionalities.Implemented CI/CD pipelines with Jenkins and AWS CodePipeline for automated deployment of frontend applications, accelerating time-to-market and ensuring seamless updates. Worked with Restful API's of Elastic Search to analyze, search and visualize real-time data.Environment: AWS, Python, Step Functions, Boto3, S3, Flask, Flyway migration tool, SQLAlchemy, SQS, SES, AWS Secrets Manager, AWS SAM, API Gateway, Dynamodb, Splunk, AWS Lambda, Java, Agile, TypeScript, JavaScript, React.js, Angular, Docker, MY SQL, Perl Scripts, Shell scripts,Python, WindowsProject#5Project Title : AWS COEClient : CognizantPosition : Programmer Analyst TraineeDuration: May 2019 to June 2020Project Description:Developed and implementation of cloud-based solutions, including AWS Config for reporting non-compliant resources and AWS Lambda scripts for remediation. Implemented an Automated Vulnerability Detection Solution using AWS Inspector and developed a patching solution for weekend server maintenance without human intervention. Utilized Guard Duty and Lambda for threat IP detection and built an Azure Python Function App for activity data processing. Managed Jenkins pipelines for automated builds, maintained applications, and streamlined deployment processes through continuous integration. Additionally, collaborated closely with development teams to establish robust CI/CD pipelines and automated monitoring tools Docker containers for efficient deployment.Responsibilities:Created a solution for Reporting non-compliant resources on the AWS cloud to the operational teams using AWS Config.Developed AWS Lambda python scripts to remediate noncompliant Resources hosted in AWS on an approval basis.Built an Automated Vulnerability Detection and Reporting Solution on AWS using AWS inspectorDeveloped Automated Patching Solution which will patch production servers on weekends without any human intervention. Which reduced the workload on the team to work weekends for patching using AWS SSM Maintenance windows.Built a threat IP detection and blocking solution in AWS using Guard Duty and AWS LambdaDeveloped Scripts to alert on any errors in AWS Lambda Serverless workloads.Built an azure python function app to query and process End user's activity data from DynamoDb and store it in Azure MySQL Database for analyticsConfigured and administered Jenkins pipelines for automated builds and responsible for installing Jenkins master and slave nodes.Utilized Splunk for comprehensive monitoring of AWS resources activity, including compliance status reported by AWS Config, AWS Inspector vulnerability scans, and AWS GuardDuty threat detection alerts.Configured Splunk dashboards and alerts to track and report non-compliant AWS resources identified by AWS Config, enabling proactive identification and remediation of compliance issues.Integrated Splunk with AWS Lambda to monitor and alert on errors in serverless workloads, ensuring timely detection and resolution of issues impacting AWS Lambda functions.Leveraged Splunk to analyze and visualize data from AWS SSM Maintenance Windows and AWS Inspector scans, providing insights into automated patching activities and vulnerability detection efforts for AWS resources.Maintained existing applications and designed and delivered new applications.Streamlined deployment process by setting up continuous integration with Jenkins.Zero downtime deployments and CI jobs maintenanceWorked on automating Application Build processes.Have closely working with the Development team to create the CICD pipelineAdministration and creation of Hudson jobs, including automatic generation, reporting and alerting of build failures and build status indicators, and information radiatorsCreated Multibranch Pipeline Jobs for Builds and Deployments, installed several plugins in Jenkins to support multiple tools required for the implementation of projects.Automated CI/CD pipeline for the monitoring tools Docker containers and written script to test them.Environment: AWS, Python, Step Functions, Boto3, S3, Flask, Flyway migration tool, SQLAlchemy, Splunk, SQS, SES, AWS Secrets Manager, AWS SAM, API Gateway, Dynamodb, AWS Lambda, Subversion, Perforce, Python, Java.Project #6Project Title: Provider-PIER IntakeClient Name: CognizantDuration: Jan 2019 to April 2019Role: InternProject Description:Created the API services on Amazon EKS using Jenkins CI/CD pipelines and Terraform, focusing on infrastructure as code for efficient provisioning. Developed a POST API for transaction input and a GET API for status queries. Implemented SonarQube for code analysis, ensuring high-quality standards. Designed and automated Spring Batch applications using AWS Batch, Terraform, and Jenkins for tracking transaction statuses and updating databases. Collaborated across teams, embraced DevOps practices, and provided valuable feedback in code reviews. Resolved user-reported issues, prioritizing iterative development and optimization for enhanced performance. Documented technical specifications and deployment procedures, ensuring transparency. Deployed both API and Batch applications consistently using Jenkins and Terraform in a DevOps-centric approach.Responsibilities:Streamlined the orchestration of Kubernetes deployment on Amazon EKS using Jenkins CI/CD, emphasizing the utilization of Terraform for scalable infrastructure provisioning.Crafted Terraform scripts meticulously to define and manage the infrastructure as code, ensuring seamless scalability and efficient resource allocation.Implemented Terraform configurations to intricately configure AWS Batch for daily execution of Spring Batch applications, achieving a notable 30% reduction in processing time.Fine-tuned resource allocation and optimized task scheduling within AWS Batch using Terraform.Ensured seamless integration of Spring Batch applications with internal databases using Terraform, facilitating real-time data accuracy and minimizing discrepancies by an impressive 25%.- Configured database connections and access permissions through Terraform for consistency and reliability.Demonstrated exceptional cross-functional collaboration skills, fostering streamlined teamwork in Terraform-based infrastructure deployment.Actively engaged with stakeholders from various departments, incorporating feedback into Terraform infrastructure designs for cohesion and efficiency.Conducted thorough performance testing to optimize API response times, achieving a significant 20% improvement in responsiveness.Meticulously configured infrastructure components using Terraform, reducing latency through Terraform-managed configurations for a smoother user experience.Environment: AWS, Python, Boto3, S3, SES, AWS Lambda, AWS EC2, AWS SSM

Respond to this candidate
Your Message
Please type the code shown in the image:

Note: Responding to this resume will create an account on our partner site postjobfree.com
Register for Free on Jobvertise