Quantcast

Data Architecture Java Spring Resume Cle...
Resumes | Register

Candidate Information
Name Available: Register for Free
Title Data Architecture Java Spring
Target Location US-FL-Clermont
Email Available with paid plan
Phone Available with paid plan
20,000+ Fresh Resumes Monthly
    View Phone Numbers
    Receive Resume E-mail Alerts
    Post Jobs Free
    Link your Free Jobs Page
    ... and much more

Register on Jobvertise Free

Search 2 million Resumes
Keywords:
City or Zip:
Related Resumes
Click here or scroll down to respond to this candidate
Candidate's Name
Orlando, FL Street Address
Email: EMAIL AVAILABLEPhone: PHONE NUMBER AVAILABLESummary:DevOps and CI/CD Pipeline: Proficient in DevOps practices and CI/CD pipelines, including MLOps/lifecycle for AI applications, Git Actions, Docker Containerization & Compose, Helm Chart, Kubernetes, and ISTIO MicroService management.Cloud Platforms: Expertise in Azure, Docker, AKS, Kubernetes, Microservices, Domino AI Platform, AWS CDK, PaaS, Automation, Configuration, CI/CD, Terraform, ARM templates, and JAVA Spring. Experienced in Azure PostgreSQL, Azure Insights, Git flow, az/kubectl CLI, Splunk, DynaTrace, JSON, and YAML scripting.Data Architecture: Skilled in data architecture of Starburst, DataBricks, SnowFlakes, Immuta,and Oracle databases with a focus on security and governance and Microsoft PurviewDevOps Tools: Expertise in Azure DevOps,JIRA, Github copilot, Akamai management, and DevOps tools such as Jenkins, TeamCity, SCM, Maven, GitHub/GASP, Artifactory server, Nexus repository, SonarQube, CheckMarx, Skaffold, Redis, Grafana, DataDog, Opsgenie, PagerDuty, Zabbix, Nagios, NMIS, MRTG and IBM AS400 systems.Cloud Development and Security: Proficient in cloud development tools and security, including Terraform, Packer, CloudFormation, Ansible, Keycloak (IAM), AWS/ECR, GitHub, Bitbucket, HashiCorp Vault, CyberArk Security, and data governance.Cloud Platforms:Expertise in IaaS, PaaS, and AWS, GCP & Azure cloud platforms, with 20 years of experience with Linux/Unix and Cloud OpenShift (ROSA/ARO), Skilled in Git and DVC (data version control).Amazon Web Services (AWS): Expertise in Amazon VPC, EC2, Directory Service, LAMBDA, AutoScan, ALB/ELB, RDS, Route 53, S3 EBS storage, AMI, IAM, Cloud security, and VPN endpoints.Google Cloud Platform (GCP): Excellent skill in GCP Kubernetes Engine, VM, and Computer Engine.Programming Languages: Strong in UNIX Shell and Windows PowerShell scripting, Python, RPG/AS400, JupyterHub, Ruby, IBM JCL, JavaScript, PHP, SOAP, and XML. Excellent experience with Oracle SQL, Oracle RAC Cluster (11gR2), and Red Hat Cluster Suite.Languages: English, Japanese, ChineseSecurity clearances: Public Trust by US Department of Commerce, EOD/Suitable by Department ofHomeland SecurityProfessional Experience:The Walt Disney CompanySr. DevSecOps Engineer (Dec/2023  May/2024)Accomplished:Akamai web services managed and optimized Disneys content delivery networks (CDNs), ensuring high availability and performance for global content distribution. Implemented best practices for CDN configuration and web service integration, enhancing content delivery speed and reliability across various platforms.Led the onboarding of applications and databases into HashiCorp Vault, ensuring secure and efficient secrets management.Utilized CI/CD pipelines and Terraform for infrastructure as code (IaC), streamlining the deployment process and enhancing scalability and reliability of Kubernetes environments.Monitored and tracked Common Vulnerabilities and Exposures (CVE) reports, proactively identifying and addressing security vulnerabilities. Coordinated the timely patching of vulnerabilities to safeguard systems and applications, minimizing security risks and ensuring compliance with industry standards.Utilized Grafana to monitor application performance and all Disney worldwide production systems, ensuring optimal operation and quick resolution of issues. Provided expert solutions for complex issues encountered during on-call rotations, maintaining system stability and minimizing downtime.Discover Financial Services (Aug/2022  Dec/2023 )Secrets Storage Team: Sr. DevSecOpsAccomplished:Designed and implemented a comprehensive secrets management solution using HashiCorp Vault, ensuring secure storage and access control of sensitive data. Configured TLS encryption and data encryption within Vault, providing robust security for stored secrets and sensitive information. Architected and deployed HashiCorp Vault auto-onboarding processes, including disaster recovery (DR) server setup, failover mechanisms, and Vault policy/path management.Integrated CyberArk Identity Security & MS AD for managing and securing privileged accounts and credentials, enhancing the overall security posture of the organization. Implemented AWS Secrets Manager to securely manage and rotate sensitive information such as database credentials, API keys, and other secrets. Ensured secure storage and access control for critical application secrets. Created and managed secrets in AWS Secrets Manager, ensuring that sensitive data was securely stored and accessed only by authorized applications and services.Utilized GitHub Copilot to automate CI/CD pipeline configurations and scripting, reducing manual errors and saving time. Developed and maintained complex GitHub Actions workflows with Copilot, enhancing the efficiency of the deployment process. Integrated security and compliance checks into the Packer build process, ensuring that Golden OS images adhere to industry standards and best practices. Implemented Starburst analytics solutions to provide faster and more flexible data analysis capabilities, enhancing the organization's ability to derive actionable insights from complex datasets.Developed and maintained infrastructure as code (IaC) templates using AWS CDK, enabling automated and repeatable provisioning of AWS resources. Integrated CI/CD Pipelines: Collaborated with DevOps teams to integrate AWS CDK scripts into CI/CD pipelines using tools like AWS CodePipeline and Jenkins, ensuring continuous delivery and deployment of infrastructure changes. Integrated AWS Secrets Manager with various applications and services to securely retrieve and use secrets. Used AWS SDKs and environment variables to dynamically fetch and inject secrets into applications at runtime.Set up and maintained Kubernetes clusters on-premises using Linux VMs to support high-availability applications. Ensured robust and secure cluster architecture to meet enterprise requirements. Integrated continuous integration and continuous deployment (CI/CD) pipelines to automate the deployment of applications and infrastructure updates within the Kubernetes environment. Implemented robust security measures for data integrity and access control within the Kubernetes cluster. Used tools like Kubernetes Network Policies, Secrets, and RBAC to secure communications and manage permissions.Designed and implemented ETL processes to link and consolidate data from multiple sources. Utilized tools such as SQL, Python, and ETL frameworks to automate data extraction, transformation, and loading into data warehouses and analytics platforms. Implemented and managed the Elastic Stack (ELK) to collect, analyze, and visualize log data from various sources. Set up centralized logging solutions to enhance monitoring and troubleshooting capabilities.Managed Akamai web services network and implemented DevOps builds with AWS CI/CD tools including CodeCommit, CodePipeline, and CodeBuild for automated code deployments to AWS EKS in production environments.Deployed and managed docker containers, stateful applications, including Kafka, Apache Druid, and PostgreSQL, within the EKS Kubernetes cluster, ensuring high performance and reliability.Managed Tupperware Key Value Stores, schedulers (Regional & Sharded), allocator task control, stateful services, and OpenShift (ROSA) Container Platform security and containers deployment.Zebra Technologies (Mar/2021 - Aug/2022)SDO team: Sr. DevOps EngineerAccomplished:Developed a comprehensive DevOps pipeline using Terraform in Azure Cloud to build and deploy Zebra WFC applications on Zebra devices. Created a unified data access layer using Starburst, allowing for streamlined querying and analysis of data across various databases and data lakes, improving data availability and usability.Automated the deployment of Zebra DNA software with AKS Kubernetes, ensuring efficient and consistent delivery of software updates. Configured Kubernetes resources, such as pods, deployments, and services, to optimize resource utilization and performance across the cluster. Utilized Kubernetes features like Horizontal Pod Autoscaler and Node Autoscaler to maintain efficient operations. Leveraged Kafka for real-time data streaming and Apache Druid for fast analytics and real-time data ingestion, ensuring seamless integration with PostgreSQL for persistent storage.Leveraged GitHub Copilot to accelerate development with AI-powered code suggestions and completions. Utilized AWS CDK to design and deploy scalable, reliable, and cost-effective cloud infrastructure. This included setting up VPCs, subnets, security groups, and EC2 instances. Configured AWS Secrets Manager to manage database credentials, ensuring secure connections to databases without hardcoding credentials in application code.Implemented security best practices and compliance requirements through AWS CDK, such as setting up IAM roles and policies, and configuring AWS Config rules and CloudTrail for auditing.Implemented GCP Cloud Functions with Apache Firebase to create the Zebra Device Tracker App (DTRK) for end-user customers and set up rules to permit access to Linux file access lists (FACLs) in each VM. Created and managed multi-cloud Golden OS images using Packer, ensuring seamless and standardized deployments across AWS, Azure, and GCP.Managed Zebra WFC profile manager and PTT Docker containers, overseeing cluster management, troubleshooting, and supporting HashiCorp Vault development.Deployed Databricks on AWS using Terraform, integrated with KMS security management tools, created ETL data pipelines, and tested ML model training with Python.Created GCP Dataproc clusters with Airflow pipelines to monitor and manage Zebra data and applications.Conducted PTT application data analysis using IBM Enterprise CloudPak on IBM Cloud, providing training and guidance to development teams on secure coding practices and addressing OWASP Top 10 and CWE Top 25 vulnerabilities.Upgraded and patched Streamsets Data Collector, managing data pipelines on the Streamsets Server.Developed Python scripts to automate the build and deployment of application pipelines in the production environment.Exported data pipelines to Azure for OpenAI module training, prediction, and experimentation, using Datadog to monitor applications and services on AWS EKS and Azure Cloud. Apps containers deployment on Azure OpenShift (ARO) serviceManaged and configured Akamai web microservices, and created GitLab pipelines from scratch, including GitLab Runner, templates, modules, integrations, and platform management in both development and production environments.Managed API keys for third-party services using AWS Secrets Manager, enhancing security by preventing exposure of keys in source code and configuration files.Johnson & Johnson New Jersey (Sep/2018  Feb/2021)R&D, AI & ML Department: Technical LeadAccomplished:Built StreamSets (SDK/STF) CI/CD pipelines with Jenkins and GitLab for blue-green deployment, transferring pipelines from QA to production environments. Utilized HashiCorp Vault for secure secret storage and dynamic creation, providing encryption for CI/CD application processes.Constructed StreamSets (v3.16.x) Control Hub and Data Collector (v3.17.1) cluster infrastructure, deploying Data Collector containers on Kubernetes in Azure. Supported development teams in creating and running data pipelines. Integrated HBASE and HIVE with StreamSets pipelines using Dataproc in GCP.Managed Domino Data Science (v4.1), Kubeflow, MLFlow, AirFlow, and Argo deployments on AWS EKS. Built Apigee Edge across multiple clouds and deployed Apigee Microgateway on Kubernetes.Created machine learning workflows, from data modeling to deployment, using Domino and Kubeflow platforms. Developed Python scripts to automate the build and deployment of application pipelines in production environments. Implemented the GitHub GASP algorithm for graph/image processing.Developed ML pipelines with Kubeflow and AirFlow, integrating APIGee, Jupyter Notebook, Katib for hyperparameter tuning, TensorFlow for model training, and KFServing with JupyterHub. Installed and managed Hadoop, Sqoop, and Spark clusters, configuring data governance policies with PKI.Created Ansible roles for automatic installation of MapR and ElasticSearch/Kibana/Kafka on AWS. Established CI/CD with Ansible Tower and Bitbucket, and deployed MS Dynamics 365 for customer production support. Configured Keycloak for Single Sign-On in AWS.Utilized Packer to build AMIs with applications and implemented Terraform to provision and update microservice resources on Kubernetes clusters and Utilized Packer to automate the creation of Golden OS images for various environments, including AWS, Azure, and VMware, resulting in a 60% reduction in deployment times.Managed Jenkins/GitHub/Maven/Selenium/Nexus CI/CD pipelines on AWS Cloud. Integrated Salesforce, Artifactory, Bitbucket, and Ansible Tower with JIRA for the DevOps group.Integrated AWS CloudWatch and Nagios with OpsGenie for alert monitoring. Created Terraform modules to provision AWS VPCs with private and public subnets, EC2, and RDS from scratch.Developed Kubernetes solutions using Terraform and Helm Charts on Google Cloud Platform (GCP), including snapshots backup and restoration. Created YAML files for persistent volume and persistent volume claims, deploying backup data on Kubernetes clusters with StatefulSet configurations.General Dynamics Information Technology, Arlington, VA (Nov/2017  August/2018)DevOps Engineer IIISupported the OBIM project for the Department of Homeland Security, ensuring high standards of security and compliance.Managed IaaS and PaaS in the cloud by building AMI applications with Packer and provisioning resources using CloudFormation on AWS.Created and maintained Build/Test pipelines with CheckMarx for release and deployment in TeamCity. Implemented Docker containerization and Compose for local CI/CD environments. Established robust security policies for data governance.Utilized Polarion for effective project management and tracking, ensuring alignment with project goals and deadlines.Bank of America Pennington, NJ (Jan 2017  Nov/2017)NETWORK / SYSTEM ENGINEER IVAccomplished:Implemented Unix shell scripts to develop serial/parallel concurrent reboot check jobs on Apache Cassandra clusters. Utilized Terraform to deploy VPCs with private and public subnets, instances, and RDS from scratch, deploying application Docker containers on Kubernetes clusters.Employed Ansible and Maven-Artifactory modules to deploy WAR files on staging servers from Artifactory repositories, streamlining the deployment process.Managed and executed application security assessments using Veracodes platform, ensuring software compliance with security standards and best practices.Implemented and managed data storage and analytics solutions in Cassandra, enhancing data handling capabilities.Developed Unix shell scripts to act as agents on each node to collect and report Cassandra cluster information, ensuring efficient monitoring and management.Created Ansible playbooks for automating data transfer and network configuration tasks, improving operational efficiency.Built automation scripts with Ansible to deploy collection agents across Cassandra nodes, facilitating seamless cluster management.Installed and configured DataStax OpsCenter and Introscope for comprehensive monitoring of DataStax Enterprise environments.Led a project team of five IT engineers in India, overseeing tasks related to patch management on Red Hat Linux systems and Docker containerization efforts.BCforward (Project in YipTV and Trac Inc), New Brunswick, NJ (Mar 2016  Jan 2017)Sr. Cloud Engineer/ArchitectResponsibilities:Configured Oracle Private Cloud Appliance (PCA) with ZFS storage, creating Linux VMs in the PCA environment. Built and maintained a LAMP environment for the web development team, providing ongoing support.Installed Oracle SOA middleware and WebLogic on Oracle Enterprise Linux (OEL) 6.5, creating a two-node SOA cluster to support the SOA development team and users. Developed Python scripts to automate cluster building in PCA and wrote backup scripts for Oracle SOA.Integrated Selenium with Jenkins and deployed Ansible for test environment automation. Established AWS cloud network infrastructure for YipTV company.Designed and created AWS VPCs, Auto Scaling Groups, IAM roles, Lambda functions, CloudFormation templates, AWS Directory Service, RDS, S3, EBS, DynamoDB, EMR, AMIs, NAT gateways, and instances for YipTV's Amazon cloud network.Utilized Alfresco on AWS to manage documents, web content, records, and images, creating collaborative content development environments in the cloud.Installed and configured open-source Puppet Master/Agent, compiling manifest files to catalogs for standardization.Built web servers with Elastic Load Balancers and Auto Scaling in AWS VPC, managing instance volume snapshots and troubleshooting issues.Deployed open-source Nagios Core monitoring tools and set up OpenVPN/OpenLDAP servers on EC2 instances.Managed Linux KVM installations and VMs with RHEL 7.2. Migrated application servers and databases from on-premises data centers to Amazon EC2, ensuring seamless transitions and minimal downtime.iConnect, Piscataway, NJ (Jun/2014  Feb/2016)Senior DevOps EngineerResponsibilities:Implemented DevOps practices for CI/CD, automating the build, test, debug, and release processes for Java code using Jenkins server.Configured, developed, tested, and supported Oracle ZFS Storage Appliance Model ZS3-2.Installed and tested Oracle WebLogic Server on Oracle Enterprise Linux (OEL) 6.5, validating its functionality with the NPAC application.Installed and configured Oracle Identity Manager (OIM) and Oracle Access Manager (OAM) for company user identity management.Installed and managed open-source Puppet Master/Agent, compiling manifest files to catalogs for standard deployment.Created Oracle Solaris 11.2 Zones/Domains/LDoms and managed them with Oracle R12c.Configured OpenLDAP server and clients, implementing PAM authentication setup on Red Hat Linux 6.5/7.1.Managed JBoss 7 AS & A-MQ and Jenkins, supporting the FCC NPAC project with software deployment. Installed and configured Oracle VM 3.2.7 server and Manager in Red Hat 7.1.Set up Nagios monitoring server and JBoss Operations Network (JON v3.2) server, using Nagios and JON to manage and monitor enterprise middleware/applications and services. Integrated JON alert SNMP traps into Nagios Monitoring console, monitoring server groups and analyzing alerts.Installed and configured eTools and CMIS Gateway of NPAC application simulators, providing support as needed.Managed SVN source control and JIRA for the development group, ensuring smooth version control and issue tracking.Installed, configured, and managed Symantec NetBackup 7.x servers for data protection and recovery.Federal farm Credit Funding Corp., Jersey City, NJ (Mar/2012 - April/2014)Sr. DevOps System EngineerAccomplishments:Installed and configured financial application software, ensuring smooth operation and maintenance of an Ethernet network.Automated CI/CD processes using Bitbucket and Jenkins in a DevOps environment, facilitating efficient code development and release cycles.Migrated NetBackup from version 6.5.6 to 7.5, transitioning from Sun Solaris to Red Hat Linux on VMware ESXi.Managed the migration and merging of CVS repositories from Sun Unix into Red Hat and Linux, ensuring compatibility and consistency across platforms.Implemented open-source packages to integrate Linux systems authentication with MS Windows Active Directory, enhancing system security and authentication processes.Set up and managed an open-source Nagios Core network monitoring server, ensuring continuous monitoring and alerting for network issues.Installed and configured McAfee Orchestrator Server/Client and Informatica Enterprise Server, enhancing security and data management capabilities.Upgraded Cisco ASA5500 iOS and set up F5 firewall/load balancing for the internal network, ensuring network security and optimal performance.Integrated SFTP servers into the DATA Warehouse project on RACKSPACE public cloud, facilitating secure file transfers.Installed and configured Hadoop on Amazon EC2 with 40 nodes, including a head node and slave nodes, and automated deployment of application software using pdsh and Hortonworks Management Center (HMC).Provided support for all Linux servers and client machines running financial software such as BTA, Hyperion, and eSpeed. Managed Veritas Volume Replicator and conducted F5 website failover and computer system disaster recovery tests. Supported Citrix Enterprise and VMware Infrastructure, as well as MS Windows environments.Dow Jones, Monmouth Junction, NJ (Apr/ 2010 - Jan/2012)Infrastructure/Network Architect / IT - Global Business & Technology Services Department Responsibilities:Upgraded the CatOS on Cisco Catalyst 6000/6500 series switches, configuring them as L3 switches. Designed a network with OSPF, BGP, and HSRP protocols for Dow Jones Factiva web server system, ensuring optimal performance and reliability.Implemented a Network Management System (NMS) to support and automate tasks on Dow Jones' internal network, enhancing network monitoring and management capabilities.Migrated from Cisco routers to Citrix NetScaler, installing and configuring Vserver, service binding, and monitor service on two-arm Citrix NetScaler for load balancing and security firewall functions.Implemented centralized management for multiple Secure Access products using Secure Access Central Manager, streamlining management and monitoring tasks.Provided maintenance, support, and troubleshooting for Juniper SA4500/6500 SSL VPNs. Managed firewall policies and performed setup and modification as needed.Ricoh, West Caldwell, NJ (Aug 1998 - Mar 2010)Principal Unix EngineerAccomplishments:Deployed and managed various UNIX variants including Linux, Sun Solaris, SuSE, IBM AIX (4.3.3/5.3), and HP-UX (11iv1/v2) in a data center environment. Designed and set up UNIX/Linux/Windows security systems and installed backup and recovery solutions across the network. Automated CI/CD projects using shell and Python scripts.Developed a UNIX printer driver for worldwide Ricoh's MFPs, providing printing solutions for the Unix platform. The driver is available for download on Ricoh's website.Built Unix DNS, NIS, NFS, Sendmail, LDAP servers, and installed Oracle RAC Cluster for UNIX and Linux servers. Hosted Ricoh's website on Linux/Apache and Windows IIS servers. Provided support for Tomcat 5.5 and JBoss running on the web servers. Managed WebSphere on RedHat Linux servers and maintained the company's web server infrastructure.Set up Amazon Elastic Compute Cloud (EC2) instances with Linux on VMware servers. Proficient in creating, accessing, running, and modifying AMI images and familiar with the yum tool. Configured Elastic MapReduce, S3 storage, and Hadoop. Supported the development of cloud computing strategy, business model, and roadmap from a technical perspective. Developed Ricoh devices plug-ins for computing cloud on Salesforce and web applications running on AWS.Planned and implemented solutions for Ricoh's disaster recovery system, including cross-network backup and recovery for various platforms using VERITAS NetBackup.Migrated from Solaris, AIX, and HP-UX to RedHat Linux running on VMware ESX 3.5 in HP Blade servers.NTT Data USA Branch, Jersey City, NJ (May/1997  July/1998)UNIX and Network SpecialistResponsibilities:Designed and installed WAN/LAN infrastructure for an Japanese (DKB) bank, implementing EIGRP, ISDN, T3/T1 with Cisco 7500/4500 routers, ensuring high-performance network connectivity.Installed IBM MQ series for message swapping between multiple platforms in the computer data center, facilitating efficient communication and data exchange.Conducted research on new application software and performed benchmarking to evaluate performance and suitability for bank systems.Installed, configured, and administered Netscape Enterprise web servers and websites, ensuring reliable and secure web hosting services.EDUCATIONElectrical Engineering Department, State University of New York at Stony Brook, NY 1991 1993 Electrical Engineering Depart., Tokyo University in Japan, 1987 1989B.S. Electrical Engineering Depart., Shanghai University. In P.R China,1 978 1982Certification:CCNA, MCP, IBM AS/400 Administrations PROFESSIONAL EXPERIENCE

Respond to this candidate
Your Message
Please type the code shown in the image:

Note: Responding to this resume will create an account on our partner site postjobfree.com
Register for Free on Jobvertise