Quantcast

Quality Assurance Test Automation Resume...
Resumes | Register

Candidate Information
Name Available: Register for Free
Title Quality Assurance Test Automation
Target Location US-NJ-Princeton
Email Available with paid plan
Phone Available with paid plan
20,000+ Fresh Resumes Monthly
    View Phone Numbers
    Receive Resume E-mail Alerts
    Post Jobs Free
    Link your Free Jobs Page
    ... and much more

Register on Jobvertise Free

Search 2 million Resumes
Keywords:
City or Zip:
Related Resumes
Click here or scroll down to respond to this candidate
Candidate's Name
E-mail: EMAIL AVAILABLE Mobile: PHONE NUMBER AVAILABLE LinkedIn: ProfileProfessional Summary:Solutions oriented professional with progressive experience in Software Quality Assurance, Test Management, Test Automation, API, ETL, Performance and Web UI in a largescale Digital, Cloud & Data transformation within a fast-paced Agile/Scrum delivery model.Gained extensive domain knowledge in Banking and Insurance, CRMS tools, Sales, Deals, AML-KYC, Salesforce, MS Dynamics, Global Payments, Wire, Fedwire, ACH, SWIFT, FIX Protocol, Client Reference Data, Client Onboarding and General Insurance, P&C Insurance, Guidewire Policy Center, Rating, Pega Claims.Key Skills, testing tools & techniquesTest Automation - Java, Selenium, Cucumber BDD, Junit, TestNG, UFT, TOSCAAPI - Java, MQ, Kafka, Microservice Contract testing, TIBCO, Postman, Parasoft SOAtestPerformance Testing - JMeter, LoadRunner, RPT, Sitescope, Dynatrace, SplunkETL/ELT - Db2, Oracle, SQL Server, PostgreSQL, Teradata, SQL, HQL, HIVE, Azure SQL DB, Snowflake DB, AWS Redshift, GCP Big Query, DWH and Data Lake.BI - MicroStrategy Reports, Cognos and Power BI Tableau ReportsOther tools- ALM, JIRA, ServiceNow ITSM, IRM, qTest, Rally, Bitbucket, Jenkins, Ansible.Experience Summary:QA / Test Management Experience:10+ years of experience in QA Lead / End to End Testing, Test Process, Test Automation and Performance, ETL and API testing in the enterprise level strategic Programs.Consulting services leading workshops, assessing testing maturity and identifying opportunities to build a strategy for implementation of full-scale testing programs.Experience sizing testing efforts and managing projects to a budget and timeline.Hands on experience as a Test Lead working with developers, product specialists, business analysts, and project managers in Agile and waterfall methodologies.Worked with Scrum team and resolving blocker issues.Track and maintain JIRA EPIC/Release/Stroy/Task/Sub-Task tickets.Managed Business Partners and Business user in UAT Testing and Prod Cutover testing.Define and implement Test Strategy, Test Plan, and Automation framework to ensure high standard of testing and quality is in place.Execute and evaluate software quality assurance across SDLC by conducting System Testing, Regression, Integration and End to End Testing, Isolate, replicate, and report quality assurance defects and verify defect fixes. Conduct defect review and publish status report.Automation & Functional Testing Experience:7+ years of experience in Automation using Java and Selenium, Cucumber BDD framework.Implemented Java based Automation development in-house Hybrid RAFT framework for Web UI, Mainframe Applications, API and ETL Testing, Utilized BDD, Data Driven Test (DDT), Page Object Model and Page Factory design patterns.Developed Automation test scripts in Java, Selenium Web Driver, Cucumber Gerkins, TestNG, Junit, SQL languages.Executed APT test scripts and logged Bug in JIRA for resolution.Built customized HTML reports for Step level and Test summary.Experienced in CI-CD integration with Github/Bitbucket and Jenkins.API / Middleware MQ Testing Experience:7+ Years of experience in Middleware Integration Testing using tools like Postman, ParaSoft SOAtest, Kafka, IBM MQ, Mule Any point, APPWatch and Java-Based Automation framework.Extensive Experience in SWIFT Messages MT102, MT103, EXEC983, FLM, XML, Pain001. Pacs008 file formats, FTP, Wire, ACH Payments, FedWire, FIX Protocol, FIXML, Trig, RDF, Web services, Restful services, Microservice Contract testing, JSON, XML and other formats.Prepared test cases, Test messages, files based on the feature requirement specification.Implemented Automation framework development in Java, Cucumber BDD, Data Driven Test (DDT) methodologies for API, MQ, RESTFULL services testing.Executed API test scripts and logged Bug in JIRA for resolution.ETL-BIG Data Testing Experience:Around 7 Years of experience in ETL and ELT Process, Big data Hadoop Testing, HDFS, Hive, Sqoop, HBase and RDBMs Oracle 11g, Netezza, Teradata, DB2, SQL Server and BI reporting MSTR, Cognos and Power BI TableauPerform ETL and ELT data validation testing on Data Lake, Data warehouse applications on-premises and cloudData migration integrity validation between legacy databases and cloud AWS, Microsoft Azure SQL Database and Hadoop Bigdata.Exposure to cloud data platforms GCP Big query, Azure synapsis.Utilized RAFT Automation framework built on Java, Cucumber TDD/BDD technique for ETL and Data validation.Proficient in writing complex SQL Queries and validate ETL/ELT Test scenarios.Validation of Data Integrity Check, Quality, Uniqueness, Structure and Field MappingValidated Aggregation, FACT and DIM tables business logic within DWH tables.Executed ETL test cases and logged defects for data discrepanciesExecuted informatica and Auto-Sys Jobs for data refresh in LLEs.MSTR, Cognos and Tableau Report data validation with Databases.Performance Testing Experience:5+ years of experience in performance Testing, utilized JMeter, LoadRunner and RPT.Design, develop and execute system performance test, Load testing, Stress testing, Scalability testing, Spike testing, Volume testing, Endurance/Soak testing.Create Performance Test scripts in JMeter for Web UI, API and DB calls.Experience identifying memory leakage, connection issues, throughput bottlenecks in web application, infrastructure, and Cloud.Monitoring using APM tools and publishing performance metrics such as resource data, CPU, network and database utilization, Garbage collection.Strong Knowledge in Monitoring Tools CA Wily Introscope, Sitescope, Dynatrace, Splunk.Certified professional in HP Load Runner, IBM Rational Performance Tester.Career Profile:RoleOrganizationLocationDateQA Automation Lead(APIs, Web UI and Database)Bank of America (Client)Mitchell Martin Inc.Newyork City, USAOct 2021  Till DateQA Lead / E2E Test LeadAutomation/Performance/ETL(APIs, ETL and Web UI)Bank of America (Client)/AIG (Client)Tata Consultancy ServicesNewyork City, USA /Chennai, IndiaDec 2012  Oct 2021Performance Test Consultant(Web-UI, APIs, and DB)Bank of America (Client)Tata Consultancy ServicesChennai, IndiaDec 2011  Nov 2012Test Automation Engineer(Functional Web UI & APIs)Bank of America (Client)Tata Consultancy ServicesChennai, IndiaFeb 2011  Dec 2011Test Lead & UAT Test Coordinator (AMLKYC)Bank of America (Client)Tata Consultancy ServicesChicago, USAApr 2009  Feb 2011Test Engineer (ETL & BI)Bank of America (Client)Tata Consultancy ServicesChennai, IndiaOct 2006 - Mar 2009Educational QualificationsDegree and DateInstituteMajor and SpecializationMaster Of Philosophy in Computer science - 2006Bharathiyar University, Coimbatore, TN  INDIAComputer ScienceMaster Of science inComputer science - 2004Vysya College,Periyar University, Salem, TN  INDIAComputer ScienceDetails of AssignmentsProject Title # 1GMAR - Client Reference Data, New York CityClientBank of America, Mitchell Martin Inc.RoleQA Automation LeadDurationOct 2021  Till DateDescriptionClient Reference Data:Client Reference data platform consists of suite of applications, such as client REST APIs, Refservice APIs, Content Management APIs, SPARQL, Web UI cesium and Client Onboarding Workflows, AMLKYC profile, AMLKYC questioner, AMLKYC Restrictions and closures and Database GMAR applications LATR, Account AML reports.1.APIs - Client REST API, Refservice APIs are built on RDF/TRIG formats to publish and validate party into cesium and allow other upstream or downstream applications to access and publish parties, AMLKYC and Accounting related data within bank.2.Web UI - Cesium UI is used for publishing and tracking newly onboarding clients, Onboarding workflows and their AMLKYC reference data.3.Database  GMAR DB Application consist of AML Regulatory Reports, LATR, Account AML and reconciliation report. LATR feeds data daily from GM data and validates AML regulatory compliance and Account AML feeds data from Cesium and validates the party AML complicate.Automation scope is to adopt and customize in-house java-based automation framework RAFT, Litmus Test and develop automation scripts for APIs, Web UI workflows and Database Validation functionalities and perform functional and regression testing.ResponsibilitiesUnderstand and implement Banks inhouse automation platform RAFT Java-Based platform and execution tool Litmus Test.Collaborate with RAFT support team and adhere with the changes introduced.Participated in a daily scrum standup call, sprint planning and retros.Worked with scrum team and resolving blocker issues.Track and maintain JIRA EPIC/Release/Stroy/Task/Sub-Task tickets.Define and implement Test Strategy, Planning and enhance framework to ensure high standard of testing and quality is in place for the in-scope applications.Prepare Test cases, test data RDF/TRIG files based on the feature requirement specification for Clent REST API and Refservice applications.Prepare stubbed test data for Microservice Contract testing and executed in PactCreate Cucumber feature files using BDD Gherkins.Develop test scripts in Java/Selenium webdriver for client onboarding UI workflows.Prepare Performance Test scripts in JMeter for API and DB calls.Schedule and execute Load, Stress and endurance test and share the performance metrics with Development and Project Team.Prepare complex SQL for ETL data validation and BI Tableau reports.Create ETL test automation scripts in Java, Cucumber BDD test methodologyExecuting and checking the script to ensure bug free scripts deliverables.Check in and Check out Automation codes in Bitbucket.Automation test execution and report Defects in JIRAParticipate daily Agile scrum meeting and update JIRA ticketsSchedule Litmus test for daily Health Check, Sanity and Regression testing.Perform release coordination in ROTA by developing Ansible playbook scripts.ToolsJava, Selenium Webdriver, Cucumber BDD, Postman, JIRA, RDF/TRIG, JMeter, Bitbucket,Ansible, DB2, SPARQL, TableauProject Title # 2Portugal SAFT Invoice Regulatory and Stamp Duty Monthly ReportingClientAmerican International Group (AIG)RoleQA Automation Lead / End to End Test LeadDurationSep 2019  Sep 2021DescriptionPortugal SAF-T Invoice Regulatory:Portugal was the first country to follow OECD recommendation to adopt in 2019 the Standard Audit File for Tax (SAF-T) file. The Portugal SAF-T file has followed a phased development process with increasing requirements for companies, Guidewire Policy Center implementations.The Portuguese Tax Authority (PTA) has stipulated that a certified invoicing system/software be used to produce invoices and relevant tax documents and have the capability to extract a SAF-T Invoicing file reported to the PTA on a monthly base. Such system certification is performed by the PTA in a controlled process.Portugal SDMR:This Project requires new process flow for generating Stamp duty monthly report in Portugal. Till date SD amount collected are report to PTA (Portugal tax Authority) in total, from Jan 2021 the stamp duty to be reported per Tax ID per SD code / type of transaction per region. Following Iberia system are in scope for this project.1)Portugal GOALD Premium data extract.2)Portugal DM Web Premium data extract.3)New ETL job(s) on extraction, data loading, and consolidation and XML generation.ResponsibilitiesTest process Governance for all the QA Programs to adhere to the Testing Factory stand standards in agile methodology.Facilitated daily Scrum standup calls, Sprint Planning and retros.Experience in supporting Guidewire Policy Center with experience on latest GWPC version for US P&C Insurance and Pega Claims Centre.Worked with Scrum team and resolving blocker issues.Track and maintain JIRA EPIC/Release/Stroy/Task/Sub-Task tickets.Report scrum burndown charts, Productivity and Backlogs.Define and implement Test Strategy, Planning and framework for testing.Analysis feature requirements specificationDevelop Test script Cucumber BDD feature files and create java step definitionsETL test automation using Java, Cucumber BDD test methodologyPrepare complex SQL for ETL data validation and BI Tableau.Prepare Performance Test scripts in JMeter for API and DB calls.Schedule and execute Load, Stress and endurance test and share the performance metrics with Development and Project Team.Monitor the test using APM tools Dynatrace.Worked with ServiceNow ITSM and resolved the ITSM tickets.Maintained Integrated Risk Management compliance with ServiceNow IRM.Prepare and publish Performance test report and key metrics.Maintained metrics dashboard at the end of a phase or at the completion of project.ToolsJava, Selenium Webdriver, Cucumber BDD, JIRA, HP ALM/QC, Sybase 15/16, ETL Datastage, Github, MPP, VB.net, Power BI Tableau, AWS, Azure SQL Database.Project Title # 3Treasury Authorized data Provisioning (TRADs), Hadoop MigrationClientBank of AmericaRoleQA Automation Lead / End to End Test LeadDurationSep 2018  Aug 2019DescriptionTreasury Authorized data Provisioning (TRADs)Phase 1:Data migration from all the SORs of TRADs Provisioning to staging maintaining data quality, which is required for reverse data lineage of RDA Assessment.Phase 2:Migration of Data Provisioning to Big Data using Hadoop, Hive, Sqoop and Spark by creating a Unified Credit model (data model) to provide the structured data to the downstream.ResponsibilitiesTest process Governance for the QA Programs to adhere to the Testing Factory stand standards in Agile methodology.Ensuring the changes to the templates is published to the entire team.Participating in Test Plan/Test Specifications/Test Scripts Review meetings.Develop selenium/java script using RAFT FrameworkDevelop Automation test scripts in Java, Cucumber for data ValidationPrepare Complex SQL, HQL for data validation.Data Integrity, Data Quality, Data Uniqueness, Structure and Field validation.Data cleanliness, scrubbing and masking checkData Validation between HDFS Vs RDBMS by running Sqoop Jobs.SQL Aggregation, FACT and DIM table validation in DWH.Represent QA in daily scrum calls with Scrum Master, Product, Dev/QA teams to discuss issues and action plansPrepares / updates the Test completion Report at the end of Test phase,ToolsJava, Selenium Webdriver, Cucumber BDD, JIRA, HP ALM/QC, Big Data, Hadoop, Spark QL,Katka, HDFS, bitbucket, Azure SQL Database, Azure Data Studio, GCP, Snowflake DBProject Title #4Middleware Technologies (SWIFT Mandates, ACH Rhino, Dynamic Currency conversion, OFAC Scanning, GFD, GTMS & Other Apps, Payments Platforms)ClientGBAM - Bank of America, USARoleEnd-to-End Test Lead / QA Lead, Middleware Data ServicesDurationNov 2012  Aug 2019LocationNewyork City, NY, USADescriptionSWIFT Network will update and/or implement new message types in response to regulatory and AML requirements for every year November release. SWIFT is the Society for Worldwide Interbank Financial Telecommunication. More than 10,000 banking organizations, securities institutions and corporate customers in 212 countries use these SWIFT formats daily to exchange millions of standardized financial messages.To support these changes, Bank of America must make changes globally every November on the wire payments systems to accommodate any new messages or updates to existing messagesResponsibilitiesDefine and implement Test Strategy, Planning and framework to ensure high standard of testing and quality is in placeAnalysing and reviewing the Request Processor mapping documents and Imitative Integrated Matrix (IIM), XSD, CopybookExtensive experience in handling and preparing test Payment file formats SWIFT MT102, MT103, EXEC983, FLM, ISO Pain.001, Pacs.008, XML, XSD, FTP, Wire, ACH Payments, Fedwire, FTR Informatica Data eXchange, RDF and Trig files.Validating and tracking the services using bank internal RPST/RPIST toolsValidation and verification of XML tags, Message payloads fieldsPrepare stubbed test data for Microservice contract testing and executed in Pact.Created simulator for Microservices for contract version and data validation.Prepare Kafka messages and validate them for producer and consumer Pub/Sub.Develop Automation test scripts in BDD framework, Cucumber for Service ValidationCreate cucumber feature files, Data port and YAML file Set upValidating in the downstream Mainframe applicationCheck point Review of SOA Test scripts in Parasoft SOAtool for MQ services and HTTPS services for regression testingToolsIBM DataPower, IIB, Mule ESB, DB2, Mainframe, Oracle, Java, Cucumber, Parasoft SOATest, Quality Center, XML Spy, SSH Tectia, Notepad++, AppWatch, PathWAI, Azure SQL Database, Azure Data Studio.Project Title #5CED Infrastructure & RAC ImplementationClientGWBT Bank of America, Chicago USARolePerformance Test LeadDurationApr 2009  Nov 2012DescriptionRAC Project addresses the current gap in the Banks Client Data Management (CED) environment of not supporting a 24x7 global operation. CED is the System of Record for GCIB Legal Entity information supporting client regulatory and reporting requirements for BAC. Due this RAC Implementations CED will have a stated SLA of 24x7 target uptime, and the necessary supporting processes will satisfy a 24x7 operational model, or at minimum a 24x6 operational model for helpdesk functions to support a Global user base introduced by the Merrill Lynch portfolio. The scope of this initiative will be to convert the current database instances from single instance to Real Application Cluster (RAC) database instances.ResponsibilitiesDefining Performance Test Strategy, Planning and scheduling required level of Load testingProject Estimation and Fund allocation for the team membersManaged team schedules, milestones, and deliverables using internal tools for all technical issues related to scripting and configurationDesigned and Developed Load Runner scripts based on the most performance centric workflowsTracked daily progress and reported any issues and proposed resolutionsCreated Executive summary reports for project stake holdersParticipated in Performance Tuning Analysis with Engineering teamPerformance Monitoring using CA Wily Introscope, Dynatrace, HP SiteScope, Splunk toolsReviewed Oracle AWR reports and discussed performance tuning with DBA'sToolsLoad Runner, Rational Performance Tester, HP Quality Centre Oracle 10G, SSH Tectia ClientProject Title # 6Rational Rollout  Bank of America, USARolePerformance EngineerDurationJan 2008  Apr 2009DescriptionRational Roll out effort consists of conducting Research and Proof of Concept on Rational 2007 Tools such as RFT, RPT, RST and Rational Clear Quest. IBM Signed off on the Proof of Concept conducted. All Automation, Performance and SOA Assets were converted from HP Mercury to Rational tool set.ResponsibilitiesBuilding out the Rational Test EnvironmentProof Of Concept and Sign Off on Rational Tool SetDesign and analysis Automation and Performance Use CasesReviewing the Automation and Performance Use Cases with the Project Stake HoldersDeveloping Automation and Performance Scripts using Rational Tool SetReviewing the Automation and Performance Scripts with the Project Stake HoldersExecuting the Automation and Performance Scripts and sharing the test Results with Project Stake HoldersToolsHP Load Runner, Rational Performance Tester, Rational Services Tester, Rational Clear Quest, HP Quality CenterProject Title # 71. UB Profitability Consolidation2. LaSalle Bank Conversion / Bank of America, USARoleETL Test EngineerDurationOct 2006  Jan 2008DescriptionBOA acquisition of La Salle bank there is a need to migrate relevant La Salle Clients and Prospects to the existing CED database. Conversion of all required data elements will Provide foundation for the conversion from the La Salle tool suite to the CRMS CIBR tool suite and support other CRMS efforts and applications. LaSalle Client data are extracted from MS Access Data base as a Tilde-Delimited Text File and are loaded into staging table. Only success and Exception records are moved to CED and all the result actions should have an entry in result table. We have migrated the Client MI data, Market Manager Data, Sales Logic data, Guarantor data, and Eagle Leasing data during this project.UB Profitability Consolidation and its primary purpose is to provide a single source of profitability data to Commercial and GCIB profitability tools in Universal Bank. To implement these changes, work effort will be broken up into various work packets. Each work pacts are spited up based on the financial data and Account details; PDB and PRF tables are implemented to consolidate the profitability data and moved into ACCUMULATOR tables.ResponsibilitiesAnalysis and Understanding of the design documents HLD, LLD and mapping documentsPreparation of test cases using Oracle SQL queriesExecuting the SQL Test cases to validate Data Integrity and data consistencyExecuting the SQL Test cases to validate the transformation logicsExecuting the SQL Test cases to verify the records migrated accuratelyLogging the defects in Quality Center and publish status reportsToolsOracle 10g, Teradata, PL SQL Developer, HP Quality CenterPERSONAL DETAILSSexMaleNationalityIndianMarital StatusMarriedCurrent LocationNew Jersey, USAContact NumberPHONE NUMBER AVAILABLEVisaH1B, I-140 Approved  Priority Date Apr-2018

Respond to this candidate
Your Message
Please type the code shown in the image:

Note: Responding to this resume will create an account on our partner site postjobfree.com
Register for Free on Jobvertise