| 20,000+ Fresh Resumes Monthly | |
|
|
| | Click here or scroll down to respond to this candidateCandidate's Name
PHONE NUMBER AVAILABLE EMAIL AVAILABLE St. Louis, MOProfessional Summary:Having 14+ years of experience in Software ETL/DWH/BI, API, Automation Testing Functional/Manual QA and Mainframe Testing.Having 6+ Years of experience in ETL/DWH/BI Testing.Proficient in complete Software Development Life Cycle (SDLC) and Software Testing Life Cycle (STLC).Having good knowledge on BI Testing (DB & Reports), Data Obfuscation & Migration Testing and basic understanding on UNIX Commands.Good Knowledge on Testing Conversion and Migration Projects.Having worked on requirements involved ETL Process for a Data Warehouse from their legacy systems using MS-SQL SSIS Packages.Performed Data Validation testing by writing SQL queries.Hands on experience in testing Near Real-time fault tolerant Data pipelines & API'sExposure on Big Data Ecosystem.Involved in DB Testing using On-Premises MS-SQL SSIS PackagesExtensive experience in Extraction, Transformation and Loading of data directly from different heterogeneous source systems like flat files, SQL Server, Oracle.Having experience in Testing of Migrating legacy ETL application to MS-Azure Cloud using Azure Data factory & Azure Data Bricks.Having knowledge of Azure Function to perform Data Validation.Well-versed in various types of testing including unit testing, functional testing, integration testing, system testing, and more.Excellent analytical and problem-solving skills, with a strong focus on learning new technologies.Effective communication and interpersonal skills.Experience in the Project Management Activities, Worked on Preparing Estimations, Billing, Invoicing, QA Metrics, POCs, Review checklists, Pre-Delivery Audits, Hiring and building experts pool, Case Study documents and User Manual creations.Knowledge on CI/CD by using Jenkins.Knowledge on JIRA for creating and documenting Test Plans, Test Cases and register the expected results.Involved Regression, UAT and PIV (Post implementation Testing) Support phasesTechnical Summary:Programming Languages: Python, Java OOPS concepts, JCL and Cobol.ETL Tools: Talend, SAP BODS, Teradata, SSIS, Azure Data Factory, Azure Data Bricks, DB Solo Beyond Compare.Reporting Tools: SAP Business Object, OBIEE, Power BI and TableauCloud Technology: IAM, EC2, VPC, S3, LAMDA, Azure Data Lake, Data Bricks & SparkDefect Management Tools: JIRA, HP Quality Centre, HP ALM, Bugzilla and OctaneTesting Methodologies: Smoke, Functional, Integration, Regression, GUI, Load/Performance, System and User Acceptance Testing.Database/Database Tools: Oracle, MySQL, PostgreSQL, Toad, SQL Developer and DB2SDLC Methodologies: Waterfall, V-Model, Agile (Scrum and Kanban)Operating Systems: Windows 98/XP/Vista/10 and Unix/Linux.Others: Spufi, File-Aid, Control-M, SoapUI, Postman, Restful, SVN, GitHub, JMeter, Selenium, Cucumber, See-Test, Db-Vis, Eclipse, JenkinsProfessional Experience:SHELL St Louis, Missouri Jul 2022 Till DateTest LeadWorked on the File to Table, Table to Table, Table to File Operations.Run Pipelines to load the data from STFP servers to Target Tables.Effectively involved in High Level Design & Application Interface Design Review & Approval followed by testing ETL Transformations which facilitate Daily, Weekly & Monthly Loading of Data.Developed complex SQL queries for querying data against different data bases for data verification process.Handled the SIT/UAT defect efficiently with lower turnaround time.Worked with different teams (Application Teams, BST/UAT Teams) to resolve the UAT Defects Issue.Worked on the Production Tickets and supported during Go-Live.Prepared the Test Plan and Testing Strategies for Data Warehousing Application.Good Relationship with Colleagues, TL, Managers, Developer & DBAsAnalyzed E2E flow (Business Scenarios, SRS, HLD requirements)Taken responsibility for mentoring team to perform Automation tools.Involved in Metadata, Data Completeness, Data transformation, Data Quality testing and Extensive SQL querying on Staging, Data warehouse and Data Marts.Environment: Agile methodology, JIRA, MS-Azure, ADF, ADB, Azure Function, MSOffice, Windows 10, Micro Services, Postman, JSON, Oracle DB, Sql-Server DB, Git-Hub.Project: Emerald (MTSP)The Retail Transaction Settlement Platform (RTSP) is the platform that currently manages the reconciliation and settlement processes over card and digital payment (all payments that are not cash, or cash equivalent). RTSP is based on a core vendor (HPS) application called Powercardv3.5 and is run and managed by a 3rd party vendor DXC. Shell has a contract with DXC to provide the service, as a turnkey: including design, development, certification, operation of the service: DXC owns the platform solution and Intellectual Property. The platform is provided by DXC and it is based on HPS Power Card application, integrated with DXC services which provide a secure PCI access-controlled environment, file batch workflow, Reporting and Digital Documentation including invoices (uploaded by Shell in GSAP). The solution is Business Critical, has Prime support, Disaster Recovery and 24x7 SLAs; compliance includes PCI Data Security Standard, Data Privacy, SOX Serbanes-Oxley, SOC 2. RTSP currently provides settlement reimbursement functionality for retailers and acquirers for 23 countries including the US (Russia is out of scope). The application and data are hosted on the vendor on-prem data center in United States.NORDEA Capgemini, Bengaluru Aug 2017 - Jul 2022Sr. QA EngineerWorking with the Scrum master to involve in sprint planning, estimation of user stories and story points to be considered for a specific sprint.Working extensively on the Data Validation of structured and semi-structured data and automating the process of validating the same.Create some failures in the record format to check if the ETL batches fail with correct error or log the error as expected per the design.Prepare Test data and upload to Azure external blob storage.Run the Azure data factory pipeline to completion without failures.Validate the transformation and production of staging data and verify the data against mapping rules and logic.Verify the flow of data from E2E. Verify the data loaded into target tables.Verify the aggregation and re-conciliation of data loaded.Create Negative data to fail the pipeline and communicate to dev team to provide a fix.Review Data Models, Data Mappings, Architectural Documentation to create/execute effective SIT Test Plans and SIT test cases.Verify the Reports produced from target database against the input feed to check the accuracy and correctness.Execute SQL queries to perform backend database testing and verify/modify user data.Work in multiple source data extraction and data transition from existing production system to test environment.Participate in documenting defects, resolve defects and coordinate with UAT team to get sign-off for the application/project.Environment: Agile methodology, MS-Azure, MSOffice, Windows 10, Micro Services, Postman, JSON, Oracle DB, Sql-Server DB, Git-Hub. Selenium, EclipseProject: SWISHThe Retail Transaction Settlement Platform (RTSP) is the platform that currently manages the reconciliation and settlement processes over card and digital payment (all payments that are not cash, or cash equivalent). RTSP is based on a core vendor (HPS) application called Powercardv3.5 and is run and managed by a 3rd party vendor DXC. Shell has a contract with DXC to provide the service, as a turnkey: including design, development, certification, operation of the service: DXC owns the platform solution and Intellectual Property. The platform is provided by DXC and it is based on HPS Power Card application, integrated with DXC services which provide a secure PCI access-controlled environment, file batch workflow, Reporting and Digital Documentation including invoices (uploaded by Shell in GSAP). The solution is Business Critical, has Prime support, Disaster Recovery and 24x7 SLAs; compliance includes PCI Data Security Standard, Data Privacy, SOX Serbanes-Oxley, SOC 2. RTSP currently provides settlement reimbursement functionality for retailers and acquirers for 23 countries including the US (Russia is out of scope). The application and data are hosted on the vendor on-prem data center in United States.Associated Bank Corp Capgemini, Green Bay, Wisconsin May 2017 - Jul 2017Sr. QA EngineerTest teller operations on daily basis using teller machine.Testing Banking Apps on Real iOS & Android DevicesTest if customers can easily log in to their account via id/pass, or if they forget credentials, the restoration process must be hassle-free.Verify whether users can complete transactions securely via card detail or bank account transfer.Test if users can check their available balance for multiple accounts.Verify security features, like whether the banking application blocks the account after three wrong attempts.Try creating a new account with the wrong credentials and see if the banking application accepts it or not.Check if data updates are reflected in the database.Test if a user gets any notifications from the bank about transactions like alerts, debit, credit, etc.Environment: Agile methodology, Mainframe, MSOffice, Teller Machine, Java, Selenium, MainframeProject: AllProAllPro is a Set of Bankers Application developed by ARGO. Associated Banc-Corp has Incorporated AllPro in its Branches to manage all Branch Office and bank office Operations. The Main applications include TellerPRO, SalesPRO, CarePRO, LendingPRO+ and ServicePROAssociated Bank Corp Capgemini, Bengaluru Aug 2015 - Apr 2017Sr. QA EngineerAnalysis of the type and nature of data and business requirement.Played around the informatica mappings to understand the business logic and test the sameInvolved in running the workflows of informatica to load the data to the targetWritten complex queries to validate the data while loading from source to the target databaseAnalyze business requirement provide accurate estimates for test conditions, planning and execution efforts.Verifying the data in target database after ETL process.Responsible for identifying test data and get the same from different source systems.Responsible for estimations, test planning, designing, project execution, defect tracking and status reporting.Implemented data reconciliation to verify the data loaded accuratelyPrepared Traceable matrix to ensure the test case coverageInvolved in creation of the test plan / approachWorking on HP-ALM for creating and documenting Test Plans and Test Cases and register the expected results.Environment: Agile methodology, Mainframe, Informatica, MSOffice, Windows 10, Micro Services, Postman, JSON, Oracle DB, Sql-Server DB, Git-Hub. Selenium, EclipseProject: MLSIMLSI (Mortgage Loan Servicing Interface). The Mortgage Cadence Re-implementation requires adaption of a number of data extracts that currently pull data for various purposes. It is understood and documented here that all extracts currently in use will at a minimum be adapted to provide data from ELC that is equivalent to what is extracted from Mortgage Cadence today. Adaption of the extracts needs to take into account difference between legacy Mortgage Cadence and ELC so that these differences are transparent to the users of extracted data. The business requirements around specific data extracts are related to Change to correct current issues with the extracts, Changes or additions that would enhance work process to a degree that may be supported by cost/benefit analysis if warranted & Changes or additions needed to support new kids of products that have been originated on legacy Mortgage Cadence in the past.Associated Bank Corp Capgemini, Bengaluru Nov 2013 - Jul 2015Sr. QA EngineerAnalysis of the type and nature of data and business requirement.Played around the informatica mappings to understand the business logic and test the sameInvolved in running the workflows of informatica to load the data to the targetWritten complex queries to validate the data while loading from source to the target databaseAnalyze business requirement provide accurate estimates for test conditions, planning and execution efforts.Verifying the data in target database after ETL process.Responsible for identifying test data and get the same from different source systems.Responsible for estimations, test planning, designing, project execution, defect tracking and status reporting.Implemented data reconciliation to verify the data loaded accuratelyPrepared Traceable matrix to ensure the test case coverageInvolved in creation of the test plan / approachWorking on HP-ALM for creating and documenting Test Plans and Test Cases and register the expected results.Environment: Agile methodology, Mainframe, MSOffice, Teller Machine, Java, Selenium, MainframeProject: CorillianCorillian is a new generation of online banking solutions provided by Fiserv. The solution is a seamless experience that deeply integrates online banking, payments and personal financial management (PFM) tools into the same interface. Associated Banc-Corp has this Solution incorporated in their Banking Business to provide Great Features to its customers.AT&T IBM, Bengaluru Nov 2011 - Oct 2013Sr. QA EngineerCreate CBIFT integration test plan and conduct test plan walkthrough.Involved in the creation of test cases and provide appropriate test case walkthrough to client.Prepare Test data by creating telephones and adding plans before set-up cycle.Verify deliverables from all impacted systems are received and corresponding is loaded in the CBIFT environment.Collect Stops and Holds, including reason and time-frame, and communicate to CBIFT Execution team.Ensure project-specific modifications are made to the CBIFT platform.Execute the Mainframe Job flow from Control-M scheduler.Support test case validation and certification as defined by the project team.Provide cycle schedule and testing milestones to the project team.Environment: IBM Mainframes, MSOffice, Windows 10, DB2, IMS, JCL, COBOL, VSAM, Control-MProject: CBIFTCBIFT provides an end-to-end platform for testing the processes used to deliver services to external customers in an integrated production-like business environment. This is where the billing software code for the entire configuration is first assembled as a release and is tested end-to-end. Both system and business functions are tested in the same environment by project and release. CBIFT also performs end-to-end regression testing in its environments simultaneously with enhancement and maintenance testing.Higher Education:Bachelor of Engineering in Industrial Engineering & Management from Sri Siddhartha Institute of Technology, Tumkur, Karnataka, India (Affiliated to VTU, Belgaum). |