Quantcast

Etl Developer Resume Wylie, TX
Resumes | Register

Candidate Information
Name Available: Register for Free
Title Etl developer
Target Location US-TX-Wylie
Email Available with paid plan
Phone Available with paid plan
20,000+ Fresh Resumes Monthly
    View Phone Numbers
    Receive Resume E-mail Alerts
    Post Jobs Free
    Link your Free Jobs Page
    ... and much more

Register on Jobvertise Free

Search 2 million Resumes
Keywords:
City or Zip:
Related Resumes
Click here or scroll down to respond to this candidate
Candidate's Name
ETL Developer
Phone: PHONE NUMBER AVAILABLE | Email:EMAIL AVAILABLE|
US Citizen
PROFESSIONAL SUMMARY
With over 6 years of experience in the IT industry, I possess a wide range of skills and expertise in various
aspects of data warehouse applications. My primary focus has been on analysis, design, development,
implementation, and troubleshooting. Specifically, I have honed my skills in working with Informatica
PowerCenter and Informatica Cloud IICS an ETL tool, where I have designed and developed ETL processes,
created mappings, sessions, and workflows. I ensured data quality and integrity, as well as resolving any data
integration issues that arise in an Agile environment. In my previous roles, I have successfully delivered two
data warehouse projects. Additionally, I have optimized and fine-tuned SQL queries for enhanced
performance, while also ensuring data quality and consistency throughout the entire ETL process. I bring a
strong background in ETL skills for data warehousing and possess expertise in working with databases such as
Oracle, SQL Server, Snowflake and AWS S3. My proficiency in writing and interpreting complex SQL
statements enables me to handle complex data transformations effectively. Furthermore, I have garnered
valuable experience in SQL optimization and performance tuning. I am adept at identifying performance
bottlenecks and employing techniques to optimize Informatica Load for better efficiency and overall
performance. I am well-versed in Unix command and understand Shell scripts and JAVA Scripts.
  Good experience in Informatica PowerCenter 10.4, 9.6.1 along with a strong ETL background.
  Good experience in developing ETL applications and statistical analysis of data on databases ORACLE
11g/12c, and SQL Server 2017, also good experience of 3 years in PL/SQL.
  Strong experience in implementing CDC using Informatica Power Center.
  Extensively worked on Informatica Power center transformations as well like Expression, Joiner, Sorter
    Filter, Router, lookup, stored procedure transformer, xml parser and other transformations as required.
  Good in Data Warehouse Concepts like Dimension Tables, Fact tables, Slowly Changing Dimensions,
    DataMart s, and Dimensional modeling schemas.
  Experience in Data modeling; Dimensional modeling and E-R Modeling and OLTP AND OLAP in Data
    Analysis; good understanding of snowflake schema and star schema.
  Strong Experience in developing Sessions/Tasks, Worklets and Workflows using Workflow Manager tools
    Task Developer, Workflow & Worklet Designer.
  Performed tuning of ETL mappings and sessions to improve performance for large volume projects.
  Experience in debugging mappings, identified bugs in existing mappings by analyzing the data flow and
evaluating transformations.
  Experience in tuning and scaling the SQL for better performance by running explain plan and using
    different approaches like hint and bulk load.
  Experience in Performance tuning of ETL process using pushdown optimization.
  Good experience in automation of ETL process, error handling and auditing purposes.
  Skilled in creating & maintaining ETL Specification Documents, Use Cases, Source to Target mapping,
  Requirement Traceability Matrix, providing Effort estimates and deployment artifacts.
  Assisted the other ETL developers in solving complex scenarios and coordinated with source systems owners
 EDUCATION

       Bachelor of Electronics and communications from The institution of engineers India (IEI)


  TECHNICAL SKILLS


     ETL: Informatica PowerCenter 10.5/10.2/ 9.6.1, Informatica Cloud,
     RDBMS: Oracle 18c/12c, SQL Server 2019/2017, MySQL, PL/SQL, snowflake, AWS S3 Buckets
     Tools: SQL Navigator, JIRA, Rally, AutoSys. Control M, Jenkins, GitHub, Notepad++, Toad


 WORK EXPERIENCE

 HUMANA, IRVING, TX DURATION: JAN 2022 TO CURRENT

  ROLE: ETL INFORMATICA DEVELOPER HUM EDW
Humana operates as a health and well-being company in the United States. The company offers medical and
supplemental benefit plans to individuals. It also has contract with Centers for Medicare and Medicaid Services
to administer the Limited Income Newly Eligible Transition prescription drug plan program; and contracts with
various states to provide Medicaid, dual eligible, and long-term support services benefits.
Responsibilities:
      Worked with BA on requirement gathering, understanding current business flow and scenarios.
      Created Technical Design Document from BDR and analyzed system requirements to identify system
    impacts.
      Performed data analysis, data profiling, and created data dictionary, used 100 of SQL query to perform
    these activities.
      Prepared source to target mapping and conducted meetings with the business to understand data
    transformation logic.
      Designed, built, and maintained mappings, sessions, and workflows for the target load process.
      Designed mappings for Slowly Changing Dimensions (Type 1, Type 2 and more), used Lookup (connected
    and unconnected), Update strategy and filter transformations for loading source data.
      Extensively used transformations like router, aggregator, lookup, source qualifier, joiner, expression, and
    sequence generator transformations in extracting data in compliance with the business logic developed.
      Wrote SQL overrides in source qualifier to filter data according to business requirements.
      Analyzed the existing mapping logic to determine the reusability of the code.
      Created Mapping Parameters, Session parameters, Mapping Variables and Session Variables.
      Done extensive performance tuning by determining bottlenecks at various points like targets, sources,
    mappings, sessions, or system. This led to better session performance.
      Created Unit test plans and did unit testing using different scenarios separately for every process.
    Involved in System test, Regression test and supported the UAT for the client.
      Performing ETL and database code migrations across environments using deployment groups.
      Populating the business rules using mappings into the target tables.
      Created different parameter files and started sessions using these parameter files using PMCMD
    command to change session parameters, mapping parameters, and variables at runtime.
      Involved in end-to-end integration and performance testing and verified through SQL query.
      Worked in slow running SQL from Production environment; Rewrote new SQL queries and fixed the issue.
      Tuned the mappings by removing the Source/Target bottlenecks to improve the throughput of the data
     loads.
      Worked on Snow SQL and Snow pipe. created Snow pipe for continuous load data from AWS S3 .
      Created external and internal stage and transformed data during the load data.
      ETL pipeline in and out data Datawarehouse using Snowflake snow SQL writing SQL queries.
      Bulk loading from external stage (AWS S3) internal stage to snowflake cloud using COPY command.
      Used COPY, PUT, GET and LIST commands for validating the internal files.
      Used import and export from internal stage (Snowflake) from the external stage (AWS S3).
      Various sources to pull data into Power BI such as oracle, Excel, SQL server and Snowflake and AWS S 3
        cloud.
      Load data and files using ETL(IICS) from ORACLE to Snowflake and AWS S3.
      Creating dashboard and visualization for KPI and reports Power BI .
      Created batch scripts for automated database build deployment.
      Worked in the ETL Code Migration Process from DEV to QA and to UAT using CICD pipeline into GitHub.
      Worked with offshore team on daily basis to ensure commitments are met.

 Environment: Informatica PowerCenter 10.5, IICS, Oracle 18c, SQL, snowflake, AWS S3, PL/SQL, Unix,
 AutoSys, Putty, WinSCP, Confluence, GitHub, Power BI


 CLIENT: METLIFE, INC., LONGWOOD, FL DURATION: MAR 2019 TO DEC 2021
 ROLE: INFORMATICA DEVELOPER MEDW RELEASE 1.7
MetLife, Inc. engages in the insurance, annuities, employee benefits, and asset management businesses
worldwide. The company offers life, dental, group short-and long-term disability, individual disability, accidental
death and dismemberment, vision, and accident and health coverages. As a part of this project data is being
collected from all servicing originations Application systems, databases like oracle, flat files, mainframe files.
Data is migrated from all various sources into the ingested staging table, and then pushed to the target system,
Data is cleansed, and transformed based on the business rules through ETL tools Informatica.
 Responsibilities:
      Worked with the business team to gather requirements for projects and created strategies to manage the
         requirements.
      Documented Data Mappings/ Transformations as per the business requirement.
      Worked on project documentation which included the Technical and ETL Specification documents.
      Created mappings using Designer and extracted data from various sources, transformed data according to
    the requirement.
      Involved in extracting the data from the Flat Files and Relational databases into staging area.
      Mappings, Sessions, Workflows from Development to Test and then to UAT environment.
      Developed the Informatica Mappings by usage of Aggregator, SQL overrides usage in Lookups, source
     filter usage in Source qualifiers, and data flow management into multiple targets using Router.
      Created Sessions and extracted data from various sources, transformed data according to the
    requirement and loading it into data warehouse.
      Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner,
    Router and Aggregator to create robust mappings in the Informatica Power Center Designer.
      Imported various heterogeneous files using Informatica Power Center 9.x Source Analyzer.
      Developed several reusable transformations and mapplets that were used in other mappings.
      Prepared Technical Design documents and Test cases.
      Extensively worked on Unit testing for the Informatica code using SQL Queries.
      Implemented various Performance Tuning techniques.
      Designed and developed complex ETL mappings making use of transformations like Source Qualifier, Joiner,
    Update Strategy, Lookup, Sorter, Expression, Router, Filter, Aggregator and Sequence Generator
    transformations.
      Wrote numerous SQL queries to view and validate the data loaded into the warehouse.
      Developed transformation logic as per the requirement, created mappings and loaded data into respective
    targets.
      Responsible for determining the bottlenecks and fixing the bottlenecks with performance tuning.
      Used PMCMD command to start, stop and ping server from UNIX and created UNIX Shell scripts to
    automate the process.
      Improved performance testing in Mapping and the session level.
      Developed Parameter files for passing values to the mappings for each type of client
      Scheduled batch and sessions within Informatica using Informatica scheduler.
      Worked with UNIX shell scripts for job execution and automation.
Environment: Informatica PowerCenter 10.2 SQL Server 2017, Oracle 12c, SQL, Unix, Putty, WinSCP,
Confluence, GitHub, Control M, JIRA


 TD BANK, MT LAUREL TOWNSHIP, NJ DURATION: JAN 2018 TO Mar 2019
 ROLE: INFORMATICA DEVELOPER TD EDW
TD Bank is an American national bank and the United States subsidiary of the multinational TD Bank Group. this
project integrates from multiple application into data into enterprise data warehouse; to build an Analytics
solution for better reporting.
Responsibilities:
      Co-ordination from various business users  stakeholders and SME to get Functional expertise, design, and
    business test scenarios review, UAT participation and validation of data from multiple sources.
      Analyzed source data and gathered business requirements from the business users and source data team
      Analyzed business and system requirements to identify system impacts.
      Created the Detail Technical Design Documents which have the ETL technical specifications for the given
        functionality.
      Prepared source to target mapping and conducted meetings with the business to understand data
        transformation logic.
      Experienced in using Informatica for data profiling and data cleansing, applying rules and
        develop mappings to move data from source to target systems
      Worked on complex mappings which involved slowly changing dimensions.
      Optimized the mappings and implementing the complex business rules by creating re-usable
        transformations and Mapplets.
      Debugged and performance tuning of targets, sources, mappings, and sessions.
      Analyzed the existing mapping logic to determine the reusability of the code.
      Created Mapping Parameters, Session parameters, Mapping Variables and Session Variables.
      Done extensive performance tuning by determining bottlenecks at various points like targets, sources,
    mappings, sessions, or system. This led to better session performance.
      Created Unit test plans and did unit testing using different scenarios separately for every process.
      Involved in System test, Regression test and supported the UAT for the client.
      Performing ETL and database code migrations across environments using deployment groups.
      Populating the business rules using mappings into the target tables.
      Involved in end-to-end system testing, performance testing and data validations.
      Managed performance and tuning of SQL queries and fixed the slow running queries in production.
      Involve in scheduling using Informatica pre/post session operations.
      Improving the performance tuning of ETL mapping/workflow
      Created different parameter files and started sessions using these parameter files using PMCMD command
    to change session parameters, mapping parameters, and variables at runtime.
      Responsible for identifying bugs in existing mappings by analyzing data flow, evaluating transformations,
    and fixing bugs.
      Involved in testing the database using complex SQL and handled the performance issues effectively.
      Code walks through with team members.
Environment: Informatica PowerCenter 9.6, Oracle 12c, MySQL, UNIX, Control M, JIRA, CICD

Respond to this candidate
Your Email «
Your Message
Please type the code shown in the image:
Register for Free on Jobvertise