Quantcast

Data Warehouse Warehousing Resume Plano,...
Resumes | Register

Candidate Information
Name Available: Register for Free
Title Data Warehouse Warehousing
Target Location US-TX-Plano
Email Available with paid plan
Phone Available with paid plan
20,000+ Fresh Resumes Monthly
    View Phone Numbers
    Receive Resume E-mail Alerts
    Post Jobs Free
    Link your Free Jobs Page
    ... and much more

Register on Jobvertise Free

Search 2 million Resumes
Keywords:
City or Zip:
Related Resumes

Business Intelligence/Data warehouse professional Plano, TX

Sql Server Data Warehouse Irving, TX

Data Analyst Warehouse Flower Mound, TX

Sql Server Data Warehouse Irving, TX

Data Warehouse Engineer Plano, TX

Data Warehouse Integration Lavon, TX

Customer Service, Data Entry, Warehouse. Dallas, TX

Click here or scroll down to respond to this candidate
Candidate's Name
Phone :PHONE NUMBER AVAILABLEEmail : EMAIL AVAILABLESUMMARY:ETL Informatica expert with 6+ years of IT experience. Experience in analysis, design, development, implementation, and troubleshooting of Data warehouse applications.Experienced in Informatica PowerCenter 10.2, 9.6.1, 9.1 along with a strong background ETL skill for Data warehousing using Informatica.Experienced in developing ETL applications and statistical analysis of data on databases ORACLE 11g/12c, Teradata 16, PostgreSQL, and SQL Server 2016.Experience integrating data to/from On-Premises database and cloud-based database solutions using Informatica Intelligent Cloud Services (IICS)Worked in different CDW projects (Cloud data warehousing)Developing the codes for different frameworks like NAS to GCS, GCS to GCS, GCS to ACQ, and ACQ to Integration Layer; used Python Script for job execution.Experience in Informatica IICS Monitor the task flows and Workflows.Strong experience in Dimensional Modeling using Star and Snowflake Schema, Identifying Facts and Dimensions, Physical and logical data modeling using Erwin and ER-StudioGood understanding of Data warehouse project development life cycle. Expertise in documenting all the phases of DWH projects.Have ability to write and interpret complex SQL statements and good in SQL optimization and performance tuning.Excellent knowledge in identifying performance bottlenecks and tuning the Informatica Load for better performance and efficiency.Moderate experience in implementing CDC using Informatica PowerExtensively worked on Informatica Power center transformations as well like Expression, Joiner, Sorter, Filter, Router, and other transformations as required.Good in Data Warehouse Concepts like Dimension Tables, Fact tables, Slowly Changing Dimensions, DataMarts, and Dimensional modeling schemas.Experience in Data modeling; Dimensional modeling and E-R Modeling and OLTP AND OLAP in Data Analysis. Very familiar with SCD1 and SCD2 in snowflake schema and star schema.Strong experience in Extraction, Transformation and Loading (ETL) data from various data sources into Data Marts and Data Warehouse using Informatica power center components.Strong Experience in developing Sessions/Tasks, Worklets and Workflows using Workflow Manager tools -Task Developer, Workflow & Worklet Designer.Experience in Debugging and Performance tuning of targets, sources, mappings and sessions in Informatica.Experience in tuning and scaling the SQL for better performance by running explain plan and using different approaches like hint and bulk load.Experience in Performance tuning of ETL process using pushdown optimization and other techniques. Reduced the execution time for huge volumes of data for a company merger project.Experience in writing UNIX shell scripts, automation of ETL process, error handling and auditing purposes.Experienced in the Control-M tool for developing jobs, scheduling the jobs and triggering jobs.Expertise on Agile Software Development process.EDUCATION:Master in Computer Science from Osmania University, India (2004)TECHNICAL SKILLS:ETLInformatica Power Center 10.x,9.x,8.x (Designer, Workflow Manager, Workflow Monitor, Repository manager, Workflow Designer, and Informatica Services), IICS (Informatica Cloud)Scheduling ToolsControl M, Informatics SchedulerDatabasesOracle 11g/10g/9i/8i, MS SQL Server 2008/2005, DB2 v8.1, Teradata, SnowflakeLanguagesSQL, PL/SQL, UNIX Shell scriptsOSWindows, UNIXOther ToolsNotepad++, Toad, SQL Navigator, Teradata SQL Assistant, JIRAPROFESSIONAL SUMMARYClient: DTCC, Coppel, TXRole: ETL Informatica Developer Duration: Sep 2023 to CurrentAbout Client: DTCC safeguards the financial markets and helps them run efficiently, in times of prosperity and crisis. We are uniquely positioned at the center of global trading activity, processing over 100 million financial transactions every day, pioneering industry-wide, post-trade solutions and maintaining multiple data and operating centers worldwide.Utilize data integration tools and work with large volumes of structured and unstructured data to provide Extract, Transform, and Load (ETL) design, development, testing, implementation, and application software using Teradata, Vertica and SQL and applying in depth knowledge of software development, Business Intelligence and Data Warehousing.Analyze Informatica (Extract, Load and Transformation tool) mapping documents and Functional Specifications.Design and executive the test scripts, test plans for Unit testing, System Integration, Regression and User Acceptance Testing Processes.Work with the project and development teams for system enhancement and drive remediation of software defects, configuration and performance issues which stabilizes production processing, and data quality.Participate in application SCRUM teams on implementation of system changes to support large project initiatives, and deployments.Provide On-Call support duties for the legacy data warehouse including technologies with Informatica, Teradata, UNIX, Hadoop, among others.Migrate codes from legacy environments to support the business needs.Worked on Informatica PowerCenter Tools-Designer (Source Analyzer, Data warehousing designer, Mapping Designer, Mapplets & Transformations), Repository Manager, Workflow Manager & Workflow MonitorUsed PowerCenter Designer to create mappings using different transformations like filter, router, lookups, stored procedure, joiner, update strategy, expression, and aggregator transformations to pipeline data to Data Warehouse/Data Marts.Written shell scripts to run SQL jobs on background and integrate data from flat files.Provided effective support in delivering process and product change improvement solutions.Used Exception Handling extensively for the ease of debugging and displaying the error messages in the application.Scheduled the workflows on Control-M tool for developing the jobs, scheduling the jobs and triggering jobs also worked on raising FDR for creating workspace in Preprod server through Control-M respective jobs.Environment: Informatica PowerCenter 10.2, GCS, IICS, Oracle 12c, PostgreSQL, UNIX Shell Scripts, Unix, Erwin 9.3, Control M, Teradata 16Client: Charles Schwab, Westlake, TXRole: ETL Informatica Developer Duration: Mar 2022 to Aug 2023About Client: Charles Schwab has embarked upon a strategic initiative to build a enterprise wide Data Warehouse that will enable fact-based decision-making. The data warehouse being built will serve as a firm foundation for Business Intelligence and Data Analytics subsequently. The subject area covered in this phase is Banking & Sales Analytics. The process loads the banking operational details and sales details, which will be used by the Decision Support System for the analysis.Responsibilities:Involved in all the phases of the SDLC requirement gathering, design, development, Unit testing, UAT, Production roll-out and Production support.Interacting with business representatives to understand the requirements and determine the best approach for timely delivery of information.Performed source system data analysis as per the Business Requirement. Distributed data residing in heterogeneous data sources is consolidated into target Enterprise Data Warehouse database.Developed Mappings, Sessions, Workflows and Shell Scripts to extract, validate, and transform data according to the business rules.Worked on Developed mappings using Source, Target, Lookup, Expression, Filter transformations in IICS.Worked on IICS by using Bundles, Integration Templates, Mappings, Task Flows, Saved Queries, Input and Output Macro fields, Mapping Configuration Tasks and Cascading Filters.Performed Data synchronization task for incremental load, Data Replication Tasks, Mappings and Task Flows and customize tasks, assignments, and decision tasks.Extensively worked on GCS (Google Cloud Services)Used Flat files are source and loaded them into GCS (Google Cloud Services)Created multiple steps for moving the data from NAS to Data warehouse. NAS to GCS, GCS to GCS, GCS to ACQ, ACQ to Integration Server.Migrated the code (Task flows, parameters, Big Query, DDL's) from Dev environment to QA, Preprod and Prod environments.Integrated the task flows in Control M and configured the required jobs and scheduled them to be executed.Work closely with the business and IT teams to troubleshoot/remediate system issues and ensure solution plans are implemented.Design and executive the test scripts, test plans for Unit testing, System Integration, Regression and User Acceptance Testing Processes.Coordinating with offshore team on daily basis and providing the updatesResponsible for identifying bugs in existing mappings by analyzing data flow, evaluating transformations and fixing bugs.Tuning the mappings based on criteria, creating partitions in case of performance issues.Performed data validation after the successful End to End tests and appropriate error handling in ETL processes.Resolving the tickets based on the priority levels raised by QA team.Developed Parameter files for passing values to the mappings as per requirement.Scheduled batch and sessions within Informatica using Informatica scheduler and also customized pre written shell scripts for job scheduling.Environment: Informatica PowerCenter 10.2, GCS, IICS, Oracle 12c, UNIX Shell Scripts, Unix, Erwin 9.3, Control MClient: Humana Inc., Dallas, TXRole: ETL Informatica Developer Duration: Mar 2020 to Mar 2022About Client: Humana Inc. operates as a health and well-being company in the United States. The company offers medical and supplemental benefit plans to individuals. It also has a contract with Centers for Medicare and Medicaid Services to administer the Limited Income Newly Eligible Transition prescription drug plan program; and contracts with various states to provide Medicaid, dual eligible, and long-term support services benefits.Responsibilities:Involved in all phases of SDLC from requirement gathering, design, development, testing, Production, user training and support for production environment.Actively involved in interacting with business users to record user requirements and Business Analysis.Outlined the complete process flow and documented the data conversion, integration and load mechanisms to verify specifications for this project.Parsing high-level design spec to simple ETL coding and mapping standards.Worked with PowerCenter Designer tools in developing mappings to extract and load the data from flat files and SQL server database.Maintained warehouse metadata, naming standards and warehouse standards for future application development.Created the design and technical specifications for the ETL process of the project.Used Informatica as an ETL tool to create source/target definitions, mappings and sessions to extract, transform and load data into staging tables from various sources.Responsible for mapping and transforming existing feeds into the new data structures and standards utilizing Router, Lookups Using Connected, Unconnected, Expression, Aggregator, Update strategy & stored procedure transformation.Worked with various complex mapping, designed slowly changing dimension Type1 and Type2.Performance tuning of the process at the mapping level, session level, source level, and the target level.Created Workflows containing command, email, session, decision and a wide variety of tasks.Responsible for identifying bugs in existing mappings by analyzing data flow, evaluating transformations and fixing bugs.Tuning the mappings based on criteria, creating partitions in case of performance issues.Performed data validation after the successful End to End tests and appropriate error handling in ETL processes.Resolving the tickets based on the priority levels raised by QA team.Developed Parameter files for passing values to the mappings as per requirement.Scheduled batch and sessions within Informatica using Informatica scheduler and also customized pre written shell scripts for job scheduling.Environment: Informatica PowerCenter 10.2, Oracle 12c, PostgreSQL, UNIX Shell Scripts, Unix, Erwin 9.3, Control M, Teradata 16Client: Liberty Holdings Limited, Plano, TXRole: Informatica Developer Duration: Mar 2017 to Feb 2020About Client: Liberty Holdings Limited, a financial services company, engages in the insurance business. The company offers a wide range of insurance products and services, including personal automobile, homeowners, workers' compensation, commercial multiple peril, commercial automobile etc.The objective of this project is to integrate Customer data from multiple source system into data mart and support it stakeholder for making better decision. It helps users / businesses make decisions by binding their data.Responsibilities:Involved in all phases of SDLC from requirement gathering, design, development, testing, UAT, deployment to production environment.Actively involved in interacting with business users to record user requirements.Involved in Analysis, profiling and cleansing of source data and understanding the business process.Translated requirements into business rules & made recommendations for innovative IT solutions.Outlined the complete process flow and documented the data conversion, integration, and load mechanisms to verify specifications for this data migration project.Involved in documentation of Data Mapping & ETL specifications for development from source to target.Also implemented the change Data Capture (CDC) while integrating the enterprise data sources.Parsing high-level design spec to simple ETL coding and mapping standards.Worked with PowerCenter Designer tools in developing mappings and Mapplets to extract and load the data from flat files and Oracle database.Maintained naming standards and warehouse standards for future application development.Created the design and technical specifications for the ETL process of the project.Used Informatica as an ETL tool to create source/target definitions, mappings, and sessions to extract, transform and load data into staging tables from various sources.Responsible for mapping and transforming existing feeds into the new data structures and standards utilizing Router, Lookups Using Connected, Unconnected, Expression, Aggregator, Update strategy & stored procedure transformation.Performance tuning of the process at the mapping level, session level, source level, and the target level.Strong on Exception Handling Mappings for Data Quality, Data Cleansing and Data Validation.Created Workflows containing command, email, session, decision, and a wide variety of tasks.Tuning the mappings based on criteria, creating partitions in case of performance issues.Tested End to End to verify the failures in the mappings using scripts.Performed data validation after the successful End to End tests and appropriate error handling in ETL processes.Resolving the tickets based on the priority levels raised by QA team.Facilitated in developing testing procedures, test cases and User Acceptance Testing (UAT)Developed Parameter files for passing values to the mappings for each type of clientScheduled batch and sessions within Informatica using Informatica scheduler and also wrote shell scripts for job scheduling.Environment: Informatica PowerCenter 9.6.1., Oracle 11g, MySQL, UNIX, Control M, JIRA

Respond to this candidate
Your Message
Please type the code shown in the image:

Note: Responding to this resume will create an account on our partner site postjobfree.com
Register for Free on Jobvertise