Quantcast

Senior Etl Informatica Iics Developer Re...
Resumes | Register

Candidate Information
Name Available: Register for Free
Title Senior ETL Informatica & IICS Developer
Target Location US-TX-Frisco
Email Available with paid plan
Phone Available with paid plan
20,000+ Fresh Resumes Monthly
    View Phone Numbers
    Receive Resume E-mail Alerts
    Post Jobs Free
    Link your Free Jobs Page
    ... and much more

Register on Jobvertise Free

Search 2 million Resumes
Keywords:
City or Zip:
Related Resumes
Click here or scroll down to respond to this candidate
Senior ETL IICS / Informatica DeveloperCandidate's Name
					Phone: PHONE NUMBER AVAILABLE					LinkedIn: https://LINKEDIN LINK AVAILABLE					Email: EMAIL AVAILABLE					Location: Frisco, TX, USAPROFESSIONAL SUMMARY:      Offering 13+ years of Extensive IT expertise as Senior ETL Lead Developer with proven abilities in gathering, analyzing and documenting business requirements, analyzing existing systems, implementing & managing of various modules end to end to incorporate all the business rules, production support activities of various applications through ETL/ELT tools.      Progressive experience working with Data Warehouse / Data Mart and Business Intelligence solutions includes working with Business Requirement Analysis, Planning, Design, Development, Testing, Deployment/Implementation to Production and Production Support of Data Warehouse, Data Integration and Data Migration projects by using various Data Warehouse Technologies and ETL tools like Informatica Power Center, Informatica Intelligent Cloud Service (IICS), SQL, IBM DB2, Oracle, PL/SQL, UNIX Shell Scripting, ADF, AWS S3, Snowflake etc.      Possess good interpersonal skills and strong ability to learn and grasp new technologies in a quick time. Ability to work in environments with fluctuating priorities and deadlines, and dedication to solve complex problems with experience.EXPERTIZE AND SPECIALIZATION:      Strong Experience on working with Data Integration, Data Warehousing & Data Migration techniques, using ETL tools like Informatica Intelligent Cloud Service (IICS), Informatica PowerCenter PHONE NUMBER AVAILABLE etc.      Expertise in integrating & migrating data from/to on-premise systems, databases to cloud based systems, databases using Intelligent Cloud Service (IICS).      Strong working experience in all phases of ETL development process including Extraction, Transformation and Loading data from various source systems into DWH/DM systems using IICS (Data Integration) and Informatica Power Center.      Experience in implementing complex business rules by creating re-usable transformations, mapplets, developing complex mappings.      Good knowledge and working experience in IICS components like Data Integration, Administrator, Monitor, Deployment s, Schedules, Operational Insights etc.      Completed AI Boot Camp program on Insurance Underwriting and AZURE POC on Customer-360.      Certified as SAFe 5 Agilist and having development experience in Agile and Kanban Methodologies, implemented tasks/defects based on the JIRA and Azure Boards.      Worked in every phase of entire BI data warehouse Life Cycle process that includes analysis, design, implementation, testing, deployment, documentation, Production Support, training and maintenance.      Depth knowledge of the Data Warehousing Life Cycle, Dimensional Data Modeling (star/snowflake schema), all types of dimensions and facts in data warehouse/data mart by using Extraction, Transformation and Loading (ETL) mechanism using Informatica Power Center, IICS.      Expertise in developing Mappings and Mapplets, Sessions, Worklets, Workflows, and various kinds of Tasks using IICS, Informatica Power Center.      Proficient in setting up & following proper ETL Naming standards & Best Practices throughout the ETL process.      Expert level experience in Developing, Testing, Deploying and Maintaining ETL work-flows for different data Sources/Targets like Mainframe, Web Services (WSDL), XML/XSD, OLTP, Flat Files, IBM DB2, SQL Server, Oracle, Azure SQL Server, Azure SQL Data warehouse, AWS, Snowflake etc.      Extensively worked on Slowly Changing Dimensions (SCD Type 1, SCD Type 2), Change Data Capture (CDC) mechanisms using Informatica.      Strong understanding of Data warehouse concepts, ETL, data modeling experience using Normalization, Business Process Analysis, Re-engineering, Dimensional Data Modeling, Physical & Logical data modeling.      Experience in Identifying and resolving ETL Production Issues and their root causes. Provided off-hour and weekend Production support on a rotational, on-call basis.      Performance tuning and optimization of the workflows & mappings in Informatica using best practices.      Good experience in Data Cleansing, Data monitoring, Data Governance, Data profiling and Data Analysis.      Very good experience in debugging the Code and Optimize the Transaction Data Flow. Having good experience with performance optimization techniques in complex code/complex mappings by identifying source level, target level and mapping level bottlenecks, implementing Error Handling techniques in Informatica Power Center, IICS.      Experience with Project Scoping, Estimation, Production Support and Worked as a team lead and Onsite/Offshore coordinator.      Extensive experience in developing Stored Procedures, Functions, Views and Triggers, Complex SQL queries using SQL Server, PL/SQL.      Have experience on scheduling the Informatica jobs using Control-M, Informatica Scheduler, Autosys & Tidal Scheduler.      Strong knowledge of cloud data technologies, with a focus on IICS, Snowflake, Azure Data Warehouse. Proficient in working with cloud storage solutions like Amazon S3 and Azure Blob Storage for efficient data storage and retrieval.
      Excellent Interpersonal skills, team player, commitment, result oriented, hard working with a quest and zeal to learn new technologies and undertake challenging tasks. Mentoring the Development team.      Good experience in change management, incident management and problem management methodologies and involved as part of CAB for Production deployments defining the process as per ITIL methodologies.      Exceptional analytical and problem-solving skills, good team player with excellent interpersonal and communication skills.      Enough experience in AGILE (Scrum) Methodology, participated in daily scrum meetings, and being actively involved in sprint planning and product backlog creation.TECHNICAL SKILLS
CategorySkills / ToolsData Warehousing Tools & TechnologiesETL Tools: Informatica Cloud   IICS, IDMC, Informatica Power Center PHONE NUMBER AVAILABLEx/7.x, Informatica IDQ, Azure Data bricks, Azure Data Factory, SSIS, AWS Redshift, Azure SynapseDomain ExperienceHealthcare (IP, OP, MH etc.) & Insurance (Customer, Sales, Non-Life, Claims, NGFM, RS, ILFS, EB etc.)RDBMS (Databases)SQL, IBM DB2, MySQL, PL/SQL &, SnowflakeCloud TechnologiesIICS, Azure, AWS, SnowflakeReporting ToolsPower BI, Tableau, Excel ReportsOperating SystemsWindows 98/2000/2003/XP, Unix and LinuxMethodologiesAgile, Kanban, WaterfallLanguagesSQL, Python (beginner)Tools & UtilitiesAzure Data Boards, Control-M, JIRA, Service Now, SQL Developer, Putty, Azure Dev Ops, Citrix, HP Quality Center, BMC, Remedy
EDUCATION      Bachelor of Technology (Electricals and Electronics Engineering) from JNTUCEA with 78.66%CERTIFICATIONS      Certified/Completed IDMC (Informatica Data Management Cloud) Foundation, IDMC Data Engineering Foundation certifications from Informatica      Industrialized AI Data Scientist   certified through DXC      Industrialized AI Master   Certified through DXC      Certified as SAFe 5 AgilistAWARDS AND ACHIEVMENTS      Appreciation Certification was awarded for the  Quality output in the DC project  within a year of performance in OneAmerica account.      COA award in 2020 in OneAmerica Account.      DXC Quarterly Champs Award (FY21, 22)      EWA Recognition provided in the One America project for Owning the responsibility.      Certificate of Merit   FY18 in Zurich Account      Received LDO Award for best Project Team Member of month in 2012 in NHS accountPROFESSIONAL/PROJECT EXPERIENCE:Client: OneAmerica Insurance, IN, USA.                                                                               June 2019   April 2024Role: ETL Architect & Senior ETL DeveloperOrganization: DXC
 Project description:  OneAmerica Financial Partners, Inc. is a U.S financial services mutual holding organization with corporate office at the OneAmerica Tower in Indianapolis, Indiana. It is founded in 1877, which is the oldest life insurance company in the USA. American United Life (AUL) Insurance Company, the State Life Insurance Company, OneAmerica Securities Inc. McCready and Keene Inc, Pioneer Mutual Life (PML) Insurance Company, AUL Reinsurance Management services LLC are the subsidiary companies of OneAmerica.      OneAmerica is among the largest global providers of insurance annuities, and employee benefit programs, with millions of customers worldwide. OneAmerica individual life insurance products and services comprise term life insurance and several types of permanent life insurance, including whole life, universal life, and final expense whole life insurance. OA provides disability products for individuals as well as employee and association groups who receive them through their employer. MetLife is among the largest providers of annuities and OA home insurance solutions include homeowner s insurance, condo insurance, renter s insurance, insurance for landlords, and mobile home insurance.Fineos Back Office project implementation mainly focus on bringing data from FINEOS and back to OneAmerica to update OneAmerica Financial Systems and to ensure correct claims processing through cross checking. Mainly involves in creation of Disbursement, Writeback, General Ledger and W/1099 Tax Extract program files by using the Fineos extracts. One of the most complex projects in Finance in OneAmerica account.Data Modernization project is the Enhancement project supporting arounds 32 source systems. It has cloud-based ELT process. It has various streams like Digital Acceleration, AIM Focus, ML Platform support. It receives source files with different formats from various Admin systems. Target Database is Mark Logic (ML) and it s a NoSQL database server.Roles and Responsibilities:      Worked as Senior ETL developer for Fineos Back Office project in Disability Claims in OneAmerica.      Responsibilities included to work with Business Analyst, IT Architect for business & technical requirements gathering, Analysis, Designing & Developing ETL processes, scheduling ETL jobs in Control-M and project coordination.
      Primarily responsible in ETL process from Analysis, Design, Development, Testing, Execution and Monitoring jobs in Informatica Intelligent Cloud Services (IICS), Informatica Power Center (IPC)      Worked on migration of Informatica Power Center code to IICS using Power Center task.      Experienced in creation of various connectors, configuring secure agents, and developing of various tasks in Informatica Intelligent Cloud Services (IICS) Data Integration.      Created N number of mappings, mapping tasks and task flows based on the requirements in CDI in IICS.      Design, develop, and test processes for loading initial and cyclical data into a data warehouse Using IICS Data integration.      Created multiple task flows to use for loading, copying data from various sources systems by using synchronization tasks, mapping tasks, replication tasks, mass ingestion tasks in IICS.      Extensively used parameters, Partition techniques, Pushdown optimization, expression macros in IICS.      Proficient working with   Expression, Joiner, Lookup, Filter, Aggregator, Sorter, Router, Sequence Generator, Update Strategy, Union, XML, Normalization, SQL transformations in both IICS and IPC.      Expertise in developing Informatica mappings to extract data from various sources and transforming the data as per business need along with implementing SCD Types (Type1, Type2). Change Data Capture (CDC), Incremental Loading techniques and loading into Data Ware House / DataMart s by using IICS and IPC.      Creation of ETL workflows, worklets, sessions and mappings by using Informatica Power Center. Creation and processing of Disbursement, Writeback and General Ledger, W2/1099 Tax files to downstream systems by using Fineos source Extract files as per the business needs.      Expertise in developing fully fledge generic mappings, reusable mapplets in Informatica which can be used by multiple processes.      Working on Raw Data Extraction from RDBMS (SQL, IBM DB2, Cloud Tables etc.), Flat Files, Transforming and Loading to Staging area and then loaded data to topics into Cloud Integration Hub (CIH), RDBMS and Flat files by using IICS and IPC.      Proposed/Recommended design alternatives to confirm the business requirements, interaction with business owners/end users directly for business understanding.      Good experience on dimensional modelling, identify Dimension and Fact tables to store business information and facts/metrics accordingly. Working together with data modeler team and to build conceptual and physical data models.      Good working experience and knowledge on Data Monitoring using Workflows and Session logs using Monitor. Also analyzing the data issues and providing the required update to Business owners to update the code as required.      Worked as single POC for all ETL development related issues and coordinator from offshore.      Worked on different documents like Technical Design Document (TDD), Mapping Documents, Changelog document and Unit Test documents. Responsible for ETL Code development and peer reviews.      Good working relationship with managers and team members. Interacting with clients to understand the business requirements and criticality of the development or issues.      Adopted an Agile methodology and involved in sprint planning, daily standups, and sprint, product backlog, sprint retrospective meetings.      As a Subject Matter Expert in the team acts as a single point of contact for all escalations and resolving those issues across all integrations.      Proficient in doing Root Cause Analysis of the Production Defects reported and fixes the same by participating in production MIM calls.      Proactively Identified existing bugs in the system and bring to client attention for planning of an appropriate action.      Participated in knowledge transfer sessions and mentored team members. Actively participating in all the cross skilled trainings given internally with in the team.      Develops metrics that provide data for process management and indicators for future improvement opportunities.Other Contributions:      Contributed for a Value-Add project, Do-CodeS - Automated Code Validation Utility for Informatica PowerCenter. Involved in the SQL queries development which will be used to create a Health Check Report for all the Informatica Objects (Workflows, Worklets, Mappings, Mapplets, Transformations, Tasks, etc.) that violate the standards. The Health Check report also provides comparison of expected and existing values.
      Received COA award in 2020 in OneAmerica Account for the Consistent Positive Attitude, Customer Commitments and Contributions in ETL Health Check Engine      Received many EWA Recognitions for owning the project responsibilities, creation clarities, demonstrating the things properly.      Awarded with DXC Quarterly Champs AwardsEnvironment: Operating Systems: Windows XP, ETL: Informatica Intelligent Cloud services (IICS), Informatica Power Center 10.5 Database: SQL Server Tools: JIRA, Azure Data Studio, SSMS, PowerShell, SQL Developer, Microsoft Power BI, Erwin, Control M.Client: FORD, UK.                                                                                                             Jan 2019   June 2019Role: Senior ETL Lead Developer / Data Migration EngineerOrganization: DXCProject description:  Ford UK project creates centralized CRM system to improve business processes and customer experience. B2B CRM brings together data from several Ford systems and external sources, providing a consolidated and consistent view of customer and sales data to Ford s Sales and Marketing personnel and Ford Dealers. Provide technical operations with a single, scalable, centralized system that improves Ford s sales & marketing. Integrate all the 9 Ford-UK source systems which holds all the customer and sales info. Involves Real time & near real time integration of various data sources for various regions of Europe.Roles and Responsibilities:      Created Extract, Transformation and Load (ETL), real time processes, that manages all data transformation & transfer of data seamlessly between CRM & various source systems.      Created Business Process Execution Language processes to receive data (XML BOD) sent from some source systems to CRM through their webservices. Hosted our BPEL process on their webservice (SOAP).      Utilized Informatica process developer for creating real time processes. Created ETL real time processes using Informatica cloud Real time integration (ICRT) & Informatica Intelligent Cloud Services application (IICS) to implement business logic.      Migrated huge amount of data from various source systems for all European regions into CRM. Scheduled the ETL jobs to run daily.      Define BI standards, guidelines and best practices for business groups and technical teams.      Design, develop software solutions for integrations and provide analytics to the organization using hands-on experience with Microsoft Azure Cloud storage and Informatica Intelligent Cloud Services (IICS).      Design, develop, manage and update logical and physical data models for the BI data marts.      Provide architectural expertise, thought leadership, direction, and assistance across entire BI organization.      Coordination of system/Integration/UAT testing with other teams involved in project and review of test strategy.Environment: Operating Systems: Windows XP, ETL: Informatica Cloud Real time integration (ICRT) & Informatica Intelligent Cloud Services (IICS) Database: SQL Server, Azure SQL Database, SQL Server Tools: JIRA, Azure Data Studio, SSMS, PowerShell, SQL Developer, Microsoft Power BI, Control M.Client: Zurich Financial Services (ZURICH Insurance Ltd, Germany)                      May 2013   Jan 2019Role: Senior ETL Developer
Organization: DXCProject description:  Zurich-EGI-BI team offers BI solutions to all underlying portfolios of General Insurance. We Built Central Data warehouse (ZDW) which is an integral part of Zurich s BI application. The ZDW is a central database which stores all incoming data from PSA and other external sources. This data is used to periodically feed various Datamart s which are then used by business to generate reports by using MSTR. Different Data Marts: DNL Data Mart Non-Life, DMV Sales Data mart, DDA Data mart Direct Insurance), DMK Customer Cube. This project includes development of mappings and enhancement of existing mappings.Roles and Responsibilities:      Worked as ETL & DWH developer and responsible for Data process through ETL i.e., Extraction, Transformation, and Loading with RDBMS and Flat files by using Informatica Power Center.      Worked at onsite location (Bonn, Germany) to interact with client and to understand, get good knowledge on overall business.      Involved in understanding and preparing technical design and estimation documents.      Interacted with End user community to understand the business requirements and in identifying data sources.      Evaluated the existing informational sources and methods to identify problem areas and make recommendations for improvement. This required a detailed understanding of the data sources and researching possible solutions.      Gather business requirements from Stakeholders and perform technical gap analysis for the Integration Projects.      Created ETL workflows, sessions, and mappings by using Informatica Power Center. Involved in ETL process from design, development to various testing phases.      Worked on Design documents like Design Instruction, Data catalogue, Deployment Order, Test Plan, Data Flow Plans and Test Case Management Tool.      Responsible for ETL Code development and peer review. Performing Unit Testing and Application testing on ETL Code.      Responsible for system integration testing, UAT testing and migrations.      Performing functionality validation to make sure that the build is done without defects.      Interacted with Clients to understand their business requirements and criticality of the build or issues. Participated in knowledge transfer sessions and mentored team members.      Worked as a Senior ETL Lead for onsite/offshore model and leading a team.
      Skilled in utilizing Informatica to streamline complex data integration processes and ensure data quality, consistency, and reliability across various systems and sources.      Experienced in designing and developing efficient workflows, mappings, and transformations in Informatica to meet business requirements.      Proficient in leveraging Informatica's capabilities to optimize data integration and processing.      Expertise in SQL Server administration, performance tuning, and query optimization for efficient data storage, retrieval, and manipulation.      Used Control M job scheduler for automating the daily runs of Integration cycle in both production and UAT environments.      Performed Gatekeeper activities for Incidents in remedy tool like SNOW. Activities include responding, communicating, assigning, resolving, Closing of Incident s etc.      Worked on software development life cycle (SDLC), which included requirements gathering, designing, and implementing the business specific functionalities, development, testing and production support.      Followed Agile Scrum Methodology in analyzing, defining, and documenting the application which will support functional and business requirements.Other Contributions:      Actively participated in Zurich Innovation Program (ZIP-2018) and awarded with 2nd prize for the idea "Weather Sensors in Crop Insurance" based on Data Analytics.      Awarded with DXC KUDO Awards for continuous high performance showed in project deliveries.Environment: Operating Systems: Windows XP, ETL: Informatica Power Center, Database: IBM DB2, SQL Server Tools, JIRA, SSMS, PowerShell, SQL Developer, Microsoft Power BI.Client: NHS (National Health Services, Department of Health, UK)                            Dec 2010   Apr 2013Role: ETL Developer
Organization: DXCProject description:  National Health Services, own the leading independent application named LORENZO developed for providing health service in U.K. The primary deliverable of the program is  Lorenzo    an enterprise HealthCare Platform. The solution is provided to replace existing products used by different healthcare trusts. Lorenzo is a Health Care product, which captures a patient s journey in the hospital.NHS Data Migration project is a part of the DXC Alliance program for NHS s NPFIT   National Program for IT. As part of our project, we migrate from different source systems to Lorenzo. This is normally accomplished in two steps:a) Extraction and transformation of data from sources   to a flat fileb) Use the flat file to load to the target system Lorenzo   using an in-house solution.Our solution is to generate flat files as per the target systems  specifications and mimicking UI Validations.Snowdonia is a code name for a generic data migration solution to migrate data from different versions of a legacy PAS system named iPM to DXC s enterprise solution   Lorenzo Regional Care.This solution enables DXC to perform migration of existing data into Lorenzo Regional Care database within no time after performing all the validations incorporated in Lorenzo Regional Care application.This migration is performed in three phases
a) Extraction of data   This phase involves extracting data from different source tables to necessary staging tables. Post this step, the data is suitably extracted   verifying all integrity constraints and all record specific business rules to a Staging Databaseb) Transformation of data   Data in the staging database is further transformed in the manner to suit the target system, by performing further checks on business rules, which involves, across record and across dataset validations and final corrections of data before the data is written to specifically formatted flat filesc) Loading of data   This phase involves, picking up of these flat files and after a second phase of integrity, business rule and other target system specific validations before the records are suitably loaded into the target system with all the proper integrity relations setup.Roles and Responsibilities:      Design generic Data Migration solution for extract and transformation along with Solution Architect and Informatica Professional Services consultant.      Requirement gathering and requirement Analysis.      Used various transformations like Source Qualifier, Filter, Aggregator, Expression, Lookup, Sequence Generator, and update Strategy to create mappings.      Using ETL Process to Extract, Transform and Load the data into stage area and data warehouse is done using the Teradata Client Utilities like Fast load and Multiload.      Analyzed the existing informational sources and methods to identify problem areas and make recommendations for improvement. This required a detailed understanding of the data sources and researching possible solutions.      Performing unit testing and Integration testing scenarios.      Performing code reviews.      Developing ETL mappings using Informatica Power Center 9.1.0 Designer (Source Analyzer, Mapping designer, Mapplet Designer, Transformation Developer), Repository Manager, and Workflow Manager & Workflow Monitor.      Extensively worked on Data Extraction, Transformation, and Loading with RDBMS and Flat files.      Expert in Data Integration with Informatica Power center and SQL Server Integration Services.      Responsibilities developing design documents based on the requirement specifications. Coding, testing, modifying, debugging, documenting and implementation of Informatica mappings.      Analyze functional requirements, mapping documents, assist in problem solving and troubleshooting. Exposure to HealthCare domain and experience in understanding of business rules.      Worked on extraction of data using DTS (Data Transformation Services). Given proper solution to extract fields from source using DTS.      Responsible for identifying Defect Management process, tracking, and validating defects.      Root Cause Analysis of the Production Defects reported and fixes the same.      Proactively Identified existing bugs in the system and bring to client attention for planning of an appropriate action.      Developed macros to use on Microsoft Excel for developing Intermediate Flat File (IFF).      Awarded as Star Employee (LDO Award) in NHS project implementation.Other Contributions:      Awarded as Star Employee (LDO Award) for the quarter of August 2012 NHS project for good analyzing and fixing the issues on time with good quality and zero outstanding issues.      Completed Basic Health Care Certification (DXC Internal).Environment: Operating Systems: Windows XP, ETL: Informatica Power Center, SQL Server Tools, SQL Developer, Microsoft Power BI, Control M.

Respond to this candidate
Your Email «
Your Message
Please type the code shown in the image:
Register for Free on Jobvertise