Quantcast

Power Center Data Integration Resume Gre...
Resumes | Register

Candidate Information
Name Available: Register for Free
Title Power Center Data Integration
Target Location US-NC-Greensboro
Email Available with paid plan
Phone Available with paid plan
20,000+ Fresh Resumes Monthly
    View Phone Numbers
    Receive Resume E-mail Alerts
    Post Jobs Free
    Link your Free Jobs Page
    ... and much more

Register on Jobvertise Free

Search 2 million Resumes
Keywords:
City or Zip:
Related Resumes
Click here or scroll down to respond to this candidate
SUSHMALATHACandidate's Name
PHONE NUMBER AVAILABLE EMAIL AVAILABLEhttps://LINKEDIN LINK AVAILABLE8 +years of IT experience in the field of Data warehousing in tools likeInformatica Power Center, IICS, CAI, CDI, Snowflake, Oracle and Shell Scripting.Strong working experience in all phases of development including Extraction, Transformation and Loading (ETL) data from various sources into Data Warehouses and Data Marts using IICS Informatica Cloud (CDI, CAI) and Power Center (Repository Manager, Designer, Workflow Manager, Workflow Monitor).Performed advanced Data Pipeline development work with Informatica Cloud (IICS) including data integration, source to target data modelling, Extract-Transaction-Load development,consuming Oracle data connections, RESTful API application-based data connections, targeting Snowflake data connections.Created N number of mappings mapping tasks and Task Flows based on requirement in CDI (Cloud Data Integration)3+ years of Experience in IICS Informatica Cloud ETL Development USING CDI, CAI.Performed one time data loads as well as Change Data Capture (CDC) for bulk data movement.Performed Power Center ETL Development USING Repository Manager, Designer, Server Manager, Workflow Manager, and Workflow Monitor.Worked on converting Informatica PowerCenter ETL code to IICS using PC to Cloud Conversion service. Involved in enhancements and maintenance activities of the data warehouse.Performed Informatica push-down optimization.Created Swagger Files for connection to pull data from ServiceNow from CDI.Worked with API Web Services using application integration for Geotab through REST End point connecter.Worked with Production Support team to immediately resolve failures in production data processing. Collaborated with business users on resolving data inconsistency and reporting issues.Understand the business rules completely based on High Level document specifications and implements the data transformation methodologies.Created Complex mappings using Connected and Unconnected Lookups, Aggregate, Update Strategy, Stored Procedure and Router transformations for populating target table in efficient manner.Experience in testing the data using debugging mapping method. Identified bugs in existing mappings by analyzing the data flow and evaluating transformations.Experience in resolving on-going maintenance issues and bug fixes; monitoring Informatica sessions as well as performance tuning of mappings and sessions.Experience with the Snowflake platform and ecosystem.Experience with Snowflake SQL.Proficient in Creation of re-usable mapplets, Transformations using various transformations of Informatica.Ability to promote code from development environments to production.Familiarity with GitHub or equivalent version control systems.EDUCATION:M.TECH (Masters in Engineering and Technology)Jawaharlal Nehru Technological University, Hyderabad, IndiaB. TECH (Bachelor of Engineering and Technology)Visvesvaraya Technological University, Karnataka, IndiaAREAS OF EXPERTISE:ETL Tool :Informatica Power Center and Informatica cloud CDI/CAIData Modelling:Physical Modelling, Logical Modelling,Relational ModellingDimensional Modelling :StarSchema,Snowflake,Fact,Dimensions,Entities,Attributes.Databases:Oracle 12g/11g/10g/9i, SQL Server, Snowflake.Languages:SQL,PL/SQL,C,C++,UNIX Shell ScriptTools:SQL Plus,TOAD,SSMS,PL/SQL Developer,SQL* LoaderOperating Systems: Windows NT/2000/2003/XP/Vista/7, Unix, LinuxScheduling Tools: DAC, BMC-CTRL-M, Autosys.PROFESSIONAL EXPERIENCE:WEX,ME June,2022 to Till DateCandidate's Name
Project: GEOTABTechnology used: Informatica Intelligent Cloud Services (IICS) CAI, CDI, Oracle, Snowflake,Control-MResponsibilities:Understanding the business rules and sourcing the data from multiple source systems.Designed and built IICS mappings to extract data from SFDC, Oracle, SQL Server and loaded into Oracle/Designed the ETL flows for generating flat file extracts as per the business need.Worked on converting Informatica PowerCenter ETL code to IICS using PC to Cloud Conversion service. Involved in enhancements and maintenance activities of the data warehouse.Refreshing the mappings for any changes/additions to source attributes.Developed the audit tables for all the cloud mappings.Developed Informatica Cloud Mappings, Ds task, Data replication and task creation.Generated automated stats with the staging loads comparing with the present day counts to the previous day counts.Built Data pipelines by creating Cloud ETL mappings in Informatica Intelligent Cloud Service and published them to be used in API calls.Introduced GitHub as a version control plugin for IICS.Involved in performance optimization of IICS jobs and design efficient queries to query data from Snowflake cloud database.Automated/Scheduled the cloud jobs to run daily with email notifications for any failures and Timeout errors.Created File Mass ingestion task for moving Data files from FTP, SFTP, Local Folders to FTP SFTP, Local Folders and Data from Databases Mass ingestion task to load data from on-premise database to Salesforce cloud.Monitored and improved the performance of Sources, Targets and MappingsDocumented the Sources, Targets and Mappings.Created File Parser connection for vast number of files and process to setup the dependency between Cloud and source. Created and developed mappings to load the data from staging tables, Files to EDW DataMart tables, CIH Topics and topics to files based on Source to Staging mapping design document.Implemented Type1, Type2, CDC and Incremental load strategies.Used Exceptional mappings to generate dynamic parameter files after the staging loads. Used Shared folders to create reusable components (Source, target, Mapplets, email tasks).Developed/Automated(power center/IDQ)jobs through Control-MUsed Sql queries to validate the data after Loading.Vertex, TX March, 2021  May,2022Candidate's Name
Project: VITALTechnology used: Informatica PowerCenter 10.2.0, IICS, Oracle, EBS, Control-MResponsibilities:Involved in data analysis and handling the ad-hoc requests by interacting with business analysts, clients and customers and resolve the issues as part of production supportReviewing project/task statuses/issues with the offshore team and ensuring completion of project on timeDeveloping UNIX shell scripts for automating and enhancing/streamlining existing manual procedures usedat client locationDeveloped SQL scripts to fetch data as per the requirementsCAI is used to connect directly to the Vertex application and to process real time data.CAI is used for API/web services and REST End point connector.Convert specifications to programs and data mapping in an ETL IICS environment.CDI used to develop Cloud mappings to extract the data for different regions.Extensively used Pre-SQL and Post-SQL scripts for loading the data into the targets according to the requirement.Created Batch scripts to call Informatica workflows for number of times. Created different jobs using batch scripting to call the workflow by using Command tasks.Created Reusable transformations, Mapplets, Worklets using Transformation Developer, Mapplet Designer and Worklet DesignerExecuted Sessions, Sequential and concurrent batches for proper execution of mappings.Attended status meetings with project managers, escalated issues, when necessary, attended meetings for issues resolution.Used parameters/variables for PowerCenter mappings, sessions and workflows.Worked in Error Handling strategy. If any errors occurred the mapping should fail.Worked extensively with sources and target like Flat files, SQL Server tablesTo build efficient ETL components to meet the technical requirementsDebug the sessions by utilizing the logs of the sessionsFixed the invalid mapping's and troubleshoot the technical problems of the database.Participated in Deployment by adding objects into deployment group and moving the code from test to production.Used Mapplets, Parameters and Variables to implement Object Orientation techniques and facilitate the reusability of code.Monitor Workflows in Production Environment and debug/resolve issuesWritten complex SQLs using joins, sub queries and correlated sub queries. Expertise in SQL Queries for cross verification of dataGenerated queries using SQL to check for consistency of the data in the tables and to Update the tables as per the Business requirements.LOral, NY Jan,2020  Jan 2021Candidate's Name
Project: B2CTechnology used: Informatica PowerCenter 10.2.0, SQL Server 2012, Control-MResponsibilities:Gathering Business requirements and creating technical specifications along with creating Internal and External Design documentsCreate Technical specification documents and work with Business Analysts on analysis of multiple source systems data as part of ETL mapping exerciseExtensively used Pre-SQL and Post-SQL scripts for loading the data into the targets according to the requirement.Created Batch scripts to call Informatica workflows for number of times. Created different jobs using batch scripting to call the workflow by using Command tasks.Created Reusable transformations, Mapplets, Worklets using Transformation Developer, Mapplet Designer and Worklet DesignerExecuted Sessions, Sequential and concurrent batches for proper execution of mappings.Used parameters/variables for PowerCenter mappings, sessions and workflows.Worked in Error Handling strategy. If any errors occurred the mapping should fail.Worked extensively with sources and target like Flat files, SQL Server tablesTo build efficient ETL components to meet the technical requirementsFixed the invalid mapping's and troubleshoot the technical problems of the database.Participated in Deployment by adding objects into deployment group and moving the code from test to production.Optimized the SQL queries using Explain plan.Johnson & Johnson,India Aug,2015  Feb, 2019Candidate's Name
Project: Pharma Data PortalTechnology used: Informatica, Oracle, SQLResponsibilities:Extract, transform and Load data from different sources into Oracle database using Informatica as ETL tool.Have involved in solving complications in the production level issues.Have involved SIT and UAT defects closing.Worked extensively in shell scripting and click stream analysisCreated Reusable transformations, Mapplets, Worklets using Transformation Developer, Mapplet Designer and Worklet DesignerExecuted Sessions, Sequential and concurrent batches for proper execution of mappingsInvolved in Tuning the Informatica objects to increase the Performance of the Informatica loads.Involved in the design, development and testing of the PL/SQL stored procedures, packages for the ETL processes.Extracted data from various Sources like Oracle, Excel Spread sheets, XML, Sybase and Flat Files.Created Unix Shell Scripts for ETL jobs, session log cleanup, dynamic parameter and maintained shell scripts for data conversionHandled off shore for related work in respect of unit test case preparation and documentation like implementation, warranty & support.

Respond to this candidate
Your Message
Please type the code shown in the image:

Note: Responding to this resume will create an account on our partner site postjobfree.com
Register for Free on Jobvertise