Quantcast

Data Management Integration Resume Bridg...
Resumes | Register

Candidate Information
Name Available: Register for Free
Title Data Management Integration
Target Location US-NJ-Bridgewater
Email Available with paid plan
Phone Available with paid plan
20,000+ Fresh Resumes Monthly
    View Phone Numbers
    Receive Resume E-mail Alerts
    Post Jobs Free
    Link your Free Jobs Page
    ... and much more

Register on Jobvertise Free

Search 2 million Resumes
Keywords:
City or Zip:
Related Resumes

Project Manager Data Center Lake Mohegan, NY

Data Analyst, Risk Management Analyst New York City, NY

Project Management Data Irvington, NJ

Principal Data Integration Engineer AUG-2021 - Present Secaucus, NJ

Data Integration Engineering Rutherford, NJ

Data Center Change Management Pennington, NJ

Data Integration Warehouse Voorhees, NJ

Click here or scroll down to respond to this candidate
Candidate's Name
EMAIL AVAILABLEPHONE NUMBER AVAILABLEProfessional SummarySeasoned Data Integration Specialist with nearly 10 years of extensive experience in designing, developing, supporting, and maintaining data integration solutions using Informatica services and associated technologies. Adept in leveraging Informatica Power Center (9.x/10.x), Intelligent Data Management Cloud (IDMC), and various data management tools and platforms to create efficient ETL processes and manage complex data flows. Key Skills and ExpertiseData Integration and ETL Development: Demonstrated expertise in designing and developing Extract, Transform, Load (ETL) processes from diverse data sources to multiple target systems, including Oracle, Salesforce.com(SFDC), SQL Server, SAP, AWS S3, Snowflake, and various flat files, REST APIs, JSON, and XML files.Informatica Services Proficiency: Advanced proficiency in Informatica Data Management Cloud (IDMC) services such as Data Synchronization Services (DSS), Data Replication Services (DRS), Mappings, Mapping Tasks, Task Flows, Service Connectors, and Business Process management. Skilled in creating Swagger files and configuring Mass Ingestion.Data Warehouse and SCD Design: Expertise in designing and implementing Slowly Changing Dimensions (SCD) types 1, 2, and 3. Solid understanding of data warehousing concepts and practices.Optimization and Standardization: Enhanced over 15 complex data flows using pushdown optimization techniques and established standardized processes for managing data integration systems to ensure consistency and efficiency.Documentation and Artifacts: Proficient in authoring key project artifacts including Technical Design Documents(TDD), Source to Target Mapping Documents (STTM), Migration Documents, Test Case Scenarios, User Acceptance Testing (UAT) plans, and Operational Support Documents compliant with GDPR.Visualization and Architecture: Skilled in creating detailed architecture diagrams using tools such as Microsoft Visio and Lucidchart.Domain Expertise: Extensive experience working in Life Sciences, Pharma, Research & Development, and Clinical Data domains, adhering to Good Automated Manufacturing Practice (GxP) and Computer System Validation (CSV) processes.Salesforce Integration and Master Data Management: Integrated Salesforce CRM objects with Veeva Vault applications and managed Master Data Management (MDM) objects within data systems.Migration and Programming: Successfully migrated Informatica Power Center 10.x code to IDMC, including thorough execution planning, testing, and workflow redesign. Developed dynamic parameterization for incremental data loading using Python programming.Cloud and BI Tools: Experience with Python programming, AWS EC2, AWS S3, Salesforce Administration, PowerBI, and Tableau. Managed data loads in Snowflake using dynamic mappings and IICS with Data Cloud connectors, and queried data from AWS S3 to Snowflake.Proof of Concept and Agile Methodologies: Presented proof of concept for new business requirements and integration services. Experienced in both Waterfall and Agile SDLC models, using tools like Jira and ServiceNow for project management and collaboration.Version Control and Collaboration: Proficient in version control systems like Git for effective code collaboration, versioning, and continuous integration.Problem Solving and Communication: Excellent problem-solving and analytical skills with the ability to handle high-pressure, fast-paced work environments. Strong communicator, effectively collaborating with teams to deliver high-quality applications and meet project deadlines.Continuous Learning: Proactive in staying updated with the latest Informatica releases and industry trends, continuously enhancing technical skills to adapt to evolving data industry requirements. Technical Skills:ETL Tools: Informatica PowerCenter 9.x / 10.x, IDMC Services (Data Integration, Application Integration, Mass Ingestion), Dell BoomiDatabases: Oracle11g, SAP BW on HANA, Hadoop (Hive), Snowflake Scripting Languages: PL/SQL, Shell Scripting (Unix / Linux), Python Job Scheduling Tools: Autosys, Crontab, Control-MDevOpps Tools Jenkins CI/CD Pipeline, GitHubCloud platforms: AWS, AzureEducation:June 2010  May 2014 Bachelor of Technology in Electronics and Communication Engineering (ECE) from Rajiv Gandhi University of Knowledge Technologies, Basara, Nirmal, Telangana State, IndiaCertifications &Trainings:IDMC Data Integration & Application Integration Partner Training and certificateWorkato Automation Pro-I and Automation Pro-IISnowflake Essentials, Data Sharing Certification from Snowflake Academy.Dell Boomi Professional Developer, Associate Dell Boomi Developer, Dell Boomi B2B/EDI Fundamentals, Dell Boomi B2B/EDI ManagementSalesforce Lightning Reports, Dashboards & List views by UdemyAWS, Azure Cloud foundation courses on UdemyAWS Solution Architect Certification (Udemy course in-progress)Tableau Data Analyst Certification (Udemy course in-progress) Professional Experience:Client: AlkermesRole: Senior Technical LeadDuration: Nov 2021  Jun 2024Tools Used: IICS CDI, CAI, Mass Ingestion, Snowflake, AWS, ServiceNow, Salesforce, Anaplan Responsibilities: Authored all project artifacts such as Integration Design Document, Source to Target Mapping Document, User Acceptance Test Scripts, Migration Document, Operational Support Document.Designed ETL Processes according to business requirements, ensuring efficient data flow from source to destination.Developed ETL Pipelines using IDMC Data integration to pull data from SAP, AWS S3 Files, MS SQL Server, sources to REST APIs to Snowflake data lake.Oversaw production environment scheduled jobs and support if any issues on failed jobs.Collaborated with Clients to get business requirements, estimate timelines with efforts and craft risk analysis on existing data flows.Prepared all project documents and managed data integration activities by following GDP regulations.Stayed up to date with new releases of IDMC and explore on new features and services.Implemented dynamic parameterization using Python in the Manufacturing data domain from OSI-PI systems to Snowflake incremental data load.Built data pipeline with AWS S3 Bucket flat files to Snowflake.Ideated a data pipeline from Azure Blob Storage to Snowflake.Configured Secure Agent in virtual machines to create up and running runtime environment for jobs to be deployed in Informatica cloud.Migrated Informatica Power Center code to Informatica Cloud with thorough process of risk analysis, test plan, jobs execution, re-build of mappings/tasks/workflows.Developed user defined functions, established source/target system connections, made informatica jobs successfully by redefining data flow architecture.Worked on Quality, Manufacturing, Clinical, Research & Development, Medical Affairs, Operation and Finance data domains extensively with Source/Target Systems.Handled Operation data domain sourced SAP HANA to Snowflake, Financial data domain sourced AWS S3 Flat files to Anaplan, Manufacturing data domain sourced OSI-PI SQL Server to Snowflake, Quality data domain sourced Salesforce Trackwise Digital to Snowflake and Commercial data domain from various data sources to Snowflake.Integrated Quality data files to Sharepoint directory using data integration sharepoint online connection.Client: Harvard Business PublisherRole: Senior Data ConsultantDuration: Sept 2019  April 2021Tools Used: IICS CDI REST API, Oracle 11g, Unix Shell script, Postman Responsibilities: Developed integrations which sources CSV files and load them into the Higher Education website using 6 API calls.Applied file validation rules to invalidate the data file and send notification email alerts with the invalid data file as an attachment with error reasons.Extensively used the Postman tool to extract responses of REST API by passing the body in JSON format with authorization.Developed Swagger files, RestV2 Connections, Business Services and Web Service Transformations for REST APIs to invoke from CDIUtilized hierarchy builder transformation in mappings to send the request payload into the Web Service transformation.Executed custom source queries in mappings to facilitate the source records flow from the staging table to REST APIsWrote Shell Scripts for custom email alerts and job dependency checks. Client: AthenahealthRole: Senior Data ConsultantDuration: May 2019  Aug 2019Tools Used: IICS CDI REST API, CAI, Oracle 11g, Unix Shell Script Responsibilities: Developed processes for Business REST APIs to integrate between Salesforce CRM and ServiceNow, and applied fault-handling scenarios in all internal sub-processes.Configured email alerts to make them self-invoking by default in the event of any process fail, and developed exceptions using the type "Throw" in Application integration.Used Query Objects to join multiple objects in SFDC for required output fields.Created/modified Swagger files, based on incident number responses, to pass input parameters like authentication, input fields, subdomain URL to the host URL.Based on Swagger files, created RestV2 connections and business services that were used for Web Service transformation in the data integration of Informatica CloudDocumented all the project artifacts by following GxP guidelines and regulationsCo-ordinated team size of 4 with development, testing, and support activities Client: PayPalRole: Senior Data ConsultantDuration: April 2018  April 2019Tools Used: Informatica PC 10.x, SAP HANA, Unix Shell Scripting, DevOps Migration tools, JIRA Responsibilities: Analyzed source data of all payment-related jobs to fix errors and ensure data availability before business cut-over time.Coordinated with the Command Center Team to re-run/hold/kill the jobs as per the source data arrival/due to long-run/outages/maintenancePerformed statistical validation in PSA and DSO layers of SAP BW on HANARemediated data from archived tables which are in Hadoop to the DSL layer of HANAGenerated reports and statistics at the end of the data loadReviewed the status of jobs in Control-M to confirm downstream.Produced standard operating documents detailing the challenges with solutions and enhancements.Updated the Confluence page with the details required by the client along with learningsDeployed code in pre-production & production environments using Olympus tool and tracked it in JiraClient: Merck Co.Role: Data Integration DeveloperDuration: Feb 2015  April 2018Tools Used: Informatica PC 9.x/10.x, Unix Shell Scripting, SFDC, Oracle 11g Database Responsibilities: Coordinated with the onsite team to create functional specifications using business requirements gathered from the client by following GxP and GDP compliances.Developed Informatica mappings using various transformations for extraction, transformation and loading of data from multiple sources to the data warehouse using complex transformationsDefined ETL processes to load data from various systems such as Oracle, Flat Files, SFDC, SAP, and XMLExtensively used mapping parameters and variables to provide the flexibilityFine-tuned the performance of Informatica components for daily incremental loadingScheduled all Informatica jobs in Autosys/Crontab using Shell scriptsCreated a notification system in UNIX shell scripting, providing the client with details of the interface runProcessed CRM data to many objects of Salesforce by using Informatica mappings and data loader manually. Made sure that, data got reflected in VEEVA CRM applications.Created Users, Territories, Accounts, Contacts, Products in the VEEVA application by processing Flat Files which are provided by Sales representatvies of each courtry configured in CRM application.Implemented a generalized method of scripting and mapping for around 30 countriesReproduced interface behaviors to identify the root-cause of problems and developed workarounds/solutionsConducted knowledge transfer sessions for end-users, created comprehensive documentation outlining the design, development, implementation, daily loads and process flow of the mappingsLogged user (client) support cases through Salesforce tickets to analyze and provide solutionsWorked on Hadoop connectivity from InformaticaImported HDFS tables as a target for populating dataCreated Shell scripts for pushing files into HDFS from LINUX serversConfigured the Kerberos network authentication protocol for a secure file transfer from LINUX Server to Hadoop systemDeveloped proof-of-concept for a broad range of scenarios in IDMC with Data Replication Services (DRS), Data Synchronization Services(DSS)Elicited and analyzed business requirements, created and delivered proof-of-concept to the client well ahead of the deadlinesExtensively worked on Web Service transformation in Informatica 9.6Developed mappings in response to clients requirementsExtracted data from source (WSDL) for loading them into the target (WSDL) system, which in turn, was connected to SAP systemsDeveloped proof-of-concept on Informatica Cloud to load data from Salesforce into Oracle tables and from WSDL to Oracle

Respond to this candidate
Your Message
Please type the code shown in the image:

Note: Responding to this resume will create an account on our partner site postjobfree.com
Register for Free on Jobvertise