| 20,000+ Fresh Resumes Monthly | |
|
|
| Related Resumes Data Engineer Sql Server Woodbridge, NJ Data Analyst Sql Server and Oracle Philadelphia, PA Data Warehouse Sql Server William Penn Annex West, PA Data Entry Sql Server Old Bridge, NJ Sql Server Data Warehouse Philadelphia, PA Sql Server C++ Sayreville, NJ SQL Server Management Studio, Oracle 11g, MS Word, MS Visio New York, NY |
| Click here or scroll down to respond to this candidateCandidate's Name
EMAIL AVAILABLEPH - PHONE NUMBER AVAILABLEExecutive SummaryData warehousing and ETL Solutions Developer with over 13 years of development and design experience.Excellent knowledge of building DW and OLTP systems and applying ETL methodologies in VLDB (multiple terabytes) environments in the pharmaceutical, media, Healthcare, finance, and Insurance industries.As designer and developer, delivered numerous high-performance ETL Solutions built on integrated, scalable, and extensible platforms. Can quickly understand business requirements and provide end to end innovative solutions for the same.Dedicated professional with strong technical, analytical, oral, problem solving, written skills and a committed team player, always involving in performance tuning of Informatica mappings and sessions.Experienced in loading data into Data Warehouse/Data Marts using informatica, Oracle, SQL Server and SQL * Loader.Extensively used Erwin to design Logical/Physical Data Models, forward/reverse engineering modeled the Data Warehousing Data marts using Snow/Star Schema.Experienced in Client/Server technology area with Oracle Database, SQL Server and PL/SQL for the back end development of packages, Stored Procedures, Functions and Triggers.Experience in analytical, programming, written and verbal communication skills with ability to interact with individuals at all levels.Excellent Team player and can work on both development and maintenance phases of the project.Developed processes for IDQ, Transforming, Extracting, Integrating and Loading data with use of Informatica Power Center.Worked with team using Informatica Power exchange to connect to various source systems for pulling data.Self motivated, Adaptive to new and challenging technological environments.Worked with the team in ILM which helps to understand life cycle and improve the standard and performance using informatica.Worked in Agile environment and rendered Production support 24 hours.Good understanding on Informatica MDM tool to identify data redundancy and improve necessary performance.Worked extensively on Informatica cloud to load data from multiple source system to Salesforce cloud.Key QualificationsDetermines Customers Business Requirements and establish a strategy / road map for DW/ETL solutions.Provide ETL Implementations from inception to final implementation to meet Clients deliverable and deadlines.Executed Client Assessments, Designs, Specifications, Data Mapping and Test plans/scripts.Performance tuning and Optimization techniques.Extensively worked on Informatica tools Repository manager, Designer, Work-flow manager, Work-flow monitor.Hands on databases like Oracle 11i/10g/9, MS SQL server 2005/2008, DB2, Teradata.Use of Database tools such as SQL*Plus, Toad, SQL Navigator, SSIS .Experience in usage of Operating Systems such as UNIX, WINDOWS XP/NT/2000/98/95.Hands on languages such as C, SQL, PL/SQL, XML.Created mappings using the transformations like Source qualifier, SQL, Aggregator, Expression, look-up, Router, Filter, Update Strategy, Joiner, Stored procedure and etc. transformations.Client Engagements DetailsUnited Health Care, Horsham, PA Duration: Mar 2019 Till DateETL DeveloperUnitedHealthcare is dedicated to helping people live healthier lives and making the health system work better for everyone. They serve millions of people from their earliest years through their working lives and through retirement.Extensively worked on improving performance in various sessions in workflow by adding parallel partitioning at database level and also at informaticaLoaded data into Various Platform from other platforms like UCPS, ISDW, FOX, COMPAS, SRA etc in oracle database and VSAM files by doing full and delta loads for each feed.Created new solution in COMPAS subset solution for member data load within 13 various non-prod environments.Extensively used COMPAS subset solution to incrementally subset data load within various environments.Calling Stored procedure from mapping to make entries into Process log tables for doing Inserts (IP) and Updates (SU) for every run and also create Run IDs and Feed IDs for each Feed.Used Informatica Power exchange to load data from various VSAM files using Data maps and performed both Bulk and CDC.Created Deployment Groups and Deploy changes to lower environments as well as worked with Production Team in refreshing the changes into production using XSDs.Conducted several demos within my team and to outside in educating them about various capabilities built for their usage.Worked closely with many requestors and helped them in providing data in multiple environments for their regression test needs.Loaded Data from multiple XML files which includes data related to application needs into Oracle DB.Environment: Informatica Power Center 10.1, 10.2 and 10.5, Informatica Power Exchange, Toad 12g, Autosys Version 11, GIT Version 1.9.5, Unix Script.JP Morgan Chase, Jersey City, NJ Duration: Dec 2018 Feb 2019ETL DeveloperJ.P. Morgan Chase & Co. is an American multinational investment bank and financial services company headquartered in New York City. JPMorgan Chase is the largest bank in the United States, and is ranked by S&P Global as the sixth largest bank in the world by total assets as of 2018, to the amount of $2.534 trillionExtensively worked on improving performance in various sessions in workflow by adding parallel partitioning at database level and also at informaticaLoaded data into Various Platform from other platforms like RSAM, RC, IDROLES etc in oracle database by doing full and delta loads for each feed.Calling Stored procedure for mapping to make entries into Process log tables for doing Inserts (IP) and Updates (SU) for every run and also create Run IDs and Feed IDs for each Feed.Introduced variance check as part of validations across all the databases before the initial data load to see if proper data is coming from source to stage and into base tablesExecuted jobs in UAT environment using scheduler tools like Autosys and validated the data counts and sample data to see if correct data is populated .worked closely with Oracle DBA people and created Roles to give grants to various users to privilege objects which has sensitive data.Updated DBA scripts and informatica XML files to GIT repositories for version control before deployment of code into production environments.Created multiple ITSM to migrate code into UAT and PROD from lower environments.Worked with team in setting up Informatica Power Exchange to integrate data from multiple source systems.Reduced development cost through automated metadata capture that supports rapid impact assessment and development as changes arise.Environment: Informatica Power Center 9.6 and 10.2, Informatica Power Exchange, Toad 12g, Autosys Version 11, GIT Version 1.9.5, Unix Script.Tiffany & Co., Inc. Duration: Jun 2017 Dec 2018Informatica Lead DeveloperThe Tiffany and Co has wide variety and rarity of colored gemstones make these distinctive sectors a rich and fascinating field. Some colored gemstones were discovered for the first time in recent decades, while others form a cultural tradition in communities where they have been mined for generations.Created mappings to actively load data from various sources like DB2, Flat Files, XML, Oracle databases into Schemas like MIPDC, MIPS, JDEE1, COUPA,TRDM for RF China project.Worked on Tiffany prestigious project to load data into POS server from various source system which includes accommodation of prices from many countries.Created active reports for the users when Enhancements and Adhoc requested.Worked with business closely in getting requirements and thoroughly understanding requirements and created a design data flow.Preparing all SDLC documents which includes IDM, IFS, IQOQ, Unit testing document, IDD and then load the same in Documentum and Share Point.Performed various smoke tests with business in Test and UAT environment.Worked on providing data related to Inventory balances, Reconciliation, Purchase orders, Invoices, Units on replenishments for Retail StoresInvolves in maintenance issues, analyzing data discrepancies, enhancements and providing resolution for technical issues.Worked with team in setting up Informatica Power Exchange to integrate data from multiple source systems.Reduced development cost through automated metadata capture that supports rapid impact assessment and development as changes arise.Environment: Informatica Power Center 10.1, Informatica Cloud, SQL Server 8.0, Oracle 11g, Toad 11g, Veeva Vault .Merck & Co., Inc. Duration: Mar 2016 May 2017Informatica Lead DeveloperMerck is determined to build upon our fundamental research strengths to deliver innovation to patients and physicians worldwide, and invest in opportunities to address unmet needs through their internal R&D efforts and the best external science.Used Informatica Cloud to actively load data from various sources like Oracle, SQL Server, Flat files which includes data of Trials, Study, Site, Country, RCAM and Milestone to Veeva Vault salesforce.Perform various API calls to load data into Merck network as well as outside Merck.Created active reports for the users when Enhancements and Adhocs requested.Worked with business closely in getting requirements and thoroughly understanding requirements and create a design data flow.Coordinate with offshore team regularly and asses them in development activities and also review code to see if it met all the requirement needs.Preparing all SDLC documents which includes IDM, IFS, IQOQ, Unit testing document, IDD and then load the same in Documentum and Share Point.Create, update and delete Binders, EDLs for Study, Site and Country into Veeva vault cloud.Create Meetings and Attendees related to meeting into CVENT and Amex team for events which helps planner to co-ordinate and make arrangements for meeting.Involves in maintenance issues, analyzing data discrepancies, enhancements, escalation and providing resolution for technical issues.Environment: Informatica Power Center 10.1, Informatica Cloud, SQL Server 8.0, Oracle 11g, Toad 11g, Veeva Vault .Lincoln Financial Group Duration: Sep 2014 Mar 2016ETL DeveloperLincoln Financial Group provides advice and solutions that help empower Americans to take charge of their financial lives with confidence and optimism. Their customers include retirement, insurance and wealth protection expertise to help address their lifestyle, savings and income goals, as well as to guard against long-term care expenses.Created mappings to actively load data from various sources like SQL Server, Salesforce Cloud, Flat Files, XML, Oracle databases into Schemas EMDBW, EMDBW_DM_CRM_RS, EMDBW_DM_RPT etc.Created active reports for the users when Enhancements and Adhocs were requested.Worked on Salesforce One Project which involves onboarding of ABS users apart from RPS users.Created a security level model to send entitlements for all users based upon their roles, their groups and also pass access levels through their hierarchy.Created new store procedures and also tweaked in existing store procedures and improved their performance.Created new mappings to load data from cloud to load data into 1511 server and also from 1511 into 1404 server through delta load.Worked on call warranty period and also in defect resolving and production support whenever defects are created.Worked on providing Campaign Opportunities, Activities, Events data through reports to the business to help them effectively run their campaigns.Involves in maintenance issues, analyzing data discrepancies, enhancements, escalation and providing resolution for technical issues.Environment: Informatica Power Center 9.5, SQL Server 8.0, Oracle 11g, Toad 11g, Salesforce one.JP Morgan Chase, Dublin, OH Duration: Aug 2013 July 2014ETL DeveloperJP Morgan Chase is an Multinational Company located worldwide. In USA operations are performed from New Jersey and Dublin Ohio, branches located in Mumbai, India and Singapore.Data were sent in the form of Flat Files and all data is loaded into SAGE and SONAR Platforms into Oracle database schemas RDTDBO, SONARDBO, SAGEDBO Respectively.Worked on Actively interacting with teams from Singapore, India and New Jersey to discuss on having Data Model and ETL Spec Walkthrough before start of any Mapping.Loading data into SAGE Platform based upon SCD Type 2 load into oracle database by doing full and delta loads for each feed.Calling Store procedure for all mapping to make entries into Process log tables for doing Inserts (IP) and Updates (SU) for every run and also create Run IDs and Feed IDs for each Feed.Loading target tables and also making entries of data with discrepancy into Exception log and Exception Attribute log respectively by calling store proc at post sql.Create Unix Scripts and Execute the workflows for testing and load data thus avoiding in creating Metadata for running the workflows manually.Involve in creating Deployment groups by creating both Static and Dynamic deployment for each mapping and do deployment of mapping from DEV to SIT and to UAT and PROD Respectively.Involved in doing Smoke Test after deployment of mappings to one environment to other environment and make sure the mappings are performing correctly once after deployment.Fixing Defects and also assigning back for Ready_to_retest in ALM for all the defects raised by Testing Team.Provided all time support to mappings in production environment and also worked on improving performance to existing mappings in production environment.Environment: Informatica Power Center 9.1, Visio Oracle 11g,, Toad 11g, Erwin 7.2, ALM Version 11, Unix Script.Merial Pharmaceuticals, ATL,GA Duration: Feb 2013- July 2013ETL DeveloperMerial Pharmaceuticals is an international company which has its head office in France. In USA operations are performed from Atlanta Georgia, recently it is bought out by Sonofi Internationals head office is in New Jersey. Data is sent through IDOCS and great planes everyday and all the data is loaded into CRM database.Worked closely with design team, shared ideas with them for building an Mexico, Argentina and Uruguay data warehouse and also suggested schema names.Helped in developing Star Schema by following referential integrities among Facts and Dimensions.Assed in assigning needed attributes required for 22 facts and dimensions and also assigned privileges to various users to do testing and data validations.Data is loaded into GDW database in schemas NGDWSRC in the form of SAP Idocs which is transformed and loaded into schema NGDWSTG with integration of CRM data from great plans.Data loaded into NGDWSTG is than transposed into final NGDWDM schema using incremental load.Developed new mappings and loaded 22 facts and dimension tables and also improved the performance of those mappings by doing performance tuning.Implemented SCD type1 and SCD type 2 logic at various stages in loading data into final tables.Deployed mappings in various environments from DEV to QA and finally to PROD using deployment groups and Export/Import.Dynamically created parameters in target directory and used them subsequently at various session levels to do incremental load.Wrote new Functions, Store Procedures and called them at various mappings.Environment: Informatica Power Center 9.1, Visio, Oracle 11g,, Toad 11g, SQL Server version 2008.Parexel (Pfizer Group), Boston, MA Duration: Jun 2012-Feb 2013ETL DeveloperParexel International is a research company which is based in U.K and also in many places in U.S.A and worldwide. The research work done in Pfizer are send to RDM and also to USDMC servers from where these source files are than FTP to put into CAL components.The data loaded in CAL components are then segregated into RDM(Research data Management) and CDM based upon the study id.Worked on creating new mappings to load the fact tables keeping referential integrity with the dimension tables based upon the study ids.Loading the history tables in parallel to the fact tables based upon SCD type 2 with new timestamps and existing timestamps.Optimizing the existing mapping before sending to production by checking sorted inputs when getting tables from multiple databases at joiner transformation and also checking sequence of columns to find the checksum when working on CDC.Loading the data into EDW schemas from CAL components RDM and CDM and putting the timestamps for those records.After successful loading of Edw schemas they are subsequently loaded into ODS schemas based upon incremental loading.Refactoring of the sql query for loading tables in various environments R3 to R4 based upon the business requirements.Dynamically created Parameters in target directory and used them at workflows for various session when doing an incremental load.Deployed mappings using deployment groups or export/ import from Dev. to Test for further testing and finally to Prod at the Informatica Repository Manager.Understand the sql query sent and respectively created mapping based upon those queries, also improve performance of the mapping by balance the load to database and Informatica respectively.Worked in creating web Intelligence reports individually and with team using business objectUnderstand and analyze universe with different relations to the various entities and their join conditions.Environment: Informatica Power Center 9.5, Visio Oracle 11g,, Toad 11g, Erwin 7.2,Business Objects 3.1.YRC Freight, Kansas, MO Duration: Aug 2011- May2012ETL Lead DeveloperYrc freight a freight company which renders services of delivering shipments worldwide. The company is based in various places in America and Canada. The shipments are then send worldwide at different location. The American location include Reddaway, Holland and Newpenn. Everyday source files come from this locations which is then loaded into the data warehouse using informatica.The source files are sent on the basis of cost incurred, revenue earned which are acquired using VB script on the windows server those files are placed in the source folder with an copy also placed in the archive folder as an backup as dot dat files. There are events which will be waiting for the files to be received and later processed and put into relational databases.Migration of workflows from Teradata prods to Oracle prod. The version used is informatica power center 9.0.1.All the workflows are tested in QA Oracle environment and migrated into Oracle prod. They are thoroughly optimized for speed and also to improve performance.All the workflows are assigned with integration server after successfully migration into various environments.Every day loads are monitored and also followed up with users for any delay in source files or missed files to finish daily loads based on which reports are triggered.Kick off various workflows using schedule task and command tasks and also delete the source files immediately after they are loaded so that we can receive next day new source files and load new data without any replication of data.Processed monthly loads every month by using TCG Spinoff process which would get records for the current month and also for subsequent months to load costs.Keep in touch frequently with database people for analysis of store procedure which are used to do direct insert or updates to particular backup tables.Trigger reports based upon completion daily loadsDesigned and Developed Informatica Data maps to pull data from source systems and implemented various filter criterias which will be useful while pulling the data from source systemsWorked with team on informatica ILM to reduce storage, control data growth, and improve performance of databases and application as a part of effective application information life cycle management.Enhanced ETL routines to maintain current and history data in major base tables by using CDC(Change Data Mechanism) with informatica Type-2 logic.Worked in creating web Intelligence reports individually and with team using business object.Understand and analyze universe with different relations to the various entities and their join conditions.Environment: Informatica Power Center 9.0.1, Visio Oracle 10g, Teradata, Toad, Erwin 7.2, Business objects 4.0 .Shire Pharmaceutical, Chesterbrook, PA Duration Feb 2011- Aug 2011ETL Lead DeveloperShire Pharmaceutical has branches all over the world one of the leading it is Jersey registered Irish headquartered company it brand product include Vyvanse, Adderrall some of the leading products. It includes its sales reps who has set boundaries assigned to work and aggregate the revenue from each territories assigned.The Edw team main purpose was to get the current hours sales rep has put, the sales for current territory they are generating which would help company to assign target sales and also make necessary efforts to improve the sales for that territory.Involved in doing low level design documentation with the team and also analyze high level flow of the data with the requirements provided.Worked as team in integration of the workflows and represented team in coordinating the same to next level superiors as necessary.Worked from staging to hub doing an incremental load of data using parameters both at session level and also at workflow levels.Was responsible in doing unit testing and migrated informatica mappings and sessions to various environments such as testing and productions, and also resolved after production issues for existing mappings according to users needs.Worked on creating headers and footers for the flat files and also worked on checking the quality (IDQ) of data based upon the tolerance levels assigned according to company standards to accept the data into relational tables or to refuse the data.Worked on Account management, Product management, territory management for the sales rep and do necessary changes with data for new reps recruited and also updated changes of territories for existing sales reps.Worked on slowly changing dimensions like type 2 and 3 and also assign flags for logical deletes in the mappings.Environment: Informatica Power Center 9.0.1, Power Exchange, Force.com, Apex Explorer, SQOL, Netezza, SQL/PL SQL, Oracle 10g, Visio 2010, Erwin, & UC4,SSIS.Education and CertificationsDegree: Masters in IT |