Quantcast

Data Integration Warehouse Resume Voorhe...
Resumes | Register

Candidate Information
Name Available: Register for Free
Title Data Integration Warehouse
Target Location US-NJ-Voorhees
Email Available with paid plan
20,000+ Fresh Resumes Monthly
    View Phone Numbers
    Receive Resume E-mail Alerts
    Post Jobs Free
    Link your Free Jobs Page
    ... and much more

Register on Jobvertise Free

Search 2 million Resumes
Keywords:
City or Zip:
Related Resumes

Data Management Integration Bridgewater, NJ

Data Warehouse Sql Server William Penn Annex West, PA

Database Developer Data Warehouse Iselin, NJ

Sql Server Data Warehouse Philadelphia, PA

sr. Data Integration Developer Hillsborough, NJ

Data Analysis Warehouse Management Edison, NJ

Data Entry Warehouse Associate Bethlehem, PA

Click here or scroll down to respond to this candidate
Candidate's Name
Technical Architect /Technical LeadTechnical Lad/Architect Professional with more than 16 years of experience in Data Warehouse and Data Integration technologies, serving clients from Telecommunication, Banking, Insurance and Manufacturing industries. Excellent Industry experience in Designing, Implementing, Supporting, and performance tuning the Data Integration/warehousing solutions. Continuously aims to reduce total cost of solutions, improve availability and redundancy.Core CompetenciesDesign and Development of Data Integration solutionsDesign and develop data models and data martsDesign and implement data flow from multiple source systemsData Warehousing and Data MigrationRequirement gathering, analysis and provide optimal technical solutionsDevelop technical and design specifications.Development and testing reviewLeadership and communicationPerformance tuningProcess improvement and process modelingExecuting Onsite offshore model team structureManaging and expanding cross functional teamData AnalyticsTools and TechnologiesInformatica Intelligent Cloud Services (IICS)IDMCInformatica PowerCenter 7.1, 8.6, 9.1,10.2, 10.5Informatica DataQuality 10.1.1SAP BODSBusiness Objects XI 3.1Apache AirflowDatabricksSnowflakeOraclePL/SQLTeradataTeradata - BTEQSQL ServerDB2SAP EC2HDFSUNIXSparkJIRAAgileServiceNowHP ALMDockerKubernetesSparkGitCertifications: AWS Certified Solutions Architect - AssociateResponsibilitiesHaving more than 16 years of experience in Data Integration/ Data Warehouse technologies with a strong background designing and developing data loading strategies, extracting data from multiple sources, and implementing transform and load.Designed and developed multiple data flows from heterogenous source like Data lakes, HDFS, flat files and Oracle to data warehouse./datamart by leveraging Informatica, Teradata bteq, pyhton/shell scripting, spark and IICS technologies.Analyze, optimize, and redesigning long running ETL jobs, queries, and scripts.Design database views and models for optimal performance.Extensively worked on projects with Informatica Power Center, Informatica Data Quality, SAP BODS and Talend and reporting tools like Business objects and OBIEE.Extensive experience in Databases like Teradata, Oracle and procedure language like PL/SQL and BTEQsSolid knowledge in performance tuning, query optimization and best practices in the industryHands on experience in guiding teams towards the successful fruition of project delivery. Conversant with maintaining high standards of ETL Solution development by coaching and mentoring. In depth knowledge of managing large teams by ensuring best practices and analytic skillsGood knowledge on Scheduling tools like Airlfow, Cybermation and Control-MDefine technical requirements and document plans for project lifecycle deployment including the scheduling of project deliverables, budgets and timelines.Direct all IT deployments and job scheduling with oversight for vendor and consultant management.Coordinate internal IT projects ensuring that implementations are ready as required by client deadlines.Proven ability to resolve complex problems, quickly diagnosing and identifying issues and determining the proper resolutions.Excellent Stakeholder management skills and experience in working on Business, Data, Application and Technology domainsEntry level expertise in Informatica MDMExperience in complete execution of the project in Agile Methodology by estimating the user stories (requirements), Mentoring and providing technical support to the Team.Educational QualificationsBachelor of Technology in Computer Science and Engineering, National Institute of Technology Silchar (Formally known a REC)Projects Worked on at VirtusaComcast  EBI Data Operations - Sales and Compensation: August 2021  Till DataRoleETL Architect/Technical LeadTechnologyInformatica Intelligent Could Services (IICS), Informatica Power Center 10.5.2, Teradata, BTEQ, PySpark, Unix Scripting, Python, JIRA, ServiceNow, Airflow, Databricks, HDFS, AWS, MinIO, Data Lake, Snowflake, Docker and KubernetesDescriptionEBI - Sales and Compensation team is responsible for managing data for Sales and commissions for Sales Reps in different Sales Channel. Data from multiple channels hosted in different platform like Datalake, AWS, HDFS, Oracle and flat files are loaded to Teradata hosted data warehouse. Data extracts were send to different business vendor from the data warehouse and Tableau reporting was performed by multiple business divisions.ResponsibilitiesDesigned and developed multiple data flows from heterogenous source like Data lakes, HDFS, flat files and Oracle to Teradata data warehouse by leveraging Teradata bteq, pyhton/shell scripting, spark and IICS technologies.Analyze, optimize, and redesigning long running bteqs and scripts.Design database views and models for tableau reporting for optimal performance.Work along with Data Stewardship and data architect team, gather requirements, estimate, and design optimal solutions.Schedule and develop data pipeline in Apache AirflowLead sprint planning and daily scrum calls.M&T Bank: Enterprise Party Hub  Master Data Management - Sep 2020- to June 2021RoleETL Architect/Technical LeadTechnologyInformatica Power Center 10.1.1, Informatica Data Quality 10.1.1, Informatica MDM, Unix Scripting, Oracle, PL/SQL, JIRA, ServiceNow, Atomic SchedulerDescriptionM&T Enterprise Party Hub is a Master data management system in which sources the Golden Party records derived from different source systems. EPH is receiving Party information around 30 different source systems including real time and determine Golden party based on the trust hierarchy. EPH is the sources of the Truth for multiple application and new sources are getting added to provide more value to the customers.ResponsibilitiesPerform ETL Architect role to add new source system to EPH hubAnalyze the backlog tickets and allocate it to Kanban board for implementationCreate PL/SQL program for ETL jobsOptimize existing SQL script for better prformaceAnalyze source system data and incorporate with EPH hubImplement new Business requirements, design and implement the solutionSupport MDM technical team to provide data to EPH Staging layerSupport SIT and UAT with test data and production deploymentProvide technical solution to production support team for daily batch failuresTechnical reviews and development support to offshore teamBlueCross BlueShield of Western New York: HN - EIS Data Integration and Delivery- Insurance- Apr 2018- Aug 2020RoleAssociate Architect/Technical LeadTechnologyInformatica Power Center 10.2, Unix Scripting, Oracle, PL/SQL, SnowflakeJIRA, ServiceNow, FACET, CAVE Market Basket SystemDescriptionHN- EIS Data Integration and Delivery is a multi-year program in which ETL team is responsible for enhancement and maintaining the existing ETL interfaces which load data from multiple source system like FACET to Data Warehouses and upstream ETL interfaces along with Vendor files. ETL team is also responsible for new projects and develop ETL solutions to provide data for new vendorsResponsibilitiesAnalysis the business requirement from DA and design the ETL solutionDevelop ETL interface to support business requirement including vendor dataCreate Stored procedures and functions in PL/SQL for ETL mappingMaintenance and enhancement of existing ETL mappingsTechnical reviews and development support to offshore teamComplete the backlog tickets and allocate it to offshore teamExperience Prior to Virtusa - Wipro LimitedMarsh & McLennan Companies: CANSYS OF12 Integration - Insurance  Jun 2017 - Feb 2018RoleTechnical LeadTechnologyInformatica Power center 9.1, Oracle, Business Object 4.2, Oracle, PL/SQL, Unix ScriptingDescriptionMarsh Canada financial system is getting migrated to Oracle Financial 12 which will be source for multiple interfaces and financial reporting. DataMart is created to stage data from OF12 which will be the using as source for multiple reports and interfaces.ETL team is responsible to create Informatica Interface which loads data from OF12 application to Datamart and Business Object reports created for displaying daily transaction and monthly revenue.ResponsibilitiesAnalysis the business requirement and design the ETL solutionDevelop for DataMart development including DDL and DML scriptsCreate Informatica ETL job to load DataMart from OF12Collect requirement for Business Object reportingResponsible for Unit testing, QA, UAT testing for the CANSYS ETL and BO applicationResponsible for Production changes and post Golive supportCorning Incorporate: HCM Replacement  Manufacturing  June 2016May 2017RoleTechnical LeadTechnologyInformatica Powercenter 9.1, Oracle, SQL Server, PL/SQL, Unix Scripting, PeopleSoft, SuccessFactorDescriptionProject HR2.0 is a global implementation project in which replace Coring Human resource system PeopleSoft into new HR management system called SuccessFactor. First phase of this project is to replace the Human Resource module and second phase is planned to replace Payroll module and rest of the business functional areas.PeopleSoft supply data for 74 Human Resources related interfaces using Informatica PowerCenter as ETL tool. ETL Interfaces team is responsible to accommodate this source system changes for all 74 interfaces and ensure business will be able to perform as usual after this change.ResponsibilitiesWork closely with the solution architect, data modeler, business analyst understands business requirements, providing expert knowledge and solutions on data warehousing, ensuring delivery of business needs in a timely cost-effective manner.Source-target mappings, ETL design/development, solution tunings and optimization, define testing strategy and implementationProvide reviews of deliverables and identify gaps/areas for improvementProvide/review detailed technical specifications and ETL documentation required in each phase of the SDLC, and comply to internal standards and guidelinesIdentify opportunities to optimize the ETL environment, implement monitoring, quality and validation processes to ensure data accuracy and integrityProvide post-implementation support, respond to problems quickly and provide both short and long-term resolutionsSealed Air: BI/BW support - Manufacturing  March 2015 - May 2016RoleTechnical LeadTechnologyTalend, Business Object 6.5 and XI 3.1, SAP, Linux, OracleDescriptionSealed Air BI/BW support is a multi-year program in which Wipro BI team is responsible for supporting different Talend ETL interfaces and Business Object for reporting. Talend connect multiple productions data sources and supply data to different interfaces through both real time and batches t and ensure that business users are getting accurate data for their process.Business Objects are using as a reporting tool and support team is responsible for all technical issues and users access managements. Issues are address by service tickets and there are projects which will require fresh development are also part of this program.ResponsibilitiesWorking as a technical lead in this programResolve end user`s technical issues related to dataResolve Business Object reports issues and user access controlDevelop new interfaces for Projects which are coming as new Business requirementResolve issues proactively and ensure quality of service.Lead technical team for all developments and issuesSchneider Electric: Installed Base Management - Manufacturing  May 2014 - Feb 2015RoleTechnical LeadTechnologyTalend, Jaspersoft, MySQL, Linux, AWSDescriptionComplexities in Supply Chain make tracking of assets more challenging between countries and distribution centers. Low data quality in heterogeneous data sources and significant partner lead sales and distribution restrict Schneider in tracing of asset. Diversity in cultures, operations, processes and systems across different countries limits a standardized approach to asset traceability.This application is to provide solution to standardize asset tracking and generate Pipeline of Opportunities in all countries for different offers like Modernization, Service Plan and Preventive Maintenance based on technical cleansing. Installed Base Datamart helps to identify potential leads and corresponding business opportunity. To achieve target Installed Base Datamart require information from different source system and they are mentioned in below sections.ResponsibilitiesWorked as Data analytic during the initial stage of the projectIdentified the source systems and scope of data for the solutionIdentified the relation between source systems and subject areasDesigned Datamart for this solutionPerform database tuning for better performanceDesign logical and physical model of DatamartDesign ETL requirement specification document and ETL designDevelop ETL mapping to load data into DatamartLead technical team for all developments and issues.Outotec: Outotec MDM Near term Solution  Manufacturing- Oct 2013 - Apr 2014RoleTechnical LeadTechnologySAP BODS, MS SQL Server 2008, Unix, OracleDescriptionIn order to bring in the required level of Optimization to the running of plants, Outotec plans to leverage Information technology to build an Operations & Maintenance Platform (O&M Platform). The way that O&M PLATFORM is envisaged to operate is through realizing an Information Hub. The Hub will collect readings from plant equipment instrumentation and store it for analysis by the Engineers and Maintenance Experts. The Hub will also take note of Failure Events that will trigger notifications for Action. In alignment with the prescribed Master Data strategy it has been decided to implement the architecture prescribed for the near term solution will be based on the capabilities of ETL and Data Quality Tools.ResponsibilitiesUnderstand the business requirement from onsite clientProposed technical solution based on the business requirementDelivered a Proof of Concept based on the business requirementPresented the proof of concept to client and solution was well appreciatedDesign technical solution for the business requirementManage small team of 3 people in sizeElectrolux: Compass SAP Global Rollout  Manufacturing- Sep 2011 - Sep 2013RoleTechnical LeadTechnologyInformatica DataQuality, Informatica Power Center, LSMW, SAPDescriptionCompass SAP Global Rollout is a multi-country data migration project which helps Customer to implement a global solution for multiple country business. A global standard Meta Data Model is used for all conversion extracts during the Compass roll-outs. The Meta Data Model is based on the SAP data model, containing those elements where the project is expecting input from legacy systems and/or local business. The Meta Data Model is divided into a number of Conversion Objects. Each conversion object can be split up over several extract files.ResponsibilitiesOnsite role to coordinate between Onsite and offshore team.Collect requirement from business and process CPEEnsure on deliverable which must be in time and meet client requirement.Help offshore team for technical solution.Set up Data Migration Environment for each Mock load and Production loadData Quality lead for the projectJohnson and Johnson: Crossroads Migration Track - Manufacturing -Dec 2009 - Sep 2011RoleSenior ETL DeveloperTechnologyInformatica DataQuality, Informatica Power Center, LSMW, SAPDescriptionCrossroads Migration Track program helped Johnson and Johnson to create a global ERP system which collects data from different companies which were acquired by JnJ. The data is extracted from legacy systems from multiple sources and is loaded into SAP systems. Almost 25 objects are developed for migration of legacy data into SAP.ResponsibilitiesWritten the Technical Design Specification (TDS) of all the three objects.Develop Code.Written unit test cases and perform unit testing for all the three objects.Did TUT, TAT, Cut Over Rehearsal, Integration test Cycle, Iterative Loads,System Testing and all the iterative loads of all the three objects.Provides coding and design guidelines in the project.Mahindra SatyamJohnson and Johnson: Crossroads Migration Track  Manufacturing- Feb 2008Sep 2009RoleETL DeveloperTechnologyInformatica DataQuality, Informatica Power Center, DB2, LSMW, SAPDescriptionCrossroads Migration Track program helped Johnson and Johnson to create a global ERP system which collects data from different companies which were acquired by JnJ. The data is extracted from legacy systems from multiple sources and is loaded into SAP systems. Almost 25 objects are developed for migration of legacy data into SAP.ResponsibilitiesDeveloped mappings, workflows and sessions.Prepared mapping design documents.Worked on Test Cases and Test Results.Contact Information:NameCandidate's Name
Primary PhonePHONE NUMBER AVAILABLEPrimary Email IdEMAIL AVAILABLECurrent Address1138 Bibbs Rd, VoorheesNew Jersey 08043Visa StatusH1-B (i-140 Approved)

Respond to this candidate
Your Message
Please type the code shown in the image:

Note: Responding to this resume will create an account on our partner site postjobfree.com
Register for Free on Jobvertise