| 20,000+ Fresh Resumes Monthly | |
|
|
| | Click here or scroll down to respond to this candidateCandidate's Name
Email-id: EMAIL AVAILABLEContact: PHONE NUMBER AVAILABLEPROFESSIONAL SUMMARY:12 years of work experience in Development and Implementations of Data Warehousing solutions.10+years of experience in SAP BODS Development and Data Migration using SAP ECC to SAP ECC and SAP S/4 HANA using SAP BODS.Having retail domains experience on SAP S/4 HANA (customer Master and Vendor Master).Having Experience data upload into SAP S4 HANA Public cloud using the LTMC.Worked on ETL tool SAP Business Objects Data Services 4x/3x and on all transforms of Data Services (Data Integrator, Platform & Data Quality transforms).Experience on SAP IS(Information Steward).Good experience on IDOCS, LSMW and LTMC.Experience in designing, developing and scheduling BODS ETL jobs sourcing from both SAP and Non-SAP (Flat Files, CSV, Excel and Databases) source systems.Used in-built functions of BODS, Custom functions, Conditional workflows, BODS scripting language, debugger to debug the BODS Job in debug mode.Exposure on BODS Management Console.Involved in production support activities.Exposure on Remedy tool.Exposure on Production support role.Prepared support projects documentations.Hands-on experience in SDLC entailing activities related to requirements gathering, including planning, design, development, implementation, documentation, prototyping, development, testing, implementation and user training.Involved in source code repositories activities.Key AchievementsAwarded multiple performance awards like Team Spot, DW Client Manager Appreciation and Technical Appreciation Award for typical work performed for few of the clients as part of my organization activities within Capgemini.Received customer Excellence Award.Key Responsibilities as SAP Data services Application DeveloperAnalysis, Design and Implementation of Software features using Business Intelligence tools SAP Business Objects Data services 4.x ETL Tool.Interface with the Business Analysis team, Quality Assurance team and Project Management team and resolve issues related to Requirements, scope, testing and schedule.Analyze and fix defects raised by Quality Assurance team.Have done scaling, sizing in procuring SAP Data services and have done end to end installation of SAP Data services 4.x using Information Platform services (IPS) on a Standalone machine.Skills: SAP Data services 4.x ETL Tool, SAP Business Objects XI & 4.0 versionsHas experience in Manufacturing, Retail, Insurance l Domains.Key Responsibilities as a Technical LeadWorking with Onsite counterpart, Database Architect and also with client in preparing Data Model and preparing ETL Mapping DocumentHaving internal review meeting with team to understand the Requirement, ETL mapping document and with architect to resolve modeling issues where ever it is required.Lead team in implementing code on new requirements by providing ETL mapping document and helping them to resolve technical related issues.Manage deployment and support offshore team ensuring that periodic tasks are completed by each team member.Assign and prepare schedule plan, coordinate with staff to ensure efficiency and productivityCreate business processes, best practices, standards, templates and operating procedures to optimize IT project development.Maintaining Risk register and updating upon the discussion on issues persist over the weekly call.Status update to manager about the initiative by updating the schedule planCAREER PROFILE(In chronological order starting from the most recent)Key Competencies & SkillsOperating SystemUnix and WindowsDesigner ToolsBODS DesignerETL ToolsSAP BODSDatabase ToolsTeradata, Oracle, SQL Server, SAP HANA,SAP IS(Information Steward)DATESORGANIZATION NAMEDESIGENATIONAug -2022 to Till DatePOPULAR BANKBODS Developer and Team LeadMar 2019 to 2022INFOSYSBODS lead and Solution ArchitectApr 2018 to Feb 2019CONAGRA FOODSBODS Developer and Team LeadJun 2017 to March 2018ULTA BEAUTYBODS Developer and Team LeadOct 2016 to May 2017KELLOGGBODS Developer and Onsite CoordinatorOct 2011 to Oct 2016CAPGEMINIETL Developer and Team LeadASSIGNMENT DETAILSThe details of the various assignments that I have handled are listed here (in reverse chronological order).CLIENT NAMEPOPULAR BANKPROJECT TITLEOUTREACHPERIODJan 2022 to Till DateTEAM SIZE8ROLETeam Lead/BODS DeveloperDESIGNATIONSenior BODS ConsultantPROJECT SYNOPSISIn order to track and identify customers that are due for an update, a system generated activity will be created in Customer Insight. Customers that are due for an updated are indentified by their risk level and the last time their information was updated. The customers fields to be updated or verified are customer profile address, occupation and employer for individuals. For organizations the fields are customer profile address and account level NAICS codes.To identify those customers and create the activities is required to add few fields and calculations to MDM. Those calculations must be performed daily to maintain the accuracy of the information. MDM must obtain the risk score from DCI to determine the customers risk level. It is required to obtain the customers last update date and calculate customers Next update due date. With the Next update due date and frequency calculate the trigger date required to create the activity. Create a set of activities for customers that are due for an update and upload those activities to CI. The project scope is to design ETL for the aforementioned process.RESPONSIBILITIESBuilding of the BODS Jobs, Workflows, Data Flows as per the Mapping Specification.BODS should extract the data from MDM,BIC and DCI.DATABASESQL, OracleTOOLSSAP BODS 4.2CLIENT NAMESOUTHERN CALIFORNIA EDISON - SCEPROJECT TITLESCMTPERIODJan 2022 to Aug 2022TEAM SIZE8ROLETeam Lead/BODS DeveloperDESIGNATIONSenior BODS ConsultantPROJECT SYNOPSISThe scope and cost estimate management tool replacement is an effort to first replace the scope and cost. Reporting platform stores multiple versions of project estimation with in 10 years of historical data. Automate data integration between Ineight and transactional systems:C55, MDIRESPONSIBILITIESBuilding of the BODS Jobs, Workflows, Data Flows as per the Mapping Specification.BODS should extract the data from Ineight using webservices.Loaded data C55/MDI into Ineight system.DATABASES4 HANATOOLSSAP BODS 4.2, SAP BW, S4 HANA, IneightCLIENT NAMEKRAFT HEINZ - CHICAGO,USAPROJECT TITLEGalaxy_MDGPERIODApr 2019 to Dec 2021TEAM SIZE15ROLETeam Lead/BODS DeveloperDESIGNATIONSenior BODS ConsultantPROJECT SYNOPSISThe purpose of this project is to implement SAP Master Data Governance solution for Vendor and Customer domain. This project is a global template starting with the current implementation of the Kraft Heinz business processes. SAP MDG 9.1 version on S/4 HANA is being implemented for this project. It is HUB architecture, MDG is single source of truth. SAP MDG will be the central authoring system and will be the source system for current and all the downstream systems rolled out in future. BODS tool utilized for extraction, transformation and loading of initial master data from Catalyst to MDG.RESPONSIBILITIESBuilding of the BODS Jobs, Workflows, Data Flows as per the Mapping Specification.BODS should extract active Vendor/Customer master data from all Vendor/Customer master tables defined in same session.Generate XML file for Key MappingAn XML file should be generated containing MDG Vendor/Customer number, Catalyst Vendor/Customer numberGenerate Preload report and provided it to Functional team and KraftHeinz team to validate and approve itDATABASESQL ServerTOOLSSAP BODS 4.2,SAP ECC, SQL Server,SAP HANACLIENT NAMECONAGRA FOODS- CHICAGO,USAPROJECT TITLEOTM_TAPERIODApr 2018 to March 2019TEAM SIZE15ROLETeam Lead/BODS DeveloperDESIGNATIONSenior BODS ConsultantPROJECT SYNOPSISThere are overall 16 reports are developed in this current project. Based on the already existing reports at ConAgra, source are coming from OTM and these source tables would be replicated into a new data warehouse location and modelled to reach the business requirement. This data foundation layer will be consumed by reporting layer for visualization dashboards as well as detailed analytical/operational reports. A BODS is being used as an ETL tool. Data will be extracted from SQL server, ORACLE to ORACLE database. This key stone Data warehouse will be used as a base to accommodate all the reporting requirements.RESPONSIBILITIESBuilding of the BODS Jobs, Workflows, Data Flows as per the Mapping Specification.Used Push-Down, Bulk Loader in BODS to transform the data from CSV File to Staging area and Staging to EDW tables.Used various transformations like Query, SQL, Case, Map Operation, Table Comparison, Pivot, Merge, etc.Used BODS Script and Global Variables.Preparation of the Unit Test documents and capturing the results in UTP.Involved in BODS Jobs and database code tuningCreated batch jobs using SAP BODS to transfer data from SAP ECC,SAP BW, SQL Server DB and Flat files to SAP HANA.Data migration from different data source to different target database.DATABASESQL Server, OracleTOOLSSAP BODS 4.2,SAP ECC,SAP BW, SQL Server, OracleCLIENT NAMEULTA BEAUTY- CHICAGO,USAPROJECT TITLEGUEST 360PERIODFrom Jun 2017 to March 2018TEAM SIZE15ROLETeam Lead/BODS DeveloperDESIGNATIONSenior BODS ConsultantPROJECT SYNOPSISThe Guest 360 data model will consist of a single ORS (G360_ORS) to store all Guest records. The data model being implemented is the Informatica Customer 360 accelerator data model, with modifications being made to meet all necessary requirements for the Guest 360 project. A BODS is being used as an ETL tool. Data will be extracted from SQL server, ORACLE to ORACLE database. This key stone Data warehouse will be used as a base to accommodate all the reporting requirements.RESPONSIBILITIESImplemented Load Balancing technique at BODS and SAP ECC level.Building of the BODS Jobs, Workflows, Data Flows as per the Mapping Specification.Used Push-Down, Bulk Loader in BODS to transform the data from CSV File to Staging area and Staging to EDW tables.Used various transformations like Query, SQL, Case, Map Operation, Table Comparison, Pivot, Merge, etc.Used BODS Script and Global Variables.Preparation of the Unit Test documents and capturing the results in UTP.Involved in BODS Jobs and database code tuningCreated batch jobs using SAP BODS to transfer data from SAP ECC,SAP BW, SQL Server DB and Flat files to SAP HANA.Data migration from different data source to different target database.DATABASESQL Server, OracleTOOLSSAP BODS 4.2,SAP ECC,SAP BW, SQL Server, SAP HANA,OracleCLIENT NAMEKELLOGG - CHICAGO,USAPROJECT TITLEISM- INVENTORYPERIODFrom Oct 2016 to May 2017TEAM SIZE15ROLETeam Lead/BODS DeveloperDESIGNATIONSenior BODS ConsultantPROJECT SYNOPSISThe ISM project INVENTORY work stream objective is to create a multi-regional, unique data repository for Inventory data measures that in turn are at reach of the business analysts to calculate key business process indicators, to generate business insights that drive optimal Supply Chain decisions and that unlocks this data for all supply chain planning needs, and provide the needed inputs to downstream supply chain processes. A BODS is being used as an ETL tool. Data will be extracted from SAP ECC, BI source to HANA & SQL databases. This key stone Data warehouse will be used as a base to accommodate all the reporting requirements.RESPONSIBILITIESImplemented Load Balancing technique at BODS and SAP ECC level.Involved in production support activities.Building of the BODS Jobs, Workflows, Data Flows as per the Mapping Specification.Used Push-Down, Bulk Loader in BODS to transform the data from CSV File to Staging area and Staging to EDW tables.Used various transformations like Query, SQL, Case, Map Operation, Table Comparison, Pivot, Merge, etc.Used BODS Script and Global Variables.Preparation of the Unit Test documents and capturing the results in UTP.Involved in BODS Jobs and database code tuningCreated batch jobs using SAP BODS to transfer data from SAP ECC,SAP BW, SQL Server DB and Flat files to SAP HANA.Data migration from different data source to different target database.DATABASESQL ServerTOOLSSAP BODS 4.2,SAP ECC,SAP BW, SQL Server, SAP HANAEMPLOYERCAPGEMINICLIENT NAMEUNILEVERPROJECT TITLECONNECT-PERIODFrom Oct 2011 to Oct 2016TEAM SIZE40ROLETeam Lead/L3 Support EngineerDESIGNATIONSenior ConsultantPROJECT SYNOPSISCurrently the CLIENT has independent DATA MARTS for all its locations spread across the world. This project is mainly for creating an INTEGRATED Data warehouse by depending on the existing DATA MARTS. A BOD is being used as an ETL tool. Data will be extracted from SAP BI source to Teradata database. This Data warehouse will be used as a base to accommodate all the reporting requirements.This project is used to capture all the retail activity for Westcon group. It is managing different types of messages like sales order, advance ship notice and invoice. Different reports are generating for end users like product wise statistical information, order wise reports and client wise statistics report.RESPONSIBILITIESImplemented Load Balancing technique at BODS and SAP ECC level.Involved in production support activities.Building of the BODS Jobs, Workflows, Data Flows as per the Mapping Specification.Implements Open database Connectivity (ODBC) to allow MS Excel database to access Oracle compliant data sourcesUsed Push-Down, Bulk Loader in BODS to transform the data from CSV File to Staging area and Staging to EDW tables.Used various transformations like Query, SQL, Case, Map Operation, Table Comparison, Pivot, Merge, etc.Used BODS Script and Global Variables.Preparation of the Unit Test documents and capturing the results in UTP.Working as a L3 Support Engineer.Involved in BODS Jobs and database code tuningHands on experience with Third party schedule tool Tivoli to manage the jobs as per dependency.Setting rights and privileges on Repositories. Creating and maintaining different User groups as per the business need.Created batch jobs using SAP BODS to transfer data from SAP ECC,SAP BW,Oracle,DB2,Oracle,SQL Server DB,XML and Flat files to TeradataData migration from different data source to different target database.Support and maintenance of existing jobs.DATABASETeradata,Oracle,DB2,SQL ServerTOOLSSAP BODS 4x,Teradata,SAP ECC,SAP BW,Oracle,Remedy8x,SQL Server |