Quantcast

Sap Mdg Bods Resume Charlotte, NC
Resumes | Register

Candidate Information
Name Available: Register for Free
Title Sap Mdg Bods
Target Location US-NC-Charlotte
Email Available with paid plan
Phone Available with paid plan
20,000+ Fresh Resumes Monthly
    View Phone Numbers
    Receive Resume E-mail Alerts
    Post Jobs Free
    Link your Free Jobs Page
    ... and much more

Register on Jobvertise Free

Search 2 million Resumes
Keywords:
City or Zip:
Related Resumes
Click here or scroll down to respond to this candidate
HariPHONE NUMBER AVAILABLE EMAIL AVAILABLEProfessional Summary14 years of experience in IT, SAP MDG on S/4HANA 2022, SAP BODS, Info Steward, BW HANA, Data Services, Tableau and 2years in Microsoft .NET and Software TestingSAP MDG Data Modeling, UI, Process Modelling, DRF, BRF+, CBA, VALIDATIONS, WORKFLOWSSAP MDG EDITION, MASS CHANGE, FILE UPLOAD, EXTENDING DATA MODELS, CHANG REQExpertise in SAP MDG 9.2/9.1 Data, Process, UI modelling and Data Replication in MDG-M/S/C/F Domains S/4HANA 1809/1909/2022SAP MDG DATA DOMAINS MATERIAL MASTER/BUSINESS PARTNER (CUST, SUPPL)/FINANCEInvolved in Analysis, Design, Development, Testing, Implementation and Support phases of the Projects for both SAP MDG, SAP BW, BO, HANA, BODS and Microsoft .NETExtensively worked on BW Composite Provider, ADSO, Info Cubes, DSO Objects, Info Objects, Info Sources, Data Sources, Transformations, DTPs, Multiproviders, Transfer rules and Update rulesGood Knowledge in Generic extraction and LO ExtractionExtensively worked on Web Analyzer and BEx Query Development for Reports creation.Experience in CO-PA extraction process and have reports built on itProficient in handling errors errors occurred during Profitability Analysis (CO-PA) data loadsFamiliar loading data to CO-PA data targets COPAo01B and COPA_T02AInvolved in Business Objects Administration creating users and groups, scheduling reports, publishing reports, planning security, authentication and single sign-on in for the reports developed using Business Objects 4.2/4.0.Used Open Hub Services to push data out of SAP BI using BODSWorked with Match Transforms, Address Cleanse and Data Cleanse Transforms for US Data Cleansing in Data Quality in BODS.Used Data Integrator and Platform Transforms extensively in BODSWorked extensively with Sales & Distribution team to design data migration from Legacy system to SAP using BODSExporting and Importing meta data in Native HANA and expertise in managing Native Hana modeling concepts, Hana Live (RDS) viewsGood experience on PL/ SQL(DDL, DML, DCL, Stored procedures) using Native HANA as a DatabaseGood experience with all the Data Provisioning techniques (SDI,SLT and SAP BODS), Modeling constructs (Attribute, Analytic, and Calculation) and Reporting solutions (BOBJ Suite of tools) with Native HANA information models as a source.Experience in handling the Business ObjectsModules Designer, Web Intelligence, Central Management Console and InfoView, Crystal Reports, Lumira and Design StudioExperience in Data Visualization tools LUMIRA, TABLEAUGood knowledge in Universe designing, developing and generating complex, Adhoc Reports, Business views, List of values, reporting functionalities such as Master Detailed, Slice and Dice, drilling methodology, Filters, Ranking, Sections, Graphs and Breaks. SME in Dashboards, Best Practices in Dashboard Design, KPI Formulation & Web Reporting, developing information spaces and exploration sets in BOBJExposure in Oil & Gas, Pharmaceutical, Retail, Telecommunication, Production and Sales Data Warehouses.S/4 HANA migration cocpit (LTMC) - Migration Object Modeler (LTMOM)Legacy Master Data (Business Partner/Material/Financials) & TD migration to S/4 HANA 1909Good knowledge in Eclipse based BW/4HANA & S/4 HANA (1809/1909)Technical Skill SetERP Packages : SAP BW 7.0, 7.3, 7.4, ECC 6.0,Master Data Management : SAP MDG 9.2, 9.19.0, DRF, DM, PM, UI Modeling, BRF+, DQ SEARCHBusiness Intelligence Tools : SAP Business Objects (WebI, Crystal reports2013), AO, OBIEEData Visualization : BO Lumira 1.31,2.0 Design Studio 1.6, Tableau 10,10.3ETL Tools : BODS 4.1, 4.2, BODI, Talend 7.1.1, 7.2.1, SAP RDM s/4 hanaData Replication Technique : SLT, SDI,SDADatabases : HANA1.0/2.0, Oracle 10g/11g, SQL Server 2008/2012/2016Microsoft Technologies : .NET 2005, C#Programming Languages : ABAP/4-BWTrainingsSAP MDG on S/4 HANA 2022/2021/1909/1809 (Material, BP, Customer, Supplier, Financial) Master DataBIG Data Insight/ Talend Open Studio Data Integrator, Job Design,Joins,Filter,Oracle/Flat file loadSAP BPC 10.0Lumira Discovery 2.0Assignment SummaryClient : HALLMARK CARDS/KANSAS CITY, MOEnvironment : SAP MDG on S/4HANA 2022, SAP BODS 4.3-14.3, SQL SERVER 2016, Teradata, Mainframe systemsDuration : Nov 2022 to PresentKey Roles and Responsibilities:Extending DMs (Material/Business Partner) in SAP MDG on S/4HanaStreamline approval process design with SAP MDG workflowsReplicating & Troubleshooting data into SAP Data Services, Ariba SLP & 3rd party systems with DRF IDOCs SOA in SAP MDG on S/4HANASAP MDG-C integration & Replication to SAP C4C (Cloud for Customer)Setup MDG DRF filters to 3rd party systemsGoverning UIBBs at Workflow step level in MDG on S/4 hana 2022Understand BADi for Parallel WF/Dynamic selection of Agents in Sap MDGInbound: migration from Mainframe files to CSG Merchandise planning tool using BODS 4.3Move file location data from MicroStrategy to CSG Merchandise planning toolRun stored procedures from SAP DS/BODSIDOCS/Bapi load, ABAP Dataflow using SAP BODSPerform Data Conversions and SAP DS code migrationOutBound: Load Data from CSG Merchandise planning tool to RDW (Teradata) for MicroStrategy reportingClient : ENERGIZER/SAINT LOUISEnvironment : SAP MDG on S/4HANA 2021, SAP BODS 4.2, SQL SERVER 2018, S/41909, SAP MDG9.2,FioriDuration : Feb 2022 to August 2022Key Roles and Responsibilities:Understand SAP Data Services, MDG HUB & CO-Deployment landscapeDesign MDG BRF+ workflows based on User Agent approvalSAP MDG-S integration & Replication to SAP Ariba SLPDefine Parallel WF for Multi distribution center approval using BaDI in SAP MDG on S/4HANAAuthorization and Governance settings in SAP MDG for Business UsersData migration from LEGACY to SAPS42021 using BODS 4.2Migrate Data using IDOC interfaces/Bapi/Function ModulesMaster Data: BP/C/S, MaterialDRF: Material master Replication to ERP using ALE IDOCsPerform Data ConversionsSAP MM data migration including Plant/storagelocation/taxclassification/SalesorgnizationClient : MOOG INC/NYEnvironment : SAP MDG on S/4HANA 1909, SAP BODS 4.2, HANA2.0, S/41909, FioriDuration : July 2021 to Feb 2022Key Roles and Responsibilities:Change Request setup for Finance  OG domain using Edition in MDG 1909Setup BRF+ Validations for MM dataPerform Data Services, Mass Change and File Upload in SAP MDG 1909Activate BC Sets in MDG for Change requestSAP MDG-S Data Replication to SRMData migration from LEGACY to S4HANA1909 using BODS 4.2Migrate Data using IDOC interfacesMaster Data: BP/C/S, Routings, Document Info Record, BoM, Material, Equipment, Purchase Info Record migrationPerform Data ConversionsMigrate your data using Fiori App (Flat file/Staging Tables)Knowledge in SAP SYNITI ADM field mapping, Datasource connections, executionsSAP MDG/MDQ  ConfigureMDQ (Remediation) on Product data with BRF+ validation rulesClient : TENNECO/MIEnvironment : SAP BODS 4.2, BW 7.4, SAP ECC/CRM/SCM, AZURE, HANA 2.0, S/4Hana 1809Duration : Dec 2020 to May 2021Key Roles and Responsibilities:Implementation/AMSData migration from SAP ECC & Flatfiles to Data Services, SAP HANA with BODS 4.2Data migration from SAP ECC & Flatfiles to Microsoft Azure Data lakeDefect fix/Data ReconciliationData Services Production system Improvement measuresPerform Adhoc loads &Archive source file once data load complete to HANA 2.0S/4 HANA migration cocpit (LTMC) - Migration Object Modeler (LTMOM)/Material/Costcenter etcCPS redwood work load automations/Scheduling parameters/Pre conditions/Mappings similar to UC4Client : ECOLAB/MNEnvironment : MDG 9.1 on S/4HANA, SAP BODS 4.2, BW 7.4, SAP ECC/CRM/SCM, BIG DATA, AZURE, CLOUDERA,Duration : July 2019 to July 2020Key Roles and Responsibilities:Implementation/AMSInvolved in SAP MDG Functional (Data, Process, UI) Modeling, Data Services, Data Replication FrameworkFlex /Reuse modes (HUB & Co-Deployment) layout architectureUsed Hub deployment method for MDG implementation, data resides in MDG systems, approved data replicated to ECC for Customer/Vendor and Material. Finance data replicated using the SOA and ALE servicesMaterial/Business Partner/Customer/Supplier/Financials Master Data Replication to S/4HANACreate/Change/Block Unblock/Mass Change Material/Business Partner/Customer/SupplierUnderstanding Type 1/Type3/Type 4 Entities in Data ModelingConfiguring Business Objects /Business activity/Logical actions for Change RequestBRF+ and Static Workflows steps in Decision Tables for Change RequestTroubleshoot FPM page errors in UI modelingPerform Initial Load using SAP BODS in HUB-Reuse storage mode from ECCPerform Data Quality Search & Duplicate check validation and setup HANA SEARCH for all DomainsDesign and Monitor the Cutover checklist and Go-Live plansUnderstand BADIs/Interfaces/Methods/ Access Class for Material/Business PartnerTroubleshoot SAP MDG Change requests & WorkFlows during AMS activityDesigned and developed data ingestion (Replicate/ETL/ELT), conceptual, logical, physical data models, data marts, analytics and visualization, performanceIngest Data Services, SAP ECC SCM Full & Delta loads to cloud-MS AZURE BLOB containerIngest SAP CRM/ECC BW EXTRACTOR data & flatfiles to cloud-MS AZURE BLOBMaster Data Attribute, Text & Transactional DataBuild DSO/OpenHub and Ingest Nalco system data to AZURE BLOBCollect and import BW transports to ProductionBuild Batchjobs and perform Delta Initialization in Sap Data ServicesAutomate Batch jobs with loadtype variable for Full, Delta and Adhoc loadsExpose custom extractors to ODP API to consume in Data ServicesImport extractors in Query & CDC modePerform archive operation, MD/TD Data reconciliation with sourceDefine Primary keys, Delta type &Metadata for extractors (ECC, SCM, CRM)Assist HADOOP team for Schema creation, schema enhancement, FL, DL, Data ValidationAutomate the SAP DS jobs in UC4Data migration LTMC  LTMOM to S/4 HANA using Files & Staging Tables with BODS/Data ServicesTroubleshooting Data Service BW extractors Delta monitorRecovering files from Archive to Azure blobAnalyze data in AZURE DATA BRICKS/AZURE DATA LAKE/AZURE DATA FACTORYRecovering failed Delta from BW extractors (SCM) in ODQMONData Services Production system Improvement measuresHandling Data Reconciliation issues Azure DATA LAKE and ECC/CRM/BWSetup Data Store connection to ATTUNITY and extract FULL load using BODS to AzureKnowledge in PYTHON scriptsClient : DuPont/DEEnvironment : SAP MDG 9.1, SAP BODS 4.2, Tableau, Sql server 2012, SAP HANA 2.0,BI 4.2, BWDuration : December 2018 to May 2019Key Roles and Responsibilities:Implementation/EnhancementsInvolved in Configuring SAP MDG Material, Supplier, Customer and Finance Master dataExtend Material/Customer/Supplier by new entity in SAP MDG 9.1Involved in Data Services, Troubleshooting Data Replication DRF and Workflow issuesDefine Change Request actions and understand scenario for Parallel WorkflowsData Replication and Monitoring Change Requests in SAP MDGTroubleshoot BODS connectivity issues with source systems, Sql server and HanaDevelop code migration methodology for SAP BODS across Dev, Test, Prod landscapeMigrating Specialty Products Data(SPECCO),CORTEVA(AGCO) MD/TDMaster Data includes Cust Master/GL account/Profit center/Material Master/Trade/Plan/Company Code/Cost Element/Cost Center etcTransactional Data Includes OpenOrders /Costcenter /Daily Sales/GLSources include SAP & Non Sap (Flatfiles/Json/Excel)Worked on Data Quality Transforms Address Cleanse/Match/Data CleanseRead Inbound shared point files and batch numbers and load automatically to extraction layer3 Layer Architecture - Extract data from source and load to Output LayerFurther load data from Staging Layer to Data Inbound/Data HubTarget system data is posted to HANA 2.0Perform Adhoc loads &Archive source file once data load complete to HANA 2.0Analyze the reports built on Target System with TableauWritten PL/SQL queries to extract source data to Data Services stagingManaged Error Handling and Success results to trace data migrationWorked on incremental loading for all the dimensions after initial load from stage to Data Mart (Star schemaData migration to SAP HANA Enterprise and build Information models (Cal Views), SQL scriptingInvolved in Unit and Integration Testing on reports and then exported them to Schedule Manager for Scheduling and Refreshing.Performed Data Conversions/functions before migrating dataWorked on Data integration transforms History preserving, Table comparison, XML_pipelineSetting up of Data store connection to read source data from multiple source systems in BODSExtensively used BODS Management Console to schedule and execute jobs, manage repositories and perform metadata reporting.Extracted larger volumes of data(SDI) using ABAP data flow transform into HANA DB in BODSCreated BODS Jobs, Work Flows, Data Flows, and Scripts using various Transforms (Integrator, Quality and Platform) to successfully load data from multiple sources into a desired targetRetrieved data using SQL Select statements as part of performance tuning in Data services 4.2Worked on scripts while reading files from a folder using WHILE_Loop in Data servicesClient : Canon/NYEnvironment : BODS 4.2, BI 4.2(WebI), Lumira,Oracle 11g, XML, WebService, SAP HANA 2.0Duration : April 2018 to November 2018Key Roles and Responsibilities:Handling priority/problem tickets/incidents/ housekeeping activitiesTroubleshooting Data Services, BODS connectivity issues with source systems and Soap WebserviceDevelop code migration methodology for SAP BODS across Dev, Test, Prod landscapeMigrating Internal, Partner, Admin, Principle, Marketing, Promotion, Service, Pricing and Super usersMaster Data, User core attributes, Roles and Sellto Billto InformationSources include CNA UAM Oracle schema and LZ(flatfile)Target system Soap webservice and intern data is posted to Oracle dbValidate against HR database for internal or external user classificationGenerating unique identity for each user using a functionConvert userlevel codes for internal and Partner users based on requirementWorked on SQL transforms, DBlinks and Stored ProceduresWritten PL/SQL queries to extract source data to Data Services stagingConsolidate data from source systems and build XML structureManaged Data Services, Error Handling and Success results to trace every user migrationWorked on incremental loading for all the dimensions after initial load from stage to Data Mart (Star schemaData migration to SAP HANA Enterprise and build Information models (Cal Views), SQL scriptingSending and Receiving IDOCs from SAP through BODS Integration systemsInvolved in Unit and Integration Testing on reports and then exported them to Schedule Manager for Scheduling and Refreshing.Written & Performed Data Conversions/functions in PL/SQL Oracle 11g before Data migrationWorked on BO universes IDT, WEBI, Crystalreports as per Business requirements.Information Steward for data profiling, validating rules, scorecards, metadata managementWorked on Data integration transforms History preserving, Table comparison, XML_pipelineSetting up of Data store connection to read source data from multiple source systems in BODSWorked on scripts while reading files from a folder using WHILE_Loop in Data Services Extensively used BODS Management Consoleto schedule and execute jobs, manage repositories and perform metadata reporting.Created BODS Jobs, Work Flows, Data Flows, and Scripts using various Transforms (Integrator, Quality and Platform) to successfully load data from multiple sources into a desired targetExtracted larger volumes of data(SDI) using ABAP data flow transform into HANA DB in BODSRetrieved data using SQL Select statements as part of performance tuning in Data services 4.2Extensively worked on data validation and reconciliation BW, ECC, BO, HANA, DATASERVICESClient : AMGEN (Life sciences)/NJ &BangaloreEnvironment : SAP HANA 1.0, BODS 4.2, BI 4.2, Tableau, BW 7.4, IPDuration : July 2015 to Feb 2018Key Roles and Responsibilities:Handling priority/problem tickets/incidents/ housekeeping activitiesMonitoring and Scheduling process chains.Handling RFCs through Service now toolCreated Native HANA information models Attribute, Analytical and Calculation ViewsWorked on Explain Plan Tables, Visual Plan, Sql Tracing as part of Performance tuning in Native HanaConfigured and Replicated the data from SAP / Non-SAP sources to SAP HANA using SAP Landscape Transformation (SLT), extracted external system data using SDI/SDABuilt graphical Calculation views using Table functions as datasource for complex calculation in Native HANAWorked on performance optimization of SQL Query/Models using Explain Plan, Visualization Plan and Expensive Statements traceGood experience in working on HANA optimized Calculation Engine functions (CE functions datasource access& relational operators)Built Calculation views on virtual tables with Smart Data Access(SDA) mechanism in HANAConverted Attribute and Analytic view to Calculation views to reduce data transfer between engines in HANACreated Sql script based calculation viewsAnalyzed and fixed real time replication server SLT delta failures in HANAFollowed Best practices/Performance tuning in designing and building HANA information modelsTransport HANA model objects using HANA ALCM Source to Target systemsFunctional verification activities of Development, Quality and Integration systemsEnhancing reports in LUMIRA and Webi, working on Lumira 2.0 proof of concept source as HANAWorked on linking Datasets and Building and Publishing Lumira Stories to BI PlatformCreated reports including formatting options such as Chart, Grouping, Sorting, Alerter, Ranking, Breaks, Sections, parameterized prompts.Involved in Unit and Integration Testing on reports and then exported them to Schedule Manager for Scheduling and Refreshing.Modified existing BO universes and WEBI reports as per Business requirements.Created Crystal reports when needed and exported to enterprise repository in order to make Crystal reports accessible.Support Business objects WEBI reports, Universe handling issues through incidentsInformation Steward for data profiling, validating rules, scorecards, metadata managementExtensively used BODS Management Console to schedule and execute jobs, manage repositories and perform metadata reporting.Worked on scripts while reading files from a folder using WHILE_Loop in Data servicesClient : STATOIL (Oil & Gas)/Capgemini/BangaloreEnvironment : SAP BODS, BO 4.0, Tableau, HANASPS 11, IP, BPC 10.0, BW7.0Duration : Jan 2014 to June 2015Key Roles and Responsibilities:Involved in Statoil BO BW transition from IBM to CapGeminiHandling priority/problem tickets/incidents/ housekeeping activitiesMonitoring and Scheduling process chains.Handling RFCs through Service now toolClassic and Sql script Analytic privileges and assign users to restrict accessing data native hanaSAP process areas involved SD, CRM, HR, Treasury & Payments, Accounting & Controlling, Plant O&MBusiness Objects reports on HANA modeling constructs (analytic, Attribute & Calculation views)Created Data Visualization dashboards using embedded Excel sheet integrated with BEX Queries.Worked on complex reports that require running totals, sub-totals, sub-reports, cross-tab and conditional display of groups, fields, and messages.Created, managed & formatted standard reports, master/detail reports and cross tab reports using Report ManagerDeveloped crystal reports and performed unit testingMigration of BO reports, folders and universes from one repository to other repositorySupport Business objects WEBI reports, Universe handling issues through incidentsDeveloped and executed unit test and integration test plan for BO reportsData load experience to SAP BW as Target, Success Factor using Data servicesExtensively worked on Data Services (Code Migration, Error Handling, Recovery Mechanism,IDOCS)Making use of SQL transform to read data from source system i.e Oracle in BODSSetting up of Projects, Jobs, workflow, dataflow and make them uniquely grouped while moving across different environment in BODSExtensive use of Data integrator and platform transforms such as Key_generation, Table_comparision, Case, Merge, Validation, sql, Query for transformationCreated Profile and Central repositories to manage data consistency in BODSMigrate data for SAP FICO and HCM module objects such as Vendors, Customer, Materials, Purchase orders, Cost Centers, Profit Centers, Transactional data such as AP, AR, GL to single system ECC 6.0 in BODSPerformed Full pushdown, Source & Target based performance tuning, Bulkloading in BODSWorked on Data Services Management console to create users and grant privilege to access repositoriesClient : Philadelphia Energy Solutions (Energy Oil & Gas)/Capgemini/BangaloreEnvironment : SAP BODS, BI7.3,Sql serverDuration : June 2013  December 2013Key Roles and Responsibilities:Involved in designing technical specification document for PES_BWInvolved in build phase of the projectDeveloped BI data flows using FLAT file extractionResponsible for creation of test scripts and documentationInvolved in Data Loading & Process chain MonitoringInvolved in unit testing, test case reviews and documentationCreated Crystal reports when needed and exported to enterprise repository in order to make Crystal reports accessibleDesign and Develop BODS batch jobs for Material Master, Purchase orders, Asset Master and Batch DeterminationUse of Try-catch and while- loop transforms during splitting records in multiple files in BODSUse BODS conditional clauses such as where, group by, order by while restricting recordsGenerating Tab delimited output file as per load programPublishing Data Quality report for quickly status meeting and decision makingAs per data quality report assist business users to modify and correct the dataFollowing are Objects worked as part of data reconciliation activity: PO vendor master, Asset PO, WBSmaster, asset location, cost center in BODSClient : Target (Retail)/Capgemini/BangaloreEnvironment : SAP BI7.3, BOXI R3, Sql serverDuration : December 2012 - June 2013Key Roles and Responsibilities:Involved in requirements gathering, designing technical specification document for Project SystemsInvolved in BUILD and UAT phase of Project Systems Module (Master Data and Transaction Data)Responsible for creation test scripts and documentationInvolved in planning and co-ordination of tasksInvolved in Report testing, test case reviews and documentationInvolved in unit testing, test case reviews and documentationMigration of existing in SSIS to Crystal reports from scratchMigrated the reports using LCM and import wizardDeveloped and executed unit test and integration test plan for BO reportsCreation and maintenance of BO Universe and reportsWorked on CMC module in creating user groups and folders and managing security in BOClient : Centrica (Gas & Energy)/Capgemini/BangaloreEnvironment : SAP BODS,BI7.3, SAPCRM, SAP ISU,OracleDuration : June 2012  December 2012Key Roles and Responsibilities:Involved in designing technical specification document for British Gas and Dataware HouseInvolved in build phase of BGDW (Master Data and Transaction Data)Developed BI data flows using LSA methodologyInvolved in enhancing CRM & ISU datasourcesResponsible for creation test scripts and documentationInvolved in Data Loading &Process chain MonitoringTest case reviews and documentationInvolved in unit testing, test case reviews and documentationUnderstanding the client business setup and structuring Object repository in Business object data servicesDeveloped and implemented solutions with data warehouse, ETL, data analysis, and BI reporting technologies.Most of the BODStransforms used Case, Merge, and Sequence Generator, Query Transform, Map Operations, Table Comparison, SQL, and Validation TransformsCreated Data Flows to load data from flat file, CSV files with various formats into Data Warehouse.Involved in creating batch jobs for data cleansing and address cleansing.History preserving, capturing slowly changing dimension data at every interval, script writing, preparing the data from initial load perspective in BODSClient : Fossil (Lifestyle brands)/Capgemini/BangaloreEnvironment : SAP BI7.3, OracleDuration : December 2011  May 2012Key Roles and Responsibilities:Involved in requirements gathering technical pre-upgrade and technical post-upgrade activitiesResponsible technical pre-upgrade and post-upgrade activitiesResponsible for creation test scripts and documentationInvolved in planning and co-ordination of tasksCo-ordination in India with respect to testingInvolved in unit testing, test case reviews and documentationClient : Vodafone (Telecom)/IBM/BangaloreEnvironment : SAP BI7.0, BO, ECC 6.0, OracleDuration : Jan 2010 Nov2011Key Roles and Responsibilities:Interacting with Business Analysts for Requirements Gathering and clarifications. Provided the weekly status update on Project Progress.Created Cubes, DSOs, Multiproviders, Infoset, Transformations, InfoPackages and DTPs.Involved in implementing logic at Start Routine levelCreated Process Chains for automating data load processCreated Open hub for extracting data from info cube into Flatfile/table.Involved in developing Interaction Analysis Queries using Restricted Key figures, Calculated Key figures, Filters, Free Characteristics, and Variables.Worked on Report to report interfaces(RRI) to drilldown to detailed ReportInvolved in Transportation of all Objects like: InfoObjects, Querys and DSO objects, Routines, Programs, etc. into BI test system after validation and testing the same in development.Prepared Project documents like Transport Checklist, Go-Live check list, Support document and Provided the Support Training to the Support TeamsCreated Unit Test Cases and Performed Unit testingExtensively Involved in System Testing and User AcceptanceTestingCreated crystal reports including graphical reports, formula based and well formatted reports according to user requirementsEnhancements in BO universes and reportsInteracted with users for analyzing diverse BO reports and scheduling themInvolved in creation of webi,crystal and dashboards according to user requirementsClient : Vodafone (Telecom)/IBM/BangaloreRole : Application DeveloperEnvironment : C#, WINDOWS APPLICATION, SQL SERVER 2005Duration : March 2007  Dec 2009Description: The unified Front End (UFE) has been built as a unique interface for Vodafone stores and customer care services. The Legacy System which existed prior to UFE used Siebel and SAP applications to support Call Centre and Stores activities. These applications even when optimized were not process oriented, not allowing the optimal service time for repetitive easy operations, implicating substantial training period and requiring extensive software licensing.The main objectives behind the creation of UFE are:To have a single interface which can be customized according to the user profileAn interface which is independent of backend systems (Smart Client)Easy to use (reduced training for users and short handling time)Key Roles and Responsibilities:Creation of (HLD) High Level Document from the Feasibility documentInvolved in preparing Detailed Design Document(LLD)Preparing Unit Test Plan and Unit Test Result documentDecision making on UI design of new applications; Suggesting changes to customerCoding of the functionality from the Detailed Design Documents(LLD) documentDatabase related activities and implementationPreparing Release Notes using open source software WikiWriting scripts for database related changesDeveloping Windows Services, Web ServicesClient : Royal Bank of Canada (Banking - CANADA) /IGate/BangaloreEnvironment : C#, ASP.NET, SQLSERVER 2000, JAVA SCRIPTDuration : July 2006  Jan 2007Key Roles and Responsibilities: Designing UI, Coding, Unit Testing.Safekeeping is banking project to generate fee file for specific range of Canadian regions. Ontario, International and QuebecDeveloped on OPS Tools framework and modules are Excluded Accounts,Feefile Generation,TaxMaster,Confirmation Page,Fee file Generation Report.OPS Tools framework is designed with XML and WEB SERVICES using YDO0_DEV (sqlserver) as Database.Work flow of project: UI interface is user controlsExcluded Accounts screen will display the previously excluded accounts; at the same time we can include these accounts for fee file generation.From Account Filtration page/Fee file generation page list of accounts are checked/unchecked with available checkboxes.Feefile will be displayed on Confirmation Screen as well as you can generate PDF report.TaxMaster which updates tax details GST,QST,HST for Ontario, International and Quebec regionsEducationBachelors Degree (B.Tech),CSIT, (India)

Respond to this candidate
Your Message
Please type the code shown in the image:

Note: Responding to this resume will create an account on our partner site postjobfree.com
Register for Free on Jobvertise