Quantcast

Data Architect Data Modeler Data Migrati...
Resumes | Register

Candidate Information
Name Available: Register for Free
Title Data Architect, Data Modeler, Data Migration, DW BI Developer
Target Location US-PA-Canonsburg
Email Available with paid plan
Phone Available with paid plan
20,000+ Fresh Resumes Monthly
    View Phone Numbers
    Receive Resume E-mail Alerts
    Post Jobs Free
    Link your Free Jobs Page
    ... and much more

Register on Jobvertise Free

Search 2 million Resumes
Keywords:
City or Zip:
Related Resumes

staff data scientist Bellevue, PA

Senior Big Data/Cloud Engineer Pittsburgh, PA

Senior Data Engineer Pittsburgh, PA

Data Analyst Senior Wexford, PA

Data Entry United States Ambridge, PA

Claims Analyst, data entry, healthcare administration Youngstown, OH

Data Entry Customer Service McKeesport, PA

Click here or scroll down to respond to this candidate
Canonsburg, PA-15317Tel: PHONE NUMBER AVAILABLEe-Mail: EMAIL AVAILABLECandidate's Name
SUMMARYHas around 18+ Years of experience in IT and played multiple roles: Cloud Data Architect, Software Architect, Data Modeler, Data Migration & Integration, Data Warehouse/BI - ETL Lead/Developer, Analyst.Enterprise across architecture domains: Information, Integration, Security & Application.Expertise in cloud-based database solution design, Modernization, Microservices (Domain Driven Design) model for Specialty Pharmacy (ScriptMed Cloud) & Property & Casualty Products.Implemented Enterprise data warehouse and data marts using IBM BDW and Firserv InformEnt.Expertise in Data Pipeline framework design using Informatica Intelligent Cloud Services (IICS).Expertise in variety of Data Modeling: - Domain Driven Design (DDD), CDM/ LDM/ PDM, Canonical Data Modeling, Reference Hierarchies, Metadata Model, Reference data Model.Exposure in Data Mash, Data Fabric and Hybrid Architecture framework to support micro-batch and real-time data processing to develop Data as a Product for Data Consumers.Enabled & implemented the Data Democratization process to make available data to the wider range of authorized users across the Data Domains within the organization.Experienced in  Database Migration (Ora2pg), ETL Migration - IICS to GCP with Python.Exposure in Project Management activities -Resource optimization, Estimation, experienced SDLC & Agile-Scrum Model that includes short term goals, iterative development.Exposure in GCP  Google Cloud Services: GKE, Cloud Storage, Cloud SQL, Bigtable, BigQuery, Dataflow, Pub/Sub, Airflow, PostgreSQL.Expertise of modern application and database architectures and design.Familiar with Azure Data services, Databricks, RESTful APIs and Snowflake.Experience with Agile delivery methodologies. Strong debugging and problem-solving skills.Experienced in OCI  Oracle Cloud Infrastructure  Autonomous database.Strong experience in multidimensional (OLAP) data modeling, such as Star schemas, Snowflakes schemas normalized and de-normalized models, handling -slow-changing- dimensions\attributes.Expertise in entity relationship modeling, conceptual and logical Data modeling, Physical Data Modeling, hierarchies, relationships, metadata, data lineage, reference data.Created documentation of all entities, attributes, data relationships, primary and foreign key structures, allowed values, codes, business rules, glossary terms, etc.Involved in various data modeling tasks that included forward engineering, reverse engineering, complete compare, creating DDL scripts, creating subject areas etc.Built Enterprise Data Warehouse using Ralph Kimball Dimensional Technique, worked on Data Modeling tools Erwin, ER\Studio, & Oracle SQL Developer Data Modeler toolsExtensively used Performance Optimization best practices for making database healthy and provide optimal performance including table partitioning, Buffer Sizing, Threading and Indexing strategy.Exposure in Project Management activities -Resource optimization, task estimation, costing for fixed Time & Material, experienced SDLC & Agile-Scrum Model that includes short term goals, iterative development and daily stand-upManage on-site and external partnerships to obtain necessary data, affect the optimal experimental conditions and, if appropriate, to participate in the analytical workProvide Business Intelligence expertise and consultation to internal stakeholders as neededExpertise to Support & ensure the data availability for the organization such as Insurance (P&C, Life), Healthcare/Life Science/ Pharmacy, Telecom domains & BFS.Sound Knowledge of Insurance (P&C, Life), Healthcare, Life Science, Telecom domainsData Modeling: Over 6+ years of Dimensional Data Modeling experience (ERwin & Oracle SSDM). Sound knowledge on Dimensional Modeling, OLTP Models, Canonical Integration Models, EAV, Associative Modeling, Ralph Kimball and Bill Inmon Methodologies, Star Schema, Snowflake schemas, Data Marts, Facts & Dimensions, Logical & Physical data modeling, MDM Data Modeling, Unstructured Data Modeling, Metadata, Reference Data Process models. ETL and Reporting Framework Modeling, Database Management in SQL Server, Oracle, DB2Data Management: Over 5 years of Microsoft BI Platform (SSIS, SSRS, SSAS), 4+ years of experience in SAP BODS (Business Objects Data Services), 3+ year experience using Informatica Power Center 8.6 till 9.5 (Designer, Workflow Manager, Workflow Monitor, Repository Manger), IICSBusiness Intelligence: 11 years of Business Intelligence experience using SAP Business Objects (7+ years) COGNOS (3 Years), Microstrategy9.2 (3 Years), SSRSAnalytics: 5 years of Experience in Multidimensional Data Analytics including SSAS (XMLA Query), SSAS (5 Years), SAP Lumira (2 years), Tableau Desktop 9.3 Tableau Server 9.3, (3+ Years), QlikView\QlikSense(3 Years), Power BI (1 year), SiSense (1+ Yrs)Databases: 12+ years of experience using Oracle 11g(6+ years), DB2 (5+ yrs), MS SQL Server 2016 or lower(8+ years), Trained in NoSQL- MongoDB, Oracle Spatial, MarkLogic(NoSQL)Programming Languages: SQL (8+ Years), T-SQL(5+ Years), PL/SQL(4+ Years), C#(1+ Years), JSON(1 Years), VB(3+ Years), Unix Shell Scripting(4+ years), Python(1 yr), R(1 Yr), TCL (2 Years)Data Tools Informatica IDQ, Informatica MDM, Information Steward, DQS, MapInfo.Version Control Tools Microsoft VSS, GitHub, Tortoise SVN repositorySchedule Tool Control-M, Auto-SysProject Management Tool: JIRA, Microsoft Project, HPSM, Agile ScrumERP : Oracle- E-Business Suite R12 (PA)DevOps: CI/CD  Liquibase Azure Repo GitHub MS Visual Studio Project.Cloud Framework: GCP-Application & Data Storage and Data warehouse Service, OCI -Oracle Cloud Infrastructure- OCI -Autonomous Database Service, AWS S3  Storage, Azure  App Services.TRAINING & CERTIFICATIONProject Management Professional (PMP) Training (PMBOK 5th Edition) successfully completedCertified Scrum Master (CSM) - Scrum Alliance  Pro: Agile Scrum ProjectOracle Certified Associate (OCA)  Oracle Corporation: Pro: Database administratorCertificate course of Big Data Internship Program Foundation granted by Big Data Trunk & UdemyCertificate course of Data Science A-Z: Real-Life Data Science Exercises by Big Data Trunk & UdemyCertificate of Completion in Data Science 101 by Big Data UniversityCertificate of Completion in Training & Assessment of DevOps Course by CSC India skill PlatformCertificate of Completion of SQL*LIMS v4.0.16 & 5.0.1 training by Merck IncEDUCATIONMaster of Computer Application (MCA) in 2004 from Kumuon University, Nainital, IndiaMaster degree in Mathematics (M.Sc.) in 2000 from CCS University, Meerut, IndiaBachelor of Science (B.Sc.) in 1997 from CCS University, Meerut, IndiaEXPERIENCEInovalon Inc, Canonsburg, PA, USA 11/21  Till DateCompunnel Inc, USA (Contractor@ Inovalon Inc) 09/19  11/21Software Architect  Commercial Cloud, Data EngineeringResponsibilities:Inovalon is a leading provider of cloud-based platforms empowering data-driven healthcare. It provides 80% of the nations clinical and quality outcomes measurement analytics.Working as full stack Software Architect - Data Engineering and liaison between product development.Expertise in reverse engineering and convert monolithic to micro service-based architecture.Experienced in Application Modernization from On Prem to Cloud architecture, Data Migration, Data Conversion, Designing data mart & Data Pipeline solution for business reporting.Perform overall structure design and development of software systems and applications to address business needs (cloud and/or in-house).Presented "big picture" architectural approach for software design and implementation.Conduct research, gather information, interpret data, identify requirements, and create a solution.Design the Data Architecture that ensures compliance, regulatory standards (e.g., HIPAA, FCRA, GDPR, CCPA) and a technology-based information security program.Create & manage a data dictionary for all data sources/datasets by aligning with the IT organization and the business owner of the data to identify the single source of truth for each data silo.Lead/perform data analysis including data mapping, data modeling and data validation.Expertise in designing Application Database model and Dimensional Data Model for reporting.Experienced in designing Data Management Solution around Informatica (IICS) using various source connections like Oracle, SQL Server, Web API & Informatica Power exchange for CDC.Exposure in ETL Migration/Conversion from IICS to GCP - Data Flow with Python Framework.Experience in Application DB & Datamart DB Migration from Oracle to PostgreSQL.Expertise in Data Migration from On-Prem to Commercial Cloud tenant Autonomous Database.Setup CI/CD process to promote the changes using Azure Repo & Liquibase tool.Apply consultative approach to assist with functional planning/design and operational procedural planning/design when implementing or integrating systems and/or product enhancement.Collaborate with other technologists on creating Cross-domain solutions.Identify approaches to improve the accuracy and effectiveness of analytics models.Define the approach in a clear and concise manner to support the model optimization efforts.Monitor ongoing model performance and work with stakeholders to resolve any identified issues.Ensure teams follow best practices regarding coding standards, code reviews, and testing.Collaborates product/business owners to defines & establish Data Quality Rules and StandardsExpertise in developing Domain Driven Design Model for Micro Services Architecture.Exposure in Data Migration from On-Prem to Commercial Cloud tenant Autonomous Database.Develops data quality standards to identify gaps and ensure compliance across the enterprise.Environment: Micro-Service API, Toad Data Modeler, Oracle19c, SQL Server 2016, JSON, PostgreSQL, Informatica Intelligent Cloud Services (IICS), GCP, OCI, Tableau, KAFKA, Python, Data Flow, CI/CD  Liquibase, Azure Repo, MS Visual Studio Project, DataDog,Client: Erie Insurance Group, Erie, PA, USA & Tower Hill Insurance, FLRole: Data Architect /Data Modeler 07/18 -08/19Working as a Data Architect on Commercial Insurance, ESB Product at Erie Insurance Group (EIG), PA.ESB product - Playing a key role in development and execution of companys data acquisition strategy and participate in negotiations and structuring solutionsDefined Product definitions and Logical Data Model for ESB product.Canonical Data Models for Integration layer to exchange the messaging between applications.Lead governance efforts related to data definitions, business requirements and taxonomies.Enterprise Data Hub and Datamart are also part of solution.Develops Enterprise Data Management Functions like data quality management and standards to identify gaps and ensure compliance across the enterprise.Develop the Conceptual Data Model (CDM), Logical Data Models (LDM) and JSON Schema.Coordinates and performs the data analysis required to identify required data elements and create source-to-target mappings (including business logic) for data transformations.Worked on to develop Domain Driven Design Model that support Micro Services Architecture.Designed LDM & Canonical Data Model to support the Integration API for messaging interface.Developed data quality standards to identify gaps and ensure compliance across the enterprise.Work on Design and develop data models using Erwin Data Modeler, enforced the Naming Standards, create Meta data and apply best practices for optimum performance of the models.Coordinates and performs the data analysis required to identify required data elements and create source-to-target mappings (including business logic) for data transformations.Worked with the application teams to identify Target model mapping design and ensure that it aligns with Integration standards and guidelines.Provide Business Intelligence expertise and consultation to internal stakeholders as neededTechnologies used: NoSQL DB- MarkLogic9, OSPAS, XML XSD, Erwin9.7, X-Query, Oracle 12c, SQL Server 2014Client: First Tennessee Bank, Memphis, USARole: Data Modeler \ Enterprise Data Management 09/17 -06/18Working as Data Modular\ Data Engineer at FTB- First Tennessee Bank.Create and Maintained the Logical data models, Physical data Models in support of Enterprise Data Models, OLTP, Operational Data Structures and Analytical systemsExtensively worked Reverse Engineering, Forward Engineering, Compete Compare, Created and Maintained Logical and Physical layer of Data Models.Generates Data Model design reports from Logical and physical layers to define the business and technical information mapping and constraints.Responsible for leading the technical design of the tables and views that will be consumed by users of the data warehouse. Creating and maintaining processes for gathering high quality metadata.Maintain and update the business attribute definition and interpret and technical design for Staging for Data Warehouse data integration projects as well as data warehouse application projectsFill the knowledge gap between business and application developer within projects from an end-to-end solution point of view.Gather, define and business requirements and develop solution design best practice guidance.Prepare impact and risk assessments for proposed changes and/or recommended business solutionsEnvironment: Erwin 9.7, IBM DB2 10.5, Oracle 11g, SQL Server 2016 or lower.Client: SGL, USARole: BI Developer\ Data Analyst 06/17 -9/17Worked as Sr. BI Developer cum data modeler on retail chain-based project.Defined the scope of the project and documented business requirements, constraints, assumptions, business impacts, project risks and scope exclusions.Report development, problem identification and its fix through major or minor enhancements.Prepared ETL design strategy document which consists of the database structure, change data capture, Error handling, restart and refresh strategies.Used MS SSIS to extract, transform and load data into DWH target system.Environment: SQL Server 2016, SSIS, Tableau Desktop 9.3, JIRACompany: Computer Science CorporationClient: KenyaRe & TOAReRole: BI-ETL Developer\Data Modeler 06/16 5/17Worked in multiple roles like- Sr. DWH-BI, Data Modeler in SICS (P&C, Life, Ceded Insurance Product) Business Analytics projects used by 150+ Reinsurance customers globally.Defined, Created, and managed business specifications, technical specifications, and others project documentation for applications development. and Performed and documented gap analysis.Created and Maintained the Logical data models (LDM), Physical data Models (PDM) in support of Enterprise Data Models (Star Schema & Snowflake Schema), OLTP, Operational Data Structures and Analytical systems.Oversee the development of custom analyses and identify opportunities for automation and scalability.Present analytical insights to internal partners and, if needed, to channel and dealer partnersExtensively worked on Dimensional Modeling (OLAP data-marts) and managed logical and physical data models for data warehouse.Used Erwin as a data modeling tool to forward engineer the models into scripts and reverse engineer the scripts into the tool.Worked in a fast-paced environment that involved following the best practices of SDLC and delivering as per the agile timeline.Used Erwin Data modeling tool to design Logical and Physical data models to suit business needs.Worked on data profiling the source systems for validating data, defining the data standards, size and datatypes in the data warehouse or mart.Created ETL mapping documents from source to target systems for ETL developers.Used advanced data techniques in data modeling, integration, visualization, database design and implementation. Expertise in Business Objects Installation, Upgrade, Migration, and Implementation for Insurance CustomersUsed calculated fields extensively for different logics for Trend, Extended price calculations for the different cost types and Used Set function manual. And Created Advanced grouping function.Found efficient ways to make tables and graphs which were visually easy to understand and at the same time maintaining the accuracy of the core information content.Worked on MS-SSIS (ETL) Solution to extract, transform and load data into DWH target system, expert in writing complex SQL query, function, and stored procedure to perform specific task.Created various Prompts, Conditions and Filters to improve the report performance by means of Detailed and summary filters, created various reports.Created SiSense dashboards using stack bars, bar graphs, scattered plots, geographical maps.Modeling the Metadata layer of data warehouse using SiSense Elastic cubes and working on motion chart, Bubble chart, Drill down analysis.Monitor and improved performance of ETL, Reporting and Database optimization, Expertise in defining data types, optimal normalization, referential integrity, triggers, partitioning design, indexing methods, and data security procedures.Worked in Agile-Scrum that includes short term goals, iterative development and daily stand-upEnvironment: IBM DB2, SQL Server 2016, SSIS, SSAS, Power Pivot, TSQL, Oracle 11g, SAP BO XI 3.1/4.2, SiSense 6.2\6.5, Desktop, JIRACompany: Computer Science Corporation, Blythewood, SC, USAClient: P&C Clients - SwissRe, Farm Bureau, Florida Peninsula & OMAG USARole: BI-ETL Developer\Architect\ Data Modeler 07/13  06/16Worked as various roles like Sr. BI Developer, Data Modeler in POINT IN/J (P&C Insurance Product) Business Analytics projects used by 170+ USA Insurance customersDefined, Created, and managed business specifications, technical specifications, and others project documentation for applications development. and Performed and documented gap analysis.Created both logical and physical design of a database and ran the reports, used advanced data techniques in data modeling, Access, integration, visualization, database design and implementation.Extensively worked on Dimensional Modeling (OLAP data-marts  Star Schema, Snowflake Schema) and manage logical and physical data models for data warehouseWorked to design data marts for account level and account status databases that help various teams to analyze the user experience, user statistics etc.Incorporated industry standard modeling practices to support a detailed and subject oriented data mart.Created documentation of all entities, attributes, data relationships, primary and foreign key structures, allowed values, codes, business rules, glossary terms, etc.Involved in various data modeling tasks that included forward engineering, reverse engineering, complete compare, creating DDL scripts, creating subject areas etc.Maintained the data model and synchronized it with the changes to the database.Extensively used Erwin Data modeling tool to design Logical & Physical data models to suit the business needs.Designed Conceptual, Logical and Physical data models for OLTP, Data Warehouse and Data Mart Applications.Generated DDL scripts during the forward engineering process for the actual physical build of the database.Created Tableau scorecards, dashboards using stack bars, bar graphs, scattered plots, geographical maps.Worked on motion chart, Bubble chart, Drill down analysis using tableau desktop. And Data Source created and modified.Worked in Agile-Scrum that includes short term goals, iterative development and daily stand-up, interacted with the clients and the business partners for issues and queries in the project and Followed, and enforced, industry and organization standards and best practicesEnvironment: IBM DB2, SQL Server 2016, MS-SSIS, SSAS, T-SQL, .NET, JSON, CA Erwin, SAP BO XI 3.1/4.2, SAP Dashboard, Tableau Desktop 9.3 Tableau Server 9.3, JIRACompany: Computer Science Corporation, Bloomington, IL, USAClient: State Farm Insurance, Bloomington, IL, USARole: Technical Architect-BI \ Data Modeler 04/12  06/13Worked as Project Lead cum Technical Architect role in State Farms- Asset Management projectDefined, Created, and managed business specifications, technical specifications, and others project documentation for applications development. and Performed and documented gap analysis.Created both logical and physical design of a database, worked on Dimensional Modeling (OLAP data-marts  Star Schema, Snowflake Schema) and manage logical and physical data models for data warehouseWorked to design data marts for account level and account status databases that help various teams to analyze the user experience, user statistics etc.Incorporated industry standard modeling practices to support a detailed and subject oriented data mart.Created documentation of all entities, attributes, data relationships, primary and foreign key structures, allowed values, codes, business rules, glossary terms, etc.Involved in various data modeling tasks that included forward engineering, reverse engineering, complete compare, creating DDL scripts, creating subject areas etc.Maintained the data model and synchronized it with the changes to the database.Used Oracle SDDM to design Logical and Physical data models to suit the business needs.Designed Conceptual, Logical and Physical data models for OLTP, Data Warehouse and Data Mart Applications.Generated DDL scripts during the forward engineering process for the actual physical build of the database.Developed the team of data analytics and visualization development engineers and convert data into actionable insights using descriptive and predictive modeling techniquesWorked on SAP Business Objects & COGNOS reports design and developmentWorked on INFORMATICA (ETL) Solution to extract data from various source, transform and load into target system (Data warehouse and data Mart).Expertise to writing complex SQL query, function and stored procedure, defining data types, optimal normalization, referential integrity, triggers, partitioning design, indexing methods, and data security procedures.Worked on T- SQL packages Solution to extract data, transform and load into DWH/DM systemExpertise BO Universes- Design and develop new BO universes, modification and enhancement of existing universe by UDT, Resolve loops, traps, and cardinality on universe.Expertise in Reports -Design, create reports, Changes/enhancement of existing BO & COGNOS reportsWorked on IBM COGNOS Report Studio, COGNOS Framework ManagerSAP BO administrator activities - access, server properties, SIA, Scheduling, publishing documents, managing BO server, Tomcat server and BO services, Excellent understand of CMS Audit DB tables.Monitors and improves performance of ETL and operational reporting environmentsHandled a team of resources at client location in Bloomington, USAInteracting with the clients and the business partners for issues and queries in the projectEnvironment: SQL Server2016, TSQL, .NET, Oracle PL/SQL, INFORMATICA9, IDQ, T-SQL, BO XI 3.1 SP5, Dashboard, QlikView, COGNOS10.2Company: Computer Science Corporation, Tokyo, JapanClient: SAFIC  SAISON Auto & fire Insurance Company, JapanRole: Data Modeler\ DWH- BI Architect 07/10  03/12Played multiple roles like Team Leader (Technology), Data Modeler, BI Architect, Technical Architect SAFIC- POLISY/J (P&C Insurance)-Business Analytics projectsDefined, Created, and managed business specifications, technical specifications, and others project documentation for applications development. and Performed and documented gap analysis.Created both logical and physical design of a database, worked on Dimensional Modeling (OLAP data-marts  Star Schema, Snowflake Schema) and manage logical and physical data models for data warehouseUsed Erwin as a data modeling tool to forward engineer the models into scripts and reverse engineer the scripts into the tool.Worked in a fast paced environment that involved following the best practices of SDLC and delivering as per the agile timeline.Used Erwin Data modeling tool to design Logical and Physical data models to suit the business needs.Worked on data profiling the source systems for validating data, defining the data standards, size and datatypes in the data warehouse or mart.Created ETL mapping documents from source to target systems for ETL developers.Involved in data profiling source systems to identify any data issues, be a subject matter systems for various consumers of the data mart and data warehouse structures.Created documentation of all entities, attributes, data relationships, primary and foreign key structures, allowed values, codes, business rules, glossary terms, etc.Involved in various data modeling tasks that included forward engineering, reverse engineering, complete compare, creating DDL scripts, creating subject areas etc.Expertise in Design, Implement and support development end to end data warehouses, data marts, ETLs development that provide structured and timely access to large datasetsDeveloped the team of data analytics and visualization development engineers and convert data into actionable insights using descriptive and predictive modeling techniquesWorked on SAP Business Objects & SAP BODI Installation at SAFIC environmentsWorked on SAP BODI (ETL) tool to extract data from POLISY/J (Japan) system, transform the data and load into target system (Data warehouse and data Mart).Expertise to writing complex SQL query, function and stored procedure to perform specific task.Expertise in BO Universes- Design and develop new BO universes, modification and enhancement, BO Reports -Design and create new BO reports, troubleshoot the BO reports issues.Experienced in SAP BO administrator activities, server properties, SIA, Scheduling, publishing documents, managing BO server, Tomcat server and BO services, Excellent understand of CMS Audit DB tables.Interacted with the clients and the business partners for issues and queries in the projectEnvironment: SQL Server 2008R2, Oracle10g, SQL, SAP BODI, SAP BO XI 3.1, Dashboard, CA ERwin, Tomcat 5/7, QlikViewCompany: Computer Science Corporation, Noida, IndiaClient: TDC- Tele-Denmark Communications, DenmarkRole: BI- ETL Developer 10/08  06/10Played as Sr. BI Developer role in TDC (Tele Denmark Communications) projectWorked on ARTEMIS Reports, major/minor bug fix, improved performance and Production SupportExperienced on Data Acquisition from Oracle e-Business Suites R12 (PA Module) using ETL Job scripts.Expertise to writing complex SQL query, function and stored procedure to perform specific task.Worked on INCA Application (GIS Application), Oracle Spatial DB and Map Info application to generate Geographical mapsExposure on Project Registration, Project Tracking, development & Maintenance HLD and DLD DocumentEnvironment: TCL, INCA (GIS Application), Oracle 10g, PL/SQL scripts, Oracle Spatial DB Oracle- EBS R12 (PA Module), ARTEMIS, Oracle Spatial, MapInfo.Company: Computer Science Corporation, Chennai, IndiaClient: Thomson Healthcare (Truven Healthcare)Role: Senior Software Engineer (GPHONE NUMBER AVAILABLEWorked as ETL  BI Developer role in Thomson Reuters (Truven Healthcare) Business Analytics projectsExperience in creating of Schema objects (Attributes, Facts and Hierarchies) and building application objects (Filters, Prompts, Metrics, custom Groups, Consolidations, Drill Maps, Templates).Experienced in design and setting up of new users, roles, privileges, data and application securityExtensively worked on Data Modeling concepts - Type2 DIM, Fact-Less-Fact and Confirmed DIM and Fact and ..etc.Experienced in Quarterly release development process and production live support.Worked on INFORMATICA 8.6 as an ETL tool to make Data warehouse and data MartExposure of MapInfo to deploy additional ZIP CODE into the system to have them into MSTR ReportsExperienced on reports issues, enhancements, ETL operational environment, DB OptimizationExpertise to writing complex SQL query, function and stored procedure to perform specific task.Interacted with the clients and the business partners for issues and queries in the projectEnvironment: SQL Server 2008 R2, Oracle 9i, UNIX Scripting, Oracle SQL Developer Data Modeler, INFORMATICA PowerCenter 8.6, IDQ, ICS, Microstrategy9, Onyx, MapInfoCompany: Cognizant Technology System, Pune IndiaClient: Merck Inc, NJ, USARole: ETL & Business Intelligence Developer 05/06  11/07Performed role of Programmer Analyst -BI Developer in Cognizant Technologies Solutions in PuneHad Exposure of Pharma Client used SQL* LIMS System for NG-LIMS and Stability-LIMS products and captured sample life cycle phase data to design the DSS system.Worked for 24 LIMS sites across the Globe to capture NG-LIMS and Stability-LIMS Sample data and loaded successfully into data warehouse and DSS system by setting up the Data Stages at UNIX serverWorked on UNIX Shell Scripting, Crontab Job Scheduling, MSTR & COGNOS Admin and Framework Manager, User AccessExpertise in creating of Schema objects (Attributes, Facts, Hierarchies) and building Application objects (Filters, Prompts, Metrics, Custom Groups, Consolidations, Drill Maps, Templates).Expertise in MSTR & COGNOS Report Development and Production SupportMonitoring processes running on UNIX server of Stability DSS and maintaining their continuationWorked on troubleshooting COGNOS and MSTR reports issues, Modify or enhance existing MSTR reportsEnvironment: UNIX Shell Script,

Respond to this candidate
Your Message
Please type the code shown in the image:

Note: Responding to this resume will create an account on our partner site postjobfree.com
Register for Free on Jobvertise