Quantcast

Data Warehouse Business Intelligence Res...
Resumes | Register

Candidate Information
Name Available: Register for Free
Title Data Warehouse Business Intelligence
Target Location US-NC-Morrisville
Email Available with paid plan
Phone Available with paid plan
20,000+ Fresh Resumes Monthly
    View Phone Numbers
    Receive Resume E-mail Alerts
    Post Jobs Free
    Link your Free Jobs Page
    ... and much more

Register on Jobvertise Free

Search 2 million Resumes
Keywords:
City or Zip:
Related Resumes
Click here or scroll down to respond to this candidate
Candidate's Name
Email EMAIL AVAILABLE Ph: PHONE NUMBER AVAILABLEExperience SummaryAbout 12+ years of adept experience in Data Warehouse Systems and Application Development, experienced in implementing complete end-to-end Software Development Life Cycle (SDLC) process.Good experience in executing and implementing end-to-end data warehouse project implementations and worked in each and every phase of DW/BI Lifecycle.Expertise in Data Modeling created various Logical/Physical Data Models for many Data Marts and Data Warehouses.Performed System Analysis & captured Requirements related to Architecture, ETL, Data Quality, MDM, Dashboards and Reports. Captured enhancements from various entities and provided Impact analysis.Good experience in Data Profiling, Data Mapping, Data Cleansing, Data Integration, Data Analysis, Data Quality, Data Architecture, Data governance, Metadata Management & Master Data ManagementDesigned key system architectures with complete system integration involving various modules.Created first of a kind unique Data Model for Intelligence Domain.Self-initiative, Excellent Analytical, Communication skills, can work independently & good team player.Good experience in debugging SQL Code developed PL/SQL scripts, stored procedures and triggers to extract and load the data. Experience in UNIX shell scripting to Automate and schedule the jobs.Developed Multiple ETL processes for data loading data facts, Dimensions & other tables.Designed & implemented pilot projects using various set of BI tools involving the latest features & product trends, experience in technologies such as Big-data, cloud computing & In-memory Apps.Good experience in other BI Tools, ETL tools and worked with many high-performance databases.Worked on for many OLTP/OLAP implementations with fine tuned SQL / MDX Queries.Expertise in Cognos, Business Objects (BO) & worked extensively on Dashboard & Scorecards.System Optimisation, performance monitoring and tuning experience in Business Intelligence systems.Developed various Semantic layers such as Framework Manager, Universes, RPD and fine-tuned queries generated. Developed complex reports and Dashboards with best Data Visualization capabilities.Involved in continuous Performance Tuning & improvements for existing OLAP systems. Adept in developing reports, dashboards and implementing Business Intelligence solutions.Successfully installed and upgraded lower versions to higher versions & migrated application data.Supported and maintained many application systems thereby ensuring all deliveries are met on time.Coordinated test plans, test cases which provided feature/functionality/usage scenarios. Created Strategy & Roadmap with execution plan, there by monitoring the progress of the projects.Proposal preparation in response to RFP, providing Effort estimation and project planningProvided training for the technical and business users in terms of operations & usability.Implemented solutions for ingesting data from various sources and processing the Data-at-Rest utilizing Big Data technologies such as Hadoop, Map Reduce Frameworks, HBase, Hive, Python.Involved in loading and transforming large sets of data and analyzed them by running Hive queries.Integrated NoSQL database like Hbase with Map Reduce to move bulk amount of data into HBase.Loaded and transformed large sets of structured, semi structured and unstructured data using Hadoop/Big Data concepts. Ingest data into Hadoop (HDFS)/Hive from different data sources.Define and manage the architecture and life cycle of Hadoop and SPARK projectsDesigned Real time Stream processing Application using Spark, Kafka, Scala and Hive to perform Streaming ETL Enhancements to traditional data warehouse based on STAR schema, update data models, perform Data Analytics and Reporting using Tableau. Extracted the data from MySQL, AWS into HDFS using Sqoop.Experience on BI reporting with at Scale OLAP for Big Data. AWS Cloud and On-Premise environments with Infrastructure Provisioning / Configuration.Implement various data warehouse projects in Waterfall & Agile methodologies.Worked as Sr. Business Intelligence & ETL Developer, Technical Lead, Solution Architect, Business System Analyst and Data Analyst.Employment SummaryCurrently working as a Sr. BI Consultant for a US Based Company.Earlier worked full-time on multiple Technologies/Clients with Wipro for around 9 years.Earlier worked full-time with 3i-Infotech consulting & product Development Company for around 1 year.Earlier worked full-time with Accenture & SSG consulting services for around 1.5 years.EducationBachelor of Engineering (B.E) in Computer Science, 2001, Mangalore UniversityTechnical ExperienceTechnologyToolsCoreBusiness Intelligence (BI), Data Analytics & Data Warehouse ImplementationVerticals & DomainsAdvanced Intelligence & Crime Analytics, Banking, Education, Energy Markets, Equities, Finance, Insurance, Retail, Telecom, HealthCareASW / Big Data EcosystemsHadoop, MapReduce, HDFS, Hive, Pig, Zookeeper, Sqoop, Flume, Apache Kafka, Spark, TDCH, Pentaho Kettle, Impala, YARN, HUE, Oozie, Zookeeper, and TalendNoSQL Databases: HBase, Cassandra, and MongoDB, EMR, Glue, Redshift, S3, LambdasCloudAWS, SnowflakeReporting Skills & Other ModulesIBM Cognos (8.4 & 10.2.1, 11.1) Report, Administration & Framework ModelSAP Business Objects (BO) XI 4.2 (IDT, UDT Designer, Web Intelligence & Administration)SAP Business Objects Xcelsius / Dashboards 4.1 / SAP Crystal Reports XI/2008/2013Tableau Desktop 9.2 /2020, Tableau Prep Builder 2020QlikView 11.2, AlteryxOracle Business Intelligence Ent. Edition (OBIEE) 10g/11g/12cSAP HANA Modeler1.0 SPS7, SAP Lumira 1.8 & Design Studio 1.3,Oracle Financial Services (OFSAA)MS SQL Server Reporting & Analytical Services (SSRS & SSAS), MS Power BISAS BI Suites, SAS Enterprise Guide 4.3, SAS Activity Based Management (ABM).ETL ToolsInformatica v9,10.2, SQL Server Integration Services (SSIS) v2012, AWS Glue,SAP BO Data Services / Data Integrator (BODS / BODI) v4.2IBM Data Stage and Quality Stage v8/11.5, InfoSphere MDM 11.5DatabasesSAP HANA, Oracle EXADATA 11g/12c, Oracle 10g/11i/12c, IBM DB2 v7/8/9/10,ISAS DB2 10, IIAS DB2 warehouse, Netezza 6/7, NCR Terradata 12 & SQL Server 2008/12, GreenplumModelling ToolsCA Erwin 9.1, IBM InfoSphere Data Architect v8/9.1.3, Viso Data Modeler v8,SAP Power Designer16.6ProgrammingJava v6, C++, VB Macro, PL/SQL & Unix Shell ScriptingScheduling ToolsAutosys, Control-M, SQL Server schedulerOperating SystemsHP Unix, Linux, Windows 2003/2008 & Sun Solaris 9/10Professional ExperienceClient: Fanniemae May 2023  CurrentRole: Sr. BI DeveloperWorked on Implementation of MISMO 3.6 ModuleExtensive involved in analyzing the Data, profiling, impact & lineage analysis & Business Analysis.Worked towards the development of Reports related to Multi-Family line of business.Assisted in Design of Reports /Dashboards with best Data Visualization capabilities demonstration.Designed and developed Reconciliation reports based on On-Prem and AWS Cloud databasesCreated and modified existing BI reports to facilitate the Reconciliation for Compliance Audit needs.Technology: Tableau, SAP BO, ER Studio, AtScale, Aurora, Dynamo DB, AWS Redshift, AWS EMR, Python & SparkClient: Kaiser Permanente Jan 2022 - May 2023Role: Sr. BI DeveloperCreated Reports and modified existing BI reportsCreating the Operational reporting needs based on EPIC Tapestry versions. Worked on Handling Finance reporting requirement for Cigna reportingCoordinate with the Offshore Team with validating the business requirements and address communication with Client. DDL scripting / reviews and performance tuning support and Handling ETL DeliveryServe as a subject matter expert and perform tasks that contribute to the organizations mission and vision.Working on Rationalizing the reports/Universes for Seven regions and implementing Security.Technology: SAP BO 4.3, Tableau 2021, MS Power BI 13, Informatica, Erwin, IBM ELM, Oracle Exadata, Oracle, SQL Server, Reporting Work Bench (RWB), Hive, PySpark.Client: Presbyterian Healthcare Services Mar 2019 - Dec 2021Role: Sr.BI DeveloperCreated Reports/Dashboard and modified existing legacy reporting solutionsImplementation of IBM UDMH (Unified Data Model for Healthcare) with Multi-Tenancy & implementing of Clinical data modules like HL7 ORU/ADT, HL7 FHIR, CDA, etc.Conform to existing standards & conventions providing leadership guidance with enterprise data strategies revising data dictionary, governance practices, and standards and partnering with security architectsDefine end-to-end solution and recommendations to implement best practices and involving MPP databases on Cloud Big data technologies.Extensive involve in defining, implementing, and executing data analysis concepts such as profiling, lineage, impact analysis, mapping experience and Business Analytics.Coordinate with the Offshore Team with validating the business requirements and address communication with Client. DDL scripting / reviews and performance tuning support and Handling ETL DeliveryPlan out key activities work with the Tech Archs and provide input into the technical design.Serve as a subject matter expert and perform tasks that contribute to the organizations mission and vision. Develop new Logical and physical data models from scratch based on the Requirements.Technology: SAP BO, Tableau, Cognos, Jasper, Informatica, IBM Data Architect, Data Vault, IBM Sailfish, Netezza 7, IIAS DB2 Warehouse, Redshift Spectrum, Oracle, Hive, PySpark, AWS, Snowflake, StreamSets, IBM Data Stage & Quality Stage, IBM MDM 11.5, AWS, SAS, SSIS, PL/SQL, Intersystem EnsembleClient: Anthem BCBS - Solution Architecture & Data Modelling May 2017 - Feb 2019Role: Sr. BI ETL DeveloperWorked on complex ETL Mapping sufficing the State and Federal Mandates requirementsat source, Target levels using Indexes, Hints and Partitioning in DB2, ORACLE and Informatica.Created and fine-tuned many QlikView/BO reports and dashboards as per the users requirement.Designed and developed various PL/SQL stored procedures to perform various calculationsConverted the PL/SQL Procedures to Informatica mappings and at the same time created procedures in the database level for optimum performance of the mappings.Investigating and fixing the bugs occurred in the production environment & providing the on-call support.Identifying Hadoop  Hive zones and creating ERD for Raw, Source Standardize and Application Zones.Implementation of columnar Database like H-Base after thorough Research and approvals from Business.Responsible for Querying on Hive or Impala databases on Hue platform for analyzing data, creating tables,Performed Unit testing and maintained test logs and test cases for all the mappings.Maintained warehouse metadata, naming standards and warehouse standards for app development.Parsing high-level design specification to simple ETL coding along with mapping standards.Identifying Hadoop  Hive zones and creating ERD for Raw, Source Standardize and Application Zones.Define schema and improve query performance by implementing partition on tables in Metastore.Initially Migrate on-prem Hadoop, Teradata etc., with major Subject areas as Claims, Member, Provider, clinical and migrate it to cloud technologies like AWS -VPC to create analytical Platform.For RX there will be a separate entity created with PLZ, Rx centric model linked Zone and Consumption layer from Cloud. Support for using Sqoop / TDCH for Importing/exporting Data from Teradata to Hadoop.Technology: Sqoop, Kafka, Hive, H-Base, PySpark, Scala, Greenplum, MongoDB, Informatica IICS, Talend, Teradata, MS SQL, Oracle, Apache Cassandra, Impala, Cloudera, AWS, AWS EMR, AWS Glue, Redshift, Flume, Apache Hadoop, Informatica Data Quality, Informatica Metadata Manager, IBM DS, IBM MDM, Map Reduce, Cassandra, Zookeeper, AWS, MySQL, Dynamo DB, MS Power BI, SAS, OBIEE, SAP BO, Tableau, PL/SQL and Python.Client: American Apparel May 2016  Apr 2017Role: Sr. BI ETL DeveloperCreated Cognos Reports using Report StudioDeveloped Multiple Models and Packages using Framework Manager.Performing data analysis, data mapping, data quality & design logical/physical data models.Developed various ETL as per the requirement need for both US and International Businesss.Created and fine-tuned many reports and dashboards as per the users requirement.Providing support, thus ensuring that all data flows are completed, and the report/dashboards are delivered to the users on time and solving any issues related to the support.Interact with various users for BI requirements and provide solutions in Agile (SCRUM) methodologies.Technology: SAP BO, MS Power BI, SSRS/SSAS, SSIS 2012, SQL 2012, AWS, Informatica, Erwin, Oracle, Snowflake & VB .Net.Client: Coca Cola - Business Intelligence on HANA Jul 2014 - Apr 2016Role: Senior BI ETL Developer / Data AnalystLead business requirements and create executive dashboards. Moderate discussions with senior stakeholders and finalize KPIs for dashboards and reports.Implementation of complex dashboards with best Data Visualization capabilities demonstration.Performing Data analysis, metadata management, Data Integration from multiple source EntitiesDesigned and developed reports and universes using IDT and based on HANA Models.Worked on creation of Dashboards with supported features for Mobile (iPad) based devices.Created Complex dashboards as per the end user requirements.Creation of HANA models including Attribute, Analytical & Calculated Views.Developed framework using shell script, Amazon SDK for moving the data from on prem systems to Snowflake on Amazon cloudWorked closely with infrastructure team to make sure horizontal incremental of EC2 instances based on the load and memory.Technology: Tableau 9.2, HANA, AWS, SAP BO R4.1, Informatica, Lumira, BODS 4.1, Erwin and Suse Linux.Client: Canara SDP Sep 2013 - Jul 2014Role: Senior BI ETL Developer / Data ModelerUnderstand the requirements & business logic to implement the same into solutions.Created the logical/physical Data Model as per the requirement and considering all the functional inputs.Created ETL Mappings, besides Testing & validating the deliverables of the subordinates.Design and develop user specific complex reports and universes as per the user requirement.Creation reporting requirements with Implementation of row/role-based security.Ensured that each Sprint wise deliverables in the agile methodology are properly rolled out in production environment without any slippage with proper support and warranty SLAs meet.Technology: MS Power BI, SAP BO R4.0, SAS, OBIEE 11g, OFSAA, Informatica 9.1, Java, Oracle 11g, PL/SQL & Erwin.Client: AP - Apollo Mar 2012 - Sep 2013Role: Data Modeler / Senior BI ETL DeveloperCreated necessary Cognos Models and Packages using Framework Manager and published.Worked on analysis studio reports for transformer cubesCaptured the requirements/enhancements from end users and analysing the impact/gap analysis.Provided a robust architecture by integrating all the solution components used in the project.Designed a complex architecture which Integrates solutions such as Geo Location Tracker, Voice Monitoring, Face Recognition & Video Analytics, speech recognition and Ethical Hacking are achieved.Provided Roadmap and execution plan for all the deliverables and monitoring the progress as per the schedule. Provided status report to the Senior Executive Management.Created first of its kind unique Data Model for Crime Intelligence Domain.Lead the delivery of ETL with Data quality routines & complex Reporting requirements.Created ETL Mappings with DQ routines, used Information Analyser to profile various source systems.Technology: Cognos 10, Data Stage, Quality Stage, IBM MDM, Identity Insight, ISAS DB2 and InfoSphere Data Architect.Client: United Bank - Risk & EDW Management Solution Jan 2010 - Mar 2012Role: Senior BI ETL Developer / Data ArchitectCaptured the requirements from all the Entity business users and provide the gap/impact analysis.Assisted the Entity Finance users in identifying the reconciliation logic and validating the group.Balance Sheet, PNL with Adjustment Functionality, Computation of Cost Allocation and Profitability.Tested the reports and deliverables and ensured that the UAT is executed properly for each entity.Currently assigned the responsibilities as a Technical Lead involving the following list of Task.Analyse the Business requirements and identifying the requirements feasibility.Provided the team with all the technical and functional assistance in terms of design and development.Tested the MIS & Analytical reports and deliverables and ensure that the UAT is completed.Implemented Data Model Design Document, BI Architectures, ETL Mappings & OLAP Reporting.Reconciliation of data for the different source systems and thereby validating data loads / Rejections.Technology: OBIEE 11g, OFSAA, SAS, Informatica 8.1, PL/SQL (BTEQ), MS BI, SSIS, SSRS/SSAS, Terradata 12 and UNIX.Client: EMC - Business Intelligence Management Systems BIMS Jul 2007 - Dec 2009Role: Technical Lead / Solution ArchitectDesigned the key System Architecture as per the requirement including logical/physical data model.Implementation of best practices in terms of design, development and documentation standards.Complete System integration involving many applications, ETL & other reporting services.Provided the training the technical staff & Business user.Reports generated for various departments like Pricing, Settlement and Market Assessment Unit.Reviewed and communicated with business users and getting sign off at each phase of the project.Completed the SIT & UAT phase and providing adequate support for Warranty and project closure.Technology: SAP BO, Crystal Reports, Informatica 7.1, PL/SQL, Oracle and Erwin.Client: KPN iBasis - PINGO Buytool Jul 2006 - Jul 2007Role: BI ETL Developer / System AnalystAnalysed business requirements, review and communicate with business users and get sign off.Performed System Analysis & Identifying key improvement areas with IT team with Gap analysis.Created system delivery specification documents, Technical Specification documents and test plans.Designed and implemented the necessary Data Marts & Architecture for the reporting services.Implementation of necessary PL/SQL scripts which extract and load the data into Data Mart.Created cross linking reports as per the requirements by providing hyperlinks between reports.Designed complex reports that suffice the business needs with improved flexibility & functionality.Technology: Cognos 8, MS SSIS/RS, SAP BO SDK, PL/SQL, SQL Server, Netezza, UNIX & Java.Client: Goldman Sachs - CADM BO Mar 2005 - Jul 2006Role: Senior Software DeveloperLead the Team, Understanding the Business needs and the Workflow.Provided the Production Support, thus ensuring that the Batch procession is completed and the status report are delivered to the users on time and solving all the issues related to the support.Interacted with the various users across the globe, understood requirements and provided solutions.Handled ad-hoc user requirements and maintained security compliance for data and reports.Reports such as PNL, Momentum reports, Hedge funds, Broker/Dealers, Institutional, Corporate and noted individuals Clients.Developed ETL process for data loading data to target facts & Dimensions.Generated reports as per the business requirements. Ensuring the reports are delivered to the users on time and solving all the issues related to the support.Technology: SAP BO, Informatica 7.1, DB2, PL/SQL, UNIX, Autosys and Perl script.

Respond to this candidate
Your Message
Please type the code shown in the image:

Note: Responding to this resume will create an account on our partner site postjobfree.com
Register for Free on Jobvertise