Quantcast

Business Intelligence Data Warehouse Res...
Resumes | Register

Candidate Information
Name Available: Register for Free
Title Business Intelligence Data Warehouse
Target Location US-MI-Novi
Email Available with paid plan
Phone Available with paid plan
20,000+ Fresh Resumes Monthly
    View Phone Numbers
    Receive Resume E-mail Alerts
    Post Jobs Free
    Link your Free Jobs Page
    ... and much more

Register on Jobvertise Free

Search 2 million Resumes
Keywords:
City or Zip:
Related Resumes

Data Analytics Information Security Business Intelligence Analys Clinton Township, MI

Business Intelligence Data Warehouse Troy, MI

Business Intelligence Data Analysis Auburn Hills, MI

Business Intelligence Sql Server Farmington Hills, MI

Data Entry Business Administration Detroit, MI

Business Analyst Data Scientist Detroit, MI

Business Analyst Project Data Novi, MI

Click here or scroll down to respond to this candidate
Candidate's Name
EMAIL AVAILABLEPHONE NUMBER AVAILABLEMichiganOver 20 years of experience in Information Technology and Client/Server Applications. Data Warehousing(DWH) and Business Intelligence(BI) life-cycle experience across various industries.A seasoned practitioner with several years of consulting in industry-leading Business Intelligence tools and applications. Professional experience in data warehouse(EDW) and BI Analytics Dashboard/Reporting. Professional experience in AWS Cloud technologies, Cloud Computing, Configuration, and implementation of AWS services, e.g. DMS, S3, EC2, Lambda, RDS, Glue, Redshift, Cloudwatch, VPC, using CLI, Terraform, and Python.Completed AWS Professional Trainings and many hands-on AWS workshops. Professional experience in Data warehouse Informatica 10.5, EDW, IBM MDM, IBM Web Sphere, SAP BI Business Object(Webi Client), BI Analytics Applications development, Oracle, SQL, OBIEE,OBIA, EDW, BI Applications Solutions, Designing, testing, and Implementation of data warehouse/data lake. Professional experience in SAP BO dashboard/reports development and administration. BI/DWH/ETL projects Implementations. Excellent experience in Data Operations Support(On-call) for data warehouse ETL, data lake, and TWS batch jobs in Windows and Linux environments.Professional ExperienceSr. Business Intelligence EngineerVOLKSWAGEN Group of America, MichiganMarch 2015 - April 2024As a part of the EDM team, I have developed and managed end-to-end BI Client Server Applications using DWH processes. Performed many Business intelligence (BI) development projects including data operation tasks and responsibilities to resolve daily production issues.Professional experience In cloud computing designing and implementing cloud-based solutions using AWS services.Collaborating with business and development teams to evaluate and identify optimal cloud solutions. Developing and maintaining cloud solutions by best practices documentations.Professional experience in AWS Cloud Services Creation, Configurations and Implementations e.g. S3, EC2, Lambda, RDS, VPC, Glue, Redshift, EBS, CloudWatch and more.Deployed AWS infrastructure and resources using automation and pull requests using AWS SDK, CLI, Python and Terraform infrastructure as code.Designed and implemented cloud migration projects to AWS. Developed AWS roadmap design documentation for major projects.Interacting with business clients, providing cloud support and making recommendations based on client needs.Developed design and architecture flow documents to build and manage highly secure, scalable, and available architectures.Exceptional experience in Business Intelligence tools and applications including development and administration, Data Warehouse(DWH) ETL-Informatica 10.5.4 development and implementation of Business Intelligence dashboards/reports and UI customization using SAP Business Objects and other BI Dashboard/Reporting tools.Excellent experience in business intelligence project standards detailed designs and end-to-end development process. I have completed many complex DWH projects and developed ETL processes and code in informatica power center, BDE 10.2, and 10.5 after excellent Informatica big data training.Professional experience in gathering business and project requirements and creating documentation like BSR, ETL design documents, and project proof of concepts (POC).Performed many BI application developments in BO/Oracle BI dashboard reports and end-to-end quality assurance testing. Managed and developed IBM MDM batch process and TIBCO MDM (Master Data Management) end-to-end process. We have developed IBM TWS batch jobs and jobs streams based on batch job execution flow.Excellent experience in BI Data Operations tasks handling critical ETL DWH and BI applications production issues and job failures daily.Performed strategy along with senior management new ways and approaches to increase performance and minimize errors and application downtime.Developed and maintained management and key performance BI reports. KPIs Identify issues with data and communicate with the business and the end customer to resolve the issues.Excellent experience in production troubleshooting and analytical skills.Environment:Business intelligence(BI), Data Warehouse(DWH), AWS, Big Data, Informatica 10.5.4, Python, Oracle 19c, AWS Cloud Computing, Terraform deployment, AWS Services(S3, DMS, EC2, Lambda, VPC, Redshift, RDS). PowerCenter ETL mappings/workflows developments, SAP Business Objects, SQL Server, IBM MDM(Master Data Management), QlikView 11, TIBCO MDM, SAS, IBM TWS, IBM MDM batch jobs, Informatica 10.5x Workflows. Toad for Oracle, Linux.Sr Data Warehouse ConsultantSogeti USA, MichiganDecember 2012 - February 2015Clients: Blue Cross Blue Shield of Michigan/P&GResponsible for leading the Enterprise Data Warehouse projects both inbound and outbound from the design to implementation phases of the project. Responsible for providing the development estimates from design to postimplementation to Delivery Lead and Project Managers Worked effectively on multiple projects and ad-hoc needs of the business users Involved with business stakeholders to collect and gather report requirements.Involved in gathering business specifications and prepared technical specifications by closely working with Business Users. Led the development team and coordinated with teams such as Data Architects, Data Modelers, Data Base Administrators, and Testing team to ensure that development work is completed on time.Managed Project Scope, determined daily priorities, and ensured efficient and on-time delivery of project tasks and milestones by following proper escalation paths. Effectively communicated the Project Risks, Issues, and Weekly progress report to the Delivery lead and other business stakeholders. Developed Functional Design Documents (FDD) and Technical Design Documents (TDD). Communicated and reviewed the design, functional, and technical specifications to ETL and Report development teams, Designed complex mapping/code with.Developed Stored PL/SQL Procedures, Packages and Functions. Implement PL/SQL code into ETL mappings and transformations. Performed Data Analysis by executing SQL queries using COGNOS Analysis and Query Studio, TOAD / SQL developer and Teradata SQL Assistant.Design and Developed high level Informatica ETL diagram flow. Developed complex ETL code and performed unit testing, Design source-to-target mappings using Informatica PowerCenter 9.5.Developed ETL code added and setup UI configuration tables within the custom mappings/workflows.Completed End to End EDW/ETL Testing in QA1, QA2 environment.Developed Informatica performance tuning, naming standards and project guideline document andDeveloped multiple files merge UNIX .ksh script within the ETL process.Designed and developed an automated ETL process for monthly/weekly data feeds for the BCBSM clients.Performed SIT and UAT testing. Developed test cases for EDW/ETL testing.Developed IBM Tivoli Job streams and flow process document. Developed and tuned complex SQL queries in the source qualifier. Created Lookups and join transformations.Designed and developed multiple target mapping/codes with the Router transformation.Project Technologies/ Products:Oracle 11g, Informatica 9.5 (ETL), INFA Power Center 9.5, Informatica Data Profiling, Oracle SQL Developer, SQL/PLUS, UI Configuration/ IBM Tivoli, MS Visio 2010, Unix, Win SCP, Putty- (FTP), Windows 8Business Intelligence Application LeadGeneral Electric (GE Capital Americas), MichiganFebruary 2010 - December 2011One of the lead contributors to the Business Intelligence Center of Excellence (BI COE) at the GE Capital Americas (GECA). Contributed to the BI COE team in rolling out OBIEE 11g (Analytics) for the first time across all of GE and have laid a foundation for future success. Strong contributor in all phases and areas of expertise including SDLC for BI, OBIEE 11g project development architecture, OBIEE 11g best practices, ETL Informatica best practices, QA, Production support, and product evaluations. I regularly collaborate and present to customers, executives, and senior leaders.As BI COE and BI Application team lead my responsibility is to deliver Interactive dashboard Analytics reporting solutions using OBIEE 11g and IBM Cognos to the end user including Design, Analysis, Gathering business requirements from business users and building proof of concept (POC) to get design and requirements approved.As BI lead I performed and lead UAT BI testing in all releases and projects development phases in the staging environment (QA).Build ETL Informatica mapping design process, Informatica ETL standards and metadata documentation.Performed Informatica 8.6.1 mapping designer to create/update mappings using different transformations required to move data into data warehouse, Informatica workflow manager to create sessions, batches to run with the logic embedded in the mappings. Created full load and incremental loads workflows.Performed OBIEE 11G and IBM Cognos dashboard development process from Installation to building each report and dashboard pages, OBIEE 11g dashboard and reports Customizations, building complex reports in BI Publisher Integration, Configuration, RPD development and Performance tuning in different OBIEE 11g and BI Server components. Performed RPD development in Physical, Logical (BMM), and Presentation layers including setting up Multi User Development (MUD) environment and OBIEE 11g Usage tracking. Implemented OBIEE 11g end-to-end performance tuning best practices.Excellent knowledge of OBIEE 11g Administration tasks handling BI server and related configuration XML files, user and data level security on the dashboard and reports, webcast migration, RPD merge and development, troubleshooting web catalog, user management, BI server, and opmnctl services activities and handling critical BI Server issues in Oracle 11g Enterprise Manager Fusion Middleware (EM) and Oracle Weblogic Server 11g Admin Console.Strong expertise in Geospatial Visualizations Integration (OBIEE 11g MapViewer), BI Composer, and SSO Integrations.Environment:OBIEE 11g 1.1.5,1.1.6, IBM Cognos, Oracle 11g DB, Informatica 8.6,Teradata DB, Oracle Weblogic Server11g, Oracle 11g, Enterprise Manager Fusion Middleware, SAP Business Object, Oracle SQL Developer, SQL/PLUS, Oracle 11g JDeveloper, SAP Business Object(BO) 3.0, jQuery, Unix, Windows 7, One Note, SnagIt 10.Sr. BI DeveloperJohn Hopkins University - Applied Physics Lab (JHU APL), MDFebruary 2010 - November 2010Project: EBSSThe project is to establish a next-generation business intelligence/data warehousing (BI/DW) platform for the Applied Physics Lab (APL) business systems. The platform will be used to deliver ad-hoc query and reporting in OBIEE capabilities that extend the capabilities of the OBIEE 10g, OBIA, Oracle E-Business Suite, Financials, Procurement, and Projects modules.Responsible for end-to-end design of the process to extract, transform, and load (ETL) the data in preparation for the data warehouse. Responsible for the establishment of project design documentation.Responsible for the establishment of project standards.Performed and developed OBIEE 10g/Siebel Analytics dashboards, Admin tool (rpd.), configuring custom dashboard, Metadata Definition and ETL Design Documents build DFFs and out of box mapping customization method Documents.Design source-to-target mappings using Informatica PowerCenter ETL 8.6.1.Developed ETL code using Informatica 8.6.1 and performed unit testingResponsible for out of box SDE/SILOS ETL mappings customization.Customize Prebuilt ETL mappings and adding new columns in the facts/dimensions tablesCreated new DFFs in the Data warehouse/facts to get additional and conversion data.Design and Development effort for the new JHU APL Business Analytics WarehouseCreated Informatica PowerCenter 8.6.1 custom reusable transformations for faster development like Expressions, lookups, etc..Performed OBIEE 10g report and RPD development, Created new tables and custom FLEX FIELD X_column(s) in the Oracle Data Warehouse using Oracle SQL Developer.Created new custom workflows and sessions when its need to load the out of box SDE/SIL data.Managed SCD (Slowly Changing Dimension) mappings where Lookup will be touched, where Custom DFFs need to be part of SCD. Created new Custom Lookup Transformation, where DFFs (FLEX FIELD) based lookup is needed.Environment:Informatica 8.6.1, Oracle E-Business Suite, Financials, Procurement and Project modules, Oracle Business Analytics Warehouse(OBAW), OBIEE 10g,Oracle SQL Developer, MS Access, Flat Files, Unix, Visio, Windows XP 2003ETL DeveloperSystems Personnel, Buffalo, WNYSeptember 2008 - June 2009Client: Blue Cross Blue Shield, HealthNowProject: Business Intelligence (BI)Project Role & Responsibilities:Work with Business analysts and users to clarify requirements and translate the requirements into technical specifications. Created new mappings in Informatica PowerCenter 8.6 and updated old mappings according to changes in Business logic, Prepared Unit Test plan, System Integration Test, and UAT to check the data quality.Performed ETL data and Mapping Validation in the QA environment.Created a Materialized view for summary data to improve the query performance. Performed a major role in understanding the business requirements and designing, loading and extracting the data into the Data Warehouse of Facets system, Pharmacy Claims, Provider, Med Claims, Members, and BHI, CRMS for McKesson, OPA files, used ETL (Informatica) to load data from source to ODS in Informatica PowerCenter 8.6 Tools, Source Analyzer, Target designer, Mapping Designer, Mapplet Designer, and Transformation Developer for defining Source & Target definitions and coded the process of data flow from source system to data warehouse. Created reusable transformations and mapplets based on the business rules to ease development.Writing SQL and PL/SQL Code in Source qualifier and Mappings, Migrated the Tested Code from Dev to QA and QA to Production Repository Developed and Tested Data security and Dashboard security in OBIEE.Using Informatica Workflow Manager and Workflow Monitor to schedule and monitor session status, UNIX job scripts for scheduling jobs in AutoSys. Developing complex mappings in Informatica PowerCenter 8.6 to load the data from various sources like SQL Server, DB2, and Flat files, into the Data Warehouse, using different transformations like Joiner, Aggregator, Update Strategy, Rank, Router, Lookup, Sequence Generator, Filter, Sorter, Source Qualifier.Environment:Informatica Power Center 8.6.1, EDW, Power Exchange, Siebel Analytics, Cognos 8,Oracle 10g, IBM DB2, SQL Server, MS Access, Flat Files, WinSQL, Unix, TOAD, Erwin 4.1,Visio, Windows XP 2003Data Warehouse ConsultantJAWOOD, MichiganClient: Blue Cross Blue Shield of Michigan BCBSMProject: Business Intelligence Enterprise Data Warehouse (EDW)Responsibilities:Reviewed the Business and functional requirement documents (BRD) and identified test scenarios. Performed Unit, User Acceptance, and end testing of different Subject Areas (Medical Claims, Master Medical Claims, Pharmacy Claims, Member and Customer, and Provider, Involved in developing test plans and test cases based on highlevel and detailed designs, extensively performed manual testing, requirements verification testing, and functional testing, Participated in Test Case walkthroughs, and inspection meetings, Interacted and worked with the Development and Data warehouse team to solve the problems encountered in test execution. Creating and Executing SQL queries in SIT, QA1, QA2, and Production environments to get $ amount Counts.Database DeveloperFelix Health Technologies, MichiganJanuary 2003 - March 2007Horizon BCBS, NJRole & Responsibilities:A multi-year project to build a data warehouse to identify the most profitable or potentially profitable customer for future interaction and to perform claim analysis to gauge claims processing efficiency. Risk modeling for risk measures can then be used to calculate the right premium amount. This included technical architecture design, data extraction, metadata management, and integration of mapping and loading of target tables.Siebel Analytics, Oracle 8/9i, Informatica Power Center 7.1.4, Flat file, MS SQL Server, SQL*Loader, Visio, Windows 2000.EducationMasters in Computer scienceFebruary 1993 - June 1996Cambridge Collage of Computer Science at Lahore, PakistanAWS Trainings:AWS Certified Cloud Practitioner (AWS Workshop)AWS Certified Solutions Architect  Associate (AWS Workshop)AWS Technical Essentials (AWS Workshop)Oracle8/8i Certified Database Administrator -DBA (Oracle Corp). USACertification in Oracle DeveloperKey SkillsBusiness IntelligenceOracle DBInformaticaData Warehouse(DWH)SQLPLSQLAWSCloud ComputingAWS Services EC2, VPC, S3, Lambda, CloudWatch, Glue, Redshift, DMS- MigrationQuickSightSAP Business ObjectLinuxPythonMDMETLSFPT/FTPOracle OBIEETWSOracle SQL DeveloperToad for Oracle

Respond to this candidate
Your Message
Please type the code shown in the image:

Note: Responding to this resume will create an account on our partner site postjobfree.com
Register for Free on Jobvertise