Quantcast

Data Analyst Modeling Resume Raleigh, NC
Resumes | Register

Candidate Information
Name Available: Register for Free
Title Data Analyst Modeling
Target Location US-NC-Raleigh
Email Available with paid plan
Phone Available with paid plan
20,000+ Fresh Resumes Monthly
    View Phone Numbers
    Receive Resume E-mail Alerts
    Post Jobs Free
    Link your Free Jobs Page
    ... and much more

Register on Jobvertise Free

Search 2 million Resumes
Keywords:
City or Zip:
Related Resumes
Click here or scroll down to respond to this candidate
Candidate's Name
Senior Data AnalystPhone: PHONE NUMBER AVAILABLEEmail: EMAIL AVAILABLEProfessional Summary:Over 9+ years of IT experience in the Analysis, design, development, testing, and Implementation of ETL & Business Intelligence solutions using Data Warehouse/Data Mart Design, ETL, OLAP, and Client/Server applications.Experience in gathering and creating Business Requirements translating business requirements into technical requirements and writing functional and technical specifications.Performed various types of testing, such as functional, system, regression, and user acceptance testing.Experienced in Training, Documentation, and Implementation in a business environment.Experience in Data Analysis, Data Validation, Data modeling, Data Cleansing, Data Verification and identifying data mismatch.Highly Proficient in Agile, Iterative, and Waterfall and Scaled Agile Framework (SAFe) software development life cycles.Experience in Data Governance Policies, Business Glossary, Data Dictionary, Data, Metadata & Master Data, Data Lineage, and Data Quality Rules.Experienced in documenting data flow diagrams for existing and future reports, providing valuable input for report design and optimization efforts.Capable of developing comprehensive Requirements Specifications and Design Specifications to guide data modeling and implementation efforts.Proven track record of mentoring and educating team members on best practices and industry standards in data modeling and analysis.Preferred expertise in Kimball Dimensional Data Modeling and experience with tools like Oracle SQL Developer or ER/Studio Data Architect for Oracle.Has expertise in Test Case Design, Test Tool Usage, Test Execution, and Defect Management.Worked on Data Scrubbing/Cleansing, Data Quality, Data Mapping Data Profiling, and Data Validation in ETL.Extensively used ETL methodology for supporting data extraction, transformations, and loading processing, in a corporate-wide-ETL Solution using Data Stage.Performed data extraction, transformation, and loading (ETL) between systems using SQL tools such as SSIS.Proven development experience using the full suite of Business Objects reporting tools: Web Intelligence, Crystal Reports, Information Design Tool, Universe Design Tool.Strong Delivery Skills including requirements gathering, gap analysis, effort estimations, functional design, UAT Support, Master Data Administration, Training, and risk analysis.Experience in design of Data Warehouse/Data Mart, Star Schema, Snowflake Schema, ODS, ETL process, Data Requirement Analysis, Data modeling &ER-Diagrams, Development, Testing, Documentation and Implementation of Business Applications.Good understanding of Oracle Data Dictionary, Oracle Workflow, Data Flow Diagrams, ER Diagrams, Data warehousing concepts, RDBMS and Normalization Techniques.Familiar with Data warehousing tools. Used ETL tools like Informatica 8. x/ 7.x to extract and load data.Extensively used Data Definition Language (DDL), Data Manipulation Language (DML), and Data Control Language (DCL) in SQL to process large data requests.Proficient at developing Ad hoc queries and reports using Teradata SQL and UNIX to fulfill data requests from business analysts and financial analysts.Proficient at creating Reports, Pivot Tables, VLOOKUP, Hook-up, Index, Graphs, and Match in MS Excel; Proficient at utilizing MS PowerPoint, MS Access, and MS Word.Ability to collaborate with peers in both business and technical areas, to deliver optimal business process solutions, in line with corporate priorities.Proficient in Oracle E-Business Suite (EBS) with a focus on versions 12.2 or higher.Expert in Tableau Desktop 10.x/9.x/8.x, Tableau Reader and Tableau Server Experienced in analysis, modeling, design, and development of Tableau reports and dashboards for analytics.Expertise in creating Dashboards, and stories and developing different chart styles including Bar Charts, Line Charts, Tree Maps, Gantts, Circle views, Scatter Plots, Bullet Graphs, Histograms, Heat Maps, Geo Maps, and Text Graphs in the Tableau desktop environment.Conducted thorough analysis of SAP transactions and ABAP code to translate them into SQL, ensuring accurate mapping and integration.Led the creation and optimization of ETL packages using SSIS, improving data processing efficiency.Ability to quickly adapt and learn new software applications.Proficient in Oracle database technologies and tools.Strong experience in Data Modeling with expertise in creating Star Snow-Flake Schemas, FACT and Dimensions Tables, and Physical and Logical Data Modeling using Erwin and Embarcadero.Technical Skills:Data WarehousingInformatica 9.1/8.6/7.1.2 (Repository Manager, Designer, Workflow Manager, and Workflow Monitor), SSIS, Data Stage 8. xLanguageRDBMS & HTML, T-SQL, SAP, PL/SQL, C/ C++, Python, ABAP, Hadoop, Java, Visual C++, Visual Basic 6.0, HTML, XMLCloudAWS (EC2, S3, RDS RedShift, EMR), Google Cloud (Big Query, Kubernetes)DatabasesMySQL, PostgreSQL, Tera Data, MS SQL Server, Informatica PowerCenter, MongoDB, Hive, Presto, AWS RDS, AWS Redshift, AWS Redis, Big QueryToolsTableau, Hadoop, Hive, Apache Airflow, Apache Spark, Flask, Apache Kafka, Jupiter Notebook, Excel, Jira, Git, Docker, Kubernetes, TOAD, BTEQ, Teradata SQL AssistantOperating SystemWindows, UNIXData ModelingErwin 4.0, Power Designer, Microsoft Visio 2003, ER Studio, Star-Schema Modeling, Snowflake-Schema Modeling, FACT and dimension tables, Pivot Tables, Sybase Power DesignerTesting ToolsWin Runner, Load Runner, Test Director, Mercury Quality Center, Rational Clear QuestPackagesNumPy, Pandas, Scikit-learn, TensorFlow, Matplotlib, Seaborn, Plotly, NLTKProfessional Experience:Ally Bank -Sandy, Utah April 2022 to Till DateRole: Senior Data AnalystResponsibilities:Working with business partners and team members, gather and analyze requirements translating these into solutions for database designs supporting transactional system data integration, reports, spreadsheets, and dashboards.Used agile software development with Scrum methodology.Involved in Planning, Defining, and Designing a database using Erwin on business requirements and provided documentation.Familiarity with Agile PLM (Product Lifecycle Management) Release 9 and above.Used Pandas, NumPy, seaborn, SciPy, Matplotlib, Scikit-learn, NLTK in Python for developing various machine learning algorithms and utilized machine learning algorithms such as linear regression, multivariate regression, naive Bayes, Random Forests, K-means, &KNN for data analysis.Created Performance Metrics using Quantitative Analysis and Benchmark Scorecards evaluating the team performance using MS Excel, SQL, Access, and PowerPoint, which helped executives gain actionable insights.Worked extensively with Advance Analysis using Actions, Calculations, Parameters, Background images, Maps, Trend Lines, Statistics, and Log Axes. Groups, hierarchies, and sets to create detailed summary reports and Dashboards using KPIs.Designed ETL packages dealing with different data sources (SQL Server, Flat Files, XML, etc.) and loaded the data into target data sources by performing different data transformations (DTS) using SSIS.Involved in designing and creating schema objects like Database tables, indexes, views, materialized views and synonyms.Experienced in developing both relational and dimensional data models to ensure efficient storage, retrieval, and analysis of data.Skilled in collaborating with architects to develop physical data models that align with the enterprise architecture standards and requirements.Hands-on experience in designing Data Facts and Dimensions within the enterprise data warehouse, following the Kimball Dimensional Data Modeling Framework.Assisted in the creation of a Data Governance Risk Matrix which will assign a risk rating for every external data exchange in a consistent manner.Integrate data from various modules within the ERP system to create a comprehensive and unified dataset.Analyzed Google Analytics Ecommerce impact and resolved conversion hurdles.Created lists of ICD9 - ICD10 CM/PCS codes, concerning the usage percentile, and provided them to Stakeholders to determine if codes were best or all possibilities.Extensively used SQL procedures in SAS for creating SAS datasets and generating reports using them when Teradata was offline for maintenance.Designed, created, and implemented WebI reports, Webservices (BIWS), and Design Studio Dashboards using best practices.Utilized Oracle Analytics tools for creating dashboards, reports, and visualizations to provide actionable insights.Studied client reports in Excel BI and converted them into OBIEE reports.Automated routine pixel-perfect reporting tasks using Python and SQL, reducing the time needed to generate reports by 40% and increasing the teams productivity.Cleanse and validate data to maintain accuracy and consistency.Ensure a seamless flow of information between different departments and modules.Used Business Objects XIR3 extensively and Involved in Full SDLC (software development life cycle) rules are followed for designing new components.Experience in Data Extraction from and loading into heterogeneous Data Sources such as Oracle, MS SQL Server, Flat files, XML, and loaded into SAP ECC, SAP HANA, and Data Warehouses.Contributed to the design and implementation of data warehouse solutions on Teradata platform, supporting business analytics and decision-making processes.data analysis and provided actionable insights to stakeholders, leveraging Teradata's capabilities in data warehousing and analytics.Created SSIS Packages for import and export of data between MS SQL Server 2014 database and others like MS Excel and Flat Files.Created logical and physical data models and reviewed these models with the business team and data architecture team.Led end-to-end implementation of Tableau dashboards, utilizing Informatica PowerCenter for ETL processes to extract, transform, and load data from diverse sourcesConducted training sessions for end-users on Oracle Analytics capabilities and best practices.Working on web metrics platform using Google Analytics PremiumTransformed Logical Data Model to Physical Data Model ensuring the Primary Key and Foreign key relationships in PDM, Consistency of definitions of Data Attributes, and Primary Index considerations.Created SQL scripts to find data quality issues and to identify keys, data anomalies, and data validation issues.CVS Health Woonsocket, Rhode Island June 2020 to Mar 2022Role: Sr. Data AnalystResponsibilities:Analyzing the specifications provided by the client and developing the specification document for the reports as per the client's requirements.Designed, developed, and debugged Stored Procedures, and configuration files and implemented best practices packages to maintain optimal performance.Generate regular reports and dashboards for stakeholders using ERP data.Created Dashboard Reports using Power BI, Cognos, and Tableau.Involved in migration projects to migrate data from data warehouses on Oracle/DB2 and migrated those to Teradata.Conducted regular audits of workforce data to ensure data accuracy, consistency, and compliance with company policies and industry regulations.Troubleshoot and resolve issues related to data migration in the Oracle EBS environment.Collaborate with cross-functional teams to understand business requirements and translate them into effective data migration strategies within Oracle EBS.Plan and execute data migration projects within the Oracle ecosystem.Proficient in visualizing and designing enterprise data management frameworks, including defining processes for data planning, acquisition, maintenance, and usage.Create automated solutions using Databricks, Spark, Python, Snowflake, and HTML.Working on web metrics platform using Google AnalyticsResponsible for management of data, duplicating the data, and storing the data in a specific data warehouse using Talend Platform for Data Management and MDM.Designed logical and physical data models, Reverse engineering, and Complete compare for Oracle and SQL server objects using E/R Studio.Reverse Engineering the existing data marts and identify the Data Elements (in the source systems), Dimensions, Facts, and Measures required for reports.Collaborated with HR teams to streamline workforce planning processes, incorporating data-driven approaches to recruitment, onboarding, and talent management.conducted scenario analysis to assess the impact of various supply chain disruptions and developed contingency plans to mitigate risks.Defined facts, and dimensions and designed the data marts using Ralph Kimball's Dimensional Data Mart modeling methodology using E/R Studio.Involved in Data profiling and performed Data Analysis based on the requirements, which helped in catching many Sourcing Issues upfront.Developed Data mapping, Transformation, and Cleansing rules for Data Management involving OLTP and ODS.Conducted scenario analysis and sensitivity testing to assess the impact of various workforce-related factors on business operations, aiding in strategic planning.Analyze existing data structures and develop strategies for efficient data migration.Created and maintained key supply chain performance metrics, providing regular reports and insights to support strategic decision-making.Customize ERP reports and dashboards based on user requirements.Integrated Power BI with other Microsoft tools and platforms, like Azure services, SharePoint, and Teams for seamless data sharing and collaboration.the Test Plan, Test Cases, and Test Scenarios to be used in testing based on Business Requirements, technical specifications, and/or product knowledge.Analyze the Business Object reports/extracts and submit them to the Risk manager after the proper validation.Design and implement data extraction, transformation, and loading (ETL) processes.Implemented Copy activity and custom Azure Data Factory Pipeline Activities.Nationwide Columbus, Ohio Feb 2018 to May 2020Role: Data AnalystResponsibilities:Created design documents using Erwin Data Modeler (ER diagrams) and MS Visio (Flowcharts).Created mappings to load data from source and target to staging, staging to reporting tables by applying business requirements using Informatica Power Center.Assisted and worked with ETL developers and data modelers to make sure that the model being developed is according to the business requirements.Demonstrated expertise in supply chain planning, including demand forecasting, inventory management, and order fulfillment.Develop and execute test plans to validate the success of data migration activities.Created data visualizations and reports based on MongoDB data using tools like Tableau or Power BI.Experience in using SSIS tools like Import and Export Wizard, Package Installation, and SSIS Designer.Performed analysis, coding, testing, implementation, and troubleshooting on production reports and tables, as well as producing ad hoc reports using SAS.Extensively used MLoad, FLoad, TPump, and Fast Export Teradata tools both in ETL and UNIX scripts for high-volume flat files.Implement access controls and monitor user permissions within the ERP system.Fine Tuned (performance tuning) SQL queries and PL/SQL blocks for maximum efficiency and fast response using Oracle Hints, Explain plans.Used Teradata as a Source and a Target for a few mappings.Worked with Teradata loaders within the Workflow manager to configure Fast Load and Multiload sessions.Designed ETL Mapping Specs for the ETL team for Source to Target Mappings and corresponding business rules involved.Designed, created, and maintained Universes for Business Objects and Web Intelligence users using BusinessMS Excel.Creating Schema & Package with the help of SQL Script in SAP HANA and loading Flat Files data into SAP HANA Studio.Proposed strategies to implement HIPAA 4010 in the new MMIS system & eventually move to HIPAA 5010. Facilitate HIPAA integration and applied ANSI ASC X12 standards for 837 (P, I, D), 270/271, 276/277, 278, and 835 transactions.Created ETL test data and tested all ETL mapping rules to the functionality of the Informatica mapping and Ab Initio graphs.Ensure data security and compliance with relevant regulations.Built the model on the Azure platform using Python and Spark for the model development and Dash by Plotly for visualizationsExtensive experience in Text Analytics, generating data visualizations using R, and Python, and creating dashboards using tools like Tableau.worked on predictive analytics use cases using Python language.Conducted testing to confirm that HL7 messages from each application conform to the appropriate HL7 interface specification.Involved in designing and developing Data Models and Data Marts that support the Business Intelligence Data Warehouse.Met with relevant stakeholders to obtain EDI contacts for all vendors to communicate testing schedules and ensure project completion.Implemented Data Exploration to analyze patterns and select features using Python SciPy.Led the development of Business Requirements Documentation (BRD) for analysis related to the ICD-10 mandate.Extensive experience in data migration projects within the Oracle EBS environment.Utilized SSRS 2008 through Visual Studio 2008 and interfaced to Oracle SAP database through a local OracleTailored product for each client providing customizations in data warehouse, ETL mappings, OBIEE Answers, Dashboards & BI Publisher.Serco Hyderabad, India July 2014 to Oct 2017Role: Data AnalystResponsibilities:Responsible for creating & managing the Business Objects repository and coordinating/performing universe development.Design, develop, validate, and integrate reporting and information solutions with Design Studio.forward and reverse engineering, applying DDLs to the database in restructuring the existing data Model using ERWINDocument data migration processes, procedures, and outcomes.Proficient in using tools such as Python, R, and SQL to analyze large datasets related to supply chain operations.Designed ETL specification documents to load the data in the target using various transformations according to the business requirements.Set up automated systems using Excel VBA to pull the data smoothly into the Tableau platform to create reports and dashboards.Collected data from internal sources using MySQL and performed data cleaning and aggregation for reporting and further analysis.Manipulated data and calculated key metrics for reporting using MySQL queries (window function, subqueries) and MS Excel.Developed interactive dashboards using tools like Tableau to visualize key performance indicators (KPIs) and facilitate data-driven decision-making.Visualized data from MySQL and MS Excel using Tableau Dashboard to monitor email performance. connected to MySQL database using R package RMySQL and retrieving data for statistical analysis and visualization.Performed exploratory data analysis (EDA) to find insight such as difference efficiency among different devices using ggplot2 in R.Actively involved in writing T-SQL Programming for implementing Stored Procedures Functions and cursors, views for different tasks.Handled the performance tune of SSIS packages and error output.Review and analyze data mapping documents to determine ETL program design.Developed SQL Joins, SQL queries, tuned SQL, views, test tables, and scripts in a development environment.

Respond to this candidate
Your Message
Please type the code shown in the image:

Note: Responding to this resume will create an account on our partner site postjobfree.com
Register for Free on Jobvertise