Quantcast

Business Intelligence Data Analyst Resum...
Resumes | Register

Candidate Information
Name Available: Register for Free
Title Business Intelligence Data Analyst
Target Location US-NJ-Jersey City
Email Available with paid plan
Phone Available with paid plan
20,000+ Fresh Resumes Monthly
    View Phone Numbers
    Receive Resume E-mail Alerts
    Post Jobs Free
    Link your Free Jobs Page
    ... and much more

Register on Jobvertise Free

Search 2 million Resumes
Keywords:
City or Zip:
Related Resumes

Data Analyst Business Intelligence Morristown, NJ

Data Analyst Business Intelligence Trenton, NJ

Business Intelligence Data Analysis Manhattan, NY

Business Intelligence Data Analysis Hoboken, NJ

Business Intelligence Data Collection Tarrytown, NY

Business Intelligence Data Analyst Jersey City, NJ

Business Intelligence Data Analyst Budd Lake, NJ

Click here or scroll down to respond to this candidate
Candidate's Name  EMAIL AVAILABLE PHONE NUMBER AVAILABLE LINKEDIN LINK AVAILABLEPROFESSIONAL SUMMARY:Over 8+years of experience in Data Analyst & Data Modelling, data development, Implementation and Maintenance of databases and software applications.Good understanding of Software Development Life cycle (SDLC) including planning, analysis, design, development, testing, implementation.Working experience of advanced Microsoft Excel functions, ETL (Extract, Transform and Load) of data into a data mart and Business Intelligence (BI) tools like Microsoft Power BI and Tableau (Data visualization and Analytics).Hands on experience in writing queries in SQL to extract, transform and load (ETL)data from large data warehouse systems, extracting, transforming, and loading (ETL) Data from spreadsheets, Database tables and other sources using SQL Server Integration Services packages (SSIS) and Informatica.Good Knowledge and working experience on AWS tools like Amazon S3, and Amazon Red Shift.Extensive experience in using ER modelling tools such as Erwin and ER/Studio, Power Designer.Experience in conducting Joint Application Development (JAD) sessions with SMEs, Stakeholders and other project team members for requirements gathering and analysis. Professional experience in Data Modelling and Data Analysis design of OLTP and OLAP systems.Excellent proficiency in Agile/Scrum and waterfall methodologies.Good experience in working with different reporting tool environments like SQL Server Reporting Services (SSRS), and Business Objects.Experience in data warehousing applications using ETL tools and programming languages like python, SQL/PLSQL, Oracle and SQL Server databases, Informatica, and SSIS.Experience in providing custom solutions like Eligibility criteria, Match and Basic contribution calculations for major clients using Informatica and reports using Tableau/ PowerBI.Extensively used Python Libraries PySpark, Pytest, Pymongo, cxOracle, PyExcel, Boto3, Psycopg, embedPy, NumPy and Beautiful Soup.Experienced with PyTorch and Tensorflow. Able to use Cnn module in PyTorch to build CNN in training data set.Excellent experience in writing and executing unit, system, integration and UAT scripts in a data warehouse project s.Closely collaborated with sales and marketing teams for conducting A/B testing to improve and track sales from different channels, using tools like Google Analytics, Python, Tableau, and Big QueryExperience in developing Conceptual, logical models and physical database design for Online Transactional processing (OLTP) and Online Analytical Processing (OLAP) systems.Excellent experience in writing SQL queries to validate data movement between different layers in data warehouse environment.Good understanding of Ralph Kimball (Dimensional) & Bill Inman (Relational) model Methodologies.Experience with Teradata utilities such as Fast Export, MLOAD for handling various tasks.Having good working experience in Data Vault which is used in maintain Historical Data in the Enterprise Data Warehouse.Excellent knowledge in preparing required project documentation, tracking, and reporting regularly on the status of projects to all project stakeholders.Expert in Building reports using SQL SERVER Reporting Services (SSRS [] Crystal Reports, Power BI, and Business Objects.Experienced in MDM (Master Data management) in removing duplicates, standardizing data, and eliminating incorrect data.Excellent understanding and working experience of industry standard methodologies like System Development Life Cycle (SDLC), as per Rational Unified Process (RUP), AGILE and Waterfall Methodologies.TECHNICLE SKILLSLanguages: Python 3.x, R, SQL, T-SQLData base: MSSQL, MySQL, SQLite, PostgreSQL, MS Access, Quickbase. Methodologies: Agile, Scrum and WaterfallLibraries: Scikit-Learns, Keras, TensorFlow, NumPy, Pandas, Matplotlib, Seaborn Statistical Methods: Hypothetical Testing, ANOVA, Time Series, Confidence Intervals, Bayes Law, Principal Component Analysis (PCA), Dimensionality Reduction, Cross- ValidationBusiness Intelligence andPredictive models:Regression analysis, Bayesian Method, Decision Tree, Random Forests, Support Vector Machine, Neural Network, K-Means Clustering, KNN and Ensemble Method, Natural Language ProcessingReporting Tools: Tableau 10.x, 9.x, 8.x which includes Desktop, Server and Online, Microsoft Power BI, Smartsheet, Google Sheet, Google Data Studio. Data Visualization: Tableau, Microsoft Power BI, QlikView, Qlik Sense, Quick base, Matplotlib, Sea-bornMachine Learning: Regression, clustering, SVM, Decision trees, Classification, Recommendation systems etc.Big data Framework: Amazon EC2, S3 and EMRETL/Data WarehouseTools:Web Intelligence, Talend, Informatica, Tableau, Data Modeling Star-Schema Modeling, Snowflake-Schema Modeling, and FACT and dimension tables, Pivot TablesPROFESSIONAL EXPERIENCE:Client: ABS Eagle,NJ May 2023 - PresentRole:Data AnalystResponsibilities:Created Complex mappings for delivering the incremental comparative reports and extracts contain changes after the comparison between the MDM EPIC data brought into MDM to the PVD (Provider verification data).Processing data from tools such as Snowflake and writing complex queries in SQL or SAS using complex joins, sub- queries, Table creation, Aggregation, and using concepts of DLL, DQL and DML.Developed Complex Mappings for creating match-merge reports for ACO Provider.Tuned performance of Outbound ETL. Performed CDC based on MDM last update date, which is indexed and maintained by MDM. Used the parameterization template mapping/session for each base object and had that as first task in the Workflow.Predictive analytics, healthcare fraud detection and prevention methods, using R/SAS Enterprise miner/machine learning. Worked as production support team to resolve production issues caused by data, UI code and backend code.Tuned performance of Informatica session to optimize the EMPI data load to landing process. Created Pre-Landing tables in Oracle and bulk loaded them. Used SQL Transformation in Informatica Designer to populate the data from Pre-Landing tables to actual Landing tables.Created mappings for loading data files received from different Sources like ECHO, HMS, and CLARITY, worked with different kinds of input files like CSV, XML, Tab delimited files.Created Views which were used by Micro-Strategy team to display Provider data in the portal.Worked with Business analysts and the DBA for requirements gathering, business analysis and designing of the data warehouse.Used PL SQL stored procedures as part of mappings to improve performance.Configured match rule set property by enabling search by rules in MDM according to Business Rules.Created Mapplet for Address Validation and used it in various Mappings.Worked on Informatica IDQ with the data quality management team.Performed Data Profiling on a regular basis using custom rules.Designed and developed data quality procedures using IDQ.Worked with Generator, Matcher, Associator, Consolidator for analysis in IDQ.Developed matching maples and address validations.Created events and tasks in the workflows using workflow manager.Working with database connections, SQL joins, cardinalities, loops, aliases, views, aggregate conditions, parsing of objects and hierarchies.Developed shell scripts for running batch jobs and scheduling them. Environment: Informatica MDM, Workflow Manager/Monitor, Power Exchange, Informatica IDQ, Oracle, SQL Server, Tableau, Erwin, MS Office.Client: NYC Administration For Childeren Services May 2022 - May 2023 Role: Sr. Data Modeler/AnalystResponsibilities:As a Sr. Data Modeler/Analyst to generate Data Models using Erwin and subsequent deployment to Enterprise Data Warehouse.Participated in design discussions and assured functional specifications are delivered in all phases of SDLC in an Agile Environment.Developed Data mapping, Transformation and Cleansing rules for the Data Management involving OLTP and OLAP.Wrote DDL and DML statements for creating, altering tables and converting characters into numeric values.Developed rule sets for data cleansing and actively participated in data cleansing and anomaly resolution of the legacy application.Import training weight file using Python (NumPy) and Tensorflow (assign). Create a function to output the detection boxes with Python.Researched and developed hosting solutions using MS Azure for service solution.Involved in data analysis, data discrepancy reduction in the source and target schemas.Developed and deployed quality T-SQL codes, stored procedures, views, functions, triggers and jobs.Created data masking mappings to mask the sensitive data between production and test environment.Worked on Data Mining and data validation to ensure the accuracy of the data between the warehouse and source systems.Worked on Master Data Management (MDM) Hub and interacted with multiple stakeholders.Worked with Business users during requirements gathering and prepared Conceptual, Logical and Physical Data Models.Worked extensively on Tableau for data visualization producing interactive graphs.Created various Physical Data Models based on discussions with DBAs and ETL developers.Created PL/SQL packages and Database Triggers and developed user procedures and prepared user manuals for the new programs.Reverse Engineering the existing data marts and identified the Data Elements (in the source systems), Dimensions, Facts and Measures required for reports.Document data dictionaries and business requirements for key workflows and process pointsPerformed the Data Accuracy, Data Analysis, Data Quality checks before and after loading the data.Used SAS procedures like means, frequency and other statistical calculations for Data validation.Developed the data warehouse model (Kimball's) with multiple data marts with conformed dimensions for the proposed central model of the Project.Involved in writing queries and stored procedures using MySQL and SQL Server.Designed the data marts in dimensional data modeling using star schemas and snowflake schemas.Maintained and implemented Data Models for Enterprise Data Warehouse using Erwin.Involved in development and implementation of SSIS, SSRS and SSAS application solutions for various business units across the organization.Worked with project management, business teams and departments to assess and refine requirements to design BI solutions using MS Azure.Involved in migration projects to migrate data from data warehouses on Oracle and migrated those to Teradata.Performed Data Manipulation using MS Excel Pivot Sheets and produced various charts for creating the mock reports.Created data flow, process documents and ad-hoc reports to derive requirements for existing system enhancements. Environment: Erwin 9.7, Teradata 15, Oracle 12c, ETL, SQL, PL/SQL, SAS, MDM, MySQL 8.0.13, T-SQL, SSIS, SSRS, OLAP, OLTP, SSAS, Azure, Tableau.Client: MASHAEXIM, Mumbai, India Jan 2019 - May 2022 Role: Data AnalystResponsibilities:Design and develop analytics, machine learning models, and visualizations that drive performance and provide insights, from prototyping to production deployment and product recommendation and allocation planning.Perform independent analysis on complex data sets using JMP, MS Excel, and TableauWorked on upgrading Tableau Dashboards to newer version 10.3 to take advantage of new features as well as other improvements and enhancements.Worked extensively in user and production support. Resolving the production issues, creating stories for larger changes and also helping users and business with support they need including giving access to application and extracting reports and stats related to the application for business.Extract data from various data sources and prepare interactive dash boards utilizing various kinds of charts, graphs and filtering data using logical filters and publishing them.Creating highly reliable dashboards, reports, presentations for a wide range of customers.Involved in Exploring, Designing, Building, and Deployment of the high-level dashboards for Board of Directors, Decision makers for critical metrics using Tableau.Creating data base objects like tables, temp tables, views, stored procedures, functions, indexes, triggers, PIVOT in SQL Server 2008, 2012Collecting data and fitting them in linear, polynomial, and Gaussian format and calculating various associated parametersApplying advanced statistical principles and calculating various parametersUsed data sources from SQL, Oracle, SSAS cubes and files to generate reports in Tableau.Hand on experience in designing the data conversion strategy, development of data mappings and the design of Extraction, Transformation and Load (ETL) routines for migrating data from different sources.Supporting clients as a Subject Matter Expert (SME) through trouble shooting, preparing ad-hoc reports and presentationsEstablished and documented Standard Operating Protocols (SOPs) for each stage of the product development through documentation.Extracted data from the databases (Oracle and SQL Server) using Informatica to load it into a single data warehouse repository of Teradata.Tested the database to check field size validation, check constraints, stored procedures, and cross verifying the field size defined within the application with metadata.Utilized critical path method to effectively complete projects within project timeline and budget.Worked with cross functional team to identify risk, plan risk response, risk mitigation and developed risk resolution.Coordinated with external team & consultants for successful implementation of knowledge transfer throughTechnology Transfer.Environment: SQL MS 2008, Python 3.x, Tableau 9.x, 8.x, MS Excel, Oracle, Informatica, ETL, Teradata, SSAS, Agile Scrum.Client: Wipro, India Dec 2016 - Jun 2018Role: IT Business Intelligence Intern/ Data Analyst Responsibilities:Evaluated new applications and identified system requirements.Visualized KPI metrics like resource utilization, net profit margin, gross profit margin and burn rate using Tableau.Worked on time series analysis using Pandas to identify patterns on how asset variable changes which in turn helped project completion by 70%.Recommended solutions to increase revenue reduce expense; maximize operational efficiency, quality, compliance, etc.Identified business requirements and analytical needs from potential data sources.Performed SQL validation to verify the data extracts integrity and record counts in the database tablesWorked with ETL developers for testing, mapping data and aware of data models to translate and migrate data.Created Requirements Traceability Matrix (RTMs) using Rational Requisite Pro to ensure complete requirements coverage with reference to low level design document and test cases.Assisted the Project Manager to develop both high-level and detailed application architecture to meet user requests and business needs. Also, assisted on project expectations and in evaluating the impact of changes on the project plans accordingly and conducted project related presentations and in performing Risk Assessment, Management and Mitigation.Collaborated with different teams to analyze, investigate and diagnose root cause of problems and publish root cause analysis report (RCA).Achieved in using advanced SQL queries and analytic functions for date calculations, cumulative distribution and NTILE calculations.Used advanced Excel formulas and functions like Pivot Tables, Lookup, If with and/index, match for data cleaning. Environment: SQL, ETL, Mapping data, Tableau, NTILE, RCA, RTMs, Pivot Tables, KPI metrics.

Respond to this candidate
Your Message
Please type the code shown in the image:

Note: Responding to this resume will create an account on our partner site postjobfree.com
Register for Free on Jobvertise