Quantcast

Business Intelligence Power Bi Resume Ka...
Resumes | Register

Candidate Information
Name Available: Register for Free
Title Business Intelligence Power Bi
Target Location US-MO-Kansas City
Email Available with paid plan
Phone Available with paid plan
20,000+ Fresh Resumes Monthly
    View Phone Numbers
    Receive Resume E-mail Alerts
    Post Jobs Free
    Link your Free Jobs Page
    ... and much more

Register on Jobvertise Free

Search 2 million Resumes
Keywords:
City or Zip:
Related Resumes
Click here or scroll down to respond to this candidate
Candidate's Name
EMAIL AVAILABLE PHONE NUMBER AVAILABLE LinkedInProfessional Summary10+ years of experience in Software Development, front end technology and back-end stack, debugging and maintenance of enterprise web application.Ability to solve standalone projects and a good team worker.Worked in waterfall and scrum and test-driven Development environment.Experience in Agile, RUP, Waterfall, Spiral, AIM and Evolutionary Methodologies.Experience in developing applications using java spring boot, maven, eclipse, MVC architecture, Angular version 7, Redux state management, PCF, Concourse, CI/CD, JavaScript, HTML, Bootstrap.Proficient in developing systems built on SQL Server 2005, 2008 using Tables, Triggers, Views, and Stored Procedures in SQL and maintaining the database, including requirement analysis, design, data conversion, loading and implementation.Experienced with version control systems like Git, GitHub, CVS, and SVN to keep the versions and configurations of the code organized.Experience with Tableau reporting/data visualization tool and integration between Alteryx and Tableau.Data was processed in Alteryx to be created for reportingProficient in Site Reliability/ Business Intelligence tools like Grafana, Graphite, Power BI, Microsoft BI StackTechnically adept in SQL, Snowflake, Python, PySpark, Pandas, Databricks and AWS cloud (S3, Redshift and Quick Sight).Well-versed in Data Governance principles, processes and framework for metadata and data qualityHands on experience with ETL, Hadoop, Business Intelligence and Data Governance tools such as, Tableau, IBM IGC, Informatica Enterprise Data Catalog and CollibraVast knowledge of Database Management Systems and associated tools in Oracle, Cassandra, Erwin Data Modeler, Business Intelligence and Oracle BI-Publisher.Highly skilled in n-tier architecture data solutions, resource scaling (On-premises and Azure Infrastructure), Service Health Efficiency, System Telemetry.Senior Data Analyst and ETL Developer skilled in multiple databases, platforms, SQL, and ETL tools and methods. Experience covers full solution development and SDLC.Extensive implementation, customization, and administration of PowerApps, Power Automate and Power BI with thorough understanding of the Office 365 Admin Center.ETL, Database, Warehouse, and full SQL Solution Development Locally and Cloud.Enthusiastic, Ability to quickly understand and utilize new technologies.Experience in transferring data using Informatica tool from AWS S3 to AWS Redshift.Experienced in Informatica ILM and Informatica Lifecycle Management and its tools.Good Knowledge on SQL queries and creating database objects like stored procedures, triggers, packages and functions using SQL and PL/SQL for implementing the business techniques.Expert in Tabular Modeling, Multi-Dimensional Cubes, DAX and MDX (SSAS).Extensively worked on Spark using Scala on cluster for computational (analytics), installed it on top of Hadoop performed advanced analytical application.Experience in Tableau Desktop, Power BI for data visualization, Reporting and Analysis; Cross tab, Scatter Plots, Geographic Map, Pie Charts and Bar Charts, Page Trails, and Density ChartPreparing Dashboards using calculations, parameters in Tableau.Improving the performance and optimization of the existing algorithms in Hadoop using Spark Context, Spark-SQL, Data Frame, Pair RDD's, YARN.Developed a Power BI Dashboard to understand types and profiles of DTE Customers.TECHNICAL SKILLSData Modeling Tools: Erwin R9.7/9.6, ER Studio V17Big Data & Hadoop Ecosystem: MapReduce, Spark 3.3, Pyspark, HBase 2.3.4, Hive 2.3, Flume 1.9, Sqoop 1.4.6, Kafka 2.6, Oozie 4.3, Hue, Cloudera Manager, Neo4j, Hadoop 3.3, Apache NiFI 1.6NOSQL Database: Mongo DB, Azure SQL DB, Cassandra 3.11.10Databases: Microsoft SQL Server 2017, Teradata 15.0, Oracle 12c, and MS AccessCloud Platforms: GCP, Google big-query, AWS, EC2, EC3, Redshift & MS AzureBI Tools: Tableau 10, SSRS, Crystal Reports, Power BI.Programming Languages: SQL, PL/SQL, UNIX shell Scripting, R, SCALAOperating Systems: Microsoft Windows Vista7/8 and 10, UNIX, and Linux.Methodologies: Agile, RAD, JAD, RUP, UML, System Development Life Cycle (SDLC), Waterfall Model.Professional Experience:CVS, Dallas, TXSr. Data Analyst June 2023 - PresentResponsibilitiesAs Data Analyst in Cummins to drive projects using Spark, SQL and Azure cloud environment.Participated in the requirement gathering sessions to understand the expectations and worked with system analysts to understand the format and patterns of the upstream source data.Done data migration from an RDBMS to a NoSQL database and gives the whole picture for data deployed in various data systems.Prepared Scripts in Python and Shell for Automation of administration tasks.Worked on data that was a combination of unstructured and structured data from multiple sources and automated the cleaning using Python scripts.Used Python to preprocess data and attempt to find insights.Designed and implemented end-to-end data solutions (storage, integration, processing, and visualization) in Azure.Developed and executed ETL applications, stored procedures, and triggers.Support development troubleshooting of data defects with transactional or data warehouse system.Developed Stored Procedures/ ETL processes on Oracle as needed to Extract, Transform and Load the data from multiple source systems to the data warehouse.Expedited root cause analysis for the Data BI Governance Team by assuring the authentication of metadata, detailed reports and priorities of data quality issues among various lines of work.Managed the strategy and vision of the Enterprise Data Management to support the Data BI Governance program.Assisted the Data BI Governance Manager with the formation of a new data documentation process that improved data usage, quality for the entire organization.Worked with Data Warehouse developers to evaluate impact on current implementation, redesign of all ETL logic.Designing the ETL jobs to extract the data from scanners or the source files from different downstream applications.Adhoc data cleansing and standardization using Alteryx.Data was processed in Alteryx to create tde for reporting.Involved in creating data blending in case of merging different sources, used Alteryx for Data blending for Tableau.Experience in all the Latest BI Tools Tableau, Qlikview and Power BI.Successfully used Agile/Scrum Method for gathering requirements and facilitated user stories workshop. Documented User Stories and facilitated Story Point discussions to analyze the level of effort on project specifications.Educate and reinforce scrum methodology and agile framework to team members and key stakeholders.Used different features of Tableau to create drill-down, filter and interactivity based on user requirement Created action filters, parameters and calculated sets for preparing dashboards and worksheets in Tableau.Developed Tableau workbooks from multiple data sources using Data Blending.Creating complex mappings using various transformations, and developing strategies for Extraction,Transformation and Loading (ETL) mechanismAnalyze, design, and build Modern data solutions using Azure PaaS service to support visualization of data.Worked with PowerApps and Power Automate to create apps.Informatica MDM, Oracle MDM, Semarchy MDMManaging expansion of financial data warehouse with data sourced from Oracle EBS, Salesforce and Semarchy to provide a self-service analytics solution accessed with PowerBI. Managing migration of IBM TM1 Planning and Analytics from on-premises Caf/Perspectives tool to a cloud solution using Planning and Analytics for Excel (Pax).Performed QA on BI application developed using tableau for clients.Set up Data Profile and Scorecards using Informatica Developer Tool to monitor the quality of data, as a part of the Data BI Governance initiative.Acted as a mediator between the technical and business units in terms of communicating the message/needs from a Data BI governance perspective.Set up Data Profile and Scorecards using Informatica Developer Tool to monitor the quality of data, as a part of the Data BI Governance initiative.Designed a pseudo-automatic integration process to integrate Data governance portal with MDM hub for enforcing certain Data BI Governance standards.Used Tableau Analytic tool to visually present the data according to the clients need.Good experience with data visualization tools Tableau and Qlikview.Expertise in developing applications using PowerApps, canvas, model driven app, Common Data Service CDS, SQL, Forms, SharePoint online, Dynamics 365 CRM, Azure, C#, ASP.Net, Web Services.Utilized Python to retrieve data from various sources, including APIs, databases, and web scraping techniques.Extensive experience in using Tableau (Desktop/ Server), Teradata, Business Objects Enterprise,Business Objects Data Integrator, SQL Server.Built dashboards in Confidential Quick Sight to provide business intelligence on active partners, open accounts, fraud accounts, and marketing eligible accounts.Extensively worked with development and maintenance of the Alteryx workflow.Troubleshoot data issues, providing support within internal systems, for Python scripts, Alteryx workflows and Tableau dashboardsDeveloped a new reporting system in the company, using Oracle PL/SQL to build all related packages, procedures, functions and IBM Cognos BI tool, version 8 to 10.1.1.Experience in cloud versioning technologies like Github.Used Git, GitHub, and Amazon EC2 and deployment using Heroku and Used extracted data for analysis and carried out various mathematical operations for calculation purpose using python library - NumPy, SciPy.Used DAX functions to create measure and calculated fields to analyze data for logical visualization and used Power Queries to transform the data.Perform data comparison between SDP (Streaming Data Platform) real time data with AWS S3 data and Snowflake data using Databricks, Spark SQL, and Python.In terms of Service Health (such as, OLTP System Tuning, Data Freshness SLA's, Data Analysis Telemetry). Not too far from the day I took charge, have shown a significant difference in Data processing SLA metrics.Used Data Analysis Expressions DAX to create custom calculations in PowerPivot for Microsoft Excel workbooks and Analysis Services tabular model projects.Perform custom ETL on multivariate data in diverse formats from a variety of commercial and government healthcare organizations.Tracking in Mixpanel to capture specific user interactions.Experience in data cleansing with MDM and Data Profiling.Involved with ETL team to develop Informatica mappings for data extraction and loading the data from source to MDM Hub Landing tables.As per the client architecture understand the customer/vendor/item (MDM) data model.Worked on Power BI reports using multiple types of visualizations including line charts, doughnut charts, tables, matrix, KPI, scatter plots, box plots, etc.Environment: Hadoop 3.3, Spark 3.3, Azure, Data factory, AWS, Telemetry, Mix panel, Snowflake, data bricks, Tableau, Scala, ADF, Scala 3.0, SQL Server, ETL, Agile, Scrum, BI governance, JSON, GitHub, Power BI, DAX, Power Apps, Semarchy, Alteryx, Microsoft BI Stack, Quick sight, Erwin Data Modeler, Azure SQL DB, Azure Synapse, data analytics, Python 3.9, PL/SQL and Agile.Safeway, Pleasanton, CASr. Data Analyst April 2021  May 2023Roles and Responsibilities:Hands on experience using Azure Data Factory (ADF) to perform data ingestion into Azure Data Lake Storage (ADLS).Meetings with business/user groups to understand the business process, gather requirements, analyze, design, development, and implementation according to client requirement.Assisted the database modelers in preparing the logical and physical data models and ascertained that requirements have been met and have worked on loading the tables in the Data Warehouse.Designing and Developing Azure Data Factory (ADF) extensively for ingesting data from different source systems like relational and non-relational to meet business functional requirements.Designed and Developed event driven architectures using blob triggers and Data Factory.Creating pipelines, data flows and complex data transformations and manipulations using ADF and PySpark with Databricks.Experience in all phases of the Data warehouse life cycle involving Analysis, Design, Development and Testing as a part of Data Quality Design framework. Development of Extraction and Loading using Informatica.Design, develop, implement, and execute marketing campaigns for US card customers using Unica Affinium campaign, Snowflake, AWS S3, Pyspark, Databricks.Worked on daily basis with lead Data Warehouse developers to evaluate impact on current implementation, redesign of all ETL logic. Worked extensively with the ERwin Model Mart for version control.Delivered data views for data wrangling and classification process, automated for overall quality of care within practice, reports prepared for CMS system and via GitHub.Worked closely with project teams (4), and uploaded files onto GitHub.Gather data from multiple sources, clean the data, analyze and generate reports using SAS Studio, SAS Enterprise guide, BigML, Alteryx and Proc SQL.Worked in an Agile environment and facilitated daily stand up meetings, developed burn down chart, project back log and release backlog.Facilitated the Jira for Defect Tracking, Test Management and Agile SDLC process by Creating and Managing User Stories, Gantt Charts and Sprint Burn down charts, project management controls, project plans, timeline schedules, facilitate RAD sessions, and review software defects.Worked with team to design and develop Alteryx workflows and Tableau dashboard.Create automated solutions using Databricks, Spark, Python, Snowflake, HTML.Performed data audit and error handling for Alteryx and Tableau workflows.Was responsible for maintenance of Alteryx workflow connections with upstream sources and maintenance of visuals of downstream KPIs.Created Talend jobs to load data into various MySQL and PostgreSQL tables.Worked on Power BI dash boards and ad hoc DAX queries for power BI.Implemented the various Tableau functions, Tableau extracts, parameters, conditional formatting, color coding to the dashboards and reports in Tableau Desktop.Developed calculated columns and measures using complex DAX functions and Power Pivot.Consolidated data from multiple sources into central data warehouse environment using Talend as an ETL and as database.Built the model on Azure platform using Python and Spark for the model development and Dash by plotly for visualizations.Analyze Format data using Machine Learning algorithm by Python Scikit-Learn.Generating various capacity planning reports (graphical) using Python packages like Numpy, matplotlib.Implemented Data Exploration to analyze patterns and to select features using Python SciPy.Core expertise in Developing Reports and Dashboards, Data Visualization, Visual Analytics, Data Mining, Business Analytics, Data Analysis, Data Quality management, GAP Analysis and Master Data Management (MDM) using Tableau, Excel, MS SQL and Microsoft BI stacks (SSIS, SSAS, SSRS).Provided BI Analysis using various BI tools (such as Alteryx, Tableau, etc.) to review key metrics.Strong experience in using the Microsoft BI Stack (SSIS, SSRS, SSAS, SQL and Power BI). Worked with SQL server 2012/2014/2016Create batch groups in Informatica MDM Hub to run the staging, match and merge and load jobs as per the dependencies.Responsible for performing data analysis using Tableau to reflect customers core applications and knowledge base.Responsible for Tableau server implementation including maintenance of sites, users, groups and projects.Worked on data shaping, transformation, and modeling in Power BI using DAX and Power Query Editor.Created reports in Power BI using Power Query, DAX Functions, visuals (Charts, Graphs, KPIs, Maps, Gauges,Power KPI,Matrix, etc.) developed and dashboards.ETL Cube Developer/Lead for offshore resources (including Web/BizTalk).Defining standards and best practices for ETL and reporting.Expertise in Erwin Data Modeler tool for ER, Dimensional Data Modelling.Prepare Conceptual, Logical Physical ER Data models covering all the entities using Erwin Data Modeler 9.64.Extensive experience in working on Oracle 9i/10g and writing Complex SQL queries and PL/SQL Procedures, Functions, Packages, Database Triggers, Exception Handlers, Ref Cursors, Database Objects using PL/SQL.Designing and developing of ETL (Extract, Transformation and Load) strategy to populate the Data Warehouse from various source systems feeds using ETL tools like Informatica, Power exchange, MDM Web services, and PL/SQL and Unix Shell scripts.Worked on data shaping, transformation, and modeling in Power BI using Power Query and DAX.Created and presented data-driven reports and visualizations to communicate insights to stakeholders.Expertise in Erwin Data Modeler tool for Relational and Dimensional Data Modelling.Environment: Hadoop 2.x, Spark v2.0.2, Hive, Sqoop, Kafka, Spark streaming, ETL, Scala, Control- M, Python (Pandas, NumPy), PySpark, GIT (version control), Semarchy, SQL Server,Data factory, AWS, Snowflake, data bricks, Tableau, MySQL, BI governance, Agile, Scrum, Github, Erwin Data Modeler, Microsoft BI Stack, Power Apps, DAX, MS SQL, MongoDB, Telemetry, Mix panel, data analytics, AWS (EC2, Alteryx,S3, EMR, RDS, Lambda, Kinesis, Redshift, Cloud Formation)State of CA, SFO, CASr Data Analyst December 2019  March 2021Roles and Responsibilities:Designed project flow including components, Services, Store including state, reducer, selector, effects.Worked with UI/UX team for responsiveness of project throughout the multiple platform.Drafted application in layered and hierarchical manner to worked though child and parent component relationship with event emitter and cross component calling.Wrote the SQL and PL/SQL queries on data staging tables and data warehouse tables to validate the data results.Incumbent data warehouse was having trouble keeping pace, and its slow response time was preventing users from getting necessary information in time to be useful. After realizing significant performance, we decided to migrate to Teradata.Identified/documented data sources and transformation rules required to populate and maintain data warehouse content.Worked on ETL migration from DataStage to Talend Open Studio 5.5 projects, handled various issues while code migration.Worked on user management for powerBI sharing the dashboard access for powerBI online.Development and maintenance of data pipeline on Azure Analytics platform using Azure Databricks, PySpark, Python, Pandas and NumPy libraries.Used GitHub source code management tool for code re-usage, high portability extending scope Of Automation Testing.Was responsible for indexing of the tables in that data warehouse.Actively participated in Business Intelligence standardization to create database layers with user friendly views in Teradata that can be used for development of various Tableau reports/ dashboards.Analyzed requirements for various reports, dashboards, scorecards and created proof of concept / prototype the same using Tableau desktop.Utilized advance features of Tableau software like to link data from different connections together on one dashboard and to filter data in multiple views at once.GitHub Repositories Database Management Oral/Written CommunicationRealistic understanding of the Data modeling (Dimensional Relational) concepts like Star-Schema Modeling, Snowflake Schema Modeling.Extensively used analytical features in Tableau like Statistical functions and calculations, trend lines, forecasting etc in many tableau reports.Created reports and dashboards in Power BI using Power Query, DAX Functions, visuals (Charts, Graphs, KPIs, Maps, Gauges, Power KPI, Matrix, etc.).Designed and build the Azure Data factory framework to ingest data from Office 365, Dynamics 365 data in Data Lake and modelled them as per requirement for the Data science project.Automated the ETL jobs and Exposing API calls to POST data to semarchy layer and GET API calls.Used Snowflake functions to perform semi structures data parsing entirely with SQL statements.Binding of data within model using Angular Material libraries including material components.Worked with multiple data format using json/xml for calling rest API and stored data in front end store.Performed Data Analysis using SQL, PL/SQL, Python, Spark, Databricks, Teradata SQL Assistant, SQL server management studio, SAS.Utilized Alteryx/Tableau integration capabilities to further optimize data ETL, transformations and reporting and dashboardsData manipulation with SQL in Oracle and MySQL, ETL with Alteryx.Data cleaning in Python, Alteryx.Environment: Hadoop 2.x, Spark v2.0.2, Hive, Sqoop, Kafka, Spark streaming, ETL, Scala, Control-M, Python (Pandas, NumPy), PySpark, GitHub, BI governance, GIT (version control), MySQL, Alteryx, MS SQL, MongoDB, AWS (EC2, S3, EMR, RDS, Lambda, Kinesis, DAX, Data factory, data analytics, AWS, Snowflake, data bricks, Tableau, Redshift, Cloud Formation)Charter Communications, St Louis, MOData Analyst March 2017  November 2019Roles and ResponsibilitiesInstalled and developed with Apache BigData Hadoop components like HDFS, MapReduce, YARN, Hive, HBase, Sqoop, Pig, Ambari and Nifi.Migrated from JMS solace to Apache Kafka and used Zookeeper to manage synchronization, serialization, and coordination throughout the cluster.Create action filters, parameters and calculate sets for preparing dashboards and worksheets in Tableau.Develop Tableau workbooks from numerous data sources using Data Blending.Implement Tableau mobile dashboards via Tableau mobile application.Extensively Designed and developed Azure Data Factory (ADF) for ingesting data from various source systems both relational and non-relational, in order to fulfill business functional needs.Extracted, transformed and load data from Sources Systems to Azure Data Storage services using a mix of Azure.Improved execution by employing distributed caching for small datasets, partitioning, bucketing in Hive, and Map Side Joins.Maintained data engineering solutions by monitoring, automating & refining them on a regular basis.Created a linked help to upload data from an SFTP location to Azure Data Lake.Created and maintained Control-M job definitions, specifying the required inputs, outputs, and dependencies for each job, ensuring accurate and consistent data processing.Created a few Databricks Spark jobs with Pyspark to execute numerous table-to-table operations.Experienced with agile and waterfall methodologies in a fast-paced environment.Integrated Control-M with various data processing and ETL tools, platforms, and systems, such as Hadoop, Spark, or relational databases, ensuring seamless data.Environment: Azure Data Factory (ADF v2), Azure Databricks (PySpark), Azure Data Lake, Spark (Python/Scala), Hive, Apache Nifi 1.8.0, Jenkins, Kafka, Spark Streaming, Docker Containers, PostgreSQL, RabbitMQ, Celery, Flask, ELK Stack, AWS, MS-Azure, GitHub, Azure SQL DatabaseData ModelerCiti Bank, Hyderabad, India August 2013- February 2017ResponsibilitiesAs a Data Modeler involved in story-driven agile development methodology and actively participated in daily scrum meetings.Assisted in the oversight for compliance to the Enterprise Data Standards, data governance and data quality.Used ER/Studio tool to develop a Conceptual Model based on business requirements analysis.Involved in designing conceptual, logical and physical models using ER/Studio and build data marts using hybrid Inmon and Kimball DW methodologies.Developed requirements, perform data collection, cleansing, transformation, and loading to populate facts and dimensions for data warehouseUsed forward engineering approach for designing and creating databases for OLAP model.Developed Star and Snowflake schemas based dimensional model to develop the data warehouse.Presented the data scenarios via, ER/Studio logical models and excel mockups to visualize the data better.Designed SSIS Packages to Extract, Transfer and Load (ETL) existing data into SQL Server from different environments for the SSAS cubes.Worked with the Reporting Analyst and Reporting Development Team to understand Reporting requirements.Created 3NF business area data modeling with de-normalized physical implementation of data.Scheduled Cube Processing from Staging Database Tables using SQL Server Agent in SSASUsed Normalizer, Lookup, Router, XML Source, Stored Procedure, etc to meet the requirements.Created stored procedures using T-SQL and tuned the databases and backend process.Generated test data and tested database to meet the functionalities deliverables in the project documentation and specifications.Built dynamic Business solution and Dashboard with Excel Power Pivot, DAX, Power Query.Maintained and performed data mining, cleansing and validation using excel Vlookup and other Excel functions.Conducted analysis documentation to identify areas of gaps/risks by using excel charts, pivots and Vlookup.Worked with Data Scientists developing python code for machine learning algorithms and other data science applications.Environment: ER/Studio, SQL Server, Excel, Python

Respond to this candidate
Your Message
Please type the code shown in the image:

Note: Responding to this resume will create an account on our partner site postjobfree.com
Register for Free on Jobvertise