Quantcast

Data Analyst Analysis Resume Detroit, MI
Resumes | Register

Candidate Information
Name Available: Register for Free
Title Data Analyst Analysis
Target Location US-MI-Detroit
Email Available with paid plan
Phone Available with paid plan
20,000+ Fresh Resumes Monthly
    View Phone Numbers
    Receive Resume E-mail Alerts
    Post Jobs Free
    Link your Free Jobs Page
    ... and much more

Register on Jobvertise Free

Search 2 million Resumes
Keywords:
City or Zip:
Related Resumes
Click here or scroll down to respond to this candidate
Candidate's Name
Data AnalystPHONE NUMBER AVAILABLE EMAIL AVAILABLESUMMARY Around 6 years of IT experience in Business Analysis, System Analysis, Data analysis, Software Design, Development, Implementation and Maintenance of Business Intelligence and across Agriculture, Airline Services, Banking and Retail domains. Skilled in data analytics, business analysis, R programming, Python, ETL design, front end development and analytical reporting. Highly Proficient in Agile, Iterative, and Waterfall and Scaled Agile Framework (SAFe) software development life cycles. Leveraged GCP tools like BigQuery and Data Studio to conduct large-scale data analysis and create visual reports, enhancing data accessibility for stakeholders. Expert in designing and developing analytical reports, dashboards using techniques for guided analytics, interactive dashboard design, and visual best practices to convey the story inside the data by integrating data from multiple sources and help users to identify critical KPIs and facilitate strategic planning in the organization. Strong knowledge in SQL using SQL Server and Oracle databases. Worked in Data Extraction, Transforming and Loading (ETL) using various tools such as SQL Server Integration Services (SSIS), Pipeline Pilot. Experienced as AWS cloud engineer to create and manage EC2 instances, S3 buckets, and configure RDS instances. Monitoring the performance, spinning off the instances regularly, and help developers to get access to the cloud instances. Managed data ingestion pipelines into Snowflake using AWS Glue and ADF for seamless integration of structured and semi-structured data. Proficient in analysis, design and implementation of data warehousing/ BI solutions using Tableau, Spotfire, Alteryx, Pipeline Pilot, Informatica, MicroStrategy and Database technologies. Excellent knowledge in designing and developing dashboards using TIBCO Spotfire by extracting the data from multiple sources (Flat files, Excel, Access, SQL server, Oracle). Designed and implemented ETL pipelines on AWS that ingest data from various sources into S3, ensuring seamless access for downstream analytics. Successfully integrated custom HTML elements into Tableau dashboards to enhance user experience by creating seamless, interactive web-based elements. Leveraged web technologies and frameworks such as Bootstrap and AngularJS to develop and optimize responsive web elements integrated with data visualizations. Good understanding of Tableau Desktop architecture for designing and developing dashboards. Extensive Tableau Experience in Enterprise Environment and Tableau Administrator experience including technical support, troubleshooting, report design and monitoring of system usage. Outstanding Data analysis skills including Data mapping from source to target database schemas, Data Cleansing and processing, writing data extract scripts/programming of data conversion and researching complex data problems. Hands-on experience using Dimensional Data Modeling, Star Schema Modeling, Snow-Flake Modeling, Fact and Dimension Tables, Physical and Logical Data Modeling. Collaborated with relational databases (RDBMS) such as MySQL and PostgreSQL, ensuring efficient data retrieval and manipulation for analytical projects. Possess strong knowledge of database management in writing complex SQL queries, stored procedures, database tuning, query optimization and resolving key performance issues. Implemented data streaming solutions using Apache Kafka, enabling real-time data ingestion and processing for timely analytics insights. Highly motivated team player with excellent Interpersonal and Customer Relational Skills, Proven Communication, Organizational, Analytical, Presentation Skills, and Leadership Qualities. SKILLSAnalysis Tools: Tableau, TIBCO Spotfire, Power BI, MS SSRS, MS SSAS, QlikView, MS Excel (VBA, VLOOKUP, Pivot tables)Data Integration (ETL) Tools: Alteryx, Pipeline Pilot, Informatica, MS SQL Server Integrated Service (SSIS) Project Management: SDLC - Waterfall & Agile methodologies, project life cycle, MS Project 2013, JIRA Programming: SQL, PL/SQL, html, CSS, JavaScript, Python, R-Script, C#, Java, Spark, PySpark Cloud computing: Amazon Web Services (AWS), Lambda, S3, EC2, VPC, EMR, Glue, Amazon Redshift, MS Azure, Azure blob storage, Azure Data Factory, Azure Data Lake, Azure Data Bricks, Azure SQL DB, Azure Synapse & Google Cloud Platform (BigQuery, Big Table, Dataproc, IAM)Databases: MySQL, SQL Server, Google BigQuery, Amazon RDS, PostgreSQL, Netezza, Snowflake Data Warehousing and Big Data Technologies: Snowflake, Hadoop, Apache Spark, Databricks, Kafka Application Software: MS Office, Eclipse, MATLAB, Visual Studio, MS Excel advance, MS Access, MS Visio, Toad Oracle Manual Testing: Requirement analysis, Test scenario design, Test case design, Test data management, Test Execution, Database testing, Regression testing and Defect tracking PROFESSIONAL EXPERIENCEElevance Health, Atlanta, Georgia FEB 2023  Present Data AnalystResponsibilities: Designed, developed, tested, and maintained Tableau reports based on user requirements. Preparing reports, extracts and integration of existing data as needed to support analysis, visualizations, and statistical models. Applied CI/CD practices to automate ETL deployment using GitLab/Jenkins, ensuring rapid and reliable code releases. Improved operational efficiency by streamlining pipeline testing, version control, and code deployment. Analyzed Snowflake usage patterns to optimize costs, implementing best practices for resource allocation and storage management. Developed interactive dashboards and reports using Qlik Sense and QlikView, improving data visibility and decision- making for cross-functional teams. Implemented Tableau JavaScript API to create dynamic and responsive visualizations that adapt to user inputs, providing more engaging and interactive dashboards. Maintained and optimized data pipelines using PySpark and Spark, improving data processing efficiency by 30% through performance tuning and resource management. Designed, developed, and maintained scalable ETL workflows using PySpark, AWS Glue, and Databricks to manage large datasets. Created reusable data layers that facilitated self-service analytics, empowering end-users to generate insights without IT intervention. Developed complex data scripts using advanced functions such as SET analysis, ApplyMaps, and What-If scenarios, enhancing analytical capabilities. Designed and implemented complex application architectures to enhance data flow and integrity, resulting in a 30% increase in data retrieval efficiency. Automated report creation using various APIs, reducing manual reporting time by 40% and ensuring timely delivery of insights. Automated ETL jobs using AWS Lambda and Step Functions for seamless scheduling and monitoring. Designing, developing and providing technical solutions through Tableau reports based on user requirements. Worked with end users/customers to understand business requirements, recommended technical solutions and documented functional requirements. Led the deployment and configuration of connector server clients for SSO and MFA across diverse Active Directory domains, improving user authentication processes. Managed data ingestion processes into HDFS, optimizing data storage and retrieval for enhanced analytics performance. Extracted, transformed, and loaded data from multiple sources into AWS S3, Redshift, and on-premises databases, ensuring smooth integration across platforms. Developed data processing applications using Apache Flink to analyze streaming data, improving response times for business-critical decisions. Leveraged NPrinting for efficient report distribution and management, ensuring stakeholders received up-to-date information promptly. Utilized Angular JS, HTML, and CSS to create custom mashups and extensions, enhancing user interface and experience in data visualization tools. Analyzed large datasets to identify suspicious transaction patterns and trends, leading to the identification of potential AML risks and enhancing compliance measures. Developed and maintained ETL pipelines using Netezza tools, facilitating the seamless integration of data from multiple sources for reporting and analytics. Extensively worked on designing and creating tableau dashboards. Also, automated dashboards to help stakeholders in implementation of strategic planning in the organization. Worked with leadership management team for visualizing the key customer metrics, driving better solution in consumer POD and supply chain. Utilized various Tableau functions like LOD, WINDOWS SUM, WINDOWS AVG, conditional functions (IF, Nested IF, CASE, etc.), RANK, Table calculations like Percentage difference, Percent of Total, Moving Average, YTD, YOY, etc. for various measures/calculated fields. Designed and maintained scalable data lakes on S3, ensuring data partitioning and compliance with governance policies. Managed large datasets and performed analytics by integrating Redshift as a data warehouse and S3 as a storage layer. Built and scheduled ETL pipelines to extract, transform, and load data from various sources into AWS Redshift and data lakes. Developed efficient ETL/ELT processes using AWS Glue, Python, and SQL to ensure smooth data flow between systems. Designed and maintained scalable data lakes on S3, ensuring data partitioning and compliance with governance policies. Worked on view orientation, Sizing, layout, Color, Fonts, Tooltips, for building Interactive dashboards/reports. Extensively worked on Tableau dashboards optimization for better performance and monitoring. Specifically dealing with huge volume of data sets in tableau. Extract, transform, and load data from a variety of data sources into staging datasets in redshift environment and ultimately into tableau for reporting. Scheduled data refresh on Tableau Server for weekly and monthly increments upon business requirement and ensured views and dashboards display changes in the data accurately. Worked on administration tasks such as Setting permissions, managing ownerships and providing access to the users and adding them to the specific group.ESM Square Technologies, INDIA JAN 2020  DEC 2021 ETL DeveloperResponsibilities: Worked in coordination with clients for feasibility study of the project in Spotfire and with DB Teams for designing the database as per project requirement. Applied QlikView for data visualization and reporting, ensuring effective data representation and insight generation. Designed, developed, tested, and maintained Spotfire reports based on user requirements. Developed Python-based API (RESTful Web Service) to track revenue and perform revenue analysis. Led the development of prescriptive data model to optimize operations and increase packaging efficiency Analyzed requirements, developed and debugged applications using BI tools Interacted with the users and business analysts to assess the requirements and performed impact analysis Strong experience in creating sophisticated visualizations, calculated columns and custom expressions. Extensively used Spotfire and Tableau to create data visualizations enabling prescriptive operations and increasing data visibility. Data Ingestion to one or more Azure Services - (Azure Data Lake, Azure Storage, Azure SQL, Azure DW) and processing the data in In Azure Databricks. Primarily involved in Data Migration using SQL, SQL Azure, Azure Storage, and Azure Data Factory, SSIS, PowerShell. Architect & implement medium to large scale BI solutions on Azure using Azure Data Platform services (Azure Data Lake, Data Factory, Data Lake Analytics, Stream Analytics, Azure SQL DW, HDInsight/Databricks, NoSQL DB). Migration of on-premise data (Oracle/ SQL Server/ DB2/ MongoDB) to Azure Data Lake and Stored (ADLS) using Azure Data Factory (ADF V1/V2). Created Information Links for many databases for populating data into Visualization from underlying databases such as Sql Server and Oracle. Built scalable pipelines, model deployments and reusable protocols using pipeline pilot to support daily operations Migrated couple of reports from Spotfire to Tableau, Power BI and Superset. Preparing reports, extracts and integration of existing data as needed to support analysis, visualizations, and statistical models. Extensively worked on designing and creating the dashboards. Also, automated Dashboards to help end users to identify the KPIs and facilitate strategic planning in the organization. Created ad-hoc SQL queries for customer reports, executive management reports. Extensively worked on Tableau dashboards optimization for better performance tuning. Extensively worked with various functionalities like context filters, global filters, Drill-Down Analysis, Parameters, Background images, Maps, Trend Lines and created customized views, conditional formatting for end users. Worked on Data blending, Data preparation using Pipeline pilot for tableau consumption and publishing data sources to Tableau server. Created scripts in Python and R for calling APIs through Insomnia and Postman. Enforced data quality standards by implementing data validation and monitoring mechanisms across all stages of the ETL pipeline. Collaborated with data governance teams to ensure compliance with data integrity and security policies.Provided production support to various tools and created tickets in Jira. Extract, transform, and load data from a variety of data sources into staging datasets and ultimately into visualization tool.Met Life - Hyderabad, India AUG 2018  NOV 2020Data Analyst/Tableau DeveloperResponsibilities: Performed extensive GAP analysis in the project as there were numerous As-Is and To-Be conditions. Developed Logical data model using Erwin and created physical data models using forward engineering. Translated business and data requirements into Logical data models in support of Enterprise Data Models, OLAP, Operational Data Structures and Analytical systems. Involved in collection and cleaning of large amount of unstructured data using Excel and python. Designed and developed various analytical reports from multiple data sources by blending data on a single worksheet in Tableau Desktop. Developed various reports using Tableau features like Parameters, Filters, Sets, Groups, Actions, etc. to present users with various scenarios of predictive analysis. Created and updated custom SQL scripts to serve as data sources for various reports and handled performance issues effectively. Worked extensively in creating dashboards using Tableau that includes tools like Tableau Desktop, Tableau Server and Tableau Reader in various versions of Tableau. Also, involved in the administration of Tableau server like installations, upgrades, user, user groups creation and setting up security features. Involved in publishing of various kinds of live, interactive data visualizations, dashboards. Reports and workbooks from Tableau Desktop to Tableau servers. Identified the factors involving the performance of the reports and coordinated in creation of indexes, statistics to improve the query turnaround time efficiently.

Respond to this candidate
Your Message
Please type the code shown in the image:

Note: Responding to this resume will create an account on our partner site postjobfree.com
Register for Free on Jobvertise