Quantcast

Data Analyst Azure Resume Leander, TX
Resumes | Register

Candidate Information
Name Available: Register for Free
Title Data Analyst Azure
Target Location US-TX-Leander
Email Available with paid plan
Phone Available with paid plan
20,000+ Fresh Resumes Monthly
    View Phone Numbers
    Receive Resume E-mail Alerts
    Post Jobs Free
    Link your Free Jobs Page
    ... and much more

Register on Jobvertise Free

Search 2 million Resumes
Keywords:
City or Zip:
Related Resumes
Click here or scroll down to respond to this candidate
Candidate's Name
DATA ANALYSTPHONE NUMBER AVAILABLE EMAIL AVAILABLE OhioPROFESSIONAL SUMMMARY3+ years of experience as a Data Analyst with an understanding of Data Modeling, Evaluating Data Sources, and understanding of Data Warehouse/Data Mart Design, and Client/Server applications.Good Knowledge of the Software Development Life Cycle (SDLC), Agile, and Waterfall Methodologies.Potential to bridge the gap between data and actionable insights with proficiency in querying and managing a wide range of databases including MySQL, PostgreSQL and DynamoDB.Working knowledge of Python and R libraries such as NumPy, Pandas, Matplotlib, Scikit-Learn, and ggplot2.Proficient in utilizing Azure Data Services such as Azure SQL Database, Azure Data Lake Storage, Azure Data Factory, and Azure Synapse Analytics for data ingestion, storage, processing, and analysis.Proficient in designing stunning visualizations using Tableau and Power BI software and publishing and presenting dashboards, and Storyline on web and desktop platforms.Knowledge of various Relational Database Management Systems (RDBMS) such as MySQL and SQL Server.Capable of all phases of data mining, data collection, data cleaning, developing models, validation, and visualization.EXPERIENCEMetLife, USA, Data Analyst Feb 2023 - PresentAdministered Agile methodology to iteratively develop predictive models, enhancing product performance and ensuring alignment with stakeholder requirements.Conducted comprehensive data analysis using SQL, extracting actionable insights from large datasets averaging 1 TB in size, and presenting findings to key stakeholders for informed decision-making.Building Complex Stored Procedures and automating them using SQL Server Agent, helped optimizing query runtimePrepare 5+ dashboards weekly using calculated fields, parameters, calculations, groups, sets, and hierarchies in Tableau.Utilize MongoDB and PostgreSQL databases to store and retrieve structured and unstructured data for analysis, ensuring data integrity and accessibility for various analytical tasks.Adept Azure SQL Database and Azure Data Factory to develop scalable data solutions, optimizing data processing and analysis workflows for increased efficiency and reliabilityCollaborated with software developers to design and develop a scalable platform for real-time analysis saving 100+ hours effort.Sage SoftTech, India, Data Analyst Jan 2019  June 2021Catalogued churn analysis to measure customer attrition rates, tracking critical hiring KPIs and improving rateDeployed machine learning algorithms in Python using NumPy, Pandas, and Scikit-learn, improving accuracy enhancing theeffectiveness of predictive analytics initiatives.Implemented interactive dashboards using Power BI to visualize market trends and performance metrics, resulting in data comprehension among stakeholders and facilitating quicker decision-making processes.Using packages like ggplot2 in R Studio for data visualization and generated scatter plots and high-low graphs to identify the relationship between different variables and high correlationsCreated a General Ledger Reconciliation report in Power BI by building Azure SQL Data Warehouse to create facts and dimension tables with various financial elements and optimizing analysis through examining KPIs.Leveraged Azure Databricks for scalable data processing and collaborative analytics, accelerating model training and experimentation, and improving time-to-insights for critical business decisions.Performed data quality assessments and implemented data cleansing processes, improving data accuracy and ensuring reliable insights for decision-making purposes.TECHNICAL SKILLSProgramming Languages: Python (NumPy, Pandas, Scikit-Learn, Keras, Tensorflow), R (dplyr, Tidyverse), SQL, JavaScript, ScalaDatabases: Snowflake, PostgreSQL, MongoDB, Databricks, SQL Server, MySQL, NoSQL, HDFS, Oracle DB2Data Visualization: Looker, Google Data Studio, Tableau, Excel, Power BI, ggplot, plotly, R, ShinyML/DL Algorithms: KNN, Logistic Regression, Random Forest, Neural Network, K-means, CNNCloud/Frameworks: AWS, GitHub, BigQuery, RStudio, Django, MEAN Stack, Spark, Hadoop, RedShift, Microsoft Power BI,Apache Airflow, Snowflake, Databricks, Talend, Apache Spark, Excel, Microsoft Azure, Google Data StudioHard Skills: Data Modeling, Data Warehousing, Data Visualization, Statistical Data Analysis, Google AnalyticsEDUCATIONMS in Computer Science, University of Dayton, Dayton, OhioBachelors in computer Science KL University, India

Respond to this candidate
Your Message
Please type the code shown in the image:

Note: Responding to this resume will create an account on our partner site postjobfree.com
Register for Free on Jobvertise