Quantcast

Data Analyst Resume Okemos, MI
Resumes | Register

Candidate Information
Name Available: Register for Free
Title Data Analyst
Target Location US-MI-Okemos
Email Available with paid plan
Phone Available with paid plan
20,000+ Fresh Resumes Monthly
    View Phone Numbers
    Receive Resume E-mail Alerts
    Post Jobs Free
    Link your Free Jobs Page
    ... and much more

Register on Jobvertise Free

Search 2 million Resumes
Keywords:
City or Zip:
Related Resumes

Data Analyst Business East Lansing, MI

Data Analyst Project Management Flint, MI

Data Analyst Lansing, MI

Data Analyst Power Bi DeWitt, MI

Project Manager Data Analyst Brighton, MI

Quality Assurance Analyst Ann Arbor, MI

Data Scientist Machine Learning East Lansing, MI

Click here or scroll down to respond to this candidate
   Candidate's Name  Shanigaram
PHONE NUMBER AVAILABLE | EMAIL AVAILABLE | LinkedInSummaryExperienced Data Analyst with 5 years of experience in data collection, analysis, statistical modeling, and data visualization across diverse sectors. Recognized for extracting actionable insights from intricate datasets to inform strategic decisions. Proficient in employing advanced analytical techniques and tools to uncover trends, patterns, and correlations. Demonstrated skill in effectively communicating findings to stakeholders and auditors through clear and compelling data visualizations ensuring data integrity and compliance with regulatory standards. Eloquent in documentation and A/B testing and UAT Testing.Skills & Abilities      Programming Languages   Python, Pyspark, PyTorch, R Studio and Libraries.      Data Engineering   SQL Server, Snowflake, Databricks, Hive and Advanced Excel/Macros.
      Machine Learning   TensorFlow, Scikit-learn, Pandas, NumPy, MLOps.      Cloud Technologies   AWS (S3, Glue, EMR and Athena), Databricks, Snowflake.      Data Science Pipeline - Cleansing, Visualization, Modeling, Interpretation.      Business Intelligence   Tableau, Power BI, Git.ExperiencePYTHON DEVELOPER | AMERIPRISE FINANCIAL SERVICES, MN | APR 2024-PRESENT      Utilized Data Analysis Expressions (DAX) to build complex calculated columns and measures, optimizing data models in Power BI.      Developed interactive dashboards using Power BI, enabling business stakeholders to visualize KPIs and track performance.
      Used MS Dynamics 365, for financial reporting and customer data management. Developed complex DAX formulas for advanced reporting in Power BI, integrating data from Dynamics 365.      Integrated Azure Data Lake and Azure SQL Database for secure and scalable data storage solutions.      Automated data cleansing and reporting using advanced Excel formulas, pivot tables, and macros.      Used SQL to extract, transform, and load (ETL) data from relational databases, improving query performance by 30%.      Applied Python scripts for data cleaning, manipulation, and analysis, streamlining workflows.      Connected Power BI to multiple data sources, including SQL Server and Snowflake, to automate the extraction and transformation of data for risk analysis and reporting.
      Analyzed sales and customer service processes within MS Dynamics 365 to identify areas for improvement.
       Leveraging Power Apps, Power Automate, and Power BI in conjunction with MS Dynamics 365 for enhanced analytics and automation.
      Built and deployed machine learning models with Scikit-learn for customer segmentation.
      Leveraged Pandas for efficient data wrangling, handling large datasets with ease.      Wrote and optimized SAS code to automate the processing of financial data, reducing manual effort and increasing efficiency in reporting tasks.
      Built machine learning models using PyTorch to predict business outcomes and improve forecasting accuracy. Developed deep learning models in TensorFlow, enhancing predictive analytics capabilities.
      Designed and executed A/B tests to evaluate different financial reporting layouts and features, analyzing the results to determine the most effective solutions for stakeholders.
      Implemented collaborative data processing workflows on Databricks, accelerating big data operations.      Maintained up-to-date documentation on UAT processes, test cases, and findings to support future testing and compliance needs.
      Automated data ingestion and transformation processes, ensuring real-time data availability.
      Achieved a 25% increase in operational efficiency through automation and optimized data workflows.MACHINE LEARNING ENGINEER | HOME DEPOT, GA | JULY 2023-APR 2024      Created dashboards in Tableau to track sales performance, reducing reporting time by 40%.      Led the development and implementation of advanced data analysis methodologies, leveraging Pythonand PySpark for efficient data manipulation and transformation tasks.      Designed and executed comprehensive data visualization strategies using Tableau, translating complex data insights into visually compelling presentations for stakeholders.
      Applied advanced Excel functions for data manipulation and ad hoc reporting, enhancing accuracy.      Implemented robust data governance practices to ensure compliance with GDPR regulations, maintaining data security and integrity throughout the data lifecycle.      Monitoring data quality metrics to maintain high standards for model inputs      Developing data cleansing and validation procedures specific to ML training datasets.
      Conducting regular audits of data usage in ML systems and ensuring compliance with data protection regulations like GDPR and CCPA.      Utilized PyTorch for model training and evaluation, optimizing performance and scalability.      Applied advanced statistical analysis techniques and algorithms for predictive data modeling and hypothesis testing by performing A/B test, delivering actionable insights to support business objectives.
      Implemented advanced MLOps practices, including automated retraining and model monitoring using tools like Kubeflow and TFX. Used SQL to extract, transform, and load (ETL).
      Spearheaded a project utilizing Amazon S3 for data storage, extracting and processing data usingAWS Python SDK and AWS EMR console and ensuring seamless integration with other AWS services.      Orchestrated the integration of S3 data, SQL server with Tableau for visualization using AWS Glue and Athena connectors, streamlining the data pipeline and enhancing automation capabilities.
      Delivered a 20% increase in customer retention through data-driven insights and predictive analytics.DATA ANALYST | MOLINA HEALTHCARE, LAFAYETTE, LA | JULY 2022-MAY 2023      Conducted exploratory data analysis (EDA) to identify trends, outliers, and correlations, informing strategic business decisions.      Designed and developed management dashboards using Power BI for a 300 billion loan portfolio and a 10 million customer base leading to an 80% reduction in reporting team time and quicker decision- making.      Collaborated with IT teams to set up and maintain Oracle Cloud ERP integration adapters.
      Implemented data source adapter-based frameworks for querying data from Oracle ERP Cloud.      Developed complex DAX formulas for advanced reporting in Power BI. Managed the integration of diverse data sources, including transactional databases, IoT devices, and third-party APIs, into the data pipelines.
      Used Python, SQL queries and PySpark for large-scale data processing, reducing job execution time by 40%.
      Migrated legacy data to cloud-based databases, improving data accessibility by 35%.
      Built automated ETL/ELT pipelines to extract and load data into reporting systems.      Established MLOps practices, including CI/CD pipelines, model versioning, and automated deployment      using tools like Jenkins and MLflow.      Created automated reporting systems using R to deliver timely insights to stakeholders, reducing manual effort and improving efficiency.      Improved operational efficiency by 15% through optimized data migration and modeling strategies.JR DATA ANALYST | BFIL | JUNE 2019- JULY 2022      Designed and implemented data pipelines for ETL processes using cloud technologies.      Utilized Excel, Oracle SQL 19g, Python, Tableau, and SAS to perform exploratory data analysis.      Applied statistical techniques to analyze data and draw conclusions from databases, ensuring data integrity and performance.
      Wrote complex SQL queries to retrieve data from multiple databases, improving reporting efficiency by 30%.      Utilized Git to manage codebase, configurations, and scripts, ensuring traceability and collaboration among team members.
      Implemented real-time data processing and streaming solutions using Apache Kafka and Spark Streaming. Automated repetitive tasks with Python, reducing manual efforts by 50%.      Developed statistical models in SAS to assess credit risk, helping the financial team make informed lending decisions based on historical data.      Created visually appealing dashboards and presentations using Tableau to communicate results effectively.      Conducted basic statistical analysis, including descriptive statistics and A/B testing and UAT testing.      Used Excel and Python to calculate measures of central tendency, dispersion, and probability distributions.
      Increased data processing efficiency by 20%, leading to faster decision-making and improved business outcomes.EducationUNIVERSITY OF LOUISIANA, LAFAYETTE, LA | MASTER OF SCIENCES IN INFORMATICS | MAY 2023JNTU HYDERABAD, HYDERABAD, INDIA | BACHELOR OF TECHNOLOGY IN MECHANICAL ENGINEERING | SEPTEMBER 2020CertificationsMicrosoft Certified: Power BI Data Analyst AssociateAWS Certified Data Engineer - Associate

Respond to this candidate
Your Email «
Your Message
Please type the code shown in the image:
Register for Free on Jobvertise