Quantcast

Data Engineer Resume Mount pleasant, MI
Resumes | Register

Candidate Information
Name Available: Register for Free
Title Data Engineer
Target Location US-MI-Mount Pleasant
Email Available with paid plan
Phone Available with paid plan
20,000+ Fresh Resumes Monthly
    View Phone Numbers
    Receive Resume E-mail Alerts
    Post Jobs Free
    Link your Free Jobs Page
    ... and much more

Register on Jobvertise Free

Search 2 million Resumes
Keywords:
City or Zip:
Related Resumes

Data Analyst Business East Lansing, MI

Data Analyst Power Bi DeWitt, MI

Process Engineer with 10+ years experience East Lansing, MI

Senior Product Development Engineer Caledonia, MI

Senior Network Engineer Mount Pleasant, MI

Software Engineer Computer Science Mount Pleasant, MI

Software Engineer Sap Abap Mount Pleasant, MI

Click here or scroll down to respond to this candidate
                                       Candidate's Name
                                                          Data Engineer
                                 H PHONE NUMBER AVAILABLE         # EMAIL AVAILABLE
                            Candidate's Name saiganesh

Professional Summary
As a Data Engineer with over four years of experience, I excel in leveraging cloud environments such as AWS and
Azure, alongside ETL processes, to derive actionable insights. My expertise encompasses data modeling, BI solution
development, and optimizing operational capacity planning. Proficient in Python, SQL, SQL stored procedures, and
various big data tools, I have a proven track record of reducing database load and improving efficiency. My skills span
data analysis, machine learning, and database engineering, strongly focusing on data storage and warehousing. Adept
at utilizing Microsoft Office Suite and Agile methodologies for effective documentation and presentations, I am known
for my strong communication and creative problem-solving abilities. Committed to excellence, I leverage my technical
skills and strategic thinking to drive business growth and innovation, making me a valuable asset for any organization
aiming to maximize data utility.

Professional Experience
GSA, Central Michigan University(Part-Time)                                  Mount Pleasant, Michigan 07/2023 - 05/2024
Central Michigan University (CMU) in Mount Pleasant is a leading public research institution known for its strong academic programs
and commitment to student success.
   Expertly managed inventory and configured systems for both Windows and macOS environments, ensuring seamless integration
   and optimal functionality. This work supported efficient operations and enhanced overall system performance.
   Proficiently managed Windows and macOS devices using Jamf for effective maintenance and troubleshooting.
   Efficiently managed and resolved technical support tickets, reducing response times and improving user satisfaction.
   Promoted effective team collaboration and communication, contributing to a positive and productive work environment.
   Organized and led workshops and events designed to enhance the student experience and foster engagement. These initiatives
   significantly supported academic success and enriched the campus community
   Conducted extensive research and performed detailed data analysis, contributing to high-impact academic publications. Provided
   valuable insights that supported institutional goals and informed strategic decision-making.
Data Engineer, Intel                                                                     Hyderabad, India 01/2022 - 12-2022
At Intel, a prestigious client of Accenture, I had the opportunity to contribute to their cutting-edge data processes and initiatives.
Intel being a global leader in semiconductor technology, constantly drives innovation in the computing industry. During my time there,
I witnessed Intel s commitment to advancing data science, data analytics, and decision-making through technologies like Snowflake
Data Warehousing, PowerBI for business intelligence, and Azure DevOps for seamless workflow automation. My role involved
supporting Intel s vision by optimizing data storage efficiency, reducing database load, and implementing robust CI/CD pipelines.
Collaborating with Intel s talented teams and being part of their data-driven journey toward excellence was a privilege.
   Managed end-to-end data engineering processes using AWS Data Pipeline, Redshift, Azure Data Factory, and Azure
   Databricks, leveraging Snowflake Data Warehousing as a central repository. Transitioned to GCP for enhanced process
   optimization.
   Designed and implemented robust data models, complex SQL queries, efficient SQL stored procedures, and seamless ETL
   pipelines using PySpark, Spark SQL, Azure Data Factory, and Snowflake Data Warehousing for comprehensive data
   processing solutions.
   Developed BI solutions with PowerBI, integrating and visualizing data from multiple cloud platforms, including Snowflake Data
   Warehousing.
   Led complex data migration projects, including seamless on-premise to cloud transitions, implementing advanced Data
   Warehousing techniques for efficient and scalable data storage. Ensured seamless integration and optimized performance for
   comprehensive data solutions.
   Reduced database load by 26% and improved performance by 45% through data structure refinements and optimized PySpark
   ETL pipelines.
   Implemented CI/CD pipelines throughout the SDLC using Azure DevOps and delivered scalable big data solutions on Azure
   Databricks.
   Facilitated successful project management and stakeholder engagement through effective communication and collaboration,
   utilizing Snowflake Data Warehousing for collaborative workflows and data-driven decision-making.
Data Engineer, Morris Garages                                                              Hyderabad, India 01/2019 - 12/2021
Morris Garages, a renowned automobile brand, was a valued client of Raam Group, a leading technology solutions provider in the
automobile industry. They have contributed to cutting-edge data solutions and product analytics on Google Cloud Platform (GCP),
optimizing data pipelines, and implementing ETL processes using PySpark, SQL, and A/B testing. Actively involved in infrastructure
automation and data governance activities across cloud platforms.
   I ve handled numerous data migration projects to Google Cloud Platform (GCP), where I effectively used tools like BigQuery,
   Dataflow, and Pub/Sub. These experiences have honed my skills in seamlessly integrating and managing data across platforms.
   Proficient in developing and optimizing data pipelines on GCP with Apache Spark. Skilled in SQL for effective data manipulation,
   including stored procedures. This expertise ensures streamlined data workflows and improved processing efficiency.
   I have a strong skill set in designing scalable data solutions on GCP, utilizing technologies such as Cloud Storage and Bigtable.
   My experience with AWS tools like Glue, Data Pipeline, and Redshift, along with a deep understanding of GCP, allowed me to
   develop robust cloud migration strategies, ensuring seamless integration for organizations.
   Skilled in ETL processes using PySpark and SQL for data transformation and advanced analysis procedures.
   Proficient in Terraform and CloudFormation for automating cloud resource provisioning and management.
   I m skilled in Tableau and Power BI, creating insightful dashboards and reports for stakeholders and decision-makers.
   Proficient in data governance and metadata management using Google Data Catalog and AWS Glue Data Catalog.
   Actively involved in performance tuning and optimization of data pipelines on GCP and AWS, leveraging Cloud Monitoring tools.
   Utilized AWS CloudWatch for real-time monitoring and troubleshooting, ensuring efficient and reliable data flow.
   Experienced in designing and implementing DevOps CI/CD pipelines for streamlined development and deployment processes,
   integrating automated testing and continuous monitoring for high-quality and efficient code releases.
   Proven track record of collaborating with cross-functional teams to deliver efficient data engineering solutions. Emphasized the
   use of Spark and SQL-based technologies for optimized performance and scalability.
   Successfully led a project to automate data quality checks, resulting in a 30% reduction in data errors. This initiative significantly
   improved data reliability, enhancing business intelligence activities and decision-making processes.
Software Engineer Intern, (Tata Motors)                                                           Pune, India 06/2016 - 07/2017
Tata Motors is a prominent automotive manufacturer known for its innovative vehicles and sustainable practices. It is recognized
globally for its commitment to quality, safety, and technological advancements in the automotive industry.
   Assisted in developing data processing workflows on Google Cloud Platform (GCP), utilizing Apache Spark and Apache Kafka for
   real-time data processing.
   Supported the implementation of technical solutions to optimize data storage and retrieval processes on Amazon Web Services
   (AWS), using services like S3 and RDS, reducing manual workload by up to 30 percent.
   Conducted data analysis and research using Python and R to identify automation and process improvement opportunities.
   Collaborated with team members to test and debug automation solutions, ensuring functionality and reliability.
   Documented automation processes and procedures using Markdown and Wiki, facilitating knowledge sharing and future
   troubleshooting.
   Presented findings and recommendations to the data engineering team, contributing to enhancing data-driven decision-making
   processes.

Academic Projects
Autonomous Cloud Management System Central Michigan University
Led AWS ACM system development with FastAPI, Terraform, Angular, integrating ML for real-time cost optimization,
security, and validating efficiency and cost savings in industry collaboration, showcasing expertise in enterprise-grade
cloud platform design.
Ott Recommendation Sysytem Central Michigan University
Crafted a cutting-edge movie recommendation engine for top OTT platforms, leveraging content-based filtering and
cosine similarity algorithms across a vast library of 10K titles. Utilized Python alongside Scikit-learn, Pandas, NumPy, and
Matplotlib to deliver personalized viewing experiences
Big Data Enhanced Disaster Management System Central Michigan University
Developed machine learning models using Random Forest, Statistical Data and Support Vector Machine(SVM) algorithms
for accurate weather prediction and early disaster alerts. Created an interactive web interface using HTML, CSS, and
JavaScript integrated with weather APIs to display real-time climate data and storm warnings
Education
Master s in Computer Science Central Michigan University                         Mount Pleasant, MI 2023-2024
Relevant Courses: Software and Hardware, Data Visualization, Business Intelligence, Digital Marketing, Time Series
Analysis & Forecasting.

Skills
  Data Visualization: Microsoft Power BI, Microsoft Excel, Tableau, SQL

  Big Data Technologies: Hadoop, Hive, PySpark, Scala, MySQL, MongoDB, Kafka, Pig, Maven, Snowflake, Stored
  Procedures

  Devops: Terraform, Git, Jenkins, Maven, Redshift, Kubernetes, Airflow

  IDE s: Intellij, Eclipse, Pycharm

  Cloud: AWS(EC2, S3, DynamoDB, EMR, Lambda, Cloud Watch), Azure(Data Lake, Data Factory, Data Bricks, Blob
  Storage, Microsoft Fabric, Virtual Machines, Azure Synapse), GCP(Data Proc, Cloud Storage, Big Query, Big Table,
  Pub/Sub)

  Programming & Scripting: Python, Kornshell, Shell Scripting, Bash, Scala, Postgres

  Soft Skills: Presentation, Planning, Organized, Creative Problem-Solving, Teamwork, Active Listening, Adaptability,
  Analytical Thinking


  Achievements

  Smart India Hackathon
  I emerged victorious in a national-level Smart India Hackathon focused on enhancing student employability and
  addressing challenges in employment opportunities. This intense 48-hour competition showcased my proficiency in
  utilizing Java, Firebase, and Google Cloud Platform (GCP) technologies.
  Navigation with Beacons
  Our project received an honorable mention in the hackathon held at Central Michigan University for campus navigation
  using beacon technology. This recognition underscores the effectiveness and creativity of our solution in addressing
  real-world challenges, further highlighting our team s dedication and ingenuity in technological innovation.
  Cyber Security Hackathon
  I emerged victorious in a state-level cybersecurity hackathon organized by JHUB by presenting a comprehensive
  solution that incorporated robust security rules for application development and authentication. Our winning proposal
  demonstrated a thorough understanding of cybersecurity principles and showcased innovative strategies to ensure the
  integrity and protection of digital assets.

Respond to this candidate
Your Email «
Your Message
Please type the code shown in the image:
Register for Free on Jobvertise