Quantcast

Aws Data Engineer Resume Madison, WI
Resumes | Register

Candidate Information
Name Available: Register for Free
Title AWS Data Engineer
Target Location US-WI-Madison
Email Available with paid plan
Phone Available with paid plan
20,000+ Fresh Resumes Monthly
    View Phone Numbers
    Receive Resume E-mail Alerts
    Post Jobs Free
    Link your Free Jobs Page
    ... and much more

Register on Jobvertise Free

Search 2 million Resumes
Keywords:
City or Zip:
Related Resumes

Data Engineer Elkhorn, WI

Data Engineer Elkhorn, WI

Software Engineer Data Entry Madison, WI

Software Engineer Data Science Madison, WI

Customer Service Devops Engineer Madison, WI

C# Software Engineer Dodgeville, WI

Software Engineer Williams Bay, WI

Click here or scroll down to respond to this candidate
 Divya SaiSr. Data EngineerEmail: EMAIL AVAILABLE | Phone: PHONE NUMBER AVAILABLE |Linkedin: LINKEDIN LINK AVAILABLE |Professional SummaryAccomplished Senior Data Engineer with over 9 years of experience in designing, implementing, and optimizing data solutions across big data and cloud platforms. Proficient in leveraging technologies like Spark, Hadoop, Kafka, and cloud services (AWS, Azure, GCP) to enhance data processing efficiency, drive business intelligence, and support predictive analytics. Skilled in data engineering best practices, automation, and orchestration, with a proven track record in data migration, real-time processing, and advanced data analytics.Core Competencies      Big Data Ecosystems: Spark, Hadoop, Kafka, Cloudera, HDFS, GCP Dataflow      Cloud Platforms: AWS (Glue, Redshift), Azure (Data Factory, Databricks), GCP (BigQuery, DataProc)      Programming Languages: Python, Scala, C#, SQL, Shell Scripting      DevOps & Orchestration: Docker, Kubernetes, Jenkins, Ansible      Databases: Snowflake, Redshift, DynamoDB, Cosmos DB, Teradata      Data Analytics & Visualization: Power BI, QlikView, Tableau, SAS, SPSS      Security & Monitoring: Log Analytics, Azure AD, App Insights, Key VaultTechnical SkillsCategoryTools & TechnologiesBig Data TechnologiesSpark, Hive, Pig, Spark SQL, Kafka, Cloudera, GCP DataProc, HadoopCloud PlatformsAWS (EC2, Glue, S3), GCP (BigQuery, Dataflow), Azure (Data Factory, Data Lake, Databricks)ProgrammingPython, Scala, C#, SQL, Shell ScriptingDevOps & CI/CDDocker, Kubernetes, Jenkins, AnsibleData WarehousingSnowflake, Redshift, BigQuery, TeradataData VisualizationPower BI, Tableau, QlikView, SAS, SPSSSecurity & MonitoringLog Analytics, Azure AD, App Insights, Key VaultWeb & IntegrationWebSphere, Azure Service Bus, Cloud Composer, Cloud Pub/SubProfessional ExperienceAWS Data EngineerMolina Healthcare, Bothell, WA | August 2021   Present      Developed real-time data solutions using Spark Streaming and Kafka, reducing latency by 40%.      Managed AWS infrastructure (EC2, S3, Glue, Redshift) to optimize data storage and processing.      Led data migration projects to Snowflake, cutting costs by 20%.      Automated data pipelines with Sqoop, Flume, and Shell scripting, reducing manual effort by 50%.      Created dashboards in QlikView for actionable insights, driving strategic initiatives.Environment: Spark, Hadoop, Kafka, Cloudera, AWS Glue, Redshift, Snowflake, Docker, Kubernetes, QlikViewGCP Data EngineerExperian, Costa Mesa, CA | April 2019   July 2021      Built and maintained data pipelines using PySpark and GCP Dataflow, increasing decision-making speed by 30%.      Engineered data integration strategies for migration to Snowflake, boosting processing speed by 50%.      Automated workflows with Cloud Composer and Pub/Sub, enhancing data synchronization.      Utilized GCP Databricks for ML model development, improving predictive analytics.Environment: GCP (BigQuery, Dataflow, Databricks), PySpark, Snowflake, SAS, SQLAzure Data EngineerPacific Life, Newport Beach, CA | April 2017   March 2019      Designed data ingestion pipelines with Azure Data Factory and Data Lake, improving data retrieval times by 30%.      Developed real-time streaming solutions using Azure Event Hub and Service Bus for actuarial computations.      Automated deployment processes with Jenkins and Ansible, reducing manual efforts by 50%.      Integrated security measures with Azure AD and Key Vault, ensuring compliance.Environment: Azure Data Factory, Data Lake, Databricks, Event Hub, Cosmos DB, Jenkins, AnsibleData AnalystPalTech, Hyderabad, India | September 2014   December 2016      Conducted data analysis using SQL and Python, increasing marketing campaign effectiveness by 20%.      Developed predictive models in R and SAS, improving forecasting accuracy by 30%.      Automated data processing tasks, saving 200+ man-hours annually.Environment: SQL, Python, Tableau, R, SAS, Power BIEducation & Certifications      Bachelor's in Engineering (B.E.)   Computer Science      Certifications: AWS Certified Data Analytics, Google Cloud Professional Data Engineer, Azure Data Engineer Associate

Respond to this candidate
Your Email «
Your Message
Please type the code shown in the image:
Register for Free on Jobvertise