Quantcast

Azure Data Engineer Resume Plano, TX
Resumes | Register

Candidate Information
Title Azure Data Engineer
Target Location US-TX-Plano
Email Available with paid plan
Phone Available with paid plan
20,000+ Fresh Resumes Monthly
    View Phone Numbers
    Receive Resume E-mail Alerts
    Post Jobs Free
    Link your Free Jobs Page
    ... and much more

Register on Jobvertise Free

Search 2 million Resumes
Keywords:
City or Zip:
Related Resumes

Azure data engineer Allen, TX

Azure data engineer Frisco, TX

Sr. Azure Data Engineer Irving, TX

Data Engineer Azure Irving, TX

Data Engineer Azure Frisco, TX

Data Engineer Azure Prosper, TX

Data Engineer Azure Dallas, TX

Click here or scroll down to respond to this candidate
.CONTACTPlano, TX Street Address
PHONE NUMBER AVAILABLEEMAIL AVAILABLEAbout MeWEBSITES, PORTFOLIOS, PROFILEShttps://LINKEDIN LINK AVAILABLESKILLSPython, Pyspark, Spark, C, C++, JavaPower Bi, Tableau, Qlik SenceOracle 12c/11g, MS Access, Microsoft SQL Server 2014/2012 Teradata 15/14, Postgres SQL, MysqlPyCharm, Sublime Text, Visual Code, Jupiter NotebookAzure, AWS, Google AnalyticsHadoop, HDFS, Hive, Spark, Pig, HBase, Sqoop, FlumeAmazon EC2, Amazon S3, Amazon Simple DB, Amazon ECS, Amazon Lambdas, Amazon RDS, Amazon Elastic Load Balancing, Elastic Search, Amazon SQS, AWS Identity and access management, AWS Cloud Watch, Amazon EBS and Amazon Cloud FormationCERTIFICATIONSMicrosoft Certified AZURE DATA ENGINEER ASSOCIATE  DP-203Microsoft Certified AZURE DATA SCIENTIST ASSOCIATE  DP-100PROFESSIONAL SUMMARY:10+ years of experience in Azure Databricks, Azure Data Factory, Azure Synapse, ML Model Design and deployment, Business Intelligence and Data Visualization.Designed and developed Data Pipelines, Data Warehouses and Data Marts to integrate new data sets from different sources into a data platform.Implemented enterprise-level Azure solutions such as Azure Databricks, Azure ML, AKS, Azure Data Factory, Logic Apps, Azure Storage Account, and Azure SQL DB.Designed, developed, and deployed ETL solutions utilizing Microsoft Synapse and Data Factory to extract, transform, and load data from various sources to the data warehouse.Implemented Azure Logic Apps and Functions to automate business processes and improve efficiency.Designed and developed a scalable data warehouse using Azure Blob Storage, Data Lake, and Synapse to store and manage large volumes of data.Optimized pipeline implementation and maintenance work using Databricks workspace configuration, cluster and notebook optimization.Worked on MLOps and ML CI/CD, and integrated BI solutions such as MS Power BI and Qlik Sence.Experience in Data Extraction, Transformation and Loading (ETL) data from multiple data sources into target databases, using Azure Databricks, Azure Data Factory, Azure Synapse, SQL Server, Azure Blob Storage, PostgreSQL and Oracle.Worked on Azure Machine Learning Studio to build and deploy a Model using various Techniques.Expertise in database querying, data manipulation and population using SQL in Oracle, SQL Server, PostgreSQL, MySQL.Experience in implementing Triggers, indexes, Views and Stored procedures. Defined and managed SLAs for data products and processes running in production.Optimized query performance and populated test data. Experience in working Agile (Scrum, Sprint) and waterfall methodologies.Ability to collaborate with testers, business analysts, developers, project managers and other team members in testing complex projects for overall enhancement of software product quality.Knowledge in building data visualizations and reporting using Power BI and Qlik Sense.Expert in Data Analyst with Reporting and Data Analyst skills in designing and developing Reports/Dashboards.Self-Starter and team player and excellent communication, problem solving skills, interpersonal skills and a good aptitude for learning.Worked collaboratively with product managers, Data Scientists and other team members in an agile and scrum environment to fulfill modelling needs.Worked Closely with Product Owners to design, implement and support analytics solutions that provide insights to make better decisions.WORK HISTORYAugust 2022 - CurrentSr Data Engineer Lennox InternationalDeveloped and maintained Data pipelines into data warehouse and data lake, including design, and developmentPerformed ETL in Azure DatabricksDeveloped and implemented strategies to translate business requirements into feasible and acceptable data warehouse and data lake solutionsCreated Architecture and built new data models to improve data models that feed business intelligence tools, increasing data accessibility, and fostering data-driven decision makingDesigned, built and launched data pipelines to move data to data lake and data warehouseBuilt and maintained framework for auditing, error logging, and master data management for data pipelinesBuilt data expertise and own data quality for the data pipelinesImplemented processes and systems to monitor data quality, ensuring production data is always accurate and available for key stakeholders and business processesPerformed data analysis and assist in the resolution of data issuesIdentified and resolved defects of complex scope using proper engineering tools and techniquesProvided support and maintain existing products and add new featuresUsed Spark SQL, PySpark, Python Pandas, and NumPy packages on the data cleaning process and data manipulationEnvironment: Azure cloud platform, specifically utilizing services such as Azure Data Factory (ADF), Azure Data Lake Analytics, Azure SQL Database, Data Bricks, Azure SQL Data Warehouse, Azure Data Lake Storage (ADLS), Blob Storage, and Spark applicationsThe statements also mention utilizing SQL, Python, and various other tools such as Azure Logic Apps, Azure Data Factory, and PowerShell for data movement and retrieval.August 2020 - July 2022Data Engineer Worldwide ExpressCreated Architecture and built new data models to improve data models that feed business intelligence tools, increasing data accessibility, and fostering data-driven decision makingDesigned, built and launched data pipelines to move data to data lake and data warehouseBuilt and maintained framework for auditing, error logging, and master data management for data pipelinesBuilt data expertise and own data quality for the data pipelinesImplemented processes and systems to monitor data quality, ensuring production data is always accurate and available for key stakeholders and business processesPerformed data analysis and assist in the resolution of data issuesIdentified and resolved defects of complex scope using proper engineering tools and techniquesProvided support and maintain existing products and add new featuresImplemented monitoring solutions in Docker, and JenkinsImplemented pipelines with CI/CD and worked with version control GitHubEnvironment: Azure Data Factory (ADF), Azure Data Lake Analytics, Azure SQL Database, Data Bricks, Azure SQL Data Warehouse, Azure Data Lake Storage (ADLS), Blob Storage, and Spark applicationsThe statements also mention utilizing SQL, Python, and various other tools such as Azure Logic Apps, Azure Data Factory, and PowerShell for data movement and retrieval.November 2017 - July 2020Data Engineer Break Parts IncDesigned and Implemented data storage solutions using Azure Services such as Azure SQL Database, Azure Cosmos DB and Azure Data Lake StorageDeveloped and maintained Data pipelines using Azure Data Factory and Azure DatabricksCreated and managed data processing jobs using Azure HDInsight and Azure Stream AnalyticsPerformed Data Modeling and Schema Design for efficient data storage and retrievalOptimized data processing and storage for performance and cost efficiencyImplemented Security and Compliance Measures for data storage and processingCollaborated with data scientists and analysts to provide data insights and support Data-Driven Decision MakingStayed up-to-date with new Azure services and technologies and evaluate their potential for improving data storage and processing solutionsEnvironment: Azure cloud platform such as Azure Data Factory (ADF), Azure Data Lake Analytics, Azure SQL Database, Data Bricks, Azure SQL Data Warehouse, Azure Data Lake Storage (ADLS), Blob Storage, and Spark applicationsThe statements also mention utilizing SQL, Python, and various other tools such as Azure Logic Apps, Azure Data Factory, and PowerShell for data movement and retrieval.June 2015 - April 2017Data Analyst Alliance Global ServicesDeveloped, implemented, and maintained leading-edge analytics systems, by taking complicated problems and building simple frameworksIdentified trends and opportunities for growth through analysis of complex datasetsEvaluated organizational methods and provide Source-to-Target mappings and information-model specification documents for datasetsCreated best-practice reports based on data mining, analysis, and visualizationEvaluated internal systems for efficiency, problems, and inaccuracies, and develop and maintained protocols for handling, processing, and cleaning dataWorked directly with managers and users to gather requirements, provide status updates, and build relationshipsExpertise in publishing Power BI reports of dashboards in the Power BI server and scheduling the dataset to refresh for live data in the Power BI serverDeveloped SQL queries using stored procedures, common table expressions (CTEs), and temporary tables to support Power BI reportsAnalyzed the KPIs based on the end user requirements and needs to develop dashboards using Power BIEnvironment: SQL Server 2008, SSIS, Oracle, Business Objects XI, Relational Rose, Data Stage, MS Vision, SQL, Crystal Reports 9, Jira, ConfluencePython.March 2011 - May 2013Junior Data Analyst Symphony Infospace Pvt Ltd, BangloreStrengthened internal data governance practices by helping to develop and implement policies related to data storage, access, and retention.Streamlined reporting processes by automating routine tasks using Python and Excel macros.Reduced manual processing time with automation scripts, allowing more focus on strategic initiatives within the department.Increased efficiency in data extraction by utilizing SQL queries for retrieving relevant information from databases.Participated in project planning meetings, ensuring that data requirements were clearly defined and understood by all team members involved.Collaborated with cross-functional teams to develop data-driven insights and recommendations for business improvements.Assisted in the development of interactive dashboards with Tableau, providing actionable insights to stakeholders.Enhanced data accuracy by meticulously cleaning and organizing datasets for analysis.EDUCATIONMay 2015M.Tech (Computer Science)JNTU, AnantpurMay 2006MCASVU, Tirupathi.SARACHANDRIKA PANDURU

Respond to this candidate
Your Message
Please type the code shown in the image:

Note: Responding to this resume will create an account on our partner site postjobfree.com
Register for Free on Jobvertise