Quantcast

Sql Server Azure Data Resume Chevy chase...
Resumes | Register

Candidate Information
Name Available: Register for Free
Title Sql Server Azure Data
Target Location US-MD-Chevy Chase
Email Available with paid plan
Phone Available with paid plan
20,000+ Fresh Resumes Monthly
    View Phone Numbers
    Receive Resume E-mail Alerts
    Post Jobs Free
    Link your Free Jobs Page
    ... and much more

Register on Jobvertise Free

Search 2 million Resumes
Keywords:
City or Zip:
Related Resumes
Click here or scroll down to respond to this candidate
Candidate's Name
EMAIL AVAILABLE PHONE NUMBER AVAILABLE Chevy Chase, Maryland Street Address
Experienced Data Engineer with a bachelors degree and decade of proven expertise in data engineering and business intelligence development. Proficient in a wide range of tools and technologies including Python, SQL, SSIS, Azure Data Factory, Databricks, Azure Synapse, and both Azure and AWS cloud resources. Adept at building robust Data Lakes and Data Warehouses, and skilled in developing insightful reports using Power BI, Tableau, and SSRS. Adept at working closely with business users to gather and translate their requirements into effective data solutions. Experienced with Big Data tools and committed to optimizing data infrastructure for improved efficiency and data-driven decision-making.TECHNOOGY STACKDATABASES: AZURE SYNAPSE, AZURE SQL DB, KUSTO SQL, AZURE COSMOS DB, SQL SERVER, ORACLE, MYSQL PROGRAMMING LANGUAGES: SQL, KUSTO SQL, POWERSHELL, PYTHON, SPARK SQL, U-SQL, DAX, R, C# DATA ENGINEERING AND BI: AZURE DATA FACTORY (ADF), AZURE DATABRICKS, SQL SERVER DATA TOOLS (SSDT), SQL SERVER INTEGRATION SERVICES (SSIS), POWER BI, SQL SERVER REPORTING SERVICES (SSRS), SQL SERVER ANALYSIS SERVICES (SSAS), AWS REDSHIFT, AWS S3, AWS GLUE.CLOUD AND OTHER SOFTWARE STACK: AZURE DATA LAKE, BLOB STORAGE, AZURE ACTIVE DIRECTORY, AZURE LOGIC APPS, FUNCTION APPS, AZURE STORAGE EXPLORER, MICROSOFT VISIO, POSTMAN, ERWIN DATA MODELER, VS WORK EXPERIENCEDATA ENGINEER ERNST & YOUNG, NY, NY 01/2023- CURRENTEngaged in data analysis, design, development, implementation, and testing of data warehousing projects, with a primary focus on data conversion, extraction, transformation, and loading (ETL) processes.Collaborated closely with business users to gather business requirements and worked with the development team to create an integrated data repository by converting data from legacy systems to a new operational system and data warehouse. This included designing intricate software systems, preparing solution architecture documentation, managing the entire software development lifecycle.Utilized Erwin to craft both logical and physical data models for databases and data warehouse schemas.Processed streaming data and constructed event-triggered pipelines using Azure Data Factory and Azure Databricks.Developed complex SSIS packages for ETL operations using SQL Server Integration Services, handling data transfer and transformation from various sources such as flat files, Excel, CSV, Oracle, and MS Access.Demonstrated proficiency and hands-on experience with SQL and Python in a data-centric environment.Implemented the Medallion Architecture, encompassing the transition from the bronze (raw) storage layer to the gold layer in Azure Data Lake Storage (ADLS).Established the Delta Live Table (DLT) to oversee streaming data ETL and all data dependencies across the pipeline using SQL and Python.Developed a Streaming Table and Materialized View within the Delta Live Table (DLT) to perform ETL and store various layer tables in Azure Data Lake Storage (ADLS).Mounted Data Lake Storage (ADLS) to Databricks File System (DBFS) for DLT to access.Parsed fixed-length data (.DAT file) received from the upstream team and uploaded it to Azure Data Lake StorageProficient in leveraging Amazon Web Services (AWS) for cloud computing solutions, including but not limited to: EC2, S3, Lambda, RDS, and IAM. Experienced in designing, deploying, and managing scalable and resilient applications in AWS cloud environments. Demonstrated ability to optimize cost, performance, and security within AWS infrastructures.2(ADLS) using Autoloader and Delta Live Table (DLT) to trigger the pipeline upon new file arrivals.Constructed multiple Azure Data Lake Storage solutions, integrating batch and streaming datasets from various on-premises and PaaS/SaaS sources. Employed cloud services such as Synapse Analytics, Data Factory, Azure Function Apps, Logic Apps, Azure Kusto, Salesforce, JDE, Azure SQL Database, and MySQL.Developed Azure Data Factory (ADF) ETL pipelines for data movement between Blob storage and other sources, employing different authentication methods such as account keys, shared access signatures, service principals, and managed identities.Contributed to the migration of existing on-premises SSIS systems/applications to the Azure cloud, strategizing and delivering roadmaps and milestones.Executed ETL and data transfer solutions using Azure Data Factory (ADF) and designed database solutions in Azure Synapse and Azure SQL. Created and executed SSIS packages using ADF Azure-SSIS IR.Migrated Microsoft SQL Server databases to Azure SQL Database, configuring SQL Azure firewall for security, and overseeing database monitoring and restoration.Managed Azure Data Lakes (ADLS) and Data Lake Analytics, integrating them with other Azure services.Developed notebooks in Azure Databricks using Python and Spark SQL. Constructed Azure Data Factory pipelines to orchestrate data flows across Azure Data Lake zones.Configured Azure Monitor services, including Log Analytics dashboards, for evaluating performance metrics and querying logs for future use.Implemented ETL and data transfer solutions for data migration using Azure Data Factory. Created multiple Data Lakes and replicated existing application logic and functionality in the Azure Data Lake, Data Factory, and SQL Data Warehouse environment.Designed and optimized various SQL database objects, including tables, views, stored procedures, user-defined functions, indexes, and triggers.Created Power BI reports and dashboards featuring scorecards, metrics, Power View/MAP, pivot tables, and other visualizations for comprehensive data analysis using Power BI Desktop. DATA ENGINEER WALGREENS, DEERFIELD, IL 05/ 2017 12/2022Collaborated with business subject matter experts to gather requirements, comprehending both business and functional needs. Engaged in requirement gathering, prepared ETL specifications, and generated design documents.Optimized SQL queries by implementing various techniques such as re-indexing, updating statistics, recompiling stored procedures, and performing other maintenance tasks.Scripted SQL queries, incorporating advanced concepts like joins, cross apply, aggregate queries, window functions, and merge operations. Managed SQL Server performance through the creation and maintenance of clustered and non-clustered indexes. Implemented error handling using the TRY-CATCH method and employed common table expressions (CTEs).Developed Spark SQL and PYTHON notebooks within Azure Databricks (ADB) for batch data validation and transformation, including data cleaning, deduplication, and null value handling.Designed and implemented data pipelines in Azure Data Factory (ADF) to handle diverse data sources, facilitating integration, transformation, and curation for delivering customer journey orchestration at scale.Utilized Azure Data Factory (ADF) for ingesting, transforming, and processing both online and offline transactional data sources into Azure Blob Storage, serving as input for downstream teams for ML predictions and business analysis.Captured updates to customer or supplier data from Operational Data Storage (ODS) to Azure Blob Storage, performing incremental loading with Type 2 Slowly Changing Dimensions (SCD).Designed and implemented automated data ingestion pipelines using Snow pipe, enabling near real-time data updates, and integrated Snowflake with other data tools and platforms to create comprehensive data pipelines. 3Conducted performance tuning and optimization in Snowflake, ensuring efficient query execution and reducing data latency. Maintained documentation for Snowflake data models, data pipelines, and system architecture to facilitate collaboration and knowledge sharing.Employed dimensional data modeling techniques following the Kimball methodology to integrate new data sources into the data warehouse.Created data visualizations, including heat maps, story timelines, and trend reports, using Tableau for capacity analysis and data pattern identification.Developed interactive Tableau dashboards with calculated fields, filters, groups, and parameters to enhance data visualization.Collaborated in the creation of calculated members, named sets, and advanced KPIs for SSAS cubes.Developed SQL queries, stored procedures, and views for storing client and advisor data using SQL Server, actively leading data mapping activities.Configured various Azure cloud services, including Azure Storage, Azure Active Directory, Azure Service Bus, Azure VPN Point to Site, Virtual Networks, Azure Functions, and Azure Custom Security. Managed Azure AD tenants and configured application integration with Azure AD.Set up multiple subscriptions in Azure and collaborated with the Network team to establish connections between online and multiple V-Nets using Azure through V-Net Peering.Created cloud modules for interacting with Azure services, enabling infrastructure creation and orchestration on Azure as well as automation of cloud-native applications using Azure microservices such as Azure Functions and Kubernetes.Designed and implemented highly available and scalable architectures using AWS services such as EC2, S3, Lambda, RDS, and IAM. Collaborated with cross-functional teams to define requirements, architect solutions, and deploy applications in AWS environments. Implemented automation using AWS CloudFormation and other DevOps tools to streamline deployment processes and increase operational efficiency. Monitored and optimized performance, cost, and security of AWS resources, resulting in significant improvements in scalability, reliability, and cost- effectiveness.Leveraged Azure DevOps services for continuous integration, continuous delivery, deployment, and monitoring tasks, including addressing build and deployment issues for various release types.Extracted, transformed, and loaded data from source systems into Azure Data Storage services using a combination of Azure Data Factory, T-SQL, Spark SQL, and U-SQL Azure Data Lake Analytics. Data was ingested into Azure services such as Azure Data Lake, Azure Storage, and Azure DW, and processed using Azure Databricks.Contributed to the migration of SQL Server databases to SQL Azure Database using the SQL Azure Migration Wizard, utilizing Python API for uploading agent logs into Azure Blob Storage.Developed complex data models in Power BI to enable in-depth analysis and insights.Created calculated columns, measures, and DAX (Data Analysis Expressions) calculations to enhance data visualization and reporting capabilities.Utilized Power Query for data transformation, cleaning, and shaping within Power BI.Designed drill-through, drill-down, and cross-filtering functionalities to allow users to explore data at various levels of detail.BI DEVELOPER GEICO, NY, NY 05/ 2015  05/ 2017Designed Logical and Physical data models using ERWIN data modeling tool. Involved in Creation of Dimensions using STAR and SNOWFLAKE Schema.Used SSIS to unite data from existing systems and performed transformations on MS SQL.Created Complex SQL Queries using Views, Indexes, Triggers, Roles, Stored procedures, and User Defined Functions.Extract Transform Load (ETL) development Using SQL server, SQL Integration Services (SSIS). 4Involved in Data Model design and enhancements of Data Marts based on the ETL project requirements.Wrote many TSQL Stored procedures to perform specific tasks as per the requirements.Extensively used T-SQL in constructing User defined Functions, Views, Indexes, User Profiles, Relational Database Models, Data Dictionaries, and Data Integrity.Used Query Analyzer, Index Wizard, DB Engine Tuning and SQL Profiler for performance tuning.Created SSIS packages to extract data from OLTP to OLAP systems and Scheduled Jobs to call the packages and Stored Procedures.Created logging for ETL load Confidential package level and task level to log number of records processed by each package and each task in a package using SSIS.Actively involved in creating SSIS packages to transform data from OLTP database to a data warehouse.Created Reports using Microsoft SQL Server Reporting Services (SSRS) with proficiency in using Report Designer as well as Report BuilderAdministered interface to organize reports and data sources using SSRS. Developed Drill down Reports, Drill through Reports and Matrix Reports by using SSRS.Wrote Parameterized Queries for generating Tabular reports using Expressions and Functions, Sorting the data and Subtotals for the reports using SSRS.

Respond to this candidate
Your Message
Please type the code shown in the image:

Note: Responding to this resume will create an account on our partner site postjobfree.com
Register for Free on Jobvertise