Quantcast

Azure Data Factory Resume Virginia beach...
Resumes | Register

Candidate Information
Name Available: Register for Free
Title Azure Data Factory
Target Location US-VA-Virginia Beach
Email Available with paid plan
Phone Available with paid plan
20,000+ Fresh Resumes Monthly
    View Phone Numbers
    Receive Resume E-mail Alerts
    Post Jobs Free
    Link your Free Jobs Page
    ... and much more

Register on Jobvertise Free

Search 2 million Resumes
Keywords:
City or Zip:
Related Resumes
Click here or scroll down to respond to this candidate
Candidate's Name
SQL / BI DEVELOPERPHONE NUMBER AVAILABLE EMAIL AVAILABLE Hickory Creek, TX 75065 PROFESSIONAL SUMMARYRigorous five years of experience in diverse business and technical environments with main area of focus in design, development, implementation and support of OnPrem and Cloud Data Warehouse applications holding big data analytics capabilities. Proficient in Cloud Services especially Microsoft Azure Service types (IaaS, PaaS, SaaS, FaaS and DaaS), deployment modes (Public, Private and Hybrid), Server less services. Well versed with Azure terminologies  Data Center, Availability Zone, Region and Geographies, Resources, Resource Groups, Subscriptions and Management Groups. Well experienced on Server less Azure Data Factory Author and Monitor features in order to build Cost efficient pipelines to port data from multiple sources. Built OnPrem and Cloud Data Warehouse from Conceptualization till production using Microsoft SQL Server and Microsoft Azure.Worked on Cost Optimization processes of Azure Data Factory with Activity Runs, DIU Units, and Pipeline Activity Execution Hours and External Pipeline Execution Hours parameters. Well versed with MicroServices architecture using (CRON Jobs deployed in) Docker and Kubernetes container orchestration.Well versed with RestFul APIs of Azure Data Factory, Azure Key Vault and Power BI endpoints. Implemented Orchestration using Apache AirFlow hosted by Azure Data Factory to prioritize and align Pipelines dynamically.Designed and developed SSIS Packages and Azure Data Factory Pipelines to Orchestrate, ETL and ELT of data, propagating data from Staging to Reporting. Highly experienced in Dimension modelling using star and snowflake schemas, handling slowly changing dimension using different techniques like row hash (MD5 and SHA), slowly changing dimension wizard and Lookup, especially in incremental load of data. Implemented ETL Workflows to handle late arriving dimensions and late arriving facts using Surrogate Key Pipelining concept.Experienced in Designing, Creating and processing of cubes using SSAS. Created and Configured Data Sources and Data Source Views, Cubes, and calculated measures for OLAP cubes. Experienced in Ad-hoc reporting, parameterized, custom reporting using SSRS for daily Ad-Hoc reports.Proven ability to architect automated environments for optimal data assets and resources, leveraging essential tools such as Azure Cloud Shell, Power Shell, and Python. Collaborated with DevOps Engineers to prepare automated CI / CD pipelines using Azure ARM templates as per deployment strategies.TECHNICAL SKILLSOperating Systems: Windows Server, LinuxDatabase: MS SQL Server 2014/2016/2017/2019, Oracle 11g/10g/9i ETL: SSIS, Data Import Export WizardOLAP/BI Tools: SSRS, SSAS, PowerBILanguages: Java, HTML, XML, SQL, PL/SQL.Web/Apps Servers: IBM Web Sphere 4x, Sun I Planet Server 6.0., IIS, Tomcat Tools: TOAD, Visio, EclipseCloud: Azure SQL Database, Azure Synapse, Azure Data Factory. Virtualization: Azure K8s, Docker imagesTools: MS Excel, MS WindowsPROFESSIONAL EXPERIENCEGoldman Sachs, Washington, DC 20001SQL/BI DeveloperFebruary 2023 to PresentModel & Architect DW design, direct and/or execute the technical characteristics of the overall strategy of the data warehouse and ETL process. Work closely with IT and the Business group to understand business reporting requirements and analyze logical model and develop subject matter expertise in a short time. Map source system data elements to target system and develop, test, and support extraction, transformation, and load processes.Worked on Azure Data Factory pipelines for population of OnPrem data to Azure Data Lake Storage BLOB in the format of .csv filesPopulated Azure Synapse DataWarehouse using .csv files and developed subjective Data marts. Implemented Integration Runtime deployments using Python scripts to bring the onPrem data to Azure SQL database using Azure Data Factory pipelines, calculated DIU and DTU required for the nightly data transfer.Implemented Azure monitoring and Log analytics to understand and resolve production performance issues and provide solutions to development teams. Developed Grafana dashboards to showcase monitoring data, prepared KustoQuries Language queries.Handled slowly changing Type 1 and 2 dimensions, Fact Tables, Star Schema design and other Data Warehouse concepts.Designed and documented the error-handling strategy in the ETL load process. Prepared the complete ETL specification document for all the ETL flows Involved in development of MDX queries for cubes in Business Intelligence Development Studio(BIDS), deploying them on the server through SSIS package for cube processing. Generated reports using Power BI for internal and external customers for business performance monitoring and business decision making. Generated periodic reports based on the statistical analysis of the data from various time frame and division using SQL Server Reporting Services (SSRS) Designed and developed dashboards and reports using Microsoft Excel and SSRS on a daily, week-to-date, month-to-date, quarter-to-date, and year-to-date basis to track loans given out. Created and maintained numerous Power BI Reports including financial and operational Reports. Environment: SQL Server 2019, Azure SQL, Synapse, Data Factory V2, Windows 2016 Server, C#, T-SQL,U-SQL, KQL, SSRS, SSIS, SSAS, Azure Repos- GIT KBR Heritage Federal Credit Union, Houston, TX 77002 SQL DeveloperSeptember 2021 to January 2023Assists in the development and delivery of technical and process-oriented solutions and recommendations in support of the production environment. Analyzing of business and user needs, documentation of current system requirements, translation into proper system specifications, and writing and execution of test cases. Worked on application coded in agile environment utilizing a test-driven development approach. Developed complex business logic using functions, stored procedures, and triggers to support application integration across metadata objects.Work at a high level of most phases of systems analysis and considers the business implications of the application of technology to the current and future business environment. Developing Power BI reports and dashboards from multiple data sources using data blending. Working on existing SSIS packages and performing new enhancements as per business request. Modifying the existing the SSIS packages of the legacy application and match it with various integration sources.Troubleshooting high level issues to identify and resolve issues caused by database, source data, ETL, or the Business Intelligence software (For example: NULLS causing Joins to function incorrectly, ETL running out of sequence, Micro strategy objects not set up properly) Scheduling jobs using a third-party application (Confluence Workstation) and SQL Server job agent to extract data from crystal reports and place the files in the outbound folder. Manage all DB maintenance plan activities including backups, indexing, integrity and consistency checks, etc.Designing Power BI data visualization using cross tabs, scatter plots, pie, bar and density charts.Environment: MS SQL Server 2017, SSIS, Power BI, MS Access, MS Excel, SSIS, Windows Server 2016ECL Finance Limited, Ludhiana, Punjab, INDIAJr. SQL DeveloperJune 2019 to August 2021Developed databases and schema objects including tables, indexes and applied constraints, connected various applications to the database and written functions, stored procedures, and triggers.Involved in requirement gathering, scheduling status review meetings, documents, and code review.Created data mapping and sourcing documents for various sources to database schema. Involved in designing and implementing relational database RDBMS model as per business needs. Developed transact-SQL commands including complex queries, views, functions, stored procedures, cursors, and triggers on SQL server.Utilized tools like SQL Profiler, index tuning wizard, windows performance monitor for monitoring and tuning MS SQL Server performance.Responsible for developing, support and maintenance for the ETL Extract, Transform and Load processes using SQL Server Integration Services.Extracting data from IMS/DB2, MS Access, and SQL Server 2005/2008 for migration and performing various transformations in SSIS ETL per business rules. Configured Connection Manager files for SSIS packages to dynamically execute on QA server and Production server.Implemented Type 1 and Type 2 SCD mappings to update slowly changing dimension tables. Experience developing and maintaining cubes, pulling data from Data Warehouse and processing SSAS cubes.Experience in configuration of report server and report manager scheduling, give permission to differentLevel of users in SQL server reporting services SSRS. Performs System testing for all the modules and helps business in User Acceptance Testing. Experience in generating on-demand and scheduled reports for business analysis or management decision using SQL server reporting services.Coordinating with offshore team for on time delivery of products and query resolution activities.Environment: MS SQL Server 2016, SQL Server Reporting Services (SSRS), MS Access, MS Excel, SSIS, MS Visio, Erwin, BCP, T -SQL.

Respond to this candidate
Your Message
Please type the code shown in the image:

Note: Responding to this resume will create an account on our partner site postjobfree.com
Register for Free on Jobvertise