| 20,000+ Fresh Resumes Monthly | |
|
|
| | Click here or scroll down to respond to this candidateLila M Banjade SQL / ETL DeveloperPHONE PHONE NUMBER AVAILABLEAddress Lexington KY Street Address
EMAIL EMAIL AVAILABLEPROFESSIONAL SUMMARY5+ years in Software Development and Testing on Microsoft Business Intelligence including SSIS, SSRS, SSAS, On Prem Microsoft-SQL SERVER and Oracle in various project scenarios.Nearly 3 year in Development on Microsoft Cloud and Cloud native Services especially pertaining to building Data Pipelines using Azure Data Factory, Azure SQL Database, building Cloud Data Warehouse and Data Lake house using Azure Synapse and Azure Data Lake.Thorough understanding of all the phases of SDLC including requirement gathering, analysis & design, implementation and delivery; versatility to work on different platforms and bring an innovative approach in designing solutionsCompletely verse with Cloud methodologies and terminologies (IaaS, PaaS, SaaS, FaaS and DaaS), Cost optimization techniques, different types of Services, Server less Services, API programming using C#.Well versed with MicroServices architecture using Docker and Kubernetes container orchestration.Implemented Data Orchestration using Apache AirFlow hosted by Azure Data Factory to prioritize and align Pipelines dynamically.Good experience of On Prem ETL processes and tools and Technical expertise in ETL (Extraction, Transformation and Loading) of data from multiple feeds & sources (Excel, CSV, Flat Files, SQL, Oracle, Json, Web Services, XML, SharePoint List and Third-party APIs (WINSCP-SFTP and Newton soft-JSON) into Data Warehouse using SQL Server Integration Services (SSIS).Conversant with Data Distribution algorithms like Round Robin, ASCII and Replicator, implemented Data Partitioning using Round Robin in Data Warehouse built using Azure MS SQL PaaS service.Well experienced on Server less Azure Data Factory Author and Monitor features in order to build Cost efficient pipelines to port data from multiple sources.Implemented Azure Data Factory Integration Runtimes to execute SSIS packages, collecting On Prem data scenarios, Parameterized Linked Services and re used existing Data Sets across most of the activities in a Pipeline.Good knowledge in EMR (Elastic Map Reducing) to perform big data operations in AWS. Knowledge in working with Amazon Web Services (AWS) using EC2 for computing and S3 as storage mechanism.Experience in dimensional data modelling and Cube development, partition, aggregation and optimization using SSAS.Created & formatted Crosstab, Conditional, Drill-down, Top N, Summary, Form, OLAP, Sub reports, ad-hoc reports, parameterized reports, interactive reports & custom reports.Experience in providing SSRS server level, item level Security & Accessibility to various reports in multi-user environment.Expert in writing T-SQL, working on DTS, SSIS, SSRS, SSAS, Data Cleansing, Data Scrubbing and Data Migration. Also created Indexed Views, complex Stored Procedures, effective functions, and appropriate Triggers to facilitate efficient data manipulation and data consistency.Excellent knowledge of DDL (CREATE, ALTER and DROP) and DML (INSERT, UPDATE and DELETE) in T-SQL and some experience in using SQL Profiler for Query Tuning and Optimization.Experienced in Designing, Creating and processing of cubes using SSAS. Created and Configured Data Sources and Data Source Views, Cubes, and calculated measures for OLAP cubes.Team player with the capability to motivate associates for delivering desired performance levels, strong communication, analytical and problem-solving skills.TECHNICAL SKILLSOperating Systems: Windows Server, XP/7/10, AIX, LinuxCloud: Microsoft Azure (Proficient), Amazon Web Services (Novice).Automation Tools: Control-M, ESPLanguage: C#.Net, C, C++, T-SQL and PL/SQLDatabases: SQL Server 2022/2019/2017/ 2016/ 2014, Oracle 19c.ETL Tools: SQL Server Integration Services (SSIS), Azure Data Factory v2Reporting Tool: SQL Server Reporting Services (SSRS), Power BI, Tableau 2022(Novice)Analysis Tool: SQL Server Analysis Services (SSAS)SOA: Api programming using C# and Windows Communication Foundation Services, MicroServices, Postman.Container: Kubernetes service, Kubernetes cluster construction, Docker container, Docker commands, Docker run, Docker images, Docker compose, Docker engine, Docker networkingScripting Tool: PowerShell, UNIX Shell Scripting, YAMLOthers: MSOffice Suite, TFS, Tortoise SVN (Subversion), GitHub, MS-Teams, BMC Remedy, JIRA, XML Spy, RallyEDUCATIONMasters DegreeCERTIFICATIONSAS Certified Specialist: Base Programming Using SAS 9.4PROFESSIONAL EXPERIENCET-SQL ProgrammerRiverstone Bank, Colorado Spring, COJune 2022 to PresentWorked on OnPrem version of Data Warehouse initially and moved to Cloud version in order to solve the challenges faced on OnPrem Servers.Implemented Data Factory Pipelines to load data from OnPrem SQL Server to Azure Data Lake Storage in the form of comma separated value format and further loaded this data into Azure MS SQL.Implemented Data Vault schema to store data in the form of Hub, Satellite and Link in order to support scalability and ease of adding new tables and attributes.Reduced the size of development by providing a smaller part of the OS by introducing Docker containers, enabled different teams to work on the same project.Creating SSRS Data Sources and troubleshooting as needed from WCF Web API due to cross system compatibility issues and Cyber Security.Implemented JWT token preparation for Authentication and Authorization of WCF Web API, Client registration using OAuth 2.0Using SQL Server Integration services Packages (Developing, Deploying, Scheduling, Troubleshooting and monitoring) for performing Data transfers and ETL purposes across different servers on a daily basis.Designed ETL packages dealing with different data sources (SQL Server, Flat Files, and XMLs etc.) and loaded the data into staging and data warehouse databases by performing different kinds of transformations using SQL Server Integration Services (SSIS).Used various transformations in SSIS like Lookup, Fuzzy grouping, Row count transformations.Proficient in Performance Tuning/Query optimization and identify coding issues and data exceptions, and cleaning data if required.Backing up databases & refreshing Data across environments. Huge Data transfers from & to SQL Server Databases using utilities/ tools SSIS & Bulk Insert.Developed and maintained ETL packages to extract the data from various sources and Responsible for debugging and upgrading of several ETL structures as per the requirements.Actively involved in development phase including testing effort for multiple project modules.Actively participate in all the ceremonies of Agile Practice every day.Environment: MS SQL Server 2019, Azure Data Factory, Azure K8s, Azure SQL Database, Docker Image, Windows Server 2016, Postman, WCF API, MS Office.SQL/SAS Programmer/Business IntelligenceHSBC Bank, New York, NYJuly 2020 to May 2022Analysis of production implementation, in accordance of source data validation, implementing business logic and used transformations as per the requirement in developing ETL packages and loading the data into the target. GAP Analysis and Root cause Analysis.Involved in the Requirement Analysis, Design phase and Development phase of agile model system. Closely worked with Business analyst for gathering requirement and Architects. Involved in providing presentations to clients and conducting group meetings with team members.Designed, developed, tested and automated Extraction Transformation and Loading (ETL) processes using SQL Server Integration services.Analyzed and performed Data modelling and mapping which involved identifying the source data fields, identifying target entities and their lookup table ids and translation rules.Responsible for the development and management of Enterprise Data Warehouse processes and policies, following strategic direction on best practices for holistic architecture.Used Performance Monitor and SQL Profiler to optimize queries and enhance the performance of database servers.Optimized stored procedures and long running scripts using execution plan, temp tables, and code rewriting and indexing strategies to increase speed and reduce run time.Designed Data Warehouse ETL packages to handle data change tracking using slowly changing dimension transformation for type 1 and type 2 dimension attributes.Design, develop, and implement SSIS packages and projects to efficiently process high-volumes of data from multiple sources and meet nightly processing windows requirements.Developed configurable metadata driven ETL solution to track execution parameters, logging, and decision-making during jobs execution in order to ease debugging process in case of job failure.Designed and developed ETL packages using SQL Server Integration Services (SSIS) to load the data from SQL server, XML files to SQL Server database through custom C# script tasks.Used various SSIS tasks such as Conditional Split, Derived Column, lookup, for each loop container, sequence container, script task etc. for Data Scrubbing, data validation checks during Staging.Utilized SSIS Project Configuration to enable ease of deploying between dev, test, stage, and prod servers, by configuring environment variables to assign project parameters.Used ETL tools to maintain the quality of data and also to improve the metadata capabilities.Designed, developed, and deployed OLAP cubes and SSIS packages to process OLAP cubes through source views having dimension and fact tables listed. Utilized XMLA file for deployment of cube.Involved in development of MDX queries for cubes in Business Intelligence Development Studio (BIDS), deploying them on the server through SSIS package for cube processing.Environment: MS SQL Server 2019/17, SSIS, SSAS, SSRS, C#, Windows Server, MS Visual Studio.Jr. T-SQL Programmer/Data AnalystTrinity Health, Livonia, MIDec 2018 to Jun 2020Involved in requirements gathering, analysis, design, development, change management, and deployment.Involved in complete Software Development Lifecycle (SDLC).Reviewed, analyzed, and implemented necessary changes in appropriate areas to enhance and improve existing systems.Converted Access databases to SQL design tables, import data and developing constraints and indexes ensuring data integrity.Developed Stored Procedures, User Defined Functions, SQL triggers and Views.Resolved issues in existing code for various modules and adding new functionality upon the requirement.Optimized the performance of queries by removing unnecessary columns, eliminated redundant and inconsistent data, normalized tables, established Joins and created clustered/non-clustered indexes wherever necessary.Resolved deadlocks issues with the databases and servers on a real-time basis.Involved in Performance tuning and Query Optimization by using SQL profiler.Integrated data from homogeneous, heterogeneous sources and legacy systems (SQL, DB2, Excel sources) to SQL server database.Worked on workflow optimization to reduce the query execution time for better performance. Developed scripts for Rules Based Notifications (RDN) for reducing the execution time of existing SQL queries that are used in Java coding.Designed and implemented data access stored procedures and Triggers for automating tasks. Performed job scheduling and alerts.Build new SQL Server subsystem environments or relocate existing SQL Server subsystem environments upon request.Work with technical support teams to identify SQL Server software related problems.Work closely with Operations and Automation teams in defining operational procedures involving the SQL Server subsystems.Provide timely resolution to SQL Server software related problems.Collaborate with other team members and stakeholders.Environment: MS SQL Server 2016, SQL, SSIS, T-SQL, Excel, Visual Studio, Shell Programming |