Quantcast

Sql Server Business Intelligence Resume ...
Resumes | Register

Candidate Information
Title Sql Server Business Intelligence
Target Location US-GA-Decatur
Email Available with paid plan
Phone Available with paid plan
20,000+ Fresh Resumes Monthly
    View Phone Numbers
    Receive Resume E-mail Alerts
    Post Jobs Free
    Link your Free Jobs Page
    ... and much more

Register on Jobvertise Free

Search 2 million Resumes
Keywords:
City or Zip:
Related Resumes
Click here or scroll down to respond to this candidate
PHONE NUMBER AVAILABLE RACHANA LUITELDecatur, GA Street Address
EMAIL AVAILABLE omSQL/ETL DeveloperProfile SummaryMore than 5 years of experience of developing Microsoft SQL Server Business Intelligence and Multi Cloud Services application with high level of volubility in Data Warehouse and OLTP concepts.Fluency in multi cloud Database and Data Warehouse (Microsoft Azure and Amazon Web Services), Worked on building Pipelines, Orchestrating pipeline execution and deployment.Designing and building data pipelines that integrate data from various sources (such as databases, APIs, and streaming services), Ensuring data quality, reliability, and scalability.Optimizing data infrastructure for performance and scalability, adhered to Platform as a Service  Server less Azure Data Factory and AWS Data Pipeline Services.Well experience on NoSQL databases, worked on Azure Cosmos DB and AWS DynamoDB databases to handle schema less key value paired data, unstructured json format etc.Proficient on Relational database concepts and architecture, implemented Database, Data Warehosue and Data Marts using SQL Server 2022, 2019, 2017 and other legacy versions.Worked on Cloud SQL Relational databases to process on premise data, files and APIs, Azure SQL and AWS Aurora, Utilized Azure Blob, AWS S3 storage to store files during data processing.Worked on implementing Web Service API End points to handle On Prem Object Database objects sensitive to Cyber Security from Cloud to the On Prem to support On Prem and Cloud data integration as per business reporting requirements.Well versed with Python, Power Shell and Bash Scripting to automate activities, cleanse source data files especially csv data files, developed ARM template deployment using Python scripts, handling roles and rights on On Prem using Power Shell scripts.Proficient in Cloud Management Groups, Subscriptions, Resource Groups, Policies, Blue print design as part of Cloud Security, Governance and regulatory compliance, Workload optimization and automation.Proficient in Microsoft Business Intelligence- SQL Server Integration, Analysis and Reporting Services, Report development using Power BI Eco-space.Experience in designing SSIS packages, SSAS Cubes and SSRS Reports to build end to end business intelligence solution for Organizational reporting needs.Highly experienced in Dimension modeling using star and snowflake schemas, handling slowly changing dimension using different techniques like row hash (MD5 and SHA), slowly changing dimension wizard and Lookup, especially in incremental load of data.Worked on defining hardware requirements, Cloud VMs and Server less server scalability, MFO and Conditional access RBAC, Compliance for Data Protection, Monitoring Cloud using Log Analytics/Cloud Watch.Well versed with CI/ CD processes and tools, Azure Key Vault, AWS KMS, Azure Repos, AWS Code commit and Git.Technical SkillsOperating System: Windows Server 2022/2016/2012, MS Windows 10, UNIX.Relational DBMS: MS SQL Server 2022/2019/2017/2016/2014, MS Access, Oracle 9i/11gCloud Tools: Azure Data Factory v2, Azure SQL Databases, AWS Data Pipeline, AWS Aurora, Azure Log Analytics, AWS Cloud Watch, Cosmos DB, Blob Storage and S3. Azure Portal, Azure IAM, Azure CI /CD platform, Azure MS SQL and Apache SuperSet.SQL Server Tools: SSMS, Profiler, Query Analyzer, Index tuning wizard, SQL Trace tools,Other Tools: MS Office Suite (Microsoft Word, PowerPoint, MS Visio), OLAP & OLTP, Adobe tools, Microsoft .NET, Team Foundation Server, Visual Studio, GITLAB, GITHUB.Windows Tools: Event Viewer, Task ManagerProgramming: T-SQL, C#, Python, Power Shell, Bash Scripting and ASP.NET / WebServicesApplications: JIRA Ticketing System, Visual Studio, MS Office, Pointing PokerJob Schedulers: SQL Server Agent, Auto Sys, Control  M, Triggers in ADFEducationMasters DegreeProfile SummarySQL/ETL DeveloperOTR Capital Solutions Roswell, GA, 30076 (Oct 2023  Present)Worked on OnPrem version of Data Warehouse initially and moved to Cloud version in order to solve the challenges faced on OnPrem Servers, Leveraged power of Server less Azure Services which in turn used Apache Spark and Apache Superset.Implemented Data Factory Pipelines to load data from OnPrem SQL Server to Azure Data Lake Storage in the form of comma separated value format and further loaded this data into Azure MS SQL.Implemented Azure Data Factory Integration Runtimes to execute SSIS packages, collecting On Prem data scenarios, Parametrized Linked Services and re used existing Data Sets across most of the activities in a Pipeline.Implemented Data Vault schema to store data in the form of Hub, Satellite and Link in order to support scalability and ease of adding new tables and attributes.Designed physical and logical data models and handling data repository for data models.Analysed and performed Data modelling and mapping which involved identifying the source data fields, identifying target entities and their lookup table ids and translation rules.Implemented Data Partitioning using Round Robin in Data Warehouse built using Azure MS SQL PaaS service.Responsible for the development and management of Enterprise Data Warehouse processes and policies, following strategic direction on best practices for holistic architecture.Used Performance Monitor and SQL Profiler to optimize queries and enhance the performance of database servers.Optimized stored procedures and long running scripts using execution plan, temp tables, and code rewriting and indexing strategies to increase speed and reduce run time.Designed Data warehouse ETL packages to handle data change tracking using slowly changing dimension transformation for type 1 and type 2 dimension attributes.Design, develop, and implement SSIS packages and projects to efficiently process high-volumes of data from multiple sources and meet nightly processing windows requirements.Designed, developed, and deployed OLAP cubes and SSIS packages to process OLAP cubes through source views having dimension and fact tables listed. Utilized XMLA file for deployment of cube.Involved in development of MDX queries for cubes in Business Intelligence Development Studio (BIDS), deploying them on the server through SSIS package for cube processing.Generating Corporate Reports for Management Committee using SQL Server Reporting Services (SSRS) and Excel.Responsible for deploying reports to Report Manager and Troubleshooting for any error occurring in execution.Environment: MS SQL Server 2022/2019, Azure Data Factory, Azure MS SQL, SQL Server Integration Services, SQL Server Analysis Services, SQL Server Reporting Services, Microsoft Visual Studio 2022, Windows Server 2022, MDX, Erwin, SQL Profiler, SQL Server Management Studio, MS Excel, T-SQL, MS Access, TFS.SQL/ETL DeveloperCBI Bank & Trust Muscatine, IA, 52761 (Jun 2020 - Sep 2023)Involved in the Requirement Analysis, Design phase and Development phase of agile model system.Collaborated with cross-functional teams to develop and maintain data governance policies and data security protocols, ensuring compliance with industry regulations and protecting sensitive data.Developed AWS Data Pipeline pipelines to orchestrate the entire On Premise to Cloud Data movement along with SageMaker models refresh at specified time of interval.Handling real time data censoring through event messaging system, batched into group of data and sent into the Aurora Databases as a staging layer.Development and implementation of Data Pipelines for various source files loaded to Aurora tables used with Python scripts to cleanse.Proficient in usage of SSIS Control Flow items (For Loop, execute package/SQL tasks, Script task, send mail task) and SSIS Data Flow items (Conditional Split, Data Conversion, Fuzzy lookup, Fuzzy Grouping, Pivot).Migrated legacy MS-SQL and Oracle-based data warehouse to Cloud Data Lake, created multiple Apache Airflow dags to orchestrate various processesInvolved in implementation of OnPrem Data Warehouse initially, as well as Cloud DataWarehouse as part of products next generation version.Developed Data Pipelines for calling RedShift stored procedures to copy data from Elastic Block Store Data Lake storages BLOB into raw schema.Implemented data security and access controls using Cognito Identity and Access Management (IAM) and Cloud Security Command CenterMonitor and troubleshoot data pipelines and storage solutions using Cloud Trail and Cloud Watch logs.Developed Error handling and reporting using EventBridge service with Real-time exception monitoring and alerting.Environment: MS SQL Server 2019, AWS Data Pipeline, Aurora, EventBridge, RedShift, Apche Airflow, BIDS Data Tools, Windows Server 2019, Python.Junior SQL DeveloperTalon Health Tech (My Medical Shopper) Portsmouth, NH, 03801 (Dec 2018 - May 2020)Worked on change requests pertaining to existing database objects that include Tables, Views, User-defined Functions, Stored Procedures, Constraints and Triggers to support ongoing business change requests.Extensively worked on Dynamic Management Views (DMV's) and Dynamic Management Functions (DMF's) to find what queries are being executed during job execution.Worked on existing SSIS Packages, Import/Export wizard for transferring data from Database (Oracle and Text format data) to SQL Server.Extensively used SSIS Data Profiler Task to profile target source systems (tables & views) prior to building ETL solutions, which was instrumental in cleaning data for data consistency and designing the table structures.Data cleaning and data manipulation such as Concatenation, LEN, Find & Replace, Filter & Sort, Conditional Formatting, Index Match, Remove Duplicates, and Logic Functions were done using Microsoft Excel on sample data.Extracted data from Oracle and migrated into SQL Server Database using SSIS and developed Package labels for test and production environments and execution of SSIS packages automatically.Used various SSIS tasks such Lookups, Derived Column, Merge Join, Fuzzy Lookup, For Loop, For Each Loop, Conditional Split, Union all, Script component etc., which did Data Scrubbing, including data validation checks during Staging, before loading the data into the Data warehouse.Used SSIS Control Flow and Data Flow tasks such as Pivot Transformation, Execute SQL Task, Data Flow Task, etc. to import data into the data warehouse.Responsible for implementation of SSIS Logging, error configurations for Error handling the packages.Updated Package Configurations of entire database from XML Configurations to SQL Server Configurations so as to assist in execution of packages in various environments.Created interactive dashboards using parameters, actions and various data visualizations with Excel Power Pivot.Created dashboards with charts, graphs that convey meaningful data and insights using Power BI.Deployed reports to Report Manager and involved in troubleshooting for any error occurs in execution.Manage subscriptions and rendered reports in different formats (PDF, Excel, etc.) to be executed automatically on daily, weekly and monthly basis.Environment: MS SQL Server 2017/2016 SQL Server Integration Services, SQL Server Analysis Services, SQL Server Reporting Services, Microsoft Visual Studio 2019, Windows Server 2016, MDX, Erwin, SQL Profiler, SQL Server Management Studio, MS Excel, T-SQL, MS Access, TFS and Power BI.

Respond to this candidate
Your Message
Please type the code shown in the image:

Note: Responding to this resume will create an account on our partner site postjobfree.com
Register for Free on Jobvertise