Quantcast

Sql Server Power Bi Resume Austin, TX
Resumes | Register

Candidate Information
Name Available: Register for Free
Title Sql Server Power Bi
Target Location US-TX-Austin
Email Available with paid plan
Phone Available with paid plan
20,000+ Fresh Resumes Monthly
    View Phone Numbers
    Receive Resume E-mail Alerts
    Post Jobs Free
    Link your Free Jobs Page
    ... and much more

Register on Jobvertise Free

Search 2 million Resumes
Keywords:
City or Zip:
Related Resumes

Power Bi Sql Server Georgetown, TX

.Net Developer Sql Server Georgetown, TX

Data Warehouse Sql Server Round Rock, TX

Sql Server Vb .Net New Braunfels, TX

Data Engineer Sql Server Austin, TX

High School Sql Server Pflugerville, TX

Sql Server Database Administrator San Antonio, TX

Click here or scroll down to respond to this candidate
Name Candidate's Name
Email: EMAIL AVAILABLEPhone: PHONE NUMBER AVAILABLESr. SQL, ETL, SSRS, Power BI DeveloperAbout MeI have over 11 + years of experience working in data warehousing, Currently, I work as MSBI Developer, improving products and services for our customers by using advanced analytics, standing up big-data analytical tools, creating and maintaining models, and onboarding compelling new data sets. In my current role, I am leading a team to derive data insights and enhance customer experience by leveraging Microsoft BI capability, I analyzed data from some of the biggest enterprises.My Strengths:Competencies: Business Insights & Reporting, Data Analytics, Data processing, Data Visualization, Business Intelligence Tools (MSBI), SSIS, SSRS, Data Strategy, Database Architecture & Warehousing and Data Management, Tableau,Expertise in SQL Server development, MSBI stack (TSQL, SSIS, SSAS, SSRS), Azure, Power BI for building, deploying, managing applications and services through a global network of Microsoft - managed Data Center. Specialize in design, implementation, integration, maintenance and testing of various web-based, Enterprise, Client/Server and Distributed applications using Python, Django, Flask PytorchWorked on Data warehousing, Data engineering, Feature engineering, big data, ETL/ELT, and Business Intelligence. As a data analyst, specialize in Azure frameworks, Hadoop Ecosystem, Excel, Snowflake, relational databases, tools like Tableau, PowerBI Python Data DevOps Frameworks/Azure Dev Ops PipelinesPassionate to work in the intersection of cloud computing, Data Engineering, experienced in building distributed data solutions, data analytical applications, ETL and streaming pipelines leveraging big data ecosystem components, Databricks platform, Azure Cloud services, Hadoop Eco System.Expert in Data Analysis, Data Profiling, Data Cleansing & Quality, Data Migration, Data Integration, Data Ingestion and Data Transformation.My experience in various industries includes Retail, Networking, Manufacturing Logistics Energy, Banking, Financial, Insurance, Health Care, E-Commerce & Utilities.PROFESSIONAL SUMMARY:Microsoft BI Developer 11 + Years of experience in MS Power BI, SSRS (SQL Server Reporting Services), SSIS, SSMS, Visual Studio), Tableau, MSBI Suite, Teradata, DB2, Oracle.Expert in designing Enterprise reports using SQL Server Reporting Services (SSRS generated drill down reports, parameterized reports, linked reports, sub reports, matrix dynamics and filters, charts in SSRSExpert in Data Warehouse development starting from inception to implementation and ongoing support, Strong understanding of BI application design and development principles.Highly experienced in whole cycle of DTS/SQL server integration services (SSIS) Packages (Developing, Deploying, Scheduling, Troubleshooting and monitoring) for performing Data transfers and ETL Purposes across different servers.Data Analyst with solid understanding of Data Modeling, Evaluating Data Sources strong understanding of Data Warehouse/Data Mart Design, ETL, BI, OLAP, Client/Server applications. Experience in Data Analysis, Data Migration, Data Cleansing, Transformation, Integration, Data Import, and Data Export through the use of multiple ETL tools such as Informatica PowerCenter Experience in testing and writing SQL statements - Stored Procedures, Functions, Triggers and packages proficient in POWER BI, Snowflake, Teradata, SQL Server, Oracle,Python.Experience in providing Logging, Error handling by using Event Handler, and Custom Logging for SSIS Packages.Experience in Performance Tuning in SSIS packages by using Row Transformations, Block and Unblock TransformationsScheduling and Monitoring ETL Processes using DTS Exec utilities and batch files.Expertise in generating reports using SQL Server Reporting Services, Crystal Reports, and MSExcel spreadsheets and Power PivotExpert in designing Enterprise reports using SQL Server Reporting Services (SSRS 2000/2005/2008) generated drill down reports, parameterized reports, linked reports, sub reports, matrix dynamics and filters, charts in SSRSWorked on Ad-hoc reports, data driven subscription reports by using Report Builder in SSRS.Experience in Data Design and Development on Microsoft SQL Server 2016/2014/2012/2008R2/2008, T-SQL, Performance Tuning, Troubleshooting, SSIS/DTS Package Configuration, SSRS, SSAS and Data-warehousing.Hands on experience in Installing, Configuring, Managing, Monitoring, Troubleshooting, Upgrading and Migrating SQL ServerCreated SSIS Packages using SSIS Designer for export heterogeneous data from OLE DB Sources (Oracle), Excel Spreadsheet to SQL Server.Experience in migration of Data from Excel, Flat file, Oracle to MS SQL Server by using BCP and DTS utility and automation with SSIS,got good scripting skills on python and PowerShell.Good Understanding in creating and managing of Linked Servers across SQL server databases.Experience in SQL Server Analysis Server MDX Queries, developing and monitoring SSIS Packages.Proficient in using NumPy and Pandas to manipulate and transform data, including tasks such as filtering, sorting, and merging datasets for analysisGenerated data reports and insights using NumPy and Pandas, delivering actionable information to stakeholders for informed decision-makingWorked on Data Marts, OLAP, Dimensional Data Modeling, Star Schema Modeling,Snow-Flake Modeling for FACT and Dimensions Tables using Analysis Services.Exposure to Big Data platform deployed in Azure Cloud,Implemented Copy activity, Custom Azure Data Factory Pipeline Activities for On-cloud ETL processing,Worked on Azure Data Lake, Azure Data Factory, SQL Data Warehouse Azure Blob, Azure Storage ExplorerExpertise in creating databases, users, tables, triggers, macros, views, stored procedures, functions, Packages, joins and hash indexes in Teradata database.Worked on SQL Server DTS and SSIS (Integration Service) package design, constructing and deployment, SQL queries involving multiple tables inner and outer joins, stored procedures.Extensive experience in Database Design using SQL and PL/SQL Programming using Stored Procedures, Functions, Packages and Triggers.ETL Extract, Transform and Load (ETL) source data into target tables to build Data Marts.Conducted Gap Analysis, created Use Cases, workflows, screen shots and Power Point presentations for various Data Applications.Worked on complex SQL Queries, PL/SQL procedures and convert them to ETL tasksExperience in understanding of Microsoft development suite SSIS and SSAS, as well as familiarity with standard data extract, cleansing, integration, and aggregation techniques and best practices.Experienced in both live and import data into for creating reports.Experience in Extraction, Transformation and Loading (ETL) data from various sources into Data Warehouses, as well as data processing like collecting, aggregating, and moving data from various sources using Apache Flume, Kafka, PowerBI and Microsoft SSIS.Experience in building data pipelines using Azure Data factory, Azure data bricks and loading data to Azure data Lake, Azure SQL Database, Azure SQL Data warehouseTECHNICAL SKILLS:DatabasesMS SQL Server 2008/2012/14/16, Oracle, PL/SQL, MS AccessProgramming LanguagesT-SQL, PL/SQL, HTML, XML, Unix.Tools/UtilitiesEnterprise Manager, SQL Server Management Studio (SSMS), Query Analyzer, Business Intelligence Development studio, DTS, Analysis Manager, SQL Profiler, SSIS, SSRS, Visual Studio, SQL * Plus, Ervin, BIDatabase Design Tools &Data ModelingMS Visio, MS Access, MS SQL Server, Star Schema/Snowflake Schema modeling. Physical & logical data modeling, Agile Scrum Methodologies.Business Intelligence ToolsBusiness Intelligence Development Studio (BIDS), SQL Server Reporting Services (SSRS), SQL Server Integration Services (SSIS), SQL Server Analytical Services (SSAS), DTS Packages.BI Tools Tableau Desktop 10.x/9.x/ 8.3/7, Tableau Server 8.2, OBIEE 11g/10.1.3.4, PowerBI2.73ETL Tools Informatica 8.6, Data stage 7.5, AlteryxOperating SystemsWindows Server,Linux,Mac,UnixApplication ServerTornado, WebLogic, TomcatWeb ServersWeb Logic, Web Sphere, Apache Tomcat, JBOSS.Management TechSVN, Git, Jira, Maven.Cloud InfrastructureAzure Cloud - Data Bricks, Azure Data Factory (ADF), Azure Data Lake, Azure pipelines, Azure Functions, Blob storage.Team Skills:Coordinating, Facilitating, Tracking, Summarizing and Reporting.PROFESSIONAL EXPERIENCEClient: Cigna, Lafayette, LA Sep 2020-PresentRole : MS SQL BI Developer (SSIS/SSRS)Project Description: Cigna Healthcare offers health insurance plans such as medical and dental to individuals and employers, international health insurance, and Medicare. This implementation also takes care of Clinical Case Management and Utilization Management software using a fully integrated database of member and provider information coordinating all interactions using collaborative workflow management and business rules automation supports real-time communication and coordination of care among all members of the care team, inside and outside the four walls of traditional healthcare settings.Project also involves creating Corporate Data Warehouse and migrating data from the OLTP systems to the Corporate Data Warehouse. SSIS was used as an ETL tool for extracting the Data from various sources running on Oracle, DB2 and MS SQL Server databases and Generate reports on Tableau and SSRS to cover weekly, monthly, quarterly and annual historic information.Key Contributions:Involved in designing and developing packages for a Data Warehousing and Data Migration projects using Integration services (SSIS) on different Data Sources.Extracted the data from Flat Files, transformed (Implemented required Business Logic) and Loaded into the target DW using SSIS (SQL Server).Created SSIS Packages by using various Control Flow and Data Flow Tasks to execute SQL Task for each loop Container, Derived Column, and Data Conversion.Created package to transfer data between OLTP and OLAP database using Lookup, Derived Columns, Condition Split, Aggregate, Execute SQL Task, Data Flow Task, Execute Package Task etc. to load underlying data into tables from Text files, Excel files, CSV files and XML files.Work with Tableau Server Admin to schedule extract refresh, upgrade desktop and server versions, actively participate in functional testing of the dashboards.Work on SQL queries  create, modify the queries to obtain a abstract set of data and connect to Tableau and create dashboards.Designed and Developed ETL jobs to extract data from different sources and load it in data mart in Snowflake and managed Snowflake clusters such as launching the cluster by specifying the nodes and performing the data analysis queries.Used various ETL tools like SSIS/DTS for data flow from source files like DB2, XML, Excel, Tables and views to other databases or files with proper mapping and loading to Data Warehouse.Worked on Dimensional modelling of the data from Scratch (Extracted the scratch data from heterogeneous sources and doing the ETL for incremental data loads using SSIS and building them into Dimension and Fact tables using Star Schema).Created Entities in MDM and updated the data programmatically. Created ADF Pipelines to Read and write Data from MDM.Using SQL Server Integration Services (SSIS) to populate data from various data sources Developed web based front-end screens using MS FrontPage, HTML and Java Script. Actively designed the database to fasten certain daily jobs, stored procedures.Exposure in developing Dashboards using Power BI for analyzing MS Product Key Activations, MS Product Key Blocks & MS Product Key distribution shared across different partners.Created complex Power BI dashboards & Developed Databricks Python notebooks to Join, filter, pre-aggregate, and process the files stored in Azure data lake storage based on business logicWorked on Python Open stack API and used Python scripts to update content in the database and manipulate files. Monitored KPIs and isolated causes of unexpected variances and trends to orchestrate expert-level data visualizations and presented recommendations to improve business blueprintsResponsibilities:Worked on creating complex T-SQL, Stored Procedures, triggers, cursors, tables, and views and other SQL joins and statements for applications.Experience in developing T-SQL (DDL, DML) statements, Data Integrity (constraints), Performance Tuning, and Query Optimization.Creating reports using SQL Reporting Services (SSRS) for customized and ad-hoc Queries.Designed dynamic SSIS Packages to transfer data crossing different platforms, validate data during transferring, and archived data files for DBMS.Utilized advance features of Tableau software to link data from different connections together on one dashboard and to filter data in multiple views at once.Identify new metrics and enhance the existing KPI's.Building, publishing customized interactive reports dashboards, report scheduling using Tableau server.Designed and implemented complex data integration workflows using SnapLogic, resulting in increased efficiency and improved data quality for clients.Led a team of SnapLogic developers in the creation of custom solutions for clients in various industries, including healthcare, finance, and retail.Successfully integrated SnapLogic with various third-party systems, such as Salesforce, Workday, and SAP, to enable seamless data exchange and improve business processes.Developed and maintained SnapLogic pipelines for batch and real-time data processing, ensuring timely delivery of critical business data to stakeholders.Conducted training sessions for new SnapLogic developers, providing them with a solid understanding of the platform's capabilities and best practices for development.Collaborated with cross-functional teams to identify business requirements and design SnapLogic solutions to meet those needs.Conducted performance tuning and optimization of SnapLogic workflows to improve data processing speed and reduce latency.Created actions, data blending, Hierarchies, parameters, Filter (Local, Global) and calculated sets for preparing dashboards and worksheets using Tableau.Administered user, user groups, and scheduled instances for reports in Tableau.Scheduled frequency and time to refresh data for the when sources are published or extracted to server.Implemented complex business logic through T-SQL stored procedures, Functions, Views and advance query concepts.Create ETL and SSRS reports to automate the adhoc reporting.Developed Complex calculated measurements using Data Analysis Expression Language DAX.Develop, monitor, and troubleshoot errors occurring in Batch Jobs.CreatedSSIS packagesthat pick files from a shared folder, transform the data as per the requirement, generate output files that are used for 3rd party integration.Create test scripts for users to test system changes. Perform creation of test cases, Scenarios, customized Integrations and performance testing.Involved in writing technical design documents.Analyze critical issues, defects and provide solutions accordingly.End to end requirements gathering, designing, developing, testing and maintaining AX application.Environment: MS SQL Server 2014/2012, Microsoft Visual Studio 2012, XHTML, XML, MS SQL Query Analyzer, MS SQL Server Integration Services(SSIS), MS SQL Server Reporting Services(SSRS) and ASP.NET, C#, T-SQL,AzureClient: T-Mobile, Dallas, TX Apr 2017  Oct 2020Role : BI Developer SQL SSRS / Power BI DeveloperProject Description: This project provides Finance/Revenue Reports, bookings, maintenance bookings, maintenance revenue and sales Business Dashboards.The primary goal of project is to focus on the collection, integration, and analysis of data from different sources. This focus will result in greater insights and more effective to support decision making. My role is to explore and transform data in the form of reports and visualizations to business users. This data is then used for making decisions on equipment management. The Project involved in developing the dashboards in Power BI, Tableau.My Key Contributions:Worked on generating various dashboards in Tableau Server using different data sources such as Teradata, Oracle, Microsoft SQL server.Involved in analysis, design, development, and enhancement of the data warehousing application.Applied Natural language processing to classify Comments based on the text content in the project onboarding questions in LCS portal using Python (NLKT)Involved in new KPI generation for Scorecards.Data collection and leveraging on data manipulation teams for required data.Involved in creating various reports like Drill through, drill down and Ad-hoc according to the user requirement using SQL Server Reporting Services (SSRS).Designing and Developing the Dashboard Reports in power BI using DAX.Creating pipelines for loading data from wide variety of sources like HDFS Files, On-Premises SQL table, Azure Storage Table to Data Lake StoreCreating Jobs in Geneva Analytics data studio for loading data from Kusto to Azure SQL Data warehouse.Exposure in Excel Reports, Power BI by connecting to various SQL server databases and improved the efficiency of these large reports.Involved in creating interactive dashboard and applied actions (filter, highlight and URL) to dashboard.Involved in creating calculated fields, mapping and hierarchies.Created List reports, cross tab reports, charts, sectional reports, ; Drill through reports by using Report studio.Responsibilities:Designed SSIS Packages to Extract, Transform and Load data into Database, Archive data file from different legacy systems using SSMS on SQL Server 2005/2008/2012 environment, and deploy the data, scheduled the jobs to do these tasks periodically.Generated tabular, matrix, drilldown, Drill through, summary Reports, sub reports, detailed reports, Parameterized and linked reports using Tableau.Performance Tuning of Stored Procedures,SQL queries using SQL Profiler Index Tuning Wizard in SSIS.Data migration from text files, csv, excel files to SQL server.Extensively used various SSIS objects such as Control Flow Components, Dataflow Components, Connection managers, Logging, Configuration Files etc.Designed, developed, debugged and tested SSIS packages, Stored Procedures, configuration files and implement best practices packages to maintain optimal performanceBuilt efficient SSIS packages for processing fact and dimension tables with complex transformations implemented the error threshold using precedence constraints and variables in SSIS packages using business rules as reference.Good knowledge of Data Marts, Data warehousing, Operational Data Store (ODS), OLAP, Data Modeling like Dimensional Data Modeling, Star Schema Modeling, Snow-Flake Modeling, FACT and Dimensions Tables using MS Analysis ServicesInvolved in automating SSIS Packages using SQL Server Agent Jobs. Created an ETL summary report which consists of the summary information for all the loading activities done each day and month.Performing Unit testing and internal QA for the developed application.Created and maintained the SSRS reports, design documents and other scrum status trackers available on SharePoint for the business users and the steering team.Designed and developed various SSIS packages (ETL) to extract and transform data and involved in Scheduling SSIS Packages.Created packages using the various control flow task such as data flow task, execute SQL task.Created SSIS Packages/projects to move data from source to destination programmatic by using Visual StudioDeployed new MDM Hub for portals in conjunction with user interface on IDD application.Configured match rule set property by enabling search by rules in MDM according to Business Rules.Developed Tablix Reports, Matrix Reports, Drill Down Reports, Tabular Reports, and charts using SSRSCreated PL/SQL scripts to extract the data from the operational database into simple flat text files using UTL_FILE package.Involved in implementing the Land Process of loading the customer Data Set into Informatica MDM from various source systems.Participated in the development and implementation of the MDM decommissioning project using InformaticaPower Centerthat reduced the cost and time of implementation and development.Creation of database objects like tables, views, materialized views, procedures and packages using oracle tools like Toad, PL/SQL Developer and SQL* plus.Improved the performance of SSRS reports by tuning the Stored Procedures and TSQL queries using various join conditions. Generate experience in converting Actuate reports into SSRS. Deploying and troubleshooting the SSIS Packages.Developed SQL queries to validate the overall data migration. Prepared data reconciliation reports for conversion team/business.Environment: Azure,Python,MS SQL Server 2014/2012, Microsoft Visual Studio 2012, XHTML, XML, MS SQL Query Analyzer, MS SQL Server Integration Services(SSIS), MS SQL Server Reporting Services(SSRS) and ASP.NET, C#, T-SQL .snowflakeClient : Express Scripts, Franklin Lake, NJ. Jan 2016 Mar 2017Role : ETL Developer/Power BI SQL DeveloperProject Description: Digital Transformation Services and SolutionsPrimary aim of the project was to load the data from different sources to make reports for customer Loan approval division to analyze debts over periods of delay, customers, regions, etc. form a complete picture of the loan portfolio, including reserves, collateral value, the amounts of outstanding debts and the amount of coverage. I was involved in Data Analysis, Data Profiling, Data Integration, Migration, Data governance, metadata management, and Master Data Management and Configuration Management. The Project involved in developing the dashboards in Tableau, environments for Metrics and Compliance department.My Key Contributions:Developed Dashboard Visualizations, cross tables, Bar charts, Waterfall, Tree map and complex reports which involves custom Controls and Custom Expressions using bars, lines and pies, maps, scatter plots, Gantts, bubbles, histograms, bullets, heat maps and highlight table.Developed various solution driven views and dashboards by developing different chart types including Pie Charts, Bar Charts, Tree Maps, Circle Views, Line Charts, Area Charts, and Scatter Plots in Power BI.Created dashboard for Transportation system financial information which includes revenue and expenditure by routes.Applied DAX for Various Reports:Use different (DAX) Expressions to create measures and calculated column.Using DAX and Branch Method created the n Number of TOP Levels in the visuals for result (Top 3, 5 or any number to present the Data)Created the Row Level Security using the branch method of Path and PathContain Function in the Calculated column and managing the role (Using DAX)Applied the DAX (Data Analysis Expressions) to various types of reports Time Intelligence, Cost and Benefit analysis.Created the Running Total for the Monthly Billing.Applied DAX creating different measures and Calculated ColumnsCreated Call summary reports on Power BI with the implementation of the Data Analysis Expressions working deep with M-Language to fix the bug and issue in Power Query editorCreated package to transfer data between OLTP and OLAP database using Lookup, Derived Columns, Condition Split, Aggregate, Execute SQL Task, and Data Flow Task, Execute Package Task etc. to load underlying data into the tables from Text files, Excel files, CSV files and XML files.Developed Power BI reports using query logic based on business rules, and visuals selected by united stakeholders.Creating Power Pivot models and Power Pivot reports, publish reports on SharePoint, and deploy models into SQL Server SSAS instance.Merging and transforming data according to the requirements for ease of analysis in Excel by using Power Query in Power BI.Performed ETL process by pulling large volumes of data using SSIS, BCP into the staging database from various data sources like Access, Excel.Responsibilities:Created various Tabular, Cross-tab, Matrix and Ad-hoc reports using SSRS.Designed and implemented complex SSIS package to migrate data from multiple data sources for data analyzing.Included Report Design and Coding for Standard Tabular type reports, including Drill down and Drill through functionality and Graphical presentations such as Charts and Dashboard type metrics.Involved with Data Steward Team for designing, documenting and configuring Informatica Data Director for supporting management of MDM data.Generated Drill down and Drill through reports with Drop down menu option, sorting the data, defining subtotals in SSRS.Scheduled Jobs for executing the stored SSIS packages which were developed to update the databaseCreated Error and Performance reports on SSIS Packages, Jobs, Stored procedures and Triggers.Created alerts, notifications and emails for system errors, insufficient resource, fatal database errors and hardware errors. Used BCP command and TSQL command to bulk copy the data from Text files to SQL Server and vice versaCreated Stored Procedures, T-SQL, Triggers, Views and Functions for the database Application.Defined the report layout and identified Datasets for the report generation. Built drill down reports, and also parameterized reports using SSRS.Extensively used Report Builder and Report Manager in SSRS.Environment: MS SQL Server 2008R2, SQL Server Integration Services (SSIS), SSAS, SSRS, OLTP, Business Intelligence Development Studio, Visual Studio, SQL Server Profiler, SQL Server Management Studio, Excel, TFSClient: Wolters Kluwer, Indianapolis, IN June 2013 to Dec 2015Role Sr. SQL/MSBI Azure DeveloperProject Description: E-Trade is a leading health care financial services,Having executed the first-ever electronic trade by an individual investor has long been at the forefront of the digital revolution, focused on delivering complete and easy-to-use solutions for traders, investors and stock plan participants aims to enhance the financial independence of traders and investors through a powerful digital offering and professional guidance. Using the solutions configurable workflow and rules engine, lenders can control how leads are distributed and assigned.Key ContributionsCreated Azure Data Factory for copying data from Azure BLOB storage to SQL ServerImplement ad-hoc analysis solutions using Azure Data Lake Analytics/Store, HDInsight/DatabricksWork with Microsoft on-prem data platforms, specifically SQL Server and SSIS, SSRS, and SSASCreate Reusable ADF pipelines to call REST APIs and consume Kafka Events.Worked on SQL Server Integration Services (SSIS) to integrate and analyze data from multiple homogeneous and heterogeneous information sources (CSV, Excel, Oracle db, and SQL).Requirement gathering, Functional & technical specifications for end user and end client applications, Re-Engineering and capacity planning.Involved in database Schema design and development.Experience in using T-SQL for creating stored procedures, indexes and functions.Generated server-side T-SQL scripts for data manipulation and validation and created various snapshots and materialized views for remote instances.Creating tables, indexes and designing constraints and wrote T-SQL statements for retrieval of data and involved in performance tuning of T-SQL Queries and Stored Procedures.Used SSIS to create ETL packages to validate, extract, transform and load data to trouble databases, data mart databases.Responsibilities:Designed high level ETL architecture for overall data transfer from the source server to the Enterprise Services Warehouse.Worked as a developer in creating complex Stored Procedures, SSIS packages, triggers, cursors, tables, and views and other SQL joins and statements for applications.Generated Reports using Global Variables, Expressions and Functions for the reports.Configured Reporting Services in SQL Server 2008.Created Report Models for Business Users to create their own reports.Created complex Stored Procedures, Functions, Indexes, Tables, Views and other T-SQL code and SQL joins for applications.Used complex expressions to Calculate, Group, Filter, Parameterize, and Format the contents of the report according to the requirement.Monitoring nightly ETL process from various highly different source systems. Sources included SQL based databases and Excel Files. Also ensured that nightly backup jobs or other ETL jobs didnt interfere with each other.Imported the data from flat files and were transformed and placed in DB2, Teradata Databases as they were the target databases.Developed SQL scripts for data dump in the Database in SQL server, DB2, Teradata.Developed data warehouse model in snowflake for over 100 datasets using where scape.Heavily involved in testing Snowflake to understand best possible way to use the cloud resourcesWorked on Azure/SQL on Premise Replication/ geo replication.Deployed Azure/SQL databases and other SQL settings using PowerShell.Imported the data from flat files and were transformed and placed in DB2, Teradata Databases as they were the target databases.Developed SQL scripts for data dump in the Database in SQL server, DB2, Teradata.Developed data warehouse model in snowflake for over 100

Respond to this candidate
Your Message
Please type the code shown in the image:

Note: Responding to this resume will create an account on our partner site postjobfree.com
Register for Free on Jobvertise