Quantcast

Data Analyst Sql Server Resume Jacksonvi...
Resumes | Register

Candidate Information
Name Available: Register for Free
Title Data Analyst Sql Server
Target Location US-FL-Jacksonville
Email Available with paid plan
Phone Available with paid plan
20,000+ Fresh Resumes Monthly
    View Phone Numbers
    Receive Resume E-mail Alerts
    Post Jobs Free
    Link your Free Jobs Page
    ... and much more

Register on Jobvertise Free

Search 2 million Resumes
Keywords:
City or Zip:
Related Resumes

Data Analyst Business Intelligence Saint Johns, FL

Data Governance Analyst Jacksonville, FL

Business Analyst Data Analysis Jacksonville, FL

Sql Server Developer/DBA and general data maven Jacksonville, FL

Business Analyst, BI Analyst, Data Analyst, Data Engineer Gainesville, FL

Data, Analyst, Librarian Gainesville, FL

Power Bi Sql Server, SQL Developer, ETL Engineer, Data Architect Jacksonville, FL

Click here or scroll down to respond to this candidate
Candidate's Name
Data AnalystPhone- PHONE NUMBER AVAILABLEEMAIL- EMAIL AVAILABLECAREER OBJECTIVESeeking to leverage my 5+ years of experience in data visualization, predictive modeling, and process optimization to contribute to a dynamic organizations growth and success. Passionate about translating complex data into actionable business solutions and committed to continuous learning and professional development.PROFILE SUMMARYAround 5+ years of experience in Software Design, Database Design, Development, Integration Implementation and Maintenance of Business Intelligence and the related Database Platforms.As a Data Analyst responsible for Data Modeling, Enterprise Data Management, Data Presentation, Visualization, Optimization, Risk assessment, predictive analysis, trend Analysis,advanced Data Analysis, Business Research and Quantitative Analysis.Experience in SQL 2017/ 2016/2014/2012/2008 R2 with an emphasis on design, development, automation, tuning, optimization, Performance and Security. Monitored and Tuned MS SQL Server databases with tools like Index Tuning Wizard, SQL Profiler, and Windows Performance Monitor for optimal Performance.Experience in analyzing business user requirements, analyzed data and designed software solutions in Tableau Desktopbased on the requirements. Experience in converting Legacy Reports to Tableau dashboards connecting to SQL server.Built and published interactive reports and dashboards using Tableau Desktop and Tableau Server.Expertise in Microsoft Integration Services like SSIS (Control Flow tasks, Data Flow tasks, Transformations, Database administration tasks). Proficient in SQL Server Integration Services like Package creation, Check Points Implementation, Dynamic Configuring of package, Logging, Custom script writing, Package deployment using Manifest, Implementing Transactions, and Package encryption.Expertise in creating reports in Power BI preview portal utilizing the SSAS Tabular via Analysis connector.Experience in publishing Power BI Desktop reports created in Report view to the Power BI service.Experience in creating Power BI Dashboards (Power View, Power Query, Power Pivot, Power Maps).Extensive experience in SQL Server 2016/2012/2008 Business Intelligence tools - SQL Server Integration Services (SSIS), SQL Server Reporting Services (SSRS), and SQL Server Analysis Services (SSAS).Experience working on Tableau (Desktop, Server, Reader), creating various Reports and Dashboards using different functionalities. Experience in Tableau Desktop, Power BI for data visualization, Reporting and Analysis; Cross tab, Scatter Plots, Geographic Map, Pie Charts and Bar Charts, Page Trails and Density Chart.Extensive experience in creating complex SSIS packages, creating jobs, Alerts and schedule SSIS Packages using SQL Agent, migrating the data from different legacy systems to SQL using SSIS.Proficient in Design of ETL DTS Package for integrating data from heterogeneous sources (Excel, CSV, Oracle, flat file, Text Format Data). Experienced in migrating Data from Excel, Flat file, Oracle to MS SQL Server using SQL Server Integration Services (SSIS) and expert in Creating, Configuring and Fine-tuning ETL workflows in SSIS.Created stored procedures for generating reports using SQL Server Reporting Services (SSRS), Crystal Reports.Well versed in writing Parameterized Queries for generating Tabular reports, Formatting report layout, Sub reports using Global Variables, Expressions, Functions, Sorting the data, defining data source and subtotals for the reports using SSRS. Built Python Modules to deliver specific formats of data.Developed Drill through and Drill down reports using Crystal Reports and SQL Server Reporting Services (SSRS).Defined, documented, and standardized key performance indicators (KPIs) to ensure consistent tracking and reporting across business units.Designed and developed OLAP Cubes with star schema and Dimensions using SSAS.Querying and manipulating multidimensional cube data through MDX Scripting.Expert in Tabular Modeling, Multi-Dimensional Cubes, DAX and MDX (SSAS).EDUCATIONMasters from University of New Haven, USATECHNICAL SKILLSRDBMS/Databases:MS SQL Server 2017/ 2016/2014/2012/2008 R2, Oracle 13c/12c/11g/10g/9i/8i, TOAD 7.5, MS Access, SQL Server Management Studio, SQL Server Data Tools, Business IntelligenceDevelopment Studio (BIDS), Visual Studio, SQL Profiler, Performance Monitor, DTS, Database.Operating Systems:MS-DOS, Windows Server 2012/2008/2003/XP/NT 4.0/98, UNIX, Linux.BI Tools:Power BI Desktop, Einstein Analytics, SAP Business Objects, Tableau Desktop (2018.1.x/10.x/9.x/8.x), Tableau server (10.x/9.x/8.x), Workday, Reader, Online, Crystal reports,MS Excel, SSRS and Excel Power ViewSQL Server Tools:SQL server management Studio, BIDS, SQL server profiler, FTP, TOAD.Programminglanguages:Python, R, SQL, Linux, UNIX shell scripting, Java, JSP, HTML, JavaScript, JIRA, T-SQL, PL/SQL,C/C++, XML, HTML, CSS, Visual Basic6, VBScript, Java script, C#, PHP, .NET, VB.NET, ASP.NET.Methodologies:Software Development Life Cycle, Agile, Waterfall, Data Modeling Logical/Physical/Dimensional, MS Project, SQL Profiler, Toad and TFS.Cloud Services:Microsoft Azure, Confidential Web Services (AWS) and Oracle Enterprise Manager Cloud.WORK EXPERIENCEClient: Webull, St. Petersburg, Florida, USA Jul 2023 - Present Role: Sr. Data AnalystDescription: Webull is a digital investment platform that offers commission-free and low-cost trading of financial products. Developed and implemented secure data pipelines, automated testing scripts, and analytical solutions while performing complex data analysis and profiling across various systems.Responsibilities:Performed complex data analysis in support of ad-hoc and standing customer requests. Designed and developed automation test scripts using Python. Used DSE SQOOP for importing data from RDBMS to Cassandra.Worked with Systems Development Life Cycle (SDLC)/ Software as a Service (SaaS) delivery model.Designed and implemented secure data pipelines into a Snowflake data warehouse from on-premise and cloud data sources. Implemented Data Lake in Azure Blob Storage, Azure Data Lake, Azure Analytics, Data bricks Data load to Azure SQL Data warehouse using Polybase, Azure Data Factory.Designed and implemented effective Analytics solutions and models with Snowflake.Queried and analyzed data from Cassandra for quick searching, sorting and grouping.Involved in Data profiling, Data analysis, data mapping and Data architecture artifacts design.Extensively created data pipelines in the cloud using Azure Data Factory.Worked with Azure Data Factory (ADF) since it's a great SaaS solution to compose and orchestrate Azure data services,Wrote python scripts to parse XML documents and load the data in the database.Involved in complete SSIS life cycle in creating SSIS packages, building, deploying and executing the packages in all environments. Implemented Custom Azure Data Factory Pipeline Activities and SCOPE scripts.Utilized Alteryx to streamline ETL processes, significantly improving data processing speed and efficiency across multiple projects.Created Reports for marketing analytics team using Looker. Worked on creating Ad-hoc reports using SQL server. Actively involved in SQL and Azure SQL DW code development using T-SQL.Interacted with stakeholders on clearing their doubts regarding the reports in PowerBI.Expert inCreateing and publishing reports for stakeholders using PowerBI. Analyzed escalated incidents within the Azure SQL database. Worked on enhancing the data quality in the database.Used Data Analysis Expressions (DAX) to create measures and calculated fields to analyze data and to create time intelligence.Utilized Power BI to create various analytical dashboards that helps business users to get quick insight of the data.Environment: Erwin 9.8, SQL, SaaS, Cassandra, Azure, XML, SSIS, Python, Oracle 12c, SQL, PL/SQL, T-SQL, SSRS, MDM, PowerBI, XML.Client: Seminole Gaming, Davie, Florida, USA Mar 2022 - Jun 2023 Role: Sr. Data AnalystDescription: Seminole Gaming is a renowned entertainment, gaming and hospitality destination. Managed MDM integration, data extraction, and reporting processes while creating visualizations and ensuring seamless data flow across multiple business domains.Responsibilities:Developed MDM integration plan and hub architecture for customers, products and vendors, Designed MDM solution for three domains. Written SQL queries against Snowflake.Developed reports for users in different departments in the organization using SQL Server Reporting Services (SSRS).Developed and supported on Oracle, SQL, PL/SQL and T-SQL queries.Tested the ETL process for both before data validation and after data validation process.Translated business concepts into XML vocabularies by designing XML Schemas with UML. Participated in daily scrum meeting with team to discuss on the challenges and development of the project.Involved in requirement gathering from business users and IT teams to understand and express business process and business logic.Extracted data for reporting from multiple data sources like SQL Server, SQL Server Analysis Service, Azure SQL Database, Azure SQL Data Warehouse, Salesforce, etc.Extracted data from data warehouse server by developing complex SQL statements using stored-procedures and common table expressions (CTEs) to support report building.Implemented Tableau for visualizations and views including scatter plots, box plots, heatmaps, tree maps, donut charts, highlight tables, word clouds, reference lines, etc.Generated complex reports in Tableau using multiple functionalities including context filters, hierarchies, parameters, LODExpressions, time intelligence, etc.Built interacting and storytelling dashboards in Tableau workbooks using dashboard actions.Published Tableau workbook on Tableau Server and set report refresh.Environment: Power BI Desktop, Power BI Service, Project Online, SQL Server 2016/2017, SSRS, SSIS, SSAS, DAX, Tableau, Microsoft Azure, Python, SharePoint, Visual Studio, Scrum/Agile, BI Development Studio (BIDS), MDX, XML, SQL Profiler, TFS, MS Office, Windows 10/7, Data Analyzer.Client: Fourrts, Chennai, India Aug 2020 - Nov 2021 Role: Data AnalystDescription: Fourrts is a leading pharmaceutical company dedicated to providing high-quality healthcare products to improve the lives of people around the world. Contributed to creating and optimizing interactive reports, dashboards, and financial analyses, while ensuring accurate data management and reporting across various business operations.Responsibilities:Created PowerPivot models and PowerPivot reports, published reports into SharePoint, and deployed models into SQL Server SSAS instance.Assisted in creating SQL database maintenance logs and presenting any issues to the database architects.Worked on Power BI reports using multiple types of visualizations including line charts, doughnut charts, tables, matrix, KPI, scatter plots, box plots, etc.Made Power BI reports more interact and activate by using storytelling features such as bookmarks, selection panes, drill through filters, etc.Created and administered workspaces for each project on Power BI service and published the reports from Power BIDesktop to Power BI Services workspace.Utilized Power BI to create various analytical dashboards that helps business users to get quick insight of the data.Installed Data Gateway and configured schedule plans for Data Refresh of Power BI reports.Created SSIS packages for ETL processes by using control flow and data flow components.Maintained and tuned performance for slowly running SSIS packages.Responsible for accurately reporting on the financial aspect of the company's operations by rendering reports on daily/ weekly/monthly base.Evaluated returns and risks for various types of investments including stocks, fixed-income assets, hedge fund, real estate and commodities and evaluated risk tolerance for customers.Created ad-hoc reports such as YTD, YOY, MTD, MOM analysis for income statement, cash flow by requests from business units.Implemented report integration among Power BI, Tableau, SSRS and Python reports.Environment: Power BI, SQL Server 2012/2016, AZURE, Google Big Query, Tableau, Excel, MS SQL Server 2016/2014/2012, Server Integration Services (SSIS), SQL Server Reporting Services (SSRS), SQL Server Analysis Services (SSAS), SQL Server Management Studio (SSMS) and Oracle 11g.Client: Barclays, Chennai, India Jun 2019 - Jul 2020 Role: Programmer AnalystDescription: Barclays plc British multinational universal bank. Involved in deploying and monitoring scalable cloud infrastructure, coordinating teams, and ensuring effective data management and validation across projects.Responsibilities:Deployed and monitored scalable infrastructure on cloud environment Amazon web services (AWS).Managed development team by receiving the weekly updates.Developed an issue log for the projects and worked with management to resolve them.Coordinated with the testing team and ETL development team for the successful dry run in all the projects.Collaborated with and reported to the Platform Manager who directly reported to the CIO.Identified the entities and relationship between the entities to develop Conceptual Model using Erwin.Used Agile Method for daily scrum to discuss the project related information.Effectively designed, developed and enhanced cloud-based applications using AWSPerformed match/merge and ran match rules to check the effectiveness of MDM process on data.Data sources are extracted, transformed and loaded to generate CSV data files with Python programming and SQL queries.Designed and developed logical & physical data models and Meta Data to support the requirements.Involved in extensive DATA validation using SQL queries and back-end testingUsed AWS Glue to crawl the data lake in S3 to populate the Data Catalog.Wrote SQL queries, PL/SQL procedures/packages, triggers and cursors to extract and process data from various source tables of the database.Developed separate test cases for ETL process (Inbound & Outbound) and reporting.Developed complex T-SQL code such as Stored Procedures, functions, triggers, Indexes, and views for the business application.Used Excel sheet, flat files, CSV files to generated Tableau ad-hoc reportsAssisted on requirement gathering and testing of the mobile application.Environment: Erwin 9.8, AWS, Agile, MDM, Python, SQL, PL/SQL, ETL, T-SQL, Tableau.

Respond to this candidate
Your Message
Please type the code shown in the image:

Note: Responding to this resume will create an account on our partner site postjobfree.com
Register for Free on Jobvertise