| 20,000+ Fresh Resumes Monthly | |
|
|
| | Click here or scroll down to respond to this candidateCandidate's Name
Data AnalystEmail: EMAIL AVAILABLE Mobile: PHONE NUMBER AVAILABLESUMMARYAround 5 years of experience in Software Design, Database Design, Development, Integration Implementation, and Maintenance of Business Intelligence and related Database Platforms.As a data analyst, I am responsible for data modeling, enterprise data management, data presentation, visualization, optimization, risk assessment, predictive analysis, advanced data analysis, business research, and quantitative analysis.Experience in SQL 2017/2016 with an emphasis on design, development, automation, tuning, optimization, Performance, and Security.Experience analyzing business user requirements, analyzing data, and designing software solutions in Tableau Desktop based on the requirements.Experience in converting Legacy Reports to Tableau dashboards connecting to SQL server.Built and published interactive reports and dashboards using Tableau Desktop and Tableau Server.Expertise in Microsoft Integration Services like Built SSIS (Control Flow tasks, Data Flow tasks, Transformations, Database administration tasks).Proficient in SQL Server Integration Services like Package creation, Check Points Implementation, Dynamic Configuring of packages, Logging, Custom script writing, Package deployment using Manifest, Implementing Transactions, and Package encryption.Explored data in various ways and across multiple visualizations using Power BI.Implemented T-SQL functions for column-level encryption in a customer database, enhancing data security and ensuring compliance with industry regulations.Expertise in creating reports in the Power BI preview portal utilizing the SSAS Tabular via Analysis connector.Experience in publishing Power BI Desktop reports created in Report View to the Power BI service.Monitored and Tuned MS SQL Server databases with tools like Index Tuning Wizard, SQL Profiler, and Windows Performance Monitor for Optimal Performance.Extensive experience in SQL Server 2016/2012/2008 Business Intelligence tools - SQL Server Integration Services (SSIS), SQL Server Reporting Services (SSRS), and SQL Server Analysis Services (SSAS).Developed Python scripts for data cleaning and manipulation, integrating them with ETL workflows to automate repetitive tasks.Automated report generation and data extraction tasks using Python, improving efficiency and reducing manual errors.TECHNICAL SKILLSRDBMS/Databases: MS SQL Server 2017/2016/2014/2012/2008 R2, MS Access, SQL Server Management Studio, SQL Server Data Tools, Business Intelligence Development Studio (BIDS), Visual Studio, SQL Profiler, Performance Monitor, DTS, Database.BI Tools: Power BI Desktop, Tableau Desktop (2018.1.x/10.x/9.x/8.x), Tableau Server (10.x/9.x/8.x), Reader, Online, MS Excel, SSRS and Excel Power View.SQL Server Tools: SQL Server Management Studio, BIDS, SQL Server Profiler, FTP, TOAD.Programming Languages: SQL, T-SQL, Python.Methodologies: Software Development Life Cycle, Agile, Waterfall, Data Modeling Logical/Physical/Dimensional, MS Project, SQL Profiler, Toad, and TFS.Cloud Services: Azure.EDUCATIONMasters in Computer Science, University at Buffalo Aug 2022 to Jan 2024PROFESSIONAL EXPERIENCEAcuity Brands, Atlanta, GAData Analyst Feb 2023 to PresentRoles And Responsibilities:Participated in daily scrum meetings with the team to discuss the challenges and development of the project.Involved in requirement gathering from business users and IT teams to understand and express business processes and business logic.Extracted data for reporting from multiple data sources like SQL Server, SQL Server Analysis Services, Azure SQL Database, Azure SQL Data Warehouse, Salesforce, etc.Extracted data from the data warehouse server by developing complex SQL statements using stored procedures and common table expressions (CTEs) to support report-building.Identified and optimized a T-SQL query in an inventory management system, reducing execution time from 15 seconds to 3 seconds through query plan analysis and index adjustments.Developed T-SQL scripts to parse and extract information from JSON responses in an API integration project, facilitating seamless data exchange between internal systems and a third-party service.Implemented Tableau for visualizations and views including scatter plots, box plots, heatmaps, tree maps, donut charts, highlight tables, word clouds, reference lines, etc.Generated complex reports in Tableau using multiple functionalities including context filters, hierarchies, parameters, LOD Expressions, time intelligence, etc.Built interacting and storytelling dashboards in Tableau workbooks using dashboard actions.Published Tableau workbook on Tableau Server and set report refresh.Used Data Analysis Expressions (DAX) to create measures and calculated fields to analyze data and to create time intelligence.Extensive use of DAX functions for the Reports and the Tabular Models.Used Python for data automation tasks, enhancing ETL efficiency.Worked on Power BI reports using multiple types of visualizations including line charts, doughnut charts, tables, matrix, KPI, scatter plots, box plots, etc.Made Power BI reports more interactive and activated by using storytelling features such as bookmarks, selection panes, drill-through letters, etc.Created and administrated workspaces for each project on Power BI service and published the reports from Power BI Desktop to Power BI Services workspace.Installed Data Gateway and configured scheduled plans for Data refresh and Power BI reports.Created SSIS packages for ETL processes using control flow and data flow components.Maintained and tuned performance for slowly running SSIS packages.Evaluated returns and risks for various types of investments including stocks, fixed-income assets, hedge funds, real estate, and commodities, and evaluated risk tolerance for customers.Implemented report integration among Power BI, Tableau, SSRS, and Python reports.HSBC, IndiaSQL/BI Developer June 2021 to July 2022Roles And Responsibilities:Created SSIS package development, Unit testing, and deployment in Dev and QA Environments.Interacted with the technical team and Business users to performance analysis of business requirements and transformed them into technical requirements for ETL.Used advanced features of T-SQL to design and tune T-SQL to interface in Azure Cloud and other applications in the most proficient manner.Created Packages, Jobs, and Sending Alerts using SQL Mail, Database Backup, Recovery, and Disaster Recovery procedures. Planned the complete Backup of various Databases for Disaster recovery scenarios in Production.Adhered to the organization's established policies and procedures and assisted in enhancing the deployment procedures. Adhered to all ETL standards and architecture.Designed ETL packages dealing with different data sources (SQL Server, Flat Files) and loaded the data into target data sources by performing different kinds of transformations using SQL Server Integration Services (SSIS).Evaluated database structures to understand the processing of claims data further.Strong development skills with Azure Data Lake, Azure Data Factory, SQL Data Warehouse, Azure Storage Explorer.Created shared dimension tables, measures, hierarchies, levels, cubes, and aggregations on MS OLAP/OLTP/Analysis Server (SSAS) in Tabular Model.Implemented Copy activity, Custom Azure Data Factory Pipeline Activities.Primarily involved in Data Migration using SQL, SQL Azure, Azure storage, Azure Data Factory, SSIS, PowerShell.Used advanced features of T-SQL to design and tune T-SQL to interface in Azure cloud and other applications most efficiently.Designed ETL packages dealing with different data sources (SQL Server, Flat Files) and loaded the data into target data sources by performing different kinds of transformations.Created and reviewed SSAS tabular cubes for consumption as a data source for reports, Dashboards, and Pivot tables.Expert in creating Star schema cubes using SSAS.Good experience with MDX query and DAX.Developed Power BI reports using Power Query from SharePoint & different Data sources.Embedded Power BI reports on the SharePoint portal page and managed access to reports and data for individual users using roles.Developed Python scripts to automate data extraction and processing tasks, integrating with Azure services for seamless data workflows.Bapendram Systems, IndiaSQL/BI Developer Aug 2019 to May 2021Roles And Responsibilities:Proficient involvement in the design of Data migration projects in end-to-end Development of Projects covering all phases of the Software Development Life Cycle including Requirement Analysis, Designing, Build/Construction, Unit/Assembly Testing, and Deployment.Followed agile pattern, and communicated with functional and technical teams on the requirements for migrating data from legacy heirloom systems to Hybris.Created staging DB, moving all the business scope data to the staging database, and migrating to Hybris using SSIS.Integrated SSIS with FTP servers to automate the retrieval of daily sales data files from external vendors, streamlining the ETL process and ensuring timely availability of data for reporting.Optimized ETL performance by implementing parallel processing in SSIS for large datasets, improving overall data loading speed and scalability.Developed a T-SQL script within an SSIS package to generate surrogate keys for dimension tables, ensuring consistency and integrity in data warehousing.Involved in building and maintaining SSIS Packages to import and export the data from various data sources based on design data models.Created intricate T-SQL queries to generate custom financial reports by joining multiple tables and applying complex business logic, facilitating better decision-making for the finance team.Involved in Functional testing, cut-over plans to discuss the technical status of the launches.Performed End-End testing including various phases like SIT (System Integrations), Soft Launch, Preproduction Launch, going live, and phased migrations.Used DAX (Data Analysis Expressions) functions for the creation of calculations and measures in the SSAS Tabular Models.Integrated SSIS data quality checks to identify and flag anomalies or missing values during the ETL process, ensuring high data quality standards in the data warehouse.Developed a comprehensive error logging system using T-SQL and stored procedures, reducing troubleshooting time by 50% through detailed error messages and automatic notification alerts.Responsible for coding SSIS processes to import data into the Data Warehouse from Excel spreadsheets, Flat Files, and OLEDB Sources. |