| 20,000+ Fresh Resumes Monthly | |
|
|
| | Click here or scroll down to respond to this candidateSenior Data AnalystName: Varsha chPhone: PHONE NUMBER AVAILABLEEmail: EMAIL AVAILABLELinkedIn:Professional SummaryHaving 4+ years of experience as a Data Analyst with a solid understanding of Data Acquisition with excellent education and experience in data integration, data mining, predictive modeling, Statistical analytics, data visualization with large data sets of structured and unstructured data, and client relationship management to support organizational objectives.Working experience with advanced Microsoft Excel functions, ETL (Extract, Transform, and Load) of data into a data mart, and Business Intelligence (BI) tools like Microsoft Power BI, Tableau (Data visualization and Analytics) and semantic layers.Managing Database, Azure Data Platform services (Azure Data Lake (ADLS), Data Factory (ADF), Data Lake Analytics, Stream Analytics, Azure SQL DW, HDInsight/Databricks, NoSQL DB), SQL Server, Oracle, Data Warehouse, etc. Build multiple Data Lakes.Hands-on experience in writing queries in SQL to extract, transform, and load (ETL) data from large data warehouse systems, extracting, transforming, and loading (ETL) Data from spreadsheets, Database tables, and other sources using SQL Server Integration Services packages (SSIS) and Informatica.Proficient in Statistical Methods like Regression Models, Neural Networks, Cluster Analysis, Decision Trees, Principal Component Analysis, and Dimensionality Reduction.Real-time operational reporting and analytics using SAP DART live.Skilled in creating measures, calculated columns and performing time series analysis using DAX in Power BI.Analyzed the SQL scripts and designed it by using Pyspark SQL for faster performance.Excellent Tableau Developer, with expertise in building Interactive Reports and Dashboards with Customized Parameters and User Filters. Experience in working with various versions of Tableau and other BI technologies like Alteryx, Teradata, PowerBI, etc.Developed and published reports and dashboards using Power BI and written effective DAX formulas and expressions.Extensive experience in diverse and home-grown methodologies such as the Agile System Development Life Cycle process, implementing the Rational Unified Process (RUP) in all different phases of SDLC.Efficient in MS Project/MS Excel for planning/status reporting/writing test scenarios.Built strong relationships with clients and demonstrated commitment to delivery.Strong communication, business understanding, critical thinking, and analytical skills.Strong understanding of Data Science Research Methodologies, statistical concepts, data mining techniques, queues, and multivariate data visualizations.Experience on Spark SQL to load tables into HDFS to run select queries on top.Proficient in programming languages such as Python, R, and SQL to manipulate and analyze data in GCP.Proficient in creating interactive reports and dashboards using GCP's Data Studio and Looker for data visualization and sharing insights with stakeholders.Developed Spark code using Scala and Spark-SQL/Streaming for faster processing of data.Worked on reading multiple data formats on HDFS using Scala.Experience working with Python to automate the operations that are used in data cleaning, and data ingestion using the Pandas framework. (NumPy, SciPy, Pandas, Matplotlib, Scikit-learn).Good knowledge of R packages, libraries, and lexicons for sentiment analysis of text data.Experience in dening key facts and dimensions necessary to support.Professional experience in Data Modeling and Data Analysis design of OLTP and OLAP systems.Experienced in Designing Star Schema (identification of facts, measures, and dimensions), Snowflake schema for Data Warehouse, and ODS architecture by using tools like Erwin Data Modeler, Power Designer, Embarcadero E-R Studio, and Microsoft Visio.Experienced-on use of Spark and Scala APIs to compare the performance of Spark with Hive and SQL, and Spark SQL to manipulate Data Frames in Scala.Expert in Building reports using SQL SERVER Reporting Services (SSRS [] Crystal Reports, Power BI, and Business Objects.Experienced in MDM (Master Data management) in removing duplicates, standardizing data, and eliminating incorrect data.Technical SkillsLanguages:Python, R, SQL, T-SQLDatabase:MSSQL, MySQL, SQLite, PostgreSQL, MS Access, QuickBase.Methodologies:Agile, Scrum, and WaterfallLibraries:Scikit-Learns, Keras, TensorFlow, NumPy, Pandas, Matplotlib, SeabornBusiness Intelligence and Predictive Models:Regression analysis, Bayesian Method, Decision Tree, Random Forests, Support Vector Machine, Neural Network, K-Means Clustering, KNN and Ensemble Method, Natural Language ProcessingReporting Tools:Tableau 10. X, 9. X, 8. X which includes Desktop, Server, and Online, Microsoft Power BI, Smartsheet, Google Sheets, and Google Data Studio.Data Visualization:Tableau, DAX, Microsoft Power BI, QlikView, Qlik Sense, Quickbase, Matplotlib, Sea-bornCloud Management:AWS (Amazon Web Services), MS Azure.Professional ExperienceEquifax, Atlanta, GA Jul 2023 PresentData analystResponsibilities:Performed Data Analysis and data profiling using complex SQL on various source systems including Oracle 10g/11g and Tera data.Creation of multiple SSIS packages to import data from the legacy (mainframe) system, Oracle, and Mongo DB to target SQL Server DB for report consumption and other use.Written SQL scripts to test the mappings and developed a Traceability Matrix of Business Requirements mapped to Test Scripts to ensure any Change Control in requirements leads to test case updates.Involved in the design of the new Data Mart for the Finance Department and working with Data Architect/ Designer using Erwin Data Modeling Tool.Expert skills in business intelligence tools and technology, able to design, build and support robust reporting solutions and Teradata semantic layers.Implemented several DAX functions for various fact calculations for efficient data visualization in Power BI.Designed complex data intensive reports in Power BI utilizing various graph features such as gauge, funnel for better business analysis.Extensive use of DAX functions for the tabular models.Monitored key performance indicators (KPIs) and trends within ERP systems to identify areas for improvement or optimization.Designed and developed an end-to-end data warehousing and OLAP solution using SSIS, SSAS, SSRS, and SQL Server.Experience on SAP strategic solution for real-time operational reporting is SAP DART Live.Gain Valuable Insights on Your Business right after installing the SAP DART Live for SAP Business Suite by using the rich content delivered by the Virtual Data Models delivered with it.Created two semantic models and also developed Spark scripts by using Scala Shell commands as per the requirement.Created data models for AWS Redshift and Hive from dimensional data models.Worked on Data modeling, Advanced SQL with Columnar Databases using AWS.Perform Data Analysis on the analytical data present in AWS S3, AWS Redshift, Snowflake, Teradata using SQL, Python, Spark, Databricks.Involved using ETL tool Informatica to populate the database, data transformation from the old database to the new database using Oracle.Developed Tableau Data Visualization using Pareto's, Combo charts, Heat Maps, Box and Whisker plots, Scatter plots, Geographic Map, Cross tabs, and Histograms.Published and shared dashboards and views on the Tableau server.Developed Data mapping, Data Governance, Transformation, and cleansing rules for the Master Data Management Architecture involving OLTP, ODS, and OLAP.Performed logical data modeling, and physical Data Modeling (including reverse engineering) using the ERWIN Data Modeling tool. Used Scala to write code for all Spark use cases.Communicated with multiple stakeholders including senior leaders regarding the architecture, discuss problems and recommended solutions and conduct sessions on Azure DevOps.Extraction and importing of data from MongoDB and configuration of the MongoBI connectors and the ODBC driver to enable smooth communication between MongoDB and SSRS and PowerBI.Used Spark SQL to process huge amount of structured dataImplemented Spark Scala application using higher order functions for both batch and interactive analysis requirement.Make use of the ease of use of as well SAP DART Studio, SAP Business Objects BI standard reporting interfaces and the capabilities of html5 front ends.Created reports from OLAP, sub-reports, bar charts, and matrix reports using SSRS.Experienced in Providing SQL to tollgate and data quality check to ETL to test the data inserted by the Java rules engine into a staging table as per requirement and logic to automate the process.Developed detailed ER diagram and data flow diagram using modeling tools following the SDLC structure.Experienced in Providing PL/SQL queries to developers as source queries to identify the data provided logic to assign.Environment: Tableau (Desktop/Server), Power BI, DAX, MS SQL server 2008, Excel, SQL, PL/SQL, SAP DART, NoSQL, SQL Server 2008, SQL BI Suite (SSIS, SSRS), AWS, Python, MS Office, Visual Studio 2010, Erwin 7x, Hive, SQL Developer, Informatica, CLoverETL, ETL Architecture and Design, Warehouse, OLTP, OLAP, ODS.Client: Dish Network, Englewood, CO Nov 2022 Jul 2023Data AnalystResponsibilities:Created Complex mappings for delivering the incremental comparative reports and extracts contain changes after the comparison between the MDM EPIC data brought into MDM to the PVD (Provider verification data).Developed Complex Mappings for creating match-merge reports for ACO Provider.Tuned performance of Outbound ETL. Performed CDC based on MDM last update date, which is indexed and maintained by MDM. Used the parameterization template mapping/session for each base object and had that as first task in the Workflow.Tuned performance of Informatica session to optimize the EMPI data load to landing process. Created Pre-Landing tables in Oracle and bulk loaded them. Used SQL Transformation in Informatica Designer to populate the data from Pre-Landing tables to actual Landing tables.Designed an effective Lead management system on Ellie Mae software known as EncompassCustomization of with Encompass Utilization of Services for delivery of loan files to underwriting to imaged file delivery to investors for purchase.Generated ad-hoc reports in Excel Power Pivot and shared them using Power BI to the decision makers for strategic planning.Implemented several DAX functions for various calculations for efficient data visualization in Power BI.Automated Quarterly reports and Budgets forecast model for People Operations using SQL and Tableau ServerDesigned and developed database models for the operational data store, data warehouse, and federated databases to support clients.Created Power BI reports using joins in multiple tables from multiple databases using complex SQL queries.Importing and exporting data using Sqoop to load data to and from Teradata to HDFS on regular basis Tools andCreated mappings for loading data files received from different Sources like ECHO, HMS, and CLARITY, worked with different kinds of input files like CSV, XML, Tab delimited files.Created Views which were used by Micro-Strategy team to display Provider data in the portal.Worked with Business analysts and the DBA for requirements gathering, business analysis and designing of the data warehouse.Used PL SQL stored procedures as part of mappings to improve performance.Configured match rule set property by enabling search by rules in MDM according to Business Rules.Created Mapplet for Address Validation and used it in various Mappings.Worked on Informatica IDQ with the data quality management team.Performed Data Profiling on a regular basis using custom rules.Designed and developed data quality procedures using IDQ.Worked with Generator, Matcher, Associator, Consolidator for analysis in IDQ.Developed matching map lets and address validations.Created events and tasks in the workflows using workflow manager.Working with database connections, SQL joins, cardinalities, loops, aliases, views, aggregate conditions, parsing of objects and hierarchies.Developed shell scripts for running batch jobs and scheduling them.Environment: Informatica MDM, Work Flow Manager/Monitor, Power Exchange, Informatica IDQ, Oracle, SQL Server, Tableau, Erwin, MS Office.DXC Technologies, India Aug 2019 Apr 2022Data AnalystResponsibilities:Used Waterfall life cycle Methodology for Iterative development and Rapid delivery of the product.Generating various reports (graphical) using Python packages like NumPy, SciPy and Matplotlib.Involved in extensive data validation by writing several complex SQL queries and involved in back-end testing and worked with data quality issues.BI Desktop reports to Power BI Service and deployed dash boards for specific user groups working with Power BI Server administrators in the organization.Performed data mining on claims data using complex SQL queries and discovered claims pattern.Excel Application Object in Access and then send final output to Excel or depending on the complexity of the report wrote VBA code in Excel to directly import data into Excel do the massage and formatting.Worked on loading the Data from flat files, csv files and excel sheets into SQL server tables.Power BI Reports migrated the existing standard scheduled reports.Reviewed Code using GitHub pull requests, improved code quality, and conducted meetings among peers.Involved in Salesforce application setup activities and customized applications to match the functional needs of the organization.Involved in all phases of data acquisition, data collection, data cleaning, model development, model validation, and visualization to deliver data science solutions.Designed and developed Power BI graphical and visualization solutions with business requirement documents and plans for creating interactive dashboards.Formulated DAX Expressions create Date to yearly columns and implemented partitions in Tabular Models for Power BI.Worked closely with marketing and product teams to build customer behavior analytics- tracking customer engagement, churn prediction, and Lifetime-Value models.Performed data management projects and fulfilled ad-hoc requests according to user specifications by utilizing data management tools like Python, SQL,Worked on various Salesforce standard objects like Case Management, Accounts, Contacts, Content, Reports, and Workspaces.Developed Templates for AWS EMR infrastructure as code using Terraform to build staging, production environments.Experience of Microsoft Excel, Microsoft Word, and Microsoft PowerPoint. Plus, VBA (i.e. Macros) pertaining to MS Office products and VBA for automation reporting - Dashboards. Plus, Pivot Table & Power Pivot (Data Model).Validation, cleanup, quality and formatting of data during the conversion data to PeopleSoft Using Excel (VLOOKUP, Hookup, Choose, Match, Index, Array, Exact, Upper, Lower, Upper, Mid)Created charts and followed the Software Development Life Cycle SDLC methodology to complete projects on time and within budget.Imported the customer data into Python using Pandas libraries and performed various data analysis found patterns in data which helped in key decisions for the company.Implemented Exception Handling Mappings by using Data Quality, data validation by using Informatica Analyst.Generating various capacity planning reports (graphical) using Python packages like NumPy, matplotlib.Environment: SDLC, Python, Pandas, Data Migration, Data Cleansing, ETL, Power Center, Excel, Data Mapping, Data Quality, Data Validations, Power BI, MySQL, Data Warehouse, Power Query, MS Visual Studio, NumPy, Matplotlib, SQL Queries, MS Access |