Quantcast

Sr Data Analyst Resume Jersey city, NJ
Resumes | Register

Candidate Information
Name Available: Register for Free
Title Sr Data Analyst
Target Location US-NJ-jersey city
Email Available with paid plan
Phone Available with paid plan
20,000+ Fresh Resumes Monthly
    View Phone Numbers
    Receive Resume E-mail Alerts
    Post Jobs Free
    Link your Free Jobs Page
    ... and much more

Register on Jobvertise Free

Search 2 million Resumes
Keywords:
City or Zip:
Related Resumes

Project Manager, Business Analyst, Data Analyst or Technical Wri Manhattan, NY

Data Analyst Business Edison, NJ

Data Analyst, Market Research Analyst, Business Analyst Jersey City, NJ

Data Analyst Power Bi Newark, NJ

Data Analyst Analysis Brooklyn, NY

Data Analyst Engineer Manhattan, NY

Data Analyst/Engineer Jersey City, NJ

Click here or scroll down to respond to this candidate
                                                            Candidate's Name

               (Street Address )-483-0529 | EMAIL AVAILABLE | https://LINKEDIN LINK AVAILABLE



PROFESSIONAL SUMMARY:
  Around 6+ years of professional experience as a Business Data Analyst with good knowledge and understanding, of scale data and
   analytics solutions on multiple different databases, including data collection, cleaning, visualization manipulation, and
   interpretation.
  Expertise in data analysis, including data profiling, mapping, and validation, to ensure data integrity and accuracy.
  Experience in data analysis, data visualization using Tableau/Microsoft Power BI, vision model building with machine learning
   algorithms for prediction and forecasting using data, using statistical tools like SAS and R, data mining using Python, SQL, Hadoop,
   Spark, Hive, etc.
  Experience in Data Extraction/Transformation/Loading (ETL), Data Conversion, and Data Migration using Microsoft SQL Server
   Integration Services (SSIS) and Informatica.
  Experienced in conducting user acceptance testing (UAT) and supporting end-users in adopting new systems and processes.
  Extensive experience working with JSON, XML, T-SQL, and Python Schema Designing.
  Developed use cases and user stories that capture functional and non-functional requirements.
  Experience performing Gap analysis, root analysis, risk analysis, and impact analysis to define boundaries, identify issues, minimize
   costs, and optimize solutions.
  Employed various JAD techniques, such as brainstorming, prototyping, and group discussions, to foster collaboration and promote
   a shared understanding of business goals and data needs.
  Experienced in conducting sprint planning sessions and backlog grooming activities using Jira.
  Proficient in creating various documentation deliverables such as business requirements documents (BRDs), and (FRD) functional
   specifications.
  Strong knowledge of insurance regulations and compliance requirements specific to the P&C Insurance domain, ensuring adherence
   to legal and industry standards.
  Skilled in utilizing SDLC Agile, and waterfall methodologies, such as Scrum project tasks, facilitate iterative development, and ensure
   timely delivery.
  Build data pipelines that will enable faster, better, data-informed decision-making within the business.
  Proficient in utilizing Microsoft Visio to create and modify diagrams, flowcharts, organizational charts, and other visual
   representations.
  Worked with Project Managers to prioritize data analysis tasks based on business needs and project timelines, ensuring that critical
   analyses were completed on time.
  Expert in writing SQL queries and optimizing the queries in Oracle, Teradata, and SQL Servers.
  Experience in designing SQL queries using joins, sub-queries, functions, indexes, views materialized views, set operators, group by,
   and OLAP functions.
  Experience in working with databases like MongoDB, MySQL, and Cassandra.
  Extensive experience using EHR, EMR HL7, and Medicaid Medicare Claims in the healthcare industry to extract, clean, and analyze
   data from primary and secondary sources.
  Created database, schema, and tables using Snowflake schema and migrated multistate-level data from SQL server to Snowflake.
  Maintained knowledge in designing star schema and snowflake schema for data modeling and created user stories using the JIRA
   tool.
  Solid Excellent experience in creating cloud-based applications using Amazon Web services, Amazon RDS, and Microsoft Azure.
  Worked closely with the project manager to identify data-related dependencies and constraints, ensuring seamless integration of
   data analysis tasks within the Waterfall framework.
  Worked on complex ad-hoc SQL queries using Joins, DML, DDL, pivots, views, constraints, and functions using Oracle, MS SQL
   Server, and MySQL.
  Experienced in working with JIRA software, including creating and managing projects, workflows, custom fields, dashboards, and
   reports.
  Extensive experience with Informatica PowerCenter and Power Exchange for designing, developing, and maintaining ETL workflows
   for data integration and migration between various data sources and targets.
  Advanced working knowledge of MS Project, MS Word, MS Excel, MS Visio, and MS PowerPoint to create test plans and timeframes
   for deliverables.
  Prepared and distributed JAD session agendas, meeting materials, and follow-up documentation to ensure all stakeholders were
   well-informed and engaged throughout the project lifecycle.
 SKILLS
 Languages and Databases: SQL, Python (Pandas, NumPy, Scikit-Learn), R (ggplot2, Lubridate), MySQL, SQL Server, Hadoop
 Analytics Tools: Jupiter Notebook, SAS (Base, EG, Miner, JMP), R Studio, Tableau, Power BI, MS Excel (VLOOKUP, Macros, VBA, Pivot
 tables), MS Office, MS Project, MS Visio, SharePoint, Google Analytics, Salesforce, QlikView, Amazon Web Services (AWS)
 Statistical Skills: Data Cleaning, Exploratory Data Analysis, Data Visualization, Hypothesis Testing, Linear and Logistic Regression,
 Classification, Clustering, PCA, Decision Trees, Random Forests, Neural Networks, Dimension Reduction, Text Analytics, A/B Testing,
 Search Engine Optimization, Time Series, ETL Data Warehousing, Machine Learning Algorithms, NLP, Financial Forecasting, Market Basket
 Analysis
 Expert in database management, SQL, Linux scripting, and cloud services, with strong analytical skills, project coordination, and
 security focus.
PROFESSIONAL EXPERIENCE

Walmart, Arkansas | Sr. Data Analyst                                                                               Aug 2023   Present
  Analyzed data and extracted actionable insights to support business decision-making.
  Designed and implemented effective Analytics solutions and models with Snowflake.
  Collaborated with business stakeholders to understand their data analysis requirements and developed custom dashboards and
   visualizations using Databricks notebooks and Power BI.
  Utilized advanced data mining techniques, such as clustering, classification, and association rule mining, to discover patterns,
   trends, and insights from large datasets, facilitating informed business decision-making.
  Conducted ad-hoc data analysis requests and provided insights to address specific business challenges.
  Managed Care Contract Management/CMS, Quality and Reporting
  Provided recommendations and insights to drive business decisions and strategies, keeping up with industry trends.
  Developed compelling visualizations and effectively communicated data insights to stakeholders.
  Identified Key Performance Indicators (KPIs) and developed reports and dashboards for monitoring and tracking performance.
  Worked on moving the data from AWS s3 to Snowflake and vice versa.
  Developed and executed complex SQL queries on DB2 databases to retrieve, analyze, and manipulate large volumes of business
   data, providing valuable insights for decision-making.
  A data discovery and visualization tool that allows users to explore data and create interactive dashboards and reports by using
   QlikView.
  Utilized Jira reporting and metrics capabilities to generate insightful reports, enabling data-driven decision-making and project
   performance evaluation.
  Queried the datasets from Snowflake and Athena for doing the data analysis and for viz reporting.
  Created and maintained comprehensive documentation (BRD) for data analysis methodologies, processes, and findings to facilitate
   knowledge sharing.
  Utilized Agile Method for daily scrum meetings to discuss project-related information.
  Involved in the complete SDLC life cycle of big data projects, from requirement analysis to production.
  Conducted User Acceptance Testing (UAT) to validate data-driven solutions and applications.
  Integrated data from multiple sources into Excel, performing data consolidation and validation.
  Utilized advanced Excel functions and pivot tables for data analysis and reporting.
  Used SUMIFS and LOOKUP features in MS Excel for tracking model construction.
  Developed VBA, Forms, Reports, and Queries for MS Access databases.
  Ensured adherence to data governance rules and standards for consistent business element names.
  Integrated Databricks with cloud providers (Google Cloud, Azure) for efficient data storage and retrieval.
  Worked with data lakes (Microsoft Azure), leveraging its storage and processing capabilities to handle large-scale datasets for
   advanced analytics and reporting.
  Assisted in the development and implementation of technology solutions, such as CRM systems or enterprise resource planning
   (ERP) systems, to optimize business processes.
  Performed data profiling activities to understand the structure, quality, and content of source data. Identify any data inconsistencies
   or anomalies that may impact the mapping process.
  Experience in managing data model repositories and creating comprehensive documentation in metadata portals using industry-
   leading tools such as Erwin,
  Utilized SQL and Python programming languages to extract, manipulate, and analyze large and complex datasets from diverse
   sources, including PostgreSQL databases, APIs, and structured/unstructured data.
  Worked with data ingestions from multiple sources into the Azure Data Lake and worked on data load using Azure Data Factory
   through an external table approach.
  Written and maintained T-SQL and PL/SQL scripts to extract, transform, and load data from various databases to Databricks.
  Designed and deployed scalable, highly available, and fault-tolerant systems on Azure.
  Utilized GitLab, Docker, and Kubernetes for CI/CD on microservices and deployed them to the Azure Cloud.
    Developed purging scripts and routines to manage data on Azure SQL Server and Azure Blob storage.
    Utilized Talend ETL tool to create XML and JSON scripts for data extraction and storage in Oracle DB.
    Worked with various data integration and transformation technologies (Spark, Hive, Pig) to implement data processing workflows.
    Linked data lineage to data quality and business glossary work within the overall data governance program.

Environment: Azure, Azure Data Lake, MS Excel, MS Access, Azure Data Factory, Power BI, Agile, Snowflake, ETL tool XML, JSON, SQL,
CI/CD, Azure Cloud, T-SQL, PL/SQL, PostgreSQL, Erwin, AWS s3, Snowflake, QlikView, Databrick.

Capgemini, India | Sr. Business Data Analyst                                                                   Aug 2021   Aug 2022
Client, Bank of America
  Defined the customer s critical to quality (CTQ) issues and requirements for the current Defect Management process.
  Used Collibra Tool to manage, and store rationalized metadata and created automated process workflows for Data Validation by
    Chief Data Stewards.
  Proficient in utilizing Quantum Metric to analyze user behavior, identify digital experience issues, and optimize website and
    application performance.
  Responsible for ETL design (identifying the source systems, designing source-to-target relationships, data cleansing, data quality,
    creating source specifications, ETL design documents, and ETL development (following Velocity best practices).
  Ensure data security and compliance by implementing Azure's security features, such as data encryption, role-based access control
    (RBAC), and auditing, to safeguard sensitive data during analysis.
  Experienced in leveraging Quantum Metric's platform to gain insights into user journeys, interactions, and pain points across digital
    touchpoints.
  leveraging Dynatrace's APM capabilities to monitor application performance, detect anomalies, and troubleshoot issues in real time.
  Created business requirements and high-level design documentation and validated multiple web services used by the client, both
    REST and SOAP.
  Using Quantum Metric's session replay feature to visualize user sessions, understand user behavior, and diagnose usability issues.
  Gaining end-to-end visibility into application and infrastructure performance with Dynatrace's full-stack observability platform,
    spanning across frontend, backend, and infrastructure layers.
  Experienced in utilizing Adobe Analytics to analyze and interpret data, track key performance indicators (KPIs), and generate
    actionable insights for digital marketing strategies.
  Implemented Power BI Power Query to extract data from external sources and modify the data to generate the reports; Designed and
    developed complex KPIs.
  Work independently with Veeva Vault pharmaceutical clients and Veeva configuration specialists to design and deploy customized
    document management system portals for marketing, legal, regulatory, medical, compliance, clinical, research, and safety teams.
  Integrating Adobe Analytics with other Adobe Marketing Cloud solutions, such as Adobe Experience Manager (AEM) and Adobe
    Campaign, for seamless data sharing and campaign management.
  Proficiently utilized Informatica to design, develop, and execute data transformation and ETL (Extract, Transform, Load) processes.
  Utilized Azure Databricks to analyze and process large-scale datasets, leveraging distributed computing capabilities to perform
    complex data transformations, machine learning, and advanced analytics tasks.
  Manage and create JIRA epics for confidential projects and translate functional design requirements into JIRA user stories.
  Experienced in end-to-end data analysis from data cleaning, data manipulation, data mapping, data mining, database (Oracle)
    testing, and developing controls using R Programming, Python, and reporting using latex coding.
  Acquired appropriate business rules in coordination with users to understand the business functionality.
  Data mapping, logical data MODELING, created class diagrams and ER diagrams, and used SQL queries to filter data within the
    Oracle database.
  Demonstrated proficiency in utilizing Azure Synapse Analytics (formerly SQL Data Warehouse) to create scalable data warehousing
    solutions.
  Worked as a Databricks Engineer with focus on Data Warehousing ETL development in Azure Databricks, FHIR, MS Azure Databricks
    with Database and ADLS (Python).
  Used MS Visio to develop end-to-end entity relationship diagrams.
  Utilized Python to conduct statistical analysis and hypothesis testing, applying libraries like SciPy and stats models to validate
    assumptions, draw conclusions, and make data-driven recommendations..
  Wrote Stored Procedures, Packages, and PL/SQL scripts to incorporate new validations and enhance existing provisioning logic.
  Perform the UAT and get sign-off from the application owner before making a live application in AWS.
  Implemented data collection and transformation in the AWS cloud computing platform using S3, Athena, Glue, Redshift,
    PostgreSQL, and Quick Sight.

Sonatro Automation | Operations Analyst, Hyderabad, India                                                        Aug 2020   Jul 2021
   Identifying and resolving operational issues, cutting project delays by 14%.
     Conducted in-depth analysis of product sales data, identifying inefficiencies in inventory management; recommendations
     implemented resulted in reducing excess stock by 20%, freeing up $50k in working capital.
     Developing & highlighting profiles for value opportunities. Conducted research on competitors to enhance products and maximize
     sales.
     Implemented data-driven strategies to optimize resource allocation, decreasing operational costs by 12%.
     Introduced new workflow automation tools, cutting manual labor hours by 15% and increasing overall productivity.

Cognizant, India | Asst Operations Analyst, Hyderabad, India                                                      Mar 2019   Aug 2020
  Alexa Chatbot| Leveraging NLP expertise, over 10,000 text datasets were meticulously curated, cleaned, and annotated for model
   training. A strong focus was placed on quality assurance for text-to-speech data and data annotation, ensuring precision throughout
   the process. Notable achievements include a 50% increase in performance in Alexa chatbot competitions due to enhanced
   accuracy. A pivotal role was played in optimizing data processing workflows and upholding rigorous privacy and ethical standards,
   contributing significantly to the success of NLP models.
  Created visually compelling dashboards in Tableau, consolidating key performance indicators for service level agreements (SLA)
   and ensuring alignment with management objectives; the tool is now utilized by 15+ stakeholders for real-time tracking.
  Worked in Initial Analysis root out and identified erroneous/misaligned data.
  I worked on the  Data Verification  review and confirmed the results of the data analysis to pinpoint data cleaning. I also worked with
    Data Cleaning  to correct the erroneous/misaligned data and Data Quality to test and confirm the quality of the data.
  Worked with management to prioritize business and information needs.
  Provided a standard process for analyzing data across multiple systems and identifying erroneous/misaligned data recommend
   resolution and Interpreted data and analyze results using statistical techniques.
  Extensively used SQL queries to analyze data and validate data correctness and completeness.
  Worked with cloud data warehouses, Tableau, Azure SQL Data Warehouse, and Snowflake and Informatica Cloud Data Integration
   solutions to augment the performance, productivity, and extensive connectivity to cloud and on-premises sources.
  Performed Data Analysis, and SQL Queries for testing & troubleshooting Data Warehouse.
  Ensure data security and compliance by implementing AWS's security features, such as encryption, identity and access
   management (IAM), and auditing, to safeguard sensitive data during analysis.
  BA Led activities include managing multiple development projects, development groups, and/or application support functions for
   major business segments for Case Management and PBM Cerner Models, as well as Athena and eCW. Managed Care Contract
   Management/CMS   Build and Load
  Performed detailed data analysis (DDA), data quality analysis (DQA), and data profiling of source data.
  Defined virtual warehouse sizing for snowflakes for different types of workloads.
  EDI setup, maintenance, and functionality for payers and payees, as NTSP owns Care N  Care (Medicaid insurance)
  Utilized AWS Lambda to build serverless data processing pipelines, performing transformations and calculations on demand, and
   orchestrating data analysis tasks within the AWS ecosystem.
  Designed and developed Insights reports on AWS Quick sight as part of client deliverables.
  Involved in testing the XML files and checking whether data is parsed and loaded to staging tables.
  Designed ER diagrams, logical model (relationship, cardinality, attributes, and candidate keys), and physical database (capacity
   planning, object creation, and aggregation strategies) as per business requirements.
  Filtered and cleaned data by reviewing data, computer reports, printouts, internet searches, and performance Identify and
   recommend new ways to save money by streamlining business processes.
  Designed and developed customized Tableau dashboards tailored to specific business needs, presenting key performance
   indicators (KPIs) and metrics in a visually appealing and user-friendly manner.
  Experience in Salesforce Customization, Security Access, Workflow Approvals, Data Validation, data utilities, Analytics, sales,
   Marketing, Customer Service, and Support Administration.
  Created and updated users, reports, and dashboards to track pipeline/stages for management visibility, while integrating Apex
   (applications) to Salesforce accounts such as Conga Merge and Outlook.
  Created pivot tables and ran VLOOKUPs in Excel as a part of data validation.
  EHR Implementation interfaces, training of Athena, Cerner, and Allscripts for 18 full life cycle go-lives.
  Defining roles and responsibilities related to data governance and ensuring clear accountability for the stewardship of the company s
   principal information assets.
  Responsible for loading, extracting, and validating client data.
  Analyze the data using relational database products such as MS Access, Teradata, and SQL Server.

 Apconic Software | Tech Sales Analyst, Hyderabad, India                                                     Apr 2018- Mar 2019
   Led a professional team in contact mining and product research, managing relationships with key clients (Adani Group and Aditya
    Birla)
    Analyzed real-time technical product requirements and led marketing strategies and brand advertising. Coordinated with
    production and development teams, handled a 2-member team, and represented the firm in expos gaining 35% market recognition
    in the south region.
    Developed marketing reports, led campaigns in expos, and connected with companies for brand advertising and ad development.
    Designed and developed plans to generate revenue using market research.
    Demonstrated expertise in using Python to handle complex data manipulation tasks, such as merging datasets, reshaping data
    structures, and handling missing values, ensuring data integrity and accuracy.
    Responsible for analyzing and designing data models, mapping, and migration.
    Continuously work with upper-level management for project planning, scheduling, and budgeting using JIRA.
    Designed and implemented data integration workflows in Informatica, mapping source data to target structures and ensuring
    accurate data migration, consolidation, and synchronization

Tech Mahindra | Technical network Assistant, Hyderabad, India                                               Aug 2017- Feb 2018

    Provided network support, improving customer satisfaction by 10%.
    Planned technical upgrades, improving delivery timelines to provide 24/7 services using Machine Learning.
    Measured performance of the  As-Is  process and collected metrics such as defect resolution rate and cycle time.
    Involved in Performance Measurement to develop measurable indicators that can be systematically tracked to assess progress
    made in achieving predetermined goals.
    Worked with building data warehouse structures, and creating facts, dimensions, and aggregate tables, by dimensional modeling,
    and Star and Snowflake schemas

EDUCATION
Anderson University, Greensville, SC                                                                      Aug 2022   Aug 2024
Master of Science in Business Analytics                                                                          GPA:3.4/4.0
Jawaharlal Nehru Technological University, Hyderabad, India                                              Aug 2013   June 2017
Bachelor s in computer science                                                                                   GPA:3.2/4.0

Respond to this candidate
Your Email «
Your Message
Please type the code shown in the image:
Register for Free on Jobvertise