| 20,000+ Fresh Resumes Monthly | |
|
|
| | Click here or scroll down to respond to this candidate Candidate's Name YEMAIL AVAILABLESenior Data AnalystSummary of Experience: Over 10+ Years of IT Industry professional with extensive experience in Technological/ business analysis, requirement gathering, specification preparation, design, development, implementation, testing, production support for many clients across multiple industries. End to end experience in designing and deploying data visualizations using Tableau Analytics. Configured on-premises Power BI gateway for different source matching analysis criteria and further modeling for Power Pivot and Power View. Experience in Dimensional Data Modeling, Star/Snowflake schema, FACT & Dimension tables. Experienced Data Modeler with strong conceptual, Logical and Physical Data Modeling skills, Data Profiling skills, Maintaining Data Quality, creating data mapping documents, writing functional specifications, queries Performed role of Data Analyst in implementation teams for Transforming Business Requirements into Software Requirement Specification (SRS) using MS Office Suite, JIRA, MS Project and various tools. Experience in conducting and facilitating agile ceremonies: daily scrum, sprint planning, sprint review and retrospective and strong understanding scrum artifacts: product backlog, sprint backlog and increment, sprint burndown chart. Excellent working experience in Agile Methodology & Software Development Life Cycle (SDLC) processes. Experience in development of Process Flow Diagram, Use Case Diagram, Activity Diagram, and Database schema generation using UML tools like Visio and Oracle SQL developer. Strong experience in Business and Data Analysis, Data Profiling, Data Migration, Data Integration, and Metadata Management Services. Well versed with the concepts of Forward Engineering and Reverse Engineering for the existing databases for Physical models using Erwin tool. Hands on experience in creating tables, views, and stored procedures in Snowflake. Excellent experience working with analytical and data visualization tools and techniques (Tableau and Power BI). Performed data analysis and data profiling using complex SQL on various sources systems including Oracle, Ms-SQL server, My-SQL, Snowflake and Teradata. Excellent understanding and knowledge of NOSQL databases like MongoDB, HBase, and Cassandra. Strong experience in Data Analysis, Data Migration, Data Cleansing, Transformation, Integration, Data Import, and Data Export using multiple ETL tools such as Informatica PowerCenter and using open-source framework such as spark and Hadoop. Experience in analytics, data visualization and modeling to find solutions for a variety of business and technical problems. Experience in creating business requirement document, functional and non-functional specifications, data flow diagrams, business process diagrams and user Stories. Experienced in Cloud Services such as AWS EC2, EMR, RDS, S3 to assist with big data tools, solve data storage issues and work on deployment solution. Good understanding of Relational Database Design, Data Warehouse/OLAP concepts and methodologies. Good working experience on Azure cloud and its services (Azure Blob Storage, MS SQL Server, Azure HD Insights, and Azure Databricks). Experience in conducting Joint Application Development (JAD) sessions with Business users, Subject Matter Experts (SME's), Development and QA teams. Experience in Installing and configuring GIT/SVN. Skilled in statistical analysis like linear, multiple, and logistic regression, hypothesis testing, predictive analytics, What-if analysis, parametric sensitivity, classification tree, Decision tree etc. Extensive ETL testing experience using Informatica (Power Center/ Power Mart) (Designer, Workflow Manager, Workflow Monitor and Server Manager). Expert in problem solving skills with the ability to think creatively and strategically about alternative solutions and opportunities.Technical Skills:Data WarehousingInformatica (Repository Manager, Designer, Workflow Manager, and Workflow Monitor), SSIS, Data StageReporting ToolsBusiness Objects, XIR3, Cognos 8 SuiteData ModelingStar-Schema Modeling, Snowflake-Schema Modeling, FACT and dimension tables, Pivot Tables, ErwinTesting ToolsWin Runner, Load Runner, Test Director, Mercury Quality Center, Rational Clear QuestRDBMSOracle, MS SQL Server, UDB DB2, Teradata, MS AccessCloud TechnologiesAWS, AzureProgrammingSQL, PL/SQL, UNIX Shell Scripting, VB Script, PythonEnvironmentWindows (95, 98, 2000, NT, XP,7,8,10), UNIXOther ToolsTOAD, MS-Office suite (Word, Excel, Project and Outlook), BTEQ,Teradata SQL AssistantProject Experience:
Client: BNYMellon, NYC, NY Nov 2022 till date
Senior Data Analyst Agile methodology was used throughout the project and had daily scrum meeting and bi-weekly sprint planning and backlog meeting. Extensively involved in almost all the phases of Project Life Cycle (SDLC) right from Requirements Gathering to Testing and Implementation, Reporting etc. Tested dashboards to ensure data was matching as per the business requirements and if there were any changes in underlying data. Performed schema changes in the database per request and met with clientele and with various subgroups for requirement gathering in order to allow self-extracted reports and created critical field reporting tools for Enterprise Data Management customer segments Utilized Power BI to create various analytical dashboards that depict critical trends along with slicers and dicers enabling end - user to make filters. Performed unit test/integrated testing scenarios which were used to obtain the UAT's. Developed data visualization using Cross tabs, Heat maps, Box and Whisker charts, Pie charts and Bar charts.
Developed donut charts and implemented complex features in charts like creating bar charts in tooltip.
Worked with the program manager, business analysts and developer to transform data into actionable intelligence through graphics and analytics. Extracted data from various database sources like Oracle, SQL Server using Informatica to load the data into a single data warehouse repository. Prepared SQL & PL/SQL Queries to validate the data in both source and target databases. Experience in creating tables, views, and stored procedures in Snowflake. Created data sharing between two snowflake accounts. Created Azure SQL database, performed monitoring and restoring of Azure SQL database. Performed migration of Microsoft SQL server to Azure SQL database. Created Azure Data Factory and managing policies for Data Factory and Utilized Blob storage for storage and backup on Azure. Created Power BI report to identify bottleneck in process as Lean Champion. Design and model datasets with Power BI desktop based on measure and dimension requested by customer and dashboard needs. Extensively used Apache Kafka, Apache Spark, HDFS and Apache Impala to build a near real time data pipelines that get, transform, store and analyze click stream data to provide a better personalized user experience. Worked on importing data into Hbase using HBase Shell and HBase Client API. Upgraded and patched Oracle binaries in the development environment and production environment. Created test cases and completed unit, integration and system tests for Data warehouse. Prepared an ETL technical document maintaining the naming standards. Created deployment groups in one environment for the Workflows, Worklets, Sessions, Mappings, Source Definitions, Target definitions and imported them to other environments. Version Control Tools or Source Code Management tools (GIT)Environment: Snowflake, MS Azure, Power BI, MS Excel, Splunk, Hadoop, Kafka, Spark, HDFS, Impala, Oracle 12c, No SQL-Hbase, GitClient: Hilton, Nashville, TN Apr 2021 Oct 2022Senior Data Analyst Business requirement gathering and translating them into clear and concise specifications and queries. Performed Data Analysis using SQL
Consistently attended meetings with the Client subject matter experts to acquire functional business requirements to build SQL queries that would be used in dashboards to satisfy the business s needs. Worked in Tableau environment to create dashboards like weekly, monthly, daily reports using tableau desktop & publish them to server. Involved in requirement gathering and database design and implementation of star-schema, snowflake schema/dimensional data warehouse using Erwin. Performed and utilized necessary PL/SQL queries to analyze and validate the data. Wrote Data Bricks code and ADF pipeline with fully parameterized for efficient code management. Design and implement database solutions in Azure SQL Data Warehouse, Azure SQL Created action filters, parameters, and calculated sets for preparing dashboards and worksheets in Tableau. Created the Power BI report on data in ADLS, Tabular Model and SQL server. Install and configure Power BI Gateways to keep the dashboards and reports refresh automatically. Create rich Power BI visualizations, DAX queries and work with Power Query, M code language for data integration and transformation as per the business requirement. Designed Power BI data visualization utilizing cross tabs, maps, scatter plots, pie, bar and density charts. Developed data warehouse model in snowflake. Developed, tested and deployed the dashboards for service now tickets by connecting to Service DB (SQL server). Identify and document limitations in data quality that jeopardize the ability of internal and external data analysts Creating complex SQL queries and scripts to extract and aggregate data to validate the accuracy of the data Used Python& SAS to extract, transform & load source data from transaction systems, generated reports, insights, and key conclusions. Experience in building SSIS packages. Dtsx involving ETL process, extracting data from various flat files, Excel files, legacy systems and loading into SQL server. Used forward engineering approach for designing and creating databases for OLAP model Used Teradata utilities such as Fast Export, MLOAD for handling various tasks Used Spark API over Cloudera Hadoop YARN to perform analytics on data in Hive. Configured real-time streaming pipeline from DB2 to HDFS using Apache Kafka. Worked on MongoDB database concepts such as locking, transactions, indexes, Sharding, replication, schema design. Expertise in writing Hadoop Jobs for analyzing data using Hive QL (Queries), Pig Latin (Data flow language), and custom MapReduce programs in Java.
Source code admin - Manage and configure SVN resolve issue regarding source code management, manage branching and merging, code freeze process.
Environment: Snowflake, Azure Data Factory, Power BI, Kafka, Hadoop, Yarn, MS Excel, Power Point, SQL Server, SVN, Agile, Mongo DB, JIRA, ETLClient: CareFirst, Reston, VA July 2019 Mar 2021Data Analyst Identifying the business-critical Measures by closely working with the SME. Responsible for creating test cases to make sure the data originating from source is making into target properly in the right format. Tested several stored procedures and wrote complex SQL syntax using case, having, connect by etc Involved in Teradata SQL Development, Unit Testing and Performance Tuning and to ensure testing issues are resolved on the basis of using defect reports. Construct the AWS data pipelines using VPC, EC2, S3, Auto Scaling Groups (ASG), EBS, Snowflake, IAM, CloudFormation, Route 53, CloudWatch, CloudFront, CloudTrail. Used AWS Glue to crawl the data lake in S3 to populate the Data Catalog. Analyzed business requirements, system requirements, data mapping requirement specifications, and responsible for documenting functional requirements and supplementary requirements in Quality Center. Setting up of environments to be used for testing and the range of functionalities to be tested as per technical specifications. Built few reports in Tableau for the management team to track the performance of representatives and their current business situation which is used as a single source of truth.
Responsible for creating calculated fields, geo coding and hierarchies using Tableau Desktop.
Supported Tableau reports in production by performing daily health check on multiple dashboards and extracts to ensure that the dashboards are running properly, and the extracts are refreshed.
As a Tableau administrator, created report schedules on Tableau server, created groups, onboarded new users, assign/update user roles. Installed Tableau in Multi-node and Enterprise Deployments Implemented business use case in Hadoop/Hive and visualized in Tableau Effectively designed, developed and enhanced cloud-based applications using AWS Developed Mappings using Source Qualifier, Expression, Filter, Look up, Update Strategy, Sorter, Joiner, Normalizer and Router transformations. Integrate Data Stage Metadata to Informatica Metadata and created ETL mappings and workflows. Involved in writing, testing, and implementing triggers, stored procedures and functions at Database level using PL/SQL. Migrated repository objects, services and scripts from development to production environment Created UNIX scripts and environment files to run batch jobs Developed scripts using both Data frames/SQL and RDD in PySpark (Spark with Python) for Data Aggregation. Unit tested the code to check if the target is loaded properly Involved in writing parsers using Python all database installation, creation, modification, and tuning issues of Oracle. Involved in HBase setup and storing data into HBase, which will be used for further analysis. Tested the ETL process for both before data validation and after data validation process. Tested the messages published by ETL tool and data loaded into various databasesEnvironment: Snowflake, Tableau Desktop, Tableau Server, Tableau client, Tableau services manager, Tableau server, Python, AWS, PL/SQL, Metadata, Hbase, PySpark, Oracle, UNIXClient: Reinsurance Group of America, Minneapolis, MN Nov 2017 Jun 2019Data Analyst Analyzed the business and wrote Business Rules Document Gathered requirements and created Use Cases, Use Case Diagrams, Activity Diagrams using MS Visio Performed unit testing, system testing and system integrated testing.
Tested Complex ETL Mappings and Sessions based on business user requirements and business rules to load data from source flat files and RDBMS tables to target tables. Designed, developed Informatica Power Center mappings to extract, transform and load the data into Oracle target tables. Used Debugger in Informatica Power Center Designer to check the errors in mapping. Responsible for different Data mapping activities from Source systems to Teradata Experience in creating UNIX scripts for file transfer and file manipulation. Worked closely with QA team and developers to clarify/understand functionality, resolve issues and provided feedback to nail down the bugs Performed Acceptance Testing (UAT), unit testing and documenting Performed Gap Analysis to check the compatibility of the existing system infrastructure with the new business requirementsEnvironment: DB2, Tableau, MS Office, MS Visio, Informatica, PowerPoint, MS Project, UML, SQL, MS-ExcelClient: Infonaya Software, India Sep 2014 July 2017ETL Developer To obtain the data about the customers from different systems and aggregate within the data warehouse using Informatica Developed, and scheduled variety of reports like cross-tab, parameterized, drill through and sub reports with SSRS. Developed the ETL jobs as per the requirements to update the data into the staging database (Postgres) from various data sources and REST API s Worked on data profiling & various data quality rules development using Informatica Data Quality. Implemented Exception Handling Mappings by using Data Quality, data validation by using Informatica Analyst. Implemented Change Data Capture using Informatica Power Exchange Wrote ETL transformation rules to assist the SQL developer. Written documentation to describe program development, logic, coding, testing, changes and corrections. Developed and tested Store procedures, Functions and packages in PL/SQL for Data ETL. Created complex Cognos reports using calculated data items, multiple lists in a single report. Prepared functional and technical documentation of the reports created for future references Worked with data modelers in preparing logical and physical data models and adding/deleting necessary fields using ErwinEnvironment: ETL, Cognos, Informatica, PL/SQL, SQL, MS Office, MS Excel, Windows. |