Quantcast

Data Analyst Warehouse Resume Norfolk, V...
Resumes | Register

Candidate Information
Title Data Analyst Warehouse
Target Location US-VA-Norfolk
Email Available with paid plan
Phone Available with paid plan
20,000+ Fresh Resumes Monthly
    View Phone Numbers
    Receive Resume E-mail Alerts
    Post Jobs Free
    Link your Free Jobs Page
    ... and much more

Register on Jobvertise Free

Search 2 million Resumes
Keywords:
City or Zip:
Related Resumes

Data Analyst/Weather Forecaster Virginia Beach, VA

Data Analyst Claims Virginia Beach, VA

Data Analyst System Virginia Beach, VA

Business Analyst Data Analysis Virginia Beach, VA

Data Analyst Power Bi Virginia Beach, VA

Administrative Assistant Data Analyst Norfolk, VA

Data Analyst Machine Learning Williamsburg, VA

Click here or scroll down to respond to this candidate
Sr. Data Analyst/Data EngineerName: NeeharikaE mail: EMAIL AVAILABLEPhone no: PHONE NUMBER AVAILABLELinkedIn: https://LINKEDIN LINK AVAILABLEPROFESSIONAL SUMMARY:More than 11 years of focused IT industry experience in Data Analyst, Business intelligence and Data Warehousing, including development, implementation and testing database and data warehouse applicationsOver 2 years of data collection and transmission with tools such as CRM, EHR, EMR, HIE, HL7 etc. Efficient development skills in Teradata SQL Objects, Creating Tables, Stored Procedures, Functions, Views, Indexing, Performance Tuning.Experience in working with 1010 data to bring enhancements in various sales analytics tools 1010 dataExperience in Stored Procedures, Triggers, Index, Table Partitions and experienced in loading data like Flat Files, XML Files, Oracle, DB2, SQL Server into Data Warehouse/Data Marts using InformaticaStrong hands-on experience using Teradata utilities (SQL, B-TEQ, Fast Load, Multiload, and Fast Export), Teradata parallel support and Unix Shell scripting.Expertise in implementing complex business rules by creating robust Mappings, Mapplets, Sessions, and Workflows using Informatica Power Center.Extensive experience in Data Warehousing, ETL, Design, Development, Implementation, and Testing of Data Warehouse/Data Mart Systems.Worked closely with data analysts to develop and deploy data pipelines in cloud environments such as AWS and GCP, ensuring reliable and scalable data processing and storageExtensive experience with Azure cloud technologies like Azure Data Lake Storage, Azure Data Factory, Azure SQL, Azure Data Warehouse, Azure Synapse Analytical, Azure Analytical Services, Azure HDInsight and DatabricksAbility to operationalize and apply CI/CD (Azure Devops) to the application on DatabricksStrong Python / PySpark knowledge and strong knowledge of coding principles (data structures).Strong experience in using Python for data ingestion rather than analytics.Preference of using Data frame APIs over Pandas / SQLExperience working in IDQ (Informatica Developer Tools) to perform data cleansing, data matching, data conversion, exception handling, data profiling, reporting and monitoringExperience in using Teradata SQL Assistant, Teradata Administrator, and data load/export utilities like BTEQ, Fast Load, Multi Load, Fast Export, and Exposure to on UNIX/Windows environments and running the batch process for Teradata.Good Knowledge of ICD-9 codes and CPT codes for both Mental and Medical Health.Working as a MS SQL developer in Business Intelligence developmentCreated stored procedures using PL/SQL and tuned the databases and backend processExperience working on BI tools such as Qlik Sense, Power BI, Tableau, etcStrong technical skills in Excel with knowledge of SQL, Alteryx, and Tableau preferredEducation Details:Bachelors in Computer Science from Jawaharlal Nehru Technological University, India -2013Technical Skills:ETL ToolsInformatica PowerCenter 9.5/9.7.1, Power Mart 6.2/5.1,Informatica Power Connect, Informatica Developer, BODS, IBM Data stageData ModelingErwin, MS Visio, ER Model, Snowflake Modeling, ER STUDIO Data Architect.Reporting ToolsBusiness Objects 6.5/6.1, Crystal Reports, TableauTesting ToolsHPALM, Test DirectorDatabasesOracle 11g/10g/9i/8i, MS SQL Server 2005/2008,MySQL, SFDC, MS Access, MS Excel, flat files, Teradata, DB2Defect Tracking ToolsMicrosoft Team foundation Server, JIRA and quality centerLanguagesVB, Java, SQL, PL/SQL, UNIX Shell Scripting, XMLWeb servicesAPI, SOAPOperating SystemsWindows 7/Vista/Server 2003/XP/2000/9x/NT/DOS, UNIX, LinuxBI ToolsINFOR ION, Tableau, Power BI, 1010dataTesting toolsSoap UI, PostmanAgile ToolsJira, Version 1, RallyWORK EXPERIENCE:Client: Southwest Airlines April 21- Till DateRole:- Sr. Data Analyst/Data Engineer Dallas, TX (Remote)ResponsibilitiesResponsible for full development cycle of a Data Warehouse, including requirements gathering, design, implementation and maintenance EMREMR implementation for Athena, GE-Centricity, Cerner and All scripts sunrise projectsDevelop T-SQL Stored Procedures and views using SSMS, SSIS and MySQLCreating database objects utilizing T-SQL, such as tables, queries, Stored Procedures, table triggers, indices and constraints, views and user-defined functionsWorked with Data Architects to design and developed architecture for data services ecosystem spanning Relational, NoSQL, and Big Data technologies. Worked on AWS Redshift and RDS for implementing models and data on RDS and RedshiftDesigned logical and physical data models, Reverse engineering, Complete compare for Oracle and SQL server objects using E/R Studio.Develop custom integration points to connect Guidewire applications with other internal and external systemsCreates, modifies, enhances and maintains Framework Manager Models within Cognos BIConversion of multiple ambulatory clinical under one EHR instance.Extensive experience in data warehouse design, data modeling, and data mapping to ensure seamless integration and efficient data processes.Developed conceptual, logical, and dimensional data models for enterprise data warehouse projects, ensuring alignment with business needs and scalability.Strong technical skills in Unix, Airflow, PySpark, and AWS, enabling efficient data processing and pipeline management.Worked on a RESTful application developed with Perl, Dancer2 using Moo (Perl Object Oriented), Bash and storing data in MongoDB.Designed and implemented data models in SQL-based databases, including Snowflake, PostgreSQL, and Oracle.Participated in the development process of Data Warehouses and Data Lakes / Delta Lake.Develop analytics using Python, Spark, PySpark and other big data tools.Transformed stakeholder requirements into functional applications by employing the software development life cycle (SDLC) within Agile methodologies to deliver effective solutions.Extensive experience in data analysis and business intelligence, driving informed decision-making for business growth.Ensured quality processes are carried out through all phases of the Software Development Lifecycle (SDLC)Building, publishing customized interactive reports and dashboards, report scheduling using Tableau server.Collaborated with cross-functional teams to build and deploy real-time analytics solutions using Qlik Sense, which improved operational efficiency and decision-making processes.Experience in executing Python programs on AWS EC2 instance for data loading, exporting, and batch processing.Work with the CATE platform engineering teams to define standard patterns for MS SQL Server/Sybase/DB2 database buildsCreated and documented comprehensive data mapping between source and target systems, supporting complex integrations and ensuring data consistency.Demonstrated knowledge and experience in handling data processes specific to the insurance industry.Partnered with Business during User Acceptance Testing, assist business users in defining User Acceptance Testing, test cases and plans. Actively participated in user group meetings to review use cases and Test Results prior to UAT start.Utilized tools like UML, Visio, and Clear Case to design and document use cases, workflow diagrams, and dataflows, facilitating clear communication and understanding of project requirements.Experience with MS SQL Server (queries, DTS, stored procedures, data structures and analysis)Good Experience writing and productionizing complex data transformations in SparkSQL and PySpark on Azure Databricks.Applied SQL Server, PL/SQL, and Denodo for developing and executing complex SQL queries, performing data extraction, and conducting thorough data analysis and testing.Explore data in a variety of ways and across multiple visualizations using Power BI.Implemented data integration solutions using Informatica and Right Data, streamlining data processing and improving overall system efficiency.Collaborated with data scientists to develop and deploy machine learning models at scale within the Snowflake data warehouse.Very good experience in ingesting and process real-time data / streaming data using AWS services such as Kinesis and Apache service Kafka.Designed data warehousing solutions using AWS Synapse Analytics for storing and analyzing transformed data.Ability to provide technical expertise in health information exchange, electronic medical record or electronic health record integration architecture, and national and international data standards, including HL7 2.x/3.x, XML, CCD, SNOMED, LOINC, ICD-9, ICD-10, CDA, and FHIREnvironment: Postman, DB2, Teradata 16.20.0.8, ETL, Data Stage, T-SQL,AWS Glue, Qlik Sense, AWS Redshift, SSIS, SSRS,SQL,Python,Pyspark, IBM Infosphere Version, Guidewire, HER,SQL,PL/SQL, Mongo DB, Postgres SQL,Oracle,11.0.5.1Client: T Mobile July 19 Mar 21Sr. Data Analyst Bellevue, WA (Onsite)RESPONSIBILITIES:Tested application under the Agile, scrum and sprint development and process.Managed various testing activities throughout the Software Testing Lifecycle (STLC) for multiple projects.Responsible for going through the User Stories and creating Test Cases based on the mockups provided in User Stories.Write, test and implement Teradata Fastload, Multiload and BTEQ scripts, DML and DDLExperience in creating complex T-SQL Queries for selects, inserts, updates, stored procedures and views.Developed Hibernate mapping configuration files to provide the relation between java objects and database tables.Coordinate and develop detailed work plans, schedules, infrastructure components estimates, resource plans and status reportsParticipated and engaged in the Data Governance Meetings, and led the action points from the meetings for Reporting Analytics.To act as thought leader to set up and guide the process to deliver data integrity within the Risk Reporting function to ensure data and reporting accuracy to avoid issues being identified by the regulatorsExpert in visualization techniques, dashboard design, and decision support tools; experience using data visualization tools such as Tableau, Power BI, etcUsed Model Mart of Erwin for effective model management of sharing, dividing and reusing model information and design for productivity improvement. Created the dimensional logical model with approximately 10 facts, 30 dimensions with 500 attributes using Erwin.Managed encryption keys and passwords through AWS Key Vault.Developed interactive dashboards using Tableau for the Supply Chain reporting & analytics team to monitor operational KPI.Created and consumed web services using JSON, XML, and REST.Design & Develop automation framework using Selenium Java, WebDriverResponsible for writing the Test Cases and Test Scenarios for Functionality, System testing and GUI TestingUtilized Power BI to create various analytical dashboards that helps business users to get quick insight of the data.Experience of AWS native servicesConfigured Maven for better dependency and Log4J for log mechanismCreated SSIS package for daily email subscriptions to alert Tableau subscription failure using the PostgreSQL database.Created queries using DAX functions power Bi desktopStrong knowledge of Microsoft Power BI, SSAS (Tabular) and DAX is MandatoryMaintained Data Mapping documents, Bus Matrix and other Data Design artifacts that define technical data specifications and transformation rules. Performed GAP analysis to analyze the difference between the system capabilities and business requirements.Conducted and participate in formal requirement meetings, live meetings/JAD sessions for various requirements gathering sessions with the stakeholders and vendorsCreated Mappings using HTTP transformation, to call REST API (third party web services) and get the data in the XML format and parse the same using XML parser transformation and subsequent transformations to load data into relational SQL SERVER database.Used Team Foundation server (TFS) for Bug Tracking and published test results in TFS to share with the team members.Involved in creating Dashboards, reports as needed using Tableau Desktop and Tableau Server.Integrated Automation Framework with CI tool Jenkins for build and scheduling the scripts.Designed logical and physical data models, Reverse engineering, Complete compare for Oracle and SQL server objects using E/R Studio.Developed and maintained ETL (Data Extraction, Transformation and Loading) mappings using Informatica Designer to extract the data from multiple source systems that comprise databases like Oracle 10g, SQL Server flat files to the Staging area, EDW and then to the Data Marts.Environment: Postman, DB2,Data Stage, SQL Server, REST API,E/R Studio, JAD, Erwin, AWS Key Vault, GAP analysis, Agile, Informatica PowerCenter 9.7.1, Query serge, TFS, HP ALM, ETL, SQL, Java, JIRA, GIT.Client: UHG Oct 17  Jun 19Data Analyst Williamsburg, VAResponsibilities:Developed and maintained Requirement Traceability Matrix (RTM) and worked with the team and resolved the requirement gaps.Working as a MS SQL developer in Business Intelligence developmentManually created and executed various Test Cases.Applied knowledge in Defect Life Cycle Management including Defect creation, modification, tracking and reporting using Jira.Updated Test Plans and Test Cases periodically to manage changes in requirements.Experience w/ MS SQL Server & OracleManage end to end complex data migration, conversion and data modeling using Alteryx,SQL and create visualization using tableau to develop high quality dashboards.Working knowledge of MDX and DAX queriesCreated intuitive and interactive dashboards, reports, and visualizations using data visualization tools like Tableau, Power BI, or Looker, integrated with Snowflake.Scheduled and executed Oozie and Airflow jobs using Jenkins within Snowflake in collaboration with data analysts, ensuring timely and accurate execution of data processing workflows.Collaborated with data analysts to architect and build multiple data pipelines for end-to-end ETL and ELT processes, ensuring efficient data ingestion and transformation in Google Cloud Platform (GCP).Designed visualization dashboards for data analytics using Power BI.Worked with disparate data sources such as NoSQL and SQL Databases, APIs, and streaming.Coordinated the processes of Data Governance for policies and compliance.Cleaned and transformed data in HDFS using MapReduce (YARN) programs for ingestion into Hive schemas.Demonstrated automation using Selenium WebDriver, Java.Implemented Page Object Model using Selenium WebDriver in Java programming.Worked within the development teams with BDD approach to develop tests using Cucumber, Gherkin, Junit, Java and Selenium WebDriver.Managed to write code in Java, using JUnit annotations.Performed extensive knowledge in Agile Methodology and participated in Sprint Planning, Sprint Retrospective, Product Backlog Refinement and daily Scrum meetings.Discussed Test Cases in Test Case review meeting with BA, PM, QA lead and other team leaders.Performed COPQ and DPPM analysis in Power BI and helped Global Quality Leaders to take the Business-driven decisions.Contributed to team Agile planning activities and Backlog prioritization and management.Environment: Agile, JIRA, Java, TestNG, SQL, Automation Framework, Selenium, QA, Agile, Power BIClient: IVY Comp tech Pvt, India Sep 15 - Feb 17ETL DeveloperResponsibilities:Worked on SAP BODS to create jobs, workflows, scripts, data stores to load in data quality tables.Involved in data quality analysis.Designed/Developed ETL Interface using BODS ETL tool. In addition, implement a BODS based ETL solutionDevelop/Implement the coding of BODS mappings for different stages of ETL. Move staging table to the target table and Developed ETL flows and framework to support ETL developmentWorked closely with SME in writing the Business Scenarios for a project for an Incoming source called ARIBA.Have worked during various phases of project life cycle phases like Requirements and Analysis, Testing.Created or altered data integration blueprints and install, configure, and tune ETL infrastructure to support the data warehousing solutionPrepared data quality rules, business level data checks and generated reports based on the EDW data at Different stages of the project.ETL architecture design experience within data warehousing environments governed by applying appropriate methodologies and ETL standards/analytical skillsDesign, coding, unit testing of ETL packages, triggers, and stored procedures, views, SQL transactionsWorked on Teradata to prepare SQL, validating the data in EDW tables, semantic views.Involved in SDL layer implementation as BODS developer and in solving production issues.Developed unit test cases/ test scripts for Inbound and Outbound ETL processes.Involved in SDL layer implementation as BODS and in solving complications in the production level issues.Worked on SAP BODS to create jobs, workflows, scripts, data stores to load in tables.Understand and comply with development standards and SDLC to ensure consistency across the teamDeveloped ETL flows and framework to support ETL development and set a standard to my teamValidated the developed unit test cases/ test scripts for Inbound and Outbound ETL processes.Worked on optimizing data refresh rates in Qlik Sense, reducing latency and improving the performance of real-time reporting for faster business insights.Expert in analyzing the error caused to failure of the ETL load with reference to the log file and report them to the corresponding team to get it rectified.Worked on DB2 as my database and also with TeradataExperience in setting up the semantic layer and also in building the dimensional data cubes and have good knowledge of snowflake and star schema.Environment: SAP BODS, Informatica, Teradata,DB2,SQL,Qlik Sense, Windows, Linux, Agile, Git, JIRAClient: Syena Infosoft PVT LTD, India May 13 - Aug 15ETL DeveloperResponsibilities:Extensively worked with the business to capture the requirements and clearly document the user Specifications and make sure these are designed and tested thoroughly.Analyzed the changes made to different EDI ANSI X12 transactions (834, 837 I and P, 278, 270 and 271) under HIPAA 5010.Develop detailed specifications of X12 and XML data format standards.Working knowledge of the EDI X12 4010A1 HIPAA Guidelines and addendums. Transaction sets include 834 eligibilities inbound and 835 remittance advice outbound.Responsible for analyzing User Case specification, workflow diagrams, sequence diagrams and class diagrams to create Test Scenarios for system and integration testing.Documented software bugs and wrote test reports by using Team Foundation Server (TFS)Integrated Qlik Sense with real-time data streaming technologies (e.g., Kafka, MQTT) to ensure continuous data flow into dashboards, improving real-time analytics capabilities.Customized real-time data feeds using Qlik Sense to support specific use cases, such as inventory management, sales tracking, and production line monitoring, improving operational efficiency.Performed Back End Testing by executing SQL queries on Window and UNIX platformResponsible for providing test environments for business / users to perform the User acceptance testing (UAT).Experience in ETL process using ETL Tools, creating and maintaining Repository, Source systems, Target Databases and developing strategies for ETL mechanisms using ETL tools like Informatica, TeradataCreated new directories, Parameter files and new set of shell scripts in Unix for processing Informatica jobs and also pre- and post-processing of the database tables.Created Mappings using HTTP transformation, to call REST API (third party web services) and get the data in the XML format and parse the same using XML parser transformation and subsequent transformations to load data into relational SQL SERVER database.Initially used mapping parameters with different parameter files to feed different sets of input parameters to the HTTP transformations.Provided estimates and retesting processes in response to XML/RESTful, and MQ change requestsTest programs using test tools and mapping the results of SQL database. Also supported and maintained messaging infrastructure tools used to monitor JMS queues.Tracking defects during the ST/UAT, responsible for closing the defects and user signoff.Supported HP ALM implementation and coached testers to adopt HP ALM for managing test activities.Weekly test status reporting for test planning & test execution to the Team Lead.Environment: Informatica PowerCenter 9.6.1, Test Director 7.6, BW, BO, MQ Series, Microsoft Visual Source Safe, ETL, Qlik Sense, TFS, HP ALM, SQL Server, API, Oracle 8i, DB2 8.1

Respond to this candidate
Your Message
Please type the code shown in the image:

Note: Responding to this resume will create an account on our partner site postjobfree.com
Register for Free on Jobvertise