| 20,000+ Fresh Resumes Monthly | |
|
|
| | Click here or scroll down to respond to this candidateYashwantejPHONE NUMBER AVAILABLE EMAIL AVAILABLEhttps://LINKEDIN LINK AVAILABLEData Modeler/AnalystPROFESSIONAL SUMMARY Over 7+ years of experience as a Senior Data Modeler/Analyst with proven skills in requirement gathering, gap analysis, change management, user acceptance testing, data analysis, business analysis, ETL testing, and BI testing in the healthcare sector. Experience in developing Business Intelligence assets using tools such as Informatica PowerCenter, Informatica Data Quality, OBIEE, Tableau, Oracle, and others. Proficient in Business Intelligence, Oracle, database management, Informatica, and SQL Developer. Experience working with data modeling tools like Erwin, Power Designer, and ER Studio. Experience in designing star schema, Snowflake schema for Data Warehouse, and ODS architecture. Designed the Data Marts in dimensional data modeling using star and snowflake schemas. Designed dimensional data models following Kimball Methodology principles to support business reporting and analytics requirements. Very proficient with Data Analysis, mapping source and target systems for data migration efforts, and resolving issues relating to data migration. Experience in Data Scrubbing/Cleansing, Data Quality, Data Mapping, Data Profiling, Data Validation in ETL. Experience in medical claims data claims, member, provider, clinical, etc. Strong understanding and experience in user acceptance testing, smoke testing, regression testing, performance testing, and functional testing. Involved in IT business analytics, data analytics, and data profiling tasks with businesses to enhance data accuracy and results. Expertise in various software development life cycles, including Waterfall and Agile methodologies. Experience with Hadoop stack and big data analytic tools, including Hive, Spark, HDFS, Sqoop, Kafka, and Airflow. Extensive experience with analytical and data visualization tools and techniques, including Tableau and Power BI. Experience in data management using Collibra for enterprise data governance. Good knowledge to recommend and implement claims policies and procedures Sound knowledge of health insurance products, healthcare claims, and policy administration operations Experienced in modeling OLTP and OLAP using tools like Erwin r9.6/9.1/r8/r7.1/7.2, Sybase Power Designer 12.1 and E-R Studio. Integrated data from diverse source systems into the data warehouse using Kimball's data integration techniques. Experience in designing and developing SSIS packages for loading data from Oracle, text files, Excel, and Flat files to SQL Server database. Experience in data analysis, data profiling, data validation, data cleansing, data verification, data mining, data mapping, and identifying data mismatches. Skilled in writing HiveQL and Spark SQL queries for managing and analyzing large datasets. Experience in creating BI reports using Cognos and Tableau, including dashboards, summary reports, master-detail reports, drill-down reports, and scorecards. Developed Tableau dashboards using stacked bars, bar graphs, scatter plots, geographical maps, and Gantt charts. Excellent knowledge of Health Insurance Portability and Accountability Act (HIPAA) standards and compliance issues. Collaborated with cross-functional teams to implement fraud detection algorithms and methodologies, improving real- time detection and prevention systems. Analyzed large datasets to detect potential fraud patterns and anomalies, leveraging statistical analysis and SQL to enhance fraud detection processes. Applied cybersecurity principles and fraud detection algorithms to design automated workflows, improving the integrity of data in fraud analysis. Experienced in modeling OLTP and OLAP using tools like Erwin r9.6/9.1/r8/r7.1/7.2, Sybase Power Designer 12.1, and E-R Studio. Integrated data from diverse source systems into the data warehouse using Kimball's data integration techniques. Good in Normalization / Demoralization procedures for effective and optimum performance in OLTP and OLAP environments. Proficient in System Analysis, ER/Dimensional Data Modeling, Database design, and implementing RDBMS specific features. Extensive experience in gathering business/functional user requirements, creating use cases, and developing/designing diagrams such as activity, class, and sequence diagrams, along with creating Business Requirements Documents (BRD). Involved in facets system implementation, electronic claims and benefits configuration setup testing, inbound/outbound interfaces, and extensions, as well as load and extraction programs involving HIPAA and proprietary format files and reports development. Worked with providers and Medicare/Medicaid entities to validate EDI transaction sets and internet portals. Defined AS-IS and TO-BE processes and developed requirements through gap analysis. TECHNICAL SKILLSLanguages: SQL, T-SQL, PL/SQL, Unix Shell Script, Python, R, SQL, JavaScript, HTML5, CSS3, C++. Databases: Oracle 9i/10g/11g, SQL Server 2008/2005, Teradata, Sybase 11.5, DB2, PL/SQL. ETL Tools: Informatica 9.5/9.1x/8.6x/7.x (Power Center) (Designer, Workflow Manager, Workflow Monitor). Operating Systems: Windows 2000/ XP/ Vista, LINUX, UNIX, DOS, Windows 2003 server. Tools: SQL Loader, SQL*Plus, TOAD, SQL Developer, Erwin, Jira, Trello. Others: Impact Assessment, Risk Management, Solution Development, MS Project. Platforms: Tableau, Power BI, Jupyter Notebook, Apache Hadoop, Spark, AWS S3, AWS Lambda, Excel, Power Query.Certification: Google Data Analytics Professional Certificate (Link) April 2024 - August 2024. Micro-Certification: Data Cleaning, Data Mining, Advanced Excel, Predictive Analytics, Data-Driven Decision Making, Data Visualization, infrastructure design, including schema design, dimensional data modeling, data Security, Analytical Thinking, Competitive Advantages, Data Analytics, Data Mining, Data Science, and Machine Learning.PROFESSIONAL EXPERIENCEClient: Biogen Inc, RTP, NCSr Data Modeler August 2022 PresentResponsibilities: Played a key role in the System Development Life Cycle (SDLC) process, including design, gap analysis, business requirements, systems requirements, test criteria, and implementation for projects automating correspondence for insurance policy owners. Utilized Agile methodology to implement project life cycles for report design and development. Developed the logical data models and physical data models that capture current state/future state data elements and data flows using ER Studio. Developed a Conceptual model using Erwin based on requirements analysis. Used Erwin for reverse engineering to connect to existing databases and ODS to create graphical representation in the form of Entity Relationships and elicit more information. Involved in Data Mapping activities for the data warehouse. Produced PL/SQL statements and stored procedures in DB2 for extracting as well as writing data. Ensured production data being replicated into a data warehouse without any data anomalies from the processing databases. Experience working with Azure BLOB and Data lake storage and loading data into Azure SQL Synapse analytics (DW). Worked as OLTP Data Architect & Data Modeler to develop the Logical and Physical 'Entity Relational Data Model' for Patient Services system with entities & attributes and normalized them up to 3rd Normal Form using ER/Studio. Worked on data cleaning, data visualization, and data mining using SQL and Python Designed graph data models for serverless architectures using AWS Lambda and Amazon DynamoDB. Designed and implemented data models for AWS Data Lakes using services like Amazon S3 and AWS Glue DataBrew. The data model was developed with PARTY, ADDRESS, PROVIDER, DRUG, CLAIM & MEMBERSHIP entities. Created and maintained documentation for AWS data models, including data dictionaries and metadata. Participated in the Data Governance working group sessions to create Data Governance Policies. Linked data lineage to data quality and business glossary work within the overall data governance program. Experience in Data Extraction, Transformation, and Loading of data from multiple data sources into target databases, using Azure Databricks, Azure SQL, PostgreSQL, SQL Server, Oracle Loaded the tables from the DWH to Azure data lake using Azure data factory integration run time. Developed Star and Snowflake schemas based dimensional model to develop the data warehouse. Created Physical Data Model from the Logical Data Model using Compare and Merge Utility in ER/Studio and worked with the naming standards utility. Reverse Engineered DB2 databases and then forward engineered them to Teradata using ER Studio. Provided source to target mappings to the ETL team to perform initial, full, and incremental loads into the target data mart. Responsible for migrating the data and data models from the SQL server environment to the Oracle 10g environment. Worked closely with the ETL SSIS Developers to explain the complex Data Transformation using Logic. Extensively worked on the naming standards which incorporated the enterprise data modeling. Generated comprehensive analytical reports by running SQL queries against current databases to conduct data analysis. Collected business requirements to define data mapping rules for data transfer from source to target systems. Utilized AWS services to store and analyze large datasets, reducing query times and enabling efficient data-driven decision-making. Worked with specialists to migrate data into DWH and was involved in data modeling. Created logical and physical data models using Erwin for relational (OLTP) and dimensional (OLAP) databases with Star schema for fact and dimension tables. Gathered requirements and modeled the data warehouse and transactional databases, conducting a gap analysis to identify and validate requirements. Built and customized interactive reports and dashboard reports using Tableau. Managed Tableau Administration tasks, including configuration, adding users, managing licenses, scheduling tasks, and embedding views. Created SQL logic for aggregate views in Snowflake DB. Documented data sources and transformation rules for populating and maintaining data warehouse content. Designed use cases and process flow models using Visio and Rational Rose. Environment: Jira Atlassian, Collibra, Power BI, Snowflake, SQL Server (SSIS, SSAS), MS Excel, GitHub, Snowflake, T-SQL, AWS (EC2), S3, Erwin, Kafka, OLTP, OLAP, Visio, UNIX, Shell Scripting, Windows. Client: Northern Trust, Chicago, ILData Modeler/Business Analyst May 2020 July 2022 Responsibilities: Participated in requirement-gathering sessions to understand expectations and worked with system analysts to understand the format and patterns of the upstream source data. Developed logical models per the requirements and validated their alignment with the enterprise logical data model(ELDM). Architecting and Modeling Enterprise Data Hub (EDH) cloud-based solutions using AWS Redshift for Analytical platform and AWS Oracle RDS for Reference and Master Data. Involved in the design and development of data warehouse environment, liaison to business users and technical teams gathering requirement specification documents and presenting and identifying data sources, targets, and report generation. Enforced referential integrity in the OLTP data model for consistent relationship between tables and efficient database design. Used different ER Model techniques such as Parent-Child, 'Associative', Sub Type Super Type, Union, Key Value Pair concept, etc. to develop the data model. Identified and Analysis of various facts from the source system and business requirements to be used for the data warehouse (Kimball Approach). Generating DDL scripts for both Redshift and RDS Oracle out of ADS ER Modeler. Providing day-to-day data administration and security-related tasks for the ETL team related to AWSRedshift and AWS Oracle RDS. Translated business requirements into detailed system specifications and developed use cases and business process flow diagrams employing unified modeling language (UML). Checked for all the modeling standards including naming standards, entity relationships on the model, and comments and history in the model. Implemented naming standards when creating indexes, primary keys, and relationships. Created reference values, lookup tables, history tables, and staging tables per requests from development teams and worked with them in developing the mappings. Worked extensively on legacy models that were reverse engineered from databases to implement object naming standards in the logical model and completed definitions by working with respective application teams. Generated ad-hoc SQL queries using joins, database connections, and transformation rules to fetch data from legacy DB2 and SQL Server database systems. Developed Data Mapping, Data Governance, Transformation, and cleansing rules for the Master Data Management Architecture involving OLTP, ODS, and OLAP. Supported Data Quality Managers in developing tools for data governance and master data management. Assisted other teams in integrating multiple data models when the respective systems were migrated from multiple databases into a single Oracle schema. Prepared ETL technical mapping documents along with test cases for each mapping for future developments to maintain SDLC. Responsible for performing in-depth quantitative data analysis and generating ad-hoc queries as per client requirements and performed complex ad-hoc SQL queries from BAs in both SSMS and Teradata SQL Assistant. Assisted the DBA in creating tables, indexes, views, snapshots, and maintaining the security of the database. Developed Data Migration and Cleansing rules for the Integration Architecture (OLTP, ODS, DW). Defined integrity rules and defaults and was involved in data profiling. Helped testing teams with the development of test cases in User Acceptance Testing and production testing, and independently performed exploratory testing.Environment: SharePoint, MS Excel, Agile, MS Word, MS Access, HP Quality Center, Python, UML, ETL, Informatica Power Center 7.X, Oracle 11g, MDM, Erwin, OBIEE 11g, MS Office suite, MS Visio, Oracle SQL Developer. Client: Baker Hughes, Huston, TXData Analyst/ SQL Developer Intern April 2019 April 2020 Responsibilities: Hands-on experience in troubleshooting production issues and performance tuning of jobs. Participated in development efforts to analyze client-server applications using Visual Basic as the front-end tool and MS SQL Server Database. Designed packages for various data sources, including SQL Server, Flat Files, and XML. Involved in converting PL/SQL stored procedures, triggers, functions, and scripts to T-SQL. Designed a custom Reference Data Management system, allowing business data stewards to manage slowly changing master data sets in registered Oracle 19c tables used by other OLTP systems. Defined standards and procedures for data modeling, database design, reference data management, and ETL requirements and design. Used ER Studio to make logical and physical data models for enterprise-wide OLAP system/Star Schema/ Snowflake Schema. Write ETL development specification and run POC to prove data transformation, migration, or integration. Developed Star and Snowflake schemas-based dimensional model growing the data warehouse. Specialized in transforming data into user-friendly visualization to give business users a complete view of their business using Tableau Designed, implemented, and maintained OLAP servers and processes to replicate production data to the server. Performed data extraction from OLTP systems to OLAP environments for decision-making using SSIS. Extracted large volumes of data from different sources and loaded them into target databases by performing various transformations using SQL Server Integration Services (SSIS). Created reports by extracting data from cubes. Involved in the installation and configuration of SSRS Reporting Server and the deployment of reports. Deployed SSRS reports across multiple environments, including Test, Production, and Supporting environments. Created ad-hoc reports for upper-level management using stored procedures and MS SQL Server 2005 Reporting Services (SSRS) based on business requirements. Developed VBA scripts and macros to automate report generation. Environment: SQL Server, Teradata, SAS, SQL, PL/SQL, T-SQL, OLAP, OLTP, SSIS, XML, SSRS, Oracle, Excel, Jira, VBA, Shell Scripting.NFS IT Solutions, Hyderabad, India July 2015 July 2017 Data Analyst/ Data ModelerResponsibilities: Interacted with users and business analysts to accumulate requirements. Understood existing data model and documented expected design affecting performance with the system Initiated and conducted JAD sessions inviting various teams to finalize the necessary data fields in addition to their formats. Developed Logical and Physical Data models by employing Erwin Created logical data model on the conceptual model and its particular conversion into physical database design. Developed shell scripts to process and manipulate large datasets, perform data validation, and extract relevant information. Followed industry best practices for shell scripting, including modularization, code readability, and version control. Created ftp connections, and database connections for sources and targets. Maintained security and data integrity from the database. Developed several forms & reports using Crystal Reports. Provided maintenance support to customized reports created in Crystal Reports/ASP. Mapped the data between Source and Targets. Generated Reports from data models. Reviewed data model together with the functional and technical team. Assisted the ETL staff, developers, and users in understanding the data model. Maintained an alteration log for each data model created. Created PL/SQL codes from data models and interacted with DBAs to make development, testing, and production databases. Implemented database objects (like indexes, and partitions in Oracle database) for performance. Interacted with DBA to discuss database design and modeling, index creations, and SQL tuning issues. Environment: Oracle 8i, SQL Navigator, PLSQL, Pro DB2, SQL server 2008, MS SQL server Analysis Manager, Sybase, Rational Requisite, Windows NT, Crystal Reports.EDUCATION Master of Science in Data Science University of Memphis at Memphis, TN March 2019 Bachelor of Computer Science and Engineering Jawaharlal Nehru Technological University, INDIA June 2015 |