| 20,000+ Fresh Resumes Monthly | |
|
|
| | Click here or scroll down to respond to this candidateBhargavi J (ETL Developer)EMAIL AVAILABLE PHONE NUMBER AVAILABLE Herndon VirginiaProfessional Summary:Over 8 years in IT specializing in Data Warehousing and ETL processes with Oracle Data Integrator (ODI) and Informatica PowerCenter, leveraging Oracle databases, Windows, RHEL, and AWS Cloud Services for On-premises to Cloud Migration.Developed ETL processes for data extraction, transformation, and loading using ODI (11g/12c) and Informatica PowerCenter.Proficient in ETL tools: ODI (Designer, Topology Manager, Operator, Security Navigator) and Informatica (Designer, Workflow Monitor, Repository Manager).Designed and implemented ETL solutions: Developed Mappings, Packages, Models, Datastores in ODI and Mappings, Sessions, Workflows, and Transformations in Informatica.Advanced knowledge of ODI Knowledge Modules: RKM, LKM, CKM, IKM, JKM.Experienced with Informatica transformations: Aggregator, Expression, Filter, Joiner, Lookup, Pivot, Unpivot, Router, Sorter.Created and mapped physical and logical architecture for various technologies across all environments using different contexts.Skilled in administration: Managed repositories, developed procedures, user functions, and variables in ODI; setup repositories, managed workflows, and optimized performance in Informatica.In-depth understanding of data warehousing concepts: Expertise in star and snowflake schemas, Normalization, dimensions, and fact tables.Expertise in extracting and transforming data from heterogeneous sources (Oracle, JSON, XML, SQL Server, Excel) to target RDBMS databases (MySQL, Oracle, and SQL Server).Proficient in AWS Cloud Services: Expertise in cloud migration, ensuring effective data integration and transformation.Competent in implementing Slowly Changing Dimensions (SCD) Type 1 & 2 and Change Data Capture (CDC).Expertise in troubleshooting problems by performing root cause analysis and fixing issues within the Service Level Agreement.Expertise in working with CI/CD automated deployments using GitHub and Jenkins. Skilled in Agile methodologies and software development lifecycle.Comprehensive understanding of data warehousing concepts, performance tuning, Python, SQL queries, and PL/SQL concepts, ensuring robust and efficient data management solutions.Skilled in preparing functional and technical documentation based on business requirements, maintaining and modifying pre-existing applications, enterprise testing, and efficient error handling.Technical Skills:ETL Tool : Oracle Data Integrator 12c/11g, InformaticaDatabases : Oracle 12c/19c, SQL server, MySQL, TeradataProgramming Language : SQL, Python, PL/SQL, Unix Shell ScriptingCloud Platforms : AWS (Redshift, S3, RDS), AWS GlueDatabase Tools : SQL Developer and ToadReporting Tools : Power BI, TableauOperating Systems : Windows10, UnixOther tools : GitHub, Jenkins, Putty, WinSCP, JIRAEducational BackgroundGeorge Mason University, VA, April 2021Master of Science in Computer ScienceMahatma Gandhi Institute of Technology, Hyderabad, TG 2015Bachelor of Technology in Computer Science and EngineeringWork Experience Duration:EmployerDurationPanera Bread, ST Louis, MO1 year and 9 months.Teradata, San Diego, CA2 years and 9 months.Universal Software, India1 years and 4 months.AARK Technologies, India3 years and 2 months.Experience:Project : AWS-Integrated ETL Solutions and Data ManagementClient : Panera Bread, ST Louis, MORole : ETL developer ODIEnvironment : ODI 12C (12.2.1.4), SQL Developer, Unix, Oracle12c/19c, Windows10, Putty, WinSCP, GitHub, Jenkins, AWS, RHEL 8.6, Power BI, JIRA, ITSMDuration : Jan 2023 PresentSummary : Developed and optimized ETL workflows integrating AWS services for efficient data handling and storage. Configured ODI Topology, created complex mappings, and implemented ELT workflows using AWS Glue, Lambda, and S3. Ensured data integrity, performed performance tuning, and developed materialized views for reporting. Provided technical support, collaborated with business analysts, and supported application migration to AWS Cloud.Roles and Responsibilities:Configured connection parameters in ODI Topology Manager for heterogeneous source and target systems, leveraging AWS services for secure and efficient data handling.Created models for multiple technologies to reverse-engineer data stores and utilized AWS Glue for data cataloging and transformation.Spearheaded the development of several complex mappings to implement transformation logic based on business requirements, integrating AWS Lambda functions for additional data processing.Developed ELT workflows within ODI packages using utilities such as ODIFileMove, ODIFileWait, ODIMasterExport, and ODIWorkExport, while employing AWS S3 for scalable data storage.Crafted PL/SQL functions, triggers, procedures, and packages as per client requirements, integrating with AWS RDS for database management.Ensured data integrity by enabling flow control using CKM and monitored mappings, and execution logs in the ODI Console, promptly resolving errors within the SLA.Configured ODI J2EE Agents, scheduled scenarios, and executed load plans to ensure precise data loads during batch operations at designated times.Customized knowledge modules for extracting and loading data from source to target databases.Developed SCD2 mappings for table historization and implemented CDC for processing journalized data, handled Dimension tables and Fact tables.Implemented performance tuning on long-running queries in SQL Developer by creating necessary indexes, removing unwanted joins as reported in explain plans, gathering stats, and applying parallel hints when required.Created materialized views with necessary columns for reporting purposes and scheduled ODI procedures for refreshing these views, data warehousing, and analytics.Offered technical support to peers during review meetings and consistently collaborated with business analysts to achieve project objectives and ensure alignment with business requirements.Executed deployments to higher environments and provided support to test teams, leveraging expertise in creating ETL unit test cases and debugging ETL mappings and PL/SQL programs.Demonstrated strong expertise in creating workflows using packages and objects like procedures, mappings, variables, user functions, and sequences.Implemented performance tuning techniques wherever required to increase the performance of the batch processing.Provided support for migrating applications from On-premises to AWS Cloud Infrastructure, including customizing Knowledge Modules to manage deletes in AUDIT tables.Project : Comprehensive Data Integration and ManagementClient : Teradata, San Diego, CARole : Associate IT ConsultantEnvironment : Informatica PowerCenter, SQL Developer, Putty, Citrix, WinSCP, Power BI, JIRA, Python, Java, Exadata.Duration : April 2020 - Dec 2022Summary : Designed Informatica-based ETL solutions, managing repositories, data servers, and models for diverse data sources. Engineered complex mappings and transformation logic, utilizing Informatica components and integrating Power BI for enhanced data analysis. Implemented Slowly Changing Dimensions (SCD) and Change Data Capture (CDC) strategies for incremental data loads and optimized SQL queries to improve job performance. Leveraged Python for scripting and automation tasks, applied Agile methodologies for project management, and utilized Java for integration tasks. Collaborated with cross-functional teams, facilitating effective communication and ensuring project success.Roles and Responsibilities:Expertise in Informatica Designer, Workflow Monitor, and Repository Manager, overseeing the creation of Master and Work repositories.Established data servers by configuring physical and logical schemas across various technologies, including File, Oracle, SQL Server, and Excel files within the Informatica environment. Used Exadata for high-performance data processing and storage.Generated new models for heterogeneous data sources and conducted reverse engineering on both data sources and targets, leveraging Power BI for scalable data analysis.Engineered numerous mappings with transformation logic utilizing Informatica's component palette, including Aggregator, Expression, Filter, Joiner, Lookup, Pivot, Unpivot, Router, and Sorter for real-time data processing.Developed Python scripts for automating ETL tasks, data validation, and custom data processing solutions.Played a pivotal role in comprehending both source and target systems while extensively applying Informatica Knowledge Modules for data storage solutions.Implemented SCD and CDC strategies, leveraging CDC functionalities for incremental data loads and large-scale data management.Created complex mappings and workflows for full and incremental data loads, ensuring seamless execution of scheduled tasks.Applied Java for integrating ETL solutions with external systems and enhancing data processing capabilities.Managed and collaborated with cross-functional teams using Agile methodologies to deliver data solutions, facilitating communication and coordination among team members to ensure project success.Led team meetings and workshops to align on project goals and strategies, fostering a collaborative environment that encouraged idea-sharing and problem-solving.Improved performance of ETL jobs processing large volumes of data by fine-tuning SQL Queries and optimizing workflows. Applied PL/SQL for advanced data manipulation and complex query development.Project : Efficient Data Integration and TransformationClient : Universal Software, Ahmadabad, IndiaRole : Associate IT ConsultantEnvironment : ODI 11g (11.1.1.5), SQL Developer, Oracle 12c, Window 10, Putty, WinScpDuration : Aug 2018 Nov 2019Summary : Executed ETL processes using ODI, enhancing data accuracy and accessibility. Configured Topology and Operator, establishing Master and Work repositories for efficient data management. Analyzed source and target systems, collaborating with business analysts to refine requirements. Established data servers and schemas for diverse technologies, facilitating seamless data integration. Engineered complex mappings with various transformation logics, and created scheduled workflows using scenarios and packages. Collaborated with cross-functional teams, contributing to project planning and review meetings. Provided mentorship to junior team members, promoting knowledge sharing and enhancing team capabilities.Roles and Responsibilities:Executed data extraction, loading, and transformation to the warehouse using ODI, enhancing data accuracy and accessibility.Configured Topology and Operator and established Master and Work repositories tailored to client environments, ensuring efficient data management and processing.Deeply involved in analyzing both source and target systems for transformation, collaborating with business analysts to gather and refine requirements.Established data servers and defined physical and logical schemas for multiple technologies, such as File, Oracle, SQL Server, and Excel files within Topology, facilitating seamless data integration.Engineered various mappings with complex transformation logic, including Filters, Joins, Lookups, Aggregates, Expressions, Pivot, and Unpivot, etc.Demonstrated strong experience in creating scheduled workflows using scenarios and packages containing objects such as procedures, interfaces, variables, and user functions, ensuring timely and accurate data processing.Led collaborative efforts with cross-functional teams to deliver high-quality data solutions, fostering a culture of open communication and continuous improvement.Actively participated in project planning and review meetings, contributing to strategy development and issue resolution.Provided mentorship and guidance to junior team members, enhancing team capabilities and promoting knowledge sharing.Project : Data Integration and Performance EnhancementClient : AARK Technologies, IndiaRole : Associate IT ConsultantEnvironment : ODI 11g (11.1 1.5), SQL Server 2008, Windows7, WinSCPDuration : June 2015 July 2018Summary : Utilized ODI 11g for ETL processes across various source systems, including databases and flat files. Collaborated with business analysts to align ETL solutions with business objectives. Managed large-scale data applications, enhancing ETL performance through indexing and query tuning. Configured and scheduled ODI scenarios and load plans for reliable job execution. Deployed code across environments and maintained technical documentation. Demonstrated expertise in design, testing, production, and support for successful project execution.Roles & Responsibilities:Utilized ODI 11g for extracting, transforming, and loading data across various source systems and targets, including databases, flat files, and external sources.Worked closely with business analysts to gather requirements and ensure ETL solutions met business needs and objectives.Managed large-scale, data volume-intensive applications and enhanced performance of ETL processes through optimization techniques such as indexing and query tuning.Configured and scheduled ODI scenarios and load plans to ensure timely and reliable execution of ETL jobs.Deployed the code from lower to higher environments.Prepared technical documentation based on requirements gathered from business users. Maintained and modified pre-existing applications.Demonstrated proven abilities in design, testing, production, and support for the successful execution of various projects. |