| 20,000+ Fresh Resumes Monthly | |
|
|
| | Click here or scroll down to respond to this candidateAbhilash Email: EMAIL AVAILABLEVisa: H1B (Since Street Address ) (I-140 Approved) Phone No. PHONE NUMBER AVAILABLECurrent Location: MichiganIn-Person Office: YesSUMMARY10 Years of Experience working in IT industries with a strong background in SDLC, Data Integration tools including Informatica PowerCenter, Informatica Cloud (IICS).Expertise and involvement in data modeling for Data Warehouse/Data Mart development, data analysis for Online Transaction Processing (OLTP) and Data Warehousing (OLAP)/Business Intelligence (BI), star-schema modeling, snowflake schema modeling, and the design of fact and dimension tables for applications.As an IICS (Informatica Intelligent Cloud Services) Informatica Cloud Developer, I have gained experience in Snowflake, AWS, Azure, Oracle, Teradata, Tableau, and ServiceNow etc.Experience in creating the methodologies and technologies that depict the flow of data within and between application systems and business functions/operations & developed Data Flow Diagrams.Support as a Lead of ETL Developers by distributing the modules of work as per sprint release.Created Multiple Task flows for loading data from Different Sources to salesforce using salesforce connecter with Data synchronization Task, Mapping Task Used API required.Experienced ETL Developer with a strong background in data integration, transformation, and migration using Informatica and Talend.Configure and test API connections in IICS for data extraction and loading and ensure secure handling of API credentials and tokens.Extensive experience with cloud data warehouses such as Snowflake, AWS Redshift, and Azure SQL Data Warehouse.Hands on experience on reporting tools like OBIEE and Business Objects.Skilled in integrating and managing data across cloud platforms including AWS, Azure, and Oracle Cloud.Strong understanding of relational databases including Oracle, Teradata, and other database systems. And familiarity with data visualization tools like Tableau for creating insightful reports and dashboards.Configure and manage the REST v2 connections in IICS to ensure proper authentication and authorization to set up secure handling of API.Placing the developed code into GitHub and branching using Git. Check-in, Check-out of files, commits, perform push/pull and merging branches.Good knowledge on Hadoop Ecosystem components like HDFS, Hive.Successfully designed and implemented ETL pipelines using IICS for seamless data integration between Snowflake, AWS S3, and Azure Blob Storage.Led migration projects from on-premises databases (such as Oracle and Teradata) to cloud platforms (AWS and Azure) using Informatica Cloud.Experience in Developing Mappings, Task flows, Tasks, Data Synchronization tasks, saved queries, Sessions and Workflows Using Informatica Powercenter and Informatica Intelligence Cloud Services.Performing data manipulations using various Informatica Transformations like Filter, Expression, Lookup, Aggregate, Update Strategy, Joiner, Router, Sorter, and Union.Proficient in leveraging Informatica Cloud to connect various cloud and on-premises data sources, including Snowflake, AWS, Azure, Oracle, Teradata, Tableau, and ServiceNow.Experience in Datawarehouse, Data engineering, Test Data Management, Informatica ETL/TDM and Performance tuning.Extensive working experience with large scale ETL process flows, relational databases, dimensional modeling, BI dashboards and reporting framework, to build efficient data integration strategy (data mapping, standardization and transformation) across multiple systems.Performing requirement gathering analysis design development testing implementation support and maintenance phases of Data Integration Projects.Support Work closely with Oracle Cloud administrators, other developers, and business stakeholders. Provide support for troubleshooting and resolving integration issues.Experience in data profiling methods, data mining & defining data quality rules.Hands-on experience writing scripts in UNIX to process files to SFTP locations.Working Knowledge of processing Flat files, XML files, JSON files, Mainframe files etc.Experience in handling the Could Oracle, DB2 database.Education:Bachelors of Technology from JNTUH (Computer Science) - 2012Masters in Computer science. Cumberlands University, KY, USA - 2016Certifications:Informatica Cloud Certification.AWS / Azure / Snowflake certifications.Technical Skills:ETL ToolsIICS (Informatica Integration Cloud Services), Informatica Power Center 9.x/10.x/12, SAP BODS, Talend Studio, Power Exchange, Data Exchange, TOAD, WINSCP, Putty.DatabasesSQL Server 12/13, Oracle 10g, DB2, Teradata, SnowflakeHadoop Echo SystemHDFS, Hive, Kafka, PySpark,Exposure skills:Pig, MainframesBI ToolsTableau, OBIEELanguagesSQL, shell scripting, Python, PL/SQLScheduling ToolsAutosys, Crontab, Informatica schedulerMethodologiesAgile, waterfall, JIRAData Modeling toolsErwin R2, SQL DBMOperating SystemWindows, UNIXPROFESSIONAL EXPERIENCEClient: Blue Cross Blue Shield of Michigan, Detroit, MISenior ETL Informatica IICS Developer Oct 2018 Till DateProject Description:Care Advance (CA) is a new care management system that BCN and BCBSM Healthcare implemented.Staff will use it to perform case management; disease management, wellness management, and utilization management to ensure members are properly using their health benefits and maintaining a good state of health.Involved in all phases of SDLC from requirement gathering and participated in meetings with business users along with business analysts.Worked on converting the functional specs from BA to technical specifications and created high level documentation for projects.Design and implement data loading processes to populate OLAP data warehouse tables to make sure the efficient data loading techniques to handle large volumes of data.Document Oracle Cloud API endpoints, data models, and transformation logic.Configure and test API connections in IICS for data extraction and loading and ensure secure handling of API.Involved in Hadoop development using Hive, Sqoop and HDFS.Extracted data from snowflake to push the data into Azure warehouse instance to support reporting requirements.Used SQL and TOAD to validate the Data going in to the Data Ware House.Configure and manage connections to Oracle Cloud services in IICS and to ensure secure handling of credentials and access permissions for Oracle Cloud integrations.Configure and manage the REST v2 connections in IICS to ensure proper authentication and authorization to set up secure handling of API, created Swagger file for API calls.Extensively used performance tuning techniques while loading data into Azure synapse using IICSDevelop mappings to extract data from REST APIs using REST v2 connectors in IICS and handle various data formats such as JSON, XML, and others provided by the APIs.Implemented SCD Type1/Type2 mappings based on client requirement.Integrating and managing data across cloud platforms including AWS and Oracle.Tuned the performance of mappings by following Informatica cloud best practices and applied several methods to get best performance by decreasing the run time of workflows.Perform Push down optimization (PDO) in order to efficiently handle huge volume of financial data.Integrated and automated data workloads to snowflake Warehouse.Ensure ETL/ELT's succeeded and loaded data successfully in Snowflake DB.Worked on regular database capacity planning related to database growth and system utilization, trend analysis and predicting future database resource requirements as per warehouse growth in volumes of the data.Creating the informatica Task flows, linear task flow, replication task, dynamic mapping, mappings and data synchronization task using IICS.Created/ Build UNIX shell scripts to pull data from vendors and dropped into Informatica environment using FTP process.Prepared Unit test case document, Impact analysis and system test case documentation.Prepared the Run books for production support team and support actively on need basis.Deep understanding of agile methodologies and the challenges, co-ordinate with the Scrum master and development/QA teams to ensure resource availability and allocations.Worked on shore offshore Model by following up on the Mappings performed.Environment: IICS (Informatica Intelligent Cloud Services), Informatica power center, AWS, Azure, TOAD, Hadoop, Python, SQL SERVER2012/2014, Oracle, JIRA, Unix Shell Scripting, Clear Quest, Putty, Snowflake, WinScp, Windows 10 Enterprise, MS-Office tools, flat files.Client: Blue Cross Blue Shield of Vermont, Berlin, VermontSenior ETL Informatica IICS Developer Sep 2017 Aug 2018Project Description:Blue Cross Blue Shield of Vermont and United Physician PO are working together to create a home-based primary care program for higher risk, chronically ill Medicare Members who are attributed to United Physician PO. United Physicians PO will identify chronically ill Medicare Members and provide in home health care once the member agrees to participate in the program. BCBSM will reimburse UP per engaged members per month. Earlier and more coordinated care within the home environment will improve member experience, quality of care, and ultimately lower costs through decreased ER use, urgent care visit, etc.Worked with Business Users and architects to understand the requirements for design & development.Worked on designing the table structure based on the Collector data and Credit Card data provided by the users.Designed the Normalized the table structure to conform to 3rd normal form for collection files, Master Files and relational database.Designed the Data Model and created several Facts & Dimensions based on the reporting needs for the Projects.Developed UNIX & Python scripts to preprocess the data.Placing the developed code into GitHub and branching using Git. Check-in, Check-out of files, commits, perform push/pull and merging branches.Implement data cleansing, normalization, aggregation, and enrichment transformations.Apply business rules and logic to prepare data for OLAP analysis.Map and load transformed data to the target systems (e.g., databases, cloud storage, or another API).Designed ETL Design Document as per the Business Rules.Used Talend to extract data from different source systems and files to load data into target database for downstream process.Developed Data integration platform components/processed using Informatica Cloud platform, Azure SQL Data warehouse, Azure Data lake store and Azure Blod Storage Technologies.Automate ETL processes using IICS workflows and schedules to ensure regular data updatesDesigned several complex ETL process to load the incoming files from Mainframes, Informix, OLTP systems into Prestaging & Staging Areas.Work on Informatica on Cloud in AWS for POC to build source to Target Mappings.Build out best practices regarding data staging data cleansing and data transformation routines within the Informatica solution.Used Informatica Cloud (IICS) to load Sales force objects into AWS S3 buckets.Have been involved in designing & creating hive tables to upload data in Hadoop and process like merging, sorting and creating, joining tables.Design and implement data transformations to cleanse, normalize, and enrich data extracted from Oracle Cloud.Tune the mappings by following Informatica best practices and applied several methods to decrease the run time of ETL jobs.Experience in creating the documentations like Source -Target mapping, High-level and low-level design documents (i.e., functional and technical design documents).Participated in Informatica hotfix testing and system testing.Created Base Tables Stages tables based on the data model and number of source systemsEnvironment: Informatica Power Center 9.6.1/10.2, Informatica scheduler, Data warehouse (OLAP), Talend, Erwin, Teradata 13.0, Oracle, IBM Mainframe, Unix scripts, Python scripts, GIT, Toad, Jira, Azure, AWS, Agile, Waterfall, Hadoop, MS-Office tools, Flat files, T-SQL, Microsoft Office SharePoint Server, MS Access 2010.Client: Motorola, Chicago, ILETL Informatica Developer May 2016 Aug 2017Project DescriptionThe project aimed to enhance data quality, streamline data processing, and generate comprehensive sales reports for strategic business analysis.Gathered requirements and provided analysis of source data as data comes in from different source systems.Designed the complex ETL process to load the incoming files into stage and load them to ODS and then to Stage and EDW as per business requirements.Designed the technical specifications based on the functional specifications.Served as point of contact between development team and the users to coordinate the requirements and make the team better understand them.Loading the data into Snowflake Data base in the Cloud from various sources.Designing and developing mapping, transformation logic and processes in Informatica for implementing business rules and standardization of source data from multiple systems into the data warehouse.Maintaining source and target mappings, transformation logic and processes to reflect the changing business environment over time.Develop mappings to extract data from various APIs and handle API pagination, rate limiting, and error responses.Perform performance tuning by implementing push down optimization (PDO) and session level partitioning.Hands on experience using AWS Services, S3, CloudFront, RDS, CloudWatch and CloudFormation focusing on high availability.Performed Code reviews with Team to ensure all standards are being followed.Participated in testing areas like Unit testing, system testing and Pre-prod testing.Worked extensively on Teradata as part of the process to develop several scripts to handle different scenarios.Created several stored procedures to update several tables and insert audit tables as part of the process.Developed UNIX shell scripts to preprocess the data prior to loading in to ODS and staging area.Environment: Informatica PowerCenter 9.6.1, IBM Mainframe, Oracle, Jira, SQL, AWS, Teradata, Snowflake, UNIX Shell Script, Tivoli, Data warehouse and DatamartClient: Wells Fargo, Hyd, IndiaETL Informatica Developer Jan 2013 Nov 2014Project Description:The project focused on streamlining data workflows, improving data quality, and providing financial and operational reports.Extensively used Informatica and PowerCenter Client tools- source Analyzer, Warehouse Designer, Mapping Designer, Mapplet Designer, Informatica Repository Manager and Informatica Workflow Manager.Collected all the Analysis required document and Data Mapping Document based up on the requirement and provided analysis of source data as data comes in from different Source systems.Designed ETL Design Document as per the Business Rules.Followed Agile Development Methodology for the process based on Iterative developmentExtensively involved in Data Extraction, Transformation and Loading (ETL) from Source to stage, stage to EDW systems.Created Mappings/Sessions/Workflows, Mapplets, Mapping Parameters, and connection variables.Used Perl scripts for the file processing.Created collect stats for the target Teradata tables.Created loaders connections like SQL loader, Multiload, T-pump to improve the target performance.Used pushdown optimization to improve the session performance.Implemented SCD Type1/Type2 mappings based on client requirement.Prepared unit test document, technical design document, migration document and Tidal document.Environment: Informatica PowerCenter 8.6.1 Informatica Power Analyzer, Agile, Jira, Waterfall, TWS, WinScp, Putty, Remote Desktop, SQL, Windows XP, Oracle 10g/9i, Teradata, Putty, UNIX shell scripting. |