Quantcast

Informatica Etl Developer Resume Ashburn...
Resumes | Register

Candidate Information
Name Available: Register for Free
Title Informatica etl developer
Target Location US-VA-Ashburn
Email Available with paid plan
Phone Available with paid plan
20,000+ Fresh Resumes Monthly
    View Phone Numbers
    Receive Resume E-mail Alerts
    Post Jobs Free
    Link your Free Jobs Page
    ... and much more

Register on Jobvertise Free

Search 2 million Resumes
Keywords:
City or Zip:
Related Resumes

Informatica developer Leesburg, VA

Informatica/IICS Developer Columbia, MD

SQL SSIS ETL Developer Microsoft Certified: Azure Data Fundament Gainesville, VA

Salesforce Developer Reston, VA

Production Support Sql Developer Silver Spring, MD

Sql Server Database Developer Alexandria, VA

Sql Server Powerbi Developer Herndon, VA

Click here or scroll down to respond to this candidate
Candidate's Name  KANNURUE-Mail: EMAIL AVAILABLEMobile: PHONE NUMBER AVAILABLEExecutive SummarySix years of experience in ETL/Informatica, four plus years in data warehousing and data integration in all phases of the SDLC (Software Development Life Cycle) including analysis, design, development, testing, implementation and maintenance with timely delivery against aggressive deadlinesProficient with Informatica Power Center designer tools (Source Analyzer, Warehouse Designer,Mapping Designer, Mapplet Designer, and Transformation Developer), Workflow Manager Tools (Task Developer, Worklet and Workflow Designer)Data modeling using star schema and snowflake models.Created UNIX shell scripts to run Informatica workflows and control the ETL flow.Skilled in the development of Informatica mappings and mapplets and tuning them for optimum performance.Experience working in an Agile development environment.Hands on experience with tools like Toad and SQL DeveloperWell acquainted with session partitioning and performance tuning of sources, targets, mapping and sessions to overcome the bottlenecks in mappings.Proficient in creating stored procedures, tables, cursors, views, indexes, SQL joins and performance-based query writing.Used advanced PL/SQL concepts for implementing complex ETL logic.Strong knowledge of slowly changing dimensionsMigrated objects in all phases (DEV, QA, PRD) of project and trained developers to maintain system when in production.Educational QualificationsBachelor of Engineering in Viswanatha Institute Of Technology And Management (VITAM) from JNTUK (2014).Work Experience SummaryOrganizationAreaTenure (Yr.)Last designation heldCognizantTechnology Solutions(Boston Medical Centre Healthnet plan(Wellsense),Blue Shield of California, MassMutual)HealthCare, InsuranceFrom Feb 2022 to Till DateETL/Informatica DeveloperHighmark IncHealthcareFrom June 2019 to Sep 2020ETL/Informatica DeveloperIndependent healthHealthcareFrom June 2017 to Mar 2019ETL/Informatica DeveloperTechnical & Functional ExpertiseOperating SystemsWindows 2011 / 10Programming languagePrimary Skill: Informatica Power Center, SQLSecondary Skill: UNIX, Tidal, TWS, Autosys, ActiveBatchScripting languageSQL, UNIXTools & UtilitiesInformatica PowerCenter 10.5/10.4/10.2Domain KnowledgeHealthCare, InsuranceDatabaseOracle, Netezza, SQL Server, DB2Highlights of Professional ExperienceCognizant Technology SolutionsFeb 2022  Till Date#AreaClientProject Details1HealthcareBoston Medical Center Healthnet plan (Wellsense)Project Details:VSP SCO:VSP is the external claims vendor for Vision claims. VSP SCO is the new implementation for Senior Care Options for VSP vision vendor.There is one source for VSP SCO claims data which will be received twice a month and VSP SCO provider data received once a month. It contains both the historical and the incremental data, and need to be transformed into database form.Responsibilities: Interacted with business analysts and data architects to finalize the requirements and completed the technical design document for coding. Managed the design, development and implementation of VSP SCO through Informatica mappings, sessions and workflows. There are three layers for the new Datawarehouse STAGE_PERM, ODS and STAGE_WHS STAGE_PERM is direct load of data from file. The ODS data loaded to STAGE_WHS. Jobs will be triggered using ActiveBatch scheduler tool. Provided the code reviews and KT sessions for any operational changes before promoting the code to higher environments.Worked on the defects raised by the QA team and Data architects team and provided the code fix. Successfully deployed the code to production. MoveIt is used for the file transfer.Technologies: Informatica Power Center 10.5.1/10.4.1/10.2.0 (Repository Manager, Designer, Workflow Monitor, Workflow Manager), Microsoft SQL Server Management studio 18, ActiveBatch Scheduler, MoveIt.2HealthcareBlue Shield of California (BSC)Project Details:PV Doula:Doula Services as a Medi-Cal BenefitThe Department of Health Care Services (DHCS) has added doula services to the list of preventive services covered under the Medi-Cal program. Doula services include personal support to women and families throughout a woman's pregnancy, childbirth, and postpartum experience. This includes emotional and physical support, provided during pregnancy, labor, birth, and the postpartum period. Pursu ant to federal regulations, doula services must be recommended by a physician or other licensed practitioner.Doula services will be available in fee-for-service Medi-Cal and through Medi-Cal managed care plans. Beneficiaries in a Medi-Cal managed care plan will receive doula services from their plan.Responsibilities:Interacted with business analysts to finalize the requirements and completed the technical design document for coding.Managed the design, development and implementation of the PV Doula project using Netezza ELT JAWS framework.Coordinated with the offshore team and develop the code, executed the manifest and loaded the data into the Netezza tables.There are three layers RAWSTG, BOR (Book of Records) and BIS (Business Information System)RAWSTG is direct load of data from file.There are different types of BORS, Claim book of records, Member book of records, Provider book of records, Product book of records etc. RAWSTG data will be loaded to BOR.The BOR data loaded to BIS, Tableau reports will be generated from this BIS layer.Jobs will be triggered using tidal scheduler tool.Provided the code reviews and KT for any operational changes before promoting the code to higher environments.Successfully deployed the code to production for five releases.Coordinated with the tidal team and created, scheduled the jobs for Doula.Coordinated with the MQMFT team for the file transfer (FTP).Technologies: Informatica Power Center 10.4.1/10.2.0 (Repository Manager, Designer, Workflow Monitor, Workflow Manager), Netezza DBA, Unix, Oracle, Tidal Scheduler, MQMFTNuna Pay:Nuna is a healthcare technology company headquartered in San Francisco, California. Our mission is to help make high-quality healthcare affordable and accessible for everyone. We do this by building data solutions for healthcare payers and providers to measure and improve their cost and quality outcomes.Responsibilities:Informatica workflow will be triggered from tidal scheduler.Parameter update workflow will be created which will keep track of incremental dates in ABC tables for extract.Once parameter workflow is completed, data extract workflow will be executed using abc parameters to generate extract file.Appropriate Rules will be applied while executing the data from BOR and will be loaded to worktables in Netezza database.The data will be read from worktables and generate extract file in pipe delimited .txt format.Based on generated trigger file MQMFT job will move it to extract location to submit it to Nuna.Technologies: Informatica Power Center 10.4.1/10.2.0 (Repository Manager, Designer, Workflow Monitor, Workflow Manager), Netezza DBA, Unix, Oracle, Tidal Scheduler, MQMFTHighmark IncJUNE 2019  SEP 2020#AreaClientProject Details1HealthcareHighmark IncProject Details:Highmark Inc. is a national, diversified health care partner serving members through its businesses in health insurance, dental insurance, and reinsurance.Brief: ODH (Operational Data Hub) is repository for member, provider, claims & pharmacy information. The primary objective of the project was to build the data warehouse system integrating insured member/agent policy and claims information to satisfy the companys need for fast financial report generation. The immediate objective was to develop a Data Mart to handle health insurance member, their policy needs and claim Extracted data from fragmented data sources across the company was transformed and loaded into target Data Marts.Responsibilities:Interacted with business analysts to finalize the requirements and completed the technical design document for coding.Managed the design, development, and implementation of the ODH decommissioning project using Informatica Power Center, reducing development and implementation cost.Successfully managed and implemented various ETL projects using Informatica Power Center that loaded data from DB2, flat files and other data sources to Oracle and other targets used for reporting.Designed POC mappings and helped the ETL team with data acquisition techniques and loading strategies during the development phase.Documented the expected level of effort and resource requirements to configure required ODH processes to support the client on boarding and maintenance business processes.Defined and designed the Data Acquisition, Transformation, and Data Cleansing approach for the ODH implementation.Worked on the 10 ODH releases, implemented the business requirements for Claims, Member, and Provider data in Oracle data warehouse.Designed several parallel jobs using Sequential File, Dataset, Join, Merge, Lookup, Change Apply, Change Capture, remove duplicates, Filter, Copy, Column Generator, Peek, Modify, Compare, Oracle Enterprise, Surrogate Key, Aggregator, Transformer, Row Generator stages.Worked with SCD stage for implementing slowly changing dimensions. Extracted data from Flat Files and Oracle databases and applied business logic to load them in the central Oracle database.Developed map, reusable objects, transformation, and Mapplets using Mapping Designer,Transformation Developer and Mapplets Designer.Created reusable transformations and mapplets and used them in mappings.Used Informatica Power Center for extraction, loading and transformation (ETL) of data in the data warehouse.Worked with data modelers to prepare logical and physical data models and added and deleted necessary fields using Erwin.Tidal Scheduler was implemented for scheduling of Informatica workflows.Interacted with the vendor and set up the SFTP connection to the vendors ftp site for transferring extracted files.Set up SFTP connections to various vendors for receiving and sending of encrypted data as part of MFT Integration team support.Implementations were populated slowly to change dimension to maintain current information and history information in dimension tables.Used Informatica PowerCenter Workflow manager to create sessions and batches to run with logic embedded in the mappings.Created folders, users, repositories, and deployment group using Repository ManagerCreated Shared Containers to increase Object Code Reusability and to increase throughput of the system.Used environment variables, stage variables and routines for developing parameter driven jobs and debugged them.Performed extraction, transformation and loading of data using different types of stages and by performing derivations over the links connecting these stages.Designed the job templates and created new jobs from existing templates.Enhanced job performance by using partitioning methods and analyzing the resources utilized using Job MonitorCreated test plans for unit testing of designed jobs.Technologies: Informatica Power Center 10.2/10.1 (Repository Manger, Designer, Oracle WarehouseBuilder 11gR2, Oracle Data Integrator 10.1/11.1, Workflow Monitor, Workflow Manager), Informatica Power Exchange for Oracle CDC, Informatica Developer IDQ,Control-M, Autosys, DB2, Oracle 12g, Siebel, TOAD, SQL*Plus, Windows, UNIX, Shell ScriptingIndependent HealthJUNE 2017- MARCH 2019#AreaClientProject Details1HealthcareIndependent HealthProject Details:Independent Health provides health plan that continually aims to provide the community with innovative health-related products and services, which enable affordable access to quality health care and unmatched relationships with physicians and providers. We ensured the successful implementation of projects in support of corporate initiatives and performed detailed analysis, developed detailed designs, programs and tested applications.Responsibilities:Created high level design, technical design documents and mapping documents based on business requirements.Configured, designed and developed ETL for Enterprise Data Infrastructure (EDI) DataMart which includes sourcing, vending the security data.Optimized complex ETL processes using performance tuning methods.Created program to implement Change Data Capture CDC using Informatica to perform inserts, updates, and delete operations.Monitored and executed production jobs whenever required.Performed unit, system and integration testing to validate source and target data and performed internal quality auditing of the ETL code before delivery.Created Informatica tasks and workflows to execute ETL processes.Wrote complex SQL Query as per business requirement to fetch the data from relational tables.Optimized complex SQL query to improve performance.Created Slowly Changing Dimension SCD type1, type2, Slowly Growing Targets, and Simple PassThrough Informatica mappings to meet business requirementsPerformed Unit testing, Integration testing/User Acceptance testing for ETL/SQL/Unix script codes.Developed UNIX shell scripts to run the workflows and SFTP source/target files from various systems and Informatica process automation.Executed Autosys command/box to run the Informatica workflows.Technologies:Informatica Power Center 9.6 (Repository Manger, Designer, Workflow Monitor,Workflow Manager), Informatica Power Exchange for Oracle CDC, Informatica Developer IDQ, Autosys, DB2, Oracle12g, Netezza, TOAD, SQL*Plus, Tibco, Windows, UNIX, Shell Scripting

Respond to this candidate
Your Message
Please type the code shown in the image:

Note: Responding to this resume will create an account on our partner site postjobfree.com
Register for Free on Jobvertise