Quantcast

Informatica Iics Developer Resume Columb...
Resumes | Register

Candidate Information
Name Available: Register for Free
Title Informatica/IICS Developer
Target Location US-MD-Columbia
Email Available with paid plan
Phone Available with paid plan
20,000+ Fresh Resumes Monthly
    View Phone Numbers
    Receive Resume E-mail Alerts
    Post Jobs Free
    Link your Free Jobs Page
    ... and much more

Register on Jobvertise Free

Search 2 million Resumes
Keywords:
City or Zip:
Related Resumes

Informatica etl developer Ashburn, VA

Informatica developer Leesburg, VA

Salesforce Developer Reston, VA

Production Support Sql Developer Silver Spring, MD

Sql Server Database Developer Alexandria, VA

Sql Server Powerbi Developer Herndon, VA

Business Intelligence Software Development Ellicott City, MD

Click here or scroll down to respond to this candidate
                                                              [pic] [pic]PROFESSIONAL SUMMARY  . 9+ years of experience in development, implementation, documentation    and support of ETL processes for large scale data warehouses using    Informatica PHONE NUMBER AVAILABLE  . 2+ years of experience in Informatica Cloud.  . Used different Informatica tools such as Informatica Power Center    (Designer, Repository Manager, Workflow Manager, Workflow Monitor and    Admin homepage for Repository & Integration services) and Informatica    Power Exchange  . Good knowledge in Informatica intelligent cloud services concepts.  . Sound knowledge of OLAP Concepts, OLTP Data Modeling and Data Warehouse    Architecture & Methodologies  . Extensively worked on data extraction, Transformations and loading data    from various sources like Oracle, DB2, MySQL and non-relational data    sources like XML, Flat files and Mainframe datasets  . Strong knowledge of Entity-Relationship concept, Facts and Dimensions    tables, Slowly Changing Dimensions (SCD) and Dimensional Modeling (Star    Schema & Snow Flake Schema)  . In-depth understanding of database design concepts, logical data models    & physical database designs, Data Integrity and Normalized/DE    Normalized data models  . Well versed in developing the complex SQL queries, unions, joins,    views, PL/SQL (Stored Procedures, functions, Triggers and Packages) and    indexing  . Hands on experience on Microsoft Visio for drawing flow charts for the    mappings  . Knowledge in Teradata utilities Fast Load, MLoad, BTEQ  . Well versed with Transformations - Source Qualifier, XML Parser,    Joiner, Sorter, Aggregator, Expression, Lookup, Router, Filter, Update    Strategy, Sequence Generator, Stored Procedure, Normalizer, SQL and    Rank  . Experience in using sql loader scripts to load the data from external    files  . Involved in migration of the maps from IDQ to Power Center  . Very good proficiency in Informatica Data Quality 9.5.1/9.6.1  . Hands on experience in Performance Tuning of sources, targets,    transformations and sessions.  . Experience in resolving on-going maintenance issues and bug fixes;    monitoring Informatica sessions as well as performance tuning of    mappings and sessions on 24/7 production support  . Experience in performing data analysis on various kinds of data sources    for DW applications like EDW  . Extensive experience in writing UNIX shell scripts and automation of    the ETL processes using UNIX shell scripting  . Sound knowledge of XML, XML Schema (XSD), Java and Salesforce  . Good knowledge and experience in Waterfall, Agile & Scrum methodologies  . Good at creating mappings documentation, Data flow diagrams and Data    Modelling  . Sound knowledge in informatica cloud, Amazon Web Services (AWS) EC2,    S3, Redshift, RDS, Elastic Search, Dynamo DB, NoSQL, SQS and SNS  . Knowledge on Microsoft Azure Cloud services  . Involved in writing testcases and unit testing in lower environments  . Used different schedulers like Autosys, Control-M and Informatica    Scheduling  . Experience in creating new jobs with job dependencies and alerts as per    application team and business requirement using control_M and Autosys  . Maintained batch jobs using control_MTECHNICAL                                                             SKILLS|ETL/Informatica  |Informatica 10.x/9.x, Informatica PowerCenter    ||                 |(Designer, Workflow Manager, Workflow Monitor and||                 |Repository Manager), Informatica Cloud,          ||                 |Informatica Data Quality (IDQ), Slowly Changing  ||                 |Dimensions, Deployment Groups, Informatica Power ||                 |Exchange                                         ||Data ware Housing|EDW, ODS, DataMart, Ralph Kimball's Model or     ||                 |Bottom-up Approach, Inman's Model or Top-down    ||                 |Approach, Snowflake                              ||Data Modeling    |Dimensional Data Modeling, Star Schema Modeling, ||                 |Snowflake Modeling, FACT and Dimensions Tables,  ||                 |Physical and Logical Data Modeling, Erwin Data   ||                 |Modeler                                          ||Programming      |SQL, Store Procedures, Functions, Triggers,      ||                 |Views, Shell scripting, Java, J2EE, Java script, ||                 |HTML, XML, XSD, XSLT, Java Script, Webservices,  ||                 |WSDL, SOAP, Postgress sql                        ||Software         |Waterfall Model, Agile and Scrum Methodologies   ||Methodologies    |                                                 ||Tools            |Toad 12.8, DB Visualizer, SQL Navigator, SQL     ||                 |Developer, SQL Loader, SSIS, SQL*Plus, Microsoft ||                 |Visio, WinSCP, Beyond Compare, HP Quality Center,||                 |Service Now, JIRA,GIT, Putty, XML Spy, Kafka,    ||                 |Autosys, Control-M, IBM CA7 Scheduler, Power BI, ||                 |Hadoo]p, MS Visio, Rational Rose, SVN/PVCS/CVS,  ||                 |Remedy, SharePoint, SoapUI, Salesforce           ||Databases        |Oracle 11g/10g/9i, TSQL, SQL Server, IBM DB2,    ||                 |MySQL, Teradata, Microsoft Azure, Snow flake     ||Operating        |Windows, Linux, Unix, Python                     ||Systems/Scripting|                                                 ||                 |                                                 ||Cloud            |Informatica Cloud, Amazon Web Services (AWS)     ||Technologies     |                                                 |CERTIFICATIONS            &            EDUCATIONAL             QUALIFICATION . AWS Certified Solutions Architect - Associate (June 29, 2020) . Informatica PowerCenter Developer Certified Professional (Feb 17, 2019) . Master of  Computer  Applications  from  Jawaharlal  Nehru  Technological   University, Hyderabad . Bachelor of Computer Applications from Kakatiya University, WarangalWORK                                                              EXPERIENCEInformatica/IICS Cloud developer , Infosys(Feb 2021 -Present)Client: Carefirst, Owings Mills, MDProjects: Altruista-4000.0 Care & Utilization Management Responsibilities: .  Created different assets like mappings, mapping task, Taskflow's etc . Generated different types of extracts for the business as per their   request . Ran the jobs in IICS data integration platform to see the statistics . Verified the logs for any issues .  Performed data analysis on different sources, targets, database   artifacts, mappings, workflows . Involved in analyzing business requirements and designing target tables.   Designed and developed different mappings and workflows to meet the   business requirements and load the targets . Extracted the raw data from Snowflake cloud data warehouse and flat files   to staging tables using Informatica Cloud. . Developed the audit activity for all the cloud mappings. . Worked on different on Informatica tools like Designer, Repository   Manager, Workflow Manager, and Workflow Monitor . Implementation of data movements from on-premises to cloud in azure. . Developed Cursors, Stored Procedures, Triggers and Packages in Oracle   database for reporting purposes . Involved in Performance tuning at source, target, mappings, sessions, and   system levels . Involved in tuning complex long  running  SQL  queries  to  achieve  good   performance .  Created  Functional  specification   documents,   Migration   documents,   Knowledge Transfer documents, Test case documents   Environment: Informatica Cloud, Informatica 10.2 Power Center, Designer,   Workflow Manger, Workflow Monitor, Repository Manger, Snow flake dataware   house , Azure data storage, Micro soft sql server, Winscp, Putty, NIXETL/Informatica Developer, Broadmoor Solutions Inc                   (Dec2018- Dec 2020)Client: Sanofi Pasteur, Swiftwater, PAProjects: OC Analytics / Contract bucket breakdown / RMGP / Vax Value/MCEFirewallSanofi Pasteur is the vaccines division of a multi-national pharmaceuticalcompany called Sanofi Aventis.The business objective of this project is to modify the existing analyticssystems and processes and develop new reports that will provide thenecessary insight on large organized customers. This will provide aholistic view of the large organized customers and facilitate sales andmarketing needs for better strategic planning. With the information thatthese new analytics solutions will bring forth, the business can focustheir time and resources on the large organized customers that have higherbusiness potential and adjust their strategies if need be. This will beutilized by a variety of users across the commercial operations entity.Responsibilities: . Worked on different small and mid-size projects ranging from 1 to 6   months timeframe with weekly status calls . Extracted the raw data from SAP and flat files to staging tables using   Informatica Cloud. . Developed the audit activity for all the cloud mappings. . Worked on Informatica Power Center tool - Source Analyzer, Data   warehousing designer, Mapping Designer, Mapplet Designer, Workflow   Designer. Task Developer and Transformation Designer . Extensively used transformations like Source Qualifier, Sorter, Filter,   Aggregation, Joiner, Expression, Lookup, Router, Sequence Generator and   Update Strategy . Used Debugger to validate transformations by creating break points to   analyze, monitor Data flow which helped in identifying and rectifying   critical bugs and issues . Created Pre/Post UNIX scripts to perform operations like gun zip, remove,   more, concat, touch and trigger files . Helped and remediated any issues with migrating developed objects across   different environments like validation, pre-Prod and Prod . Job scheduling and monitoring was done through Control-M,CA7 scheduler . Extracted the data from flat files and other RDBMS databases into staging   area and populated onto Data warehouse . Worked on different tasks in Workflows like sessions, events raise, event   wait, decision, e-mail, command, Assignment, Timer and scheduling of the   workflowEnvironment: Informatica Cloud, Informatica 10.2/9.6, Power Center,Designer, Workflow Manger, Workflow Monitor, Repository Manger, InformaticaVersioning, Oracle 11g, TOAD, WinSCP, Putty, Control-M, CA7schedulerWindows, UNIXInformatica Developer, JSMN International Inc                   (Dec 2017-Nov 2018)Client: Fannie Mae, Herndon, VAProject: Multifamily CDSCreating a Multifamily MBS begins with a mortgage loan requested by afinancial institution or other lender on behalf of a borrower to finance orrefinance the purchase of a multifamily property. Multifamily CDS is acentralized data storage system for multifamily related activities. we getthe data from different sources in order to build ODS in between the sourceand target. We call it as CDS.CDS is developed by using industry-standard software, systems, and datarequirements and will be adaptable for use by other market participants inthe future. The EDI built will host all the business events/activities andvends the data in MISMO format.
Responsibilities: . Followed the Agile methodology development process with 4-week sprint   cycle and used JIRA to track the tasks . Involved in translating business requirements into technical requirements   and creating detailed design deliverables . Worked on Informatica Power Center tool - Source Analyzer, Data   warehousing designer, Mapping Designer, Mapplet Designer and   Transformation Designer . Worked with various transformations including router transformation,   update strategy, expression transformation, lookup transformation,   sequence generator, aggregator transformation, sorter transformation and   Filter transformation . Reusable transformations and Mapplets were built wherever redundancies   were needed . Extensively used Debugger in Power Center to investigate, diagnose,   isolate, debug issues and resolve problems . Involved in Dimensional modeling (Star Schema) of the Data warehouse and   used Erwin to design the business process, dimensions and measured facts . Wrote UNIX shell Scripts & PMCMD commands for FTP of files from remote   server and backup repository and folder . Developed the Autosys Jill scripts (Box, Command and File Watcher) to   schedule the Informatica jobs and added the dependencies per project   requirement . Participated in code reviews and helped tune the mappings to increase the   performance . Tuned performance of Informatica Session by increasing block size, data   cache size, sequence buffer length and Target based commit interval, and   mappings by dropping and recreation of indexes . Created mapping documents to outline data flow from sources to targets . Performed Unit testing and maintained test logs and test cases for all   the mappings . Validated/modified xml files using xml spy . Was on-call 24x7 on rotation basis  to  help  troubleshoot  and  fix  any   production issues to meet SLAsEnvironment: Informatica 9.6/9.1, Informatica Data Quality, Oracle 11g/10g,Toad for oracle 12.8, Db2, WinSCP, Putty, SharePoint, HP Quality Centre,Autosys, PVCS, XML, XML Schema (XSD), XML Spy, Erwin, Windows, UNIXInformatica Developer, JSMN International Inc                     (Jun 2013- Sep 2017)Client: Rail Inc., Cary, NCProject: Car Data Management SystemCar Data Management System's Operational Data Store (ODS) environment is anarchitectural storage environment that is primarily designed to decoupledata from dependent source applications. Moreover, an ODS can provide anagile environment that integrates and stores transactional data frommultiple sources to address key Rail Inc's requirements for transactionalprocessing, data analysis and reporting. The integration of data oftenrequires cleansing, resolving redundancy, and checking against businessrules to ensure integrity. In contrast to a data warehouse, an ODS isdesigned to contain low-level or atomic data with limited history that iscaptured and shared "real time". Finally, an ODS is also intended toaddress enterprise operational data requirements.
Responsibilities: . Responsible for Business Analysis and Requirements Collection. . Worked on Informatica Power Center tools- Designer, Repository Manager,   Workflow Manager, and Workflow Monitor. . Involved in building the ODS architecture and Source to Target mapping to   load data into Data warehouse. . Extensively used transformations like Filter, Aggregation, Joiner,   Expression, Lookup, Router, Sequence Generator and Update Strategy. . Involved in designing the maplets according to the business needs for   reusability. . Created Post UNIX scripts to perform operations like gunzip, remove, more   and touch files. . Strong knowledge in data ware house concepts. . Extracted the data from flat files and other RDBMS databases into staging   area and populated onto Data warehouse. . Maintained stored definitions, transformation rules and targets   definitions using Informatica repository Manager. . Developed mappings to load into staging tables and then to Dimensions and   Facts. . Developed SCD Type II mappings for master table transformations . Worked on different tasks in Workflows like sessions, events raise, event   wait, decision, e-mail, command, worklets, Assignment, Timer and   scheduling of the workflow. . Used Debugger to validate transformations by creating break points to   analyze, and monitor Data flow. . Validation and loading the rejected records into the relevant tables and   send to the users for re-submitting. . Involved in Performance tuning at source, target, mappings, sessions, and   system levels. . Identified data quality issues and their root causes. . Was on-call 24x7 on rotation basis to help troubleshoot and fix any   production issues to meet SLAs . Experience with job scheduling tool AutosysEnvironment: Informatica 9.6/9.1, Oracle 11g/10g, Db2, Db Visualizer,WinSCP, Putty, HP Quality Centre, Autosys, PVCS, XML, XML Schema (XSD), XMLSpy, Erwin, Windows, UNIX-----------------------Candidate's Name
Phone: PHONE NUMBER AVAILABLEEmail: EMAIL AVAILABLE

Respond to this candidate
Your Email «
Your Message
Please type the code shown in the image:
Register for Free on Jobvertise