| 20,000+ Fresh Resumes Monthly | |
|
|
| | Click here or scroll down to respond to this candidateMs. Candidate's Name
EMAIL AVAILABLE
Mobile :PHONE NUMBER AVAILABLE
Informatica/TalenD Developer
7+ years of IT experience in Analysis, Design, Development, Implementation, Testing and
Support of Data Warehousing and Data Integration Solutions using Informatica Powercenter.
Sun Certified Java Programmer.
5+ years of experience in using Informatica PowerCenter (7.1.3/8.6.1/9.0.1).
1+ years of experience in Reporting tool COGNOS ( 8.4).
1 +year of experience in Talend Open studio and Talend Integration Suite .
Knowledge in Full Life Cycle development of Data Warehousing.
Have extensively worked in developing ETL program for supporting Data Extraction,
transformations and loading using Informatica Power Center.
Experience with dimensional modeling using star schema and snowflake models.
Understand the business rules completely based on High Level document specifications and
implements the data transformation methodologies.
Created UNIX shell scripts to run the Informatica workflows and controlling the ETL flow.
Developed OLAP applications using Cognos 8BI - (Frame Work Manager, Cognos Connection,
Report Studio, Query Studio, and Analysis Studio) and extracted data from the enterprise data
warehouse to support the analytical and reporting for Corporate Business Units.
Strong with relational database design concepts.
Extensively worked with Informatica performance tuning involving source level, target level and
map level bottlenecks.
Excellent experience working on Talend ETL and used features such as Context variables,
Database components like tMSSQLInput, tOracleOutput, tmap, tFileCopy, tFileCompare, tFileExist
file components, ELT components etc.
Strong business understanding of verticals like Banking, Brokerage, Insurance, Mutual funds and
Pharmaceuticals.
Independently perform complex troubleshooting, root-cause analysis and solution development.
Ability to meet deadlines and handle multiple tasks, decisive with strong leadership qualities,
flexible in work schedules and possess good communication skills.
Team player, Motivated, able to grasp things quickly with analytical and problem solving skills.
Comprehensive technical, oral, written and communicational skills.
PROFESSIONAL EXPERIENCE:
Client: PAREXEL International, Boston, MA
Role: ETL Consultant Duration: Feb 2015 to till date
PAREXEL International Corporation is a leading global biopharmaceutical services organization, providing
a broad range of expertise-based contract research, consulting, medical communications, and technology
solutions and services to the worldwide pharmaceutical, biotechnology and medical device industries.
SIMS Data feed project is for loading data from staging area to data warehouse to feed dimension and
fact table to different subject areas called
Subjects, Medical Imaging data, Queries, Contacts.
Responsibilities:
Involved to understand the Business requirements with Business Analysts and stakeholders.
Involved to design the blue print and approach Documents for high level development and design
of the ETL applications.
Developed complex ETL mappings to load dimension and fact tables.
Worked with type 1 and type 2 dimensional mappings to load data from source to target.
Worked on agile methodology and JIRA based stories.
Designed blueprint and approach documents for each story and take approvals from the business
analysts to design the ETL mappings.
Developed mappings, sessions, workflows and run the jobs.
To develop the mappings used all transformations like Source Qualifier, Normalizer, Expression,
Filter, Router, Update strategy, Sorter, Lookup, Aggregator, Joiner and Sequence Generator in the
mapping.
Used reusable transformation from shared folder to implement the same logic in different
mappings.
Worked on unit testing and prepared unit test documents and uploaded in JIRA to review.
Involved with testing team to resolve the issues while testing ETL applications.
Resolved the testing issues and conducted meeting with testing team to understand the business
requirements and how to test the applications based on business and technical prospective.
Involved in moving the ETL jobs from development to test environment.
Environment: Informatica9.5.1, Oracle 11g, Unix
Client: JPMorgan Chase & Co, Tampa, FL
Role: ETL Consultant Duration: May 2014 to Jan 2015
Description :JPMorgan Chase & Co is the leading financial institution in the United states, Control Room
Risk Assessment Data Analytics & Reporting(RADAR) is the project for risk Analysis and Reporting for
protecting security and confidentiality of the information to provide an organized , effective and complaint
approach to responding any potential breach of JPMorgan Chase Information.
Data Privacy was one of the Context on boarded in to Control Room RADAR Portal Application, RADAR
will receive manual file from GLASS Source system .using Informatica .xlsx files will Extract, Transform
and loaded in to Data warehouse. Data will be available for Dashboard and Reporting requirements in
RADAR Application Portal for end users.
Responsibilities:
Involved to understand Business requirements with Business Analysts.
Designed Function Specification Documentation as per previous generic Framework designs for
all on boarding Contexts in RADAR Application Portal.
Involved to understand Meta Data driven Framework design and Process Workflows.
Involved in design and development of complex ETL mappings.
Created context specific views on top of fact tables depending on Reporting Requirements.
Validated files uploaded through Portal, loaded in to staging, dimension and fact tables.
Coordinated with offshore team to develop the context specific applications.
Involved to verify unit test case scenarios, performance issues, defect fixes in SIT and UAT
environment
Coordinated with support team to deploy the code in QA, UAT and Prod and fixes the issues
during deployment phase.
Environment: Informatica9.5.1, Oracle 11g,Unix,Autosys,Cognos,Qlickview
Client: CenturyLink, Gardner,KS
Role: Informatica Developer Duration: August 2012 to May 2014
Description:
CenturyLink, Inc. is a multinational communications company headquartered in Monroe, Louisiana. It
provides communications and data services to residential, business, governmental and wholesale
customers. It acquired Embarq and Qwest and began doing business as CenturyLink, Network Inventory
Consolidation Program is the conversion project for Qwest acquisition, using Informatica ETL tool to pull
the data from source system CIRAS & MetaSolv and transformed the data and load it into
TGT_FILE_TABLES which have the structure of the TIF that we have generated loaded in TIRKS Target
Systems.
Responsibilities:
Design, Development and Documentation of the ETL (Extract, Transformation & Load) strategy to
populate the Data Warehouse from the various source systems.
Worked on Informatica 9.0.1 client tools like Source Analyzer, Warehouse Designer, Mapping
Designer, Workflow Manager and Workflow Monitor.
Involved in design and development of complex ETL mappings.
Implemented partitioning and bulk loads for loading large volume of data.
Worked on dimensional modeling containing three phases i.e., conceptual, physical and logical
modeling.
Based on the requirements, used various transformations like Source Qualifier, Normalizer,
Expression, Filter, Router, Update strategy, Sorter, Lookup, Aggregator, Joiner and Sequence Generator in
the mapping.
Developed Mapplets, Worklets and Reusable Transformations for reusability.
Identified performance bottlenecks and Involved in performance tuning of sources, targets,
mappings, transformations and sessions to optimize session performance.
Identified bugs in existing mappings/workflows by analyzing the data flow and evaluating
transformations.
Monitored jobs in work flow monitor to resolve critical issues.
Experienced in release management and deployment of code..
Performance tuning by session partitions, dynamic cache memory, and index cache.
Developed Informatica SCD type-I, Type-II mappings. Extensively used almost all of the
transformations of Informatica including complex lookups, Stored Procedures, Update Strategy, Mapplets
and others.
Implemented update strategies, incremental loads, Data capture and Incremental Aggregation.
Extensively worked on various Look up Caches like Static, Dynamic, Persistent, and Shared
Caches.
Developed workflow tasks like Email, Event wait, Event Raise, Timer, Command and Decision.
Created Stored Procedures in PL/SQL.
Used PMCMD command to start, stop and ping server from UNIX and created UNIX SHELL
PERL scripts to automate the process.
Created UNIX Shell scripts and called as pre session and post session commands.
Developed Documentation for all the routines (Mappings, Sessions and Workflows).
Involved in design and development of Business Requirements in liaison to business users and
Technical teams by gathering requirement specification documents and identifying data sources
and targets.
Analyzed application requirements and provided recommended design and studied the current
system to understand the existing data structures.
Participated actively in user meetings and collected requirements from users.
Used Informatica Power Center for extraction, transformation and loading (ETL) of source data on
heterogeneous Database sources like Oracle, SQL Server & flat files.
Designed and developed a number of complex mappings using various transformations like
Source Qualifier, Aggregator, Router, Joiner, Union, Expression, Lookup (Connected &
unconnected), Filter, Update Strategy, Stored Procedure, Sequence Generator and used
reusable transformations as well as mapplets.
Worked with Workflow Manager for the creation of various tasks like Worklets, Sessions,
Batches, Event Wait, E-mail notifications, Decision and to Schedule jobs.
Extensively used the Slowly Changing Dimensions-Type II in various data mappings to load
dimension tables in Data warehouse.
Administered the repository by creating folders and logins for the group members and assigning
necessary privileges using Informatica Repository Manager.
Was responsible for the version control and up-gradation.
Involved in the creation of partitions in Mapping to improve the performance of Informatica
sessions.
Involved in extensive performance tuning by determining bottlenecks using Debugger at various
points like targets, sources, mappings, sessions or system. This led to better session
performance.
Wrote UNIX Shell Scripts and pmcmd command line utility for automating Batches and Sessions
using ESP Workload manager.
Involved in Production Scheduling to setup jobs in order and provided 24x7 production support.
Involved in trouble shooting and resuming failed jobs.
Environment: Informatica Power Center 9.0.1/Informatica Power Center 8.6.1, Putty, MS SQL server
2005/2008, Oracle 10g/9i, SQL, PL/SQL, Shell Scripts, Windows XP, Unix.
Client: Humana Inc, Louisville, KY Duration: October 2011 to July 2012
Role: ETL Informatica Developer
Description:
Humana Inc., headquartered in Louisville, Kentucky, is a leading health care company that offers a wide
range of insurance products and health wellness services. Humana provides Medicare Advantage plans
and prescription drug coverage to more than 3.5 million members throughout the US.
The main objective of this project shared data Repository is to capture new vitality program customers
data , policies, group policies, HumanaOne and non HumanaOne medicare plans.
Data is coming from various sources like SQL Server, Mainframe, etc. which will be loaded in to EDW
based on different frequencies as per the requirement. The entire ETL process consists of source
systems, staging area, Datawarehouse and Datamart.
Responsibilities:
Developed ETL programs using Informatica to implement the business
requirements.
Communicated with business customers to discuss the issues and requirements.
Created shell scripts to fine tune the ETL flow of the Informatica workflows.
Used Informatica file watch events to pole the FTP sites for the external
mainframe files.
Production Support has been done to resolve the ongoing issues and
troubleshoot the problems.
Performance tuning was done at the functional level and map level. Used
relational SQL wherever possible to minimize the data transfer over the network.
Effectively used Informatica parameter files for defining mapping variables,
workflow variables, FTP connections and relational connections.
Involved in enhancements and maintenance activities of the data warehouse including tuning,
modifying of stored procedures for code enhancements.
Effectively worked in Informatica version based environment and used deployment groups to
migrate the objects.
Used debugger in identifying bugs in existing mappings by analyzing data flow, evaluating
transformations.
Effectively worked on Onsite and Offshore work model.
Pre and post session assignment variables were used to pass the variable values from one
session to other.
Designed workflows with many sessions with decision, assignment task, event wait, and event
raise tasks, used informatica scheduler to schedule jobs.
Reviewed and analyzed functional requirements, mapping documents, problem solving and
trouble shooting.
Performed unit testing at various levels of the ETL and actively involved in team code reviews.
Identified problems in existing production data and developed one time scripts to correct them.
Fixed the invalid mappings and troubleshoot the technical problems of the database.
Environment: Informatica 8.6.1, SQL Server 2008 R2, LINUX ,
Client :Allstate, Chicago, IL Duration : August 2010 to Sep 2011
Role : ETL TalenD developer
Description:
Allstate is one of the fastest growing Auto/Property/ Life Insurance Company. It serves its customers by
offering a range of innovative products to individuals and group customers at more than 600 locations
through its company-owned offices.
The primary objective of this project is to capture different Customers, Policies, Claims Agents, Products
and financial related data from multiple OLTP Systems and Flat files. Extracted Transformed Loaded
data in to data warehouse using TalenD and generated various reports on a daily, weekly monthly and
yearly basis. These reports give details of the various products of Allstate Insurance products that are
sold. The reports are used for identifying agents for various rewards and awards and performance, risk
analysis reports for Business development Managers.
Responsibilities:
Worked with Data Mapping Team to understand the source to target mapping rules.
Analyzed the requirements and framed the business logic for the ETL process using talend.
Involved in the ETL design and its documentation.
Developed Jobs in Talend Enterprise edition from stage to source ,intermediate, conversion and
Target
Worked on Talend ETL to load data from various sources to Oracle DB. Used tmap, treplicate,
tfilterrow,tsort, tWaitforFile and various other features in Talend.
Worked on Talend ETL and used features such as Context variables, Database components like
tMSSQLInput, tOracleOutput, file components, ELT components etc
Followed the organization defined Naming conventions for naming the Flat file structure, Talend
Jobs and daily batches for executing the Talend Jobs.
Worked on Context variables and defined contexts for database connections, file paths for easily
migrating to different environments in a project.
Implemented Error handling in Talend to validate the data Integrity and data completeness for the
data from the Flat File
Tuned sources, targets and jobs to improve the performance
Used Talend components such as tmap, tFileExist, tFileCompare, tELTAggregate, tOracleInput,
tOracleOutput etc.
Participated in weekly end user meetings to discuss data quality, performance issues. Ways to
improve data accuracy and new requirements, etc.
Involved in migrating objects from DEV to QA and testing them and then promoting to Production.
Followed the organization defined Naming conventions for naming the Flat file structure, Talend
Jobs and daily batches for executing the Talend Jobs.
Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load)
processes using Talend Integration Suite.
Involved in automation of FTP process in Talend and FTPing the Files in Unix .
Created Talend Development Standards. This document describes the general guidelines for
Talend developers, the naming conventions to be used in the Transformations and also development and
production environment structures.
Extracted data from Oracle as one of the source databases.
Optimized the performance of the mappings by various tests on sources, targets and
transformations
Environment: TalenD 5.1.1, Oracle11g,UNIX
Client :Adobe Inc, San Jose, CA Duration :April 2010 to August 2010
Role :ETL Developer
Description:
This position requires implementing data warehouse for Forecasting, Marketing, Sales
performance reports. The data is obtained from Relational tables and Flat files. I was
involved in cleansing and transforming the data in the staging area and then loading into
Oracle data marts. This data marts/Data warehouse is an integrated Data Mine that
provides feed for extensive reporting.
Responsibilities:
Used Informatica Power Center for (ETL) extraction, transformation and loading data from
heterogeneous source systems into target database.
Created mappings using Designer and extracted data from various sources, transformed data
according to the requirement.
Involved in extracting the data from the Flat Files and Relational databases into staging area.
Mappings, Sessions, Workflows from Development to Test and then to UAT environment.
Developed Informatica Mappings and Reusable Transformations to facilitate timely Loading of
Data of a star schema.
Developed the Informatica Mappings by usage of Aggregator, SQL overrides usage in Lookups,
source filter usage in Source qualifiers, and data flow management into multiple targets using Router.
Created Sessions and extracted data from various sources, transformed data according to the
requirement and loading into data warehouse.
Used various transformations like Filter, Expression, Sequence Generator, Update Strategy,
Joiner, Router and Aggregator to create robust mappings in the Informatica Power Center Designer.
Imported various heterogeneous files using Informatica Power Center 8.x Source Analyzer.
Developed several reusable transformations and mapplets that were used in other mappings.
Prepared Technical Design documents and Test cases.
Involved in Unit Testing and Resolution of various Bottlenecks came across.
Implemented various Performance Tuning techniques.
Environment:
Informatica 8.1.1 Power Center, Oracle 9i, Windows NT.
Client :PRUDENTIAL Financial, Inc. Newark, NJ Duration : Oct 2009 to March 2010
Role :ETL Informatica Developer
Description:
Prudential Financial companies serve individual and institutional customers worldwide and include The
Prudential Insurance Company of America, one of the largest life insurance companies in the U.S. These
companies offer a variety of products and services, including mutual funds, annuities, real estate
brokerage franchises, relocation services, and more. Involved in the development and implementation of
goals, policies, priorities, and procedures relating to financial management, budget, and accounting.
Analyzes monthly actual results versus plan and forecast.
Responsibilities:
Involved in design, development and maintenance of database for Data warehouse project.
Involved in Business Users Meetings to understand their requirements.
Designed, Developed and Supported Extraction, Transformation and Load Process (ETL) for data
migration with Informatica 7.X.
Developed various mappings using Mapping Designer and worked with Aggregator, Lookup,
Filter, Router, Joiner, Source Qualifier, Expression, Stored Procedure, Sorter and Sequence
Generator transformations.
Created complex mappings which involved Slowly Changing Dimensions, implementation of
Business Logic and capturing the deleted records in the source systems.
Worked extensively with the connected lookup Transformations using dynamic cache.
Worked with complex mappings having an average of 15 transformations.
Created and scheduled Sessions, Jobs based on demand, run on time and run only once.
Monitored Workflows and Sessions using Workflow Monitor.
Performed Unit testing, Integration testing and System testing of Informatica mappings.
Coded PL/SQL scripts.
Wrote Unix scripts, perl scripts for the business needs.
Coded Unix Scripts to capture data from different relational systems to flat files to use them as
source file for ETL process.
Created Universes and generated reports on using Star Schema.
Environment: Informatica PowerCenter 7.1.3, Oracle, UNIX
Client :Oakwood Healthcare System, Dearborn, MI Duration : Nov 2008 to Oct 2009
Role :Cognos Developer
Description:
The Oakwood Healthcare System serves 35 different communities in southeastern Michigan with over 40
primary and secondary care locations. Responsibilities include working with the clinical analytics team on
the measurement of provider performance, quality improvement initiatives, and various ad-hoc requests.
The reports are created, distributed and published using various Cognos BI tools like ReportNet,
Impromptu, Power Play, IWR, and UpFront to the end-users. The application had OLAP features like Drill
Down analysis, Multidimensional analysis, Prompts, Exception Highlighting and User Privileges.
Responsibilities:
Developed models in Framework Manager.
Published packages and managed the distribution / setup of the environment.
Used Query Studio for creating Ad-hoc Reports.
Created complex and multi-page reports using Report Studio.
Performed migration from Impromptu to Reportnet.
Used Schedule Management in Cognos Connection.
Performed Bursting Reports and Multilingual Reports using Report Studio.
Developed Layout, Pages, Object Containers and Packages using Report Studio.
Created reports using ReportNet with multiple Charts and Reports.
Responsible for assigning user Sign-Ons for the new users.
Provided guidance to report creators for enhancement opportunities.
Created Multidimensional Cubes using PowerPlay and published on the UpFront Portal using
PowerPlay Enterprise Server.
Developed PowerPlay Cubes, used multiple queries, calculated measures, customized cube content
and optimized cube creation time.
Fine-tuned the Cubes and checked the database space issue and cube growth periodically.
Responsible in the creation of new User Groups and User Classes using Access Manager.
Environment: CogonsBI (Frame work manager, Cognos Connection, Report Studio, Query Studio),
SQL server 2005.
TECHNICAL SKILLS
Operating Systems: Windows, Linux, HP-UX
Software / Applications: MS XP, MS 2000, MS Word, MS Excel, MS Access,
Outlook, PowerPoint
Database: SQL Server 2008/2005/2000, Oracle 10g/9i/8i
ETL: Informatica PowerCenter 7.1.3/8.6.1/9.0.1,
Informatica Power Exchange 8.6.1,
Modeling: Talend 5.1.1, Framework Manager, PowerPlay Transformer
OLAP/BI Tools: Cognos 8 Series
Languages: Java, HTML, XML, SQL, PL/SQL.
Web/Apps Servers: IBM Web Sphere 4x, Sun iPlanet Server 6.0. , IIS, Tomcat
Tools: TOAD, Visio, Eclipse
Education:
Bachelor of Science (Osmania University)
Master Diploma in Computer Applications(Osmania University)
Certifications:
SCJP (Sun Java Certified)
|