Quantcast

Etl Developer Lead Resume Bloomington, I...
Resumes | Register

Candidate Information
Name Available: Register for Free
Title ETL Developer Lead
Target Location US-IL-Bloomington
Email Available with paid plan
Phone Available with paid plan
20,000+ Fresh Resumes Monthly
    View Phone Numbers
    Receive Resume E-mail Alerts
    Post Jobs Free
    Link your Free Jobs Page
    ... and much more

Register on Jobvertise Free

Search 2 million Resumes
Keywords:
City or Zip:
Related Resumes

Python Developer Software Development Bloomington, IL

Software Development Engineer Bloomington, IL

Web Developer Springfield, IL

Java Architect/Lead Gibson City, IL

Lead Mechanical Engineer Peoria, IL

Product Development Manager Champaign, IL

Software Developer C# Mahomet, IL

Click here or scroll down to respond to this candidate
Candidate's Name
		 PHONE NUMBER AVAILABLE 		                   Bloomington, IL	                                                                                                                       EMAIL AVAILABLE                  		          WorkPermit: EAD	                                                                   https://LINKEDIN LINK AVAILABLE


PROFESSIONAL SUMMARY
      13 years of IT experience in Analysis, Design, Development, Data Modeling and implementation of Client/Server, Data Warehouse/Data Mart & Business Intelligence for the domains Insurance, Manufacturing, Healthcare
      Over 11 years of ETL and data integration experience in developing ETL mappings and scripts using Informatica PowerCenter 10X/9.X/8.x/7.x/6.x      4+ years of experience working in Informatica cloud (IICS) and 2 years in IDQ      Dimensional Data modeling experience using Data modeling tool Sybase Power designer
      Strong experience with business requirement analysis, source system analysis, Data Analysis
      Involved in all phases of data warehouse project life cycle. Designed and developed ETL Architecture to load data from/to various sources/targets like DB2, Oracle, Flat files, and XML files, Teradata, Sybase and MS SQL Server, SAP(Idocs) and MS SQL Server , AWS S3, Azure Db
      Hands on experience in tuning Informatica mappings, identify and resolve performance bottlenecks in various levels like source, target, mappings and session      Hands on experience in scheduling jobs through Autosys and Tivoli.      Hands on writing Unix Shell & Autosys JIL scripts and PERL Scripts      Hands on experience in maintaining the code versions using Visual Source Safe.      Hands on working in Informatica Master Data management (MDM)      Proficient in writing complex Oracle SQL, Views, for database applications.      Hands on reading and writing to AWS S3. & Snowflake.Hands on creating/accessing IAM Users      Done POC in AWS Glue/S3/Athena      Done POC in Microsoft Azure Datafactory      Good Knowledge in AWS Cloud with S3, EC2, VPC, Elastic beanstalk, CloudTrail, Lambda, API, RDS, Glue, Athena, Snowflake, Redshift      Good Knowledge in Reporting Tool Business Objects Reports XI and Crystal Report, PowerBI      Good Verbal and Written Communication Skills. Have proven to be highly effective in interfacing across business and technical groupsEDUCATION & CERTIFICATION
      Master of Computer Applications in 2004, University of Madras, TamilNadu, India      Bachelors in Physics in 2001, University of Madras, India      Informatica Powercenter Certified Designer
      Preparing for AWS Data Engineer Associate certificationTECHNICAL SKILLS
SL#NAMEDESCRIPTION1OSUNIX, AIX 5.2, Windows 7/XP/2000/NT2ETL/ReportingDTS in SQL Server/SSIS, Informatica PowerCenter 10x,9.6/8.X/7X, Business Objects XI, Tableau, Microstrategy, Microsoft PowerBI, Informatica IDQ/MDM3DatabaseOracle, DB2, Teradata/ Netezza, MS-SQL Server 2000/2005, AWS Redshift, Snowflake5Data Modeling
Sybase Power Designer 15.26SQL EditorPL/SQL Developer, TOAD for DB27ScriptsUNIX Shell, PERL, Autosys JIL, JavaScript, Python8SchedulerAutosys, Jenkins9Config & MgmtVisual source safe, Quality Center    10Web ApplicationsASP, VB .NET & ASP.NET, PHP, HTML    11CloudInformatica cloud (IICS) / AWS /SnowflakeWORK SUMMARYCognizant Technology Solutions (Client - Mercury Insurance), USA-RemoteETL Tech LeadOct 2021   6th March 2024Emp#id: 671678Roles and Responsibilities
      Automated the existing manual process of Escheatment using Informatica IICS, Snowflake database      Report Data Analysis generating month/y/quarterly reports writing SQL scripts (PowerBI)      Incident-Defect Analysis using SQL queries      Supported Financial Hub running on Informatica PowercenterEnvironment:  Informatica PowerCenter 10x, Informatica IICS, Snowflake  database, AWS Redshift/S3, MS SqlServer, Db2, Tivoli Scheduler, UNIX, Jenkins, Jira, Microsoft PowerBIInfosys, (Northwestern Mutual, Wisconsin, USA)Jan 2020   Dec 2020ETL Tech ConsultantRoles & Responsibilities      Support the General Ledger application
      Data analysis/Data modelling/ development/data cleansing of the financial application      Migrated Informatica PC mappings to  Informatica IICS / debugged & modified mappings in IICS
      Supported the areas in Property & Casualty insuranceEnvironment:  Informatica PowerCenter 10x, Informatica Intelligent Cloud services IICS, IDQ, Oracle, UNIX, Autosys, PeopleSoft, Sybase Power Designer, JiraCVS Pharma, Buffalo Grove, ILMar 2018   July 2019ETL Informatica DeveloperRoles & Responsibilities      Data Analyst role understanding the Medicare/Medicaid process/flows of CVS Retail Data Warehouse      Ran reports using SQL scripts      Collaborate with core technical team inside and outside vendors for implementation      Involved in POC developing and testing the new functionality      Involved in the migration of informatica workflows from version 9.6 to version 10.2      Performed detailed Data Analysis to identify the root cause of production issues and provide solutions in the permanent schedule      Involved in creation of ETL mappings through Informatica PowerCenter /Testing/deployment      Co-ordinate with all the business users, SMEs and process owners to complete the activities of SDLC phasesEnvironment:  Informatica PowerCenter 10x/9.6, Oracle 11g, UNIX, Flat Files, Control   MZurich North America, Schaumburg, IL (contractor from Cognizant Technology Solutions, USA)Dec 2017- Mar 2018Data Analyst/ETL DeveloperResponsibilities      Involved in data analysis/code change analysis as part of the change request
      Implement ETL code/Shell script change/SQL/Autosys JIL Script as per the change request
      Co coordinating with offshore on the change request/Unit testing the ETL code done by the offshore team      Daily status updates given to the business on the code changes /testingEnvironment:  Informatica PowerCenter 9.6.1, AQT for Netezza, AQT for DB2, UNIX, AutosysDeere & Company, Moline, IL	(IBM India Pvt Ltd)
Jan 2011   Jan 2014Data Analyst/ETL developerResponsibilities
      Involved in translating business requirements to Technical specifications.
      Involved in full project life cycle - from analysis to production implementation and support      Created pre and post session Stored procedures to drop, recreate the indexes and keys of source and target tables.      Work as Data Modeler using the Sybase Power Designer to design the tables needed for the dashboard project.  Developed the high-level design to map the data from source into the data mart tables
      Develop the source to target mapping document for ETL development in Informatica PowerCenter      Perform system testing and enter test cases in Quality Center to ensure data is loaded and mapped correctly to Data warehouse/ Data mart models according to the source to target mapping document      Assist business analysts/users in identifying data gap and data variances      Work with source system SMEs to identify the source data to be brought down to the Data mart      Developed ETL procedures to transform the data in the intermediate tables according to the business rules and functionality requirements.      Create and execute unit test plans based on system and validation requirements Troubleshoot, optimize, and tune ETL processes      Applied the Teradata utilities of Multiload, Fastload, worked on B-teq script      Used Informatica Designer to design transformations, reusable transformation, mappings and Mapplets      Update the data dictionary on system data and mapping information      Responsible to create interactive Dash board reports as per the Business requirements.      Documented and maintained and ensured the process and quality of the system.Environment:  Sybase Power Designer 15.2, Informatica PowerCenter 9.1, DB2 9, Toad for DB2 4.5, Teradata 14.0, Flat Files, Quality Center, UNIX and ESPPfizer Data Services   S4 Operate, USA (IBM India Pvt Ltd)
Aug 2008   Dec 2010Senior ETL Developer (India location)Responsibilities      Involved in translating business requirements to Technical specifications. Also performed Column Mapping & identifying the Business Rules for some of the requirements.      Involved in the Preparation of Detail Design Document (DDD).      Developed ETL jobs to extract information from disparate systems like SAP, DB2, Flat Files, Oracle, MS SQL and XML s.      Involved in creating stage tables/mappings/ setting the trust level -HDD in Siperian for Customer Master system      Performed Source system study to load the Hierarchy Table. Created Multiple Mappings to load data from including mainframe data available in DB2 schema.      Worked extensively on Margin controlling system, which is rated as critical for Business and successfully loaded multiples tables as required for reports.      Extensively used Informatica to load data from DB2 and flat files into staging & loading area.
      Performed various transformations, aggregations and ranking routines on the data to be stored in the Data Provisioning Mart
      Used Informatica Designer to design transformations, reusable transformation, mappings and Mapplets.
      Developed and modified existing Hierarchy Table to increase the level of granularity for BO reporting requirements.
      Used Informatica workflow manager to create sessions and workflows for corresponding mappings.      Conducted unit testing and generated test case reports to ensure the expected Business Standards.      Documented the mappings and workflows & periodically updated the status of the work assigned to the system Owners.      Involved in identifying the performance Bottlenecks on source, target, mapping, session and successfully tuned them to increase session performance as wells as Oracle views.      Created mappings in designer to implement all different types of slowly changing dimensions.      Involved in the code reviews and bug fixing of mappings in production to increase the efficiency.
      Documented and maintained and ensured the process and quality of the system      Worked with mapping parameters and variables to load data from different sources to corresponding partition of the database tableEnvironment:  Informatica 8.1.1/8.6.1, Oracle, SQL Server, SHELL script, SIPERIAN tool (MDM), AutosysAXA PPP Healthcare for UKAug 2006 - Jun 2008ETL- Informatica Developer (India location)Responsibilities      Involved in the ETL Design and Development of Informatica Mappings      Involved in Requirement Gathering, Analysis and Design for Loading the VSAM file in to SAP through Power Connect for SAP      Prepared Unit test Plans and System Test Plans
      Created PERL scripts for Unzipping / Archiving source files and performing looping operations      Performed Unit testing and delivered the IDOC Status Report for each file posted to SAP conducted Informatica Power connect (SAP ALE IDoc methodology) training to the team      Tuning Informatica Mappings and Sessions for Optimum Performance.      Executed prototypes for ETL and Data Quality demonstrating their capabilities while training the team to use of the new platform.      Project management activities:          Adherence to quality process          Responsible for on time delivery          Ensuring Reusability and continuous learning      Experience in Unit testing and Integration Testing of different process in Data warehouse.      Quality Center is used to document test cases and test cases are executed in Quality center for each new release.Environment:  Informatica PowerCenter 7.1.3, Toad for Oracle, Quality Center, PERL scriptCarbonOn Tech, India
Feb 2005 to Aug 2006Software developerWeb Based ApplicationsCarbonOn Tech is a Web based development and Game Art + Game development outsourcing company based in IndiaResponsibilities      Studying existing functionality.      Involved in DB design. Creating web-based reports using ASP and Windows based report using VB.NET      Developed mappings using DTS to extract CSV files to SQL server DB      Developing the Stored Procedures      Performing code migration from Dev to Test Unit testing the modules.      Created Job package using DTS that sends emails periodically using SQL Server Job scheduler.      Import Excel Sheets to SQL-Server database using DTS mappings.      Development of stored procedures.      Written complex SQL using joins, sub queries and correlated sub queries. Environment:  ASP 2.0, SQL Server 2000, IIS and JavaScript, DTS (Microsoft Server ETL tool), ASP.NET 1.1, C#.NET (code behind), Microsoft  Application Blocks (Data Access), DTS in SQL Server 2000,   Microsoft Visual Studio.NET 2003, VB.Net Windows Forms, Crystal Reports, SQL Server 2000 (DTS)

Respond to this candidate
Your Email «
Your Message
Please type the code shown in the image:
Register for Free on Jobvertise