| 20,000+ Fresh Resumes Monthly | |
|
|
| | Click here or scroll down to respond to this candidateCandidate's Name PHONE NUMBER AVAILABLE
Street Address EMAIL AVAILABLE
Career Objective
To implement my talent and contribute significantly in the field of technology through continuous focused hard work, innovation
& research, utilizes my skills and provides me an opportunity towards personal and organization growth .
Professional summary
Technical Lead with over 14 years of experience in data warehousing, data modeling, ETL development, database
development, Unix development, data analyst and production support, currently employed at TATA Consultancy Services
Limited and contractor to Fifth Third Bank. Expert in SQL, T/SQL, PL/SQL, IBM Infosphere Data Stage 11.7, Erwin, Snowflake
and Unix scripts, with a strong track record in leading complex data migration and database development projects. Known
for meticulous attention to detail and ability to manage multiple high-stakes projects simultaneously, aiming to leverage
technical skills to drive further innovation and efficiency.
Employment history
Fifth Third Bank | Technical Lead Developer
Jan 2023 - Present
Leads development projects, enhancing system efficiency by 20% at Fifth Third Bank.
Oversaw meticulous code reviews to ensure software reliability and performance at Fifth Third Bank.
Passionately lead teams in dynamic project environments, driving technological advancements.
Fostered team collaboration to tackle complex system upgrades and increased productivity by 20%.
Streamlined database operations, boosting data retrieval speeds by 15%.
Mentored junior developers, improving team skill levels and project output.
Fifth Third Bank | Production Support and Developer Lead
Feb 2020 - Dec 2022
Fifth Third Bank | Developer Lead
Jan 2019 - Jan 2020
SLK America INC | Data Migration Lead Developer | America
Jun 2017 - Dec 2018
SLK Software Services | ETL/SQL Developer and Data Modeler | Bangalore
May 2010 - Jun 2017
Education
Visvesvaraya Technological University, Belgaum in Karnataka India | Bachelor of Engineering in
Computer Science (B.E)
Courses
AWS | Solution Architect Association
Snowflake | SNOWPRO
Python
Skills
Requirement Understanding and Analysis Data Modeling
Conceptual Logical
Physical data model Data Dictionary
ETL design Data Mapping
HLD LLD
TSD ETL Development
Database Development SQL
TSQL PL/SQL
complex Queries QA
UAT Production Support
deployment Erwin
IBM - Infosphere Data Stage 11.7 IBM - Infosphere Data Stage 11.5
IBM - Infosphere Data Stage 9.1 Snowflake
SSIS
Informatica Power Center 9.6.0 Unix scripts
Additional information
Training (Attended) Skills
Oracle Business Intelligence Enterprise Edition(OBIEE)
IBM Cognos reporting tool
PROFESSIONAL EXPERIENCE
Working as Technical Lead, Developer Lead, production support lead, data warehouse technical lead and data migration
Lead in TATA Consultancy Services Limited, America from Jan-02-2019 to till date.
Working as Technical Lead, Developer Lead in Fifth Third Bank, from Jan-01-2023 to till date
Worked as Production Support and Developer lead in Fifth Third Bank, from Feb-01-2020 to 31-Dec-2022.
Worked as Developer lead in Fifth Third Bank, from Jan-02-2019 to 31-Jan-2020.
Worked as a data migration Lead Developer in SLK America INC, America from Jun-24-2017 to 31-Dec-2018.
Worked as an ETL/SQL Developer and Data Modeler in SLK Software Services, Bangalore from May-10-2010 to Jun-23-
2017.
Summary
Creating data model once requirement analysis is completed, like conceptual/logical/physical data model, generating data
dictionary and get it review with architect before sharing with team
Creating high level document when complete workflow is available for the respective subject area and then working on the low-
level document and reviewing with Architects prior to ETL development
Leading the team to provide complete guidance on the design phase like explaining every single piece of data model,
complete workflow (HLD/LLD) diagram of ETL processes
Driving agile stand-up meeting in order to provide complete details to project managers behalf of the team like updates from
data analysts/ETL developers and QA team etc.
Leading the ETL development team efforts and make sure that team is following best practices and guidelines in order to
ensure that jobs are efficient to load large volume of data and maintaining quality in the code like reviewing ETL code when
development is completed for every single process
Ensure that team is preparing Technical Specification Document, unit test case document and deployment guide to move jobs
from lower environment to higher environment and review and provide the feedback for the same and ensure that all the
documents are uploaded into share point path
Proactively prioritize the multiple tasks for the team and help project manager to align with the timeline and bring the issues to
project manger s notice
Working with multiple teams, primary focus is on the core team where things are very much critical in order to complete project
on time and other team is working on the merger process, which means addition to existing architecture, mentoring both the
team and providing proper guidance/direction in order to bring everyone into same page
Consolidating complete status of the project and preparing weekly status report and monthly status report for the client
management and presenting same along with project manager.
Review the test case execution results and ensure that required details are available in order pass it on to line of business
team or any other team and present the same to line of business team & help them to enhance their testing
Review the deployment guide and work with scheduler team and ensure that jobs are scheduled as per the requirement
Review the stand operation procedure document or run book for the production support and help production team in order to
maintain the jobs in the future.
Extensively worked on Data Stage job design, development, scheduling, execution, testing and defect fixing
Involved in creating the logical and physical design of the Data Warehouse (Fact and Dimension tables) using Star Schema
and Snowflake schema approaches.
Good experience in agile methodology and communicating with Project Managers in providing estimations for development
efforts to meet the project milestones and deliverables.
Proficient in Stored Procedures, Functions views and indexes, T-SQL - performance tuning and query optimization, like
understanding metrics based up on the execution plan and building indexes if required, re-writing query etc.
Involved in Deployment activities like migrating database DDL scripts from lower to higher environment or migrating ETL
jobs from lower to higher environments
Positive attitude, strong work ethic with good analytical skills.
Flexibility and adaptability with new technologies and environments highly result oriented and driven for success.
Production support like incident management and resolving tickets based on the SLA
Working on the new proposals with business relationship manager
Creating reports using oracle BI publisher, maintaining oracle BI and WebLogic server and oracle enterprise manager
middleware worked on POC project
DR Plan creation and execution, Participate in disaster recovery testing
Professional profile
PROJECT DETAILS: -
Project #1 : Wealth Asset Management
Client : Fifth Third Bank, Cincinnati Ohio USA
Duration : Jan-2022 to till date
Role : ETL Technical Lead and developer lead
Environment : Erwin, IBM Infosphere data stage 11.7, 11.5, DB2 queries using Advanced Query Tool,
Snowflake
Summary: Wealth Asset Management (WAM)
Investment advisor or wealth asset management has several source systems which are sending various types of data into
data mart like customer, account, revenue and commission, advisor license information. Basically data mart is a critical process for
line of business for their day to day activities; basically we are having 3 different opportunities work on like development for new
process, production support and enhancement for existing feature
Personal deposit Account system is having some of the batches to process payment process, basically there are many files
which are dropping into given folder location in the form of batches, ETL is picking up every single file based on the SLA time and
processing, ETL is running round the clock.
Responsibilities:
Understanding the requirement with the help of business analysts
All the source and targets will be discussed with BA s in order to have completely clarity on the requirement
Required database design will be done once the mapping document has clarity in terms of the requirement
Working on the ETL development, unit testing will be done after the development is completed
UAT will be done along with BA s or LOB s
Will be working on the change requirement along with documentation
Change request will be created using Service Now tool and will engage release management team for the change review
and approval
Will be supported during the deployment and post testing
PROJECT DETAILS: -
Project #2 : Consumer ETL enhancements and operation
Client : Fifth Third Bank, Cincinnati Ohio USA
Duration : Mar-2020 to Dec-2022
Role : ETL Technical Lead and production support lead
Environment : Erwin, IBM Infosphere data stage 11.7, 11.5, DB2 queries using Advanced Query Tool
Summary: Consumer Business Intelligence (CBI), Wealth Asset Management (WAM) & Personal Deposit Account (PDA)
Fifth Third Bank has many applications in consumer space. The main objective is to maximize the consumer experience and
profitability. The Consumer ETL project houses many ETL project and enhancements. It has wealth Asset management or Investment
Advisor or Brokerage & consumer/commercial applications Referral consumer/commercial, Balance spend products, One view
consumer services, Commercial services and personal digital assistant
Investment advisor or wealth asset management has several source systems which are sending various types of data into
data mart like customer, account, revenue and commission, advisor license information. Basically, data mart is a critical process for
line of business for their day-to-day activities; we are having 3 different opportunities work on like development for new process,
production support and enhancement for existing feature
Personal deposit Account system is having some of the batches to process payment process, basically there are many files
which are dropping into given folder location in the form of batches, ETL is picking up every single file based on the SLA time and
processing, ETL is running round the clock.
Responsibilities:
Responsible for understanding the business specifications/functional specifications document, understanding every single
source like flat/complex flat file/CSV/Excel/XML database structure and perform the data profiling in order to meet the
requirement.
Understanding every single field in the mapping document and in the existing system with the help of subject matter of
expert and having detailed sessions with Business Analysts/data architect s/data analysts
Creating data model once requirement analysis is completed, like conceptual/logical/physical data model, generating data
dictionary and get it review with architect before sharing with team
Dimensional modeling with the snowflake and start schema
Working with SME s or business analysts or LOB to prepare data mapping document in other words playing data analyst
role
Working on the development for the complex jobs or requirement once the data mapping document has been approved
Leading the team to provide complete guidance on the design phase like explaining every single piece of data model,
complete workflow (HLD/LLD) diagram of ETL
Leading the ETL development team efforts and make sure that team is following best practices and guidelines in order to
ensure that jobs are efficient to load large volume of data and maintaining quality in the code like reviewing ETL code
when development is completed for every single process
Development activities in terms of feasible work assign to team and provide required support and development with complex
requirement
We are getting around 155GB data on a daily basis for the data warehouse, reusability for loading strategy since volume of
data is huge
Review the test case execution results and ensure that required details are available in order pass it on to line of business
team or any other team and present the same to line of business team & help them to enhance their testing
Review the deployment guide and work with scheduler team and ensure that jobs are scheduled as per the requirement
Review the stand operation procedure document or run book for the production support and help production team in order
to maintain the jobs in the future.
Production support for 11000 jobs, Recovering the job aborts and closing the ticket with TWS team.
Checking for the jobs finishing with warnings and analysing for fixes
Monitoring critical jobs which has abort history and prod issues in recent past
Running weekly/monthly/quarterly jobs when they are due and updating stake holders.
Analysing the jobs which have frequent abort history and recommending for permanent fixes.
Preparing System Maintenance Technical Document Run books which will help the team to understand functionality and
taking actions for aborted jobs.
Documentation for scheduled outages and working as per plan on outage day.
Analysing the jobs for requests coming from clients on different subject areas as per the requirements
CRQ creation, execute and support, Planning and reviewing with leads.
Validation of jobs post implementation, Execution of the jobs as per the plan
Coordination with different teams involved.
Value Additions and Improvements
Support on the patch upgrade to servers
DR Plan creation and execution, Participate in disaster recovery testing
PROJECT DETAILS: -
Project #3 : Anti Money laundering (AML)
Client : Fifth Third Bank, Cincinnati Ohio USA
Duration : Jan-2019 to Feb-2020
Role : ETL Technical Lead and Data Migration Lead
Environment : Erwin, IBM Infosphere data stage 11.5, DB2 queries using Advanced Query Tool
Summary: Anti Money laundering
The Bank Secrecy Act (BSA) requires a Financial Institutions monitor customer activity to detect certain known or suspected
violations of federal law or suspicious transactions related to anti money laundering or violation of the BSA. Current AML(Legacy) will
go end of 12-2018, this project will replace the current AML system based up on the bank compliant applicable and AML regulations,
AML provides an automated solution that is able to be cutting edge in terms of analytical environments while giving investigators the
necessary capabilities to swiftly investigate and disposition cases along with filing of Suspicious Activity Reports (SAR) with Financial
Crime Enforcement Network (FinCEN)
Identify and implement a platform to monitor transactions and associated alerts across all relevant channels in order to
identify potentially suspicious activity related to money laundering
Allow Financial Crimes Compliance to work AML cases in an integrated case management solution
The current Transaction monitoring(Legacy) system requires a review of each alerted potentially suspicious activity event via
a multi-level triage/investigative process, as such process is resource intensive, consists of various manual processes, lacks
adequate controls and is insufficient mitigating AML risk, there is a need to replace existing processes with an Automated
Transaction Monitoring system which can effectively support the ability to utilize risk-based approach to focus analytical and
investigative resources on the most important areas of the bank requiring scrutiny in terms of Financial Crimes.
Responsibilities:
Responsible for understanding the business specifications/functional specifications document, understanding every single
source like flat/complex flat file/CSV/Excel/XML database structure and perform the data profiling in order to meet the
requirement.
Understanding every single field in the mapping document and in the existing system with the help of subject matter of
expert and having detailed sessions with Business Analysts/data architects/data analysts
Working with SME s or business analysts or LOB to prepare data mapping document in other words playing data analyst
role
Data analyst role would be kind of exposure in this project since I got an opportunity to work with different teams like data
warehouse, eCIF team etc.
Creating high level document when complete workflow is available for the respective subject area and then working on
the low-level document and reviewing with Architects prior to ETL development
Leading the team to provide complete guidance on the design phase like explaining every single piece of data model,
complete workflow (HLD/LLD) diagram of ETL processes
Driving agile stand-up meeting in order to provide complete details to project managers behalf of the team like updates
from data analysts/ETL developers and QA team etc.
Leading the ETL development team efforts and make sure that team is following best practices and guidelines in order to
ensure that jobs are efficient to load large volume of data and maintaining quality in the code like reviewing ETL code
when development is completed for every single process
Ensure that team is preparing Technical Specification Document, unit test case document and deployment guide to move
jobs from lower environment to higher environment and review and provide the feedback for the same and also ensure
that all the documents are uploaded into share point path
Proactively prioritize the multiple tasks for the team and help project manager to align with the timeline and bring the
issues to project manger s notice
Working with multiple teams, primary focus is on the core team where things are very much critical in order to complete
project on time and other team is working on the merger process, which means addition to existing architecture,
mentoring both the team and providing proper guidance/direction in order to bring everyone into same page
Consolidating complete status of the project and preparing weekly status report and monthly status report for the client
management and presenting same along with project manager.
Review the stand operation procedure document or run book for the production support and help production team in order
to maintain the jobs in the future.
Used Data Stage Designer to develop processes for Extracting, cleansing, transforming, integrating and loading data into
DB2 v11.1.4.4 different source like database, file etc...
Used various Stages like complex flat file, change data capture, sequential file stage, Join, Aggregator, Transformer,
Filter, lookup, sort, remove duplicates, Funnel, Column Generator, DBBC stage, File Set, datasets, filter stage for
designing the jobs in the Data Stage.
Following are the list of sources system feed files which are fed into AML system, Amtrust, NFS, Time Deposit, GPR,
TSYS, AFS, ACBS, UDS, MICR, SWIFT, MTS, ALS, Loan Serv, IDW and Wall Street, historical data migration from
legacy system, like 30 million accounts needs to be pulled from data ware house and 2.9 billion transactions are required
to be migrated from legacy to new system and million number of transactions are coming in on a daily basis form these
various source systems.
Used Data Stage Director for running and monitoring & scheduling the jobs.
Worked with Data Stage Designer for creating jobs, Parameter Sets, Data Connection Parameter, Export & Import Jobs.
Worked with Data Stage Administrator for creating Environment variables.
Involved in creation of the Test Cases, execution of Test cases in order to migrate 3.7 billion records which would be
leveraged 8TB including staging data, validation or cleansing and moving cleansed data into data mart
Worked on handling/capturing exception/error records in parallel jobs and handling exceptions in the Sequential job.
Involved in preparing deployment guide or implementation guide for Migration and daily batch load process.
Involved in migration jobs from Development to UAT/QA to Production Servers.
PROJECT DETAILS: -
Project # 4 : 1) Wealth Management System, 2) ACBS 3) BPM 4) Master Card
Client : FHN (First Tennessee Bank) Bank, Memphis TN USA
Duration : July-2014 to Dec-31-2018
Role : Data Modeler, ETL Developer and SQL/TSQL developer, Erwin
Environment : SQL/TSQL, Erwin, IBM Infosphere data stage 8.5, SSIS 2008, SSRS 2008, SQLSERVER
2008, SQL server 2012
Summary: Wealth Management System
The wealth management system is an application which is the system of record for
Sale of wealth management products
Revenue earned on the sale of wealth management products (like investment, insurance and trust accounts etc ) by the
frontline bank sales staff
Commission paid on revenue earned
Incentive paid on referrals created
Bonus paid
In order to provide a solution to meet the business needs of different user group of this solution, streamline existing processes support
a wide range of needs going forward, there is a need to rewrite this application.
The replacement application, while retaining the existing functionality and interfaces, would aim at streamlining the menu options and
reports, provide scalability with regards to adding new products, incentive structures and new roles.
Responsibilities:
Responsible for getting, understanding the business specs or functional specification, preparing technical Specs.
Understanding every single field in the existing system with the help of reverse engineering from Erwin tool and then started
building business story with the help of existing application as well as LOB sessions.
Identified the issues in the existing application during code analysis and explained the business scenarios and provided
solution to streamline the process
Created logical/physical data model using Erwin tool, prepared the data dictionary in the Erwin data model
Walkthrough the data model for all the audience like BA s, data architect, application architect, developer etc..
Understanding the HLD with the help of Business Analyst/Project owner and then preparing LLD.
Understanding source and target system then preparing mapping document.
Used Data stage Designer to develop processes for Extracting, cleansing, transforming, integrating and loading into MS
SQL Server 2008 R2 data base.
Used various Stages like Join, Aggregator, Transformer, Filter, lookup, sort, remove duplicates, Funnel, Column
Generator, Sequential file, DBBC stage, fileset, datasets, filter stage for designing the jobs in the Data Stage.
Used Data Stage Director for running and monitoring & scheduling the jobs.
Worked with Data Stage Designer for Parameter Sets, Export & Import Jobs.
Worked with Data stage Administrator for creating Environment variables.
Involved in creation of the Test Cases, execution of Test cases.
Worked on handling exceptions in Sequential job and capturing exception records in parallel jobs.
Involved in preparing deployment guide or implementation guide for Migration.
Migrating jobs from lower environment to higher environment along with database scripts
Involved in performance tuning of data stage jobs, writing SQL queries, stored procedures
Co-ordinated and worked with multiple projects with multiple teams during merger or acquisition time.
Project #5 : Marketable Securities or MarginNet
Client : FHN (First Tennessee Bank) Bank, US
Duration : OCT-2012 to Jun-2014
Role : ETL Developer, Data Modelling and development of SP s
Environment : SQL/TSQL, Informatica Power centre 9.0.1, SSRS 2008, SQLSERVER 2008, ERwin.
Summary: MarginNet
FHN Bank was, the statements of marketable securities are received from different sources and the RM has to manually
compile the values and update their respective spreadsheets based on monthly statements.
The Application/tool aims at tracking and monitoring the performance of loans against collateral considered securities. The
tool will automate the process of listing accounts managed by RMs, pull information from files created by data pools from different
source systems, create a match process and execute mathematical calculations using fields from both the files. The tool will display
the information in an integrated format on a dashboard.
Responsibilities:
Responsible for getting, understanding the business specs or functional specification, preparing technical Specs.
Understanding every single field in the existing system with the help of reverse engineering from Erwin tool and then started
building business story with the help of existing application as well as LOB sessions.
Identified the issues in the existing application during code analysis and explained the business scenarios and provided
solution to streamline the process
Created logical/physical data model using Erwin tool, prepared the data dictionary in the Erwin data model
Walkthrough the data model for all the audience like BA s, data architect, application architect, developer etc..
Understanding the HLD with the help of Business Analyst/Project owner and then preparing LLD.
Understanding source and target system then preparing mapping document.
Methodology for supporting Data extraction, transformations and loading processing, in a corporate wide ETL solution using
Informatica power center and developer tool
Extensively worked on informatica designer components Source Analyzer, warehousing designer, transformations
developer, Mapplet and mapping designer
Used informatica mapping Designer to develop processes for Extracting from various sources like flat file, fixed width file,
Cobol file, relational databases etc.., cleansing, transforming, integrating and loading data into Oracle database, DB2
v11.1.4.4
Used various transformations like Source definition, Source Qualifier, Lookup transformation, join transformation,
Aggregator transformation, Expression transformation, Filter transformation, sort transformation, update transformation,
Union transformation, update strategy transformation, Router Transformation, Sequence Generator, Aggregator
Transformation, Target Definition etc...
Project #6 : Systematic collateral Management System (STOC) and Smart Line
Client : M&T Bank, US
Duration : APR-2011 to Sep-2012
Role : BI Developer, Data Modelling and development of SP s
Environment : SQL/TSQL, SSIS 2008, SSRS 2008, SQLSERVER 2008, ERwin.
Summary: STOC
In securities lending and borrowing, collateral comprises assets given as a guarantee by a borrower to secure a securities
loan and subject to seizure in the event of default. Collateral Management refers to the handling of all tasks related to the monitoring
of collateral posted by a borrower
SmartLine
SmartLine is similar to overdraft, Loan arrangement under which a bank extends credit up to a maximum amount (called
overdraft limit) against which a current (checking) account customer can write checks or make withdrawals. An SmartLine allows the
individual to continue withdrawing money even if the account has no funds in it. Basically the bank allows people to borrow a set
amount of money. If the overdraft is secured by an asset or property, the lender has the right to foreclose on the collateral in case
the account holder does not pay. But SmartLine is a unsecured line of credit product.
SmartLine contains some rules, hence qualified customer can only use this product:
Offer a small ($500 maximum) amount. Customer cannot access an amount higher than their credit limit
Credit Card holder not eligible to SmartLine product.
After using SmartLine, if the customer paid within 10 days then smartline account is active else Account is freeze
etc ..
Project #7 : Banking Built for Business (BBFB)
Client : M&T Bank, US
Duration : DEC-2010 to APR-2011
Role : BI Developer
Environment : SQL/TSQL, SSIS 2008, SSRS 2008, SQLSERVER 2008.
Summary:
A package of services designed for companies to help save money and time, BBFB is a business bundle will include incremental benefits
to business owner, who enroll in the bundle and have the ability to receive even greater value by consolidating their business and personal
relationship with M&T, as a preferred business/personal customer they will automatically qualify for exclusive benefits on both sides of their
banking relationship.
BBFB project has a whole compromise of various technologies like design and implementation of DB, application design real time
data movement using ESB and non real time data movement using ETL
Project #8 : Sonic Inventory Management System
Client : Crowe Horwath, US
Duration : Mar -2010 to NOV-2010
Role : SQL, T-SQL Developer & BI Developer
Environment : SSMS 2008 r2, SSIS 2008 r2, SSRS 2008 r2, SQLSERVER 2008 r2.
Summary:
Sonic Automotives dealerships provide comprehensive services, including sales of both new and used cars and light
trucks, sales of replacement parts, performance of vehicle maintenance, warranty, paint and collision repair services, and
arrangement of extended warranty contracts, financing and insurance for the company's customers. The key objective is to design
the web application. SIMS Project has whole compromise of various technologies like design and implementation of db and
application design.
Responsibilities:
Understanding the business requirements, interacted with end users to understand the requirements and the purpose of
those changes in the data model.
Created logical model and physical model along with applying changes to the data model and preparing data dictionary
for the same
Design and develop SSIS packages, store procedures, configuration files, and functions implement best practices to
maintain optimal performance.
Used Control Flow Tasks like Execute SQL, File System, send mail, Dataflow Task, Execute Process Task and Execute
package Task.
Construct SSIS packages to pull the data from heterogynous sources
Notification, Auditing, logging, package configuration incorporated etc incorporate as standard procedure
Data profiling done from PS to ES such as concatenation of dates, remove blank spaces , handling Nulls etc
Understand configuration control check-in / check-out mechanism
Perform peer review / participate in code walkthrough with client.
Perform unit tests (including test case development, review & execution)
Apply relevant software engineering processes (reviews, testing, defect management, configuration management,
release management)
Deployment of the packages in the test and cert environment and supporting for testing
|