Quantcast

Database Developer Data Warehouse Resume...
Resumes | Register

Candidate Information
Title Database Developer Data Warehouse
Target Location US-NJ-Iselin
Email Available with paid plan
Phone Available with paid plan
20,000+ Fresh Resumes Monthly
    View Phone Numbers
    Receive Resume E-mail Alerts
    Post Jobs Free
    Link your Free Jobs Page
    ... and much more

Register on Jobvertise Free

Search 2 million Resumes
Keywords:
City or Zip:
Related Resumes

Database Development Manager Long Island City, NY

Lead Database Developer Jersey City, NJ

Database developer Fair Lawn, NJ

Data Developer Carteret, NJ

Tableau Developer Data Analyst Hicksville, NY

Big Data Software Development Bridgewater, NJ

Oracle Database Sql Developer Princeton, NJ

Click here or scroll down to respond to this candidate
Yu ManStreet Address
Email: EMAIL AVAILABLECell Phone: PHONE NUMBER AVAILABLEObjective:Experienced Database Developer with over 15 years of expertise in database development and administration, as well as extensive experience in the financial industry. Seeking to leverage comprehensive skills in SQL programming and database development to contribute effectively to the SQL programmer position in Rutgers.Summary:Over 15 years of experience in database and data warehouse design and development, utilizing a wide range of databases including MS SQL, Sybase IQ/ASE, Oracle, DB2 UDB, MY SQL, Hadoop Hive, etc.Proficient in T-SQL, PL/SQL, SQL, Shell script, Python, PEARL,PHP, JAVA, JavaScript, Visio, Microsoft Visual Studio and VBA.Extensive experience with BI tools such as SSAS, SSRS, Crystal Report, Cognos, and Tableau, proficient in major ETL tools such as Composite Software, DataStage, Informatica, SSIS etc.Strong communication skills demonstrated through effective collaboration with cross-functional teams and stakeholders.Skilled in performance tuning and query optimization, with a proven track record of improving critical process performance from hours to seconds for top financial firms.Experienced in cloud platforms like AWS, Azure.Experience:Database Developer Bank of America, October 2022  March 2024Database design and data modeling for the GPVirtual application, using T-SQL to develop the GPVirtual application, and extending services to new departments such as Merrill Lynch.Setup connections between client servers and data servers using NDM, SFTP, and SCP.Created ETL jobs using DataStage and SSIS to extract and load data into SQL 19c databases.Developed OLAP data cubes using SSAS and generated reports using SSRS.Migrated and developed reports from retired Ariaba client systems to SQL-based applications.Converted java codes in Ariaba reports to store procedures and views in SQL 19c database.Optimized SQL 19c database objects and queries to improve performance for long-running reports.Participated in new enhancement developments of GPVirtual projects, including requirement analysis, SCRUM meetings, and daily development work.Database Development Team Lead Citigroup, February 2019  October 2022Led a team responsible for maintaining a large data warehouse with Sybase IQ/ASE and MS SQL 19c servers across multiple locations.Designed and created database objects, SQL scripts, stored procedures, and views for new projects with T-SQL..Developed OLAP data cubes using SSAS and generated reports using SSRS.Managed database servers, Linux/Windows servers, and application servers including Data backup and recovery; configuring and updating server profiles and credential configuration files.Created ETL jobs using SSIS to extract data from upstream sources and load it into the data warehouse.Provided support to reporting systems and downstream customer applications, investigating and resolving production issues.Participated in disaster recovery testing and ensured high availability of database systems.Wrote PERL and Python scripts to execute tasks in Linux server level.Provide support to front end application.Database Developer The Vanguard Group Inc., August 2014  November 2018Replicated IDM system to Composite, using MS SQL 12c databases to offload jobs in the IDM LDAP servers.Developed backend database in SQL 12c database server to support various projects, including database modeling, creating database objects, writing stored procedures and SQLs Queries using T-SQL.Developed Python and shell scripts, and DataStage jobs for BQC projects to retrieve and process data.Worked with AWS team to set up and configure Vanguard AWS account, migrated servers and applications to AWS.Created OLAP databases using SSAS, migrated data into AWS databases using SSIS, and generated reports using SSRS and Tableau.Provide support to front end applications which used Restful APIs to access databases.Database Developer Northern Trust Bank, September 2013  August 2014Using T-SQL to create new database objects and write composite codes for new developments.Cooperated with other teams to design a new data center and create composite services to support new applications.Big data and database Developer American Express. NY Sep 2012  Aug 2013My major project is to analyze the discrepancies between two Data warehouses: GRMS and TRIUMPH. GRMS and TRIUMPH are two different mainframe Data warehouses in AMEX which store the same customer and transaction information. But the data in these two Data Warehouses were not in synch. Since the data is huge, we cannot compare it in the mainframe systems without downgrading the system performance. We developed a Hive database system, transferred data from GRMS and TRIUMPH to Hadoop, and used the Hadoop technology to do data analysis to find out mismatching rates and mismatching causes.My jobs include the following:Using Perl to write shell scripts to transfer mainframe data files to SAS server.Created tasks in Informatica to implement ETL jobs.Wrote SAS scripts to process the mainframe files. Since GRMS and TRIUMPH store and display data using different business logic, for example, TRIUMPH may store all historical transaction data including valid and invalid transactions and used very complicated business logic to identify the valid transactions, but GRMS only stores valid transactions. I wrote SAS scripts to apply business logic into the data files.Converted the data file to control-A format for loading into Hadoop.Create Hadoop databases.Create Hadoop tables to match the mainframe files.Wrote Shell scripts to transfer data files from SAS to Hadoop.Created ETL tasks to transfer data between DB2 and Hadoop servers.Loaded the mainframe data files into Hadoop tables.Wrote scripts in Hadoop to compare GRMS and TRIUMPH data to get matching rate for each file.Wrote scripts in Hadoop to get mismatching sample data.Wrote matching reports to Business Analysts.Discussed with Business Analysts to find out the mismatch causes.Wrote scripts to analyze mismatch cases in Hadoop for details.DATABASE DEVELOPER JPMorgan Chase Co. NYC Dec 2008  Jul. 2012Working in the broker dealer unit of JPMorgan Chase. The bank uses Composite Software as Enterprise Information Integration (EII) to integrate heterogeneous data sources (DB2, Sybase, Oracle) to support Front End Applications. As an administrator and Developer of this EII system, I provide support for the DEV, QA and PROD composite servers, write complicated views, procedures to support front end applications and do performance tuning.Responsibilities:Upgraded the Composite servers from 3.7.1 to 5.1. Fixed bugs and incompatible issues.Optimized queries rewrote views and procedures in Composite, Sybase and DB2 databases. I systematically investigated the portals which had the worst performance issues and rewrote their views and procedures with better algorithms and improved their performance significantly.Wrote views, package SQL, procedures, triggers in Composite servers and Sybase databases, DB2 UDB databases for new projects.Administered Composite servers, checked query plans, monitored execution of queries, memory, connection pools and other health issues of composite servers.Regularly checked composite server event logs, server logs etc. to fix any issues if occurred.Wrote Perl scripts to Analyze transaction log files to find out long running queries and found out the root causes of the performance issues.Wrote Java Pub Test program to test composite procedures and views.Wrote composite procedures and triggers to transfer data among database servers.Created AUTOSYS jobs on AUTOSYS servers.Created and designed workflows using INFORMATICA to migrate data among DB2, Sybase and Oracle.Designed database architecture, created schema for Broker Deal portals using Power Design.Fixed and resolved several long-lasting bugs.Provided production support for Composite servers.SYBASE DEVELOPER Lehman Brothers Jan 2008  Nov 2008Worked for its Global Sales Technology department and take part in several projects:CRM: developed a web application to provide real-time data to clients ( I worked on equity and fid income Portals)ATC: developed a web application to provide internal users with information to help them to seize business opportunities.ACS: developed a web application to provide information on salespersons activities and their efficiency.My major dutiesMigrated crystal reports from CE 10 to Crystal Report/Business Object XI.Created and Published Crystal Reports configured Crystal Reports on Crystal report server.Created Composite objects including federal views, package SQL, procedures and triggers etc.Created database objects including tables, views, indexes, store procedures and triggers etc.Analyzed SQL codes and tuned store procedures to improve performance of applications.Wrote complicated Triggers to synchronize data in different databases and tables.Wrote complicated store procedures to do daily data processing and data upload for different projects and applications.Wrote complicated UNIX shell scripts and Perl scripts to do data migration between different databases on different servers.Created AUTOSYS jobs on AUTOSYS servers.Created and designed workflows using INFORMATICA to do large scale data migration.Used BCP and LOAD utilities to migrate data between SYBASE and BD2 servers.Major contribution: found and solved a configuration issue in Lehmans enterprise systems which improved performance for all applications in the enterprise platform by 2 times.DATABASE DEVELOPER The Bear Stearns Companies, Inc. NYC June 2007  Dec 2007Worked in the prime Brokerage unit of the firm. took part in several projects. My major duties were listed as following:Administrator Responsibilities:Managing Sybase, DB2 database servers and EII (Enterprise Information Integration) system. In order to improve performance and simplify the connection between its WebLogic servers to its distributed backend database servers, the company installed an EII system. As an Administrator of Sybase and the EII system, I took part in the installation, configuration, data migration, performance tuning, error shooting, code analysis, debugging etc. I also provided general production support for a doze of Sybase servers, DB2 servers and the EII system.Developer Responsibilities:Migrated Business logic from Sybase, DB2 and other database server to the EII system. I rewrote T-SQL, DB2 SQL to the EII codes and move business logic from the database servers to the EII system.Wrote complicated store procedures and views for Bear Prime Broker project, CEA (Customer Electronic Access) project and Bear Online Trading Project etc.Major contributions:Improved the performance of Bear Prime Broker project of ALL ACCOUNTS context by 2 times.Improved the performance of Bear Prime Broker project of Group context by 40%.Improve performance of some critical procedures of CEA project by 30%.Fixed several long time existing bugs ( they were found long time ago, but no one could fix them.Education:Master of Computer Science, New Jersey Institute of Technology

Respond to this candidate
Your Message
Please type the code shown in the image:

Note: Responding to this resume will create an account on our partner site postjobfree.com
Register for Free on Jobvertise