| 20,000+ Fresh Resumes Monthly | |
|
|
| | Click here or scroll down to respond to this candidateCandidate's Name
New Egypt, NJ 08515 PHONE NUMBER AVAILABLE EMAIL AVAILABLE WWW: Bold ProfileProfessional SummaryCertified, senior level professional with 20+ years of industry experience including hands technical and leadership roles, focused on driving adoption of cloud technologies. Hands on experience with Snowflake, AWS Red Shift DW and cloud technologies AWS-EC2, S3, SQS, SNS, Cloud Formation, Data Pipeline, Streams & Cloud servers. Experience in Data center & DB2 to Aurora PostgreSQL migration, managing onsite-off DBA team, and Training to support team. Broad exposure and hands-on experience with a wide range of RDBMS and BI tools. Deep data domain knowledge and excellent understanding of financial services and Health industry. Creative problem solver, quick learner, proactive team leader with excellent prioritization skills and aptitude to balance multiple priorities. Strong presentation and communication skills. Passionate for learning new technologies.SkillsMultiple RDBMS, SNOWFLAKE & REDSHIFTSQL, NOSQL Architect & Admin.System MappingDatabase UpdatesReverse EngineeringData LakesData MigrationPerformance Tuning.Work HistoryData Consultant 11/2022 to 06/2023StateFarm Insurance Bloomington, ILSnowflake, DB2, POSTGRESQL, MSSQL, MONGO DB, Redshift Experienced in DWH Redshift Performance Tuning for Columnar Databases Proficiency with Redshift Architectures, Storage, Query Processing Created Redshift Clusters for improve Query throughput, high concurrency, cost saving. Created Data Shares and provided usage for Business users. Disaster recovery setup done at different region.Sr. Database Administrator /Modeler 01/2020 to 09/2022 CMS, VA Rockville, MD Experienced in DWH Snowflake, Redshift, NoSQL Mongo dB. Proficiency with Snowflake Architectures, Storage, Query Processing, Could Services. Created Virtual Warehouse with XL (16 Servers) and configured. Created Clusters for improve Query throughput for high concurrency. Configured Auto Suspense and Auto Resume for cost saving. Hands on experience with Time Travel for recovery with accident removal. Created Pipeline with S3 stage & Notification to Trigger Pipeline. Automated File load from S3 bucket. Manipulated with Roles [Account Admin, Sys, Sec & User Admin ] & Privileges. SnowFlake & AWS storage Pricing knowledge for cost saving and Effective Usage. Hands on experience with bulk loading and Continues Loading. Hands on Experience with scaling up and scaling out for workloads and WLM Created Streams [append_only] for CDC for online DML changes. Created Data Pipeline using python. [ Pandas ] Good in Python Programming. Created Data Sampling using Row and Block methods. Created EC2 instance and Migrated UDB 11.5 DB to PostGreSQL 14.3 AWS RDS. Escalation Support Production Data Issues and Performance Tuning Hands on Experience in PostgresSQL. Created EC2 instance with General Purpose SSD for best performance. Support for PostgreSql, DB2 UDB, MSSQL Databases Implemented for EDB Postgres Advanced Server for Oracle Compatibility. Performance Expert. Tunning, Vacuum, Vacuum Full, Bloat, Parameters. Hands on Experience with NOSQL MongoDB ( BigData ) Good understanding with Hadoop Technologies. Expert in Erwin Data Modeling Tool. Created Logical/Physical Model using Reverse Engineering. Developed PL/SQL packages to load/purge large volume of Data. Hands on Experience in AWS Cloud Architect. S3 storage all possible configuration which includes version and Encypt. Configured VPN, Security Groups with NAT Gateway. Implemented Range Partition using pg_partman Extension. Experienced in CloudFormation, CloudTrail, IAM, Route 53, S3 and RBAC. Hands on Experience with Code Pipeline, API Gateway, Lambda, and DynamoDB. Developed scripts and processes for data integration and maintenance. Set up and controlled user access levels across databases to protect important data. Created and updated database designs and data models. Proven ability to learn quickly and adapt to new situations. Worked well in a team setting, providing support and guidance. Worked effectively in fast-paced environments.Sr. Cloud Data Consultant / Modeler 08/2019 to 01/2020 OPM .VA Created EC2 instance and Installed UDB 11.1 with FP4. Escalation Support Production Issues and Performance Tuning for Business Queries. Installed Fix Packs to avoid vulnerabilities and security risks. Performance tuning in complex DWH, OLTP, OLAP queries. Implemented successfully CPU shares in Work Load Manager. Hands on experience on Performance Tools DynaTrace, DBI, DSM,Zenoss. Expert in UDB DPF and built Multi Host DB. Hands on experience and strong Architectural knowledge in Splunk, ELK. Automated routine tasks with shell scripts.Data Consultant 01/2019 to 01/2020CMS VASnowflake, DB2, POSTGRESQL, MSSQL, MONGODB, Redshift Environment: Snowflake XL 16 ServerAWS EC2 m5 Instance, 192 GB, AWS RA3 nodesResponsibilities:Experienced in DWH Snowflake, Redshift, NoSQL Mongo dB Proficiency with Snowflake Architectures, Storage, Query Processing, Could Services Created Virtual Warehouse with XL (16 Servers) and configured Created Clusters for improve Query throughput for high concurrency Configured Auto Suspense and Auto Resume forCloud Database Developer 11/2018 to 08/2019Huntington BankEnvironment: Five Physical Node DPF, with 20 Logical Partitions Responsibilities:Implemented WLM for Bank DWH DB on UDB 11.4Escalation Support Production Issues and Performance Tuning Installed Fix Packs to avoid vulnerabilities and security risks Implemented new features RCAC for HR Databases for Rows and columns(Mask) Performance tuning in complex DWH, OLTP, OLAP queries Implemented successfully CPU shares in WLMHands on experience on Performance Tools Dyna Trace, DBI, DSM,Zenoss Modeled and implemented database schemas in DB2 UDB Replication Architect 10/2015 to 09/2018Morgan StanleyHands on Experience on PURE SCALE Cluster solution Installed and upgraded from 105 to 11.1 for 4 CF setupLevel 3 support for UDB DB2Implemented new UDB 11 features for critical DPF and ESE applications RCAC and RADIX for OLTP ApplicationsWorking closely with Application DBAs and Engg., Team for enhancements Performance Issues and Tuning using tools db2batch, dbcaem Expert in UDB DB2 DPF built in NAS storagePerformance Engineer MSSQL DBs 04/2015 to 10/2015Bank of AmericaEnv: HP DL380 Gen 8 16 Core 2.9 GHz Windows 2008 MSSQL 2012 Responsibilities:Tuning exercise on applications using latest DMVs and DMFs Working Table Range partitions for Rollin/ Rollout Column store Indexes created for Query Performance Implemented new features in MSSQL 2014 & 2016Always Encrypted & Stretch DatabaseSetup HADR with multiple RO stand by.DB Admin / Developer 12/2012 to 03/2015Global Support Team BARCLAYSEnv: HP DL380 Gen 8 16 Core 2.9 GHz Red Hat Enterprise Linux Server release 6.4 (Santiago) UDB 10.5 FP4 ESEDPF Mulitnode Cluster Server, MS SQL2008 R2, 2012& 2014 Role: Level 3 Support for Data Warehouse Team and other Applications DB2 UDB Responsibilities:Expert in DPF Partitioned Database Architectural Design, Setup and Installation Developed Store Procedure and automated scripts to Rollin/Rollout data for Table Partitioned DBFound and resolved Architectural issue to separate TMP, LOGS, DATA File system in dedicated VG for best performance for OLTP envSupport for Production issues on 24/7 basisVery good with Volume Group, File System Setup and SRDF config, Hands on experience on SRDF setup Failover and Fallback Worked on UDB 10.5 FP5, new features of HADR Multiple STANDBY and new commands db2prereqcheckExplored new features of multi-temperature storage for hot, cold and warm data using STOGROUP & Adaptive CompressionSupport for Apply Lag, Capture Qdepth, ASN Schema Tables issues Cataloging Databases and Nodes (db2cfexp & db2cfimp) Used tools DB2DART, DB2SUPPORT, DB2PDExperienced in DB2 Recovery ( Crash Recovery, Roll Forward Recovery, Version Recovery ) and movement of data using Load, Import, and Export Utilities Expert in analysis of Run stat, Reorg Tables, Roerich report for SQL Performance Strong in Partition concepts (Hash, Range & Group by) Strong with DB Artisan Tools, Erwin, Quest Central / Spotlight, BMC Change Manager & Patrol Expert in Session Logs and Workflow monitoring to improve performance of ETL load Knowledgeable in Star, Snowflake Schemas concepts and Cube Creation, Strong in SQL Analysis, Hash Joins, Loin, and Merge Join Federated DB Setup to move data from one environment to others Perl and Shell scripting.Application DBA 06/2012 to 12/2012CHASEEnvironment: IBM P570 POWER6 Servers & IBM 3150 BCU DW Responsibilities:Bank Application setup with HADR with High Availability with minimal time and effort. Architect 11/2010 to 06/2012Bank of AmericaFor Data Warehouse ETL TeamEnvironment: Front end Cognos Report and Backend DB2 UDB Five Physical Nodes23 TB DatabaseResponsibilities:Built on UDB 9.5 FP9 DPF DBSupport for Production issues on 24/7 Basis.EducationBachelors: Electronics and Communication 12/2003MorningSide Tabulations And Consulting - New York, NY AccomplishmentsCERTIFICATIONS:IBM Certified Specialist (DB2 UDB)IBM Certified Solutions Expert (DB2 UDB Database Administration for Linux, UNIX, Windows and 0S/2) |