Quantcast

Data Engineer Software Development Resum...
Resumes | Register

Candidate Information
Name Available: Register for Free
Title Data Engineer Software Development
Target Location US-DE-Wilmington
Email Available with paid plan
Phone Available with paid plan
20,000+ Fresh Resumes Monthly
    View Phone Numbers
    Receive Resume E-mail Alerts
    Post Jobs Free
    Link your Free Jobs Page
    ... and much more

Register on Jobvertise Free

Search 2 million Resumes
Keywords:
City or Zip:
Related Resumes

Software Engineer Test Automation Mount Laurel, NJ

senior aws data engineer Malvern, PA

Software Machine Learning Engineer Westville, NJ

Software Engineer Philadelphia, PA

Senior Software Engineer Glenolden, PA

gcp data engineer Philadelphia, PA

Software Development Engineer in Test Exton, PA

Click here or scroll down to respond to this candidate
Candidate's Name
Phone: PHONE NUMBER AVAILABLEEMAIL AVAILABLEWork Experience Summary:17+ years of experience in various phases of software development life cycle, including designing, development and testing of business applications using technologies such as Snowflake, Oracle 8i/9i/10g/11g, Hadoop, Azure, DataStage 8.5/9.1, Informatica, ER/Studio Data Architect, shell and Python scripting, various scheduling tools.Snowflake Data Engineer with extensive experience in migrating on-premise data warehouses to cloud. Experience with Snowflake on Azure - SnowSQL, zero copying cloning, azcopy, JavaScript stored proc etc.Familiar with AWS data tools such as Glue, RedShift etc.Experienced in development for NoSQL databases such as HBase.Extensive experience in PL/SQL, writing stored procedures, packages, triggers and other database objects, Import/Export, SQL Loader, used standard packages like DBMS_SCHEDULER, DBMS_PROFILER, DBMS_OUTPUT, DBMS_SQL, DBMS_UTILS and UTL_FILE utilities.Experienced in complex stored procedure, triggers, views and other database objects, bulk load, queuing, export/import, XML based data load, partition exchange, indexes and granting privileges.Hadoop frameworks utilizing Hive, HBase, Sqoop, Hue etc. in Avro, text file formats.Understanding of Teradata BTEQ scripting, tpt utilityExperience in SDLC(Waterfall) and Agile methodologies such as Kanban, Scrum in various capacities Developer, Scrum Master, BSA. Experience of conducting scrum ceremonies and related reporting.Experience in interacting with users, analyzing client business processes, documenting business requirements, performing design analysis and developing design specifications.High Volume ETL data load using IBM DataStage/Informatica/Ab InitioScheduling jobs using Autosys, TWS, Control-M, Tidal, CronExperience in Performance Tuning & Optimization of SQL statements.Possesses excellent analytical, programming and communication skills.Working knowledge of PQL(Celonis)and KSQL(Kafka).Passionate about exploring new technologies and finding innovative solutions to complex business problems with a commitment to continuous learning and professional development.Certifications:Machine Learning A-Z: Hands-On Python In Data Science: Udemy, Oct20Deep Learning and Computer Vision A-Z: OpenCV, SSD & GANs: Udemy, currently pursuing.SnowPro Core Certificate COF-CO2: Currently pursuingTechnical Skills:Operating Systems : Windows NT/95/98/2000/XP/8/10, UNIX, LinuxCloud Environments : Azure, AWSRDBMS/DB : Snowflake, Oracle 8i/9i/10g/11g, Hive, HBase, MS-Access, Sql ServerProgramming : SnowSQL, PL/SQL, SQL, TSQL, C, C++, HiveSQL, Python for Data ScienceData Modeling Tools : ER/StudioData Architect7.0, ERwin Data ModelerETL Tools : Informatica 9.5, IBM DataStage 8.5/9.1, AB InitioScheduling Tools : Autosys, TWS, Control-M, Tidal, CronReporting Tools : BOXI, Crystal Reports, Apache SolrApplication Dev Tools : Toad, SQL Developer, SQLPlus, SQL Loader, Hue, Eclipse, Rational ClearCase, SVN, WinCVS, gitEducation:University of Hertfordshire, UK- Master of Science,2010Dr B.A.M.University, India- Bachelor of Engineering -2002Work Experience:Sr Snowflake Developer/Data Analyst, Staples May 2020 - PresentInvolved in Teradata migration to Snowflake for multiple large legacy warehouses.Created Snowflake framework stored procs to copy data from Prod to QA on a weekly basis.Converted 300+ BTEQ scripts to equivalent SnowSQL scripts.Migration of historical data for 800+ tables using Python framework/shell scripts.1000+ Tidal jobs re-created for Tidal. Worked with extended team on plan for parallel run and plan for switching off legacy jobs.Teradata stored procedure conversion to Snowflake stored proc in JavaScript/SQLWorked with multiple vendors and business teams across the company to validate 60+ outgoing feeds migration to Snowflake from Teradata warehouses.Extensive experience with various Snowflake features such as zero copy cloning, COPY INTO, virtual warehouse sizing, time travel, creating dashboards, etc.Transformation from Json to flattened data in Snowflake.Sqoop results of HiveQL queries from Hadoop to Snowflake.Created POC Power BI report to display images from Snowflake external stage.Rewrote existing data load frameworks, Unix scripts to work with Snowflake instead of TeradataImported source definitions in Informatica for Snowflake tables using Snowflake connector.Analysis and conversion of legacy simple Informatica Workflows to Snowflake only processes, eliminating use of additional ETL tool.Daily working sessions with various business/analyst teams across the organization to help them migrate reports, macros, etc. to Snowflake.Data analysis using SQL and PythonData visualization using Snowflake Dashboards, Excel charts and Power BIAnalysis of coupon usage and sales data via email and digital channelsComparison and root cause analysis of differences in reports from Teradata vs Snowflake using custom codeDropship vendor report card generation for various violations such as late shipping/delivery, additional shipping charges etc. using data from UPS, USPS and FedEx.Experience presenting analytical findings to audiences at all levels.Environments: Snowflake, Teradata, Azure, Tidal, Informatica, JavaScript, Jira, Shell scripting, Python Scripting, Azure Storage Explorer, Sqoop, Power BI, MS-AccessSr PL/SQL Developer/Cross-functional Agile Member, Barclaycard US Aug 2013 - Mar 2020Helped drive projects in Tech lead/Sr Dev capacity for the client. Have been the go-to SME for applications owned by the team with an understanding of both the business and technology aspects of the applications. Also filled in for other roles in Agile team as and when needed.Enhanced Enterprise Communication engines involving emails, Letter, SMS, iPush as needed using Oracle Pl/SQL, IBM DataStage, Hadoop Sqoop, Hive, Unix Shell scripts, Control-M.Migrating data from Verint Click-to-chat, Aprimo Campaign, email/letter engine databases to Barclays Data Lake using Hadoop frameworks utilizing Hive, HBase, Sqoop, Hue etc. in Avro, text file formats.Utilized DataStage components to generate XML reports for publishing on the customer website.Integrated real time Java Web Services and ETL Batch systems using the DataStage for awarding points earned through social media activity.Created process to gather customer data from all lines of business and consolidate it in central Hive on HBase table. Export data from these tables to Oracle DB using internal Sqoop export framework. Built process to load flat files into Hadoop using internal file ingest/scheduler frameworkAssisted in creating data models for KYC, social media communities and other applications using data governance guidelines.Performance tune DataStage jobs, Sequencers to handle huge amount of Bank data.Production On-Call support to help resolve high priority issues in a timely manner. Experience with creating End of Sprint wrap-up Summary, managing story backlog, and conducting scrum ceremonies.Conducted brown bag technology sessions for various topics.Environments: Oracle 10/11g, PL/SQL, Toad, Hive, Impala, HBase, Sqoop, IBM DataStage, Control-M, Tivoli Workload Scheduler, MicroStrategy, Unix, Shell script, git, Snowflake, S3, WinCVS, SVN, Eclipse, Cucumber, Rally, JiraPL/SQL Developer- Fannie Mae, VA Aug 2011 - Apr 2013The U.S.Securities and Exchange Commission (SEC) adopted Dodd Frank rule 15-Ga-1 mandated the GSEs such as Fannie Mae to disclose fulfilled and unfulfilled repurchase requests concerning assets in asset-backed securities. Repurchase Disclosure System was a new application built from scratch in a limited time for the report to be filed quarterly to SEC.Also helped the client develop the Servicing Capacity P&L Analytical System (SCAPLS) which was built for internal reporting to help Finance Organization and National Servicing Organization to keep track of performance of the rapidly growing loan population transferred to high touch Servicers against benchmark and for additional related analytics on the same.Wrote PL/SQL and SQL procedures, functions, and packages for data quality checks, logging, and staging to core table moves.Developed triggers and exception handling procedures for sending alert notifications for business rule violations.Developed processes to load over 65 M+ records from various systems for The Dodd-Frank report.Environments: Oracle 11g, PL/SQL, Autosys, SQL, TOAD, ClearCase, Unix, Shell script, Visio, ER-Studio, Ab Initio, BOXI, Microsoft ProjectPL/SQL Developer- Gamestec Leisure Limited, UK Oct 2007  Feb 2011Designed and developed complex procedures to handle errors and exceptions at both application and database levels using PL/SQL and shell scripts.Analyzed and discussed the business requirements with the business owners and business analysts.Environments: Oracle 9i/10g, PL/SQL, HTML, SQL, TOAD, VSS, Unix, Shell scriptSoftware Engineer- Nexa Systems, India Jan 2005-Sep 2005Involved in database development by creating Oracle PL/SQL Functions, Procedures, Triggers, Packages, Records, and Collections.Environments: Oracle 9i, PL/SQL, SQL, TOAD, VSSJr. Software Engineer-Shah Electronics, India Jul 2002  Dec 2004Gathered application requirements and provided design recommendations.Assisted in the design of tables, databases, forms, and reports.Environments: Oracle 8, PL/SQL, SQL, SQL*Plus

Respond to this candidate
Your Message
Please type the code shown in the image:

Note: Responding to this resume will create an account on our partner site postjobfree.com
Register for Free on Jobvertise