Quantcast

Data Engineer Migration Resume Herndon, ...
Resumes | Register

Candidate Information
Name Available: Register for Free
Title Data Engineer Migration
Target Location US-VA-Herndon
Phone Available with paid plan
20,000+ Fresh Resumes Monthly
    View Phone Numbers
    Receive Resume E-mail Alerts
    Post Jobs Free
    Link your Free Jobs Page
    ... and much more

Register on Jobvertise Free

Search 2 million Resumes
Keywords:
City or Zip:
Related Resumes
Click here or scroll down to respond to this candidate
Candidate's Name
Candidate's Name @gmaiPHONE NUMBER AVAILABLEData Migration Application Architect/ Senior Data Engineer& certified safe 6.0 AgilistStatus: US CitizenClearance: Secret Clearance (Active)Position Applied: Data Migration Application Architect/ Data EngineerSoftware Certification:Certified SAFe Agile 6.0Completed 35 PDU for PMP certification in December 2023Certified Salesforce CRM AdministratorCertified Teradata Professional.Certified Teradata SQL Specialist.Certified ColdFusion Advanced Developer.Experienced in Big Data Integration ArchitectTrained in No-SQL DBA Administration.Obtained Training to Comp-TIA Security plus Certification and PMPEDUCATION:BS (Computer Science & Engineering) from National Institute of Technology, Tiruchirappalli, Tamil Nadu, India 620015)SUMMARYProvide leadership and planning to develop and implement a disciplined Data security management strategy, which focuses on managing risk, optimizing cost-effectiveness, measuring performance, and ensuring alignment with Enterprise Data Architecture.Lead Data team for the Data architecture vision, Data Migration, Stakeholder coordination and performance Test Subject Matter expertise to the Program.Experience in architecting, design, development and implementation Salesforce CRM application Database Architecture, Data warehousing, Business Intelligence and Application Software systems.Professional experience in Redshift, Oracle, PostgreSQL, Teradata, Netezza, HIVE, Mark logic DB, Syncsort DMX-informatic and Talend managing large databases and Data Warehouse. Expertise in building large scale real-time BI / Data Warehousing applications with Strong performance tuning skills. Professional Certified on Teradata.Experience in Managing Multiple projects in a diverse environment like Project based and Matrix organization.Project Lead experience in agile /Scrum-based and Kanban environment to achieve the management goals.Data Subject matter expert to salesforce CRM application and interface migration systems.Leading a team of CRM Salesforce Architect and development team and production support.Leading the Data Architecting team for Application development.Expert database engineer; NoSQL & relational data modeling, object-relational integration (ORM), physical design/tuning. Specialized with Oracle, Hadoop, Mark logic and MongoDB.Expert Enterprise Data Modeler with a deep understanding of developing transactional enterprise data models that strictly meet normalization rules, as well as enterprise data warehouses using Kimball and Inmon data warehouse methodologies.Experience in identifying a client business analytical need, and translating them into data warehousing and business intelligence solutions.Experience in defining ETL and system integration architectures combining data from disparate source systems and disparate data formats.Experience with Business Analysis, Modeling, ERD, System Design, Custom Development Methods, Application Implementation Methods, and Case Tools.Proficient with Oracle 8i/9i/10g and Oracle 11g, UDB DB2 8.x, Sybase 11.x, PL/SQL, SQL Loader, Erwin, Shell scripting, SQL * Loader, Teradata V2R5.Experience as an architect in building scalable, cloud-based product or web applications.Experience in designing & building RESTful service implementation and Service Oriented ArchitectureExcellent communication skills with ability to interact with end-user, customer, and team members.Work ProfileChenega Decision Science LLC May 2018  PresentClient: ARHS  Accessions Information Environment (AIE) ProgramPosition: Lead Data Migration Application Architect/ Senior Data EngineerDevelop logical and physical data models, in Traditional Data Stores (i.e. DB2, Oracle) and in NoSQLBaseline DDLs, Develop, test, and implement scripts, programs, and related processes to collect, manage, and publish DEV and PROD metadata from data models.Develop DDL implementation and prepare back-out plans in multiple environments.Promote Data models into production metadata environments and support DDL PROD implementation.Support Post-production Validation of DDL scripts.Overseeing growth of multiple databases and serve instances horizontal vs Vertical and advice the management to the cloud utilization.Analysis, design, Data Mapping, and database design Architect from existing Legacy data to Final Database for the MCRISS II project from AWS Cloud to Azure cloud platform.Design, Development, and Implementation of ETL process for extract and load data from Intermediate database to New Final database design to each individual entity level.Continue (CI/CD) development and integration of database in mapping schema model from Intermediate data schema to final data schema.Design, Development and lead the database Analysis, Architect, Development and implement to each sprint planning.Lead of data ETL migration project and modernization project.Develop and Implement of Database implementation to Master database scripts and Dev-ops Process to Higher database environments.Design Architect of tables, views, SQL Scripts development for new Development for new functionality in the finial data schema.Design and development of QlikView Load scripts for distribution across multiple data sources.Designed and maintained QlikView applications.Design and utilized the integrated workflow using MuleSoft ESB.Involved in making suggestion in implementing Data Encryption of PII, PHI object data fields.Involved in development of System Acceptance (SAT)/User Acceptance Test (UAT) and Test and Evaluation (T&E) process.Articulate Technical team building, sprint planning and execution plan.Involved in AIE PMO project process with external solution provider for data migration from legacy systems into new AIE salesforce application.Project: 2Design, coding, and implementation of new database application development in MCRISS2-APP-DEV, MCRISS2-APP-TST and MCRISS2-APP-PRD (Development, Test and Production) environment of database backend support for each sprint cycles.Experience in data ingestion process in Azure data factory environment using built-in API connectors for SaaS application, salesforce, redshift, Databricks, Oracle, NoSQL Document databases, Postgres, MS SQL database.Design developed, tested, and implementation the Data ingestion data using Azure SQL Managed Instances (MI) in AZURE SQL Server.Analysis, design, development and implantation of Data ingestion and utilization of existing data using core SQL in Databricks database.Design, Developed, tested, and implemented SSIS ETL script to load data into development database, staging database.Migration of data into Datafile, Enterprise data warehouse database using Azure data services.Design, Develop and implementation of select data connectors to inject Json documents into Cosmo database from enterprise database system using core SQL(Collections) /Azure Table Storage.+Environment: Salesforce Cloud, mule-soft, Microsoft SQL Server 2017, Gov+ Cloud AWS andMicrosoft Azure Data factory, enterprise SharePoint, Microsoft SQL Server Integration Services (SSIS). MicrosoftManagement Studio, Toad, Visual Studio 17, QlikView, Microsoft Import and Export tool, Microsoft Management tools,AWS and Azure cloud platform, Toad modeler, Redgate. Languages: T-SQL, SQL Stored procedure, SQLETL Script Packages, SSIS Packages, Jobs, Toad ER Architect, Redgate Tool Belt, ALM tolls Confluence, MongoDB, Cosmos dBJira, RedshiftGaddis Group Inc Nov 2015  Mar2018Small Business AdministrationPosition: Principal Engineer / Lead Data ArchitectData Migration project.Design, architect, and implementation ETL using Syncsort DMX-h tool for migration of data from Mainframe load files and Netezza table data to Hadoop environment.Developed multiple ETL tasks, jobs using Syncsort DMX-h tool and copy/migrate the data to Hadoop Hive database using impala SQL.Design/Architect, Developed, tested, and implemented the Hive Database environment with parallel to Netezza EDW database.Develop, test and implementation of Impala SQLs for the ETL load to data to Hive target table. Experience in developing and testing data load impala queries using Hue impala tool.Architect development different types of ETL tasks for EBCDIC files, comma separated and text files and implemented production release.Supported production implementation and production ETL data triage validation.Worked closely with Data-warehouse (EDW), function and Data Quality experts to seamlessly integrate the sensitive data Migration to Hadoop environment.Installation and configuration of Sqoop and Flume, HBase.Managed and reviewed Hadoop Log files as a part of administration for troubleshooting purposes. Communicate and escalate issues appropriately.Worked with Apache Spark which provides fast and general engine for large data processing integrated with functional programming language Scala.Loaded the data into Spark RDD and do in memory data computation to generate the output response.Environment:Oracle 11G, Microsoft Azure, enterprise SharePoint, ColdFusion 10/11, SVN Gits, JIRA, CFML, HTML, JavaScript, Linux, Hadoop CDH cluster, edge nodes. Hive Databases, Impala SQL, Apache Spark, Scala, DMX-h, Redshift & Netezza, PostgreSQL, AWS EC2, DMS, Power Center, Hue and HSQL.Languages:SQL, PL/SQL, shell scripts, CSS, HTML, C, XQuery, CFML.Tools: I Python, MS Excel 2013, Tableau 8/9/10, QLICKAccenture Federal Service (AFS) Jun 2013  Nov 2015Project: 2 SASP-SPM (SEAMLESS ACESS SERVICE PERFORMANCE)Analyzed requirements for providing solutions to new features for Service Performance & ManagementPlan, implement & executed client release deliverables based on deliverable release schedules & SOW.Designed and developed Mapping source to target mapping, multiple partitions in multi-node process for Source Data base /file names.Involved in design/development new features which includes Advance compression process Auditing and Error processing to increase the performance of the SASP-SPM Application.Worked closely with QA/ETL Teams for testing at UNIT/system & functional level for data warehouse Performed Gap analysis on existing data warehouse system.Develop SQL and PL/SQL producers, functions, packages and Indexes for new Ad hoc reports, Data warehouse application Reports /new dB design applications.Responsible for application solution analysis, design, development, integration, and enhancement in addition to being involved in resolving complex support issues.Designed and implemented a Cassandra NoSQL based database and associated RESTful web service that persists high-volume user profile data for vertical teamsMigrated high-volume OLTP transactions from Oracle to Cassandra to reduce Oracle licensing footprint. Created architecture stack blueprint for data access with NoSQL; used as the technical basis for new Cassandra projects.Lead role in NoSQL column family design, client access software, Cassandra tuning; during migration from Oracle based data storesTechnical Environments: Oracle 11g, Toad, Benthic, SharePoint, Linux 6.3, Apache Cassandra SDE, Netezza, Redshift and Hybrid Cloud (AWS& Azure).Wireless Matrix Inc Herndon, VA 20171 Jan 2011 to Jun 2013Position: Team Leader - ETL No-SQL Solution ArchitectCalamp Corp, formerly Wireless matrix Inc.is mainly involved in the SaaS product Fleet outlook 360, GPS tracking system to fleets for large, medium and small business like Verizon, Comcast, DISH, Sears and small Business Service companies in USA and Canada. We provide software reporting solutions to clients with our application product. Which includes, Dashboards, Tracking, Alert TechConnect, Tech Direct, Data warehouse Reports Module.Responsibilities:Develop Logical/Physical Model for both Transactional and Data Warehouse systems.Designed and developed Mapping Spread sheets for documenting source to target mapping, data types, logical/physical names and Source Data base /file names.Developed, tested, and implemented ETL monitoring scripts and scheduled jobs.Implemented AWS Redshift Data-warehousing BI Solutions Cloud Telematics Application.Architected Database design, model and ETL components.Interacted with Talend-Support for Troubleshooting and Performance enhancement exercise.Leveraged AWS-Talend for Big Data Tool for Data Integration from discrete source ranging From RDBMS (Oracle/MySQL/PostgreSQL), Mongo DB-No-SQL, Sales Force.Migrating data storage in Amazon cloud Glacier storage environmentTechnical Environment:(Oracle 10g, 11g, SQL*Plus, SQL*Loader, MongoDB 2.2, SQL Developer, Sun Solaris Unix, Autosys, Ruby on Rails, HTML, JavaScript, TOAD, JIRA, PUTTY, WinSCP, Word, Excel, Visio, PowerPoint, Erwin 5.0/7.0, SVN, Linux, Unix and Windows, AWS Glacier, EBS, EC2)Employer: Creative Information Technology IncClient: Department of LaborPosition: Lead Application Engineer Dec 2008  Dec 2010Project: SPARQ (SCSEP)SPARQ is to provide the United States Department of Labor's Senior Community Service Employment Program (SCSEP) and its Grantees an automated system of managing participant data.Responsibilities:Developed and Implemented Several Oracle packages/procedures and Views to meet ongoing business requirements.Attending Project JAD Sessions and interacting with client and requirement gathering and Analysis.Giving project team members directions, mentoring and peer view activities on daily basis to successful completions of releases.Developed & maintained Logical & physical models for Staging & Data Mart (Star & Snowflake schemas) using ErwinCreated/designed new tables, views, queries, PL/SQL Function, packages, procedures for new enhancement in the application.Involved in design of the ETL Framework which includes Auditing and Error processing.Developed, tested and implemented new functions, procedures, packages for new Management reports and QPR Reports for application.Developed and implemented SQL queries to the front-end application.Developed SQL scripts to generate DB extract files for analysts and for grantees on Monthly basis.Modified existing Procedures, packages that had low performance due Cartesian loop and did query optimization to improve query/report performance when the report was generated in web application.Developed new Admin report like user log report, error generation report, for the application, which involves creating new packages and procedures.SQL Query tuning & testing Procedure, Packages in development, staging & production environment by loading test data.Developed SQL Loader scripts for loading data in dev/test/Mirror environment with seed /Transaction data Developed UNIX script for data loading into oracle database/tables on weekly basis.Analyze tables and underlying indexes on all tables.Technical Environment: (Oracle 10g, 11g, SQL*Plus, SQL*Loader, PL/SQL, Erwin, Autosys SQL Developer, Data Stage 7.5, ColdFusion 8.0, jQuery, JavaScript, HTML, XML, SharePoint)Employer: Global IT Inc Feb 2003  Dec 2008Position: Sr. Data Consultant Verizon, Silver Spring, MDThe MDSW system (stands for: Market Data Systems Warehouse) was originally developed out of a need to have as much strategic information on-line to best assist managers in obtaining information at various levels. These levels include information about a single customer up through a corporation. Since MDSW data is extracted from many Legacy systems with differing requirements and definitions of customer and product, care should be used in interpreting this data. Therefore, the use of this data should be limited for strategic intentions only and not for billing purposes or to impact a customers service record, bill or profile. There are approximately 200 tables of current & historical data in MDSW. 20 terabyte of company data is available for immediate query. This is approximately equivalent to 850 fully utilized CD-ROM disks.Responsibility:Develop Logical/Physical Model for both Transactional and Data Warehouse systems.Responsible for design, developing and maintaining the technical architecture for the data warehouse.Designed and developed Mapping Spread sheets for documenting source to target mapping, data types, logical/physical names and Source Data base /file names.Worked extensively on various kinds of queries such as Sub-Queries, Correlated Sub-Queries, Dynamic SQL, Union Queries and concentrated much on Query tuning.Worked extensively on hierarchical queries.Worked on data migration from Mainframe DB2 to Oracle 9i.Involved in daily, weekly, and Monthly load of data from Mainframe data set to Oracle database.Worked on Very Large Database Environment in terabytes. And tuned queries for performance. Created and maintained Parallel queries for very large datasets.Involved in Analysis, design, development & implementation of FACT tables, Dimension tables using Star schema.Data loading using SQL *Loader, Loading Mainframe flat files to Teradata base using Fast load, Multiload.utilitiesImported/exported several databases through various environments.Developed JCL SYNSORT, JCL FTP and JCL code for formatting mainframe flate files.Developed and maintained lot of batch scripts in UNIX shell scripts for the production, like scheduled data loading, Create, databases, Tuning and backup and Recovery.Technical Environment:(ColdFusion 8, CFML, HTML, JavaScript, CSS Oracle 8i/9i, SQL*Plus, SQL*Loader, PL/SQL, MS Access,Teradata V26, JCL Excel Spread sheet Erwin, Unix Korn Shell scripts, Lotus Notes 6.0, TOAD, Lotus Domino6.0, DB2, TSO, MVS, Redbrick, VSS, LoadRunner 7.0)Employer: Pentasoft Software Lab, Chennai India (Jan 1996  oct 1999)Title: Software EngineerSQL Scripts were developed for creating dynamic query tables using the latest Oracle 8i features.SDLCReport Triggers were extensively used for creating dynamic tables and dynamic queries.Developed PL-SQL packages for loading various data loading process.Developed SQL loader script for loading data from different schemas.Technical Environment:(Oracle 8i/9i, SQL*Plus, SQL*Loader, PL/SQL, DB2, JCL,TSO,MVS,MS Access, Unix Korn Shell scripts, ColdFusion 4.0)

Respond to this candidate
Your Message
Please type the code shown in the image:

Note: Responding to this resume will create an account on our partner site postjobfree.com
Register for Free on Jobvertise