| 20,000+ Fresh Resumes Monthly | |
|
|
| | Click here or scroll down to respond to this candidateCandidate's Name
PHONE NUMBER AVAILABLEEMAIL AVAILABLESynopsis:Ability to remain highly focused and self-assured in fast-paced and high-pressure environments. Highly Motivated, Solutions Driven with over 8+ years of Data Warehousing experience in the areas of ETL design and Development, Data Management, Involved in complete Software Development life-cycle (SDLC) of various projects, including Requirements gathering, System Designing, Data modeling, and ETL design, development, Production Enhancements, Support and Maintenance. Creative Data Management Analyst dedicated to providing innovative and effective data solutions and assessment. Proficient at analyzing data for business and market trends, document and record validation, and completing extraction, transformation, and loading. Specialize at developing system tools to optimize data assessment and management. Excellent Interpersonal and communication.Extensive ETL tool experience using IBM Infosphere/Websphere DataStage, Ascential DataStage.Worked on DataStage tools like DataStage Designer, DataStage Director and DataStage Administrator.Strong understanding of the principles of Data Warehousing using fact tables, dimension tables and star/snowflake schema modeling.Worked extensively with Dimensional modeling, Data migration, Data cleansing, ETL Processes for data warehouses.Developed parallel jobs using different processing stages like Transformer, Aggregator, Lookup, Join, Sort, Copy, Merge, Funnel, CDC, Change Apply and Filter, FTP Enterprise.Used Enterprise Edition/Parallel stages like Datasets, Change Data Capture, Row Generator and many other stages in accomplishing the ETL CodingFamiliar in using highly scalable parallel processing infrastructure using parallel jobs and multiple node configuration files.Ability pulling data from multiple sources including: xml, web services, delimited flat files, and other data sourcesWebSphere MQ Stage to read messages from the Queues and also worked on IBM WebSphere MQ Explorer to test the Queue Depth, dead letter queue and number of messages.Worked with both an InfoSphere Information Services Director input stage and a service output stageWorked with XML files for reading the claims payload files from web services.Used Web Services-Transformer, Web Services-Client Stages to make Web Service Calls.Imported WSDL file definitions using Import Web Service File Definitions utility in DataStage Designer.Captured all the Rejected Data from Web Service Calls and shared those with customers, which they used to debug the data issues.Used to receive XML-Files from upstream PIM and used to parse them and generate the Flat files and XML-Outputs and send them to downstreamUsed Web services MQ-Stage to read and write messages to Queue.Experienced in scheduling Sequence and parallel jobs using DataStage Director, UNIX scripts and scheduling tools.Experience in troubleshooting of jobs and addressing production issues like data issues, ENV issues, performance tuning and enhancements.Knowledge in using Erwin as leading Data modeling tool for logical (LDM) and physical data model (PDM).Extensive experience in design and development of Decision Support Systems (DSS).Assisted in development efforts for Data marts and Reporting.Experience in all aspects of analytics/data warehousing solutions (Database issues, Data modeling, Data mapping, ETL Development, Metadata Management, data migration and reporting solutionsTechnical and analytical skills with clear understanding of design goals of ER modeling for OLTP and dimension modeling for OLAP.Extensive experience in Unit Testing, Functional Testing, System Testing, Integration Testing, Regression Testing, User Acceptance Testing (UAT) and Performance Testing.Worked with various databases like Oracle 10g,11g/9i/8i,Netezza, DB2, SQL Server, Teradata.Work ExperienceTechnical SpecialistUSAA- San Antonio, TX July 2017 to PreResponsibilities:Providing Production support with Technology Operations teams to ensure operations and maintenance of Marketing Data platformCompleted various data management tasks to ensure record accuracy and workflow efficiency, including loading, transformation and extractionMonitored the status and security of data systems, established direct access files and other configurationsCoordinated with application development to ensure maximum efficiency and functionalityWorked with Business analysts and the DBAs for requirements gathering, analysis, and testing to implement fixes for identified data and customer issues, metrics and project coordination.Experienced in PX file stages that include Complex Flat File stage, Datasets, Lookup File Stage, Sequential file stage.Monitoring and Evaluating Daily Data and customer processes in order to meet SLA for Customer Master Data Management Environment.Involved in creating UNIX shell scripts for database connectivity and executing queries in parallel job execution.Involved in development of Job Sequencing using the Sequencer.Worked on Robot scheduler tool for scheduling batch jobs.Experience with customer data process such as Standardization, Merge/Match and data profiling.Used designer and director to schedule and monitor jobs and collect the performance statistics.Extensively worked with database objects including tables, views, indexes, schemas, PL/SQL packages, stored procedures, functions, and triggers.Creating local and shared containers to facilitate ease and reuse of jobs.Worked with SCDs to populate Type I and Type II slowly changing dimension tables from several operational source filesExecuted Pre and Post session commands on Source and Target database using Shell scripting.Worked with Developers to troubleshoot and resolve issues in job logic as well as performance.Documented ETL test plans, test cases, test scripts, and validations based on design specifications for unit testing, system testing, functional testing. Prepared test data for testing, error handling and analysis.Environment: IBM Websphere DataStage 11.5,8.5 IBM AIX 5.2, Oracle 11g, XML files, Autopsys,Netezza database, MS SQL Server database, sequential flat files,Sr Programmer analystUniversal Studios - Orlando, FL June 2015 to May 2017Pyramid Systems is a software solutions firm dedicated to developing, re-engineering and maintaining mission-critical IT systems for federal and commercial clients.Responsibilities:Worked with the Business analysts and the DBAs for requirements gathering, analysis, testing, and metrics and project coordination.Experienced in PX file stages that include Complex Flat File stage, Datasets stage, Lookup File Stage, Sequential file stage.Involved in creating and maintaining Sequencer and Batch jobs.Creating ETL Job flow design.Involved in creating UNIX shell scripts for database connectivity and executing queries in parallel job execution.Created various standard/reusable jobs in DataStage using various active and passive stages like Sort, Lookup, Filter, Join, Transformer, aggregator, Change Capture Data, Sequential file, Datasets.Involved in development of Job Sequencing using the Sequencer.Used Remove Duplicates stage to remove the duplicates in the data.Used designer and director to schedules and monitor jobs and to collect the performance statistics.Extensively worked with database objects including tables, views, indexes, schemas, PL/SQL packages, stored procedures, functions, and triggers.Creating local and shared containers to facilitate ease and reuse of jobs.Environment: IBM Websphere DataStage 9.1, IBM AIX 5.2, Oracle 11g, XML files, Autosys, MS SQL Server database, sequential flat files, Oracle Golden GateETL Datastage DeveloperPyramid Systems - Fairfax, VA February 2013 to June 2015Responsibilities:Involved in understanding of business processes and coordinated with business analysts to get specific user requirements.Helped in preparing the mapping document for source to target.Extensively used DataStage Tools like Infosphere DataStage Designer, Infosphere DataStage Director for developing jobs and to view log files for execution errors.Involved in design and development of Datastage batch jobs for loading the data into the Huntington's Customer Information System (CIS).Experienced in developing parallel jobs using various Development/debug stages (Peek stage, Head & Tail Stage, Row generator stage, Column generator stage, Sample Stage) and processing stages (Aggregator, Change Capture, Change Apply, Filter, Sort & Merge, Funnel, FTP, Remove Duplicate Stage)Used DataStage as an ETL tool to extract data from sources systems, loaded the data into the SQL Server, DB2 and Oracle databases.Developed job sequencer with proper job dependencies, job control stages, triggers.Used the ETL Data Stage Director to schedule and running the jobs, testing and debugging its components & monitoring performance statistics.Controlled jobs execution using sequencer, used notification activity to send email alerts.Imported table/file definitions into the Datastage repository.Participated in Datastage Design and Code reviews.Worked in Agile/Scrum environment.Involved in creating UNIX shell scripts for database connectivity and executing queries in parallel job executionWorked on programs for scheduling Data loading and transformations using DataStage from DB2 to Oracle using SQL* Loader and PL/SQL.Successfully implemented pipeline and partitioning parallelism techniques and ensured load balancing of data.Worked on Zena scheduler tool for scheduling the Datastage batch jobs.Involved in performance tuning of the ETL process and performed the data warehouse testing.Documented ETL test plans, test cases, test scripts, and validations based on design specifications for unit testing, system testing, functional testing, prepared test data for testing, error handling and analysis.Prepared documentation including requirement specification.Participated in weekly status meetings.Environment: IBM Infosphere DataStage 8.7 (Parallel & Server), Oracle 10g, DB2, SQL Server 2008, PL/ SQL, Flat files, XML files, Zena Scheduler.Sr ETL DataStage DeveloperL.A. Care - Lynwood, CA June 2011 to January 2013Responsibilities:Analyzed, designed, developed, implemented and maintained Parallel jobs using IBM info sphere Data stage.Involved in design of dimensional data model - Star schema and Snow Flake SchemaGenerating DB scripts from Data modeling tool and Creation of physical tables in DB.Worked SCDs to populate Type I and Type II slowly changing dimension tables from several operational source filesCreated some routines (Before-After, Transform function) used across the project.Experienced in PX file stages that include Complex Flat File stage, Dataset stage, Lookup File Stage, Sequential file stage.Implemented Shared container for multiple jobs and Local containers for same job as per requirements.Adept knowledge and experience in mapping source to target data using IBM Data Stage 8.xImplemented multi-node declaration using configuration files (APT_Config_file) for performance enhancement.Used DataStage stages namely Hash file, Sequential file, Transformer, Aggregate, Sort, Datasets, Join, Lookup, Change Capture, Funnel, FTP, Peek, Row Generator stages in accomplishing the ETL Coding.Debug, test and fix the transformation logic applied in the parallel jobsExcessively used DS Director for monitoring Job logs to resolve issues.Experienced in using SQL *Loader and import utility in TOAD to populate tables in the data warehouse.Involved in performance tuning and optimization of DataStage mappings using features like Pipeline and Partition Parallelism to manage very large volume of data.Deployed different partitioning methods like Hash by column, Round Robin, Entire, Modulus, and Range for bulk data loading and for performance boost.Repartitioned job flow by determining DataStage PX best available resource consumption.Created Universes and reports in Business object Designer.Created, implemented, modified and maintained the business simple to complex reports using Business objects reporting module.Environment: IBM Info sphere DataStage 8.5, IBM HCPDM, Oracle 11g, Flat files, Autosys, UNIX, Erwin, TOAD, MS SQL Server database, Mainframe COBOL, XML files, MS Access database.Sr. DataStage Developer/ Data ModelerMedical Graphics Corporation - Saint Paul, MN January 2010 to May 2011MGC Diagnostics is a global medical technology company dedicated to cardiorespiratory health solutions. This singular focus guides our strategy and defines our commitment to customers, employees and shareholders. These attributes make us uniquely qualified to solve today's challenges and uncover solutions for tomorrow's opportunities.Responsibilities:Extensively used DataStage for extracting, transforming and loading databases from sources including Oracle, DB2 and Flat files.Collaborated with EDW team in, High Level design documents for extract, transform, validate and load ETL process data dictionaries, Metadata descriptions, file layouts and flow diagrams.Collaborated with EDW team in, Low Level design document for mapping the files from source to target and implementing business logic.Generation of Surrogate Keys for the dimensions and fact tables for indexing and faster access of data in Data Warehouse.Tuned transformations and jobs for Performance Enhancement.Extracted data from flat files and then transformed according to the requirement and Loaded into target tables using various stages like sequential file, Look up, Aggregator, Transformer, Join, Remove Duplicates, Change capture data, Sort, Column generators, Funnel and Oracle Enterprise.Created Batches (DS job controls) and Sequences to control set of jobs.Extensively used DataStage Change Data Capture for DB2 and Oracle files and employed change capture stage in parallel jobs.Executed Pre and Post session commands on Source and Target database using Shell scripting.Collaborated in design testing using HP Quality Center.Extensively worked on Job Sequences to Control the Execution of the job flow using various Activities & Triggers (Conditional and Unconditional) like Job Activity, Wait for file, Email Notification, Sequencer, Exception handler activity and Execute Command.Collaborated in Extraction of OLAP data from SSAS using SSIS.Extensively used SAP R/3 and SAP BW packsCollaborated with BI and BO teams to find how reports are affected by a change to the corporate data model.Collaborated with BO teams in designing dashboards and scorecards for Analysis and Tracking of key business metrics and goals.Utilized Parallelism through different partition methods to optimize performance in a large database environment.Developed DS jobs to populate the data into staging and Data Mart.Executed jobs through sequencer for better performance and easy maintenance.Performed the Unit testing for jobs developed to ensure that it meets the requirements.Developed UNIX shell scripts to automate file manipulation and data loading procedures.Scheduled the jobs using AutoSys, Tivoli and Corntab.Collaborated in developing Java Custom Objects to derive the data using Java API.Responsible for daily verification that all scripts, downloads, and file copies were executed as planned, troubleshooting any steps that failed, and providing both immediate and long-term problem resolution.Provided technical assistance and support to IT analysts and business community.Environment: IBM Infosphere DataStage and QulalityStage 8.5 (Administrator, Designer, Director), IBM Information Analyzer8.0.1a, Microsoft SQL 2005/2008, IBM DB2 9.1, AIX6.0, Microsoft SQL 2008, Oracle11g, Toad 9.5, Java, MS Access, SAP BW, SAP MDM, AS/400, shell scripts, PUTTY, WinSCP, ERWIN 4.0, HP Quality Center, Tivoli, Corntab, AutoSys.ETL DeveloperPrudential Financial - Newark, NJ November 2008 to December 2009Prudential has its branches worldwide, in order to keep track of the huge amounts of data generated, a data warehouse was developed which aided all levels of management in obtaining a clear perspective on the trend of the business. Budgeting for the company needs and forecasting the company business decisions were based on the reports produced using this data warehouse. The system was developed for analyzing and reporting the time variant data Database was maintained with the user account details, and change requests on user A/C.Responsibilities:Provided Technical support to the team as the ETL developer. Addressed best practices and productivity enhancing issues.Worked on designing and developing the Quality stage.Loaded data into load, staging and lookup tables. Staging area was implemented using flat files.Created jobs in DataStage to import data from heterogeneous data sources like Oracle 9i, Text files and SQL Server.Generation of Surrogate IDs for the dimensions in the fact table for indexed and faster access of data in server jobs.Extensively worked on Job Sequences to Control the Execution of the job flow using various Activities & Triggers (Conditional and Unconditional) like Job Activity, Wait for file, Email Notification, Sequencer, Exception handler activity and Execute Command.Dicing and Slicing of the input data for the Business feedback. Testing of the system.Designing Data masking techniques to mask sensitive information when working with offshoreAssisted Mapping team to transform the business requirements into ETL specific mapping rules.Enhanced various complex jobs for performance tuning.Responsible for version controlling and promoting code to higher environments.Worked on Teradata optimization and performance tuning.Performed Unit Testing, System Integration Testing and User acceptance testingInvolved in ongoing production support and process improvements. Ran the DataStage jobs through third party schedulersEnvironment: Ascential DataStage 7.5.2 (Designer, Manager, Director, Administrator), Oracle 9i, TOAD, SQL/PLSQL, Teradata, Erwin 4.0, UNIX (AIX).EducationBachelor in Computers scienceMzuzu Mzuni UniversitySkillsDATABASE, DATASTAGE, ETL, EXTRACT, TRANSFORM, AND LOAD, ORACLE, SSISAdditional InformationTechnical SkillsETL ToolsIBM Infosphere DataStage 11.5 (Quality Stage) IBM Infosphere DataStage 11.3, IBM Infosphere DataStage 9.1 (Parallel & Server), IBM Websphere DataStage 8.7, (Designer, Director, Administrator), Ascential DataStage 8.5 (Designer, Director, Administrator, Manager)Database Oracle 11,10g/9i/8i, IBM DB2/UDB, Teradata, Netezza, SQL Server, Golden Gate Unica Support Interact v10.1Data Warehousing Star & Snow-Flake schema Modeling, Fact and Dimensions, Physical and Logical Data Modeling, Erwin, CognosOperating systems Windows 7x/NT/XP, UNIX, LINUX, Solaris, MS-DOS, MS Access Languages/Scripting C, C++, Visual Basic, PL/SQL, UNIX Shell scripts Testing/Defect Tracking HP Quality Center, Test Director |