Quantcast

Data Warehouse Modeling Resume Aurora, I...
Resumes | Register

Candidate Information
Name Available: Register for Free
Title Data Warehouse Modeling
Target Location US-IL-Aurora
Email Available with paid plan
Phone Available with paid plan
20,000+ Fresh Resumes Monthly
    View Phone Numbers
    Receive Resume E-mail Alerts
    Post Jobs Free
    Link your Free Jobs Page
    ... and much more

Register on Jobvertise Free

Search 2 million Resumes
Keywords:
City or Zip:
Related Resumes
Click here or scroll down to respond to this candidate
Candidate's Name
Phone: PHONE NUMBER AVAILABLEEmail: EMAIL AVAILABLELinkedIn URL: https://LINKEDIN LINK AVAILABLEAround 8 Years of IT experience in all aspects of Analysis, Design, Testing, Development, Implementation and Support of ETL(Talend,Matillion), Data Warehousing Systems and Data Marts in various domains.Summary:Expertise in Data Warehouse/Data mart, ODS, OLTP and OLAP implementations teamed with project scope, Analysis, requirements gathering, data modeling, Effort Estimation, ETL Design, development, System testing, Implementation, and production support.Around 7 years of experience with Talend Open Studio & Talend Enterprise platform.Experienced in working with Talend for Data Integration.Experience in using cloud components and connectors to make API calls for accessing data from cloud storage Amazon S3 in Talend Studio & Matillion ETL.Experience working on Matillion ETL for Redshift & Snowflake.Experienced in migrating data to Snowflake using Talend ELT jobs.Expertise in creating mappings in TALEND using tMap, tJoin, tReplicate, tParallelize, tConvertType,, tflowtoIterate, tAggregate, tSortRow, tLogCatcher, tRowGenerator, tNormalize, tDenormalize, tSetGlobalVar, tHashInput, tHashOutput, tJava, tJavarow, tAggregateRow, tWarn, tLogCatcher, tMysqlScd, tFilter, tGlobalmap, tDie etc.Expertise in Data modeling techniques like Data Modeling- Dimensional/Star Schema and Snowflake modeling, Slowly Changing Dimensions (SCD Type 2).Experience in using Amazon cloud components and connectors like tRedshiftInput, tRedshiftRow, tS3List, tS3Get, tS3Put for accessing data from cloud storage like Amazon S3 & Redshift.Created Context Groups and variables to run jobs in Dev, Stage and Production environmentsExperienced with Talend Big Data, Hadoop, Hive and used Talend Big data components like tHDFSInput, tHDFSOutput,, tHiveLoad, tHiveInput, tHbaseInput, tHbaseOutput.Experience in development and design of ETL (Extract, Transform and Loading data) methodology for supporting data transformations and processing, in a corporate wide ETL Solution using Informatica Power Center.Created mappings using Lookup, Aggregator, Joiner, Expression, Filter, Router, Update strategy and Normalizer Transformations. Developed reusable Transformation and Mapplets.Strong Experience with shell scripting, understanding of approaches for business intelligence, data warehouse.Experience working in AWS Services like S3, Athena.Experienced in Waterfall, Agile methodologies.Hands-on experience across all stages of Software Development Life Cycle (SDLC) including business requirement analysis, data mapping, build, unit testing, system integration and user acceptance testing.Professional ExperienceTalend-ETL DeveloperThe Hartford, Hartford CT Nov 2019  PresentRoles and ResponsibilitiesWorked extensively with the customers to gather project requirements, specifications and convert them to executable steps.Designed & developed ETL jobs from various sources such as Oracle, SQL Server, Amazon S3 and loaded into target Redshift Database.Involved closely in project design and meetings to leverage the environment and architecture capabilities.Experienced in writing oracle and Snowflake Stored procedures to transform data into required specifications.Design and implement strategies to build new data extraction, transformation, loading, and quality frameworks to replace legacy systems and implement the cloud native technologies like Matillion ETL/ELT tool.Used Matillion components like SQL Script, Database Query, Query Result to Grid/Scalar, Grid/Fixed Iterator, DataConfigured Matillion instances, Designed & Developed Matillion ETL jobsUsed Matillion components like SQL Script, Database Query, Query Result to Grid/Scalar, Grid/Fixed Iterator, Data Transfer, Nested Data Load, S3load, S3Unload, Python, Shell, etc.Loading/unloading data into Redshift using Matillion ELT jobs.Experienced in parsing nested json formatted data structures to get the required data and put them in usable hive and Snowflake tables.Created complex mappings in Talend using Talend components like tHash, tDenormalize, tMap, tUniqueRow.Created Talend Mappings to populate the data into dimensions and fact tables.Developed complex Talend ETL jobs to migrate the data from flat files to database.Successfully Loaded Data into different targets from various source systems like Oracle Database, DB2, Flat files, XML files etc. into the Staging table and then to the target database.Prepared ETL mapping Documents for every mapping and Data Migration document for smooth transfer of project from development to testing environment and then to production environment.Experienced in writing complex snow SQL queries in tSnowflakerow.Extensively used Talend Snowflake components - tSnowflakeinput,tSnowflakeoutout,tSnowflakerow.Experienced in creating external staging tables in Snowflake to move data between Snowflake and Aws S3.Worked on migrating json data into Snowflake from S3 staging and creating custom views for consumer use.Used the temp table features & parallel processing power of Redshift to perform ELT on huge data setsCreated Schedules to run the Matillion pipelines.Integrated AWS Secret Manager with Matillion to securely pull passwords.Experienced in formatting normalized data into json format and creating json formatted hive tables.Experience in working with CI/CD pipeline for quick and multiple deployment of code.Used TMC for Talend jobs scheduling and monitoring.Experienced in troubleshooting production issues, quick solutioning of the issue and deploying for uninterrupted customer work.Talend DeveloperMcDonalds Corporation, Chicago IL December 2017  November 2019Roles and ResponsibilitiesWorked closely with Business Analysts to review the business specifications of the project and to gather the ETL requirements.Closely worked with Data Architects in designing tables and even involved in modifying technical Specifications.Involved in Extraction, Transformation and Loading of data from multiple source systems to Aws S3.Involved in the Development of copying Data from AWS S3 to Redshift using the Talend Process.Involved in writing custom copy command querys and used the context extensively and implemented in tRedshift Row Component.Extensively used tsystem component to push large sets of data to s3.Developed a Talend Code for S3 Tagging in the Process of Moving data from source to S3Utilized Talend components like tS3Put, tS3Get, tS3File List, tRedshiftRow, tRedshiftUnload, tRedshiftBulkExec, tFlowToIterate, tSetGlobalVar.Experienced in loading parquet format files to S3 using Big Data Batch Jobs.Extensively used AWS Athena to Query Parquet format files in S3.Involved in Integrating IAM Roles in Talend Components.Experienced in creating standard jobs. Involved in development of big data batch Jobs.Extensively used Talend components tMap, tDie, tConvertType, tFlowMeter, tLogCatcher, tRowGenerator, tOracleInput, tOracleOutput, tFileList etc, tS3put, tS3get, tReplicate, tSortrow, tDenormalize, tNormalize, tRedshiftRowUtilized Big Data components like tHDFSInput, tHDFSOutput, tHiveLoad, tHiveInput, tHiveOutput, tHiveRow, tHive Connection .Experienced in executing the jobs in parallel using tparallelize component.Used debugger and breakpoints to view transformations output and debug mappings.Load and transform data into HDFS from a large set of structured data /Oracle/SQL server using Talend Big data studio.Worked on Global variables, Context variables, and extensively used tcontext load in most of the jobs.Experienced in creating reusable jobs for error handling.Experienced in tuning the ETL Jobs for better performance.Extensively worked with TAC (Talend Administrator Console) for scheduling jobs using the execution plan.Environment: Talend Data Integration 6.1/5.5.1, Talend Administrator Console, Hive, HDFS, AWS S3, AWS Athena.Talend DeveloperTractor Supply, Nashville TN January 2017  November 2017Roles and Responsibilities:Participated in all phases of development life-cycle with extensive involvement in the definition and design meetings, functional and technical walkthroughs.Created Talend jobs to copy the files from one server to another and utilized Talend FTP componentsCreated and managed Source to Target mapping documents for all Facts and Dimension tablesUsed ETL methodologies and best practices to create Talend ETL jobs. Followed and enhanced programming and naming standards.Created and deployed physical objects including custom tables, custom views, stored procedures, and Indexes to SQL Server for Staging and Data-Mart environment.Design and Implemented ETL for data load from heterogeneous Sources to SQL Server and Oracle as target databases and for Fact and Slowly Changing Dimensions SCD-Type1 and SCD-Type2.Utilized Big Data components like tHDFSInput, tHDFSOutput, tPigLoad, tPigFilterRow, tPigFilterColumn, tPigStoreResult, tHiveLoad, tHiveInput, tHbaseInput, tHbaseOutput, tSqoopImport and tSqoopExportExtensively used tMap component which does lookup & Joiner Functions, tjava, tOracle, txml, tdelimtedfiles, tlogrow, tlogback components etc. in many of my Jobs Created and worked on over 100+components to use in my jobs.Used Talend most used components (tMap, tDie, tConvertType, tFlowMeter, tLogCatcher, tRowGenerator, tSetGlobalVar, tHashInput & tHashOutput and many more)Created many complex ETL jobs for data exchange from and to Database Server and various other systems including RDBMS, XML, CSV, and Flat file structures.Created Implicit, local and global Context variables in the job. Worked on Talend Administration Center (TAC) for scheduling jobs and adding users.Worked on various Talend components such as tMap, tFilterRow, tAggregateRow, tFileExist, tFileCopy, tFileList, tDie etc.Developed stored procedures to automate the testing process to ease QA efforts and reduced the test timelines for data comparison on tables.Automated SFTP process by exchanging SSH keys between UNIX servers. Worked Extensively on Talend Admin Console and Schedule Jobs in Job Conductor.Involved in production and deployment activities, creation of the deployment guide for migration of the code to production, also prepared production run books.Environment: Talend Data Integration 6.4, Talend Administrator Center, Oracle 11g, Hive, HDFS, SQL Navigator, Toad, Putty, Winscp.Java/SQL DeveloperNotus Technologies, Hyderabad, India May 2015  December 2015Roles and Responsibilities:Responsible for designing and developing of mappings, Mapplets, sessions and workflows for load the data from source to target database using Informatica Power Center and tuned mappings for improving performance.Created database objects like views, indexes, user defined functions, triggers, and stored procedures.Involved in ETL process from development to testing and production environments.Extracted data from various sources like Flat files, Oracle and loaded it into Target systems using Informatica 7.x.Developed mappings using various transformations like update strategy, lookup, stored procedure, router, joiner, sequence generator and expression transformation.Developed PL/SQL triggers and master tables for automatic creation of primary keys.Used Informatica Power Center Workflow Manager to create sessions, batches to run with the logic embedded in the mappings.Used Log4J for writing into different log files Application Log and Error Log.Added features to send email and download the attachments from e-mail using Java Mail API.Tuned mappings and SQL queries for better performance and efficiency.Created & Ran shell scripts in UNIX environment.Created and ran the Workflows using Workflow manager in Informatica Maintained stored definitions, transformation rules and targets definitions using Informatica repository manager.Created tables and partitions in database Oracle.EDUCATIONMasters at University of Michigan Dearborn -2017Bachelors in computer science from JNTU-2015

Respond to this candidate
Your Message
Please type the code shown in the image:

Note: Responding to this resume will create an account on our partner site postjobfree.com
Register for Free on Jobvertise