Quantcast

Business Intelligence Information Techno...
Resumes | Register

Candidate Information
Name Available: Register for Free
Title Business Intelligence Information Technology
Target Location US-FL-Orlando
Email Available with paid plan
Phone Available with paid plan
20,000+ Fresh Resumes Monthly
    View Phone Numbers
    Receive Resume E-mail Alerts
    Post Jobs Free
    Link your Free Jobs Page
    ... and much more

Register on Jobvertise Free

Search 2 million Resumes
Keywords:
City or Zip:
Related Resumes

Information Technology Business Development Orlando, FL

Business Analyst in Information Technology Orlando, FL

Business Analyst Project Manager Orlando, FL

Program Manager Information Systems Palm Bay, FL

Business Development Care Services Orlando, FL

Operations Analyst Business Orlando, FL

Business Development Change Management Winter Park, FL

Click here or scroll down to respond to this candidate
Candidate's Name
Email: EMAIL AVAILABLEPh#: PHONE NUMBER AVAILABLELINKEDIN LINK AVAILABLEProfessional Summary:Over 8+ years of work experience in Information Technology as an Informatica Developer with strong background in ETL Data warehousing.Experienced using Informatica PowerCenter and Informatica Intelligent Cloud Services (IICS). Also have good experience in Oracle BI Applications (OBIEE 11g).Experience in software development life cycle (SDLC), business requirement analysis, design, programming, database design, data warehousing and business intelligence concepts, Star Schema and Snowflake Schema methodologies.Strong experience in the Analysis, design, development, testing and Implementation of Business Intelligence solutions using Data Warehouse/Data Mart Design, ETL, OLAP, Client/Server and Mainframe applications.Experience working with Informatica IICS tool effectively using it for Data Integration and Data Migration from multiple source systems.Experience working with IICS concepts relating to data integration, Monitor, Administrator, deployments, permissions, schedules.Expertise in design and implementation of Slowly Changing Dimensions SCD type1, type2, type3.Experience in Integration of various data source like Oracle, Teradata, SQL Server, DB2 databases, Flat files, Web Services, APIs, XML & JSON files.Extensive experience in developing complex mappings from varied transformations like Router, Filter, Sorter, Connected and Unconnected lookups, Normalizer, Expression, Aggregator, Joiner, Union, Update Strategy, Stored Procedure and Sequence Generator etc.Experience in using Informatica tools Mapping Designer, Workflow Manager, Workflow Monitor and Repository Manager, metadata Manager.Strong knowledge in OLAP systems and Inmon methodology & models, Dimensional modeling using Star and Snowflake schema.Designed complex mappings to load into Snowflake.Experience in Dimension Data Modeling concepts like Star Join Schema Modeling, Snow-Flake Modeling, FACT and Dimensions Tables, Physical and Logical Data Modeling.Experienced in loading data, troubleshooting, debugging mappings, performance tuning of Informatica Sources, Targets, Mappings and Sessions and fine-tuned transformations to make them more efficient in terms of session performance.Strong experience in Extraction, Transformation and Loading (ETL) data from various sources into Data Warehouses and Data Marts using Informatica Power Center (Repository Manager, Designer, Workflow Manager, Workflow Monitor, Metadata Manger), Power Exchange, Power Connect as ETL tool on Oracle, DB2 and SQL Server Databases.Automated data quality monitoring with custom Python, UNIX, SQL and BTEQ scripts, reducing manual intervention and allowing the team to focus on critical tasks.Good Experience in designing BI solutions.Experience in OBIEE RPD design and development.Good Experience in developing Answers and Dashboards, in OBIEE 11g.Designed and implemented data pipelines using Data Flows and Pipelines in Azure Data Factory.Scheduled and triggered data pipelines for automated data movement and processing.Managed and monitored ADF pipelines for optimal performance and troubleshooting issues.Strong experience in working with large scale Data Warehouse implementations using IICS/Informatica PowerCenter 10.x/9.x/8.x/6.x, Oracle, DB2, SQL Server on UNIX and Windows platforms.Excellent interpersonal and communication skills, and is experienced in working with senior level managers, businesspeople, and developers across multiple disciplines.Technical Skills:ETL ToolsInformatica Intelligent Cloud Services,Informatica 10.x/9.6/9.1/8.6/8.5.1(PowerCenter/Power Mart), IDQ, OBIEE 11g, Apache NifiDatabasesOracle 11g/10g/9i/8i, Teradata, Snowflake, My SQL Server 16/14/12/08,DB2, Netezza, PostgreSQLLanguagesSQL(Advanced), PL/SQL(Advanced), Unix Shell Script (Advanced),Python (Intermediate), XML(Basic), JSON(Basic), Visual Basic (Basic), JavaScript (Basic),Groovy Script (Basic)ToolsToad, SQL* Loader, Cognos 7.0/6.0Operating SystemsWindows, UNIX, MS-DOS, MacScheduling ToolsAutosys r11.3/10.X, Control-M 9/8/6.1.X, UC4Additional Technical SkillsGit, Docker, Jenkins, Azure Data FactoryProfessional Experience:Client: Bank Of America, Dallas, TX (Remote). Duration: Sep 2022  Till DateRole: Data Engineer /Informatica Application DeveloperResponsibilities:Involved in all phases of SDLC from requirement gathering, design, development, testing, Production release and user training.Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load) processes using Informatica Power Center 10.x and Informatica Intelligent Cloud Services (IICS).Understanding the business rules and sourcing the data from heterogeneous source systems using IICS/Informatica Power Center.Implemented Type1, Type2, CDC and incremental load strategies in IICS.Designed and developed mappings using Source Qualifier, Expression, Lookup, Router, Aggregator, Filter, Sequence Generator, Stored Procedure, Update Strategy, joiner, and Rank transformations.Designed and developed Informatica ETL mappings to extract master and transactional data from heterogeneous data feeds and load.Managed the Metadata associated with the ETL processes used to populate the Data Warehouse.Designed reusable components such as transformation, Mapplets, lookups and reusable objects sources and targets (shared folder) for ETL process.Created Mapplets, Reusable Transformations, Reusable Session/Email/Command Tasks to increase the code reusability and maintenance.Developed Informatica Mappings and Reusable Transformations to facilitate timely Loading of Data of a star schema.Effectively used Informatica parameter files for defining mapping variables, workflow variables, FTP connections and relational connections.Developed the Informatica Mappings by usage of Aggregator, SQL overrides usage in Lookups, source filter usage in Source qualifiers, and data flow management into multiple targets using Router.Involved in Dimensional modeling (Star Schema) of the Data warehouse and used Erwin to design the business process, dimensions, and measured facts.Created UNIX shell scripts for Informatica ETL tool to automate sessions.Deployed dynamic ETL solutions by automating routine data management tasks using Python scripts.Extracted data from Oracle and SQL Server then used Oracle for data warehousing.Design and develop PL/SQL Packages, stored procedures, tables, views, indexes, and functions.Developed/Automated (PowerCenter/IICS) jobs through Autosys.Created the data model for this BI solution.Provided the Physical data models and Logical data models.Developed RPD Metadata from scratch based on the proposed data warehouse design.Demonstrated the BI features to clients and participated in the requirement gathering and analysis.Designing the RPD layers (Physical, BMM, and Presentation Layer) as per user needs in the RPD Design phase from EBS and PeopleSoft sources.Creating reports and dashboards which have navigation to other BI Content with help of Action Links and Actions.Interacting with Business users to understand their reporting requirements and address them accordingly.Followed Agile testing methodology, participated in daily SCRUM meetings and testing each SPRINT deliverables.Participated in weekly status meetings and conducted internal and external reviews as well as formal walk through among various teams and documenting the proceedings.Environment: IICS/Informatica Power Center 10.x, Oracle, OBIEE 11g, My-SQL, Flat Files, SQL Assistant, Autosys, PL/SQL, Unix & Python scripting, Jenkins, Git Hub shell scripting, Agile and Windows.Client: INFOSYS Limited, Phoenix, AZ (Remote). Duration: Apr 2021  Sep 2022Role: Data Engineer/DevOps EngineerResponsibilities:Involved in all phases of SDLC from requirement gathering, design, development, testing, Production releases and support for production environment.Involved in understanding the Requirements of the End Users/Business Analysts and Developed Strategies as DevOps Engineer.Mainly was part of Production support, providing support for production issues every bi-weekly.Involved in Dimensional modeling (Star Schema) of the Data warehouse and used Erwin to design the business process, dimensions, and measured facts.Experience to automate data transformation using Python scripting, resulting in a 25-hour reduction in weekly data processing times.Used Azure Data Factory and Jenkin pipelines for Data transformation.Utilized various ADF activities like Copy, Data Transformation, and Control Flow activities for data movement and manipulation.Integrated data from diverse sources (databases, APIs, etc.) to Azure Data Lake Storage (ADLS) or other target data stores.Participated in cloud data migration that improved data storage scalability, supporting 50% more concurrent users with Azure.Created BTEQ SCRIPTS, UNIX shell scripts for data Transformation.Extracted data from Flat File, Oracle and SQL Server then used Teradata for data warehousing.Involved in BI Solution data models development.Preparation of Repository Design Document as per the Data ModelEnd to End security model in BI RPD in which EBS responsibility is driving factor.Design and Development of Repository according to the requirements as mentioned in the Functional Design Document.Have converted snowflake to star model in RPD using MLTS process, created level-based metrics and hierarchies.Used CI/CD methodologies in ETL Scripts, which led to reduction in code deployment times.Designed and implemented a CI/CD pipeline for ETL scripts with Azure DevOps, improving deployment frequency by 40% and reducing manual efforts by 60 hours per month.Followed Agile testing methodology, participated in daily SCRUM meetings and development, testing and production support as each SPRINT deliverables.Participated in weekly status meetings and conducted internal and external reviews as well as formal walk through among various teams and documenting the proceedings.Environment: Teradata, Snowflake, Hadoop, OBIEE, My-SQL, Flat Files, SQL Assistant, Control-M, PL/SQL, CI/CD Methodologies, Unix & Python Scripting, Jenkins, Git Hub shell scripting, Agile and Windows.Client: Huntington National Bank, Cincinnati, OH (Remote). Duration: Mar 2020  Apr 2021Role: Informatica/ETL DeveloperResponsibilities:Involved in understanding the Requirements of the End Users/Business Analysts and Developed Strategies for ETL processes.Developed ETL procedure strategies and worked with business and data validation groups to provide assistance and guidance for system analysis, data integrity analysis, and data validation activities.Developed mappings/Reusable Objects/Transformation/Mapplet by using mapping designer, transformation developer and Mapplet designer in Informatica Power Center.Designed and developed ETL Mappings to extract data from flat files, MS Excel, and Oracle to load the data into the target database.Developed Informatica Mappings for the complex business requirements provided using different transformations like Normalizer, SQL Transformation, Expression, Aggregator, Joiner, Lookup, Sorter, Filter, and Router and so on.Involved in developing several complex mappings in Informatica a variety of PowerCenter transformations, Mapping Parameters, Mapping Variables, Mapplets Parameter files in Mapping Designer using Informatica Power Center.Worked with Informatica Data Quality (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities of IDQ.Worked in developing Mapplets and Re-usable Transformations for reusability and reducing effort.Involved in the data analysis for source and target systems and good understanding of Data Warehousing concepts, staging tables, Dimensions, Facts and Star Schema, Snowflake Schema.Involved in migration of large amount of data from OLTP to OLAP by using ETL Packages.Used ETL to load data using PowerCenter/Power Connect from source systems like Flat Files and Excel Files into staging tables and load the data into the target database.Developed complex mappings using multiple sources and targets in different databases, flat files and loading them into Teradata.Developed complex SQL queries to develop the Interfaces to extract the data in regular intervals to meet the business requirements and extensively used Teradata.Involved in performance tuning of the Informatica ETL mappings by using the caches and overriding the SQL queries and by using Parameter files.Used various transformations like Unconnected/Connected Lookup, Aggregator, Expression Joiner, Sequence Generator, Router etc.Responsible for the development of Informatica mappings and tuning for better performance.Worked with UNIX scripts for automation of ETL Jobs using Autosys Scheduler and Involved in migration/conversion of ETL processes from development to production environment.Created transformations like Expression, Lookup, Joiner, Rank, Update Strategy and Source Qualifier transformation using the Informatica designer.Created mapplet and used them in different mappings.Written PL/SQL Procedures and functions and involved in change data capture (CDC) ETL process.Environment: Informatica Power Center, IDQ, Oracle, My SQL, Flat Files, SQL Assistant, Autosys, PL/SQL, Erwin, Unix shell scripting, Unix, Agile and Windows.Client: Budco Financial, Detroit, MI. Duration: Nov 2018  Feb 2020Role: Informatica/ETL DeveloperResponsibilities:Analyzed business requirements, data mapping specifications and developed Test Plans, Test Cases, Expected Results and Prioritized tests for the applications for various modules.Involved in design, development, and implementation of ETL process in power center. Responsible for managing, scheduling, and monitoring the workflow sessions.Involved in building the ETL architecture and Source to Target mapping to load data into Data warehouse.Involved in Dimensional modeling (Star Schema) of the Data warehouse and used Erwin to design the business process, dimensions, and measured facts.Developed Mappings using various Transformations like Aggregate, Expression, Filter, Lookup, Joiner, Sequence Generator, Stored Procedure, Update strategy and Rank.Used workflow manager extensively to create tasks, worklets, and workflows and executed the tasks.Responsible for tuning ETL procedures and STAR schemas to optimize load and query performance.Involved in Developing and handling the PL/SQL Packages, Procedures and Database Triggers, Views, Sequences, Indexes, Functions, ER Diagrams, reporting.Developed complex mappings using multiple sources and targets in different databases, flat files and loading them into Teradata.Imported the IDQ address standardized mappings into Informatica Designer as a Mapplets.Utilized Informatica IDQ to complete initial data profiling and matching/removing duplicate data.Developed complex SQL queries to develop the Interfaces to extract the data in regular intervals to meet the business requirements and extensively used Teradata.Involved in performance tuning of the Informatica ETL mappings by using the caches and overriding the SQL queries and by using Parameter files.Used various transformations like Unconnected/Connected Lookup, Aggregator, Expression Joiner, Sequence Generator, Router etc.Responsible for the development of Informatica mappings and tuning for better performance.Worked with UNIX scripts for automation of ETL Jobs using Autosys Scheduler and Involved in migration/conversion of ETL processes from development to production environment.Created transformations like Expression, Lookup, Joiner, Rank, Update Strategy and Source Qualifier transformation using the Informatica designer.Created mapplet and used them in different mappings.Written PL/SQL Procedures and functions and involved in change data capture (CDC) ETL process.Environment: Informatica Power Center, IDQ, Teradata, Oracle, My SQL, Flat Files, SQL Assistant, Autosys, PL/SQL, Erwin, Unix shell scripting, Unix, Agile and Windows.Client: Office Depot, Boca Raton, FL. Duration: Jan 2018  Oct 2018Role: ETL DeveloperResponsibilities:Responsible to meet with business stakeholders and other technical team members to Gather and analyze application requirements.Involved in development of Logical and Physical data models that capture current state Developed and tested all the informatica Data mappings, sessions, and workflows - involving several Tasks.Designed the ETL processes using Informatica tool to load data from Oracle, flat files into the target Oracle Database.Designed and developed data validation and load processes (using PL/SQL, SQL) and pre-session, post-session routines and batch execution routines using UNIX Shell Scripts. Implemented slowly changing dimensions.Designed and developed new mappings in Mapping designer and sessions and workflows in Workflow manager and modified existing ones to extract data from SQL Server, Excel, and Flat files and load to Oracle database as per business requirements.Developed the Informatica Mappings by usage of Aggregator, SQL overrides usage in Lookups, source filter usage in Source qualifiers, and data flow management into multiple targets using Router.Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, Router and Aggregator to create robust mappings in the Informatica Power Center Designer.Used Informatica Power Center for (ETL) extraction, transformation and loading data from heterogeneous source systems into target database.Created UNIX shell scripts for Informatica ETL tool to automate sessions.Wrote and implemented generic UNIX Scripts for various purposes like running workflows, archiving files, execute SQL commands and procedures, move inbound/outbound files.Used complex SQL queries to unit test the system and to test the existing project also used to validate the data in data warehouse.Used SQL and PL/SQL tuning techniques to improve the performance.Daily monitoring of the mappings that ran the day before and fixing the issues.Worked with DBA and Reporting team to generate Materialized View of the data warehouse.Worked on call for production support.Environment: Informatica Power Center, ETL, XML, Oracle, PL/SQL, SQL, UNIX, and Windows.Company: Seismic Technologies, India. Duration: Nov 2015  Dec 2017Role: ETL DeveloperResponsibilities:Creating Business Objects reports according to BRD specifications.Analyzing and enhancing the existing BO Reports as per new requirements.Responsible for the development and support of a Data Warehouse using Informatica as the primary ETL Tool and Business Objects as reporting tool.Designed and developed mappings using Source Qualifier, Aggregator, Joiner, Lookup, Sequence Generator, Stored Procedure, Expression, Filter, Java, and Rank transformations and validated the Data.Implemented and populated Slowly Changing Dimension (SCD) to maintain current information and history information in dimension tables.Used Informatica PowerCenter Workflow Manager to create sessions, batches to run with the logic embedded in the mappings.Developed PL/SQL and UNIX shell scripts for scheduling the sessions in Informatica.Developed PL/SQL programming that included writing Stored Procedures, Packages, Functions andDatabase Triggers.Documentation of Technical specification, business requirements, functional specifications for the development of Informatica Extraction, Transformation and Loading (ETL) mappings to load data into various tables.Created procedures to drop and recreate the indexes in the target Data warehouse before and after the sessions.Created shell script to pass database connections, parameter entries for source and target.Demonstrated complete follow-through on technical problems.Environment: Informatica PowerCenter, Oracle, Flat Files, Win7, SQL* Plus, PL/SQL, UNIX, and Windows.Educational Details:University: Jawaharlal Nehru Technological UniversityStream: Computer Science and Engineering.Batch: 2010-2014State: Hyderabad, Telangana.References: Will be provided upon request.

Respond to this candidate
Your Message
Please type the code shown in the image:

Note: Responding to this resume will create an account on our partner site postjobfree.com
Register for Free on Jobvertise