Quantcast

Informatica Powercenter Iics Idmc Terada...
Resumes | Register

Candidate Information
Name Available: Register for Free
Title Informatica Powercenter, IICS/IDMC, Teradata, MS SQL Servier, Cl
Target Location US-MO-Chesterfield
Email Available with paid plan
Phone Available with paid plan
20,000+ Fresh Resumes Monthly
    View Phone Numbers
    Receive Resume E-mail Alerts
    Post Jobs Free
    Link your Free Jobs Page
    ... and much more

Register on Jobvertise Free

Search 2 million Resumes
Keywords:
City or Zip:
Related Resumes

Data Warehouse Integration St. Louis, MO

Sql Server Software Engineering St. Louis, MO

Sr. Data Engineer St. Louis, MO

Data Analyst Saint Peters, MO

Data Analyst Sql Server St. Louis, MO

Stack Developer Web Services St. Louis, MO

Data Analyst Power Bi Godfrey, IL

Click here or scroll down to respond to this candidate
Candidate's Name
(https://LINKEDIN LINK AVAILABLE)Mobile#: PHONE NUMBER AVAILABLE Email: EMAIL AVAILABLESenior Data Warehousing professional with hands-on expertise over Street Address  years in Design, Modeling Development, Testing and Deployment of large complex Data Warehouse Applications and Analytical Reporting, while adopting Agile Scrum and Waterfall methodologies.Diverse IT expertise: data management, business intelligence, project management, custom development, and systems integration with strong background in all phases of software development life cycle and leveraging best practices.Highly dedicated and result-oriented with strong analytical and communication skillsExperienced in data extraction, cleansing, transformation and loading from heterogenous sources into staging and data warehouse environments in Cloud and Big Data environments using various Informatica versions including Intelligent Data Management Cloud, Azure Data Factory Teradata and Oracle database utilities.Worked on Informatica versions 8.x/9.x/10.x, PowerCenter, Intelligent Data Management Cloud, Informatica BDM, Informatica IDQ, SSIS, Datastage, along Hadoop and Hive, Apache Spark, Apache Kafka on Cloudera platform.Prior experience in mid to large size businesses demonstrating a good understanding of data requirements and work effort estimations to simultaneously manage multiple priorities.Experienced in databases using Oracle 8i/10g/11g, Teradata 12/15.10, MS SQL Server 2012/14/16/19Experienced in design, develop and building Data Marts in various on-perm databases and in Cloud Data warehousing platforms in Snowflakes, Redshift, SQL Server created on AWS RDS to facilitate data analytics teams to data driven business decisions.Experienced in cloud migration project to migrate approximately over 250 TB Enterprise Data Warehouse migration to AWS platform leveraging S3, RedShift database, AWS SCTProven skills in analyzing highly complex business requirements, designing and writing technical specifications to design and develop ETL processes.Experienced in implementing complex mappings/mapplets, reusable transformations, partitioning sessions in Informatica and writing daily batch jobs using Unix Shell scripts.Demonstrated extensive experience in using load utilities like FASTLOAD, TPUMP, MLOAD, FAST Export, TPT (Teradata Parallel Transporter), PDO (Push Down Optimization) and BTEQ scripts.Expertise in query optimization and performance tuning of stored procedures and functionsPrior experience in leading and supporting pre and post-implementation activitiesFamiliarity with business visualization tools like MicroStrategy, Business objects and TableueEDUCATION AND CERTIFICATIONSAWS Certified Cloud PractitionerCertified SAFe 4 PractitionerMaster of Science in Electronics, Bharathidasan University, Tiruchirapalli, IndiaBachelor of Science in Electronics, Bharathidasan University, Tiruchirapalli, IndiaSkill Summary:ETL ToolsInformatica Power Center 10.2, Intelligent Data Management Cloud (IDMC/IICS), SSIS, DataStage, Azure Data Factory, Informatica BDM, Informatica IDQ, AWS GlueDatabasesTeradata, Oracle, MS SQL server, PostgreSQL, DB2Programming language & UtilitiesSQL, PL/SQL, Shell Scripting, BTEQ, PDO, Fast Load, Fast Export, MLOAD, TPUMP, TPT, Self-Leaned Python programming.Cloud Service ProvidersAWS Services, MS Azure, Snowflake, VantageCloud on AWSBig DataApache Hadoop, Apache Kafka, Apache Spark, HIVEToolsTeradata SQL Assistance, Teradata studio, SSMS, Oracle SQL Developer, TOAD, Apache Ambari, DBeaver, Jenkins, Service Now, Putty, WinSCP3SCM ToolsWLM, Control-M, Tidal, One Automation, SVN, GitHub, Bitbucket, Active BatchOperating SystemsWindows, LinuxPROFESSIONAL EXPERIENCE: Aug 2021  Till dateCharter Communication,St Louis, MissouriBusiness Intelligence Developer (ETL)Project: Customer Serviceability Journey - The main objective of this project to create reports on MicroStrategy to facilitate business users to study the trend and provide business solution to improve the serviceability process to give seamless and enhanced customer experience by reducing overall time taken for service installation.Responsibilities:Design, develop and support new and existing Business Intelligence solutions for users.Transform data into meaningful and accurate information which the business can consume.Design and build components to extract, cleanse, transform, integrate and load data into target systems using Informatica IDMC and Teradata utilities.Worked on a project to mirror 200+ BI fact and dimension tables from Teradata to Snowflake through S3 buckets to enable workload isolated compute usage for the benefits of various business analytics teams.Participate in active discussions with team members and users to understand and document business requirements.Design, develop, implement and support new and existing data integration jobs using Teradata Stored Procedure.Generate new extracts for reporting by working with Data Governance team.Create ETL Design Document and review with CTO team to secure required approvals.Understand and implement best practice solutions per development standards.Produce ad hoc reports to thoroughly respond to business question.Collaborate with Architecture team to develop and implement best practices for ETL development.Create test plans and scenarios, as well as conduct testing to ensure quality and requirements traceability to the final deliverable.Understand data elements int the warehouse at a very detailed grain to support and form of user needs from research, ad hoc querying, and managing large information projects.Involve in pre-implementation activities and post-production activities including scheduling and monitoring production jobs as part of warranty support.Work extensively in solving post-production tickets raised by users during warranty period.Tools & Technologies: Teradata Stored procedure, Snowflake, Teradata FASTLOAD, MLOAD, TPUMP, Teradata Parallel Transporter, MicroStrategy, Informatica 10.20, IDMC, IDQ, SSIS, WinSCP, Putty, SQL Assistance, SSMS, WinSQL, Linux, BIT Bucket, Subversion.Wellmark Inc., Charter Communication, Sep20-Aug21Des Moines, IowaSenior Data Engineer (ETL)Project: Provider Extracts to External Vendors - The main objective of this project is to create 6 Provider profiles in JSON format by sourcing data from AWS Redshift database and place it in the S3 for downstream consumption to submit for PDT mandate with BCBS associations.Responsibilities:Design and build components to extract, cleanse, transform, integrate and load data into target systems using Intelligent Data Management Cloud.Actively participated in cloud migration project to migrate approximately 250 TB Enterprise Data Warehouse migration to AWS platform leveraging S3 and serverless RedShift database.Actively participate in requirement meetings with business stakeholders and third-party vendors to understand data specific requirements to create Data Flows and Provider profiles.Responsible to develop ETL solutions aligned to the CTO recommended technology stack, while providing estimates and developing a high-level work plan and securing require approvals.Ensure solution meets the technical specifications and design requirements.Actively lead and participate in the data discovery sessions with the business during the PI planning sessions.Analyze and enhance existing process flows with the introduction of AWS RedShift database with hands-on development using shell scripting, adding automation and notifications with multiple check points.Work extensively in solving post-production tickets raised by users during warranty period.Tools & Technologies: Teradata, SQL Server, AWS RedShift, AWS RDS, AWS S3, Snowflake, informatica 10.20, IICS/IDMC, SSIS, Teradata FASTLOAD, MLOAD, TPUMP, Teradata Parallel Transporter, WinSCP, Putty, SQL Assistance, JIRA, WinSQL, PL/SQL Developer, Linux, XML, JSON, Flat Files, Shell ScriptingCentene Corp., Mar18  Dec 20St. Louis, MissouriLead/Senior ETL DeveloperProject: EDW 2.0 and Healthnet, Fidelis Integration - The main objective of this project converts the existing weekly feed to daily feeds from one of the Centenes acquired Healthnet (Centene West) source systems to EDW systems to improve and for better and accurate reporting with very minimal lag.Responsibilities:Designed, developed, and delivered Enterprise Data Warehouse 2.0 for the client. Facilitated Data Architecture for data layer and the ETL strategy to migrate and/or extract data from multiple subject areas to load into EDW 2.0, creating datasets for Financial Lags Analytics.Designed and developed Informatica ETL programs to migrate HealthNet Data Warehouse to EDW as part of the Centene HealthNet integration efforts.Worked with HealthNet team to resolve data load errors and discrepancies.Developed job schedules to ensure the load times are consistent and making EDW available for business within SLAs.Partnered with Finance to migrate existing Health Benefit Ratio (HBR), Lags, and Revenue Reconciliation System on EDW 2.0.Presented Teradata temporal concepts, implementation and use cases at various levels.Created staging tables to load data in EDW to perform ETL tasks using Informatica.Modeled a generic landing layer for data acquisition from various claims source systems.Participated in weekly Architecture meetings to collaborate and discuss ongoing initiatives.Tools & Technologies: Teradata 15.10, Solution Model Building Blocks, SAP PowerDesigner, Informatica BDM, PL/SQL Developer, AWS RedShift, AWS RDS, AWS S3, Snowflake, Informatica 10.20, IICS/IDMC, SSIS, TOAD, ER-Win, Linux, Cloudera, Apache Hadoop, Hive, WinSCP, Ambari, Putty, SQL AssistanceActive Health Management Mar17  Mar 18Atlanta, Georgia (Onsite)Senior ETL DeveloperProject: Financial Data Ingest - The primary objective is to ingest financial claim data (medical and prescription data) from various vendors to ODS.Responsibilities:HealthNew ( now acquired by Centene Corp) Apr13  Feb17Woodland Hills, California (Chennai, Offshore)Senior Informatica DeveloperProject(s):HealthNet AO/Membership Repository (MR) Migration - This project is to re-platform the current MR data from IBM AS400 platform to a new MR data store (Feb 16  Feb 17).Prime  BIX Data Extraction and Loading into EDW - Implemented Revenue Enhancement initiative with Human Arc (May 14  Feb 16).Omni Data Store  Implemented a large data consolidation initiative, OMNI, with emphasis on metadata using BIX (Business Information eXchange) and storing the same in Operational Data Warehousing while implementing Appeals & Grievances Dashboard Reporting (Apr 13  May 14).Responsibilities:Led the team on the migration initiative while working closely with other scrum teams.Modeled access layer for Claims domain and presented data model design to CTO team.Participated in architectural review meetings to finalize migration and testing strategy.Defined criterion to identify various patterns and marked each object against a pattern.Converted complex procedures involving recursive query and presented a session to the team on conversion approach.Key recommendations included changing PI, creating indices, rewriting mapping rules, parallel execution, resolving data type issues and statistics collection.Implemented Dashboard Reporting to provide high level summary and detailed reports for business users based on transactions in Pega Appeals & Grievances Call Centre system. This involved supporting the design of Universe database to present data for the reporting platform and resolving orphan records during conversion.Loaded data in XML format into the newly created tables in Operational Data Warehousing (ODW) after applying business transformation logic using Informatica.Recommendations included changing KPIs, creating index, rewriting logic, parallel execution, resolving data type issues, multi value compression and statistics collection.Tools & Technologies: Informatica 9.6.1, Oracle 11g, WinSCP, Putty, PL/SQL Developer, Unix, ESI Scheduler, ALM, Sybase Power Designer, Business Objects XI, Teradata 14.10 (Cloud), SQL Server 2008, Unix Shell scripting, ER-Win, JIRA, Talend, Health Care Logical Data Model (HCLDM), ER-winBritish Telecommunications, UK May13  Apr13Programmer Analyst (Chennai, Offshore)Project: Master Customer ID (MCID) - Capturing the Customer details like demographic information across all source systems and send the data to the initiate server after processing required transformations. Initiate server will identify the MCID for all the members and send it back to the ETL Team. Finally, the data from Initiate server will again undergo ETL process before loading to the target.Responsibilities:Analyzed the existing data flows and processes to document and validate with business stakeholders.Performed data analysis for identifying and validating transformations and diagnosing root cause of the data conflicts and anomalies.Involved in technical solutions/data conflict resolutions including design, development and implementation of new and modified data flows.Designed and developed the data payloads to build historical data.Developed data preprocessing, stats and data validation using stored procedures.Implemented project change requests and carried out regression testing.Tools & Technologies: Teradata, Informatica 9.6.1, SQL, TOAD, ER-Win, Unix, BusinessObjects XI, WLMR. R. Donnely India Pvt Ltd.Chennai, India Sep05  Oct10Process AssociateResponsibilities:Supported multiple process and workflow related changes to improve organizational efficiencies and reporting metrics.Tools & Technologies: Inhouse tools & Microsoft PowerPoint

Respond to this candidate
Your Message
Please type the code shown in the image:

Note: Responding to this resume will create an account on our partner site postjobfree.com
Register for Free on Jobvertise