Quantcast

Software Engineer Supply Chain Resume Be...
Resumes | Register

Candidate Information
Title Software Engineer Supply Chain
Target Location US-OR-Beaverton
20,000+ Fresh Resumes Monthly
    View Phone Numbers
    Receive Resume E-mail Alerts
    Post Jobs Free
    Link your Free Jobs Page
    ... and much more

Register on Jobvertise Free

Search 2 million Resumes
Keywords:
City or Zip:
Related Resumes

Project Manager Software Engineer Scio, OR

Software Engineer Machine Learning Portland, OR

Software Engineer Senior Portland, OR

Data Engineer Software Beaverton, OR

Software Engineer C++ OpenCV DICOM Deep Learning C# Python Imagi Wilsonville, OR

Qa Engineer Software Tester Portland, OR

Software Engineer Data Hillsboro, OR

Click here or scroll down to respond to this candidate
Professional Summary:13 years of IT experience in the field of Data warehousing as a Senior ETL Developer which includes 6 years of On-Site work experience.Worked on informatica Power Center, Informatica TDM, Informatica BDM, Oracle PL/SQL and UNIX Shell scripting & Python tools.Worked on multiple domains like Manufacturing, Supply Chain, Health Care & Telecommunication systems.Experience of working on complete project development life cycle which includes Requirement analysis, Design, Development, Testing, Deployment and Maintenance through different project development methodologies like Waterfall and Agile.Experience with Data Extraction, Transformation and Loading (ETL) from different data sources like DB2, Oracle, MS SQL Server, Teradata, Salesforce, Azure Could & Flat files.experience in implementing complex business rules by creating re - usable transformations and robust mappings/mapplets using various transformations.Good amount of knowledge on Power Center Performance tuning techniques and SQL query optimization & tuning techniques.Experience of working on Informatica TDM in providing the de-identified test data to non-production environments for testing data needs.Worked on Informatica Big Data Management (BDM) tool with Cloudera Hadoop Platform.Possess good Communication, Interpersonal & Analytical skills.Ability to work under strict deadlines and diverse conditions.Ability to perform independently as well as leading a team of 2-3 resources technically.Technical Skills:Data Warehousing (ETL Tools)Informatica Power Center, Informatica TDM and BDMDatabasesOracle (SQL & PL/SQL)Programming LanguagesUnix Shell Scripting and Python BasicsCloud TechnologiesAzure Fundamentals.Employment History:Name of the CompanyDesignationFromToDurationLumen TechnologiesSenior Software EngineerFeb 2022Till Date1.10 YearsCognizant Technology SolutionsAssociate- ProjectsMar 2014Jan 20227.10 YearsiGATE Global Solutions LimitedSenior Software EngineerOct 2010Sep 20133 YearsDatacore Technologies Pvt. Ltd.Software EngineerApril 2010Sep 20100.6 YearsProject Profile:#7 Project NameHawkeyes Service Assurance & Service DeliveryClientLumen TechnologiesLocationOregon, United States of AmericaTechnologies UsedInformatica 10.1.0, Oracle, UNIX & Azure BasicsDurationFebruary 2022  Till dateProject Description:Lumen Technologies Data Enablement provides data solutions related to special initiatives like Gainsight, integration and consolidation efforts into centralized data stores related to Sales Orders, Trouble Ticketing, and Revenue, new product development initiatives, and executive data needs for Sales, Service Delivery, Service Assurance, and Operations to meet our business stakeholder data needs. We deliver data warehouse and big data/data lake implementations and make sure that quick turnarounds happen when new requirements continue to come in. The team works in a truly agile and PI planning approach. We collaborate and coordinate with our stakeholders and coworkers continuously to deliver software implementations.Roles & Responsibilities:Worked with architects, system engineers and product owners to understand/detail business and system requirements and logical/physical data models.Worked with scrum team members and other team's system analysts to create/update/enhance source to target mapping documents.Worked on quality assurance to generate test plans and test cases.Analyzes programming requests for the purpose of assisting integration with current applications Develops ETL code, test scenarios and assists with UAT Build and data loading for User Acceptance Testing.Estimates levels of effort (LOEs) for work requests: analysis, design, development, and testing.Participates in and/or consults on Integrated Application and Regression Testing Tasks that include project implementation, researching and evaluating problems, and recommending and implementing decisions.Conducts training for the purpose of providing technical expertise to end users.#6 Project NameKaiser - Regulatory FinanceClientKaiser PermanenteLocationOregon, United States of America.Technologies UsedInformatica 10.1.0 with BDM, TDM, Oracle, UNIX, & PythonDurationMay 2015  January 2022Project Description:As part of Section 1343 of Affordable Care Act (ACA), created the Risk Adjustment program to better spread the financial risk borne by health insurance issuers to stabilize the premiums and provide issuers the ability to offer a variety of plans to meet the needs of the diverse population. Under the risk adjustment program payments will be transferred from issuers with relatively lower risk populations to issuers with relatively higher risk populations. on-grand fathered individuals and small group market plans irrespective of whether they are the part of exchange will submit the risk adjustment data (Claims & Enrollments) that will be used to determine individual level risk scores, plan average actual risk and associated payments and charges.Roles & Responsibilities:Sr. ETL Informatica Developer & Lead:Analyzing the requirements and Preparing the High level & Low-level design documents with the proposed solution to handle the business requirements.Developing the code as per the approved design documents and conducting the internal peer reviews. Preparing the UTC documents and capturing the Unit Test Results.Migrate the code to higher environments SIT & UAT. Provide support for System Integration and User Acceptance testing.Providing the hand over to Production Support Team after successful completion of SIT & UAT and before Deploying to the Production.Preparing the deployment documents to migrate the code to Production Environment with necessary business approvals.Post-Production support like production deployment validations and post-production issue fixes if any.Participating in daily/weekly status calls to provide the status on the project related activities.TDM Informatica Developer:Preparing De-identified data for all ACA applications for V2016 and v2019 SMP rules as per CMS regulations.Data Masking  Data Creation, Masking of sensitive fields and loading the masked data to non-Production environments.Create the masking policies and apply on the data sources and generate workflows into informatica power center using informatica TDM web portal tool.Collecting the Intake requests for data preparation from different applications (3R, Edge Server, CSR, KPMETA, RSAC, RACS and RADA) and preparing data and perform validations and run post load updates.Coordination and Collaboration with various application teams and stake holders.Worked with Python developers to build scripts to automate the TDM processes and validate the data between actual source systems and TDM databases.POC Using Informatica BDM with Cloudera Hadoop Platform:DSSHCR - 2Rs project has been implemented using Informatica power center, UNIX and Oracle tools for the Enrollment, Pharmacy and Utilization tracks. The new project which comes under Kaiser Permanente will leverage the Cloudera Hadoop platform and the Extract, Transform and Load part need to be implemented by Informatica Big Data Management tool. The purpose of the POC is to detail the business use case which needs to be implemented using the Informatica BDM tool with Cloudera Hadoop platform. This use case focuses on the processing capabilities of Big Data Hadoop platform by pushing down the Informatica BDM transformations into Hadoop cluster.Tools Used: Informatica BDM with Cloudera Hadoop Platform, Oracle SQL Developer and UNIX.Roles & Responsibilities:Develop the ETL mappings, workflows using BDM tool to replicate the existing systems scenarios for both Enrollment and Utilization tracks.Created and Executed the ETL jobs using Blaze, Spark and HIVE engines and compared the performance statistics between all three engines.Created the ETL objects to use all the possible transformations to verify the capabilities and limitations in using all the three engines.#5 Project NameDSSHCR - 3Rs PurgingClientKaiser PermanenteLocationOregon, United States of America.DurationApril 2014  April 2015Technologies UsedInformatica, UNIX and OracleProject Description:As part of Kaiser Permanente data management policies archiving and purging of data is recommended for every application depending on the business needs. Claims Management System (CMS) mandates data submitted by all vendors must be retained for a certain period (5 Years) for audit purposes.Data Archiving/Purging is the process of identifying and moving inactive data out of the current production systems and into long term archival storage systems. Moving inactive data out of production systems optimizes the performance of the resources needed there while specialized archival systems store information more costeffectively and provide for retrieval whenever needed.Roles & Responsibilities:Analysis on the Retention periods required on SST / HST / VAD tables for purging / Archival by setting up the meetings with respective teams.Involved in the creation of PL/SQL stored procedures to purge the data for individual schemas.Involved in the creation of ETL jobs to automate the purge process.Provided Support for SIT and UAT testing Phases.Deployed the code to SIT, UAT & Production environments.#4 Project NameOMAR Enhancement and OMAR PonytailClientMotorola SolutionsLocationChicago, United States of America.Technologies UsedInformatica, UNIX, OracleDurationJanuary 2013  September 2013Project Description:OMAR enhancement and OMAR Ponytail Projects are extensions of Order Management Project which involves the requirements in NA and EMEA DWH.Roles & Responsibilities:Detail level of analysis on Requirements given on EMEA and NA Regions and prepare design and mapping specification documents for different user requirements.Develop the code and perform unit testing.Provided the support for System Integration and User Acceptance testing.Deploying the implemented changes to production.Also worked on Production support activities to schedule & monitor the EMEA of NA data Loads.#3 Project NameMotorola EIM Solution ArchitectClientMotorola SolutionsLocationChicago, United States of America.Technologies UsedInformatica, UNIX, Oracle & TeradataDurationOctober 2012  December 2012Project Description:Problematic Issues in data at DWH level were raised as individual tickets assigned to Level 3 Support team. Key responsibility of the Assignee is to have a detailed analysis over the ticket assigned by having series of discussions with the user, support team etc. and conclude to solve the issue by finding its root cause. This project helped us to have direct one to one interaction with clients on a periodic basis in the form of calls and email communication.Roles & Responsibilities:Perform detailed analysis over the Issue Reported by the user in order to find the root cause.Have periodic calls with user to understand the issue reported and to give status of the ticket.Prepare a design document which indicates necessary solution for the bug reported.Implement the necessary fix required and perform unit testing.Deploy the change to Production instance upon necessary business approvals.#2 Project NameEIM  Order ManagementClientMotorola SolutionsLocationChicago, United States of America.Technologies UsedOracle, Informatica and UnixDurationAugust 2011  September 2012Project Description:The Customer, Motorola is one of the leading Mobile handsets and other electronic equipment Manufacturers. Motorola has an Order Management ERP application which maintains details of all the orders placed till its end delivery. The key objective of Order Management Data warehouse Project was to extract data from these various data sources (ERP and COF) and load them into a unified database for reporting and other purposes.Roles & Responsibilities:Detail level of analysis on Requirements given on EMEA and NA Regions and prepare design and mapping specification documents for different user requirements.Develop the code and perform unit testing.Provided the support for System Integration and User Acceptance testing.Deploying the implemented changes to production.Also worked on Production support activities to schedule & monitor the EMEA of NA data Loads.#1 Project NameHNM Spend/Savings by BU/TechnologyClientMotorola SolutionsLocationChicago, United States of America.Technologies UsedInformatica, UNIX and OracleDurationOctober 2010  July 2011Project Description:Business is doing ad-hoc reporting/data analysis using OBIEE answers. This OBIEE answers is built on SMS system which get orders and shipments data from GEMS NA (Liberty) data warehouse. There are three different source systems (SMS, XACTLY, SFDC) from where we need to bring the data in our Data warehouse (EDW).Roles & Responsibilities:Involved in requirement gathering and requirement analysis for assigned module.Involved in preparing the technical design document for the SMS module.Developed the Code for SFDC, XACTLY and SMS modules.Created Unit Test Case documents and captured the unit test resultsInvolved in the peer Reviews.Education:Title of the Degree with BranchUniversityYear of PassingMaster of Computer Applications.Sri Venkateswara University2009Bachelor of Science in Computer ApplicationsSri Venkateswara University2006XIIBoard of Intermediate Education2000XBoard of Secondary Education1998

Respond to this candidate
Your Message
Please type the code shown in the image:

Note: Responding to this resume will create an account on our partner site postjobfree.com
Register for Free on Jobvertise