| 20,000+ Fresh Resumes Monthly | |
|
|
| | Click here or scroll down to respond to this candidate Name: SatyaPhone: PHONE NUMBER AVAILABLEEmail: EMAIL AVAILABLESUMMARY: Over 15+ years of IT experience with 8+ years in Informatica PowerCenter in all phases of Analysis, Design, Development, Implementation and support data warehousing applications Around 1.5 years of experience in IICS Experience in the field of Data warehouse using ETL tools such as Informatica PowerCenter PHONE NUMBER AVAILABLE/8.6/7.1 databases as DB2, Oracle, MS SQL Server, Teradata and Mongodb Expertise in Oracle 11g/10g/9i PL/SQL, Functions, Procedures, Packages, Triggers and Materialized views Excellent knowledge in Creating and Maintaining Database objects like Tables, Indexes, Views, Synonyms, Stored Procedures and Packages Expert in writing complex SQL queries Extensive knowledge on the SDLC development life cycle. Involved in different phases starting from Requirement Gathering to Production Deployments and warranty Support Extensive experience in utilizing ETL Process for designing and building very large-scale data OLAP using Teradata and ETL tools Informatica ETL implementation using SQL Server Integration Services SSIS, Reporting Services SSRS Extensively worked in Extraction Transformation Loading (ETL) process using Informatica 10.2, Informatica Data
Quality (IDQ) 9.x, Informatica Intelligent Cloud Services (IICS), Informatica JSON Data Edition (BDE) 9.x
Performed Data Profiling using Informatica Data Quality Tool and Design, Development, Testing and Implementation of ETL processes using Informatica Cloud Experienced in database design, data analysis, development, administration, SQL performance tuning, data
warehousing ETL process and data conversions Expertise in Administration and Maintenance of Dev, Data Staging, UAT, prod and standby databases for DSS and Data Warehousing environments Extensive experience with Informatica Power Center hosted on UNIX, RHEL, J2EE and Windows platforms. Extensively worked on Informatica Designer Components Source Analyzer, Warehouse Designer, Transformation Developer, Mapping Designer and Mapplet Designer Extensively worked on Informatica Power Center Transformations such as Source Qualifier, Lookup, Filter, Expression, Router, Normalizer, Joiner, Update Strategy, Rank, Aggregator, Stored Procedure, Sorter, Sequence Generator, XML Source Qualifier, HTML, Web Services Transformation Experience working on WSDL, XML, SOAP messages and web services and invoking REST web services through HTTP transformation Experience with CI (Continuous Integration) and CD (Continuous Deployment) methodologies using Jenkins Expertise in different Informatica Performance Tuning issues like Source, Target, Mapping, Transformation, session to make them more efficient in terms of session performance Good understanding of relational database management systems like Oracle, DB2, and SQL Server and extensively worked on Data Integration using Informatica for the Extraction transformation and loading of data from various database source systems Expertise in creating databases, users, tables, triggers, macros, views, functions, Packages, macros, joins and hash indexes in oracle and Teradata database Experience in using the Informatica Utilities like Pushdown optimization and Session Partitioning to improve the performance of Informatica mappings Highly Proficient in creating SQL Server Reports, Graphs, handling sub-reports and writing queries to perform roll-up and drill down operations in SSRS Worked on OBIEE Reports and Dashboards Working experience in Informatica Enterprise Data Catalog (EDC) Excellent experience in developing pipelines using Informatica intelligent cloud services (IICS) and developing workflows in Application integration and (ICRT) real-time API s using AWS, AWS Glue Studio, AWS Secret Manager, Informatica cloud etc. Created connection to SAP HANA from SAP BODS repository for data loading. Knowledge in Machine learning and exposure to its concepts Worked on Java Spring Tool Suite
Experience in developing BI applications utilizing SQL server, BI stack, and Power BI Experience with high volume datasets from various sources like Oracle, Flat files, SQL Server and XML. Experienced in handling SCDs (Slowly Changing Dimensions) using Informatica Expertise in trouble-shooting production issues by root cause analysis to identify the problem, impact analysis to determine the dependencies and providing the resolution Extensive knowledge of various Performance Tuning Techniques on Sources, Targets, Mappings and Workflows using Partitions/Parallelization and eliminating Cache Intensive Transformations Experienced creating and scheduling SQL Server Agent jobs for running ETL processes and performing data messaging Worked with scheduling tools like Autosys, Tidal, Espresso etc. Well versed in writing UNIX shell scripting Experience in waterfall and Agile environments Good experience of Incident management and Change management process Good in planning, organizing, prioritizing multiple tasks Interested in learning new technologies and willing to working changing environments. Strong decision-making and interpersonal skills with result oriented dedication towards goalsTECHNICAL SKILLS:ETL ToolInformatica PowerCenter PHONE NUMBER AVAILABLE6/.1/7.1, Power Exchange 10.1/9.5.1/8.6.1, Informatica Cloud, Informatica Intelligent Cloud Services (IICS), Snowflake,
DatabasesOracle 9i/10g/11g/12c DB2, SQL Server 2005,2008, 2012 Netezza, Teradata (Fast load, Multiload and Fast export), NoSQL, PL/ASQL, Oracle Exadata, MYSQL, SASData Modeling ToolERWIN, Toad Data modeler, MS Visio, kafka/Cassandra, OEMScripting LanguagesUNIX Shell Scripting, Windows Batch scripting, C++, JavaScript, PythonOperating SystemsWindows, LINUX, UNIX, RHEL, J2EESchedulersControl-M, Autosys, Tidal, Espresso, Tivoli, Air FlowReportingPower BI, TableauPROFESSIONAL EXPERIENCE:
Client: CSG, Florida Aug 2023 - Till Date
Database ETL and IICS Developer
Responsibilities: Closely worked with Business Analysts, Architects, Reporting Team, Product Owners and Business Users to gather requirements, UAT testing and Production issues etc. Implemented complex business logic and processes through SQL stored procedures, Functions, Views and advance query concepts
Developed Mappings/Sessions/Tasks using Informatica Power Center and IICS for a Home Depot project. Participate in daily Stand Up calls, Sprint Status calls, Retrospective calls, Check-in Meetings with Business Users Reverse Engineering for various CRs to correct/update millions of records of data in the Production along with the new code deployment Troubleshoot various environments like DEV/QA for failed jobs and ensure the scheduled jobs from Appworx are run to ensure effective QA/UAT testing for users Worked on various Production issues like ETL Job failures, data discrepancies, business justifications etc. Extensively worked on the scheduler, Appworx to schedule ETL jobs along with their dependencies and also fix various failures Create unit test cases to meet the business requirements and performed unit testing of mappings/sessions/workflows/task/taskflows etc. Developed dynamic mapping in IICS and created mapping tasks to load multiple tables by using the one time developed dynamic mapping Extensively worked on Informatica Versioning and Deployment Groups and worked with Migration team for code deployment to QA and Prod Involved in the data analysis for source and target systems. Good understanding of Data warehousing concepts, Star schema and Snow-flake schema Design cloud real time replication tasks using IICS (Informatica Intelligent Cloud Service) with Salesforce integration and design ETL process using Informatica PowerCenter Excellent T-SQL development skills to write complex queries involving multiple tables, great ability to develop and maintain stored procedures, packages, triggers, user defined functions.
Worked on Batch Processing or Scheduling and Shell Scripting.
Created PL/SQL Functions & Procedures for the ETL process and for Data Validations between source and target tables Worked on PowerCenter to IICS migration and validated IICS objects by updating the code with available functionality in IICS Experience in business intelligence, data modeling, and/or data warehousing experience. Implemented several DAX functions for various fact calculations for efficient data visualization in Power BI Closely worked and assisted the QA team during the test cycles to resolve the issues and bug fixing.
Created Mappings using Mapping Designer to load data from various sources like MySQL, Oracle, Flat Files, MS SQL Server and XML. Extensively used ETL to load data from Flat Files, XML, Oracle ODI, MSQL etc. Used debugger to test the mapping and fixed the bugs
Experience working in Cloud technologies like Amazon S3, AWS Glue Studio, AWS Secret Manager etc
Involved in Performance Tuning at various levels including Target, Source, and Mapping Knowledge on working in environments like Azure, Databricks, Snowflake, AWS etc. Experience on working in Tableau Reports and Dashboards Experience in uploading data into AWS-S3 bucket using information amazonS3 plug Monitoring and troubleshooting of AWS Snowflake databases and cluster related issues Experience in using different types of utilities in IICS like Synchronization task, Replicate Task, Mass Ingestion Task, Mapping Task, Data Integration Task, PowerCenter Task and Masking Task to implement the business needs into the Cloud Enterprise Data Warehouse in AWS Redshift.Environment: Informatica Power Center 10.2, IICS (Informatica Intelligent Cloud Service), T-SQL, Oracle 11g & Teradata, SQL Developer, SQL Server 13.0, SSRS, MYSQL, ESP, UNIX, Remedy, AWS, Azure, SAS, TableauClient: Cleveland Clinic Foundation, Pennsylvania Jun 2022 Jun 2023ETL and Database DeveloperResponsibilities: Worked with various Doctors, Business Analysts, Architects, Data team etc. to gather the requirements during Development phases and for Issues in data Implemented complex business logic and processes through SQL stored procedures, Functions, Views and advance query concepts Created a New Search Engine with New Servers and Database Structures from afresh Worked on analyzing and creating many table dependencies for pulling data from many sources like EPIC, CoPath, Sun Quest etc
Created multiple Staging, Fact and Dimension tables using Informatica mappings in Powercenter and IICS, Teradata Views and Stored Procedure s Created multiple tasks and taskflows in IICS and scheduled them using Tivoli Worked on failure jobs in IICS and ensure they are back running Experience in MDM to resolve business issues due to inadequate data Worked extensively on extracting hundreds of millions of Comments data, cleaning them and integrating themInto meaning full data and accurate formats and data consistency throughout Also, worked on processing various forms of Epic Comments like Plain Text, Rich Text, Tabular data etc and to ensure they are as per the requirements in the Case reports Created many views to process the Case comments to integrate and concatenate them based on the Requisition key, group_number, Line_number etc. To generate and upload case slides and images to the client ensuring we meet the monthly deadlines Developed various jobs to create data manifests for the images uploaded and to make sure all images have the metadata Worked on exploring/investigating various new Databases in various servers in the huge Datamart, and get the required data based on the clients requirements for various diseases Reverse Engineering on identifying the actual database tables/fields based on the front end data & dashboards to associate them to various lookup tables to get meaningful data to the client Closely worked with many internal teams to investigate data and get the needed access if required. Troubleshoot various environments like DEV/QA for failed jobs and ensure the scheduled jobs are run to ensure effective QA/UAT testing for users Worked on various Production issues like ETL Job failures, data discrepancies, business justifications etc. Extensively worked on the schedulers, Airflow and Tivoli to schedule ETL jobs along with their dependencies and also fix various failures Create unit test cases to meet the business requirements and performed unit testing of all the created jobs Extensively worked on Datastage Versioning and Deployment Groups and worked with Migration team for code deployment to QA and Prod Involved in the data analysis for source and target systems. Good understanding of Data warehousing concepts, Star schema and Snow-flake schema Worked on Data Vault Modelling for long term Historical Storage of data Excellent T-SQL development skills to write complex queries involving multiple tables, great ability to develop and maintain stored procedures, packages, triggers, user defined functions.
Created PL/SQL Functions & Procedures for the ETL process and for Data Validations between source and target tables Update the changes in the BODS code as per the client requirements and perform testing and move the code to different environments Worked in Enterprise Data Catalog (EDC) to filter search results, data lineage etc. Created the technical documents for the BODS interface and uploaded to Share point Experience in SSIS tools like Import and Export Wizard, Package installation etc. To mask the sensitive data as per HIPPA using the Data Privacy Management (DPM) and Data Masking applications Experience in importing data between different sources like Teradata, SQL server etc using SSIS/DTS utility Experience in business intelligence, data modeling, and/or data warehousing experience Closely worked and assisted the QA team during the test cycles to resolve the issues and bug fixing.
Created Mappings using Mapping Designer to load data from various sources like Oracle, Flat Files, MS SQ and XML Experience in working with the Java Spring Tool Suite Extensively used ETL to load data from Flat Files, XML, Oracle, MSQL etc. Used debugger to test the mapping and fixed the bugs
Experience working in Cloud technologies like Amazon S3, AWS Glue Studio, AWS Secret Manager etc
Involved in Performance Tuning at various levels Knowledge on working in environments like Azure, Databricks, Snowflake, AWS etc.Environment: Powercenter Informatica, MDM, T-SQL, Oracle 11g & Teradata, SQL Developer, SQL Server 13.0, SSRS, MYSQL, ESP, UNIX, Remedy, AWS, Azure, Tivoli, AirFlowClient: Citrix Systems Inc, Florida Jan 2020-Apr 2022
PowerCenter ETL Developer
Responsibilities: Closely worked with Business Analyst, Architects and business users Responsible for coordinating with the Business Analysts and users to understand business and functional needs and implement the same into an ETL design document Implemented complex business logic and processes through SQL stored procedures, Functions, Views and advance query concepts
Developed mappings/sessions using Informatica Power Center for data loading Extensive experience with T-SQL in constructing Triggers, Tables, implementing stored Procedures, Functions, Views, User Profiles, Data Dictionaries and Data Integrity
Developed and implemented Error Handling Strategies for Informatica ETL process.
Extensively worked on Informatica Versioning and Deployment Groups Involved in the data analysis for source and target systems. Good understanding of Data warehousing concepts, Star schema and Snow-flake schema Design cloud real time replication tasks design of ETL process using Informatica PowerCenter Involved in supporting and maintaining Oracle ODI jobs and MYSQL Imports, Exports and SQL*Loader jobs Excellent T-SQL and BTEQ development skills to write complex queries involving multiple tables, great ability to develop and maintain stored procedures, triggers, user defined functions.
Worked on Batch Processing or Scheduling and Shell Scripting.
Performed SQL and PL/SQL tuning and Application tuning using various looks like Explain Plan, SQL*Trace, TRPROF and Auto-Trace Created PL/SQL Functions & Procedures for the ETL process and for Data Validations between source and target tables.
Good experience in MYSQL and SAS hands-on Experience in business intelligence, data modeling, and/or data warehousing experience. Implemented several DAX functions for various fact calculations for efficient data visualization in Power BI Developed Sqoop import jobs for data migration from legacy platforms to big data platforms like kafka/Cassandra/Talend etc Involved in writing complex SQL Scripts using the excel sheets for loading data into the maintenance tables. Closely worked and assisted the QA team during the test cycles to resolve the issues and bug fixing.
Created Mappings using Mapping Designer to load data from various sources like MySQL, Oracle, Flat Files, MS SQL Server and XML. UAT support when needed for the Business users Created reports using SSRS from various sources like SQL Server R2 and Cubes from Analysis Services Involved in the documentation of the ETL process with information about the various mappings, the order of execution for them and the dependencies. Experience on webMethods Business Activity Monitoring (BAM) Components Extensively used ETL to load data from Flat Files, XML, Oracle, MSQL etc. Used debugger to test the mapping and fixed the bugs Deploying SSRS Reports across multiple environments including Test, Production and supporting environments including Reporting Services Report Manager Involved in Performance Tuning of various mappings and workflows Prepared unit test cases to meet the business requirements and performed unit testing of mappings. Involved in writing batch scripts Performed data validation by writing SQL and BTEQ queries Acquianted with the Pega toolEnvironment: Informatica Power Center 10.2, T-SQL, Oracle 11g & Teradata, SQL Developer, SQL Server 13.0, SSRS, MYSQL, ESP, UNIX, Remedy, AWS, Azure, SASCAI, Pennsylvania Aug16 to Dec19PowerCenter ETL Developer
Responsibilities: Involved in understanding the business requirements, discuss with Business Analysts, analyzing the requirements and preparing business rules as per HIPPA standards Involved in designing the entire data warehousing life cycle designing Analyzed/Profiled the source data and involved in the gap analysis and implemented rules to cleanse the data. Responsible for Functional designing / technical designing of data warehouse and introducing new FACT or Dimension Tables to the existing Model
Modeled the Data Warehousing Data marts using Star Schema Worked on Informatica tools such as Source Analyzer, Data Warehouse Designer, Transformation Designer, Mapplet Designer and Mapping Designer Worked on the BI Reports module of the project as a developer on MS SQL Server 2005 (using SSRS, T-SQL, scripts, stored procedures and views) and Mongodb Experience on working in SAS, OBIEE Reports and Dashboards Optimizing and performance tuning ETLs, SQLs and BI reports Worked on tools like Tableau, MicroStrategy etc.. Experience in health, insurance, ecommerce and telecom domains, knowledge on the functionality and worked on processing claims, payments, customers etc. Involved in running the loads to the data warehouse and data mart involving different environments. Responsible for definition, development and testing of processes/programs necessary to extract data from client's operational databases, Transform and cleanse data, and load it into data marts. Used the update strategy to effectively migrate data from source to target. Created mappings and Mapplets using various transformations of Lookup, Aggregator, Expression, Stored procedure, Sequence Generator, Router, Filter, Normalization, XML and Update Strategy Involved in requirement analysis, ETL design and development for extracting data from source systems like Oracle, SQL Server, flat files and loading to Netezza
Extensively used PL/SQL programming in backend and front-end functions, procedures, packages to implement business rules. Created, Configured and Load Scheduled the Sessions and Batches for different mappings using workflow Manager and Unix Scripts. Worked on job monitoring and scheduling of jobs in espresso, Control-M etc. Analyzed/de-bug production issues and provided quick turnaround. Interacted with end users to identify key dimensions and abstracted quantitative metrics feasible in deciding business solutions. Designing, developing, monitoring, Enterprise Business Intelligence solutions to Production Monitoring SQL Server Logs, Application logs and analyzing the logs.
Contributed to the Design and develop various software using Java/Spring Batch/Micro-service Apps and JavaScript Experience in Power Exchange for loading/retrieving data from mainframe systems Extensively worked in the performance tuning and Query Optimization of the programs, ETL Procedures and processes. Expertise in functionalities like FTP, SFTP, MFTP etc. Interaction with Offshore Team daily based on the Tickets/issues and to follow up with them Organized data in the reports by using Filters, Sorting, Ranking and highlighted data with Alerts Using Informatica PowerCenter Designer analyzed the source data to Extract & Transform from various source systems (oracle, Teradata, DB2, SQL server and flat files) by incorporating business rules using different objects and functions that the tool supports. Expertise in Shell Scripts, Perl, Python scripting etc.Environment: Informatica Powercenter8.6.1, Informatica Power Connect / Power Exchange, Data Analyzer 8.1TOAD, MS SQL, Teradata, Oracle, PL/SQL, Oracle10g/9i, SQL Server, NoSQL, Mongodb, SSRS, T-SQL, TOAD, UNIX, JavaScript, ServiceNOW, Github, Datastage, Microsoft AzureBBB, NJ Oct 15 to July 16ETL DeveloperResponsibilities: Experience in doing changes to existing mappings in accordingly to the user requirements and business logic. Involved in the requirements definition and analysis in support of data warehousing efforts Worked with high volume datasets from various sources like Oracle, Text Files, relational Tables, and xml targets Built interfaces and automated them with Informatica ETL tool. Extracted the raw data from Microsoft Dynamics CRM to staging tables using Informatica Intelligent Cloud Services (IICS) Knowledge on Informatica DEI and Big data Extensively used Transformations viz. Stored procedure Transformations (for Index drop/rebuild), Router (for creating different pipelines), Lookup transformation (connected & unconnected), Update strategy (to define Insert/Update/Delete/Reject), Aggregators, Sequence generator Transformations (for generating surrogate keys) Called stored procedures to perform database operations on post-session and pre session commands.
Extensively used Informatica for loading the historical data from various tables for different departments and data marts Developed slowly changed dimensions SCD Type 1 and 2 using MD5 hash function Designed and enabled Informatica workflows as Web Services using Web Service as the source & target and exposed them for real time processing from different 3rd party clients including Java Used Web Services consumer transformation to access various 3rd party web services, used HTTP transformation to access REST web service GET & POST methods to download and upload attachments to different applications Implemented Realtime new relic integration from IICS logs and snow load logs. Created and Monitored Sessions and Workflows using Informatica Power Mart Server. Responsible to tune ETL procedures and STAR schemas to optimize load and query Performance. Used version control tool, Change Man, for deploying scripts into QA and UA environment Working experience in AWS cloud technologies including Red shift and JSON files Data Extraction from legacy systems and Load into AWS Red shift Tuned performance to increase the through put for both mapping and session level
Created parameters and variables to increase the usability of mappings
Created multiple partitions in session to concurrently load the data. Resolved Issues occurred in Integration and Production Convert specifications to programs and data mapping in an ETL Informatica Intelligent Cloud Services (IICS) environment Support team Involved in the requirements definition and analysis in support of data warehousing efforts Worked on preparing a table of summarization and aggregation of the fact data Expertise in ETL reporting based on the Flat Files data sourced in different platforms. Developed ETL jobs in API to extract information from Enterprise Data Warehouse.
Environment: Informatica Power Center 7.1, Informatica IICS, Informatica DEI, ORACLE 9i, SQL, PL/SQL, SQL Loader, Windows, ERWIN, T-SQL, windows batch scripting, ServiceNOW, Datastaging, AWSPNC Bank, Bangalore (TCS) Oct 13 to July 15ETL Developer Involved in design, planning, implementation, and the highest level of performance and functionally analyzed the application domains, involved in various knowledge transfers from dependent teams understand the business activities and application programs and document the understandings for internal team referencing Interacted with functional/end users to gather requirements of core reporting system to understand exceptional features users expecting with ETL and Reporting system and also to successfully implement business logic Study of detailed requirement of end users of system their expectations from Applications Involved in the data analysis for source and target systems and good understanding of Data Warehousing concepts, staging tables, Dimensions, Facts and Star Schema, Snowflake Schema Business process re-engineering to optimize the IT resource utilization Integration of various data sources like Oracle 10g, SQL Server 2008,2012, Fixed Width and Delimited Flat Files, DB2, COBOL files XML Files Transformed data from various sources like excel and text files into reporting database to design most analytical reporting system
Experience in working on Powercenter Admin console Initiated the data modelling sessions, to design and build/append appropriate data mart models to support the reporting needs of applications Involved in Data Extraction from Oracle and Flat Files using SQL Loader Designed and developed mappings using Informatica. Extensively used different types of lookups like Incore lookup and Flat file lookup. And used debugger to test the mappings and fixed the bugs Involved in fixing of invalid Mappings, Performance tuning, testing of Stored Procedures and Functions, Batches and the Target Data Involved in Data Extraction from Oracle and Flat Files, XML Files using SQL Loader, Freehand SQL, JSON Using Toad to increase User productivity and application code quality while providing an interactive community to support the user experience Developed and tested all the Informatica mappings, Processes and workflows - involving several Tasks Imported metadata from different sources such as Relational Databases, XML Sources and Impromptu Catalogues into Frame Work Manager Conducted and participated in process improvement discussions and recommending possible outcomes and focused on production application stability and enhancementsEnvironment: Informatica Power Center 7.1, ORACLE 9i, SQL, PL/SQL, SQL Loader, Windows, ERWIN, T-SQL, windows batch scripting.Allianz Life Insurance, Bangalore (TCS)Rational and Testing Admin Dec 09 to Aug 13 Documentation of Requirement Summary, Test Plan, UAT documents and to get them approved by the Change Management Approval team for every release Analysis and gathering the Business Requirements with client on new projects or enhancements To fix all Severity tickets in Production within SLA Requirement Analysis, Design and Coding Status reporting and preparing test completion report Knowledge on insurance domain and worked in data related to settlements, claims, mortgages etc
To Create and apply various BSM patches to deploy the code from one version to the other Troubleshoot and fix various issues extensively during Patch applications, Production deployments etc. User administration like adding, modifying, terminating users, granting permissions to projects etc for both Rational Requisite Pro, Clearquest and MQC tools. Projects, Server and Service maintenance for both Rational and MQC tools and Remedy as well Creating new databases in backend and configuring the Rational tools as per user requirements Conducted and participated in process improvement discussions and recommending possible outcomes and focused on production application stability and enhancementsEnvironment: Unix, Rational Clearquest, Rational Requisite Pro, Mercury Quality Center, QTPNortel Tele Communications, Mumbai (TCS)C++ Developer Mar 05 to Aug 09 Analysis and gathering the Business Requirements with client on new projects or enhancements Monthly and Weekly Change Requests which include new developments and changes to existing C++ and Shell scripts Develop and Performance tuning of various Shell scripts, C++ and Perl codesTo create and deploy BSM patches across different environments and version On call and in person support for various issues during and after production deployment Documentation of Requirement Summary, Test Plan, UAT documents and to get them approved by the Change Management Approval team for every release. To fix all Severity tickets in Production within SLA Co-ordination with various teams like CAB, Business etc to explain the changes, giving them live demos on the new projects and enhancements Requirement Analysis, Design and Coding Develop test cases for Functional testing. Perform System Testing, Integration Testing and Regression Testing Defect reporting and tracking. Status reporting and preparing test completion report Knowledge on Telecom domain and worked in data related to availability, invoices, network etc
To Create and apply various BSM patches to deploy the code from one version to the other Troubleshoot and fix various issues extensively during Patch applications, Production deployments etc. Conducted and participated in process improvement discussions and recommending possible outcomes and focused on production application stability and enhancementsEnvironment: C++, Shell Scripts, Unix, Perl |