Quantcast

Java Full Stack Developer Resume Royersf...
Resumes | Register

Candidate Information
Name Available: Register for Free
Title Candidate's Name
Target Location US-PA-Royersford
Phone Available with paid plan
20,000+ Fresh Resumes Monthly
    View Phone Numbers
    Receive Resume E-mail Alerts
    Post Jobs Free
    Link your Free Jobs Page
    ... and much more

Register on Jobvertise Free

Search 2 million Resumes
Keywords:
City or Zip:
Related Resumes

Java Full Stack Developer King of Prussia, PA

Full Stack .Net Developer Philadelphia, PA

Senior Full Stack .NET Developer Smoketown, PA

Full Stack Developer Philadelphia, PA

Salesforce developer, full-stack software engineer Philadelphia, PA

Java Developer Stack Delaware City, DE

Full-Stack Software Engineer Freeland, PA

Click here or scroll down to respond to this candidate
Candidate's Name
Name: Anil KumarPh no: PHONE NUMBER AVAILABLEProfessional SUMMARY:Over 13 years of professional experience in the IT industry, with a focus on developing, implementing, and maintaining a variety of applications using Java, J2EE technologies, and Object-oriented methodologies. Skilled in enterprise technologies, frameworks, and design patterns.Certified Scrum master and expertise in delivering projects through Agile and Test driven development (TDD).Experienced with J2SE Technologies like API, Threads, Executor framework, Completable Future, Futures, Collections, and Exception Handling and, J2EE Technologies like Servlet, Listener, JSP, Java Security API.Strong development skills in Java, J2EE, JDBC, JSP, Servlets, EJB J2EE, JNDI, RMI, HTML, XML, XSL, Java Script, Rational Rose, DB2, Oracle and SQL Server.Experience writing backend using Node.js with frameworks like Express and MongoDB database.Expertise in the implementation of Core concepts of Java, J2EE Technologies, JSP, Servlets, JSF, JSTL, EJB transaction implementation, Spring, Hibernate, Java Beans, JDBC.Stream applications using Kafka APIs and Kafka Streams API. Wrote Producer and Consumer API to publish and consume data from topics respectively. .Ample experience on usage of bundle packages and familiar using tools like NPM, Bower as task runners. Used Karma, Jasmine, Protractor as UI testing for Backbone JS and React JS.Implemented retry mechanism before sending to error topic. Implemented multithreaded consumption for slow consumers. Implemented exactly once semantics using Kafka.Expertise in JDBC Connection Pooling, Persistence, Caching, EJB Server, HTTP, HTTP TunnelingGood experience and knowledge in various development methodologies like Test Driven Development (TDD), Extreme Programming (XP), Scrum, Agile.Proficiency in front end application developer using Angular 2.0/4.0, React JS, Ember JS for dynamic users and which helps in architectural pattern MVC.Using Kotlin for implementing new modules in the application.Have knowledge on Kotlin Android Extensions framework.Experience in working with containerization technologies like Docker, Kubernetes and OpenShift.Experience in designing and architecting using UML based diagrams through tools like Plant UML and Lucid charts.Worked on customized front end application development using jQuery, React JS, Handlebar JS and implemented React JS using Redux library and Flux pattern.Extremely good in Spring Boot, Spring Framework, Hibernate, Angular 8, React JS, TypeScript, and JUnit frameworks.Proficient in using Enterprise integration patterns (EIP) with Apache Camel such as Multicast, dynamic router, content based router, splitter and recipient list.Expert in Various Agile methodologies like SCRUM, Test Driven Development (TDD), Incremental and Iteration methodology, Pair Programming, Agile Development &Testing using Software Development Life CycleExtensively worked on Collections, Generics, Enumerations, Annotations.Have knowledge of Spring Cloud to develop Spring Boot-based Microservices interacting through REST.Expertise in Object Oriented Analysis and Design (OOAD), OOPS using Unified Modeling Language (UML), Design Patterns, MVC Frameworks.In-depth knowledge of Apache Subversion (SVN), Git & Bit Bucket and Jenkins Continuous Integration Server  Installation, Configuration, Design and Administration, and integrating these tools with other systems.Experienced on working with Amazon Web Services like EC2, S3, AWS CloudWatch, Dynamo, SQS, Lambda and SNS.Familiarity with big data processing frameworks like Apache Spark and Apache Flink.Experience in Angular / React JS development using Test driven development (TDD) using unit testing frameworks such as Jasmine Protractor, Karma and SeleniumFamiliar with data architecture including data ingestion pipeline design, Hadoop information architecture, data modeling and data mining, machine learning and advanced data processing. Experience optimizing ETL workflows.Proficient in Service Oriented Architecture (SOA), Experienced in development and use of Web Services.Proficient in using caching technologies like Hazelcast and Redis and integration of those into application.Implemented Test Driven Environment (TDD) used Junit, Mockito, Sonar for Unit testingWorked on various J2EE applications on appservers such as Weblogic10.3, Websphere, Jboss Fuse 6.1 and Tomcat.Experience in writing JUnit tests using Mockito, Power Mockito and behaviour based tests using Spock and Cucumber.Experience in Front - End UI technologies like HTML5, CSS3, JQuery, JSON, AJAX, Node JS, AngularJS, BackboneJS, Bootstrap, Tag Libraries, and JSTL.Expertise in creating Single Page Application (SPA) and reusable components in Angular 6/Angular 10.Experience in implementing e-commerce/distributed applications using HTML, HTML5, CSS, JavaScript, Java, J2EE, Servlets, JSP, Java Beans, JDBC, EJB, XML, XPATH, JAXB, JAXP, SQL, jQuery, Unix, Linux and Windows.Designed and implemented XML schemas, Java APIs, business logic and XML/JavaScript user interfaces.Extensive experience with developing web and enterprise applications with development tools like Eclipse, IntelliJ and WebLogic.Extensive experience in developing unit testing frameworks using Junit and test-driven methodology.Experience in building projects through Maven and ANT building systems.Proficient in Core Java concepts like Multi-threading, Collections and Exception Handling concepts.Experience in version control tools like SVN, GitHub and BitBucket.TECHNICAL SKILLS:Programming LanguagesJava, J2EE, JDBC, Shell Scripting, Python, JavaScript, TypeScript, C, C++, jQuery, HTML5, DHTML, REST Services, ag-gridFrameworks/LibrariesApache Camel, Spring, Spring-Boot, Angular > 4, React JS, Apache Spark,Flask, Django, Bootstrap, Dozer, YARN, Express, JEST unit testing frameworkJava Enterprise APIServlets, JSP, Junit, EJB, JNDI, JSON, JMS, JDBC, Java Mail, RMI,Web servicesMessaging TechnologiesApache Kafka, IBM MQ, Rabbit MQ, ActiveMQ, IBM WebSphere and JMSSystem DesignDocker, Kubernetes, Openshift, MVC, Spring, Spring Boot, Hibernate,CSS3, Microservices, Node.JS, Reactive and Event driven systems.Cloud TechnologiesAWS (S3, ECS), GCPDatabases& ProgrammingMySQL, SQL, MongoDB, NoSQL, Oracle, SQL Server, IBM DB2Stored Procedures, PostgreSQL, AWS Dynamo, AWS Aurora, Cassandra, AnsibleSoftware EngineeringUML, Design Patterns, Object Oriented Methodologies, Service OrientedArchitecture, Test Driven Development, Scrum and Agile methodologiesXML TechnologiesXML, DOM, SOAP, WSDLApplication ServersApache Tomcat, Glassfish, Jenkins, JBoss, WebLogic, IBM, Apache KarafWebSphere, Eclipse, Maven, Yeoman, Grunt / GulpIDEs & ToolsEclipse, IntelliJ, VS Code, WinSCP, Putty, Jenkins, ANT, Maven, Log4J,Splunk, DataDog, Grafana, Websphere Studio Application Developer, NexusEducational Qualifications:Master's Degree  Computer Science (Embedded Systems), JNTU Hyderabad University (Main Campus), Telangana,India, Full time, 2 years, Graduation, Nov-2012.Bachelor's Degree- Electronics and communication Engineering, JNTU Hyderabad University, Telangana,India, Full time, 4 years, Graduation, Apr-2009.PROFESSIONAL EXPERIENCE:Client: Honeywell, Singapore, (PA) March 2023  Till DateRole: Senior Java Full Stack EngineerResponsibilities:Designed for breaking the S3 monolith by separating out encryption process of storage system into a microservice handling more than 98% of traffic coming to S3.Designed the requirements and implementation strategy using PlantUML and Gliffy and presented it for review with senior engineers and principal engineers.Design and implement micro-services API for mobile/web front end and back end edge points. Good Knowledge on Microsoft Azure Cloud.Designed multiple interfaces and adapters to integrate various APIs like GET, PUT, COPY, LIST and DELETE with new encryption module.Developed robust RESTful web services in Typescript, Node.JS and Angular in an agile environment using continuous integration using Github Actions.Developed publisher and consumer services to pull messages from Kafka Topics.Visual Studio Code was used as an IDE for development with PostgreSQL as the Database.Wrote algorithms to serialize and deserialize encryption module output refactored out of S3.Wrote checksum logic to increase durability guarantees of operations for all APIs using thread safe logic without impacting performance of operations.Extensively used Spring JDBC and Sequelize ORM template for performing Database Transactions.Refactored various implementations of encryption using factory design pattern and used them to process object bytes coming to S3.Developed and maintained Java-based applications using Spring Boot, delivering efficient and scalable solutions.Designed and implemented RESTful web services, ensuring high-performance communication between different system components.Leveraged cloud technologies, such as AWS and Microsoft Azure, for building and deploying cloud-native applications.Collaborated with front-end developers, utilizing JavaScript, to create interactive and user-friendly web interfaces.Utilized ag-Grid for efficient data rendering and manipulation in web applications.Implemented unit testing using the JEST framework to ensure code quality and reliability.Led web application packaging and deployment efforts using Yeoman, Grunt, and Gulp, optimizing build processes.Managed build tools and dependencies with Ant/Maven, Nexus, Git, and Jenkins, ensuring smooth development workflows.Applied expertise in handling large volumes of real-time data with big-data technologies like KDB/Q.Worked in an Agile environment, actively participating in Scrum ceremonies and contributing to agile methodologies.Contributed to the development of trading systems and brokerage technology, enhancing system performance and reliability.Demonstrated strong analytical, communication, and organizational skills while managing multiple tasks simultaneously.Leveraged event-driven and service-oriented processing concepts to build robust and scalable applications.Played a key role in the adoption and implementation of cloud technologies, earning industry certifications for AWS and Azure.Experience in RDBMS such as Oracle, SQL Server and writing stored procedures triggers, cursors, and optimizing queries using SQL.Utilized decorator design pattern to wrap encryption module responses from encryption microservice.Wrote database interaction code and used JDBC API to connect MySQL.Wrote a blob processing module to process object blobs which included writing new Iterator calculating range and part for GET API.Involved in writing JSP Custom tags and JSF components. Used JSTL tag library Core, Logic, Nested Bean and HTML tag libraries to create standard dynamic web pages.Performed integration of above two refactored modules enabling it to take 2% of customer traffic. Used Command and Strategy pattern to implement that integration.Used log4j logger to log errors and info across the application with proper exception handling.Wrote new unit test cases using Mockito and power Mockito to improve coverage of classes to 97%.Wrote new behavior-based end to end test cases in Spock and Cucumber to improve testing coverage from 69% to 93%.Successfully utilized Agile methodology to deliver features resolving dependencies and using CI/CD pipelines.Setup infrastructure to simulate customer traffic and workflows to perform performance testing for new implementation.Automated lot of frequent log dives by writing shell scripts based on log patterns.Worked collaboratively with other developers in a team-based environment, utilizing version control tools to 70l vb-n777manage changes to the code base.Environment: Java 1.8, Typescript, Nodejs, JDK, Log4j, J2ee, JDBC Lombok, Spock, Cucumber, Mockito, Power Mockito, Functional programming, design patterns, ANT build, IntelliJ, GIT, SOA, JMS, SOAP, XML, Eclipse, RESTful Web Services, WebSphere, Microservice architecture, Cassandra, Ansible, awk and shell scripts.Client: DBS Bank, Singapore Feb 2021  Feb 2023Role: Candidate's Name
Responsibilities:Worked on Spring-Boot application using JAVA 8 that aggregates data and metrics for 16 types of ECHO devices.Developed REST APIs with spring based transactions to use Oracle database to fetch devices info and process those requests on EMR clusters.Implemented MVVM architecture using Redux Architecture with React JS.Developed data ingestion application to bring data from source system to HBase using spark streaming, Kafka.Used multithreading concepts and Executor framework to manage thread pools to run 200-230 Amazon Athena queries to collect performance percentiles.Created JUnit test cases using Mockito to provide 98% test coverage used Sonar to identify bugs and check style issues.Configure Window Failover Cluster by creating Quorum for File sharing in Azure Cloud.Performed Real time event processing of data from multiple servers in the organization using Apache Storm by integrating with Apache Kafka.Involved in the design Implementation of JSP, Servlets and Web DevelopmentInvolved in the Parsing of internal XML format document to retrieve the information and to pass them to Struts Action class for further processing.Collaborated with cross-functional teams to develop and maintain Java-based applications.Designed and implemented RESTful APIs for seamless communication between microservices.Assisted in the migration of on-premises applications to the cloud, leveraging AWS services.Supported front-end development with JavaScript expertise and integrated third-party libraries.Conducted unit tests using JEST and participated in code reviews to ensure code qualityWorked with build tools like Maven and Jenkins for automated deployment processes.Contributed to the analysis and processing of real-time data streams in financial applications.Participated in Agile development practices and Scrum ceremonies.Gained exposure to trading systems and brokerage technology.Developed various screens for the front end using React JS and used various predefined components from NPM and Redux.Developed single page applications using React Redux architecture, ES6, web pack and grunt.Closely worked with Application using React JS and Node.js libraries NPM, gulp directories to generate desired view and flux to root the URL's properly.Expertise in Microsoft Azure Cloud Services (PaaS & IaaS ), Application Insights, Document DB, Internet of Things (IoT), Azure Monitoring, Key Vault, Visual Studio Online (VSO) and SQL Azure.Worked on some legacy web services built on Apache-CXF running on Apache Tomcat.Used JAXB to process JSON based responses from RESTful Web Services external to application to collect driver metrics.Used Spring JDBC template and Hibernate for performing Database Transactions.Worked on building front end using React JS, HTML, CSS that helps parsing device logs and generates insights for memory and CPU logs that are uploaded through web portal.Used AWS lambda, AWS Cloudwatch and AWS SQS to create email notification system which creates email whenever certain Cloudwatch event is alarming with log analysis details in email.Used Node.JS to develop comments repository which automatically triggers test runs interacting with MongoDB.Used Ant tool to build the application and Websphere Application Server WAS6.0 to deploy the applicationMaintained schedules for Data warehouse storage. Read and interpreted UNIX logs.Experience with Performance Tuning for Oracle RDBMS using Explain Plan and HINTS.Experience, dealing directly with equity products, and developing in Java or Scala.Created AWS IAM roles and policies to create permissions for AWS resources.Used AWS S3 to store compressed log files and insights pdf generated for future reference. Generated pre-signed urls to access those.Created SQL queries, indexes, triggers, sequences for AWS Athena to fetch results.Created Apache Spark jobs that runs on cluster of Linux machines to stream performance logs into AWS Athena.Used CI/CD pipelines build and release features and update libraries to minimize security risks.Maintained and optimized the performance of web applications, identifying and addressing bottlenecks as necessary.Environment: Java, Spring-Boot,J2EE (JEE),Apache Spark, AWS Athena, AWS Lambda, Web Services, AWS SQS, React, Javascript, GIT, REST, JAXB, AWS S3, JMS, VSCode, AWS IAM, WebSphere, Eclipse, AWS Cloudwatch, Sonar, Mockito, PowerMockito, Node.JS, Linux, Cassandra, Ansible.Client: DXC Technology Singapore PTE Ltd, Singapore. Jan 2019  Jan 2021Role: Candidate's Name
Responsibilities:Developed a web application using Angular, HTML, Javascript and CSS to automate filling of excel sheets for Micron engineers saving man hours by 183 hrs/ week. Designed and deployed the whole application end to end.Experience in Working and Designed and populated dimensional model (star and snowflake schema) for a Data warehouse and data marts and Reporting Data Sources.Created a data access layer and used MYSQL persistence to store changes in excel sheets on click of save button.Used Node.JS to process comments left at excel sheets with MongoDB storage. Also used Express frameworks to expose REST APIs to interact with excel sheets using comments on cells.Wrote SQL queries, indexes, Stored procedures to maintain CRUD operations of the databases.Worked on Continuous Integration CI/Continuous Delivery (CD) pipeline for Azure Cloud Services using CHEF.Developed the front end in Angular and used Hands on Table library to perform excel sheet related functions.Developed rest APIs using Scala and play framework to retrieve processed data from Cassandra database.Thin client rendering: HTML, Tag Libs, JavaScript, XML, JSP and Servlets.The Java Spring boot backend to support database interaction using ORM principles and exposed various REST APIs for front end to interact (CRUD operations).Developed another web application to allow Micron employees to submit tickets to the team. This application uses JIRA APIs to log description and other details of ticket.Created a single sign on (SS0) capability using ADFS integration and JWT token with OAuth2.0 to allow all Micron employees to submit tickets to the team.Used pagination table to display status, person operating on tickets submitted and ETA of completion increasing response time by 25%.Deployed application on Openshift pods using Docker images of the application. Explored Kubernetes to manage those containers.As a big data developer implemented solutions for ingesting data from various sources and processing the data-at-rest.Used Log4J for logging the user actions and exceptions to find out causes of system malfunctioning and keep user action logs.Involved in multi-tier Java and J2EE based applications support, responsible for writing business logic using core Java, SQL queries for the backend RDBMS.Implemented solutions using Hadoop, Apache spark, spark streaming, spark sql, hbase and ScalaUsed IntelliJ and VScode for the development environment.Experience of using Microsoft REST APIs for Azure Cloud and Office 365.Wrote JUnit test cases to test the functionality of each method in the DAO layer.Used GIT for version control.Environment: Java, HTML, Javascript, CSS, ADFS, OpenShift, Docker, Kubernetes, MySQL, SQL, Spring, Spring Boot, Log4J, Maven, GIT, Angular, Javascript, HandsonTable, Eclipse Node.JS, Linux, Cassandra, AnsibleClient: Starhub, Singapore Aug 2015  Dec 2018Role: Software DeveloperResponsibilities:Developed a data ingestion application to load real time data coming from locomotives into Oracle 10g database using Apache Camel framework with Spring Boot Integration.Developed we application reporting locomotive status using JSF, Hibernate, and J2EE technologies to access an Oracle RDBMS 13g RAC database in a multi-developer, configuration-controlled environment.Development of complex web application - Confidential utilizing wide range of open source and Oracle technologies: Oracle RDBMS, Spring MVC, J2EE/JEE, servlets, JSP, JSF, JavaScript, Hibernate, WebLogic, Web Center and ADF.Developed three microservices using Spring boot to categorize three sets of requests type processing based on data source and request structure. All microservices exposed REST endpoints.The middleware interaction used JMS/IBM WebSphere MQ series for transferring messages between different components using the JMS/Mail API framework.The data rate was 4 million messages per day serving 50-52 requests per sec using IBM MQ and event driven architecture.Communicated with external applications JMS messages using IBM Websphere MQ.Utilized JBOSS IDEs for application server environments that included JBOSS AS > 5.0 and JBOSS EAP.Used Service-Oriented architecture (SOA) principle and OSGI to expose services for decoding, transforming and storing according to data source, type and use cases.Used Scala collection framework to store and process the complex consumer informationUsed Enterprise Integration Patterns like multicast, content-based-router, dynamic router, recipient list and splitter to transform messages.Extensive experience as Production Support Personal in various multiple Data warehouse Projects and extensively worked with offshore team.Project Setup using myeclipse, IntelliJ and servers like tomcat, JBossDeployed all OSGI web services using FUSE platform 6.1 (Apache Camel, Blueprint, IBM MQ, Karaf/OSGi container)Experience in utilizing and implementing Confluent Schema Registry with Kafka.Implemented Spring boot microservices to process the messages into the Kafka cluster setup.Used caching mechanisms like Hazelcast to cache intermediate states and store results and reduce reads on the database. Explored differences between Hazelcast and Redis before designing the application.Developed a Spring Boot application to collect stream of real time panel images from locomotives and display them on web portal with a refresh rate of 3 sec. The system supported around 2k locomotives at a time.Eclipse was used as an IDE for development with Apache tomcat as the serverXML was used to create the db schema-mapping file for Hibernate.Developed Jsps including AJAX that call different APIs that process messages using XML.Reviewed code and documented designs to drive ETL processes using RedHat technologies for real time data ingestion.Used log4J for logging, Junit4 for unit tests and JDK8 for development.Used JDBC to retrieve data from Oracle database.Wrote multiple data ingestion queries and used indexes, stored procedures to facilitate data ingestion process as a configuration.Assortments, Products to Apache Kafka Topic by using custom Serializers. Exposed teh endpoint for Swagger and developed API's for documenting RESTFUL Web services.Maintained SQL queries in configuration files to perform database transactions during data ingestion process.Utilizing big data technologies such as Hadoop, map reduce frameworks, mongo dB, hive, Oozie, flume, sqoop and talend etc.Used HAWTIO dashboards to monitor logs and Karaf containers health.Used Sonar, Jenkins CI/CD to drive agility and quality in development process.Used Splunk dashboards to write queries and detect anomalies from logs for driving operation excellence.Designed and implemented business logic for tiered applications, incorporating JSF (1.2  3.0), EJB (3.0 - 3.2), and JSP (2.0 - 2.3).Environment: Candidate's Name  Studio.Client: BSNL, India Jan 2013  July 2015Role: Full Stack Candidate's Name
Responsibilities:Written complex SQL queries, Stored Procedures and Functions in PL/SQL for manipulating the data.Worked on Lambda Expressions, Functional interfaces Stream API's, Time API, and Improvements on Collection, Concurrency, and IO improvements by using Java 8.Used Subversion for configuration Management and Jira for task management and bug tracking.Used SOAPUI to test for sending and receiving XML data and worked with JMS Queues for sending messages in point to point mode communication.Used multithreading for writing the collector parser and distributor process, which was getting real-time data from Zacks API in format of JSON, using multithreading improved the performance by a lot. Moreover, using concurrency package of collections made it thread safe.Used Message body Writer for converting Java types streams.Used Maven for compiling and building the code.Used JavaScript, HTML, JSP pages for developing front end UI and wrote application level code to perform client-side validation.Used HTML5 wireframes with CSS provided by the design team. JS is used to make it dynamicUsed AJAX and JavaScript for Client-side validations.Provide 24x7 support to the application in pilot and production phases. Support included being on conference calls, identifying and fixing bugs, investigating reasons for specific application behavior.Performed unit testing using JUNIT framework and tested DAO's and Business ServicesMigrated technology from Angular 1.0 to Angular 2.0 to use upgraded features such as Angular Components and Angular Routers as per the strategy requirement.Developed Servlets for server-side transactions and made use of AJAX for server-side processing without refreshing the JSP page.Experience in generating Reports and Dashboards on Dynatrace and SplunkExperience in implementing MongoDB CRUD (Create Read Update Delete) operations by using Mongoose library in Node-JS including Angular JS.Extensive professional experience in Developing and Deploying enterprise applications on web/application servers such as JBOSS EAP 5.1, Tomcat 5.x/4.x, IBM WebSphere 6.x/7.x, Web Logic under Windows OS and UNIXExtensively used Jenkins as Continuous Integration tools to deploy the Spring Boot with Microservices to Pivotal Cloud Foundry (PCF) using build pack.Implemented AngularJS Controllers to maintain each view data. Implemented Angular service calls using Angular Factory with Dependency Injection to prevent scope conflict commonly found with JavaScript.Implemented light weight WADL (Web application description Language) for better understanding of Rest based web services and its configuration.Implemented multi-threaded synchronization processes, with JMS queues for consumption of Asynchronous requests.Involved in bug fixing during the System testing, Joint System testing and User acceptance testing. Deploying the applications and binding third party services like AppDynamics on Pivotal Cloud Foundry (PCF).Developed application using Spring JPA, Angular 2.0 on the presentation layer, the business layer is built using spring and the persistent layer uses Hibernate.Developed and implemented Restful Web APIs, and exposes endpoints using HTTP methods like GET, PUT, POST and DELETE.Designed new classes and functionalities using various JQUERY components for CRM application for customer service.Deployed our application on Pivotal Cloud Foundry (PCF) which is used to reduce the development overhead by providing a ready to use platform.Created Web User Interface (UI) using HTML5, DHTML, table-less XHTML, CSS3 and Java Script that follows W3C Web Standards and are browser compatible.Configured Bamboo to handle application deployment on Cloud (PCF) and to integrate with GitHub version control.Built Java Security Aplite adds security and authentication to my application.Environment: Angular JS, HTML5, CSS3, AJAX, Bootstrap, JSON, XML, Active MQ, JMS, Hibernate, DB2, SOAP-AXIS2, Restful services, JAX-RS SOA, Eclipse Java EE IDE Neon.3, Git, Log4j, DB2, Maven, TestNg, WADL,PCF.Client: CDAC, India May 2010  Dec 2012Role: Candidate's Name
Responsibilities:Developed a JAVA application to test integration of Informatica tool with Hadoop clusters running spark jobs with over 120 scenario.Prepare the Azure cloud infrastructure Azure Resource Manager Templates.Used Apache Kafka to collect events and process it to start and stop nodes and issue other environment testing commands.Written Servlet and deployed them on IBM Websphere Application server.Assortments, Products to Apache Kafka Topic by using custom Serializers. Exposed teh endpoint for Swagger and developed API's for documenting RESTFUL Web services. Experience in Implementing API's in Java Multi-Threaded EnvironmentSet up Cloudera agents on Linux based clusters using shell scripts.Environment: Java JDK7, Apache Kafka, YARN APIs, Cloudera management console.

Respond to this candidate
Your Message
Please type the code shown in the image:

Note: Responding to this resume will create an account on our partner site postjobfree.com
Register for Free on Jobvertise