Quantcast

Software Development Core Java Resume Au...
Resumes | Register

Candidate Information
Name Available: Register for Free
Title Software Development Core Java
Target Location US-TX-Austin
20,000+ Fresh Resumes Monthly
    View Phone Numbers
    Receive Resume E-mail Alerts
    Post Jobs Free
    Link your Free Jobs Page
    ... and much more

Register on Jobvertise Free

Search 2 million Resumes
Keywords:
City or Zip:
Related Resumes
Click here or scroll down to respond to this candidate
Candidate's Name
ACCOMPLISHMENTS:Around 11 years of professional IT experience with full project Software Development Life Cycle (SDLC) using Java-J2EE Technologies-Requirements analysis, Designing, Developing, Testing and Deployment of software applications, Maintenance of Client/Server applications.Extensive programming and development experience using Core Java and J2EE technologies including JSP, Servlets, Struts (MVC2Architecture), Hibernate, Spring (IOC, Hibernate Template), Spring Boot, Spring Security, Spring cloud foundry, XML, JDBC, LDAP, Log4J, AngularJS, JavaScript, HTML5, CSS3, Bank fusion tool.Project implementation skills using core java, J2EE technologies (JDBC, JNDI, EJB, RMI, Servlets, JSPs, JMS) and XMLExperienced in designing interactive web applications using AJAX, DWR, JSON and DOJO.Experienced in implementing application logic using Spring MVC architecture.Experience with Teradata load script like BTEQ, MLOAD, FASTLOAD and Teradata SQL optimizationSkills in optimizing websites for Mobile phone using CSS media queries.Expertise in developing applications using Spring Framework and using spring modules such as IOC/ Dependency Injection, AOP, MVC and configuring application context for spring bean factory.Implementing Service Oriented Architecture (SOA) using Web Services (SOAP, REST, WSDL, UDDI).Extensive experience in implementing J2EE design patterns like Session Faade, Singleton, MVC pattern, Business Delegate, Data Access Object, Service Locator, Transfer Object.Experience using NoSQL databases like MongoDB, Cassandra and Redis.Experience in using design tools like Rational Rose, MS Visio for Object Oriented Analysis (OOA) and Object-Oriented Design (OOD) using UML.Familiar with various SDLC methodologies such as RUP and Agile.Experience in unit testing, Integration testing, performance testing, writing UTD, UTR, JUnit Testing tool.Coding Maven build-scripts and configuring and using frameworks such as Log4J and JUnit.Experience working extensively on both UNIX based and Windows environments.Skilled in documentation and user presentation.Highly motivated team player with the ability to work independently and adapt quickly to new and emerging technologies.Education:Master's in computer science  University of Central MissouriSKILLS & ABILITIES:Java/J2EE Technologies - Java 17, Spring JDBC, Groovy, Bash scripting, DOS commands, Eclipse, Apache Tomcat, EJB 3.1, Spring MVC, JSP, Tiles, Hibernate 3.0, JNDI, JMS, MQ Series, JDBC, RMI, JXL, JavaMail, JavaBeans, SOAP UI, WSDL, Restful, Maven, Gradle, Kafka, RabbitMQ, NodeJS, Elasticsearch, Sonar cube, Sonar lint, Docker, Prometheus, Python 3.10, Spring boot 2.0 +, Spring 5.0 +, GrafanaRDBMS  PostgreSQL, H2, DB2, MS SQL server, Oracle, MS Access, MySQL, CassandraSCM Tools - Git, Subversion, CVS, Bitbucket, Code Build, Code Commit, Code pipeline.AWS Services: AWS Lambda, API gateway, StepFunctions, SQS, AWS Glue, SNS, SES, CloudWatch, CFN template, AWS connect, Greengrass, IoTSubject Matter expert in Lambda and IoT servicesIAC: Terraform, CloudFormation, Ansible.PROFESSIONAL EXPERIENCE:Lead / Sr. Cloud Engineer  Cigna Health Dec23 - PresentIn my role as a Lead/Senior Cloud Engineer at Cigna Health, I have been instrumental in driving process innovation and cloud migration initiatives that significantly enhance engineering team productivity and operational efficiency. My responsibilities encompass:Process Innovation: Identifying and executing process enhancements to boost the efficiency of the engineering team and streamline operations.Cloud Migration & Infrastructure, Application Management: Successfully migrated workflows to the cloud, leveraging AWS Glue, PySpark - DataFrames, and Terraform. Demonstrated proficiency in managing infrastructure within Kubernetes-based environments, contributing to improved workflow efficiency.Integration with AWS Services: Integrated Lex bots with other AWS services such as Lambda, DynamoDB, and S3 to handle backend logic, store session data, and manage static assets.AWS System Design Expertise: Leveraged extensive experience in designing and implementing AWS-based systems, strictly following the AWS Well-Architected Framework. This ensures the creation of secure, high-performing, resilient, and efficient infrastructures. Key AWS services utilized include AWS Lambda, Amazon ECS, and AWS Step Functions for serverless computing, container orchestration, and asynchronous API invocation, guaranteeing scalability and reliability in cloud settings.Technology Evaluation: Conducted thorough evaluations of current and emerging technologies to assess their cost-effectiveness and compatibility with existing systems.Key Technologies Used:AWS Serverless Services, Python, Java 17 - Asynchronous and Reactive Programming, Draw.io - For Architecture Design, AWS Glue, Open Shift (RedHat)Amazon Web Services Nov19- Nov23Cloud Engineer II - DMS SME Lambda and IoTThis is a AWS customer Facing role which involves extensively adopting AWS cloud technologies andCollaborating with Cross functional team to get Product developed and delivered on time, making sure customer deliverables are on time. Customer Problem statement are addressedManaged and maintained the companys AWS infrastructure, including EC2 instances, VPCs, Elastic Load Balancing (ELB), Security Groups (SG) and CloudFormation templates.Designed/Manage Event source mapping on lambda functions - SQS, Kafka/MSK, IoT, Kinesis etc for both Synchronous and Asynchronous type invokes.Conducted extensive testing and optimization of Lex bots to enhance accuracy, reduce latency, and improve overall user satisfaction.Designed and implemented solutions for automating repetitive tasks using tools such as CloudFormation and deploy them without giving developers more elevated access to account/Environments - Any changes that are doneDesigned and developed conversational interfaces using Amazon Lex, integrating natural language understanding (NLU) and automatic speech recognition (ASR) capabilities.Provided technical expertise in cloud architecture design principles and best practices to ensure optimal performance and reliability of systems.Implementing Lambda best practices - declaring Database connection outside the handler so that connection doesn't get created every time Lambda function Linux container gets created.Architected, designed, and deployed ETL pipelines using AWS Glue for integrating multiple data sources, resulting in a 30% reduction in processing times.Implemented data partitioning and schema evolution strategies to optimize query performance in downstream analytics tools.Integrated AWS Glue with other AWS services like Lambda, S3, and RDS for seamless data flow across the data lake architecture.Educate and Guide Enterprise and business level customers to use Lambda and IoT more efficiently and lower the operational bill.Training North American Engineers on lambda/IoT and its best practices, troubleshooting Event sourcemapping issues to decrease Iterator age on functions without increasing limits on throughput on souce but by increasing parallelization on function.Upgrade hundreds of function runtimes at once via CloudFormation templateImplementing API mTLS for security using Certificate and private key.Leveraging IoT Credential provider with X509 certificates so that we don't have to save Access key and secret key on device - which drastically improves security in case access and secret keys are exposed, entire fleet won't be compromised.Implement Custom Authorizer so that we don't want to use other Identity provided services like Cognito - but this is for small traffic application and device management application only because Lambda is subjected to 3K concurrent executions per min.Have a deep knowledge and sub pub MQTT service which can update status of device which is deployed on remote location. Where the device has limited connectivity - OTA, persistence session with Qos1.Technologies used: Serverless - Lambda, API gateway, Step Function, SNS, SQS, SES, CloudWatch, IoT and Greengrass, boto3, AWS CLI v2, Amazon Linux 2, ubuntu and debian, Python3.8 +, CFN, Ctrial, AWS Glue, Cloud formation.T-Mobile, Bothell, Washington July 19 - Nov19Sr. Java Developer, Distributed SystemsDEEP.io (Digital Enterprise Event Processing) is a messaging framework which enables Microservices to communicate data through events. It is offered as "software as a service".It leverages RabbitMQ messaging infrastructure for event delivery. RabbitMQ is a open source software that implements the AMQP (Advanced Message Queue Protocol). Kafka is a distributed streaming platform, leveraged for event audit and monitoring. A message queue uses an asynchronous protocol for communication. The publisher (or producer or sender) of the message or the consumer (or receiver) of the message need not interact with the message queue in real time. Messages placed on the queue are stored until the consumer retrieves them. Typically, messaging systems provide checks and balances to ensure that the messages do not get lost in case of system failures, retry mechanisms and ability to monitor the event / consumer activity.Responsibilities:Utilized Kafka and Cassandra as main data stores to create data pipelines to gather analytics from devices for various business needsCreated Application to connect Prometheus with application instances instead of scraping metrics from actuators, can monitor application at individual instance with RSOCKET protocolActed as DevOps backup, and supported during production deployments.Increased test coverage by 20% by adding new test cases and refactoring efforts.Created, REST APIs using Spring Boot, Spring MVC, Spring Data and Spring Security.Utilized Docker for containerized deploymentDeveloped Microservices Applications to Pivotal cloud Foundry and services to KubernetesEnvironment: Java 8, Spring integration, PL/SQl, Oracle 12c  Sql developer, RabbitMQ, Spring Boot, Spring MVC, Akka, Kafka, Cassandra, Linux, Oracle, Splunk, Github, Jenkins, RESTful web services, JSON, JUnit and Mockit, Prometheus, GrafanaDiscover Financial Services, Houston Feb18 - July 19Java / Spark DeveloperEPP: Enterprise Payment PlatformEnterprise Payments Platform initiative began under the name Common Payments Platform (CPP)Which consisted of collective development teams across the Discover Network, Diners Club International and PULSE Networks. The administration of development separated between PULSE and the Discover Global Network(DGN). As product offerings such as Protect Buy and Digital Payments expand on the Discover and Diners Club International NetworksResponsibilities:Developed Enterprise level applications for reporting in Monolithic Pattern, Spring IntegrationUsed Spring to create a SPARK SQL at run time used Spring IoC ContainerUsed Kafka for log accumulation like gathering physical log documents off servers and places them in a focal spot like HDFS for handling.Maintain common Libraries for all the dependency for use with in the EPPActed as module owner for a specific RepoHands on experience in working on Spark SQL queries, Data frames, and import data from Data sources, perform transformations, perform read/write operations, save the results to output directory into HDFS.Collected data using Spark Streaming from various sources (LINUX) in near-real-time and performs necessary Transformations and Aggregations to build the data model and persists the data in HDFSUsed all the open-source Technologies to take away organizational cost of spending on LicenseesReport formatting matching with IBM sourcesUsed Java API to perform various operation on aggregation and transformation logicUsed web HDFS REST API to make the HTTP GET, PUT, POST and DELETE requests from the webserver to perform analytics on the data lake.Used GIT for source code management and to resolve merge conflictsInvolved in converting Hive/SQL queries into Spark transformations using Spark RDD's.Responsible for writing REST based web service for mapping Institution names with Processors and Intercepts IDsWritten Awk to data masking to test data in lower environment, to protect details of the actual customerUsed Spark for interactive queries, processing of streaming data and integration with popular NoSQL database for huge volume of data.Written Shell script to Invoke PLSQL for INCETIVE Calculation on Institution levelWritten Dispute related logic which defines card adjustment, card adjustment settlement, terminal adjustment, terminal adjustment settlement, institution summary and processor summaryExtensive Experience with Concourse User Interface to manage Task for data loading in various environment.Extensive Experience in Demoing Sprint worth of work end of every sprint (System Demo) and at the end of every PI.Environment: Java 8, Spark RDD, Spring Integration, PL/SQl, Oracle 12c  Sql developer, Spring, Kerberos, Unix, shell, Webservices(REST), Gradle, Hadoop, Hive, HDFS, HPC, WEBHDFS, WEBHCAT, Spark, Spark-SQL,General Electrics (GE) Oil & Gas, New Orleans, Louisiana Dec 16 - Feb18 Role: Predix Back End DeveloperThe Plant Operation Advisor (POA) application built for oil and gas plants which helps the process engineers by automatically notifying plant vulnerabilities and is a one stop portal for surveillance and monitoring of the plants underlying machineries. POA is to support operations and reduce the risks at onsite and identify the unplanned downtime for the Asset by providing early warnings and advisory system of facility health vulnerabilities and operating envelope excursions for engineering and providing complete health report for early warnings and excursions and system trends and mywatchlist for each level (classification) L1, L2, L3, L4, Owner of Anomaly (Low Severity threat detection)  Analytics, cases modulesResponsibilities:Participated in requirement analysis of the applications along with business owners.Responsible for writing microservices using Spring Boot and Spring cloud foundryResponsible for API Platform design on High Available & Scalable Microservices ArchitectureDevelopment of spring boot micro services in PREDIX cloud environment.Developed with using secured bearer token-based Service Consumed and Service Provided Spring RESTful (JAX-RS) web services using JSONDeploying and managing applications in Cloud Foundry and creating database instances of PostgreSQLUsed AWS (Amazon Web Services) S3 (Simple storage service) data storage for storing the images of cases, machinery data and accessed them using AWS SDK for JavaSecurity is implementing using Spring Security OAuth2Good Experience on Cloud configuration, deployment, managing applicationsAnalyzing the log files by using Cloud Foundry consoleResponsible for implementing Data Migration as batch job.Responsible for Time Series data ingestion, accessing asset model data at different level of platformResponsible for Creating custom, general use modules and components which extend the elements and modules of core AngularJS, polymer JSWrote JPA queries for PostgreSQL and Apache Cassandra database. (Time series and Asset Database)Used JPA to access data from databaseWritten unit test cases, and tested using JUnit and Mockito.Implementing logging mechanism using the log4j.Using GIT for Source Control and Version Management.Implementing Agile methodology with RALLY in the development of the project. Involved in requirement gathering, analysis and designing.Developed several Use Case diagrams, Class Diagrams and Sequence diagrams using StarUML toolResponsible for finalizing the requirements and getting it implemented and delivered as per scheduleAlso, involved in testing and deployment of the application integration and QA testing phase.Played key role in estimating the timelines for the newly provided requirements in Scrum Sprint meetings.Responsible for converting the requirements into Technical Design DocumentResponsible for development and executing in demo system for every release.Environment:Java 8, SpringBoot, Spring Security, Springcloudfoundry, Webservices(REST), AWSJPA, postgreSQL, Cassandra, AngularJS, Postman, JSON, GIT, StarUML, RallyT-Mobile, Bothell, Washington Sep15 - Dec16Software Developer FTR Project:Participated in all phases of SDLC, involved in Agile Methodology.Used RAD for developing web components such as JSP, Controller tier that includes action classes and business tier that includes EJBs.Developed web application using JSF Framework.Used JSF framework in developing user interfaces using JSF UI Components, Validator, Events and Listeners.Used JQuery and JSF validation framework for front end validations.Used Clear Case version controlling to maintain project versions.Extensively used the LOG4j to log regular Debug and Exception statements.Closely worked with Test Team to identify bugs in application.Performed security scans and developed solutions where necessary after reviewing the results.Assist with creation of test data for the Quality Insurance team.REOS Project:Involved in various SDLC phases like Requirement gathering, Design, Analysis and Code development.Developed business layer using Spring, Hibernate and DAO s.Worked in Agile Methodology and involved in the project discussions.Used Springs Jdbc and DAO layers to offer abstraction for the business from the database related code (CRUD).Implemented Spring Beans using IOC, AOP and Transaction management features to handle the transactions and business logic.Designed and developed RESTful services using Spring MVC for AJAX calls for the UI.Customized DataTables and High charts into Angular JS Directives.Implemented Angular Controllers to maintain each view data.Used Log4J to capture the log that includes runtime exceptions and debugging the application.Developed queries using PL/SQL to retrieve data from the database.Involved in bug fixing, enhancements and support.Used WebSphere Application Server as part of production implementation.Environment: JDK 1.6/1.7, J2EE, JSP, Web services (SOAP), REST, JSF 2.0, MyFaces, OmniFaces, Spring, Hibernate 3.0, Websphere Application Server 7.0, JavaScript, Angular JS, HTML, CSS, ANT, JUnit, JQuery, XML, Log 4j, SOA and PL/SQL.Aditya Birla Group, India May 12 - Feb 14Java DeveloperGather requirements from business analyst, analyzed and convert the requirement into technical design.Played an active role in gathering system requirements from Business Analysts.Developed the application using Struts MVC for the web layer.Developed UI layer logics of the application using JSP, JavaScript, HTML/DHTML, and CSS.Involved in developing complex Hibernate mapping files, mapping different kinds of associations between tables.Developed queries using PL/SQL to retrieve data from the database.Developed Test plans, cases and executed them in Test and Stage environments.Developed GUI and Business Logic using JSP and Servlets.Involved in requirements gathering and converting them into specifications.Designed JSP pages using different Tag libraries.Involved in bug fixing, enhancements and support.Created Stored Procedures, Triggers for the application.Developed unit test cases using JUnit for testing functionalities/performed integration testing of application.Implemented client side validations using JavaScript functions.Support to UAT, production environments and resolving issues with other deployment and testing groups.Extensively involved in Production Support and in fixing defects.Environment: Java, Servlets 2.1, JSP1.0, JDBC, XML, Hibernate, Oracle, HTML, Java Script, Glassfish, Net Beans

Respond to this candidate
Your Message
Please type the code shown in the image:

Note: Responding to this resume will create an account on our partner site postjobfree.com
Register for Free on Jobvertise