Quantcast

Software Engineer Solution Architect Res...
Resumes | Register

Candidate Information
Name Available: Register for Free
Title Software Engineer Solution Architect
Target Location US-VA-Henrico
Email Available with paid plan
Phone Available with paid plan
20,000+ Fresh Resumes Monthly
    View Phone Numbers
    Receive Resume E-mail Alerts
    Post Jobs Free
    Link your Free Jobs Page
    ... and much more

Register on Jobvertise Free

Search 2 million Resumes
Keywords:
City or Zip:
Related Resumes
Click here or scroll down to respond to this candidate
Candidate's Name
CONTACT MEPHONE NUMBER AVAILABLEEMAIL AVAILABLERichmond, VALINKEDIN LINK AVAILABLESKILLSTechnical LeadershipAWSJava/ PythonAgileCloud ComputingAgile MethodologyDevOpsBigDataMYSQL Aurora DatabaseCERTIFICATIONSAWS Certified Solution Architect Associate & Developer AssociateOracle Certified Professional (OCWCD & OCJP)EDUCATIONMaster of Computer ApplicationsKumaon Engineering College,Dwarahat, India, 2005 - 2008Bachelor of ScienceGarhwal University,Pauri Garhwal India, 2002 - 2005PROFESSIONAL PROFILE13+ years of experience in complete Software Development Life Cycle (SDLC) & Agile Methodology, which includes Analysis, Design, Development, Deployment, Testing, Debugging, implementation, performance & maintenance/support of large & complex enterprise applications. I am proficient in Python and Java, & have extensive knowledge of various AWS services, such as EC2, EMR, S3, RDS, SQS, SNS, Route53, ALB etc. I have migrated various legacy services to server-less architecture. I am passionate about delivering innovative solutions that meet the needs of our customers & stakeholders, while adhering to the best practices of DevOps.WORK EXPERIENCELEAD SOFTWARE ENGINEERCapital One Services LLC Richmond, VA Dec 2017 - presentProject Name: Card Offer ManagementDescription: Management of Credit Card offers that includes assigning and updating offers details to customers in system.Role/Responsibilities:Leading a team of software engineers in developing & implementing various CapitalOne Card applications.Contributed to Agile development processes, including daily stand-ups, sprint planning & retrospective meetings.Working on creating/ managing AWS infrastructure through Dev-ops using EC2, ECS, ALB, Docker, EMR, Jenkins etc.Hands-on experience in developing/ deploying server-less applications using AWS Lambda, API Gateway, DynamoDB, AWS Step Functions & AWS EventBridge etc.Migrated various Java based legacy application to server-less architecture.Worked on migration from DB2 to Aurora MySql database on AWS Cloud.Developed various scalable RESTful & Batch applications using Java, Spring framework, Go, Python and SparkConducted thorough testing, debugging, and code review to ensure software reliability and performance.Developed and optimized complex SQL queries to support business operations and data analysis.Partnering with product owners & other stakeholders to define & create the roadmap for the teamAttend calls with the customers and work on the issues as per the business priority to ensure timely resolution of the incidents/ problems.A certified API Coach in Capital One responsible to review APIs within Capital One.Identified & implemented various measures to save AWS CostEnvironment: Java, REST Services, GIT, AWS services, Python, GO, MYSQL Aurora database, Lambda, EMR, EC2, ECS, S3 etcTECHNOLOGY LEADInfosys Ltd Richmond, VA Aug 2016  Dec 2017Project Name: Batch Data Store (Feb 2017 - Dec 2017)Description: Process Anthem members data via Java/ Hadoop.Role/Responsibilities:Worked as a Java/ Hadoop Technology Lead.Worked on Sqoop to import data from Oracle to HDFS.Developed various Hive queries to generate extracts for customer.Developed workflow in Oozie to automate the Sqoop jobs.Developed the shell script to automate Hive queries and Sqoop queries.Developed Data Ingestion Framework via Spark with Java and stored the JSON documents in NoSQL database MongoDB.Developed Data Synchronization from text files to MongoDB using Spark.Worked on FTP set up and Control-M set up on Production Environment.Shared responsibility for Hadoop Administrator.General (day to day) Team (offshore) Lead activities including production support.Environment: Java, J2EE, REST, HDFS, Hive, Sqoop, Oozie, MongoDB, Spark, Control-M, Eclipse, Linux, Maven & GIT/ Bit bucket.Project Name: MetLife (Aug 2016 - Jan 2017)  Cary, NCDescription: Application support and enhancement of www.metlife.com and related Java/J2EE based applications.Role/Responsibilities:Worked as a Java/J2EE Technology Lead.Worked on IT Service Management (ITIL) principles to provide Application Technical Support to MetLife.com website and related Java/J2EE based applications.Core Upgrades, Support creating and implementing new patches for the development/test stages for each release stages.Handling incidents/ issues/ requests for services and problem related to Metlife applications.Work on Change Requests by modifying relevant Java/J2EE/UNIX code/scripts & associated business logic.Worked on Content Management System.General (day to day) Team (offshore) Lead activities.Environment: Java, J2EE, Web Service (SOAP/REST), XML, Linux, Oracle, MySQL, UNIX, Service Now, Tridion Content Management, Crown peak Content Management System.CONSULTANTWiregraphTech Services Remote Mar 2014  August 2016Project Name: Bug-DB Analysis (March 2014 August 2016)Description: Research data preparation to do the analysis of all the resolved as well as unresolved bugs from open source tools/software. By just sitting outside its difficult to get the bug details of all the bugs of this universe, so initially it was targeted to get the details of those bugs which are hosted in Bugzilla.Role/Responsibilities:Design & development of the system using Java and Hadoop.Developed the Java code to extract the bug details from different Bugzilla using XML-RPC.Developed Rest based Web services using JAX-RS specification.Worked on Sqoop to import data from RDMS to HDFS.Developed MapReduce programs to perform analysis on data received from Bugzilla using XML-RPC.Developed workflow in Oozie to automate the tasks in HDFS.Involved in creating Hive tables, loading with data and writing Hive queries.Created HBase tables to load sets of processed or analytic data coming from Bugzilla XML-RPC.Managed and reviewed Hadoop log files and configuration files.Shared responsibility for administration of Hadoop.Environment: Java, J2EE, Web Service (SOAP/REST), XML-RPC, Servlet, JSP, Java Script, Tomcat, XML, HDFS, HBase, Hive, Pig, MapReduce, Oozie, Sqoop, Eclipse, Linux, Ant, Maven & SVNTECHNICAL ASSOCIATETech Mahindra Ltd Noida, India Sep 2008  Jan 2013Project Name: FaME-WMGG-Development (Jul 2010 - Jan 2013)Description: Managed Routine Interface provides the complete solution to British Telecom (BT) for different operations on already existing equipment of BT Customers.Role/Responsibilities:Involved in Analysis, Design & Development of Managed Routine Interface & developed specifications that include Use Cases, Class/ Sequence & Activity Diagrams.Developed the application using classical Model View Controller (MVC) architecture.Developed Session beans which encapsulate the workflow logic.Deployment of application on Web Logic server.Used JAXB API to bind XML schema to Java Classes.Developed logging module-using Log4J to create log files to debug as well as trace application.Defect Fixing and Provide application support by resolving issues.Automated deployment, build, configuration management, code coverage, run JUnit using Hudson.Daily Monitoring via Hudson and extensively used ANT as a build tool.Environment: Java, J2EE, Oracle ADF, EJB, MDB, JPA, UNIX, Oracle, Ant script, Hudson, Web Logic 10.3.2, JDeveloper.Project Name - FaME-WMGG-ASG (November 2009 June 2010)Description: It provides the complete solution to British Telecom for their various operations like registering a variety of customer complaints and handling these complaints by their thousands of field engineers on their laptops online.Role/Responsibilities:Worked on ED, WMDC, MHW and WMWS module.Providing technical Support to Onshore as well as offshore Taskforce extended deliveries (ED) users and the support team (ASG).Core Upgrades, creating/implementing new patches for development/test stages of each release stages.Handling all issues regarding requests for services and problem related to Taskforce extended deliveries.Ensure timely project delivery and to ensure that various SLAs are met.Attend calls with the customers and work on the issues as per their priority.Providing complete assistance on CST, CIT and IVVT environments as well.Configuration, Deployment and monitoring of the application on WebLogic Server.Raise cases, service request and change request through Clarify tool in case of production support.Environment: Java, J2EE, Oracle 9i, Web logic 8.1, UNIX, IBM MQ and Control-M.Project Name: Tikona Boss Managed Services (Sep 2008 - Nov 2009)Description: It provides the E2E (end-to-end) solution to Broadband based services for Tikona Digital Networks (TDN) through Business Support System & Operational support system (BOSS).Role/Responsibilities:Worked on OSS (OSM, UIM, ASAP) and AIA module.Monitoring the status of server, checking WebLogic queue, checking logs and the deployments.Completing the manual tasks in the Pending Orders and check process history of orders in OSM etc.Check status of Orders in AIA and tell other streams why order got failed.Environment: Metasolv Oracle Products (OSM, UIM, ASAP and AIA), UNIX, Oracle 9i

Respond to this candidate
Your Message
Please type the code shown in the image:

Note: Responding to this resume will create an account on our partner site postjobfree.com
Register for Free on Jobvertise