Quantcast

Python Developer Web Applications Resume...
Resumes | Register

Candidate Information
Title Python Developer Web Applications
Target Location US-IL-Tennessee
Email Available with paid plan
Phone Available with paid plan
20,000+ Fresh Resumes Monthly
    View Phone Numbers
    Receive Resume E-mail Alerts
    Post Jobs Free
    Link your Free Jobs Page
    ... and much more

Register on Jobvertise Free

Search 2 million Resumes
Keywords:
City or Zip:
Related Resumes

Assistant Java Macomb, IL

Tech Support Specialist Monmouth, IL

Data Science Software Engineering Hannibal, MO

Sales Associate Manager Keokuk, IA

Project Manager Document Management Fort Madison, IA

v Quincy, IL

Click here or scroll down to respond to this candidate
Name: SABITHA BEKKAMEmail: EMAIL AVAILABLEPhone: PHONE NUMBER AVAILABLESummary:Over 10+ years of Software Life Cycle experience in System Analysis, Design, Development, and Implementation, Maintenance, and Production support of Data Warehouse Applications, AWS certified cloud engineer SQL and Python and AZURE Data FactoryDomain Knowledge on HealthCare, Insurance & banking Payments.Experienced in Automating, Configuring and deploying instances on AWS, Azure environments and Data centers, also familiar with EC2, Cloud watch, Cloud Formation and managing security groups on AWS.Hands on experience in developing web applications implementing Model View Control (MVC) architecture using Django, Flask web application frameworks.Developed web applications, RESTful web services and APIs using Python Flask, Django.Experience in developing web-based applications using Python 2/3, Django 1.4/1.3, Angular JS, JAVA, CSS, Bootstrap, HTML, DHTML, JavaScript and jQuery, AJAX.Experience in working with Python ORM Libraries including Django ORM, SQL Alchemy.Proficient in PostgreSQL, SQLite, MySQL databases with Python.Hands on experience in SVN, Git.Automated Git Pipeline with the use of GitHub Events and AWS Lambda.Extensive use of Linux, SSH, Flask/Django.Worked on AJAX framework to transform Datasets and Data tables into HTTP-serializable JSON strings.Experienced in creating web applications with Python Flask.Created RESTful API's using Python and POSTMAN API for other developers. Experience in configuring EC2 Instances for app deployment in productionCreated Back-End Micro-Services for mobile and web applicationsSetting up databases in AWS using RDS and configuring instance backups to S3 bucket.CERTIFICATIONS:AWS Certified developer - AssociateTechnical skills:ETL ToolsInformatica 10.1.1/10.1/9.6/9.1/8.6.1/8.1 Source Analyzer, Mapping Designer, Workflow Monitor, Workflow Manager, Data Cleansing, Data Quality, Repository, Metadata, streamsets Data Mart, OLAP, OLTP, SQL Server SSIS.DatabasesN SQL, Java, PL/SQL, T-SQL, Oracle 11g/10g/9i/8i, IBM-DB2, MS SQL Server (2008, 2005, 2000, 7.0, 6.5, 6.0), MS Access, DTS, DB2, snowflake, hive, T-SQL. Netezza, PL/SQL8.4.X to 9.5.X, C, C++, C#, Visual Basic 6, Visual Basic,oracle application R12.0.6Other ToolsToad, SQL Developer, Crystal Reports, SQL Assistant, Alteryx,Crystal Report, SQL Server 2012 Reporting Services (SSRS), Hadoop YARN, Spark code, Spark StreamingProgramming LanguagesPython, pyspark, pandas,UNIX Shell ScriptingJob schedulingShell Scripting, Autosys, Tidal, Control-M,lambda auto scheduleEnvironmentMS Windows 2012/2008/2005, UNIXAWS (amazon web services)Certified associate Developer, AWS (EC2, VPC, ELB, S3, EBS, RDS, Route53, CloudFormation, AWS Auto Scaling, Lambda), AWS CLI, Jenkins, Chef, Terraform, Nginx, Tomcat, JBoss.DBA ToolsSQL, Erwin, TOAD, PL/SQL, T-SQL, VMware,Professional ExperienceRole: Sr. AWS Python DeveloperClient: Fannie MaeWashington, DC, Virginia September 2022 -PresentDeveloped, tested, and deployed Business feature set in Node.js with Express and MongoDB backend, incorporating APIs.Developed application using Amazon Web Service (AWS) like EC2, cloud search, Elastic load balancer ELB, Cloud Deploy and monitor scalable infrastructure on Amazon web services (AWS) & configuration management using puppet.Rewrote one of the key pages, which allows users to manage their content. The task involved investigation of the AngularJS UI-Grid as well as refactoring of several backend methods.Built AngularJS modules, controllers, pop up modals, and file uploaders.Worked on server-side applications with Django using Python programming.Interacted with 3rd party APIs and built RESTful APIs using NodeJSDeveloped GUI using Python and Django for dynamically displaying the test block documentation and other features of Python code using a web browser.Configured Spark streaming to receive real time data from the Kafka and store the stream data to HDFS using Scale.Worked on MongoDB database concepts such as locking, transactions, indexes, Sharding, replication, schema design.Managed large-scale data migrations to S3 and RDS, ensuring data integrity and minimal downtime.Developed enhancements to MongoDB architecture to improve performance and scalability.Involved in Installing Hadoop, Map Reduce, HDFS, and AWS and developed multiple MapReduce jobs in Hive for data cleaning and pre-processing.Involved in writing SQL queries implementing functions, triggers, cursors, object types, sequences, indexes etc.Created Data tables utilizing MySQL utilized Jinja to access data and display it in the front end.Worked on Automation of data pulls from SQL Server to Hadoop eco system via SQOOP.Contributed in the design and creation of RESTful APIs using Python/Django/Django Rest Framework.Worked extensively with ServiceNow for incident management, ensuring timely resolution of tickets and compliance with SLAsSupported healthcare applications, focusing on AWS Glue, PySpark, and Terraform for infrastructure management and deployment.Imported millions of structured data from relational databases using Sqoop import to process using Spark and stored the data into HDFS in CSV format.Strong knowledge of all phases of SDLC and Strong working knowledge of Software testing (Functional Testing, Regression Testing, Load Testing).Installing and maintaining the Hadoop - Spark cluster from the scratch in a plain Linux environment and defining the code outputs as PMML.Complex loan origination and underwriting strategies can be flexibly defined and can include the implementation and deployment of credit risk scoring and risk rating models through model import (PMML).Successfully implemented Apache Spark and Spark Streaming applications for large scale data.Writing API documentation for onboarding developers on microservices platform of Autodesk.Integrated the app with Amazon Alexa, Google Home and Smart Home Devices.Used event driven Serverless AWS Lambda for computing to scale up secure web applications and distributed systems.Role: Sr. AWS SQL Developer/ Data Engineer February 2021- Sept 2022Next pathway, New York, New YorkResponsibilities:Expert knowledge of Terada platform and associated tools. Expert design/coding skills, unit testing methodologies and techniques.Configured several nodes (Amazon EC2 spot Instance) Hadoop cluster to transfer the data from Amazon S3 to HDFS and HDFS to AmazonS3 and to direct input and output to the Hadoop MapReduce framework.Deployed project into Amazon Web Services (AWS) using Amazon Elastic Beanstalk.Worked on Amazon Web Services (AWS) infrastructure with automation and configuration management tools such as Chef and Puppet.Developed and maintained ETL processes using AWS Glue, significantly reducing processing times and improving data pipeline efficiency.Developed UI using CSS, HTML, JavaScript, AngularJS, JQuery and JSON.Used SAX for XML parsing, JSON and AJAX to send request to secured web service.Created and maintained AWS EC2 servers running the Dittach application's full suite f micro-service Docker containers and configured Strong loop NodeJS API Services.Implemented user interface guidelines and standards throughout the development and maintenance of the website using the HTML, CSS, JavaScript and JQuery.Developed Web API using NodeJS and hosted on multiple load balanced API instances.Converted the raw JavaScript application to AngularJS, Node, and MongoDB.Designed DynamoDB pipeline for routing/storing of email bounce handling records.Developed Views and Templates with Python and to create a user-friendly website interface Django's view controller and template language is used.Worked on MongoDB database concepts such as locking, transactions, indexes, Shading, replication, schema design.Developed enhancements to MongoDB architecture to improve performance and scalability.Work on modification of machine learning algorithm name Santillan to build price recommendation system with feature selection.Involved in Installing Hadoop, Map Reduce, HDFS, and AWS and developed multiple MapReduce jobs in Hive for data cleaning and pre-processing.Worked on Automation of data pulls from SQL Server to Hadoop eco system via SQOOP.Contributed in the design and creation of RESTful APIs using Python/Django/Django Rest Framework.Wrote SOAP and Restful web services, in house ESB for a web Claims application (Spring MVC, Angular JS, Active, SOAP UI, Mocking, JSON, Build forge).Designed and documented RESTful APIs for collection and retrieval of high volume IOT telemetry data.Successfully migrated the DynamoDB from PostgreSQL with complete data integrity.Involved in environment, code installation as well as the SVN implementation.Developed, tested, and debugged software tools utilized by clients and internal customers.Coded test programs and evaluated existing engineering processes.Designed and configured database and back end applications and programs.Collaborated with cross-functional teams to integrate PySpark applications with other data processing tools and systems, enhancing overall data architecture.Expertise in working with MySQL databases, Apache Web server, Tomcat Application Servers.Written Cloud formation templates and deployed AWS resources using it.Worked on google cloud components, google container builders and GCP client libraries. Architecting the infrastructure on google platform using GCP services and automated GCP infrastructure using GCP cloud deployment manager.Environment: Environment: Python, Linux, Windows XP, HTML, CSS, JavaScript, jQuery, AWS, SVN, AJAX, AWS (EC2, VPC, ELB, S3, EBS, RDS, Route53, streamsets, PYTHON, CloudFormation, AWS Auto Scaling, Lambda), GIT, SQL, Jira, AW, ASP.NET core.Role: Sr. AWS Python DeveloperClient: Cigna Health Spring, Jul 2018 January 2021Nashville, TNResponsibilities:Worked with business analysts for requirement gathering, business analysis, and translated the business requirements into technical specifications to build the Enterprise data warehouse. Design and developed end-to-end ETL process from various source systems to Staging area, from staging to Data Marts.Extensively used Python / Django Framework for developing backend of applications.Responsible for analysing various cross-functional, multi-platform applications systems enforcing Python best practices and provide guidance in making long term architectural design decisions.Worked with JSON based REST Web services and Amazon Web Services (AWS).Worked on Angular JS framework to develop interactive websites based on client needs.Successfully migrated the website's main database from MySQL to PostgreSQL.Helped the big data analytics team with implementation of python scripts for Sqoop, spark and Hadoop batch data streaming.Involved in Developing a Restful service using Python Flask framework.Actively involved in Initial software development life cycle (SDLC) of requirement gathering and also in suggesting system configuration specifications during client interaction.Designed and created the database tables and wrote SQL queries to access PostgreSQL.Analysed and Designed workflows based on business logic.Designed user friendly Interface using Bootstrap framework.Wrote python code within the Hadoop framework to solve Natural Language Processing problems.Environment: Informatica Power Center 10.1.1, Hadoop, Hive, Oracle 11g, Azure, IDQ, UNIX, PL/SQL, SQL* PLUS, TOAD, TERADATA 14.0, MS Excel, Active Batch V12 Console, Cognos, Big Query, SQL server Management studio 2016. AWS (EC2, VPC, ELB, S3, EBS, RDS, Route53, CloudFormation, AWS Auto Scaling, Lambda), GIT, SQL, Jira, AWS.Role: Sr. AWS Python DeveloperClient: USAA, September 2017  June 2018San Antonio, TexasResponsibilities:Interacted actively with Business Analysts and Data Modelers on Mapping documents and Design process for various Sources and Targets.Plan, design, and implement application database code objects, such as stored procedures and views. Build and maintain SQL scripts, indexes, and complex queries for data analysis and extraction.Designed and implemented scalable, secure, and highly available SQL databases on AWS RDS, leveraging PostgreSQL, MySQL, and SQL Server.Implemented automated backup, restore, and disaster recovery solutions for SQL databases on AWS, ensuring data integrity and minimal downtime.Utilized AWS Data Migration Service (DMS) for seamless migration of on-premises databases to AWS, reducing migration time and ensuring data consistency.Environment: Informatica Power Center 9.6, Oracle 11g, Azure, Autosys, IDQ, Big Query, Cognos, AWS, UNIX, PL/SQL, IDQ, Teradata V 13.0, SQL* PLUS, TOAD, Teradata SQL Assistant, MS Excel.Role: Sr. AWS python developer Jan 2016  July 2017Client: FedEx,Memphis, TN.Responsibilities:Effective Communication with data architects, designers, application developers and senior management to collaborate on projects that involve multiple teams in a vitally time-sensitive environment.Effectively involved in allocation & review of various development activities / task with onshore counter apart. Assisted in the definition of the database requirements; analyzed existing models and reports looking for opportunities to improve their efficiency and troubleshoot various performance issues. High level proficiency with SQL, Transactional SQL, Stored Procedures, and Relational Database Management System (RDBMS).Created RESTful APIs and microservices using Python and AWS Lambda.Developed ETL processes using AWS Glue and Python for data ingestion and transformation.Implemented data processing workflows using AWS Step Functions and PythonDeveloped highly optimized stored procedures, functions, and database views to implement the business logic and created clustered and non-clustered indexes. Involved in performance monitoring, tuning and capacity planning.Designed and developed complex SQL queries, stored procedures, functions, and triggers to support mpliance by implementing encryption, user roles, permissions, and auditing in SQL databases.Environment: Informatica Power Center, Teradata SQL Assistant 12.0, Teradata V12.0R2, AWS, Oracle10g/9i, MS SQL server 2005/2012, Business Objects, Autosys, Toad 7.6, SQL, PL/SQL, Unix Shell Scripting.GE Health Care, Hyderabad May 2014- July 2015Role:AWS python DeveloperResponsibilities:Involved in creating Technical Specification Document (TSD) for the project.Used Informatica for loading the historical data from various tables for different departments.Involved in the development of Data Mart and populating the data marts using Informatica.Created and maintained metadata and ETL documentation that supported business rules and detailed source to target data mappings.Writing & debugging the SQL scripts.Created and maintained RESTful APIs and microservices using Flask, Django, and FastAPI.Involved in Data Extraction, Transformation and Loading (ETL) between Homogenous and Heterogeneous systems using SQL tools (SSIS, Bulk Insert).Environment: Oracle RDBMS 9i, Informatica, JAVA, SQL*Plus Reports, SQL*Loader, XML, ToadEducation Details:Masters: Masters in engineering management :2017 from Christian brothers University -TNBachelors: Computers Science and Engineering :2015 from JNTU -Hyderabad

Respond to this candidate
Your Message
Please type the code shown in the image:

Note: Responding to this resume will create an account on our partner site postjobfree.com
Register for Free on Jobvertise