| 20,000+ Fresh Resumes Monthly | |
|
|
| | Click here or scroll down to respond to this candidateCandidate's Name
(M): PHONE NUMBER AVAILABLEEmail Id: EMAIL AVAILABLEExperience Summary:Over 10+ years of experience as an Analysis, Design, Development, Management and Implementation of various stand-alone, client-server enterprise applications using Python.Strong experience in developing software in Python using libraries- Beautiful Soup, numpy, scipy, matplotlib, python-twitter, Pandas data frame, network, urllib2, MySQL dB for database connectivitySound experience in working with different python editors like PyCharm, PyScripter, PyStudio, Sublime Text, Emacs and SpyderImplemented automation using Selenium WebDriver, Python.Experienced with full software development life-cycle, architecting scalable platforms, object oriented programming, database design and agile methodologiesStrong Experience in Automating Web Application Testing using Selenium WebDriver with Testing FrameworkExpert in analyzing test results and preparing test/defect summary report for Sr. Management.Experience in Python, Django, Zope, Pyramid FrameworkStrong exposure testing applications on different browsers like IE, Firefox, Chrome.Expertise in locating the web elements using XPATH and CSS.Experienced in WAMP (Windows, Apache, MYSQL and Python) and LAMP (Linux, Apache, MySQL, and Python) Architectures.Good experience in setting up REST API using Django.Firm knowledge of different phases of Software Testing and Software Development Life Cycle (SDLC) including Agile Methodology and Waterfall.Worked on Python Testing frameworks like zope, pytest, nose and Robot framework.Good Experience in Linux Bash scripting and following PEP Guidelines in Python.Good knowledge in front end technologies such as HTML5, CSS3, Bootstrap, JavaScript, Node.JS, Angular.JS, XML, Ajax and JQuery.Good knowledge in Object-Oriented Analysis, Design and development of applications using Core Java, Servlets, JSP, JDBC, RMI, Multithreading, Eclipse and Tomcat.Familiar with JSON based REST Web services and Amazon Web services.Proficient in writing SQL Queries, Stored procedures, functions, packages, tables, views, triggers using relational databases like Oracle, DB2, MySQL and MSSQL Server.Skilled in debugging/troubleshooting issues in complex applications.Expertise in Object-Oriented design and coding. Good knowledge of various Design Patterns and UML.Followed Agile Methodologies (Scrum), involved in daily standup meetings.Experienced in unit testing and testing using Selenium.Good Experience on Selenium IDE and creating Scripts in selenium --RC by using PythonExperienced in Version controlling tools like Dimensions, Git, CVS and SVN and build Tools like Hudson, Clear Case and Ant.Good experience in Shell Scripting, SQL Server, Unix and Linux, Open stock.Expertise in Object-Oriented design and coding. Good knowledge of various Design Patterns and UML.Have good idea about development tools like Bugzilla, Jira, Confluence, and Axosoft.Technical Skills:FrameworksDjango, FlaskWeb TechnologiesHTML, CSS, Java Script, JQuery, AJAX, XMLProgramming LanguagesPython, C/C++, PERL, SQL and PL/SQL.Versioning ToolsSVN, CVS, Git, GitHub.DatabasesOracle (9i,10g, 11g), My SQL, PostgresProtocolsTCP/IP, HTTP/HTTPS, SOAP, SMTPDeployment ToolsAmazon EC2, Jenkins, Gitlab CI/CDTesting ToolsSelenium, Bugzilla and JIRA.Professional Experience:Northern Trust, Chicago, IL Feb 2021- PresentData Engineer / Sr. Python DeveloperResponsibilitiesPlayed a key role in the migration of a large commercial loan portfolio from IBS to ACBS (worth 10M USD), resulting in improved data accuracy and streamlined reporting processes.Implemented CRUD (Create, Read, Update, Delete) functionalities using SQLAlchemy ORM for Postgres and Mysql providing seamless data manipulation capabilities within Python applications.Developed ETL pipeline leveraging AWS services like EC2, ECR, S3, Cloud watch, RDS for banking application from multiple sources using Python, which impacted the client growth by ~20%.Created custom automation solutions for data cleansing, validation, and transformation for millions of database records ensuring data accuracy and consistency across multiple systems and platforms and scheduled cron jobs to run automatically.Integrated with external systems and services using RESTful APIs, Python scripting, and message queuing technologies, such as RabbitMQ and Apache Kafka.Deployed PySpark and Spark SQL code and ran them in AWS Glue and AWS EMR Cluster, orchestrated using AWS Step Function.Designed and deployed automated ETL workflows using AWS lambda and cleansing data in S3 buckets using AWS Glue and processed data using Amazon Redshift.Designed and implemented DynamoDB tables for efficient storage and retrieval of customer and product dataIntegrated API Gateway for managing external API access and ensuring secure communicationExperienced with AWS cloud platform and its features, like EC2, S3, RDS and Cloud watch.Loaded Cloudwatch logs to S3 and loads into kinesis Streams for Data processing.Created terraform scripts for EC2 instances, Elastic load balancers and S3 buckets.Connected with Rest Api to download all the owner details and organization information using Python and stored the data into Postgres DB using PandasDeployed PySpark and Spark SQL code and ran them in AWS Glue and AWS EMR Cluster, orchestrated using AWS Step Function.Designed and implemented a comprehensive data warehouse model in Snowflake.Developed robust ETL pipelines using Snowflake's native features, ensuring seamless extraction, transformation, and loading of data from various sources.Developed data warehouse model in SnowflakeWrote SQL queries and stored procedures within the Snowflake environment, ensuring efficient data extraction and analysis.Experienced working Google Cloud Platform (GCP) services such as Compute Engine, Kubernetes Engine, Cloud Storage, Big Query, Dataflow, Pub/Sub, and more.Designed and deployed automated ETL workflows using AWS lambda and cleansing data in S3 buckets using AWS Glue and processed data using Amazon Redshift.Experienced with AWS cloud platform and its features, like EC2, S3, RDS and Cloud watch.Worked on building, deploying, and troubleshooting data extraction of huge amount of data using Azure Data Factory ensuring efficient ETL processes for large-scale datasets.Designed and implemented end-to-end data solutions leveraging Azure services, including ADF, Azure Data Lake, and Azure SQL Data Warehouse.Utilized Scala and Spark for real-time and batch processing of data, ensuring the scalability and reliability of data processing workflows.Developed spark application using Spark-SQL in Databricks for ETL process from multiple file formats for analyzing and transforming the dataState Farm, Bloomington, IL Dec 2018 Feb 2021Sr. Python AWS EngineerResponsibilities:Associated with various phases of Software Development Life Cycle (SDLC) of the application like requirement gathering, Design, Analysis and Code development.Designed and developed data pipelines. Automated based on the requirement. Generated Python Django forms to record data of online users.Modifying available python and R libraries based on business requirement.Worked on Postgres Database to retrieve roles and owners list from 3rd party vendor.Used python restful services to query Qualys DB and upload the meaningful data to DatabaseBuilding Python and R images to be compatible with machine learning platformWorked on setting up the build and deployment automaton for Aqua on Premise.Worked on ITSVP ticketing system to create tickets with vulnerabilities and notify users and management.Developed a high-performance real-time application API using FastAPIBuilt a secure and scalable e-commerce API using FlaskUtilize Flask as the web framework for building the APIRefactored and optimized an existing Flask API, improving code readability and performance.Created an application using Django to create admin page and Postgres to query multiple databasesPutting modern data platforms into use, including platform as a service variantUsed Rest api to download data from Qualys and filter out all the Active attacks and exploit public CVE IdsUtilized PyTest for comprehensive testing, ensuring reliability and performance under various usage scenarios.Implemented type annotations with Pydantic, enhancing code readability and maintainability.Orchestrated CI/CD pipelines using GitLab, ensuring smooth integration between development and operations teams.Integrated SQL Alchemy for database management, ensuring efficient data storage and retrieval.Integrated JFrog Pipelines into the CI/CD workflow, improving artifact management and deployment processes.Proficient in utilizing SQL Alchemy to interact with relational databases such as MySQL, PostgreSQL, and SQLite, ensuring efficient data storage, retrieval, and manipulation.Successfully integrated SQL Alchemy with popular Python web frameworks like Flask and Django, facilitating seamless database integration in web applications and APIs, and ensuring scalability and maintainability.Deployed serverless microservices on AWS Fargate using Python Lambda to process the Aqua dataDesigned and implemented a real-time recommendation engine using Python and Amazon Neptune.Developed data pipelines with Airflow and EMR (HDFS) to orchestrate data ingestion, cleansing, and transformation from diverse sources to Neptune and DynamoDB.Conducted static code analysis using tools like Coverity and Coverage to identify and address code vulnerabilities, ensuring the production of high-quality software.Connected with Solma Api to download all the owner details and organization information using python and stored the data into Postgres DB using PandasWorked on developing new APIsWorked on building, deploying, and troubleshooting data extraction of huge amount of data using Azure Data Factory ensuring efficient ETL processes for large-scale datasets.Designed and implemented end-to-end data solutions leveraging Azure services, including ADF, Azure Data Lake, and Azure SQL Data Warehouse.Utilized Scala and Spark for real-time and batch processing of data, ensuring the scalability and reliability of data processing workflows.Developed spark application using Spark-SQL in Databricks for ETL process from multiple file formats for analyzing and transforming the dataMigrated workloads from On premise to AzureProvided access to Aqua tool to customer using Azure.Developed and maintained automation scripts and tools for deployment, configuration management using Terraform.Validated and ensured the accuracy of data migration from SQL Server to Snowflake.Infrastructure as Code tools like Terraform or Deployment Manager to automate the provisioning and configuration of GCP resources.Configuration management tools like Ansible or Puppet to manage and automate the configuration of GCP instances.Successfully orchestrated and executed Terraform configurations for multi-cloud environments, optimizing flexibility and resource utilization.Integrated Terraform configurations seamlessly with version control systems like Git, promoting collaboration, and enabling effective version tracking.Continuous Integration/Continuous Deployment (CI/CD) tools like Jenkins, GitLab CI/CD, or Cloud Build to automate the deployment of code changes to GCP environments.Monitoring and logging tools like Stack driver or Prometheus/Grafana to monitor the health and performance of GCP resources and applications.Security and compliance tools such as Identity and Access Management (IAM), Security Command Center, and Compliance Manager to ensure GCP environments are secure and compliant with relevant regulations and standards.Collaboration and communication tools such as Slack, Jira, or Confluence to work with other team members and stakeholders on GCP projects.Designed and implemented scalable and highly available cloud architectures on Google Cloud Platform (GCP) for multiple projects, ensuring optimal performance and cost-effectiveness. Utilized services such as Compute Engine, Kubernetes Engine, App Engine, Cloud Storage, BigQuery, and Cloud SQL.Implemented CI/CD pipelines using Google Cloud Build (GCP) and Jenkins, to improve release frequency and reliability. Streamlined the software delivery process by integrating automated testing and deployment workflows.Designing and implementing scalable, reliable, and secure cloud infrastructure in GCP using technologies such as Google Kubernetes Engine (GKE), Compute Engine, Cloud Storage, and Cloud SQL.Installed and configured a private Docker Registry, authored Docker files to run apps in containerized environments and used Kubernetes to deploy scale, balance the load and manage Docker containers with multiple namespace ids.Configured applications that run multi-container Docker applications by utilizing the Docker-Compose tool which uses a file configured in YAML format. Used Kubernetes to manage containerized applications using its nodes and deployed application containers as Pods.Involved in setting up Kubernetes Clusters for running microservices and pushed microservices into production with Kubernetes backed Infrastructure.Deployed Aqua in On-premise and in AWS using TerraformOptimized AWS Redshift for efficient storage and management of large-scale of dataDesigned and implemented a real-time analytics dashboard using RedshiftUtilized AWS Redshift for efficient storage and management of petabyte-scale bankingQuerying SQL database to delete any duplicates and get required information form Qualys, Aqua, Solma and Itada tablesBuilt the CI/CD process from scratch and automated the whole process to ticket using Service-nowDeveloping database which contains Tables, Stored Procedures, Functions, Views, Triggers and Indexes in SQL SERVER and connecting to existing CMS system.Environment: Python 3.6, Docker, Docker Swarm, Kubernetes, Pandas, Hadoop, Git, Logstash, AWS, Kafka, Hive, TerraformQ2ebanking, Austin, TX Oct 2017 Dec 2018Python developerResponsibilities:Providing digital banking solutions by integrating banking cores, back-end systems with Q2 platform in a n-tier architecture with the use of tools & techniques of Python programming and web technologies.Developed online banking forms where end users interact directly to use the various services provided in online banking.Automated backups through shell scripts and scheduled cron jobs for daily backups, weekly archival to S3 buckets on AWSDeveloped Product extensions for banking product to provide custom additional functionalities using Python.Designed and implemented PySpark data pipelines to ingest, transform, and cleanse financial data from diverse sources like account transactions, market feeds, and customer records.Developed PySpark ETL scripts to seamlessly migrate data and ensure data integrity.Design product extensions that leverage the Q2 Wedge FrameworkWorking with SOAP and XML base api to communicate with third parties through platform.Communicating with database making stored procedures for necessary database operations.Support internal and external testing as well as resolve bugsDeploy code to customer environmentsWorked on web forms, back-end API's with Python integration framework for Banks and Credit Unions which are developed on component-based design architecture.Integrated TransferNow and PopMoney services provided by Fiserv using Single sign-on (SSO) authentication service using Python requests module.Worked on integration of third party calls for FIS to view user payment activity eStatements, Documents.Developed and maintained web applications and APIs using FastAPI, Flask, and Django, ensuring adherence to project requirements and timelines.Implemented database operations and ORM using SQL Alchemy, optimizing query performance and data management.Conducted thorough testing of applications with PyTest, addressing issues promptly to maintain code quality.Enforced coding standards and style guidelines using Flake8/Black, facilitating collaboration and code review processes.Integrated type annotations with Pydantic, improving code maintainability and facilitating documentation.Leveraged asynchronous programming for Rest API calls, enhancing application responsiveness and scalability.Managed application deployment and process management using Unicorn/Gunicorn, ensuring smooth operation in production environments.Developed RESTful web services for Symitar Repgen extension.Conducted performance tuning and optimization of PySpark jobs, enhancing data processing speed and resource utilization.Implemented and maintained data pipelines in PyCharm, ensuring code quality and adherence to best practices.Developed interactive dashboards and reports using Athena to empower decision-makers with real-time data insights.Managed 4 bank portfolios simultaneously by providing implementation, support and maintenance services.Visual Studio 2010/2012 and Team Foundation Server (TFS) for source control and project managementIntegrated Stop payment type notifications for end users using techniques like SFTP, Send Secure message notification script.Worked on Funds Transfer API and developed widgets like External Transfers, Future Dated Transfers,Member to Member Transfers, Scheduled Transfers, and Recurring Transfers.Worked on end user Auto-enrollment online banking module which creates user profile for online banking.Worked on end user Real Time address change banking module which allows user to change their address when they move.Collaborated with internal teams to convert end user feedback into meaningful and improved solutions.Developed application extensions like Fraud Alerts, Travel Notifications, Secure Messages, Secure Access Code, Account Preferences, and User Profile Update.Good understanding of technical banking terminology such as ABA, Internal and External account numbers, PFM accounts, External accounts.Developed REST and SOAP test suites for testing API's using SoapUI and Postman tools.Worked in Kanban and Scrum software development process of an agile methodology.Associated with various phases of Software Development Life Cycle (SDLC) of the application like requirement gathering, Design, Analysis and Code development.Worked with team of developers on Python applications for RISK management.Responsible for building an automated ingestion platform that crawls websites, manipulates the dataResolved ongoing problems and accurately documented progress of a project.Environment: Python 2.7, MSSQL, JQuery, Linux, Ajax, Java Script, Apache, Jinja, HTML5 and CSSSouthern Graphics Systems, LLC (Kwikee), Peoria, IL Oct 2016 Oct 2017Python developerResponsibilities:Developed Restful API's integrating web exe with Django, Html with Jinja logic and Python implementations with data exchange through JSON and XML formats.Wrote Python scripts to parse XML documents and load the data in database.Refactored existing project to make it more RESTful and thread-safe.Created Python/Django based web application using Python scripting for data processing, Oracle/SQlite3 for the backend databaseUsed many regular expressions to match the pattern with the existing one.Worked on various Excel functionality using Python using xlsxwriter and openpyxl modulesDeveloped views and templates with Python, Django view controller and template language to create a user-friendly website interface and used Django Database API's to access database objects.Experience in writing Sub Queries, Stored Procedures, Triggers, Cursors, and Functions on MySQL and PostgreSQL database.Worked on Linux shell commands and command line utilities.Used Python Library Beautiful Soup 4 for Web Scraping to extract data for building graphs.Worked in MySQL database on queries and writing Stored Procedures for normalizationWorked on the internal applications like automation of data transformation engine, Proctor and Gamble Full Ingestion and client applications like Hy-Vee vendor portal (hyvee.kwikeevendor.com) by Kwikee.Managed large datasets using Panda data frames and Cassandra.Implemented business logic using Python/Django. Worked with millions of database records daily, finding common errors and bad data patterns and fixing them.Used standard Python modules xlrd, csv, itertools, pickle, json, subprocess, collections, lxml, xmltodict, cx_Oracle, Element Tree XML API for development.Experienced in developing parallel computing applications using multithreading and multiprocessing.Developed Python web services for processing JSON and interfacing with the Data layer.Worked with data specialists to develop data mappings that supports Kwikee data format.Build SQL queries for performing various CRUD operations like create, update, read and delete.Managed, developed, and designed a dashboard control panel for feed status information for the Kwikee Internal Team using Python, Django framework, HTML, CSS, JavaScript, Bootstrap, jQuery and RESTAPI calls.Used Vagrant to install and configure virtual machines and hosted personnel VM that points to development environment.Used Liquid Planner and Microsoft Teams for project management and collaboration.Responsible for debugging and troubleshooting the web application.Knowledge in AWS Cloud platform and its features which includes EC2, AWS S3, Lambda, EBS, Cloud watch, SNS, Auto-scaling, IAM user management, Route 53 etc.Engaged in Design, Development, Deployment, Testing and Implementation of the application.Built development environment using bug-tracking tools like Jira, Confluence and version controls such as Git, GitLab and SVN.Experienced in Agile Methodologies and SCRUM Process.Experienced in writing technical documentation to accurately represent application design for user support.Environment: Python, Jinja, Django, Selenium, Amazon Aws, Beautiful Soup, Pandas, JQuery, Bootstrap, MySQL, Linux, Ajax, Java Script, Apache, Cassandra, Oracle SQL Developer, HTML5 and CSSNew Generation Computers, INDIA Aug 2011 Dec 2014Sr. Python developerResponsibilities:Translated the customer requirements into design specifications and ensured that the requirements translate into software solution.Application was based on service oriented architecture and used Python 2.7, Django 1.5, JSF 2.1, Spring 3.2.5, Ajax, HTML, CSS for the frontend.Generated Python Django Forms to record data of online financial users.Maintain Selenium scripts in between releases. Reproduce failed automated test cases manuallyUsed PandasAPI to put the data as time series and tabular format for east timestamp data manipulation and retrieval.Used many regular expressions in order to match the pattern with the existing one.Worked with team of developers on Python applications for RISK management.Responsible for inserting deleting and updating all financial data using MySQL.Skilled in using collections in Python for manipulating and looping through different user defined objects.Involved in Unit testing and Integration testing.Implemented user interface guidelines and standards throughout the development and maintenance of the website using the HTML, CSS, JavaScript and JQuery.Implemented the validation, error handling, and cachingUsed Test driven approach (TDD) for developing services required for the application.Worked in development of applications especially in Linux environment and familiar with all of its commands.Designed and implemented a scalable Selenium test framework using Python and PyTest, reducing test execution time by 30%.Refactored existing project to make it more RESTful and thread-safe.Added unit tests and improved existing ones.Responsible for debugging and troubleshooting the web application.Implemented Model View Control architecture in developing web applications using Django frame work.Extensively worked on Python scripting and development. CSS is used to style Web pages, XHTML markup.Environment: Python 2.7, Django 1.5, Selenium, HTML5, CSS, VMware, Oracle DB, Pandas, Spring, MS SQL Server 2013, Jasper, Reports, JavaScript, Ajax, Eclipse, Linux, Shell Scripting, RESTful, MVC3.Education:Bachelor of Technology in Electronics and Communications Engineering, JNTUA, INDIA.Masters in computers systems and information Security in AUM 2016 |