Quantcast

Data Engineer Google Cloud Resume Frisco...
Resumes | Register

Candidate Information
Name Available: Register for Free
Title Data Engineer Google Cloud
Target Location US-TX-Frisco
Email Available with paid plan
Phone Available with paid plan
20,000+ Fresh Resumes Monthly
    View Phone Numbers
    Receive Resume E-mail Alerts
    Post Jobs Free
    Link your Free Jobs Page
    ... and much more

Register on Jobvertise Free

Search 2 million Resumes
Keywords:
City or Zip:
Related Resumes
Click here or scroll down to respond to this candidate
Candidate's Name
Phone: PHONE NUMBER AVAILABLEEmail: EMAIL AVAILABLEProfessional Summary:Senior lead Data Engineer with 11+ Years of experience in Information Technology delivering projects in fast paced environment.Data Engineer with a comprehensive background in design, developing, testing and supporting production-grade data solutions on Google Cloud Platform (GCP).Proficient in SQL, and leveraging GCP services to optimize data workflows, drive actionable insights, and ensure data integrity.Skilled in managing complex data pipelines and providing production support, ensuring the reliability and availability of critical data systems on GCP.Proven ability to collaborate with cross-functional teams and deliver robust and scalable data solutions that empower data-driven decision-making in a GCP environment.Extract, transform, and load (ETL) data using Google Cloud Platform (GCP) native tools such as Big Query, Compute instances, Cloud storage.Expertise on Google Cloud Platform (GCP), GCS (Google Cloud Storage), Big Query, Dataflow, Gsutil and Hadoop.Expertise on Google Bigtable, Cloud Composer/Airflow, Pub-Sub, Compute Engines and IAM.Representing team in senior leadership reviews.Expertise on Insurance Domain and RDBMS (Oracle SQL).Successfully led and managed a team of 5 individuals, effectively organizing workloads, prioritizing tasks, and ensuring timely delivery of projects while tracking performance using key performance indicators (KPIs).Familiarity with SDLC and Agile.Resolved team conflicts by facilitating open dialogues and implementing conflict resolution strategies, improved team collaboration and productivity by 30% through proactive intervention.Monitor team performance and report on metrics.Expertise on Servicenow, Github, Jira, Confluence, Global Service desk (GSD), Control-M.Incident Management and RCA with in the SLA.Set clear team goals and KPIs.Expertise in Insurance, Banking Domains.Proficient in defect tracking and classifying bugs based on severity.Involved in daily, weekly, and monthly review meetings.Technical Skill Set:Cloud GCP servicesCloud Storage, Big Query, Cloud Composer, Cloud SQL, Dataflow, DataProc, Pub/Sub, Cloud Functions, Compute Instances.Hadoop EcosystemsHDFS.ProgrammingPython3.DatabasesBig Query, Oracle (SQL).Operating SystemsLINUX, Windows.Developer ToolsOracle SQL Developer, PyCharm, Visual Studio.Bug Reporting ToolsBug Zero, Bugzilla.Scheduling ToolsAirflow (Cloud Composer), Cloud Scheduler, Control- M.Other UtilitiesPutty (UNIX), Confluence, Github, Global Service Desk, ServiceNow, JIRA, Citrix, SharePoint, WINSCP.Professional Certificates:Google Cloud Certified Professional Data Engineer (2023).Google Cloud Certified Associate Cloud Engineer (2022).Professional Experience:Capgemini Senior Data Engineer. Jun 2019  May 2023Client: HSBC BankProject Name : Group Liquidity Reporting System (GLRS)Environment : Google Cloud Platform (GCP), Linux, Python, Hadoop, Control-M, SQL.The purpose of this project is to analysis and develops the required reports using GCP Big Query by transferring the data from on-prem to cloud.Roles & Responsibilities:Worked on Google cloud platform (GCP) with Google Data Products like compute engine, cloud storage, BigQuery, stack driver monitoring and cloud deployment manager.Developed and maintained data pipelines on Google Cloud Platform (GCP), including BigQuery and report generation.Identified and resolved data quality issues.Successfully delivered projects on time and within budget.Accountable for ensuring successful Builds Deployments and Change Requests (CRs).Tuning of SQL queries to increase the performance and cost optimization.Testing and designing for the scripts before deployment in production environment.Developed Airflow workflows to extract, and load data from Cloud Storage to Cloud BigQuery.Preparing RCA document and maintain the same in internal database for future references.Holds Strong experience of Working in strictly time bound environment and providing resolution of Incidents within agreed SLAs.Encrypt the files using Python before loading the files to Google Cloud Storage.Coordinating support activities, patches deployment and configuration management with external vendors and numerous internal teams.Used Hadoop to Store the data on-prem.Reducing Manual Dependencies by implementing automated processes.Extract, transform, and load (ETL) data utilizing native tools within the Google Cloud Platform (GCP), including BigQuery.Performing Audit tasks and required evidence the auditor.Performing Monthly activities like Housekeeping on GCP BigQuery tables, Linux servers and VM refresh.Being a mentor, I have a responsibility to guide teammates and help them proceed and resolve technicalities and to improve functional knowledge.Built development environment using bug-tracking tools like Servicenow, Jira, Confluence, and version controls such as Git and Github.Nucsoft Ltd.  Senior Software Engineer. Jun 2017  Jun 2019Client: SBI Life InsuranceProject Name : Channel Management System (CMS)The purpose of this project is to analysis and develops the required reports using GCP Big Query by transferring the data to cloud.Roles & Responsibilities:Use Airflow (Composer) to build and schedule data pipelines for Ingress and Egress data.Perform peer reviews as a code owner.Develop adhoc scripts to run the models and compare the results between legacy and new platforms for Audit purpose.Deploy Data processing jobs into different Batch and Online platforms with respective GCP services: Dataflow.Create and modify Database objects: Tables, Views as pre the requirement.Collaborated with cross-functional teams, including data engineer and analysts, to understand business requirements and translate them into technical specifications.Nucsoft Ltd.  Software Engineer. Dec 2013  Jun 2017Client: SBI Life InsuranceProject Name : Policy Management System (PMS)The purpose of this project is to analysis and develops the required reports using Oracle SQL.Roles & Responsibilities:Analyzing and developing Reports as per the request (CCP).Developing RO & HO Contest Queries using SQL and PL/SQL.Creation of new Indexes to improve the performances of SQL statement.Creation of Database Objects like Tables, Views, Triggers, Procedures and Packages.Developing queries that can extract valuable insights from large datasets by using multiple joins.Using unit tests to ensure that individual units of code work as expected before they are integrated.Work with other engineers and stakeholders to understand and implement data requirements.Providing clarity on complex technical topics through documentation.Proven ability to work in fast-paced, deadline-driven environments and resolve incidents within agreed service level agreements (SLAs).Keeping the team informed of progress and blockers during daily standup meetings.GVR Infra Projects Ltd.  System Admin. Nov 2010  Mar 2013Roles & Responsibilities:Operating System Installation and related software's.Monitor system performance.Quickly arrange repair for hardware in occasion of hardware failure.Installing of hardware & peripheral Devices (Printers, Scanners, etc.).Troubling shooting of hardware & software.Academic Credentials:Jawaharlal Nehru Technological University, India - Bachelor of Technology Aug 2005 - Jul 2009.

Respond to this candidate
Your Message
Please type the code shown in the image:

Note: Responding to this resume will create an account on our partner site postjobfree.com
Register for Free on Jobvertise