| 20,000+ Fresh Resumes Monthly | |
|
|
| Related Resumes Data Engineer Seattle, WA Power Bi Data Analysis Federal Way, WA Data engineer, analyst, escalation, Azure (cloud)Entra-Infra-Dev Auburn, WA Data Analysis Science Bellevue, WA Software Engineer Azure Data Bothell, WA Data Engineer/Analyst/Scientist Seattle, WA Data Engineer Senior Seattle, WA |
| Click here or scroll down to respond to this candidatePraneeta Simhadri Driving e- commerce success through Advanced Data Insights & AnalyticsData Engineering with PySpark, Databricks, Vertica Visualization using Tableau, Power BI SQL & Python Seattle, WA EMAIL AVAILABLE PHONE NUMBER AVAILABLE Summary: Highly analytical Data Engineering Professional with 10+ years in Business Intelligence, and Data visualization, specializing in retail and e-commerce. Expert in building data products and efficient data pipelines within Customer Data Platforms. Proficient in data analysis and visualization using SQL and PySpark. Education: Tableau Specialization University of Washington, WA 2018 B.S in Mechanical Engineering GITAM Andhra University, India 2006 Skills: Analysis: SQL, PySpark, Python Databases: Vertica, Databricks, Snowflake, Azure SQL Server, Azure Synapse, PostgreSQL Orchestration Tools: Airflow, Azure Data Factory Cloud: Azure BI Tools & Visualization: Tableau (Primary), Power BI, Kibana, Power Apps Collaboration & Project Management Tools: GitLab, GitHub, JIRA, Confluence, Miro board, Draw.io Strengths & Specialties: In-depth knowledge of e-commerce and retail industries, with a focus on fraud detection and risk management along with charge back analysis. Profound experience collaborating with business teams to understand analytical and data needs and critical metrics. Exposure to client interactions, user requirement analysis and user support. Worked on data projects that integrate marketing analytic solutions, like unifying customer profiles for building personalized experiences, generating customer 360 datasets. Experience building and maintaining scalable data pipelines and ETL processes using PySpark, SQL in Databricks environment. Proficient in Apache Spark and Spark SQL for processing and querying large datasets efficiently. Performed Exploratory Data Analysis using SQL on Databricks and Snowflake platforms uncovering critical insights and trends to drive data driven decision making. Leveraged Medallion architecture by developing and optimizing bronze, silver, and gold tables, ensuring high data quality, and supporting diverse business use cases with robust data solutions. Handled streaming tasks using Databricks DLT to process near real time data using Azure Event Hubs, enabling seamless data ingestion, and processing for Lakehouse. Developed and maintained dashboards to provide real-time insights into metrics and allowing stakeholders to self-serve. Have extensive experience in sourcing, processing, managing, and distributing business-driven and visual reports for key stakeholders in e-commerce & trade-in domain environments. Utilized SQL queries to scrutinize transaction data and identify data anomalies and deviations from established fraud rules, thereby strengthening the Fraud Detection System and ensuring the integrity of financial transactions. Extensive knowledge in creating development pages in confluence and project management tools like JIRA. End to end experience in designing, building, and deploying interactive dashboards handling large datasets using Tableau & Power BI. Designed and development various dashboards, reports utilizing Tableau Visualizations like Dual Axis, Bar Graphs, Scatter Plots, Pie-Charts, Heat Maps, Bubble Charts, Tree Maps, Geographic Visualization, and other making use of actions and applying global filters according to the end user requirement. Professional Experience:Senior Data Engineer Fraud & Risk Operation Digitech- Clients (Samsung) 20222024 Integrated data from multiple sources within Datawarehouse to provide comprehensive analytical solutions. Developed complex queries and data transformations with Spark SQL to support fraud detection and chargeback analysis. Created credit authorization datasets comparing suppression data from third party vendors like Signifyd and analyzed revenue impacts Developed reports in Tableau to analyze the success rates at different levels of traffic. Performed detailed data analysis using SQL to identify fraud trends and patterns and anomalies both within trade- in data. Was actively involved in reconciliating data gaps and data deficiencies which I uncovered during Exploratory Data Analysis on identified Fraudulent data. Designed SQL views in Vertica, Databricks to bolster Tableau extracts and dashboards. Developed comprehensive reports using Tableau to present insights on trade-in operations, providing a visual representation of key performance indicators and trends, SLA rates, chargeback losses, and fraud analytics. Worked with cross-functional teams to customize Tableau reports, ensuring they meet departmental requirements and contribute to data-driven decision-making processes. Collaborated with Engineering and DevOps to integrate new data sources, enhancing trade-in and fraud data. Addresses and rectified data gaps in fraud and trade-in sectors. Established and maintained data quality standards, ensuring the accuracy and reliability of data used in Risk, Fraud, Chargeback and Trade-in operations. Senior Business Intelligence Developer Regal IT Clients (Publicis, Sephora) 2018 2022 Designed and implemented data models to support business intelligence and analytics. Managed data engineering pipelines for the Customer Data Platform for retail e-commerce clients using Azure Databricks. Integrated data from different sources to create unified data sets within the bronze layer of Data lakehouse using SQL and PySpark. Created analytical datasets within Databricks supporting Data Science models. Worked with multiple customer transactional and behavioral datasets to develop Customer Journey tables, Customer 360 views which helped Data Scientist to build Feature Stores and Data Science models. Worked with large-scale data sets to perform data cleaning, transformation, and analysis. Imported and manipulated data from Databricks and Parquet data stores using SQL queries using Databricks delta tables.BI Developer Digitech India Clients (Invesco, India) 2012 2017 Translated business requirements into technical solutions by creating reports, Power BI dashboards, and scorecards. Developed and managed data pipeline using Azure Databricks and Azure Data Factory, to load flat file extracts received from third party suppliers, transforming the data and loading into Data Lake. Created aggregated data sets using SQL and published data extracts for Tableau, OBIEE. Created Tableau dashboards with derived fields to explain data trends to non-technical audiences. Designed various visualizations like heat maps, story timelines, and trend reports in Tableau for data analysis. |