Quantcast

Data Scientist Resume Birmingham, AL
Resumes | Register

Candidate Information
Name Available: Register for Free
Title Data Scientist
Target Location US-AL-birmingham
Email Available with paid plan
Phone Available with paid plan
20,000+ Fresh Resumes Monthly
    View Phone Numbers
    Receive Resume E-mail Alerts
    Post Jobs Free
    Link your Free Jobs Page
    ... and much more

Register on Jobvertise Free

Search 2 million Resumes
Keywords:
City or Zip:
Related Resumes

Sr. Food Scientist Moundville, AL

Data Engineer Power Bi SQL Birmingham, AL

Data Entry Clerk Birmingham, AL

Entry Clerk Data Jemison, AL

Customer Service Data Entry Birmingham, AL

Data Entry Property Management Birmingham, AL

Human Resources/customer service data entry Hueytown, AL

Click here or scroll down to respond to this candidate
 Candidate's Name
PHONE NUMBER AVAILABLEEMAIL AVAILABLELinkedIn: Candidate's Name -Street Address
Data ScientistPROFESSIONAL SUMMARY:      Accumulated over 5 years of experience in managing large datasets of structured and unstructured data for complex data analysis and predictive modeling.      Developed logical data architectures, aligning closely with enterprise-wide standards and objectives.      Applied predictive modeling and machine learning algorithms to deliver actionable insights for strategic business projects.      Specialized in extracting value-added datasets using tools such as Azure, Python, R, SAS, and SQL to target specific customer segments.      Collaborated with the lead Data Architect to design data warehouse solutions adhering to FSLDM standards, utilizing Snowflake schema and 3NF formats.      Designed and deployed interactive visualizations and dashboards using Tableau, enhancing data presentation and storytelling on web platforms.      Demonstrated proficiency in SQL and PL/SQL, developing procedures, triggers, and packages to streamline data processes.      Employed foundational and advanced machine learning models including regression, random forests, neural networks, and deep learning algorithms.      Leveraged visualization tools like ggplot2, d3.js, and Tableau for creating insightful dashboards and reports.      Mastered a variety of programming and scripting languages including R, Python, Java, and C++, enhancing analytical capabilities.      Conducted cluster analysis, principal component analysis (PCA), and developed recommender systems and association rules for enhanced decision-making.      Integrated and synthesized data from diverse sources including Oracle, MS SQL Server, Teradata, and flat files into comprehensive data solutions.      Utilized statistical and machine learning techniques like ANOVA, clustering, and regression for robust model building and time series analysis.      Expert in predictive modeling, data mining methods, and advanced statistical techniques including factor analysis and hypothetical testing.      Advanced knowledge in data modeling, mining, and extraction techniques, significantly improving data utilization and interpretation.      Developed and managed business intelligence and data warehousing solutions, focusing on dimensional data modeling, ETL, and OLAP cubes.      Enhanced data-driven decision-making through rigorous statistical analysis including sampling, variance analysis, and hypothesis testing.      Designed physical data architectures for new system implementations, optimizing performance and scalability.      Spearheaded the integration of business intelligence tools with existing data systems to enhance analytical frameworks and reporting capabilities.      Profound understanding of relational database design, OLAP concepts, and data warehouse methodologies.      Implemented advanced data analysis techniques such as outlier detection and correlation analysis to refine data integrity and accuracy.      Served as an Integration Architect and Data Scientist, specializing in analytics, BPM, SOA, ETL, Big Data, and cloud technologies, leading multiple projects to successful completion.
TECHNICAL SKILLS:      Database Design Tools: ER/Studio, SQL Power Architect, Tableau Prep Builder      Databases: PostgreSQL, MySQL, MongoDB, Neo4j, Amazon Redshift      Languages: Python, R, Scala, JavaScript, SQL, Java      Data Processing: Apache Spark, Apache Flink, Apache Kafka      Machine Learning: TensorFlow, PyTorch, scikit-learn, Keras      Data Visualization: Tableau, Power BI, Matplotlib, Seaborn, D3.js      Big Data Platforms: Hadoop, Apache Hive, Cloudera, Google BigQuery      Cloud Services: AWS (S3, EC2, Redshift, Lambda), Azure (HDInsight, Databricks), GCP      Development Tools: Jupyter Notebook, RStudio, Git, Docker      Operating Systems: Linux, macOS, WindowsPROFESSIONAL EXPERIENCE:Client: StateFarm, Boerne, TX                                                                                                              Mar 2023 to till date
Role: Data ScientistRoles & Responsibilities:      Conducted predictive modeling using machine learning techniques such as regression and classification to forecast outcomes effectively.      Designed and developed advanced data analytics programs using R and Python, preparing, transforming, and harmonizing datasets for modeling.      Identified opportunities for process enhancements and implemented them using technologies like Oracle, Informatica, and SAP Business Objects.      Engineered the prototype of the Data Mart, documenting anticipated outcomes for end-user applications and reporting.      Utilized UML for business process modeling to ensure alignment with strategic goals.      Developed and maintained a comprehensive data dictionary, creating metadata reports for both technical and business stakeholders.      Imported data from diverse sources, transformed it using Apache Hive and MapReduce, and managed storage within HDFS.      Engaged with Business Analysts, SMEs, and Data Architects to gather and understand business requirements for various projects.      Created and maintained SQL tables with strict referential integrity, developing complex queries using SQL, SQL*PLUS, and PL/SQL.      Executed data analysis tasks including identifying datasets, source data attributes, metadata, and data definitions.      Performed database performance tuning, including optimizing SQL statements and indexes, and monitored server performance.      Authored simple to complex SQL queries and scripts to generate standard and ad-hoc reports for senior management.      Collaborated on the creation of data mapping documents from source to target, including assessments of data quality.      Developed PL/SQL packages, database triggers, and user procedures, along with user manuals for newly created programs.      Participated in business meetings to grasp and document business needs and requirements.      Prepared ETL architecture and design documents, detailing the processes of data extraction, transformation, and loading using SSIS for Duck Creek data into a dimensional model.      Provided technical and requirements guidance to team members specifically for ETL and SSIS design.      Designed ETL frameworks and was responsible for the development and maintenance of these systems.      Developed logical and physical data models using tools like MS Visio.      Engaged in stakeholder meetings to thoroughly understand business requirements and translate them into actionable data strategies.      Participated in architectural solution meetings, providing guidance on Dimensional Data Modeling design.      Coordinated with technical teams to clarify data requirements, ensuring accurate and timely delivery of data solutions. Environment: R, Python, Oracle, Informatica, SAP Business Objects, UML, Apache Hive, MapReduce, HDFS, SQL, SQL*PLUS, PL/SQL, SSIS, MS Visio.Client: Tansoncorp, Bloomington, MN.                                                                                                  Nov 2020 to Jul 2022Role: Data Scientist      Roles & Responsibilities:      Conducted advanced statistical analysis and modeling using Python, TensorFlow, and PyTorch to forecast trends and inform business strategies.      Documented entire process flows detailing program development, testing, integration, coding, and implementation, ensuring transparency and reproducibility.      Scored predictive models according to regulatory requirements, utilizing Population Stability Index (PSI) to ensure accuracy and compliance.      Developed predictive scorecards for life insurance, term deposits, cross-selling car loans, and recurring deposits, enhancing product targeting.      Mentored junior data scientists and analysts, providing guidance on complex data projects and fostering professional development.      Created propensity models for retail liability products using R, driving proactive marketing campaigns based on customer behavior predictions.      Performed data transformation and cleansing and engineered new variables to refine analytical models and insights.      Defined functional requirements for each data integration project, documenting source-to-target mappings to guide development.      Identified and documented critical customer and account attributes for Master Data Management (MDM) implementation from various sources.      Presented and gained approval for designed logical data models at the Data Model Governance Committee (DMGC) meetings.      Extracted and tabulated data from multiple sources using advanced SQL and SAS, ensuring accurate data availability for analysis.      Collaborated with stakeholders to identify the most appropriate sources of record, profiling essential data for sales and service enhancements.      Extracted data from Hadoop Distributed File System (HDFS) and prepared datasets for exploratory analysis, employing data munging techniques.      Defined key identifiers for each data mapping and interface, ensuring data integrity and consistency across platforms.      Validated machine learning classifiers using Receiver Operating Characteristic (ROC) curves and lift charts, optimizing model performance.      Led Data Workshops with subject matter experts (SMEs) and stakeholders to enhance understanding of data requirements and catalogues.      Integrated and tested machine learning algorithms using Jupyter Notebooks and Databricks, streamlining model development and deployment.      Utilized Apache Spark for large-scale data processing and analytics, improving data handling efficiency and analysis speed.      Advocated for and implemented data governance practices, ensuring data quality and compliance with industry standards across all projects.
Environment: Python, TensorFlow, PyTorch, R, SQL, SAS, Hadoop Distributed File System (HDFS), Jupyter Notebooks, Databricks, Apache Spark.Client: Avon Technologies Pvt Ltd, India                                                                                           Oct 2018 to Oct 2020Role: Data AnalystRoles & Responsibilities:      Developed Tableau visualizations and dashboards using Tableau Desktop.      Applied Business Objects best practices during development with a strong focus on reusability and better performance.      Developed and executed load scripts using Teradata client utilities FASTLOAD, MULTILOAD and BTEQ.      Generated periodic reports based on the statistical analysis of the data using SQL Server Reporting Services.      Used Graphical Entity-Relationship Diagramming to create new database design via easy to use, graphical interface.      Maintained metadata (data definitions of table structures) and version controlling for the data model.      Designed different type of STAR schemas for detailed data marts and plan data marts in the OLAP environment.      Write SQL scripts to test the mappings and Developed Traceability Matrix of Business Requirements mapped to Test Scripts to ensure any Change Control in requirements leads to test case update.      Responsible for development and testing of conversion programs for importing Data from text files into map Oracle Database utilizing PERL, shell scripts & SQL*Loader.      Utilized Erwin's forward/reverse engineering tools and target database schema conversion process.      Developed SQL scripts for creating tables, Sequences, Triggers, views and materialized views.      Co-ordinate with various business users, stakeholders and SME to get Functional expertise, design and business test scenarios review, UAT participation and validation of financial data.      Worked on creating enterprise-wide Model EDM for products and services in Teradata Environment based on the data from PDM.
      Conceived, designed, developed and implemented this model from scratch.Environment: Tableau Desktop, Teradata client utilities, SQL Server Reporting Services, ERwin, SQL*Loader, Perl.

Respond to this candidate
Your Email «
Your Message
Please type the code shown in the image:
Register for Free on Jobvertise