| 20,000+ Fresh Resumes Monthly | |
|
|
| | Click here or scroll down to respond to this candidate Candidate's Name
PHONE NUMBER AVAILABLE
EMAIL AVAILABLE
Data Analyst
PROFESSIONAL SUMMARY:
Over 5 years of expertise in data analysis and manipulation using a variety of SQL technologies and data tools.
Utilized Ataccama for comprehensive data quality management, ensuring high standards of data integrity.
Integrated Ataccama ONE with other data management tools and platforms (e.g., databases, data warehouses,
BI tools) to ensure seamless data flow and consistency.
Utilized Databricks to connect to cloud data sources, including Azure Blob Storage, for seamless data ingestion.
Developed and executed ETL pipelines in Databricks to clean, transform, and aggregate large datasets, including
transactional records and clickstream data.
Proficient in using Python libraries such as Pandas and NumPy to clean, transform, and analyze large datasets.
Experienced in handling missing values, outliers, and performing data aggregation.
Demonstrated expertise in data manipulation and complex SQL querying with MY SQL, ANSI SQL, and MySQL
Workbench.
Skilled in utilizing Microsoft Excel for advanced formulas, enabling efficient data analysis and reporting tasks.
Proficient in data visualization and creating interactive dashboards with Tableau, Power BI enhancing decision-
making processes.
Experienced in leveraging Talend Open Studio for robust data integration and ETL processes, ensuring data
accuracy.
Competent in version control using GIT, ensuring collaborative and efficient development workflows.
Utilized Docker for setting up reproducible development environments, streamlining project setup and
maintenance.
Expert in analyzing data and building models in Jupyter Notebooks, optimizing data-driven strategies.
Advanced proficiency in SQL, Informatica Data Quality, and Excel for comprehensive data quality management.
Skilled in R for statistical analysis, applying quantitative techniques to solve complex data challenges.
Utilized data profiling tools like Trifacta and Open Refine for precise data cleaning and preparation.
Expert in utilizing Collibra for data governance, ensuring compliance with data standards and policies.
Proficient in managing and analyzing data using SQL Server, enhancing database performance and reliability.
Loaded and ingested data from various sources into Snowflake, including structured and semi-structured data.
Designed and managed scalable data models using Snowflake s schema and table structures.Applied Agile
methodologies in project management to improve team performance and project deliverability.
Proficient in AWS Cloud services including S3 and Redshift, optimizing data storage and analytics operations.
Leveraged Amazon SageMaker for deploying machine learning models, enhancing predictive analytics.
Experienced in utilizing Apache Spark for large-scale data processing, improving data handling efficiencies.
Integrated AWS Recognition into data projects, enriching data insights with advanced image and text analysis.
Developed and maintained data automation workflows using tools like AWS, Alteryx, and Apache Spark,
enhancing operational efficiencies.
TECHNICAL SKILLS:
SQL and Database Management : MY SQL, ANSI SQL, MySQL Workbench, SQL Server
Data Analysis and Reporting: Microsoft Excel, Tableau, R, Apache Spark ALS, AWS Redshift,Data bricks
Programming Languages: Python, Javascript SAS
Data Integration and ETL : Talend Open Studio, Alteryx, Informatica Data Quality, Snow flake
Data Visualization: Tableau, Jupyter Notebooks, Power BI
Machine Learning and AI : Amazon SageMaker, AWS Recognition
Cloud Services and Storage : AWS S3, AWS Cloud
Version Control and Collaboration : GIT, Docker, SharePoint, Confluence
Data Governance and Quality : Collibra, Ataccama, Trifacta, Open Refine
Agile and Project Management : Agile
PROFESSIONAL EXPERIENCE:
Client: FIS Global, Jacksonville, FL Jan 2023 to till date
Role: Data Analyst
Roles & Responsibilities:
Analyzed financial data using advanced SQL queries in Excel, improving accuracy in financial reporting and
decision-making.
Developed and maintained a collaborative project documentation system using SharePoint and Confluence,
which enhanced team productivity.
Used Ataccama to design and manage ETL (Extract, Transform, Load) processes to consolidate financial
data from various sources, including transactional systems, accounting systems, and market data feeds.
I specialized in solving complex problems through detailed data analysis and statistical methods
Designed and implemented interactive data visualizations and dashboards in Tableau, significantly
improving stakeholders insights into financial trends. Applied machine learning techniques using Amazon
SageMaker to predict financial outcomes, thereby increasing predictive accuracy.
Managed and optimized large datasets in AWS S3, enhancing data storage and accessibility for extensive
financial analysis.
Configured and maintained data processing workflows using Apache Spark ALS, enhancing performance
and accuracy of predictive models.
Developed quantitative models for forecasting stock prices, calculating risk metrics, and optimizing
investment portfolios. Used libraries such as StatsModels and SciPy for statistical analysis and hypothesis
testing.
Automated routine data preparation and cleansing tasks using Alteryx, significantly improving data quality
and usability for analysis.
Conducted comprehensive data analysis and reporting using the R language, supporting advanced
statistical analysis and modeling for financial forecasting.
Documented all data processes and automation workflows comprehensively, ensuring transparency and
repeatability in financial analyses.
Enhanced financial data manipulation capabilities using complex formulas and data functions in Microsoft
Excel.
Utilized AWS Cloud for scalable data storage solutions, facilitating efficient data management and retrieval.
Leveraged Confluence for team collaboration and knowledge sharing, improving project communication
and documentation.
Improved financial data visualization and interactivity using advanced features in Tableau, facilitating easier
data interpretation.
Streamlined financial data workflows using AWS Redshift for efficient data warehousing solutions,
enhancing data accessibility.
Optimized financial forecasting models using machine learning capabilities in Amazon SageMaker,
increasing model reliability and accuracy.
Enhanced data profiling and cleansing processes using Alteryx, ensuring high standards of data cleanliness
and accuracy.
Employed AWS S3 for robust data backups and disaster recovery solutions, ensuring data security and
availability.
Integrated machine learning algorithms with financial datasets using Amazon SageMaker, enhancing
predictive analytics capabilities.
Documented and standardized financial data handling processes using SharePoint, ensuring compliance with
regulatory standards and internal policies.
Environment: SQL, SharePoint, Confluence, Tableau, Amazon SageMaker, AWS S3, Apache Spark, AWS Redshift, Alteryx,
AWS Recognition, R, Microsoft Excel.
Client: Wellington management, Boston, MA Dec 2020 to Jul 2022
Role: Data Quality Analyst
Roles & Responsibilities:
Utilized SQL for extensive data analysis, enhancing insights into investment trends and market conditions.
Employed Informatica Data Quality tools to ensure the accuracy and usability of investment data.
Conducted statistical analyses using Excel and R, providing robust support for investment decisions.
Implemented data profiling with Trifacta, improving the quality and consistency of financial datasets.
Collaborated effectively with data stewards and IT teams to address and resolve data-related issues.
Developed and maintained a data validation framework using SQL and Excel, ensuring data integrity.
Governed data processes using Collibra, maintaining compliance with regulatory and internal data standards.
Managed SQL Server databases, optimizing data storage and retrieval for investment analyses.
Applied Agile methodologies to manage projects, enhancing team agility and project responsiveness.
Utilized Ataccama for data quality control, ensuring high-quality data for investment analysis.
Employed SQL Server and Informatica tools to manage metadata and support data lineage initiatives.
Worked with Trifacta and Open Refine for data cleaning and standardization, enhancing data quality.
Used R for advanced statistical modeling, supporting quantitative investment strategies and decisions.
Partnered with IT teams using Agile practices to streamline project delivery and enhance data workflows.
Implemented data governance frameworks with Collibra to ensure data quality and compliance.
Coordinated with data stewards to enhance data accuracy and governance using SQL and Informatica.
Facilitated root cause analysis for data issues, using SQL and collaboration with IT teams.
Documented and maintained data quality initiatives in Collibra, enhancing transparency and accountability.
Environment: SQL, Informatica Data Quality, Excel, R, Trifacta, Collibra, SQL Server, Ataccama, Open Refine.
Client: Metalyst Forgings Limited, India Jan 2019 to Nov 2020
Role: Data analyst
Roles& Responsibilities:
Conducted complex data querying and management using MY SQL and ANSI SQL, supporting critical data
operations.
Designed and managed databases using MySQL Workbench, improving data structure and accessibility.
Advanced data manipulation and analysis performed using Microsoft Excel, supporting key business
decisions.
Created dynamic data visualizations and reports using Tableau, enhancing understanding of data patterns.
Managed data integration projects effectively using Talend Open Studio, ensuring data consistency and
reliability.
Utilized GIT for version control, enhancing data security and collaboration among project teams.
Employed Docker to ensure consistent development environments across various projects, reducing setup
times.
Analyzed and interpreted complex datasets using Jupyter Notebooks, driving insights and analytics initiatives.
Utilized Microsoft Excel for financial modeling and forecasting, providing detailed analysis and reporting.
Applied GIT for effective version control in collaborative development projects, maintaining code integrity.
Leveraged Docker for containerization, ensuring seamless environment management for development
teams.
Used Jupyter Notebooks for data visualization and predictive modeling, enhancing data-driven decisions.
Developed ETL processes with Talend Open Studio, improving data integration and workflow efficiency.
Employed Tableau for strategic data reporting, supporting business intelligence and decision-making
processes.
Managed version control with GIT in development projects, ensuring efficient team collaboration and data
consistency.
Environment: MySQL, ANSI SQL, MySQL Workbench, Microsoft Excel, Tableau, Talend Open Studio, GIT, Docker,
Jupyter Notebooks.
Education:
Masters in Management Information Systems
University of Memphis, TN
|