Quantcast

Data Analyst Quality Resume Meriden, CT
Resumes | Register

Candidate Information
Name Available: Register for Free
Title Data Analyst Quality
Target Location US-CT-Meriden
Email Available with paid plan
Phone Available with paid plan
20,000+ Fresh Resumes Monthly
    View Phone Numbers
    Receive Resume E-mail Alerts
    Post Jobs Free
    Link your Free Jobs Page
    ... and much more

Register on Jobvertise Free

Search 2 million Resumes
Keywords:
City or Zip:
Related Resumes

Data Quality Management Burlington, CT

Risk Management Data Analyst Meriden, CT

Supply Chain Data Analyst New Britain, CT

Quality Assurance Business Analyst South Windsor, CT

Data Analyst Power Bi Stratford, CT

Mainframe Architect/Data/Business/Product Analyst Hartford, CT

Data Scientist/ Data Analyst Hartford, CT

Click here or scroll down to respond to this candidate
Candidate's Name
PHONE NUMBER AVAILABLEEMAIL AVAILABLEData AnalystPROFESSIONAL SUMMARY:Experienced Data Analyst with over 6 years of experience in SQL, Excel, and Tableau for data analysis and reporting across multiple industries.Skilled in designing and deploying large-scale data infrastructures using ETL processes, Oracle Data Integrator, and Informatica to enhance operational efficiencies.Connected and transformed data from various sources using Power Query and DAX.Proficient in advanced data analytics, utilizing tools like Apache Spark and Amazon Sage Maker for predictive modeling and machine learning deployment.Facilitated sharing and collaboration via Power BI Service.Enabled data-driven decision-making by delivering timely and accurate business intelligence solutions that directly contributed to achieving organizational objectives.Expert in data cleansing and transformations, employing technologies such as SSIS and Apache NiFi to ensure data accuracy and usability.Utilized advanced analytics features such as Power BIs AI capabilities and complex DAX calculations to provide deeper insights and predictive analytics.Strong background in financial and healthcare sectors, focusing on data integrity using SQL Server and Informatica Data Quality.Adept in implementing data lineage and metadata management frameworks with tools like Collibra and Informatica Enterprise Data Catalogue.Capable of translating complex datasets into actionable insights using visualization tools like Tableau and Microsoft Excel.Experienced with AWS cloud services including AWS S3, AWS Redshift, and AWS Cloud for data processing and storage solutions.Skilled in root cause analysis for data discrepancies, using R for statistical analysis and data quality tools like Atacama.Effective collaborator in Agile project environments, using Confluence and SharePoint for project management and team collaboration.Committed to continuous learning and application of new data technologies and methodologies to stay ahead in the field.Highly skilled in maintaining data documentation standards, ensuring adherence to compliance and data security protocols.Proficient in managing data through FTP/SFTP and integrating systems via RESTful APIs for seamless data exchange.Expert in utilizing data quality tools like Informatica Data Quality and Open Refine for improving data accuracy and performance.Experienced with data integration and transformation tools such as SSIS, Oracle Data Integrator, and Apache NiFi.Knowing how to manage Salesforce and handle data well makes working with information much simpler and more effective.Proficient in statistical analysis and predictive modeling using R, enhancing data-driven decision-making across business operations.Skilled in automating data processes and enhancing analytical workflows using Alteryx to improve efficiency and accuracy.Proficient in managing and querying large data sets using Apache Spark ALS and AWS Redshift for robust data analytics.Skilled in setting up data validation frameworks to facilitate data stewardship and IT team collaborations, using MDM and SQL Server.Possess strong analytical skills, capable of managing complex data projects from planning through deployment using Agile methodologies.Adept at developing and maintaining professional relationships with stakeholders, leveraging data analytics for strategic decisions.Committed to leveraging data to drive improvements, utilizing technologies like AWS Recognition and SharePoint for data enhancement and collaboration.TECHNICAL SKILLS:Data Analysis: SQL, Excel, Tableau, RData Integration: ETL, SSIS, Apache NiFi, Oracle Data Integrator, Informatica, Apache KafkaData Quality: Informatica Data Quality, Trifacta, Open Refine, AtacamaData Visualization: Power BI, Tableau, SharePoint, ConfluenceMachine Learning: Amazon Sage Maker, AWS RecognitionData Management: Collibra, Informatica Enterprise Data Catalogue, MDM, SQL ServerCloud Services: AWS S3, AWS Cloud, AWS RedshiftProgramming/Scripting: R, PythonData Profiling: Trifacta, Informatica Data Quality, AtacamaData Governance: Collibra, Informatica Enterprise Data Catalogue, Alation, Oracle EDMStatistical Analysis: RProject Management: Agile methodologies, Confluence, SharePointProcess Automation: AlteryxBig Data Technologies: Apache Spark, Apache KafkaData Storage/Transfer: FTP/SFTP, RESTful APIData Security/Compliance: Informatica Data Quality, Atacama, SQL Server, LinuxDocumentation: Microsoft Excel, ConfluencePROFESSIONAL EXPERIENCE:Client: Total Bank Solutions, Hackensack, NJ Dec 2022 to till dateRole: Data AnalystRoles & Responsibilities:Conducted detailed financial data analyses to identify trends, leveraging SQL, Excel, and Tableau for enhanced data visualization.Developed and maintained advanced analytics dashboards in Tableau, integrating SharePoint and Confluence for improved project collaboration.Deployed machine learning models using Amazon Sage Maker to accurately predict financial behaviors and outcomes.Managed extensive data integration projects utilizing AWS S3 and AWS Cloud to secure and scale financial data storage.Implemented data process automation with Alteryx, improving efficiency and accuracy in financial data handling.Utilized Apache Spark ALS for robust predictive analytics, enhancing the bank's investment decision-making processes.Integrated AWS Recognition for advanced visual analytics projects, enhancing the bank's capability to interpret complex visual data sets.Maintained rigorous documentation standards to support data integrity, compliance, and process transparency across financial projects.Employed R language for complex statistical analysis, supporting the bank's advanced analytics and data modeling efforts.Importing data from sources like SQL, Excel, and SharePoint into Power BI, transforming and cleaning it with Power Query, and utilizing DAX for complex financial calculations and reporting.Collaborated closely with IT and data teams to implement and manage robust, scalable data solutions tailored to financial services.Enhanced reporting mechanisms by integrating SharePoint and Confluence, facilitating better access and collaboration on financial data.Optimized data storage and querying capabilities using AWS Redshift, ensuring efficient data retrieval and management for financial services.Analyzed client financial portfolios using advanced analytics and machine learning, delivered through AWS Sage Maker and Apache Spark.Automated routine financial reports and data cleansing processes using Alteryx, significantly reducing manual data handling errors.Orchestrated secure data transfers and integrations across platforms using FTP/SFTP and RESTful APIs, enhancing system interoperability.Developed customized financial models and forecasts using R, aiding in the bank's strategic planning and risk assessment.Leveraged AWS Cloud technologies to streamline data workflow and storage solutions, enhancing data availability and disaster recovery processes.Conducted in-depth data validation and quality assurance using SQL and Excel, ensuring the accuracy of financial reports and dashboards.Facilitated the implementation of a comprehensive data governance framework using Collibra, improving data lineage and metadata management.Applied AWS Recognition to classify and analyze financial documents, improving the speed and accuracy of data extraction and categorization.Streamlined data analytics processes and reporting using SharePoint and Confluence, ensuring timely access to critical business insights.Utilized AWS S3 for efficient data archiving and retrieval, optimizing storage management and supporting compliance with financial regulations.Environment: Power BI, SQL, Excel, Tableau, Amazon, Sage Maker, AWS S3, AWS Cloud, Alteryx, Apache Spark ALS, AWS Recognition, R, SharePoint, Confluence, AWS Redshift, FTP/SFTP, RESTful APIs, Collibra.Client: HCSC, Chicago, IL Jun 2021 to Jul 2022Role: Data Quality AnalystRoles & Responsibilities:Enhanced the accuracy of healthcare data reporting by employing Informatica Data Quality, leading to improved patient care outcomes.Implemented data profiling and cleansing techniques using Trifacta and Open Refine, ensuring high standards of data quality and usability.Managed data validation frameworks effectively, collaborating with data stewards using SQL Server and Collibra to ensure data integrity.Developed and maintained metadata management solutions with Collibra and Informatica Enterprise Data Catalogue, enhancing data governance.Identified and addressed root causes of data discrepancies, working closely with IT teams for effective resolutions.Managed and analyzed healthcare data using SQL Server, supporting extensive data analysis activities for healthcare improvements.Conducted statistical analyses using R and python to decipher complex healthcare data patterns, aiding in the development of targeted health programs.Created custom reports and dashboards, enhancing the financial reporting process using Python.Utilized Agile methodologies to manage projects efficiently, ensuring the timely delivery of healthcare data solutions.Conducted data profiling and analyzed large datasets to identify and resolve data quality issues, enhancing data integrity by 30% and improving accuracy by 15%.Enhanced data quality control using Atacama, standardizing data cleaning processes across multiple data sources.Streamlined data workflows and integration using Apache NiFi, improving data movement and processing within healthcare systems.Developed a robust data lineage system with Informatica Enterprise Data Catalogue to trace data origins for compliance and audit processes.Conducted data validation using MDM and SQL Server, ensuring the accuracy and reliability of clinical data reports.Integrated healthcare data with R for advanced statistical analysis, providing deeper insights into patient care and hospital management.Facilitated effective data governance meetings and workshops using Agile practices, involving data stewards and IT teams for better decision-making.Implemented root cause analysis of data quality issues using SQL and Informatica Data Quality, reducing data errors significantly.Coordinated with healthcare professionals to align data analytics initiatives with operational goals, using Collibra for data management.Oversaw the documentation of all data processes and quality measures, ensuring adherence to healthcare regulations and standards.Promoted continuous improvement of data quality tools and practices, integrating new technologies to keep pace with healthcare industry demands.Environment: Informatica Data Quality, Trifacta, Open Refine, SQL Server, Collibra, Informatica Enterprise Data Catalogue, Atacama, Apache NiFi, R, Python. MDM.Client: Kimberly Clark, Irving, TX Jan 2020 to May 2021Role: Data AnalystRoles& Responsibilities:Managed and optimized data integrations across various systems using ETL tools, SSIS, and Apache NiFi, enhancing system efficiency.Developed and implemented data cleansing and transformation processes using Oracle Data Integrator and Informatica, ensuring high-quality data outputs.Utilized Apache Kafka for efficient data streaming and processing, facilitating real-time data analytics and decision-making.Performed data analysis and reporting tasks using Microsoft Excel, providing critical insights into business operations and customer trends.Managed XML and JSON formats for seamless data interchange and integration, supporting diverse application needs.Ensured secure data management and transfer through FTP/SFTP, maintaining data integrity and confidentiality.Created detailed documentation for all data processes and integrations, facilitating clear understanding and compliance.Conducted comprehensive data analysis using SQL and Excel, extracting valuable insights to inform business strategy and operations.Implemented data transformations and integrations using Informatica, improving data flow and accessibility across systems.Coordinated with IT teams to resolve data discrepancies and streamline data handling processes using Apache NiFi.Enhanced data accessibility and integration using RESTful APIs, connecting disparate systems and enhancing data utility.Supported continuous data operations and maintenance using Git for version control, ensuring operational continuity and data consistency.Developed and implemented validation processes to verify the accuracy and completeness of data entries.Optimized data workflows and processing using Oracle Data Integrator, reducing processing times and improving data quality.Facilitated cross-departmental data sharing and analysis, enhancing collaborative efforts and driving business growth.Maintained rigorous compliance with data security standards using Linux-based systems, protecting sensitive data and ensuring regulatory compliance.Environment: ETL tools, SSIS, Power BI, Apache NiFi, Oracle Data Integrator, Informatica, Apache Kafka, Microsoft Excel, XML, JSON, FTP/SFTP, SQL, RESTful APIs, Git, Linux.Client: Impetus Technologies, Hyderabad, India Jun 2018 to Dec 2019Role: SQL DeveloperRoles& Responsibilities:Developed complex SQL queries and scripts, optimizing data manipulation and reporting capabilities.Managed PostgreSQL and MySQL databases, enhancing database performance and scalability.Designed and maintained Talend jobs for data integration and automation, improving workflow efficiencies.Integrated Python scripts to enhance ETL processes, automating tasks and improving data quality.Created dynamic data visualizations and business dashboards using Power BI, providing key insights to stakeholders.Utilized Git for version control, ensuring code integrity and supporting team collaboration.Automated data workflows using Talend, reducing manual efforts and streamlining data processes.Designed and implemented data backups and recovery strategies, ensuring data availability and integrity.Optimized database performance through careful tuning and index optimization.Developed custom data models and schemas to support business analytics and reporting needs.Collaborated with business analysts to translate business requirements into technical specifications.Managed data migrations and upgrades, ensuring minimal downtime and data consistency.Trained junior developers in SQL, Python, and Talend, enhancing team skills and project capabilities.Implemented data quality checks within ETL processes, ensuring accuracy and reliability of reports.Conducted performance benchmarking and optimization to improve system responsiveness and efficiency.Environment: SQL, PostgreSQL, MySQL, Talend, Python, Power BI, Git.Education:Master of Business Administration in Data Analyst, University of Newhaven, West haven, CT.

Respond to this candidate
Your Message
Please type the code shown in the image:

Note: Responding to this resume will create an account on our partner site postjobfree.com
Register for Free on Jobvertise