Quantcast

Data Analysis Modeling Resume Lombard, I...
Resumes | Register

Candidate Information
Name Available: Register for Free
Title Data Analysis Modeling
Target Location US-IL-Lombard
Email Available with paid plan
Phone Available with paid plan
20,000+ Fresh Resumes Monthly
    View Phone Numbers
    Receive Resume E-mail Alerts
    Post Jobs Free
    Link your Free Jobs Page
    ... and much more

Register on Jobvertise Free

Search 2 million Resumes
Keywords:
City or Zip:
Related Resumes

Financial Modelling, Microsoft, Research, Data Analysis Chicago, IL

Data Analysis Social Media Chicago, IL

Power Bi Data Analysis Arlington Heights, IL

Data Warehouse Modeling Aurora, IL

Data Analysis Database Administration Chicago, IL

Data Analyst Analysis Chicago, IL

Product Design Data Analysis Chicago, IL

Click here or scroll down to respond to this candidate
Candidate's Name
Email: EMAIL AVAILABLE Mobile: PHONE NUMBER AVAILABLE United StatesSummary of Experience:Over 10+ Years of IT Industry professional with extensive experience in Technological/ business analysis, requirement gathering, specification preparation, design, development, implementation, testing, and production support for many clients across multiple industries.Excellent working experience in Agile Methodology & Software Development Life Cycle (SDLC) processes.Experience in Dimensional Data Modeling, Star/Snowflake schema, FACT & Dimension tables.Experienced Data Modeler with strong conceptual, Logical, and Physical Data Modeling skills, Data Profiling skills, Maintaining Data Quality, creating data mapping documents, writing functional specifications, queriesExperience in conducting and facilitating agile ceremonies: daily scrum, sprint planning, sprint review, and retrospective and strong understanding of scrum artifacts: product backlog, sprint backlog and increment, sprint burndown chart.Trace requirements throughout the SDLC process using the Requirements Traceability Matrix (RTM) documenting business processes and translating them to functional specifications.Hands-on experience in creating tables, views, and stored procedures in Snowflake.Excellent experience working with analytical and data visualization tools and techniques (Tableau and Power BI).Performed data analysis and data profiling using complex SQL on various source systems including Oracle, MS-SQL server, My-SQL, Snowflake, and Teradata.Excellent understanding and knowledge of NoSQL databases like MongoDB, HBase, and Cassandra.Strong experience in Business and Data Analysis, Data Profiling, Data Migration, Data Integration, and Metadata Management Services.Established Best Practices for the Enterprise Tableau Environment and application intake and Development processes.Experience in gathering and writing detailed business requirements and translating them into technical specifications and design.Extensive knowledge of Data Profiling using the Informatica Developer tool.End-to-end experience in designing and deploying data visualizations using Tableau Analytics.Configured on-premises Power BI gateway for different source-matching analysis criteria and further modeling for Power Pivot and Power View.Strong experience in Data Analysis, Data Migration, Data Cleansing, Transformation, Integration, Data Import, and Data Export using multiple ETL tools such as Informatica PowerCenter and open-source frameworks such as Spark and Hadoop.Well-versed with the concepts of Forward Engineering and Reverse Engineering for the existing databases for Physical models using the Erwin tool.Experience in the development of Process Flow Diagram, Use Case Diagram, Activity Diagram, and Database schema generation using UML tools like Visio and Oracle SQL developer.Performed the role of Data Analyst in implementation teams for Transforming Business Requirements into Software Requirement Specification (SRS) using MS Office Suite, JIRA, MS Project, and various tools.Experience in analytics, data visualization, and modeling to find solutions for a variety of business and technical problems.Experience in creating business requirement documents, functional and non-functional specifications, data flow diagrams, business process diagrams, and user Stories.Experienced in Cloud Services such as AWS EC2, EMR, RDS, and S3 to assist with big data tools, solve data storage issues and work on deployment solutions.Good understanding of Relational Database Design, Data Warehouse/OLAP concepts and methodologies.Extensive ETL testing experience using Informatica (Power Center/ Power Mart) (Designer, Workflow Manager, Workflow Monitor and Server Manager).Experience in conducting Joint Application Development (JAD) sessions with Business users, Subject Matter Experts (SMEs), and Development and QA teams.Experience in Installing and configuring GIT/SVN.Good working experience on Azure cloud and its services (Azure Blob Storage, MS SQL Server, Azure HD Insights, and Azure Databricks).Skilled in statistical analysis like linear, multiple, and logistic regression, hypothesis testing, predictive analytics, What-if analysis, parametric sensitivity, classification tree, Decision tree etc.Expert in problem-solving skills with the ability to think creatively and strategically about alternative solutions and opportunities.Technical Skills:Data WarehousingInformatica (Repository Manager, Designer, Workflow Manager, and Workflow Monitor), SSIS, Data StageBI ToolsPower BI Desktop, Tableau Desktop Tableau server, Crystal reports, MS Excel, SSRS and Excel Power ViewReporting ToolsBusiness Objects, XIR3, Cognos 8 SuiteData ModelingStar-Schema Modeling, Snowflake-Schema Modeling, FACT and dimension tables, Pivot Tables, ErwinTesting ToolsWin Runner, Load Runner, Test Director, Mercury Quality Center, Rational Clear QuestRDBMSOracle, MS SQL Server, UDB DB2, Teradata, MS AccessCloud TechnologiesAWS, AzureProgrammingSQL, PL/SQL, UNIX Shell Scripting, VB Script, PythonEnvironmentWindows (95, 98, 2000, NT, XP,7,8,10), UNIXOther ToolsTOAD, MS-Office suite (Word, Excel, Project and Outlook), BTEQ, Teradata SQL AssistantEducation: BBA (Bachelor of Business Administration) - Osmania University2013Project Experience:Client: LAHD, Los Angeles, CA May 2022  PresentSenior Data AnalystInvolved in reviewing business requirements and analyzing data sources from Excel/IBM DB2 Server for design, development, testing, and production rollover of reporting and analysis projects within Tableau Desktop.Extensively used Apache Kafka, Apache Spark, HDFS, and Apache Impala to build near real-time data pipelines that get, transform, store, and analyze click stream data to provide a better-personalized user experience.Extracted and loaded CSV files and JSON files data from AWS S3 to Snowflake Cloud Data Warehouse.Collaborated with diverse lines of business to understand the applications cross-functional impact and devise a mitigation planExecuted SQL queries to extract data from the database to check the data for consistencyParticipated in daily stand-up meetings to assist the Scrum team to prioritize work and resolve technical issuesParticipated in creating tables, views, and stored procedures in Snowflake.Utilized Agile tool JIRA for backlog management and created reports for Sprint meetings using Burndown charts and Task boardsConstruct the AWS data pipelines using VPC, EC2, S3, Auto Scaling Groups (ASG), EBS, Snowflake, IAM, CloudFormation, Route 53, CloudWatch, CloudFront, and CloudTrail.Performed unit test/integrated testing scenarios used to obtain the UATs.Built a few reports in Tableau for the management team to track the performance of representatives and their current business situation which is used as a single source of truth.Developed data visualization using Cross tabs, Heat maps, Box and Whisker charts, Pie charts, and Bar charts.Worked on the backend database development which was the source data for the tableau.Created custom hierarchies to meet business requirements in Tableau.Developed donut charts and implemented complex features in charts like creating bar charts in the tooltip.Responsible for creating calculated fields, geocoding, and hierarchies using Tableau Desktop.Worked with the program manager, business analysts, and developer to transform data into actionable intelligence through graphics and analytics.Created data sharing between two snowflake accounts.Developed data warehouse model in Snowflake.Extracted data from various database sources like Oracle, and SQL Server using Informatica to load the data into a single data warehouse repository.Administered Joint Application Development (JAD) sessions and interviewed end-users to elicit business requirements; Categorized and documented these requirements in the form of user stories and epicsPrepared SQL & PL/SQL Queries to validate the data in both source and target databases.Collaborated with the Scrum Master in conducting Scrum ceremonies including Sprint planning, Sprint review, and Sprint retrospective sessionsTested dashboards to ensure data was matching as per the business requirements and if there were any changes in underlying data. Performed schema changes in the database per request and met with clientele and various subgroups for requirement gathering to allow self-extracted reports and created critical field reporting tools for Enterprise Data Management customer segmentsSupported Tableau reports in production by performing daily health checks on multiple dashboards and extracts to ensure that the dashboards are running properly, and the extracts are refreshed.As a Tableau administrator, created report schedules on the Tableau server, created groups, onboarding new users, and assigned/updated user roles. Installed Tableau in Multi-node and Enterprise DeploymentsImplemented business use case in Hadoop/Hive and visualized in TableauEnvironment: Snowflake, Tableau Desktop, Tableau Server, Tableau client, Tableau services manager, Tableau server, IBM DB2, Service now, Jira, DB visualizer, MS Excel, Splunk, AWS, Hadoop, Kafka, Spark, HDFS, Impala, Oracle 12c.Client: Integrityhpg, Washington DC Feb 2019  May 2022Data AnalystIdentify and document limitations in data quality that jeopardize the ability of internal and external data analystsUtilized Power BI to create various analytical dashboards that depict critical trends along with slicers and dicers enabling end - users to make filters.Worked on data migration from east to west Snowflake accounts.Creating complex SQL queries and scripts to extract and aggregate data to validate the accuracy of the dataUsed Python& SAS to extract, transform & load source data from transaction systems, generated reports, insights, and key conclusions.Consistently attended meetings with the Client subject matter experts to acquire functional business requirements to build SQL queries that would be used in dashboards to satisfy the businesss needs.Involved in requirement gathering and database design and implementation of star schema, snowflake schema/dimensional data warehouse using Erwin.Built dashboards that will provide insights into Snowflake usage statistics.Performed and utilized necessary PL/SQL queries to analyze and validate the data.Wrote Data Bricks code and ADF pipeline with fully parameterized for efficient code management.Business requirement gathering and translating them into clear and concise specifications and queries.Responsible for creating Power BI reports and deploying them to Power BI Services.Created the Dashboards and Shared it to user groups using Power BI Apps.Performed Data Analysis using SQLCreated the Power BI report on data in ADLS, Tabular Model, and SQL server.Developed, tested, and deployed the dashboards for service now tickets by connecting to Service DB (SQL server).Responsible for building reports in Power BI from Scratch.Used a forward engineering approach for designing and creating databases for the OLAP modelUsed Teradata utilities such as Fast Export, and MLOAD for handling various tasksUsed Spark API over Cloudera Hadoop YARN to perform analytics on data in Hive.Configured on-premises Power BI gateway for different source-matching analysis criteria and further modeling for Power Pivot and Power View.Configured real-time streaming pipeline from DB2 to HDFS using Apache Kafka.Created Azure Data Factory and managed policies for Data Factory and Utilized Blob storage for storage and backup on Azure.Expertise in writing Hadoop Jobs for analyzing data using Hive QL (Queries), Pig Latin (Data flow language), and custom MapReduce programs in Java.Created Live and Extract dashboards on top of Snowflake DWEnvironment: Snowflake, Power BI, Kafka, Hadoop, Yarn, Azure, Data Bricks, Data Factory, MS Excel, Power Point, SQL Server, Agile, JIRA, ETL, VIZQLClient: HUMANA, Louisville, KY Nov 2017  Feb 2019Data AnalystConverted Excel Reports to Tableau Dashboard with High Visualization and Good Flexibility; Created action filters, Parameters, and calculated sets for preparing dashboards and worksheets like Yearly, and monthly reports (in Tableau); Created Geographic map reports (in Tableau) which explained the company financial keys according to the members.Used AWS Glue to crawl the data lake in S3 to populate the Data Catalog.Tested the ETL process both before data validation and after the data validation process. Tested the messages published by the ETL tool and data loaded into various databasesResponsible for creating test cases to make sure the data originating from the source is making it into the target properly in the right format.Tested several stored procedures and wrote complex SQL syntax using the case, having, connect by, etc.Used MySQL and Tableau as data analytical tools to analyze and understand trading and investment strategy dataIntegrated R in the Tableau dashboard to forecast the cash flowsInvolved in Teradata SQL Development, Unit Testing, and Performance Tuning to ensure testing issues are resolved based on using defect reports.Analyzed business requirements, system requirements, and data mapping requirement specifications, and was responsible for documenting functional requirements and supplementary requirements in the Quality Center.Identify business-critical measures by working closely with the SMEs.Conducted JAD sessions and user interviews to identify and comprehend business requirements; Documented requirements in the form of user stories and epicsInvolved in writing parsers using PythonEffectively designed, developed, and enhanced cloud-based applications using AWSDeveloped Mappings using Source Qualifier, Expression, Filter, Lookup, Update Strategy, Sorter, Joiner, Normalizer and Router transformations.Integrate Data Stage Metadata to Informatica Metadata and create ETL mappings and workflows.Involved in writing, testing, and implementing triggers, stored procedures, and functions at the Database level using PL/SQL.Migrated repository objects, services, and scripts from development to the production environmentCreated UNIX scripts and environment files to run batch jobsWorked with DBAs for performance tuning and to get privileges on different tables in different environmentsTested Complex ETL Mappings and Sessions based on business user requirements and business rules to load data from source flat files and RDBMS tables to target tables.Evaluated and enhanced current data models to reflect business requirements.Used Power Exchange to create and maintain data maps for each file and to connect the Power center to the mainframeDeveloped scripts using both Data frames/SQL and RDD in PySpark (Spark with Python) 1.x/2.x for Data Aggregation.Unit tested the code to check if the target is loaded properlyEnvironment: Python, AWS, PL/SQL, Metadata, Cloudera, Java, PySpark, Scala, UNIX, Tableau.Client: Verisk, Jersey City, NJ Sep 2015  Oct 2017Data AnalystImplemented Exception Handling Mappings by using Data Quality, and data validation by using Informatica Analyst.Implemented Change Data Capture using Informatica Power ExchangeWrote ETL transformation rules to assist the SQL developer.Implemented populate slowly changing dimension to maintain current information and history information in dimension tables.Also wrote Unit tests for the developed scripts for getting through the quality checks before pushing to the deploymentsWritten documentation to describe program development, logic, coding, testing, changes, and corrections.Developed and tested Store procedures, Functions, and packages in PL/SQL for Data ETL.Created complex Cognos reports using calculated data items, and multiple lists in a single report.Prepared functional and technical documentation of the reports created for future referencesWorked with data modelers in preparing logical and physical data models and adding/deleting necessary fields using ErwinTo obtain the data about the customers from different systems and aggregate it within the data warehouse using InformaticaDeveloped, and scheduled a variety of reports like cross-tab, parameterized, drill through and sub-reports with SSRS.Developed the ETL jobs as per the requirements to update the data into the staging database (Postgres) from various data sources and REST APIsWorked on data profiling & various data quality rules development using Informatica Data Quality.Environment: ETL, Cognos, Informatica, PL/SQL, SQL, MS Office, MS Excel, Windows.Client: Biocon, Hyderabad, India July 2013  May 2015Business AnalystActively monitored and contributed to all the stages within the Software Development Life Cycle (SDLC) based on Waterfall methodologyDeveloped comprehensive visual representations of technical processes through UML diagrams with Use cases and Activity diagrams using MS VisioDetermined deadlines and tasks of the project through MS Project to guarantee the deployment of the project promptlyMaintained a clear understanding of project goals among stakeholders by conducting walkthroughs and meetings involving various leads from BA, Development, and Technical support teamsParticipated in the project kick-off meeting by documenting and analyzing the business requirements using IBM Rational Requisite ProWorked closely with the QA during User Acceptance Testing (UAT) and Functional TestingStrong knowledge of SQL to extract data for validating test resultsEnvironment: Waterfall, IBM Rational Requisite Pro, MS Visio, MS Project, SQL Server

Respond to this candidate
Your Message
Please type the code shown in the image:

Note: Responding to this resume will create an account on our partner site postjobfree.com
Register for Free on Jobvertise