Quantcast

Agile Scrum Master Resume Dallas, TX
Resumes | Register

Candidate Information
Name Available: Register for Free
Title Agile Scrum Master
Target Location US-TX-Dallas
Email Available with paid plan
Phone Available with paid plan
20,000+ Fresh Resumes Monthly
    View Phone Numbers
    Receive Resume E-mail Alerts
    Post Jobs Free
    Link your Free Jobs Page
    ... and much more

Register on Jobvertise Free

Search 2 million Resumes
Keywords:
City or Zip:
Related Resumes
Click here or scroll down to respond to this candidate
 Scrum Master/Agile CoachCandidate's Name
                  Phone: PHONE NUMBER AVAILABLEEmail: EMAIL AVAILABLEAddress: Street Address
Objective
A seasoned Scrum master / Kanban flow master / Agile Coach with a PSM-1 Certification, bringing over 8 years of rich experience across diverse sectors such as Health Care, and Banking. Well-versed in guiding teams through the complexities of transitioning to Agile methodologies from traditional project management frameworks. My career is marked by a value-driven approach, underscored by a deep commitment to leveraging Agile practices to overcome organizational challenges and enhance project delivery.      Actively participated in the adoption of Agile and Lean software development methodologies (Scrum, and Kanban practices) within project teams.      Visualizing all work items on the Kanban board, making it clear to the team what tasks are in progress, what are completed, and what tasks yet to start , facilitating daily scrums.
      Setting and enforcing WIP limits to prevent overloading the team and to ensure a smooth flow of work through the system.
      Monitoring and managing the flow of work to identify bottlenecks and areas for improvement. Continuous improvement using metrics and feedback while analyzing lead times, cycle times, and other performance indicators to identify areas of enhancement.
      Implementing policies for how work is prioritized, started, and completed.
      Coaching and training new members on Kanban principles and practices to ensure everyone is aligned and understands rules and Kanban practices.      Assisted in upholding Kanban / Scrum rules, addressing impediments, and promoting team self-management, while contributing to project success in dynamic environments.      Supported Agile Release Trains (ART) by aiding in the implementation of Scrum practices, focusing on team-level contributions and interactions.      Facilitated teams in story sizing, task definition, and the application of Test Driven Development (TDD) techniques, emphasizing the delivery of high-value requirements.      Gained proficiency in transitioning to the Scaled Agile Framework (SAFe) from traditional software development methodologies.      Worked closely with Scrum Masters, Product Owners, and team members to reinforce core Agile principles and practices.      Trained teams in the scrum framework, guiding them towards self-organization and effective use of Agile concepts such as Backlog Refinement and Sprint Planning.      Developed skills in key Agile practices including iteration planning, conducting retrospectives, and facilitating effective communication within teams.      Monitored and communicated relevant metrics such as Velocity and Burn-up/Burn-down charts to track team progress.      Aided in gathering and defining SMART (Specific, Measurable, Attainable, Realistic, and Timely) requirements, transforming them into functional specifications.      Collaborated seamlessly across functions, refining communication, and cultivating relationship-building skills within the Scrum team.      Demonstrated strong oral and written communication skills, alongside analytical and presentation capabilities.      Operated as a motivated self-starter, capable of working independently and managing multiple tasks efficiently under tight deadlines.Skills
Agile Project Management      Agile Scrum: Advanced (4+ years)      Kanban: Proficient (3+ years)      Waterfall Methodology: Experienced (4+ years)      Scaled Agile Framework (SAFe): Proficient (1+ years)Project Management & Collaboration Tools:      JIRA: Advanced (4+ years)      Confluence: Advanced (4+ years)      GitHub: Experienced (5+ years)      MS Project: Proficient (1+ years)      SharePoint: Proficient (5+ years)      HP ALM: Experienced (4+ years)Database Technologies & Tools:      SQL: Advanced (8+ years)      Oracle: Advanced (8+ years)      TOAD: Proficient (6+ years)      SQL Developer: Experienced (5+ years)Microsoft Office Suite:      MS Word: Advanced (8+ years)      MS PowerPoint: Advanced (8+ years)      MS Excel: Advanced (8+ years)      MS Visio: Proficient (5+ years)Work Experience
Junior Scrum Master
Wells Fargo, Charlotte, NC   April 2022   Dec 2023SimCorp Dimension is WF Securities Operations and Accounting's (SO&A) system of record for Investment Portfolio (IP) security trading portfolios and Collateral Allocation for Derivatives Collateral Management and Public Funds. Worked on decommissioning projects   Wallstreet Application, and Blackrock System, Portfolio Layering   FASB changes, SCD to Bloomberg   Investment Portfolio Efficiency and Transformation Initiative, Remediation Source Code request (Audit) and Wells Fargo Trading and Funding Applications.      Facilitated daily scrum meetings, ensuring efficient task updates, and addressing any blockers in project execution for the GSMOS project.      Actively collaborated with Product Managers, Technical Teams, and other stakeholders to streamline project deliverables within defined SLAs, focusing on Agile methodologies.      Directly interacted with stakeholders and product owners, gathering, and managing feedback to continually enhance customer satisfaction and meet business requirements.      Played a key role in understanding and documenting user expectations, developing functional and technical requirements, and clarifying any doubts with stakeholders.      Conducted knowledge transfer sessions for the team on functional specifications, helping team members understand and apply Agile principles in their work.      Served as the primary point of contact for business requirements, effectively representing end-user perspectives to the development team.      Led the functional testing phase of projects,ensuring product quality and readiness prior to user acceptance testing.      Promoted the adoption of Agile best practices within the team, focusing on continuous improvement and functional product quality.      Practiced Agile methodologies, including test-driven development and model storming, and assisted in setting priorities in line with iteration goals.      Encouraged a collaborative team environment, sharing best practices and contributing to the team s overall knowledge and skillset.      Organized and facilitated requirement-gathering workshops, such as Joint Application Development (JAD) sessions and root cause analysis meetings.      Conducted training sessions on requirements management and contributed to the development of best practices for Agile product delivery.      Partnered with development leads to fostering better collaboration and identifying opportunities for process improvement.      Documented business needs and processes comprehensively, ensuring the support systems were robust and effective.      Actively involved in resolving issues during the testing process, facilitating smooth progress and adherence to project timelines.Application Support Analyst / Business Analyst InternWells Fargo, Charlotte, NC   April 2019   April 2021SimCorp Dimension is WF Securities Operations and Accounting's (SO&A) system of record for Investment Portfolio (IP) security trading portfolios and Collateral Allocation for Derivatives Collateral Management and Public Funds.      Documented and Presented to Business Stakeholders the key goals/needs and created BRD, FSD, Data Flow / Mapping / Transformation rules for the GSMOS project, ensures compliance with Sanctions Laws by automating sanctions screening within SCD system and allow to send messages directly to SWIFT Alliance Access (SAA) replacing the current Black Aladdin system.      CECL Project   Involved in discussions with the Technical Team providing clarifications on the requirements.      Prepared and reviewed various SDLC documents - for design, data flow diagram, source to target mappings, impact analysis, unit test and implementation plan document.      Conducted	interviews	with	multiple	stakeholders	to	capture	and	create	Business Requirements documents (BRDs).      Translated business requirements into technical user stories for development teams.      Defined scope and documented BRD or new enhancements based on requirement elicitation, operational constraints, information system architecture and project risks.      Collaborated with stakeholders to identify requirements and recommend solutions to address business needs complying with operations and it controls frameworks.      Created a standardized process document for the department to ensure successful integration of existing.      Actively participated in the management of multiple projects at a time, ranging in complexity and size. Built relationships with co-workers, partners, and clients to enhance professional reputation.      Raised tickets through HP ALM, for domains related issues, assigned them to various teams in order to resolve them.Associate Support AnalystBank of America, Charlotte, NC   July 2018   March 2019LEO   GBAM s single system of record for capturing AML regulatory data for clients using GBAM products and services. Maintains pertinent information for AML/KYC on boarding to ensure compliance with BAC AML policy and program, with appropriate local AML requirements based on Jurisdiction and booking entity, facilitate account opening/relationship establishment processing current client status within specific regions, LOB s, and special products level, streamlining the on- boarding processing across all regions and LOB s and sharing client regulatory data and documentation.      Regulatory reporting functions, Work with Data Analyst, Data Extractions. Data analysis and data mining capabilities. Business partner facing roles.      Development testing and system validation through automated testing strategies - Agile and waterfall methodology.      Requirements gathering to translate business needs into technical requirements and ensure all system functions conform to the business needs and specifications.      Understand and document the current business processes. Complete System Design and provide development support. Proactively identifying gaps and risks and developing mitigation plans to resolve issues.      Aerial  Performed Impact analysis - created/modified Tableau report/SQLs to complete audit requests from geographically based wide business partners to generate time sensitive audit reports to ensure compliance.Associate Support AnalystWells Fargo, Charlotte, NC   Feb 2016   August 2017SimCorp Dimension is an Innovative, Integrated, Automated and Modular Investment Management software, ICA   Integrated Compensation Application   three phase project   intents to integrate the existing independent compensation applications   STARS (sales tracking and referral system), WMGCOMP (Dashboard reporting to manager and producers), iComp (calculates and processes compensation for Wealth producers) and iBonus (evaluates bonus eligibility) - onto one platform to address data integrity and consistency issues, leading to audit risks, business impact during compensation reconcile.      Supported ICA/SCD Application   post upgrade, existing process enhancements and defects, post production to fix defects   incorrect IFS opportunity sales details, optimized code for performance improvements in account loader process.      WFBI-SimCorp to ITFT Interface Project   Created BRD, FSD, Data Flow/Mapping/Transformation Rules, and Visio Diagram of MQ Message Flow from SCD to ITFT and walked through for sign off with Business Team.      Autosys Application Support   Built custom solutions by splitting the Batch Job Group (BJG) in SCD into Multiple BJG to avoid reruns when failed, and assisted Business Team to create Test Data positions/transactions in SCD and created/ran ADHOC jobs in Prod Fix environment to verify/validate the exports before sharing through Documentum.      Investigated Nightly ETL batch process for data integrity issues for business intelligence reporting. Corrected job errors, data access issues, and platform data type conversion conflicts.      Prepared and reviewed various SDLC documents   for design, data flow diagram, and source to target mapping, impact analysis, unit test, and implementation plan document.      As part of Application support, worked on fixing Autosys issues ranging from job dependency with source system and downstream application, enhancing Sales process package with new business requirements, enhanced loader process to optimize code to add Bulk collections for faster updates to the opportunity and sales data.      Written PL/SQL processes to transform ICA Data into metrics data and load into aggregated tables.      Created and modified UNIX shell Scripts according to the changing needs of the project and client requirements.      Modified and created PL/SQL stored procedures, PL/SQL functions and packages for moving the data from staging area to data mart.      Proficient in using PL/SQL for developing complex Stored Procedures, Triggers, Tables, Views, User Functions, Relational Database Models and Data Integrity, SQL joins and Query Writing.SQL Support AnalystMaritz, Saint Louis, MO   Nov 2015   Jan 2016Maritz Incorporated, which ranked 157th on the most recent Forbes list of "The 500 Biggest Private Companies," offers marketing research, travel, and performance improvement, which are delivered through 20 operating companies located in the U.S., Canada, Mexico, the United Kingdom, France, Spain, Germany, and Italy. MaritzCX works with customer experience management professionals and market researchers to provide a customer experience software and research solution that helps organizations understand and respond to customer experiences in real time.      Actively involved in the gathering requirements and analysis of the requirements related to the project.      Displayed exemplary skills in initiating and executing conversion packages to convert the data from EPF data source and converted existing data, together with technical specifications document for extracting data from various business for spreading the incoming DATA.      Worked closely with end-users to collect requirements and test processes. Analyzed data and designed databases for importing data.      Modified and created PL/SQL and SQL stored procedures, T-SQL, PL/SQL functions and packages for moving the data from staging area to data mart.      Modified existing indexes to change/add columns on the tables for faster retrieval of the data to enhance database performance.      Involved in data loading using PL/SQL and SQL*Loader calling UNIX scripts to download and manipulate files.      Created different Fact and Dimension tables to increase the performance of complex SQL queries that are used in Tableau reporting.SQL Dev AnalystFamily Dollar, Charlotte, NC   Jan 2015   Sept 2015Family Dollar is an American variety store chain and second largest retailer of its type with 8000 plus stores in the United States. IT pricing and Promotions performs various promotional pricing like Bogo (buy one get one), loss leader pricing, close out pricing, Trade pricing, demand-based pricing, Everyday low pricing, Going Rate pricing, Markup Cost pricing, Penetrate pricing, Prestige pricing, Value-in-use pricing, Tiered pricing, Target return and Variant pricing. The promotions are applied between certain dates to stores based on zones and other factors. These details are uploaded via flat files into all stores using various nightly batch cycles and on demand based due to conflicts arising during store business hours.      Worked on SQL*Loader to load data from flat SAT files obtained from various facilities every day.      Written SQL processes to transform OD into metrics data and load into aggregated tables.      Created and modified UNIX shell Scripts according to the changing needs of the project and client requirements.      Wrote Unix Shell Script to create Daily, Weekly and Monthly reports that is shared with the business team.      Involved in the continuous enhancements and fixing of production problems.      Performed detailed Impact analysis on existing batch failures that downloads data to the stores.      Modified and created PL/SQL and SQL stored procedures, PL/SQL functions and packages for moving the data from staging area to data mart.      Modified existing indexes to change/add columns on the tables for faster retrieval of the data to enhance database performance.      Involved in data loading using PL/SQL and SQL*Loader calling UNIX scripts to download and manipulate files.      Proficient in using T-SQL for developing complex Stored Procedures, Triggers, Tables, Views, User Functions, Relational Database Models and Data Integrity, SQL joins and Query Writing.      Worked with T-SQL, DDL, and DML Scripts and established relationships between tables using Primary Keys and Foreign Keys.      Created Indexes to optimize performance of TSQL queries.      Created Tables/Views in Netezza to support the reporting requirements and optimized the tables by adding the right distribution keys to avoid data skews.      Extensively involved in using hints to direct the optimizer to choose an optimum query execution plan.      Created PL/SQL scripts to extract the data from the operational database into simple flat text files using UTL_FILE package.      Creation of database objects like tables, views, materialized views, procedures, and packages using oracle tools like Toad, PL/SQL Developer and SQL* plus.      Utilized various utilities such as Explain Plan, TK-Profile, and DBMS STATS in tuning the SQL interfaces.SQL Support AnalystA.C Nielson, Chicago, IL   Jan 2013   Dec 2014A.C Nielsen is the world s leading company which conducts surveys for retail products across the world and gives the best rated products by the customers. In RDL system all the data about the products will come as an input flat file; the RDL system will load the data into the regional database in the two modes. One is the initial load, and the other is the incremental load. During the incremental load the system will capture the changed data (CDC Changed Data Capture) and it will upload only the changed data in the regional database. If any error occurred during the RDL process all the information will be stored in the respective log tables. If all the data uploaded successfully, System will extract the file in the same format which is initially received for the RDL process and will place in the outgoing area from where the MDL (Massive Data Load) process will get initiated.      Worked on SQL*Loader to load data from flat files obtained from various facilities every day.      Created and modified several UNIX shell Scripts according to the changing needs of the project and client requirements.      Wrote Unix Shell Script to process the incoming files on daily basis in loading them into the base tables.      Proficient in using T-SQL for developing complex Stored Procedures, Triggers, Tables, Views, User Functions, Relational Database Models and Data Integrity, SQL joins and Query Writing.      Worked with T-SQL, DDL, and DML Scripts and established relationships between tables using Primary Keys and Foreign Keys.      Worked with Architect to design the various database tables/reports to capture the CDC and FULL external received daily and weekly basis.      Created Indexes to optimize performance of TSQL queries.      Involved in the continuous enhancements and fixing of production problems.      Created PL/SQL stored procedures, functions, and packages for moving the data from staging area to data mart.      Created scripts to create new tables, views, queries for new enhancement in the application using TOAD.      Created indexes on the tables for faster retrieval of the data to enhance database performance.      Involved in data loading using PL/SQL and SQL*Loader calling UNIX scripts to download and manipulate files.      Performed SQL and PL/SQL tuning and Application tuning using various tools like EXPLAIN PLAN.      Extensively involved in using hints to direct the optimizer to choose an optimum query execution plan.      Used Bulk Collections for better performance and easy retrieval of data, by reducing context switching between SQL and PL/SQL engines.      Created PL/SQL scripts to extract the data from the operational database into simple flat text files using UTL_FILE package.      Creation of database objects like tables, views, materialized views, procedures, and packages using oracle tools like Toad, PL/SQL Developer and SQL* plus.      Extensively used the advanced features of PL/SQL like Records, Tables, and Object types.      Utilized various utilities such as Explain Plan, TK-Profile, and DBMS STATS in tuning the SQL interfaces.      Strategized the design, code, and system/unit test plan reviews to enhance project quality.Education & Visa Status
Bachelor of Science   GAUHATI UNIVERSITY, ASSAM, INDIAVISA STATUS   GC HOLDER

Respond to this candidate
Your Email «
Your Message
Please type the code shown in the image:
Register for Free on Jobvertise