A.3 Guide to Decision Support Tools 

Strategic Objective:

Strategic Objective:

Use Decision Support Tools to Achieve Organization Excellence

Quality Objective:

Quality Objective:

Train Managers to become OE21 Certified Management Analysts

Responsibility:

Approved: DD-MMM-YY

Responsibility:

Approved by: CEO, President or Senior Organization Leader

All Focus Teams (LFT, CFT, OFT, WFT)

VALUE ADDED       

VALUE ADDED: - OE21 Decision Support Tools (Surveys, Analysis, Plans, Trends)

 

  1. Improve the process of data collection, analysis, problem, and solution implementation.

  2. Improves synthesizing, analyzing, and interpreting quantitative and qualitative data, turning data into useful information, and acting operationally and strategically.

  3. Improves impact and benefits analysis, requirements definition, and solution creation.

  4. Improving analysis of performance factors listed in Baldrige Excellence Commentary 4.1.

  5. Trains organization managers to use advanced decision support tools and techniques

  6. Ensures that problem definition and solution creation processes are logical and effective.

  7. Ensures that all OE21 Focus Teams have a common and shareable approach. 

Applicability of A.3 Decision Support Tools

This A.3 Common Standard applies to the majority of the OE21 Surveys and their associated Decision Support Tools spreadsheet models, excluding the following:

  • All other OE21 Common Standards (see Main)

  • LFT B.1 surveys and sheets

  • LFT Organizational Profile Part 1 and Part 2

  • LFT 2.2 Product, Service Value Contribution (sheet)

  • LFT Leadership Excellence Strategy Plan (sheet)

  • LFT Leadership Excellence Results (sheet)

  • CFT Customer Excellence Strategy Plan (sheet)

  • CFT Customer Excellence Results (sheet)

  • OFT 6.1a Process Designer (sheet)

  • OFT 6.1b Project Manager (sheet)

  • OFT 6.2a Process Improvement Workbook

  • OFT 6.2a SM Team Charter (survey)

  • OFT 6.2a SM Data Log (survey)

  • OFT 6.2a SM Story (survey)

  • OFT PDF files (reference documents)

  • OFT Operations Excellence Strategy Plan (sheet)

  • OFT Operations Excellence Results (sheet)

  • WFT Workforce Excellence Strategy Plan (sheet)

  • WFT Workforce Excellence Results (sheet)

A.3 Implementation Instructions

Prerequisite Steps: Conduct Surveys and Export Results – Each Focus Team implements its unique set of OE21 Standards and decision support tools, including setting up and conducting surveys.

 

All focus teams use the common OE21 A.2 Survey Process instructions to execute the surveys and to apply the Survey Methods application properly. Once a survey is finished, the OE21 Facilitator (FAC) or another Focus Team member exports the Survey Methods results into a flat Excel file suitable for insertion into the SurveyData tab of the selected OE21 Decision Support tool. The OE21 Main page presents surveys and their associated decision support tools.  

 

The following steps assume that the survey data exists in the SurveyData tab of the OE21 Decision Support tool that is paired with the survey. 

PART 1 - STATISTICS

Step 1 - Open the OE21 Decision Support tool and verify that the SMdata tab is populated from the survey used to collect data from responders. If not, then repeat the steps in the Survey Process

Step 2 - Open the Instructions tab and observe the Target Population (yellow cell at upper left). This value should be the number of responders that you sent surveys to, and this usually differs from the number of responses (# of Responses) that you received back in the survey. 

  • Refer to Figure A.3-1 STATISTICS TABLE (below)

 

Step 3 - Estimate and Input the Target Population for this OE21 process. The target population is the number of people or units. Units might be organizations, associations, work units, stakeholders, customers, employees, suppliers, community business, health care or educational institutions, or other units. Note that the target population is usually less than the maximum possible. Input the Target Population into cell C2 on the Instructions tab. Note that this input calculates the sample size needed based upon a Sample Size Calculator formula in that cell. The following statistics are used in the sample size calculations:

  • Margin of Error - The OE21 Sample Size Calculation uses only the most common Margin or Error (also called Confidence Interval) of plus/minus three percent (+/-3%). This is the amount of error you choose to tolerate - an indicator of accuracy.

 

  • Confidence Level - The OE21 Sample Size Calculation uses only the most commonly used Confidence Level of 95%. This is the amount of uncertainty that you choose to tolerate. Note this 95% is also expressed as 1.96 in cell C9, for use in sample size calculation. 

 

  • Proportion of Interest - The OE21 Sample Size Calculation uses only the most commonly used value of 50%. This is an estimate of the proportion of people or units falling into the group within your Target Population in which you are interested. Using 50% is like insurance (i.e., worst case) and increases the accuracy of the sample size calculation.

 

Step 4 - Observe the Sample Size Needed value in cell C3. This value is the minimum number of people or units that should be surveyed for the Target Population you input. Ideally, you would survey this number or more people or units, however not all will respond. 

 

Step 5 - Observe the # (number) of Responses in cell C4. The formula in this cell counts the number of people or units that actually responded to your survey. These responses will come into the OE21 survey system (www.surveymethods.com) for this OE21 Standard (Title tab). The number of responses would ideally be close to the Sample Size Needed in cell C3, however, it may differ by a significant percentage.

 

Step 6 - Observe the Responses divided by Need (Responses / Need) in cell c5. This calculates the percentage of responses to the percentage needed. Ideally, the percentage is over 75% which indicates that you probably have sufficient responses to analyze. Percentages lower than 50% would indicate that you may need to survey additional people or units before making assumptions about the responses and associated analysis

 

Step 7 - Senior Leader Summary- The Analysis tab summarizes the above statistics as values for Target Population, Sample Size Needed, Number (#) of Responses, and Responses divided by Need in percent.

 

  • Refer to Figure A.3-2 STATISTICS TABLE (below)

The Responses/Need % value helps provide an answer to the question: "Do we have enough responses for observations and to help plan out solutions?"

 

Our guidelines are as follows: If less than 50% we should go back and gather more responses before analysis and planning. If 50% or above we may proceed ahead with analysis and planning, however, we should reach out to those that did not respond, and ask them to respond once again. 

INSTRUCTIONS STATS.PNG

Figure A.3-1 STATISTICS TABLE

(Instructions page)

ANALYSIS STATS.PNG

Figure A.3-2 STATISTICS TABLE

(Analysis Page

Part 2 - ANALYSIS 

The OE21 analysis tools are aimed at answering the following questions:

  • Q1 - Have we defined the purpose and primary goal for our analysis efforts?

  • Q2 - Do we have sufficient data and information to understand the problem?

  • Q3 - Do we have sufficient data and information to decide on the solution?

Learning Analysis by Example - The majority of the OE21 Analysis tools use data collected by sending surveys to target population groups (people or organizations). The data collected are input to OE21 decision support tools. We have selected one of these to help train managers in how to use the OE21 Analysis tabs that these tools contain. Our selection is named WFT 5.1b Voice of Workforce. We suggest that you open a copy of this tool on a second window, and use it as a reference as you study these instructions. The original version of this excel file contains example data that has been input into the tab named SMdata. 

Analysis Purpose and Primary Goal

 

The purpose of the Analysis in our Voice of Workforce example is to understand the workforce (employees, contractors, volunteers) personal views of the organization attributes, including satisfaction, communications, health and safety, benefits, workplace comfort, work procedures and policies, and recognition and rewards provided by the organization.

 

The primary goal of the Analysis is to measure each attribute and then take steps to improve these attributes. As you will see, the improvement steps become project tasks that are input in a tab named SimplePlan. Each project task has a performing organization or person, estimated planned start and finish dates, and estimated labor hours or non-labor dollars to complete the tasks. The summary level measures that the Analysis process creates are also used to track the Trends of the measure (in percent and over a time period, until the measure reaches its final target goal (usually 100%). 

Q1 - Have we defined the purpose and primary goal for our analysis efforts? To answer this question, observe the Analysis tab of the WFT 5.1b Voice of Workforce sheetHere are suggested observations:

  • Target Population - 195 people from the workforce will receive the survey that collects attribute data.

  • Sample Size Needed - 155 is the sample size calculated for responses needed from 195 population

  • # Responses actually received (those who responded to the survey is 125 (75% of 155)

We would say that 75% received is sufficient for achieving our purpose. On the other hand, if we had 50% or fewer responses we would suggest that we need to go back and ask for more responses. The bottom line is that 75% of 155 responses needed is pretty good and is probably sufficient to accomplish our purpose of understanding the workforce (employees, contractors, volunteers) personal views of the organization's attributes.

An important part of our primary goal is to measure each attribute. Here are a few key observations:

  • At the top of the Analysis tab, we see Date Refreshed, the Measure (58.3%), and Target of 100%. This measure is computed as the average ratings of the 125 responders. On the 1-5 rating scale, the responders averaged ratings is 2.92 which is 58.3% of the highest rating of "5" or 100%.

DateMeasureTarget.PNG

Figure A.3-3 Date Measure Target Example from Analysis tab

Just below the Measure and Target values are the Areas Impacted (%), shown in Figure C.2-4. The Areas Impacted (Benefits/Policies, Customers/Stakeholders, Employees/Contractors, Volunteers, etc.) are selected by responders during the survey, using a pulldown list. The associated bar chart presents a graphical view of areas impacted by percent. Here are suggested observations:

  • The list of numbers on the far left is the number of responder suggestions (748) from the survey.

  • The Areas Impacted list with % is Analysis tab calculations. The highest area impacted is the category Materials/Supplies/Facilities (count of 372; 49.9%). These areas impacted were selected by survey responders using a survey pull-down list. Benefits/Policies (count 245; 32.9%) was also a significant area impacted.

  • The bar chart presents the graphical view of these data. 

The Average Ratings (1-5 Scale) in Figure C.2-4 is calculated as the average of all 1-5 ratings in all questions. This value is shown in decimal (2.92 of 5) and percent (58.3%). There are a few minor considerations when observing these averaged ratings:

  • Statistics experts will explain that a 1-5 rating scale is an "ordinal scale" which means the (as an example) the differences between a 3 and 4 are not really known. We know 4 is higher than 3 but we can not quantify how much better the 4 is. 

  • The OE21 average ratings are used only to estimate the direction, over time, of the rating. Is it getting better (60%, then 70%, then 75%) or worse (60% then 50%, etc.). 

AreasImpacted.PNG

Figure A.3-4 Area Impacted (attributes) with statistics

Below the Areas Impacted section of the Analysis tab is the Excel Table, containing columns and rows for the key elements of the analysis. Above the table are survey questions that responders encountered during the survey. The Table has the Excel gray arrow icons which indicate that the data can be sorted.

 

NOTICE: To activate sort, the Analysis tab must be unprotected, using the password given to the OE21 facilitator during the OE21 installation and setup phase. 

Figure A.3-5 below is the initial sort by Responses. This is the sequence that the survey was responded to (e.g., who was first and who was last to finish the survey). This sort also gives you a way to see who did NOT take the survey. Just compare the organization work unit/role responders to the list of all organization work units and responders. Then you can go find out why some might not have participated in the survey and ask for them to do the survey. 

QuestionsRatingsSuggestionsImpact.PNG

Figure A.3-5 Questions, Ratings, Suggestions, and Areas Impacted

SORTING EXERCISE 1: Figure A.3-6 below shows Sort by Organization Work Unit or Role. By clicking the gray arrow icon on the table heading for Organization Work Unit or Role, this sort appears. 

  • IMPORTANT: To avoid viewing blank cells, click the Sort icon, then scroll to the bottom and unselect the [  ]Blanks. Now only cells with data will appear in either A-Z or Z-A sorts. 

 

Value of the Sort by Organization or Role:

Figure C.2-6 shows the contributions of the responders in each work unit or role, (job position). Here you must consider the number of people in work units or roles as well as the number of responses and the quality of the responses. A few analysis observations related to the engagement of people might be as follows:

  • Are work units or roles actively participating in this survey (number or responders vs. target)?

  • Are there engagement issues (e.g., people not doing their best; all ratings same, etc.)?

  • Are there many mismatches between suggestions and impact categories?

 

IF we don't find many "disengagement" indicators, then the results are more useful.

  • Are there key differences between ratings, suggestions, and the impact of different work units or roles? If so, then are there good reasons for these key differences? Or not?

 

  • An example might be that 80% of work units indicated they were very dissatisfied with the dental plan. Are the other work units aware of the dental plan? Should the organization initiate a dental plan awareness plan?

 

1sort.PNG

Figure A.3-6 Sort by Organization Work Unit or Role

SORTING EXERCISE 3: Figure A.3-7 below shows Sort by Suggestions. By clicking the gray arrow icon on the table heading for Suggestions, this sort appears. 

 

Value of the Sort by Suggestions:

We can see the suggestions that responders have in order to deserve their highest rating of a "5." The suggestions are useful to pinpoint the problems and collect ideas for possible solutions. The more people that agree on specific suggestions the more likely that these are valuable inputs.

  • One caution is to avoid not considering "lone wolf" suggestions with only 1 or 2 responders. Sometimes one or two people have great suggestions or ideas. 

Another use of sort by suggestions is to see how well the responders are engaged. Are they making their best efforts to submit good suggestions? Or just some random thought to enter and move ahead rapidly through the survey. To see how rapidly they completed the survey, look at the SMdata tabs and find the actual start date and time, along with the actual finish date and time. In this observation, you will get an idea of how long it takes most people to do the survey. 

 

IF we don't find too many of the above "disengagement" indicators, then the results are more useful.

2sort.PNG

Figure A.3-7 Sort by Suggestions

SORTING EXERCISE 4: Figure A.3-8 below shows Sort by Ratings. By clicking the gray arrow icon on the table heading for Ratings, this sort appears. 

Value of the Sort by Ratings:

We can see the ratings that responders submitted from lowest (1) to highest (5). If we want to get a quick idea of what work unit or roles are the least or most satisfied this is the sort to use. Another observation is which of the areas impacted have the lowest or highest ratings?

Another observation is about engagement. What if the majority of the responders notice that the survey question asked them to input suggestions for any ratings below a "5" and then to avoid thinking they just input a "5" rating and moved quickly through the survey. 

About Engagement: The thought leaders and publishers of worldwide significant surveys of engagement of employees and managers indicate that well over 50% of these folks are disengaged! Disengagement is a big problem and if an organization has a high level, then something must be done to correct this costly problem. The cost of terminations, replacement, and retraining is very large. The cost of low product/service quality due to disengaged workers is equally large. 

The OE21 WFT 5.2a Manager Engagement decision support tool is recommended in order to collect survey data aimed at measuring engagement using a number of attributes. 

The OE21 OFT 6.1d Innovator decision support tool is highly recommended in order to drill down to find problem causes. why these problems exist and to find good solutions to eliminate high disengagement. 

3sort.PNG

Figure A.3-8 Sort by Ratings

SUMMARY OF ANALYSIS OBSERVATIONS

A summary of the results of these analysis exercises is as follows:

 

  • We decided on the Target Population value and input it into the Instructions tab (yellow cell). The target population is the maximum number of responses we would like to have.

 

  • We observed the calculated Sample Size Needed and the number of responses received. We observed the calculation of responses/needs and made sure it was sufficient using a guideline of 75% or more. If our responses are lower, then we know we should go back to non-responders and encourage them to respond.

 

  • We learned that our Primary Goal is to measure the ratings for each attribute in the survey and then try to create improvements. Responders provide both ratings and suggestions for improvements, along with the categories that each suggestion belongs to. The responders used the category pull-down list to link their suggestions to appropriate areas impacted. 

 

  • From all measured ratings, we observed the calculated top-level Measure in percent (58.3%)and compared this to the Target, which was set to 100%. We also noted the Date the survey was completed.

 

  • From all measured ratings, we observed the calculated percent (%) for each of the Areas Impacted. In our example, we learned that Benefits/Policies (32.9%) and Materials/Supplies/Facilities (49.9%) were the top areas.

 

  • We sorted the Analysis table by Organization or Role and used the results to view the contributions of each work unit/role. From these observations we gained a feeling about the level of engagement of the responders, e.g., not enough responders or responses; doing their best (or not), mismatches between suggestions and impact categories, and other useful indicators.

 

  • We sorted the Analysis table by Ratings to learn which responders tended to rate the highest and the lowest, as well as how many were similar. 

 

  • We sorted the Analysis table by Suggestions to help pinpoint the problems and to consider the suggestions that responders submitted.

  • We sorted the Analysis table by Ratings to learn which responders tended to rate the highest and the lowest, as well as how many were similar.

The next step is to input the Date, Measures and Targets from the Analysis tab into the Trends tab.

(End of Part 2)

PART 3 - ANALYSIS TRENDS

Figure A.3-9  is the Trends Chart that resides in the Voice of Workforce Trends tab. The dates, measures, and targets are input from those displayed near the top of the Analysis tab. This trend shows that the latest measure (58.3%) is moving steadily toward its target value of 100%. The Analysis and Action Plan narrative is based upon the analysis of survey data, as presented in the Analysis tab, and as created by the Workforce Focus Team (WFT) and the OE21 Facilitator. 

1TrendsVOW.PNG

Figure A.3-9 TRENDS CHART - Voice of Workforce

INSTRUCTIONS FOR INPUT OF DATA INTO TRENDS CHART FIELDS

1. Input the Chart Information (below chart)

 

  • Responsibility (name of Focus Team)

 

  • Measure Title and Year (e.g., Voice of Workforce Analysis Year 2020)

 

  • Metric or Measure Source (e.g., Data Log or another source)

 

  • Responsibility  Name (for analysis and action plan)

 

  • Responsibility  Name (for collection and validity of data)

 

  • Date Refreshed: Input from the most recent survey and analysis conducted.  

TREND CHART REVIEW GUIDELINES

1. The TARGETS should be set by the head of each focus team (LFT, CFT, OFT, WFT) and approved by the Leadership Focus Team (LFT).

 

2. The MEASURES should be trending toward their Target values, in order to improve. 

 

3. IF a Measure is trending away from its Target then the focus team should take timely and effective action to drive the measure toward the target, and until the target is reached.

 

4. IF Measure has arrived at, or has passed its Target value, then the Target value should be changed to set a new improvement goal. Once a measure has reached its maximum possible target value then no future action is necessary. 

5. If Benchmarks (Alpha and Bravo lines) are available from competitors, comparative organizations, or other benchmark sources then the title of the benchmark should replace Alpha and Bravo titles.

 

6. Alpha and Bravo lines can also be replaced by Upper and Lower Limits in reference to the Measure. These values might be from calculations such as those used in statistical process control. 

7. Focus Teams (LFT, CFT, OFT, WFT) should review all Trend Charts at least every month, or when the data are refreshed.

Figure A.3-10 is the Trends Chart Data Table that resides alongside the Trend Chart in the Trends tab. This Voice of Workforce example shows that three measures were input in the current year. The Instructions will explain the table inputs and use. 

2TrendsVOW.PNG

Figure A.3-10 TRENDS CHART DATA TABLE 

INSTRUCTIONS FOR INPUT OF DATA INTO TRENDS CHART DATA TABLE

 

1. Collect and input the MEASURES - The measures are collected (refreshed) each time the associated survey or other data source are updated or refreshed. In Figure C.2-10, the data (measures) were collected in January, August, and November of the current year. The values are from the Analysis tab rating summary. 

2. Set and input the TARGETS - Target values should be input into each month, even if the Measures are input less frequently. This presents the trend chart target value as a dark continuous line that makes it easy to compare to the Measures. Note that the target should always be set above or below the measure, according to these guidelines:

  • If a measure should improve from a low value toward a higher value target (e.g. 25% to 100%) then the target is always set higher (above) the measure. As the measure approaches the target, it is a good practice to "stretch" the target even higher.

  • If a measure should improve from a high value toward a lower value (e.g. 20% defects to 1% defects) then the target is always set lower (below) the measure. As the measure approaches the target, it is a good practice to "stretch" the target even lower.

 

​3. Input a value into the Hi (1) Low (0) cell. A "1" indicates that the measure should move higher toward the target (up is good). A "0" indicates the measure should move lower toward the target (down is good). This input triggers the color-coded value (5) to the right of the hi/lo input. The code is green (satisfactory), yellow (marginal), and red (unsatisfactory). 

4. IF appropriate, input Alpha and Bravo Benchmarks. These values should represent the competitive or comparative organizational performance from other organizations (if you can capture these). If collected and input the Alpha and Bravo titles are edited to actual organization titles. The benchmark values may be input as collected - or - may be replicated each month to draw green (Alpha) or red (Bravo) continuous lines like the target.

OE21 Trend Chart Limitations and Alternatives

 

  • The OE21 Trend Charts are designed to work for only the current year. Calendar dates run from January to December. No data are saved unless you back up the Excel file with the Trends tab.  

  • The OE21 Trend Chart data inputs for measures, targets, alpha and bravo benchmarks and Hi/Lo values are directly inputted into the data table (not externally collected).

  • The OE21 Decision Support Tools containing SM (survey tabs) and Analysis tabs have one and only one Trend Chart.

  • OE21 Excellence Results Excel workbooks contain twenty (20) Trend Charts each. These multi-trend workbooks are used by each OE21 Focus Team (LFT, CFT, OFT, WFT). In cases where a focus team needs more than 20 trend charts to track many key performance indicators (KPIs), that team can use multiple workbooks. Example: The OFT might need two workbooks to track up to 40 trends.

  • Outside Trend Charting tools (i.e. Microsoft BI, DOMO, others) are commonly used by many organizations for tracking their KPIs and for doing advanced analysis of trends, including correlations between trends. These Business Intelligence (BI) systems have the capability to pull in OE21 data tables from Analysis tabs and use these data to create trend charts and more advanced analysis of data. 

(End of Part 3)

PART 4 - POST-ANALYSIS PROJECT PLAN (SimplePlan tab)

Figure C.2-11 is an example of a post-analysis project plan (SimplePlan tab). This project plan is used to plan and implement the organization's improvements, based upon suggestions from the Analysis tab and other ideas from the responsible focus team (Workforce Focus Team in this case). 

 

'The overall purpose of this project is to help achieve workforce excellence, using inputs from the Voice of Workforce survey, analysis, and trends. As Figure A.3-11 shows, this is a simple small project, with a duration of about 2.5 months and a budget of $18,192. 

 

All information presented in Figure A.3-11 is directly input, except for the last column (K) which is the calculated Budget at Completion. The key features of this SimplePlan are:

  • Allows input of up to 100 tasks, and those responsible for task completion

  • Allows input of Planned Start Date and Expected Finish Dates for each task

  • Allows input of labor hour or non-labor dollar estimates for each task

  • Allows input of tasks status dates and percent complete (%)

  • Allows input of actual costs (optional)

  • Allows input of summary task information (project ID, Project manager/control, overall start/finish date

  • Calculates Budget at Completion which is the final cost of the total project.

 

1Splan.PNG

Figure A.3-11 POST-ANALYSIS PROJECT PLAN (SimplePlan tab) 

PART 4 - POST-ANALYSIS PROJECT PLAN INSTRUCTIONS (SimplePlan tab)

These instructions refer to Figure A.3-11 and apply to all OE21 SimplePlan tabs. 


Step 1 - Open to the SimplePlan tab 

 

Step 2 - Go to Cell C2 and input Project Title.  Example: WFT P1-2020

 

Step 3 - Assign project manager and input name in Cell C4; if needed input name for Project Control person in Cell C5.    

Step 4 - Input a brief Project Description (paragraph) in Cell B17    

 

Step 5 - Input Task Descriptions in Cells B19 to B118 (max of 100 tasks)   

 

Step 6 - Go to Task 1 Cell C19 and input Work Unit, Supplier, or Person responsible for the Task     
   

  • Guideline 1 - Work Units must be confirmed by HR and the CEO. Work Units are usually used by Accounting to group similar workers and compute the average of their hourly wages. These averages are often used in budgeting and in cost collection systems. 

Step 7 -  Skip Status Date and % Completion fields for now and input Task 1 Start Date and Finish Dates in Cells F19 and G19.    

 

Step 8 - The people responsible for Task 1 provide an estimate of Labor Hours or Non-Labor Dollars 

  

  • Guideline 2 - It is very important that the people who will do the work are the ones that provide the hours and dollars estimates for the project. These may be changed if errors or under/overestimates are suspected, but the performers should always be involved in the estimates. This keeps them accountable and helps gain their "buy-in" to get the project work done.        

 

  • Guideline 3: Labor Hours are estimated hours to complete each Task. Ask Accounting for the average rate-per-hour that the Work Unit or Person is paid; input this amount in Cell I19        

   

  • Note: The SimplePlan model multiplies estimated hours times rate/hour to calculate Budget-At-Completion (BAC) in Cell K19. The BAC is the total expected cost for the Task.        

 

Step 9 - IF non-labor (supplier, material, other non-labor costs) are needed ask the responsible organization for the estimate. The source of non-labor costs might be a supplier or list of materials that are ordered by Purchasing.    

 

Step 10 -  Input a "0" in Column H for the task and then input the estimated dollars in Column "I"    

 

  • Note: The SimplePlan model uses the "0" in a formula that places the non-labor dollars from Column I into the BAC column K.        

 

Step 11 - Repeat Steps 6 to 10 for all remaining Tasks in the SimplePlan.    

 

Step 12 - Go to Cell C9 and observe the total cost, labeled as Budget at Completion (BAC).    

 

Step 13 - IF the BAC is unsatisfactory, then work with those responsible for the tasks to make revisions to estimates for labor hours or dollars.

       

  • Guideline 4 - It is very important that the people who will do the work are the ones that provide the hours and dollars estimates for the project. Disagreements must be settled between the Project Manager and those that do the work. This is a key step in good project management. If an agreement cannot be reached, the Project Manager may have to request additional budget authority from the senior leaders and CFO.        

 

Step 14 - Input the earliest Start Date (Column F) in Cell C6.    

 

Step 15 - Input the latest Finish Date (Column F; last task row) in Cell C7.    

 

Step 15 -  Schedule a meeting with all those responsible for project management, project control, and those responsible for the performance of tasks. At the project review meeting, the Project Manager presents all tasks and estimates and seeks agreements with the task performers. Adjustments to tasks, schedules, and cost estimates are usually finalized at this meeting, or in a separate "breakout" meeting with a smaller group of task performers.    

 

Step 16 - When the project plan is finished, confirm agreements with all task performers, then submit the final SimplePlan to the appropriate higher authority for approval or revisions

   

  • Guideline 5 - Agreements are usually confirmed by signatures on some type of Work Authorization Document (WAD). This document should become a standard document used throughout the organization. The WAD should include the reference to the SimplePlan statement of work and BAC estimates and a place for signatures of all those responsible for task performance. The Project Manager signs the WAD. This becomes an "internal contract" or a memorandum of understanding (MOU).    

 

Step 17 - Once the Work Authorization Document is signed, the project is approved and project work should start as planned. 

 

(End of Part 4)