Moursund, D.G. (2002). Obtaining resources for technology in education: A how-to guide for writing proposals, forming partnerships, and raising funds. Copyright (c) David Moursund, 2002.
Chapter 7: Evaluation of Formal Proposals
It helps a proposal writer to know a lot about the overall processes and rubrics used to evaluate a proposal.
Section Headings for Chapter 7
U. S. Department of Education Rubric
U.S. National Science Foundation Rubric
Dwight D. Eisenhower State-Level Projects Rubric
You can expect that a funding proposal you submit will be evaluated by well qualified proposal reviewers. It helps a proposal writer to know a lot about the overall processes used to evaluate a proposal. Many Resource Providers have well developed rubrics that they use in evaluating proposals. A proposal should be designed and written so that it communicates effectively with representatives of the Resource Provider. Moreover, it should be designed and written so that it is aligned with the evaluation rubric.
Top of Page
Funding Agency Processing of Proposals
In competitive grant situations, it is common for the funding agency to develop a set of evaluation criteria. Several different reviewers read and evaluate each proposal. If a large number of proposals needs to be processed, there will be many proposal reviewers. Each will read and evaluate a set of proposals. The funding agency will probably use procedures that roughly conform to the following steps.
- The funding agency's clerical staff receives and date-stamps the proposals, gives each one an identification number, and sends a postcard to the proposal submitter indicating that the proposal has been received. In some cases, proposal submitters are expected to include a self-addressed and stamped postcard, to slightly reduce the work and cost involved in processing proposals. If proposals are submitted electronically, the "receipt" is apt to be sent by email.
- The clerical staff eliminates proposals that do not satisfy basic requirements. Proposals must be received on or before a specified date, have required signatures, and have certain parts of the accompanying forms filled out.
- The clerical staff gathers some demographic information about the proposals. For example, the Program Officer might want to know the number of proposals that were submitted, where the proposals came from, the dollar amounts of the proposals, and so on.
- The clerical staff and/or Program Officers sort the qualifying proposals that have not already eliminated into bundles that will be given to the proposal readers. The sorting may depend on the proposal readers' areas of expertise. It is relatively common for each proposal reader to handle approximately 5 to 10 or more proposals, although this number varies considerably with different funding organizations. Care is taken to avoid conflict of interest by the reviewers. Thus, for example, a reviewer is not allowed to review proposals submitted from his/her organization.
- Several reviewers read each proposal. Each reviewer fills out a review sheet (a rubric) and provides numerical ratings on several scales, an overall numerical rating, and comments. Many funding agencies strongly encourage their reviewers to provide detailed written comments.
- If the proposal reviewers have been gathered at one location, they may discuss their results on each proposal and revise their ratings before turning them in to the Program Officer.
- If the proposal reviewers have not been brought together at one location, they surface mail or telecommunicate their reviews.
- The clerical staff tabulates the reviewers' results and gives them to the Program Officer.
- The Program Officer conducts an initial analysis of the proposal reviewers' evaluations. This might involve the following three steps:
- Reading every proposal, at least at a superficial level. (The Program Officer has probably read the proposals while the reviewers have been doing their reviewing work.)
- Eliminating from further consideration those proposals the reviewers have uniformly ranked low, unless there is something especially appealing about the proposal that catches the Program Officer's eye.
- Asking for an additional review for proposals receiving moderately good ratings but having considerable discrepancies in their ratings. This review might be done by an outside reviewer or another Program Officer.
- The Program Officer does a second pass through the remaining proposals and decides which ones will be funded. The process varies considerably but may include one of the following:
- A rank-ordering of the proposals based strictly on their overall scores. Starting at the top and working down the list, proposals are funded until funds are depleted.
- A rank-ordering of all proposals that fall into a "good enough to fund" classification. Starting at the top of the list, each proposal is examined very carefully and a funding decision is made on each. Moving down the list, "accept" and "reject" decisions are made until all available resources have been committed. Considerable discretion is exercised by the Program Officer.
- The clerical staff and/or Program Officer communicate with all proposal submitters about the outcomes. Two steps might be taken at this point:
- In certain funding agencies it is standard practice to provide feedback to the proposal submitters. For example, the feedback might include copies of the written comments and numerical ratings given by the reviewers.
- For each proposal to be funded, detailed contracts may be negotiated with the proposal submitter, especially regarding budget items. Often this requires significant revision of the proposal.
This overall review process is labor intensive, depends on individual judgments, and is not particularly precise. Indeed, it is quite common for three different reviewers to assign a proposal three different ratings on a five or six point scale. There is no guarantee that the best proposals are always funded.
There are ways to greatly improve the quality of the review process. For proposals requesting large amounts of money, it is common for funding agencies to carry out the following steps:
- A panel of highly qualified proposal reviewers is assembled in one location. Typically they have been provided copies of the proposals in advance (perhaps electronically), so they have had a chance to familiarize themselves with the proposals. Indeed, they may have been assigned various proposals to read prior to the face-to-face meeting.
- The panel is given instructions about the nature and purpose of the funding agency, the RFP, and the proposals being reviewed.
- The panel is given instructions on the evaluation criteria, the meaning of the different rankings, and the type of comments that will be helpful to the Program Officer and proposal submitters.
- A practice session is held in which reviewers read one or two proposals, discuss their evaluations, and gain skill in evaluating the proposals. This gives proposal reviewers insight into different ways of examining a proposal.
- Each proposal is then read by a relatively large panel--perhaps five to seven reviewers.
- After all proposals have been evaluated, panel members discuss each one. The Program Officer (who has read every proposal) usually sits in on the discussion. Each panel member provides a revised (and/or unchanged, as the case may be) rating for each proposal based on the discussion. The Program Officer uses these final ratings to select the winning proposals.
- When there is a large number of proposals, and hence a number of panels reading proposals, a norming technique may be used between panels. This addresses the problems that arise when one panel rates proposals at a much higher or lower level than another panel.
- The Program Officer carefully negotiates the final contracts to the "winning" organizations. Often this requires a considerable period of "give and take" negotiation so that the resulting project better fits the needs of the Resource Provider.
Top of Page
Being a Proposal Reviewer
One way to learn about writing proposals is to review them. This can be done as an exercise in a proposal-writing class, or you might work to get yourself invited to serve on a proposal-review panel. If you feel that you have good qualifications to review proposals for a particular funding agency, contact one of its Program Officers and ask about being added to its pool of reviewers.
Here is an example of a request for reviewers from a funding agency:
Accessed 5/3/02: http://www.ed.gov/offices/
OPE/HEP/pmit/readers.html. [Note 4/20/05: the material quoted below is no longer available at that Web address.]
Programs in the Office of Postsecondary Education continually seek experts in the field of higher education to serve as readers (review panelists) for grant competitions. Review panelists travel to Washington, DC usually for a week-long panel review and are provided with round-trip travel, lodging, meal allowances, and a daily stipend. The review process involves orientation, reading of the applications, and discussions with other panelists. Panelists' scores are based on U.S. Department of Education criteria.
If you would like to submit your name to be included among the pool of review panelists, please send HEP the following information and indicate which HEP programs meet your qualifications.
Your Name; Social Security Number; Home Address; Home Telepho; e Place of Employment; Work Address; Work Telephone; E-mail addres; Fax number; Résumé.
You may submit this information by e-mail to OPE_HEPReader@ed.gov or write to: Field Readers, U.S. Department of Education, Higher Education Programs, 1990 K Street, NW, Washington, DC 20006-8523; Phone (202) 502-7800; Fax (202) 502-7665 or (202) 502-7872.
Résumés will be reviewed for appropriate qualifications and eligibility (barring no apparent or actual conflicts of interest). All biographical documents sent to HEP will be maintained confidentially and entered into the Field Reader Registry for use in drawing names of prospective readers. Those persons who are not selected to evaluate applications in a given program competition will be kept in our Registry for future competitions, unless we are requested by the persons submitting the documentation to delete their names.
Qualified and eligible persons will be notified to determine their availability for one or more program competitions during a fiscal year. From the persons responding to the request for availability, prospective panelists will be selected by program officers for invitation to one or more application review sessions.
The Department of Education pays non-Federal review panelists a daily honorarium, funds permitting. For review panelists from outside the Washington, DC metro area, the Department provides pre-paid hotel accommodations and a $46 maximum daily meal allowance (per diem). It also pays in advance for travel to and from Washington at coach or economy air fare, and reimburses panelists for transportation costs to and from the home and Washington airports. Review panelists receive 75 percent of the per diem for the day of departure and the last day of travel. Panelists are responsible for any other expenses incurred.
Review panelists from the Washington metro area will receive payment for local travel costs (including parking) and the honorarium.
The mileage reimbursement rate for use of privately owned vehicles (POV) is $.365 per mile for local review panelists and for review panelists outside Washington, DC.
Funding Agencies like to have well qualified reviewers who may also be interested in writing proposals in future rounds of a competition. Thus, Funding Agencies often select reviews who have the knowledge and skills to write proposals, but who have not submitted a proposal to the specific program for which they are being asked to review.
Reviewing a large number of proposals in a short period of time is hard work. Each individual reviewer's ratings and comments are quite important. Thus, you will want to do a conscientious, high-quality job.
At the same time, there are usually intense time pressures. You may be reviewing a large number of proposals one after another. By the time you reach the end of the 12th proposal, you may have trouble remembering whether a particular idea was covered in the first part of the current proposal, in the last part of the previous proposal, or in a proposal you looked at two hours ago. You have become mind-numb! (Teachers who grade term papers and/or long essays written by students have already had lots of experience in dealing with this mind-numbness situation.)
Keep these facts in mind as you write proposals. Do whatever you can to make it easier for a reviewer to accurately and adequately review your proposal under adverse working conditions. Work to design a proposal that will communicate effectively with a tired reviewer who will spend a half hour or so reading and analyzing your proposal, when your proposal may be 15-30 pages or more in length.
Top of Page
Evaluation Criteria (Rubrics)
Rubrics used for evaluating proposals vary considerably among different Resource Providers. This section contains three examples of such rubrics. All are rather general, not specifically focussing on Information and Communications Technology in Education.
U.S. Department of Education Rubric
The U.S. Department of Education was established on May 4, 1980 by Congress in the Department of Education Organization Act (Public Law 96-88 of October 1979). The Department's mission is to:
- Strengthen the Federal commitment to assuring access to equal educational opportunity for every individual;
- Supplement and complement the efforts of states, the local school systems and other instrumentalities of the states, the private sector, public and private nonprofit educational research institutions, community-based organizations, parents, and students to improve the quality of education;
- Encourage the increased involvement of the public, parents, and students in Federal education programs;
- Promote improvements in the quality and usefulness of education through Federally supported research, evaluation, and sharing of information;
- Improve the coordination of Federal education programs;
- Improve the management of Federal education activities; and
- Increase the accountability of Federal education programs to the President, the Congress, and the public.
An RFP from the Department of Education includes a precise statement of the evaluation criteria and the number of points assigned to each category in these criteria. Usually there are 100 points possible. However, sometimes the Department of Education will evaluate a proposal using a 110- or 115-point scale. The extra 10 or 15 points are assigned to some criteria a particular program wants to emphasize. For example, a particular RFP may indicate that up to 10 points will be awarded to proposals that address the needs of minority students in the 50 largest school districts in the country. Or, an RFP might indicate that an extra five points will be awarded to proposals that come from organizations that have a handicapped person who will be working on the project. Needless to say, unless your proposal is strong in these "extra points" areas, it has little chance of being funded.
The following Department of Education evaluation criteria identifies the general components of a proposal and the weight applied to each component, based on a total of 100 points.
- Statement and importance of the problem to be solved or task to be accomplished. Contains a brief literature review if appropriate. (10 points)
- Technical soundness of methodology. Describe the methodology. Present solid, research-based evidence that the methodology being used in the project will solve the problem. (40 points)
- Plan of operation. Provide detailed information on the implementation steps that will be taken to solve the problem or accomplish the task. These must be consistent with the methodology and the research literature supporting the methodology. (10 points)
- Evaluation plan. Present a plan for the formative and summative evaluation of the project and reporting this information to the Resource Provider. (5 points)
- Quality of key personnel. (10 points)
- Adequacy of resources. Present an analysis showing that the resources being requested, when combined with those being provided by the Resource Seeker, will be adequate to solve the problem. (5 points)
- Impact. What impact will successful completion of the project have on the educational system, both now and in the future? (5 points)
- Organizational capability. Present evidence that your organization has the capabilities to successfully carry out the project. (10 points)
- Budget and budget notes. (5 points)
One of the most interesting aspects of this set of evaluation criteria is the great emphasis on methodology. The goal of the project is to solve a problem or accomplish a task. A great deal is probably already known about how to achieve that objective. The proposal must show that you are very familiar with this background knowledge. You need to show that you are building on the expertise and work of other people.
Top of Page
U.S. National Science Foundation Rubric
The National Science Foundation (NSF) is an independent agency of the Federal government that was established in 1950 by an Act of Congress . The agency's mission is to promote the progress of science and engineering.
The NSF puts much of its funds into pure and applied research. However, it puts a substantial amount of money into education in the areas of science, technology,engineering, and mathematics (STEM). Here the emphasis tends to be on staff and curriculum development designed to improve the quality of education that students receive.
The following material is quoted from a NSF Press Statement (2/4/02) that discusses the proposed NSF budget for Fiscal Year 2003. The material quoted captures the "flavor" of how the NSF invests its funds. This flavor is inherent both to its RFPs and also to its proposal evaluation processes. The priorities and new programs listed give a good indicatoin of major frontiers in Sciejnce, Engineering, Mathematics, and Technology research and implementation.
The National Science Foundation's FY2003 Budget: Sustaining U.S. Leadership Across the Frontiers of Scientific Knowledge
Statement by Dr. Rita Colwell Director, National Science Foundation
It is with a sense of pride and purpose that we present the National Science Foundation's budget for the next fiscal year. It is not just a balance sheet. It is a blueprint for our nation's future.
Every year, for more than half a century, the Foundation's investments at the frontiers of discovery have enriched Americans' health, security, environment, economy, and general well-being.
And every year, the Foundation's optimal use of limited public funds has relied on two conditions: Ensuring that our research and education investments are aimed - and continuously re-aimed - at the leading edge of understanding; and certifying that every dollar goes to competitive, merit-reviewed, and time-limited awards with clear criteria for success.
When these two conditions are met, our nation gets the most intellectual and economic leverage from its research and education investments.
The National Science Foundation is requesting $5.036 billion for FY2003, $240 million or five percent more than the previous fiscal year. For the United States to stay on the leading edge of discovery and innovation, we cannot do less.
Maintaining the pace of discovery and producing the finest scientists and engineers for the twenty-first century are our principal goals. Investments proposed in the FY2003 budget are key to developing our nation's talent and increasing the productivity of our workforce. The budget includes a second installment of $200 million for the President's five year Math and Science Partnership program to link local schools with colleges and universities to improve preK-12 math and science education, train teachers, and create innovative ways to reach out to underserved students and schools.
In order to attract more of the nation's most promising students into graduate level science and engineering, we are requesting an investment of approximately $37 million to increase annual stipends for graduate fellows to $25,000. Another investment of $185 million is directed toward NSF's Learning for the 21st Century Workforce priority area. A key centerpiece includes $20 million to fund three to four new multi-disciplinary, multi-institutional Science of Learning Centers to enhance our understanding of how we learn, how we remember, and how we can best use new information technology to promote learning. As we comprehend the dynamics of human learning, we will be better able to explore how educational institutions at all levels foster or inhibit learning and how to develop more effective strategies to prepare our workforce.
Our request also includes $221 million for nanotechnology research and $286 million for information technology research. Neither area can achieve its full potential without complementary progress in the other.
The emerging field of nanoscale science and engineering -- the ability to manipulate and control matter at atomic and molecular levels - promises revolutionary breakthroughs in areas such as materials and manufacturing, medicine and healthcare, environment and energy, biotechnology and agriculture, computation and information technology, and, of course, national security.
New paradigms will use advances in quantum computation and nanoelectronics to devise radically faster computers that begin to solve problems previous dismissed as "uncomputable," such as full-scale simulations of our biosphere. Viewing cells as computational devices will help enable the design of next generation computers that feature self organization, self repair, and adaptive characteristics seen in biological systems.
These and other challenges will require new mathematical tools, techniques, and insights. We propose to invest $60 million as part of a new priority area in mathematical and statistical sciences research that will ultimately advance interdisciplinary science and engineering. Only by mining and comparing enormous data sets can we find the patterns, trends, and insights needed to improve the safety and reliability of critical systems such as our telecommunications network, our electric power grid, and our air traffic control system. And only by modeling the enormous complexity of the living world can we fully understand it.
We are also requesting $10 million to seed a new priority area in the social, behavioral, and economic sciences to explore the complex interactions between new technology and society to better anticipate and prepare for their consequences.
The budget request includes $79 million for research on biocomplexity in the environment, building upon past investments in the study of the remarkable and dynamic web of interrelationships that arise when living things at all levels interact with their environment. Research in two new areas this year -- microbial genome sequencing and ecology of infectious diseases -- will help develop strategies to assess and manage the risks of infectious diseases, invasive species, and biological weapons.
The following material is quoted from the Program Announcement and Guidelines for NSF Undergraduate Education: Science, Technology. Engineering, and Mathematics. Both the proposal writers and the proposal reviewers make use of this information.
A. Performance Competence. Typical questions raised in the review process are:
- Is the proposal supported by the involvement of capable faculty (and where appropriate, student assistants), adequate facilities and resources, and institutional and departmental commitment?
- Does the proposal show an awareness of current pedagogical issues, the extent of the problems, what others have done, and relevant literature?
B. Intrinsic Merit. Typical questions raised in the review process are:
- Does the project address a major challenge facing U.S. undergraduate education?
- Are the goals and objectives, and the plans and procedures for achieving them, innovative, well developed, worthwhile, and realistic?
- Are the plans for assessing progress and evaluating the results of the project adequate?
C. Utility or Relevance of the Project. Typical questions raised in the review process are:
- Does the proposal design take into consideration the background, preparation, and experience of the target audience?
- Is the proposed course of curriculum integrated into the institution's academic program?
D. Effect on the Infrastructure of Science, Technology. Engineering, and Mathematics. Typical questions raised in the review process are:
- Are plans for dissemination and communication of the results appropriate and adequate?
- Does the proposal effectively address one or more of the following objectives:
- To ensure the highest quality of education for those students pursuing careers in science, engineering, and mathematics?
- To increase the participation of qualified women, minorities, and persons with disabilities?
- To prepare precollege teachers of science and mathematics?
- To provide a foundation for scientific and technological literacy?
- To develop multi- and interdisciplinary courses and curricula?
- Excellent: Probably will fall in the top 10% of proposals in this subfield: highest priority for support. This category should only be used for truly outstanding proposals.
- Very Good: Probably will fall among the top third of proposals in this subfield; should be supported.
- Good: Probably will fall in the middle third of proposals in this subfield; worthy of support.
- Fair: Probably will fall among the lowest third of proposals in this subfield.
- Poor: Proposal has serious deficiencies; should not be supported.
Here is some additional information quoted from an NSF document:
Grant Proposal Guide, NSF 03-2, Effective October 1, 2002
In the January 2002 Grant Proposal Guide (GPG), NSF published revised proposal preparation guidelines relating to completion of the Project Summary and Project Description. PIs were instructed that they must address both merit review criteria in the preparation of proposals submitted to NSF. The GPG specifies that proposers must clearly address, in separate statements within the one-page limitation, both of the NSF merit review criteria in the Project Summary. The GPG also reiterates that broader impacts resulting from the proposed project must be addressed in the Project Description and described as an integral part of the narrative. Examples illustrating activities likely to demonstrate broader impacts are available electronically on the NSF website at: (accessed 4/20/05): http://www.nsf.gov/pubs/2003/nsf032/bicexamples.pdf
This particular NSF approach to proposal evaluation is somewhat holistic in nature. It leaves the Program Officers with a reasonable amount of leeway in the final decision-making process. A Program Officer can make decisions among proposals that have somewhat similar rankings. The NSF indicates that it funds about 30% of the total number of proposals it receives each year. In the evaluation scheme given above, this means that a proposal that is ranked Excellent or Very Good is apt of be funded. Proposals that receive ratings of Good or lower ratings are not apt to be funded.
In some NSF programs, proposal reviewers are asked to assign points on a 100-point rating scheme, much like the Department of Education's evaluation system.
Top of Page
Dwight D. Eisenhower State-Level Projects Rubric
The Dwight D. Eisenhower program is a federal program designed to improve inservice and preservice teacher training. Funds are awarded to states in a block grant manner. An individual state then decides how the funds are to be distributed. The criteria (the program emphasis) may change from year to year. In 1994 in Oregon, some of the program's funds were awarded by competitive grants through higher education institutions.
The 1994 RFP contained both the guidelines for proposal writers and a detailed listing of the evaluation criteria to be used by proposal evaluators. This is a clear and open way to conduct a competitive grant-making process. Thus, the proposal writers were able to ensure that their proposals adequately addressed each of the evaluation criteria. There were 100 points possible. Here are the specific evaluation criteria.
- Is the problem the project addresses clearly outlined and are the needs significant? Does the proposal include data to exemplify need, including data from local education agencies who will be participating in the project? If a project is proposing to continue efforts from previous years, is evaluation data provided to document why the need still exists? (15 points)
- Has the proposed project been planned cooperatively between the institution of higher education, local school district(s), and other appropriate organizations (industry or nonprofit organizations)? Where programs are serving inservice or retraining needs of local education agencies, have the programs been tailor-made to meet the needs of the local education agencies? When addressing preservice needs, have the programs involved school district staff both in the planning and implementation stage? (15 points)
- Are the goals, objectives, and anticipated outcomes of the project clear, logical, and relevant to the identified needs? Are the project activities clearly related to the objectives? Do they show evidence that conditions will lead to the anticipated outcomes and can be accomplished within the stated time frame? Are the activities designed in line with the best practices to achieve long-term improvement in mathematics and science education? (15 points)
- Is the evaluation plan related to both the assessment of anticipated outcomes and the overall effectiveness of the program? (10 points)
- Has the project taken into account the need for greater access to and participation for students from historically underrepresented and underserved groups and gifted and talented students? (15 points)
- Does the project provide opportunities for the equitable participation of public and private school teachers? (5 points)
- Are the qualifications and responsibilities of the staff appropriate for the proposed project? (15 points)
- Has the project been designed for the widest possible dissemination? (5 points)
- Does the project budget reflect significant commitment by school districts participating in the project (e.g., contribution of staff, administrative time, use of facilities, use of equipment, teacher release time, mentor teachers, and substitute teacher contributions)? (5 points)
This representation of evaluation criteria and their relative number of points communicates the program guidelines and gives a good indication of the Resource Provider's goals.
Top of Page
Some General Recommendations
If an RFP contains detailed evaluation criteria (as illustrated in the U.S. Department of Education example and the Eisenhower example) you should write your proposal so that it corresponds very closely to these criteria. Indeed, your proposal's table of contents should have main headings that correspond exactly to the evaluation criteria because the proposal reviewers evaluate proposals according to the evaluation criteria given in the RFP. They will probably use evaluation forms on which they indicate the points awarded on each criterion. You want to make it very easy for the reviewer to see that you have adequately addressed all criteria. Moreover, you want the order of the sections in your proposal to follow the order on the evaluation form.
Almost every Resource Provider is interested in the quality of the Resource Seeker's staff and the reliability of its organization. Thus, a proposal should contain information that makes it easy for proposal evaluators to judge the qualifications and suitability of the proposed project staff and learn about the reliability and stability of the organization submitting the proposal.
Proposal reviewers usually have a high level of technical competence in the content area of the proposals they are reading. In fact, they may be leaders in the field. They may have written proposals that have been funded. (Usually an effort is made to avoid conflicts of interest between proposal submitters and proposal reviewers. Thus, someone who has submitted a proposal in a competition will not be asked to evaluate proposals in that competition.)
The high level of technical competence of the evaluators means that technical incompetence or a low level of technical competence on the part of the proposal writers probably will be detected. Or, taking a positive point of view, it means that technical competence on the part of the proposal writers is apt to be appreciated and rewarded with high marks.
Most Resource Providers want the Resource Seeker to provide some of the resources. In recent years, discussion of "matching funds" or "in-kind" contributions has become an increasingly important component of a proposal. The Resource Seeker's contributions may include facilities, volunteers, and other non monetary resources. Another approach is for the Resource Seeker to request a lower level of indirect or overhead support than might be expected. Often a private foundation will not provide funds for indirect expenses. Some State and Federal funding agencies severely restrict the level of indirect expenses that will be funded.
Most Resource Providers are interested both in the short- and long-term effects of the funds they provide. The Resource Provider wants to "make a difference" in the world. Thus, a proposal gains an advantage if its results will be widespread and long lasting. A plan for wide-scale dissemination of the results can strengthen a proposal.
In brief summary, put yourself into the shoes of the Resource Provider. How does the Resource Provider's organization want to invest its resources? Does a particular proposal represent a good investment with a high potential for making a significant contribution toward accomplishing the provider's goals? What can you do to strengthen your proposal in the eyes of the reviewers and the Resource Provider?
Top of Page
- Analyze the three proposal evaluation guidelines given in this chapter. What are some features that are common to at least two of these guidelines?
- What do you personally feel are the most important aspects of a proposal? Analyze your answer from the point of view of the evaluation guidelines provided in this chapter. Of the three proposal evaluation guidelines given, which one gives the most weight to the evaluation criteria you feel are most important?
- Obtain a copy of a proposal that has been submitted to a funding agency. (Perhaps this will be a proposal written by one of your colleagues.) Work with a small groups of people as you each individually read and evaluate the proposal. Discuss the results, paying particular attention to the different strengths and weaknesses that each reader finds in the proposal.
Top of Page