Natural Sciences and Engineering Research Council of Canada
Symbol of the Government of Canada

Common menu bar links

FAQ: Discovery Grants Competition

How do the results of the 2012 Discovery Grants competition compare to previous years?

The number of applicants to the Discovery Grants (DG) Program seems to be stabilizing. NSERC received 3,477 DG applications this year, five fewer than in the 2011 competition. The success rate in the DG Program increased in all categories over 2011, and remains above NSERC’s guidelines. Specifically, the success rate for established researchers applying to renew a grant increased from 74 percent to 78 percent. The success rate for established researchers not holding a grant also increased—from 33 percent to 36 percent. Early career researchers went from a success rate of 54 percent to 62 percent—well above NSERC’s target of at least 50 percent. The total number of awards increased from 2,002 to 2,161. More details are available in the statistics package.


Why does NSERC separate the rating process from the funding recommendation?

This practice removes an applicant’s previous granting history as a factor in determining funding, allowing reviewers to judge an application on its current merits. It is a key aspect of making the current system more objective and more responsive to rewarding researchers with outstanding contributions. This is a best practice implemented by most funding agencies around the world.


Who decides grant levels for successful applicants?

After applications have been evaluated against the three Discovery Grants selection criteria, they are placed into quality bins based on their ratings. Following this process, the Executive Committee of each Evaluation Group (EG) independently recommends the appropriate level of funding to assign to each bin, in consultation with NSERC staff. Each Executive Committee is composed of the Group Chair and Section Chairs of the EG, who have themselves participated in the evaluation process. The Executive Committees have the important and challenging task of recommending an appropriate balance between the number of funded applicants and average grant sizes, while remaining within the available budget. The funding decision ultimately rests with NSERC, but decisions are made with the full engagement of the Executive Committees.


What other factors influenced success rates and grant sizes in the 2012 competition?

Funding is allocated on the basis of merit. Thus the number and record of accomplishment of applicants in the competition have an impact on the success rate. For early career researchers, NSERC strives to achieve a minimum success rate of 50%, given the importance of providing an opportunity to these applicants to launch their research programs.


Why do the average grant levels vary from one Evaluation Group to another?

Grant levels are intended to reflect the relative costs of research among disciplines. Current levels are based largely on historical values, and were influenced by past reallocation exercises. We have requested an assessment to be conducted by the Council of Canadian Academies (CCA) on performance indicators for basic research. NSERC intends to adopt a new methodology once the assessment report provides indicators, metrics and suggestions on how to do so. The CCA report is expected to be released in mid 2012. NSERC will also incorporate feedback from the research community and other sources when identifying the best mechanisms and indicators to use in future decisions about budget allocations.


Has NSERC made changes to the process since implementing it in 2009?

NSERC has closely monitored the results of implementing these new processes. Small adjustments have already been made in response to the first two competitions and NSERC will continue to refine the system.


Why does NSERC place so much emphasis on the selection criteria related to training highly qualified personnel (HQP)?

Training HQP has always been an important element in NSERC’s assessment of the merit of an application. Indeed, universities and professors always highlight the students they train as their main contribution to research and innovation.


What is the distribution of applications by “quality bin”?

As a result of peer review, applications are placed in 16 “quality bins” based on their merit against the three selection criteria—Excellence of Researcher (EoR), Merit of Proposal (MoP), and Contribution to the Training of High Quality Personnel (HQP). The figure below shows the distribution of applications for Early Career Researchers (ECR),* Established Researchers Renewing their grant (ER-R), and Established Researchers Not Holding a Grant at the time of application (ER-NHG) between 2009 and 2011.

Budget permitting, NSERC aims to support Established Researchers* to Bin J (which corresponds to ratings of Strong on three criteria or equivalent) and ECR to Bin K or Bin L.

Figure 1. Distribution of applications by Quality Bin for each applicant category
Figure 1. Distribution of applications by Quality Bin for each applicant category

* Early Career Researchers are applicants who are within two years of the start date of their first eligible position at the university and who have no prior academic or non-academic independent research experience. All other applicants are considered Established Researchers.


How does the peer review process treat applications from early career researchers?

NSERC is committed to supporting early career researchers (ECRs)* who have the training and expertise to make valuable research contributions, and considers it important to allow ECRs to demonstrate their potential for quality contributions to research and training.

While applications from ECRs are evaluated against the same three selection criteria, NSERC recognizes that these applicants may not have had the opportunity to make the same level of contributions to research or training as established researchers, and allows for that by having a different quality cut-off for ECRs. NSERC aims to support at least 50 percent of early career applicants, subject to the assurance of high quality. In fact, NSERC has consistently exceeded this target. NSERC currently devotes $7.5 million per year that it received through Budget 2011 to supplement the Discovery Grants of ECRs. Through consultations, this group was identified as the one that would most benefit from additional resources, kick-starting their research by allowing them to assemble strong teams early and establish high-quality programs.

* Early Career Researchers are applicants who are within two years of the start date of their first eligible position at the university and who have no prior academic or non-academic independent research experience. All other applicants are considered Established Researchers.


Do small universities face a challenge in relation to the HQP criterion?

NSERC analyzed the results of the 2011 competition for the rating pattern of applications in Bin K, which usually is the first bin not funded for Established Researchers. Bin K generally corresponds to receiving two ratings of Strong and one Moderate. Other combinations—such as a Very Strong, a Strong, and an Insufficient; or two Moderate and a Very Strong—make up the rest of the cases.

Figure 2. Percentage of occurrences of various combinations of applications from large, medium and small universities
Figure 2. Number Percentage of occurrences of various combinations of applications from large, medium and small universities

MSS: Moderate for Excellence of Researcher (EoR), Strong for Merit of Proposal (MoP), Strong for Contribution to Training of HQP
SMS: Strong for EoR, Moderate for MoP, Strong for Contribution to Training of HQP
SSM: Strong for EoR, Strong for MoP, Moderate for Contribution to Training of HQP

For all university sizes, the most frequent rating combination was SMS (i.e, Strong for EoR, Moderate for MoP, Strong for Contribution to Training of HQP). Half of the applications land in Bin K due to a lower rating on the MoP criterion.


People Discovery Innovation