Methodology

The following describes the methodology used to develop the Canadian Global Health Equity in Biomedical Research Report Card. The development of the methodology was overseen by a student committee, based on metrics that meet defined criteria and the advice of colleagues working in global health.

All elements od the evaluation, including selection of the universities, selection of metrics, data collection, scoring and grading, were conducted by UAEM Report Card Student Team between June 2016 and July 2017.

Click here to download the full methodology. Canadian UAEM 2016-2017 Global Equity in Biomedical Research Methodology

SELECTION OF UNIVERSITIES

The Canadian iteration specifically evaluates the universities that are members institutions of the U15. The U15 is a non-governmental organization designed to represent the interests of its member universities concerning research and development for the Canadian government. Collectively, the members of the U15 represent 47 percent of all university students in Canada, 71 percent of all full-time doctoral students in the country, 87 percent of all contracted private-sector research in Canada, and 80 percent of all patents and startups in Canada. These figures can be found here: http://u15.ca/our-impact.

SELECTION OF EVALUATION METRICS

To provide a comprehensive overview of university commitment to global health research, the Canadian Report Card measures 29 performance indicators in 4 general categories. To provide a comprehensive overview of university commitment to global health research, the Canadian Report Card measures 29 performance indicators in 4 general categories.

The 24 specific metrics were selected on the basis of the following criteria:

  • Significance as indicators of global health research
  • Availability of standardized data sources for all evaluated institutions
  • Consistent data and comparability across evaluated institutions
  • Ability of evaluated institutions to concretely improve performance on these metrics
  • Diverse range of indicators measuring policy and implementation

INNOVATION

I-Q1: What percentage of the University’s total funding received from CIHR, NSERC, and the Gates Foundation is dedicated to global health research, training and collaborations?

I-Q2: What percentage of the university’s total biomedical research funding received from CIHR, NSERC, and the Gates Foundation is devoted to projects focused on neglected diseases (NDs), neglected aspects of HIV/AIDS, Tuberculosis, Malaria, and/or antimicrobial resistance (AMR)?

I-Q3: What percentage of the university’s total medical PubMed publications are focused on global health?

I-Q4: What percentage of the university’s total medical PubMed publications is focused on neglected diseases, neglected aspects of HIV, TB, malaria, antimicrobial resistance, and/or access to medicines in low- and middle-income countries?

I-Q5: Does the university have a research center or institute dedicated specifically to neglected diseases and/or neglected aspects of HIV/AIDS, TB, Malaria, or AMR?

I-Q6: How many Grand Challenges Canada grants has the university been awarded between FY2013 and present?

I-Q7: In the wake of the current Zika epidemic, how has your institution responded to the lack of innovation that currently exists for prevention, diagnosis and/or treatment of this disease?

I-Q8: Is any of the university’s medical research being done in collaboration with, funded by or driven by alternative models for research and development? (e.g Drug Discovery and Data-Sharing platforms, Prizes, Philanthropy for Drug Discovery, Drug Patent Pools, Public-Private-Partnerships etc.)

I-Q9: Is the University currently engaged in or supporting research on Canadian drug pricing mechanisms to ensure equitable access to affordable medicines, or has research in this area been carried out in the past 2 years?

I-Q10: Is the university currently in one or more partnerships with a pharmaceutical corporation either via a specific research project, lab, center, initiative, or other model?

ACCESS

A-Q1:

  • Part A: Has the university officially and publicly committed to licensing its medical discoveries in ways that promote access and affordability for resource-limited populations?
  • Part B: Does the website of the university’s technology transfer office (TTO) make an effort to disclose, explain and promote access licensing commitments and practices?

A-Q2:

  • Part A: Has the university adopted or implemented a policy statement regarding open access publications?
  • Part B: Does the university provide support for open access publishing?
  • Part C: What percentage of the university’s total medical sciences publication output is published in open access publications?

A-Q3:

  • Part A: In the past year, what percentage of the university’s total research licenses were non-exclusive?
  • Part B: In the past year, what percentage of the university’s health technology licenses were non-exclusive?

A-Q4: In the past year, for what percentage of all health technologies did the university seek patents in low- and middle-income countries where they may restrict access?

  • Part A: for Upper-Middle-Income Countries (including Brazil, Russia, India, China, and South Africa) (as defined by the World Bank)
  • Part B: Low- and Lower-Middle Income Countries (as defined by the World Bank)

A-Q5: Access Provisions in Exclusive Licenses

  • Part A: In the past year, what percentage of the university’s exclusive licenses of health technologies included provisions to promote access to those technologies in low- and middle-income countries?
  • Part B: What percentage of those access provisions included the biggest low- and middle-income economies (Brazil, Russia, India, China or South Africa) in their scope?
  • Part C: In the past year, what percentage of the university’s exclusive licenses of health technologies included provisions to promote access to those technologies in high-income countries?

A-Q6: Has the university shared its best practices for promoting access to medicines through licensing?

A-Q7: Has the university publicly acknowledged the existence/effectiveness of alternative models of research and development as being important to ensuring access to medical innovation?

A-Q8: Between January 2006 and December 2014, what percentage of completed clinical trials registered on ClinicalTrials.gov conducted by the university had their data shared as summary results on ClinicalTrials.gov or pubmed?

EMPOWERMENT

E-Q1: Does the university offer its students access to global health engagement and/or education?

  • PART A: As indicated by the existence of a university center/institute, department, and/or non-degree program in global health.
  • PART B: As indicated by the existence of a university graduate degree, major/concentration, focus/specialization, certificate, or undergraduate degree in global health.

E-Q2: Does the university offer graduate courses that address the policy and legal context of biomedical R&D, and more specifically the impact of intellectual property policies, on research priorities and global access to medical innovations?

E-Q3: Does the university offer graduate courses that address the prevalence of and/or lack of research on neglected diseases, including neglected aspects of HIV, TB, and/or malaria?

E-Q4: Has the university hosted a major conference, symposium or campus-wide event in the last 12 months on:

  • A. the policy and legal context of biomedical R&D, specifically the impact of intellectual property rights on research priorities and global access to medical innovations?
  • B. neglected diseases, including neglected aspects of HIV, TB, and/or malaria, and health needs of low- and middle-income countries?
  • C. Drug pricing in Canada and/or in other high-income countries?

E-Q5: Does the university offer any of its students accessible opportunities to study, work, or complete research abroad in global health?

E-Q6: Is the university formally involved in a global health partnership with one or more universities based in low- and middle-income countries?

E-Q7: Does the university offer any of its students an opportunity to learn more about alternative models for research and development through courses, workshops, or other opportunities?

TRANSPARENCY

T-Q1: How responsive was the university’s Technology Transfer Office (TTO) to emails from UAEM regarding the Innovation and Access sections surveys?

T-Q2:

  • PART A: For questions relying on public data (CATEGORY 1) in the Access section, was sufficient information available online?
  • PART B: For questions relying on public data (CATEGORY 1) in the Innovation section, was sufficient information available online?
  • PART C: For questions relying on public data (CATEGORY 1) in the Empowerment section, was sufficient information available online?

T-Q3: How much discrepancy exists between university responses in the submitted forms and what is being internally collected using publicly available data for Category 1 and 2 questions?

T-Q4: Does the university have clear guidelines for conflict of interest policies delineated for partnerships with industry that have commercial interest?

While we acknowledge there will be variation across universities selected for evaluation (e.g. in levels of research funding, student body size), we also recognize that these institutions are public universities. This homogeneity among Canadian universities will allow for more direct comparisons than would be possible with a mix of public and private institutions. Regardless, UAEM has selected evaluation criteria intended to minimize the impact of any variations that may arise.

Importantly, all metrics that analyze continuous variables account for variation in school size and funding by normalizing the absolute number to the overall level of combined CIHR, NSERC and Gates Foundation funding. For example, when evaluating a university’s investment in neglected disease (ND) research, Antimicrobial Resistance (AMR) and neglected aspects of HIV/TB/Malaria, our metric is calculated by dividing a given institution’s overall medical research funding devoted to ND and related research projects (from the >100 funding sources included in the G-Finder report) by the total CIHR + NSERC + Gates funding to generate an “ND Innovation Index”. This enables us to adjust for confounding by institutional size and allows for a meaningful comparison of performance across institutions.

For categorical metrics, we have developed pre-defined sets of discrete categories by which all universities can be uniformly evaluated, and for which performance is likely to be independent of variation in university size, funding, capacity or resources.

DATA SOURCES AND COLLECTION

A critical aspect of the Report Card methodology is the collection and analysis of data using two broad categories of data extraction:

  1. Data obtained by accessing publicly available sources, such as university websites, online grant databases, and search engines; these data are collected by UAEM members, staff, and interns.
  2. Data obtained by self-report of university officials in response to survey instruments designed and provided by UAEM.

We attempt to maintain rigor and minimize biases by systematically collecting and analyzing data according to detailed, predetermined standardized operating procedures (SOPs). For CATEGORY 1 (PUBLIC DATA), we address data quality and consistency as follows:

  • We prospectively developed SOPs and standardized data entry forms, including uniform search terms to which all investigators are required to adhere.
  • We performed quality control tests to ensure that investigators were obtaining the same results from the collection procedures.
  • Where possible, multiple individual investigators independently and concurrently perform the same data collection and search processes to ensure consistency of data.

For CATEGORY 2 (SELF-REPORTED DATA), we address data quality and consistency, including concerns about questionnaire non-response, as follows:

  • Compared to the first iteration of the Report Card, we chose to reduce the number of questions we asked of administrators if answers could be easily verified via public sources by our team of investigators.
  • We provide the same questionnaires to all institutions.
  • We have developed a standardized process for identifying and verifying contacts to receive questionnaires at each institution.
  • We identified between 5 and 10 specific administrators in leadership positions at each university whom we felt are most likely to recognize the value of the surveys and would encourage a response from within their teams. The individual contact details were searched publicly via the website and if not via the internal site via students at those institutions. Finally phone calls were made if the contact details could not be ascertained by these means The list includes but is not limited to directors of technology licensing offices, deans of individual schools (law, public health, medicine), and vice presidents for research.
  • We use standardized communication strategies to deliver the survey instruments to all institutions and conduct consistent follow up via e-mail; institutions are given at least 1 month to respond to all survey instruments, and each administrator is contacted a minimum of three times to encourage response.
  • Where possible, we have asked questions in a manner such that the variable under question is categorical, rather than continuous; this is in an effort to maximize the likelihood of response from institutions.
  • We apply standardized scoring of responses across all institutions.
  • We measure and report response rates both for the entire questionnaire and for individual questions.
  • If more than one person per institution replies, and there is discrepancy in the responses, first we aim to verify the correct answer via verified public sources. If this is not possible, we elect to use the answer that favors the university.

We also review the grading from past years for included universities to make sure that we capture all previously recorded data.

SCORING AND GRADING

As in previous iterations and given the purpose of the Report Card, greater weight is allocated to the Innovation and Access sections, with each section accounting for 35% of the total grade. The Empowerment section is worth 20% of the total grade due to the increased challenges in evaluating these specific metrics and the lack of a measurable correlation between these metrics and their impact on increasing access to medicines and addressing neglected diseases in low- and middle-income countries. Finally, the newly added Transparency section is worth 10% of the total grade since we believe that open and collaborative biomedical research is essential to ensuring access and innovation for all.

For each question, the institution is assigned a raw score from 0 to 5, based on the data that is gathered. Each question is also associated with a weighting multiplier from 0.25 to 2.5, based on the relative importance of each question as determined by UAEM’s report card team. The weighted score for a given question is the product of the raw score and the weighting multiplier. To minimize bias due to non-response to CATEGORY 2 (self-reported) questions, we have designed the Report Card such that each section is a mix of CATEGORY 1 (public data) and CATEGORY 2 (self-reported) questions.