This page provides responses to the most frequently asked questions about the Global Equity in Biomedical Research  project. If you do not find an answer to your question here, you are welcome to use the form below to contact us.

      1. Who produced this Canadian Report Card?

      2. Where does the data used in the Report Card come from?

      3. What are the main publicly-available data sources used in the Canadian Report Card?

      4. Why does the Canadian Report Card use PubMed as a publication database?

      5. How did the Report Card collect data from universities for metrics that relied partially or wholly on self-reported information?

      6. How does the Report Card fairly evaluate universities with varying sizes and research budgets?

      7. How has the Report Card defined certain terms?


Who produced this Canadian Report Card?

This project was conceived, developed and produced by Universities Allied for Essential Medicines (UAEM), an international nonprofit organization of graduate and undergraduate students in medicine, research, law, and related fields. Students from a wide range of Canadian institutions, including many of those evaluated by the Report Card, contributed to the project. Research and analysis to produce the Canadian iteration was conducted over the course of mid 2016 to 2017.

The UAEM 2017 Report Card Team:

Varoon Mathur

Varoon attended Queen’s University in Canada where he studied Neurobiology and Biochemistry and then pursued his Master’s at University of British Columbia in Computer Science. He has been involved with UAEM since founding the Queen’s chapter in 2012, and joining the Coordinating Committee in February of 2013. Varoon is primarily interested in the intersection of medical innovation and access licensing strategies in the context of global health, and is a part of the Report Card team for UAEM North America. He was the lead on our Canadian Report Card, released in Fall 2017.


Catherine Whicher

Currently pursing a master’s degree in Global Health Policy from the London School of Hygiene and Tropical Medicine, Catherine has long had a passion for health equity, and access to medicines in particular. While studying a Bachelor’s of Chemistry at the University of British Columbia she was a member of Friends of Médecins Sans Frontières, leading the chapter in her final year. There she collaborated with UAEM on a campaign to protect generic drug producers from international trade deals with unprecedentedly restrictive intellectual property regulations. She hopes that the 2017 Canadian Report Card will highlight the many opportunities Canadian schools have to lead in global health efforts and improve the well-being of millions worldwide.


Michael Lee

Michael Lee is in his fourth year at Western University, pursuing a bachelor’s degree in neuroscience. Two years ago, he started to lead the UAEM access campaign at Western, advocating for the adoption of a global access licensing policy. His work primarily focuses on alleviating the access-to-medicines crisis by affecting drug patent and licensing policies.


Yifei Wang

Yifei Wang is in his fourth year at Queen’s University for Life Sciences, specializing in drug development and toxicology. A year ago, he joined the Queen’s UAEM Chapter as an academic co-director to educate students about the global health impact of academic institutions. Since then, he has been organizing campus workshops on various topics such as universal pharmacare, the global antibiotic crisis, and conflict-of-interest partnerships between big pharma and medical schools. 



Where does the data used in the Report Card come from?

As detailed in the methodology, this evaluation is based on a combination of metrics derived from a.) publicly-available information, and b.) self-reported data from evaluated institutions.

To promote fair evaluation and methodological rigor, we used standardized, authoritative, publicly accessible data sources for as many metrics as possible. Self-reported data was only sought on metrics for which public information was limited or inconsistent.

What are the main publicly-available data sources used in the Canadian Report Card?

The most significant sources of publicly-available data used in this evaluation are:

        • Canadian Institutes of Health Research – Funding Decisions Data
        • Natural Sciences and Engineering Research Council of Canada – Awards Database
        • Bill and Melinda Gates Foundation – Grants Database
        • Grand Challenge Canada – Innovations and Results
        • AllTrials/EBM DataLab – TrialsTracker
        • ResIn – Research Investments in Global Health Database
        • G-FINDER (Global Funding of Innovation for Neglected Diseases) Public Search Tool
      • University websites, technology transfer office websites
      • PubMed and PubMed Central


Why does the Canadian Report Card use PubMed as a publication database?

We chose to use PubMed as it is a comprehensive resource that is commonly used by researchers in the health field, together with its ease of use and features which allow us to easily download large amounts of data. Using more than one publication database would have meant duplication of work and greater room for error. Comparing publication sets between PubMed and PubMed Central allowed us to easily analyse how much of a university’s health related research output is freely available online.

How did the Report Card collect data from universities for metrics that relied partially or wholly on self-reported information?

For the Access, Empowerment and Innovation sections, an online questionnaire was developed with SurveyGizmo, and e-mailed to TTO officials, Administrators, Faculty and Staff best suited to provide the data. TTOs were contacted a minimum of 3 times by email and twice by telephone over a 12-week period beginning March 1, 2017.

How does the Report Card fairly evaluate universities with varying sizes and research budgets?

While we acknowledge there will be variation across universities selected for evaluation (e.g. in levels of research funding, student body size), we also recognize that these institutions are public universities. This homogeneity among Canadian universities will allow for more direct comparisons than would be possible with a mix of public and private institutions. Regardless, UAEM has selected evaluation criteria intended to minimize the impact of any variations that may arise.

Importantly, all metrics that analyze continuous variables account for variation in school size and funding by normalizing the absolute number to the overall level of combined CIHR, NSERC and Gates Foundation funding. For example, when evaluating a university’s investment in neglected disease (ND) research, Antimicrobial Resistance (AMR) and neglected aspects of HIV/TB/Malaria, our metric is calculated by dividing a given institution’s overall medical research funding devoted to ND and related research projects (from the >100 funding sources included in the G-Finder report) by the total CIHR + NSERC + Gates funding to generate an “ND Innovation Index”. This enables us to adjust for confounding by institutional size and allows for a meaningful comparison of performance across institutions.

For categorical metrics, we have developed pre-defined sets of discrete categories by which all universities can be uniformly evaluated, and for which performance is likely to be independent of variation in university size, funding, capacity or resources.

How has the Report Card defined certain terms?

For the purposes of this iteration of the UAEM Report Card, “neglected diseases” (NDs) are defined as diseases that disproportionately affect low- and middle-income countries. Our list of NDs and research areas was based on the criteria set by the GFINDER 2014 survey on global neglected disease innovation funding and the World Health Organization’s list of recognized neglected tropical diseases. The scope of the research areas included was further focused in adding terms of subject matter and application. Notably, this definition of ND includes Ebola, Zika, Antimicrobial resistance, AIDS, HIV, tuberculosis, malaria, diarrheal diseases, meningitis, and pneumonia; however, for several of the diseases there are substantial restrictions to include only aspects or subsets of these diseases that are truly neglected. For example, we did not include all research on HIV, only research pertaining to pediatric HIV, HIV diagnosis, diagnostics, microbicides, and vaccines.

In regard to “Alternative Research and Development (R&D)”, our working definition for this iteration of the UAEM Report Card is derived specifically from the inclusion criteria developed and used in the UAEM Re:Route Report. Alternative biomedical research initiatives must apply de-linkage plus one or more of the following innovative mechanisms:

  • a pull mechanism
  • a push mechanism
  • pooled funding and/or an IP pooling mechanism
  • broad collaboration
  • open approaches to R&D (open source, open data sharing, open innovation)