quarta-feira, 2 de junho de 2010

Agências dos EEUU implantam novo sistema de avaliação do impacto da pesquisa

Foi anunciado ontem o “STAR METRICS: New Way to Measure the Impact of Federally Funded Research” que visa monitorar o impacto dos investimentos federais em ciência sobre o emprego, a geração do conhecimento e os desfechos de saúde.
“The initiative—Science and Technology for America’s Reinvestment: Measuring the Effect of Research on Innovation, Competitiveness and Science, or STAR METRICS—is a multi-agency venture led by the National Institutes of Health, the National Science Foundation (NSF), and the White House Office of Science and Technology Policy (OSTP).”
A implantação do programa tem um custo estimado de US$ 1 milhão no primeiro ano. Esta é uma das primeiras diferenças para as iniciativas nacionais, que normalmente tentam implantar sistemas de avaliação sem alcoar fundos específicos.
Na nota de divulgação, há um pequeno delineamento do sistema:
“Information will be gathered from the universities in a highly automated way, with minimal or no burden for the scientists and the university administration. ...
There are two-phases to the program. The first phase will use university administrative records to calculate the employment impact of federal science spending through the American Recovery and Reinvestment Act and agencies' existing budgets. The second phase will measure the impact of science investment in four key areas:
Economic growth will be measured through indicators such as patents and business start-ups. Workforce outcomes will be measured by student mobility into the workforce and employment markers. Scientific knowledge will be measured through publications and citations. Social outcomes will be measured by long-term health and environmental impact of funding.”
Maiores informações podem ser já obtidas (aqui). 
Dei uma olhada a vôo de pássaro. Veja abaixo a proposta para constar do relatório anual dos PIs. Em breve irão preencher o CV Lattes….
“Draft: FOR DISCUSSION Reports to Principal Investigators: BioSketches and Annual/Final Reports
Must create automated reporting that responds to standardized federal biosketches (Research Business Models) 
Must create flexibility to add personal discussions 
Must be subsettable to a specific award, a specific timeframe, or full set of achievements
External Awards (source: administrative records) 
  1. From agency(ies) 
  2. Y ear and length of awards 
  3. Award Types (individual PI, group PI, center, 1 year, new PI, etc)
Publications to date (source: webscraping; requirement: PI confirmation of newly identified publications/communications) Training and Professional development
a. Graduate student names, time contributed (source: HR records; requirement: informed consent)
b. Undergraduate student names, time contributed (source: HR records; requirement: informed consent and PI identification if non-paid student)
  1. Presentations (poster/talks) (source: webscraping; requirement: PI confirmation) 
  2. Citations (source: webscraping, ISI, Scopus, PubMed etc.)
Other products
Patents (source: PTO; requirement: PI confirmation) Patent Applications (source: PTO; requirement: PI confirmation) 
Websites created (source: webscraping) Federal Review boards/committees
Collaborators – this should be a component under Dissemination and/or Training and Professional Development. 
Other outcomes (engagement of community critical to develop these)
a. Health b. Equity c. Safety d. Security e. Infrastructure f. environment
Additional Possible Features
Visualization of collaboration network(s)Visualization of citation network Map of citations
Possible approach
Step 1: Using preexisting data and frame 1. Create frame of individuals receiving science funding. This frame should
a. Initially use only publicly available data on PIs and coPIs b. Go back as far as possible (ten year window target: 2000 – 2010) c. Subsequently be updated on a monthly basis from administrative data from
universities and funding agencies 2. Provide initial match to outcomes using existing data on patents, patent applications and
citations, which allows for location and discipline specific outcomes. This would build
on existing work prototyped by both science agencies and academic researchers. 3. Create prototype reports for principal investigators, universities and science agencies
(see below). All should use the following approaches a. Use the existing frame b. Be structured to be updated on a monthly basis c. Provide visually interesting summaries and updates d. Provide useful and relevant information as determined by users through feedback
on reports (format, content, and timing) and reported and future metrics. 4. Post prototypes on data.gov (initially on restricted access site)
Step 2: Expanding the data sources 1. Update match to outcomes on flow basis using webscraping technologies and
administrative data
Step 3: Expanding the user base and engaging the community 1. Create openly accessible data enclave for researcher community
2. Announce the existence of the enclave and create prize for best report in each of the four categories (innocentive type competition)
3. Update data.gov with best reports as rated by the user community

Nenhum comentário:

Postar um comentário