2019 Metascience Symposium

09 Sep 2023 - 09 Nov 2025
Open in Logseq
    • Notes from a report to my then-employer, PICI
    • Takeaway

    • I attended a couple of days at Metascience 2019. This was a free conference organized by the Fetzer Franklin Fund , an outfit very similar to the better-known Templeton Foundation – that is, they fund edgy work in various aspects of science, a mix of very interesting and very flakey activities. This leaned towards the former at least. They plan to put videos of all talks online in the near future.
    • Metascience turns out to mean just what you would expect: applying the methods of science to study the processes of science itself. Some examples:
      • literature and citation analysis
      • critiques of various statistical malpractices like p-hacking, and attempts to reform them.
      • agent-based modelling of scientific processes
      • proposals for improving replicability (like markets)-
      • proposals for improved publishing practices (like pre-registration of experiments)
    • Oddly enough from my perspective there was very little attention to things like data sharing, open standards, automated experimental workflows, or any of the other technologies that might actually improve the day-to-day practice of science. Maybe that՚s science engineering. So a lot of this came off to me as moaning about how bad science was, rather than attempts to improve it. The talks on statistics and publishing practices were interesting but would probably be more valuable to other PICI people.
    • One complaint that I could sympathize with is that scholarship these days is very reliant on Google Scholar. Just like in quotidien search, scholars tend to focus on the first few items returned and ignore the rest. This constitutes a distortion in the collective process of scientific thought. And since the ranking algorithm is completely opaque, there is no way to even characterize it. However, addressing this problem by building an alternative open and deprivatized search engine did not come up.
    • Some highlights:
    • Statistics

    • The best talks were by two well-known statisticians, Andrew Gelman and Steve Goodman. Goodman railed against the use of p-values and in favor of a more Bayesian approach that would allow a rigorous representation of epistemic uncertainty. Gelman՚s talk ranged over a bunch of areas and I can՚t really recreate it here.
    • Goodman also mentioned PCORI, an organization I know of but didn՚t realize that they are the 2nd largest funder of medical research in the US! They have a published set of methodology standards that Goodman recommends (and was involved in drafting).
    • Literature Analysis

    • Evans is a scientific advisor to Meta and does a ton of work in literature analysis. He tried to cram way too much into this talk, but the gist seemed to be that the social networks of science produce epistemic convergence which leads to a decrease in diversity and certainty. He advocates policies that “maintain productive disconnection between disciplines”
    • Replicability and reproducibility

    • The level of actual replication and replicability in science is laughably small, for many reasons, including motivational.
    • Techniques for reproducibility: data sharing, dockerization, etc. Stuff we already know.
    • Agent-based modelling of the scientific process

    • Carl Bergstrom (U Wash) made a model of competition among grant writers to demonstrate the inherent wastefulness of the process (and how it might be tweaked to be better).
    • Cailin O'Connor, (UC Irvine) made a model to illustrate how scientific polarization (that is, separate groups with differing beliefs) might arise from various social network effects. The keyword here is “network epistemology”. O՚Connor has a book, The Misinformation Age, which treats some of these problems
    • Open Science

    • This meeting had a substantial overlap with people and topics in the Open Science movement. Some speakers mentioned the Transparency and Openness Promotion guidelines, which “include eight modular standards, …Journals select which of the eight transparency standards they wish to implement and select a level of implementation for each. These features provide flexibility for adoption depending on disciplinary variation, but simultaneously establish community standards.”
    • Standards: Data Citation | Data, Materials, and Code Transparency | Design and Analysis | Preregistration | Replication
    • Tools

    • This was kind of cool: statcheck – an R package that searches texts (pdfs) for use of p-values and checks the methodology and recomputes if necessary. Web version: http://statcheck.io/
    • OSF has a preprint archive search: [OSF)