Talks by Paul Ginsparg

Attention is all you get

Paul Ginsparg Cornell University
For the past decade, there has been a new major architectural fad in deep learning every year or two. One such fad for the past two years has been the transformer model, an implementation of the attention method which has superseded RNNs in most sequence learning applications. I'll give an overview of the model, with some discussion of non-physics applications, and intimate some possibilities for physics.

At a Physics/InfoSci Intersection P. Ginsparg, Physics and InfoSci

Paul Ginsparg Cornell University

Over Twenty-five years into the internet era, over twenty years into the WorldWideWeb era, fifteen years into the Google era, and a few years past the Facebook/Twitter era, we've yet to converge on a new long-term methodology for scholarly research communication. I will provide a sociological overview of our current metastable state, and then a technical discussion of the practical implications of literature and usage data considered as computable objects, using arXiv as exemplar.

Next-Generation Implications of Open Access

Paul Ginsparg Cornell University
True open access to scientific publications not only gives readers the possibility to read articles without paying subscription, but also makes the material available for automated ingestion and harvesting by 3rd parties. Once articles and associated data become universally treatable as computable objects, openly available to 3rd party aggregators and value-added services, what new services can we expect, and how will they change the way that researchers interact with their scholarly communications infrastructure?