The Allen Institute for Artificial Intelligence in Seattle last week unveiled the Semantic Scholar system, which offers extreme compression of lengthy scientific and technical articles to reduce the time spent studying them.
It is an AI-powered research tool that can come in handy in a scientific environment. Thanks to the abstracting function, he can view a huge amount of scientific literature, reducing the processed material to literally one sentence.
Semantic Scholar services were used by 7 million users in a month, its database contains 10 million articles on informatics. According to one of the developers of the system, Dan Weld, the existing database will be regularly updated, including articles on other disciplines.
This is far from the first natural language processing program for generalizing documents. Usually, one of two approaches is used to solve this problem – either extractive, making a choice in favor of a representative text and its verbatim use in a resume, or abstract, when natural language generation algorithms are used to create a resume with an original wording.
In contrast, Semantic Scholar has an extremely high compression ratio. For example, an article summary of 5,000 words will be only 21 words, that is, 1: 238. The closest competitor Semantic Scholar has this figure of 1:36.
The Allen Institute offers its code absolutely free to everyone, and also invites you to visit its demo site scitldr.apps.allenai.org. To date, the Semantic Scholar repository is filled exclusively with English-language materials, but over time it will be replenished with documents on other languages.