ylu1997

Latex Literature Management System

A LaTeX system for managing academic literature with automated sorting, tag-based filtering, and multiple display formats. Built on the datatool package for structured data storage and processing.

What It Does

This system allows you to:

Understanding the Insight Field

The Insight field serves as your literature reading notes, designed to capture the essential value and your personal understanding of each paper. It is recommended to record:

Examples:

% Recording innovative methods
\addliterature{Attention Is All You Need}{Vaswani et al.}{2017}{Introduces Transformer architecture based entirely on attention mechanisms, eliminating recurrence and convolution for parallelizable training}{NLP,Attention}

% Recording key conclusions
\addliterature{BERT}{Devlin et al.}{2018}{Breakthrough in bidirectional encoding through masked language model pre-training, achieving state-of-the-art results across multiple NLP tasks}{NLP,BERT}

% Recording methodological innovation and implications
\addliterature{GPT-3}{Brown et al.}{2020}{Demonstrates few-shot learning capabilities of large-scale language models, 175B parameter model excels across diverse tasks, pointing toward artificial general intelligence}{NLP,GPT}

Through detailed insight recording, you can quickly review key paper points, facilitating future research and writing references.

Installation

\input{literature-system.tex}

Download

📥 Get the LaTeX system file:

Simply download the literature-system.tex file and place it in your LaTeX project directory, then include it using \input{literature-system.tex} in your document.

Core Functions

Data Input

\addliterature{title}{authors}{year}{insight}{tag}

Add literature with full information including tags.

\addliterature{Attention Is All You Need}{Vaswani et al.}{2017}{Transformer architecture}{NLP,Attention}

\addliteratureold{title}{authors}{year}{insight}

Legacy command for backward compatibility (no tags).

\addliteratureold{BERT Paper}{Devlin et al.}{2018}{Bidirectional encoding}

Sorting Functions

\sortbyyear

Sort by publication year (earliest to latest).

\sortbyyear

\sortbytitle

Sort alphabetically by title.

\sortbytitle

\sortbyauthors

Sort alphabetically by author names.

\sortbyauthors

\sortbytag

Sort alphabetically by tags.

\sortbytag

\sortbyyeartitle

Sort by year first, then by title.

\sortbyyeartitle

\sortbytagyear

Sort by tag first, then by year.

\sortbytagyear

\sortby{field}

Sort by any database field.

\sortby{authors}

Display Functions

\showfullreview

Display all literature in detailed table format.

\showfullreview

\showbasiclist

Display literature in compact list format.

\showbasiclist

\showbasiclistwithinsight

Display compact list including research insights.

\showbasiclistwithinsight

Tag-Based Filtering

\showfullreviewbytag{tag}

Show detailed view of papers with specific tag.

\showfullreviewbytag{NLP}

\showbasiclistbytag{tag}

Show compact list of papers with specific tag.

\showbasiclistbytag{AI}

\showbasiclistwithinsightbytag{tag}

Show compact list with insights for specific tag.

\showbasiclistwithinsightbytag{DeepLearning}

Statistics and Information

\showstats

Display database statistics and available tags.

\showstats

\getpapercount

Get total number of papers.

Total papers: \getpapercount

\showavailabletags

Display all available tags in the database.

\showavailabletags

\systeminfo

Show system version and feature information.

\systeminfo

Database Management

\clearliterature

Clear all literature from database.

\clearliterature

\resetpapercount

Reset paper numbering counter.

\resetpapercount

Complete Example

\documentclass{article}
\input{literature-system.tex}

\begin{document}

% Add literature with detailed insights
\addliterature{Attention Is All You Need}{Vaswani et al.}{2017}{Introduces Transformer architecture based entirely on attention mechanisms, eliminating recurrence and convolution, enabling parallelizable training and laying foundation for large language models}{NLP,Attention}
\addliterature{BERT}{Devlin et al.}{2018}{Breakthrough in bidirectional encoding through masked language model pre-training, achieving deep bidirectional representations and setting new records on 11 NLP tasks}{NLP,BERT}
\addliterature{GPT-3}{Brown et al.}{2020}{Demonstrates few-shot learning capabilities of large-scale language models, 175B parameter model performs diverse tasks without fine-tuning, showcasing potential for artificial general intelligence}{NLP,GPT}

% Sort and display
\sortbyyear
\showstats
\showfullreview

% Filter by tag
\showbasiclistbytag{NLP}

\end{document}

Key Features


Version 2.0 - Enhanced with tag support and insight documentation