The article proposes a non-parametric alternative to deep neural networks (DNNs) for text classification. The method combines a simple compressor like gzip with a k-nearest-neighbor classifier. The researchers find that this approach achieves competitive results with non-pretrained deep learning methods on six in-distribution datasets. It also outperforms BERT, a popular pretrained model, on all five out-of-distribution datasets, including low-resource languages. The method excels in the few-shot setting, where labeled data is scarce and DNNs are not effective. The researchers provide the Python code for implementing the method and compare it with various deep learning models and other compressor-based methods.

In their study, the researchers evaluate various compressors to determine their effectiveness in text classification. Among the compressors tested, gzip is found to perform the best in terms of both accuracy and compression ratio. This means that gzip is able to effectively capture the regularities present in text data, resulting in accurate classification results. The researchers emphasize that the use of gzip as a compressor is a key component of their method.

One of the main advantages of the proposed method is its simplicity. Unlike deep neural networks (DNNs) which require complex architectures and training procedures, the researchers' method is straightforward and does not require any preprocessing or training parameters. This simplicity makes the method easy to implement and use, allowing for efficient text classification tasks without the need for extensive computational resources.

The lightweight nature of the proposed method is another notable feature. While DNNs often require millions of parameters and large amounts of labeled data, the researchers' method does not have any training parameters and is not computationally intensive. This makes it a cost-effective solution for text classification, as it does not demand significant computational resources for optimization or transfer to out-of-distribution cases.

Additionally, the researchers highlight the universality of their method. They explain that compressors, like gzip, are data-type agnostic, meaning they can be applied to various types of text data without the need for customization. This makes the method versatile and adaptable to different text classification tasks and datasets.

Based on their evaluation and comparisons with DNNs and other compressorbased methods, the researchers conclude that their proposed method is a viable alternative to DNNs for text classification tasks. They highlight its competitive performance on both in-distribution and out-of-distribution datasets, including low-resource languages. The simplicity, lightweight nature, and universality of the method make it an attractive option for text classification tasks in terms of accuracy, resource efficiency, and ease of use.

Reference: https://aclanthology.org/2023....