The full-fledged AllenNLP is now in maintenance mode. Please refer to the repo for more details and alternatives.
Although there's an AllenNLP-light variant that keeps part of the library work, it's still relied on a large number of dependencies.
We try to keep the minimized dependencies:
- Only the torch is basically needed.
- the transformers, nltk, scipy are required for some of the metrics.
We remove some of the original functions such as CLI commands, FromParams, Lazy Generics, data readers and, token embedders, trainers, cached-paths, fairness, and distributed training. The test cases are also removed due to some of the common dependencies.
We follow the semantic versions of the original repo. And the new version starts from 2.11.0 while the original final version is 2.10.1.
allennlp | An open-source NLP research library, built on PyTorch |
allennlp.commands | Functionality for the CLI |
allennlp.common | Utility modules that are used across the library |
allennlp.data | A data processing module for loading datasets and encoding strings as integers for representation in matrices |
allennlp.fairness | A module for bias mitigation and fairness algorithms and metrics |
allennlp.modules | A collection of PyTorch modules for use with text |
allennlp.nn | Tensor utility functions, such as initializers and activation functions |
allennlp.training | Functionality for training models |
The minimal python version is expected to be greater than 3.11.0. And the model is tested on pytorch 2.0.1.
Just clone the repo, and install it with pip install -e .