Notebooks#
The notebook reference guide for Argilla tutorials.
This section contains:
- ๐ Backup and version Argilla
Datasets
usingDVC
- ๐ Run Argilla with a Transformer in an active learning loop and a free GPU in your browser
- Initial setup on Google Colab
- Install Elastic Search
- Start the Argilla localhost in a terminal
- Create a public link to Argilla localhost with ngrok
- Log data to argilla and start your active learning loop with small-text
- Start annotating in the browser via the ngrok link
- Extract annotated data for downstream use
- Summary
- ๐พ Monitor FastAPI model endpoints
- ๐งธ Using LLMs for Text Classification and Summarization Suggestions with
spacy-llm
- ๐บ๏ธ Add bias-equality features to datasets with
disaggregators
- ๐ก Build and evaluate a zero-shot sentiment classifier with GPT-3
- ๐จ Label data with semantic search and Sentence Transformers
- ๐ธ Bulk Labeling Multimodal Data
- ๐งฑ Augment weak supervision rules with Sentence Transformers
- Introduction
- Running Argilla
- Setup
- Detailed Workflow
- The dataset
- 1. Create an Argilla dataset with unlabelled data and test data
- 2. Defining rules
- 3. Building and analyzing weak labels
- 4. Using the weak labels
- 5. Extending the weak labels
- 6. Training a downstream model
- Summary
- Appendix: Visualize changes
- Appendix: Optimizing the thresholds
- Appendix: Plot extension
- ๐ซ Zero-shot and few-shot classification with SetFit
- ๐ Multi-label text classification with weak supervision
- ๐ฐ Train a text classifier with weak supervision
- ๐๏ธ Assign records to your annotation team
- ๐ฉน Delete labels from a Token or Text Classification dataset
- ๐ซ Evaluate a zero-shot NER with Flair
- ๐ญ Train a NER model with
skweak
- ๐ซ Explore and analyze
spaCy
NER predictions - ๐ Using LLMs for Few-Shot Token Classification Suggestions with
spacy-llm
- ๐ง Find label errors with cleanlab
- ๐ฅ Compare Text Classification Models
- ๐ต๏ธโโ๏ธ Analyze predictions with explainability methods
- ๐งผ Clean labels using your modelโs loss
- ๐ค Fine-tunning a NER model with BERT for Beginners
- Text classification active learning with
classy-classification
- ๐ค Text Classification active learning with ModAL
- ๐คฏ Few-shot classification with SetFit
- ๐ค Train a sentiment classifier with SetFit
- ๐ Text Classification: Active Learning with small-text
- ๐ท๏ธ Fine-tune a sentiment classifier with your own data
- Introduction
- Running Argilla
- Preliminaries
- 1. Run the pre-trained model over the dataset and log the predictions
- 2. Explore and label data with the pre-trained model
- 3. Fine-tune the pre-trained model
- 4. Testing the fine-tuned model
- 5. Run our fine-tuned model over the dataset and log the predictions
- 6. Explore and label data with the fine-tuned model
- 7. Fine-tuning with the extended training dataset
- Summary
- ๐ธ๏ธ Train a summarization model with Unstructured and Transformers