Using Adapter Transformers at Hugging Face
In this tutorial, we show how to use the HuggingFace AdapterHub to access adapter transformers in Paperspace Notebooks.
In this tutorial, we show how to use the HuggingFace AdapterHub to access adapter transformers in Paperspace Notebooks.
In this tutorial, we look at and implement the pipeline for running zero-shot text classification with Hugging Face on a Gradient Notebook.
In this article, we examine HuggingFace's Accelerate library for multi-GPU deep learning. We apply Accelerate with PyTorch and show how it can be used to simplify transforming raw PyTorch into code that can be run on a distributed machine system.
Aitextgen is a Python library for training text-generation models using GPT-2 and GPT-3/GPT Neo. In this tutorial you'll get aitextgen up and running quickly in a Jupyter notebook on a free GPU instance from Paperspace Gradient!