
GPT-NeoX: A 20 Billion Parameter NLP Model on Gradient Multi-GPU
Follow this guide to learn how to set up and use GPT-NeoX-20B within Paperspace Gradient to generate text in response to an inputted prompt.
Follow this guide to learn how to set up and use GPT-NeoX-20B within Paperspace Gradient to generate text in response to an inputted prompt.
In the conclusion to the tutorial series on solving tic tac toe with the genetic algorithm, you will put all the lessons together to apply them to the game itself.
Follow part 2 of this tutorial series to see how to train a classification model for object localization using CNN's and PyTorch. Then see how to save and convert the model to ONNX.
In part one of this series on object localization with pytorch, you will learn the theory behind object localization, and learn how to set up the dataset for the task.
Follow this guide to learn about the JAX library, and learn how to directly implement it in Gradient.
Follow this guide to learn how to integrate the Weights and Biases API with your code in Gradient Notebooks! Readers should expect to learn how to get started with Weights and Biases, how to integrate it with Gradient, and how to log your training results in Weights and Biases via Gradient.
This tutorial seeks to eliminate much of the difficulties that come with data storytelling in the world of Machine Learning. Readers can expect to finish the article with an understanding of how to use the available tools in Python to create interpretable ML projects on platforms like Gradient.
Follow this guide to see how to run distributed training with TensorFlow on Gradient Multi-GPU powered instances!
This new comprehensive guide will usher you through creating your own transformers nlp model for semantic analysis following two methodologies: from scratch and using a pre-trained TF-Hub model.