书籍 Natural Language Processing with Transformers的封面

Natural Language Processing with Transformers

Lewis Tunstall, Leandro von Werra, Thomas Wolf

出版时间

2022-04-17

ISBN

9781098103248

评分

★★★★★
书籍介绍
Since their introduction in 2017, Transformers have quickly become the dominant architecture for achieving state-of-the-art results on a variety of natural language processing tasks. If you're a data scientist or machine learning engineer, this practical book shows you how to train and scale these large models using HuggingFace Transformers, a Python-based deep learning library. Transformers have been used to write realistic news stories, improve Google Search queries, and even create chatbots that tell corny jokes. In this guide, authors Lewis Tunstall, Leandro von Werra, and Thomas Wolf use a hands-on approach to teach you how Transformers work and how to integrate them in your applications. You'll quickly learn a variety of tasks they can help you solve. Build, debug, and optimize Transformer models for core NLP tasks, such as text classification, named entity recognition, and question answering Learn how Transformers can be used for cross-lingual transfer learning Apply Transformers in real-world scenarios where labeled data is scarce Make Transformer models efficient for deployment using techniques such as distillation, pruning, and quantization Train Transformers from scratch and learn how to scale to multiple GPUs and distributed environments
用户评论
借着把huggingface撸一遍
本文讲解Transformers的基础使用,包括分类、NER、QA等基础的使用,但内容过于简单,整体的知识点不多。
不如照着Hugging Face出的教程看,不过不搞学术也不用看了,学完就失业
《深度学习入门》和《深度学习进阶》主要以MLP, CNN, RNN框架模型为主,而这本主要以Transformer框架为主,分别讲了Encoder(DistilBERT, BERT, RoBERTa, XLM, XLM-R, ALBERT, etc.), Decoder(GPT, GPT-2, GPT-3, GPT-Neo, etc.), Encoder-Decoder(T5, BART, BirBird, etc.)。
迅速过了一遍,质量还是很高的,就是大部分和Huggingface上的tutorial重合,只有第8章优化模型部分是新东西。
transformers基础教程,内容比较简单,跟nlp的原理或者模型解释较少,基本就是使用教程
跟着敲了一遍代码