Amazon cover image
Image from Amazon.com
Image from Coce

Transformers for natural language processing : build, train, and fine-tuning deep neural network architectures for NLP with Python, Hugging Face, and OpenAI's GPT-3, ChatGPT, and GPT-4 /

By: Contributor(s): Material type: TextTextSeries: Expert insightPublisher: Birmingham, England : Packt Publishing, 2022Edition: Second editionDescription: xxxiii, 565 pages : illustrations ; 24 cmContent type:
  • text
Media type:
  • unmediated
Carrier type:
  • volume
ISBN:
  • 9781803247335
Subject(s): DDC classification:
  • 006.3 ROT 23
Contents:
Summary: Transformers are a game-changer for natural language understanding (NLU) and have become one of the pillars of artificial intelligence. Transformers for Natural Language Processing, 2nd Edition, investigates deep learning for machine translations, speech-to-text, text-to-speech, language modeling, question-answering, and many more NLP domains with transformers. An Industry 4.0 AI specialist needs to be adaptable; knowing just one NLP platform is not enough anymore. Different platforms have different benefits depending on the application, whether it's cost, flexibility, ease of implementation, results, or performance. In this book, we analyze numerous use cases with Hugging Face, Google Trax, OpenAI, and AllenNLP. This book takes transformers' capabilities further by combining multiple NLP techniques, such as sentiment analysis, named entity recognition, and semantic role labeling, to analyze complex use cases, such as dissecting fake news on Twitter. Also, see how transformers can create code using just a brief description. By the end of this NLP book, you will understand transformers from a cognitive science perspective and be proficient in applying pretrained transformer models to various datasets.
Star ratings
    Average rating: 0.0 (0 votes)
Holdings
Item type Current library Collection Call number Status Date due Barcode Item holds
Book Non-borrowing Book Non-borrowing Library D Information Technology 006.3 ROT (Browse shelf(Opens below)) Not For Loan 1004606
Book Book Library D Information Technology 006.3 ROT (Browse shelf(Opens below)) Available 1004607
Total holds: 0

Includes bibliographical references and index.

What are Transformers? -- Getting Started with the Architecture of the Transformer Model -- Fine-Tuning BERT Models -- Pretraining a RoBERTa Model from Scratch -- Downstream NLP Tasks with Transformers -- Machine Translation with the Transformer -- The Rise of Suprahuman Transformers with GPT-3 Engines -- Applying Transformers to Legal and Financial Documents for AI Text Summarization -- Matching Tokenizers and Datasets -- Semantic Role Labeling with BERT-Based Transformers -- Let Your Data Do the Talking: Story, Questions, and Answers -- Detecting Customer Emotions to Make Predictions -- Analyzing Fake News with Transformers -- Interpreting Black Box Transformer Models -- From NLP to Task-Agnostic Transformer Models -- The Consolidation of Suprahuman Transformers with OpenAI’s ChatGPT and GPT-4 -- Other Books You May Enjoy -- Index.

Transformers are a game-changer for natural language understanding (NLU) and have become one of the pillars of artificial intelligence. Transformers for Natural Language Processing, 2nd Edition, investigates deep learning for machine translations, speech-to-text, text-to-speech, language modeling, question-answering, and many more NLP domains with transformers. An Industry 4.0 AI specialist needs to be adaptable; knowing just one NLP platform is not enough anymore. Different platforms have different benefits depending on the application, whether it's cost, flexibility, ease of implementation, results, or performance. In this book, we analyze numerous use cases with Hugging Face, Google Trax, OpenAI, and AllenNLP. This book takes transformers' capabilities further by combining multiple NLP techniques, such as sentiment analysis, named entity recognition, and semantic role labeling, to analyze complex use cases, such as dissecting fake news on Twitter. Also, see how transformers can create code using just a brief description. By the end of this NLP book, you will understand transformers from a cognitive science perspective and be proficient in applying pretrained transformer models to various datasets.

There are no comments on this title.

to post a comment.