Transformers for natural language processing : (Record no. 1998)
[ view plain ]
000 -LEADER | |
---|---|
fixed length control field | 03390cam a22003135i 4500 |
003 - CONTROL NUMBER IDENTIFIER | |
control field | EG-CaNGU |
005 - DATE AND TIME OF LATEST TRANSACTION | |
control field | 20240131133103.0 |
008 - FIXED-LENGTH DATA ELEMENTS--GENERAL INFORMATION | |
fixed length control field | 231214s2022 enka frb 001 0 eng d |
020 ## - INTERNATIONAL STANDARD BOOK NUMBER | |
International Standard Book Number | 9781803247335 |
040 ## - CATALOGING SOURCE | |
Original cataloging agency | ORMDA |
Language of cataloging | eng |
Description conventions | rda |
-- | pn |
Transcribing agency | ORMDA |
Modifying agency | OCLCO |
-- | EG-CaNGU |
082 04 - DEWEY DECIMAL CLASSIFICATION NUMBER | |
Classification number | 006.3 |
Item number | ROT |
Edition number | 23 |
100 1# - MAIN ENTRY--PERSONAL NAME | |
Personal name | Rothman, Denis, |
Relator term | author. |
9 (RLIN) | 5986 |
245 10 - TITLE STATEMENT | |
Title | Transformers for natural language processing : |
Remainder of title | build, train, and fine-tuning deep neural network architectures for NLP with Python, Hugging Face, and OpenAI's GPT-3, ChatGPT, and GPT-4 / |
250 ## - EDITION STATEMENT | |
Edition statement | Second edition. |
264 #1 - PRODUCTION, PUBLICATION, DISTRIBUTION, MANUFACTURE, AND COPYRIGHT NOTICE | |
Place of production, publication, distribution, manufacture | Birmingham, England : |
Name of producer, publisher, distributor, manufacturer | Packt Publishing, |
Date of production, publication, distribution, manufacture, or copyright notice | 2022. |
300 ## - PHYSICAL DESCRIPTION | |
Extent | xxxiii, 565 pages : |
Other physical details | illustrations ; |
Dimensions | 24 cm. |
336 ## - CONTENT TYPE | |
Content type term | text |
Content type code | txt |
Source | rdacontent |
337 ## - MEDIA TYPE | |
Media type term | unmediated |
Media type code | n |
Source | rdamedia |
338 ## - CARRIER TYPE | |
Carrier type term | volume |
Carrier type code | nc |
Source | rdacarrier |
490 0# - SERIES STATEMENT | |
Series statement | Expert insight |
504 ## - BIBLIOGRAPHY, ETC. NOTE | |
Bibliography, etc. note | Includes bibliographical references and index. |
505 00 - FORMATTED CONTENTS NOTE | |
Formatted contents note | What are Transformers? -- Getting Started with the Architecture of the Transformer Model -- Fine-Tuning BERT Models -- Pretraining a RoBERTa Model from Scratch -- Downstream NLP Tasks with Transformers -- Machine Translation with the Transformer -- The Rise of Suprahuman Transformers with GPT-3 Engines -- Applying Transformers to Legal and Financial Documents for AI Text Summarization -- Matching Tokenizers and Datasets -- Semantic Role Labeling with BERT-Based Transformers -- Let Your Data Do the Talking: Story, Questions, and Answers -- Detecting Customer Emotions to Make Predictions -- Analyzing Fake News with Transformers -- Interpreting Black Box Transformer Models -- From NLP to Task-Agnostic Transformer Models -- The Consolidation of Suprahuman Transformers with OpenAI’s ChatGPT and GPT-4 -- Other Books You May Enjoy -- Index. |
520 ## - SUMMARY, ETC. | |
Summary, etc. | Transformers are a game-changer for natural language understanding (NLU) and have become one of the pillars of artificial intelligence. Transformers for Natural Language Processing, 2nd Edition, investigates deep learning for machine translations, speech-to-text, text-to-speech, language modeling, question-answering, and many more NLP domains with transformers. An Industry 4.0 AI specialist needs to be adaptable; knowing just one NLP platform is not enough anymore. Different platforms have different benefits depending on the application, whether it's cost, flexibility, ease of implementation, results, or performance. In this book, we analyze numerous use cases with Hugging Face, Google Trax, OpenAI, and AllenNLP. This book takes transformers' capabilities further by combining multiple NLP techniques, such as sentiment analysis, named entity recognition, and semantic role labeling, to analyze complex use cases, such as dissecting fake news on Twitter. Also, see how transformers can create code using just a brief description. By the end of this NLP book, you will understand transformers from a cognitive science perspective and be proficient in applying pretrained transformer models to various datasets. |
650 #7 - SUBJECT ADDED ENTRY--TOPICAL TERM | |
Topical term or geographic name entry element | Python (Computer program language). |
Source of heading or term | NGU-sh |
9 (RLIN) | 2383 |
650 #7 - SUBJECT ADDED ENTRY--TOPICAL TERM | |
Topical term or geographic name entry element | Artificial intelligence |
General subdivision | computer programs. |
Source of heading or term | NGU-sh |
9 (RLIN) | 5361 |
650 #7 - SUBJECT ADDED ENTRY--TOPICAL TERM | |
Topical term or geographic name entry element | Cloud computing. |
Source of heading or term | NGU-sh |
9 (RLIN) | 5987 |
700 1# - ADDED ENTRY--PERSONAL NAME | |
Personal name | Gulli, Antonio, |
Relator term | foreword writer. |
9 (RLIN) | 5988 |
Withdrawn status | Lost status | Source of classification or shelving scheme | Damaged status | Not for loan | Collection code | Home library | Current library | Date acquired | Source of acquisition | Cost, normal purchase price | Arrivals Code | Full call number | Barcode | Date last seen | Price effective from | Koha item type |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Dewey Decimal Classification | Not For Loan | Information Technology | Library D | Library D | 12/14/2023 | Dar Al Fajr | 1773.00 | ITS202312 | 006.3 ROT | 1004606 | 08/14/2024 | 12/14/2023 | Book Non-borrowing | |||
Dewey Decimal Classification | Information Technology | Library D | Library D | 12/14/2023 | Dar Al Fajr | 1773.00 | ITS202312 | 006.3 ROT | 1004607 | 08/14/2024 | 12/14/2023 | Book |