NLP
Natural Language Processing with Transformers 豆瓣
作者: Lewis Tunstall / Leandro von Werra O'Reilly Media 2022 - 4
Since their introduction in 2017, Transformers have quickly become the dominant architecture for achieving state-of-the-art results on a variety of natural language processing tasks. If you're a data scientist or machine learning engineer, this practical book shows you how to train and scale these large models using HuggingFace Transformers, a Python-based deep learning library.
Transformers have been used to write realistic news stories, improve Google Search queries, and even create chatbots that tell corny jokes. In this guide, authors Lewis Tunstall, Leandro von Werra, and Thomas Wolf use a hands-on approach to teach you how Transformers work and how to integrate them in your applications. You'll quickly learn a variety of tasks they can help you solve.
Build, debug, and optimize Transformer models for core NLP tasks, such as text classification, named entity recognition, and question answering
Learn how Transformers can be used for cross-lingual transfer learning
Apply Transformers in real-world scenarios where labeled data is scarce
Make Transformer models efficient for deployment using techniques such as distillation, pruning, and quantization
Train Transformers from scratch and learn how to scale to multiple GPUs and distributed environments
Practical Natural Language Processing 豆瓣
作者: Sowmya Vajjala / Anuj Gupta O'Reilly Media 2020 - 6
If you want to build, iterate and scale NLP systems in a business setting and to tailor them for various industry verticals, this is your guide.
Consider the task of building a chatbot or text classification system at your organization. In the beginning, there may be little or no data to work with. At this point, a basic solution that uses rule based systems or traditional machine learning will be apt. As you accumulate more data, more sophisticated—and often data intensive—ML techniques can be used including deep learning. At each step of this journey, there are dozens of alternative approaches you can take. This book helps you navigate this maze of options.
Explain Me This 豆瓣
作者: Adele E. Goldberg Princeton University Press 2019
We use words and phrases creatively to express ourselves in ever-changing contexts, readily extending language constructions in new ways. Yet native speakers also implicitly know when a creative and easily interpretable formulation—such as “Explain me this” or “She considered to go”—doesn’t sound quite right. In this incisive book, Adele Goldberg explores how these creative but constrained language skills emerge from a combination of general cognitive mechanisms and experience.
Shedding critical light on an enduring linguistic paradox, Goldberg demonstrates how words and abstract constructions are generalized and constrained in the same ways. When learning language, we record partially abstracted tokens of language within the high-dimensional conceptual space that is used when we speak or listen. Our implicit knowledge of language includes dimensions related to form, function, and social context. At the same time, abstract memory traces of linguistic usage-events cluster together on a subset of dimensions, with overlapping aspects strengthened via repetition. In this way, dynamic categories that correspond to words and abstract constructions emerge from partially overlapping memory traces, and as a result, distinct words and constructions compete with one another each time we select them to express our intended messages.
While much of the research on this puzzle has favored semantic or functional explanations over statistical ones, Goldberg’s approach stresses that both the functional and statistical aspects of constructions emerge from the same learning mechanisms.
Getting Started with Natural Language Processing 豆瓣
作者: Ekaterina Kochmar Manning Publications 2021 - 3
Essential Natural Language Processing is a hands-on guide to NLP with practical techniques you can put into action right away. By following the numerous Python-based examples and real-world case studies, you’ll apply NLP to search applications, extracting meaning from text, sentiment analysis, user profiling, and more. When you’re done, you’ll have a solid grounding in NLP that will serve as a foundation for further learning.
what's inside
Extracting information from raw text
Named entity recognition
Automating summarization of key facts
Topic labeling
2020年1月2日 想读
NLP AI
Deep Learning for Natural Language Processing 豆瓣
作者: Stephan Raaijmakers Manning Publication 2020 - 5
Deep Learning for Natural Language Processing teaches you to apply state-of-the-art deep learning approaches to natural language processing tasks. You’ll learn key NLP concepts like neural word embeddings, auto-encoders, part-of-speech tagging, parsing, and semantic inference. Then you’ll dive deeper into advanced topics including deep memory-based NLP, linguistic structure, and hyperparameters for deep NLP. Along the way, you’ll pick up emerging best practices and gain hands-on experience with a myriad of examples, all written in Python and the powerful Keras library. By the time you’re done reading this invaluable book, you’ll be solving a wide variety of NLP problems with cutting-edge deep learning techniques!
what's inside
An overview of NLP and deep learning
One-hot text representations
Word embeddings
Models for textual similarity
Sequential NLP
Semantic role labeling
Deep memory-based NLP
Linguistic structure
Hyperparameters for deep NLP