Librería Portfolio Librería Portfolio

Búsqueda avanzada

TIENE EN SU CESTA DE LA COMPRA

0 productos

en total 0,00 €

DEEP LEARNING FOR CODERS WITH FASTAI AND PYTORCH
Título:
DEEP LEARNING FOR CODERS WITH FASTAI AND PYTORCH
Subtítulo:
Autor:
HOWARD, J
Editorial:
O´REILLY
Año de edición:
2020
Materia
INTELIGENCIA ARTIFICIAL - GENERAL
ISBN:
978-1-4920-4552-6
Páginas:
624
67,50 €

 

Sinopsis

Deep learning is often viewed as the exclusive domain of math PhDs and big tech companies. But as this hands-on guide demonstrates, programmers comfortable with Python can achieve impressive results in deep learning with little math background, small amounts of data, and minimal code. How? With fastai, the first library to provide a consistent interface to the most frequently used deep learning applications.

Authors Jeremy Howard and Sylvain Gugger, the creators of fastai, show you how to train a model on a wide range of tasks using fastai and PyTorch. You'll also dive progressively further into deep learning theory to gain a complete understanding of the algorithms behind the scenes.

Train models in computer vision, natural language processing, tabular data, and collaborative filtering
Learn the latest deep learning techniques that matter most in practice
Improve accuracy, speed, and reliability by understanding how deep learning models work
Discover how to turn your models into web applications
Implement deep learning algorithms from scratch
Consider the ethical implications of your work
Gain insight from the foreword by PyTorch cofounder, Soumith Chintala



Table of contents

Preface
Who This Book Is For
What You Need to Know
What You Will Learn
O'Reilly Online Learning
How to Contact Us
Foreword
I. Deep Learning in Practice
1. Your Deep Learning Journey
Deep Learning Is for Everyone
Neural Networks: A Brief History
Who We Are
How to Learn Deep Learning
Your Projects and Your Mindset
The Software: PyTorch, fastai, and Jupyter (And Why It Doesn't Matter)
Your First Model
Getting a GPU Deep Learning Server
Running Your First Notebook
What Is Machine Learning?
What Is a Neural Network?
A Bit of Deep Learning Jargon
Limitations Inherent to Machine Learning
How Our Image Recognizer Works
What Our Image Recognizer Learned
Image Recognizers Can Tackle Non-Image Tasks
Jargon Recap
Deep Learning Is Not Just for Image Classification
Validation Sets and Test Sets
Use Judgment in Defining Test Sets
A Choose Your Own Adventure Moment
Questionnaire
Further Research
2. From Model to Production
The Practice of Deep Learning
Starting Your Project
The State of Deep Learning
The Drivetrain Approach
Gathering Data
From Data to DataLoaders
Data Augmentation
Training Your Model, and Using It to Clean Your Data
Turning Your Model into an Online Application
Using the Model for Inference
Creating a Notebook App from the Model
Turning Your Notebook into a Real App
Deploying Your App
How to Avoid Disaster
Unforeseen Consequences and Feedback Loops
Get Writing!
Questionnaire
Further Research
3. Data Ethics
Key Examples for Data Ethics
Bugs and Recourse: Buggy Algorithm Used for Healthcare Benefits
Feedback Loops: YouTube's Recommendation System
Bias: Professor Latanya Sweeney "Arrestedö
Why Does This Matter?
Integrating Machine Learning with Product Design
Topics in Data Ethics
Recourse and Accountability
Feedback Loops
Bias
Disinformation
Identifying and Addressing Ethical Issues
Analyze a Project You Are Working On
Processes to Implement
The Power of Diversity
Fairness, Accountability, and Transparency
Role of Policy
The Effectiveness of Regulation
Rights and Policy
Cars: A Historical Precedent
Conclusion
Questionnaire
Further Research
Deep Learning in Practice: That's a Wrap!
II. Understanding fastai's Applications
4. Under the Hood: Training a Digit Classifier
Pixels: The Foundations of Computer Vision
First Try: Pixel Similarity
NumPy Arrays and PyTorch Tensors
Computing Metrics Using Broadcasting
Stochastic Gradient Descent
Calculating Gradients
Stepping with a Learning Rate
An End-to-End SGD Example
Summarizing Gradient Descent
The MNIST Loss Function
Sigmoid
SGD and Mini-Batches
Putting It All Together
Creating an Optimizer
Adding a Nonlinearity
Going Deeper
Jargon Recap
Questionnaire
Further Research
5. Image Classification
From Dogs and Cats to Pet Breeds
Presizing
Checking and Debugging a DataBlock
Cross-Entropy Loss
Viewing Activations and Labels
Softmax
Log Likelihood
Taking the log
Model Interpretation
Improving Our Model
The Learning Rate Finder
Unfreezing and Transfer Learning
Discriminative Learning Rates
Selecting the Number of Epochs
Deeper Architectures
Conclusion
Questionnaire
Further Research
6. Other Computer Vision Problems
Multi-Label Classification
The Data
Constructing a DataBlock
Binary Cross Entropy
Regression
Assembling the Data
Training a Model
Conclusion
Questionnaire
Further Research
7. Training a State-of-the-Art Model
Imagenette
Normalization
Progressive Resizing
Test Time Augmentation
Mixup
Label Smoothing
Conclusion
Questionnaire
Further Research
8. Collaborative Filtering Deep Dive
A First Look at the Data
Learning the Latent Factors
Creating the DataLoaders
Collaborative Filtering from Scratch
Weight Decay
Creating Our Own Embedding Module
Interpreting Embeddings and Biases
Using fastai.collab
Embedding Distance
Bootstrapping a Collaborative Filtering Model
Deep Learning for Collaborative Filtering
Conclusion
Questionnaire
Further Research
9. Tabular Modeling Deep Dive
Categorical Embeddings
Beyond Deep Learning
The Dataset
Kaggle Competitions
Look at the Data
Decision Trees
Handling Dates
Using TabularPandas and TabularProc
Cre