Skip to content

Latest commit

 

History

History
90 lines (74 loc) · 8.36 KB

README.md

File metadata and controls

90 lines (74 loc) · 8.36 KB

Federal University of Rio Grande do Norte

Technology Center

Graduate Program in Electrical and Computer Engineering

Department of Computer Engineering and Automation

PEEC2318 Machine Learning

References

  • 📚 Godoy, Daniel. Deep Learning with PyTorch - Step by Step. [Link]
  • 📚 Tam, Adrian. Deep Learning with PyTorch. [Link]
  • 📚 Cristina, Stefania; Saeed, Mehreen. Building Transformer with Attention. [Link]
  • 📚 Huyen, Chip. Designing Machine Learning Systems. [Link]

Week 01: Course Outline Open in PDF

  • Detailed breakdown of the course structure and content, exploring various aspects and applications of Machine Learning.
  • Motivation, Syllabus, and other issues.
  • 🎉 GitHub Education Benefits - GitHub Education Pro: Get access to the GitHub Education Pro pack by visiting GitHub Education
    • 📖 Learning Resources
    • AI Python for Beginners: Learn Python programming fundamentals and how to integrate AI tools for data manipulation, analysis, and visualization. Andrew Ng

Week 02: Machine Learning Fundamentals Open in PDF

  • Motivation: how advances in Machine Learning are helping bridge the gap between AI's current capabilities and human cognitive abilities, highlighting limitations and future directions for AI systems.
  • Overview of Machine Learning fundamentals, including an exploration of semi-supervised learning, active learning, and weak supervision.
  • Discussion on Moravec's Paradox: examining the difference in cognitive complexity between tasks easily handled by AI versus tasks natural to humans.
  • Self-supervised learning: Introduction to pretext tasks, where models are trained on unlabeled data, and their application in Natural Language Processing (NLP).

Key Concepts:

  • Semi-supervised Learning: Training a model using both labeled and unlabeled data.
  • Active Learning: A model that actively seeks human-labeled data for improved accuracy.
  • Weak Supervision: Using weakly labeled data generated through heuristics or external knowledge sources.
  • Self-Supervised Learning: Training models on pretext tasks to build representations from unlabeled data, with applications in NLP.

Week 03: Visualizing Gradient Descent Open in PDF

  • In this week's lesson, we explore the Gradient Descent algorithm, a fundamental method for optimizing machine learning models. The focus is on understanding how gradient descent works and its application in training a linear regression model. We also examine the use of PyTorch for implementing these concepts, visualizing the steps, and critically evaluating key aspects of gradient-based optimization.

Week 04: Rethinking the training loop: a simple classification problem Open in PDF

  • Jupyter Rethinking the training loop:
    • build a function to perform training steps, implement our own dataset class, use data loaders to generate mini-batches
    • build a function to perform mini-batch gradient descent, evaluate our model
    • save / checkpoint our model to disk
    • load our model from disk to resume training or to deploy
  • Jupyter Going Classy:
    • define a class to handle model training
    • implement the constructor method
    • understand the difference between public, protected, and private methods of a class
    • integrate the code we’ve developed so far into our class
    • instantiate our class and use it to run a classy pipeline
  • Jupyter A simple classification problem:
    • build a model for binary classification
    • understand the concept of logits and how it is related to probabilities
    • use binary cross-entropy loss to train a model
    • use the loss function to handle imbalanced datasets
    • understand the concepts of decision boundary and separability

Week 05: Machine Learning and Computer Vision - Part I Open in PDF

  • Jupyter From a shallow to a deeep-ish clasification model:
    • data generation for image classification
    • transformations using torchvision
    • dataset preparation techniques
    • building and training logistic regression and deep neural network models using PyTorch
    • focusing on various activation functions like Sigmoid, Tanh, and ReLU

Week 06: Machine Learning and Computer Vision - Part II Open in PDF

  • Jupyter Kernel
  • Jupyter Convolutions:
    • In this lesson, we’ve introduced convolutions and related concepts and built a convolutional neural network to tackle a multiclass classification problem.
      • Activation function, pooling layer, flattening, Lenet-5
      • Softmax, cross-entropy
      • Visualizing the convolutional filters, features maps and classifier layers
      • Hooks in Pytorch

Week 07: Machine Learning and Computer Vision - Part III Open in PDF

  • Jupyter Rock, Paper and Scissors:
    • Standardize an image dataset
    • Train a model to predict rock, paper, scissors poses from hand images
    • Use dropout layers to regularize the model

Week 08: Machine Learning and Computer Vision - Part III Cont.Open in PDF

  • Jupyter Rock, Paper and Scissors:
    • Learn how to find a learning rate to train the model
    • Understand the use of adaptive learning rates