From 2763449fd8deb6dd4b9c9f6cbe642b601617feb2 Mon Sep 17 00:00:00 2001 From: "rafi.ghanbari" Date: Sun, 4 Jan 2026 23:21:51 +0330 Subject: [PATCH] docs: audit and fix typos in READMEs and notebooks --- .../01-Linear_Regression.ipynb | 6 ++--- .../Chapter_01_Supervised_Learning/README.md | 8 +++---- Jupyter_Notebooks/README.md | 22 +++++++++---------- .../01-Linear_Regression.ipynb | 6 ++--- README.md | 2 +- 5 files changed, 22 insertions(+), 22 deletions(-) diff --git a/Jupyter_Notebooks/Chapter_01_Supervised_Learning/01-Linear Regression/01-Linear_Regression.ipynb b/Jupyter_Notebooks/Chapter_01_Supervised_Learning/01-Linear Regression/01-Linear_Regression.ipynb index ff6534a0..7ed8f2ba 100644 --- a/Jupyter_Notebooks/Chapter_01_Supervised_Learning/01-Linear Regression/01-Linear_Regression.ipynb +++ b/Jupyter_Notebooks/Chapter_01_Supervised_Learning/01-Linear Regression/01-Linear_Regression.ipynb @@ -63,7 +63,7 @@ }, "source": [ "# Regression\n", - "In this section, we will try to solve the problem of **Regression**. In our first step, we will atack the problem from the analytical view." + "In this section, we will try to solve the problem of **Regression**. In our first step, we will attack the problem from the analytical view." ] }, { @@ -377,7 +377,7 @@ "## Gradient Descent:\n", "In this section, we will use the popular iterative method called **Gradient Descent** to solve the regression problem.\n", "\n", - "Assuming we need to find $ w_0\\ and\\ w_1 $ in the problem of linear regression, update rule using gradinet descent will be:\n", + "Assuming we need to find $ w_0\\ and\\ w_1 $ in the problem of linear regression, update rule using gradient descent will be:\n", "$$\n", "\\begin{array}{l}\n", "w_0 \\leftarrow w_0 - \\eta \\frac{\\partial J}{\\partial w_0} = w_0 - \\eta \\sum_{i=1}^{m} (h_w(x^{(i)}) - y^{(i)}) \\\\\n", @@ -660,7 +660,7 @@ "id": "Rc90d9vW7qPX" }, "source": [ - "As shown in the plots above, choosing a large learning rate leads to divergence. In this example, the update rule keeps making weights larger and larger and the weights will never converge. Choosing a small learning rate on the other hand, leads to slow convergence. In this example, learning $ w_0 $ is happening at a slow time because the update rule is being changed almost minimially !" + "As shown in the plots above, choosing a large learning rate leads to divergence. In this example, the update rule keeps making weights larger and larger and the weights will never converge. Choosing a small learning rate on the other hand, leads to slow convergence. In this example, learning $ w_0 $ is happening at a slow time because the update rule is being changed almost minimally!" ] }, { diff --git a/Jupyter_Notebooks/Chapter_01_Supervised_Learning/README.md b/Jupyter_Notebooks/Chapter_01_Supervised_Learning/README.md index 759fc695..1a85105a 100644 --- a/Jupyter_Notebooks/Chapter_01_Supervised_Learning/README.md +++ b/Jupyter_Notebooks/Chapter_01_Supervised_Learning/README.md @@ -2,7 +2,7 @@ You should work your way through the notebooks in the following order: # [1. Linear Regression](https://github.com/SharifiZarchi/Introduction_to_Machine_Learning/blob/main/Jupyter_Notebooks/Chapter_01_Supervised_Learning/01-Linear%20Regression/01-Linear_Regression.ipynb) -In this notebook we solve the problem of Regression from scratch using only numpy. We also experiment wih Polynomial Regression and use sklearn to solve real-world problems. +In this notebook we solve the problem of Regression from scratch using only numpy. We also experiment with Polynomial Regression and use sklearn to solve real-world problems. **Warning!** If you want to run this notebook online, make sure you run the following code to download the dataset: ``` @@ -40,7 +40,7 @@ In this notebook we use Regression to solve binary classification problems and u # [3. Logistic Regression](https://github.com/SharifiZarchi/Introduction_to_Machine_Learning/blob/main/Jupyter_Notebooks/Chapter_01_Supervised_Learning/03-Logistic%20Regression/03-Logistic_Regression.ipynb) -In this notebook we learn about Logistic Regression and repurpose Regeression to classify data from small datasets. +In this notebook we learn about Logistic Regression and repurpose Regression to classify data from small datasets. [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/SharifiZarchi/Introduction_to_Machine_Learning/blob/main/Jupyter_Notebooks/Chapter_01_Supervised_Learning/03-Logistic%20Regression/03-Logistic_Regression.ipynb) @@ -49,7 +49,7 @@ In this notebook we learn about Logistic Regression and repurpose Regeression to # [4. K-Nearest Neighbors](https://github.com/SharifiZarchi/Introduction_to_Machine_Learning/blob/main/Jupyter_Notebooks/Chapter_01_Supervised_Learning/04-kNN/04-kNN.ipynb) -In this notebook we use implement the kNN algorithm from scratch and use it for classification. +In this notebook we implement the kNN algorithm from scratch and use it for classification. [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/SharifiZarchi/Introduction_to_Machine_Learning/blob/main/Jupyter_Notebooks/Chapter_01_Supervised_Learning/04-kNN/04-kNN.ipynb) @@ -58,7 +58,7 @@ In this notebook we use implement the kNN algorithm from scratch and use it for # [5. Ensemble Learning](https://github.com/SharifiZarchi/Introduction_to_Machine_Learning/blob/main/Jupyter_Notebooks/Chapter_01_Supervised_Learning/05-Ensemble%20Learning/05-Ensemble_Learning.ipynb) -In this notebook we use implement the Decision Tree Classifiers from scratch and explore how ensembling different models improve classification. +In this notebook we implement the Decision Tree Classifiers from scratch and explore how ensembling different models improve classification. We also work with Random Forests and XGBoost and compare their results. diff --git a/Jupyter_Notebooks/README.md b/Jupyter_Notebooks/README.md index 3a346bb4..0aa0c6a8 100644 --- a/Jupyter_Notebooks/README.md +++ b/Jupyter_Notebooks/README.md @@ -3,29 +3,29 @@ This is an overview of notebooks from all chapters. ---- # [Chapter 1: Supervised Learning](https://github.com/SharifiZarchi/Introduction_to_Machine_Learning/tree/main/Jupyter_Notebooks/Chapter_01_Supervised_Learning) -In this chapter we over the basics of Supervised Learning, where we train models using labeled data. +In this chapter we go over the basics of Supervised Learning, where we train models using labeled data. To learn more about notebooks from this chapter check out [here](https://github.com/SharifiZarchi/Introduction_to_Machine_Learning/blob/main/Jupyter_Notebooks/Chapter_01_Supervised_Learning/README.md). ## [1. Linear Regression](https://github.com/SharifiZarchi/Introduction_to_Machine_Learning/blob/main/Jupyter_Notebooks/Chapter_01_Supervised_Learning/01-Linear%20Regression/01-Linear_Regression.ipynb) -In this notebook we solve the problem of Regression from scratch using only numpy. We also experiment wih Polynomial Regression and use sklearn to solve real-world problems. +In this notebook we solve the problem of Regression from scratch using only numpy. We also experiment with Polynomial Regression and use sklearn to solve real-world problems. ## [2. Linear Classification](https://github.com/SharifiZarchi/Introduction_to_Machine_Learning/blob/main/Jupyter_Notebooks/Chapter_01_Supervised_Learning/02-Linear%20Classification/02-Linear_Classification.ipynb) In this notebook we use Regression to solve binary classification problems and understand their limitations. ## [3. Logistic Regression](https://github.com/SharifiZarchi/Introduction_to_Machine_Learning/blob/main/Jupyter_Notebooks/Chapter_01_Supervised_Learning/03-Logistic%20Regression/03-Logistic_Regression.ipynb) -In this notebook we learn about Logistic Regression and repurpose Regeression to classify data from small datasets. +In this notebook we learn about Logistic Regression and repurpose Regression to classify data from small datasets. ## [4. K-Nearest Neighbors](https://github.com/SharifiZarchi/Introduction_to_Machine_Learning/blob/main/Jupyter_Notebooks/Chapter_01_Supervised_Learning/04-kNN/04-kNN.ipynb) -In this notebook we use implement the kNN algorithm from scratch and use it for classification. +In this notebook we implement the kNN algorithm from scratch and use it for classification. ## [5. Ensemble Learning](https://github.com/SharifiZarchi/Introduction_to_Machine_Learning/blob/main/Jupyter_Notebooks/Chapter_01_Supervised_Learning/05-Ensemble%20Learning/05-Ensemble_Learning.ipynb) -In this notebook we use implement the Decision Tree Classifiers from scratch and explore how ensembling different models improve classification. +In this notebook we implement the Decision Tree Classifiers from scratch and explore how ensembling different models improve classification. We also work with Random Forests and XGBoost and compare their results. ---- # [Chapter 2: Unsupervised Learning](https://github.com/SharifiZarchi/Introduction_to_Machine_Learning/tree/main/Jupyter_Notebooks/Chapter_02_Unsupervised_Learning) -In this chapter we over the basics of Unsupervised Learning, where we train models using unlabeled data. +In this chapter we go over the basics of Unsupervised Learning, where we train models using unlabeled data. To learn more about notebooks from this chapter check out [here](https://github.com/SharifiZarchi/Introduction_to_Machine_Learning/blob/main/Jupyter_Notebooks/Chapter_02_Unsupervised_Learning/README.md). ## [1. Clustering](https://github.com/SharifiZarchi/Introduction_to_Machine_Learning/blob/main/Jupyter_Notebooks/Chapter_02_Unsupervised_Learning/Clustering.ipynb) @@ -37,7 +37,7 @@ In this notebook we go over different Dimensionality Reduction algorithms such a ---- # [Chapter 3: Neural Networks](https://github.com/SharifiZarchi/Introduction_to_Machine_Learning/tree/main/Jupyter_Notebooks/Chapter_03_Neural_Networks) -In this chapter we over the basics of Neural Networks, which are the building blocks of modern Deep Learning. +In this chapter we go over the basics of Neural Networks, which are the building blocks of modern Deep Learning. To learn more about notebooks from this chapter check out [here](https://github.com/SharifiZarchi/Introduction_to_Machine_Learning/blob/main/Jupyter_Notebooks/Chapter_03_Neural_Networks/README.md). ## [1. Neural Networks from Scratch](https://github.com/SharifiZarchi/Introduction_to_Machine_Learning/blob/main/Jupyter_Notebooks/Chapter_03_Neural_Networks/NNs_from_scratch.ipynb) @@ -60,8 +60,8 @@ We try to get better results using different optimizers, bigger networks, batch ---- -# [Chapter 4: Compuer Vision](https://github.com/SharifiZarchi/Introduction_to_Machine_Learning/blob/main/Jupyter_Notebooks/Chapter_04_Compuer_Vision) -In this chapter we over the basics of Computer Vision, exploring how neural networks can be modified to solve different problems in Computer Vision. +# [Chapter 4: Computer Vision](https://github.com/SharifiZarchi/Introduction_to_Machine_Learning/blob/main/Jupyter_Notebooks/Chapter_04_Compuer_Vision) +In this chapter we go over the basics of Computer Vision, exploring how neural networks can be modified to solve different problems in Computer Vision. To learn more about notebooks from this chapter check out [here](https://github.com/SharifiZarchi/Introduction_to_Machine_Learning/blob/main/Jupyter_Notebooks/Chapter_04_Compuer_Vision/README.md). ## [1. Convolutional Neural Networks from Scratch](https://github.com/SharifiZarchi/Introduction_to_Machine_Learning/blob/main/Jupyter_Notebooks/Chapter_04_Computer_Vision/CNNs_from_scratch.ipynb) @@ -86,7 +86,7 @@ In this notebook we introduce the U-Net architecture and investigate its use cas ---- # [Chapter 5: Natural Language Processing](https://github.com/SharifiZarchi/Introduction_to_Machine_Learning/blob/main/Jupyter_Notebooks/Chapter_05_Natural_Language_Processing) -In this chapter we over the basics of Natural Language Processing, exploring how neural networks can be modified to solve different problems in NLP. +In this chapter we go over the basics of Natural Language Processing, exploring how neural networks can be modified to solve different problems in NLP. To learn more about notebooks from this chapter check out [here](https://github.com/SharifiZarchi/Introduction_to_Machine_Learning/blob/main/Jupyter_Notebooks/Chapter_05_Natural_Language_Processing/README.md). ## [1. Word Embeddings](https://github.com/SharifiZarchi/Introduction_to_Machine_Learning/blob/main/Jupyter_Notebooks/Chapter_05_Natural_Language_Processing/01-Word%20Embedding/Word%20Embedding.ipynb) @@ -98,7 +98,7 @@ In this notebook, we are going to implement the Attention layers from scratch us ---- # [Chapter 6: Contrastive Learning](https://github.com/SharifiZarchi/Introduction_to_Machine_Learning/tree/main/Jupyter_Notebooks/Chapter_06_Contrastive_Learning) -In this chapter we over the basics of Contrastive Learning, exploring how models trained on images and languages can be used together to enhance the performance of our models. +In this chapter we go over the basics of Contrastive Learning, exploring how models trained on images and languages can be used together to enhance the performance of our models. To learn more about notebooks from this chapter check out [here](https://github.com/SharifiZarchi/Introduction_to_Machine_Learning/blob/main/Jupyter_Notebooks/Chapter_06_Contrastive_Learning/README.md). ## [1. Vision Transformers](https://github.com/SharifiZarchi/Introduction_to_Machine_Learning/blob/main/Jupyter_Notebooks/Chapter_06_Contrastive_Learning/ViT.ipynb) diff --git a/Previous_Semesters/2024/Jupyter_Notebooks/Chapter_01_Supervised_Learning/01-Linear Regression/01-Linear_Regression.ipynb b/Previous_Semesters/2024/Jupyter_Notebooks/Chapter_01_Supervised_Learning/01-Linear Regression/01-Linear_Regression.ipynb index f514b3c3..5c28a16b 100644 --- a/Previous_Semesters/2024/Jupyter_Notebooks/Chapter_01_Supervised_Learning/01-Linear Regression/01-Linear_Regression.ipynb +++ b/Previous_Semesters/2024/Jupyter_Notebooks/Chapter_01_Supervised_Learning/01-Linear Regression/01-Linear_Regression.ipynb @@ -40,7 +40,7 @@ }, "source": [ "# Regression\n", - "In this section, we will try to solve the problem of **Regression**. In our first step, we will atack the problem from the analytical view." + "In this section, we will try to solve the problem of **Regression**. In our first step, we will attack the problem from the analytical view." ] }, { @@ -404,7 +404,7 @@ "## Gradient Descent:\n", "In this section, we will use the popular iterative method called **Gradient Descent** to solve the regression problem.\n", "\n", - "Assuming we need to find $ w_0\\ and\\ w_1 $ in the problem of linear regression, update rule using gradinet descent will be:\n", + "Assuming we need to find $ w_0\\ and\\ w_1 $ in the problem of linear regression, update rule using gradient descent will be:\n", "$$\n", "\\begin{array}{l}\n", "w_0 \\leftarrow w_0 - \\eta \\frac{\\partial J}{\\partial w_0} = w_0 - \\eta \\sum_{i=1}^{m} (h_w(x^{(i)}) - y^{(i)}) \\\\\n", @@ -693,7 +693,7 @@ "id": "Rc90d9vW7qPX" }, "source": [ - "As shown in the plots above, choosing a large learning rate leads to divergence. In this example, the update rule keeps making weights larger and larger and the weights will never converge. Choosing a small learning rate on the other hand, leads to slow convergence. In this example, learning $ w_0 $ is happening at a slow time because the update rule is being changed almost minimially !" + "As shown in the plots above, choosing a large learning rate leads to divergence. In this example, the update rule keeps making weights larger and larger and the weights will never converge. Choosing a small learning rate on the other hand, leads to slow convergence. In this example, learning $ w_0 $ is happening at a slow time because the update rule is being changed almost minimally!" ] }, { diff --git a/README.md b/README.md index 55a65c3a..b919612d 100644 --- a/README.md +++ b/README.md @@ -1,6 +1,6 @@ # Machine Learning Course -Welcome to the "Machine Learning" course of [Department of Computer Engineering](https://ce.sharif.edu), [Sharif University of Technology](https://www.sharif.edu). +Welcome to the "Machine Learning" course at [Department of Computer Engineering](https://ce.sharif.edu), [Sharif University of Technology](https://www.sharif.edu). You can access [slides](https://github.com/SharifiZarchi/Introduction_to_Machine_Learning/tree/main/Slides), [Jupyter notebooks](https://github.com/SharifiZarchi/Introduction_to_Machine_Learning/tree/main/Jupyter_Notebooks), and [exercises](https://github.com/SharifiZarchi/Introduction_to_Machine_Learning/tree/main/Exercises).