From 7838326c61552f4a574ec4e448fb0e663ab71f0c Mon Sep 17 00:00:00 2001 From: Oisin Date: Sun, 9 Feb 2025 14:12:35 +0000 Subject: [PATCH 01/10] Updated qmarkdown reports to be inline with jupyterlab reports --- ...checture.png => AlexNet8_architecture.png} | Bin report/qmarkdown/keras_analysis_results.qmd | 26 +++++-- report/qmarkdown/torch_analysis_results.qmd | 69 ++++++++++++++++++ ..._archecture.png => VGG16_architecture.png} | Bin 4 files changed, 87 insertions(+), 8 deletions(-) rename report/keras/{AlexNet8_archecture.png => AlexNet8_architecture.png} (100%) create mode 100644 report/qmarkdown/torch_analysis_results.qmd rename report/torch/{VGG16_archecture.png => VGG16_architecture.png} (100%) diff --git a/report/keras/AlexNet8_archecture.png b/report/keras/AlexNet8_architecture.png similarity index 100% rename from report/keras/AlexNet8_archecture.png rename to report/keras/AlexNet8_architecture.png diff --git a/report/qmarkdown/keras_analysis_results.qmd b/report/qmarkdown/keras_analysis_results.qmd index 29aa3b2..b46aefe 100644 --- a/report/qmarkdown/keras_analysis_results.qmd +++ b/report/qmarkdown/keras_analysis_results.qmd @@ -1,5 +1,5 @@ --- -title: "Analysis Results" +title: "Keras Analysis Results" format: html: toc: true @@ -14,6 +14,14 @@ jupyter: python3 # Cats vs Dogs Image Classification ```{python} +import sys +import re +import os + +root_dir_re_match = re.findall(string=os.getcwd(), pattern="^.+CatClassifier")[0] +sys.path.append(os.path.join(root_dir_re_match, "model")) +import cons + from tensorflow import keras ``` @@ -21,17 +29,19 @@ This project aims to create a model to classify cat and dog images. The data was ## Example Image -![Random Image](report/keras/random_image.jpg) +![Random Image](../keras/random_image.jpg) ## Data Processing The images were further processed using rotations, scaling, zooming, flipping and shearing prior to the modelling training phase. See example image processing below. -![Generator Plot](report/keras/generator_plot.jpg) +![Generator Plot](../keras/generator_plot.jpg) + +## AlexNet8 Model Architecture -## AlexNet8 Model Archecture +An AlexNet CNN model with 8 layers was trained using the processed images via Keras. See AlexNet diagram below, as well as Keras model summary. -An AlexNet CNN model with 8 layers was trained using the processed images via Keras. See AlexNet diagram below, as well as keras model summary. +![AlexNet Architecture](../report/keras/AlexNet8_architecture.png) ```{python} # load trained keras model @@ -44,12 +54,12 @@ model.summary() The model was trained across 25 epochs. The model accuracy and loss are plotted below across the training and validation sets. -![Model Accuaracy](report/keras/model_accuracy.png) +![Model Accuracy](../keras/model_accuracy.png) -![Model Loss](report/keras/model_loss.png) +![Model Loss](../keras/model_loss.png) ## Model Image Predictions The model predictions were made for the Kaggle test set, see below example model predictions. -![Predicted Images](report/keras/pred_images.jpg) +![Predicted Images](../keras/pred_images.jpg) diff --git a/report/qmarkdown/torch_analysis_results.qmd b/report/qmarkdown/torch_analysis_results.qmd new file mode 100644 index 0000000..15ac74f --- /dev/null +++ b/report/qmarkdown/torch_analysis_results.qmd @@ -0,0 +1,69 @@ +--- +title: "Torch Analysis Results" +format: + html: + toc: true + toc-location: left + toc-depth: 2 + toc-title: Contents + code-fold: false + echo: false +jupyter: python3 +--- + +# Cats vs Dogs Image Classification + +```{python} +import sys +import re +import os + +root_dir_re_match = re.findall(string=os.getcwd(), pattern="^.+CatClassifier")[0] +sys.path.append(os.path.join(root_dir_re_match, "model")) +import cons + +import torch +from model.torch.VGG16_pretrained import VGG16_pretrained +``` + +This project aims to create a model to classify cat and dog images. The data was sourced from the [dogs-vs-cats](https://www.kaggle.com/competitions/dogs-vs-cats/overview) Kaggle competition, and also from [freeimages.com](https://www.freeimages.com/) using a web scraper. Docker containers were used to deploy the application on an EC2 spot instances in order to scale up hardware and computation power. + +## Example Image + +![Random Image](../torch/random_image.jpg) + +## Data Processing + +The images were resized to a uniform dimension prior to the modelling training phase. See example image processing below. + +![Generator Plot](../torch/generator_plot.jpg) + +## VGG16 Model Architecture + +A pre-trained VGG CNN model with 16 layers was trained using the processed images via PyTorch. See VGG16 diagram below, as well as torch model summary. + +![AlexNet Architecture](../torch/VGG16_architecture.png) + +```{python} +# device configuration +device = torch.device('cuda' if torch.cuda.is_available() else 'cpu') +# load trained torch model +model = VGG16_pretrained(num_classes=2).to(device) +model.load(input_fpath=cons.torch_model_pt_fpath) +# print model summary +print(model) +``` + +## Model Performance + +The model was trained across 4 epochs. The model accuracy and loss are plotted below across the training and validation sets. + +![Model Accuracy](../torch/model_accuracy.png) + +![Model Loss](../torch/model_loss.png) + +## Model Image Predictions + +The model predictions were made for the Kaggle test set, see below example model predictions. + +![Predicted Images](../report/torch/pred_images.jpg) diff --git a/report/torch/VGG16_archecture.png b/report/torch/VGG16_architecture.png similarity index 100% rename from report/torch/VGG16_archecture.png rename to report/torch/VGG16_architecture.png From 43c53d3cdf318d36d816f5270a43d4819ce9d117 Mon Sep 17 00:00:00 2001 From: Oisin Date: Sun, 9 Feb 2025 14:16:48 +0000 Subject: [PATCH 02/10] restructured qmardown and ipython notebooks to read report model files from child directories --- .../keras_analysis_results.ipynb | 0 .../keras_analysis_results.qmd | 12 +- report/notebooks/torch_analysis_results.ipynb | 260 ------------------ report/torch_analysis_results.ipynb | 118 ++++++++ .../torch_analysis_results.qmd | 12 +- 5 files changed, 130 insertions(+), 272 deletions(-) rename report/{notebooks => }/keras_analysis_results.ipynb (100%) rename report/{qmarkdown => }/keras_analysis_results.qmd (85%) delete mode 100644 report/notebooks/torch_analysis_results.ipynb create mode 100644 report/torch_analysis_results.ipynb rename report/{qmarkdown => }/torch_analysis_results.qmd (86%) diff --git a/report/notebooks/keras_analysis_results.ipynb b/report/keras_analysis_results.ipynb similarity index 100% rename from report/notebooks/keras_analysis_results.ipynb rename to report/keras_analysis_results.ipynb diff --git a/report/qmarkdown/keras_analysis_results.qmd b/report/keras_analysis_results.qmd similarity index 85% rename from report/qmarkdown/keras_analysis_results.qmd rename to report/keras_analysis_results.qmd index b46aefe..427341b 100644 --- a/report/qmarkdown/keras_analysis_results.qmd +++ b/report/keras_analysis_results.qmd @@ -29,19 +29,19 @@ This project aims to create a model to classify cat and dog images. The data was ## Example Image -![Random Image](../keras/random_image.jpg) +![Random Image](keras/random_image.jpg) ## Data Processing The images were further processed using rotations, scaling, zooming, flipping and shearing prior to the modelling training phase. See example image processing below. -![Generator Plot](../keras/generator_plot.jpg) +![Generator Plot](keras/generator_plot.jpg) ## AlexNet8 Model Architecture An AlexNet CNN model with 8 layers was trained using the processed images via Keras. See AlexNet diagram below, as well as Keras model summary. -![AlexNet Architecture](../report/keras/AlexNet8_architecture.png) +![AlexNet Architecture](keras/AlexNet8_architecture.png) ```{python} # load trained keras model @@ -54,12 +54,12 @@ model.summary() The model was trained across 25 epochs. The model accuracy and loss are plotted below across the training and validation sets. -![Model Accuracy](../keras/model_accuracy.png) +![Model Accuracy](keras/model_accuracy.png) -![Model Loss](../keras/model_loss.png) +![Model Loss](keras/model_loss.png) ## Model Image Predictions The model predictions were made for the Kaggle test set, see below example model predictions. -![Predicted Images](../keras/pred_images.jpg) +![Predicted Images](keras/pred_images.jpg) diff --git a/report/notebooks/torch_analysis_results.ipynb b/report/notebooks/torch_analysis_results.ipynb deleted file mode 100644 index 196438b..0000000 --- a/report/notebooks/torch_analysis_results.ipynb +++ /dev/null @@ -1,260 +0,0 @@ -{ - "cells": [ - { - "cell_type": "markdown", - "id": "a9219296-311e-4135-8a3a-8c4320624286", - "metadata": {}, - "source": [ - "# Cats vs Dogs Image Classification" - ] - }, - { - "cell_type": "code", - "execution_count": 1, - "id": "6424bfa9-adfd-48bc-a203-b6242008dd3a", - "metadata": {}, - "outputs": [], - "source": [ - "import sys\n", - "sys.path.append('..')\n", - "import cons\n", - "\n", - "import torch\n", - "from model.torch.VGG16_pretrained import VGG16_pretrained" - ] - }, - { - "cell_type": "markdown", - "id": "05783f63-8fb2-47d0-b88a-8fc91842bd90", - "metadata": {}, - "source": [ - "This project aims to create a model to classify cat and dog images. The data was sourced from the [dogs-vs-cats](https://www.kaggle.com/competitions/dogs-vs-cats/overview) Kaggle competition, and also from [freeimages.com](https://www.freeimages.com/) using a web scraper. Docker containers were used to deploy the application on an EC2 spot instances in order to scale up hardware and computation power. " - ] - }, - { - "cell_type": "markdown", - "id": "1c739375-4438-4ee1-8889-3bec76b070a7", - "metadata": {}, - "source": [ - "## Example Image" - ] - }, - { - "attachments": {}, - "cell_type": "markdown", - "id": "e9d44483-13ac-4013-8d3e-1573872f005a", - "metadata": {}, - "source": [ - "![Random Image](../report/torch/random_image.jpg)" - ] - }, - { - "cell_type": "markdown", - "id": "93a586a0-abe8-4e87-8125-3a0761ecac49", - "metadata": {}, - "source": [ - "## Data Processing" - ] - }, - { - "cell_type": "markdown", - "id": "cb0ec205-434c-4fe5-a4f0-26f811f25761", - "metadata": {}, - "source": [ - "The images were resized to a uniform dimension prior to the modelling training phase. See example image processing below. " - ] - }, - { - "attachments": {}, - "cell_type": "markdown", - "id": "7d53af71-c2ee-4ed1-8335-5565dce7951a", - "metadata": {}, - "source": [ - "![Generator Plot](../report/torch/generator_plot.jpg)" - ] - }, - { - "cell_type": "markdown", - "id": "9ac34849-f97f-4086-99ca-c8ab2ba5d77e", - "metadata": {}, - "source": [ - "## VGG16 Model Archecture" - ] - }, - { - "cell_type": "markdown", - "id": "a7b1dcff-9319-4973-9659-fda87cb85481", - "metadata": {}, - "source": [ - "A pretrained VGG CNN model with 16 layers was trained using the processed images via pytorch. See VGG16 diagram below, as well as torch model summary." - ] - }, - { - "cell_type": "markdown", - "id": "f65e6c9c-c63f-41c2-90f0-2c1321881b81", - "metadata": {}, - "source": [ - "![AlexNet Architecture](../report/torch/VGG16_archecture.png)" - ] - }, - { - "cell_type": "code", - "execution_count": 2, - "id": "6f790b84-1255-42fb-b07c-37065ab49c4d", - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "VGG16_pretrained(\n", - " (resnet): VGG(\n", - " (features): Sequential(\n", - " (0): Conv2d(3, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))\n", - " (1): ReLU(inplace=True)\n", - " (2): Conv2d(64, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))\n", - " (3): ReLU(inplace=True)\n", - " (4): MaxPool2d(kernel_size=2, stride=2, padding=0, dilation=1, ceil_mode=False)\n", - " (5): Conv2d(64, 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))\n", - " (6): ReLU(inplace=True)\n", - " (7): Conv2d(128, 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))\n", - " (8): ReLU(inplace=True)\n", - " (9): MaxPool2d(kernel_size=2, stride=2, padding=0, dilation=1, ceil_mode=False)\n", - " (10): Conv2d(128, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))\n", - " (11): ReLU(inplace=True)\n", - " (12): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))\n", - " (13): ReLU(inplace=True)\n", - " (14): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))\n", - " (15): ReLU(inplace=True)\n", - " (16): MaxPool2d(kernel_size=2, stride=2, padding=0, dilation=1, ceil_mode=False)\n", - " (17): Conv2d(256, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))\n", - " (18): ReLU(inplace=True)\n", - " (19): Conv2d(512, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))\n", - " (20): ReLU(inplace=True)\n", - " (21): Conv2d(512, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))\n", - " (22): ReLU(inplace=True)\n", - " (23): MaxPool2d(kernel_size=2, stride=2, padding=0, dilation=1, ceil_mode=False)\n", - " (24): Conv2d(512, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))\n", - " (25): ReLU(inplace=True)\n", - " (26): Conv2d(512, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))\n", - " (27): ReLU(inplace=True)\n", - " (28): Conv2d(512, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))\n", - " (29): ReLU(inplace=True)\n", - " (30): MaxPool2d(kernel_size=2, stride=2, padding=0, dilation=1, ceil_mode=False)\n", - " )\n", - " (avgpool): AdaptiveAvgPool2d(output_size=(7, 7))\n", - " (classifier): Sequential(\n", - " (0): Linear(in_features=25088, out_features=4096, bias=True)\n", - " (1): ReLU(inplace=True)\n", - " (2): Dropout(p=0.5, inplace=False)\n", - " (3): Linear(in_features=4096, out_features=4096, bias=True)\n", - " (4): ReLU(inplace=True)\n", - " (5): Dropout(p=0.5, inplace=False)\n", - " (6): Linear(in_features=4096, out_features=1000, bias=True)\n", - " )\n", - " )\n", - " (classifier): Sequential(\n", - " (0): Linear(in_features=1000, out_features=2, bias=True)\n", - " )\n", - ")\n" - ] - } - ], - "source": [ - "# device configuration\n", - "device = torch.device('cuda' if torch.cuda.is_available() else 'cpu')\n", - "# load trained torch model\n", - "model = VGG16_pretrained(num_classes=2).to(device)\n", - "model.load(input_fpath=cons.torch_model_pt_fpath)\n", - "# print model summary\n", - "print(model)" - ] - }, - { - "cell_type": "markdown", - "id": "b81cdf44-b643-400e-8f14-556384ba9ad0", - "metadata": {}, - "source": [ - "## Model Performance" - ] - }, - { - "cell_type": "markdown", - "id": "f771965b-b74f-4363-aa95-242038dcd235", - "metadata": {}, - "source": [ - "The model was trained across 4 epochs. The model accuracy and loss are plotted below across the training and validation sets." - ] - }, - { - "attachments": {}, - "cell_type": "markdown", - "id": "52070b1f-4162-4bb8-a082-534a7b858335", - "metadata": {}, - "source": [ - "![Model Accuaracy](../report/torch/model_accuracy.png)" - ] - }, - { - "attachments": {}, - "cell_type": "markdown", - "id": "c9d58fe1-fce7-4754-a35a-10a7325b7bdd", - "metadata": {}, - "source": [ - "![Model Loss](../report/torch/model_loss.png)" - ] - }, - { - "cell_type": "markdown", - "id": "990e17f5-70dc-4542-8c24-83c52bcc3859", - "metadata": {}, - "source": [ - "## Model Image Predictions" - ] - }, - { - "cell_type": "markdown", - "id": "f307af91-0b9f-486f-bc8d-f8acb3845949", - "metadata": {}, - "source": [ - "The model predictions were made for the Kaggle test set, see below example model predictions." - ] - }, - { - "attachments": {}, - "cell_type": "markdown", - "id": "d8856811-a69b-4f4c-a2b6-9d528fcfb75e", - "metadata": {}, - "source": [ - "![Predicted Images](../report/torch/pred_images.jpg)" - ] - }, - { - "cell_type": "markdown", - "id": "ee25817b", - "metadata": {}, - "source": [] - } - ], - "metadata": { - "kernelspec": { - "display_name": "catclass", - "language": "python", - "name": "python3" - }, - "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 3 - }, - "file_extension": ".py", - "mimetype": "text/x-python", - "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython3", - "version": "3.12.7" - } - }, - "nbformat": 4, - "nbformat_minor": 5 -} diff --git a/report/torch_analysis_results.ipynb b/report/torch_analysis_results.ipynb new file mode 100644 index 0000000..7c1cf28 --- /dev/null +++ b/report/torch_analysis_results.ipynb @@ -0,0 +1,118 @@ +{ + "cells": [ + { + "cell_type": "raw", + "metadata": {}, + "source": [ + "---\n", + "title: Torch Analysis Results\n", + "format:\n", + " html:\n", + " toc: true\n", + " toc-location: left\n", + " toc-depth: 2\n", + " toc-title: Contents\n", + " code-fold: false\n", + " echo: false\n", + "---" + ], + "id": "12b5a990" + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Cats vs Dogs Image Classification\n" + ], + "id": "3dd7e904" + }, + { + "cell_type": "code", + "metadata": {}, + "source": [ + "import sys\n", + "import re\n", + "import os\n", + "\n", + "root_dir_re_match = re.findall(string=os.getcwd(), pattern=\"^.+CatClassifier\")[0]\n", + "sys.path.append(os.path.join(root_dir_re_match, \"model\"))\n", + "import cons\n", + "\n", + "import torch\n", + "from model.torch.VGG16_pretrained import VGG16_pretrained" + ], + "id": "9e032473", + "execution_count": null, + "outputs": [] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "This project aims to create a model to classify cat and dog images. The data was sourced from the [dogs-vs-cats](https://www.kaggle.com/competitions/dogs-vs-cats/overview) Kaggle competition, and also from [freeimages.com](https://www.freeimages.com/) using a web scraper. Docker containers were used to deploy the application on an EC2 spot instances in order to scale up hardware and computation power. \n", + "\n", + "## Example Image\n", + "\n", + "![Random Image](torch/random_image.jpg)\n", + "\n", + "## Data Processing\n", + "\n", + "The images were resized to a uniform dimension prior to the modelling training phase. See example image processing below. \n", + "\n", + "![Generator Plot](torch/generator_plot.jpg)\n", + "\n", + "## VGG16 Model Architecture\n", + "\n", + "A pre-trained VGG CNN model with 16 layers was trained using the processed images via PyTorch. See VGG16 diagram below, as well as torch model summary.\n", + "\n", + "![AlexNet Architecture](torch/VGG16_architecture.png)\n" + ], + "id": "87a94848" + }, + { + "cell_type": "code", + "metadata": {}, + "source": [ + "# device configuration\n", + "device = torch.device('cuda' if torch.cuda.is_available() else 'cpu')\n", + "# load trained torch model\n", + "model = VGG16_pretrained(num_classes=2).to(device)\n", + "model.load(input_fpath=cons.torch_model_pt_fpath)\n", + "# print model summary\n", + "print(model)" + ], + "id": "504c7d94", + "execution_count": null, + "outputs": [] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Model Performance\n", + "\n", + "The model was trained across 4 epochs. The model accuracy and loss are plotted below across the training and validation sets.\n", + "\n", + "![Model Accuracy](torch/model_accuracy.png)\n", + "\n", + "![Model Loss](torch/model_loss.png)\n", + "\n", + "## Model Image Predictions\n", + "\n", + "The model predictions were made for the Kaggle test set, see below example model predictions.\n", + "\n", + "![Predicted Images](report/torch/pred_images.jpg)" + ], + "id": "51886ec7" + } + ], + "metadata": { + "kernelspec": { + "name": "python3", + "language": "python", + "display_name": "Python 3 (ipykernel)" + } + }, + "nbformat": 4, + "nbformat_minor": 5 +} \ No newline at end of file diff --git a/report/qmarkdown/torch_analysis_results.qmd b/report/torch_analysis_results.qmd similarity index 86% rename from report/qmarkdown/torch_analysis_results.qmd rename to report/torch_analysis_results.qmd index 15ac74f..695139f 100644 --- a/report/qmarkdown/torch_analysis_results.qmd +++ b/report/torch_analysis_results.qmd @@ -30,19 +30,19 @@ This project aims to create a model to classify cat and dog images. The data was ## Example Image -![Random Image](../torch/random_image.jpg) +![Random Image](torch/random_image.jpg) ## Data Processing The images were resized to a uniform dimension prior to the modelling training phase. See example image processing below. -![Generator Plot](../torch/generator_plot.jpg) +![Generator Plot](torch/generator_plot.jpg) ## VGG16 Model Architecture A pre-trained VGG CNN model with 16 layers was trained using the processed images via PyTorch. See VGG16 diagram below, as well as torch model summary. -![AlexNet Architecture](../torch/VGG16_architecture.png) +![AlexNet Architecture](torch/VGG16_architecture.png) ```{python} # device configuration @@ -58,12 +58,12 @@ print(model) The model was trained across 4 epochs. The model accuracy and loss are plotted below across the training and validation sets. -![Model Accuracy](../torch/model_accuracy.png) +![Model Accuracy](torch/model_accuracy.png) -![Model Loss](../torch/model_loss.png) +![Model Loss](torch/model_loss.png) ## Model Image Predictions The model predictions were made for the Kaggle test set, see below example model predictions. -![Predicted Images](../report/torch/pred_images.jpg) +![Predicted Images](torch/pred_images.jpg) From 904b260af6e4021cc857de8d96fb9ab14350497f Mon Sep 17 00:00:00 2001 From: Oisin Date: Sun, 9 Feb 2025 14:24:11 +0000 Subject: [PATCH 03/10] Fixed file paths in qmarkdown reports --- report/keras_analysis_results.qmd | 7 ++----- report/torch_analysis_results.qmd | 5 +---- 2 files changed, 3 insertions(+), 9 deletions(-) diff --git a/report/keras_analysis_results.qmd b/report/keras_analysis_results.qmd index 427341b..5dcf88a 100644 --- a/report/keras_analysis_results.qmd +++ b/report/keras_analysis_results.qmd @@ -15,11 +15,8 @@ jupyter: python3 ```{python} import sys -import re -import os -root_dir_re_match = re.findall(string=os.getcwd(), pattern="^.+CatClassifier")[0] -sys.path.append(os.path.join(root_dir_re_match, "model")) +sys.path.append("../model") import cons from tensorflow import keras @@ -45,7 +42,7 @@ An AlexNet CNN model with 8 layers was trained using the processed images via Ke ```{python} # load trained keras model -model = keras.models.load_model('data/keras_model.h5') +model = keras.models.load_model(cons.keras_model_pickle_fpath) # print model summary model.summary() ``` diff --git a/report/torch_analysis_results.qmd b/report/torch_analysis_results.qmd index 695139f..bad237a 100644 --- a/report/torch_analysis_results.qmd +++ b/report/torch_analysis_results.qmd @@ -15,11 +15,8 @@ jupyter: python3 ```{python} import sys -import re -import os -root_dir_re_match = re.findall(string=os.getcwd(), pattern="^.+CatClassifier")[0] -sys.path.append(os.path.join(root_dir_re_match, "model")) +sys.path.append("../model") import cons import torch From b7d5c21f2620b872719011a28be2e39d2aeffb24 Mon Sep 17 00:00:00 2001 From: Oisin Date: Sun, 9 Feb 2025 14:34:56 +0000 Subject: [PATCH 04/10] Refreshed notebook analyses --- report/keras_analysis_results.ipynb | 209 ++++++++++++++++++++-------- report/torch_analysis_results.ipynb | 128 +++++++++++------ 2 files changed, 239 insertions(+), 98 deletions(-) diff --git a/report/keras_analysis_results.ipynb b/report/keras_analysis_results.ipynb index 545712d..0fb87d7 100644 --- a/report/keras_analysis_results.ipynb +++ b/report/keras_analysis_results.ipynb @@ -10,11 +10,16 @@ }, { "cell_type": "code", - "execution_count": 3, + "execution_count": 5, "id": "6424bfa9-adfd-48bc-a203-b6242008dd3a", "metadata": {}, "outputs": [], "source": [ + "import sys\n", + "\n", + "sys.path.append(\"../model\")\n", + "import cons\n", + "\n", "from tensorflow import keras" ] }, @@ -89,67 +94,161 @@ "id": "f65e6c9c-c63f-41c2-90f0-2c1321881b81", "metadata": {}, "source": [ - "![AlexNet Architecture](../report/keras/AlexNet8_archecture.png)" + "![AlexNet Architecture](keras/AlexNet8_architecture.png)" ] }, { "cell_type": "code", - "execution_count": 4, + "execution_count": 6, "id": "6f790b84-1255-42fb-b07c-37065ab49c4d", "metadata": {}, "outputs": [ { - "name": "stdout", - "output_type": "stream", - "text": [ - "Model: \"AlexNet8\"\n", - "_________________________________________________________________\n", - " Layer (type) Output Shape Param # \n", - "=================================================================\n", - " input_1 (InputLayer) [(None, 128, 128, 3)] 0 \n", - " \n", - " conv2d (Conv2D) (None, 30, 30, 96) 34944 \n", - " \n", - " max_pooling2d (MaxPooling2D (None, 14, 14, 96) 0 \n", - " ) \n", - " \n", - " conv2d_1 (Conv2D) (None, 14, 14, 256) 614656 \n", - " \n", - " max_pooling2d_1 (MaxPooling (None, 6, 6, 256) 0 \n", - " 2D) \n", - " \n", - " conv2d_2 (Conv2D) (None, 6, 6, 384) 885120 \n", - " \n", - " conv2d_3 (Conv2D) (None, 6, 6, 384) 1327488 \n", - " \n", - " conv2d_4 (Conv2D) (None, 6, 6, 256) 884992 \n", - " \n", - " max_pooling2d_2 (MaxPooling (None, 2, 2, 256) 0 \n", - " 2D) \n", - " \n", - " flatten (Flatten) (None, 1024) 0 \n", - " \n", - " dense (Dense) (None, 4096) 4198400 \n", - " \n", - " dropout (Dropout) (None, 4096) 0 \n", - " \n", - " dense_1 (Dense) (None, 4096) 16781312 \n", - " \n", - " dropout_1 (Dropout) (None, 4096) 0 \n", - " \n", - " dense_2 (Dense) (None, 2) 8194 \n", - " \n", - "=================================================================\n", - "Total params: 24,735,106\n", - "Trainable params: 24,735,106\n", - "Non-trainable params: 0\n", - "_________________________________________________________________\n" - ] + "data": { + "text/html": [ + "
Model: \"AlexNet8\"\n",
+       "
\n" + ], + "text/plain": [ + "\u001b[1mModel: \"AlexNet8\"\u001b[0m\n" + ] + }, + "metadata": {}, + "output_type": "display_data" + }, + { + "data": { + "text/html": [ + "
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓\n",
+       "┃ Layer (type)                     Output Shape                  Param # ┃\n",
+       "┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩\n",
+       "│ input_layer (InputLayer)        │ (None, 128, 128, 3)    │             0 │\n",
+       "├─────────────────────────────────┼────────────────────────┼───────────────┤\n",
+       "│ conv2d (Conv2D)                 │ (None, 30, 30, 96)     │        34,944 │\n",
+       "├─────────────────────────────────┼────────────────────────┼───────────────┤\n",
+       "│ max_pooling2d (MaxPooling2D)    │ (None, 14, 14, 96)     │             0 │\n",
+       "├─────────────────────────────────┼────────────────────────┼───────────────┤\n",
+       "│ conv2d_1 (Conv2D)               │ (None, 14, 14, 256)    │       614,656 │\n",
+       "├─────────────────────────────────┼────────────────────────┼───────────────┤\n",
+       "│ max_pooling2d_1 (MaxPooling2D)  │ (None, 6, 6, 256)      │             0 │\n",
+       "├─────────────────────────────────┼────────────────────────┼───────────────┤\n",
+       "│ conv2d_2 (Conv2D)               │ (None, 6, 6, 384)      │       885,120 │\n",
+       "├─────────────────────────────────┼────────────────────────┼───────────────┤\n",
+       "│ conv2d_3 (Conv2D)               │ (None, 6, 6, 384)      │     1,327,488 │\n",
+       "├─────────────────────────────────┼────────────────────────┼───────────────┤\n",
+       "│ conv2d_4 (Conv2D)               │ (None, 6, 6, 256)      │       884,992 │\n",
+       "├─────────────────────────────────┼────────────────────────┼───────────────┤\n",
+       "│ max_pooling2d_2 (MaxPooling2D)  │ (None, 2, 2, 256)      │             0 │\n",
+       "├─────────────────────────────────┼────────────────────────┼───────────────┤\n",
+       "│ flatten (Flatten)               │ (None, 1024)           │             0 │\n",
+       "├─────────────────────────────────┼────────────────────────┼───────────────┤\n",
+       "│ dense (Dense)                   │ (None, 4096)           │     4,198,400 │\n",
+       "├─────────────────────────────────┼────────────────────────┼───────────────┤\n",
+       "│ dropout (Dropout)               │ (None, 4096)           │             0 │\n",
+       "├─────────────────────────────────┼────────────────────────┼───────────────┤\n",
+       "│ dense_1 (Dense)                 │ (None, 4096)           │    16,781,312 │\n",
+       "├─────────────────────────────────┼────────────────────────┼───────────────┤\n",
+       "│ dropout_1 (Dropout)             │ (None, 4096)           │             0 │\n",
+       "├─────────────────────────────────┼────────────────────────┼───────────────┤\n",
+       "│ dense_2 (Dense)                 │ (None, 2)              │         8,194 │\n",
+       "└─────────────────────────────────┴────────────────────────┴───────────────┘\n",
+       "
\n" + ], + "text/plain": [ + "┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓\n", + "┃\u001b[1m \u001b[0m\u001b[1mLayer (type) \u001b[0m\u001b[1m \u001b[0m┃\u001b[1m \u001b[0m\u001b[1mOutput Shape \u001b[0m\u001b[1m \u001b[0m┃\u001b[1m \u001b[0m\u001b[1m Param #\u001b[0m\u001b[1m \u001b[0m┃\n", + "┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩\n", + "│ input_layer (\u001b[38;5;33mInputLayer\u001b[0m) │ (\u001b[38;5;45mNone\u001b[0m, \u001b[38;5;34m128\u001b[0m, \u001b[38;5;34m128\u001b[0m, \u001b[38;5;34m3\u001b[0m) │ \u001b[38;5;34m0\u001b[0m │\n", + "├─────────────────────────────────┼────────────────────────┼───────────────┤\n", + "│ conv2d (\u001b[38;5;33mConv2D\u001b[0m) │ (\u001b[38;5;45mNone\u001b[0m, \u001b[38;5;34m30\u001b[0m, \u001b[38;5;34m30\u001b[0m, \u001b[38;5;34m96\u001b[0m) │ \u001b[38;5;34m34,944\u001b[0m │\n", + "├─────────────────────────────────┼────────────────────────┼───────────────┤\n", + "│ max_pooling2d (\u001b[38;5;33mMaxPooling2D\u001b[0m) │ (\u001b[38;5;45mNone\u001b[0m, \u001b[38;5;34m14\u001b[0m, \u001b[38;5;34m14\u001b[0m, \u001b[38;5;34m96\u001b[0m) │ \u001b[38;5;34m0\u001b[0m │\n", + "├─────────────────────────────────┼────────────────────────┼───────────────┤\n", + "│ conv2d_1 (\u001b[38;5;33mConv2D\u001b[0m) │ (\u001b[38;5;45mNone\u001b[0m, \u001b[38;5;34m14\u001b[0m, \u001b[38;5;34m14\u001b[0m, \u001b[38;5;34m256\u001b[0m) │ \u001b[38;5;34m614,656\u001b[0m │\n", + "├─────────────────────────────────┼────────────────────────┼───────────────┤\n", + "│ max_pooling2d_1 (\u001b[38;5;33mMaxPooling2D\u001b[0m) │ (\u001b[38;5;45mNone\u001b[0m, \u001b[38;5;34m6\u001b[0m, \u001b[38;5;34m6\u001b[0m, \u001b[38;5;34m256\u001b[0m) │ \u001b[38;5;34m0\u001b[0m │\n", + "├─────────────────────────────────┼────────────────────────┼───────────────┤\n", + "│ conv2d_2 (\u001b[38;5;33mConv2D\u001b[0m) │ (\u001b[38;5;45mNone\u001b[0m, \u001b[38;5;34m6\u001b[0m, \u001b[38;5;34m6\u001b[0m, \u001b[38;5;34m384\u001b[0m) │ \u001b[38;5;34m885,120\u001b[0m │\n", + "├─────────────────────────────────┼────────────────────────┼───────────────┤\n", + "│ conv2d_3 (\u001b[38;5;33mConv2D\u001b[0m) │ (\u001b[38;5;45mNone\u001b[0m, \u001b[38;5;34m6\u001b[0m, \u001b[38;5;34m6\u001b[0m, \u001b[38;5;34m384\u001b[0m) │ \u001b[38;5;34m1,327,488\u001b[0m │\n", + "├─────────────────────────────────┼────────────────────────┼───────────────┤\n", + "│ conv2d_4 (\u001b[38;5;33mConv2D\u001b[0m) │ (\u001b[38;5;45mNone\u001b[0m, \u001b[38;5;34m6\u001b[0m, \u001b[38;5;34m6\u001b[0m, \u001b[38;5;34m256\u001b[0m) │ \u001b[38;5;34m884,992\u001b[0m │\n", + "├─────────────────────────────────┼────────────────────────┼───────────────┤\n", + "│ max_pooling2d_2 (\u001b[38;5;33mMaxPooling2D\u001b[0m) │ (\u001b[38;5;45mNone\u001b[0m, \u001b[38;5;34m2\u001b[0m, \u001b[38;5;34m2\u001b[0m, \u001b[38;5;34m256\u001b[0m) │ \u001b[38;5;34m0\u001b[0m │\n", + "├─────────────────────────────────┼────────────────────────┼───────────────┤\n", + "│ flatten (\u001b[38;5;33mFlatten\u001b[0m) │ (\u001b[38;5;45mNone\u001b[0m, \u001b[38;5;34m1024\u001b[0m) │ \u001b[38;5;34m0\u001b[0m │\n", + "├─────────────────────────────────┼────────────────────────┼───────────────┤\n", + "│ dense (\u001b[38;5;33mDense\u001b[0m) │ (\u001b[38;5;45mNone\u001b[0m, \u001b[38;5;34m4096\u001b[0m) │ \u001b[38;5;34m4,198,400\u001b[0m │\n", + "├─────────────────────────────────┼────────────────────────┼───────────────┤\n", + "│ dropout (\u001b[38;5;33mDropout\u001b[0m) │ (\u001b[38;5;45mNone\u001b[0m, \u001b[38;5;34m4096\u001b[0m) │ \u001b[38;5;34m0\u001b[0m │\n", + "├─────────────────────────────────┼────────────────────────┼───────────────┤\n", + "│ dense_1 (\u001b[38;5;33mDense\u001b[0m) │ (\u001b[38;5;45mNone\u001b[0m, \u001b[38;5;34m4096\u001b[0m) │ \u001b[38;5;34m16,781,312\u001b[0m │\n", + "├─────────────────────────────────┼────────────────────────┼───────────────┤\n", + "│ dropout_1 (\u001b[38;5;33mDropout\u001b[0m) │ (\u001b[38;5;45mNone\u001b[0m, \u001b[38;5;34m4096\u001b[0m) │ \u001b[38;5;34m0\u001b[0m │\n", + "├─────────────────────────────────┼────────────────────────┼───────────────┤\n", + "│ dense_2 (\u001b[38;5;33mDense\u001b[0m) │ (\u001b[38;5;45mNone\u001b[0m, \u001b[38;5;34m2\u001b[0m) │ \u001b[38;5;34m8,194\u001b[0m │\n", + "└─────────────────────────────────┴────────────────────────┴───────────────┘\n" + ] + }, + "metadata": {}, + "output_type": "display_data" + }, + { + "data": { + "text/html": [ + "
 Total params: 24,735,108 (94.36 MB)\n",
+       "
\n" + ], + "text/plain": [ + "\u001b[1m Total params: \u001b[0m\u001b[38;5;34m24,735,108\u001b[0m (94.36 MB)\n" + ] + }, + "metadata": {}, + "output_type": "display_data" + }, + { + "data": { + "text/html": [ + "
 Trainable params: 24,735,106 (94.36 MB)\n",
+       "
\n" + ], + "text/plain": [ + "\u001b[1m Trainable params: \u001b[0m\u001b[38;5;34m24,735,106\u001b[0m (94.36 MB)\n" + ] + }, + "metadata": {}, + "output_type": "display_data" + }, + { + "data": { + "text/html": [ + "
 Non-trainable params: 0 (0.00 B)\n",
+       "
\n" + ], + "text/plain": [ + "\u001b[1m Non-trainable params: \u001b[0m\u001b[38;5;34m0\u001b[0m (0.00 B)\n" + ] + }, + "metadata": {}, + "output_type": "display_data" + }, + { + "data": { + "text/html": [ + "
 Optimizer params: 2 (12.00 B)\n",
+       "
\n" + ], + "text/plain": [ + "\u001b[1m Optimizer params: \u001b[0m\u001b[38;5;34m2\u001b[0m (12.00 B)\n" + ] + }, + "metadata": {}, + "output_type": "display_data" } ], "source": [ "# load trained keras model\n", - "model = keras.models.load_model('../data/keras_model.h5')\n", + "model = keras.models.load_model(cons.keras_model_pickle_fpath)\n", "# print model summary\n", "model.summary()" ] @@ -212,17 +311,11 @@ "source": [ "![Predicted Images](../report/keras/pred_images.jpg)" ] - }, - { - "cell_type": "markdown", - "id": "ee25817b", - "metadata": {}, - "source": [] } ], "metadata": { "kernelspec": { - "display_name": "Python 3 (ipykernel)", + "display_name": "catclass", "language": "python", "name": "python3" }, @@ -236,7 +329,7 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.11.3" + "version": "3.12.8" } }, "nbformat": 4, diff --git a/report/torch_analysis_results.ipynb b/report/torch_analysis_results.ipynb index 7c1cf28..1432277 100644 --- a/report/torch_analysis_results.ipynb +++ b/report/torch_analysis_results.ipynb @@ -1,52 +1,32 @@ { "cells": [ - { - "cell_type": "raw", - "metadata": {}, - "source": [ - "---\n", - "title: Torch Analysis Results\n", - "format:\n", - " html:\n", - " toc: true\n", - " toc-location: left\n", - " toc-depth: 2\n", - " toc-title: Contents\n", - " code-fold: false\n", - " echo: false\n", - "---" - ], - "id": "12b5a990" - }, { "cell_type": "markdown", + "id": "3dd7e904", "metadata": {}, "source": [ "# Cats vs Dogs Image Classification\n" - ], - "id": "3dd7e904" + ] }, { "cell_type": "code", + "execution_count": 3, + "id": "9e032473", "metadata": {}, + "outputs": [], "source": [ "import sys\n", - "import re\n", - "import os\n", "\n", - "root_dir_re_match = re.findall(string=os.getcwd(), pattern=\"^.+CatClassifier\")[0]\n", - "sys.path.append(os.path.join(root_dir_re_match, \"model\"))\n", + "sys.path.append(\"../model\")\n", "import cons\n", "\n", "import torch\n", "from model.torch.VGG16_pretrained import VGG16_pretrained" - ], - "id": "9e032473", - "execution_count": null, - "outputs": [] + ] }, { "cell_type": "markdown", + "id": "87a94848", "metadata": {}, "source": [ "This project aims to create a model to classify cat and dog images. The data was sourced from the [dogs-vs-cats](https://www.kaggle.com/competitions/dogs-vs-cats/overview) Kaggle competition, and also from [freeimages.com](https://www.freeimages.com/) using a web scraper. Docker containers were used to deploy the application on an EC2 spot instances in order to scale up hardware and computation power. \n", @@ -66,12 +46,71 @@ "A pre-trained VGG CNN model with 16 layers was trained using the processed images via PyTorch. See VGG16 diagram below, as well as torch model summary.\n", "\n", "![AlexNet Architecture](torch/VGG16_architecture.png)\n" - ], - "id": "87a94848" + ] }, { "cell_type": "code", + "execution_count": 4, + "id": "504c7d94", "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "VGG16_pretrained(\n", + " (resnet): VGG(\n", + " (features): Sequential(\n", + " (0): Conv2d(3, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))\n", + " (1): ReLU(inplace=True)\n", + " (2): Conv2d(64, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))\n", + " (3): ReLU(inplace=True)\n", + " (4): MaxPool2d(kernel_size=2, stride=2, padding=0, dilation=1, ceil_mode=False)\n", + " (5): Conv2d(64, 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))\n", + " (6): ReLU(inplace=True)\n", + " (7): Conv2d(128, 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))\n", + " (8): ReLU(inplace=True)\n", + " (9): MaxPool2d(kernel_size=2, stride=2, padding=0, dilation=1, ceil_mode=False)\n", + " (10): Conv2d(128, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))\n", + " (11): ReLU(inplace=True)\n", + " (12): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))\n", + " (13): ReLU(inplace=True)\n", + " (14): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))\n", + " (15): ReLU(inplace=True)\n", + " (16): MaxPool2d(kernel_size=2, stride=2, padding=0, dilation=1, ceil_mode=False)\n", + " (17): Conv2d(256, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))\n", + " (18): ReLU(inplace=True)\n", + " (19): Conv2d(512, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))\n", + " (20): ReLU(inplace=True)\n", + " (21): Conv2d(512, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))\n", + " (22): ReLU(inplace=True)\n", + " (23): MaxPool2d(kernel_size=2, stride=2, padding=0, dilation=1, ceil_mode=False)\n", + " (24): Conv2d(512, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))\n", + " (25): ReLU(inplace=True)\n", + " (26): Conv2d(512, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))\n", + " (27): ReLU(inplace=True)\n", + " (28): Conv2d(512, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))\n", + " (29): ReLU(inplace=True)\n", + " (30): MaxPool2d(kernel_size=2, stride=2, padding=0, dilation=1, ceil_mode=False)\n", + " )\n", + " (avgpool): AdaptiveAvgPool2d(output_size=(7, 7))\n", + " (classifier): Sequential(\n", + " (0): Linear(in_features=25088, out_features=4096, bias=True)\n", + " (1): ReLU(inplace=True)\n", + " (2): Dropout(p=0.5, inplace=False)\n", + " (3): Linear(in_features=4096, out_features=4096, bias=True)\n", + " (4): ReLU(inplace=True)\n", + " (5): Dropout(p=0.5, inplace=False)\n", + " (6): Linear(in_features=4096, out_features=1000, bias=True)\n", + " )\n", + " )\n", + " (classifier): Sequential(\n", + " (0): Linear(in_features=1000, out_features=2, bias=True)\n", + " )\n", + ")\n" + ] + } + ], "source": [ "# device configuration\n", "device = torch.device('cuda' if torch.cuda.is_available() else 'cpu')\n", @@ -80,13 +119,11 @@ "model.load(input_fpath=cons.torch_model_pt_fpath)\n", "# print model summary\n", "print(model)" - ], - "id": "504c7d94", - "execution_count": null, - "outputs": [] + ] }, { "cell_type": "markdown", + "id": "51886ec7", "metadata": {}, "source": [ "## Model Performance\n", @@ -101,18 +138,29 @@ "\n", "The model predictions were made for the Kaggle test set, see below example model predictions.\n", "\n", - "![Predicted Images](report/torch/pred_images.jpg)" - ], - "id": "51886ec7" + "![Predicted Images](torch/pred_images.jpg)" + ] } ], "metadata": { "kernelspec": { - "name": "python3", + "display_name": "catclass", "language": "python", - "display_name": "Python 3 (ipykernel)" + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.12.8" } }, "nbformat": 4, "nbformat_minor": 5 -} \ No newline at end of file +} From ac6bff62df9f26e6650126827a7c303db06d9304 Mon Sep 17 00:00:00 2001 From: Oisin Date: Sun, 9 Feb 2025 14:37:25 +0000 Subject: [PATCH 05/10] Removed repo contents and updated analysis notebook location --- README.md | 26 +------------------------- 1 file changed, 1 insertion(+), 25 deletions(-) diff --git a/README.md b/README.md index ec0df2c..0895493 100644 --- a/README.md +++ b/README.md @@ -4,34 +4,10 @@ This git repository contains code and configurations for implementing a Convolutional Neural Network to classify images containing cats or dogs. The data was sourced from the [dogs-vs-cats](https://www.kaggle.com/competitions/dogs-vs-cats/overview) Kaggle competition, and also from [freeimages.com](https://www.freeimages.com/) using a web scraper. Docker containers were used to deploy the application on an EC2 spot instances in order to scale up hardware and computation power. -## Repo Contents - -* The __aws__ subdirectory contains batch and shell scripts for configuring ec2 spot instances and the deploying docker container remotely. -* The __conda__ subdirectory contains batch and shell scripts for creating a local conda environment for the project. -* The __data_prep__ subdirectory contains python utility scripts to data cleansing and processing for modelling. -* The __kaggle__ subdirectory contains python scripts for downloading and unzipping competition data from Kaggle. -* The __model__ subdirectory contains python scripts for initiating and training CNN models. -* The __ref__ subdirectory contains previous analysis and kernals on dogs vs cats classification from Kaggle community members. -* The __report__ subdirectory contains reportable images and plots generated by the application. -* The __webscrapers__ subdirectory contains webscraping tools for downloading cats and dogs images from [freeimages.com](https://www.freeimages.com/). - -## Application Scripts - -The main dog and cat image classification application is contained within the root scripts: - -* The __01_prg_kaggle_data.py__ script downloads / unzips the cat vs dogs competition data. -* The __02_prg_scrape_imgs.py__ script scrapes additional cat and dog images from [freeimages.com](https://www.freeimages.com/). -* The __03_prg_keras_model.py__ script trains, fits and makes image predictions of the cat and dog images using a CNN model. -* The __analysis_results.ipynb__ file contains a high level summary aof the analysis results. -* The __cons.py__ script contains programme constants and configurations. -* The __Dockerfile__ builds the application container for deployment on ec2. -* The __exeDocker.bat__ executes the Docker build process locally on windows. -* The __requirements.txt__ file contains the python package dependencies for the application. - ## Analysis Results See the analysis results notebook for a summary of the project; including image processing, CNN architecture and model performance. -* https://nbviewer.org/github/oislen/CatClassifier/blob/main/notebooks/torch_analysis_results.ipynb +* https://nbviewer.org/github/oislen/CatClassifier/blob/main/report/torch_analysis_results.ipynb ## Docker Container From bb73f0e57d94379dd0487dcb37a7c3bace61bded Mon Sep 17 00:00:00 2001 From: Oisin Date: Mon, 10 Feb 2025 18:11:52 +0000 Subject: [PATCH 06/10] Added draw.io workflow diagram of the data collection and modelling process --- doc/catclassifier.drawio | 94 +++++++++++++++++++++++++++++++++++++++ doc/catclassifier.jpg | Bin 0 -> 53815 bytes 2 files changed, 94 insertions(+) create mode 100644 doc/catclassifier.drawio create mode 100644 doc/catclassifier.jpg diff --git a/doc/catclassifier.drawio b/doc/catclassifier.drawio new file mode 100644 index 0000000..85f692f --- /dev/null +++ b/doc/catclassifier.drawio @@ -0,0 +1,94 @@ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/doc/catclassifier.jpg b/doc/catclassifier.jpg new file mode 100644 index 0000000000000000000000000000000000000000..95f2f8687b327c7e55986d56116f48facf52bba7 GIT binary patch literal 53815 zcmd?R1z1~MvoM-^X`wCJ1}zdiPzbKYNw5;!3nV~*K=D$bihBYChZYDfB{&6YxE6P4 zai>Uex#@TQd*1i^p8q}P{qDWzx%YYQB-xogOJ>%rS$of9&&v78`FFq_m@-ruaNz;~ zaDn&&oX=bsfhs6iJc4T}L)Df3qo5T)gv$>A0JM|4D_lkXzM+xv{cGRZ z+@H7;>pc36yXwLfh&Y0XbK3q1xB3%q{lxWm{cvI(DYT=A@p#e31_0R30RYI1007Ew0Kg5Cf0Pjqe<9m_Vi7%2E@$G)7T^d#0`3E# z04IPofRBg?0v-Sa03zq(00qFMi@)Lb@e&a(U%mVru3x=+#iPV1+&OZUjuV0wG zBz)-t3*aL8g-hfY&Km&?M0FFz`fZ8-CtSUD<@&`-moJbIi*4@!09P+uxJl(sa&n<8(I(u4+Kz{|Xq?qW>mX)js5%SFPiKqQ0$Sh`@ zu^i|+)baI&(udq7ZknmS)`J$Q;RU^}FX;VQ8445q zITrVR87x|JpR)WvSEv$Gc5;oA|Ci4lo1OcmMJ+plEc$VibK|zN4;G{~KpO0dEoM)9 zr&YZXF-91=(DBE2|ND^+n}Pa_$zX3AI<`-$;OT@HVx26&#(|W;n`EVQ%_ggz2Zc{RkHq~rgYSr0`RJL}+&GBe9ucyk`fuGf7*nuYtQA8ilu<~FoAu~9li ziI@~I_9$B-e8nk}!Qn3*|9W&oNxqxR{hA#$&FD~;8qdMe1v~8eEAwJfdfV6u5TANs8(MPqIGFvr~1;TSsWJll@H*GRUMn1*7n#+71KeMWd1GqCb zM&)&+EN(C3V{x`u=Mg_{c+TVS$`!{IwR91$`CHBMj}ot&K)Pb)6lW-uH`%I{x-O@7 zB1$A>7sg&sPMm5FicxIfXJblhy~P$}*43X?BDX}RL{bw$Eved-eOgPoT9=ccZ1%#} zhlEGgJN2`-lIUYdGtyl|UU!jMA4oWGXSNq$ok8Zx6PfU~zL~gfpR_sK9w$y!qqw7K zTK%_QKCQV=3uoJmDOP(J;IvsGSHCDCXeZI-UW;5Zp{7Db&3DGy%7nNS?S#D{^c_0< z!k{j`KJSD{snAfIlM^5R3c^S*wqZ7+CA`xU(F`$;h}X{{o5ZTJe9tsr4*0=<8Ef>g zr^&C!%(^ZDZE}e04+4(Ahxuy2*NI z-^5f2u$8e`Ln+auw|IzjDcCp#G0@>^Ef$n)JlurXWaI;L%U40O1iYSYqTdYmKkO|3 zz*5?Q{E{f+rz||z*Ul>z#wIOU=e>t@EQqmJK;UKkey}AAgG^NJ@8M!lX^L-*=Xt8b z$17PO){_#N+%=tk`iQ-S`ViCBI-OEhSE23QK=|=Px1fGC=de|tUP5pMWEot{mlxX6 zF{|#L=Als@pSbNs=fUQUBRhs_2ny z{ihxHX%TyQ)y@tE^ccqjYvIN%LT?HoQu?cbx#v#jQ}*2}3YJwS##95dsJ+e<5Rj0g z+S(?GM;YtfYxKF(Yqje+5N>4Yu`qYJ<38R0e*VW=`^xh&&v!E~(>_9p+&{~SV5kBG12TOyN)H%B%vOpOg472_w zxrnV>eF;<6iC-HtACLN8AM&J$d%(}$9GY~fQ!rHCYAMAA-c9Q&g2KmOR8al)Ib{rW z$f*2S1FxgaN1bt7sW6cq?Uo`A{kyl*IL=7#jh*Iyy_YSBqrqrv=`2onf-d)$ zCiF^@P7Kv}n*LzeN~|MCy3AUV&{tzmA=icSwUVdG4-WH1})=o1$ zD^t#V6~ATz+=&>?Vb8H?*kl;w?A@WxpD1wx&7M7K+*_MeAYePKqGQ7COf-_&og`G* z9DhjHY|@slNz3t>OLK{cneG=Xt=JW2&xw&rCgv+%WP3NZj}WtaU6C~qhaHcO(rshQ zO3ale9BUvSfTp~g!Rk+iwO#WVj;S}G?xBQcDVy6L#KAEs(c2NoF1p;{)^BFFEOo}b zsLMKOn^M1QFFj(YNsnS>iDBf@X9yRWbQzKc-tetL$K~;e_>}z=#s>EpNfHB2tMZhT0ibD!^cp@NQ*DDJ`3y>6iD^j~Os6>oyneE%qf zgNxEQb}WNDIhn%>_d-#Ykul$E;>~pS>06XA4_ib;(`VNe1wx+dQ9iQHuB(MmtX5Q+ zNJghsU5@P zpsXX|d#vGqZkn({bK4>}p)keDcl41)3owSpQIp|BT)jHp(60dn_hxQnR)j&b_C{J; zJ74M#(H7)N$k``68|G(fy>F;9Y9gN+uG3m#!uQa&;TiV`hFea|-T@oE`D_&U4k|=5 z_pZgjM@98Z!EreM&GNcxmD15jK~y-dogrm5(u59ce8fdTqcj}Cg-+Z^^YtU_O7eF5 z)~q834DXgO2pz%7dmnM*gH;oTJx}78M+IZoVwBrJ6yN-_1{9}OAmo{AEvX}l6>(UH zD~`u)(E^`;hTztcHS0}N+)^6eeVhMso`}FShuHfG z6MNKtSi7#46xI$~rsWgXi+2V71YAPZrE<9RnU~X_nzOJ$vX<+-y8wHcrqt^RsD1)8$566%94fhOB z`^NN6wu{e3jNe|&|0ja5BKJJKt6!G z03iWc0&{n`eB?rq06;j4#0{5zl4BC`9B3$T|W2x+p{CDX_w{;jO{JolT$q?ejg!b8b%C)_;QW)q zD<@I>zJX1D4CUX(LQxG_+V*i&;X#Z+4zxlJB*FzsvD8iOla{_T8S+O-q^EjMfga2} zRg=|AbG>;6Z`#l1F8i(9mv4i~5`;Hv&FH{nmhI3ZHLX{Pw6e=QUh5eHT zGum&=)H=OyxT($Tr*ZXpejRCTbll5-Q-J@v`royt|I5qqzh(~r{9Wg@QAgt!OPFqM z7D8-S(y%wv5L#r+`?i=Td-dk`3W~b}-)vV3K6T`g82)z*|Go16KblX@PK0qg+O=)W zGV{@6)vAK;VJENKxLs=lobKZSKV)%n$8(Ou4JJT#FnAVM3uPOLne(&R;#JM)3NDB4 zu@>Pn5l4xo$e#DOPQy;Nq17HSRdjiLNS?~Kuz3Ofj&*3})g6q0W1y1*s3O^Bidm0) zoy3T-IqoBM#rs|i2D2Z@7JS5H{#vlx-wL-wS33yYGmdZyp!DB^Evp88Y29ayGRLYb zi$IpLgr^*p+Ux~cX&|Hm)VNt?B$-ej^g5b^4Nt=7uc-7bxZ%PAVoRch*JF5NSKz?C zN3xy0-en+6Yty6*2lHpjJyUQqF&#TI=dEX+i~pGnJ$ZvSN=LPwntX9@j*j}2(wZ}x z+fYL#Hvh3zWnZgN>(ejavI z2zcxb_GKPQ7Z<7>`R<<#P~j#Cqs2XVi)aulM|ZmVGX%utOdH(1#wP1HS*8W7U@Tx96j^9?%$#A;n{Z%V2;VI#V9bo5ecByNp|9CUZV?!!XM>5 zNztaL#;eJjqL5v9@MT|Yme@yyYj>l9=gsRA+@91ta7zn&2)LU4_^+1m<-dE4_`9q6 zuNeIm{qNpHtp4sz;jb9|C;e*Pp7Vfx*v*KCpNfEH9w)x%fCuqZcp%8&fsQwr5c_#^ zSFzk~8%h35bwj&`LidgS5Se*@y;Nfbr55-*QaEt?-3p%@3u!Z>sdqHfo|cW`E{ZJ4 z)ba~oN_2*F73)NJ>*Fm3sodd@r%vC6cWgDMw>)0v$7q#Q1Ao5zm_1-0*kLm>&C`YJ z%K5(COs8{u&BRg7vVyUttsgS+;imcb{?psqLgP}6rmUlyA5}kaCZCPBJ47m|${k=+ zgnY&^OmZ`hiMu<`j%Oscjql$-IICG|*57Js@i3uC=Zp%Ssd&;7D^fC6lxZNu)SAIc z=?=biA2${P)pk-=sR8=3lu#XsUwXGoQ8$guKd6fTHFnuYBumZUr**l#hlLBfKu$4b$@8EbT7{H3n}(2h!K%(D zF~{?KjiVl_hsLO`BOrOaHX659e6(-O64mKJo83{t82-YFNuf;^)}`gTT7*tmom^%;0mpFhh}=39JDv=jl%~rAw-x! z{b1oqsvkYbOSY`CX?Y1a;&}a^7V)ph-?jC>YrBvC#DMweO~JDl*<{U{7QQ%}9tbBj z#?aNwfBtTW0CjBMqUU}Wmmv5k@y6-o(?Om$n@_rX`jW$6sQK`;W)9K?99uaFvWsD5 zU!h>|L!(`F}Cf7K*3$un10^^&AN2ZOgS7KE6htwq$okY(8!P6oq zBHAH8KN@Jh{nHEr{wK}j&wT#CIl-6CcKDx<0DI<@5*9fl`5DXxz{G>Jt!j@$rKF2idnHhP2cy*+EfYWR%gaEcSiuA4MX%D}hEh=h zHn|Ot{>@7b17G=H{!(BUJf1S={Z6@HQJ2|~gS6go?yEQ}u;ocIKh*4U!jqr6dKw3} zh)6>A{E&0d7b*hco0Y!Nzv8UjQlKYnRc-dHhh8*>*Aky^d%Lf-^ji7jUas22UUdro4| z8XGFm?pXSv`0%=mQe*1Ihwe>WoA6H%7Z*g^?uShM^R*L$-ZCUW<%!fj-GHwjkOl%8 z6w)jf%I%{SLCVEOMm9&c*R!X|?ZT-{;uThum0g*4OVa;D z_55HFcg$0OGdc7aO#C9WkV$45W{YpHYb)$amqh5>#5?j*nP0BJkTaK`ea5wc@7IFZ zK#wnxcJ_#S??51hFNrw`4}Zz(WiA)&78YYxS>AhVYU()s-0^d>NAq`TyJ#1I^A@V0 z=o-!**BR^LS)SEm+`p!dw|>uVL+m(=@>o0l;ud6;1LC%p zlFk8FPG-eF&&|TEK88NrK03Trhf*TwP*+XKk-v?Ir`t?qg>FmZVEf^ro^6%-pKeT# zA8W4un9Xl1oGeU!VREC*vFet3Hcu}bR>NBtmn4$Otfu{r$9Ei*q*wK7otCeU-&AOg zROO1_agN~O(^z(=lsFxdVb;poeD5|sB{&v_#;@VV+z(x#iB9(Hqay9=7mQkmV)^9| z4rooGnj7}`j;kx_vFbY>S5YD#X5@ph4p5Bvr+si-OYv87w=qGG-_=x8{h9WZGcQ3Q zubD%XhzGi8XvD9n;Pr>EsQqMz$hJA#nbdHRwn~G0*TJ6ED35FvNt&=&tc^izrpNG) zRx7$RjianY+$<{cDDfOH1j?a8uN5nrF!ts<_1AiP_<2+3fK;-w@Nb+vT`kotM;UdG z3(Lo~=`H!i)I?$?iwH(iPUQFF`aMTQdXNnl>jH`lyVfJYckiN?*AV(K+!nflX>V9+ z4z?z|?Vdr_@dNxz2veF^AVr9o8hTBDl9t8FIu~&9@(akPt-|e_0)p3JqCaZpsdr|p zv)PAL`LE5&MBq9CjuDBz*J^;6BX>8r;K~W+nQ(y)Mq<5sH9iWVIA)}&KvKqHB<8;s zt6}TDpDRb9AH(rT(e&Gtse-UAu)Ng;)YEj3$<@N7v>~>2_N7^zHd*d$YanUlN&Vi` zllrY;8fey;W;H>BW83ioDb2M6y_H*(a;bf;+oYic;VdhILfjOJ55#>yrC1bRKg$a2 zty%hGk^j%c*tZhPEQb}IfB9k}VAX`uk~!|CphJ~ol5lt?#jN1)&YpMVWWJ??W{(D2 z^y?oarLf*KB5!J)lUEFLKro$Evf1{bl@T4*3MKCb#mu0wS$%kfq++|&I8Rf#*g$4( zDe5GOLtYm-{V8-76-D-S3;39m$^&!9(IJVR+_%*89S(QVKnGM$&m6r(2ab=Ni zes(O?V)v7W4C-WzM%~w!bkx$X9-n8Ipc54rELIHIz%Ka)N98;k*ZhVqv7sEZpgUO1 z|0v-ty7*(AKjOrVnDZl*ytCCCx=K;9nZI>BDbyjrQDQC0p0^rcAlCT>&1L*8CJ4ME z-*#iZ%s-+v3f&nV=y@v_h0oL{dRUAQa4nQ8$fgoJTb;n4=S$Jw9GC81W|0BaZ~3TE zMOMCsGVjzXZxG$;(R=Q`?cCGChEf|0n!lG_WFNs>{!3xCGsgtPu?Z}*so^bHr6mp2 zkicj2G%~D$Q zkdC!ZG|ySjtkf}L+g|vbhtT9eN4am9spUH6*oh36O-DM&xa3q~aw|UT?NL!oItfV) zU1Efd^B0*u8jM8`;KVumu-LH=negD)KrhYx@582A9A7=+HGj02kMPTxqWS*^#@u^tSx=acBMUV_Vrc8td*Y z*-WMvzrbemTv>S?@4x~zQClW#YuBl^No)D^*5d279rIh*H>ARx%=K=!eTXz$)Rd``u03$!C%>4ph72|^pQv88 zLgCA724nMSfMj9wRNF>kaBwWXXdtux3FdQUECR(r^@AK+SF)^SfIl@{ie)V~^WBpS z5^#*7A#jyv1;=~zPMPM4P+&U*>9)OX{P8MX9Bp4bGBsBbM<8=W&w>ox(Y@ThR5=2d zlToSZ+|&0IVb=<#bl~4cYrs-sF7Oa%FFRv31Q$#YI`d$o?ZK9vvtSr6cYS}l2B^YX zj0&b(J?NX&dbT7oVNxQsoP*@AQ-dy{kQ@y=H>7Xw2$nYaaBbE{mE5;jVU#`xq)fv1 zjx4t}4$+(2<9p09y6$H$*rG&Cog@`|_4@VZwIA^I_0zpC2|%%L@Y2wQSTMqalSF;? zlU)Yyf`@{ayjA%P!`wGV{N9zuE|6N^XmYhGiF0NTl~Ht5*)b`zCd?A(Kwgtul!%TH z;k*o3Yz!N1rFNVHW{*1xqf!_I;cHo7`M^|=ZDW?r7SjLr!?k?#mq&Bt3JUonlXA>3 z{g$T2w06EH4Ch}%EB=nX zLCpwD!QsRe36B`{SR@&^)qkLyG&)RkuUlJ>q}HvKJ3Wy)QZ(*##)Knd-V@>jc2XhxpluUV>j7FQ5gnZ z%Sxmc&8f#)p&CsLtLL*RjdM_OP&4*2Er(gCiE@UFr)KU@-U=KZ%NsMdWJKH9rzTWH z8NuJXoa7QA8v#K^WH7`oW{o_|ZizQC?B}ulu<^K&i;qfU0P136?~bdG@P#|DYB=9J z2apy)+Ya6>sMsZ4YMGxGe4ySHnrSatPKdh@mw8`32-|Vx&bT+xc^Vo8QTPO`5wx?g zF!(cub)Kkq!Q^zG-0l*@hXq}JS))sDcg$+p2xcWa70$(4!4kEEsudIblZIQ>3cG&Ju=Hs%{{gGhGxz+uBYC2z2+J9Ksm>g$4yUWCtYTTpPOu$I6j|L zICU?FtZg{Y)i14nT|Wq2-&jod!l2t^GEv3Kll@6KnPn{cHeKa`3tw?~mJe)^qt67g zw^)V5uoFqKSmBC>W;N#++w^`7&Fq2vHa?XAzKjigUznG|W31v>o}}|+Ef&HG3+q$J zg`gYPvN&7S6)(Ysnw8ZAMbQ4Htn}S0KoN)a55-%FyufWS9b;C0&QQF^Yzz#WP^OTXM{OPx{_K*kGX>jc_Y5gfAoXZbKaY7D$!n zcWU+j4*sWNvJ0l3*y^|_Pme*#10Er0(U%f-!3^mL9;cff(D+G~d-f6u140$MPrj)} znNjCPu+@=}y~SdW8x@AfT2>5)v?|UN4r;bqZDyns#Y!FnuIdw$FLb@1ziA$rSlBy$ zgyuczo7d4D=;hj`yD`rk;HsO6)iy<>ZqnHC-4vy4$~pihUcZuVbE?tGqEyw!$IU*i zmx0k>XZRIhsp)ys?cS4NK?n*%HQ@x;S98Q6GT<8AS`quX;Q3qM_w&9uN`;<2U^H`B z=%Z(Ag{gm2j>BZNb_Pee{uFp*?l9k&duNB<5}m3?b7RBO@1{-Xl=s435H(B?;@qzo zz}bhAeHUNnb1N=cOKDXkEh^3|eY?kMcc9-OYDiMal9eMAm!nxd?^zXF8L~`6Kjjgd z)syUx(kU>?LXl@;~nH~EX zlVk_hk+^TQ83>~WlK3AkWzuFfwTUrBWFrQ$p~>#SI)&94ia?Adlk1GB9EC|7*e3hX z$v1-g9*IOuO6Ik~;hP%pn*1o22XmiWW@Z^Jlp8PCHzuc?*lBej?d`$`4jaZ1KX7d` zG0>jQ^u)pFJ@fV)E&u98BWuPMrW|}V$M$&b5Gyx@WJO_yoDO>A9SUBlu!87A@vF8e zW4le4aBrbj%`Ew)q=9d@+xC^BOhy|*kE>miM=(NlaeZh1yC%TC6Y%8>R2sT5iRrM&S9Jbnd&R z1xGcz0pdV zH1-~QsL6*selaK7J3gW{(c2bFR|^(<7onYVRzHyfdMdYSLVd!nwk@vO@TrJ`m@CsF z<3kh`(Gn`ZB+ke0n7M8HDH zDb)LGO9XE8I)*NS`%)}Gt?b@raS?9LZ9X3q{APOmdP1*bEewN$HI%>*NLP11MCucO z`XO-xi(jlRs`u$P(IWHeZYftwSg+EQPIS0hiSbb-V zd!{f@q6z@Oh;4tvg}<8*@^^#XPyguFVMt+Db70h2ra>#`;G@^^)OPc6+O$Qj$}yWj zQl4e{!$Opp4SgX_s1mREJ_xu6Oo%a2R<9n*lIk}ld7;hf1q-Rq%A$7EppBptBRO`ybo1p?OCn;#O2bTPSebzt{8u_o&&pL?s5yr#dk`k zfk)Vb<3H6#j)hx;+8~{MfjL*?S<~>t6CG_D2aWw?3SndpXq4y;xwAMFEwe@R>%ocgIMbjq}JWt|U^W2U#M}M7q`T2V#&KGTN4;rSJz`gG`PB?t4>-m1@WUOccc+egY7 z;8egU$ykX{^6kpTUCsDggd$Fyi#@JmD`bE_Y-P@#4+?>9;DH+|RoaF+3WK|QK5y_5 zcByMO+hM^1b=jX+fG+7KeC3+n+vy1CcJqvB*Bf-&4-{F7re$(Y?BcLZdx^{zHV zvhdbE$W_EOB~IPA3$Kw}8k#bThjF&s096$lW3g~VYHL&?Zk2{k!Yz@b%*G^u?_KsL z;krSIQa)<0O(0&_xw_Nb_X9y_KTj@^=M|8#89A`=_IjZlb+4R(9kQD_X0WGa&<}NM zm=>eLc;i!mf-R# zoJuwC!v%xbUk@hjf6$$T?1W737>OxO$G?3&i70tkUu&It7;t4$qrO|3_6FsPyTz#vnetmVBICuj2F3lW^z7O+FWg^+}QtP9%HcejO_WGo{kW^}Bza)GwE7Rc`C z_J=&T%x!+36qL&(e*hnO#4GB^47>iss}EBZ`%fYai|zVJ!si-h+wYIPkLV&bRGv#a zB!L)Ezf<*SP-#NM!1dz8K1(S}mkt+b-AsCbn3M*-r2a6Lobcm@*jN+3N?4=f+h*xP z%7BKI z2_WAU)F#L53Qjx)KA3Bgm9pDgPD+#(!~g&b|LeD~>uHt+61FevCe&Lb@C6B57=bIx zcH*v7$Qn@YGQ?vLJ&a)Mu@s<0^^rrUBapn*Q%`nuDl(NgGpge`6x(<=(=0h^F&Fab zIE!VFn{4Un~Vb)79X}n2Aca zM*n^F^1v!BgdsxTz|GaMa+y;p_17!5RG0F^B-Zz1*D;KOWH6!f*W+%&oI_>8>G;n- zy;-)fCmD<{?l#RPX>fHTWdRq~rTj&;%e%{Dt?2hSKB~mPi?n*@?qJioi@lWyVb;55 zNQ7eiJ@X|OY0@gTJTYm>!DQeeU9CaBu;U}!_anN2Q$3$1ZMMPYkpc=IpZZ$4^^~@qrl?X zzGz;|X{?agzuRt!;w!IV2wQH>-Z}}${*V9^d=#UN2mN5Qf253Wo<@>$p=qOGYM!3} zw{F$;lY#dq)Xp35bAM#~!Q zHlgz**$4=v)ji}4k8)kI0^1cjpn3(2%9rm{0pL!ylbM7|Q}J34>XZfRRZ95G%WxK7 z!>B9A_;-*z&l?yy7R*IUJSUwcv8wKj@4s8ji7vy~8t+-aEtn+hagT~cB2D{we(Q~OZPNK&3MuTF?=2yML+f4l0xZR^R zf}<$Ej=e}f)j9{H$Jf1(J!tr(P4cP2xcX3gFRn{HY|v_qkmLaR)%9Tf>DlVXr(RVK zW3R&syd=;GmFaqU!aH{6+P=i3cboe?^X#f0eSZEbVR}7vBtHJS%L4iRVv<8Sxaq>y zAlcoHT%cBo?IVAS;rK5vCXzd#cSG`AqOp7*cBKnVc^*AMtK{u$T)_=me1>)5>n*r$ zQ)-g0p{t?N-w1+&LYyIr2n0ewL(qWD!DujPpn^k9IRpT>nNQqw`Zu2g{o6@n+{Ht( z@MeNv$u?6{PozSW15=*0k<9Sn?U}_EAV2-njRqY3({Xv6Mx~;IdDBO#4TMShD<=fR zg|Rb5tL*(%NVn%XfQpysLdczbms)h*lJ7axW$M00hO-^DQU9yrOUMWuqRHo_o%yvK z-BA%)-Oka$L&P$Kqtxn9CidNopjg)~A1_;kqQA!sOeVu(QSwD7&5QQ8XHOX2st(@S zp5)(NyinOTYA46F%Sfnf$E-3u?A%xp2B9f=`LaX{goB8N?SsDvYZbFxVqrhLGkH4no^TyJaq z;^QZe%Fgtz?Y!Jc{n&ICkRx<@b_?X5)AwIUEmoAwHEoCZ~ zxDDd0Vv!E&h`5rG=8&k7fH+t$9UC#OaxmO==CoVDPz93kKbVD~OtnkF6#!&@Z_Dwmg4hts!X)v>WDnxFb( zD*qTat!dvzI1gpy zogRuq4ui96c}M+f$6{T?E_j0e{!| z{r9pd$)5emy9Fi+?e<%2=j{&eFnrg*&NZQD*)WNiyMofSxhGd?k5_WKWQGsoXX*W+ z$`jg@ku1igjT)@_vLhS9TgFkRqfBQn_aq~H^)t@_o+`w&4E>{Ieu1(Lf(-5A8KHASaQ}{JGvV+CG=*`}UvFyzy3g zZ7e^4+y?x-^HXdw%fO%FYYaYJMXTS%b+UyW-nTmEz`rTQV8O96!4AGzs#rpBrX`&U zL{20=Lb)8XwSpw4O1Bc}Ax!?0x}_s$EvR$P0h5?>z!J5dO#PgAh+NPs)CW%k(yykg zE2G2Ez-lhV9G0T&yOUrF22tyepKy+qS#QH9RKQ{6logc_ex1{oRWuPp257#3?1a`) z*ZmIb$_g&Eg6}YO53Z^^K_r8mJ+FSU6~<}^>dzEz1Tp*AbPloDV^m8ik-$6tC;3n#kCM`3#qty*K9paN;l?SX{zxF z3l0vwr5s(Ui}(Rz%Yeuwy8jdqFxDbi@yN89KJ8@@O8EJ%IkeAfw0aGnMRUf{#HJ`< zEc;7kMQcQv=^P*%9sVzFI8TyL4#?Ft{|`T$82FNQ30XjeY2?`#CD0L*oel*$$f5EY z^|W!P@nf%G=~|~aa-!3odv`R3ENGabtKu&0X+ATuA1N61T|n5G@cNMAHGi~wfugX; zUD)%!8ntD9-D11Ex%F*lMjYcl<3r8!@%> zJ*1=;xwmYnxCOM`g1u8go=V0qBjXdh;V%eEm1d7!hEF&PYbQMXhB`Ig7lHfDM3IiD zv-H{O8MEp79W!fz*yY3xn+j$&2Z&6zJ%o4w0VIY>&GjPav-_P0008k#1ah!`^Hyh$ zR`IG}ioB*4Np+-}hN&P_J$i@DNl07QLB)$TNoTSU_Xc*5F0|hI@xkt9MRVbD6xx2G zq&$)*e&(rfil;KoKwUO*StY^lFRPnaxGcYcMgj(Vc~5KUiPFFCV)) z|D~V7N=LWm1E;zTTdmmuX?UNBx8gM_Uwu5rRTq6c)H$FW$A+CL64(K5Bihi;vOR6P zPf_X~ebE5G%*sDhLp;2hKT--u45f8&r^JQZkgREfrA%rG@DZCqY|ov(a6R zK7uahp8%wVZ;5}{qO3zSA8Gl`9n*|VrUEny?S^Krppekd9OV+F*Jcavc}H`qY${)& z!`WDRLm9p87{k~+M;R%s=M}B6`Su|$J65qu-vJHAV(m4XcT5kTfSigVty3_x9aZc@ z8q8l#f<`0rf9#sOMZ&)yc5m}qa6dj_aHunKlRF!gjP8`{EWN3UwH-q0ZqLexPqgt1 zi09B*Nk;5P)K-9qTcV zIQTu58$@=rx_B}d&KWj4(e4T5aS)Op zhwvSWUfO)>0$GCU?X{?AM33ojha*>ly)PM2Tdk~VV1tAO$Hlfa<|&NXC#FLWld#Z{ z9!w&+b9Gy2yul`yD_WGebIGY29AOrz710*N$KyY%B~);4A?}^hx7JqOf!KI)h6a)K z$}HOySBHsO(!E(EW#-!%#pm*wLSrWGAu(u;99!D&KXSea(@K+*n_25w)qZvSWQTJr z08%D+3T|(&>Tj!xY3P06hjrrz_!RPuBU)w+nkJL4F=2EfJ8f2?CE zvp88})C*hSM`Ctm^_81%jVkGKj)0P^-P^Ukp7^>QA(UH@O%>7b3Nt#fF)qZ!yTo0V z>0In_!3RO6zF+46t%)-p=FpFyjiO-n%qs?2;77#GUj=2aPWg=mRAfWn%bx?TJ=xdU z^OMD|M5v`Vfdq5^=`7DdY_sEPtCi|? zTrQnXwjVWvmhm(B?;ofJT2EdxEy1Hi@=H!NW23&XT(EzcU;okDk4MQv@5RdE?JrN> zj8C;$z`jZAS)2n%f}_4*9};z=RTZgRg73`{7m4 z>*>$AZOq~q&6(m|tZV@#myBME7tv-Df&JNaRuacQ!(i>Z zi!E%8Y~szAf5io+&Dgs}&O^n??xwhs#q}dyq>sFU(SEnx$sKJs)}{3*_LbMUI5JZl zNPcOXYke8Vk@saBH(NV#_0y|qaKIpm3pgAqx^F)2a7rc2q*s8O*IRe#b(slK3Kt&5 z)}FUVgotpgve~m=_dJ%aT=MvJy`WRjelW*>226XTdDLJ_aqr{UM{e;mm<@Oo5rC8lyI4K+mbY7g<{ zq|aToO%Ut_>@dvdnV zf^$>(MqRoz>D`HGo1B)oWanPqNL0cO2*z<; z$;8b%!(=;nU`$s-9)#gvGTWk%7CGEcLX!}*IiaN~`D!uwx7kZjZ9PML&k~^z z;V@$dGgpXn)=*X+=Z{`4h`iVJgyDCTH~X)D$$!rsw2Ycl^A4e*sN{&=^_N-P=PA1TEy)cj z{B3_C(EO|Hz(pUynyB*NmNw25JSH#ANlN*sjbUbJYDIZTznQ~u zDb*6Tf59*?ZbmGq|x@?oF&1|FScLx<=O%M(kzOCqsN$jb~ujwRa z3^qkEZnZs#k)VIN3LisCv4V0iZ4ScT3eL2LOc4F1s?@XYYWMbvVa*Q>(QBdOw1;j~ zbME#32XXHK)JEQ|c{4LkIA8+?8!%vk$id`{#~^|b!4f$e5CS1Y&cPnrR0KWDq%2+KQz{?M5ywh11~vukZoC84wC(`s2M5ap zJ-zE;IF1s%$`YQbECQo+=X1cPk?L-^P8q%}X%}Ph@)2zv3%I0VS)Ae|DPO}+BVNOY zfBm~!t_h$l<6Fr~Yg10|v|%2TUT>Ph5ex+;yz(^N6rf9|)34aaze4&6`AC<$b3Cpr zEcc@Fooe5Pe-U)0ZdW2ia* zq3GXK2yc%Aw7jqMrf7oiXyh3+_xZ4JP10MA2F(|bR6w+hV;}BgqjmgMH;9#rE5>-8 zDaOCfEH_?DV5_qJrozO{7YT(wAL1e5hYsF&3+|#CE`@zuetUCXtSO}&(K9A$7gf7- z>NL(5zw09O{oat04GUaetKt;w*c2?}(O}5ChpQDjKNKy4GQCKeWe*-vO2Z88_9kBg$= zw&OzqrK(sxVq3Q8SY)K3FNNH4AUE9bf!ssfhzfnA4{#+-ZVPHuDAIkNo@(q|vj%DS za-K|?lNj}QJ(X_)j``biJ(bcxdDp*zI5{~af1~$L9um{s3`2~MN)^n%O7>m48)3 zw!8_Q5KM+PRPlVuzE(B|RdaqNKGTCG=_^Gv|7kp1@;WaTc458zJUZ)12)j{ju~uzo zWoxCqJS2T|+56Y{kDCGRmXqtvP%_sLd!8{uTDBK_`+Ml~Hp<9I6@saJQuzcK_ z_anXJ#=?&I{{7XRy%c__myaAjgEG4k3({?Jjd}GenGRksIjh@2Ma?nU6SY8;E^LS(evC_5D& zYna<8A(Lmuzbkr`O3=W39kM&dzF*l?j_DDa5erK4Yu$FEp}F6hz(Y% z7Kbq!7Tvyrpu4SyUG_wL3Utz~4{UbFD-Xn})TdUtgDD=Fu~q$EcmPUtcaB2}CZzk! zQSPTFQ2bL$rfp84gng9O>>Y@+MgzV>EoNNX>_Y657KY{R3MS}KPqcCG<@{iBh0VBO zV{h~j>W)&~%*jNZQiQr^%uTVB#Ld^L@k`#-ggKi*ry|8g?Itz@{HPe%9wpSrla*6< z^yM2oXWj#VqNROS0@UU?PW?(qG7$GjMM2wmDPMxcV3X?2G22E~5Lyhe+yq(HfY1|yEdT|k>3_C83Onvl*7N@rpec+)jIIb z2s3?=pa=i5MVC5XrLl3_*7U6~GALaQ!6`=%13^KGBM8ZJ^}uWS>QyiJi9hI)hAEJ}``t(7FY`p7)*u<)E1|mDN>WNA4)0dN zyopE*Ye-|6@HWhHQr=N_`JEK4OU~L~2p^}6m$3!oey%At{gq_l+YMdrGd`uzL}KfDjs!dHbMS-O${8#NjL*oE`~U)4ES$!E z2w0s~0Jrpb1F2q-}5C8y#**7q;t8$bwF{U_OHw+yuQwp(uGC6r~c#!zAFJJ1RsblOqDAuk?FzR4Z z!fo*N%gpUa{z4 zmX+-r6@%g8|FJYfxhl=z{<}*vw?;~(KRGzKAItl&8}bT;UzbWOV~a@wXhea{E=c$s z3Fij+g)G*XAKo)0x|hF|8_ydmBThcHeSEykHSTQ_2CfPZgGm%xWx+5>G(==6{|3Je z$h2`|}GYl=9ozz-_R!fC)HGtk-@lvSmnWTg)5KZCr z_4*tDGdO!&Ve`8-JzF)2%eLp%qHArKuErOw1h=+g0mn`Cpan~XBQ6$qGUld?7x$W`MN$m;V`{MWX zTjL4&2^JdplZtPm!{NYR#r@u0#&=aB_W2ju~L!&w8>D|MhWW1!aaUDT z(ZtY{noS9Msgzb4sC~etg3U}j=du4LN#rq3r!2}EYAb$Gx(k0c6)tA%T$Og)0j4LD z&Kt|HZ_{kle=Y=Yx6s`Hc%c-3-1{)B>aDZ+C2P7_5e+Iwovf^`H=QN2A3ZP{;?^yh z<1I+No&-({d(|+OH4Cadh?H;CMOh6%i2)KLHiU_=dRbPN$|`vgXwzbqf}3Duiteg@qwN7eVOb zQt#I(JmQ*IJY&^Xl^=6-*{Zn9x5q3yl06eTf5cXEDv}0^#&r0CL+BoRb0gMCgV?;= zB88ls!VzJl)rTdB6SYFo5!erVZ!_z*g_#LCj7gUdgMJtS|2I{@VHtle&!7c?HqTK{ zif(wM(}h~x$0l=-^5@81V$aQJhc}#2cBa?*c5F3VZl|sZ(_k=6E~@c)g6>E_9uG!} zFyOO?qA#zubE>#Y6SX+Zk^99FS?c

Tlb3J2~I-)Ok=0~AJjFS(x6u9QzSOD!N7*?WG{rz)QDNqL{9@ry6X%3ChKtj)_ z*5As`Fbyd0Rhi_r-l{+aaBsA5iO=1h6w%M_VKlb6;t+#c$@eO`B#_wy@I)UE+?@6m zkArh&V%hY~QE;(y9N7Ez08()2D_e4p?IngiV9y`;yTt{ zp8l&sO|ZXrZ0z*$lOOa4n#KvV?);4@FLzgFC6hxt;lp; z5+(NY++*HtuKQOi8?u^&Zq29#rt}e+>aO#r|9xip?`v=WGoM_%LSp`7c>j~bC#oRK z=5OUCxlpItubkOQWs`ZKY)Y|hU5OUk%-lRGGl~3yF?F;Sy7Y=BFKF|C3oZ`517BZ- z7^$a2x_Wj!4ZG9z%lRfhBuomkQtU`|c-+(aUGVJjE*x%77vQ%SU?x~rWK}Kim|uQj z0Q2ieT?zgs!mXF4q*sM^@fc+yCQJwn^KcBO;6q0SHEn5Uq z$SxInie1?b%nfd|z33s=Ca@G+hgD{W`J)_pWVCM9%7Av`ap3n$C=Rp1Su0{B z7lUr<+jiwO&|0|HnQP|<68oM^8TFUHbw7&1jD(tMOzm2(ur^1f>(P=+i_7>0OkElJ z5>aORlM1s>M9U_z-hFED;FlOe2&;5g`5W& zU^%#9z~F~Y(46Y{%!-tl^cp-PJLm&b|f)* zHOD8l|B3};k83-8O&xivY}+G@kn_be-4r<`b4$#lKGmP5*YB1oco-f2P4!P~di%yd z({Zj1KfP@0>b0}-QhUQ)#Bh1cRrhKo?sek*e}1R-Pe%vjue6Q0xXphDt<{Yc{0E?` zR#f;uiG33;|A{&)|0jSPVD?{x*BV-_X2_p)7JXD;GTwM_D!DPbnI!ODuK?f!l6Su_qa1 zor`WMC_3IKa%z23FlID~)p<@Hn-P!`X)Ap-YW6o3w`^-%kM6QnozMI4`cC^S%d#hx z=Xfd1&YOf^B0(NZ_3?0B!O3wzy9Dbv8VC>JSNj;YtZE`zUb70qm(!-mKqUuwF$v7g z8&P(KyE$_yRrZ~Xhdl~6PrzC#&{k(@btLC34@{%xe(n6TB}LNS1H?S)WF|yUuj2lz zRBH(Fm(RhTlp(RMCopcSd6FkHcpl)s|9$z5g+pXSob$dIZQtGSDYMKTR$^=Mu;LVQ ze9)@yy-APZnpk%Sda^;1;?YIz96Q3DfaJyIP6`*d*{=Fz!JByMrEGkq{fZ3@=$&}S z>sVUScha^Hgnb{ahrry0I&QA`n>reG_s;AuaYa?muGFOl&Co9c_XzTg^IiLlW!cT$ z2}ArNYVOOvdUPpT=qh(0(CKO>KF!Zt{Wa4o;@HKKQ5fcze0V3yAU4~{qsX*1^y!4T zc1;pB_yq~?14`9VpmkFSiVlj|zI4*jxkM>-B*Ojjy{J80|08#X`zECzyKj9-ILuhj+|5mPeMD0GjiK){loahpCuZ$4N zj}!&*i4^Fmt4?~Szc*wZX?Rp~$8YbIRmW@HXOWEiadMKFh}n0U%MnFjCvnt00ww=$ zpOzlCAmEScx1yJqzjyFng@qt-Is=6v+SC0>*3hB#7|lm zOP9&HwsGf%2)$?s&n2Eo{tUBB3awO!`&H4 zSLMo1s;w^Z*$sjLBBLX|hJECgYY#`mlVCSbDF8joE0~Kuv~n08+17 znU?^J#pmd7=ZepNa?kppzc?Y6upk#BADBFOGjlUZAgz0`17*1L!b74YZ*^?*D)I*s zWQWl8XJ~Eg4?Wy!kAn6{=a+Hcx3+cTV`YN_jR!Cl^-;|jB*H;X(7X20qd{UHNvHGf zV6~PGfDCg|P=$2kjICu@8@=;q%S)@8Bfj#Q-k_M@tDiAE2<(N{UHy=}hDA=V_{=1SF3k&>das-WiL19eci@0m#%sqzlx%K89Cz~+{7MS<~l z-)UJOrQQKW$BROyqZ7M2@n%Z=b>H;otc=y1i(bu;?eA!-YR$<38tNzMB5Sj+c46(+ zE))wU77FakCLM^yU^t=tnXR=RskLEIjdZN;HH}-k-4Y^NC>U0{&TySy9Ry-Zvdo9y zW}2!wA!_Jyf-;b7tF~)4mAvNIFmn~^`V&#Ui!j64r+rG@W0+M(cb25bB7xcGm^gi| zJCt=FX%}ld8ao-I{M#+hutUuUlV;eX<>#!L2jO!3ceMZ<=5*9Bb$wb>Z8iH>Mki5{ z@JA+>W2|n+Ud~}sPZTt1rw)7`adO?)a01o;kadw=Dtf_B>Z{p(XPO&4&=ca5Sl8I* zxGmz9#gBXA6wRb(WFT`LY5UMrshhBG@D+gHR0$-FEfqloqXXUNR5EBGo#~V+TdoY8 z5p1nSsp}?fJ$YB@b#rst>tN}%&CFTbvDs8n;eCN#aToijfw#sYo|XLrceV3vIZj75 z*&`OZ0Uha+!U+Crxf(@f%Q20u`YYDLCFv>_peyONmIz2P%Q_^!Fxc)%xCQ*SBs`bC zOjd`3Z*{Xl*qm{?;PUW!Q0J@F)qS6T>AQM(6*K6KrCLvRxc-2^MSj~bCMang{Y7<+ z@E=k0e-#t}#2tWAk@xE^EeuZ)Yn43@cepFyQ?id!ln!B;aa-5A)2LIL#KqR1Ra|t} zB{^`0F-YFDau25#yG=*k_QK5E%)bko&M7>C&D=DG$(}TxcQ_;?Y86I%s|6X+Q;6-N zJUQWs+wQL7z1XzltzdrXGxQj#CmLAo+}lOl8px7IlJeJ82#)6!=oI-dGa}zT54ZQi zFN>ekx2}difaAQmM8X$UTG|3tj8|@TZVO4yi*#&=qyMI=azQ1TxWo9S^vNKiVyBh! z$TSkd5=)4V>wv+5D5ft}Pke8XhkS3GD8;EdNViQkrC1!b^S1%6)oHch7i;pq1DIK4 zZpEC+-0dY-FzZilmNN_s1^ap@x!RA8++^=^_CSzmZL}>sQNeA#FgXUCH}?xL8vpDAKN@f%J&EelSDa}ZwQA>K#gQ(*Sgf_{qQ>8{3%4wcXKgY5 zM%;?N!_gYyQ=JhLQ7hDK*E6qr@Zht27-LPLPM!;Gp;~L3#$xbwChR&X_erg{j zk3b27UtX+{0#8CFV(md!Nrqvr%QI)eCQ0F|p>1>SE)m9MOZ=2v*6`P~2&0X~1IE&2 zTMTo30EKSIDrz_;+|=8JUbnu6{*HUC^Z)}()O@3II7{gybN7!xg9@!~Bi93GVj5d0 z=jY81)@oA36l=nCv1^lRq_?J2O18S@N^kQ+Y(Xp_=EK6z_FNPHkn^N%@j zB+bat0IKaf&>+t!`~n@?lG4H{qDk3#fk{(Ti@+Jsccs??$-`VCfQN^ zLSkiA2AKj=x2W&0LJFeU3K6(yMs)G*4y>=+^Jn&VemSL(;u5hzXax)cBEhNqWf zhkN&o6&Rd#57D^0og=?g>Z_ab&`XMSG{2Q+R>H5RiEBP^Pp{HkF;H|wP|mv`n(ll* zau6T-^x|arcL}Ylx}8^+^q5s5ZRYeD(_>(7q6GiMzm6+NaNYaYIyCYBLme7EF%WHN z&C~BR67hY~e2@QLp%XU;&lQD}+X=F*F&NQ%xyYV&b#eQfiU48WNccE;=Y`~Mz>1TM ze+0Ot%dY93P_-D(*D?wE+`MWRVPKsO4@S=j`=?AV5|2t*Nyc>$5HKOU$DVkDVgEwE z@G0wSPG_|H+-{u`C88&u;2PuOh<*4c<)pBjNMAiJSWjUlTI57< zeSQe4_vzA*K&eSZ6H-pi5yX3e8IVm&l44s-8cRH1#7ii+rM z+##(hn|&OvXUr4FNh}Mdu#`Xq*7T%)N>|~biwdRKEDeRYEl6X*~1@yu0*Rj&jM@w-KvU$n` zrqNLQt?I!HQBwKqMzw1sx}V9_3TH-#wpxg@ULFMz363P_p#Tq5CxujV04l(J1h`T> z>`uaEiMl)fuGYt*dPk{uy1J0Ov*T^eH5D$p3f^cf-0}$>iz?i$4{$@dFO{B0WduEK z>k4CvlwkUmEm168g%Zu}$;raniDi$GQc8;c@DQ}{hv9OJZ{EpvdGDA|8=wWsY%_w* z^_p=h-*w7&zd8M$`C;mCB=e$|4>vHlU3Q6%1+`2B2hEs8-lwBrAp6`gi(YoR`Uz54 z2BC6U@p_SL4TR9t9tVlIRPUVDGFi%JhUp#dQmf<}t#nRL{;_e*uB*oss;>ar>{I59 zS?wsNOT5G1zH_CKq^Bj+Tm~$ljl$rxb~jq_D}p>n$7NhMaQblJ)^RH zqTDHH5sna1T0Fn{6JGjGgt`W3BTTGnn{}&&t@%jLyy{+hx2K;MW>w3Oh@P=j%4DgTTiss2v~Yr zPg$b}zbY)puVfhG8yD;QzPz#WFGu2Yc`FAsZ=b%07n2qh7d?d@)2LIrX%`|_51&p7 z2k4+~7OH(tZ_c3}7^tVXBk*)G4LI2wGramv>Ff|Bh`PM2V<;)>nc@`}cna!pPI{25 z6|(FKGXnPP!-RF^m{z+r9wo2VGBdehr-S0(>DO0tX}s4l88M1U4!<5ky3%oqqX@jV z%U2N0ZMBi!U;1L&_`DWDyVBxKHLtDB)DhPTZH4_Xph%*|x?)bh)FZ_)fv~X>_*82e zb6&Fjx*2Cf*PenH<`Q$;qf3Y5^_>H6euX>`WbTi3KagM<_5_UQGM@LY{gBEzds9blr{M|w!{hDLvf&u!{<^jwn)NN&u?uNg}2XYH^ zt7S2fEW?DoExm7{z}!$>*Oj*v<nT|n^ECZz!-#bT;cA>DA z;XN1S1+yOkbI+AZGV}YT3(UKV+_F@aJ)k?wFI?mR!ih4osdi{Auw3m61piAy+f#M_ zXTTI-^FTX{^5v?nY(+Ek;>s0g`4wsg$Ya{E(cX6v)cd;K|+(lEu0_ZFuxXH9{mDSSUnrDw|@0IFzziyo> zbjBukLv`+Of_EBwBB9!3$`VrVZ6X~S!`Ehc{exZKA-8=B_jE*E)W;-O0J#VQJI^f= zPdKN3XD`fhd-tqr+MG_4f-u^Yjf|Dc;gMVo7p=Y5_gYj`u$i)CE7@!Fvzo<8xJ%Qc zn?>sdfBvaGf7Fk>RRc|}O!c1W%nfn2jmr<7W@(I-3atDQrWVDr50?IP;E>@GP#l8^ zCyhT{qWwTAxKfVOJ1v5wp3bgoZIGLUN1NU+%kGHid)te^m4C?qqZ=X)0!%~-u*Apa zck}vr+po{bx-?Zga$2=1RI&9@9LkDNuRmC|c%@DFNE}d?bqLGCcR8aVR zv?gW%zk;)d_p(kU336LlD{iM=i6X0wE|ei+o>_<=;uhkLiSzu<`7YF81eZ8BEuCiL zv8qN%Y4bXZa;Xji>yHJ2ckv}*SJkV&{V;jkQ5VwFI3)9}-O z{JMXiJx2Yrrlk(Ey-=(up1BfHO3_wHgd9yg#>yzI4V#4H4~7qT41Q>5?Il}xeY|^k zAh(i*3!tLv=>6x`{`K`;I2iqMVd65(5Njr8u5ZZXu?h~hxH4E}+=N_y`7V`UTD(RbwXm?=1zlm7?9h(T=~w#f?`?uopVX*>hBz5diz|Mmm5vvU`#$LC-s|B z$b{{D@%s8&$&l^F+o37!L-gPDx!qOc*`uh1#jU}RvWefU9L@n(73Hp;54u(DRjXz^ z%;bVuu=vQ7uVgKUfL1ELi)mL!Ig}9Z3F1DFKn`7~FI;o1ZZGf__#<#XNfjW)7zOg9 z%kDAlFeB|lw47sla_BTFtv<-EEbOlyeb}*0bw}U_d7ZLsR?q=&>eOj<+EZbFd03tIHswZQp_TdVM(~jA0Tj9SbhDztkw_NU7lLj)Ca>=JLkG;UXaMg zsD7V)!Pm?M9o4ZW-SLgXvv=z4^Nw^fZN!R9LO-@@=eMT9trQJeuLa*p^cE+pZvRc? zo?I$kyCow;lo7Fy8ErQVz8|-oG$mEK(d0FggYOdZ{UxvP zZlfF^2ptq7qERt+Mnj#TTUgWjrNEi^vLwAQQKU90qa1>&N`|80L+AXP??rz~9>sQky zJrR9R3?w*cQgue^-)S(Q`)*0mCMKrRw_ZG0F?_cXliD~nb*V_Oy&(lyHf;BKOPluB zhKFi&%+xgKDOL;O>0DnbAi*WBTspKvJ^nB3eyCJ8O*=1~S<7HDfaa>U&s=M$k>s??@ZRZ~EMB%W)IL4=SejewEy9_S zENvEid-gsvvrM|GlUB4>AxzzoOOQ6eH6K}DKsJ7RrCa~-D%YT2ogBhoWy#F3j0Iw?4L5MPP-K~j}Dv~Ho&ecqjU zOEIymx(m2y%p~QaK8INlf4Z^lb2vI?qT1n{YpxpBHC$wt90q+Vo!nH)4^5TEwL4jL z0869ijY|UFl^T{Cu=M7vzFW2bDZ8yxMvOJ&BAfX}s;KG;3^c2jxIvehq^qGZ=y&=E z!FX-L2BWHGo25d>t8vwKxG-MOqSK-vfT%J(bJiA&Pme>QmXs~%yF{20Q<&maWycci zCJ*_;!|f8pr|0?VvS^;u^GqRF)# z64A{E$`FTyg=3P41P|$U%3n}X!BYg0GS@-zFBDEA)Io{87wvoz!=mlS@unzGfN8JG zn6To7VpwrkzJkiECx7TX=1vIq;~FnF_XMah+#Kp-)|(KL$NB4Rp1h^Ob^}8yDq7X& zi%MRP`NREbgYc2XC;DZ~FD=H9sM^$cPbm)GMWF4qfY8atx|bp)qp{VGIOH5&c%s>8 zrD!i^2j9o@Om_uk#jQ>S%hc~8ME}xV&w}knZrs)vtQ!rS6^9oKkM@}V(dD+Om@!&y zYSHyI*+=YUjaUt;2}e%|vH%od-%Y~Q5s#ibYQy6JDTK2QbL0C@OED>uBlTm6X2FHP z)Q#|_Qs%(g01yDx7VWXFHfUPyCdAy`Vkj+Z`LICX?P{L->jqqnTOHSd-HUveH7~+N zOZ|~Z#tkmn0*ZOghv62L_I2TjT7SoaqN)NQxntQ!N2C3=N}BWF75!dWN&e9%E}x>^ z)$A(dMu41Mrfn)0bosWyDJ5MN zLj`6F(jg04#ohHXwcTu8T_LA~alzuPygy%*E z;@F61_qy>ryTSdGTkdrTlVY2C8ji0L?gDXG3|vb>0>T`y8yL~?BHU3pD8N#wX*kI8 zY;y4Rip5rAiOV%0e`E;9mw+g=S0OvRA=9P@2`~n}=^ep#Og`~x( z%&iKb?z0~_LYE4ahQ=jWS9H9w2a@v8X|qURY+tp6=2Vx7`W^pZ)3k{NY}Fo;dC3wX zP?@J27D~H|pEV2YXyi9fWb(^gS(NA62up2C^R}ng6T{mtM@m|dU5N4)RiKmFY*kS+%}b}HN{KY zb)!lt5^UoB)^{eI@kJU6^NCL{uu~;?d9=?^80H} z1>XL+fe*_6T>IPZON!|J4(P{F*WXl{H`_Gq{(t*O?jjaT7|ElhWn6^^-+DijO|V{? z!W`yeekD9f-Ok9?GG*d@&Rjo$msA5KO zH(@$F)M93W)RU?aXsY>sea&aDP{4+98UFyTY1HjEh+b@Y|JkkRo_w2tTacCbL@_ZY zd5E$UwiLy=ng7J}*gP6ih%`OEIvLsZ#jvHNWyHVQ4b_4ipXGc=wUvaUhjpGd(_y(I z&}!dl?`tju_N}GiN~JSjQQa2!pQqUEH?pH8#BrevBD2(uYtaF%w?2}E(D{M|MvK&r z$VL@(p+{^jAoz<>NS=U7y1QZmZtS#7ez#`h4iw+KQ{rhZ0smSSv3)`9+#d5q$EEo& zBy1Ywmx>L()vC|n+&Yr_%vFw_kqlq0GB+im78#&rgSfzwOnnpz~^#`***J8Yw98i ze>tqfoe(yN)_RFv2Ldf`MSo+1>(Q5ZbXvaNQ)Lg=K5WiGjUy0=bi;ykLlLB08#KNT{e&4?ugs_uylrW5)BT2BXC$EN|hidtdnd{QSWP z?V3(wl;{DIJhJei*t&!x#88g3*zEGTJ_HB2okZPED;Q-^1L(k8OhE@Vx&&0CkDj^v zoy8e$O@_38>8h9kYqDHVB0D$d6hd%B|c?*~E zLKPeh8DBD4fdy<974&@#rIV+=_pArOolOn$qot*NaNqA=2N?cy>A&ZKOhxs3(T?4* z5R&(*%Es~2;%r&jr&MKjyP-r!8Us!GYnlHFxb0wZb~g2I?uAYO089&%)T;jP;5g&z zgxrblrB?x$8Cqnqna>c{R}{``jWb%5Ziyn)eZUu)oKE)|W*x*lIvREQolf)oP?e`M z%`X$Q{c1aCFuh_bb~O*Y7npg?@(MiXw$~aucgi|0I=O!o@=m>ULt7I+yyE1-)9dtz zuCKV?QXTBZn3qbIG=iMWl>&e`tvcZ9R@@S8+EDw-d zPbZ3GpyX!M1D_hSsml`I+w;fau}dx-&sC0eKHnjH5^p$4d5=4{-O;ELx&brYFRDNa zgwIsV-7T0v|IE+u=+^>u%62cJJ$>)-w2XfBM0qRdINA>PHIE3v8W=iInTRKqdT71Z`GsDi0w%8FNgX475`^m#0S5?AKmNEz{}G zvA^!=z0z6(SnA)4YDbVHxY&#(;(C}Ez1^`ehy0ioS+!!X=%$8454aeIux|laoH5s( z+u};5L4grxIYoiJred{!{E?EN+qMWFxj$!ExuRdpQe_WatgdAD+u|;BRPP2IKxn;1 zLwFo2r8D~+yTwXK7?v)v^x?eI((#n*imkwhr0KY z&(mQ{UERF1GUHkIlA&E$`|#Svv0x-0Yzx|H*w(0!E%SeY`{q3r*HctRtSnQ@mU~4IcYa3jT;4#( z-kMa>{=vl9IpY7QcuISYd+zmWfvslyY+%=Pli`f*Uv>V>eh=eY0 z7OC5cfS9@C6u>mOE5|&*ff6-P2^b1_ZW|g<6gOFnX?wl@AtuJ#A>gg$J29)NO`+XV z7w#VbOEoiQgN8D+4w9RtUz$RP1!zHM+;4G;vi3%-1?30Tj`f-9IyN;GHQy1nO_BY@ zf4e<$L_FuCvXK9tK(_!_V@Ul5MgVFMf^KTdBm>aHJ?*I*zv$)2zRbSa#@=oWQ7g5h z+^hT?dDVci6zNFfy&KCn#xE&4*=^6%^W@sExg;sQFB@y7Hq@K2A1hf9=g>LynqU$l z53xn_P4m(X6=dj(FsF%M**`>s2 zZH!g=t&Rkswa@+B$)uY0y0qtRwhZlYDM9|dDq)|{0D!@8`=&`&iaI>*jc`exu}o0% zt@i!`M5dhG?Ocu#ZQHV7bc(iYHEOBi2;l2}HHDSEhd%Z|^gDAG-bu9-{K_5^6K!G=RGf>8i&JdMIK{z%F_B)fg4E+R zlJq$ylwy#k=zb%Z)`ql&VE|MrV_<6iKX zAy=^MV|>;s-Qty_P0{LYR{#l`tn@1aAIqklBMM^emf_S&aGLsDJms68jL$)=G~Xpw z%o!-D^}}8yf9;7E|KnHJHaRnvqZ%9|t2T^&s6ko|Oia1U#+8p_zA}L>ow?EKyfOVt zq=Es8I6nPD`Jv3wRP4)1=Na=A|EkHFpKK;yoop_nQIXElxf4I+;ekmu$QkB0*_%9R z4$09Wf&AkekL>B+O#AD&lM%O`)9HuB2YPU=E9zjBGhMBNG_1hDYwYafN{_qeWb1c} zZS_}Z_brUd9v_5_UKrLmt-k;95NPWXn%zb>4dE9MCSN&ei-s84+7%6nDY5@arSj)* zpMTs{v-bcfPYXY-gU&zz{GuVf=MQLJJ$+>P!Atp_aoTpRHRHjrra$B1Bau$O?)t>) z7VmO*NFIOj-Wxnx()3J^86Of~HTn@p_a-QBB2A=_T~J5vqw1B-v~bldZr*AO2>B70 zYcT1PJCx4OX78D2v@9LbToD5!BKvv+x;j3{S3M^x%+k+Jx*+(-(E0-np{a^l#6Zk& z*4l&b=f1ZZhHc7Z3z4G1G*gY(dbNe~3_PPbTR#)KZj-aZqt6n-H#(o4%i;`s9WY~f zu3TH_XpB|rjOkgalTsfTl8S|2JpyADxz#+4o1QaBI7@F&?EgO6qMq@7+<4d)hnj)= z1K*c-7BlAZjO{b-f`WRAxSqx5$meWT0;rE>Ua-AU=-c0sZF4?!CW2$eMkVH&W+n42 z-PK`O$zjA%x;GN~GK4!;%M6X?lHj;cu@LyVz3m+cnUFrbn}L;_`NLE;=5ZP)lqFOUBBt6LR%}?= z9EwA*wz#`ni#vtl)|;O9zUw>ZyX*b#d)K;Wee14upFfhB$t06!X7)3CX7By`O*3AR ze6~8QQD#5vcp{?3VLm;jjNFOuYbYHfOCY468a6nIYidP(Xe67Fw$C-sWCVTvr$+ss z-v2fkdZ48n^|_bZHWfM$0IV;KH3XV2tg8rM#;)+WrPr%^w|2PDHs5OvHJ=rTVoo;4 zBKWBhkgW-aBXqJOj;rL9;$bA^UcvXV%(I3aU1`q3Tivs^Q@Yh@Bl6<+U`o?0;a@~U zVD<`H3(^A9ZdWAkvK{rNzsIZ{f6B&hC~RZflM0!AX!<5tIxhmpV%|GRk=mUElZlveR-I{N0|f!BA!#!1&e#`#uty(Gw!ZfqYT#Pi_G{ie?Avg0)>9Mzzf%Fjx? z*vWwEo&s@93o#-1^O=f6MM)#`4~0dAm6`K(&^MEBMOdfVuouL1-y$D?#U~muPx{e%pb2TcV0hqXgBN=KxpKLynhn=&~+OsqgjvbGoZGo za(E|%;3wxJauE1%t8MjOLyK14p`$7os@b+Gn!u@1cxl)Xi=0%W+LbGfn+JcfDNdMz zQbD%k)WhA)kw7V-oM>L#-1ESwO}-t4$XvpeSgOWa=Yph>->WZ*dS5@ptO%}fW|lK( zc%GcQsD8tFy8E5VK!o z%O?d-_S<}@S+o5iB968#=SZtUyb!4gL z=c@pM=w-{mXykimakBVFk*3&uuSj#vbHn03+1Ifuyp}2F>`;eXAP@Mo(uVf{;p1Qv zRz;ECjzMig45TK0J@Nl0KAiuwnEWNHs(1Hl)#&OYZwAhx#y{--^Phl*o`vc=uDD1M zu|1M=Zk&<>U--75<-f2?CIldSxLrnZhc09#;!Hv_T*IJJwj=NzSq`(~TZ^1dDb1<4 zigPW8=fjFF`8X?!ri@exFYmtilGD~fINzyE|qV(6(1v{n~tu2WcsDP~iFU zb`?>Sorl%-c~2+uI?B{_!S0NHi>-Z4DDD*Eh#Zwa*yZ?0$aA%PurU>l>B!^moR^JO zn;7fBL0IDRJdwX(VW5uzkQ$KgW`iGY`UeXw4!;A6q@Y>l%ZSLpK6T0A_tfV;JnN(k zj9c%gvga|!ahdKPucEY-syZW5T{ax$aJ01oKM?BdGiF~Gk9ou+DxETnnmzp4*QdZO zU3wQOQe&aYtloZ6Vihf!#S(!H!*lnsx^&?PPK7X|Gs=anzHFsYk*v1JJG{>_vt$4bkmzpzCf;TdJFW=}9*U;kt$X~;0%xk9b ztb^$1U|lQ&b^UxWHN?D%EZc4F`H!XX_lpHb5vRiR_@{7bvrn$oY%}lUKMr3`PwmWw zUn-9or!-{z037eOT{GL1_8F<1y_$OV_|!Z9`{;jJg)sR4;Tkvk8nx2w{QBv$u^FE9 zhL3`s5sx=`&kQPAIvLJ9sh-~Wek4H|hEImH!C>-11=X=-4=e(d+xO23z1@xn+Uh)R zp*$5Nrvxb&vuAsnd1j+_K77b-LztMo9Fu`!3&x2ksaYb>7##$t9H+;IAZ3j9AG{){ zD$?Q$wi#!o#5v6M;6|NWSc^Gw?J)_rZ*>Z|SYz4ApYNRu6uS%M@?(R{5*@CkUz=@YuCEYz+UWFTl_ zbdWHVhs4a9EYp!wcvZ!Mb5)CntH)(>l5rdD(7%u6*W_<+1||1+PIg|WUTf%v zAO7WeyY@0;D7yI^2W3KVUUB>r;8gYr@Y2zo@(MD|cEaE1qD0$b*go5Db3n1LLay`mI_(w;FUc% zq(h`=qLd`64xE*NZyi7c?wk;L#=R%eQ5o>4kY?{QRLUPWpTXZsvrgnY}w zWb85M+HS}&6B6=*9-PUxRz@LHwnH@rJ1-2Om=}y;K!!i?dGaA7Z;WV%(VjlnEFaJ2 z^XV=w26us()?0)uOSw*KUpCF!KdLdWOb6)cLiBdl|cm2A1jkAk@4)c>} zhR69V`mjj)_TJ;Dw6QJ-xVL&)zFjFdWhr_S+@kqtzbGiXpR~yf9cS6XYPr~){B*fP zAk7&9p`78B?n3mXzW6?Qu9b8r%ErOfRZwJ{SzDcdE9EQ@&&iBuYrE|&15bE;I1kJY zWW_RSJjca8$Sc8^1!eT6nL=jBa%mzAw*_k>4OeUBRRX08Q@6xAFSUtXv6H;G37?Mv;U3d>K=E5p?| zcx~zq&n#TsbR3tcFeDN=|}Xl9V@SDF-uc3V)%mwot$gXOI2|1C?-UuEWBD&theWD2BRBZP|3%Sr0HS?%m><6 zpT|Y7J`Q~`yOeMT8Bdb1oTzHC-b`aD|M|FgusEWTlsG1NFvvzA5xz;n&Fp1fposi1 zy22@I*hW&As^)E=k;PX#T+QH|h^JI5TaNT0it*-KfU{IV!{nr+afS;@{&y*3S>5=H zSci{I_tohAlmW|Tozqs@ND86Z2Nxzzu+soZy0JTFL-Q+l!T=jo1q3zD;<}bVF z8p$b&{*L~_{gI(;REg5{X(vfuNWQ;2jh~t&-lL(rNhoKW52aQ-7PH_}6Cs zb*^Ni_;9Pz|F(JSrKx#0NF`S9Qa_16*0nimb#BYNI3X=T1~c~}Ha3=(NlwO;b6!93 zIKbKI$Y|p-Sh-6kNjSV~g(sO7&t@Ui+n)TjGK7d+`4VaF$oYB1Pa$iJjg^8xoM4>K zdYsr+E1ZS>pWrk{d=%#e-+4BrRJW8Qh9~`Ugl0nd!1p1P8gbx=i0IacSH98op>~37 z>^luToy=6uRm8~3WCtjaGf2)*q0<9jPmU&OU9njd0X9oaBG~lkz_ae@W>+}R-*ji` ztUXQnkk@s0qE297-9b0W$T@VANvvHxaIT}WCtBxX967d@75>{XL0ia zNPoNFG5L7Ds9T-2WLzb~fwjY^UEX0TK8`VFP?bF(h_?>HeA!T$)@JnsZN~SJilsFv zEwx8kou~w$E@iuK#(y-e(d$^>peWZy=N`e=Ff>^d8{~#gYxn05dT*=|?w)AbCms@} zPZ(mv2=yul9N$&W`~L()OJ3a1xh+9*_xQ_C0O|V+G%O+E_-#N=-^(^VF5ja&LvTW3}{rGbL*>f>Lw|HR?-?sO49S#G{IW?GveH~PN z!O1vW$r9Zf$z*PE5bXrWt1MVtP&wdVXR_r#lgH=D`Bpz$Ii=+F!5J<^;Xg~*mjI!xm!Yw1+WDT&EmFjBB8I9IGMDGR+_ z&m76Y1~ZFi`lW`IVePPiKLG;Es12uaRSXiMU!Bds|8esnXZh-UoCD*41u^o45P;gi zxmsVnq?kqO-YA;om@)Qcaz?{*;RgcRDJGL3_@1k8Nd|4PqrWSdnV$%CreJ+R9WeP*f_* zX)wFSB&s(x#;l~Juli(A@Zi{JMUTEtp>LNFG2(#z2>c~!S!bwMDhO}KaX6%_cGgY5 z1*H2vvkTzs)4U*O+3#W|3b>0`PCP366OgjMGvZf6zh?B&Rx`uEv!d<=@W>BkF=Rp! zaPLD6U-&LXXvhPBeX)v-kf82U51}K)ZsMYhs$27U1;f+014mPrBD>nF`CWmZfS19# zojZzq@KGhU$OE6dWdV-$Yiw4vB8vK|`OCA6p|Cu^Jy6*Ty4MB3*5F}7!nWIA z0|4pxHQ>)q4aW@y6?V#5=L{-H%Cex@ch#jcz+^g7vR3n~$b<$93<%I@)#5&f$I8p0MCRc6h7FNwz%5Z& zl}Lt09&P1od_#*!JM{SMNIxSATh|HIsHR~|&bcDZMCd6FR^lFjDH?K2z@gC4NRzc* zME7$5@qhaQ@qgWc=(l2=N7^p-zzDsbGx^xdHx9bQZfB!`M6VlY*evu}WKvLsAV zg{(uFRShRAU)^F>K7uDo7))ztc4l>NKyJH7=}eA-w_BNHDYp_X@;aoN#2VvST<(tBnFdzMpTwFTHvnqHK|a+bBWy2;i1jCGvAAO~Q6&fE*nSKz0AECECu9ol?_ zDuh8k`7f(QDrPKqz)Nj<5tsLni=Dj#^9P_p^Xu1MXZks~T_mk7DjKR5r2p1K^ z6ON;rp*eU7!#-~Z)N;WE7)^22_l0*cJt2BDRTx-ob!Y%!lkvzOy3@SZ5H&+RGp;2c zIW>|gY3K^9PHQqeK9gA^!eOb5-n-U*y_U4A&_QdsBAZP@QqxioRU?Sy7W)PXjmm!n zz?K!_+Ki|R}cBjQ?%F9rlWA2YM=Ro^xEI=2mij)c4t|z^7wl2&# zDqp~pUMo;$Z`t{FA$g@~H)dP})RNG(8=Ru+5KrD@tb;qs!-?X%`&E4T&wRLT4%Z!v zlZn2Ip6EH>Esv?3XhNvbHJF>rJqwbZ`XRZDw2?@*5)dt%ccklN6khtyz%fC36bmxG z6)g|`6Q{xVC&i5Y>%0vjz{Brh1%`S=#t6A@#EkGB`&+P(EsIS`NmKuVxHbsRb&NOV zo`6-328AZpWx9}xL!SeYWl?{wP27vFPU{0JK#>?UpPWyeV)_ob&E z+q!;4cHd`*CkSd|*9ZCv4F+4$*vz!avX=95hCpcB4(d&3ap4`IDF@}FH?$M2`@~Bn z_5?!XOdQEex5M6O=&b()C^E7!G#PQxF1Wobr5N22=z|A-!v;ngg9bFVz}ej(KZhNw z+E8?M>KD_abTw8M0Dci}PzY&?=lCC+C3`yJHwok7Go>hB1f*s&Ji_zKIpj~AAgEEZ zEyuH`BW-OT+?hoc9%FhRW52O4FV!X$7Y`E&glw^MIJ=K0Asm}Fsq8CL#;fvhL})k{ zVymw*Y(K`pkS-PMNZ+G6n9cfoj%t0kA(4yaF3*t%Jsn}$AEGl|I5CD6*~tV>In53&v1_Y~I|0`TXUtUFSSK-yk*Pr-b9drtd+=e3lcrnMtWBBRd!pK2L6a&PfhwF%%V}Wn$u)Q3c!y{@a4l{*@5?=kmWJod3>!{?Gg3 zG94Lh6;Orac;w^Yhl!p*+-V*wL3L#^IQJ3GnY5Jx%*Mc%OyR)$;kSh7q*W18jskv% zzMkG&ePC_NT_P`8hqZE@TBt%u+*e2{l9BGZ_Pu15Fmh00n9^^W@J{Ujw{A%IouI`p zSpg`qbdEEg0jo&Ad++tdXL9^+!FU#icxmVS22XLy;am1w<)aSdTr21bb=NuM*y-K>a;x>fF)Ob;&rWMEImOjv$;qZv-SI zH4noiQaPWlTeLbsgc2cabE2{1$5XmB7KMi6VJG%2=QF7`@P{JzqdqAyoIrUwVr!r` z=TC%GW32s6eIKO4ClRt^1VYlS$C2G#1I=TGs=Y2Ng$0=mZJMNk7sH`$WJzNUBG_ zjb>k6=>(rkiuTTX-|z;hX5VuHfOb&&@!_`f2a)LP^T5J!A$@+Ybiqiy`7&4acHZHP zNYai_9=MFPFA6gp&k7L~I)o12Gz0vBHcZx4nGJK51?xtoEY@JwN<7^5S16mBU%s7% zK4-8LVK8Mb_;|u=ZDxGpkkOXU^|HW1p_Jjgji!~z@AJv*3a0B^=JH(Fr;SR@g-oGo zktz1R(CvnvlXzdON1)lR+sMe#?$NK%;nCT6Td3TImUj<%KG?Hr_fp_>)HjjsLGXn{~qOn~9o)A0ab71P%6J0c8H5u+Yd z;Fu8wMtab9B62C34-EDKTkNu?jZXnL@TB?vf<{v8{M06OcSQH|M!Ieo{NyDQ^9tH0 z8d9-W^RZL3X?^NF6?VqrU|HE^IjweILLKL);$1N8Ko^R$&-M0wSw{75(anoUb8Y;=BbRs_@3?`bdRKZH`NGRM zSS#k(YZ|XP*TcWSc%PyWdMVWPN40YP#%9d1V^Z@XPr;EtC6aa;Te~pg^UR{~n&CRq zIGFqCFE+j!vK-*X9O{VgTFY)fKJjpd5yzt}Ue9yrpSRsrQ0zA?DEV!Q{<-!L9szyIvk!$_aOI(;vJ%CFIm-{zK`#;%kfgi}_a zU8)c4VQsVvE%kK~rJFiswvoy1Pq}#^T^k=%F<2MPHu!Q`w@SHb`6eb@IdoQ z?&j#9Wd2XdziC&Kb)jm8Blf9|TG9m@_)yk!teShN8Ot-x+3Pg!E?ZpT&W?_8kcCEX zQYZHI2BfbiiJL)t@*J0C`a6=}Chm$=U3>x3*68xZXN5`y^9z#w-t?B(e3Uz;#c|%A zn1;vHMIj_q{2Ap@B3*Ny zBF~}me)}~1)AsjcBi{3?mhHrO212tmwAxVP?<0Za*PQ8nIO^<`I+de4lY4Z{mx3ye zIDSDbC;X1d351}CXpzWE+`jzN$VRB-Y2j?mihjPGuks@Vw<`c0@$HWET z!?JnFfK!L`k&A$_R1s#Cx7tKA_l1jO!oGNkTN_ulZoLt zUG&#w@LTqF&H4DGU)ma7Jelb%ol5i&omIc_*i*p$fwM z*|m24eddYxYZ*Y_je)w}xA(T66jm7C$se$<2=%PFFC<(oOi1U=V4Bk}Ebn|2*MPE2 z6cp?kgbN%BNXQh2ovs26yUjy0VD0HcM`_((6-GG51>ovS&uvM)NH;$Esgy7E2?hI> zc3xB<665PSfj|U=2ech`Waix*Yp&NwcS~Hnt_TwwCg|-_L}8M+1knhr#u&<@D0>{l z{Kz6AWh`NQlp0RmCEZ9i?)!T8T-?yX4oq_12KUZ2@lx;RoK_zhe0%Z*^IhWUln}gz856&?b_eAG!J<@cn8iT$`qX25sSZ*|7IdV~}dj zv*#*_L?QuIz~1D@fMJ{$*@S*Whnm^f<0M}GB zJngtz$a*%k;f|iC^>p5o9Zmyp*W});^1|1)0^=MyD+I;)mPora^#OsJNT%XZwU~n# zS!_s+^0jT`lhHQwv&JA`!6<5UYHPdx_m;^DWD$_qw1&c8iM>a4__4L4TfXxwW8Mj0 zQMd`|zy+n+%JGAEzqVW6ov5)f{;sNaHH1WNXG9w(@~ssEGc9cq1JbWgGp$B-y`^&%Hw4l~ z5zG3Im7|qa1bR9{LUQbjS87ow3KDpWFvfFH=3JSaMq>13BY&ypMzbM!h#PRDYwW>a zDCuOuxyBmf*?w3)%eAi6F-e6przfbP_$zDTk2h`wyx`-(m1dm+aivMi`~muA z)IR~1oP93M8pqxvg1CheFZ8+gQ6bggF3rZ4Ywig(YY_H5{h zMwDI3@Yp&Y3y|dj3L1O735=ni^vq5}52i2gDv0+|@@1gm^JSNE+IACPA@nw{@C#>q z(4prVMQ72A{NxO{Z!$mfdH;;j56C_y*HoRi9DW_xuUhpSr#*OdsGVo19ch*xosSKi zMze?yYy*!Z$~Fs)w_s5LtT_{+&4H`0n)W=A7dd;F()<14F!~^BHp|=*ub$TV~W&GMQ z=2&<-zFAv`uWIp>JXZ@(vPs(~;Q6m#TrC;J4aVL)PUp;@tZ4oTP}A|G$)EWNm`>5a zX|1=zC9f@T%mu&H%Ygr>UXD`wC;dYaHF7I;n=|nx%aWQ>OL=gm`I5q&p8ZGeFD>?j z9Ezi$)Z5lw_zB46fL8u{_GKKEajXa0aeVk8_+`^$8}J*#JsSGN!{GAk;o>VnX(yb) zfhdow@cZWQ9~~#Qa|@?-N@{YQMogjosm+6{n~h)4`Oisk(fw!#wt))0?}#Q%V|;0@XiQ!JBd7$RAMfdJ1{?$pnC6BSwXE; zu${k(Y5&cRfkDa)3Vr;p;f*c@$UHtT)Ckn9oK^d0{fQR1zVL_sG*&t<*{uh-xk@MZ z>cR&RP#$e+1>3B`FF+BUV z)njxkY;qusSU-8&P$ko+MQfXNP0NCF_}k>}hnBL8cz%JF`x!DGIC?7KhPjVQ??QrR z9h3sx&<}cS-*!PfnbsIfyR->eT?XjB3;?>P8!*IN5yBW;z za}R;h5;k10oE${rN0>;-+D{~gO4a6t%GOcP zw#x+I5jv;0TKXKG-m~^2NEnEJrJ-(0sTQwFVLGbHE7O@DPpo(WFMm^+H(E*52YBF_;lUfh&5f@Clv9fV77F0a-+zf)MiEo>d8*Ba+@yfWfiGoev51U zRF#mW&D;wiTU;RZz-UBuhULh(j(qj2Hk8(KX8t`;CXZ=7#W3)yLK)5e;xY_E*GZ=J zK=XO51#2C-My$9@gK@26}Oqs`saf2#T|T7*YD$7R~U<>;33#IOnr^ zZFdI^Sc-iKD=JWi9}RQY+l6CBENZVD)S^{v z+zlJFFfGX8PzA4FlP08Ro`;8m2)|}1wf?t?_p#Qp@!s?*elBY!HJo5<2d!>Ostd^u z8un^&)Y3-ft}Z-cpk4}vPN$#q%xVv)JMy7y8VMe>g!vz{vK1Zg!|jp;8HiO{>pLDb zMi&tK#QGll-{)jqp=_CJX*cVkUW4eq6632G^#b{LOzw2O|gDzdWpp|31H^hWPR zJGGDh^A_*j9&gx?6-TZ=>CvrkkdG9Q=t($cv>L3F3rVDYK@pn>LaGm*@G$UlR-4$S z=jSZFz_`M=Cu(5TWbCF!&wp55b5T^dh)IjrFw^Q6Fh~@EWJj7f$vF{cCN0b)(c}e2 zee7O^%0qhDh#RfVWCVg?gPrbovHIyf;|FQK4;*({`pPEzv>x~qd4%lVezint8!6xB zEW~a~#jf8kv{m3OlRNI4=69lHS!1XdF^bOX?d??v9x~}zfwNA1vuIBCYIXwLmFN4r zAjSUKPyf63zvC1a*QR>P)z#U0z|3#j%Z=-`ZNjBjyflw=525bADX;q9ZsL9^d(zw4 zR2Q?<>PO|5dnxkuyy~Lk4E7x0+p_)%aKp$l<;yPkz5K|z5_gZwg=az3O6zc)-t?0Q z6;J(QbxZjJyx+Yfc6K&$^@hgnWz_Z7%q@tB%{C!D~Om*h0j1@Swd>|9_O^$O4h%d zAL7mN@9CJCfDt8`>uB_LXHoa+Wr?rE4cnOsv^<+%s|d<=bwff};iFklg*3Yp2`RO3 zi^+6X#c}yS^CRKguuVP63~hd#5)D@z&VoheLwJAr!{R)f9$Ew_x~C+(=4N|#H>bx4 z7$U8Y!yJK?wk7%6Zz5{-A`Gg^RYSt#S-_@P*Y6-GJW+`V(kBbL&^3#%(@yp;1(0B#l%=@or zoCGcwH8x?{pHbo!{`gNog-4*d2dAOp2);r>sUJ^&es*zc^II=&r5Y9c0Y?~SmHYno z909qax~@C)Rrna=G9Hc~+g=Z4b)wkz<+lX1<7T8*@s6?LIz}^F?=uP7*m|n`UIW1b z+3TkRTW7-ZTn#cpv?(4jj@gvoUKKT8_osF1{sb)7ya;%XeeRM@{^0tFdlWODKP!BG zf(9KwBuQJq z9Oxf5C0ia-Dr*fN{N;WMd&R`R!l<13trR*YI>+}y%i|qd_O9!Cg6@r~x~LwNhQ;~K zJbWG?KZD9+=Fgny$Hh$MS!W8ZXAtPHwH^eb zX7`H;7QcL{LLjC;{c7wsb+5>{RNsV^sR!3c@fzW=qaX#u@&#_d0u!Y+27{9mTWk>% z)a!@NF9QoHjrgrzNgE8!S>|kndb}1T8`-Q%Tq4R%U#Qx9T0-0;1rL?q%K$F;2Y%j> z)*jMHNH$mCBB$uI(i93{)6)>fTUZu^ZAEB(zdYKYteS{!5x0_AzI9ty0^7PFSW(o) zWW=nE(iV|Fsz3S|;2Yw6{AtYG&9%HKo8_9!zAT>n+EydukihlxTJ)Com}G|YYh{&;dEmXTKL)-r zhqus$O5C_3D{sSY6qY}{Z65zMyYgc_9QKXx@k?{71LCH=Ch>>s33+q&+Y^=GLZyrm zpsAZ0o1W`Q!}6K_23I^G$hc{Lq1abc9J@iFwe)==SHOo6hdv>adHI&J(-y%FB->Le zOPIB!;#v@@U`%{{n)LdDeV{6|na{e-yxT^X_Fd8ZBJrd}wMtf0HE)hG%%>wf912X7 zB~G7tJ;3eIKP??L!y8)@=Cyh*ygSw}&3zHNgCkhBZw5Pnv%Vkd^Zl}c;W*>XBWryH zP-!m8&EMj8hfVYa+U+4b6m-}}Y*5CiVpZkwm|8}76e|+Wtr~X9`-2FNAKLG5zI%WU> literal 0 HcmV?d00001 From 48cd4f211691a6f626768d7f1e596f65d10d0dda Mon Sep 17 00:00:00 2001 From: Oisin Date: Mon, 10 Feb 2025 18:12:30 +0000 Subject: [PATCH 07/10] Added the draw.io workflow diagram to the README.md doc --- README.md | 16 ++++++++++++++-- 1 file changed, 14 insertions(+), 2 deletions(-) diff --git a/README.md b/README.md index 0895493..8546727 100644 --- a/README.md +++ b/README.md @@ -2,14 +2,26 @@ ## Overview -This git repository contains code and configurations for implementing a Convolutional Neural Network to classify images containing cats or dogs. The data was sourced from the [dogs-vs-cats](https://www.kaggle.com/competitions/dogs-vs-cats/overview) Kaggle competition, and also from [freeimages.com](https://www.freeimages.com/) using a web scraper. Docker containers were used to deploy the application on an EC2 spot instances in order to scale up hardware and computation power. +This git repository contains code and configurations for implementing a Convolutional Neural Network to classify images containing cats or dogs. The data was sourced from the [dogs-vs-cats](https://www.kaggle.com/competitions/dogs-vs-cats/overview) Kaggle competition, and also from [freeimages.com](https://www.freeimages.com/) using a web scraper. + +Two models were trained to classify the images; an AlexNet8 model via Keras and a VGG16 model via Torch. + +Docker containers were used to deploy the application on an EC2 spot instances in order to scale up hardware and computation power. + +![Workflow](doc/catclassifier.jpg) ## Analysis Results +Both models were trained using a variety of image transformations, early stopping, stochastic gradient descent, learning rate reduction and cross entropy loss criterion. + See the analysis results notebook for a summary of the project; including image processing, CNN architecture and model performance. * https://nbviewer.org/github/oislen/CatClassifier/blob/main/report/torch_analysis_results.ipynb -## Docker Container +## Running the Application (Windows) + +### Anaconda + +### Docker The application docker container is available on dockerhub here: From 55381c701cea7e14c6aad889a5fb191bf50d6d37 Mon Sep 17 00:00:00 2001 From: Oisin Date: Mon, 10 Feb 2025 19:59:26 +0000 Subject: [PATCH 08/10] Updated qmarkdown and jupyter notebook reports --- report/keras_analysis_results.ipynb | 4 ++-- report/keras_analysis_results.qmd | 4 ++-- report/torch_analysis_results.ipynb | 12 +++++++++--- report/torch_analysis_results.qmd | 6 +++--- 4 files changed, 16 insertions(+), 10 deletions(-) diff --git a/report/keras_analysis_results.ipynb b/report/keras_analysis_results.ipynb index 0fb87d7..92db525 100644 --- a/report/keras_analysis_results.ipynb +++ b/report/keras_analysis_results.ipynb @@ -86,7 +86,7 @@ "id": "a7b1dcff-9319-4973-9659-fda87cb85481", "metadata": {}, "source": [ - "An AlexNet CNN model with 8 layers was trained using the processed images via Keras. See AlexNet diagram below, as well as keras model summary." + "An AlexNet CNN model with 8 layers was trained using the processed images via Keras. See AlexNet diagram below, as well as Keras model summary. Stochastic gradient descent was implemented to optimize the training criterion function cross entropy loss." ] }, { @@ -266,7 +266,7 @@ "id": "f771965b-b74f-4363-aa95-242038dcd235", "metadata": {}, "source": [ - "The model was trained across 25 epochs. The model accuracy and loss are plotted below across the training and validation sets." + "The model was trained across 25 epochs. Learning rate reduction on plateau and early stopping were implemented as part of training procedure.The model accuracy and loss are plotted below across the training and validation sets." ] }, { diff --git a/report/keras_analysis_results.qmd b/report/keras_analysis_results.qmd index 5dcf88a..4fbae08 100644 --- a/report/keras_analysis_results.qmd +++ b/report/keras_analysis_results.qmd @@ -36,7 +36,7 @@ The images were further processed using rotations, scaling, zooming, flipping an ## AlexNet8 Model Architecture -An AlexNet CNN model with 8 layers was trained using the processed images via Keras. See AlexNet diagram below, as well as Keras model summary. +An AlexNet CNN model with 8 layers was trained using the processed images via Keras. See AlexNet diagram below, as well as Keras model summary. Stochastic gradient descent was implemented to optimize the training criterion function cross entropy loss. ![AlexNet Architecture](keras/AlexNet8_architecture.png) @@ -49,7 +49,7 @@ model.summary() ## Model Performance -The model was trained across 25 epochs. The model accuracy and loss are plotted below across the training and validation sets. +The model was trained across 25 epochs. Learning rate reduction on plateau and early stopping were implemented as part of training procedure.The model accuracy and loss are plotted below across the training and validation sets. ![Model Accuracy](keras/model_accuracy.png) diff --git a/report/torch_analysis_results.ipynb b/report/torch_analysis_results.ipynb index 1432277..fca9ada 100644 --- a/report/torch_analysis_results.ipynb +++ b/report/torch_analysis_results.ipynb @@ -37,13 +37,13 @@ "\n", "## Data Processing\n", "\n", - "The images were resized to a uniform dimension prior to the modelling training phase. See example image processing below. \n", + "The images were resized to a uniform dimension and the colour channels normalised prior to the modelling training phase. See example image processing below. \n", "\n", "![Generator Plot](torch/generator_plot.jpg)\n", "\n", "## VGG16 Model Architecture\n", "\n", - "A pre-trained VGG CNN model with 16 layers was trained using the processed images via PyTorch. See VGG16 diagram below, as well as torch model summary.\n", + "A pre-trained VGG CNN model with 16 layers was trained using the processed images via PyTorch. See VGG16 diagram below, as well as torch model summary. Stochastic gradient descent was implemented to optimize the training criterion function cross entropy loss.\n", "\n", "![AlexNet Architecture](torch/VGG16_architecture.png)\n" ] @@ -128,7 +128,7 @@ "source": [ "## Model Performance\n", "\n", - "The model was trained across 4 epochs. The model accuracy and loss are plotted below across the training and validation sets.\n", + "The model was trained across 10 epochs. Learning rate reduction on plateau and early stopping were implemented as part of training procedure. The model accuracy and loss are plotted below across the training and validation sets.\n", "\n", "![Model Accuracy](torch/model_accuracy.png)\n", "\n", @@ -140,6 +140,12 @@ "\n", "![Predicted Images](torch/pred_images.jpg)" ] + }, + { + "cell_type": "markdown", + "id": "0c7419e8", + "metadata": {}, + "source": [] } ], "metadata": { diff --git a/report/torch_analysis_results.qmd b/report/torch_analysis_results.qmd index bad237a..28c8aab 100644 --- a/report/torch_analysis_results.qmd +++ b/report/torch_analysis_results.qmd @@ -31,13 +31,13 @@ This project aims to create a model to classify cat and dog images. The data was ## Data Processing -The images were resized to a uniform dimension prior to the modelling training phase. See example image processing below. +The images were resized to a uniform dimension and the colour channels normalised prior to the modelling training phase. See example image processing below. ![Generator Plot](torch/generator_plot.jpg) ## VGG16 Model Architecture -A pre-trained VGG CNN model with 16 layers was trained using the processed images via PyTorch. See VGG16 diagram below, as well as torch model summary. +A pre-trained VGG CNN model with 16 layers was trained using the processed images via PyTorch. See VGG16 diagram below, as well as torch model summary. Stochastic gradient descent was implemented to optimize the training criterion function cross entropy loss. ![AlexNet Architecture](torch/VGG16_architecture.png) @@ -53,7 +53,7 @@ print(model) ## Model Performance -The model was trained across 4 epochs. The model accuracy and loss are plotted below across the training and validation sets. +The model was trained across 10 epochs. Learning rate reduction on plateau and early stopping were implemented as part of training procedure. The model accuracy and loss are plotted below across the training and validation sets. ![Model Accuracy](torch/model_accuracy.png) From c1b5e7bd851d33aa1544e811db9c1f6bd99680e1 Mon Sep 17 00:00:00 2001 From: Oisin Date: Mon, 10 Feb 2025 20:07:07 +0000 Subject: [PATCH 09/10] Added report points and images to README file --- README.md | 11 +++++++++-- 1 file changed, 9 insertions(+), 2 deletions(-) diff --git a/README.md b/README.md index 8546727..3aef08b 100644 --- a/README.md +++ b/README.md @@ -12,9 +12,16 @@ Docker containers were used to deploy the application on an EC2 spot instances i ## Analysis Results -Both models were trained using a variety of image transformations, early stopping, stochastic gradient descent, learning rate reduction and cross entropy loss criterion. +The images were further normalised using rotations, scaling, zooming, flipping and shearing prior to the modelling training phase. + +![Generator Plot](report/torch/generator_plot.jpg) + +Models were trained across 10 to 25 epochs using stochastic gradient descent and cross entropy loss. Learning rate reduction on plateau and early stopping were implemented as part of training procedure. + +![Predicted Images](report/torch/pred_images.jpg) + +See the analysis results notebook for a further details on the analysis; including CNN architecture and model performance. -See the analysis results notebook for a summary of the project; including image processing, CNN architecture and model performance. * https://nbviewer.org/github/oislen/CatClassifier/blob/main/report/torch_analysis_results.ipynb ## Running the Application (Windows) From 19fa5a69f16f83a79eb2cf5f6e90dd206a015e12 Mon Sep 17 00:00:00 2001 From: Oisin Date: Mon, 10 Feb 2025 20:18:15 +0000 Subject: [PATCH 10/10] Added instructions for running appliction on windows via anaconda and docker --- README.md | 38 ++++++++++++++++++++++++++++++++++++-- 1 file changed, 36 insertions(+), 2 deletions(-) diff --git a/README.md b/README.md index 3aef08b..192a0fa 100644 --- a/README.md +++ b/README.md @@ -28,8 +28,42 @@ See the analysis results notebook for a further details on the analysis; includi ### Anaconda +Create a local conda environment for the Cat Classifier app using [anaconda](https://www.anaconda.com/): + +``` +conda create --name CatClassifier python=3.12 --yes +conda activate CatClassifier +pip install -r requirements.txt +``` + +Execute the webscrapers and model training pipeline using the following commands and the local conda environment: + +``` +:: run webscrapers +python webscrapers/prg_scrape_imgs.py --run_download_comp_data --run_webscraper +:: run model training pipeline +python model/prg_torch_model.py --run_model_training --run_testset_prediction +``` + +The model training and evaluation report can be opened with: + +``` +jupyter lab --ip=0.0.0.0 --allow-root "report/torch_analysis_results.ipynb" +``` ### Docker -The application docker container is available on dockerhub here: +The latest version of the Cat Classifier app can be found as a [docker](https://www.docker.com/) image on dockerhub here: + +* https://hub.docker.com/repository/docker/oislen/cat-classifier + +The image can be pulled from dockerhub using the following command: + +``` +docker pull oislen/cat-classifier:latest +``` + +The Cat Classifier app can then be started within a jupyter lab session using the following command and the docker image: -https://hub.docker.com/repository/docker/oislen/cat-classifier +``` +docker run --name cc --shm-size=512m --publish 8888:8888 -it oislen/cat-classifier:latest +``` \ No newline at end of file