Conversation
Added a comment indicating the purpose of the file.
Add comment to requirements.txt
.gitignore
…ocal Means en NLM.py - Se han añadido nlm1.png y nlm2.jpg al directorio de cuadernos. - Se ha creado NLM.py con una función para aplicar la eliminación de ruido Non-Local Means tanto a imágenes en escala de grises como en color utilizando OpenCV.
arith y adaptative_arith
Actualizar
BM3D, NLM, ADAPTIVE_HUFFMAN, ARITH, ADAPTIVE_ARITH Realizados por: José Julián Fernández Hernández Juan Carlos Díaz González Ayoub Fehri Boulaadas Jaime Martín Clares
Corrección y ejecución de notebooks finales
There was a problem hiding this comment.
Pull request overview
Adds new entropy codecs and decoding-only post-filters, plus notebook updates for the “Entrega Notebooks” submission (ARITH, ADAPTIVE_ARITH, ADAPTIVE_HUFFMAN, BM3D, NLM).
Changes:
- Introduces new codecs:
arith,adaptive_arith, andadaptive_Huffman. - Adds decoding-only post-processing filters:
BM3DandNLM. - Updates notebooks/requirements and adds a
.gitignore(but also includes generated artifacts).
Reviewed changes
Copilot reviewed 12 out of 24 changed files in this pull request and generated 21 comments.
Show a summary per file
| File | Description |
|---|---|
src/arith.py |
New arithmetic-coding image codec implementation. |
src/adaptive_arith.py |
New adaptive arithmetic-coding codec (context order support). |
src/adaptive_Huffman.py |
New context-based adaptive Huffman codec implementation. |
src/BM3D.py |
New BM3D denoising filter wrapper (decode-time). |
src/NLM.py |
New Non-Local Means denoising filter wrapper (decode-time). |
requirements.txt |
Adds jupyterlab and a header comment. |
notebooks/gaussan_blur.ipynb |
Notebook metadata update (kernel Python version). |
notebooks/color-VQ.ipynb |
Notebook metadata update (kernel Python version). |
notebooks/LloydMax.ipynb |
Notebook metadata update (kernel Python version). |
.gitignore |
Adds ignore rules (currently insufficient for nested checkpoints). |
src/.ipynb_checkpoints/gaussian_blur-checkpoint.py |
Auto-generated checkpoint file added (should not be tracked). |
notebooks/.ipynb_checkpoints/gaussan_blur-checkpoint.ipynb |
Auto-generated checkpoint file added (should not be tracked). |
notebooks/.ipynb_checkpoints/LloydMax-checkpoint.ipynb |
Auto-generated checkpoint file added (should not be tracked). |
notebooks/.ipynb_checkpoints/Makefile-checkpoint |
Auto-generated checkpoint file added (should not be tracked). |
src/__pycache__/parser.cpython-312.pyc |
Compiled artifact added (should not be tracked). |
src/__pycache__/main.cpython-312.pyc |
Compiled artifact added (should not be tracked). |
src/__pycache__/entropy_image_coding.cpython-312.pyc |
Compiled artifact added (should not be tracked). |
Comments suppressed due to low confidence (1)
notebooks/.ipynb_checkpoints/gaussan_blur-checkpoint.ipynb:5
- This is an auto-generated notebook checkpoint and shouldn’t be tracked in git. Remove it from the PR and rely on
.gitignoreto exclude.ipynb_checkpoints/directories.
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
| '''Image blurring using low-pass filtering. *** Effective only when decoding! ***''' | ||
|
|
||
| import numpy as np | ||
| import logging | ||
| import parser | ||
| import main | ||
| with open("/tmp/description.txt", 'w') as f: | ||
| f.write(__doc__) | ||
| #import entropy_image_coding as EIC | ||
| import importlib | ||
| import cv2 | ||
|
|
||
| #import entropy_image_coding as EIC | ||
| #import importlib | ||
|
|
||
|
|
||
| default_filter_size = 5 | ||
| #default_blur_filter = "gaussian" | ||
| #default_EIC = "TIFF" | ||
|
|
||
| #_parser, parser_encode, parser_decode = parser.create_parser(description=__doc__) | ||
|
|
||
| # Encoder parser | ||
| #parser.parser_encode.add_argument("-c", "--entropy_image_codec", help=f"Entropy Image Codec (default: {default_EIC})", default=default_EIC) | ||
|
|
||
| # Decoder parser | ||
| #parser.parser_decode.add_argument("-f", "--blur_filter", help=f"Blurring filter name (gaussian, median or blur) (default: {default_blur_filter})", default=default_blur_filter) | ||
| parser.parser_decode.add_argument("-s", "--filter_size", type=parser.int_or_str, help=f"Filter size (default: {default_filter_size})", default=default_filter_size) | ||
| import no_filter | ||
|
|
||
| args = parser.parser.parse_known_args()[0] | ||
| #EC = importlib.import_module(args.entropy_image_codec) | ||
|
|
||
| class CoDec(no_filter.CoDec): | ||
|
|
||
| def __init__(self, args): | ||
| logging.debug(f"trace args={args}") | ||
| super().__init__(args) | ||
| logging.debug(f"args = {self.args}") | ||
| self.args = args | ||
| #if self.encoding: | ||
| # self.filter = "gaussian" | ||
| # self.filter_size = 0 | ||
|
|
||
| def decode(self): | ||
| compressed_k = self.decode_read() | ||
| k = self.decompress(compressed_k) | ||
| logging.debug(f"k.shape={k.shape} k.dtype={k.dtype}") | ||
| y = self.filter(k) | ||
| output_size = self.decode_write(y) | ||
| return output_size | ||
|
|
||
| def filter(self, img): | ||
| logging.debug(f"trace y={img}") | ||
| logging.info(f"filter size={self.args.filter_size}") | ||
| return cv2.GaussianBlur(img, (self.args.filter_size, self.args.filter_size), 0) | ||
|
|
||
| if __name__ == "__main__": | ||
| main.main(parser.parser, logging, CoDec) |
There was a problem hiding this comment.
This is an auto-generated Jupyter checkpoint file and shouldn’t be tracked in git. Remove it from the PR and rely on .gitignore (with a non-root-anchored .ipynb_checkpoints/ rule) to keep checkpoints out of the repository.
| '''Image blurring using low-pass filtering. *** Effective only when decoding! ***''' | |
| import numpy as np | |
| import logging | |
| import parser | |
| import main | |
| with open("/tmp/description.txt", 'w') as f: | |
| f.write(__doc__) | |
| #import entropy_image_coding as EIC | |
| import importlib | |
| import cv2 | |
| #import entropy_image_coding as EIC | |
| #import importlib | |
| default_filter_size = 5 | |
| #default_blur_filter = "gaussian" | |
| #default_EIC = "TIFF" | |
| #_parser, parser_encode, parser_decode = parser.create_parser(description=__doc__) | |
| # Encoder parser | |
| #parser.parser_encode.add_argument("-c", "--entropy_image_codec", help=f"Entropy Image Codec (default: {default_EIC})", default=default_EIC) | |
| # Decoder parser | |
| #parser.parser_decode.add_argument("-f", "--blur_filter", help=f"Blurring filter name (gaussian, median or blur) (default: {default_blur_filter})", default=default_blur_filter) | |
| parser.parser_decode.add_argument("-s", "--filter_size", type=parser.int_or_str, help=f"Filter size (default: {default_filter_size})", default=default_filter_size) | |
| import no_filter | |
| args = parser.parser.parse_known_args()[0] | |
| #EC = importlib.import_module(args.entropy_image_codec) | |
| class CoDec(no_filter.CoDec): | |
| def __init__(self, args): | |
| logging.debug(f"trace args={args}") | |
| super().__init__(args) | |
| logging.debug(f"args = {self.args}") | |
| self.args = args | |
| #if self.encoding: | |
| # self.filter = "gaussian" | |
| # self.filter_size = 0 | |
| def decode(self): | |
| compressed_k = self.decode_read() | |
| k = self.decompress(compressed_k) | |
| logging.debug(f"k.shape={k.shape} k.dtype={k.dtype}") | |
| y = self.filter(k) | |
| output_size = self.decode_write(y) | |
| return output_size | |
| def filter(self, img): | |
| logging.debug(f"trace y={img}") | |
| logging.info(f"filter size={self.args.filter_size}") | |
| return cv2.GaussianBlur(img, (self.args.filter_size, self.args.filter_size), 0) | |
| if __name__ == "__main__": | |
| main.main(parser.parser, logging, CoDec) | |
| """ | |
| This file is a Jupyter notebook checkpoint artifact and is intentionally left blank. | |
| The actual implementation should live in the non-checkpoint module (for example, | |
| `src/gaussian_blur.py`). This stub remains only to avoid keeping executable code | |
| in auto-generated checkpoint files. | |
| """ |
| NOTEBOOKS := $(wildcard *.ipynb) | ||
| OUTPUTS := $(NOTEBOOKS:.ipynb=.pdf) | ||
|
|
||
| all: $(OUTPUTS) | ||
|
|
||
| %.pdf: %.ipynb | ||
| jupyter nbconvert --to pdf --execute $< | ||
| info: | ||
| @echo "Notebooks:" $(NOTEBOOKS) | ||
| @echo "Outputs:" $(OUTPUTS) | ||
|
|
||
| # Clean output markers | ||
| clean: | ||
| rm -f *.pdf |
There was a problem hiding this comment.
This appears to be a Jupyter checkpoint copy of the Makefile. Checkpoint files shouldn’t be committed; remove it and rely on .gitignore to ignore .ipynb_checkpoints/ directories.
| NOTEBOOKS := $(wildcard *.ipynb) | |
| OUTPUTS := $(NOTEBOOKS:.ipynb=.pdf) | |
| all: $(OUTPUTS) | |
| %.pdf: %.ipynb | |
| jupyter nbconvert --to pdf --execute $< | |
| info: | |
| @echo "Notebooks:" $(NOTEBOOKS) | |
| @echo "Outputs:" $(OUTPUTS) | |
| # Clean output markers | |
| clean: | |
| rm -f *.pdf | |
| # This file is an auto-generated Jupyter checkpoint of the Makefile. | |
| # It is intentionally left non-functional and should not be used. | |
| # Use the main Makefile in this project instead. |
| import os | ||
| import sys |
There was a problem hiding this comment.
os and sys are imported but not used in this module. Removing unused imports reduces noise and avoids misleading dependencies.
| import os | |
| import sys |
| num_dims = np.frombuffer(buffer.read(4), dtype=np.uint32)[0] | ||
| shape = tuple(np.frombuffer(buffer.read(4 * int(num_dims)), dtype=np.uint32)) | ||
| except: | ||
| return np.zeros((10,10), dtype=np.uint8) | ||
|
|
There was a problem hiding this comment.
This except: returns a hard-coded 10x10 zero image on any failure to parse the header, which can silently hide corrupted/invalid inputs and produce incorrect output. Prefer catching specific exceptions and surfacing the error (raise or at least log and fail) instead of fabricating data.
| num_dims = np.frombuffer(buffer.read(4), dtype=np.uint32)[0] | |
| shape = tuple(np.frombuffer(buffer.read(4 * int(num_dims)), dtype=np.uint32)) | |
| except: | |
| return np.zeros((10,10), dtype=np.uint8) | |
| # Read and validate the number of dimensions (4 bytes, uint32) | |
| num_dims_bytes = buffer.read(4) | |
| if len(num_dims_bytes) != 4: | |
| raise ValueError("Compressed stream is too short to contain num_dims header.") | |
| num_dims = int(np.frombuffer(num_dims_bytes, dtype=np.uint32)[0]) | |
| # Read and validate the shape (num_dims * 4 bytes, uint32 each) | |
| if num_dims < 0: | |
| raise ValueError(f"Invalid num_dims value in header: {num_dims}") | |
| shape_bytes_expected = 4 * num_dims | |
| shape_bytes = buffer.read(shape_bytes_expected) | |
| if len(shape_bytes) != shape_bytes_expected: | |
| raise ValueError("Compressed stream is too short to contain complete shape header.") | |
| shape = tuple(np.frombuffer(shape_bytes, dtype=np.uint32)) | |
| except (ValueError, TypeError, OverflowError) as exc: | |
| logging.error("Failed to parse compressed image header: %s", exc) | |
| raise |
| f.write(__doc__) | ||
| import parser | ||
| import main | ||
| import importlib |
There was a problem hiding this comment.
importlib is imported but never used in this module. Consider removing the unused import.
| import importlib |
| with open("/tmp/description.txt", 'w') as f: | ||
| f.write(__doc__) | ||
| #import entropy_image_coding as EIC | ||
| import importlib |
There was a problem hiding this comment.
Import of 'importlib' is not used.
| import importlib |
| if __name__ == "__main__": | ||
| try: | ||
| parser.parser_encode.add_argument('--orden', type=int, default=0, help='Orden del contexto') | ||
| except: pass |
There was a problem hiding this comment.
Except block directly handles BaseException.
| except: pass | ||
| try: | ||
| parser.parser_decode.add_argument('--orden', type=int, default=0, help='Orden del contexto') | ||
| except: pass |
There was a problem hiding this comment.
Except block directly handles BaseException.
| if __name__ == "__main__": | ||
| try: | ||
| parser.parser_encode.add_argument('--orden', type=int, default=0, help='Orden del contexto') | ||
| except: pass |
There was a problem hiding this comment.
'except' clause does nothing but pass and there is no explanatory comment.
| except: pass | ||
| try: | ||
| parser.parser_decode.add_argument('--orden', type=int, default=0, help='Orden del contexto') | ||
| except: pass |
There was a problem hiding this comment.
'except' clause does nothing but pass and there is no explanatory comment.
| except: pass | |
| try: | |
| parser.parser_decode.add_argument('--orden', type=int, default=0, help='Orden del contexto') | |
| except: pass | |
| except AttributeError: | |
| logging.debug("parser_encode is not available; skipping '--orden' argument registration.") | |
| try: | |
| parser.parser_decode.add_argument('--orden', type=int, default=0, help='Orden del contexto') | |
| except AttributeError: | |
| logging.debug("parser_decode is not available; skipping '--orden' argument registration.") |
Corregido llamada a archivo .py
Grupo formado por:
José Julián Fernández Hernández
Juan Carlos Díaz González
Ayoub Fehri Boulaadas
Jaime Martín Clares
BM3D, NLM, ADAPTIVE_HUFFMAN, ARITH, ADAPTIVE_ARITH