Skip to content

Comments

Entrega Notebooks#35

Open
jfh891 wants to merge 20 commits intoSistemas-Multimedia:mainfrom
afb356-ual:main
Open

Entrega Notebooks#35
jfh891 wants to merge 20 commits intoSistemas-Multimedia:mainfrom
afb356-ual:main

Conversation

@jfh891
Copy link

@jfh891 jfh891 commented Feb 8, 2026

Grupo formado por:
José Julián Fernández Hernández
Juan Carlos Díaz González
Ayoub Fehri Boulaadas
Jaime Martín Clares

BM3D, NLM, ADAPTIVE_HUFFMAN, ARITH, ADAPTIVE_ARITH

jfh891 and others added 18 commits December 28, 2025 12:00
Added a comment indicating the purpose of the file.
Add comment to requirements.txt
…ocal Means en NLM.py

- Se han añadido nlm1.png y nlm2.jpg al directorio de cuadernos.
- Se ha creado NLM.py con una función para aplicar la eliminación de ruido Non-Local Means tanto a imágenes en escala de grises como en color utilizando OpenCV.
BM3D, NLM, ADAPTIVE_HUFFMAN, ARITH, ADAPTIVE_ARITH
Realizados por:

José Julián Fernández Hernández
Juan Carlos Díaz González
Ayoub Fehri Boulaadas
Jaime Martín Clares
Corrección y ejecución de notebooks finales
Copilot AI review requested due to automatic review settings February 8, 2026 23:04
Copy link

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Adds new entropy codecs and decoding-only post-filters, plus notebook updates for the “Entrega Notebooks” submission (ARITH, ADAPTIVE_ARITH, ADAPTIVE_HUFFMAN, BM3D, NLM).

Changes:

  • Introduces new codecs: arith, adaptive_arith, and adaptive_Huffman.
  • Adds decoding-only post-processing filters: BM3D and NLM.
  • Updates notebooks/requirements and adds a .gitignore (but also includes generated artifacts).

Reviewed changes

Copilot reviewed 12 out of 24 changed files in this pull request and generated 21 comments.

Show a summary per file
File Description
src/arith.py New arithmetic-coding image codec implementation.
src/adaptive_arith.py New adaptive arithmetic-coding codec (context order support).
src/adaptive_Huffman.py New context-based adaptive Huffman codec implementation.
src/BM3D.py New BM3D denoising filter wrapper (decode-time).
src/NLM.py New Non-Local Means denoising filter wrapper (decode-time).
requirements.txt Adds jupyterlab and a header comment.
notebooks/gaussan_blur.ipynb Notebook metadata update (kernel Python version).
notebooks/color-VQ.ipynb Notebook metadata update (kernel Python version).
notebooks/LloydMax.ipynb Notebook metadata update (kernel Python version).
.gitignore Adds ignore rules (currently insufficient for nested checkpoints).
src/.ipynb_checkpoints/gaussian_blur-checkpoint.py Auto-generated checkpoint file added (should not be tracked).
notebooks/.ipynb_checkpoints/gaussan_blur-checkpoint.ipynb Auto-generated checkpoint file added (should not be tracked).
notebooks/.ipynb_checkpoints/LloydMax-checkpoint.ipynb Auto-generated checkpoint file added (should not be tracked).
notebooks/.ipynb_checkpoints/Makefile-checkpoint Auto-generated checkpoint file added (should not be tracked).
src/__pycache__/parser.cpython-312.pyc Compiled artifact added (should not be tracked).
src/__pycache__/main.cpython-312.pyc Compiled artifact added (should not be tracked).
src/__pycache__/entropy_image_coding.cpython-312.pyc Compiled artifact added (should not be tracked).
Comments suppressed due to low confidence (1)

notebooks/.ipynb_checkpoints/gaussan_blur-checkpoint.ipynb:5

  • This is an auto-generated notebook checkpoint and shouldn’t be tracked in git. Remove it from the PR and rely on .gitignore to exclude .ipynb_checkpoints/ directories.

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Comment on lines +1 to +59
'''Image blurring using low-pass filtering. *** Effective only when decoding! ***'''

import numpy as np
import logging
import parser
import main
with open("/tmp/description.txt", 'w') as f:
f.write(__doc__)
#import entropy_image_coding as EIC
import importlib
import cv2

#import entropy_image_coding as EIC
#import importlib


default_filter_size = 5
#default_blur_filter = "gaussian"
#default_EIC = "TIFF"

#_parser, parser_encode, parser_decode = parser.create_parser(description=__doc__)

# Encoder parser
#parser.parser_encode.add_argument("-c", "--entropy_image_codec", help=f"Entropy Image Codec (default: {default_EIC})", default=default_EIC)

# Decoder parser
#parser.parser_decode.add_argument("-f", "--blur_filter", help=f"Blurring filter name (gaussian, median or blur) (default: {default_blur_filter})", default=default_blur_filter)
parser.parser_decode.add_argument("-s", "--filter_size", type=parser.int_or_str, help=f"Filter size (default: {default_filter_size})", default=default_filter_size)
import no_filter

args = parser.parser.parse_known_args()[0]
#EC = importlib.import_module(args.entropy_image_codec)

class CoDec(no_filter.CoDec):

def __init__(self, args):
logging.debug(f"trace args={args}")
super().__init__(args)
logging.debug(f"args = {self.args}")
self.args = args
#if self.encoding:
# self.filter = "gaussian"
# self.filter_size = 0

def decode(self):
compressed_k = self.decode_read()
k = self.decompress(compressed_k)
logging.debug(f"k.shape={k.shape} k.dtype={k.dtype}")
y = self.filter(k)
output_size = self.decode_write(y)
return output_size

def filter(self, img):
logging.debug(f"trace y={img}")
logging.info(f"filter size={self.args.filter_size}")
return cv2.GaussianBlur(img, (self.args.filter_size, self.args.filter_size), 0)

if __name__ == "__main__":
main.main(parser.parser, logging, CoDec)
Copy link

Copilot AI Feb 8, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is an auto-generated Jupyter checkpoint file and shouldn’t be tracked in git. Remove it from the PR and rely on .gitignore (with a non-root-anchored .ipynb_checkpoints/ rule) to keep checkpoints out of the repository.

Suggested change
'''Image blurring using low-pass filtering. *** Effective only when decoding! ***'''
import numpy as np
import logging
import parser
import main
with open("/tmp/description.txt", 'w') as f:
f.write(__doc__)
#import entropy_image_coding as EIC
import importlib
import cv2
#import entropy_image_coding as EIC
#import importlib
default_filter_size = 5
#default_blur_filter = "gaussian"
#default_EIC = "TIFF"
#_parser, parser_encode, parser_decode = parser.create_parser(description=__doc__)
# Encoder parser
#parser.parser_encode.add_argument("-c", "--entropy_image_codec", help=f"Entropy Image Codec (default: {default_EIC})", default=default_EIC)
# Decoder parser
#parser.parser_decode.add_argument("-f", "--blur_filter", help=f"Blurring filter name (gaussian, median or blur) (default: {default_blur_filter})", default=default_blur_filter)
parser.parser_decode.add_argument("-s", "--filter_size", type=parser.int_or_str, help=f"Filter size (default: {default_filter_size})", default=default_filter_size)
import no_filter
args = parser.parser.parse_known_args()[0]
#EC = importlib.import_module(args.entropy_image_codec)
class CoDec(no_filter.CoDec):
def __init__(self, args):
logging.debug(f"trace args={args}")
super().__init__(args)
logging.debug(f"args = {self.args}")
self.args = args
#if self.encoding:
# self.filter = "gaussian"
# self.filter_size = 0
def decode(self):
compressed_k = self.decode_read()
k = self.decompress(compressed_k)
logging.debug(f"k.shape={k.shape} k.dtype={k.dtype}")
y = self.filter(k)
output_size = self.decode_write(y)
return output_size
def filter(self, img):
logging.debug(f"trace y={img}")
logging.info(f"filter size={self.args.filter_size}")
return cv2.GaussianBlur(img, (self.args.filter_size, self.args.filter_size), 0)
if __name__ == "__main__":
main.main(parser.parser, logging, CoDec)
"""
This file is a Jupyter notebook checkpoint artifact and is intentionally left blank.
The actual implementation should live in the non-checkpoint module (for example,
`src/gaussian_blur.py`). This stub remains only to avoid keeping executable code
in auto-generated checkpoint files.
"""

Copilot uses AI. Check for mistakes.
Comment on lines +1 to +14
NOTEBOOKS := $(wildcard *.ipynb)
OUTPUTS := $(NOTEBOOKS:.ipynb=.pdf)

all: $(OUTPUTS)

%.pdf: %.ipynb
jupyter nbconvert --to pdf --execute $<
info:
@echo "Notebooks:" $(NOTEBOOKS)
@echo "Outputs:" $(OUTPUTS)

# Clean output markers
clean:
rm -f *.pdf
Copy link

Copilot AI Feb 8, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This appears to be a Jupyter checkpoint copy of the Makefile. Checkpoint files shouldn’t be committed; remove it and rely on .gitignore to ignore .ipynb_checkpoints/ directories.

Suggested change
NOTEBOOKS := $(wildcard *.ipynb)
OUTPUTS := $(NOTEBOOKS:.ipynb=.pdf)
all: $(OUTPUTS)
%.pdf: %.ipynb
jupyter nbconvert --to pdf --execute $<
info:
@echo "Notebooks:" $(NOTEBOOKS)
@echo "Outputs:" $(OUTPUTS)
# Clean output markers
clean:
rm -f *.pdf
# This file is an auto-generated Jupyter checkpoint of the Makefile.
# It is intentionally left non-functional and should not be used.
# Use the main Makefile in this project instead.

Copilot uses AI. Check for mistakes.
Comment on lines +3 to +4
import os
import sys
Copy link

Copilot AI Feb 8, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

os and sys are imported but not used in this module. Removing unused imports reduces noise and avoids misleading dependencies.

Suggested change
import os
import sys

Copilot uses AI. Check for mistakes.
Comment on lines +99 to +103
num_dims = np.frombuffer(buffer.read(4), dtype=np.uint32)[0]
shape = tuple(np.frombuffer(buffer.read(4 * int(num_dims)), dtype=np.uint32))
except:
return np.zeros((10,10), dtype=np.uint8)

Copy link

Copilot AI Feb 8, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This except: returns a hard-coded 10x10 zero image on any failure to parse the header, which can silently hide corrupted/invalid inputs and produce incorrect output. Prefer catching specific exceptions and surfacing the error (raise or at least log and fail) instead of fabricating data.

Suggested change
num_dims = np.frombuffer(buffer.read(4), dtype=np.uint32)[0]
shape = tuple(np.frombuffer(buffer.read(4 * int(num_dims)), dtype=np.uint32))
except:
return np.zeros((10,10), dtype=np.uint8)
# Read and validate the number of dimensions (4 bytes, uint32)
num_dims_bytes = buffer.read(4)
if len(num_dims_bytes) != 4:
raise ValueError("Compressed stream is too short to contain num_dims header.")
num_dims = int(np.frombuffer(num_dims_bytes, dtype=np.uint32)[0])
# Read and validate the shape (num_dims * 4 bytes, uint32 each)
if num_dims < 0:
raise ValueError(f"Invalid num_dims value in header: {num_dims}")
shape_bytes_expected = 4 * num_dims
shape_bytes = buffer.read(shape_bytes_expected)
if len(shape_bytes) != shape_bytes_expected:
raise ValueError("Compressed stream is too short to contain complete shape header.")
shape = tuple(np.frombuffer(shape_bytes, dtype=np.uint32))
except (ValueError, TypeError, OverflowError) as exc:
logging.error("Failed to parse compressed image header: %s", exc)
raise

Copilot uses AI. Check for mistakes.
f.write(__doc__)
import parser
import main
import importlib
Copy link

Copilot AI Feb 8, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

importlib is imported but never used in this module. Consider removing the unused import.

Suggested change
import importlib

Copilot uses AI. Check for mistakes.
with open("/tmp/description.txt", 'w') as f:
f.write(__doc__)
#import entropy_image_coding as EIC
import importlib
Copy link

Copilot AI Feb 8, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Import of 'importlib' is not used.

Suggested change
import importlib

Copilot uses AI. Check for mistakes.
if __name__ == "__main__":
try:
parser.parser_encode.add_argument('--orden', type=int, default=0, help='Orden del contexto')
except: pass
Copy link

Copilot AI Feb 8, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Except block directly handles BaseException.

Copilot uses AI. Check for mistakes.
except: pass
try:
parser.parser_decode.add_argument('--orden', type=int, default=0, help='Orden del contexto')
except: pass
Copy link

Copilot AI Feb 8, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Except block directly handles BaseException.

Copilot uses AI. Check for mistakes.
if __name__ == "__main__":
try:
parser.parser_encode.add_argument('--orden', type=int, default=0, help='Orden del contexto')
except: pass
Copy link

Copilot AI Feb 8, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

'except' clause does nothing but pass and there is no explanatory comment.

Copilot uses AI. Check for mistakes.
Comment on lines +240 to +243
except: pass
try:
parser.parser_decode.add_argument('--orden', type=int, default=0, help='Orden del contexto')
except: pass
Copy link

Copilot AI Feb 8, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

'except' clause does nothing but pass and there is no explanatory comment.

Suggested change
except: pass
try:
parser.parser_decode.add_argument('--orden', type=int, default=0, help='Orden del contexto')
except: pass
except AttributeError:
logging.debug("parser_encode is not available; skipping '--orden' argument registration.")
try:
parser.parser_decode.add_argument('--orden', type=int, default=0, help='Orden del contexto')
except AttributeError:
logging.debug("parser_decode is not available; skipping '--orden' argument registration.")

Copilot uses AI. Check for mistakes.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants