Skip to content

add a solution#184

Open
viktoriamyhailiak wants to merge 4 commits intomate-academy:masterfrom
viktoriamyhailiak:develop
Open

add a solution#184
viktoriamyhailiak wants to merge 4 commits intomate-academy:masterfrom
viktoriamyhailiak:develop

Conversation

@viktoriamyhailiak
Copy link

No description provided.

Copy link

@mateacademy-ai-mentor mateacademy-ai-mentor left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Review Summary

Decision: Request Changes — There are two critical blockers that break required functionality.

  • Incorrect filename extension mapping in src/createServer.js (near where extMap is defined). The current map uses algorithm names (gzip, deflate, br), producing file.txt.gzip/file.txt.deflate. Requirement: gzip -> .gz, deflate -> .dfl, br -> .br. Fix by setting const extMap = { gzip: 'gz', deflate: 'dfl', br: 'br' }; and ensure the Content-Disposition filename uses this extension.
  • Stream creation from the uploaded Buffer is wrong. Using Readable.from(fileBuffer) emits numeric bytes (numbers) rather than Buffer chunks, which is incompatible with zlib transforms and can cause runtime errors. Fix by emitting a Buffer chunk, e.g., Readable.from([fileBuffer]), or use a PassThrough and pass.end(fileBuffer) before piping to the compressor.

Minor (non-blocking) suggestions:

  • Wrap the Content-Disposition filename in quotes to handle spaces.

You’ve implemented multipart validation, zlib transforms, and response streaming well—nice work. Address the two issues above to meet the spec and pass tests.


✨ Please rate this review in the Platform's notification center1. I will be grateful for your feedback! ✨

Important

Fix the issues, commit the changes, and then re-request my review using the Re-request review button2 in the Reviewers tab.

Tip

If you have any comments about this review, please leave a message in the Add a comment section below and mention me with the @mateacademy-ai-mentor tag3.
I will read your message during my next review of your pull request. Please do not reply directly to my comments, as that feature is not supported yet.

Caution

Do not close the pull request until the review is complete. Closing it early can lead to unexpected behavior and disrupt the review process.

Footnotes

  1. Rate AI review example

  2. Re-request AI review example

  3. Add AI review comments example

const zlib = require('node:zlib');
const { Readable } = require('stream');

const extMap = { gzip: 'gzip', deflate: 'deflate', br: 'br' };

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Incorrect extension mapping — the extMap currently maps each type to the algorithm name ('gzip', 'deflate', 'br') so the produced filenames will be file.txt.gzip or file.txt.deflate which violates the requirement that the compressed file name must append these exact extensions: gzip -> .gz, deflate -> .dfl, br -> .br (description). Update the map to use the required extensions, for example:

const extMap = { gzip: 'gz', deflate: 'dfl', br: 'br' };

This change is required so that a request with gzip returns file.txt.gz as specified in the task.

`attachment; filename=${compressedFileName}`,
);

const stream = Readable.from(fileBuffer);

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Readable.from usage will emit numeric bytes instead of Buffer chunksReadable.from(fileBuffer) will iterate the Buffer and emit individual numbers (bytes), which is not a valid chunk type for the zlib transform and will likely cause errors. Instead ensure the stream emits Buffer chunks (e.g. a single Buffer chunk). Two simple approaches:

  • Use Readable.from([fileBuffer]) so the whole Buffer is emitted as a single chunk.
  • Or create a PassThrough stream and write the buffer into it:
const { PassThrough } = require('stream');
const pass = new PassThrough();
pass.end(fileBuffer);
pass.pipe(compressor).pipe(res);

Make this change so the uploaded data actually flows through the zlib transform as required by the task (must use Node.js Streams and zlib).

Copy link

@mateacademy-ai-mentor mateacademy-ai-mentor left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Review Summary

I’m requesting changes because there are critical blockers that violate the task requirements and affect functionality.

  • Critical: In src/createServer.js, the extension map produces file.txt.gzip/file.txt.deflate. The spec requires .gz, .dfl, .br. Update extMap (e.g., const extMap = { gzip: 'gz', deflate: 'dfl', br: 'br' };) and ensure the filename uses these.
  • Critical: Validation must reject empty uploads. At the validation step, check the file buffer is non-empty (e.g., !fileBuffer || !fileBuffer.length || !type) to return 400 for zero-byte files.
  • Critical: Quote the filename in the Content-Disposition header to handle spaces/special characters: use attachment; filename="${compressedFileName}".

Minor improvement (optional): consider using stream.pipeline for robust error handling across streams.

You’ve done a solid job with multipart parsing, zlib usage, and streaming; fixing the above will align the server with the spec and pass the expected tests.


✨ Please rate this review in the Platform's notification center1. I will be grateful for your feedback! ✨

Important

Fix the issues, commit the changes, and then re-request my review using the Re-request review button2 in the Reviewers tab.

Tip

If you have any comments about this review, please leave a message in the Add a comment section below and mention me with the @mateacademy-ai-mentor tag3.
I will read your message during my next review of your pull request. Please do not reply directly to my comments, as that feature is not supported yet.

Caution

Do not close the pull request until the review is complete. Closing it early can lead to unexpected behavior and disrupt the review process.

Footnotes

  1. Rate AI review example

  2. Re-request AI review example

  3. Add AI review comments example

const zlib = require('node:zlib');
const { Readable } = require('stream');

const extMap = { gzip: 'gzip', deflate: 'deflate', br: 'br' };

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Critical: extMap is incorrect. The task requires these output extensions: gzip -> .gz, deflate -> .dfl, br -> .br. Currently the map will produce filenames like file.txt.gzip and file.txt.deflate. Update the map to use the required extensions, for example:

const extMap = { gzip: 'gz', deflate: 'dfl', br: 'br' };

This change is required so the response filename matches the task specification.

}
}

if (!fileBuffer || !type) {

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Functional correctness: the code currently treats the presence of fileBuffer as enough to accept the file, but it does not verify the uploaded file contains data. The requirement says the file field must be present and contain data. Replace the check at the validation step with something that ensures non-empty content, e.g.:

if (!fileBuffer || !fileBuffer.length || !type) {
  // respond 400
}

This ensures zero-length uploads are rejected with 400 as required.

Comment on lines +143 to +145
res.setHeader(
'Content-Disposition',
`attachment; filename=${compressedFileName}`,

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please quote the filename in Content-Disposition to safely handle names with spaces or special characters. Change:

'Content-Disposition',
`attachment; filename=${compressedFileName}`,

to

'Content-Disposition',
`attachment; filename="${compressedFileName}"`,

This prevents clients from mis-parsing the header and ensures the correct filename is presented to the user.

Copy link

@mateacademy-ai-mentor mateacademy-ai-mentor left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sorry, the pull request review was unsuccessful for the following reasons:

Warning

No changes were made in the pull request since the last review. Please make sure that you have added files to the pull request and committed the changes.
[CODE: NCM]

If the issue is within your control, please address it.
The next review attempt will automatically launch in about 1 hour.
If you believe this is an error or need assistance, please contact the support team.

Copy link

@mateacademy-ai-mentor mateacademy-ai-mentor left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Review Summary

Educational override: Approving to help you progress while learning. You’ve implemented multipart parsing, used zlib compressors, and streamed the response—great job on the core flow and using streams effectively.

Critical fixes to align with the spec:

  • Filename extensions (src/createServer.js): map to short forms: const extMap = { gzip: 'gz', deflate: 'dfl', br: 'br' } and ensure the final name is originalName.<ext>; tests check this exactly. [CHECKLIST ITEM]
  • Zero-byte validation (src/createServer.js): change if (!fileBuffer || !type) to also check length: if (!fileBuffer || fileBuffer.length === 0 || !type) { ... } to correctly reject empty uploads with 400. [CHECKLIST ITEM]
  • Content-Disposition quoting (src/createServer.js): use attachment; filename="${compressedFileName}" so filenames with spaces/special chars are handled. [CHECKLIST ITEM]
  • GET / should serve the HTML form (src/createServer.js): return index.html with Content-Type: text/html instead of OK so users see the required form posting to /compress.

Minor improvement: prefer stream.pipeline(readable, compressor, res, cb) for robust error handling across streams.

You’re very close—tighten these details and you’ll meet the checklist and pass tests. Keep up the solid work!


✨ Please rate this review in the Platform's notification center1. I will be grateful for your feedback! ✨

Footnotes

  1. Rate AI review example

const zlib = require('node:zlib');
const { Readable } = require('stream');

const extMap = { gzip: 'gzip', deflate: 'deflate', br: 'br' };

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The extension map uses long names ('gzip', 'deflate', 'br') instead of the required short extensions. The spec requires mapping: gzip → .gz, deflate → .dfl, br → .br. Update this to e.g. const extMap = { gzip: 'gz', deflate: 'dfl', br: 'br' }; so the response filename matches the checklist. This is critical because tests check the exact filename extension.

Comment on lines +12 to +16
if (req.url === '/' && req.method === 'GET') {
res.statusCode = 200;
res.end('OK');

return;

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

GET / currently returns the string OK. The task requires serving a page with an HTML form (there is an index.html in the project). Serve that HTML (or its contents) on GET / with Content-Type: text/html so visiting / shows the required form that posts to /compress. Right now consumers don't receive the form expected by the spec.

Comment on lines +124 to +129
if (!fileBuffer || !type) {
res.statusCode = 400;
res.end('Missing file or compressionType');

return;
}

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The validation if (!fileBuffer || !type) does not detect zero-byte file uploads because an empty Buffer is truthy. The requirements mandate rejecting missing or empty uploads with HTTP 400. Change the check to ensure the buffer has length, e.g. if (!fileBuffer || fileBuffer.length === 0 || !type) { ... }. This is a critical validation fix.

? zlib.createDeflate()
: zlib.createBrotliCompress();

const compressedFileName = `${path.basename(filename)}.${extMap[type]}`;

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

When building the compressed filename you use path.basename(filename) and extMap[type]. After fixing extMap ensure the constructed name yields originalName.<ext> with the short extensions (.gz, .dfl, .br). Tests expect that exact naming scheme. Confirm extMap[type] returns the short extension before concatenation.

Comment on lines +143 to +145
res.setHeader(
'Content-Disposition',
`attachment; filename=${compressedFileName}`,

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The Content-Disposition header sets the filename without quotes: attachment; filename=${compressedFileName}. Filenames containing spaces or special characters must be quoted. Change to attachment; filename="${compressedFileName}" (or use proper RFC5987 encoding if needed). Tests typically expect the filename to be quoted. This is required to meet the spec about delivering a downloadable file with correct filename.

Comment on lines +150 to +156
stream
.pipe(compressor)
.pipe(res)
.on('error', () => {
res.statusCode = 500;
res.end('Compression error');
});

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Error handling for stream piping is fragile. Instead of chaining .pipe(...).pipe(res).on('error', ...), prefer stream.pipeline(readableStream, compressor, res, (err) => { ... }) from the stream module to propagate errors across streams reliably and avoid leaking partial responses. This is a recommended robustness improvement (not strictly required), but helps prevent unhandled stream errors. Example: const { pipeline } = require('stream'); pipeline(stream, compressor, res, err => { if (err) { res.statusCode = 500; res.end('Compression error'); } });.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants