Fix Issue #2583: Dynamic padding in torch.nn.functional.pad #2588
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Description
Fixes #2583
This PR fixes the conversion of
torch.nn.functional.pad
operations with dynamic padding values (e.g.,x.size(-1)
) to Core ML format.Problem
When padding values contain runtime-determined sizes, the
_array_construct
function inops.py
was returning a Python list instead of a proper MILVar
object, causingAttributeError: 'list' object has no attribute 'val'
.Solution
Modified
_array_construct
to create a properVar
object usingmb.concat
for dynamic list construction, maintaining the expectedVar
interface throughout the conversion pipeline.Testing
test_ops.py
covering dynamic, constant, and mixed padding casesChecklist for Reviewer(s)