Skip to content

Conversation

YifanShenSZ
Copy link
Collaborator

@YifanShenSZ YifanShenSZ commented Sep 5, 2025

Torch has released 2.8, but we support up to 2.7.1 for now. Before we upgrade coremltools to be compatible with torch 2.8, let us limit torch < 2.8 in CI for now

CI ✅ https://gitlab.com/coremltools1/coremltools/-/commit/463b494215a5f6f9ef73e913e8d60a25fe8c472d/pipelines

@junpeiz junpeiz self-requested a review September 5, 2025 20:00
@YifanShenSZ YifanShenSZ merged commit 2256748 into apple:main Sep 5, 2025
noobsiecoder pushed a commit to noobsiecoder/coremltools that referenced this pull request Sep 10, 2025
junpeiz pushed a commit that referenced this pull request Sep 11, 2025
* Fix Issue #2583: Dynamic padding in torch.nn.functional.pad

Modified _array_construct to handle dynamic padding values: Creates proper Var objects using mb.concat instead of Python lists + Fixes AttributeError when converting models with x.size(-1) padding

* limit torch to older than 2.8 for now (#2591)

Co-authored-by: yifan_shen3 <[email protected]>

* Add RMSNorm operator support for PyTorch to CoreML conversion (#2585) (#2592)

* Add RMSNorm operator support for PyTorch to CoreML conversion (#2585)

* formatted code

* handles FP16 overflow for RMSNorm operation

* handle dynamic padding w/o breaking legacy code

---------

Co-authored-by: Yifan Shen <[email protected]>
Co-authored-by: yifan_shen3 <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants