-
Notifications
You must be signed in to change notification settings - Fork 30.3k
Fix doc for PerceptionLMForConditionalGeneration forward. #40733
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
@@ -317,7 +317,6 @@ def prepare_inputs_for_generation( | |||
return model_inputs | |||
|
|||
@can_return_tuple | |||
@auto_docstring |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
we still need to keep auto docstring, it will add docs about forward args
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
auto_docstring make the modeling file doc stale (don't see modular file doc change)
Maybe a simple workaround is to remove, convert then add it back
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What do you mean, modular doesn't copy the example snippet if we keep it? We can't delete it from modeling file as well, so it is not about modular and we can't add it back :)
"content": [ | ||
{ | ||
"type": "image", | ||
"url": test_image_file, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
no need to download, we can put image link here (https://huggingface.co/datasets/shumingh/perception_lm_test_images/resolve/main/14496_0.PNG
) and it will also work
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ok. let me test this out. I've already left Meta, let me find a GPU somewhere lol.
Yea . See the PR description.
if @autodoc_string is present in modular, no matter how one updates its
docstring,
convert script does NOT update the doc string in modeling.
if @autodoc_string is removed from modular,
convert script updates the doc string in modeling.
…On Mon, Sep 8, 2025 at 9:01 AM Raushan Turganbay ***@***.***> wrote:
***@***.**** commented on this pull request.
------------------------------
In src/transformers/models/perception_lm/modular_perception_lm.py
<#40733 (comment)>
:
> @@ -317,7 +317,6 @@ def prepare_inputs_for_generation(
return model_inputs
@can_return_tuple
- @auto_docstring
What do you mean, modular doesn't copy the example snippet if we keep it?
We can't delete it from modeling file as well, so it is not about modular
and we can't add it back :)
—
Reply to this email directly, view it on GitHub
<#40733 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AAWMMF3IPS3QPSEYFC3Q4QT3RWR43AVCNFSM6AAAAACFX6KEN6VHI2DSMVQWIX3LMV43YUDVNRWFEZLROVSXG5CSMV3GSZLXHMZTCOJXGE3TSOBVGM>
.
You are receiving this because you authored the thread.Message ID:
***@***.***>
|
That is weird, similar pattern works fine with other models (e.g. transformers/src/transformers/models/aria/modular_aria.py Lines 1462 to 1485 in fd2a29d
Could it be caused by version mismatch with one of the packages? 🤔 |
hmm i just noticed the "stale" example I mentioned was actually added
here, post PLM PR.
add43c4#diff-4bdb20fd22b6d6f61cdb8e54281a289e68ab95c377bebb3fbfa4f23c08204f45
…On Mon, Sep 8, 2025 at 9:41 AM Raushan Turganbay ***@***.***> wrote:
*zucchini-nlp* left a comment (huggingface/transformers#40733)
<#40733 (comment)>
That is weird, similar pattern works fine with other models (e.g.
https://github.com/huggingface/transformers/blob/fd2a29d4680c972d7c4e48593cd69d875a6f2499/src/transformers/models/aria/modular_aria.py#L1462-L1485
)
Could it be caused by version mismatch with one of the packages? 🤔
—
Reply to this email directly, view it on GitHub
<#40733 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AAWMMF7PVUAMVTSZZKTO4JD3RWWVNAVCNFSM6AAAAACFX6KEN6VHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZTENRXGEYTONRSGQ>
.
You are receiving this because you authored the thread.Message ID:
***@***.***>
|
Ah yeah, modular copied from llava because it had no example from Perception itself. Sorry, missed that one. As long as we add docstring it should be fine, togetheer with the decorator |
but only modeling though, not the modular file
…On Mon, Sep 8, 2025 at 10:10 AM Raushan Turganbay ***@***.***> wrote:
*zucchini-nlp* left a comment (huggingface/transformers#40733)
<#40733 (comment)>
Ah yeah, modular copied from llava because it had no example from
Perception itself. Sorry, missed that one. As long as we add docstring it
should be fine, togetheer with the decorator
—
Reply to this email directly, view it on GitHub
<#40733 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AAWMMFZJSYFO72VM6QOLY7T3RWZ7ZAVCNFSM6AAAAACFX6KEN6VHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZTENRXGE4TSNZQGU>
.
You are receiving this because you authored the thread.Message ID:
***@***.***>
|
both, same as in aria and it should work |
Oh I meant the “stale” example was in modeling only not in modular in that
PR.
…On Mon, Sep 8, 2025 at 11:24 AM Raushan Turganbay ***@***.***> wrote:
*zucchini-nlp* left a comment (huggingface/transformers#40733)
<#40733 (comment)>
both, same as in aria and it should work
—
Reply to this email directly, view it on GitHub
<#40733 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AAWMMF42ZEZD27P4D3OF3BL3RXCVZAVCNFSM6AAAAACFX6KEN6VHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZTENRXGQZDIMZYGY>
.
You are receiving this because you authored the thread.Message ID:
***@***.***>
|
ahhh oke, misunderstood then. In any case, we will need to have both files updated and with identical docs to pass CI :) |
Lets merge it, it only had the decorator missing :) |
[For maintainers] Suggested jobs to run (before merge) run-slow: perception_lm |
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ah, have to approve first
Awesome, thanks! (I'm getting a laptop from my new employer today)
…On Wed, Sep 10, 2025 at 2:57 AM Raushan Turganbay ***@***.***> wrote:
Merged #40733 <#40733>
into main.
—
Reply to this email directly, view it on GitHub
<#40733 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AAWMMF63ZZJUWQXJNQITWS33R7YZLAVCNFSM6AAAAACFX6KEN6VHI2DSMVQWIX3LMV45UABCJFZXG5LFIV3GK3TUJZXXI2LGNFRWC5DJN5XDWMJZGYYDSMZSHA4DEMI>
.
You are receiving this because you authored the thread.Message ID:
***@***.***>
|
Congrats, hope you got a GPU as well 😆 |
@auto_docstring for
PerceptionLMForConditionalGeneration
'sforward
would pin the doc for the method in modeling file to a stale version even if the doc was manually updated in the modular file.Removing it and then run
updated the example in doc modeling file.
@zucchini-nlp