Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix skip of test_training_gradient_checkpointing #34723

Merged
merged 1 commit into from
Nov 18, 2024

Conversation

dvrogozh
Copy link
Contributor

19d58d3 has introduced a context manager to manage subtests of test_training_gradient_checkpointing. However, test body was not moved under the "with" statement. Thus, while tests are correctly marked as skipped, test bodies were still executed. In some cases, as with llama this caused attribute errors.

Fixes: #34722
Fixes: 19d58d3 ("Add MLLama (#33703)")

CC: @amyeroberts, @ArthurZucker

@dvrogozh
Copy link
Contributor Author

@amyeroberts : test failures seem unrelated. Please, advice when I should rebase/try again?

_ ERROR at setup of JambaModelIntegrationTest.test_simple_batched_generate_with_padding _
....
/usr/local/lib/python3.10/ssl.py:1163: Failed

19d58d3 has introduced a context manager to manage subtests of
test_training_gradient_checkpointing. However, test body was not
moved under "with" statement. Thus, while tests are correctly
marked as skipped, test bodies were still executed. In some cases,
as with llama this caused attribute errors.

Fixes: huggingface#34722
Fixes: 19d58d3 ("Add MLLama (huggingface#33703)")
Signed-off-by: Dmitry Rogozhkin <[email protected]>
@dvrogozh
Copy link
Contributor Author

@amyeroberts : rerun tests, no side fails now. Please, help review.

@LysandreJik
Copy link
Member

Thanks a lot for your PR @dvrogozh! Pinging @qubvel and @ydshieh

@ydshieh ydshieh self-assigned this Nov 15, 2024
@ydshieh
Copy link
Collaborator

ydshieh commented Nov 15, 2024

Hi @dvrogozh, thank you for the PR.

When I run things like

python -m pytest -v tests\models\llama\test_modeling_llama.py -k "test_training_gradient_checkpointing"

I see

tests/models/llama/test_modeling_llama.py::LlamaModelTest::test_training_gradient_checkpointing SKIPPED (`supports_gradient_checkpointing` is False for LlamaModel.)                                  [ 33%] tests/models/llama/test_modeling_llama.py::LlamaModelTest::test_training_gradient_checkpointing_use_reentrant SKIPPED (`supports_gradient_checkpointing` is False for LlamaModel.)                    [ 66%] tests/models/llama/test_modeling_llama.py::LlamaModelTest::test_training_gradient_checkpointing_use_reentrant_false SKIPPED (`supports_gradient_checkpointing` is False for LlamaModel.)

and I also check the body after the subTest block which is not run.

Could you provide us in which cases it will enter that body while self.skipTest is executed.

@ydshieh
Copy link
Collaborator

ydshieh commented Nov 15, 2024

Could you set breakpoint(s) to check which model_class is not skipped.

Also, is pytorch/pytorch@1a8752b necessary to produce the issue? I am running with 2.6.0.dev20241112+cu121 and it works well.

@dvrogozh
Copy link
Contributor Author

dvrogozh commented Nov 15, 2024

Could you provide us in which cases it will enter that body while self.skipTest is executed.

Shortly: if pytest-subtests package is installed.

@ydshieh : this seems interesting. I was wondering why HF ci does not see this issue. Your comment above also suggests that you don't see the issue on your side. It seems I've found the reason. I have 2 systems, one with XPU, another with CUDA. Initially I saw issue on both systems. Yesterday, I fully cleaned and reinstall environment for XPU and issue was gone. So, the issue which I observe is triggered by environment difference. In particular, it shows up if the following package is installed: pytest-subtests (I have version 0.13.1). I am not sure when I got this package in my environment. I did check that if I install it on XPU I again can reproduce the issue.

@dvrogozh
Copy link
Contributor Author

Most likely I got pytest-subtests installed from the HF Accelerate. It's in dependencies:
https://github.com/huggingface/accelerate/blob/c0552c9012a9bae7f125e1df89cf9ee0b0d250fd/setup.py#L25

@dvrogozh
Copy link
Contributor Author

Also, is pytorch/pytorch@1a8752b necessary to produce the issue? I am running with 2.6.0.dev20241112+cu121 and it works well.

Most likely no. I am just building pytorch from sources on my side since I on purposely look for XPU backend issues in the most recent code. I did not try to check other pytorch versions, but the issue I see does not seem to be related to pytorch.

@ydshieh
Copy link
Collaborator

ydshieh commented Nov 15, 2024

Hi thanks for the information. Indeed, our CI env. doesn't have pytest-subtests.

I will check it and see how a fix would be.

Currently, without pytest-subtests, although we don't see the mentioned issue, the logic is not good: when a model_class is identified to be skipped and run self.skipTest, it will skip the whole test case (not just that single subTest) and the other model_class won't be checked if to run at all (i.e. no more for loop).

@dvrogozh
Copy link
Contributor Author

Currently, without pytest-subtests, although we don't see the mentioned issue, the logic is not good: <...> it will skip the whole test case (not just that single subTest) <...>

This sounds like we need to add pytest-subtests to the HF Transformers dependencies list. Let me know if you want me to do that in this PR.

dvrogozh added a commit to dvrogozh/transformers that referenced this pull request Nov 15, 2024
As discussed in [1], pytest-subtests changes behavior of .skipTest() having
effect to really skip individual subtests or skip the entire test if module
is not installed. Huggingface Accelerate has module in its dependencies. It
makes sense to add it for Transformers as well to avoid divergent environment
between users and ci.

See[1]: huggingface#34723 (comment)
Signed-off-by: Dmitry Rogozhkin <[email protected]>
dvrogozh added a commit to dvrogozh/transformers that referenced this pull request Nov 15, 2024
As discussed in [1], pytest-subtests changes behavior of .skipTest() having
effect to really skip individual subtests or skip the entire test if module
is not installed. Huggingface Accelerate has module in its dependencies. It
makes sense to add it for Transformers as well to avoid divergent environment
between users and ci.

See[1]: huggingface#34723 (comment)
Signed-off-by: Dmitry Rogozhkin <[email protected]>
@ydshieh
Copy link
Collaborator

ydshieh commented Nov 15, 2024

Although I think installing pytest-subtests is the way to go, I find something quite strange like:

import unittest


class T(unittest.TestCase):
    def test_foo(self):
        for i in range(7):
            with self.subTest("custom message"):
                if i < 3:
                    self.skipTest(f"bon {i}")
                # self.assertLess(i, 3)
                assert i < 3

fails with

[custom message] SUBFAIL test_foo.py::T::test_foo - AssertionError: assert 6 < 3
====================================================================================== 1 failed, 1 passed, 3 skipped in 0.67s =======================================================================================

while

change for i in range(7): to for i in range(6): gives

1 passed, 3 skipped in 0.43s

I am not sure I am doing stupid things 😭

@ydshieh
Copy link
Collaborator

ydshieh commented Nov 15, 2024

without with self.subTest(f"custom message {i}"), it works with

[custom message 3] SUBFAIL test_foo.py::T::test_foo - AssertionError: assert 3 < 3
[custom message 4] SUBFAIL test_foo.py::T::test_foo - AssertionError: assert 4 < 3
[custom message 5] SUBFAIL test_foo.py::T::test_foo - AssertionError: assert 5 < 3
[custom message 6] SUBFAIL test_foo.py::T::test_foo - AssertionError: assert 6 < 3

@dvrogozh
Copy link
Contributor Author

with range(7) adding print() right after the range:

>>> 0
>>> 1
>>> 2
>>> 3
>>> 4
>>> 5
>>> 6

T
 » foo
 » foo
 » foo
 ✗ foo
 ✓ foo
[custom message] SUBFAIL d.py::T::test_foo - AssertionError: assert 6 < 3
============================== 1 failed, 1 passed, 3 skipped in 0.11s ==============================

So, it loops thru all 6 cases. It correctly skips first 3, but after that I don't understand what's going on. It should have reported 4 failures, but it reported 1 failure, 1 pass and ate up 2 others. And in the end only printed assertion failure for the last iteration :).

@ydshieh
Copy link
Collaborator

ydshieh commented Nov 15, 2024

Yeah. Maybe it's a bug. I will open an issue.
In the meantime, for test_training_gradient_checkpointing, I will talk to @ArthurZucker .

@dvrogozh
Copy link
Contributor Author

without with self.subTest(f"custom message {i}"), it works with

You mean without if + skip condition? It works reasonably on my side without that. It behaves strange with if + skip though. I am checking this now.

import unittest
class T(unittest.TestCase):
    def test_foo(self):
        for i in range(7):
            print(f">>> {i}")
            with self.subTest(i=i):
                assert i < 3

output:

$ python3 -m pytest --capture=no d.py
======================================= test session starts ========================================
platform linux -- Python 3.10.12, pytest-7.4.4, pluggy-1.5.0
rootdir: /home/dvrogozh/tmp
plugins: pspec-0.0.4, timeout-2.3.1, hypothesis-6.118.8, subtests-0.13.1, dash-2.18.2, xdist-3.6.1, rich-0.1.1
collected 1 item

d.py >>> 0
>>> 1
>>> 2
>>> 3
>>> 4
>>> 5
>>> 6
uuuu.

============================================= FAILURES =============================================
_________________________________________ T.test_foo (i=3) _________________________________________

self = <d.T testMethod=test_foo>

    def test_foo(self):
        for i in range(7):
            print(f">>> {i}")
            with self.subTest(i=i):
                #if i < 3:
                #    self.skipTest(f"bon {i}")
                #self.assertLess(i, 3)
>               assert i < 3
E               AssertionError: assert 3 < 3

d.py:12: AssertionError
_________________________________________ T.test_foo (i=4) _________________________________________

self = <d.T testMethod=test_foo>

    def test_foo(self):
        for i in range(7):
            print(f">>> {i}")
            with self.subTest(i=i):
                #if i < 3:
                #    self.skipTest(f"bon {i}")
                #self.assertLess(i, 3)
>               assert i < 3
E               AssertionError: assert 4 < 3

d.py:12: AssertionError
_________________________________________ T.test_foo (i=5) _________________________________________

self = <d.T testMethod=test_foo>

    def test_foo(self):
        for i in range(7):
            print(f">>> {i}")
            with self.subTest(i=i):
                #if i < 3:
                #    self.skipTest(f"bon {i}")
                #self.assertLess(i, 3)
>               assert i < 3
E               AssertionError: assert 5 < 3

d.py:12: AssertionError
_________________________________________ T.test_foo (i=6) _________________________________________

self = <d.T testMethod=test_foo>

    def test_foo(self):
        for i in range(7):
            print(f">>> {i}")
            with self.subTest(i=i):
                #if i < 3:
                #    self.skipTest(f"bon {i}")
                #self.assertLess(i, 3)
>               assert i < 3
E               AssertionError: assert 6 < 3

d.py:12: AssertionError
===================================== short test summary info ======================================
(i=3) SUBFAIL d.py::T::test_foo - AssertionError: assert 3 < 3
(i=4) SUBFAIL d.py::T::test_foo - AssertionError: assert 4 < 3
(i=5) SUBFAIL d.py::T::test_foo - AssertionError: assert 5 < 3
(i=6) SUBFAIL d.py::T::test_foo - AssertionError: assert 6 < 3
=================================== 4 failed, 1 passed in 0.11s ====================================

@ydshieh
Copy link
Collaborator

ydshieh commented Nov 15, 2024

yes, sorry. I mean without if ... skipTest it works (as you noted). But with it, things goes crazy ..

@dvrogozh
Copy link
Contributor Author

@ydshieh : I filed #34755 to specifically consider what to do with subtests story. See breakdown on how it works in different cases. Really confusing...

As for the test_training_gradient_checkpointing story I think there are 2 possible ways to handle it separate from subtests story:

  1. Either merge in PR 34723 as is, i.e. with self.subTest() remains, but we move the actual test body under the clause
  2. Remove with self.subTest() entirely, i.e. revert to previous test version when skipped cases were handled with just continue in the loop without explict markup of what passed or failed

Copy link
Collaborator

@ArthurZucker ArthurZucker left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Leaving this up to you @ydshieh 🤗 I think this one makes sense!

Copy link
Collaborator

@ydshieh ydshieh left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Despite the weired issue from the pytest-subtests with subTest + skipTest, this PR itself makes sense.

Furthermore, it doesn't change the current behavior when pytest-subtests is not installed (although that behavior is not desirable).

Therefore LGTM to merge.

@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

@ydshieh ydshieh merged commit 1c471fc into huggingface:main Nov 18, 2024
26 checks passed
BernardZach pushed a commit to BernardZach/transformers that referenced this pull request Dec 5, 2024
19d58d3 has introduced a context manager to manage subtests of
test_training_gradient_checkpointing. However, test body was not
moved under "with" statement. Thus, while tests are correctly
marked as skipped, test bodies were still executed. In some cases,
as with llama this caused attribute errors.

Fixes: huggingface#34722
Fixes: 19d58d3 ("Add MLLama (huggingface#33703)")

Signed-off-by: Dmitry Rogozhkin <[email protected]>
BernardZach pushed a commit to innovationcore/transformers that referenced this pull request Dec 6, 2024
19d58d3 has introduced a context manager to manage subtests of
test_training_gradient_checkpointing. However, test body was not
moved under "with" statement. Thus, while tests are correctly
marked as skipped, test bodies were still executed. In some cases,
as with llama this caused attribute errors.

Fixes: huggingface#34722
Fixes: 19d58d3 ("Add MLLama (huggingface#33703)")

Signed-off-by: Dmitry Rogozhkin <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

CI fails on few test_training_gradient_checkpointing tests for LLAMA
5 participants