-
-
Notifications
You must be signed in to change notification settings - Fork 8k
Enable flake8 and re-enable it everywhere #11477
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
pytest.ini
Outdated
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Shouldn't this be left at 79 (the default, i.e. remove this line)?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I got this from @cclauss's PR, but I think you're right, so I'm going to remove it.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
sorry, off-by-one bug on my part.
cclauss
commented
Jun 22, 2018
LOL... #10940 (comment)
Well, okay, now pyside(2) seems to be broken.
Also, I have no idea why, but flake8 make the tests ridiculously slow. I ran a non-parallel test locally and it took ~7 minutes, but with the --flake8
flag it's not finished for the past hour.
Maybe related tholo/pytest-flake8#19?
If #11477 (comment) is still an issue, perhaps replace pytest-flake8 by a direct invocation of flake8? (with e.g. https://github.com/snoack/flake8-per-file-ignores for per-file ignores)
(TBH I don't really see the point of a pytest plugin that just calls another executable (if it is not active by default).)
I ran
pytest -v --pep8 lib/matplotlib/tests/test_axes.py
: 143.09 spytest -v --flake8 lib/matplotlib/tests/test_axes.py
: 131.09 s
Sure I didn't have the exceptions turned on or anything (the pep8 test failed).
cclauss
commented
Jul 6, 2018
Try running PEP8 and flake8 WITHOUT passing thru pytest.
Well, I tried conda, virtualenv, and even downloading the docker image that Travis used, but I cannot reproduce the slow checks anymore. I guess standalone flake8
is the only option; the only annoyance is that configuration will have to go into a new file, as pytest.ini
is not a supported file.
From http://flake8.pycqa.org/en/latest/user/invocation.html
$ flake8 . --config=CONFIG # Path to the config file that will be the authoritative config source. This will cause Flake8 to ignore all other configuration files.
Is the issue that pytest spins up a new flake8 process for each file which just takes forever on the travis vms?
watching a running test it is not hung, just from 91% on it seems to be taking it's time (and then we hit the 50 minute timeout).
One minute, 10 seconds to flake8 test the entire repo seems reasonable to me.
$ time flake8 . --config=pytest.ini # Python 3.7 on Travis CI
real 0m26.769s
user 1m10.204s
sys 0m0.953s
@cclauss
cclauss
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Running flake8 from inside pytest has proved to be a bad idea when each file in the repo has its own style rules. Running flake8 standalone with the --config=pytest.ini seems the best approach.
9d55403
to
6fa1391
Compare
At some point in the near future, I would like to get rid of the **
exceptions, as they cause confusion when linting a single file (per-file-ignores
thinks that the exception is unnecessary and raises its X100
error).
Where does the flake8 output show up in the test? I can't see any evidence that it ran, but maybe I'm looking in the wrong spot...
That's because it passed 😉 I'll add a message to say it's okay on there.
cclauss
commented
Jul 8, 2018
Fixes #11550
.travis.yml
Outdated
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I suspect this is going to have the same issue we discovered in the pytzectomy PR where if the tests fail but flake8 passes, travis will report success.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Probably set -e is the way to go?
Or add || return 1
at the end of each (relevant) line.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't like set -e
; it bleeds into Travis' code and causes spurious errors, so you need to remember to turn it off. The last time that happened, we ended up just removing it instead. I split these into separate entries instead, which does fail correctly.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is used as a meta-class so I ma not sure that this change is correct (I am also not sure off the top of my head what self
is in this case and when this __init__
gets called).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It may be that the object is a class/type here, but it's still an instance at this point, so I'm not sure calling it something other than self
is warranted.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think that the way this is called in travis will mask failures in the test suite from failing the test.
Anyone can clear this if this is fixed or they are sure I am wrong.
cclauss
commented
Jul 8, 2018
Do the flake8 tests first and the pytests after?
@cclauss then it won't fail if flake8 fails?
#11597 shows that @tacaswell is indeed correct - that failing a test still returns a positive Travis check the way this is setup now....
Doesn't catch failing pytest tests anymore
#11597 shows that adding set -e
causes a failure when pytest should fail. Testing if it fails if there is a flake8 error
Splitting the two commands into separate entries should work fine.
Thanks everyone!
PR Summary
Due to the removal of the examples symlink in #11141, pytest was no longer seeing the examples and running pep8 on them. This adds that directory back to pytest's search.
It also adds a rebase of #10940 to enable flake8.
PR Checklist