Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Is it possible to connect pytest assert failures to problems and code views? #120

Closed
DonJayamanne opened this issue Nov 13, 2017 · 6 comments
Assignees
Labels
area-testing feature-request Request for new features or functionality important Issue identified as high-priority

Comments

@DonJayamanne
Copy link

From @hendrics on May 21, 2017 20:15

Hi. I was wondering if there's an option to integrate assert failures and problem/code view.

I managed to hack it together using problemMatchers for tasks.
See the details example here
https://github.com/hendrics/python-vscode-pytest-example

Also i am looking for an ability to add links to the Output view to jump to the code.
There are open issues on the vscode to address it via matchers.

So overall copy pasting the tasks.json around is not scalable and i am wondering
if there's scope to add it to this extension unless it is already implemented.

Environment data

VS Code version: Version 1.12.2 (1.12.2)
Python Extension version: Version 0.6.4 (4 May 2017)
Python Version: 2.7.11
OS and version: macOS El Capitan

Copied from original issue: DonJayamanne/pythonVSCode#973

@DonJayamanne
Copy link
Author

Does sound like a good suggestion, however what's interesting is that this hasn't been up voted by anyone. I'll check this feature with my colleagues and get back to you. There could be other edge cases that need to be considered, e.g. what if the unit tests pass, we'll then need to ensure the problems disappear.

@brettcannon brettcannon changed the title Is it possible connect pytest assert failures to problems and code views? Is it possible to connect pytest assert failures to problems and code views? Nov 14, 2017
@brettcannon brettcannon added awaiting 1-decision area-testing feature-request Request for new features or functionality labels Nov 14, 2017
@DonJayamanne
Copy link
Author

@brettcannon @qubitron
I think its a good piece of functionality, we could probably accept a PR for this and leave it as requiring volunteer work.

@DonJayamanne
Copy link
Author

screen shot 2018-12-20 at 3 34 14 pm

screen shot 2018-12-20 at 3 34 24 pm

@DonJayamanne
Copy link
Author

screen shot 2018-12-20 at 3 39 36 pm

@orn688
Copy link

orn688 commented Jan 29, 2019

This is a great feature, but is there a way to disable the red underlines for failing tests? I think of the red underlines as indicating linting/syntax errors, so I find it a bit confusing that they're overloaded to indicate test failures. Thanks for working on this though!

@DonJayamanne
Copy link
Author

Please create a separate issue and we can go from there.

@lock lock bot locked as resolved and limited conversation to collaborators Feb 26, 2019
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
area-testing feature-request Request for new features or functionality important Issue identified as high-priority
Projects
None yet
Development

No branches or pull requests

3 participants