Skip to content

Pytest summary output shows misleading counts (e.g. xfail tests missing) #7665

@vbrozik

Description

@vbrozik

Describe the bug

It looks like Marimo implements customized summary output for Pytest. Unfortunately the "Summary" output is misleading because it seems to ignore some tests. Below is a testing cell and its output. Note: to see more information I added addopts = ["-ra"] to pytest.toml.

@pytest.mark.xfail(reason="Testing xfail functionality")
def test_xfail_example() -> None:
    assert False
x                                                                        [100%]
=================================== Overview ===================================

Summary:
Total: 0, Passed: 0, Failed: 0, Errors: 0, Skipped: 0
=========================== short test summary info ============================
XFAIL src/notebooks/marimo_bugs::test_xfail_example - Testing xfail functionality

Output of the same test in a regular Python file does not contain the misleading "Overview" section:

~/dev/python-marimo$ pytest ./src/notebooks/marimo_bugs/test_xfail_outside_notebook.py 
=================================================== test session starts ====================================================
platform linux -- Python 3.14.0, pytest-9.0.2, pluggy-1.6.0
rootdir: /home/vbrozik/dev/python-marimo/src/notebooks/marimo_bugs
configfile: pytest.toml
plugins: anyio-4.12.0
collected 1 item                                                                                                           

src/notebooks/marimo_bugs/test_xfail_outside_notebook.py x                                                           [100%]

================================================= short test summary info ==================================================
XFAIL src/notebooks/marimo_bugs/test_xfail_outside_notebook.py::test_xfail_example - Testing xfail functionality
==================================================== 1 xfailed in 0.05s ====================================================

Will you submit a PR?

  • Yes

Environment

Details
{
  "marimo": "0.18.4",
  "editable": false,
  "location": "/home/vbrozik/dev/python-marimo/.venv/lib/python3.14/site-packages/marimo",
  "OS": "Linux",
  "OS Version": "6.6.87.2-microsoft-standard-WSL2",
  "Processor": "x86_64",
  "Python Version": "3.14.0",
  "Locale": "en_US",
  "Binaries": {
    "Browser": "--",
    "Node": "v24.12.0"
  },
  "Dependencies": {
    "click": "8.3.1",
    "docutils": "0.22.4",
    "itsdangerous": "2.2.0",
    "jedi": "0.19.2",
    "markdown": "3.10",
    "narwhals": "2.14.0",
    "packaging": "25.0",
    "psutil": "7.2.1",
    "pygments": "2.19.2",
    "pymdown-extensions": "10.20",
    "pyyaml": "6.0.3",
    "starlette": "0.50.0",
    "tomlkit": "0.13.3",
    "typing-extensions": "4.15.0",
    "uvicorn": "0.40.0",
    "websockets": "15.0.1"
  },
  "Optional Dependencies": {
    "altair": "6.0.0",
    "anywidget": "0.9.21",
    "mcp": "1.25.0",
    "openai": "2.14.0",
    "pandas": "2.3.3",
    "polars": "1.36.1",
    "pyarrow": "22.0.0",
    "pytest": "9.0.2",
    "python-lsp-server": "1.14.0",
    "ruff": "0.14.10",
    "vegafusion": "2.0.3"
  },
  "Experimental Flags": {
    "chat_modes": true
  }
}

Code to reproduce

import marimo

__generated_with = "0.18.4"
app = marimo.App(width="full")

with app.setup:
    import pytest


@app.function
@pytest.mark.xfail(reason="Testing xfail functionality")
def test_xfail_example() -> None:
    assert False


if __name__ == "__main__":
    app.run()

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions