Task ID: task_e_682d2fe780088323a2c5448563899b16
Building scubaduck @ file:///workspace/scubaduck
⠹ Preparing packages... (0/33)
python-slugify ------------------------------ 9.82 KiB/9.82 KiB
pyee ------------------------------ 15.36 KiB/15.36 KiB
itsdangerous ------------------------------ 15.85 KiB/15.85 KiB
pluggy ------------------------------ 20.06 KiB/20.06 KiB
markupsafe ------------------------------ 22.58 KiB/22.58 KiB
execnet ------------------------------ 32.00 KiB/39.66 KiB
pytest-xdist ------------------------------ 32.00 KiB/45.03 KiB
requests ------------------------------ 63.41 KiB/63.41 KiB
packaging ------------------------------ 14.88 KiB/64.91 KiB
text-unidecode ------------------------------ 32.00 KiB/76.32 KiB
click ------------------------------ 46.88 KiB/99.76 KiB
flask ------------------------------ 46.88 KiB/100.88 KiB
urllib3 ------------------------------ 48.00 KiB/125.66 KiB
jinja2 ------------------------------ 30.88 KiB/131.74 KiB
charset-normalizer ------------------------------ 139.06 KiB/145.08 KiB
certifi ------------------------------ 155.88 KiB/155.88 KiB
werkzeug ------------------------------ 91.38 KiB/219.24 KiB
python-dateutil ------------------------------ 48.00 KiB/224.50 KiB
pytest ------------------------------ 335.58 KiB/335.58 KiB
greenlet ------------------------------ 32.00 KiB/589.71 KiB
pyright ------------------------------ 668.70 KiB/5.31 MiB
ruff ------------------------------ 606.20 KiB/11.02 MiB
Building scubaduck @ file:///workspace/scubaduck
⠹ Preparing packages... (0/33)
python-slugify ------------------------------ 9.82 KiB/9.82 KiB
pyee ------------------------------ 15.36 KiB/15.36 KiB
itsdangerous ------------------------------ 15.85 KiB/15.85 KiB
pluggy ------------------------------ 20.06 KiB/20.06 KiB
markupsafe ------------------------------ 22.58 KiB/22.58 KiB
execnet ------------------------------ 32.00 KiB/39.66 KiB
pytest-xdist ------------------------------ 32.00 KiB/45.03 KiB
requests ------------------------------ 63.41 KiB/63.41 KiB
packaging ------------------------------ 14.88 KiB/64.91 KiB
text-unidecode ------------------------------ 32.00 KiB/76.32 KiB
click ------------------------------ 46.88 KiB/99.76 KiB
flask ------------------------------ 46.88 KiB/100.88 KiB
urllib3 ------------------------------ 48.00 KiB/125.66 KiB
jinja2 ------------------------------ 30.88 KiB/131.74 KiB
charset-normalizer ------------------------------ 139.06 KiB/145.08 KiB
certifi ------------------------------ 155.88 KiB/155.88 KiB
werkzeug ------------------------------ 91.38 KiB/219.24 KiB
python-dateutil ------------------------------ 48.00 KiB/224.50 KiB
greenlet ------------------------------ 32.00 KiB/589.71 KiB
pyright ------------------------------ 668.70 KiB/5.31 MiB
ruff ------------------------------ 606.20 KiB/11.02 MiB
duckdb ------------------------------ 94.91 KiB/19.27 MiB
Building scubaduck @ file:///workspace/scubaduck
⠹ Preparing packages... (0/33)
pyee ------------------------------ 15.36 KiB/15.36 KiB
itsdangerous ------------------------------ 15.85 KiB/15.85 KiB
pluggy ------------------------------ 20.06 KiB/20.06 KiB
markupsafe ------------------------------ 22.58 KiB/22.58 KiB
execnet ------------------------------ 32.00 KiB/39.66 KiB
pytest-xdist ------------------------------ 32.00 KiB/45.03 KiB
requests ------------------------------ 63.41 KiB/63.41 KiB
packaging ------------------------------ 30.88 KiB/64.91 KiB
text-unidecode ------------------------------ 48.00 KiB/76.32 KiB
click ------------------------------ 46.88 KiB/99.76 KiB
flask ------------------------------ 62.88 KiB/100.88 KiB
urllib3 ------------------------------ 64.00 KiB/125.66 KiB
jinja2 ------------------------------ 30.88 KiB/131.74 KiB
charset-normalizer ------------------------------ 139.06 KiB/145.08 KiB
certifi ------------------------------ 155.88 KiB/155.88 KiB
werkzeug ------------------------------ 91.38 KiB/219.24 KiB
python-dateutil ------------------------------ 48.00 KiB/224.50 KiB
greenlet ------------------------------ 48.00 KiB/589.71 KiB
pyright ------------------------------ 684.70 KiB/5.31 MiB
ruff ------------------------------ 638.20 KiB/11.02 MiB
duckdb ------------------------------ 126.91 KiB/19.27 MiB
playwright ------------------------------ 0 B/43.05 MiB
Building scubaduck @ file:///workspace/scubaduck
⠹ Preparing packages... (0/33)
pyee ------------------------------ 15.36 KiB/15.36 KiB
itsdangerous ------------------------------ 15.85 KiB/15.85 KiB
pluggy ------------------------------ 20.06 KiB/20.06 KiB
markupsafe ------------------------------ 22.58 KiB/22.58 KiB
execnet ------------------------------ 32.00 KiB/39.66 KiB
pytest-xdist ------------------------------ 32.00 KiB/45.03 KiB
requests ------------------------------ 63.41 KiB/63.41 KiB
packaging ------------------------------ 30.88 KiB/64.91 KiB
text-unidecode ------------------------------ 64.00 KiB/76.32 KiB
click ------------------------------ 62.88 KiB/99.76 KiB
flask ------------------------------ 62.88 KiB/100.88 KiB
urllib3 ------------------------------ 64.00 KiB/125.66 KiB
jinja2 ------------------------------ 46.88 KiB/131.74 KiB
charset-normalizer ------------------------------ 139.06 KiB/145.08 KiB
werkzeug ------------------------------ 107.38 KiB/219.24 KiB
python-dateutil ------------------------------ 62.95 KiB/224.50 KiB
greenlet ------------------------------ 48.00 KiB/589.71 KiB
pyright ------------------------------ 700.70 KiB/5.31 MiB
ruff ------------------------------ 638.20 KiB/11.02 MiB
duckdb ------------------------------ 126.91 KiB/19.27 MiB
playwright ------------------------------ 0 B/43.05 MiB
Building scubaduck @ file:///workspace/scubaduck
⠹ Preparing packages... (0/33)
pyee ------------------------------ 15.36 KiB/15.36 KiB
pluggy ------------------------------ 20.06 KiB/20.06 KiB
markupsafe ------------------------------ 22.58 KiB/22.58 KiB
execnet ------------------------------ 32.00 KiB/39.66 KiB
pytest-xdist ------------------------------ 32.00 KiB/45.03 KiB
requests ------------------------------ 63.41 KiB/63.41 KiB
packaging ------------------------------ 30.88 KiB/64.91 KiB
text-unidecode ------------------------------ 76.32 KiB/76.32 KiB
click ------------------------------ 62.88 KiB/99.76 KiB
flask ------------------------------ 62.88 KiB/100.88 KiB
urllib3 ------------------------------ 64.00 KiB/125.66 KiB
jinja2 ------------------------------ 46.88 KiB/131.74 KiB
charset-normalizer ------------------------------ 139.06 KiB/145.08 KiB
werkzeug ------------------------------ 107.38 KiB/219.24 KiB
python-dateutil ------------------------------ 78.95 KiB/224.50 KiB
greenlet ------------------------------ 48.00 KiB/589.71 KiB
pyright ------------------------------ 716.70 KiB/5.31 MiB
ruff ------------------------------ 654.20 KiB/11.02 MiB
duckdb ------------------------------ 142.91 KiB/19.27 MiB
playwright ------------------------------ 8.00 KiB/43.05 MiB
Building scubaduck @ file:///workspace/scubaduck
⠹ Preparing packages... (0/33)
pyee ------------------------------ 15.36 KiB/15.36 KiB
pluggy ------------------------------ 20.06 KiB/20.06 KiB
markupsafe ------------------------------ 22.58 KiB/22.58 KiB
execnet ------------------------------ 32.00 KiB/39.66 KiB
pytest-xdist ------------------------------ 32.00 KiB/45.03 KiB
packaging ------------------------------ 30.88 KiB/64.91 KiB
text-unidecode ------------------------------ 76.32 KiB/76.32 KiB
click ------------------------------ 62.88 KiB/99.76 KiB
flask ------------------------------ 62.88 KiB/100.88 KiB
urllib3 ------------------------------ 64.00 KiB/125.66 KiB
jinja2 ------------------------------ 46.88 KiB/131.74 KiB
charset-normalizer ------------------------------ 139.06 KiB/145.08 KiB
werkzeug ------------------------------ 107.38 KiB/219.24 KiB
python-dateutil ------------------------------ 78.95 KiB/224.50 KiB
greenlet ------------------------------ 48.00 KiB/589.71 KiB
pyright ------------------------------ 716.70 KiB/5.31 MiB
ruff ------------------------------ 670.20 KiB/11.02 MiB
duckdb ------------------------------ 158.91 KiB/19.27 MiB
playwright ------------------------------ 8.00 KiB/43.05 MiB
Building scubaduck @ file:///workspace/scubaduck
⠹ Preparing packages... (0/33)
pluggy ------------------------------ 20.06 KiB/20.06 KiB
markupsafe ------------------------------ 22.58 KiB/22.58 KiB
execnet ------------------------------ 39.66 KiB/39.66 KiB
pytest-xdist ------------------------------ 32.00 KiB/45.03 KiB
packaging ------------------------------ 46.88 KiB/64.91 KiB
text-unidecode ------------------------------ 76.32 KiB/76.32 KiB
click ------------------------------ 62.88 KiB/99.76 KiB
flask ------------------------------ 62.88 KiB/100.88 KiB
urllib3 ------------------------------ 80.00 KiB/125.66 KiB
jinja2 ------------------------------ 62.88 KiB/131.74 KiB
charset-normalizer ------------------------------ 139.06 KiB/145.08 KiB
werkzeug ------------------------------ 107.38 KiB/219.24 KiB
python-dateutil ------------------------------ 78.95 KiB/224.50 KiB
greenlet ------------------------------ 48.00 KiB/589.71 KiB
pyright ------------------------------ 748.70 KiB/5.31 MiB
ruff ------------------------------ 686.20 KiB/11.02 MiB
duckdb ------------------------------ 174.91 KiB/19.27 MiB
playwright ------------------------------ 8.00 KiB/43.05 MiB
Building scubaduck @ file:///workspace/scubaduck
⠹ Preparing packages... (0/33)
markupsafe ------------------------------ 22.58 KiB/22.58 KiB
execnet ------------------------------ 39.66 KiB/39.66 KiB
pytest-xdist ------------------------------ 32.00 KiB/45.03 KiB
packaging ------------------------------ 46.88 KiB/64.91 KiB
text-unidecode ------------------------------ 76.32 KiB/76.32 KiB
click ------------------------------ 62.88 KiB/99.76 KiB
flask ------------------------------ 62.88 KiB/100.88 KiB
urllib3 ------------------------------ 80.00 KiB/125.66 KiB
jinja2 ------------------------------ 62.88 KiB/131.74 KiB
charset-normalizer ------------------------------ 139.06 KiB/145.08 KiB
werkzeug ------------------------------ 107.38 KiB/219.24 KiB
python-dateutil ------------------------------ 78.95 KiB/224.50 KiB
greenlet ------------------------------ 48.00 KiB/589.71 KiB
pyright ------------------------------ 748.70 KiB/5.31 MiB
ruff ------------------------------ 686.20 KiB/11.02 MiB
duckdb ------------------------------ 174.91 KiB/19.27 MiB
playwright ------------------------------ 8.00 KiB/43.05 MiB
Building scubaduck @ file:///workspace/scubaduck
⠹ Preparing packages... (0/33)
execnet ------------------------------ 39.66 KiB/39.66 KiB
pytest-xdist ------------------------------ 32.00 KiB/45.03 KiB
packaging ------------------------------ 46.88 KiB/64.91 KiB
text-unidecode ------------------------------ 76.32 KiB/76.32 KiB
click ------------------------------ 62.88 KiB/99.76 KiB
flask ------------------------------ 62.88 KiB/100.88 KiB
urllib3 ------------------------------ 80.00 KiB/125.66 KiB
jinja2 ------------------------------ 62.88 KiB/131.74 KiB
charset-normalizer ------------------------------ 139.06 KiB/145.08 KiB
werkzeug ------------------------------ 107.38 KiB/219.24 KiB
python-dateutil ------------------------------ 78.95 KiB/224.50 KiB
greenlet ------------------------------ 48.00 KiB/589.71 KiB
pyright ------------------------------ 748.70 KiB/5.31 MiB
ruff ------------------------------ 686.20 KiB/11.02 MiB
duckdb ------------------------------ 174.91 KiB/19.27 MiB
playwright ------------------------------ 8.00 KiB/43.05 MiB
Building scubaduck @ file:///workspace/scubaduck
⠹ Preparing packages... (0/33)
execnet ------------------------------ 39.66 KiB/39.66 KiB
pytest-xdist ------------------------------ 45.03 KiB/45.03 KiB
packaging ------------------------------ 62.88 KiB/64.91 KiB
click ------------------------------ 99.76 KiB/99.76 KiB
flask ------------------------------ 78.88 KiB/100.88 KiB
urllib3 ------------------------------ 80.00 KiB/125.66 KiB
jinja2 ------------------------------ 94.88 KiB/131.74 KiB
charset-normalizer ------------------------------ 145.08 KiB/145.08 KiB
werkzeug ------------------------------ 123.38 KiB/219.24 KiB
python-dateutil ------------------------------ 224.50 KiB/224.50 KiB
greenlet ------------------------------ 176.00 KiB/589.71 KiB
pyright ------------------------------ 876.70 KiB/5.31 MiB
ruff ------------------------------ 830.20 KiB/11.02 MiB
duckdb ------------------------------ 318.91 KiB/19.27 MiB
playwright ------------------------------ 8.00 KiB/43.05 MiB
Building scubaduck @ file:///workspace/scubaduck
⠹ Preparing packages... (0/33)
pytest-xdist ------------------------------ 45.03 KiB/45.03 KiB
packaging ------------------------------ 62.88 KiB/64.91 KiB
click ------------------------------ 99.76 KiB/99.76 KiB
flask ------------------------------ 78.88 KiB/100.88 KiB
urllib3 ------------------------------ 80.00 KiB/125.66 KiB
jinja2 ------------------------------ 110.88 KiB/131.74 KiB
charset-normalizer ------------------------------ 145.08 KiB/145.08 KiB
werkzeug ------------------------------ 139.38 KiB/219.24 KiB
python-dateutil ------------------------------ 224.50 KiB/224.50 KiB
greenlet ------------------------------ 208.00 KiB/589.71 KiB
pyright ------------------------------ 924.70 KiB/5.31 MiB
ruff ------------------------------ 862.20 KiB/11.02 MiB
duckdb ------------------------------ 350.91 KiB/19.27 MiB
playwright ------------------------------ 16.00 KiB/43.05 MiB
Building scubaduck @ file:///workspace/scubaduck
⠹ Preparing packages... (0/33)
pytest-xdist ------------------------------ 45.03 KiB/45.03 KiB
packaging ------------------------------ 62.88 KiB/64.91 KiB
click ------------------------------ 99.76 KiB/99.76 KiB
flask ------------------------------ 78.88 KiB/100.88 KiB
urllib3 ------------------------------ 96.00 KiB/125.66 KiB
jinja2 ------------------------------ 110.88 KiB/131.74 KiB
werkzeug ------------------------------ 139.38 KiB/219.24 KiB
python-dateutil ------------------------------ 224.50 KiB/224.50 KiB
greenlet ------------------------------ 224.00 KiB/589.71 KiB
pyright ------------------------------ 940.70 KiB/5.31 MiB
ruff ------------------------------ 878.20 KiB/11.02 MiB
duckdb ------------------------------ 366.91 KiB/19.27 MiB
playwright ------------------------------ 16.00 KiB/43.05 MiB
Building scubaduck @ file:///workspace/scubaduck
⠹ Preparing packages... (0/33)
packaging ------------------------------ 62.88 KiB/64.91 KiB
click ------------------------------ 99.76 KiB/99.76 KiB
flask ------------------------------ 94.88 KiB/100.88 KiB
urllib3 ------------------------------ 96.00 KiB/125.66 KiB
jinja2 ------------------------------ 126.88 KiB/131.74 KiB
werkzeug ------------------------------ 139.38 KiB/219.24 KiB
python-dateutil ------------------------------ 224.50 KiB/224.50 KiB
greenlet ------------------------------ 288.00 KiB/589.71 KiB
pyright ------------------------------ 1004.70 KiB/5.31 MiB
ruff ------------------------------ 926.20 KiB/11.02 MiB
duckdb ------------------------------ 430.91 KiB/19.27 MiB
playwright ------------------------------ 32.00 KiB/43.05 MiB
Building scubaduck @ file:///workspace/scubaduck
⠹ Preparing packages... (0/33)
packaging ------------------------------ 62.88 KiB/64.91 KiB
flask ------------------------------ 94.88 KiB/100.88 KiB
urllib3 ------------------------------ 96.00 KiB/125.66 KiB
jinja2 ------------------------------ 126.88 KiB/131.74 KiB
werkzeug ------------------------------ 139.38 KiB/219.24 KiB
python-dateutil ------------------------------ 224.50 KiB/224.50 KiB
greenlet ------------------------------ 304.00 KiB/589.71 KiB
pyright ------------------------------ 1020.70 KiB/5.31 MiB
ruff ------------------------------ 958.20 KiB/11.02 MiB
duckdb ------------------------------ 446.91 KiB/19.27 MiB
playwright ------------------------------ 32.00 KiB/43.05 MiB
Building scubaduck @ file:///workspace/scubaduck
⠹ Preparing packages... (0/33)
packaging ------------------------------ 62.88 KiB/64.91 KiB
flask ------------------------------ 94.88 KiB/100.88 KiB
urllib3 ------------------------------ 96.00 KiB/125.66 KiB
jinja2 ------------------------------ 126.88 KiB/131.74 KiB
werkzeug ------------------------------ 139.38 KiB/219.24 KiB
python-dateutil ------------------------------ 224.50 KiB/224.50 KiB
greenlet ------------------------------ 320.00 KiB/589.71 KiB
pyright ------------------------------ 1.00 MiB/5.31 MiB
ruff ------------------------------ 974.20 KiB/11.02 MiB
duckdb ------------------------------ 462.91 KiB/19.27 MiB
playwright ------------------------------ 32.00 KiB/43.05 MiB
Building scubaduck @ file:///workspace/scubaduck
⠹ Preparing packages... (0/33)
flask ------------------------------ 100.88 KiB/100.88 KiB
urllib3 ------------------------------ 96.00 KiB/125.66 KiB
jinja2 ------------------------------ 126.88 KiB/131.74 KiB
werkzeug ------------------------------ 139.38 KiB/219.24 KiB
python-dateutil ------------------------------ 224.50 KiB/224.50 KiB
greenlet ------------------------------ 336.00 KiB/589.71 KiB
pyright ------------------------------ 1.02 MiB/5.31 MiB
ruff ------------------------------ 990.20 KiB/11.02 MiB
duckdb ------------------------------ 494.91 KiB/19.27 MiB
playwright ------------------------------ 32.00 KiB/43.05 MiB
Building scubaduck @ file:///workspace/scubaduck
⠹ Preparing packages... (0/33)
flask ------------------------------ 100.88 KiB/100.88 KiB
urllib3 ------------------------------ 96.00 KiB/125.66 KiB
jinja2 ------------------------------ 126.88 KiB/131.74 KiB
werkzeug ------------------------------ 139.38 KiB/219.24 KiB
greenlet ------------------------------ 400.00 KiB/589.71 KiB
pyright ------------------------------ 1.07 MiB/5.31 MiB
ruff ------------------------------ 1.03 MiB/11.02 MiB
duckdb ------------------------------ 526.91 KiB/19.27 MiB
playwright ------------------------------ 32.00 KiB/43.05 MiB
Building scubaduck @ file:///workspace/scubaduck
⠹ Preparing packages... (0/33)
urllib3 ------------------------------ 112.00 KiB/125.66 KiB
jinja2 ------------------------------ 131.74 KiB/131.74 KiB
werkzeug ------------------------------ 155.38 KiB/219.24 KiB
greenlet ------------------------------ 460.72 KiB/589.71 KiB
pyright ------------------------------ 1.16 MiB/5.31 MiB
ruff ------------------------------ 1.11 MiB/11.02 MiB
duckdb ------------------------------ 558.91 KiB/19.27 MiB
playwright ------------------------------ 48.00 KiB/43.05 MiB
Building scubaduck @ file:///workspace/scubaduck
⠹ Preparing packages... (0/33)
urllib3 ------------------------------ 112.00 KiB/125.66 KiB
werkzeug ------------------------------ 155.38 KiB/219.24 KiB
greenlet ------------------------------ 460.72 KiB/589.71 KiB
pyright ------------------------------ 1.20 MiB/5.31 MiB
ruff ------------------------------ 1.17 MiB/11.02 MiB
duckdb ------------------------------ 590.91 KiB/19.27 MiB
playwright ------------------------------ 48.00 KiB/43.05 MiB
Building scubaduck @ file:///workspace/scubaduck
⠹ Preparing packages... (0/33)
werkzeug ------------------------------ 187.38 KiB/219.24 KiB
greenlet ------------------------------ 476.72 KiB/589.71 KiB
pyright ------------------------------ 1.32 MiB/5.31 MiB
ruff ------------------------------ 1.36 MiB/11.02 MiB
duckdb ------------------------------ 622.91 KiB/19.27 MiB
playwright ------------------------------ 64.00 KiB/43.05 MiB
Building scubaduck @ file:///workspace/scubaduck
⠹ Preparing packages... (0/33)
greenlet ------------------------------ 492.72 KiB/589.71 KiB
pyright ------------------------------ 1.54 MiB/5.31 MiB
ruff ------------------------------ 1.59 MiB/11.02 MiB
duckdb ------------------------------ 686.91 KiB/19.27 MiB
playwright ------------------------------ 96.00 KiB/43.05 MiB
Building scubaduck @ file:///workspace/scubaduck
⠹ Preparing packages... (0/33)
greenlet ------------------------------ 572.72 KiB/589.71 KiB
pyright ------------------------------ 1.81 MiB/5.31 MiB
ruff ------------------------------ 2.53 MiB/11.02 MiB
duckdb ------------------------------ 862.91 KiB/19.27 MiB
playwright ------------------------------ 653.34 KiB/43.05 MiB
Building scubaduck @ file:///workspace/scubaduck
⠹ Preparing packages... (0/33)
pyright ------------------------------ 1.82 MiB/5.31 MiB
ruff ------------------------------ 2.79 MiB/11.02 MiB
duckdb ------------------------------ 910.91 KiB/19.27 MiB
playwright ------------------------------ 893.34 KiB/43.05 MiB
Building scubaduck @ file:///workspace/scubaduck
⠹ Preparing packages... (0/33)
pyright ------------------------------ 1.91 MiB/5.31 MiB
ruff ------------------------------ 4.23 MiB/11.02 MiB
duckdb ------------------------------ 1.26 MiB/19.27 MiB
playwright ------------------------------ 2.26 MiB/43.05 MiB
Building scubaduck @ file:///workspace/scubaduck
⠹ Preparing packages... (0/33)
pyright ------------------------------ 1.99 MiB/5.31 MiB
ruff ------------------------------ 6.09 MiB/11.02 MiB
duckdb ------------------------------ 1.78 MiB/19.27 MiB
playwright ------------------------------ 4.17 MiB/43.05 MiB
Building scubaduck @ file:///workspace/scubaduck
⠸ Preparing packages... (28/33)
pyright ------------------------------ 2.08 MiB/5.31 MiB
ruff ------------------------------ 7.42 MiB/11.02 MiB
duckdb ------------------------------ 2.36 MiB/19.27 MiB
playwright ------------------------------ 5.51 MiB/43.05 MiB
Building scubaduck @ file:///workspace/scubaduck
⠸ Preparing packages... (28/33)
pyright ------------------------------ 2.12 MiB/5.31 MiB
ruff ------------------------------ 9.11 MiB/11.02 MiB
duckdb ------------------------------ 3.03 MiB/19.27 MiB
playwright ------------------------------ 7.19 MiB/43.05 MiB
Building scubaduck @ file:///workspace/scubaduck
⠸ Preparing packages... (28/33)
pyright ------------------------------ 2.16 MiB/5.31 MiB
ruff ------------------------------ 10.81 MiB/11.02 MiB
duckdb ------------------------------ 3.86 MiB/19.27 MiB
playwright ------------------------------ 8.97 MiB/43.05 MiB
Building scubaduck @ file:///workspace/scubaduck
⠸ Preparing packages... (28/33)
pyright ------------------------------ 2.17 MiB/5.31 MiB
duckdb ------------------------------ 4.01 MiB/19.27 MiB
playwright ------------------------------ 9.18 MiB/43.05 MiB
Building scubaduck @ file:///workspace/scubaduck
⠸ Preparing packages... (28/33)
pyright ------------------------------ 2.20 MiB/5.31 MiB
duckdb ------------------------------ 4.74 MiB/19.27 MiB
playwright ------------------------------ 11.79 MiB/43.05 MiB
Building scubaduck @ file:///workspace/scubaduck
⠼ Preparing packages... (29/33)
pyright ------------------------------ 2.26 MiB/5.31 MiB
duckdb ------------------------------ 5.87 MiB/19.27 MiB
playwright ------------------------------ 14.88 MiB/43.05 MiB
Building scubaduck @ file:///workspace/scubaduck
⠼ Preparing packages... (29/33)
pyright ------------------------------ 2.39 MiB/5.31 MiB
duckdb ------------------------------ 7.19 MiB/19.27 MiB
playwright ------------------------------ 17.48 MiB/43.05 MiB
Building scubaduck @ file:///workspace/scubaduck
⠼ Preparing packages... (29/33)
pyright ------------------------------ 2.42 MiB/5.31 MiB
duckdb ------------------------------ 8.64 MiB/19.27 MiB
playwright ------------------------------ 19.73 MiB/43.05 MiB
Building scubaduck @ file:///workspace/scubaduck
⠴ Preparing packages... (29/33)
pyright ------------------------------ 2.44 MiB/5.31 MiB
duckdb ------------------------------ 10.49 MiB/19.27 MiB
playwright ------------------------------ 21.90 MiB/43.05 MiB
Building scubaduck @ file:///workspace/scubaduck
⠴ Preparing packages... (29/33)
pyright ------------------------------ 2.56 MiB/5.31 MiB
duckdb ------------------------------ 10.99 MiB/19.27 MiB
playwright ------------------------------ 24.26 MiB/43.05 MiB
Built scubaduck @ file:///workspace/scubaduck
⠴ Preparing packages... (29/33)
pyright ------------------------------ 2.62 MiB/5.31 MiB
duckdb ------------------------------ 12.01 MiB/19.27 MiB
playwright ------------------------------ 25.62 MiB/43.05 MiB
⠴ Preparing packages... (29/33)
pyright ------------------------------ 2.64 MiB/5.31 MiB
duckdb ------------------------------ 12.45 MiB/19.27 MiB
playwright ------------------------------ 25.79 MiB/43.05 MiB
⠴ Preparing packages... (29/33)
pyright ------------------------------ 2.70 MiB/5.31 MiB
duckdb ------------------------------ 14.14 MiB/19.27 MiB
playwright ------------------------------ 27.51 MiB/43.05 MiB
⠴ Preparing packages... (29/33)
pyright ------------------------------ 2.73 MiB/5.31 MiB
duckdb ------------------------------ 16.09 MiB/19.27 MiB
playwright ------------------------------ 29.36 MiB/43.05 MiB
⠦ Preparing packages... (30/33)
pyright ------------------------------ 2.80 MiB/5.31 MiB
duckdb ------------------------------ 17.66 MiB/19.27 MiB
playwright ------------------------------ 31.31 MiB/43.05 MiB
⠦ Preparing packages... (30/33)
pyright ------------------------------ 2.85 MiB/5.31 MiB
duckdb ------------------------------ 19.15 MiB/19.27 MiB
playwright ------------------------------ 32.83 MiB/43.05 MiB
⠦ Preparing packages... (30/33)
pyright ------------------------------ 2.89 MiB/5.31 MiB
playwright ------------------------------ 33.89 MiB/43.05 MiB
⠦ Preparing packages... (30/33)
pyright ------------------------------ 2.98 MiB/5.31 MiB
playwright ------------------------------ 35.32 MiB/43.05 MiB
⠧ Preparing packages... (31/33)
pyright ------------------------------ 3.08 MiB/5.31 MiB
playwright ------------------------------ 36.92 MiB/43.05 MiB
⠧ Preparing packages... (31/33)
pyright ------------------------------ 3.17 MiB/5.31 MiB
playwright ------------------------------ 38.54 MiB/43.05 MiB
⠧ Preparing packages... (31/33)
pyright ------------------------------ 3.27 MiB/5.31 MiB
playwright ------------------------------ 40.48 MiB/43.05 MiB
⠧ Preparing packages... (31/33)
pyright ------------------------------ 3.37 MiB/5.31 MiB
playwright ------------------------------ 41.26 MiB/43.05 MiB
⠧ Preparing packages... (31/33)
pyright ------------------------------ 3.53 MiB/5.31 MiB
playwright ------------------------------ 43.01 MiB/43.05 MiB
⠇ Preparing packages... (31/33)
pyright ------------------------------ 3.55 MiB/5.31 MiB
⠇ Preparing packages... (31/33)
pyright ------------------------------ 3.77 MiB/5.31 MiB
⠇ Preparing packages... (31/33)
pyright ------------------------------ 4.12 MiB/5.31 MiB
⠇ Preparing packages... (31/33)
pyright ------------------------------ 4.32 MiB/5.31 MiB
⠋ Preparing packages... (32/33)
pyright ------------------------------ 4.48 MiB/5.31 MiB
⠋ Preparing packages... (32/33)
Prepared 33 packages in 1.64s
░░░░░░░░░░░░░░░░░░░░ [0/0] Installing wheels...
░░░░░░░░░░░░░░░░░░░░ [0/33] Installing wheels...
░░░░░░░░░░░░░░░░░░░░ [0/33] six==1.17.0
░░░░░░░░░░░░░░░░░░░░ [1/33] six==1.17.0
░░░░░░░░░░░░░░░░░░░░ [1/33] text-unidecode==1.3
█░░░░░░░░░░░░░░░░░░░ [2/33] text-unidecode==1.3
█░░░░░░░░░░░░░░░░░░░ [2/33] idna==3.10
█░░░░░░░░░░░░░░░░░░░ [3/33] idna==3.10
█░░░░░░░░░░░░░░░░░░░ [3/33] typing-extensions==4.13.2
██░░░░░░░░░░░░░░░░░░ [4/33] typing-extensions==4.13.2
██░░░░░░░░░░░░░░░░░░ [4/33] execnet==2.1.1
███░░░░░░░░░░░░░░░░░ [5/33] execnet==2.1.1
███░░░░░░░░░░░░░░░░░ [5/33] pytest==8.3.5
███░░░░░░░░░░░░░░░░░ [6/33] pytest==8.3.5
███░░░░░░░░░░░░░░░░░ [6/33] pytest-playwright==0.7.0
████░░░░░░░░░░░░░░░░ [7/33] pytest-playwright==0.7.0
████░░░░░░░░░░░░░░░░ [7/33] pytest-base-url==2.1.0
████░░░░░░░░░░░░░░░░ [8/33] pytest-base-url==2.1.0
████░░░░░░░░░░░░░░░░ [8/33] python-slugify==8.0.4
█████░░░░░░░░░░░░░░░ [9/33] python-slugify==8.0.4
███████████████████░ [32/33] pyright==1.1.400
Installed 33 packages in 83ms
+ blinker==1.9.0
+ certifi==2025.4.26
+ charset-normalizer==3.4.2
+ click==8.2.0
+ duckdb==1.2.2
+ execnet==2.1.1
+ flask==3.1.1
+ greenlet==3.2.2
+ idna==3.10
+ iniconfig==2.1.0
+ itsdangerous==2.2.0
+ jinja2==3.1.6
+ markupsafe==3.0.2
+ nodeenv==1.9.1
+ packaging==25.0
+ playwright==1.52.0
+ pluggy==1.6.0
+ pyee==13.0.0
+ pyright==1.1.400
+ pytest==8.3.5
+ pytest-base-url==2.1.0
+ pytest-playwright==0.7.0
+ pytest-xdist==3.6.1
+ python-dateutil==2.9.0.post0
+ python-slugify==8.0.4
+ requests==2.32.3
+ ruff==0.11.10
+ scubaduck==0.1.0 (from file:///workspace/scubaduck)
+ six==1.17.0
+ text-unidecode==1.3
+ typing-extensions==4.13.2
+ urllib3==2.4.0
+ werkzeug==3.1.3
++ source .venv/bin/activate
+++ '[' -z '' ']'
+++ '[' -n x ']'
+++ SCRIPT_PATH=.venv/bin/activate
+++ '[' .venv/bin/activate = /tmp/3hbE6W-setup_script.sh ']'
+++ deactivate nondestructive
+++ unset -f pydoc
+++ '[' -z '' ']'
+++ '[' -z '' ']'
+++ hash -r
+++ '[' -z '' ']'
+++ unset VIRTUAL_ENV
+++ unset VIRTUAL_ENV_PROMPT
+++ '[' '!' nondestructive = nondestructive ']'
+++ VIRTUAL_ENV=/workspace/scubaduck/.venv
+++ '[' linux-gnu = cygwin ']'
+++ '[' linux-gnu = msys ']'
+++ export VIRTUAL_ENV
+++ '[' -z '' ']'
+++ unset SCRIPT_PATH
+++ _OLD_VIRTUAL_PATH=/root/.cargo/bin:/root/.rbenv/shims:/root/.rbenv/bin:/root/.rbenv/shims:/root/.local/share/swiftly/bin:/root/.bun/bin:/root/.nvm/versions/node/v22.15.1/bin:/root/.pyenv/shims:3437PYENV_ROOT/shims:/root/.pyenv/bin:/usr/local/go/bin:/root/go/bin:/root/.rbenv/bin:/root/.rbenv/shims:/root/.bun/bin:/root/.local/bin:/root/.pyenv/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
+++ PATH=/workspace/scubaduck/.venv/bin:/root/.cargo/bin:/root/.rbenv/shims:/root/.rbenv/bin:/root/.rbenv/shims:/root/.local/share/swiftly/bin:/root/.bun/bin:/root/.nvm/versions/node/v22.15.1/bin:/root/.pyenv/shims:3437PYENV_ROOT/shims:/root/.pyenv/bin:/usr/local/go/bin:/root/go/bin:/root/.rbenv/bin:/root/.rbenv/shims:/root/.bun/bin:/root/.local/bin:/root/.pyenv/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
+++ export PATH
+++ '[' xscubaduck '!=' x ']'
+++ VIRTUAL_ENV_PROMPT='(scubaduck) '
+++ export VIRTUAL_ENV_PROMPT
+++ '[' -z '' ']'
+++ '[' -z '' ']'
+++ _OLD_VIRTUAL_PS1=
+++ PS1='(scubaduck) '
+++ export PS1
+++ alias pydoc
+++ true
+++ hash -r
++ playwright install chromium
Downloading Chromium 136.0.7103.25 (playwright build v1169) from https://cdn.playwright.dev/dbazure/download/playwright/builds/chromium/1169/chromium-linux.zip
167.7 MiB [] 0% 0.0s167.7 MiB [] 0% 25.2s167.7 MiB [] 0% 16.3s167.7 MiB [] 0% 13.1s167.7 MiB [] 1% 6.8s167.7 MiB [] 1% 4.8s167.7 MiB [] 2% 4.4s167.7 MiB [] 2% 3.9s167.7 MiB [] 4% 3.1s167.7 MiB [] 4% 2.9s167.7 MiB [] 5% 2.7s167.7 MiB [] 6% 2.6s167.7 MiB [] 8% 2.3s167.7 MiB [] 9% 2.1s167.7 MiB [] 10% 2.0s167.7 MiB [] 11% 1.9s167.7 MiB [] 13% 1.7s167.7 MiB [] 14% 1.7s167.7 MiB [] 15% 1.6s167.7 MiB [] 17% 1.5s167.7 MiB [] 19% 1.4s167.7 MiB [] 21% 1.3s167.7 MiB [] 22% 1.3s167.7 MiB [] 23% 1.3s167.7 MiB [] 24% 1.2s167.7 MiB [] 26% 1.2s167.7 MiB [] 27% 1.2s167.7 MiB [] 29% 1.2s167.7 MiB [] 30% 1.1s167.7 MiB [] 32% 1.1s167.7 MiB [] 34% 1.0s167.7 MiB [] 35% 1.0s167.7 MiB [] 37% 0.9s167.7 MiB [] 39% 0.9s167.7 MiB [] 40% 0.9s167.7 MiB [] 41% 0.8s167.7 MiB [] 43% 0.8s167.7 MiB [] 45% 0.8s167.7 MiB [] 48% 0.7s167.7 MiB [] 50% 0.6s167.7 MiB [] 52% 0.6s167.7 MiB [] 53% 0.6s167.7 MiB [] 55% 0.6s167.7 MiB [] 57% 0.6s167.7 MiB [] 58% 0.5s167.7 MiB [] 60% 0.5s167.7 MiB [] 62% 0.5s167.7 MiB [] 64% 0.4s167.7 MiB [] 65% 0.4s167.7 MiB [] 67% 0.4s167.7 MiB [] 69% 0.4s167.7 MiB [] 71% 0.3s167.7 MiB [] 73% 0.3s167.7 MiB [] 74% 0.3s167.7 MiB [] 76% 0.3s167.7 MiB [] 78% 0.3s167.7 MiB [] 80% 0.2s167.7 MiB [] 81% 0.2s167.7 MiB [] 82% 0.2s167.7 MiB [] 84% 0.2s167.7 MiB [] 86% 0.2s167.7 MiB [] 88% 0.1s167.7 MiB [] 90% 0.1s167.7 MiB [] 92% 0.1s167.7 MiB [] 93% 0.1s167.7 MiB [] 95% 0.1s167.7 MiB [] 97% 0.0s167.7 MiB [] 99% 0.0s167.7 MiB [] 100% 0.0s
Chromium 136.0.7103.25 (playwright build v1169) downloaded to /root/.cache/ms-playwright/chromium-1169
Downloading FFMPEG playwright build v1011 from https://cdn.playwright.dev/dbazure/download/playwright/builds/ffmpeg/1011/ffmpeg-linux.zip
2.3 MiB [] 0% 0.0s2.3 MiB [] 2% 0.6s2.3 MiB [] 11% 0.2s2.3 MiB [] 21% 0.2s2.3 MiB [] 61% 0.0s2.3 MiB [] 100% 0.0s
FFMPEG playwright build v1011 downloaded to /root/.cache/ms-playwright/ffmpeg-1011
Downloading Chromium Headless Shell 136.0.7103.25 (playwright build v1169) from https://cdn.playwright.dev/dbazure/download/playwright/builds/chromium/1169/chromium-headless-shell-linux.zip
101.4 MiB [] 0% 0.0s101.4 MiB [] 0% 14.6s101.4 MiB [] 0% 10.4s101.4 MiB [] 0% 6.8s101.4 MiB [] 1% 4.1s101.4 MiB [] 3% 2.4s101.4 MiB [] 6% 1.5s101.4 MiB [] 8% 1.2s101.4 MiB [] 10% 1.1s101.4 MiB [] 13% 1.0s101.4 MiB [] 16% 0.8s101.4 MiB [] 18% 0.8s101.4 MiB [] 23% 0.6s101.4 MiB [] 27% 0.6s101.4 MiB [] 30% 0.5s101.4 MiB [] 35% 0.5s101.4 MiB [] 38% 0.4s101.4 MiB [] 42% 0.4s101.4 MiB [] 45% 0.4s101.4 MiB [] 50% 0.3s101.4 MiB [] 55% 0.3s101.4 MiB [] 59% 0.2s101.4 MiB [] 63% 0.2s101.4 MiB [] 66% 0.2s101.4 MiB [] 70% 0.2s101.4 MiB [] 74% 0.1s101.4 MiB [] 76% 0.1s101.4 MiB [] 80% 0.1s101.4 MiB [] 84% 0.1s101.4 MiB [] 88% 0.1s101.4 MiB [] 91% 0.0s101.4 MiB [] 95% 0.0s101.4 MiB [] 98% 0.0s101.4 MiB [] 100% 0.0s
Chromium Headless Shell 136.0.7103.25 (playwright build v1169) downloaded to /root/.cache/ms-playwright/chromium_headless_shell-1169
++ echo 'source .venv/bin/activate'
I'll check AGENTS.md for guidance, then open the shell to proceed with the task.
bash: .venv/bin/activate: No such file or directory
root@6c81833b9b82:/workspace# ls
scubaduck
root@6c81833b9b82:/workspace# cd scubaduck
root@6c81833b9b82:/workspace/scubaduck# ls
AGENTS.md codex_setup.sh scubaduck tests
README.md pyproject.toml stubs uv.lock
root@6c81833b9b82:/workspace/scubaduck# cat AGENTS.md
- Use "source .venv/bin/activate" to activate the virtual environment before
running any commands
- ONLY use non-interactive commands like cat, sed, apply_patch to do edits.
Do NOT use interactive editors.
- Do NOT attempt to install packages. Only the packages specified in
pyproject.toml are available. You cannot add new packages. If you
desperately want another package, make a note of it in the final PR
description.
- Use conventional commits to format PR title
- There are no nested AGENTS.md files, this is the only agents file
- When using Playwright, ONLY use chromium browser. This browser is already
installed.
- Use "ruff check" to check lint, "ruff format" to autoformat files and
"pyright" to typecheck.
- When writing the PR description, include the original prompt supplied by the
user.
- When you add functionality to the server, add server tests to
tests/test_server.py. When you add features to the frontend, add frontend
tests to tests/test_web.py
AGENTS.md specifies using certain tools and commands, and I'll inspect the repo directories now.
root@6c81833b9b82:/workspace/scubaduck# ls -R | head
.:
AGENTS.md
README.md
codex_setup.sh
pyproject.toml
scubaduck
stubs
tests
uv.lock
root@6c81833b9b82:/workspace/scubaduck# ls scubaduck
__init__.py sample.csv server.py static
I'm looking over server.py to add the full SQL query and traceback to errors.
root@6c81833b9b82:/workspace/scubaduck# sed -n '1,160p' scubaduck/server.py
from __future__ import annotations
from dataclasses import dataclass, field
from typing import Any, Dict, List, Tuple
import re
from datetime import datetime, timedelta, timezone
import time
from pathlib import Path
import sqlite3
import duckdb
from dateutil import parser as dtparser
from flask import Flask, jsonify, request, send_from_directory
@dataclass
class Filter:
column: str
op: str
value: str | int | float | list[str] | None
@dataclass
class QueryParams:
start: str | None = None
end: str | None = None
order_by: str | None = None
order_dir: str = "ASC"
limit: int | None = None
columns: list[str] = field(default_factory=lambda: [])
filters: list[Filter] = field(default_factory=lambda: [])
derived_columns: dict[str, str] = field(default_factory=lambda: {})
graph_type: str = "samples"
group_by: list[str] = field(default_factory=lambda: [])
aggregate: str | None = None
show_hits: bool = False
def _load_database(path: Path) -> duckdb.DuckDBPyConnection:
ext = path.suffix.lower()
if ext == ".csv":
con = duckdb.connect()
con.execute(
f"CREATE TABLE events AS SELECT * FROM read_csv_auto('{path.as_posix
()}')"
)
elif ext in {".db", ".sqlite"}:
con = duckdb.connect()
sconn = sqlite3.connect(path)
info = sconn.execute("PRAGMA table_info(events)").fetchall()
col_defs = ", ".join(f"{r[1]} {r[2]}" for r in info)
con.execute(f"CREATE TABLE events ({col_defs})")
placeholders = ",".join("?" for _ in info)
for row in sconn.execute("SELECT * FROM events"):
con.execute(f"INSERT INTO events VALUES ({placeholders})", row)
sconn.close()
else:
con = duckdb.connect(path)
return con
_REL_RE = re.compile(
r"([+-]?\d+(?:\.\d*)?)\s*(hour|hours|day|days|week|weeks|fortnight|fortnight
s)",
re.IGNORECASE,
)
def parse_time(val: str | None) -> str | None:
"""Parse an absolute or relative time string into ``YYYY-MM-DD HH:MM:SS``.""
"
if val is None or val == "":
return None
s = val.strip()
if s.lower() == "now":
dt = datetime.now(timezone.utc)
return dt.replace(microsecond=0).strftime("%Y-%m-%d %H:%M:%S")
m = _REL_RE.fullmatch(s)
if m:
qty = float(m.group(1))
unit = m.group(2).lower()
delta: timedelta
if unit.startswith("hour"):
delta = timedelta(hours=qty)
elif unit.startswith("day"):
delta = timedelta(days=qty)
elif unit.startswith("week"):
delta = timedelta(weeks=qty)
elif unit.startswith("fortnight"):
delta = timedelta(weeks=2 * qty)
else: # pragma: no cover - defensive
raise ValueError(f"Unsupported unit: {unit}")
dt = datetime.now(timezone.utc) + delta
return dt.replace(microsecond=0).strftime("%Y-%m-%d %H:%M:%S")
dt = dtparser.parse(s)
return dt.replace(microsecond=0, tzinfo=None).strftime("%Y-%m-%d %H:%M:%S")
def build_query(params: QueryParams) -> str:
select_parts: list[str] = []
if params.group_by:
select_parts.extend(params.group_by)
agg = (params.aggregate or "avg").lower()
def agg_expr(col: str) -> str:
if agg.startswith("p"):
quant = float(agg[1:]) / 100
return f"quantile({col}, {quant})"
if agg == "count distinct":
return f"count(DISTINCT {col})"
return f"{agg}({col})"
for col in params.columns:
if col in params.group_by:
continue
select_parts.append(f"{agg_expr(col)} AS {col}")
if params.show_hits:
select_parts.insert(len(params.group_by), "count(*) AS Hits")
else:
select_parts.extend(params.columns)
for name, expr in params.derived_columns.items():
select_parts.append(f"{expr} AS {name}")
select_clause = ", ".join(select_parts) if select_parts else "*"
query = f"SELECT {select_clause} FROM events"
where_parts: list[str] = []
if params.start:
where_parts.append(f"timestamp >= '{params.start}'")
if params.end:
where_parts.append(f"timestamp <= '{params.end}'")
for f in params.filters:
op = f.op
if op in {"empty", "!empty"}:
val = "''"
else:
if f.value is None:
continue
if isinstance(f.value, list):
if not f.value:
continue
if op == "=":
vals = " OR ".join(f"{f.column} = '{v}'" for v in f.value)
where_parts.append(f"({vals})")
continue
val = f"'{f.value}'" if isinstance(f.value, str) else str(f.value)
if op == "contains":
where_parts.append(f"{f.column} ILIKE '%' || {val} || '%'")
elif op == "!contains":
where_parts.append(f"{f.column} NOT ILIKE '%' || {val} || '%'")
elif op == "empty":
where_parts.append(f"{f.column} = {val}")
elif op == "!empty":
where_parts.append(f"{f.column} != {val}")
else:
where_parts.append(f"{f.column} {op} {val}")
if where_parts:
query += " WHERE " + " AND ".join(where_parts)
if params.group_by:
query += " GROUP BY " + ", ".join(params.group_by)
root@6c81833b9b82:/workspace/scubaduck# sed -n '160,320p' scubaduck/server.py
query += " GROUP BY " + ", ".join(params.group_by)
if params.order_by:
query += f" ORDER BY {params.order_by} {params.order_dir}"
if params.limit is not None:
query += f" LIMIT {params.limit}"
return query
def create_app(db_file: str | Path | None = None) -> Flask:
app = Flask(__name__, static_folder="static")
db_path = Path(db_file or Path(__file__).with_name("sample.csv")).resolve()
con = _load_database(db_path)
column_types: Dict[str, str] = {
r[1]: r[2] for r in con.execute("PRAGMA table_info(events)").fetchall()
}
sample_cache: Dict[Tuple[str, str], Tuple[List[str], float]] = {}
CACHE_TTL = 60.0
CACHE_LIMIT = 200
@app.route("/")
def index() -> Any: # pyright: ignore[reportUnusedFunction]
assert app.static_folder is not None
return send_from_directory(app.static_folder, "index.html")
@app.route("/api/columns")
def columns() -> Any: # pyright: ignore[reportUnusedFunction]
rows = con.execute("PRAGMA table_info(events)").fetchall()
return jsonify([{"name": r[1], "type": r[2]} for r in rows])
def _cache_get(key: Tuple[str, str]) -> List[str] | None:
item = sample_cache.get(key)
if item is None:
return None
vals, ts = item
if time.time() - ts > CACHE_TTL:
del sample_cache[key]
return None
sample_cache[key] = (vals, time.time())
return vals
def _cache_set(key: Tuple[str, str], vals: List[str]) -> None:
sample_cache[key] = (vals, time.time())
if len(sample_cache) > CACHE_LIMIT:
oldest = min(sample_cache.items(), key=lambda kv: kv[1][1])[0]
del sample_cache[oldest]
@app.route("/api/samples")
def sample_values() -> Any: # pyright: ignore[reportUnusedFunction]
column = request.args.get("column")
substr = request.args.get("q", "")
if not column or column not in column_types:
return jsonify([])
ctype = column_types[column].upper()
if "CHAR" not in ctype and "STRING" not in ctype and "VARCHAR" not in ct
ype:
return jsonify([])
key = (column, substr)
cached = _cache_get(key)
if cached is not None:
return jsonify(cached)
rows = con.execute(
f"SELECT DISTINCT {column} FROM events WHERE CAST({column} AS VARCHA
R) ILIKE '%' || ? || '%' LIMIT 20",
[substr],
).fetchall()
values = [r[0] for r in rows]
_cache_set(key, values)
return jsonify(values)
@app.route("/api/query", methods=["POST"])
def query() -> Any: # pyright: ignore[reportUnusedFunction]
payload = request.get_json(force=True)
try:
start = parse_time(payload.get("start"))
end = parse_time(payload.get("end"))
except Exception as exc:
return jsonify({"error": str(exc)}), 400
params = QueryParams(
start=start,
end=end,
order_by=payload.get("order_by"),
order_dir=payload.get("order_dir", "ASC"),
limit=payload.get("limit"),
columns=payload.get("columns", []),
derived_columns=payload.get("derived_columns", {}),
graph_type=payload.get("graph_type", "samples"),
group_by=payload.get("group_by", []),
aggregate=payload.get("aggregate"),
show_hits=payload.get("show_hits", False),
)
for f in payload.get("filters", []):
params.filters.append(Filter(f["column"], f["op"], f.get("value")))
sql = build_query(params)
try:
rows = con.execute(sql).fetchall()
except Exception as exc:
return jsonify({"sql": sql, "error": str(exc)}), 400
return jsonify({"sql": sql, "rows": rows})
return app
app = create_app()
if __name__ == "__main__":
app.run(debug=True)
The exception isn't being caught as expected. The binder error should trigger a 400 response, but it's not.
The line numbers shifted, so I'll review the whole server code around the query route.
root@6c81833b9b82:/workspace/scubaduck# nl -ba scubaduck/server.py | sed -n '186
,252p'
186 def columns() -> Any: # pyright: ignore[reportUnusedFunction]
187 rows = con.execute("PRAGMA table_info(events)").fetchall()
188 return jsonify([{"name": r[1], "type": r[2]} for r in rows])
189
190 def _cache_get(key: Tuple[str, str]) -> List[str] | None:
191 item = sample_cache.get(key)
192 if item is None:
193 return None
194 vals, ts = item
195 if time.time() - ts > CACHE_TTL:
196 del sample_cache[key]
197 return None
198 sample_cache[key] = (vals, time.time())
199 return vals
200
201 def _cache_set(key: Tuple[str, str], vals: List[str]) -> None:
202 sample_cache[key] = (vals, time.time())
203 if len(sample_cache) > CACHE_LIMIT:
204 oldest = min(sample_cache.items(), key=lambda kv: kv[1][1])[
0]
205 del sample_cache[oldest]
206
207 @app.route("/api/samples")
208 def sample_values() -> Any: # pyright: ignore[reportUnusedFunction]
209 column = request.args.get("column")
210 substr = request.args.get("q", "")
211 if not column or column not in column_types:
212 return jsonify([])
213 ctype = column_types[column].upper()
214 if "CHAR" not in ctype and "STRING" not in ctype and "VARCHAR" n
ot in ctype:
215 return jsonify([])
216 key = (column, substr)
217 cached = _cache_get(key)
218 if cached is not None:
219 return jsonify(cached)
220 rows = con.execute(
221 f"SELECT DISTINCT {column} FROM events WHERE CAST({column} A
S VARCHAR) ILIKE '%' || ? || '%' LIMIT 20",
222 [substr],
223 ).fetchall()
224 values = [r[0] for r in rows]
225 _cache_set(key, values)
226 return jsonify(values)
227
228 @app.route("/api/query", methods=["POST"])
229 def query() -> Any: # pyright: ignore[reportUnusedFunction]
230 payload = request.get_json(force=True)
231 try:
232 start = parse_time(payload.get("start"))
233 end = parse_time(payload.get("end"))
234 except Exception as exc:
235 return jsonify({"error": str(exc)}), 400
236
237 params = QueryParams(
238 start=start,
239 end=end,
240 order_by=payload.get("order_by"),
241 order_dir=payload.get("order_dir", "ASC"),
242 limit=payload.get("limit"),
243 columns=payload.get("columns", []),
244 derived_columns=payload.get("derived_columns", {}),
245 graph_type=payload.get("graph_type", "samples"),
246 group_by=payload.get("group_by", []),
247 aggregate=payload.get("aggregate"),
248 show_hits=payload.get("show_hits", False),
249 )
250 for f in payload.get("filters", []):
251 params.filters.append(Filter(f["column"], f["op"], f.get("va
lue")))
252 sql = build_query(params)
root@6c81833b9b82:/workspace/scubaduck# nl -ba scubaduck/server.py | sed -n '252
,320p'
252 sql = build_query(params)
253 try:
254 rows = con.execute(sql).fetchall()
255 except Exception as exc:
256 return jsonify({"sql": sql, "error": str(exc)}), 400
257 return jsonify({"sql": sql, "rows": rows})
258
259 return app
260
261
262 app = create_app()
263
264
265 if __name__ == "__main__":
266 app.run(debug=True)
I need to modify code to include full SQL queries and stack traces in errors.
root@6c81833b9b82:/workspace/scubaduck# ls tests
conftest.py test_server.py test_web.py
root@6c81833b9b82:/workspace/scubaduck# sed -n '1,200p' tests/test_server.py
from __future__ import annotations
import json
from pathlib import Path
import duckdb
from scubaduck import server
import pytest
def test_basic_query() -> None:
app = server.app
client = app.test_client()
payload = {
"start": "2024-01-01 00:00:00",
"end": "2024-01-02 00:00:00",
"order_by": "timestamp",
"order_dir": "ASC",
"limit": 10,
"columns": ["timestamp", "event", "value", "user"],
"filters": [],
}
rv = client.post(
"/api/query", data=json.dumps(payload), content_type="application/json"
)
data = rv.get_json()
assert data
rows = data["rows"]
# We expect first three rows (until 2024-01-02 00:00:00)
assert len(rows) == 3
assert rows[0][1] == "login"
assert rows[1][1] == "logout"
def test_filter_multi_token() -> None:
app = server.app
client = app.test_client()
payload = {
"start": "2024-01-01 00:00:00",
"end": "2024-01-02 03:00:00",
"order_by": "timestamp",
"limit": 10,
"columns": ["timestamp", "event", "value", "user"],
"filters": [{"column": "user", "op": "=", "value": ["alice", "charlie"]}
],
}
rv = client.post(
"/api/query", data=json.dumps(payload), content_type="application/json"
)
data = rv.get_json()
assert data
rows = data["rows"]
# Should only return rows for alice and charlie
assert len(rows) == 3
assert rows[0][3] == "alice"
assert rows[-1][3] == "charlie"
def test_empty_filter_is_noop() -> None:
app = server.app
client = app.test_client()
base_payload = {
"start": "2024-01-01 00:00:00",
"end": "2024-01-03 00:00:00",
"order_by": "timestamp",
"limit": 100,
"columns": ["timestamp", "event", "value", "user"],
}
no_filter = {**base_payload, "filters": []}
empty_filter = {
**base_payload,
"filters": [{"column": "user", "op": "=", "value": None}],
}
rv1 = client.post(
"/api/query", data=json.dumps(no_filter), content_type="application/json
"
)
rv2 = client.post(
"/api/query", data=json.dumps(empty_filter), content_type="application/j
son"
)
rows1 = rv1.get_json()["rows"]
rows2 = rv2.get_json()["rows"]
assert rows1 == rows2
def test_select_columns() -> None:
app = server.app
client = app.test_client()
payload = {
"start": "2024-01-01 00:00:00",
"end": "2024-01-03 00:00:00",
"order_by": "timestamp",
"limit": 10,
"columns": ["timestamp", "user"],
"filters": [],
}
rv = client.post(
"/api/query", data=json.dumps(payload), content_type="application/json"
)
data = rv.get_json()
assert data
rows = data["rows"]
assert len(rows[0]) == 2
assert rows[0][1] == "alice"
def test_string_filter_ops() -> None:
app = server.app
client = app.test_client()
base = {
"start": "2024-01-01 00:00:00",
"end": "2024-01-03 00:00:00",
"order_by": "timestamp",
"limit": 100,
"columns": ["timestamp", "event", "value", "user"],
}
contains = {
**base,
"filters": [{"column": "user", "op": "contains", "value": "ali"}],
}
rv = client.post(
"/api/query", data=json.dumps(contains), content_type="application/json"
)
rows = rv.get_json()["rows"]
assert all("ali" in r[3] for r in rows)
regex = {
**base,
"filters": [{"column": "user", "op": "~", "value": "^a.*"}],
}
rv = client.post(
"/api/query", data=json.dumps(regex), content_type="application/json"
)
rows = rv.get_json()["rows"]
assert all(r[3].startswith("a") for r in rows)
assert len(rows) == 2
not_empty = {**base, "filters": [{"column": "user", "op": "!empty"}]}
rv = client.post(
"/api/query", data=json.dumps(not_empty), content_type="application/json
"
)
assert len(rv.get_json()["rows"]) == 4
def _make_payload() -> dict[str, object]:
return {
"start": "2024-01-01 00:00:00",
"end": "2024-01-02 00:00:00",
"order_by": "timestamp",
"order_dir": "ASC",
"limit": 10,
"columns": ["timestamp", "event", "value", "user"],
"filters": [],
}
def test_database_types(tmp_path: Path) -> None:
csv_file = tmp_path / "events.csv"
csv_file.write_text(Path("scubaduck/sample.csv").read_text())
sqlite_file = tmp_path / "events.sqlite"
import sqlite3
conn = sqlite3.connect(sqlite_file)
conn.execute(
"CREATE TABLE events (timestamp TEXT, event TEXT, value INTEGER, user TE
XT)"
)
with open(csv_file) as f:
next(f)
for line in f:
ts, ev, val, user = line.strip().split(",")
conn.execute(
"INSERT INTO events VALUES (?, ?, ?, ?)", (ts, ev, int(val), use
r)
)
conn.commit()
conn.close() # pyright: ignore[reportUnknownMemberType, reportAttributeAcce
ssIssue]
duckdb_file = tmp_path / "events.duckdb"
con = duckdb.connect(duckdb_file)
con.execute(
f"CREATE TABLE events AS SELECT * FROM read_csv_auto('{csv_file.as_posix
()}')"
)
con.close() # pyright: ignore[reportUnknownMemberType, reportAttributeAcces
sIssue]
for db in (csv_file, sqlite_file, duckdb_file):
app = server.create_app(db)
client = app.test_client()
payload = _make_payload()
rv = client.post(
"/api/query", data=json.dumps(payload), content_type="application/js
on"
)
rows = rv.get_json()["rows"]
assert len(rows) == 3
def test_group_by_table() -> None:
app = server.app
client = app.test_client()
payload = {
"start": "2024-01-01 00:00:00",
root@6c81833b9b82:/workspace/scubaduck# sed -n '200,400p' tests/test_server.py
"start": "2024-01-01 00:00:00",
"end": "2024-01-03 00:00:00",
"order_by": "user",
"limit": 10,
"columns": ["value"],
"group_by": ["user"],
"aggregate": "Sum",
"show_hits": True,
}
rv = client.post(
"/api/query", data=json.dumps(payload), content_type="application/json"
)
rows = rv.get_json()["rows"]
assert rows[0][0] == "alice"
assert rows[0][1] == 2
assert rows[0][2] == 40
def test_relative_time_query(monkeypatch: pytest.MonkeyPatch) -> None:
app = server.app
client = app.test_client()
from datetime import datetime
fixed_now = datetime(2024, 1, 2, 4, 0, 0)
class FixedDateTime(datetime):
@classmethod
def now(cls, tz=None): # type: ignore[override]
return fixed_now if tz is None else fixed_now.astimezone(tz)
monkeypatch.setattr(server, "datetime", FixedDateTime)
payload = {
"start": "-1 hour",
"end": "now",
"order_by": "timestamp",
"limit": 100,
"columns": ["timestamp", "event", "value", "user"],
"filters": [],
}
rv = client.post(
"/api/query", data=json.dumps(payload), content_type="application/json"
)
data = rv.get_json()
assert rv.status_code == 200
assert len(data["rows"]) == 1
assert data["rows"][0][3] == "charlie"
def test_invalid_time_error() -> None:
app = server.app
client = app.test_client()
payload = {
"start": "nonsense",
"end": "now",
"order_by": "timestamp",
"limit": 10,
"columns": ["timestamp"],
"filters": [],
}
rv = client.post(
"/api/query", data=json.dumps(payload), content_type="application/json"
)
data = rv.get_json()
assert rv.status_code == 400
assert "error" in data
root@6c81833b9b82:/workspace/scubaduck# sed -n '400,800p' tests/test_server.py
I noticed tests are capped at roughly 300 lines, the test_invalid_time_error
test exists, but no query error test. I'll also check tests/test_web.py
.
root@6c81833b9b82:/workspace/scubaduck# sed -n '1,160p' tests/test_web.py
from __future__ import annotations
from typing import Any
def run_query(
page: Any,
url: str,
*,
start: str | None = None,
end: str | None = None,
order_by: str | None = None,
order_dir: str | None = "ASC",
limit: int | None = None,
) -> dict[str, Any]:
page.goto(url)
page.wait_for_selector("#order_by option", state="attached")
page.wait_for_selector("#order_dir", state="attached")
page.wait_for_function("window.lastResults !== undefined")
if start is not None:
page.fill("#start", start)
if end is not None:
page.fill("#end", end)
if order_by is not None:
page.select_option("#order_by", order_by)
if order_dir is not None and order_dir == "DESC":
page.click("#order_dir")
if limit is not None:
page.fill("#limit", str(limit))
page.evaluate("window.lastResults = undefined")
page.click("text=Dive")
page.wait_for_function("window.lastResults !== undefined")
return page.evaluate("window.lastResults")
def test_range_filters(page: Any, server_url: str) -> None:
data = run_query(
page,
server_url,
start="2024-01-02 00:00:00",
end="2024-01-02 04:00:00",
order_by="timestamp",
limit=100,
)
assert len(data["rows"]) == 2
from dateutil import parser
timestamps = [parser.parse(row[0]).replace(tzinfo=None) for row in data["row
s"]]
assert timestamps == [
parser.parse("2024-01-02 00:00:00"),
parser.parse("2024-01-02 03:00:00"),
]
def test_order_by(page: Any, server_url: str) -> None:
data = run_query(
page,
server_url,
start="2024-01-01 00:00:00",
end="2024-01-03 00:00:00",
order_by="value",
order_dir="DESC",
limit=100,
)
values = [row[2] for row in data["rows"]]
assert values == sorted(values, reverse=True)
def test_limit(page: Any, server_url: str) -> None:
data = run_query(
page,
server_url,
start="2024-01-01 00:00:00",
end="2024-01-03 00:00:00",
order_by="timestamp",
limit=2,
)
assert len(data["rows"]) == 2
def test_simple_filter(page: Any, server_url: str) -> None:
page.goto(server_url)
page.wait_for_selector("#order_by option", state="attached")
page.click("text=Add Filter")
filter_el = page.query_selector("#filters .filter:last-child")
assert filter_el
filter_el.query_selector(".f-col").select_option("user")
val_input = filter_el.query_selector(".f-val")
val_input.click()
page.keyboard.type("alice")
page.keyboard.press("Enter")
page.click("text=Dive")
page.wait_for_function("window.lastResults !== undefined")
data = page.evaluate("window.lastResults")
assert len(data["rows"]) == 2
assert all(row[3] == "alice" for row in data["rows"])
def test_default_filter_and_layout(page: Any, server_url: str) -> None:
page.goto(server_url)
page.wait_for_selector("#order_by option", state="attached")
count = page.evaluate("document.querySelectorAll('#filters .filter').length"
)
assert count == 1
last_is_button = page.evaluate(
"document.querySelector('#filters').lastElementChild.id === 'add_filter'
"
)
assert last_is_button
position = page.evaluate(
"getComputedStyle(document.querySelector('#filters .filter button.remove
')).position"
)
assert position == "static"
def test_header_and_tabs(page: Any, server_url: str) -> None:
page.goto(server_url)
page.wait_for_selector("#order_by option", state="attached")
header = page.text_content("#header")
assert "sample.csv" in header
assert "events" in header
assert page.is_visible("#settings")
assert page.is_hidden("#columns")
page.click("text=Columns")
assert page.is_visible("#columns")
cols = [c.strip() for c in page.locator("#column_groups li").all_inner_texts
()]
assert "timestamp" in cols
assert "event" in cols
page.click("text=View Settings")
assert page.is_visible("#settings")
btn_color = page.evaluate(
"getComputedStyle(document.querySelector('#dive')).backgroundColor"
)
assert "rgb(0, 128, 0)" == btn_color
sidebar_overflow = page.evaluate(
"getComputedStyle(document.querySelector('#sidebar')).overflowY"
)
view_overflow = page.evaluate(
"getComputedStyle(document.querySelector('#view')).overflowY"
)
assert sidebar_overflow == "auto"
assert view_overflow == "auto"
def test_graph_type_table_fields(page: Any, server_url: str) -> None:
page.goto(server_url)
page.wait_for_selector("#graph_type", state="attached")
page.select_option("#graph_type", "table")
assert page.is_visible("#group_by_field")
assert page.is_visible("#aggregate_field")
assert page.is_visible("#show_hits_field")
page.click("text=Columns")
assert not page.is_visible("text=Strings:")
def test_help_and_alignment(page: Any, server_url: str) -> None:
page.goto(server_url)
page.wait_for_selector("#order_by option", state="attached")
root@6c81833b9b82:/workspace/scubaduck# sed -n '160,320p' tests/test_web.py
page.wait_for_selector("#order_by option", state="attached")
titles = page.evaluate(
"Array.from(document.querySelectorAll('#settings .help')).map(e => e.tit
le)"
)
assert any("start/end of the time range" in t for t in titles)
text_align = page.evaluate(
"getComputedStyle(document.querySelector('#settings label')).textAlign"
)
assert text_align == "right"
def test_table_sorting(page: Any, server_url: str) -> None:
run_query(
page,
server_url,
start="2024-01-01 00:00:00",
end="2024-01-03 00:00:00",
order_by="timestamp",
order_dir="ASC",
limit=100,
)
# header alignment
align = page.evaluate(
"getComputedStyle(document.querySelector('#results th')).textAlign"
)
assert align == "right"
header = page.locator("#results th").nth(3)
def values() -> list[str]:
return page.locator("#results td:nth-child(4)").all_inner_texts()
orig_rows = values()
assert orig_rows == ["alice", "bob", "alice", "charlie"]
first_sql = page.evaluate("window.lastResults.sql")
header.click()
assert values() == sorted(orig_rows)
assert header.inner_text().endswith("▲")
color = page.evaluate(
"getComputedStyle(document.querySelector('#results th:nth-child(4)')).co
lor"
)
assert "0, 0, 255" in color
assert page.evaluate("window.lastResults.sql") == first_sql
header.click()
assert values() == sorted(orig_rows, reverse=True)
assert header.inner_text().endswith("▼")
header.click()
assert values() == orig_rows
assert header.inner_text() == "user"
color = page.evaluate(
"getComputedStyle(document.querySelector('#results th:nth-child(4)')).co
lor"
)
assert "0, 0, 255" not in color
def test_relative_dropdown(page: Any, server_url: str) -> None:
page.goto(server_url)
page.wait_for_selector("#order_by option", state="attached")
btn = page.query_selector('[data-target="start-select"]')
assert btn
btn.click()
page.click("#start-select div:text('-3 hours')")
assert page.input_value("#start") == "-3 hours"
def test_end_dropdown_now(page: Any, server_url: str) -> None:
page.goto(server_url)
page.wait_for_selector("#order_by option", state="attached")
page.click('[data-target="end-select"]')
page.click("#end-select div:text('now')")
assert page.input_value("#end") == "now"
def test_invalid_time_error_shown(page: Any, server_url: str) -> None:
data = run_query(
page,
server_url,
start="nonsense",
end="now",
order_by="timestamp",
)
assert "error" in data
msg = page.text_content("#view")
assert "nonsense" in msg
def test_column_toggle_and_selection(page: Any, server_url: str) -> None:
page.goto(server_url)
page.wait_for_selector("#order_by option", state="attached")
page.click("text=Columns")
page.wait_for_selector("#column_groups input", state="attached")
count = page.evaluate(
"document.querySelectorAll('#column_groups input:checked').length"
)
assert count == 4
page.click("#columns_none")
count = page.evaluate(
"document.querySelectorAll('#column_groups input:checked').length"
)
assert count == 0
page.click("#columns_all")
count = page.evaluate(
"document.querySelectorAll('#column_groups input:checked').length"
)
assert count == 4
page.uncheck("#column_groups input[value='value']")
page.click("text=View Settings")
page.fill("#start", "2024-01-01 00:00:00")
page.fill("#end", "2024-01-02 00:00:00")
page.select_option("#order_by", "timestamp")
page.fill("#limit", "10")
page.evaluate("window.lastResults = undefined")
page.click("text=Dive")
page.wait_for_function("window.lastResults !== undefined")
data = page.evaluate("window.lastResults")
assert len(data["rows"][0]) == 3
headers = page.locator("#results th").all_inner_texts()
assert "value" not in headers
def test_columns_links_alignment(page: Any, server_url: str) -> None:
page.goto(server_url)
page.wait_for_selector("#order_by option", state="attached")
page.click("text=Columns")
page.wait_for_selector("#column_groups input", state="attached")
tag = page.evaluate("document.getElementById('columns_all').tagName")
assert tag == "A"
align = page.evaluate(
"getComputedStyle(document.querySelector('#column_actions')).textAlign"
)
assert align == "right"
def test_column_group_links(page: Any, server_url: str) -> None:
page.goto(server_url)
page.wait_for_selector("#order_by option", state="attached")
page.click("text=Columns")
page.wait_for_selector("#column_groups a", state="attached")
tag = page.evaluate("document.querySelector('#column_groups .col-group a').t
agName")
assert tag == "A"
def test_column_group_links_float_right(page: Any, server_url: str) -> None:
page.goto(server_url)
page.wait_for_selector("#order_by option", state="attached")
page.click("text=Columns")
page.wait_for_selector("#column_groups .col-group .links", state="attached")
float_val = page.evaluate(
"getComputedStyle(document.querySelector('#column_groups .col-group .lin
ks')).float"
)
assert float_val == "right"
inp = f.query_selector(".f-val")
inp.click()
page.wait_for_selector("#filters .filter:last-child .chip-dropdown div")
page.keyboard.type("ali")
page.wait_for_selector("text=alice")
page.keyboard.press("ArrowDown")
page.keyboard.press("Enter")
chips = page.evaluate(
"Array.from(document.querySelectorAll('#filters .filter:last-child .chip
')).map(c => c.firstChild.textContent)"
)
assert chips == ["ali"]
page.click("#filters .filter:last-child .chip .x")
page.wait_for_selector(".chip", state="detached")
def test_chip_copy_and_paste(page: Any, server_url: str) -> None:
page.goto(server_url)
page.wait_for_selector("#order_by option", state="attached")
page.evaluate(
"Object.defineProperty(navigator, 'clipboard', {value:{ _data: '', write
Text(t){ this._data = t; }, readText(){ return Promise.resolve(this._data); } }}
)"
)
page.click("text=Add Filter")
f = page.query_selector("#filters .filter:last-child")
assert f
f.query_selector(".f-col").select_option("user")
inp = f.query_selector(".f-val")
inp.click()
page.keyboard.type("alice")
page.keyboard.press("Enter")
page.keyboard.type("bob")
page.keyboard.press("Enter")
f.query_selector(".chip-copy").click()
assert page.evaluate("navigator.clipboard._data") == "alice,bob"
page.evaluate(
"var f=document.querySelector('#filters .filter:last-child'); f.chips=[]
; f.querySelectorAll('.chip').forEach(c=>c.remove())"
)
page.wait_for_selector("#filters .chip", state="detached")
inp.click()
page.evaluate(
"var dt=new DataTransfer(); dt.setData('text/plain','alice,bob'); var e=
new ClipboardEvent('paste',{clipboardData:dt}); document.querySelector('#filters
.filter:last-child .f-val').dispatchEvent(e);"
)
chips = page.evaluate(
"Array.from(document.querySelectorAll('#filters .filter:last-child .chip
')).map(c => c.firstChild.textContent)"
)
assert chips[:2] == ["alice", "bob"]
page.evaluate(
"var f=document.querySelector('#filters .filter:last-child'); f.chips=[]
; f.querySelectorAll('.chip').forEach(c=>c.remove())"
)
page.wait_for_selector("#filters .chip", state="detached")
inp.click()
page.evaluate(
"var dt=new DataTransfer(); dt.setData('text/plain','alice,bob'); var e=
new ClipboardEvent('paste',{clipboardData:dt}); Object.defineProperty(e,'shiftKe
y',{value:true}); document.querySelector('#filters .filter:last-child .f-val').d
ispatchEvent(e);"
)
chips = page.evaluate(
"Array.from(document.querySelectorAll('#filters .filter:last-child .chip
')).map(c => c.firstChild.textContent)"
)
assert chips[-1] == "alice,bob"
def test_chip_dropdown_hides_on_outside_click(page: Any, server_url: str) -> Non
e:
page.goto(server_url)
page.wait_for_selector("#order_by option", state="attached")
page.click("text=Add Filter")
f = page.query_selector("#filters .filter:last-child")
assert f
f.query_selector(".f-col").select_option("user")
inp = f.query_selector(".f-val")
inp.click()
page.wait_for_selector("#filters .filter:last-child .chip-dropdown div")
page.click("#header")
page.wait_for_selector("#filters .filter:last-child .chip-dropdown", state="
hidden")
def test_chip_input_no_outline(page: Any, server_url: str) -> None:
page.goto(server_url)
page.wait_for_selector("#order_by option", state="attached")
page.click("text=Add Filter")
inp = page.query_selector("#filters .filter:last-child .f-val")
assert inp
inp.click()
outline = page.evaluate(
"getComputedStyle(document.querySelector('#filters .filter:last-child .f
-val')).outlineStyle"
)
assert outline == "none"
def test_table_enhancements(page: Any, server_url: str) -> None:
run_query(
page,
server_url,
start="2024-01-01 00:00:00",
end="2024-01-03 00:00:00",
order_by="timestamp",
limit=10,
)
border = page.evaluate(
"getComputedStyle(document.querySelector('#results td')).borderStyle"
)
assert border == "solid"
color1 = page.evaluate(
"getComputedStyle(document.querySelector('#results tr:nth-child(2) td'))
.backgroundColor"
)
color2 = page.evaluate(
"getComputedStyle(document.querySelector('#results tr:nth-child(3) td'))
.backgroundColor"
)
assert color1 != color2
page.hover("#results tr:nth-child(2)")
hover_color = page.evaluate(
"getComputedStyle(document.querySelector('#results tr:nth-child(2) td'))
.backgroundColor"
)
assert hover_color != color1
page.click("#results tr:nth-child(2)")
selected_color = page.evaluate(
"getComputedStyle(document.querySelector('#results tr:nth-child(2) td'))
.backgroundColor"
)
assert "189, 228, 255" in selected_color
overflow = page.evaluate(
"var v=document.getElementById('view'); v.scrollWidth > v.clientWidth"
)
assert not overflow
def test_timestamp_rendering(page: Any, server_url: str) -> None:
run_query(
page,
server_url,
start="2024-01-01 00:00:00",
end="2024-01-02 00:00:00",
order_by="timestamp",
limit=1,
)
cell = page.text_content("#results td")
assert cell != "Invalid Date"
valid = page.evaluate("v => !isNaN(Date.parse(v))", cell)
assert valid
def test_url_query_persistence(page: Any, server_url: str) -> None:
page.goto(server_url)
page.wait_for_selector("#order_by option", state="attached")
page.wait_for_function("window.lastResults !== undefined")
page.fill("#start", "2024-01-01 00:00:00")
page.fill("#end", "2024-01-02 00:00:00")
page.fill("#limit", "1")
page.evaluate("window.lastResults = undefined")
page.click("text=Dive")
page.wait_for_function("window.lastResults !== undefined")
first_url = page.url
first_rows = page.evaluate("window.lastResults.rows.length")
page.fill("#limit", "2")
page.evaluate("window.lastResults = undefined")
page.click("text=Dive")
page.wait_for_function("window.lastResults !== undefined")
second_url = page.url
second_rows = page.evaluate("window.lastResults.rows.length")
assert second_rows != first_rows
assert first_url != second_url
page.go_back()
page.wait_for_function("window.lastResults !== undefined")
assert page.url == first_url
assert page.evaluate("window.lastResults.rows.length") == first_rows
def test_load_from_url(page: Any, server_url: str) -> None:
url = (
f"{server_url}?start=2024-01-01%2000:00:00&end=2024-01-02%2000:00:00"
"&order_by=timestamp&limit=2"
)
page.goto(url)
page.wait_for_selector("#order_by option", state="attached")
page.wait_for_function("window.lastResults !== undefined")
assert page.input_value("#start") == "2024-01-01 00:00:00"
assert page.input_value("#end") == "2024-01-02 00:00:00"
assert page.input_value("#limit") == "2"
assert page.evaluate("window.lastResults.rows.length") == 2
def test_empty_data_message(page: Any, server_url: str) -> None:
data = run_query(
page,
server_url,
start="2025-01-01 00:00:00",
end="2025-01-02 00:00:00",
order_by="timestamp",
limit=100,
)
assert data["rows"] == []
msg = page.text_content("#view")
assert "Empty data provided to table" in msg
root@6c81833b9b82:/workspace/scubaduck# sed -n '640,960p' tests/test_web.py
I'll focus on the new feature: showing error details such as the SQL query and stack trace. The existing tests cover some cases, but I may need to add more for query errors. I’ll check the README for sample dataset info to help design those tests.
I really like Scuba (Meta's internal real-time database system). The distributed
, real-time database part of Scuba is quite difficult (and expensive) to replica
te, but I also really like Scuba's UI for doing queries, and I have found myself
wishing that I have access to it even for "small" databases, e.g., I have a sql
ite dataset I want to explore.
Pivotal ideas:
* Time series by default. In the dedicated "time series" view, there are many fe
atures specifically oriented towards working towards tables that represent event
s that occurred over time: the start, end, compare, aggregate and granularity fi
elds all specially privilege the timestamp field. In fact, you can't log events
to Scuba's backing data store without a timestamp, they always come with one. (S
cuba also supports other views that don't presuppose a time series, but the time
series is the most beloved and well used view.) This is in contrast to typical
software which tries to generalize to arbitrary data first, with time series bei
ng added on later.
* It's all about exploration. Scuba is predicated on the idea that you don't kno
w what you're looking for, that you are going to spend time tweaking queries and
changing filters/grouping as part of an investigation to figure out why a syste
m behaves the way it is. So the filters/comparisons/groupings you want to edit a
re always visible on the left sidebar, with the expectation that you're going to
tweak the query to look at something else. Similarly, all the parameters of you
r query get saved into your URL, so your browser history can double up as a quer
y history / you can easily share a query with someone else. This is contrast to
typical software which is often oriented to making pretty dashboards and reports
. (This function is important too, but it's not what I want in exploration mode!
)
* You can fix data problems in the query editor. It's pretty common to have mess
ed up and ended up with a database that doesn't have exactly the columns you nee
d, or some columns that are corrupted in some way. Scuba has pretty robust suppo
rt for defining custom columns with arbitrary SQL functions, grouping over them
as if they were native functions, and doing so with minimal runtime cost (Scuba
aims to turn around your query in milliseconds!) Having to go and run a huge dat
a pipeline to fix your data is a big impediment to exploration; quick and easy c
ustom columns means you can patch over problems when you're investigating and fi
x them for real later.
I’ll check out index.html to review the user interface.
root@6c81833b9b82:/workspace/scubaduck# ls scubaduck/static
index.html
root@6c81833b9b82:/workspace/scubaduck# sed -n '1,120p' scubaduck/static/index.h
tml
<!DOCTYPE html>
<html>
<head>
<meta charset="utf-8">
<title>ScubaDuck</title>
<style>
body { margin: 0; font-family: sans-serif; height: 100vh; display: flex; fle
x-direction: column; }
#header { padding: 10px; font-weight: bold; border-bottom: 1px solid #ccc; }
#content { flex: 1; display: flex; height: calc(100vh - 42px); overflow: hid
den; }
#sidebar { width: 450px; padding: 10px; border-right: 3px solid #ccc; overfl
ow-y: auto; display: flex; flex-direction: column; box-sizing: border-box; }
#sidebar-resizer { width: 5px; cursor: col-resize; background: #ccc; }
#view { flex: 1; padding: 10px; overflow-y: auto; overflow-x: auto; }
.field { display: flex; align-items: center; margin-bottom: 10px; }
.field label { width: 80px; text-align: right; margin-right: 5px; }
.help { margin-left: 4px; cursor: help; }
.rel-btn { margin-left: 4px; }
#tabs { display: flex; align-items: center; margin-bottom: 10px; }
#tabs .tab { margin-right: 5px; background: none; border: 1px solid #ccc; pa
dding: 4px 8px; cursor: pointer; width: 120px; text-align: center; box-sizing: b
order-box; }
#tabs .tab.active { background: #eee; font-weight: bold; }
#dive { margin-left: auto; background: green; color: white; border: none; pa
dding: 5px 10px; cursor: pointer; }
.tab-content { display: none; }
.tab-content.active { display: block; }
#filter_list { display: flex; flex-direction: column; }
#filters .filter {
border: 1px solid #ccc;
padding: 5px;
margin-bottom: 5px;
position: relative;
display: flex;
flex-direction: column;
}
#filters .filter-row { display: flex; margin-bottom: 5px; }
#filters .filter-row .f-col { flex: 1; }
#filters .filter-row .f-op {
margin-left: 5px;
width: fit-content;
flex: 0 0 auto;
}
#filters .filter input.f-val {
border: none;
flex: 1;
min-width: 60px;
margin: 2px;
outline: none;
}
.chip-box { position: relative; }
.chip-input { display: flex; flex-wrap: wrap; border: 1px solid #ccc; paddin
g: 2px; min-height: 24px; }
.chip { background: #eee; border: 1px solid #999; padding: 2px 4px; margin:
2px; border-radius: 3px; display: flex; align-items: center; }
.chip .x { margin-left: 4px; cursor: pointer; }
.chip-copy { margin-left: 4px; cursor: pointer; background: none; border: no
ne; }
.chip-dropdown { position: absolute; left: 0; right: 0; top: 100%; backgroun
d: white; border: 1px solid #ccc; max-height: 120px; overflow-y: auto; z-index:
10; display: none; }
.chip-dropdown div { padding: 2px 4px; cursor: pointer; }
.chip-dropdown div.highlight { background: #bde4ff; }
.rel-box { position: relative; display: flex; }
.rel-dropdown { position: absolute; left: 0; right: 0; top: 100%; background
: white; border: 1px solid #ccc; z-index: 10; display: none; }
.rel-dropdown div { padding: 2px 4px; cursor: pointer; }
.rel-dropdown div:hover { background: #bde4ff; }
#filters .filter button.remove {
margin-left: 5px;
width: 20px;
flex: 0 0 auto;
}
#filters h4 { margin: 0 0 5px 0; }
table { border-collapse: collapse; min-width: 100%; }
th, td { border: 1px solid #ccc; padding: 4px; box-sizing: border-box; }
th { text-align: left; cursor: pointer; position: relative; }
th.sorted { color: blue; }
tr:nth-child(even) td { background: #f9f9f9; }
tr.selected td { background: #bde4ff !important; }
tr:hover:not(.selected) td { background: #eee; }
#column_actions {
text-align: right;
margin-bottom: 5px;
}
#column_actions a {
margin-left: 5px;
}
.col-group-header {
overflow: hidden;
}
.col-group-header .links {
float: right;
}
.col-group-header .links a {
margin-left: 5px;
}
/* Column resizer removed */
</style>
</head>
<body>
<div id="header">sample.csv - events <select id="graph_type"><option value="sa
mples">Samples</option><option value="table">Table</option></select></div>
<div id="content">
<div id="sidebar">
<div id="tabs">
<button class="tab active" data-tab="settings">View Settings</button>
<button class="tab" data-tab="columns">Columns</button>
<button id="dive" onclick="dive()">Dive</button>
</div>
<div id="settings" class="tab-content active">
<div class="field">
<label>Start<span class="help" title="Sets the start/end of the time r
ange to query. Can be any kind of datetime string. For example: 'April 23, 2014'
or 'yesterday'.">[?]</span></label>
<div class="rel-box">
<input id="start" type="text" />
<button type="button" class="rel-btn" data-target="start-select">	
660;</button>
<div id="start-select" class="rel-dropdown" data-input="start">
<div data-value="-1 hour">-1 hour</div>
<div data-value="-3 hours">-3 hours</div>
<div data-value="-12 hours">-12 hours</div>
<div data-value="-1 day">-1 day</div>
<div data-value="-3 days">-3 days</div>
<div data-value="-1 week">-1 week</div>
<div data-value="-1 fortnight">-1 fortnight</div>
<div data-value="-30 days">-30 days</div>
<div data-value="-90 days">-90 days</div>
</div>
</div>
</div>
<div class="field">
<label>End<span class="help" title="Sets the start/end of the time ran
ge to query. Can be any kind of datetime string. For example: 'April 23, 2014' o
r 'yesterday'.">[?]</span></label>
<div class="rel-box">
I'll modify the code to show stack trace and SQL in the console and UI.
I'll add SQL and stack trace printing when an error occurs for debugging and testing.
root@6c81833b9b82:/workspace/scubaduck# grep -n "lastResults" -n scubaduck/stati
c/index.html
640: window.lastResults = undefined;
870: window.lastResults = data;
886: window.lastResults = {error: msg};
root@6c81833b9b82:/workspace/scubaduck# sed -n '820,920p' scubaduck/static/index
.html
hour12: true,
timeZoneName: 'short'
});
} else {
if (col === 'Hits') {
const pct = totalHits ? ((v / totalHits) * 100).toFixed(1) : '0';
td.textContent = `${v} (${pct}%)`;
} else {
td.textContent = v;
}
}
td.style.textAlign = isStringColumn(col) ? 'left' : 'right';
tr.appendChild(td);
});
table.appendChild(tr);
});
// ensure table does not overflow unless necessary
const view = document.getElementById('view');
if (table.scrollWidth <= view.clientWidth) {
table.style.width = '100%';
}
}
function handleSort(e) {
const idx = parseInt(e.target.dataset.index, 10);
if (sortState.index !== idx) {
sortState.index = idx;
sortState.dir = 'asc';
} else if (sortState.dir === 'asc') {
sortState.dir = 'desc';
} else if (sortState.dir === 'desc') {
sortState.index = null;
sortState.dir = null;
} else {
sortState.dir = 'asc';
}
let rows = originalRows.slice();
if (sortState.index !== null) {
rows.sort((a, b) => {
const va = a[sortState.index];
const vb = b[sortState.index];
if (va === vb) return 0;
if (sortState.dir === 'desc') return va < vb ? 1 : -1;
return va > vb ? 1 : -1;
});
}
renderTable(rows);
}
function showResults(data) {
window.lastResults = data;
const view = document.getElementById('view');
if (data.rows.length === 0) {
view.innerHTML =
'<p id="empty-message">Empty data provided to table</p><table id="results"
></table>';
} else {
view.innerHTML = '<table id="results"></table>';
}
originalRows = data.rows.slice();
sortState = {index: null, dir: null};
renderTable(originalRows);
document.getElementById('query_info').textContent =
`Your query took about ${lastQueryTime} ms`;
}
function showError(msg) {
window.lastResults = {error: msg};
const view = document.getElementById('view');
view.innerHTML = `<p id="error-message">${msg}</p>`;
document.getElementById('query_info').textContent = '';
}
</script>
</body>
</html>
I'm checking lines 800 to 840 again to make sure I didn't miss anything.
root@6c81833b9b82:/workspace/scubaduck# sed -n '840,880p' scubaduck/static/index
.html
}
}
function handleSort(e) {
const idx = parseInt(e.target.dataset.index, 10);
if (sortState.index !== idx) {
sortState.index = idx;
sortState.dir = 'asc';
} else if (sortState.dir === 'asc') {
sortState.dir = 'desc';
} else if (sortState.dir === 'desc') {
sortState.index = null;
sortState.dir = null;
} else {
sortState.dir = 'asc';
}
let rows = originalRows.slice();
if (sortState.index !== null) {
rows.sort((a, b) => {
const va = a[sortState.index];
const vb = b[sortState.index];
if (va === vb) return 0;
if (sortState.dir === 'desc') return va < vb ? 1 : -1;
return va > vb ? 1 : -1;
});
}
renderTable(rows);
}
function showResults(data) {
window.lastResults = data;
const view = document.getElementById('view');
if (data.rows.length === 0) {
view.innerHTML =
'<p id="empty-message">Empty data provided to table</p><table id="results"
></table>';
} else {
view.innerHTML = '<table id="results"></table>';
}
originalRows = data.rows.slice();
sortState = {index: null, dir: null};
renderTable(originalRows);
root@6c81833b9b82:/workspace/scubaduck# grep -n "showError" -n scubaduck/static/
index.html
653: showError(err.message);
885:function showError(msg) {
I’ll look around line 640 to investigate how error handling works.
root@6c81833b9b82:/workspace/scubaduck# sed -n '630,660p' scubaduck/static/index
.html
let lastQueryTime = 0;
let queryStart = 0;
function dive(push=true) {
const payload = collectParams();
if (push) {
history.pushState(payload, '', paramsToSearch(payload));
}
const view = document.getElementById('view');
view.innerHTML = '<p>Loading...</p>';
window.lastResults = undefined;
queryStart = performance.now();
fetch('/api/query', {method:'POST', headers:{'Content-Type':'application/json'
}, body:JSON.stringify(payload)})
.then(async r => {
const data = await r.json();
if (!r.ok) throw new Error(data.error || 'Error');
return data;
})
.then(data => {
lastQueryTime = Math.round(performance.now() - queryStart);
showResults(data);
})
.catch(err => {
showError(err.message);
});
}
function collectParams() {
updateSelectedColumns();
const payload = {
start: document.getElementById('start').value,
I'll take a look at the showError function now to understand it better.
root@6c81833b9b82:/workspace/scubaduck# sed -n '880,900p' scubaduck/static/index
.html
renderTable(originalRows);
document.getElementById('query_info').textContent =
`Your query took about ${lastQueryTime} ms`;
}
function showError(msg) {
window.lastResults = {error: msg};
const view = document.getElementById('view');
view.innerHTML = `<p id="error-message">${msg}</p>`;
document.getElementById('query_info').textContent = '';
}
</script>
</body>
</html>
The error message should now include traceback. Instead of just throwing an error, I'll throw the entire data object to handle both error
and traceback
.
I’ll change showError to support HTML or newlines, allowing SQL and traceback display.
I'll update showError to handle both string and object, displaying SQL and traceback in <pre>
.
from __future__ import annotations
from typing import Any
def run_query(
page: Any,
url: str,
*,
start: str | None = None,
end: str | None = None,
order_by: str | None = None,
order_dir: str | None = "ASC",
limit: int | None = None,
) -> dict[str, Any]:
page.goto(url)
page.wait_for_selector("#order_by option", state="attached")
page.wait_for_selector("#order_dir", state="attached")
page.wait_for_function("window.lastResults !== undefined")
if start is not None:
page.fill("#start", start)
if end is not None:
page.fill("#end", end)
if order_by is not None:
page.select_option("#order_by", order_by)
if order_dir is not None and order_dir == "DESC":
page.click("#order_dir")
if limit is not None:
page.fill("#limit", str(limit))
page.evaluate("window.lastResults = undefined")
page.click("text=Dive")
page.wait_for_function("window.lastResults !== undefined")
return page.evaluate("window.lastResults")
def test_range_filters(page: Any, server_url: str) -> None:
data = run_query(
page,
server_url,
start="2024-01-02 00:00:00",
I need to update run_query to handle errors, ensuring it still works with showError.
I'll check pyproject for appropriate dependencies and to ensure style consistency.
[project]
name = "scubaduck"
version = "0.1.0"
description = "Scuba on DuckDB"
readme = "README.md"
requires-python = ">=3.12"
license = {text = "MIT"}
dependencies = [
"duckdb",
"flask",
"python-dateutil>=2.9.0.post0",
]
[build-system]
requires = ["hatchling"]
build-backend = "hatchling.build"
[tool.pytest.ini_options]
addopts = "-n auto --tb=native"
[tool.pyright]
# Pyright configuration with strict settings
include = ["scubaduck", "tests"]
exclude = ["**/__pycache__", "dist"]
stubPath = "stubs"
venvPath = "."
venv = ".venv"
reportMissingImports = true
reportMissingTypeStubs = true
pythonVersion = "3.12"
pythonPlatform = "All"
typeCheckingMode = "strict"
reportUnknownMemberType = true
reportUnknownParameterType = true
reportUnknownVariableType = true
reportUnknownArgumentType = true
reportPrivateImportUsage = true
reportUntypedFunctionDecorator = true
reportFunctionMemberAccess = true
reportIncompatibleMethodOverride = true
[tool.ruff]
target-version = "py312"
[tool.uv]
dev-dependencies = [
"pytest>=8.3.5",
"ruff>=0.11.2",
"pytest-xdist>=3.6.1",
"playwright>=1.52.0",
"pyright>=1.1.350",
"pytest-playwright>=0.7.0",
]
The Python version is 3.12, and there's a stub path. I'll open server.py now.
root@6c81833b9b82:/workspace/scubaduck# grep -n "traceback" -n scubaduck/server.
py
I'll adjust the code to enhance error handling with traceback and update the frontend.
root@6c81833b9b82:/workspace/scubaduck# grep -n "function showError" -n scubaduc
k/static/index.html
885:function showError(msg) {
root@6c81833b9b82:/workspace/scubaduck# sed -n '880,890p' scubaduck/static/index
.html
renderTable(originalRows);
document.getElementById('query_info').textContent =
`Your query took about ${lastQueryTime} ms`;
}
function showError(msg) {
window.lastResults = {error: msg};
const view = document.getElementById('view');
view.innerHTML = `<p id="error-message">${msg}</p>`;
document.getElementById('query_info').textContent = '';
}
I'll update showError to handle both objects and strings, then update dive() accordingly.
root@6c81833b9b82:/workspace/scubaduck# grep -n "showError(" -n scubaduck/static
/index.html
653: showError(err.message);
885:function showError(msg) {
I'll adjust the code around line 653 where showError(err.message) is used.
root@6c81833b9b82:/workspace/scubaduck# sed -n '640,660p' scubaduck/static/index
.html
window.lastResults = undefined;
queryStart = performance.now();
fetch('/api/query', {method:'POST', headers:{'Content-Type':'application/json'
}, body:JSON.stringify(payload)})
.then(async r => {
const data = await r.json();
if (!r.ok) throw new Error(data.error || 'Error');
return data;
})
.then(data => {
lastQueryTime = Math.round(performance.now() - queryStart);
showResults(data);
})
.catch(err => {
showError(err.message);
});
}
function collectParams() {
updateSelectedColumns();
const payload = {
start: document.getElementById('start').value,
I'll apply the patch to server.py, adjusting fetch, catch, and showError functionality.
root@6c81833b9b82:/workspace/scubaduck# sed -n '1,30p' scubaduck/server.py
from __future__ import annotations
from dataclasses import dataclass, field
from typing import Any, Dict, List, Tuple
import re
from datetime import datetime, timedelta, timezone
import time
from pathlib import Path
import sqlite3
import duckdb
from dateutil import parser as dtparser
from flask import Flask, jsonify, request, send_from_directory
@dataclass
class Filter:
column: str
op: str
value: str | int | float | list[str] | None
@dataclass
class QueryParams:
start: str | None = None
end: str | None = None
order_by: str | None = None
order_dir: str = "ASC"
Let me add 'import traceback' to handle any exception tracebacks more effectively.
root@6c81833b9b82:/workspace/scubaduck# grep -n "import traceback" -n scubaduck/
server.py
Time to apply the patch and push it through!
> *** Begin Patch
> *** Update File: scubaduck/server.py
> @@
> -from pathlib import Path
> -import sqlite3
> +from pathlib import Path
> +import sqlite3
> +import traceback
> *** End Patch
> PATCH
Success. Updated the following files:
M scubaduck/server.py
It's time to patch the query function to improve its behavior.
end=end,
order_by=payload.get("order_by"),
order_dir=payload.get("order_dir", "ASC"),
limit=payload.get("limit"),
columns=payload.get("columns", []),
derived_columns=payload.get("derived_columns", {}),
graph_type=payload.get("graph_type", "samples"),
group_by=payload.get("group_by", []),
aggregate=payload.get("aggregate"),
show_hits=payload.get("show_hits", False),
)
for f in payload.get("filters", []):
params.filters.append(Filter(f["column"], f["op"], f.get("value")))
sql = build_query(params)
try:
rows = con.execute(sql).fetchall()
except Exception as exc:
return jsonify({"sql": sql, "error": str(exc)}), 400
return jsonify({"sql": sql, "rows": rows})
return app
I'll update the exception block to store traceback data and log it. Let's patch it.
root@6c81833b9b82:/workspace/scubaduck# apply_patch<<'PATCH'
> *** Begin Patch
> *** Update File: scubaduck/server.py
> @@
> - try:
> - rows = con.execute(sql).fetchall()
> - except Exception as exc:
> - return jsonify({"sql": sql, "error": str(exc)}), 400
> - return jsonify({"sql": sql, "rows": rows})
> + try:
> + rows = con.execute(sql).fetchall()
> + except Exception as exc:
> + tb = traceback.format_exc()
> + print(f"Query failed:\n{sql}\n{tb}")
> + return (
> + jsonify({"sql": sql, "error": str(exc), "traceback": tb}),
> + 400,
> + )
> + return jsonify({"sql": sql, "rows": rows})
> *** End Patch
> PATCH
Success. Updated the following files:
M scubaduck/server.py
I'll apply fixes to showError and dive() in index.html now.
root@6c81833b9b82:/workspace/scubaduck# grep -n "function dive" -n scubaduck/sta
tic/index.html
633:function dive(push=true) {
I'm checking the code near line 633 to see what changes are needed.
root@6c81833b9b82:/workspace/scubaduck# sed -n '630,660p' scubaduck/static/index
.html
let lastQueryTime = 0;
let queryStart = 0;
function dive(push=true) {
const payload = collectParams();
if (push) {
history.pushState(payload, '', paramsToSearch(payload));
}
const view = document.getElementById('view');
view.innerHTML = '<p>Loading...</p>';
window.lastResults = undefined;
queryStart = performance.now();
fetch('/api/query', {method:'POST', headers:{'Content-Type':'application/json'
}, body:JSON.stringify(payload)})
.then(async r => {
const data = await r.json();
if (!r.ok) throw new Error(data.error || 'Error');
return data;
})
.then(data => {
lastQueryTime = Math.round(performance.now() - queryStart);
showResults(data);
})
.catch(err => {
showError(err.message);
});
}
function collectParams() {
updateSelectedColumns();
const payload = {
start: document.getElementById('start').value,
I'll adjust the code to handle the async response and error as instructed.
root@6c81833b9b82:/workspace/scubaduck# apply_patch<<'PATCH'
> *** Begin Patch
> *** Update File: scubaduck/static/index.html
> @@
> - fetch('/api/query', {method:'POST', headers:{'Content-Type':'application/js
on'}, body:JSON.stringify(payload)})
> - .then(async r => {
> - const data = await r.json();
> - if (!r.ok) throw new Error(data.error || 'Error');
> - return data;
> - })
> + fetch('/api/query', {method:'POST', headers:{'Content-Type':'application/js
on'}, body:JSON.stringify(payload)})
> + .then(async r => {
> + const data = await r.json();
> + if (!r.ok) throw data;
> + return data;
> + })
> @@
> - .catch(err => {
> - showError(err.message);
> - });
> + .catch(err => {
> + showError(err);
> + });
> *** End Patch
> PATCH
Success. Updated the following files:
M scubaduck/static/index.html
Time to make changes to the showError function. Let's dive into that.
root@6c81833b9b82:/workspace/scubaduck# sed -n '884,896p' scubaduck/static/index
.html
function showError(msg) {
window.lastResults = {error: msg};
const view = document.getElementById('view');
view.innerHTML = `<p id="error-message">${msg}</p>`;
document.getElementById('query_info').textContent = '';
}
</script>
</body>
</html>
I'm adjusting the code to properly handle an object.
root@6c81833b9b82:/workspace/scubaduck# apply_patch<<'PATCH'
> *** Begin Patch
> *** Update File: scubaduck/static/index.html
> @@
> -function showError(msg) {
> - window.lastResults = {error: msg};
> - const view = document.getElementById('view');
> - view.innerHTML = `<p id="error-message">${msg}</p>`;
> - document.getElementById('query_info').textContent = '';
> -}
> +function showError(err) {
> + window.lastResults = err;
> + const view = document.getElementById('view');
> + let msg = '';
> + if (typeof err === 'string') {
> + msg = err;
> + } else if (err) {
> + msg = err.error || 'Error';
> + if (err.sql) {
> + msg += '\nSQL: ' + err.sql;
> + }
> + if (err.traceback) {
> + msg += '\n' + err.traceback;
> + }
> + }
> + view.innerHTML = `<pre id="error-message">${msg}</pre>`;
> + document.getElementById('query_info').textContent = '';
> +}
> *** End Patch
> PATCH
Success. Updated the following files:
M scubaduck/static/index.html
I'll format the code now, making sure to address any trailing whitespace or similar issues.
root@6c81833b9b82:/workspace/scubaduck# source .venv/bin/activate
(scubaduck) root@6c81833b9b82:/workspace/scubaduck# ruff format scubaduck/server
.py scubaduck/static/index.html
error: Failed to parse scubaduck/static/index.html:1:1: Expected a statement
1 file left unchanged
(scubaduck) root@6c81833b9b82:/workspace/scubaduck#
Ruff can't format HTML, so I'll run it only on server.py instead.
(scubaduck) root@6c81833b9b82:/workspace/scubaduck# ruff format scubaduck/server
.py
1 file left unchanged
(scubaduck) root@6c81833b9b82:/workspace/scubaduck#
The server.py file remains the same, so I'll proceed with the ruff check.
...
813 | weekday: 'short',
|
scubaduck/static/index.html:812:24: SyntaxError: Expected ',', found '='
|
810 | if (isTimeColumn(col)) {
811 | const d = new Date(v);
812 | td.textContent = d.toLocaleString('en-US', {
| ^
813 | weekday: 'short',
814 | month: 'short',
|
scubaduck/static/index.html:822:11: SyntaxError: Expected ',', found ';'
|
820 | hour12: true,
821 | timeZoneName: 'short'
822 | });
ntaxError: Got unexpected token ?
|
823 | } else {
824 | if (col === 'Hits') {
825 | const pct = totalHits ? ((v / totalHits) * 100).toFixed(1) : '0'
;
| ^
826 | td.textContent = `${v} (${pct}%)`;
827 | } else {
|
scubaduck/static/index.html:825:70: SyntaxError: Expected ',', found ':'
|
823 | } else {
824 | if (col === 'Hits') {
825 | const pct = totalHits ? ((v / totalHits) * 100).toFixed(1) : '0'
;
| ^
826 | td.textContent = `${v} (${pct}%)`;
827 | } else {
|
scubaduck/static/index.html:825:75: SyntaxError: Expected ',', found ';'
|
823 | } else {
824 | if (col === 'Hits') {
825 | const pct = totalHits ? ((v / totalHits) * 100).toFixed(1) : '0'
;
|
^
826 | td.textContent = `${v} (${pct}%)`;
827 | } else {
|
scubaduck/static/index.html:826:26: SyntaxError: Expected ',', found '='
|
824 | if (col === 'Hits') {
825 | const pct = totalHits ? ((v / totalHits) * 100).toFixed(1) : '0'
;
826 | td.textContent = `${v} (${pct}%)`;
| ^
827 | } else {
828 | td.textContent = v;
|
scubaduck/static/index.html:826:28: SyntaxError: Got unexpected token `
|
824 | if (col === 'Hits') {
825 | const pct = totalHits ? ((v / totalHits) * 100).toFixed(1) : '0'
;
826 | td.textContent = `${v} (${pct}%)`;
| ^
827 | } else {
828 | td.textContent = v;
|
scubaduck/static/index.html:826:29: SyntaxError: Got unexpected token $
|
824 | if (col === 'Hits') {
825 | const pct = totalHits ? ((v / totalHits) * 100).toFixed(1) : '0'
;
826 | td.textContent = `${v} (${pct}%)`;
| ^
827 | } else {
828 | td.textContent = v;
|
scubaduck/static/index.html:826:35: SyntaxError: Got unexpected token $
|
824 | if (col === 'Hits') {
825 | const pct = totalHits ? ((v / totalHits) * 100).toFixed(1) : '0'
;
826 | td.textContent = `${v} (${pct}%)`;
| ^
827 | } else {
828 | td.textContent = v;
|
scubaduck/static/index.html:826:42: SyntaxError: Expected an expression
|
824 | if (col === 'Hits') {
825 | const pct = totalHits ? ((v / totalHits) * 100).toFixed(1) : '0'
;
826 | td.textContent = `${v} (${pct}%)`;
| ^
827 | } else {
828 | td.textContent = v;
|
scubaduck/sringColumn(col) ? 'left' : 'right';
832 | tr.appendChild(td);
| ^ E703
833 | });
834 | table.appendChild(tr);
|
= help: Remove unnecessary semicolon
scubaduck/static/index.html:833:1: SyntaxError: unindent does not match any oute
r indentation level
|
831 | td.style.textAlign = isStringColumn(col) ? 'left' : 'right';
832 | tr.appendChild(td);
833 | });
| ^
834 | table.appendChild(tr);
835 | });
...
scubaduck/static/index.html:892:22: SyntaxError: Expected an expression
|
890 | msg = err;
891 | } else if (err) {
892 | msg = err.error || 'Error';
| ^
893 | if (err.sql) {
894 | msg += '\nSQL: ' + err.sql;
|
scubaduck/static/index.html:892:31: SyntaxError: Expected ',', found ';'
|
890 | msg = err;
891 | } else if (err) {
892 | msg = err.error || 'Error';
| ^
893 | if (err.sql) {
894 | msg += '\nSQL: ' + err.sql;
|
scubaduck/static/index.html:892:32: SyntaxError: Expected '}', found newline
|
890 | msg = err;
891 | } else if (err) {
892 | msg = err.error || 'Error';
| ^
893 | if (err.sql) {
">${msg}</pre>`;
| ^
901 | document.getElementById('query_info').textContent = '';
902 | }
|
scubaduck/static/index.html:900:21: SyntaxError: Expected a statement
|
898 | }
899 | }
900 | view.innerHTML = `<pre id="error-message">${msg}</pre>`;
| ^
901 | document.getElementById('query_info').textContent = '';
902 | }
|
scubaduck/static/index.html:900:26: SyntaxError: Simple statements must be separ
ated by newlines or semicolons
|
898 | }
899 | }
900 | view.innerHTML = `<pre id="error-message">${msg}</pre>`;
| ^
901 | document.getElementById('query_info').textContent = '';
902 | }
|
scubaduck/static/index.html:900:45: SyntaxError: Got unexpected token $
|
898 | }
899 | }
900 | view.innerHTML = `<pre id="error-message">${msg}</pre>`;
| ^
901 | document.getElementById('query_info').textContent = '';
902 | }
|
scubaduck/static/index.html:900:52: SyntaxError: Expected an expression
|
898 | }
899 | }
900 | view.innerHTML = `<pre id="error-message">${msg}</pre>`;
| ^
901 | document.getElementById('query_info').textContent = '';
902 | }
|
scubaduck/static/index.html:900:57: SyntaxError: Got unexpected token `
|
898 | }
899 | }
900 | view.innerHTML = `<pre id="error-message">${msg}</pre>`;
| ^
901 | document.getElementById('query_info').textContent = '';
902 | }
|
scubaduck/static/index.html:900:58: SyntaxError: Expected a statement
|
898 | }
899 | }
900 | view.innerHTML = `<pre id="error-message">${msg}</pre>`;
| ^
901 | document.getElementById('query_info').textContent = '';
902 | }
|
scubaduck/static/index.html:900:58: E703 Statement ends with an unnecessary semi
colon
|
898 | }
899 | }
900 | view.innerHTML = `<pre id="error-message">${msg}</pre>`;
| ^ E703
901 | document.getElementById('query_info').textContent = '';
902 | }
|
= help: Remove unnecessary semicolon
scubaduck/static/index.html:900:59: SyntaxError: Expected a statement
|
898 | }
899 | }
900 | view.innerHTML = `<pre id="error-message">${msg}</pre>`;
| ^
901 | document.getElementById('query_info').textContent = '';
902 | }
903 | </script>
|
scubaduck/static/index.html:90
904 | </body>
905 | </html>
| ^
|
Found 2153 errors.
(scubaduck) root@6c81833b9b82:/workspace/scubaduck#
I ran ruff on index.html, which led to syntax errors. I'll focus on Python files.
(scubaduck) root@6c81833b9b82:/workspace/scubaduck# ruff check scubaduck/server.
py
All checks passed!
(scubaduck) root@6c81833b9b82:/workspace/scubaduck# pyright
0 errors, 0 warnings, 0 informations
(scubaduck) root@6c81833b9b82:/workspace/scubaduck#
Let's run pytest to test test_server.py and test_web.py.
(scubaduck) root@6c81833b9b82:/workspace/scubaduck# pytest -k '' -q
bringing up nodes...
.................................. [100%]
34 passed in 6.12s
(scubaduck) root@6c81833b9b82:/workspace/scubaduck#
Tests all passed, but I'll add the new server test test_query_error_includes_sql_and_traceback
.
(scubaduck) root@6c81833b9b82:/workspace/scubaduck# tail -n 30 tests/test_server
.py
"limit": 100,
"columns": ["timestamp", "event", "value", "user"],
"filters": [],
}
rv = client.post(
"/api/query", data=json.dumps(payload), content_type="application/json"
)
data = rv.get_json()
assert rv.status_code == 200
assert len(data["rows"]) == 1
assert data["rows"][0][3] == "charlie"
def test_invalid_time_error() -> None:
app = server.app
client = app.test_client()
payload = {
"start": "nonsense",
"end": "now",
"order_by": "timestamp",
"limit": 10,
"columns": ["timestamp"],
"filters": [],
}
rv = client.post(
"/api/query", data=json.dumps(payload), content_type="application/json"
)
data = rv.get_json()
assert rv.status_code == 400
assert "error" in data
(scubaduck) root@6c81833b9b82:/workspace/scubaduck#
I'll implement the test for the query error, checking SQL and traceback contents.
> *** Update File: tests/test_server.py
> @@
> def test_invalid_time_error() -> None:
> @@
> data = rv.get_json()
> assert rv.status_code == 400
> assert "error" in data
> +
> +
> +def test_query_error_returns_sql_and_traceback() -> None:
> + app = server.app
> + client = app.test_client()
> + payload = {
> + "start": "2024-01-01 00:00:00",
> + "end": "2024-01-03 00:00:00",
> + "columns": ["event"],
> + "group_by": ["user"],
> + "aggregate": "avg",
> + }
> + rv = client.post(
> + "/api/query", data=json.dumps(payload), content_type="application/jso
n"
> + )
> + data = rv.get_json()
> + assert rv.status_code == 400
> + assert "sql" in data
> + assert "traceback" in data
> + assert "avg(event)" in data["sql"]
> *** End Patch
> PATCH
Success. Updated the following files:
M tests/test_server.py
(scubaduck) root@6c81833b9b82:/workspace/scubaduck#
I'll run ruff format to check for code style issues.
(scubaduck) root@6c81833b9b82:/workspace/scubaduck# ruff format tests/test_serve
r.py scubaduck/server.py
2 files left unchanged
(scubaduck) root@6c81833b9b82:/workspace/scubaduck#
I'll run a ruff check to catch potential issues in the code.
(scubaduck) root@6c81833b9b82:/workspace/scubaduck# ruff check scubaduck/server.
py tests/test_server.py
All checks passed!
(scubaduck) root@6c81833b9b82:/workspace/scubaduck# pyright
0 errors, 0 warnings, 0 informations
(scubaduck) root@6c81833b9b82:/workspace/scubaduck#
Time to execute the tests and see what results I get!
(scubaduck) root@6c81833b9b82:/workspace/scubaduck# pytest -k '' -q
bringing up nodes...
................................... [100%]
35 passed in 5.61s
(scubaduck) root@6c81833b9b82:/workspace/scubaduck#
I think I need to verify if the request for stack trace validation is covered in the tests. I’ll check tests/test_web.py
for relevant error handling, focusing on "test_invalid_time_error_shown"
.
(scubaduck) root@6c81833b9b82:/workspace/scubaduck# grep -n "invalid_time_error_
shown" -n tests/test_web.py
238:def test_invalid_time_error_shown(page: Any, server_url: str) -> None:
(scubaduck) root@6c81833b9b82:/workspace/scubaduck# sed -n '228,248p' tests/test
_web.py
def test_end_dropdown_now(page: Any, server_url: str) -> None:
page.goto(server_url)
page.wait_for_selector("#order_by option", state="attached")
page.click('[data-target="end-select"]')
page.click("#end-select div:text('now')")
assert page.input_value("#end") == "now"
def test_invalid_time_error_shown(page: Any, server_url: str) -> None:
data = run_query(
page,
server_url,
start="nonsense",
end="now",
order_by="timestamp",
)
assert "error" in data
msg = page.text_content("#view")
assert "nonsense" in msg
(scubaduck) root@6c81833b9b82:/workspace/scubaduck#
I'm thinking of creating a test for errors shown with SQL or traceback. I’ll inspect run_query
since it doesn't support group_by
, so I'll try using fetch
manually or mimic UI actions. Alternative: modify run_query
for better customization. Let’s see where this leads.
I'll use fetch
to replicate the query and check if the error appears in the DOM.
146: <div id="group_by_field" class="field" style="display:none;">
150: <input id="group_by" class="f-val" type="text">
243: document.getElementById('group_by_field').style.display = show ? 'flex' :
'none';
340: groupBy = document.getElementById('group_by').closest('.field');
667: group_by: groupBy.chips || [],
693: if (params.group_by && params.group_by.length) sp.set('group_by', params.g
roup_by.join(','));
713: if (params.group_by) {
714: groupBy.chips = params.group_by.slice();
757: if (sp.has('group_by')) params.group_by = sp.get('group_by').split(',').fi
lter(c => c);
(scubaduck) root@6c81833b9b82:/workspace/scubaduck#
I'll check around line 650 for the group_by input field.
(scubaduck) root@6c81833b9b82:/workspace/scubaduck# sed -n '640,680p' scubaduck/
static/index.html
window.lastResults = undefined;
queryStart = performance.now();
fetch('/api/query', {method:'POST', headers:{'Content-Type':'application/json'
}, body:JSON.stringify(payload)})
.then(async r => {
const data = await r.json();
if (!r.ok) throw data;
return data;
})
.then(data => {
lastQueryTime = Math.round(performance.now() - queryStart);
showResults(data);
})
.catch(err => {
showError(err);
});
}
function collectParams() {
updateSelectedColumns();
const payload = {
start: document.getElementById('start').value,
end: document.getElementById('end').value,
order_by: document.getElementById('order_by').value,
order_dir: orderDir,
limit: parseInt(document.getElementById('limit').value, 10),
columns: selectedColumns,
graph_type: graphTypeSel.value,
group_by: groupBy.chips || [],
aggregate: document.getElementById('aggregate').value,
show_hits: document.getElementById('show_hits').checked,
filters: Array.from(document.querySelectorAll('#filters .filter')).map(f =>
{
const chips = f.chips || [];
const op = f.querySelector('.f-op').value;
let value = null;
if (op !== 'empty' && op !== '!empty') {
value = chips.length === 0 ? null : (chips.length === 1 ? chips[0] : chi
ps);
}
return {column: f.querySelector('.f-col').value, op, value};
})
};
return payload;
(scubaduck) root@6c81833b9b82:/workspace/scubaduck#
I'll review the run_query function to verify it handles group_by properly now.
b.py
from __future__ import annotations
from typing import Any
def run_query(
page: Any,
url: str,
*,
start: str | None = None,
end: str | None = None,
order_by: str | None = None,
order_dir: str | None = "ASC",
limit: int | None = None,
) -> dict[str, Any]:
page.goto(url)
page.wait_for_selector("#order_by option", state="attached")
page.wait_for_selector("#order_dir", state="attached")
page.wait_for_function("window.lastResults !== undefined")
if start is not None:
page.fill("#start", start)
if end is not None:
page.fill("#end", end)
if order_by is not None:
page.select_option("#order_by", order_by)
if order_dir is not None and order_dir == "DESC":
page.click("#order_dir")
if limit is not None:
page.fill("#limit", str(limit))
page.evaluate("window.lastResults = undefined")
page.click("text=Dive")
page.wait_for_function("window.lastResults !== undefined")
return page.evaluate("window.lastResults")
def test_range_filters(page: Any, server_url: str) -> None:
data = run_query(
page,
server_url,
start="2024-01-02 00:00:00",
(scubaduck) root@6c81833b9b82:/workspace/scubaduck#
I'll add group_by
and aggregate
as optional parameters to run_query without disruptive changes.
(scubaduck) root@6c81833b9b82:/workspace/scubaduck# grep -n "groupBy" -n scubadu
ck/static/index.html | head
209:let groupBy = {chips: [], addChip: () => {}, renderChips: () => {}};
340: groupBy = document.getElementById('group_by').closest('.field');
341: initChipInput(groupBy);
384: selectedColumns = groupBy.chips.slice();
667: group_by: groupBy.chips || [],
714: groupBy.chips = params.group_by.slice();
715: groupBy.renderChips();
(scubaduck) root@6c81833b9b82:/workspace/scubaduck#
I need to check earlier in the code for that variable initialization.
static/index.html
</div>
<script>
const allColumns = [];
const columnTypes = {};
const stringColumns = [];
const integerColumns = [];
const timeColumns = [];
let selectedColumns = [];
let displayType = 'samples';
let groupBy = {chips: [], addChip: () => {}, renderChips: () => {}};
// Sidebar resizing
const sidebar = document.getElementById('sidebar');
const sidebarResizer = document.getElementById('sidebar-resizer');
let sidebarWidth = parseInt(localStorage.getItem('sidebarWidth') || 450, 10);
sidebar.style.width = sidebarWidth + 'px';
let sidebarResize = false;
function startSidebarDrag(e) {
e.preventDefault();
sidebarResize = true;
document.addEventListener('mousemove', onSidebarDrag);
document.addEventListener('mouseup', stopSidebarDrag);
}
function onSidebarDrag(e) {
if (!sidebarResize) return;
sidebarWidth = Math.max(200, e.clientX - sidebar.getBoundingClientRect().left)
;
(scubaduck) root@6c81833b9b82:/workspace/scubaduck#
I’ll patch run_query to handle group_by and aggregate via JS, and check graph_type.
(scubaduck) root@6c81833b9b82:/workspace/scubaduck# grep -n "graph_type" -n scub
aduck/static/index.html | head
91: <div id="header">sample.csv - events <select id="graph_type"><option value=
"samples">Samples</option><option value="table">Table</option></select></div>
236:const graphTypeSel = document.getElementById('graph_type');
666: graph_type: graphTypeSel.value,
692: if (params.graph_type) sp.set('graph_type', params.graph_type);
711: graphTypeSel.value = params.graph_type || 'samples';
756: if (sp.has('graph_type')) params.graph_type = sp.get('graph_type');
(scubaduck) root@6c81833b9b82:/workspace/scubaduck#
It seems the aggregate
is separate from group_by
—where's the select element for aggregate?
(scubaduck) root@6c81833b9b82:/workspace/scubaduck# grep -n "aggregate" -n scuba
duck/static/index.html | head
156: <div id="aggregate_field" class="field" style="display:none;">
158: <select id="aggregate">
244: document.getElementById('aggregate_field').style.display = show ? 'flex' :
'none';
668: aggregate: document.getElementById('aggregate').value,
694: if (params.aggregate) sp.set('aggregate', params.aggregate);
717: if (params.aggregate) document.getElementById('aggregate').value = params.
aggregate;
758: if (sp.has('aggregate')) params.aggregate = sp.get('aggregate');
(scubaduck) root@6c81833b9b82:/workspace/scubaduck#
I'll check around line 156 for details to set the aggregate field.
(scubaduck) root@6c81833b9b82:/workspace/scubaduck# sed -n '140,170p' scubaduck/
static/index.html
<button id="order_dir" type="button">ASC \u25B2</button>
</div>
<div class="field">
<label>Limit<span class="help" title="Choose the maximum number of res
ults to show in the chart after any aggregations have been applied. For example,
a limit of 10 will show no more than 10 rows for a table, etc.">[?]</span></lab
el>
<input id="limit" type="number" value="100" />
</div>
<div id="group_by_field" class="field" style="display:none;">
<label>Group By</label>
<div class="chip-box">
<div class="chip-input">
<input id="group_by" class="f-val" type="text">
<button type="button" class="chip-copy">\u2398</button>
</div>
<div class="chip-dropdown"></div>
</div>
</div>
<div id="aggregate_field" class="field" style="display:none;">
<label>Aggregate</label>
<select id="aggregate">
<option>Avg</option>
<option>Count</option>
<option>Sum</option>
<option>Min</option>
<option>Max</option>
<option>Count Distinct</option>
<option>p5</option>
<option>p25</option>
<option>p50</option>
<option>p70</option>
<option>p75</option>
<option>p90</option>
(scubaduck) root@6c81833b9b82:/workspace/scubaduck# sed -n '170,210p' scubaduck/
static/index.html
<option>p90</option>
<option>p95</option>
<option>p99</option>
<option>p99.9</option>
<option>p99.99</option>
</select>
</div>
<div id="show_hits_field" class="field" style="display:none;">
<label>Show Hits</label>
<input id="show_hits" type="checkbox" checked>
</div>
<div id="filters">
<h4>Filters<span class="help" title="You can create as many filters as
you want. You can either write a filter using a UI or manual SQL. In the UI, fi
lter consists of a column name, a relation (e.g., =, !=, <, >) and then a text f
ield. The text field is a token input. It accepts multiple tokens for = relation
, in which case we match using an OR for all options.">[?]</span></h4>
<div id="filter_list"></div>
<button id="add_filter" type="button" onclick="addFilter()">Add Filter
</button>
</div>
<div id="query_info" style="margin-top:10px;"></div>
</div>
<div id="columns" class="tab-content">
<div id="column_actions">
<a id="columns_all" href="#">All</a>
<a id="columns_none" href="#">None</a>
</div>
<div id="column_groups"></div>
</div>
</div>
<div id="sidebar-resizer"></div>
<div id="view">
<table id="results"></table>
</div>
</div>
<script>
const allColumns = [];
const columnTypes = {};
const stringColumns = [];
const integerColumns = [];
const timeColumns = [];
let selectedColumns = [];
let displayType = 'samples';
let groupBy = {chips: [], addChip: () => {}, renderChips: () => {}};
// Sidebar resizing
(scubaduck) root@6c81833b9b82:/workspace/scubaduck#
I’ll add parameters for group_by
list and aggregate
string to run_query before click.
b.py
from __future__ import annotations
from typing import Any
def run_query(
page: Any,
url: str,
*,
start: str | None = None,
end: str | None = None,
order_by: str | None = None,
order_dir: str | None = "ASC",
limit: int | None = None,
) -> dict[str, Any]:
page.goto(url)
page.wait_for_selector("#order_by option", state="attached")
page.wait_for_selector("#order_dir", state="attached")
page.wait_for_function("window.lastResults !== undefined")
if start is not None:
page.fill("#start", start)
if end is not None:
page.fill("#end", end)
if order_by is not None:
page.select_option("#order_by", order_by)
if order_dir is not None and order_dir == "DESC":
page.click("#order_dir")
if limit is not None:
page.fill("#limit", str(limit))
page.evaluate("window.lastResults = undefined")
page.click("text=Dive")
page.wait_for_function("window.lastResults !== undefined")
return page.evaluate("window.lastResults")
def test_range_filters(page: Any, server_url: str) -> None:
data = run_query(
page,
server_url,
start="2024-01-02 00:00:00",
(scubaduck) root@6c81833b9b82:/workspace/scubaduck#
I'll add group_by
, aggregate
, and optionally graph_type
to the query function.
> *** Begin Patch
> *** Update File: tests/test_web.py
> @@
> def run_query(
> page: Any,
> url: str,
> *,
> start: str | None = None,
> end: str | None = None,
> order_by: str | None = None,
> order_dir: str | None = "ASC",
> - limit: int | None = None,
> + limit: int | None = None,
> + group_by: list[str] | None = None,
> + aggregate: str | None = None,
> ) -> dict[str, Any]:
> @@
> - if limit is not None:
> - page.fill("#limit", str(limit))
> + if limit is not None:
> + page.fill("#limit", str(limit))
> + if group_by is not None:
> + page.select_option("#graph_type", "table")
> + page.evaluate("groupBy.chips = arguments[0]; groupBy.renderChips();",
group_by)
> + if aggregate is not None:
> + page.select_option("#graph_type", "table")
> + page.select_option("#aggregate", aggregate)
> *** End Patch
> PATCH
Success. Updated the following files:
M tests/test_web.py
(scubaduck) root@6c81833b9b82:/workspace/scubaduck#
I’ll verify the return lines and test run_query
after the patch to confirm.
b.py
from __future__ import annotations
from typing import Any
def run_query(
page: Any,
url: str,
*,
start: str | None = None,
end: str | None = None,
order_by: str | None = None,
order_dir: str | None = "ASC",
limit: int | None = None,
group_by: list[str] | None = None,
aggregate: str | None = None,
) -> dict[str, Any]:
page.goto(url)
page.wait_for_selector("#order_by option", state="attached")
page.wait_for_selector("#order_dir", state="attached")
page.wait_for_function("window.lastResults !== undefined")
if start is not None:
page.fill("#start", start)
if end is not None:
page.fill("#end", end)
if order_by is not None:
page.select_option("#order_by", order_by)
if order_dir is not None and order_dir == "DESC":
page.click("#order_dir")
if limit is not None:
page.fill("#limit", str(limit))
if group_by is not None:
page.select_option("#graph_type", "table")
page.evaluate("groupBy.chips = arguments[0]; groupBy.renderChips();", gr
oup_by)
if aggregate is not None:
page.select_option("#graph_type", "table")
page.select_option("#aggregate", aggregate)
page.evaluate("window.lastResults = undefined")
page.click("text=Dive")
page.wait_for_function("window.lastResults !== undefined")
return page.evaluate("window.lastResults")
def test_range_filters(page: Any, server_url: str) -> None:
data = run_query(
page,
server_url,
start="2024-01-02 00:00:00",
end="2024-01-02 04:00:00",
order_by="timestamp",
(scubaduck) root@6c81833b9b82:/workspace/scubaduck#
I'll create a new test after invalid_time_error_shown
in tests/test_web.py.
(scubaduck) root@6c81833b9b82:/workspace/scubaduck# grep -n "invalid_time_error_
shown" -n tests/test_web.py
246:def test_invalid_time_error_shown(page: Any, server_url: str) -> None:
_web.py
page.wait_for_selector("#order_by option", state="attached")
page.click('[data-target="end-select"]')
page.click("#end-select div:text('now')")
assert page.input_value("#end") == "now"
def test_invalid_time_error_shown(page: Any, server_url: str) -> None:
data = run_query(
page,
server_url,
start="nonsense",
end="now",
order_by="timestamp",
)
assert "error" in data
msg = page.text_content("#view")
assert "nonsense" in msg
def test_column_toggle_and_selection(page: Any, server_url: str) -> None:
page.goto(server_url)
(scubaduck) root@6c81833b9b82:/workspace/scubaduck#
I'll add a test to verify the error message and ensure "event" is included as a column.
> *** Update File: tests/test_web.py
> @@
> def test_invalid_time_error_shown(page: Any, server_url: str) -> None:
> @@
> msg = page.text_content("#view")
> assert "nonsense" in msg
> +
> +
> +def test_query_error_shown(page: Any, server_url: str) -> None:
> + data = run_query(
> + page,
> + server_url,
> + start="2024-01-01 00:00:00",
> + end="2024-01-03 00:00:00",
> + order_by="timestamp",
> + group_by=["user"],
> + aggregate="Avg",
> + )
> + assert "error" in data
> + assert "traceback" in data
> + msg = page.text_content("#view")
> + assert "avg(event)" in msg
> + assert "Traceback" in msg
> *** End Patch
> PATCH
Success. Updated the following files:
M tests/test_web.py
(scubaduck) root@6c81833b9b82:/workspace/scubaduck# ruff format tests/test_web.p
y
1 file left unchanged
tests/test_server.py scubaduck/server.py
All checks passed!
(scubaduck) root@6c81833b9b82:/workspace/scubaduck# pyright
0 errors, 0 warnings, 0 informations
(scubaduck) root@6c81833b9b82:/workspace/scubaduck#
I'll try running the tests again to see if they pass this time.
(scubaduck) root@6c81833b9b82:/workspace/scubaduck# pytest -k '' -q
bringing up nodes...
.....................F.............. [100%]
=================================== FAILURES ===================================
_______________________ test_query_error_shown[chromium] _______________________
[gw4] linux -- Python 3.12.3 /workspace/scubaduck/.venv/bin/python
Traceback (most recent call last):
File "/workspace/scubaduck/.venv/lib/python3.12/site-packages/_pytest/runner.p
y", line 341, in from_call
result: TResult | None = func()
^^^^^^
File "/workspace/scubaduck/.venv/lib/python3.12/site-packages/_pytest/runner.p
y", line 242, in <lambda>
lambda: runtest_hook(item=item, **kwds), when=when, reraise=reraise
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/workspace/scubaduck/.venv/lib/python3.12/site-packages/pluggy/_hooks.py
", line 512, in __call__
return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult
)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
^
File "/workspace/scubaduck/.venv/lib/python3.12/site-packages/pluggy/_manager.
py", line 120, in _hookexec
return self._inner_hookexec(hook_name, methods, kwargs, firstresult)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/workspace/scubaduck/.venv/lib/python3.12/site-packages/pluggy/_callers.
py", line 167, in _multicall
raise exception
File "/workspace/scubaduck/.venv/lib/python3.12/site-packages/pluggy/_callers.
py", line 139, in _multicall
teardown.throw(exception)
File "/workspace/scubaduck/.venv/lib/python3.12/site-packages/_pytest/threadex
ception.py", line 92, in pytest_runtest_call
yield from thread_exception_runtest_hook()
File "/workspace/scubaduck/.venv/lib/python3.12/site-packages/_pytest/threadex
ception.py", line 68, in thread_exception_runtest_hook
yield
File "/workspace/scubaduck/.venv/lib/python3.12/site-packages/pluggy/_callers.
py", line 139, in _multicall
teardown.throw(exception)
File "/workspace/scubaduck/.venv/lib/python3.12/site-packages/_pytest/unraisab
leexception.py", line 95, in pytest_runtest_call
yield from unraisable_exception_runtest_hook()
File "/workspace/scubaduck/.venv/lib/python3.12/site-packages/_pytest/unraisab
leexception.py", line 70, in unraisable_exception_runtest_hook
yield
File "/workspace/scubaduck/.venv/lib/python3.12/site-packages/pluggy/_callers.
py", line 139, in _multicall
teardown.throw(exception)
File "/workspace/scubaduck/.venv/lib/python3.12/site-packages/_pytest/logging.
py", line 846, in pytest_runtest_call
yield from self._runtest_for(item, "call")
File "/workspace/scubaduck/.venv/lib/python3.12/site-packages/_pytest/logging.
py", line 829, in _runtest_for
yield
File "/workspace/scubaduck/.venv/lib/python3.12/site-packages/pluggy/_callers.
py", line 139, in _multicall
teardown.throw(exception)
File "/workspace/scubaduck/.venv/lib/python3.12/site-packages/_pytest/capture.
py", line 898, in pytest_runtest_call
return (yield)
^^^^^
File "/workspace/scubaduck/.venv/lib/python3.12/site-packages/pluggy/_callers.
py", line 139, in _multicall
teardown.throw(exception)
File "/workspace/scubaduck/.venv/lib/python3.12/site-packages/_pytest/skipping
.py", line 257, in pytest_runtest_call
return (yield)
^^^^^
File "/workspace/scubaduck/.venv/lib/python3.12/site-packages/pluggy/_callers.
py", line 121, in _multicall
res = hook_impl.function(*args)
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/workspace/scubaduck/.venv/lib/python3.12/site-packages/_pytest/runner.p
y", line 174, in pytest_runtest_call
item.runtest()
File "/workspace/scubaduck/.venv/lib/python3.12/site-packages/_pytest/python.p
y", line 1627, in runtest
self.ihook.pytest_pyfunc_call(pyfuncitem=self)
File "/workspace/scubaduck/.venv/lib/python3.12/site-packages/pluggy/_hooks.py
", line 512, in __call__
return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult
)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
^
File "/workspace/scubaduck/.venv/lib/python3.12/site-packages/pluggy/_manager.
py", line 120, in _hookexec
return self._inner_hookexec(hook_name, methods, kwargs, firstresult)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/workspace/scubaduck/.venv/lib/python3.12/site-packages/pluggy/_callers.
py", line 167, in _multicall
raise exception
File "/workspace/scubaduck/.venv/lib/python3.12/site-packages/pluggy/_callers.
py", line 121, in _multicall
res = hook_impl.function(*args)
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/workspace/scubaduck/.venv/lib/python3.12/site-packages/_pytest/python.p
y", line 159, in pytest_pyfunc_call
result = testfunction(**testargs)
^^^^^^^^^^^^^^^^^^^^^^^^
File "/workspace/scubaduck/tests/test_web.py", line 260, in test_query_error_s
hown
data = run_query(
^^^^^^^^^^
File "/workspace/scubaduck/tests/test_web.py", line 34, in run_query
page.evaluate("groupBy.chips = arguments[0]; groupBy.renderChips();", group_
by)
File "/workspace/scubaduck/.venv/lib/python3.12/site-packages/playwright/sync_
api/_generated.py", line 8536, in evaluate
self._sync(
File "/workspace/scubaduck/.venv/lib/python3.12/site-packages/playwright/_impl
/_sync_base.py", line 115, in _sync
return task.result()
^^^^^^^^^^^^^
File "/workspace/scubaduck/.venv/lib/python3.12/site-packages/playwright/_impl
/_page.py", line 468, in evaluate
return await self._main_frame.evaluate(expression, arg)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/workspace/scubaduck/.venv/lib/python3.12/site-packages/playwright/_impl
/_frame.py", line 278, in evaluate
await self._channel.send(
File "/workspace/scubaduck/.venv/lib/python3.12/site-packages/playwright/_impl
/_connection.py", line 61, in send
return await self._connection.wrap_api_call(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/workspace/scubaduck/.venv/lib/python3.12/site-packages/playwright/_impl
/_connection.py", line 528, in wrap_api_call
raise rewrite_error(error, f"{parsed_st['apiName']}: {error}") from None
playwright._impl._errors.Error: Page.evaluate: ReferenceError: arguments is not
defined
at eval (eval at evaluate (:313:29), <anonymous>:1:17)
at eval (<anonymous>)
at UtilityScript.evaluate (<anonymous>:313:29)
at UtilityScript.<anonymous> (<anonymous>:1:44)
------------------------------ Captured log call -------------------------------
INFO werkzeug:_internal.py:97 127.0.0.1 - - [21/May/2025 01:48:31] "GET / HT
TP/1.1" 200 -
INFO werkzeug:_internal.py:97 127.0.0.1 - - [21/May/2025 01:48:31] "GET /api
/columns HTTP/1.1" 200 -
INFO werkzeug:_internal.py:97 127.0.0.1 - - [21/May/2025 01:48:31] "POST /ap
i/query HTTP/1.1" 200 -
=========================== short test summary info ============================
FAILED tests/test_web.py::test_query_error_shown[chromium] - playwright._impl._e
rrors.Error: Page.evaluate: ReferenceError: arguments is...
1 failed, 35 passed in 6.23s
(scubaduck) root@6c81833b9b82:/workspace/scubaduck#
The error happens because ‘arguments’ isn’t defined in the arrow function. Changing to the correct syntax.
(scubaduck) root@6c81833b9b82:/workspace/scubaduck# grep -n "groupBy.chips" -n t
ests/test_web.py
34: page.evaluate("groupBy.chips = arguments[0]; groupBy.renderChips();",
group_by)
(scubaduck) root@6c81833b9b82:/workspace/scubaduck# sed -n '30,40p' tests/test_w
eb.py
if limit is not None:
page.fill("#limit", str(limit))
if group_by is not None:
page.select_option("#graph_type", "table")
page.evaluate("groupBy.chips = arguments[0]; groupBy.renderChips();", gr
oup_by)
if aggregate is not None:
page.select_option("#graph_type", "table")
page.select_option("#aggregate", aggregate)
page.evaluate("window.lastResults = undefined")
page.click("text=Dive")
page.wait_for_function("window.lastResults !== undefined")
(scubaduck) root@6c81833b9b82:/workspace/scubaduck# apply_patch<<'PATCH'
> *** Begin Patch
> *** Update File: tests/test_web.py
> @@
> - if group_by is not None:
> - page.select_option("#graph_type", "table")
> - page.evaluate("groupBy.chips = arguments[0]; groupBy.renderChips();",
group_by)
> + if group_by is not None:
> + page.select_option("#graph_type", "table")
> + page.evaluate(
> + "g => { groupBy.chips = g; groupBy.renderChips(); }",
> + group_by,
> + )
> *** End Patch
> PATCH
Success. Updated the following files:
M tests/test_web.py
(scubaduck) root@6c81833b9b82:/workspace/scubaduck# ruff format tests/test_web.p
y
1 file left unchanged
(scubaduck) root@6c81833b9b82:/workspace/scubaduck# ruff check tests/test_web.py
tests/test_server.py scubaduck/server.py
All checks passed!
(scubaduck) root@6c81833b9b82:/workspace/scubaduck# pyright
0 errors, 0 warnings, 0 informations
(scubaduck) root@6c81833b9b82:/workspace/scubaduck#
Time to run pytest and check if everything is working as expected.
(scubaduck) root@6c81833b9b82:/workspace/scubaduck# pytest -k '' -q
bringing up nodes...
.................................... [100%]
36 passed in 6.36s
(scubaduck) root@6c81833b9b82:/workspace/scubaduck#
Tests passed successfully. I'll check the git status before committing.
(scubaduck) root@6c81833b9b82:/workspace/scubaduck# git status --short
M scubaduck/server.py
M scubaduck/static/index.html
M tests/test_server.py
M tests/test_web.py
(scubaduck) root@6c81833b9b82:/workspace/scubaduck# git add scubaduck/server.py
scubaduck/static/index.html tests/test_server.py tests/test_web.py
(scubaduck) root@6c81833b9b82:/workspace/scubaduck# git status --short
M scubaduck/server.py
M scubaduck/static/index.html
M tests/test_server.py
M tests/test_web.py
(scubaduck) root@6c81833b9b82:/workspace/scubaduck#
I might have missed committing after staging with git add
. I'll double-check.
(scubaduck) root@6c81833b9b82:/workspace/scubaduck# git diff --staged --name-onl
y
scubaduck/server.py
scubaduck/static/index.html
tests/test_server.py
tests/test_web.py
(scubaduck) root@6c81833b9b82:/workspace/scubaduck#
The files are staged, but the status shows as modified. I'll go ahead and commit.
(scubaduck) root@6c81833b9b82:/workspace/scubaduck# git commit -m "feat(server):
include SQL and traceback in query errors" && git status --short
[work 1aa7a0e] feat(server): include SQL and traceback in query errors
4 files changed, 72 insertions(+), 6 deletions(-)
(scubaduck) root@6c81833b9b82:/workspace/scubaduck# git status --short
(scubaduck) root@6c81833b9b82:/workspace/scubaduck#
Time to write the PR summary, highlighting changes to server code error handling.
(scubaduck) root@6c81833b9b82:/workspace/scubaduck# nl -ba scubaduck/server.py |
sed -n '244,268p'
244 columns=payload.get("columns", []),
245 derived_columns=payload.get("derived_columns", {}),
246 graph_type=payload.get("graph_type", "samples"),
247 group_by=payload.get("group_by", []),
248 aggregate=payload.get("aggregate"),
249 show_hits=payload.get("show_hits", False),
250 )
251 for f in payload.get("filters", []):
252 params.filters.append(Filter(f["column"], f["op"], f.get("va
lue")))
253 sql = build_query(params)
254 try:
255 rows = con.execute(sql).fetchall()
256 except Exception as exc:
257 tb = traceback.format_exc()
258 print(f"Query failed:\n{sql}\n{tb}")
259 return (
260 jsonify({"sql": sql, "error": str(exc), "traceback": tb}
),
261 400,
262 )
263 return jsonify({"sql": sql, "rows": rows})
264
265 return app
266
267
268 app = create_app()
(scubaduck) root@6c81833b9b82:/workspace/scubaduck#
I’ll also display the modifications I made to the showError function in JS.
(scubaduck) root@6c81833b9b82:/workspace/scubaduck# grep -n "showError" -n scuba
duck/static/index.html
653: showError(err);
885:function showError(err) {
(scubaduck) root@6c81833b9b82:/workspace/scubaduck# nl -ba scubaduck/static/inde
x.html | sed -n '648,664p'
648 .then(data => {
649 lastQueryTime = Math.round(performance.now() - queryStart);
650 showResults(data);
651 })
652 .catch(err => {
653 showError(err);
654 });
655 }
656
657 function collectParams() {
658 updateSelectedColumns();
659 const payload = {
660 start: document.getElementById('start').value,
661 end: document.getElementById('end').value,
662 order_by: document.getElementById('order_by').value,
663 order_dir: orderDir,
664 limit: parseInt(document.getElementById('limit').value, 10),
(scubaduck) root@6c81833b9b82:/workspace/scubaduck# nl -ba scubaduck/static/inde
x.html | sed -n '882,904p'
882 `Your query took about ${lastQueryTime} ms`;
883 }
884
885 function showError(err) {
886 window.lastResults = err;
887 const view = document.getElementById('view');
888 let msg = '';
889 if (typeof err === 'string') {
890 msg = err;
891 } else if (err) {
892 msg = err.error || 'Error';
893 if (err.sql) {
894 msg += '\nSQL: ' + err.sql;
895 }
896 if (err.traceback) {
897 msg += '\n' + err.traceback;
898 }
899 }
900 view.innerHTML = `<pre id="error-message">${msg}</pre>`;
901 document.getElementById('query_info').textContent = '';
902 }
903 </script>
904 </body>
(scubaduck) root@6c81833b9b82:/workspace/scubaduck# nl -ba tests/test_server.py
| sed -n '90,124p'
90 "end": "2024-01-03 00:00:00",
91 "order_by": "timestamp",
92 "limit": 10,
93 "columns": ["timestamp", "user"],
94 "filters": [],
95 }
96 rv = client.post(
97 "/api/query", data=json.dumps(payload), content_type="applicatio
n/json"
98 )
99 data = rv.get_json()
100 assert data
101 rows = data["rows"]
102 assert len(rows[0]) == 2
103 assert rows[0][1] == "alice"
104
105
106 def test_string_filter_ops() -> None:
107 app = server.app
108 client = app.test_client()
109 base = {
110 "start": "2024-01-01 00:00:00",
111 "end": "2024-01-03 00:00:00",
112 "order_by": "timestamp",
113 "limit": 100,
114 "columns": ["timestamp", "event", "value", "user"],
115 }
116
117 contains = {
118 **base,
119 "filters": [{"column": "user", "op": "contains", "value": "ali"}
],
120 }
121 rv = client.post(
122 "/api/query", data=json.dumps(contains), content_type="applicati
on/json"
123 )
124 rows = rv.get_json()["rows"]
(scubaduck) root@6c81833b9b82:/workspace/scubaduck#
I'm heading towards the bottom to grab the latest test lines.
(scubaduck) root@6c81833b9b82:/workspace/scubaduck# grep -n "test_query_error_re
turns_sql_and_traceback" -n tests/test_server.py
269:def test_query_error_returns_sql_and_traceback() -> None:
(scubaduck) root@6c81833b9b82:/workspace/scubaduck# nl -ba tests/test_server.py
| sed -n '260,284p'
260 }
261 rv = client.post(
262 "/api/query", data=json.dumps(payload), content_type="applicatio
n/json"
263 )
264 data = rv.get_json()
265 assert rv.status_code == 400
266 assert "error" in data
267
268
269 def test_query_error_returns_sql_and_traceback() -> None:
270 app = server.app
271 client = app.test_client()
272 payload = {
273 "start": "2024-01-01 00:00:00",
274 "end": "2024-01-03 00:00:00",
275 "columns": ["event"],
276 "group_by": ["user"],
277 "aggregate": "avg",
278 }
279 rv = client.post(
280 "/api/query", data=json.dumps(payload), content_type="applicatio
n/json"
281 )
282 data = rv.get_json()
283 assert rv.status_code == 400
284 assert "sql" in data
(scubaduck) root@6c81833b9b82:/workspace/scubaduck# nl -ba tests/test_server.py
| sed -n '284,296p'
284 assert "sql" in data
285 assert "traceback" in data
286 assert "avg(event)" in data["sql"]
(scubaduck) root@6c81833b9b82:/workspace/scubaduck#
I'll add the new test to test_web to verify the changes.
(scubaduck) root@6c81833b9b82:/workspace/scubaduck# grep -n "test_query_error_sh
own" -n tests/test_web.py
262:def test_query_error_shown(page: Any, server_url: str) -> None:
(scubaduck) root@6c81833b9b82:/workspace/scubaduck# nl -ba tests/test_web.py | s
ed -n '250,272p'
250 data = run_query(
251 page,
252 server_url,
253 start="nonsense",
254 end="now",
255 order_by="timestamp",
256 )
257 assert "error" in data
258 msg = page.text_content("#view")
259 assert "nonsense" in msg
260
261
262 def test_query_error_shown(page: Any, server_url: str) -> None:
263 data = run_query(
264 page,
265 server_url,
266 start="2024-01-01 00:00:00",
267 end="2024-01-03 00:00:00",
268 order_by="timestamp",
269 group_by=["user"],
270 aggregate="Avg",
271 )
272 assert "error" in data
(scubaduck) root@6c81833b9b82:/workspace/scubaduck# nl -ba tests/test_web.py | s
ed -n '272,288p'
272 assert "error" in data
273 assert "traceback" in data
274 msg = page.text_content("#view")
275 assert "avg(event)" in msg
276 assert "Traceback" in msg
277
278
279 def test_column_toggle_and_selection(page: Any, server_url: str) -> None
:
280 page.goto(server_url)
281 page.wait_for_selector("#order_by option", state="attached")
282 page.click("text=Columns")
283 page.wait_for_selector("#column_groups input", state="attached")
284
285 count = page.evaluate(
286 "document.querySelectorAll('#column_groups input:checked').lengt
h"
287 )
288 assert count == 4
(scubaduck) root@6c81833b9b82:/workspace/scubaduck#
I'll summarize changes: added stack trace, SQL query in error responses, plus new tests.