logging_strict

joined 1 year ago
[–] logging_strict@programming.dev 2 points 2 days ago (1 children)

Very wise idea. And if you want to up your game, can validate the yaml against a schema.

Check out strictyaml

The author is ahead of his time. Uses validated yaml to build stories and weave those into web sites.

Unfortunately the author also does the same with strictyaml tests. Can get frustrating cause the tests are too simple.

[–] logging_strict@programming.dev 2 points 2 days ago* (last edited 2 days ago)

Curious to hear your reasoning as to why yaml is less desirable? Would think the opposite.

Surprised me with your strong opinion.

Maybe if you would allow, and have a few shot glasses handy, could take a stab at changing your mind.

But first list all your reservations concerning yaml

Relevent packages I wrote that rely on yaml

  • pytest-logging-strict

  • sphinx-external-toc-strict

v1.8.1 was release on 2025-03-17

So pynput is being maintained. BUT there are current 155 open issues and 23 PRs. This means nothing besides this is a popular package.

v1.8.1 Changes

  • Remove incorrectly merged line for the Xorg backend. Thanks to sphh!
  • Let events know about the new injected parameter. Thanks to phpjunkie420!

Mentioned that a PR dealt with a Xorg backend issue. Don't know if this addresses your particular issue.

What pynput version do you have in your venv?

Run this command please to get the package version

python -m pip list | grep pynput

Can understand your issue report would be buried in all the other issues. So help there might not come within a timely manner.

The next step would be to create a pytest file to be able to repeat the tests, rather than run them each time manually. Obviously logging in and out is not possible. But it's probably also completely unnecessary.

I use Xorg and Linux. Competent enough with pytest (author of pytest-logging-strict).

Before roll up my sleeves want you to confirm v1.8.1 still has this issue.

[–] logging_strict@programming.dev 1 points 1 week ago* (last edited 1 week ago)

You yourself are a victim

Having good intentions you tried and found out the hard way that in fact packaging does matter.

You were tricked.

I looked at it, recognized the flaming turd being thrown at the proverbial wall, and dodged.

That is our job when doing code reviews and offering advice. Be kind up to the point where being honest is unavoidable.

A series of scripts does not make a package. Have to put our collective foot down; follow Nancy Reagon's advice, Just say no!

This project cannot be helped. It needs a complete rewrite.

Having minimal expectations is not being mean to noobs. Not getting anywhere in the ballpark of minimal expectations is being mean to potential users ...

[–] logging_strict@programming.dev 2 points 1 week ago* (last edited 1 week ago)

Using match is virtue signaling that have no intention of creating a working package.

What's next on the list of crap could all live without?

Why? It's not a package? There are no tests or anything else. It's held together with duct tape, hope, and good intentions. So of course it'll not work as intended.

[–] logging_strict@programming.dev 3 points 1 week ago* (last edited 1 week ago) (2 children)

User friendly LOL!

If you encounter any issues or bugs, let me know so I can fix them!

You shoulda lead with that. I love the humor. Can't stop laughing,

The entire project is a bug LOL!

Where to start?

  1. There is no packaging at all

Start with a pyproject.toml and work from there.

  1. no tests

Everything is a bug until it's got test coverage.

  1. screenshots

In the .github folder?! That's gotta be a 1st

  1. no dev environment

Expecting pre-commit as well as isort, flask, black, and mypy

  1. print statements galore

Looked into requirements.txt expecting to find a console UI framework. There is none!

A pattern has emerged that many Python coders have spent not enough to no time learning packaging, dev toolchain, and CI/CD publishing. When asking folks to test your work they'll be expecting a published package, not a series of amateurish scripts and a bash install script.

Should write an advertisement

Please someone skilled at console UI and packaging please please please help in a paid position.

Can confidently say, you need help.

Not writing more features the OP is good at that. Just packaging and swapping out the prehistoric console UI with a modern console UI framework.

[–] logging_strict@programming.dev 2 points 2 weeks ago (1 children)

Wow that's neat! Thanks for bringing this up.

The feature flags example should be rewritten to use enum.Flag

[–] logging_strict@programming.dev 3 points 3 weeks ago* (last edited 3 weeks ago)

a use case -- feature flags

Mix and match to plan your day

will i be going home today?

>>> OUTRAGED_BY_NEWS = 0b00000001
>>> GET_A_COFFEE = 0b00000010
>>> GO_FOR_A_HIKE = 0b00000100
>>> GO_FOR_A_RUN = 0b00001000
>>> GO_HOME = 0b00010000 
>>> various_flags_ored_together = GET_A_COFFEE | GO_FOR_A_RUN | GO_HOME
>>> various_flags_ored_together & GO_HOME == GO_HOME
True
>>> various_flags_ored_together & GO_FOR_A_HIKE == GO_FOR_A_HIKE
False
>>> various_flags_ored_together = GET_A_COFFEE | GO_FOR_A_RUN | GO_HOME
>>> bin(various_flags_ored_together)
'0b11010'
>>> various_flags_ored_together & OUTRAGED_BY_NEWS == OUTRAGED_BY_NEWS
>>> False
>>> bin(OUTRAGED_BY_NEWS)
>>> '0b1'
>>> various_flags_ored_together >> OUTRAGED_BY_NEWS
>>> bin(various_flags_ored_together)
'0b1101'

Guess haven't gone for a hike today...maybe tomorrow

right shift removes bit at flag position. Which, in this case, happens to correspond to the right most bit.

use case -- file access permissions

For those looking to check file access permissions there is the stat module

>>> import stat
>>> from pathlib import Path
>>> path_f = Path.home().joinpath(".bashrc")
>>> stat.S_IRUSR
256
>>> path_f.stat().st_mode
33188
>>> is_owner_read = path_f.stat().st_mode & stat.S_IRUSR == stat.S_IRUSR
>>> is_owner_read
True
>>> path_f = Path("/etc/fstab")
>>> is_other_write = path_f.stat().st_mode & stat.S_IWOTH == stat.S_IWOTH
>>> is_other_write
False

Assumes ~/.bashrc exists, if not choose a different file you are owner and have read access to.

path_f.stat().st_mode & stat.S_IRUSR == stat.S_IRUSR

Looking thru the mundane file (not Linux access control list) permissions. All those flags are crammed into st_mode. In st_mode, on/off bit at 2^8 is that on?

Sources

read user access stat.S_IRUSR

write others access stat.S_IWOTH

os.stat_result

pathlib.Path.stat

Sorry

Is that a quote from Forest Gump movie?

Slightly healthier words of wisdom

Only so much humor can take on an empty stomach

[–] logging_strict@programming.dev 1 points 3 weeks ago* (last edited 3 weeks ago)
from multiprocessing import Lock
l = Lock()

flake8 .... way too ambiguous

so what if it's 100x slower /nosarc

8
submitted 1 month ago* (last edited 1 month ago) by logging_strict@programming.dev to c/python@programming.dev
 

Market research

This post is only about dependency management, not package management, not build backends.

You know about these:

  • uv

  • poetry

  • pipenv

You are probably not familiar with:

  • pip-compile-multi

    (toposort, pip-tools)

You are defintely unfamiliar with:

  • wreck

    (pip-tools, pip-requirements-parser)

pip-compile-multi creates lock files. Has no concept of unlock files.

wreck produces both lock and unlock files. venv aware.

Both sync dependencies across requirement files

Both act only upon requirements files, not venv(s)

Up to speed with wreck

You are familiar with .in and .txt requirements files.

.txt is split out into .lock and .unlock. The later is for packages which are not apps.

Create .in files that are interlinked with -r and -c. No editable builds. No urls.

(If this is a deal breaker feel free to submit a PR)

pins files

pins-*.in are for common constraints. The huge advantage here is to document why?

Without the documentation even the devs has no idea whether or not the constraint is still required.

pins-*.in file are split up to tackle one issue. The beauty is the issue must be documented with enough details to bring yourself up to speed.

Explain the origin of the issue in terms a 6 year old can understand.

Configuration

python -m pip install wreck

This is logging-strict pyproject.toml


[tool.wreck]
create_pins_unlock = false

[[tool.wreck.venvs]]
venv_base_path = '.venv'
reqs = [
    'requirements/dev',
    'requirements/kit',
    'requirements/pip',
    'requirements/pip-tools',
    'requirements/prod',
    'requirements/manage',
    'requirements/mypy',
    'requirements/tox',
]

[[tool.wreck.venvs]]
venv_base_path = '.doc/.venv'
reqs = [
    'docs/requirements',
]

dynamic = [
    "optional-dependencies",
    "dependencies",
    "version",
]

[tool.setuptools.dynamic]
dependencies = { file = ["requirements/prod.unlock"] }
optional-dependencies.pip = { file = ["requirements/pip.lock"] }
optional-dependencies.pip_tools = { file = ["requirements/pip-tools.lock"] }
optional-dependencies.dev = { file = ["requirements/dev.lock"] }
optional-dependencies.manage = { file = ["requirements/manage.lock"] }
optional-dependencies.docs = { file = ["docs/requirements.lock"] }
version = {attr = "logging_strict._version.__version__"}

Look how short and simple that is.

The only thing you have to unlearn is being so timid.

More venvs. More constraints and requirements complexity.

Do it

mkdir -p .venv || :;
pyenv version > .venv/python-version
python -m venv .venv

mkdir -p .doc || :;
echo "3.10.14" > .doc/python-version
cd .doc && python -m venv .venv; cd - &>/dev/null

. .venv/bin/activate
# python -m pip install wreck
reqs fix --venv-relpath='.venv'

There will be no avoidable resolution conflicts.

Preferable to do this within tox-reqs.ini

Details

TOML file format expects paths to be single quoted. The paths are relative without the last file suffix.

If pyproject.toml not in the cwd, --path='path to pyproject.toml'

create_pins_unlock = false tells wreck to not produce .unlock files for pins-*.in files.

DANGER

This is not for a faint of heart. If you can avoid it. This is for the folks who often say, Oh really, hold my beer!

For pins that span venv, add the file suffix .shared

e.g. pins-typing.shared.in

wreck deals with one venv at a time. Files that span venv have to be dealt with manually and carefully.

Issues

  1. no support for editable builds

  2. no url support

  3. no hashs

  4. your eyes will tire and brains will splatter on the wall, from all the eye rolling after sifting thru endless posts on uv and poetry and none about pip-compile-multi or wreck

  5. Some folks love having all dependency managed within pyproject.toml These folks are deranged and its impossible to convince them otherwise. pyproject.toml is a config file, not a database. It should be read only.

  6. a docs link on pypi.org is 404. Luckily there are two docs links. Should really just fix that, but it's left like that to see if anyone notices. No one did.

6
submitted 1 month ago* (last edited 1 month ago) by logging_strict@programming.dev to c/python@programming.dev
 

Finally got around to creating a gh profile page

The design is to give activity insights on:

  • what Issues/PRs working on

  • future issues/PRs

  • for fun, show off package mascots

All out of ideas. Any suggestions? How did you improve your github profile?

14
Whats in a Python tarball (programming.dev)
submitted 2 months ago* (last edited 2 months ago) by logging_strict@programming.dev to c/python@programming.dev
 

From helping other projects have run across a fundamental issue which web searches have not given appropriate answers.

What should go in a tarball and what should not?

Is it only the build files, python code, and package data and nothing else?

Should it include tests/ folder?

Should it include development and configuration files?

Have seven published packages which include almost all the files and folders. Including:

.gitignore,

.gitattributes,

.github folder tree,

docs/,

tests/,

Makefile,

all config files,

all tox files,

pre-commit config file

My thinking is that the tarball should have everything needed to maintain the package, but this belief has been challenged. That the tarball is not appropriate for that.

Thoughts?

 

PEP 735 what is it's goal? Does it solve our dependency hell issue?

A deep dive and out comes this limitation

The mutual compatibility of Dependency Groups is not guaranteed.

-- https://peps.python.org/pep-0735/#lockfile-generation

Huh?! Why not?

mutual compatibility or go pound sand!

pip install -r requirements/dev.lock
pip install -r requirements/kit.lock -r requirements/manage.lock

The above code, purposefully, does not afford pip a fighting chance. If there are incompatibilities, it'll come out when trying randomized combinations.

Without a means to test for and guarantee mutual compatibility, end users will always find themselves in dependency hell.

Any combination of requirement files (or dependency groups), intended for the same venv, MUST always work!

What if this is scaled further, instead of one package, a chain of packages?!

 

In a requirements-*.in file, at the top of the file, are lines with -c and -r flags followed by a requirements-*.in file. Uses relative paths (ignoring URLs).

Say have docs/requirements-pip-tools.in

-r ../requirements/requirements-prod.in
-c ../requirements/requirements-pins-base.in
-c ../requirements/requirements-pins-cffi.in

...

The intent is compiling this would produce docs/requirements-pip-tool.txt

But there is confusion as to which flag to use. It's non-obvious.

constraint

Subset of requirements features. Intended to restrict package versions. Does not necessarily (might not) install the package!

Does not support:

  • editable mode (-e)

  • extras (e.g. coverage[toml])

Personal preference

  • always organize requirements files in folder(s)

  • don't prefix requirements files with requirements-, just doing it here

  • DRY principle applies; split out constraints which are shared.

view more: next ›