Compare commits

...

1007 commits
v0.1 ... master

Author SHA1 Message Date
lilydjwg
fc8d90f46b tests: update 2025-03-01 11:43:32 +08:00
依云
e1882a2778
Merge pull request #295 from Maks1mS/feat/snap-source
feat: add snapcraft source
2025-02-28 18:15:38 +08:00
Maxim Slipenko
b83cbd2ead tests: add tests for snapcraft source 2025-02-28 11:05:31 +03:00
Maxim Slipenko
d4be5189fe fix: handle HTTPError specifically 2025-02-28 11:04:47 +03:00
Maxim Slipenko
cc9001ad88
feat: add snapcraft source 2025-02-22 06:16:54 +00:00
lilydjwg
fca89f7830 Merge remote-tracking branch 'origin/pr/293' 2025-02-11 10:53:07 +08:00
Yichao Yu
92a424f946 Implement use_max_release for github
This follows the logic for use_max_tag but only includes the tags that are part of
a github release. The returned version follows include_prerelease and use_release_name
just like use_latest_release.

This allows waiting for release artifacts to be created after the tag is created on the repo.
2025-02-10 21:15:30 -05:00
lilydjwg
d3bf5b2db5 tests: fix unintended redirection 2025-02-11 10:01:33 +08:00
lilydjwg
bd72ea04d2 Merge branch 'test-ci' 2025-02-10 17:38:52 +08:00
lilydjwg
50d5342586 tests: I don't know why but these tests are flaky 2025-02-10 17:31:01 +08:00
lilydjwg
1e3ed1e4b9 run_cached_tests: output mitmdump logs 2025-02-10 17:31:01 +08:00
lilydjwg
72c0730725 tests: venv mitmproxy to avoid conflict 2025-02-10 17:31:01 +08:00
lilydjwg
6759d2f206 tests: use source distribution of pycurl so that it supports more compression types
Cross-os binary distribution is EVIL!
2025-02-10 16:18:09 +08:00
lilydjwg
17e351f825 support SSL_CERT_FILE env for pycurl http backend 2025-02-10 15:51:05 +08:00
lilydjwg
d58638733e tests: cache http responses 2025-02-10 15:51:05 +08:00
lilydjwg
287282eb2c archpkg: update url 2025-02-10 15:51:05 +08:00
lilydjwg
fa9ca38690 tests: fix type 2025-02-10 12:25:31 +08:00
lilydjwg
5c7918bf7a tests: don't run test_container on GitHub Actions 2025-02-10 11:40:20 +08:00
lilydjwg
c03bec7452 tests: travis is long gone 2025-02-10 11:40:20 +08:00
依云
7cacd9edaf
Merge pull request #292 from JeanChristopheMorinPerso/skip_yanked_pypi_releases
Exclude yanked PyPI releases
2025-02-10 10:26:17 +08:00
Jean-Christophe Morin
af21f93bd1 Explicitly document the behavior around yanked releases
Signed-off-by: Jean-Christophe Morin <jean_christophe_morin@hotmail.com>
2025-02-09 14:58:31 -05:00
Jean-Christophe Morin
ca011221cf Use walrus operator and explicitly require python >= 3.8
Signed-off-by: Jean-Christophe Morin <jean_christophe_morin@hotmail.com>
2025-02-09 14:50:10 -05:00
lilydjwg
745f0decdb remove obsolete entries from sample_config.toml 2025-02-09 15:19:14 +08:00
Jean-Christophe Morin
6f5870787a Exclude yanked PyPI releases
Signed-off-by: Jean-Christophe Morin <jean_christophe_morin@hotmail.com>
2025-02-08 16:02:51 -05:00
lilydjwg
4364759b29 docs: fix example
"is_xml" didn't work and wasn't added.

[skip ci]
2025-02-05 14:59:34 +08:00
lilydjwg
dbf6c4601f docs: add an example on RSS feed
closes #289.

[skip c]
2025-02-05 14:57:56 +08:00
lilydjwg
4d5e29f26b gitea: use the full date string as commit version
Only using the date part is too coarse. Also date could be different
across timezones and thus confusing.
2024-12-28 21:53:10 +08:00
lilydjwg
ad892b713e docs: add a section about Gogs / Forgejo / Codeberg
closes #287.

[skip ci]
2024-12-28 21:53:10 +08:00
依云
5803237d39
Merge pull request #285 from dpeukert/patch-1
chore: fix typo in comment
2024-12-20 16:49:53 +08:00
Daniel Peukert
71a0002624
chore: fix typo in comment 2024-12-20 09:34:45 +01:00
lilydjwg
2722ccc7fe docs: revision may be missing in the event. 2024-12-18 15:23:21 +08:00
Jamison Lahman
228139ddd4 handle default logger when revision is None 2024-12-17 07:06:44 -08:00
Jamison Lahman
372fce4445 update usage 2024-12-17 06:28:03 -08:00
Jamison Lahman
babb3aa74e feat: include revision in logging output 2024-12-17 06:14:14 -08:00
依云
d44a50c273
Merge pull request #283 from dpeukert/master
feat: add opam source
2024-12-15 16:40:27 +08:00
Daniel Peukert
c0d6568434
feat: add opam source 2024-12-15 00:39:04 +01:00
lilydjwg
2a64f7ab8f bump version to 2.17dev 2024-12-02 23:57:50 +08:00
lilydjwg
af1a855fd5 make mypy happy 2024-12-02 23:56:11 +08:00
Jakub Ružička
eeddd56156 rpmrepo: new RPM repo source 2024-12-02 15:16:55 +01:00
Jakub Ružička
29b0f63103 docs: Fix too short title underline in jq usage
This fixes docs build warning:

	docs/usage.rst:330: WARNING: Title underline too short.
2024-12-02 14:45:09 +01:00
lilydjwg
3a87f920a5 version 2.16 2024-11-30 16:48:57 +08:00
lilydjwg
af7acde251 httpx_httpclient: pass None when params is empty
see https://github.com/encode/httpx/discussions/3428
2024-11-30 16:42:57 +08:00
Chih-Hsuan Yen
c7f571ae79 update tests 2024-11-30 14:23:18 +08:00
lilydjwg
256bb9dd3b use repr instead of str for errors
Ref #279.
2024-11-05 15:09:50 +08:00
lilydjwg
28f1ab5144 github: make use_release_name=false work with include_prereleases
also make gitref uses the full ref path format. (If someone handles this
value to git, it should still works.)
2024-11-03 19:11:37 +08:00
lilydjwg
e1a6517169 tests: disable aiohttp as it always fails; also use Python version 3.13 instead of 3.13-dev 2024-11-03 19:10:42 +08:00
lilydjwg
afad2f08f2 tests: skip hg test as it keeps timing out 2024-11-03 18:51:04 +08:00
lilydjwg
5cead67640 ci: add Python 3.13 2024-11-03 18:51:04 +08:00
lilydjwg
728dcca1bd update tests 2024-11-03 18:36:46 +08:00
lilydjwg
89b4cd90ba tests: use asyncio_default_fixture_loop_scope = "session"
See also https://github.com/pytest-dev/pytest-asyncio/issues/924.
2024-11-03 18:30:54 +08:00
lilydjwg
4eb70a0fbe github: add option to use release name instead of tag
fixes #278
2024-11-03 18:24:54 +08:00
依云
8d3b6adaaa
Merge pull request #277 from ilai-deutel/cratesio-prerelease
crates.io: skip pre-releases, list option
2024-09-27 12:39:34 +08:00
lilydjwg
5b561456ae update tests 2024-09-27 12:36:44 +08:00
Ilaï Deutel
d5cc276000 crates.io: skip pre-releases, list option
This adds a `use_pre_release` option and adds support for list option for crates.io, similar to PyPI.
2024-09-27 00:10:21 -04:00
lilydjwg
3abe5ad394 Merge remote-tracking branch 'origin/pr/276' 2024-08-08 23:41:52 +08:00
imlonghao
e4cff0b507
anitya: support to use anitya ID 2024-08-08 23:13:47 +08:00
lilydjwg
d4e27c22ac aiohttp: remove unused object causing async context issues 2024-08-08 16:50:01 +08:00
lilydjwg
7848f0907d bump version to 2.16dev 2024-07-14 16:41:43 +08:00
lilydjwg
a1ab77f92a Merge remote-tracking branch 'origin/pr/275' 2024-07-14 16:40:47 +08:00
lilydjwg
5a3ca69c1c update tests 2024-07-14 16:40:25 +08:00
Bert Peters
5677c9595e Add launchpad source 2024-07-13 14:54:46 +02:00
lilydjwg
61ca3c95d0 pypi: fix handling of invalid versions
fixes #273.
2024-06-01 16:34:36 +08:00
lilydjwg
1d55664b31 version 2.15 2024-05-29 20:55:53 +08:00
lilydjwg
c66ca7f2bc fix nvtake name=version corrupted the oldver file
fixes #272.
2024-05-24 15:05:24 +08:00
lilydjwg
2800f392d9 Merge remote-tracking branch 'origin/master' 2024-05-19 15:19:55 +08:00
依云
3cf403fc51
Merge pull request #270 from yan12125/pypi-list-options
pypi: support list options
2024-05-19 15:18:08 +08:00
lilydjwg
6af59aa2ae test_ubuntupkg: update and make it more resilient 2024-05-19 12:16:02 +08:00
Chih-Hsuan Yen
8397cbbce0 pypi: support list options 2024-05-19 12:10:40 +08:00
lilydjwg
d49d07437b aur: datetime.utcfromtimestamp is deprecated 2024-05-11 15:58:26 +08:00
lilydjwg
5db6a89fc3 test_ubuntupkg: update and make it more resilient 2024-05-11 14:15:37 +08:00
lilydjwg
9f6706e0e7 tests: fix dep on httpbin 2024-05-11 14:15:37 +08:00
lilydjwg
2a63ddef7a fix nicelogger for Python 3.12 2024-04-28 17:14:57 +08:00
lilydjwg
fdac4af358 Revert "update nicelogger"
This reverts commit 0f441b0bf2.

nicelogger seems to have local changes and is dependent by nvchecker/tools.py
2024-04-28 17:12:14 +08:00
lilydjwg
0f441b0bf2 update nicelogger 2024-04-28 16:32:38 +08:00
lilydjwg
0890c7e966 make exception from "_process_result" a "no-result" log
so that it follows the doc and tests.
2024-04-26 21:25:56 +08:00
lilydjwg
7e2583be56 emit an error log when an exception is set
also fixes #269, but maybe more
2024-04-26 20:27:45 +08:00
lilydjwg
d13cbeea38 emit a warning if "include_regex" matched no versions
fixes #269.
2024-04-26 20:13:45 +08:00
lilydjwg
5140fa2542 Merge remote-tracking branch 'origin/pr/268' 2024-04-24 10:03:54 +08:00
lilydjwg
c367b49703 logging: filter out taskName that is added by Python 3.12
fixes #267.
2024-04-24 10:02:14 +08:00
Daniel Peukert
62fdca8fbf
feat: implement version url support for go 2024-04-23 23:27:06 +02:00
lilydjwg
f1ff604b4c httpx: use "proxy" argument instead of deprecated "proxies" 2024-04-19 12:48:53 +08:00
lilydjwg
61b3705f35 new dev version 2024-04-19 12:46:47 +08:00
lilydjwg
bd2b2f9494 only save new version records with a full run
and keep others only when `--entry` specified on cmdline.

Fixes #263.
2024-04-19 11:54:03 +08:00
依云
0355af5e4c
Merge pull request #264 from marsam/fix-exception-statement
fix: correct try/except statement
2024-04-19 09:56:46 +08:00
Mario Rodas
67ccc10f17 fix: correct try/except statement
Otherwise, it fails because Python can't find the `jq` exception.
2024-04-18 04:20:00 +00:00
lilydjwg
2f6f803c1a version 2.14 2024-04-17 16:49:00 +08:00
lilydjwg
be8967b1d2 tests: missing assert statements 2024-04-17 16:49:00 +08:00
依云
7960541a34
Merge pull request #262 from yan12125/fix-tests
tests: update Debian package versions again
2024-04-08 15:06:18 +08:00
Chih-Hsuan Yen
c306b44711 tests: update Debian package versions again 2024-04-08 14:31:24 +08:00
rocka
47ce6fc2e4
feat: add jq source (#261)
feat: add jq source to parse json
2024-03-31 16:10:09 +08:00
依云
511922e6fb
Merge pull request #260 from yan12125/fix-tests
tests: update Debian package versions
2024-03-24 11:52:00 +08:00
Chih-Hsuan Yen
beceadddee tests: update Debian package versions 2024-03-24 11:45:49 +08:00
lilydjwg
de1a3c6fc2 record rich results in verfile and get rid of Result & VersData types
Use RichResult to replace Result; Result was RichResult plus entry name.
2024-03-14 18:30:15 +08:00
David Arnold
f29bdee6a3
feat: return full results from result processing 2024-03-14 10:40:25 +01:00
David Arnold
9dd7fb1736 fix: #255
this ensures the newver file keeps untouched records
2024-03-11 21:26:04 +08:00
David Arnold
03819388e8 feat: log up-to-date when single entry 2024-03-11 21:26:04 +08:00
lilydjwg
471b66ff4c tests: use pytest-rerunfailures instead of flaky
flaky is no longer compatible with pytest 8.0 and seems unmaintained.

https://github.com/pytest-dev/pytest/issues/12066
2024-03-09 20:23:25 +08:00
lilydjwg
4ec12d30fd mercurial: tests: update asyncio mark 2024-03-09 19:59:01 +08:00
lilydjwg
1005bd5f01 mercurial: update tests 2024-03-09 19:19:06 +08:00
lilydjwg
4ad3bdb95d go: style: unify indentation etc 2024-03-09 18:48:47 +08:00
lilydjwg
35a19f30ae mercurial: minor update 2024-03-09 18:45:19 +08:00
lilydjwg
5227df6f5b Merge remote-tracking branch 'origin/pr/170' 2024-03-09 18:44:04 +08:00
lilydjwg
e19f033f15 Merge remote-tracking branch 'origin/master' 2024-03-09 18:41:47 +08:00
yingziwu
0e0eb2c363
feat: add go pkg support (#259) 2024-03-09 18:41:20 +08:00
lilydjwg
a7e6512e70 use LiteralString (although mypy hasn't supported it yet)
https://github.com/python/mypy/issues/12554
2024-03-09 18:37:05 +08:00
lilydjwg
c53952c75c httpheader: fix GetVersionError usage 2024-03-09 18:28:44 +08:00
lilydjwg
9784e64307 still fix 2024-03-08 20:49:37 +08:00
lilydjwg
5d908b34a6 fixes 2024-03-08 20:39:13 +08:00
lilydjwg
1d19082673 Merge remote-tracking branch 'origin/pr/254' 2024-03-08 20:34:16 +08:00
lilydjwg
370711c878 fix linter warnings 2024-03-08 20:33:31 +08:00
David Arnold
03c2e25bbd
feat: implement more metadata for git{,ea,hub,lab} 2024-03-04 13:10:14 +01:00
David Arnold
3691fa9a51
feat: add richer results 2024-03-04 13:10:13 +01:00
David Arnold
f8a73a5a92
docs: document changes don't mention the old keys
while the old way of declaring the keys is still available it is no longer documented

this should encourage new users to use the new, netrc-compatible way
2024-03-04 12:54:13 +01:00
David Arnold
8c82c3c289
feat: add configurable host to github 2024-03-02 10:47:50 +01:00
David Arnold
e3e8415251
feat: add netrc support
backwards compatible
2024-03-02 10:46:46 +01:00
lilydjwg
6b73d8cd87 update github source to handle rate limits according to current github docs 2024-02-29 21:00:13 +08:00
lilydjwg
60d88ac5d2 update github actions 2024-02-27 15:03:38 +08:00
依云
c9017d7e07
Update codeql.yml 2024-02-27 15:03:17 +08:00
lilydjwg
e5751d2894 workaround a pycurl wheel issue to fix ci
https://github.com/pycurl/pycurl/issues/834
2024-02-27 14:55:10 +08:00
lilydjwg
1e5de6cbcb support using prefix and from_pattern/to_pattern at the same time
closes #249.
2024-02-27 13:52:35 +08:00
lilydjwg
c944cbcac3 bump version to 2.13.1 2023-12-25 15:43:28 +08:00
lilydjwg
28c3c7076b Revert "nvtake: clean up old records from oldver file on saving"
This reverts commit eaac776a90.

Failed ones shouldn't be removed. Need a way to indicate failures.
2023-12-25 15:43:28 +08:00
依云
07c3704641
Merge pull request #246 from bianjp/bugfix-container-tag-multiarch
Fix container test for multi-arch images
2023-12-18 16:02:16 +08:00
Bian Jiaping
fac30d46d9
Fix container test for multi-arch images 2023-12-18 14:58:56 +08:00
lilydjwg
d35fb3b5bc bump version to 2.13 2023-12-15 15:58:19 +08:00
lilydjwg
6c63d7991f drop support for Python 3.7
It has ended its life since 2023-06-06 and is not supported by pytest-asyncio 0.23+.
2023-12-09 20:38:23 +08:00
lilydjwg
56869a45bc tests: require pytest-asyncio version >=0.23 2023-12-09 20:38:23 +08:00
Chih-Hsuan Yen
4ca61ba11a tests: fix for pytest-asyncio 0.23 2023-12-09 20:38:00 +08:00
moui0
918457afed tests: bump example and xml2 version 2023-12-07 06:01:56 +01:00
lilydjwg
eaac776a90 nvtake: clean up old records from oldver file on saving 2023-12-04 17:49:14 +08:00
Bian Jiaping
4833135f87
container source plugin supports watching update of a specified tag (#243)
container source plugin supports watching update of a specified tag. Resolve #241
2023-12-02 23:45:25 +08:00
Andreas Schleifer
0ba8cd41de
feat(tools): handle symlinks correctly for take command (#240) 2023-11-29 00:20:59 +08:00
lilydjwg
c863d24695 tests: aiohttp isn't ready for 3.12 yet
See also https://github.com/aio-libs/aiohttp/issues/7739
2023-10-27 17:36:22 +08:00
lilydjwg
6da6145350 tests: include Python 3.12 2023-10-27 17:19:09 +08:00
Levente Polyak
9221a476c5
feat(core): resolve symlinks on file write (#233)
* feat(core): resolve symlinks on file write

Instead of using the passed filename as the rename target, use
Path.resolve for resolving any symlinks.

This allows to use symlinks for the nvchecker database files that point
somewhere else. Before this commit nvchecker simply replaced the symlink
with an actual file, now we resolve the targets first and only replace
the actual file that is pointed to.

---------

Signed-off-by: Levente Polyak <levente@leventepolyak.net>
Co-authored-by: Andreas 'Segaja' Schleifer <webmaster@segaja.de>
Co-authored-by: lilydjwg <lilydjwg@gmail.com>
2023-10-27 17:18:05 +08:00
依云
c401d239b2
Merge pull request #239 from dpeukert/feature/source-url-support
Implement URL result support for other sources
2023-10-23 16:18:30 +08:00
Daniel Peukert
55b3f671c8
implement second batch of URL results for sources 2023-10-18 21:58:29 +02:00
Daniel Peukert
592b4a4f6b
update docs for sparkle 2023-10-18 02:03:10 +02:00
Daniel Peukert
6bf34873d3
implement first batch of URL results for sources 2023-10-18 02:00:46 +02:00
Daniel Peukert
5a6fee2817
fix pacman test 2023-10-18 01:59:43 +02:00
Daniel Peukert
a04d6b0fc6
implement rich result support for list-based sources 2023-10-18 01:59:30 +02:00
lilydjwg
5dcb3bc36a tests: update 2023-10-14 15:19:13 +08:00
lilydjwg
d00203d103 fix: Python < 3.10 doesn't have kw_only for dataclass 2023-10-14 14:54:56 +08:00
lilydjwg
bdd025d995 implement url results for AUR and PyPI
see https://github.com/lilydjwg/nvchecker/issues/238.
2023-10-14 14:52:10 +08:00
lilydjwg
cc4a0cd301 support url in results 2023-10-14 14:52:10 +08:00
lilydjwg
0fd35f4458 minor fix of a docstring 2023-10-14 14:52:10 +08:00
依云
3563d71fb2
Merge pull request #237 from stepnem/notifyopt
nvchecker-notify: accept and pass --tries and --failures to nvchecker
2023-10-07 19:06:34 +08:00
Štěpán Němec
4f79058cbd nvchecker-notify: accept and pass --tries and --failures to nvchecker 2023-10-03 17:28:34 +02:00
lilydjwg
b8c0e8a6b3 still fix readthedocs
[skip ci]
2023-08-26 11:54:02 +08:00
lilydjwg
f3da72403e fix readthedocs
[skip ci]
2023-08-26 11:47:34 +08:00
lilydjwg
efc99cb791 add .readthedocs.yaml
https://blog.readthedocs.com/migrate-configuration-v2/

[skip ci]
2023-08-26 11:46:24 +08:00
lilydjwg
2683f47e04 update tests: give up deepin as it times out frequently 2023-08-26 10:23:40 +08:00
lilydjwg
0a90d195d1 update tests 2023-08-26 10:17:40 +08:00
依云
83272dc04a
Merge pull request #236 from dpeukert/github-prereleases
github: implement option to include prereleases
2023-08-26 10:09:00 +08:00
Daniel Peukert
5052d82550
github: implement option to include prereleases 2023-08-25 23:23:30 +02:00
依云
587eeafd65
Merge pull request #235 from yan12125/fix-tests
Fix tests
2023-08-22 20:48:46 +08:00
Chih-Hsuan Yen
386d05f8e3 tests: skip testing with broken Deepin mirror 2023-08-22 19:51:16 +08:00
Chih-Hsuan Yen
07cddd9bd6 tests: fix tests with httpbin 0.10.0
The latest httpbin changes the format of JSON response. For example:
```
{
  "authenticated": true,
  "user": "username"
}
```
2023-08-22 18:44:51 +08:00
Chih-Hsuan Yen
598bb94135 tests: update Android SDK version 2023-08-22 18:40:01 +08:00
依云
ea542d1fa1
Merge pull request #231 from hack3ric/master
tests: bump android-sdk-cmake and xml2 version
2023-07-13 18:00:44 +08:00
Eric Long
fe1342e9fb tests: bump android-sdk-cmake and xml2 version 2023-07-13 16:54:24 +08:00
lilydjwg
5a972cd0d0 alpm: community repo is gone 2023-06-29 14:03:03 +08:00
依云
4e7dce4e93
Merge pull request #229 from yan12125/fix-test-alpmfiles
Fix test_alpmfiles in Arch chroots
2023-06-19 22:18:38 +08:00
Chih-Hsuan Yen
d9888cc49d Fix test_alpmfiles in Arch chroots
In general, there are no /var/lib/pacman/sync/*.files in Arch chroots,
and thus tests in test_alpmfiles.py fail.

The tempfile approach roughly follows tests/test_alpm.py.

Follow-up https://github.com/lilydjwg/nvchecker/pull/226
2023-06-19 22:06:47 +08:00
lilydjwg
b54c5a0cf7 PyPI signature support is gone
[skip ci]
2023-06-17 22:27:21 +08:00
lilydjwg
82dc93ceff bump version to 2.12 2023-06-17 18:52:00 +08:00
lilydjwg
e708d19d35 tests: update test_htmlparser to avoid httpbin.org
it produces a lot of 504 errors.
2023-06-02 00:58:24 +08:00
lilydjwg
d3603fcee8 better documentation on optional dependencies
fixes #225.
[skip ci]
2023-06-02 00:43:49 +08:00
lilydjwg
ac57d8fd15 alpmfiles: fix typing 2023-06-02 00:35:05 +08:00
依云
62ab3b05ef
Merge pull request #226 from pekkarr/alpmfiles
Add alpmfiles source
2023-06-02 00:33:02 +08:00
Pekka Ristola
8157e08b59
Add alpmfiles source
Rewritten from the `archfiles` source in archlinuxcn/lilac
2023-05-31 21:29:17 +03:00
lilydjwg
af77af34f3 fight mypy[1]
The following error will be reported without if:

  error: Name "tomllib" already defined (by an import)

While this can be silenced by a "# type: ignore", in some case[2] mypy
will report the following error:

  error: Unused "type: ignore" comment

[1]: https://github.com/python/mypy/issues/1153
[2]: https://github.com/lilydjwg/nvchecker/actions/runs/4916840821/jobs/8793454970
2023-05-09 13:31:11 +08:00
lilydjwg
62a3f336b8 bump to dev version 2023-05-08 23:28:52 +08:00
lilydjwg
e9ecbd514b silence mypy for new module 2023-05-08 23:28:20 +08:00
Felix Yan
34e87db8f9
Use tomllib on Python 3.11+
Fallback to tomli for compatibility with older Python.
2023-05-08 10:52:27 +03:00
依云
8cd75fd101
Merge pull request #223 from alexfikl/remove-sre-constants
Remove usage of deprecared sre_constants
2023-05-01 17:33:56 +08:00
Alexandru Fikl
fed4fc52ff
test_cran: bump xml2 version to pass test 2023-05-01 09:59:16 +03:00
Alexandru Fikl
1be02fc29c
remove usage of deprecared sre_constants 2023-04-30 14:12:34 +03:00
依云
4a0caf3213
Merge pull request #222 from yan12125/fix-test_alpm
Fix test_alpm in Arch chroots
2023-04-07 21:11:50 +08:00
Chih-Hsuan Yen
166dfb11a4 Fix test_alpm in Arch chroots
Arch chroots with recent devtools has options=(debug) by default, and
thus test_alpm fails:

_________________________ ERROR at setup of test_alpm __________________________

module = <module 'tests.test_alpm' from '/build/nvchecker/src/nvchecker/tests/test_alpm.py'>

    def setup_module(module):
      global temp_dir, db_path
      temp_dir = tempfile.TemporaryDirectory()
      temp_path = pathlib.Path(temp_dir.name)
      pkg_path = temp_path / 'test-pkg'
      pkg_path.mkdir()
      with (pkg_path / 'PKGBUILD').open('w') as f:
        f.write(
          'pkgname=test-pkg\n'
          'pkgver=1.2.3\n'
          'pkgrel=4\n'
          'arch=(any)\n'
          'provides=("test-provides=5.6-7" "test-provides-unversioned")\n'
        )
      subprocess.check_call(['makepkg', '--nosign'], cwd=pkg_path)
      pkg_file = subprocess.check_output(['makepkg', '--packagelist'], cwd=pkg_path, text=True).strip()
      db_path = pkg_path / 'test-db'
      db_path.mkdir()
      repo_path = db_path / 'sync'
      repo_path.mkdir()
>     subprocess.check_call([
        'repo-add',
        repo_path / 'test-repo.db.tar.gz',
        pkg_path / pkg_file
      ])

tests/test_alpm.py:40:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

popenargs = (['repo-add', PosixPath('/tmp/tmp2kl26h_y/test-pkg/test-db/sync/test-repo.db.tar.gz'), PosixPath('/pkgdest/test-pkg-1.2.3-4-any.pkg.tar.zst\n/pkgdest/test-pkg-debug-1.2.3-4-any.pkg.tar.zst')],)
kwargs = {}, retcode = 1
cmd = ['repo-add', PosixPath('/tmp/tmp2kl26h_y/test-pkg/test-db/sync/test-repo.db.tar.gz'), PosixPath('/pkgdest/test-pkg-1.2.3-4-any.pkg.tar.zst\n/pkgdest/test-pkg-debug-1.2.3-4-any.pkg.tar.zst')]

    def check_call(*popenargs, **kwargs):
        """Run command with arguments.  Wait for command to complete.  If
        the exit code was zero then return, otherwise raise
        CalledProcessError.  The CalledProcessError object will have the
        return code in the returncode attribute.

        The arguments are the same as for the call function.  Example:

        check_call(["ls", "-l"])
        """
        retcode = call(*popenargs, **kwargs)
        if retcode:
            cmd = kwargs.get("args")
            if cmd is None:
                cmd = popenargs[0]
>           raise CalledProcessError(retcode, cmd)
E           subprocess.CalledProcessError: Command '['repo-add', PosixPath('/tmp/tmp2kl26h_y/test-pkg/test-db/sync/test-repo.db.tar.gz'), PosixPath('/pkgdest/test-pkg-1.2.3-4-any.pkg.tar.zst\n/pkgdest/test-pkg-debug-1.2.3-4-any.pkg.tar.zst')]' returned non-zero exit status 1.

/usr/lib/python3.10/subprocess.py:369: CalledProcessError
---------------------------- Captured stdout setup -----------------------------
==> Making package: test-pkg 1.2.3-4 (Fri Apr  7 11:53:43 2023)
==> Checking runtime dependencies...
==> Checking buildtime dependencies...
==> Retrieving sources...
==> Extracting sources...
==> Entering fakeroot environment...
==> Tidying install...
  -> Removing libtool files...
  -> Purging unwanted files...
  -> Removing static library files...
  -> Stripping unneeded symbols from binaries and libraries...
  -> Compressing man and info pages...
==> Checking for packaging issues...
==> Creating package "test-pkg"...
  -> Generating .PKGINFO file...
  -> Generating .BUILDINFO file...
  -> Generating .MTREE file...
  -> Compressing package...
==> Leaving fakeroot environment.
==> Finished making: test-pkg 1.2.3-4 (Fri Apr  7 11:53:44 2023)
==> No packages modified, nothing to do.
---------------------------- Captured stderr setup -----------------------------
==> ERROR: File '/pkgdest/test-pkg-1.2.3-4-any.pkg.tar.zst
/pkgdest/test-pkg-debug-1.2.3-4-any.pkg.tar.zst' not found.
2023-04-07 20:38:17 +08:00
lilydjwg
80343945a1 bump version to 2.11 2023-04-06 21:00:56 +08:00
lilydjwg
1bbbdca638 github actions: exclude matrix containing an upstream bug 2023-03-26 20:51:19 +08:00
lilydjwg
414c20a60f update github actions versions 2023-03-26 20:02:39 +08:00
lilydjwg
931b3fdf3c github actions: test again Python 3.11 2023-03-26 18:47:38 +08:00
lilydjwg
577a42ef89 update tests
Microsoft's teams repo has gone.
2023-03-26 18:39:05 +08:00
lilydjwg
addc7d0b52 regex source: cache by url and post_data
so we can search for different regexes for the same web page.

fixes #221.
2023-03-26 17:30:08 +08:00
依云
e71dd309e7
Merge pull request #220 from alexfikl/port-platformdirs
Port from deprecated appdirs to platformdirs
2023-02-02 01:27:05 +08:00
Alexandru Fikl
8366e519d6
port from deprecated appdirs to platformdirs 2023-02-01 17:15:15 +02:00
lilydjwg
2782205efc remove unused # type: ignore 2022-11-20 14:16:33 +08:00
lilydjwg
2107e6ef6f try to fix mypy 2022-11-20 14:14:33 +08:00
lilydjwg
4bace14110 add comment about how to build and upload 2022-11-20 13:50:32 +08:00
lilydjwg
fad7cf631d Merge remote-tracking branch 'origin/pr/212'
Conflicts:
      setup.cfg
      setup.py
2022-11-20 13:42:22 +08:00
lilydjwg
add6951eae new dev version starts 2022-11-20 13:35:54 +08:00
lilydjwg
8ca78f7445 docs: update about default git branch.
[skip ci]
2022-11-08 20:57:10 +08:00
lilydjwg
324f2f5f59 bump version to 2.10 2022-10-30 13:57:00 +08:00
依云
115963ab93
Merge pull request #219 from yan12125/minor-fixes
Several minor fixes for docs and tests
2022-10-29 14:37:55 +08:00
Chih-Hsuan Yen
2fc6be9d4e Update versions in test_{openvsx,vsmarketplace} 2022-10-29 14:22:23 +08:00
Chih-Hsuan Yen
3d3d8c6136 Fix test_{archpkg,pacman}
Apparently the ipw2100-fw package is dropped. I switched to another
package that is rarely updated.
2022-10-29 14:21:02 +08:00
Chih-Hsuan Yen
51ffbbdf61 Fix docs warnings
usage.rst:886: (WARNING/2) Title underline too short.

    Check Visual Studio Code Marketplace
    ~~~~~~~~~~~~~~~
    usage.rst:886: (WARNING/2) Title underline too short.

    Check Visual Studio Code Marketplace
    ~~~~~~~~~~~~~~~
2022-10-29 14:20:07 +08:00
lilydjwg
4c4479d4ea gitea: omit default branch if not specified, fix for blm 2022-08-25 16:46:06 +08:00
依云
a1d2a1325c
Merge pull request #218 from moodyhunter/master
alpm source: support checking packages from multiple repos
2022-08-25 16:32:03 +08:00
Moody Liu
dada15505c
doc: separate 'repo' and 'repos' for alpm source 2022-08-21 15:29:07 +01:00
Moody Liu
706952609e
update alpm documentation: repos 2022-08-21 14:54:34 +01:00
Moody Liu
7b2a464821
alpm: support searching in multiple repos 2022-08-21 14:51:06 +01:00
lilydjwg
13cca483e1 update test_android_sdk.py 2022-08-12 13:52:04 +08:00
lilydjwg
9ee53041b6 catch exceptions from Worker.run 2022-08-12 13:40:33 +08:00
lilydjwg
8b8f2d1d8a nvchecker-notify: try Notify 0.8
fixes #217.
2022-07-16 15:48:59 +08:00
lilydjwg
268bd9125e begin a new version 2022-07-16 15:47:52 +08:00
lilydjwg
2325d2ac74 bump version to 2.9 2022-06-27 20:33:39 +08:00
Vanilla
2de30c0acd vsmarketplace: use better filterType
fixes #216.
2022-06-18 16:27:17 +08:00
lilydjwg
914c6414e8 fix tests trying to raise exc_info=True 2022-06-12 14:22:14 +08:00
lilydjwg
8b32c26d1e htmlparser: handle xpath pointing to element instead of text() 2022-06-12 12:31:11 +08:00
lilydjwg
e683476fb2 avoid the whole process taken down by a single _process_result 2022-06-12 12:22:07 +08:00
lilydjwg
88cbdd925c fix mypy 2022-06-05 14:41:57 +08:00
lilydjwg
ffe84cae08 add support for using awesomeversion to compare versions 2022-06-05 14:31:54 +08:00
lilydjwg
d706961669 update tests 2022-04-21 17:23:20 +08:00
lilydjwg
f2aa4cad20 fix typos 2022-04-21 17:16:34 +08:00
lilydjwg
48f82e819b bump version to 2.9dev 2022-04-21 17:16:23 +08:00
lilydjwg
257b0f63fe packaging issue fixed
It was setuptools-scm that caused a lot of unintended files to be
packaged. That package was required by twine but it's no longer needed
now.
2022-04-11 19:17:39 +08:00
lilydjwg
5605f62269 bump version to 2.8 2022-04-11 18:40:15 +08:00
lilydjwg
69a5d82d25 tests: fix some warnings 2022-04-07 14:06:03 +08:00
lilydjwg
2e042d7576 fix httpx and aiohttp issues
It seems that we no longer need to close the sessions.
2022-04-07 13:44:38 +08:00
lilydjwg
2ea44d3694 make mypy happy 2022-04-02 12:07:20 +08:00
Chih-Hsuan Yen
1235d77e72
Pin werkzeug for httpbin compatibility 2022-04-02 12:03:45 +08:00
Chih-Hsuan Yen
c15e9b7576
android-sdk improvements
* Returns all matched versions to support list options
* Don't hard-code the host OS
* Document the default of `channel`
2022-04-02 11:53:27 +08:00
Dusk Banks
2598b0fd19 tests: pytest-asyncio asyncio_mode -> strict
the `asyncio_mode` of `legacy` (current default) is deprecated.
`asyncio_mode` will be `strict` by default in the future.
2022-03-03 09:56:33 -08:00
Dusk Banks
9005c96d10 introduce tox support 2022-03-03 09:56:33 -08:00
lilydjwg
c6ed37ada1 nvcmp: optionally exit with 4 if there are differences
closes #196.
2022-02-28 14:09:12 +08:00
Dusk Banks
a62866a2d3 tests: spell test_htmlparser correctly 2022-02-22 21:51:47 -08:00
Dusk Banks
15020dfcd6 switch from setup.py to PEP 517 & setup.cfg 2022-02-22 21:51:47 -08:00
Dusk Banks
8b929322cb defensively fix reStructuredText syntax
a ` : ` (space, colon, space) like that can be confused for a definition
list classifier, especially in a definition list, so prefer to
backslash-escape the colon.

see the "reStructedText Markup Specification",
section "Syntax Details" > "Body Elements" > "Definition Lists".
2022-02-22 21:51:47 -08:00
lilydjwg
f4e9ab80b9 Merge remote-tracking branch 'origin/pr/211' 2022-02-23 11:11:51 +08:00
lilydjwg
ff1c980456 bump to version 2.8dev 2022-02-23 11:11:29 +08:00
Pekka Ristola
e03023bd6c
add CRAN support 2022-02-22 19:06:15 +02:00
Guizai
7739f5c8c0
fix: source sparkle (#210)
fix: source sparkle
2022-02-22 20:13:18 +08:00
lilydjwg
3dff0466d1 bump version to 2.7 2022-02-18 19:36:44 +08:00
lilydjwg
d8c26fa666 minor update 2022-02-13 14:36:13 +08:00
Tom Rathborne
ea01b6d9ee
Add all option to cmp (#209)
Add --all option to nvcmp to get all packages, even with no version change
2022-02-13 14:34:32 +08:00
依云
b61fbe5a37
Merge pull request #208 from trathborne/Add-JSON-output-to-cmp
Split comparison process from output and provide JSON output
2022-02-10 16:59:53 +08:00
Tom Rathborne
6d8f00036d Handle sort=none better 2022-02-10 09:45:43 +01:00
Tom Rathborne
9ea6f37712
Merge branch 'lilydjwg:master' into Add-JSON-output-to-cmp 2022-02-09 16:11:41 +01:00
Tom Rathborne
3e88d74043 Split comparison process from output and provide JSON output for automated tooling 2022-02-09 13:27:20 +01:00
lilydjwg
da1d204307 bump version to 2.7dev 2022-02-08 14:42:04 +08:00
lilydjwg
222e3e9b4c remove old script maintained_in_aur 2022-02-08 14:41:51 +08:00
依云
acb56d7827
Merge pull request #206 from alexfikl/bitbucket-order-tags
Bitbucket: Add support for queries and sorting
2022-02-08 14:40:59 +08:00
Alexandru Fikl
4bf2755b0e
bitbucket: use 'fields' query to restrict returned fields 2022-02-06 10:17:49 -06:00
Alexandru Fikl
a8228bb594
bitbucket: use querying and sorting for improved tag search 2022-02-04 10:31:41 -06:00
Alexandru Fikl
4f06ce7862
bitbucket: add types 2022-02-04 09:53:03 -06:00
lilydjwg
31492c4eb0 bump to version 2.6.1 2022-02-04 16:57:42 +08:00
lilydjwg
d5920efa54 add a test to test the whole command tool 2022-02-04 16:54:13 +08:00
lilydjwg
ac1aafc9f1 use asyncio.get_event_loop() for Python < 3.10
fixes #207.
2022-02-04 16:53:56 +08:00
Alexandru Fikl
dd87c0e80c
update bitbucket doc links 2022-02-02 14:27:14 -06:00
Alexandru Fikl
b6eef8a54a
bitbucket: order tags by date 2022-02-02 14:17:51 -06:00
lilydjwg
2f61336ce3 bump version to 2.6 2022-02-02 11:20:52 +08:00
lilydjwg
bc2c2edfde update test_android_sdk.py 2022-01-20 17:22:24 +08:00
lilydjwg
1fb3c970cf fix types and tests 2022-01-20 16:38:10 +08:00
lilydjwg
112c916a6d add --failures cmdline flag to indicate there are errors during checking
closes #205.
2022-01-20 14:45:21 +08:00
lilydjwg
83286263d2 [mypy] remove some unusted type: ignore 2021-12-29 17:15:29 +08:00
lilydjwg
5e60a1dc86 replace the toml library with tomli 2021-12-29 16:36:23 +08:00
lilydjwg
8d4982d440 try asyncio.run again
because asyncio.get_event_loop() is deprecated.
2021-12-14 12:58:55 +08:00
lilydjwg
91cbed1f8e httpx: allow_redirects becomes follow_redirects 2021-11-06 10:59:48 +08:00
lilydjwg
b8b6e47d59 update test_apt 2021-11-06 10:57:41 +08:00
lilydjwg
f3be1c585f regex: make sure we don't have more than one group in the regex 2021-11-06 10:44:10 +08:00
lilydjwg
f8be3e7418 doc: \ in ` ` should not be escaped
Vim syntax has an error with this.
[skip ci]
2021-11-02 11:23:03 +08:00
lilydjwg
e93f91af57 doc: remove an erroneous `
[skip ci]
2021-11-02 11:20:41 +08:00
lilydjwg
d24f64a4f0 doc: update example to use toml syntax
fixes #204.
[skip ci]
2021-11-02 11:11:55 +08:00
lilydjwg
ceb573f337 Python 3.10 2021-10-07 14:46:13 +08:00
lilydjwg
3ef83b0a95 bump version to 2.5 2021-10-03 16:48:44 +08:00
依云
750999f397
Merge pull request #201 from yuyichao/post
Support HTTP POST request in the htmlparser and regex source
2021-09-07 13:13:19 +08:00
Yichao Yu
c43d4e900f
Support HTTP POST request in the htmlparser and regex source
Example:

    source = "regex"
    regex = "spcm_linux_libs_v[0-9a-zA-Z]*"
    url = "https://spectrum-instrumentation.com/spcm_downloads_downloads_ajax"
    post_data = "series%5B%5D=273&families%5B%5D=475"
2021-09-07 00:40:23 -04:00
Yichao Yu
c65a5343ed
httpclient: Support arbitrary request body
The body will be supplied as a string. This is the only method I can find
that is supported by all the backends as well as the config file.

Make sure all the backends have the same default content types.
2021-09-05 16:23:26 -04:00
依云
4d6e87a975
Merge pull request #200 from yan12125/fix-regex
regex: make sure regex fails if nothing is found
2021-08-31 20:18:52 +08:00
Chih-Hsuan Yen
9acae25c0f
regex: make sure regex fails if nothing is found
This is a regression since 2019 [1].
Before that commit, max() raises ValuError if re.findall returns an
empty list. After that commit, regex fails silently if nothing is found.

[1] 7897317294 (diff-fd40f1295f9ddece86df8d2c385ddb02bd4e2a10cd20bd9a95306d3d35a9b601)
2021-08-31 20:07:06 +08:00
依云
f443fa8e96
Merge pull request #199 from yan12125/update-tests
Update tests
2021-08-22 23:20:31 +08:00
Chih-Hsuan Yen
b8afc43430
Update tests
* Use a seemingly never updated container for testing container
  registries other than docker.io
* Update versions for openvsx and vsmarketplace
2021-08-22 22:43:33 +08:00
依云
20183a83e5
Merge pull request #198 from yan12125/relax-test-anitya
anitya: relax the test - shutter is updated often these days
2021-08-07 15:13:18 +08:00
Chih-Hsuan Yen
44b9f1856f
anitya: relax the test - shutter is updated often these days 2021-08-07 14:37:18 +08:00
lilydjwg
02515fc82d cleanup dependencies and update setup.py
closes #197
2021-07-29 13:13:00 +08:00
lilydjwg
b70de0a628 version 2.4 2021-07-28 19:13:08 +08:00
lilydjwg
4643652eaf update keyfile.toml.enc
[skip ci]
2021-07-27 16:06:39 +08:00
lilydjwg
ad24b841cf github actions: run apt update 2021-07-25 19:49:00 +08:00
lilydjwg
494c4ddf67 alpm: make repo option optional 2021-07-25 19:38:26 +08:00
lilydjwg
11aea624e6 update docs about parse_version
[skip ci]
2021-07-18 21:20:20 +08:00
lilydjwg
ddd99bd59f tweaks for mypy 2021-07-18 17:24:03 +08:00
lilydjwg
9a42ed9503 alpm: make sure we get the repo config 2021-07-18 17:14:18 +08:00
lilydjwg
606b3f65fb vendor packaging.version for our own use
because LegacyVersion is deprecated.
closes #180.
2021-07-18 17:12:06 +08:00
lilydjwg
151b03801f update tests 2021-07-14 22:45:13 +08:00
Th3Whit3Wolf
25d7a7498d Add Visual Studio Code Marketplace 2021-07-14 22:45:13 +08:00
Th3Whit3Wolf
ad2dc19415 Add OpenVSX source 2021-07-14 13:05:45 +01:00
lilydjwg
46d724b465 update sparkle 2021-07-01 19:47:47 +08:00
lilydjwg
f8dd1a4212 docs: add an inline sample config
also fix the code block display
2021-07-01 19:38:17 +08:00
依云
0f44759b35
Merge pull request #193 from murlakatamenka/improve-docs
Docs: add sample keyfile.toml
2021-07-01 19:27:23 +08:00
Sergey A
822eb76a1c Docs: add sample keyfile.toml 2021-07-01 10:46:14 +03:00
lilydjwg
92e422c726 add doc for the cratesio source
closes #192
[skip ci]
2021-07-01 15:04:56 +08:00
lilydjwg
6fd3ba95ba apt: handle multiple verions correctly
closes #191
2021-06-25 15:22:40 +08:00
lilydjwg
4c4b770c27 docs: clarify behavior when from_pattern is not matched
[skip ci]
2021-06-24 11:18:35 +08:00
lilydjwg
ffaca8c949 option to not verify HTTPS certificate
fixes #190.
2021-06-23 16:08:47 +08:00
lilydjwg
3eee6480ab docs: remove trailling space
[skip ci]
2021-06-11 16:19:49 +08:00
Maud LAURENT
9deb55806d
Add optional token for regex and httpheader. Add htmlparser source (#182)
add an httptoken option; add htmlparser with xpath source

Co-authored-by: Maud LAURENT <maud.laurent@eurecia.com>
Co-authored-by: lilydjwg <lilydjwg@gmail.com>
2021-06-11 16:18:39 +08:00
lilydjwg
4033c0b9ba fix a typo in the docs
[skip ci]
2021-06-08 15:11:20 +08:00
lilydjwg
3c2abb01f0 update docs for combiner
[skip ci]
2021-06-08 15:07:44 +08:00
lilydjwg
ae506ba9cf add a combiner source and the underlying mechanism to wait for other entries' results 2021-06-08 14:55:57 +08:00
lilydjwg
d83d8d5367 aur: avoid the asyncio.wait warning 2021-06-06 12:26:53 +08:00
lilydjwg
943847da74 container: fix test 2021-06-06 12:26:44 +08:00
lilydjwg
68fcfe3924 container: add pagination
fixes #189
2021-06-06 12:13:07 +08:00
依云
6f7633a93c
Merge pull request #188 from yan12125/fix-test-pacman
Fix test_pacman in some scenarios
2021-06-03 12:38:14 +08:00
Chih-Hsuan Yen
1064f3d948
Update test_anitya.py
Shutter is now actively developed :)
2021-06-03 12:33:07 +08:00
Chih-Hsuan Yen
61bc0c5562
Fix test_pacman in some scenarios
For example, on reproducible.archlinux.org, apparently pacman databases
are not synced before building synced before building packages, and thus
packages cannot be found. From https://reproducible.archlinux.org/api/v0/builds/110565/log,

=================================== FAILURES ===================================
_________________________________ test_pacman __________________________________

get_version = <function get_version.<locals>.__call__ at 0x7f2fa5ed1310>

    async def test_pacman(get_version):
>       assert await get_version("ipw2100-fw", {
            "source": "pacman",
        }) == "1.3-10"

tests/test_pacman.py:11:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
tests/conftest.py:46: in __call__
    newvers = await run(entries)
tests/conftest.py:40: in run
    return await main.run(result_coro, runner_coro)
nvchecker/__main__.py:92: in run
    return await result_fu
nvchecker/core.py:361: in process_result
    r1 = _process_result(r)
nvchecker/core.py:322: in _process_result
    logger.error(version.msg, **kw)
/usr/lib/python3.9/site-packages/structlog/_log_levels.py:118: in meth
    return self._proxy_to_logger(name, event, **kw)
/usr/lib/python3.9/site-packages/structlog/_base.py:198: in _proxy_to_logger
    args, kw = self._process_event(method_name, event, event_kw)
/usr/lib/python3.9/site-packages/structlog/_base.py:155: in _process_event
    event_dict = proc(self._logger, method_name, event_dict)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

logger = <PrintLogger(file=<_io.TextIOWrapper name="<_io.FileIO name=6 mode='rb+' closefd=True>" mode='r+' encoding='utf-8'>)>
method_name = 'error'
event_dict = {'cmd': "LANG=C pacman -Si ipw2100-fw | grep -F Version | awk '{print $3}' | head -n 1", 'error': "warning: database f...r: package 'ipw2100-fw' was not found", 'event': 'command exited without output', 'logger_name': 'nvchecker.core', ...}

    def proc(logger, method_name, event_dict):
      if method_name in ('warning', 'error'):
        if 'exc_info' in event_dict:
          raise event_dict['exc_info']
        if not event_dict['event'].startswith(('rate limited', 'no-result')):
>         raise RuntimeError(event_dict['event'])
E         RuntimeError: command exited without output

tests/conftest.py:85: RuntimeError
----------------------------- Captured stdout call -----------------------------
running cmd
__________________________ test_pacman_strip_release ___________________________

get_version = <function get_version.<locals>.__call__ at 0x7f2fa5ed1310>

    async def test_pacman_strip_release(get_version):
>       assert await get_version("ipw2100-fw", {
            "source": "pacman",
            "strip_release": 1,
        }) == "1.3"

tests/test_pacman.py:16:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
tests/conftest.py:46: in __call__
    newvers = await run(entries)
tests/conftest.py:40: in run
    return await main.run(result_coro, runner_coro)
nvchecker/__main__.py:92: in run
    return await result_fu
nvchecker/core.py:361: in process_result
    r1 = _process_result(r)
nvchecker/core.py:322: in _process_result
    logger.error(version.msg, **kw)
/usr/lib/python3.9/site-packages/structlog/_log_levels.py:118: in meth
    return self._proxy_to_logger(name, event, **kw)
/usr/lib/python3.9/site-packages/structlog/_base.py:198: in _proxy_to_logger
    args, kw = self._process_event(method_name, event, event_kw)
/usr/lib/python3.9/site-packages/structlog/_base.py:155: in _process_event
    event_dict = proc(self._logger, method_name, event_dict)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

logger = <PrintLogger(file=<_io.TextIOWrapper name="<_io.FileIO name=6 mode='rb+' closefd=True>" mode='r+' encoding='utf-8'>)>
method_name = 'error'
event_dict = {'cmd': "LANG=C pacman -Si ipw2100-fw | grep -F Version | awk '{print $3}' | head -n 1", 'error': "warning: database f...r: package 'ipw2100-fw' was not found", 'event': 'command exited without output', 'logger_name': 'nvchecker.core', ...}

    def proc(logger, method_name, event_dict):
      if method_name in ('warning', 'error'):
        if 'exc_info' in event_dict:
          raise event_dict['exc_info']
        if not event_dict['event'].startswith(('rate limited', 'no-result')):
>         raise RuntimeError(event_dict['event'])
E         RuntimeError: command exited without output

tests/conftest.py:85: RuntimeError
----------------------------- Captured stdout call -----------------------------
running cmd
2021-06-03 12:11:41 +08:00
lilydjwg
8db5c6a938 doc: we may need a newer rtd theme 2021-05-23 17:47:12 +08:00
lilydjwg
c65513b37d doc: clarify options of the container source
See also: https://github.com/lilydjwg/nvchecker/issues/183
2021-05-23 17:38:19 +08:00
依云
9d2d47ed15
Merge pull request #181 from yan12125/update-tests
Update tests
2021-05-06 13:07:47 +08:00
Chih-Hsuan Yen
134a6885f1
Update tests 2021-05-06 12:44:55 +08:00
依云
d7c553ae94
Merge pull request #179 from marsam/fix-test_httpheader
tests: add needs_net marker to httpheader test
2021-03-21 11:19:20 +08:00
Mario Rodas
7366d82bfc tests: add needs_net marker to httpheader test 2021-03-20 04:20:00 +00:00
lilydjwg
1b7736a549 bump version to 2.4dev and fix docs/*.py being included in dist tarball 2021-03-19 13:51:48 +08:00
lilydjwg
46da41147f version 2.3
update docs for gitlab source.
2021-03-19 13:41:26 +08:00
lilydjwg
f8f261c476 gitlab: omit default branch if not specified, fix for blm 2021-03-18 13:19:05 +08:00
依云
968af3df0b
Merge pull request #178 from yan12125/test-sortversion
Use pytest for sortversion tests
2021-02-21 17:37:07 +08:00
Chih-Hsuan Yen
bf8a4c9298
Use pytest for sortversion tests 2021-02-21 15:24:01 +08:00
依云
bace59de78
Merge pull request #177 from yan12125/patch-1
Fix tests
2021-02-14 08:50:45 +08:00
Chih-Hsuan Yen
e6e8573c33
Fix tests
Upstream commits: https://salsa.debian.org/med-team/golang-github-dataence-porter2/-/commits/master
2021-02-13 23:30:48 +08:00
lilydjwg
de0a5fe02e github actions: omit pypy tests
continue-on-error doesn't work as expected and there is no way to
"allow-failure". See https://github.com/actions/toolkit/issues/399.
2021-02-01 17:58:14 +08:00
lilydjwg
27269534d2 update test status badget 2021-02-01 17:43:55 +08:00
lilydjwg
0114a411f2 switch tests from travis to github actions
because travis queues are too long.
2021-02-01 15:17:44 +08:00
依云
d2a5a1bf79
Merge pull request #176 from aminvakil/master
Correct package name in httpheader test
2021-01-29 16:58:47 +08:00
Amin Vakil
d2130b64c6
Correct package name in httpheader test 2021-01-29 12:18:00 +03:30
lilydjwg
29f2bad400 tests: use another url because jmeter-plugins.org one has issues with Tornado's simple HTTP client. 2021-01-29 16:11:11 +08:00
lilydjwg
305c329e02 add a httpheader source
see #175 for use cases.
2021-01-28 18:05:58 +08:00
lilydjwg
9c2d2a39d7 httpclient: add follow_redirects argument 2021-01-28 17:56:18 +08:00
lilydjwg
2e069fe1c7 add headers to HTTP Response objects and support for HTTP HEAD method 2021-01-28 17:43:09 +08:00
lilydjwg
8b9bd1ca1c Merge remote-tracking branch 'origin/pr/173' 2021-01-17 20:21:06 +08:00
Chih-Hsuan Yen
91daf5ad4d
Make the generated man page reproducible 2021-01-17 17:20:10 +08:00
lilydjwg
e09750b7a2 pypi source: use packaging.version.Version directly 2021-01-15 15:40:23 +08:00
lilydjwg
491a71add7 git source: support fetching commit hash on a branch 2020-12-24 22:01:22 +08:00
lilydjwg
a0c32ce5f0 travis: test on Python 3.9 2020-12-24 22:01:22 +08:00
Felix Yan
f00c6d163f
Add a mercurial source
This uses the json api of mercurial repositories. Verified to work on
the following hosting sites:

www.mercurial-scm.org/repo
hg.code.sf.net
hg.nginx.org
hg.mozilla.org
2020-11-18 01:28:45 +08:00
依云
d053fc3ba4
Merge pull request #169 from felixonmars/limit-entry
Add a command line option to run nvchecker only on a specific entry
2020-11-17 11:44:06 +08:00
Felix Yan
e6c0ce38e5
Add a command line option to run nvchecker only on a specific entry
This is useful for debugging purpose, when your nvchecker.toml is very
large.
2020-11-17 06:34:53 +08:00
lilydjwg
e5a28b0fb1 bump version to 2.3dev 2020-11-10 23:27:55 +08:00
lilydjwg
372454136f tests/test_alpm: add --nosign and remove an unused import 2020-11-10 23:27:09 +08:00
DDoSolitary
5622d6d0c1
add ALPM source 2020-11-09 05:07:14 +08:00
lilydjwg
630f41c19b docs: a missed - -> _ conversion
[skip ci]
2020-11-05 00:11:03 +08:00
lilydjwg
64994e4a74 version 2.2 2020-11-03 14:29:45 +08:00
lilydjwg
b4689369c8 repology: return a list of versions instead
to support list options. Returning the first isn't usually desirable.
2020-10-30 17:57:31 +08:00
Jakub Ružička
024849c242 repology: add support for subrepo
Without this, querying latest Fedora packages is impossible because they
reside in 'updates' subrepo but old 'release' versions are returned.
Same thing for openSUSE.

Example config to query Fedora 32 latest packages in updates subrepo:

[somepackage]
source = "repology"
repo = "fedora_32"
subrepo = "updates"
2020-10-30 06:06:15 +01:00
lilydjwg
b83f92cc67 give better error messages when failing to load file 2020-10-24 16:31:18 +08:00
lilydjwg
a8073e2ad5 usage.rst: mention the `git source in 2.x changes on removed vcs` one 2020-10-24 16:22:00 +08:00
lilydjwg
cddf866ef6 tests/test_archpkg: use "dbus" to test "provided"
This package provides a package and a so.
2020-10-23 17:06:44 +08:00
依云
c8b0e673ca
Merge pull request #166 from DDoSolitary/patch-archpkg-provided
fix the provided option for unversioned "provides"
2020-10-23 17:05:16 +08:00
DDoSolitary
6815b035a1
fix the provided option for non-soname "provides"
this fixes #165
2020-10-23 03:35:11 +08:00
依云
30055a9dcb
Merge pull request #164 from DDoSolitary/patch-gems-list
Add list options support for gems source
2020-10-22 00:35:59 +08:00
DDoSolitary
ab66fc0468
Add list options support for gems source 2020-10-22 00:10:45 +08:00
依云
8d1a079703
Merge pull request #163 from yan12125/android-sdk-channel
Add `channel` for Android SDK packages
2020-10-18 00:16:34 +08:00
Chih-Hsuan Yen
ea1200126b
Add channel for Android SDK packages 2020-10-10 13:19:36 +08:00
lilydjwg
d4b07d67b8 Merge branch 'master' of github.com:lilydjwg/nvchecker into master 2020-10-09 19:14:56 +08:00
lilydjwg
f8566f3f91 also install the nvchecker-notify script 2020-10-09 19:11:56 +08:00
lilydjwg
8a988ba3a3 nvchecker-notify: add the keyfile flag 2020-10-09 19:11:12 +08:00
依云
441f2a9fbd
Merge pull request #162 from yan12125/container
Add container source plugin
2020-10-07 21:14:54 +08:00
Chih-Hsuan Yen
75e72c11b3
Add container source plugin
Closes https://github.com/lilydjwg/nvchecker/issues/59
2020-10-07 19:17:48 +08:00
Chih-Hsuan Yen
8d83d7ac66
Add exception HTTPError for HTTP 4xx errors 2020-10-06 22:21:56 +08:00
Chih-Hsuan Yen
f957acc756
Use my formal English name for previous contributions 2020-10-06 22:17:47 +08:00
lilydjwg
42dfd483c7 add nvchecker-notify
which mimics `nvchecker -n` in 1.x era. See #158.

[skip ci]
2020-10-06 11:55:27 +08:00
lilydjwg
e080e32c89 remove old scripts 2020-10-06 11:54:15 +08:00
lilydjwg
926e75c075 fix versions sorting 2020-10-06 10:18:21 +08:00
lilydjwg
71612ad9b8 sort and indent version record files
See #161.
2020-10-06 10:07:20 +08:00
Felix Yan
108016aa45 Fix documentation for git source (fixes #159) 2020-10-06 10:01:25 +08:00
lilydjwg
55c4cd9868 release version 2.1 2020-10-05 23:57:30 +08:00
lilydjwg
7d4b2daa65 make make man work
[skip ci]
2020-10-05 12:26:48 +08:00
lilydjwg
ccae7d8fcf docs: add a script to convert usage.rst to a man page
[skip ci]
2020-10-04 21:18:27 +08:00
依云
c251af817a Create codeql-analysis.yml 2020-10-02 17:24:34 +08:00
lilydjwg
b3bc7ec049 exclude docs/ in setup.py
[skip ci]
2020-10-02 13:03:12 +08:00
lilydjwg
e9190df7c2 docs: try to make things clearer
[skip ci]
2020-10-02 12:54:03 +08:00
lilydjwg
494c67beb0 Merge remote-tracking branch 'origin/pr/155' into master 2020-10-02 12:38:13 +08:00
lilydjwg
54596bde3f well, let's start version 2.1 2020-10-02 12:37:57 +08:00
lilydjwg
854399a30a rename sample_source.toml to sample_config.toml
source now refers to where to get information from, e.g. "source plugins".
2020-10-02 12:35:37 +08:00
Felix Yan
071b57c580
Allow empty to_pattern
It's useful to have a partial match pattern to be replaced by an empty
string, like stripping a suffix. Let's allow it as "not to_pattern" will
reject this.
2020-10-02 04:39:57 +08:00
lilydjwg
0eca4a9b43 releasing version 2.0 2020-10-01 14:41:35 +08:00
lilydjwg
4cb63cbe65 test_gitea: remove some cases that go out of date too frequently 2020-09-28 16:12:22 +08:00
lilydjwg
4f3a900505 setup httpclient from configuration options
closes #76
2020-09-28 15:09:51 +08:00
lilydjwg
254a229401 fix support for Python 3.7 2020-09-24 18:56:14 +08:00
lilydjwg
cd1cbfde30 support per source plugin configuration and support different registry for the npm source 2020-09-24 18:27:30 +08:00
依云
121b94a7eb
Merge pull request #153 from felixonmars/git
Add a git source
2020-09-24 16:44:50 +08:00
Felix Yan
3ffb34257a
Use list comprehension instead of map lambda 2020-09-24 16:13:45 +08:00
Felix Yan
7213d84056
Avoid using regex at all 2020-09-24 05:45:09 +08:00
Felix Yan
d7624defd7
Fix mypy and regex flag 2020-09-24 04:51:28 +08:00
Felix Yan
c9b689d67b
Add a git source
It's a thin wrapper around the cmd source, and reuses its get_cmd
function.
2020-09-24 04:26:12 +08:00
依云
e6ecb753d6
Merge pull request #152 from felixonmars/fix-typo
Correct a typo in usage.rst
2020-09-23 23:11:57 +08:00
Felix Yan
050f440465
Correct a typo in usage.rst 2020-09-23 22:41:14 +08:00
依云
a78c06163c
Merge pull request #151 from felixonmars/npm-range-request
npm: Fetch only first 1KB
2020-09-23 20:54:31 +08:00
Felix Yan
71a63f90f6
npm: Fetch only first 1KB
The result is still large even after specifying
"Accept: application/vnd.npm.install-v1+json". Since the content we need
are always at the beginning of the response, let's make it more violent.
2020-09-23 20:04:02 +08:00
lilydjwg
07951bd745 httpx update to 0.15.0 2020-09-23 17:11:56 +08:00
Felix Yan
23bcbf8906
npm: Request abbreviated metadata to reduce download size
Ref:
https://github.com/npm/registry/blob/master/docs/responses/package-metadata.md
2020-09-23 16:51:09 +08:00
依云
c03e33db56
Merge pull request #148 from felixonmars/apt-parse
apt: Parse Packages and cache it to improve performance
2020-09-21 19:11:32 +08:00
Felix Yan
82d5ebb630
apt: Parse Packages and cache it to improve performance
It's a very noticable 20~40s to 1s improvement here.
2020-09-21 18:49:23 +08:00
lilydjwg
ac73106e43 apt: minor update 2020-09-20 23:09:40 +08:00
lilydjwg
185a7e88a9 use asyncio.Semaphore instead of self-made queue 2020-09-20 22:15:54 +08:00
lilydjwg
f4983eaea3 apt: run decompress in an executor to give other tasks time to run 2020-09-20 22:02:18 +08:00
lilydjwg
450fa7de10 update test_gitea 2020-09-17 13:20:38 +08:00
依云
2502b0babc
Merge pull request #147 from felixonmars/nvcmp-sort-and-newer
Add --sort and --newer to nvcmp
2020-09-17 13:17:55 +08:00
依云
ef808b7517
Merge pull request #146 from felixonmars/apt
Add an APT source
2020-09-17 13:17:33 +08:00
Felix Yan
d380e31881
Fix srcpkg test 2020-09-17 12:55:30 +08:00
Felix Yan
a018f71186
Make mirror and suite mandatory 2020-09-17 12:53:03 +08:00
Felix Yan
a2b03867a9 Add --sort and --newer to nvcmp
The arrow is shown as backwards if the new version is older than the old
version. Also added a switch --newer to hide those with an older new
version.
2020-09-17 12:49:26 +08:00
Felix Yan
da5b489173
Use srcpkg instead of source_pkg 2020-09-16 21:19:38 +08:00
Felix Yan
b620ed4e90
Support source package too 2020-09-16 18:57:50 +08:00
Felix Yan
819a8461a4
Use deb.debian.org instead of ftp.debian.org 2020-09-16 17:41:45 +08:00
Felix Yan
ae563d007f
Add an APT source
This source follows the same flow apt itself uses to check for a
package's version, so it can be used to check for any APT repository
(not necessarily Debian or Ubuntu).
2020-09-16 17:20:17 +08:00
lilydjwg
e744a27572 use dedicated exception for loading errors
fixes #145
2020-09-11 15:11:50 +08:00
依云
f28cc7ae61
Merge pull request #144 from felixonmars/patch-2
Correct a typo in util.py
2020-09-09 07:01:39 +08:00
Felix Yan
f8444926ba
Correct a typo in util.py 2020-09-09 02:39:44 +08:00
依云
61e53b6698
Merge pull request #141 from felixonmars/nvtake-to-specific-version
Allow nvtake to update to a specific version
2020-09-03 19:30:59 +08:00
lilydjwg
fbb7484b32 docs: try newer sphinx
type annotations don't link. :meta private: doesn't hide. async is
absent.

[ci skip]
2020-09-03 17:52:30 +08:00
lilydjwg
daae6b5341 readthedocs doesn't have curl-config, try without 2020-09-03 17:02:08 +08:00
lilydjwg
bf8baa2c21 httpclient: fix headers being modified across invocations 2020-09-03 17:00:04 +08:00
lilydjwg
b9ac0794aa add docs/requirements.txt for readthedocs 2020-09-03 16:55:18 +08:00
lilydjwg
cf42eab8c2 README: add documentation badge 2020-09-03 16:53:29 +08:00
lilydjwg
03f99944cf remove the NEW file
because it's done!
2020-09-03 16:42:48 +08:00
lilydjwg
1ceea05e15 trim down README and link the docs from readthedocs
It doesn't link well on GitHub.
2020-09-03 16:39:51 +08:00
lilydjwg
7bae8ece16 tests/test_debianpkg: update 2020-09-03 16:30:32 +08:00
lilydjwg
717c3e2367 Merge branch 'master' of github.com:lilydjwg/nvchecker into master 2020-09-03 16:27:57 +08:00
lilydjwg
8c99ae8eea add documentation for plugins 2020-09-03 16:27:35 +08:00
依云
0397ea5cc5
Merge pull request #143 from felixonmars/f-string
Fix f-string in github source
2020-09-03 10:33:43 +08:00
Felix Yan
52ab39d92e
Fix f-string in github source 2020-09-03 04:38:14 +08:00
依云
8e277e1528
Merge pull request #142 from felixonmars/f-string
Use f-string for better readability
2020-09-03 02:36:36 +08:00
Felix Yan
30fe6e6d49
Use f-string for better readability
Since we are targeting 3.7+, it should be safe to use them now.
2020-09-03 00:51:46 +08:00
Felix Yan
3591bd8ce8
Allow nvtake to update to a specific version
This is useful when the new version is somehow broken and you currently
can only update the software to a version in between. It's also useful
when you ran nvtake by accident and want to reset it back to the correct
old version.
2020-09-03 00:17:14 +08:00
lilydjwg
0eb5860584 Merge remote-tracking branch 'origin/pr/140' into master 2020-09-02 23:15:34 +08:00
Felix Yan
2604b8377c
Make nvcmp output colorful
Also refactored nicelogger to reuse the logic there.
2020-09-02 17:37:48 +08:00
lilydjwg
36358c889a Merge remote-tracking branch 'origin/pr/139' into master 2020-09-02 14:45:35 +08:00
lilydjwg
61daa29d48 fix aiohttp_httpclient not posting json 2020-09-02 14:40:38 +08:00
lilydjwg
8c4b5bbe57 travis-ci: use keyfile when KEY is set 2020-09-01 21:51:47 +08:00
依云
8a892f3909
Merge pull request #138 from felixonmars/fix-typo
Fix typos in README
2020-09-01 21:40:43 +08:00
Felix Yan
042217eee2
Complete only those having different versions and merge into nvcmp 2020-09-01 20:33:41 +08:00
Felix Yan
465b9ee4dd
Add bash completion support for nvtake 2020-09-01 20:15:45 +08:00
Felix Yan
d238a219f3
Fix typos in README 2020-09-01 18:59:41 +08:00
依云
ade7d53a1a
Merge pull request #137 from felixonmars/pagure
Add support for Pagure
2020-09-01 18:28:42 +08:00
Felix Yan
7f64ffa12a
Fine tunes 2020-09-01 18:13:56 +08:00
Felix Yan
a7da4207ae
Add support for Pagure 2020-09-01 18:11:57 +08:00
lilydjwg
0cfaac774d update README
[ci skip]
2020-09-01 17:02:38 +08:00
lilydjwg
0d8b65a9c0 missing updates about the new "appdirs" dep 2020-08-31 17:50:11 +08:00
lilydjwg
25b3eb5261 fix not using keyfile in configuration file when --keyfile not given
fixes #136.
2020-08-31 17:38:53 +08:00
lilydjwg
bc1f9df8c6 use appdirs for the default configuration file
closes #97.
2020-08-31 17:34:11 +08:00
lilydjwg
2a2c67090c forgot to remove test_vcs.py 2020-08-30 18:34:51 +08:00
lilydjwg
307c0db0f8 remove vcs source from here and update README
It's cumbersome and prevent the project to be zip_safe. Move it to where
it's used (archlinuxcn/lilac) so that it can evolve with that project.
2020-08-30 18:22:58 +08:00
lilydjwg
a2a239bdc6 nvchecker_source/manual: convert to string
because it's often written as an int or float, especially when
converting from YAML.
2020-08-30 17:49:39 +08:00
lilydjwg
290240eafb README: fix badge
[ci-skip]
2020-08-27 23:53:29 +08:00
lilydjwg
b8d5bc151e merge nvcheckver2 (the 2.0 dev branch) into master 2020-08-27 23:30:03 +08:00
lilydjwg
5689bc0a3f update README and make new version master 2020-08-27 23:28:59 +08:00
lilydjwg
3d90c02273 fix README 2020-08-27 01:55:54 +08:00
lilydjwg
c2b4e51f56 README: update wording
[ci-skip]
2020-08-27 01:55:11 +08:00
lilydjwg
b54ebe6bcf some setup.py fixes for #135
* use `find_namespace_packages` to find recursively
* install the nvchecker-ini2toml script
2020-08-27 01:49:19 +08:00
lilydjwg
c8c7bc5d4e README: Python 3.7+ is required
[ci-skip]
2020-08-26 20:44:49 +08:00
lilydjwg
376e5ad499 update test_android_sdk 2020-08-26 20:38:55 +08:00
lilydjwg
275e502be3 update setup.py 2020-08-26 20:19:13 +08:00
lilydjwg
027b8f9b46 write versions to record files even it's up-to-date.
This matches the old behavior, and the record file means what we get,
not what are updated (that's what nvcmp does).
2020-08-26 20:18:08 +08:00
lilydjwg
6a6d5df682 port tools.py and change record files to use json format
because special characters like spaces broke the old format.
2020-08-26 20:06:13 +08:00
lilydjwg
4f515d75db fix type annotations
[ci skip]
2020-08-21 18:53:09 +08:00
lilydjwg
6c2e31193c fix README: toml has tables rather than sections 2020-08-21 18:24:43 +08:00
lilydjwg
7f8310e685 update README and some more fixes 2020-08-21 18:16:44 +08:00
lilydjwg
e59766f839 add the ini2toml script 2020-08-21 17:42:31 +08:00
lilydjwg
57626faecf fix aiohttp_httpclient 2020-08-20 19:29:00 +08:00
lilydjwg
58a7ae05ca hackage is flaky 2020-08-20 17:07:16 +08:00
lilydjwg
e6d0d158f0 drop tests for pypy because it doesn't support the annotations future import 2020-08-20 16:10:28 +08:00
lilydjwg
8ed445d238 aiohttp_httpclient: fix 2020-08-20 15:58:08 +08:00
lilydjwg
04a95de7af gitea.com is slow on travis, try harder 2020-08-20 15:50:36 +08:00
lilydjwg
649799389f bitbucket no longer supports hg 2020-08-20 15:42:50 +08:00
lilydjwg
e339827436 aiohttp: fix request arguments 2020-08-20 15:41:35 +08:00
lilydjwg
2d0d3cfcee bitbucket: minor cleanup 2020-08-20 15:39:21 +08:00
lilydjwg
f6ff697d21 dash doesn't have $RANDOM 2020-08-20 15:31:00 +08:00
lilydjwg
ce828df545 requires http2 for httpx 2020-08-20 15:30:43 +08:00
lilydjwg
eb6833475e partial updates to README 2020-08-20 15:23:05 +08:00
lilydjwg
e228e31b84 final tweak of keyfile 2020-08-20 15:18:43 +08:00
lilydjwg
95150fa8e9 port github source and tests (the last one!)
`include_tags_pattern` and `ignored_tags` removed, see #99
2020-08-20 15:18:28 +08:00
lilydjwg
61a67a4a5b port more tests 2020-08-18 16:45:20 +08:00
lilydjwg
7de923c1e1 remove ignored_tags from vcs and bitbucket sources
See #99.
2020-08-18 16:35:26 +08:00
lilydjwg
2f4629fb22 port gitlab source
also remove `ignored_tags` as per #99
2020-08-18 16:32:47 +08:00
lilydjwg
71015be582 port the gitea source 2020-08-18 16:22:14 +08:00
lilydjwg
c0d836ba65 github actions: try run directly on ubuntu to speed up and get latest Python packages via pip 2020-08-17 22:09:54 +08:00
lilydjwg
870fb99182 port more sources 2020-08-17 22:04:44 +08:00
lilydjwg
6737400a4c update NEW 2020-08-17 17:01:41 +08:00
lilydjwg
34f7de18c0 try run mypy with github actions 2020-08-17 16:56:54 +08:00
lilydjwg
894706819a update mypy.ini 2020-08-17 16:56:54 +08:00
lilydjwg
0f9099aae5 port more tests 2020-08-17 16:31:59 +08:00
lilydjwg
4887ec66d9 port more tests 2020-08-17 16:21:02 +08:00
lilydjwg
616cf2bc20 fix a typo causing proxy not used 2020-08-17 16:20:34 +08:00
lilydjwg
10f6b1561e start porting tests to new version 2020-08-17 15:11:14 +08:00
lilydjwg
1f7552bbf7 support setting keyfile on command line 2020-08-17 14:32:39 +08:00
lilydjwg
8520a62271 update httpclient implementations 2020-08-15 17:02:19 +08:00
lilydjwg
5e209cc9ad update typing 2020-08-14 21:23:31 +08:00
lilydjwg
19553c3564 create an api for modules 2020-08-14 20:04:05 +08:00
lilydjwg
72d1d27f89 port more sources, add cache.get_json 2020-08-14 18:25:48 +08:00
lilydjwg
5dcd13306a fix ci badget, refer to the master branch
[skip ci]
2020-08-13 22:36:09 +08:00
lilydjwg
961c1315ef better and simpler caching mechanism 2020-08-13 20:42:24 +08:00
lilydjwg
b76bfb5606 fixes 2020-08-13 20:03:07 +08:00
lilydjwg
435edf8589 support simpler source with only one get_version funcion
also caching is working now
2020-08-13 20:03:04 +08:00
lilydjwg
0232d0fb4f aur: optimize requests 2020-08-13 20:03:00 +08:00
lilydjwg
14b3863f11 version 2: MVP 2020-08-11 17:43:03 +08:00
lilydjwg
fbdde9bf58 Merge remote-tracking branch 'origin/pr/131' into master 2020-08-04 16:42:31 +08:00
techknowlogick
2b3e88be13
update readme 2020-08-03 12:15:42 -04:00
techknowlogick
4022a375d5
Update readme 2020-08-03 12:14:57 -04:00
Matti R
a35a35a784
Support Gitea as source 2020-08-02 21:33:29 -04:00
lilydjwg
d120f61b11 aur: minor cleanup 2020-08-01 13:28:57 +08:00
lilydjwg
273764c2a4 bump version to dev 2020-07-29 15:49:44 +08:00
lilydjwg
145af9f0e5 aiohttp: enable proxy environment handling 2020-07-29 15:49:06 +08:00
lilydjwg
f367d04287 travis: still need libgnutls28-dev 2020-07-25 15:21:07 +08:00
lilydjwg
ddc21adad5 travis: try ubuntu 18.04 again
because pypy3 + httpx is not happy with a ssl library lacking ALPN support
https://travis-ci.org/github/lilydjwg/nvchecker/jobs/711675911
2020-07-25 15:21:07 +08:00
lilydjwg
44687da8a6 update tests 2020-07-25 15:21:07 +08:00
lilydjwg
6101dde1e5 github: fix keyman may not be present 2020-07-25 15:21:07 +08:00
lilydjwg
fa42c573b6 release version 1.7 2020-07-02 20:57:29 +08:00
lilydjwg
1bcd02aad2 tests: session-scoped ioloop is needed for aiohttp support 2020-07-02 16:10:39 +08:00
lilydjwg
cbd7e13867 tornado: construct AsyncHTTPClient each time so that it uses the right ioloop
This is a better fix for #129.
2020-07-02 15:07:19 +08:00
lilydjwg
2fc1ec3355 remove code deprecated in tornado 5
Tornado<5 has already been dropped.
2020-07-02 00:47:59 +08:00
lilydjwg
36c995eef6 tests: set event_loop fixture to be session scoped
fixes https://github.com/lilydjwg/nvchecker/issues/129.
2020-07-02 00:45:55 +08:00
lilydjwg
e176078226 all nightly tests are allowed to fail 2020-06-14 14:05:11 +08:00
lilydjwg
7a2d3d226b GitHub: add use_latest_tag 2020-06-14 13:55:51 +08:00
lilydjwg
ac3e2beef9 bump to new dev version 2020-05-05 20:40:43 +08:00
依云
3dfbca494c
Merge pull request #127 from guizai/add-sparkle-source
Fix duplicated version and shortVersionString in sparkle
2020-05-05 17:29:40 +08:00
Guizai
e7634a625c fix: duplicated version and shortVersionString in sparkle 2020-05-05 16:37:02 +08:00
依云
4109091955
Merge pull request #126 from guizai/add-sparkle-source
Add sparkle source
2020-05-05 13:41:17 +08:00
Guizai
f0869aa1c1 feat: Add sparkle to sample source 2020-05-05 12:23:50 +08:00
Guizai
dd15f68033 feat: Add sparkle source 2020-05-05 12:20:36 +08:00
lilydjwg
bc2fcd148d fix: released wrong files to PyPI 2020-04-13 12:44:13 +08:00
lilydjwg
4fc62d3588 tests: update test_archpkg 2020-04-08 18:17:32 +08:00
lilydjwg
430ce48e6f release version 1.6 2020-04-08 17:25:59 +08:00
lilydjwg
eba37c47a4 tests: update gitlab token 2020-04-02 21:28:37 +08:00
lilydjwg
89fe2988cc pypi: fix sorting with use_pre_release=True 2020-04-02 20:12:33 +08:00
lilydjwg
de6fc3b5ba update .travis.yml to match latest syntax 2020-02-29 00:57:24 +08:00
lilydjwg
249e6b8f9b fix wrong version 2020-02-29 00:38:00 +08:00
lilydjwg
09f65b2dc3 add a test for cmd with % 2020-02-28 16:02:16 +08:00
lilydjwg
04101c11e0 bump version to dev 2020-02-28 15:55:18 +08:00
lilydjwg
31f618e586 no interpolation should be done for the ini file 2020-02-28 15:53:56 +08:00
lilydjwg
374ef20ad5 tornado<5 is no longer supported
because it doesn't support Python 3.9
2020-02-18 16:16:44 +08:00
依云
47c434ae3e
Merge pull request #125 from kyrias/vcs-list-options
vcs: Add support for list options when using use_max_tag
2020-02-18 16:11:40 +08:00
Johannes Löthberg
add890fbea vcs: Add support for list options when using use_max_tag
Signed-off-by: Johannes Löthberg <johannes@kyriasis.com>
2020-02-18 03:06:57 +01:00
lilydjwg
fe2ff0aa74 make it clearer that github releases are not those tags
closes #123.
2019-12-09 12:52:05 +08:00
依云
9c9e7f89d9
Merge pull request #122 from yan12125/fix-debian-ubuntu-tests
Fix Debian & Ubuntu tests
2019-12-07 14:24:30 +08:00
Chih-Hsuan Yen
d9daa17a50
Fix Debian & Ubuntu tests 2019-12-07 11:23:44 +08:00
依云
912e3cb758
Merge pull request #121 from yan12125/fix-gitlab-tests
Fix gitlab tests
2019-11-29 23:20:14 +08:00
Chih-Hsuan Yen
b6d26e68fb
Fix gitlab tests
The repo https://gitlab.com/gitlab-org/gitlab-test gets a new tag
v1.1.1, which is for testing a new GitLab feature [1][2].

[1] https://gitlab.com/gitlab-org/gitlab/merge_requests/17773
[2] https://gitlab.com/gitlab-org/gitlab-test/merge_requests/40
2019-11-29 22:42:08 +08:00
依云
a9639eee98
Merge pull request #120 from yan12125/pacman-fixes-and-updates
Pacman fixes and updates
2019-11-20 00:12:15 +08:00
Chih-Hsuan Yen
7055ad555e
Update for upcoming ipw2100-fw 1.3-10
It's rebuilt during the last [core] mass rebuilds [1].

[1] https://lists.archlinux.org/pipermail/arch-dev-public/2019-November/029721.html
2019-11-18 16:58:44 +08:00
Chih-Hsuan Yen
3546821471
pacman: fix two issues
* Correct cache key
* Return only one version when a package is available in two or more
  repos (e.g., stable and testing)
2019-11-18 16:56:53 +08:00
依云
1cb902d691
Merge pull request #118 from marsam/master
tests: add needs_net marker to ubuntupkg tests
2019-11-05 13:57:47 +08:00
Mario Rodas
d10f3fc5e1
tests: add needs_net marker to ubuntupkg tests 2019-11-04 21:00:00 -05:00
lilydjwg
a86c4fdc13 new release 2019-11-04 17:44:13 +08:00
lilydjwg
374f7c3d42 vcs: use a timeout of 20 seconds 2019-11-03 23:57:16 +08:00
lilydjwg
38c1a4ce9d Merge commit '058b7d9679e1e9712baaa4380819e0de7f6dc50a' 2019-11-03 22:39:55 +08:00
lilydjwg
058b7d9679 more README update 2019-11-03 22:38:47 +08:00
oldherl
761c944f19
explain use_latest_release and use_max_tag in README.rst 2019-11-03 00:31:53 +08:00
lilydjwg
3e47944a5a travis-ci: 3.8 is available 2019-10-29 23:51:16 +08:00
lilydjwg
f677eda656 update README for repology source 2019-10-29 23:43:28 +08:00
lilydjwg
2fca9e025c add repology source, close #115 2019-10-29 23:40:24 +08:00
lilydjwg
be7e54404e archpkg: support provides 2019-10-29 00:19:07 +08:00
lilydjwg
7e44dd5767 travis: don't run test_aur 2019-10-15 18:59:16 +08:00
lilydjwg
eafeaca789 travis: revert to xenial because no pypy available for bionic
travis only has good support for pretty old distributions QAQ
2019-10-15 18:09:41 +08:00
lilydjwg
b7b7a08ae4 travis: add comment about libgnutls28-dev 2019-10-15 18:04:31 +08:00
lilydjwg
b45ac1831d travis: Ubuntu has changed the package name? 2019-10-15 17:57:29 +08:00
lilydjwg
6fb9ba590a travis: retry 2019-10-15 17:50:11 +08:00
lilydjwg
737c7b9448 fix travis script
also update dist to bionic
2019-10-15 17:35:18 +08:00
lilydjwg
6f48b774ae tests: add a needs_net mark for tests that require Internet access
use pytest-httpbin for test_regex

closes #113
2019-10-15 17:27:34 +08:00
lilydjwg
484778218d bump version 2019-10-15 17:22:39 +08:00
lilydjwg
893836fb09 rename nvchecker.main to nvchecker.__main__
closes #116
2019-10-15 16:54:31 +08:00
lilydjwg
5577e2cbd8 bitbucket: fix not returning a list of versions when use_max_tag is set
fixes #112
2019-10-12 15:40:08 +08:00
lilydjwg
6674a875bf README: minor fix 2019-08-30 15:09:49 +08:00
lilydjwg
1f36032a28 tests: allow nightly+aiohttp failure 2019-07-14 22:41:32 +08:00
lilydjwg
1e1f431f9b fix test_regex 2019-07-08 11:40:08 +08:00
lilydjwg
981143f62b update DEFAULT_USER_AGENT and bump version 2019-07-07 17:42:11 +08:00
lilydjwg
ed37892c87 tests: test default user agent in test_regex 2019-07-06 14:23:00 +08:00
lilydjwg
8b0186bd25 Revert "Use pytest-httpbin and add a test for default User-Agent"
This reverts commit 851e141f3d.

pytest-httpbin doesn't work with nightly. Also it brings in too many
dependencies.
2019-07-06 14:20:43 +08:00
依云
e12a9c4250
Merge pull request #110 from yan12125/investigate-crates-io-403
Fix 403 errors from crates.io
2019-07-06 14:15:45 +08:00
Chih-Hsuan Yen
851e141f3d
Use pytest-httpbin and add a test for default User-Agent 2019-07-06 13:39:59 +08:00
Chih-Hsuan Yen
629e82ac8a
Add default User-Agent
crates.io requires it since https://github.com/rust-lang/crates.io/pull/1696
2019-07-06 13:39:58 +08:00
Chih-Hsuan Yen
8720d59f79
Use the latest pypy3
Ref: https://github.com/pyenv/pyenv/tree/master/plugins/python-build/share/python-build
2019-07-05 09:22:09 +08:00
Chih-Hsuan Yen
51e2348d55
Install libgnutls-dev
Well, installing packages for some builds only seems complicated.
https://github.com/travis-ci/travis-ci/issues/3505
2019-07-05 09:22:09 +08:00
lilydjwg
5e324ef6ff travis: fix environmental variables 2019-07-04 22:35:36 +08:00
lilydjwg
077a752045 bitbucket: actually use max_page parameter 2019-07-04 17:59:19 +08:00
lilydjwg
93d1ca16db update tests 2019-07-04 16:47:33 +08:00
lilydjwg
3194ef8e6c bitbucket: 1.0 API is gone, let's use 2.0
fixes #109
2019-07-04 16:39:49 +08:00
lilydjwg
b1856f7426 test: don't raise RuntimeErrors for no-result warnings 2019-05-01 17:26:10 +08:00
lilydjwg
3481013f78 bump version to dev 2019-05-01 17:19:33 +08:00
lilydjwg
b014759f81 don't print tracebacks for networking errors
closes #108.
2019-05-01 17:16:17 +08:00
lilydjwg
8ade0f68b0 Merge branch 'test-ci', remote-tracking branch 'origin/pr/107' 2019-04-08 12:25:27 +08:00
lilydjwg
cc763205e8 test: don't raise RuntimeError for rate limited logs 2019-04-08 12:15:49 +08:00
lilydjwg
0f101bda4d test: is pytest-xdist necessary? 2019-04-08 11:45:19 +08:00
lilydjwg
05848d1bc2 test: has pytest-asyncio fix released?
https://github.com/pytest-dev/pytest-asyncio/issues/104
2019-04-08 11:42:40 +08:00
Haochen Tong
c6ef132ebb
Fix copy-paste mistake
BitBucket -> GitLab
2019-04-07 20:54:55 +02:00
lilydjwg
9a271c030b bump version to 1.4.3 and update doc 2019-03-24 15:31:40 +08:00
Chih-Hsuan Yen
7dab7c3b6a
Get rid of warnings from logging.warn
logging.warn is deprecated [1].

[1] https://docs.python.org/3/library/logging.html#logging.Logger.warning
2019-03-23 23:59:13 +08:00
lilydjwg
2677d29f1d github: use a better API for tags so we use less requests 2019-03-22 23:47:39 +08:00
lilydjwg
7370b8d23b github: fix next_page_url
fixes #103
2019-03-22 23:42:35 +08:00
lilydjwg
c64766cc0e bump version again to 1.4.2
because 1.4.1 was bad (forgot to git pull)
2019-03-22 21:04:02 +08:00
lilydjwg
445ae6452b bump version to 1.4.1 2019-03-22 21:03:46 +08:00
依云
34ce4b0678
Merge pull request #102 from donfranio/fix-wrong-config-param-type-max_page
read 'max_page' as int
2019-03-22 18:58:15 +08:00
Frank Scherrer
e2f68f764d
add casting 'max_page' to int 2019-03-22 10:34:01 +01:00
lilydjwg
a7f7a632a2 README: fix warnings 2019-03-16 21:32:00 +08:00
lilydjwg
c34c2e70ee version 1.4 2019-03-16 21:13:49 +08:00
lilydjwg
056fe65e79 use is not None for xml.etree.Element 2019-03-07 17:23:36 +08:00
依云
e58142e37f
Merge pull request #100 from yan12125/improve-tests
Improve tests
2019-03-07 17:08:05 +08:00
Chih-Hsuan Yen
2a9b60155b
Re-raise previous exceptions 2019-03-07 16:45:14 +08:00
Chih-Hsuan Yen
be7618988e
github: Fix "No tag found in upstream repository" with use_max_tag 2019-03-07 16:26:58 +08:00
Chih-Hsuan Yen
ab1ecc231d
Raise exceptions on all warnings and errors in tests 2019-03-07 16:25:54 +08:00
lilydjwg
5437100f08 fix github test 2019-03-06 22:45:23 +08:00
lilydjwg
ca68c44201 github: try as few API calls as possible 2019-03-06 22:37:18 +08:00
lilydjwg
c23ef737d4 add a `max_page` option to github source
also update README for other minor issue.
2019-03-06 22:33:58 +08:00
lilydjwg
e5d52a9762 add the exclude_regex list option 2019-03-06 22:15:50 +08:00
lilydjwg
5b47391af0 fix gitlab test 2019-03-06 22:03:35 +08:00
lilydjwg
de381bf26d fix github test and update the doc 2019-03-06 22:01:07 +08:00
lilydjwg
7897317294 add the "list options" feature
closes #83
old options will be removed in #99
2019-03-06 21:51:04 +08:00
lilydjwg
58a68957e9 update document for the new list options 2019-03-06 21:33:21 +08:00
lilydjwg
6e60b84740 README: minor fix 2019-03-06 21:18:07 +08:00
lilydjwg
5a8577dec6 minor usage update 2019-02-24 19:49:43 +08:00
lilydjwg
87a55c6cf9 fix README link and remove unused section 2019-02-15 20:32:43 +08:00
lilydjwg
a6315985d0 support for retry on failure
resolves #94
2019-02-15 20:16:49 +08:00
依云
f482ee6499
Merge pull request #93 from petronny/master
added path to github source
2019-01-28 20:07:10 +08:00
Jingbei Li
620c7a89cd added path to github source 2019-01-28 19:36:41 +08:00
依云
2f24822c88
Merge pull request #92 from yan12125/fix-pypi
pypi: fix valueless items
2019-01-07 14:19:50 +08:00
Chih-Hsuan Yen
d2d0279d56
Fix tests
Workaround pytest-asyncio incompatibility with pytest 4.1 [1]

[1] https://github.com/pytest-dev/pytest-asyncio/issues/104
2019-01-07 14:08:18 +08:00
Chih-Hsuan Yen
b108d9624a
pypi: fix valueless items
Fix checking for such an nvchecker.ini:

[logcat-color]
pypi =
2019-01-07 02:19:30 +08:00
lilydjwg
ae011170af update README for pypi use_pre_release config 2019-01-01 22:06:33 +08:00
Bruce Zhang
2e55a7db60 pypi: use_pre_release support
Signed-off-by: Bruce Zhang <zttt183525594@gmail.com>
2019-01-01 22:05:16 +08:00
依云
22c7b5db20
Merge pull request #90 from yan12125/fix-for-python38
Fix running on Python 3.8
2018-12-27 16:29:45 +08:00
Chih-Hsuan Yen
6002e273a9
Fix running on Python 3.8
Fix the following 2 errors when running on Python 3.8

```
Traceback (most recent call last):
  File "<string>", line 1, in <module>
  File "/home/yen/Projects/nvchecker/nvchecker/main.py", line 33, in main
    if core.process_common_arguments(args):
  File "/home/yen/Projects/nvchecker/nvchecker/core.py", line 46, in process_common_arguments
    nicelogger.enable_pretty_logging(
  File "/home/yen/Projects/nvchecker/nvchecker/lib/nicelogger.py", line 93, in enable_pretty_logging
    formatter = TornadoLogFormatter(color=color)
  File "/home/yen/Projects/nvchecker/nvchecker/lib/nicelogger.py", line 16, in __init__
    super().__init__(self, *args, **kwargs)
  File "/usr/lib/python3.8/logging/__init__.py", line 589, in __init__
    self._style.validate()
  File "/usr/lib/python3.8/logging/__init__.py", line 441, in validate
    if not self.validation_pattern.search(self._fmt):
TypeError: expected string or bytes-like object
```

```
Traceback (most recent call last):
  File "<string>", line 1, in <module>
  File "/home/yen/Projects/nvchecker/nvchecker/main.py", line 42, in main
    ioloop.run_until_complete(s.check())
  File "/usr/lib/python3.8/asyncio/base_events.py", line 589, in run_until_complete
    return future.result()
  File "/home/yen/Projects/nvchecker/nvchecker/core.py", line 177, in check
    self.print_version_update(name, result)
  File "/home/yen/Projects/nvchecker/nvchecker/core.py", line 190, in print_version_update
    logger.info('updated', name=name, version=version, old_version=oldver)
  File "/home/yen/.local/lib/python3.8/site-packages/structlog/_base.py", line 191, in _proxy_to_logger
    args, kw = self._process_event(method_name, event, event_kw)
  File "/home/yen/.local/lib/python3.8/site-packages/structlog/_base.py", line 151, in _process_event
    event_dict = proc(self._logger, method_name, event_dict)
  File "/home/yen/Projects/nvchecker/nvchecker/slogconf.py", line 48, in stdlib_renderer
    getattr(logger, level)(
  File "/usr/lib/python3.8/logging/__init__.py", line 1441, in info
    self._log(INFO, msg, args, **kwargs)
  File "/usr/lib/python3.8/logging/__init__.py", line 1572, in _log
    fn, lno, func, sinfo = self.findCaller(stack_info, stacklevel)
TypeError: findCaller() takes from 1 to 2 positional arguments but 3 were given
```

To fix the second issue, I backported relevant CPython changes [1] to slogconf.py

Test command: `python3.8 -c 'from nvchecker.main import main; main()' sample_source.ini`

[1] dde9fdbe45 (diff-c11fc567cf5b6e518eb855c23b5599bcR1400)
2018-12-26 11:39:23 +08:00
lilydjwg
77de8da557 make README look better 2018-12-16 14:17:58 +08:00
lilydjwg
13441ea88a 1.3 release, bump version 2018-12-16 13:30:10 +08:00
lilydjwg
d6067a3e86 add repology badge
thanks to @farseerfc
2018-12-16 13:28:40 +08:00
依云
377242f70b
Merge pull request #89 from yan12125/cache-vcs-by-name
Use the name as the fallback cache key for vcs packages
2018-12-16 13:24:37 +08:00
Chih-Hsuan Yen
0301155177
Use the name as the fallback cache key for vcs packages
Fixes #88
2018-12-15 21:45:11 +08:00
lilydjwg
5848aa39d9 travis-ci: fix pypy3, and python 3.7 works now 2018-12-05 17:14:46 +08:00
lilydjwg
b5fce0a3b9 tests: maybe newer pytest works now? 2018-12-05 16:46:23 +08:00
lilydjwg
eb75dc8305 Revert "tests: disable bitbucket tests because Travis-CI chooses the obsolete TLS 1.0 protocol"
This reverts commit 1da0e785db.

Trying other methods...
2018-12-05 16:38:45 +08:00
lilydjwg
5d3b5ed10c travis: use xenial 2018-12-05 16:38:09 +08:00
lilydjwg
1da0e785db tests: disable bitbucket tests because Travis-CI chooses the obsolete TLS 1.0 protocol 2018-12-05 16:34:57 +08:00
Chih-Hsuan Yen
6c6aff99b1
Workaround some pycurl.error issues
In my experiments, pycurl.error issues [1][2] do not appear if no 30x redirection involved.

Tested with https://github.com/tornadoweb/tornado/files/2624772/nv.log

[1] https://github.com/tornadoweb/tornado/issues/2542
[2] https://github.com/curl/curl/issues/3324
2018-12-04 22:23:04 +08:00
lilydjwg
5af558d214 tests: example.net certificate has expired, use http instead 2018-11-29 13:31:08 +08:00
Felix Yan
c1e5822f07
Handle graceful exit and timeout
Timeout was refactored and the defaults work correctly here.
2018-11-29 06:39:14 +08:00
Felix Yan
e3d62d4169
Ignore content type checking for aiohttp
Pass content_type=None as per the document suggests:

https://docs.aiohttp.org/en/stable/client_advanced.html#disabling-content-type-validation-for-json-responses
2018-11-29 06:37:47 +08:00
lilydjwg
c966e6faf8 vcs: allow the value to be None
This is a better fix for https://github.com/archlinuxcn/lilac/pull/91.
2018-11-24 12:18:39 +08:00
lilydjwg
d0ee359209 update tests and bump version to dev 2018-11-19 15:16:29 +08:00
lilydjwg
1adc38099a fix PyPI long_description display 2018-11-19 15:08:06 +08:00
lilydjwg
a6c3b49cc2 bump version to 1.2 2018-11-19 14:25:29 +08:00
lilydjwg
64703e8b2d tests: add test_cache 2018-10-10 17:35:18 +08:00
lilydjwg
e2fb91d3fb tests: fix clear_cache arg 2018-10-10 17:22:02 +08:00
lilydjwg
10e3478b12 fix test_keyfile
keyfile is not part of cache keys so the expectations are wrong.

We don't make it part of cache keys because we don't expect keyfile to
affect the version as long as we can get it.
2018-10-10 17:17:02 +08:00
lilydjwg
26eaef92aa handle deprecated warning, reindent conftest.py 2018-10-10 17:08:28 +08:00
lilydjwg
903b414183 cache result during a run
so that different entries can refer to a same software without much overhead.

This is needed because entries may be generated, and only nvchecker can
tell if two entries are expecting the same result because the name may
or may not be relevant.

Closes #81.
2018-10-10 17:03:20 +08:00
lilydjwg
8cc902909e update doc for AUR use_last_modified config 2018-09-30 19:51:55 +08:00
lilydjwg
faf14b64c4 Merge branch 'master' of github.com:lilydjwg/nvchecker 2018-09-30 19:50:27 +08:00
依云
0848d0a5d7
Merge pull request #82 from petronny/master
added use_last_modified for aur
2018-09-30 19:49:30 +08:00
Jingbei Li
ddbe30db8c added use_last_modified for aur 2018-09-30 17:42:42 +08:00
lilydjwg
b8812f4ae6 fix typo in comments 2018-09-27 21:40:21 +08:00
lilydjwg
81fb17ab25 nvtake: log for ignored names 2018-09-25 12:54:56 +08:00
lilydjwg
2d9bbadd15 add --ignore-nonexistent arg to nvtake 2018-09-25 12:36:17 +08:00
lilydjwg
033440da84 gitlab: fix rate limiting check 2018-09-25 12:33:37 +08:00
lilydjwg
9937378876 structlog PrintLogger doesn't have the 'fatal' synonym. 2018-09-22 21:27:17 +08:00
lilydjwg
60afa62d89 Revert "cache get_version results".
Some sources will use the name.

This reverts commits aabf9f5037
and f75a156d46.
2018-09-22 13:36:13 +08:00
lilydjwg
f75a156d46 fix get_version cache key 2018-09-22 13:28:59 +08:00
lilydjwg
aabf9f5037 cache get_version results
closes #81
2018-09-22 13:12:25 +08:00
lilydjwg
db376ccfed fix referring to wrong conf when logging no-result messages 2018-09-22 12:17:27 +08:00
lilydjwg
bfc4b76124 fix logging message module info pointing to structlog 2018-09-22 12:11:25 +08:00
lilydjwg
5bb2c960e5 declare that we support python 3.7 2018-09-21 19:41:42 +08:00
依云
68d9cd5d61
Merge pull request #80 from yan12125/add-missing_ok
Add missing ok
2018-09-21 16:10:58 +08:00
Chih-Hsuan Yen
e6d0449bc4
Fix tests 2018-09-21 12:18:21 +08:00
Chih-Hsuan Yen
1d16726c90
Add missing_ok, which suppresses no-result warnings 2018-09-21 12:17:21 +08:00
lilydjwg
42a02efec8 pytest 3.7.0 doesn't work with pytest-asyncio 0.9.0. pin to old pytest for now
Also see: https://github.com/pytest-dev/pytest-asyncio/issues/89
2018-08-01 16:41:19 +08:00
lilydjwg
0c459c1c03 Update README.rst about installation and running 2018-08-01 16:10:04 +08:00
lilydjwg
3e915a6a39 travis-ci: Python 3.7 isn't ready yet? 2018-07-22 23:17:16 +08:00
lilydjwg
431d80c61e test Python 3.7 2018-07-22 23:04:10 +08:00
lilydjwg
af368112ed Merge remote-tracking branch 'origin/pr/77' 2018-07-22 23:03:53 +08:00
lilydjwg
9f5028bbf0 tornado: set max_clients to 20
we'd better find out a way to specify this in the config file though.
2018-07-22 23:02:07 +08:00
Chih-Hsuan Yen
d9872dad6b Relax tornado requirement
I guess nobody is using tornado<4
2018-07-22 21:59:40 +08:00
Chih-Hsuan Yen
6afa9a551a Fix test_anitya 2018-07-22 21:08:00 +08:00
Chih-Hsuan Yen
7face004e4 Also test tornado<5 2018-07-22 21:08:00 +08:00
lilydjwg
00f7dccb39 update setup.py so that users can pip install nvchecker without other steps 2018-06-27 16:01:51 +08:00
lilydjwg
c537cf20fc bump version 2018-06-27 16:01:33 +08:00
lilydjwg
18d4e02977 fix max_concurrent config not being respected 2018-06-27 15:39:32 +08:00
lilydjwg
2e4f5d23a0 new version 2018-06-22 15:32:30 +08:00
lilydjwg
be6f9dc9e9 github & gitlab: exc.response may be None when timed out 2018-06-19 18:29:38 +08:00
lilydjwg
9b8cfc6f6f github: avoid TypeError when github returns 502 or similar
...without the ratelimit header
2018-06-12 15:15:58 +08:00
依云
6848c49dfb
Merge pull request #73 from Xuanwo/gitlab-without-token
Add rate limit check support for gitlab
2018-06-10 22:12:12 +08:00
Xuanwo
c9f2fcdb62 tests: Add local test for over limit check
Signed-off-by: Xuanwo <xuanwo.cn@gmail.com>
2018-06-10 21:24:33 +08:00
Xuanwo
bb7afab0a7 gitlab: Support get tags without token
Signed-off-by: Xuanwo <xuanwo.cn@gmail.com>
2018-06-10 21:24:22 +08:00
依云
07a97e04c5
Merge pull request #72 from Xuanwo/fix-gitlab-support
Fix gitlab support
2018-06-09 00:08:51 +08:00
Xuanwo
9d32a25c55 gitlab: Upgrade to API v4
Signed-off-by: Xuanwo <xuanwo.cn@gmail.com>
2018-06-09 00:02:02 +08:00
Xuanwo
f1c55296af misc: Ignore .pytest_cache
Signed-off-by: Xuanwo <xuanwo.cn@gmail.com>
2018-06-09 00:01:55 +08:00
lilydjwg
116e426d5a vcs: collect errors into log
This fixes #39.
2018-05-29 14:46:53 +08:00
lilydjwg
e61f9dcd76 vcs: fix name conflicts 2018-05-17 17:29:41 +08:00
lilydjwg
9340db2663 vcs: should return oldver than generate a new one 2018-05-16 14:10:41 +08:00
lilydjwg
c1c531d707 debian API becomes flaky recently 2018-05-15 22:37:39 +08:00
lilydjwg
5383fb30e0 get_version should always return the current version
not None when it's not updated.
2018-05-15 22:25:35 +08:00
lilydjwg
8c0f7b5680 log extra arguments to console 2018-05-13 19:07:00 +08:00
lilydjwg
ae58042a3e fix arguments mismatch for on_no_result
fixes #71.
2018-05-10 20:29:38 +08:00
lilydjwg
b3897a1235 update README on JSON logging 2018-05-09 14:34:32 +08:00
lilydjwg
79c3c934ce log no-result events 2018-05-09 14:34:20 +08:00
lilydjwg
d321beec36 travis-ci: fix allow_failures instruction 2018-05-08 20:27:41 +08:00
lilydjwg
f4ce0bdab0 travis-ci: use allow_failures instead of exclude 2018-05-08 20:19:29 +08:00
lilydjwg
6abc482c87 travis-ci: enable pip cache 2018-05-08 20:13:20 +08:00
lilydjwg
77781b3920 Merge branch 'keyfile' 2018-05-08 20:00:54 +08:00
lilydjwg
a31470df0b tests: aur: try harder 2018-05-08 19:59:57 +08:00
lilydjwg
d8536f4390 tests: aiohttp and pypy3 don't work well 2018-05-08 19:55:45 +08:00
lilydjwg
ddfc569ba3 tests: remove more f-strings 2018-05-08 19:44:48 +08:00
lilydjwg
960f76e333 tests: fix test_proxy 2018-05-08 19:41:11 +08:00
lilydjwg
ded2bcd8c5 tests: httpbin seems to be overloaded, disabled temporarily 2018-05-08 19:27:45 +08:00
lilydjwg
9273f30036 remove usage of f-strings, Python 3.5 is still supported 2018-05-08 19:27:28 +08:00
lilydjwg
3da3e356fa github: give an explicit error message when rate limited
also update tests.
2018-05-08 19:15:36 +08:00
lilydjwg
8f2d64d353 some debugging additions 2018-05-08 18:35:12 +08:00
lilydjwg
6c15ee8517 expose HTTPError, and raise it if status >= 400 for aiohttp 2018-05-08 18:34:32 +08:00
lilydjwg
c86f8820b7 fix Source.keymanager not defined when __config__ is absent 2018-05-08 18:18:17 +08:00
lilydjwg
6fcf5869ad add test_keyfile 2018-05-08 18:18:10 +08:00
lilydjwg
cf817ffe27 add Source.on_exception event hook 2018-05-08 18:17:45 +08:00
lilydjwg
de1634a587 ConfigParser.get doesn't return None on missing options 2018-05-06 12:18:28 +08:00
lilydjwg
26e7160a90 add an empty "keys" section 2018-05-06 11:52:10 +08:00
lilydjwg
54f474cfb2 not zip_safe 2018-05-06 11:48:24 +08:00
lilydjwg
3d69ad4ad6 update README for keyfile 2018-05-06 00:01:27 +08:00
lilydjwg
6bcef4e307 gitlab: tweak key logic 2018-05-06 00:00:48 +08:00
lilydjwg
b12f98ae6e gitlab: use keyfile 2018-05-05 23:52:13 +08:00
lilydjwg
39e79631fc fix configparser usage bug 2018-05-05 23:49:33 +08:00
lilydjwg
82dba4a965 github: use keyfile 2018-05-05 23:44:50 +08:00
lilydjwg
a73314dc64 make keyfile path relative to the config file 2018-05-05 23:44:35 +08:00
lilydjwg
d0a197115c support more kwargs for get_version implementations 2018-05-05 23:40:04 +08:00
lilydjwg
6c044b010e add KeyManager 2018-05-05 23:37:07 +08:00
lilydjwg
2e4f2bc424 test: test github use_max_tag with paging and include_tags_pattern 2018-05-04 11:46:40 +08:00
lilydjwg
23e9464cda github: support paging for use_max_tag 2018-05-04 11:46:26 +08:00
lilydjwg
081266e807 structlog: a rendering processor is always needed 2018-05-04 11:29:52 +08:00
lilydjwg
317ff6d97a github: implement the include_tags_pattern pattern, closes #70 2018-05-04 11:05:25 +08:00
lilydjwg
1be2f5fe80 github: add docs about new include_tags_pattern option 2018-05-04 11:01:23 +08:00
lilydjwg
7676628b57 json logging to stdout instead 2018-04-14 13:44:33 +08:00
lilydjwg
8b2c933d02 Merge branch 'structlog'
closes #67
2018-04-14 00:09:01 +08:00
lilydjwg
a8f8a41c46 update tests 2018-04-14 00:01:57 +08:00
lilydjwg
d7ce75a2fd source.cmd: fix typo 2018-04-13 23:59:18 +08:00
lilydjwg
bb629ec561 source.cmd: reject empty results
also better logging
2018-04-13 23:48:12 +08:00
lilydjwg
07fe08c639 support use both loggers 2018-04-13 23:30:05 +08:00
lilydjwg
85288583e1 rename --logger argument to indicate usage than implementation 2018-04-12 13:52:05 +08:00
lilydjwg
e9357f490d fix name info on exception 2018-04-11 17:03:50 +08:00
lilydjwg
8d2690e344 try Python 3.7 again 2018-04-11 16:50:13 +08:00
lilydjwg
cb6d38ed43 update travis config 2018-04-11 16:47:48 +08:00
lilydjwg
a5510113aa update README and setup.py 2018-04-11 16:45:19 +08:00
lilydjwg
9f1f769738 fix throttling
The old implementation will wait for all of the last max_concurrent ones
to complete.
2018-04-11 16:40:33 +08:00
lilydjwg
79c36641a2 switch from logging to structlog 2018-04-11 16:18:41 +08:00
lilydjwg
ac1e248aac update tests 2018-04-09 14:07:19 +08:00
lilydjwg
e8235ee2a2 Merge remote-tracking branch 'origin/pr/68' 2018-04-09 14:05:56 +08:00
Felix Yan
fd6abcaeee
Fix a typo in nvchecker/get_version.py 2018-04-09 13:58:52 +08:00
lilydjwg
b24f4ae5fe try five times for AUR tests 2018-02-07 22:05:14 +08:00
lilydjwg
6cf7867428 travis: update 2018-02-07 21:58:30 +08:00
lilydjwg
1be7b57ea0 update ubuntupkg test 2018-02-07 21:56:45 +08:00
lilydjwg
ee1b8267d0 fix github tests 2018-02-07 21:42:10 +08:00
lilydjwg
c150b77b61 github: don't use master as default branch 2018-02-07 18:54:09 +08:00
lilydjwg
7133ffc1a0 [test] yield_fixture is deprecated 2018-01-23 15:39:59 +08:00
依云
96b3d83849
Merge pull request #66 from felixonmars/fix-typo
Fix a typo in README
2018-01-15 09:40:38 +08:00
Felix Yan
e1ca67442f
Fix a typo in README 2018-01-15 00:47:16 +08:00
依云
48c394b693
Merge pull request #65 from felixonmars/handle-exception
Handle exception for no-tag situation
2018-01-09 14:23:16 +08:00
Felix Yan
9ad63287eb
Handle exception for no-tag situation
Before:
```
[E 01-09 11:26:03.736 core:132] unexpected error happened with dnscrypt-proxy
    Traceback (most recent call last):
      File "/usr/lib/python3.6/site-packages/nvchecker/core.py", line 128, in future_done
        version = fu.result()
      File "/usr/lib/python3.6/site-packages/nvchecker/get_version.py", line 43, in get_version
        version = await func(name, conf)
      File "/usr/lib/python3.6/site-packages/nvchecker/source/github.py", line 43, in get_version
        version = data[-1]
    IndexError: list index out of range
```

After:
```
[E 01-09 11:32:35.308 github:50] dnscrypt-proxy: No tag found in upstream repository.
```
2018-01-09 11:32:50 +08:00
lilydjwg
1693d82fd4 declare to be stable 2017-12-08 15:45:53 +08:00
lilydjwg
e3d7a6c29a v1.0, yay! 2017-12-08 15:32:21 +08:00
lilydjwg
0ef27aaffd debianpkg: use sources.debian.org 2017-12-05 21:58:26 +08:00
lilydjwg
1021d5f7e4 manual: no empty values 2017-12-01 18:14:04 +08:00
lilydjwg
587203d9ec catch CURLE_UNSUPPORTED_PROTOCOL errors
fixes #63
2017-11-22 18:23:41 +08:00
lilydjwg
75873b5281 update test_aur 2017-11-05 16:41:06 +08:00
Yen Chi Hsuan
8a17ccfd6b Use asyncio locks to ensure cache are always used 2017-11-05 01:38:51 +08:00
Yen Chi Hsuan
5df11f7b1a Improve android_sdk version extractor
* Add README
* Update author info
* Cache repo manifests
2017-11-05 01:29:12 +08:00
Yen Chi Hsuan
bb3fa58481 Add tests for Android SDK packages 2017-10-29 22:57:59 +08:00
Yen Chi Hsuan
02601c0b58 Support checking versoins from Android SDK 2017-10-13 00:19:57 +08:00
依云
3083ef6a1f Merge pull request #61 from yan12125/substitute-versions
Add support for substitutions of version strings
2017-10-13 00:15:10 +08:00
Yen Chi Hsuan
a1222a0820 Catch exceptions with version substitutions and log them 2017-10-12 22:56:24 +08:00
Yen Chi Hsuan
05b604530c Refine docs of substitution options and add a test for option priority 2017-10-12 22:48:06 +08:00
Yen Chi Hsuan
78fef9d34d Fix README 2017-10-11 01:58:21 +08:00
Yen Chi Hsuan
b834f9a81c Add a link for the new section 2017-10-11 01:57:35 +08:00
Yen Chi Hsuan
ffa6eb7b8f Add documents and tests for substitution commands 2017-10-11 01:50:44 +08:00
Yen Chi Hsuan
3628505ea7 Support substitution of version strings 2017-10-10 00:39:24 +08:00
lilydjwg
f3653c79c8 update tests (3rd time) 2017-08-30 17:53:25 +08:00
lilydjwg
fd5879a4a2 update tests again 2017-08-30 17:36:55 +08:00
lilydjwg
fcd38b0bf7 update tests 2017-08-30 17:34:05 +08:00
lilydjwg
156e281a82 nvtake: add support for '--all'
fixes #60
2017-08-30 14:27:00 +08:00
lilydjwg
dfbc91f22e ignore when version records file does not exist 2017-08-30 14:26:25 +08:00
lilydjwg
e25f0bebce update sample_source.ini 2017-08-30 14:26:03 +08:00
lilydjwg
ac6616afd6 fix get_version returning None 2017-08-13 18:49:36 +08:00
lilydjwg
f98ef68949 replace linebreaks in version strings
linebreaks in version breaks nvchecker
2017-08-13 11:54:22 +08:00
依云
0912747416 Merge pull request #58 from felixonmars/improve-error
Improve error message for non-exist nvtake targets
2017-08-05 08:38:26 +08:00
Felix Yan
39d1895b2b
Improve error message for non-exist nvtake targets
`nvtake foo` was giving the following error:

```
Traceback (most recent call last):
  File "/usr/bin/nvtake", line 11, in <module>
    load_entry_point('nvchecker==0.5', 'console_scripts', 'nvtake')()
  File "/usr/lib/python3.6/site-packages/nvchecker/tools.py", line 34, in take
    oldvers[name] = newvers[name]
KeyError: 'foo'
```

This is unclear and too verbose. With the change it is now:

```
[C 08-05 02:51:46.767 tools:38] foo doesn't exist in 'newver' set.
```
2017-08-05 02:51:14 +08:00
lilydjwg
933df7da4c update README for anitya, change version to dev
closes #57
2017-07-25 15:36:40 +08:00
lilydjwg
2d2f7149aa add Anitya support 2017-07-25 15:27:49 +08:00
lilydjwg
c3503c34ac bump version 2017-07-11 14:08:01 +08:00
依云
0115089279 Merge pull request #56 from felixonmars/fix-typo
Fix a typo: seperated -> separated
2017-07-10 16:18:38 +08:00
Felix Yan
0202eaa404
Fix a typo: seperated -> separated 2017-07-10 15:49:36 +08:00
lilydjwg
b46cea9dda fix pkg_resources dep checking 2017-07-09 08:48:52 +08:00
lilydjwg
189847ffdc Revert "make AUR tests more flaky"
This reverts commit 61df628bd8.

It doesn't help :-(
2017-07-08 16:15:17 +08:00
lilydjwg
61df628bd8 make AUR tests more flaky 2017-07-08 16:11:16 +08:00
lilydjwg
c81a826c57 github wants the User-Agent header properly set 2017-07-08 16:06:33 +08:00
lilydjwg
1bb14397ed no need to return name now; we track this on the futures 2017-07-08 15:59:54 +08:00
lilydjwg
774c36f4fe fix bitbucket source 2017-07-08 15:52:30 +08:00
lilydjwg
f0f841db71 fix test_proxy skipping logic 2017-07-08 15:48:43 +08:00
lilydjwg
19be32164f fix that aiohttp uses 'read()' on response, but tornado doesn't 2017-07-08 15:48:03 +08:00
lilydjwg
3226a76cfe fix __config__ without oldver and newver 2017-07-08 15:18:34 +08:00
lilydjwg
dfefcd4db9 max_concurrent setting 2017-07-08 15:05:17 +08:00
lilydjwg
7c534aec18 fix tests 2017-07-08 14:59:14 +08:00
lilydjwg
fdcd84ea2a update dependencies in README 2017-07-08 14:58:46 +08:00
lilydjwg
85cad99662 tests for both tornado and aiohttp 2017-07-08 14:55:34 +08:00
lilydjwg
139afdaa59 support both tornado and aiohttp
also fix throttling with aiohttp
2017-07-08 14:42:18 +08:00
lilydjwg
f6e836c16a use HTTPS for links 2017-07-06 16:21:05 +08:00
lilydjwg
ad583dd59a remove Google Code sources 2017-07-06 16:21:05 +08:00
依云
a82c18c376 Merge pull request #55 from felixonmars/ubuntupkg
Add ubuntupkg source
2017-07-06 15:47:42 +08:00
Felix Yan
028cf52b2d
Mark ubuntupkg tests as flaky 2017-07-06 13:31:24 +08:00
Felix Yan
9085dceeb3 Add ubuntupkg source 2017-07-06 13:27:44 +08:00
依云
c10e88206b Merge pull request #54 from felixonmars/asyncio
Add global proxy setting to README
2017-07-06 12:51:54 +08:00
Felix Yan
1448d9edcd
Add global proxy setting to README 2017-07-06 12:49:36 +08:00
lilydjwg
3489e4e2d1 call fsync on saving 2017-07-06 12:44:27 +08:00
Felix Yan
84df2716b5 AsyncIO rewrite 2017-07-06 12:24:45 +08:00
lilydjwg
c423bdee0a update Contents 2017-07-04 10:47:56 +08:00
lilydjwg
92aedb3680 bump version to dev 2017-07-04 10:46:39 +08:00
lilydjwg
ca2dbd5949 Merge branch 'master' of github.com:lilydjwg/nvchecker 2017-07-04 10:45:55 +08:00
依云
02fbb63657 Merge pull request #52 from felixonmars/pytest
Use new pytest-runner command
2017-07-04 10:37:23 +08:00
依云
08975b3302 Merge pull request #51 from felixonmars/debianpkg
Add Debian package source
2017-07-04 10:36:58 +08:00
Felix Yan
967a4c2f7f
Use new pytest-runner command
`ptr` is considered deprecated and will be removed in a future version.
2017-07-04 01:18:42 +08:00
Felix Yan
2b5f2d190d
Add Debian package source 2017-07-04 01:14:55 +08:00
依云
780f75eaf7 Merge pull request #50 from felixonmars/patch-1
Update archpkg and pacman test cases to match new package version
2017-06-27 10:23:18 +08:00
Felix Yan
af982c19dd
Update archpkg and pacman test cases to match new package version 2017-06-27 01:32:48 +08:00
lilydjwg
e0765d6ce6 bump version 2017-06-26 22:52:28 +08:00
lilydjwg
018a2957ad update tokens for travis 2017-03-24 22:18:35 +08:00
依云
fcd2ff29e4 Merge pull request #47 from felixonmars/metacpan-update
CPAN: Use metacpan fastapi
2017-03-24 20:41:25 +08:00
Felix Yan
02809b703f
CPAN: Use metacpan fastapi
All docs in metacpan.org currently point to this new API endpoint. The
old API endpoint returns wrong data for e.g. TermReadKey (latest 2.37 vs
API 2.14), and the new API endpoint doesn't have this issue.
2017-03-24 16:50:56 +08:00
lilydjwg
8e2be647e0 add license declaration to each file 2017-02-28 15:27:24 +08:00
依云
1998238594 Merge pull request #46 from romab/more_proxy
Use proxy settings if set for github and cpan repos
2017-02-09 21:32:01 +08:00
Gabriel Kihlman
aa10e20dcc Update documentation with newly added proxy settings for github and cpan 2017-02-09 13:48:38 +01:00
Gabriel Kihlman
224e9f4b08 Use proxy settings if set for github and cpan repos 2017-02-09 11:31:05 +01:00
lilydjwg
bd61be6981 compare oldver and newver using == instead of version comparison
closes #45 and archlinuxcn/repo#351
2017-02-06 15:54:21 +08:00
lilydjwg
63a0161acf can't test against PyPy3 2017-02-04 16:38:58 +08:00
lilydjwg
d62d49a1d5 add MIT LICENSE 2017-02-04 16:36:48 +08:00
lilydjwg
22e1ca9ed1 travis: use more Python versions 2017-02-04 16:33:27 +08:00
lilydjwg
82ca0cff9c skip archpkg test on Travis-CI 2017-02-04 16:29:59 +08:00
lilydjwg
5c54947b42 bump version 2017-01-25 15:47:45 +08:00
lilydjwg
aa418aba20 remove support for GitCafe, which has died
closes #42
2016-09-01 20:40:51 +08:00
依云
375fb4f14f add support for crates.io 2016-07-22 14:24:07 +08:00
Jiachen Yang
b9f5a1e476 add pyalpm.vercmp as an alternative to parse_version
`pkg_resources.parse_version` has different results from
`pyalpm.vercmp`. This allows to use `pyalpm.vercmp` as the sorting key.

This closes #40.
2016-04-24 16:05:52 +08:00
lilydjwg
8850babe71 add README for the lib directory 2016-04-04 18:39:31 +08:00
依云
9df0bb5433 [github] strip the final 'Z' of the date and update the test 2016-03-14 15:20:04 +08:00
依云
3c97b5c693 [github] uniform indentation 2016-03-14 15:13:31 +08:00
依云
89acf137a0 [github] more accurate version string 2016-03-14 15:12:41 +08:00
lilydjwg
65ed4ada17 bump version 2016-02-29 17:55:28 +08:00
lilydjwg
316dcd18be [aur]: update to AUR API version 5 2016-02-29 17:53:51 +08:00
lilydjwg
7209189753 Merge pull request #38 from felixonmars/git-tag
Remove unnecessary chdir, and avoid the need to create the dir when there is no need to parse PKGBUILD
2016-01-01 12:02:12 +08:00
Felix Yan
c90e8fc07b Remove unnecessary chdir, and avoid the need to create the dir when
there is no need to parse PKGBUILD
2015-12-31 20:28:47 +08:00
lilydjwg
f7e8f3be7b Merge pull request #37 from felixonmars/git-tag
Add use_max_tag and ignored_tags support to vcs source (git only)
2015-12-31 12:01:48 +08:00
Felix Yan
aaf375f0bd Add use_max_tag and ignored_tags support to vcs source (git only) 2015-12-29 12:15:19 +08:00
lilydjwg
e3aae23225 [tests] bye, slow GitCafe! 2015-12-24 21:36:09 +08:00
lilydjwg
7e5d4aa83a fix rst syntax 2015-12-23 22:00:07 +08:00
lilydjwg
70832eb6c8 README: update phrasing 2015-12-23 21:59:53 +08:00
Felix Yan
4b375866b6 Forcefully upgrade pytest on Travis 2015-12-23 21:50:42 +08:00
Felix Yan
e723053c41 Add ignored_tags support
Ignore certain tags while sorting. This option must be used together with
use_max_tag. This can be useful to avoid some known badly versioned tags,
so the newer tags won't be "overriden" by the old broken ones.
2015-12-23 21:50:38 +08:00
lilydjwg
b34861f9cd Merge pull request #35 from felixonmars/master
Add support for Packagist (Composer)
2015-12-03 11:39:36 +08:00
Felix Yan
f02e6e2c31 Add support for Packagist (Composer) 2015-12-03 11:25:10 +08:00
lilydjwg
ac7ed64118 tests: gitlab-test has updated, let's check loosely 2015-11-29 22:48:51 +08:00
lilydjwg
3fd661ef74 fix links for README 2015-11-29 22:39:46 +08:00
lilydjwg
83c99f1edf minor update for scripts/gen_contents 2015-11-29 22:37:30 +08:00
lilydjwg
a977a508d2 add scripts/gen_contents 2015-11-29 22:31:29 +08:00
lilydjwg
b6df5d6f2d add "Contents" to README 2015-11-29 22:30:33 +08:00
lilydjwg
d22ca1dde0 README: use svg image for PyPI 2015-11-11 14:31:25 +08:00
lilydjwg
fbbc7cb1e4 bump version for PyPI
also adjust test_gitcafe retry times
2015-11-11 14:28:25 +08:00
lilydjwg
af14b39833 update for releasing to PyPI 2015-11-11 14:21:58 +08:00
lilydjwg
986773c814 bump version to 0.4 2015-11-11 14:21:17 +08:00
lilydjwg
2e6749021f Merge pull request #34 from felixonmars/master
Don't install tests directory
2015-11-08 01:21:57 +08:00
Felix Yan
8cfcb5897a Don't install tests directory 2015-11-08 00:50:52 +08:00
lilydjwg
9df22e807e Merge pull request #33 from felixonmars/master
Fix a small typo in README
2015-11-07 22:01:37 +08:00
Felix Yan
6c661bb2e0 Fix a small typo in README 2015-11-07 21:38:35 +08:00
依云
697ba410d5 Merge pull request #32 from felixonmars/flaky
Make test_gitcafe flaky
2015-11-05 23:46:49 +08:00
Felix Yan
a9fb82b03c Actually pytest-runner need not to be in tests_require 2015-11-05 23:33:13 +08:00
Felix Yan
b8987edaff Skip github tests if NVCHECKER_GITHUB_TOKEN not present 2015-11-05 23:23:21 +08:00
Felix Yan
03765db683 Make test_gitcafe flaky 2015-11-05 23:16:30 +08:00
lilydjwg
694d8d6bea travis: still fixing env... 2015-11-05 23:09:31 +08:00
lilydjwg
e6ece60c7d travis: fix env setting 2015-11-05 23:02:25 +08:00
lilydjwg
4d46b494f4 fix typo 2015-11-05 22:55:48 +08:00
lilydjwg
8b34569e01 update for GitLab support 2015-11-05 22:53:22 +08:00
Felix Yan
19d78411a5 Add GitLab support 2015-11-05 22:47:35 +08:00
依云
ed39e35c4a Merge pull request #29 from felixonmars/bitbucket-tags
Add use_max_tag support for bitbucket source
2015-11-05 22:31:40 +08:00
依云
29b0fe7e2d Merge pull request #31 from felixonmars/tests
Add more tests
2015-11-05 22:02:19 +08:00
Felix Yan
ce9819f49d Add testcases for VCS source 2015-11-05 20:59:07 +08:00
Felix Yan
5bde44a3c2 Add testcases for pacman 2015-11-05 20:47:46 +08:00
Felix Yan
7065df1b32 Add a testcase for GitCafe 2015-11-05 20:41:53 +08:00
lilydjwg
132bbdc559 travis: add NVCHECKER_GITHUB_TOKEN env 2015-11-05 19:45:19 +08:00
lilydjwg
1d48d813e3 Merge remote-tracking branch 'origin/pr/27' 2015-11-05 19:43:39 +08:00
lilydjwg
52c09f1e4a archpkg: be a little nicer :-) 2015-11-05 19:33:09 +08:00
Felix Yan
07d75c4c5d Add use_max_tag support for GitHub source 2015-11-05 12:18:50 +08:00
Felix Yan
d28d869add Add use_max_tag support for bitbucket source 2015-11-05 12:18:15 +08:00
Felix Yan
2c17c40206 Fix archpkg and add tests 2015-11-05 11:42:07 +08:00
lilydjwg
4be2d6c09d fix github.py 2015-11-05 01:29:05 +08:00
依云
f924b973d4 Merge pull request #26 from felixonmars/tests
Testcases for Regex and Manual
2015-11-05 00:08:24 +08:00
Felix Yan
d77cff9cc7 Add a testcase for Manual 2015-11-04 23:44:23 +08:00
Felix Yan
7595011cbb Add a testcase for Regex 2015-11-04 23:43:35 +08:00
lilydjwg
d36b11321d Merge remote-tracking branch 'origin/pr/25' 2015-11-04 23:31:40 +08:00
Felix Yan
0047c6ddac add testcases for AUR 2015-11-04 23:30:25 +08:00
lilydjwg
2992a75686 move sample_source.ini 2015-11-04 23:26:22 +08:00
lilydjwg
3d36589d1a remove unnecessary config files 2015-11-04 23:24:37 +08:00
Felix Yan
5db3b93dcd test helper: generate a real ConfigParser to make all its methods available 2015-11-04 23:11:59 +08:00
lilydjwg
d909983f0d update gitignore for pytest 2015-11-04 22:43:12 +08:00
lilydjwg
1d3abf73b2 use a common module for sources with simple JSON API
closes #19
2015-11-04 22:42:32 +08:00
依云
13c1253c19 Merge pull request #24 from felixonmars/cpan-fix
CPAN fix: version can be an integer and crash nvchecker
2015-11-04 22:28:28 +08:00
依云
18f7eeb825 Merge pull request #23 from felixonmars/tests
Move global IOLoop hack to helper
2015-11-04 22:27:59 +08:00
Felix Yan
b32952e66c CPAN fix: version can be an integer and crash nvchecker 2015-11-04 22:09:36 +08:00
Felix Yan
f1cca7388b Move global IOLoop hack to helper 2015-11-04 21:25:01 +08:00
lilydjwg
f48fdbc441 Merge remote-tracking branch 'origin/pr/22' 2015-11-04 21:00:02 +08:00
lilydjwg
088b73ebde remove the only explicite io_loop argument
This requires Tornado>=4.1.
2015-11-04 20:57:21 +08:00
lilydjwg
b047840c04 Merge remote-tracking branch 'origin/pr/22' 2015-11-04 20:52:43 +08:00
Felix Yan
a76e283918 Clean up sync_get_version 2015-11-04 20:50:22 +08:00
Felix Yan
f5fb79cc63 Add a testcase for Command 2015-11-04 20:47:39 +08:00
Felix Yan
f7287bdf56 Add a testcase for CPAN 2015-11-04 20:20:25 +08:00
Felix Yan
10ac7ea7a6 Add a testcase for RubyGems 2015-11-04 20:18:37 +08:00
Felix Yan
5ae52b9e16 Add a testcase for Hackage 2015-11-04 20:17:37 +08:00
Felix Yan
793b1ad0da Add a testcase for PyPI 2015-11-04 20:15:43 +08:00
lilydjwg
d2049aa3ce update README 2015-11-04 20:13:32 +08:00
lilydjwg
512399cef5 add support for GitHub releases
Original author is isamazingfate, in this pull request:
https://github.com/lilydjwg/nvchecker/pull/15

This closes #15.
2015-11-04 20:10:13 +08:00
lilydjwg
15b800d5a8 more fix for README.rst 2015-11-04 19:43:10 +08:00
lilydjwg
7852ea4d7e fix README.rst 2015-11-04 19:36:28 +08:00
依云
66123fbe9f Merge pull request #21 from felixonmars/tests
Add testcases
2015-11-04 19:32:15 +08:00
Felix Yan
5fc778c6e5 Add testcases 2015-11-04 19:18:03 +08:00
lilydjwg
03e77aa1af Merge remote-tracking branch 'origin/pr/18' 2015-11-04 18:23:04 +08:00
lilydjwg
0cd587c6ba Merge remote-tracking branch 'origin/pr/17' 2015-11-04 18:21:26 +08:00
依云
a60e493170 Merge pull request #16 from felixonmars/master
Add support for NPM Registry
2015-11-04 18:18:56 +08:00
Felix Yan
2431922619 Add support for CPAN 2015-11-04 18:15:40 +08:00
Felix Yan
0a3d62986a Add support for Hackage 2015-11-04 18:01:47 +08:00
Felix Yan
49ec750772 Add support for NPM Registry 2015-11-04 17:39:30 +08:00
lilydjwg
3dce06ca21 add strip-release config, closing #11 2015-08-25 00:18:19 +08:00
lilydjwg
8a6fbf4a9a Revert "try aur4 url first"
This reverts commit fd0c7d57ae.

AUR4 has merged into current AUR.
2015-08-20 01:38:27 +08:00
lilydjwg
011ae94c4d update notify.py, fixing for x86_64 2015-08-05 23:35:13 +08:00
lilydjwg
9de0eb289e add bitbucket support 2015-08-04 16:12:46 +08:00
依云
ddf9169074 Merge pull request #10 from petronny/master
modified get_version.py for gitcafe support
2015-06-29 09:21:02 +08:00
Jingbei Li
c6c647ac56 modified get_version.py for gitcafe support 2015-06-28 22:07:14 +08:00
依云
108dfeea53 Merge pull request #9 from petronny/master
simple gitcafe.com support
2015-06-28 21:54:29 +08:00
Jingbei Li
6cf385a73a simple gitcafe.com support 2015-06-28 21:37:03 +08:00
依云
00978b2a5b Merge pull request #8 from petronny/master
fixed no user-agent error for github.py
2015-06-28 11:22:34 +08:00
Jingbei Li
7390202d87 fixed no user-agent error for github.py 2015-06-28 00:19:33 +08:00
依云
34a1a88f63 Merge pull request #7 from farseerfc/aur4
try aur4 url first
2015-06-15 21:12:58 +08:00
Jiachen Yang
fd0c7d57ae try aur4 url first 2015-06-15 16:58:23 +09:00
依云
0093a40957 Merge pull request #6 from farseerfc/master
handle github repo redirects using preview api
2015-05-12 15:18:05 +08:00
Jiachen Yang
bdf5efdb76 use preview redirect api regardless if access token is set 2015-05-12 08:28:27 +09:00
Jiachen Yang
41df426ca5 handle github repo redirects using preview api 2015-05-12 08:07:08 +09:00
依云
4b5b037b9e Merge pull request #5 from farseerfc/master
escape aurname in url
2015-05-11 12:46:45 +08:00
Jiachen Yang
c3c16247b7 escape aurname in url 2015-05-11 02:40:51 +09:00
lilydjwg
fa927dbf50 fix missing import 2015-04-18 18:59:23 +08:00
lilydjwg
0c362d907d nvtake: backup oldver before overwriting 2015-04-18 16:39:43 +08:00
lilydjwg
819d958461 update README 2015-04-15 13:46:42 +08:00
依云
388889a27e Merge pull request #4 from farseerfc/master
add access_token for github api, use NVCHECKER_GITHUB_TOKEN env var
2015-04-15 13:28:26 +08:00
Jiachen Yang
7f081f4a79 add access_token for github api, use NVCHECKER_GITHUB_TOKEN env var 2015-04-15 14:12:26 +09:00
lilydjwg
e4830dd162 add 'user_agent' config for the regex source 2015-03-14 15:54:26 +08:00
lilydjwg
ca931633aa fix gcode_hg source 2014-12-18 15:27:44 +08:00
lilydjwg
be7e9105c0 add gcode_svn source: for Google Code svn repositories 2014-12-18 15:27:22 +08:00
lilydjwg
0bf4559c3c add missing dependency, fixing #3 2014-12-12 14:57:38 +08:00
lilydjwg
82c85f0402 fix README syntax 2014-11-03 22:18:51 +08:00
lilydjwg
3194ec374b Merge branch 'source-vcs' 2014-11-03 22:09:23 +08:00
lilydjwg
369f75949d minor update for README 2014-11-03 22:08:26 +08:00
lilydjwg
0863ff5adf fix coding style for vcs.py 2014-11-03 21:41:56 +08:00
Yichao Yu
5457360858 Update README 2014-11-03 07:39:56 -05:00
Yichao Yu
ff7f3c65f9 add vcs version checker 2014-11-03 01:42:17 -05:00
Yichao Yu
f85909e6ee ignore package files 2014-11-02 23:20:12 -05:00
lilydjwg
ab9f453e3a add conf['oldver'] for get_version 2014-11-03 11:52:48 +08:00
lilydjwg
ad31a14c75 correct version number 2014-10-19 14:35:17 +08:00
lilydjwg
4b0bff467e update README 2014-10-19 13:44:47 +08:00
lilydjwg
57ae4c54f1 fix a NameError 2014-10-16 11:35:52 +08:00
lilydjwg
5cb229a26f archpkg: don't take testing repo into account 2014-10-15 11:46:56 +08:00
lilydjwg
2c3ba193e3 add support for RubyGems 2014-09-29 16:38:44 +08:00
lilydjwg
b258adfc13 source update 2014-09-29 16:35:55 +08:00
lilydjwg
8c8c0364f7 update README 2014-09-14 16:28:38 +08:00
lilydjwg
73b9f1bb1b add branch support for GitHub source 2014-09-14 16:23:36 +08:00
lilydjwg
fbd046af34 sources update 2014-09-11 15:21:50 +08:00
lilydjwg
b5ff67fa67 add 'manual' source to specify versions by hand 2014-09-11 15:21:14 +08:00
lilydjwg
819a2c69d3 add new source: archpkg 2014-09-08 00:10:06 +08:00
lilydjwg
81a17ef690 sources update 2014-03-11 21:34:36 +08:00
lilydjwg
13f26b83d6 arch_cn source update 2014-02-25 23:34:54 +08:00
lilydjwg
7a2883a092 update sources and readme 2014-02-25 23:22:50 +08:00
lilydjwg
906254e6b5 do a .task_dec() when an exception happens 2014-02-25 23:21:20 +08:00
lilydjwg
21a9eec31b sources update 2014-02-20 15:42:24 +08:00
lilydjwg
e71d56b671 nicelogger update 2014-01-31 17:26:12 +08:00
lilydjwg
dcdfaa7d25 sources update 2014-01-31 17:26:03 +08:00
lilydjwg
60894a43fa sources update 2014-01-08 22:28:42 +08:00
lilydjwg
a4ce7e3b66 update README 2014-01-08 21:25:43 +08:00
lilydjwg
2eea113a66 add nvcmp command 2014-01-08 21:24:13 +08:00
lilydjwg
653324ae6f set 'oldver' and 'newver' to none if not provided 2014-01-08 21:18:02 +08:00
lilydjwg
5dd8a4df68 nvtake: exit on unexpected config 2014-01-08 21:16:40 +08:00
lilydjwg
a767da8a51 sources update 2013-12-08 21:50:17 +08:00
lilydjwg
dfaf858951 version 0.4: simpler config, simpler usage
and simpler code
2013-12-04 00:34:14 +08:00
lilydjwg
9f0e030958 fix README.rst format 2013-11-18 00:37:48 +08:00
lilydjwg
b7534c9091 remove troublesome colon from record files 2013-11-17 23:43:07 +08:00
lilydjwg
e5d89e8ce8 rename func; it isn't test any more 2013-11-17 23:26:05 +08:00
lilydjwg
3a1960425f reraise AUR HTTP Error 2013-11-17 19:20:05 +08:00
lilydjwg
659ff63fe0 [sources/lilydjwg] zbar goes to [community] now 2013-11-17 19:19:08 +08:00
lilydjwg
44d0e8ee84 [sources] fix arch_cn lib32-fcitx* 2013-11-14 11:38:43 +08:00
lilydjwg
e06963ad50 minor typo 2013-11-09 21:45:22 +08:00
lilydjwg
f92a10ddfb sources update 2013-11-03 21:05:39 +08:00
lilydjwg
e36c74d639 sources/arch_lilydjwg.ini updated 2013-11-03 18:50:47 +08:00
lilydjwg
16ef913b88 fix exception when AUR package disappears 2013-11-03 18:50:19 +08:00
lilydjwg
dbc35aa901 refactor get_version.py 2013-11-03 18:21:50 +08:00
lilydjwg
a2ed894c2d complete update for arch_cn.ini 2013-11-03 00:22:25 +08:00
lilydjwg
a147f22fc4 support 'pacman' source; README update 2013-11-03 00:21:46 +08:00
lilydjwg
c150ec27ac add scripts/maintained_in_aur; update arch_aur.ini 2013-10-30 22:21:18 +08:00
lilydjwg
abeefb6426 sources update 2013-09-26 18:31:30 +08:00
lilydjwg
8b0105d491 sources/arch_aur.ini update 2013-09-25 14:53:55 +08:00
lilydjwg
76c834d3d2 add helper scripts: arch_repopkg_unmonitored 2013-09-25 14:53:49 +08:00
lilydjwg
7c20a2ff21 add sources/arch_aur.ini 2013-09-08 16:02:37 +08:00
lilydjwg
2cc5de34ab more logging 2013-09-08 15:18:06 +08:00
lilydjwg
312f99edf9 update README 2013-09-08 15:11:23 +08:00
lilydjwg
bce4b1af41 add Google Code (hg variant) support 2013-09-08 15:08:11 +08:00
lilydjwg
27df5909e3 add PyPI support
API: https://wiki.python.org/moin/PyPIJSON?action=show&redirect=PyPiJson
2013-09-08 14:56:22 +08:00
lilydjwg
110d083bb9 arch_lilydjwg.ini update 2013-09-05 21:14:00 +08:00
lilydjwg
b58e3f70e5 add the nvtake command utility 2013-09-05 16:53:19 +08:00
lilydjwg
fd99a076e9 add config file support 2013-09-05 16:27:44 +08:00
lilydjwg
e1457aadd3 rearrange argument parsing 2013-08-21 17:38:43 +08:00
lilydjwg
b736675784 arch_lilydjwg.ini update 2013-08-09 21:56:05 +08:00
lilydjwg
0e53128981 fix 2013-08-09 21:55:11 +08:00
lilydjwg
6d4e3c84b8 add back arch_lilydjwg.ini 2013-08-09 19:01:54 +08:00
lilydjwg
ff46f33752 use argparse instead of tornado.options and change a few terms 2013-08-09 18:50:41 +08:00
144 changed files with 8562 additions and 679 deletions

84
.github/workflows/codeql.yml vendored Normal file
View file

@ -0,0 +1,84 @@
# For most projects, this workflow file will not need changing; you simply need
# to commit it to your repository.
#
# You may wish to alter this file to override the set of languages analyzed,
# or to provide custom queries or build logic.
#
# ******** NOTE ********
# We have attempted to detect the languages in your repository. Please check
# the `language` matrix defined below to confirm you have the correct set of
# supported CodeQL languages.
#
name: "CodeQL"
on:
push:
branches: [ "master" ]
pull_request:
branches: [ "master" ]
schedule:
- cron: '34 14 * * 4'
jobs:
analyze:
name: Analyze
# Runner size impacts CodeQL analysis time. To learn more, please see:
# - https://gh.io/recommended-hardware-resources-for-running-codeql
# - https://gh.io/supported-runners-and-hardware-resources
# - https://gh.io/using-larger-runners
# Consider using larger runners for possible analysis time improvements.
runs-on: ${{ (matrix.language == 'swift' && 'macos-latest') || 'ubuntu-latest' }}
timeout-minutes: ${{ (matrix.language == 'swift' && 120) || 360 }}
permissions:
# required for all workflows
security-events: write
# only required for workflows in private repositories
actions: read
contents: read
strategy:
fail-fast: false
matrix:
language: [ 'python' ]
# CodeQL supports [ 'c-cpp', 'csharp', 'go', 'java-kotlin', 'javascript-typescript', 'python', 'ruby', 'swift' ]
# Use only 'java-kotlin' to analyze code written in Java, Kotlin or both
# Use only 'javascript-typescript' to analyze code written in JavaScript, TypeScript or both
# Learn more about CodeQL language support at https://aka.ms/codeql-docs/language-support
steps:
- name: Checkout repository
uses: actions/checkout@v4
# Initializes the CodeQL tools for scanning.
- name: Initialize CodeQL
uses: github/codeql-action/init@v3
with:
languages: ${{ matrix.language }}
# If you wish to specify custom queries, you can do so here or in a config file.
# By default, queries listed here will override any specified in a config file.
# Prefix the list here with "+" to use these queries and those in the config file.
# For more details on CodeQL's query packs, refer to: https://docs.github.com/en/code-security/code-scanning/automatically-scanning-your-code-for-vulnerabilities-and-errors/configuring-code-scanning#using-queries-in-ql-packs
# queries: security-extended,security-and-quality
# Autobuild attempts to build any compiled languages (C/C++, C#, Go, Java, or Swift).
# If this step fails, then you should remove it and run the build manually (see below)
- name: Autobuild
uses: github/codeql-action/autobuild@v3
# Command-line programs to run using the OS shell.
# 📚 See https://docs.github.com/en/actions/using-workflows/workflow-syntax-for-github-actions#jobsjob_idstepsrun
# If the Autobuild fails above, remove it and uncomment the following three lines.
# modify them (or add more) to build your code if your project, please refer to the EXAMPLE below for guidance.
# - run: |
# echo "Run, Build Application using script"
# ./location_of_script_within_repo/buildscript.sh
- name: Perform CodeQL Analysis
uses: github/codeql-action/analyze@v3
with:
category: "/language:${{matrix.language}}"

28
.github/workflows/mypy.yaml vendored Normal file
View file

@ -0,0 +1,28 @@
name: run mypy
on: [push, pull_request]
jobs:
build:
name: run mypy
runs-on: ubuntu-latest
steps:
- name: Checkout Code
uses: actions/checkout@v4
- name: Setup Cache
uses: actions/cache@v4
env:
cache-name: cache-pipmypy
with:
path: ~/.cache/pip
key: ${{ runner.os }}-${{ env.cache-name }}-${{ hashFiles('setup.py') }}
restore-keys: |
${{ runner.os }}-${{ env.cache-name }}-
${{ runner.os }}-cache-pip-
- name: Install deps
run: pip3 install -U tornado pytest pytest-asyncio pytest-httpbin pytest-rerunfailures structlog tomli platformdirs aiohttp httpx mypy awesomeversion
- name: Run mypy for --install-types
run: PATH=$HOME/.local/bin:$PATH mypy --namespace-packages --explicit-package-bases nvchecker nvchecker_source tests
continue-on-error: true
- name: Install types
run: PATH=$HOME/.local/bin:$PATH yes | mypy --install-types
- name: Run mypy
run: PATH=$HOME/.local/bin:$PATH mypy --namespace-packages --explicit-package-bases nvchecker nvchecker_source tests

93
.github/workflows/tests.yaml vendored Normal file
View file

@ -0,0 +1,93 @@
name: run tests
on: [push, pull_request]
jobs:
tests:
runs-on: ubuntu-latest
strategy:
fail-fast: false
matrix:
python-version:
- "3.8"
- "3.9"
- "3.10"
- "3.11"
- "3.12"
- "3.13"
# pypy fails in some cases but we don't care much about that
# with github actions we can't mark some jobs to not affect the overall
# conclusion so we have to omit "allow-failure" tests.
# See https://github.com/actions/toolkit/issues/399
# - pypy-3.7
deps:
- tornado pycurl
# timer runs when loop is closed, see https://github.com/lilydjwg/nvchecker/actions/runs/11650699759/job/32439742210
# - aiohttp
- tornado
- httpx[http2]>=0.14.0
exclude: []
steps:
- name: Checkout code
uses: actions/checkout@v4
- name: Setup Python ${{ matrix.python-version }}
uses: actions/setup-python@v5
with:
python-version: ${{ matrix.python-version }}
- name: Setup Cache
uses: actions/cache@v4
env:
cache-name: cache-pip
with:
path: ~/.cache/pip
key: ${{ runner.os }}-${{ env.cache-name }}-${{ matrix.deps }}-${{ hashFiles('pyproject.toml', 'setup.cfg') }}
restore-keys: |
${{ runner.os }}-${{ env.cache-name }}-${{ matrix.deps }}-
${{ runner.os }}-${{ env.cache-name }}-
- name: Install pycurl deps
if: ${{ contains(matrix.deps, 'pycurl') }}
run: |
sudo apt update
sudo apt install -y libcurl4-openssl-dev
# werkzeug is pinned for httpbin compatibility https://github.com/postmanlabs/httpbin/issues/673
- name: Install Python deps
env:
# use env to avoid `>` being redirection
deps: ${{ matrix.deps }}
run: pip install -U $deps pytest 'pytest-asyncio>=0.24' pytest-httpbin pytest-rerunfailures structlog tomli platformdirs lxml jq 'werkzeug<2.1' awesomeversion
# don't use binary distribution because:
# hardcoded cacert path doesn't work on Ubuntu (should have been resolved?)
# limited compression support (only deflate & gzip)
- name: Install pycurl
if: ${{ contains(matrix.deps, 'pycurl') }}
run: |
pip uninstall -y pycurl
pip install -U pycurl --no-binary :all:
- name: Decrypt keys
env:
KEY: ${{ secrets.KEY }}
run: if [[ -n $KEY ]]; then openssl enc -d -aes-256-ctr -pbkdf2 -k $KEY -in keyfile.toml.enc -out keyfile.toml; fi
- name: Setup mitmproxy cache
uses: actions/cache@v4
env:
cache-name: cache-mitm
with:
path: ~/.mitmproxy
key: ${{ env.cache-name }}
restore-keys: |
${{ env.cache-name }}-
- name: Install mitmproxy
run: |
/usr/bin/python -m venv --system-site-packages ~/.mitmproxy/venv
. ~/.mitmproxy/venv/bin/activate
pip install -U mitmproxy
# https://github.com/DevToys-app/DevToys/issues/1373#issuecomment-2599820594
sudo sysctl -w kernel.apparmor_restrict_unprivileged_unconfined=0
sudo sysctl -w kernel.apparmor_restrict_unprivileged_userns=0
# - name: Setup upterm session
# uses: lhotari/action-upterm@v1
- name: Run pytest
env:
mitmdump: /home/runner/.mitmproxy/venv/bin/mitmdump
run: scripts/run_cached_tests

14
.gitignore vendored
View file

@ -1,2 +1,12 @@
nvchecker.egg-info/
versions/
*.egg-info/
__pycache__/
/build/
/dist/
.cache/
.eggs/
*.pyc
*.pyo
.travis.pub
.pytest_cache/
.tox/
keyfile.toml

10
.readthedocs.yaml Normal file
View file

@ -0,0 +1,10 @@
version: 2
build:
os: ubuntu-22.04
tools:
python: "3.11"
sphinx:
configuration: docs/conf.py
python:
install:
- requirements: docs/requirements.txt

2
.typos.toml Normal file
View file

@ -0,0 +1,2 @@
[default.extend-words]
mis = "mis"

21
LICENSE Normal file
View file

@ -0,0 +1,21 @@
MIT License
Copyright (c) 2013-2017 lilydjwg <lilydjwg@gmail.com>, et al.
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.

View file

@ -1,96 +1,60 @@
**nvchecker** (short for *new version checker*) is for checking if a new version of some software has been released.
nvchecker is now **in development**.
This is the version 2.0 branch. For the old version 1.x, please switch to the ``v1.x`` branch.
.. image:: https://github.com/lilydjwg/nvchecker/workflows/run%20tests/badge.svg?branch=master
:alt: Test Status
:target: https://github.com/lilydjwg/nvchecker/actions?query=workflow%3A%22run+tests%22
.. image:: https://badge.fury.io/py/nvchecker.svg
:alt: PyPI version
:target: https://badge.fury.io/py/nvchecker
.. image:: https://readthedocs.org/projects/nvchecker/badge/?version=latest
:target: https://nvchecker.readthedocs.io/en/latest/?badge=latest
:alt: Documentation Status
|
.. image:: https://repology.org/badge/vertical-allrepos/nvchecker.svg
:alt: Packaging status
:target: https://repology.org/metapackage/nvchecker/versions
.. contents::
:local:
Dependency
==========
- Python 3
- Tornado
- All commands used in your configuration files
----------
- Python 3.8+
- Python library: structlog, platformdirs, tomli (on Python < 3.11)
- One of these Python library combinations (ordered by preference):
* tornado + pycurl
* aiohttp
* httpx with http2 support (experimental; only latest version is supported)
* tornado
- All commands used in your software version configuration files
Install and Run
---------------
To install::
pip3 install nvchecker
To use the latest code, you can also clone this repository and run::
pip install .
Running
=======
To see available options::
./nvchecker --help
nvchecker --help
Run with one or more configuration files::
Run with one or more software version files::
./nvchecker config_file_1 config_file_2 ...
nvchecker -c config_file
You normally will like to specify some "version files"; see below.
You normally will like to specify some "version record files"; see below.
Version Files
=============
Version files record which version of the software you know or is available. They are simple key-value pairs of ``(name, version)`` seperated by ``:`` ::
Documentation
-------------
fcitx: 4.2.7
google-chrome: 27.0.1453.93-200836
vim: 7.3.1024
Say you've got a version file called ``old_ver.txt`` which records all your watched software and their versions. To update it using ``nvchecker``::
./nvchecker --oldverfile=old_ver.txt --verfile=new_ver.txt config.ini
Compare the two files for updates (assuming they are sorted alphabetically; files generated by ``nvchecker`` are already sorted)::
comm -13 old_ver.txt new_ver.txt
# or say that in English:
comm -13 old_ver.txt new_ver.txt | sed 's/:/ has updated to version/;s/$/./'
# show both old and new versions
join old_ver.txt new_ver.txt | awk '$2 != $3'
Configuration Files
===================
The configuration files are in ini format. *Section names* is the name of the software. Following fields are used to tell nvchecker how to determine the current version of that software.
See ``sample_config.ini`` for an example.
Search in a Webpage
-------------------
Search through a specific webpage for the version string. This type of version finding has these fields:
url
The URL of the webpage to fetch.
encoding
(*Optional*) The character encoding of the webpage, if ``latin1`` is not appropriate.
regex
A regular expression used to find the version string.
It can have zero or one capture group. The capture group or the whole match is the version string.
When multiple version strings are found, the maximum of those is chosen.
proxy
The HTTP proxy to use. The format is ``host:port``, e.g. ``localhost:8087``.
Find with a Command
-------------------
Use a shell command line to get the version. The output is striped first, so trailing newlines do not bother.
cmd
The command line to use. This will run with the system's standard shell (e.g. ``/bin/sh``).
Check AUR
---------
Check `Arch User Repository <https://aur.archlinux.org/>`_ for updates.
aur
The package name in AUR. If empty, use the name of software (the *section name*).
Check GitHub
------------
Check `GitHub <https://github.com/>`_ for updates. The version returned is in date format ``%Y%m%d``, e.g. ``20130701``.
github
The github repository, with author, e.g. ``lilydjwg/nvchecker``.
Other
-----
More to come. Send me a patch or pull request if you can't wait and have written one yourself :-)
Bugs
----
* Finish writing results even on Ctrl-C or other interruption.
For detailed documentation, see `https://nvchecker.readthedocs.io/en/latest/ <https://nvchecker.readthedocs.io/en/latest/>`_.

View file

@ -1,207 +0,0 @@
[3to2]
aur
[android-apktool]
aur
[android-docs]
aur
[android-ndk]
aur
[android-sdk]
aur
; [aufs3-util-lily-git]
[cgdb]
aur
[coffee-script]
aur
[dcron]
aur
[dmg2img]
aur
[dnscrypt-proxy-git]
; my aur
github = jedisct1/dnscrypt-proxy
[elflibviewer]
aur
[evince-nodbus]
cmd = LANG=C pacman -Si evince | grep -F Version | awk '{print $3}'
[fcitx-lilydjwg-git]
github = fcitx/fcitx
[google-appengine-python]
aur
; [gvim-lily]
[hfsprogs]
aur
[kingsoft-office]
aur
[latencytop]
aur
[libsodium]
aur
; [linux-lily]
; [linux-lily-headers]
[msdl]
aur
[nautilus-dropbox]
aur
[nodejs-jake]
aur
[openresty-dev]
; my aur
url = http://openresty.org/
regex = ngx_openresty-([\d.]+)\.tar\.gz\|
[perl-data-random]
aur
[perl-goo-canvas]
aur
[perl-gtk2-imageview]
aur
[perl-gtk2-unique]
aur
[perl-mouse]
aur
[perl-net-dropbox-api]
aur
[perl-net-oauth]
aur
[perl-proc-processtable]
aur
[perl-yaml-tiny]
aur
[perl4-corelibs]
aur
[python-autopep8]
aur
[python-bitstring]
aur
[python-blist]
aur
[python-cffi]
; my aur
url = https://pypi.python.org/pypi/cffi
regex = cffi-([\d.]+)\.tar\.gz
[python-pycparser-git]
aur
[python-setproctitle]
aur
; [python-you-get-git]
; RSS'ed
[python2-netlib-git]
aur
[python3-pycurl]
aur
[reaver-wps-svn]
aur
[ruby-gettext]
aur
[ruby-levenshtein]
aur
[ruby-locale]
aur
[ruby-maruku]
aur
[ruby-sass]
aur
[ruby-yard]
aur
[shutter]
aur
[spideroak]
aur
[sqlite-manager]
aur
[ssed]
; my aur
url = http://sed.sourceforge.net/grabbag/ssed/
regex = The current version is ([\d.]+)\.
proxy = localhost:8087
[tp_smapi-dkms]
aur
[ttf-ume]
aur
[urlview]
aur
; [vim-lily]
[wdiff]
aur
[wireshark-gtk2]
cmd = LANG=C pacman -Si wireshark-gtk | grep -F Version | awk '{print $3}'
[xf86-input-wizardpen]
aur
[xkbset]
aur
[xmind]
aur
[zbar]
aur
; [zhcon]
; my aur
; last update is six years ago
[zint]
aur

View file

@ -1,31 +0,0 @@
[fcitx]
url = https://code.google.com/p/fcitx/
regex = fcitx-([\d.]+)\.tar\.xz
[vim]
url = http://ftp.vim.org/pub/vim/patches/7.3/
regex = 7\.3\.\d+
; [badone]
; url = http://www.baidu.com/
; regex = baidu (\d+)
[google-chrome]
cmd = wget -qO- http://dl.google.com/linux/chrome/rpm/stable/x86_64/repodata/other.xml.gz | zgrep "google-chrome-stable" | awk -F\" '{print $10"-"$12}'
[you-get]
aur = python-you-get-git
[fbcat]
aur
[winterpy]
github = lilydjwg/winterpy
[nvchecker]
github = lilydjwg/nvchecker
[ssed]
url = http://sed.sourceforge.net/grabbag/ssed/
regex = The current version is ([\d.]+)\.
proxy = localhost:8087

1
docs/.gitignore vendored Normal file
View file

@ -0,0 +1 @@
_build/

26
docs/Makefile Normal file
View file

@ -0,0 +1,26 @@
# Minimal makefile for Sphinx documentation
#
# You can set these variables from the command line, and also
# from the environment for the first two.
SPHINXOPTS ?=
SPHINXBUILD ?= sphinx-build
SOURCEDIR = .
BUILDDIR = _build
# Put it first so that "make" without argument is like "make help".
help:
@$(SPHINXBUILD) -M help "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)
.PHONY: help Makefile man
man: $(BUILDDIR)/man/nvchecker.1
$(BUILDDIR)/man/nvchecker.1: usage.rst
mkdir -p $(BUILDDIR)/man
./myrst2man.py $< > $@
# Catch-all target: route all unknown targets to Sphinx using the new
# "make mode" option. $(O) is meant as a shortcut for $(SPHINXOPTS).
%:
@$(SPHINXBUILD) -M $@ "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)

26
docs/api.rst Normal file
View file

@ -0,0 +1,26 @@
``nvchecker.api`` --- The source plugin API
===========================================
.. automodule:: nvchecker.api
:members:
:imported-members:
:undoc-members:
.. py:data:: session
:type: nvchecker.httpclient.base.BaseSession
The object to send out HTTP requests, respecting various options in the configuration entry.
.. automodule:: nvchecker.httpclient.base
:members: BaseSession, Response
:undoc-members:
.. autodata:: nvchecker.api.proxy
.. autodata:: nvchecker.api.user_agent
.. autodata:: nvchecker.api.tries
.. autodata:: nvchecker.api.verify_cert
.. py:data:: nvchecker.api.entry_waiter
:type: contextvars.ContextVar
This :class:`ContextVar <contextvars.ContextVar>` contains an :class:`EntryWaiter <nvchecker.api.EntryWaiter>` instance for waiting on other entries.

47
docs/conf.py Normal file
View file

@ -0,0 +1,47 @@
import os
import sys
sys.path.insert(0, os.path.abspath(".."))
import nvchecker
master_doc = "index"
project = "nvchecker"
copyright = "lilydjwg, et al."
version = release = nvchecker.__version__
extensions = [
"sphinx.ext.autodoc",
"sphinx.ext.doctest",
"sphinx.ext.intersphinx",
"sphinx.ext.viewcode",
]
primary_domain = "py"
default_role = "py:obj"
autodoc_member_order = "bysource"
autoclass_content = "both"
autodoc_inherit_docstrings = False
# Without this line sphinx includes a copy of object.__init__'s docstring
# on any class that doesn't define __init__.
# https://bitbucket.org/birkenfeld/sphinx/issue/1337/autoclass_content-both-uses-object__init__
autodoc_docstring_signature = False
intersphinx_mapping = {"python": ("https://docs.python.org/3", None)}
html_theme = "sphinx_rtd_theme"
on_rtd = os.environ.get("READTHEDOCS", None) == "True"
# On RTD we can't import sphinx_rtd_theme, but it will be applied by
# default anyway. This block will use the same theme when building locally
# as on RTD.
if not on_rtd:
import sphinx_rtd_theme
html_theme_path = [sphinx_rtd_theme.get_html_theme_path()]
html_theme_options = {
'collapse_navigation': False,
}

22
docs/index.rst Normal file
View file

@ -0,0 +1,22 @@
.. nvchecker documentation master file, created by
sphinx-quickstart on Thu Sep 3 00:19:02 2020.
You can adapt this file completely to your liking, but it should at least
contain the root `toctree` directive.
Welcome to nvchecker's documentation!
=====================================
.. toctree::
:maxdepth: 2
usage
plugin
api
Indices and tables
==================
* :ref:`genindex`
* :ref:`modindex`
* :ref:`search`

35
docs/make.bat Normal file
View file

@ -0,0 +1,35 @@
@ECHO OFF
pushd %~dp0
REM Command file for Sphinx documentation
if "%SPHINXBUILD%" == "" (
set SPHINXBUILD=sphinx-build
)
set SOURCEDIR=.
set BUILDDIR=_build
if "%1" == "" goto help
%SPHINXBUILD% >NUL 2>NUL
if errorlevel 9009 (
echo.
echo.The 'sphinx-build' command was not found. Make sure you have Sphinx
echo.installed, then set the SPHINXBUILD environment variable to point
echo.to the full path of the 'sphinx-build' executable. Alternatively you
echo.may add the Sphinx directory to PATH.
echo.
echo.If you don't have Sphinx installed, grab it from
echo.http://sphinx-doc.org/
exit /b 1
)
%SPHINXBUILD% -M %1 %SOURCEDIR% %BUILDDIR% %SPHINXOPTS% %O%
goto end
:help
%SPHINXBUILD% -M help %SOURCEDIR% %BUILDDIR% %SPHINXOPTS% %O%
:end
popd

74
docs/myrst2man.py Executable file
View file

@ -0,0 +1,74 @@
#!/usr/bin/python3
import time
import locale
import os
import sys
try:
locale.setlocale(locale.LC_ALL, '')
except:
pass
sys.path.insert(0, '..')
import nvchecker
from docutils.core import publish_cmdline, default_description
from docutils import nodes
from docutils.writers import manpage
from docutils.parsers.rst import roles
def ref_role(
role, rawtext, text, lineno, inliner,
options={}, content=[],
):
node = nodes.reference(rawtext, text.title(), **options)
return [node], []
def doc_role(
role, rawtext, text, lineno, inliner,
options={}, content=[],
):
node = nodes.reference(rawtext, text, **options)
return [node], []
roles.register_local_role('ref', ref_role)
roles.register_local_role('doc', doc_role)
class MyTranslator(manpage.Translator):
def visit_image(self, node):
raise nodes.SkipNode
def visit_topic(self, node):
self.body.append('\n')
raise nodes.SkipNode
def visit_title(self, node):
try:
super().visit_title(node)
except nodes.SkipNode:
if self.section_level == 0:
self._docinfo['title'] = 'nvchecker'
self._docinfo['subtitle'] = 'New version checker for software releases'
self._docinfo['title_upper'] = 'nvchecker'.upper()
self._docinfo['manual_section'] = '1'
# Make the generated man page reproducible. Based on the patch from
# https://sourceforge.net/p/docutils/patches/132/#5333
source_date_epoch = os.environ.get('SOURCE_DATE_EPOCH')
if source_date_epoch:
self._docinfo['date'] = time.strftime('%Y-%m-%d', time.gmtime(int(source_date_epoch)))
else:
self._docinfo['date'] = time.strftime('%Y-%m-%d')
self._docinfo['version'] = nvchecker.__version__
raise
class MyWriter(manpage.Writer):
def __init__(self):
super().__init__()
self.translator_class = MyTranslator
def main():
description = ("Generates plain unix manual documents. " + default_description)
publish_cmdline(writer=MyWriter(), description=description)
if __name__ == '__main__':
main()

96
docs/plugin.rst Normal file
View file

@ -0,0 +1,96 @@
How to develop a source plugin for nvchecker
============================================
.. contents::
:local:
Source plugins enable nvchecker to discover software version strings in
additional ways.
Where to put the plugins
------------------------
They are Python modules put in any directories named ``nvchecker_source`` in
``sys.path``. This is called namespace packages introduced by `PEP 420 <https:
//www.python.org/dev/peps/pep-0420/>`_. For local use,
``~/.local/lib/pythonX.Y/site-packages/nvchecker_source`` is a good place, or
you can define the ``PYTHONPATH`` environment variable and put nvchecker source
plugins there inside a ``nvchecker_source`` directory.
Plugins are referenced by their names in the configuration file (``source = "xxx"``).
If multiple plugins have the same name, the first one in ``sys.path`` will be used.
How to write a simple plugin
----------------------------
For simple situations, you need to define an async function with the following signature::
async def get_version(
name: str, conf: Entry, *,
cache: AsyncCache, keymanager: KeyManager,
**kwargs,
) -> VersionResult:
...
Those types are imported from :mod:`nvchecker.api`.
``name`` is the table keys in the configuration file, and ``conf`` is a dict of
the content of that table. You should not modify this dict.
``cache`` is an :class:`AsyncCache <nvchecker.api.AsyncCache>` object that
caches results for you. Every plugin has its own ``cache`` object so that cache
keys won't conflict.
``keymanager`` is a :class:`KeyManager <nvchecker.api.KeyManager>` object that
you can call :meth:`.get_key(name) <nvchecker.api.KeyManager.get_key>` to get
the key (token) from the keyfile.
There may be additional keyword arguments in the future so ``**kwargs`` should be used.
If you want to send an HTTP request, it's preferred to use :meth:
`cache.get_json <nvchecker.api.AsyncCache.get_json>` or the :data:
`nvchecker.api.session` object. It will use the auto-selected HTTP backend and
handle the ``proxy`` option automatically.
For details about these objects, see :mod:`the API documentation <nvchecker.api>`,
or take existing source plugins as examples.
How to write a more powerful plugin
-----------------------------------
You may want more control in your source plugin, e.g. to do batch requests. To
do this, you provide a class instead::
class Worker(BaseWorker):
async def run(self) -> None:
...
You will have the following in the attributes::
token_q: Queue[bool],
result_q: Queue[RawResult],
tasks: List[Tuple[str, Entry]],
keymanager: KeyManager,
You are expected to process :attr:`tasks <nvchecker.api.BaseWorker.tasks>` and
put results in :attr:`result_q <nvchecker.api.BaseWorker.result_q>`. See
``nvchecker_source/none.py`` for the simplest example, and
``nvchecker_source/aur.py`` for a complete, batching example.
For details about these objects, see :mod:`the API documentation <nvchecker.api>`.
You can also receive a configuration section from the configuration as
``__config__.source.SOURCE_NAME``, where ``SOURCE_NAME`` is what your plugin is
called. This can be used to specify a mirror site for your plugin to use, e.g.
the ``npm`` plugin accepts the following config::
[__config__.source.npm]
registry = "https://registry.npm.taobao.org"
When such a configuration exists for your plugin, you need to define a function
named ``configure`` to receive it::
def configure(config):
'''use the "config" dict in some way'''
...

6
docs/requirements.txt Normal file
View file

@ -0,0 +1,6 @@
structlog
platformdirs
tornado>=6
sphinx>=3.2
# <5 has strange bottom margins for p, and no list indicators
sphinx-rtd-theme>=0.5

1145
docs/usage.rst Normal file

File diff suppressed because it is too large Load diff

1
keyfile.toml.enc Normal file
View file

@ -0,0 +1 @@
Salted__ÇßKÊ]å¯õ´äó<C3A4>(¯J×ýºŒv4/ÆáôLÔ<4C>µ(Ó“O|ôNTÌ¡O @”¬” ×Ûª¨8—ïû‡Jz cÈOüžË²úäkd·—æ/œ÷ÛZ)<29>Q}Þá4×µËÒW§@í*´ ÃW¹%¢·Bäðo¤¨wDA<44>Ú\W´µÏ$:Ø– ß…´µ9Èß½0"1yE53«UÍwÅ"â0j!µéâœq^ ®ë½¼J$oÜÛn±hñ—u`ÅÚ0GÝÛôËNHµX®¼l53×D ÿ€Œ

31
mypy.ini Normal file
View file

@ -0,0 +1,31 @@
[mypy]
warn_unused_configs = True
warn_redundant_casts = True
warn_unused_ignores = True
show_error_context = True
show_column_numbers = True
no_implicit_optional = True
[mypy-structlog]
ignore_missing_imports = True
[mypy-pyalpm]
ignore_missing_imports = True
[mypy-flaky]
ignore_missing_imports = True
[mypy-pytest_httpbin]
ignore_missing_imports = True
[mypy-lxml]
ignore_missing_imports = True
[mypy-tomllib]
ignore_missing_imports = True
[mypy-jq]
ignore_missing_imports = True
[mypy-tomli]
ignore_missing_imports = True

View file

@ -0,0 +1,4 @@
# MIT licensed
# Copyright (c) 2013-2024 lilydjwg <lilydjwg@gmail.com>, et al.
__version__ = '2.17dev'

112
nvchecker/__main__.py Executable file
View file

@ -0,0 +1,112 @@
#!/usr/bin/env python3
# MIT licensed
# Copyright (c) 2013-2024 lilydjwg <lilydjwg@gmail.com>, et al.
from __future__ import annotations
import sys
import argparse
import asyncio
from typing import Coroutine, Tuple
from pathlib import Path
import structlog
from . import core
from .util import ResultData, RawResult, KeyManager, EntryWaiter
from .ctxvars import proxy as ctx_proxy
logger = structlog.get_logger(logger_name=__name__)
def main() -> None:
parser = argparse.ArgumentParser(description='New version checker for software')
parser.add_argument('-k', '--keyfile',
metavar='FILE', type=str,
help='use specified keyfile (override the one in configuration file)')
parser.add_argument('-t', '--tries', default=1, type=int, metavar='N',
help='try N times when network errors occur')
parser.add_argument('--failures', action='store_true',
help='exit with code 3 if failures / errors happen during checking')
parser.add_argument('-e', '--entry', type=str,
help='only execute on specified entry (useful for debugging)')
core.add_common_arguments(parser)
args = parser.parse_args()
if core.process_common_arguments(args):
return
try:
entries, options = core.load_file(
args.file, use_keymanager=not bool(args.keyfile))
if args.entry:
if args.entry not in entries:
sys.exit('Specified entry not found in config')
entries = {args.entry: entries[args.entry]}
if args.keyfile:
keymanager = KeyManager(Path(args.keyfile))
else:
keymanager = options.keymanager
except core.FileLoadError as e:
sys.exit(str(e))
if options.proxy is not None:
ctx_proxy.set(options.proxy)
task_sem = asyncio.Semaphore(options.max_concurrency)
result_q: asyncio.Queue[RawResult] = asyncio.Queue()
dispatcher = core.setup_httpclient(
options.max_concurrency,
options.httplib,
options.http_timeout,
)
entry_waiter = EntryWaiter()
try:
futures = dispatcher.dispatch(
entries, task_sem, result_q,
keymanager, entry_waiter,
args.tries,
options.source_configs,
)
except ModuleNotFoundError as e:
sys.exit(f'Error: {e}')
if options.ver_files is not None:
oldvers = core.read_verfile(options.ver_files[0])
else:
oldvers = {}
result_coro = core.process_result(oldvers, result_q, entry_waiter, verbose=bool(args.entry))
runner_coro = core.run_tasks(futures)
if sys.version_info >= (3, 10):
# Python 3.10 has deprecated asyncio.get_event_loop
results, has_failures = asyncio.run(run(result_coro, runner_coro))
else:
# Python < 3.10 will create an eventloop when asyncio.Queue is initialized
results, has_failures = asyncio.get_event_loop().run_until_complete(run(result_coro, runner_coro))
if options.ver_files is not None:
newverf = options.ver_files[1]
if args.entry:
# don't remove other entries when only one entry is specified on cmdline
vers = core.read_verfile(newverf)
else:
vers = {}
vers.update(results)
core.write_verfile(newverf, vers)
if args.failures and has_failures:
sys.exit(3)
async def run(
result_coro: Coroutine[None, None, Tuple[ResultData, bool]],
runner_coro: Coroutine[None, None, None],
) -> Tuple[ResultData, bool]:
result_fu = asyncio.create_task(result_coro)
runner_fu = asyncio.create_task(runner_coro)
await runner_fu
result_fu.cancel()
return await result_fu
if __name__ == '__main__':
main()

11
nvchecker/api.py Normal file
View file

@ -0,0 +1,11 @@
# MIT licensed
# Copyright (c) 2020 lilydjwg <lilydjwg@gmail.com>, et al.
from .httpclient import session, TemporaryError, HTTPError
from .util import (
Entry, BaseWorker, RawResult, VersionResult, RichResult,
AsyncCache, KeyManager, GetVersionError, EntryWaiter,
)
from .sortversion import sort_version_keys
from .ctxvars import tries, proxy, user_agent, httptoken, entry_waiter, verify_cert

461
nvchecker/core.py Normal file
View file

@ -0,0 +1,461 @@
# MIT licensed
# Copyright (c) 2013-2020, 2024 lilydjwg <lilydjwg@gmail.com>, et al.
from __future__ import annotations
import os
import sys
import asyncio
from asyncio import Queue
import logging
import argparse
from typing import (
Tuple, NamedTuple, Optional, List, Union,
cast, Dict, Awaitable, Sequence, Any,
TYPE_CHECKING,
)
import types
from pathlib import Path
from importlib import import_module
import re
import contextvars
import json
import dataclasses
import structlog
if TYPE_CHECKING:
import tomli as tomllib
else:
try:
import tomllib
except ModuleNotFoundError:
import tomli as tomllib
import platformdirs
from .lib import nicelogger
from . import slogconf
from .util import (
Entry, Entries, KeyManager, RawResult, RichResult, ResultData,
FunctionWorker, GetVersionError,
FileLoadError, EntryWaiter,
)
from . import __version__
from .sortversion import sort_version_keys
from .ctxvars import tries as ctx_tries
from .ctxvars import entry_waiter as ctx_entry_waiter
from . import httpclient
logger = structlog.get_logger(logger_name=__name__)
def get_default_config() -> str:
confdir = platformdirs.user_config_dir(appname='nvchecker')
file = os.path.join(confdir, 'nvchecker.toml')
return file
def add_common_arguments(parser: argparse.ArgumentParser) -> None:
parser.add_argument('-l', '--logging',
choices=('debug', 'info', 'warning', 'error'), default='info',
help='logging level (default: info)')
parser.add_argument('--logger', default='pretty',
choices=['pretty', 'json', 'both'],
help='select which logger to use')
parser.add_argument('--json-log-fd', metavar='FD',
type=lambda fd: os.fdopen(int(fd), mode='w'),
help='specify fd to send json logs to. stdout by default')
parser.add_argument('-V', '--version', action='store_true',
help='show version and exit')
default_config = get_default_config()
parser.add_argument('-c', '--file',
metavar='FILE', type=str,
default=default_config,
help=f'software version configuration file [default: {default_config}]')
def process_common_arguments(args: argparse.Namespace) -> bool:
'''return True if should stop'''
processors = [
slogconf.exc_info,
slogconf.filter_exc,
slogconf.filter_nones,
slogconf.filter_taskname,
]
logger_factory = None
if args.logger in ['pretty', 'both']:
slogconf.fix_logging()
nicelogger.enable_pretty_logging(
getattr(logging, args.logging.upper()))
processors.append(slogconf.stdlib_renderer)
if args.logger == 'pretty':
logger_factory=structlog.PrintLoggerFactory(
file=open(os.devnull, 'w'),
)
processors.append(slogconf.null_renderer)
if args.logger in ['json', 'both']:
processors.extend([
structlog.processors.format_exc_info,
slogconf.json_renderer,
])
if logger_factory is None:
logfile = args.json_log_fd or sys.stdout
logger_factory = structlog.PrintLoggerFactory(file=logfile)
structlog.configure(
processors = processors,
logger_factory = logger_factory,
)
if args.version:
progname = os.path.basename(sys.argv[0])
print(f'{progname} v{__version__}')
return True
return False
def safe_overwrite(file: Path, data: Union[bytes, str], *,
method: str = 'write', mode: str = 'w', encoding: Optional[str] = None) -> None:
# FIXME: directory has no read perm
# FIXME: hard links
resolved_path = file.resolve()
tmpname = str(resolved_path) + '.tmp'
# if not using "with", write can fail without exception
with open(tmpname, mode, encoding=encoding) as f:
getattr(f, method)(data)
# see also: https://thunk.org/tytso/blog/2009/03/15/dont-fear-the-fsync/
f.flush()
os.fsync(f.fileno())
# if the above write failed (because disk is full etc), the old data should be kept
os.rename(tmpname, resolved_path)
def read_verfile(file: Path) -> ResultData:
try:
with open(file) as f:
data = f.read()
except FileNotFoundError:
return {}
try:
v = json.loads(data)
except json.decoder.JSONDecodeError:
# old format
v = {}
for l in data.splitlines():
name, ver = l.rstrip().split(None, 1)
v[name] = ver
if v.get('version') is None:
v = {k: RichResult(version=a) for k, a in v.items()}
elif v['version'] == 2:
v = {k: RichResult(**a) for k, a in v['data'].items()}
else:
raise Exception('unknown verfile version', v['version'])
return v
def write_verfile(file: Path, versions: ResultData) -> None:
d = {
'version': 2,
# sort and indent to make it friendly to human and git
'data': dict(sorted(versions.items())),
}
data = json.dumps(
d,
indent = 2,
ensure_ascii = False,
default = json_encode,
) + '\n'
safe_overwrite(file, data)
def json_encode(obj):
if isinstance(obj, RichResult):
d = {k: v for k, v in dataclasses.asdict(obj).items() if v is not None}
return d
raise TypeError(obj)
class Options(NamedTuple):
ver_files: Optional[Tuple[Path, Path]]
max_concurrency: int
proxy: Optional[str]
keymanager: KeyManager
source_configs: Dict[str, Dict[str, Any]]
httplib: Optional[str]
http_timeout: int
def load_file(
file: str, *,
use_keymanager: bool,
) -> Tuple[Entries, Options]:
try:
with open(file, 'rb') as f:
config = tomllib.load(f)
except (OSError, tomllib.TOMLDecodeError) as e:
raise FileLoadError('version configuration file', file, e)
ver_files: Optional[Tuple[Path, Path]] = None
keymanager = KeyManager(None)
source_configs = {}
if '__config__' in config:
c = config.pop('__config__')
d = Path(file).parent
if 'oldver' in c and 'newver' in c:
oldver_s = os.path.expandvars(
os.path.expanduser(c.get('oldver')))
oldver = d / oldver_s
newver_s = os.path.expandvars(
os.path.expanduser(c.get('newver')))
newver = d / newver_s
ver_files = oldver, newver
if use_keymanager:
keyfile = c.get('keyfile')
if keyfile:
keyfile_s = os.path.expandvars(
os.path.expanduser(c.get('keyfile')))
keyfile = d / keyfile_s
keymanager = KeyManager(keyfile)
if 'source' in c:
source_configs = c['source']
max_concurrency = c.get('max_concurrency', 20)
proxy = c.get('proxy')
httplib = c.get('httplib', None)
http_timeout = c.get('http_timeout', 20)
else:
max_concurrency = 20
proxy = None
httplib = None
http_timeout = 20
return cast(Entries, config), Options(
ver_files, max_concurrency, proxy, keymanager,
source_configs, httplib, http_timeout,
)
def setup_httpclient(
max_concurrency: int = 20,
httplib: Optional[str] = None,
http_timeout: int = 20,
) -> Dispatcher:
httplib_ = httplib or httpclient.find_best_httplib()
httpclient.setup(
httplib_, max_concurrency, http_timeout)
return Dispatcher()
class Dispatcher:
def dispatch(
self,
entries: Entries,
task_sem: asyncio.Semaphore,
result_q: Queue[RawResult],
keymanager: KeyManager,
entry_waiter: EntryWaiter,
tries: int,
source_configs: Dict[str, Dict[str, Any]],
) -> List[asyncio.Future]:
mods: Dict[str, Tuple[types.ModuleType, List]] = {}
ctx_tries.set(tries)
ctx_entry_waiter.set(entry_waiter)
root_ctx = contextvars.copy_context()
for name, entry in entries.items():
source = entry.get('source', 'none')
if source not in mods:
mod = import_module('nvchecker_source.' + source)
tasks: List[Tuple[str, Entry]] = []
mods[source] = mod, tasks
config = source_configs.get(source)
if config and getattr(mod, 'configure'):
mod.configure(config)
else:
tasks = mods[source][1]
tasks.append((name, entry))
ret = []
for mod, tasks in mods.values():
if hasattr(mod, 'Worker'):
worker_cls = mod.Worker
else:
worker_cls = FunctionWorker
ctx = root_ctx.copy()
worker = ctx.run(
worker_cls,
task_sem, result_q, tasks, keymanager,
)
if worker_cls is FunctionWorker:
func = mod.get_version
ctx.run(worker.initialize, func)
ret.append(ctx.run(worker._run_maynot_raise))
return ret
def substitute_version(
version: str, conf: Entry,
) -> str:
'''
Substitute the version string via defined rules in the configuration file.
See usage.rst#global-options for details.
'''
prefix = conf.get('prefix')
if prefix:
if version.startswith(prefix):
version = version[len(prefix):]
from_pattern = conf.get('from_pattern')
if from_pattern:
to_pattern = conf.get('to_pattern')
if to_pattern is None:
raise ValueError("from_pattern exists but to_pattern doesn't")
version = re.sub(from_pattern, to_pattern, version)
return version
def apply_list_options(
versions: List[Union[str, RichResult]],
conf: Entry,
name: str,
) -> Optional[Union[str, RichResult]]:
pattern = conf.get('include_regex')
if versions and pattern:
re_pat = re.compile(pattern)
versions2 = [x for x in versions
if re_pat.fullmatch(str(x))]
if not versions2:
logger.warning('include_regex matched no versions',
name=name, versions=versions, regex=pattern)
return None
versions = versions2
pattern = conf.get('exclude_regex')
if pattern:
re_pat = re.compile(pattern)
versions = [x for x in versions
if not re_pat.fullmatch(str(x))]
ignored = set(conf.get('ignored', '').split())
if ignored:
versions = [x for x in versions
if str(x) not in ignored]
if not versions:
return None
sort_version_key = sort_version_keys[
conf.get("sort_version_key", "parse_version")]
versions.sort(key=lambda version: sort_version_key(str(version))) # type: ignore
return versions[-1]
def _process_result(r: RawResult) -> Union[RichResult, Exception]:
version = r.version
conf = r.conf
name = r.name
url = None
revision = None
gitref = None
if isinstance(version, GetVersionError):
kw = version.kwargs
kw['name'] = name
logger.error(version.msg, **kw)
return version
elif isinstance(version, Exception):
logger.error('unexpected error happened',
name=r.name, exc_info=r.version)
return version
elif isinstance(version, list):
version_str = apply_list_options(version, conf, name)
if isinstance(version_str, RichResult):
url = version_str.url
gitref = version_str.gitref
revision = version_str.revision
version_str = version_str.version
elif isinstance(version, RichResult):
version_str = version.version
url = version.url
gitref = version.gitref
revision = version.revision
else:
version_str = version
if version_str:
version_str = version_str.replace('\n', ' ')
try:
version_str = substitute_version(version_str, conf)
return RichResult(
version = version_str,
url = url,
gitref = gitref,
revision = revision,
)
except (ValueError, re.error) as e:
logger.exception('error occurred in version substitutions', name=name)
return e
else:
return ValueError('no version returned')
def check_version_update(
oldvers: ResultData,
name: str,
r: RichResult,
verbose: bool,
) -> None:
if old_result := oldvers.get(name):
oldver = old_result.version
else:
oldver = None
if not oldver or oldver != r.version:
logger.info(
'updated',
name = name,
version = r.version,
revision = r.revision,
old_version = oldver,
url = r.url,
)
else:
# provide visible user feedback if it was the only entry
level = logging.INFO if verbose else logging.DEBUG
logger.log(level, 'up-to-date', name=name, version=r.version, url=r.url)
async def process_result(
oldvers: ResultData,
result_q: Queue[RawResult],
entry_waiter: EntryWaiter,
verbose: bool = False,
) -> Tuple[ResultData, bool]:
ret = {}
has_failures = False
try:
while True:
r = await result_q.get()
try:
r1 = _process_result(r)
except Exception as e:
logger.exception('error processing result', result=r)
r1 = e
if isinstance(r1, Exception):
entry_waiter.set_exception(r.name, r1)
# no versions are returned from "apply_list_options"?
logger.error('no-result', name=r.name, error=repr(r1))
has_failures = True
continue
check_version_update(oldvers, r.name, r1, verbose)
entry_waiter.set_result(r.name, r1.version)
ret[r.name] = r1
except asyncio.CancelledError:
return ret, has_failures
async def run_tasks(
futures: Sequence[Awaitable[None]]
) -> None:
for fu in asyncio.as_completed(futures):
await fu

21
nvchecker/ctxvars.py Normal file
View file

@ -0,0 +1,21 @@
# MIT licensed
# Copyright (c) 2020 lilydjwg <lilydjwg@gmail.com>, et al.
from __future__ import annotations
from contextvars import ContextVar
from typing import Optional, TYPE_CHECKING
from . import __version__
DEFAULT_USER_AGENT = f'lilydjwg/nvchecker {__version__}'
if TYPE_CHECKING:
from .util import EntryWaiter
tries = ContextVar('tries', default=1)
proxy: ContextVar[Optional[str]] = ContextVar('proxy', default=None)
user_agent = ContextVar('user_agent', default=DEFAULT_USER_AGENT)
httptoken = ContextVar('httptoken', default=None)
entry_waiter: ContextVar[EntryWaiter] = ContextVar('entry_waiter')
verify_cert = ContextVar('verify_cert', default=True)

View file

@ -1,124 +0,0 @@
import re
import sre_constants
import logging
from functools import partial
import queue
import json
import urllib.parse
from pkg_resources import parse_version
from tornado.httpclient import AsyncHTTPClient
import tornado.process
from tornado.ioloop import IOLoop
logger = logging.getLogger(__name__)
handler_precedence = ('github', 'aur', 'cmd', 'regex')
try:
import pycurl
AsyncHTTPClient.configure("tornado.curl_httpclient.CurlAsyncHTTPClient")
except ImportError:
pycurl = None
def get_version(name, conf, callback):
g = globals()
for key in handler_precedence:
if key in conf:
funcname = 'get_version_by_' + key
g[funcname](name, conf, callback)
break
else:
logger.error('%s: no idea to get version info.', name)
callback(name, None)
def get_version_by_regex(name, conf, callback):
try:
r = re.compile(conf['regex'])
except sre_constants.error:
logger.warn('%s: bad regex, skipped.', name, exc_info=True)
callback(name, None)
return
encoding = conf.get('encoding', 'latin1')
httpclient = AsyncHTTPClient()
kwargs = {}
if conf.get('proxy'):
if pycurl:
host, port = urllib.parse.splitport(conf['proxy'])
kwargs['proxy_host'] = host
kwargs['proxy_port'] = int(port)
else:
logger.warn('%s: proxy set but not used because pycurl is unavailable.', name)
httpclient.fetch(conf['url'], partial(
_get_version_by_regex, name, r, encoding, callback
), **kwargs)
def _get_version_by_regex(name, regex, encoding, callback, res):
body = res.body.decode(encoding)
try:
version = max(regex.findall(body), key=parse_version)
except ValueError:
logger.error('%s: version string not found.', name)
callback(name, None)
else:
callback(name, version)
AUR_URL = 'https://aur.archlinux.org/rpc.php?type=info&arg='
def get_version_by_aur(name, conf, callback):
aurname = conf.get('aur') or name
url = AUR_URL + aurname
AsyncHTTPClient().fetch(url, partial(_aur_done, name, callback))
def _aur_done(name, callback, res):
data = json.loads(res.body.decode('utf-8'))
version = data['results']['Version']
callback(name, version)
GITHUB_URL = 'https://api.github.com/repos/%s/commits'
def get_version_by_github(name, conf, callback):
repo = conf.get('github')
url = GITHUB_URL % repo
AsyncHTTPClient().fetch(url, user_agent='lilydjwg/nvchecker',
callback=partial(_github_done, name, callback))
def _github_done(name, callback, res):
data = json.loads(res.body.decode('utf-8'))
version = data[0]['commit']['committer']['date'].split('T', 1)[0].replace('-', '')
callback(name, version)
cmd_q = queue.Queue()
cmd_q.running = False
def get_version_by_cmd(name, conf, callback):
cmd = conf['cmd']
cmd_q.put((name, cmd, callback))
if not cmd_q.running:
_run_command()
def _run_command():
cmd_q.running = True
try:
name, cmd, callback = cmd_q.get_nowait()
except queue.Empty:
cmd_q.running = False
return
p = tornado.process.Subprocess(cmd, shell=True, io_loop=IOLoop.instance(),
stdout=tornado.process.Subprocess.STREAM)
p.set_exit_callback(partial(_command_done, name, callback, p))
def _command_done(name, callback, process, status):
if status != 0:
logger.error('%s: command exited with %d.', name, status)
callback(name, None)
else:
process.stdout.read_until_close(partial(_got_version_from_cmd, callback, name))
_run_command()
def _got_version_from_cmd(callback, name, output):
output = output.strip().decode('latin1')
callback(name, output)

View file

@ -0,0 +1,55 @@
# MIT licensed
# Copyright (c) 2013-2020 lilydjwg <lilydjwg@gmail.com>, et al.
from typing import Optional
from .base import TemporaryError, HTTPError
class Proxy:
_obj = None
def set_obj(self, obj):
super().__setattr__('_obj', obj)
def __getattr__(self, name):
return getattr(self._obj, name)
def __setattr__(self, name, value):
return setattr(self._obj, name, value)
session = Proxy()
def setup(
which: Optional[str] = None,
concurreny: int = 20,
timeout: int = 20,
) -> None:
if which is None:
which = find_best_httplib()
m = __import__(
'%s_httpclient' % which, globals(), locals(), level=1)
session.set_obj(m.session)
session.setup(concurreny, timeout)
def find_best_httplib() -> str:
try:
import tornado, pycurl
# connection reuse, http/2
which = 'tornado'
except ImportError:
try:
import aiohttp
which = 'aiohttp'
# connection reuse
except ImportError:
try:
import httpx
which = 'httpx'
except ImportError:
import tornado
which = 'tornado'
# fallback
return which

View file

@ -0,0 +1,85 @@
# MIT licensed
# Copyright (c) 2013-2020 lilydjwg <lilydjwg@gmail.com>, et al.
import asyncio
from typing import Optional, Dict
import structlog
import aiohttp
from .base import BaseSession, TemporaryError, Response, HTTPError
__all__ = ['session']
logger = structlog.get_logger(logger_name=__name__)
class AiohttpSession(BaseSession):
session = None
def setup(
self,
concurreny: int = 20,
timeout: int = 20,
) -> None:
self._concurreny = concurreny
self._timeout = timeout
async def request_impl(
self, url: str, *,
method: str,
proxy: Optional[str] = None,
headers: Dict[str, str] = {},
follow_redirects: bool = True,
params = (),
json = None,
body = None,
verify_cert: bool = True,
) -> Response:
if self.session is None:
# need to create in async context
self.session = aiohttp.ClientSession(
connector = aiohttp.TCPConnector(limit=self._concurreny),
timeout = aiohttp.ClientTimeout(total=self._timeout),
trust_env = True,
)
kwargs = {
'headers': headers,
'params': params,
'allow_redirects': follow_redirects,
}
if not verify_cert:
kwargs['ssl'] = False
if proxy is not None:
kwargs['proxy'] = proxy
if body is not None:
# Make sure all backends have the same default encoding for post data.
if 'Content-Type' not in headers:
headers = {**headers, 'Content-Type': 'application/x-www-form-urlencoded'}
kwargs['headers'] = headers
kwargs['data'] = body.encode()
elif json is not None:
kwargs['json'] = json
try:
logger.debug('send request', method=method, url=url, kwargs=kwargs)
res = await self.session.request(
method, url, **kwargs)
except (
asyncio.TimeoutError, aiohttp.ClientConnectorError,
) as e:
raise TemporaryError(599, repr(e), e)
err_cls: Optional[type] = None
if res.status >= 500:
err_cls = TemporaryError
elif res.status >= 400:
err_cls = HTTPError
if err_cls is not None:
raise err_cls(res.status, res.reason, res)
body = await res.content.read()
return Response(res.headers, body)
session = AiohttpSession()

View file

@ -0,0 +1,124 @@
# MIT licensed
# Copyright (c) 2019-2020 lilydjwg <lilydjwg@gmail.com>, et al.
import structlog
from typing import Optional, Dict, Mapping
import json as _json
from ..ctxvars import tries, proxy, user_agent, httptoken, verify_cert
logger = structlog.get_logger(logger_name=__name__)
class Response:
'''The response of an HTTP request.
.. py:attribute:: body
:type: bytes
.. py:attribute:: headers
:type: Mapping[str, str]
'''
def __init__(
self,
headers: Mapping[str, str],
body: bytes,
) -> None:
self.headers = headers
self.body = body
def json(self):
'''Convert response content to JSON.'''
return _json.loads(self.body.decode('utf-8'))
class BaseSession:
'''The base class for different HTTP backend.'''
def setup(
self,
concurreny: int = 20,
timeout: int = 20,
) -> None:
pass
async def head(self, *args, **kwargs):
'''Shortcut for ``HEAD`` request.'''
return await self.request(
method='HEAD', *args, **kwargs)
async def get(self, *args, **kwargs):
'''Shortcut for ``GET`` request.'''
return await self.request(
method='GET', *args, **kwargs)
async def post(self, *args, **kwargs):
'''Shortcut for ``POST`` request.'''
return await self.request(
method='POST', *args, **kwargs)
async def request(
self, url: str, *,
method: str,
headers: Dict[str, str] = {},
follow_redirects: bool = True,
params = (),
json = None,
body = None,
) -> Response:
t = tries.get()
p = proxy.get()
ua = user_agent.get()
httpt = httptoken.get()
verify = verify_cert.get()
headers = headers.copy()
headers.setdefault('User-Agent', ua)
if httpt is not None:
headers.setdefault('Authorization', httpt)
for i in range(1, t+1):
try:
return await self.request_impl(
url,
method = method,
headers = headers,
params = params,
follow_redirects = follow_redirects,
json = json,
body = body,
proxy = p or None,
verify_cert = verify,
)
except TemporaryError as e:
if i == t:
raise
else:
logger.warning('temporary error, retrying',
tries = i, exc_info = e)
continue
raise Exception('should not reach')
async def request_impl(
self, url: str, *,
method: str,
proxy: Optional[str] = None,
headers: Dict[str, str] = {},
follow_redirects: bool = True,
params = (),
json = None,
body = None,
verify_cert: bool = True,
) -> Response:
''':meta private:'''
raise NotImplementedError
class BaseHTTPError(Exception):
def __init__(self, code, message, response):
self.code = code
self.message = message
self.response = response
class TemporaryError(BaseHTTPError):
'''A temporary error (e.g. network error) happens.'''
class HTTPError(BaseHTTPError):
'''An HTTP 4xx error happens'''

View file

@ -0,0 +1,78 @@
# MIT licensed
# Copyright (c) 2020-2022,2024 lilydjwg <lilydjwg@gmail.com>, et al.
from typing import Dict, Optional, Tuple
import httpx
from .base import BaseSession, TemporaryError, Response, HTTPError
__all__ = ['session']
class HttpxSession(BaseSession):
def setup(
self,
concurreny: int = 20,
timeout: int = 20,
) -> None:
self.clients: Dict[Tuple[Optional[str], bool], httpx.AsyncClient] = {}
self.timeout = timeout
async def request_impl(
self, url: str, *,
method: str,
proxy: Optional[str] = None,
headers: Dict[str, str] = {},
follow_redirects: bool = True,
params = (),
json = None,
body = None,
verify_cert: bool = True,
) -> Response:
client = self.clients.get((proxy, verify_cert))
if not client:
client = httpx.AsyncClient(
timeout = httpx.Timeout(self.timeout, pool=None),
http2 = True,
proxy = proxy,
verify = verify_cert,
)
self.clients[(proxy, verify_cert)] = client
try:
if body is not None:
# Make sure all backends have the same default encoding for post data.
if 'Content-Type' not in headers:
headers = {**headers, 'Content-Type': 'application/x-www-form-urlencoded'}
body = body.encode()
r = await client.request(
method, url, json = json, content = body,
headers = headers,
follow_redirects = follow_redirects,
# httpx checks for None but not ()
params = params or None,
)
err_cls: Optional[type] = None
if r.status_code >= 500:
err_cls = TemporaryError
elif r.status_code >= 400:
err_cls = HTTPError
if err_cls is not None:
raise err_cls(
r.status_code,
r.reason_phrase,
r,
)
except httpx.TransportError as e:
raise TemporaryError(599, repr(e), e)
body = await r.aread()
return Response(r.headers, body)
async def aclose(self):
for client in self.clients.values():
await client.aclose()
del self.clients
session = HttpxSession()

View file

@ -0,0 +1,103 @@
# MIT licensed
# Copyright (c) 2013-2020 lilydjwg <lilydjwg@gmail.com>, et al.
import json as _json
from urllib.parse import urlencode
from typing import Optional, Dict, Any
import os
from tornado.httpclient import AsyncHTTPClient, HTTPRequest
try:
import pycurl
except ImportError:
pycurl = None # type: ignore
from .base import BaseSession, TemporaryError, Response, HTTPError
__all__ = ['session']
HTTP2_AVAILABLE = None if pycurl else False
SSL_CERT_FILE = os.environ.get('SSL_CERT_FILE')
def setup_curl(curl):
global HTTP2_AVAILABLE
if HTTP2_AVAILABLE is None:
try:
curl.setopt(pycurl.HTTP_VERSION, 4)
HTTP2_AVAILABLE = True
except pycurl.error:
HTTP2_AVAILABLE = False
elif HTTP2_AVAILABLE:
curl.setopt(pycurl.HTTP_VERSION, 4)
if SSL_CERT_FILE:
curl.setopt_string(pycurl.CAINFO, SSL_CERT_FILE)
curl.setopt_string(pycurl.ACCEPT_ENCODING, "")
class TornadoSession(BaseSession):
def setup(
self,
concurreny: int = 20,
timeout: int = 20,
) -> None:
impl: Optional[str]
if pycurl:
impl = "tornado.curl_httpclient.CurlAsyncHTTPClient"
else:
impl = None
AsyncHTTPClient.configure(
impl, max_clients = concurreny)
self.timeout = timeout
async def request_impl(
self, url: str, *,
method: str,
proxy: Optional[str] = None,
headers: Dict[str, str] = {},
follow_redirects: bool = True,
params = (),
json = None,
body = None,
verify_cert: bool = True,
) -> Response:
kwargs: Dict[str, Any] = {
'method': method,
'headers': headers,
'request_timeout': self.timeout,
'follow_redirects': follow_redirects,
'validate_cert': verify_cert,
}
if body:
# By default the content type is already 'application/x-www-form-urlencoded'
kwargs['body'] = body
elif json:
kwargs['body'] = _json.dumps(json)
kwargs['prepare_curl_callback'] = setup_curl
if proxy:
host, port = proxy.rsplit(':', 1)
kwargs['proxy_host'] = host
kwargs['proxy_port'] = int(port)
if params:
q = urlencode(params)
url += '?' + q
r = HTTPRequest(url, **kwargs)
res = await AsyncHTTPClient().fetch(
r, raise_error=False)
err_cls: Optional[type] = None
if res.code >= 500:
err_cls = TemporaryError
elif res.code >= 400:
err_cls = HTTPError
if err_cls is not None:
raise err_cls(
res.code, res.reason, res
)
return Response(res.headers, res.body)
session = TornadoSession()

4
nvchecker/lib/README.md Normal file
View file

@ -0,0 +1,4 @@
This directory contains code from other places:
* `nicelogger.py`: from my [winterpy](https://github.com/lilydjwg/winterpy)
* `packaging_version.py`: from python-packaging 20.9, modified

View file

113
nvchecker/lib/nicelogger.py Normal file
View file

@ -0,0 +1,113 @@
# MIT licensed
# Copyright (c) 2013-2017 lilydjwg <lilydjwg@gmail.com>, et al.
'''
A Tornado-inspired logging formatter, with displayed time with millisecond accuracy
FYI: pyftpdlib also has a Tornado-style logger.
'''
import sys
import time
import logging
class Colors:
def __init__(self, color=None):
if color is None:
color = support_color()
if color:
import curses
curses.setupterm()
if sys.hexversion < 0x30203f0:
fg_color = str(curses.tigetstr("setaf") or
curses.tigetstr("setf") or "", "ascii")
else:
fg_color = curses.tigetstr("setaf") or curses.tigetstr("setf") or b""
self.blue = str(curses.tparm(fg_color, 4), "ascii")
self.yellow = str(curses.tparm(fg_color, 3), "ascii")
self.green = str(curses.tparm(fg_color, 2), "ascii")
self.red = str(curses.tparm(fg_color, 1), "ascii")
self.bright_red = str(curses.tparm(fg_color, 9), "ascii")
self.normal = str(curses.tigetstr("sgr0"), "ascii")
else:
self.blue = self.yellow = self.green = self.red = self.bright_red = self.normal = ""
class TornadoLogFormatter(logging.Formatter):
def __init__(self, color, *args, **kwargs):
super().__init__(*args, **kwargs)
self._color = color
if color:
colors = Colors(color=color)
self._colors = {
logging.DEBUG: colors.blue,
logging.INFO: colors.green,
logging.WARNING: colors.yellow,
logging.ERROR: colors.red,
logging.CRITICAL: colors.bright_red,
}
self._normal = colors.normal
def format(self, record):
try:
record.message = record.getMessage()
except Exception as e:
record.message = "Bad message (%r): %r" % (e, record.__dict__)
record.asctime = time.strftime(
"%m-%d %H:%M:%S", self.converter(record.created))
prefix = '[%(levelname)1.1s %(asctime)s.%(msecs)03d %(module)s:%(lineno)d]' % \
record.__dict__
if self._color:
prefix = (self._colors.get(record.levelno, self._normal) +
prefix + self._normal)
formatted = prefix + " " + record.message
formatted += ''.join(
' %s=%s' % (k, v) for k, v in record.__dict__.items()
if k not in {
'levelname', 'asctime', 'module', 'lineno', 'args', 'message',
'filename', 'exc_info', 'exc_text', 'created', 'funcName',
'processName', 'process', 'msecs', 'relativeCreated', 'thread',
'threadName', 'name', 'levelno', 'msg', 'pathname', 'stack_info',
'taskName',
})
if record.exc_info:
if not record.exc_text:
record.exc_text = self.formatException(record.exc_info)
if record.exc_text:
formatted = formatted.rstrip() + "\n" + record.exc_text
return formatted.replace("\n", "\n ")
def support_color(stream=sys.stderr):
if stream.isatty():
try:
import curses
curses.setupterm()
if curses.tigetnum("colors") > 0:
return True
except:
import traceback
traceback.print_exc()
return False
def enable_pretty_logging(level=logging.DEBUG, handler=None, color=None):
'''
handler: specify a handler instead of default StreamHandler
color: boolean, force color to be on / off. Default to be on only when
``handler`` isn't specified and the term supports color
'''
logger = logging.getLogger()
if handler is None:
h = logging.StreamHandler()
else:
h = handler
if color is None and handler is None:
color = support_color()
formatter = TornadoLogFormatter(color=color)
h.setLevel(level)
h.setFormatter(formatter)
logger.setLevel(level)
logger.addHandler(h)

View file

@ -0,0 +1,629 @@
# This file comes from python-packaging 20.9 and is modified
# This file is dual licensed under the terms of the Apache License, Version
# 2.0, and the BSD License.
from __future__ import annotations
import collections
import itertools
import re
from typing import TYPE_CHECKING
class InfinityType(object):
def __repr__(self):
# type: () -> str
return "Infinity"
def __hash__(self):
# type: () -> int
return hash(repr(self))
def __lt__(self, other):
# type: (object) -> bool
return False
def __le__(self, other):
# type: (object) -> bool
return False
def __eq__(self, other):
# type: (object) -> bool
return isinstance(other, self.__class__)
def __ne__(self, other):
# type: (object) -> bool
return not isinstance(other, self.__class__)
def __gt__(self, other):
# type: (object) -> bool
return True
def __ge__(self, other):
# type: (object) -> bool
return True
def __neg__(self):
# type: (object) -> NegativeInfinityType
return NegativeInfinity
Infinity = InfinityType()
class NegativeInfinityType(object):
def __repr__(self):
# type: () -> str
return "-Infinity"
def __hash__(self):
# type: () -> int
return hash(repr(self))
def __lt__(self, other):
# type: (object) -> bool
return True
def __le__(self, other):
# type: (object) -> bool
return True
def __eq__(self, other):
# type: (object) -> bool
return isinstance(other, self.__class__)
def __ne__(self, other):
# type: (object) -> bool
return not isinstance(other, self.__class__)
def __gt__(self, other):
# type: (object) -> bool
return False
def __ge__(self, other):
# type: (object) -> bool
return False
def __neg__(self):
# type: (object) -> InfinityType
return Infinity
NegativeInfinity = NegativeInfinityType()
if TYPE_CHECKING: # pragma: no cover
from typing import Callable, Iterator, List, Optional, SupportsInt, Tuple, Union
InfiniteTypes = Union[InfinityType, NegativeInfinityType]
PrePostDevType = Union[InfiniteTypes, Tuple[str, int]]
SubLocalType = Union[InfiniteTypes, int, str]
LocalType = Union[
NegativeInfinityType,
Tuple[
Union[
SubLocalType,
Tuple[SubLocalType, str],
Tuple[NegativeInfinityType, SubLocalType],
],
...,
],
]
CmpKey = Tuple[
int, Tuple[int, ...], PrePostDevType, PrePostDevType, PrePostDevType, LocalType
]
LegacyCmpKey = Tuple[int, Tuple[str, ...]]
VersionComparisonMethod = Callable[
[Union[CmpKey, LegacyCmpKey], Union[CmpKey, LegacyCmpKey]], bool
]
__all__ = ["parse", "Version", "LegacyVersion", "InvalidVersion", "VERSION_PATTERN"]
_Version = collections.namedtuple(
"_Version", ["epoch", "release", "dev", "pre", "post", "local"]
)
def parse(version):
# type: (str) -> Union[LegacyVersion, Version]
"""
Parse the given version string and return either a :class:`Version` object
or a :class:`LegacyVersion` object depending on if the given version is
a valid PEP 440 version or a legacy version.
"""
try:
return Version(version)
except InvalidVersion:
return LegacyVersion(version)
class InvalidVersion(ValueError):
"""
An invalid version was found, users should refer to PEP 440.
"""
class _BaseVersion(object):
_key = None # type: Union[CmpKey, LegacyCmpKey]
def __hash__(self):
# type: () -> int
return hash(self._key)
# Please keep the duplicated `isinstance` check
# in the six comparisons hereunder
# unless you find a way to avoid adding overhead function calls.
def __lt__(self, other):
# type: (_BaseVersion) -> bool
if not isinstance(other, _BaseVersion):
return NotImplemented
return self._key < other._key
def __le__(self, other):
# type: (_BaseVersion) -> bool
if not isinstance(other, _BaseVersion):
return NotImplemented
return self._key <= other._key
def __eq__(self, other):
# type: (object) -> bool
if not isinstance(other, _BaseVersion):
return NotImplemented
return self._key == other._key
def __ge__(self, other):
# type: (_BaseVersion) -> bool
if not isinstance(other, _BaseVersion):
return NotImplemented
return self._key >= other._key
def __gt__(self, other):
# type: (_BaseVersion) -> bool
if not isinstance(other, _BaseVersion):
return NotImplemented
return self._key > other._key
def __ne__(self, other):
# type: (object) -> bool
if not isinstance(other, _BaseVersion):
return NotImplemented
return self._key != other._key
class LegacyVersion(_BaseVersion):
def __init__(self, version):
# type: (str) -> None
self._version = str(version)
self._key = _legacy_cmpkey(self._version)
def __str__(self):
# type: () -> str
return self._version
def __repr__(self):
# type: () -> str
return "<LegacyVersion({0})>".format(repr(str(self)))
@property
def public(self):
# type: () -> str
return self._version
@property
def base_version(self):
# type: () -> str
return self._version
@property
def epoch(self):
# type: () -> int
return -1
@property
def release(self):
# type: () -> None
return None
@property
def pre(self):
# type: () -> None
return None
@property
def post(self):
# type: () -> None
return None
@property
def dev(self):
# type: () -> None
return None
@property
def local(self):
# type: () -> None
return None
@property
def is_prerelease(self):
# type: () -> bool
return False
@property
def is_postrelease(self):
# type: () -> bool
return False
@property
def is_devrelease(self):
# type: () -> bool
return False
_legacy_version_component_re = re.compile(r"(\d+ | [a-z]+ | \.| -)", re.VERBOSE)
_legacy_version_replacement_map = {
"pre": "c",
"preview": "c",
"-": "final-",
"rc": "c",
"dev": "@",
}
def _parse_version_parts(s):
# type: (str) -> Iterator[str]
for part in _legacy_version_component_re.split(s):
part = _legacy_version_replacement_map.get(part, part)
if not part or part == ".":
continue
if part[:1] in "0123456789":
# pad for numeric comparison
yield part.zfill(8)
else:
yield "*" + part
# ensure that alpha/beta/candidate are before final
yield "*final"
def _legacy_cmpkey(version):
# type: (str) -> LegacyCmpKey
# We hardcode an epoch of -1 here. A PEP 440 version can only have a epoch
# greater than or equal to 0. This will effectively put the LegacyVersion,
# which uses the defacto standard originally implemented by setuptools,
# as before all PEP 440 versions.
epoch = -1
# This scheme is taken from pkg_resources.parse_version setuptools prior to
# it's adoption of the packaging library.
parts = [] # type: List[str]
for part in _parse_version_parts(version.lower()):
if part.startswith("*"):
# remove "-" before a prerelease tag
if part < "*final":
while parts and parts[-1] == "*final-":
parts.pop()
# remove trailing zeros from each series of numeric parts
while parts and parts[-1] == "00000000":
parts.pop()
parts.append(part)
return epoch, tuple(parts)
# Deliberately not anchored to the start and end of the string, to make it
# easier for 3rd party code to reuse
VERSION_PATTERN = r"""
v?
(?:
(?:(?P<epoch>[0-9]+)!)? # epoch
(?P<release>[0-9]+(?:\.[0-9]+)*) # release segment
(?P<pre> # pre-release
[-_\.]?
(?P<pre_l>(a|b|c|rc|alpha|beta|pre|preview))
[-_\.]?
(?P<pre_n>[0-9]+)?
)?
(?P<post> # post release
(?:-(?P<post_n1>[0-9]+))
|
(?:
[-_\.]?
(?P<post_l>post|rev|r)
[-_\.]?
(?P<post_n2>[0-9]+)?
)
)?
(?P<dev> # dev release
[-_\.]?
(?P<dev_l>dev)
[-_\.]?
(?P<dev_n>[0-9]+)?
)?
)
(?:\+(?P<local>[a-z0-9]+(?:[-_\.][a-z0-9]+)*))? # local version
"""
class Version(_BaseVersion):
_regex = re.compile(r"^\s*" + VERSION_PATTERN + r"\s*$", re.VERBOSE | re.IGNORECASE)
def __init__(self, version):
# type: (str) -> None
# Validate the version and parse it into pieces
match = self._regex.search(version)
if not match:
raise InvalidVersion("Invalid version: '{0}'".format(version))
# Store the parsed out pieces of the version
self._version = _Version(
epoch=int(match.group("epoch")) if match.group("epoch") else 0,
release=tuple(int(i) for i in match.group("release").split(".")),
pre=_parse_letter_version(match.group("pre_l"), match.group("pre_n")),
post=_parse_letter_version(
match.group("post_l"), match.group("post_n1") or match.group("post_n2")
),
dev=_parse_letter_version(match.group("dev_l"), match.group("dev_n")),
local=_parse_local_version(match.group("local")),
)
# Generate a key which will be used for sorting
self._key = _cmpkey(
self._version.epoch,
self._version.release,
self._version.pre,
self._version.post,
self._version.dev,
self._version.local,
)
def __repr__(self):
# type: () -> str
return "<Version({0})>".format(repr(str(self)))
def __str__(self):
# type: () -> str
parts = []
# Epoch
if self.epoch != 0:
parts.append("{0}!".format(self.epoch))
# Release segment
parts.append(".".join(str(x) for x in self.release))
# Pre-release
if self.pre is not None:
parts.append("".join(str(x) for x in self.pre))
# Post-release
if self.post is not None:
parts.append(".post{0}".format(self.post))
# Development release
if self.dev is not None:
parts.append(".dev{0}".format(self.dev))
# Local version segment
if self.local is not None:
parts.append("+{0}".format(self.local))
return "".join(parts)
@property
def epoch(self):
# type: () -> int
_epoch = self._version.epoch # type: int
return _epoch
@property
def release(self):
# type: () -> Tuple[int, ...]
_release = self._version.release # type: Tuple[int, ...]
return _release
@property
def pre(self):
# type: () -> Optional[Tuple[str, int]]
_pre = self._version.pre # type: Optional[Tuple[str, int]]
return _pre
@property
def post(self):
# type: () -> Optional[Tuple[str, int]]
return self._version.post[1] if self._version.post else None
@property
def dev(self):
# type: () -> Optional[Tuple[str, int]]
return self._version.dev[1] if self._version.dev else None
@property
def local(self):
# type: () -> Optional[str]
if self._version.local:
return ".".join(str(x) for x in self._version.local)
else:
return None
@property
def public(self):
# type: () -> str
return str(self).split("+", 1)[0]
@property
def base_version(self):
# type: () -> str
parts = []
# Epoch
if self.epoch != 0:
parts.append("{0}!".format(self.epoch))
# Release segment
parts.append(".".join(str(x) for x in self.release))
return "".join(parts)
@property
def is_prerelease(self):
# type: () -> bool
return self.dev is not None or self.pre is not None
@property
def is_postrelease(self):
# type: () -> bool
return self.post is not None
@property
def is_devrelease(self):
# type: () -> bool
return self.dev is not None
@property
def major(self):
# type: () -> int
return self.release[0] if len(self.release) >= 1 else 0
@property
def minor(self):
# type: () -> int
return self.release[1] if len(self.release) >= 2 else 0
@property
def micro(self):
# type: () -> int
return self.release[2] if len(self.release) >= 3 else 0
def _parse_letter_version(
letter, # type: str
number, # type: Union[str, bytes, SupportsInt]
):
# type: (...) -> Optional[Tuple[str, int]]
if letter:
# We consider there to be an implicit 0 in a pre-release if there is
# not a numeral associated with it.
if number is None:
number = 0
# We normalize any letters to their lower case form
letter = letter.lower()
# We consider some words to be alternate spellings of other words and
# in those cases we want to normalize the spellings to our preferred
# spelling.
if letter == "alpha":
letter = "a"
elif letter == "beta":
letter = "b"
elif letter in ["c", "pre", "preview"]:
letter = "rc"
elif letter in ["rev", "r"]:
letter = "post"
return letter, int(number)
if not letter and number:
# We assume if we are given a number, but we are not given a letter
# then this is using the implicit post release syntax (e.g. 1.0-1)
letter = "post"
return letter, int(number)
return None
_local_version_separators = re.compile(r"[\._-]")
def _parse_local_version(local):
# type: (str) -> Optional[LocalType]
"""
Takes a string like abc.1.twelve and turns it into ("abc", 1, "twelve").
"""
if local is not None:
return tuple(
part.lower() if not part.isdigit() else int(part)
for part in _local_version_separators.split(local)
)
return None
def _cmpkey(
epoch, # type: int
release, # type: Tuple[int, ...]
pre, # type: Optional[Tuple[str, int]]
post, # type: Optional[Tuple[str, int]]
dev, # type: Optional[Tuple[str, int]]
local, # type: Optional[Tuple[SubLocalType]]
):
# type: (...) -> CmpKey
# When we compare a release version, we want to compare it with all of the
# trailing zeros removed. So we'll use a reverse the list, drop all the now
# leading zeros until we come to something non zero, then take the rest
# re-reverse it back into the correct order and make it a tuple and use
# that for our sorting key.
_release = tuple(
reversed(list(itertools.dropwhile(lambda x: x == 0, reversed(release))))
)
# We need to "trick" the sorting algorithm to put 1.0.dev0 before 1.0a0.
# We'll do this by abusing the pre segment, but we _only_ want to do this
# if there is not a pre or a post segment. If we have one of those then
# the normal sorting rules will handle this case correctly.
if pre is None and post is None and dev is not None:
_pre = NegativeInfinity # type: PrePostDevType
# Versions without a pre-release (except as noted above) should sort after
# those with one.
elif pre is None:
_pre = Infinity
else:
_pre = pre
# Versions without a post segment should sort before those with one.
if post is None:
_post = NegativeInfinity # type: PrePostDevType
else:
_post = post
# Versions without a development segment should sort after those with one.
if dev is None:
_dev = Infinity # type: PrePostDevType
else:
_dev = dev
if local is None:
# Versions without a local segment should sort before those with one.
_local = NegativeInfinity # type: LocalType
else:
# Versions with a local segment need that segment parsed to implement
# the sorting rules in PEP440.
# - Alpha numeric segments sort before numeric segments
# - Alpha numeric segments sort lexicographically
# - Numeric segments sort numerically
# - Shorter versions sort before longer versions when the prefixes
# match exactly
_local = tuple(
(i, "") if isinstance(i, int) else (NegativeInfinity, i) for i in local
)
return epoch, _release, _pre, _post, _dev, _local

View file

@ -1,110 +0,0 @@
#!/usr/bin/env python3
# vim:fileencoding=utf-8
import os
import sys
import configparser
import logging
from functools import partial
from pkg_resources import parse_version
from tornado.ioloop import IOLoop
from tornado.options import parse_command_line, define, options
from nvchecker.get_version import get_version
from nvchecker import notify
logger = logging.getLogger(__name__)
notifications = []
g_counter = 0
g_oldver = {}
g_curver = {}
define("notify", type=bool,
help="show desktop notifications when a new version is available")
define("oldverfile", type=str, metavar="FILE",
help="a text file listing current version info in format 'name: version'")
define("verfile", type=str, metavar="FILE",
help="write a new version file")
def task_inc():
global g_counter
g_counter += 1
def task_dec():
global g_counter
g_counter -= 1
if g_counter == 0:
IOLoop.instance().stop()
write_verfile()
def load_config(*files):
config = configparser.ConfigParser(
dict_type=dict, allow_no_value=True
)
for file in files:
with open(file) as f:
config.read_file(f)
return config
def load_oldverfile(file):
v = {}
with open(file) as f:
for l in f:
name, ver = [x.strip() for x in l.split(':', 1)]
v[name] = ver
return v
def write_verfile():
if not options.verfile:
return
with open(options.verfile, 'w') as f:
# sort using only alphanums, as done by the sort command, and needed by
# comm command
for item in sorted(g_curver.items(), key=lambda i: (''.join(filter(str.isalnum, i[0])), i[1])):
print('%s: %s' % item, file=f)
def print_version_update(name, version):
oldver = g_oldver.get(name, None)
if not oldver or parse_version(oldver) < parse_version(version):
logger.info('%s: updated version %s', name, version)
_updated(name, version)
else:
logger.info('%s: current version %s', name, version)
task_dec()
def _updated(name, version):
g_curver[name] = version
if options.notify:
msg = '%s updated to version %s' % (name, version)
notifications.append(msg)
notify.update('nvchecker', '\n'.join(notifications))
def get_versions(config):
task_inc()
for name in config.sections():
task_inc()
get_version(name, config[name], print_version_update)
task_dec()
def main():
files = parse_command_line()
if not files:
return
def run_test():
config = load_config(*files)
if options.oldverfile:
g_oldver.update(load_oldverfile(options.oldverfile))
g_curver.update(g_oldver)
get_versions(config)
ioloop = IOLoop.instance()
ioloop.add_callback(run_test)
ioloop.start()
if __name__ == '__main__':
main()

View file

@ -1,98 +0,0 @@
'''
调用 libnotify
'''
__all__ = ["set", "show", "update", "set_timeout", "set_urgency"]
from ctypes import *
from threading import Lock
import atexit
NOTIFY_URGENCY_LOW = 0
NOTIFY_URGENCY_NORMAL = 1
NOTIFY_URGENCY_CRITICAL = 2
UrgencyLevel = {NOTIFY_URGENCY_LOW, NOTIFY_URGENCY_NORMAL, NOTIFY_URGENCY_CRITICAL}
libnotify = None
gobj = None
libnotify_lock = Lock()
libnotify_inited = False
class obj: pass
notify_st = obj()
def set(summary=None, body=None, icon_str=None):
with libnotify_lock:
init()
if summary is not None:
notify_st.summary = summary.encode()
notify_st.body = notify_st.icon_str = None
if body is not None:
notify_st.body = body.encode()
if icon_str is not None:
notify_st.icon_str = icon_str.encode()
libnotify.notify_notification_update(
notify_st.notify,
c_char_p(notify_st.summary),
c_char_p(notify_st.body),
c_char_p(notify_st.icon_str),
c_void_p()
)
def show():
libnotify.notify_notification_show(notify_st.notify, c_void_p())
def update(summary=None, body=None, icon_str=None):
if not any((summary, body)):
raise TypeError('at least one argument please')
set(summary, body, icon_str)
show()
def set_timeout(self, timeout):
'''set `timeout' in milliseconds'''
libnotify.notify_notification_set_timeout(notify_st.notify, int(timeout))
def set_urgency(self, urgency):
if urgency not in UrgencyLevel:
raise ValueError
libnotify.notify_notification_set_urgency(notify_st.notify, urgency)
def init():
global libnotify_inited, libnotify, gobj
if libnotify_inited:
return
try:
libnotify = CDLL('libnotify.so')
except OSError:
libnotify = CDLL('libnotify.so.4')
gobj = CDLL('libgobject-2.0.so')
libnotify.notify_init('pynotify')
libnotify_inited = True
notify_st.notify = libnotify.notify_notification_new(
c_void_p(), c_void_p(), c_void_p(),
)
atexit.register(uninit)
def uninit():
global libnotify_inited
try:
if libnotify_inited:
gobj.g_object_unref(notify_st.notify)
libnotify.notify_uninit()
libnotify_inited = False
except AttributeError:
# libnotify.so 已被卸载
pass
if __name__ == '__main__':
from time import sleep
notify = __import__('__main__')
notify.set('This is a test', '测试一下。')
notify.show()
sleep(1)
notify.update(body='再测试一下。')

141
nvchecker/slogconf.py Normal file
View file

@ -0,0 +1,141 @@
# vim: se sw=2:
# MIT licensed
# Copyright (c) 2018-2020,2023-2024 lilydjwg <lilydjwg@gmail.com>, et al.
import logging
import os
import io
import traceback
import sys
import structlog
from .httpclient import TemporaryError
def _console_msg(event):
evt = event['event']
if evt == 'up-to-date':
msg = 'up-to-date, version %s' % event['version']
del event['version']
elif evt == 'updated':
if event.get('old_version'):
msg = 'updated from %(old_version)s to %(version)s' % event
else:
msg = 'updated to %(version)s' % event
del event['version'], event['old_version']
else:
msg = evt
if 'revision' in event and not event['revision']:
del event['revision']
if 'name' in event:
msg = f"{event['name']}: {msg}"
del event['name']
event['msg'] = msg
return event
def exc_info(logger, level, event):
if level == 'exception':
event['exc_info'] = True
return event
def filter_nones(logger, level, event):
if 'url' in event and event['url'] is None:
del event['url']
return event
def filter_taskname(logger, level, event):
# added in Python 3.12, not useful to us, but appears as a normal KV.
if 'taskName' in event:
del event['taskName']
return event
def filter_exc(logger, level, event):
exc_info = event.get('exc_info')
if not exc_info:
return event
if exc_info is True:
exc = sys.exc_info()[1]
else:
exc = exc_info
if isinstance(exc, TemporaryError):
if exc.code == 599: # network issues
del event['exc_info']
event['error'] = exc
return event
def stdlib_renderer(logger, level, event):
# return event unchanged for further processing
std_event = _console_msg(event.copy())
try:
logger = logging.getLogger(std_event.pop('logger_name'))
except KeyError:
logger = logging.getLogger()
msg = std_event.pop('msg', std_event.pop('event'))
exc_info = std_event.pop('exc_info', None)
if 'error' in std_event:
std_event['error'] = repr(std_event['error'])
getattr(logger, level)(
msg, exc_info = exc_info, extra=std_event,
)
return event
_renderer = structlog.processors.JSONRenderer(ensure_ascii=False)
def json_renderer(logger, level, event):
event['level'] = level
return _renderer(logger, level, event)
def null_renderer(logger, level, event):
return ''
class _Logger(logging.Logger):
_my_srcfile = os.path.normcase(
stdlib_renderer.__code__.co_filename)
_structlog_dir = os.path.dirname(structlog.__file__)
def findCaller(self, stack_info=False, stacklevel=1):
"""
Find the stack frame of the caller so that we can note the source
file name, line number and function name.
"""
f = logging.currentframe()
#On some versions of IronPython, currentframe() returns None if
#IronPython isn't run with -X:Frames.
if f is not None:
f = f.f_back
orig_f = f
while f and stacklevel > 1:
f = f.f_back
stacklevel -= 1
if not f:
f = orig_f
rv = "(unknown file)", 0, "(unknown function)", None
while hasattr(f, "f_code"):
co = f.f_code
filename = os.path.normcase(co.co_filename)
if filename in [logging._srcfile, self._my_srcfile] \
or filename.startswith(self._structlog_dir):
f = f.f_back
continue
sinfo = None
if stack_info:
sio = io.StringIO()
sio.write('Stack (most recent call last):\n')
traceback.print_stack(f, file=sio)
sinfo = sio.getvalue()
if sinfo[-1] == '\n':
sinfo = sinfo[:-1]
sio.close()
rv = (co.co_filename, f.f_lineno, co.co_name, sinfo)
break
return rv
def fix_logging():
logging.setLoggerClass(_Logger)

34
nvchecker/sortversion.py Normal file
View file

@ -0,0 +1,34 @@
# MIT licensed
# Copyright (c) 2013-2021 lilydjwg <lilydjwg@gmail.com>, et al.
'''
Sort versions using deprecated pkg_resource / packaging.parse_version or pyalpm.vercmp
'''
__all__ = ["sort_version_keys"]
from .lib.packaging_version import parse as parse_version
try:
import pyalpm
from functools import cmp_to_key
vercmp = cmp_to_key(pyalpm.vercmp)
vercmp_available = True
except ImportError:
def vercmp(k):
raise NotImplementedError("Using vercmp but pyalpm can not be imported!")
vercmp_available = False
try:
from awesomeversion import AwesomeVersion
awesomeversion_available = True
except ImportError:
def AwesomeVersion(k): # type: ignore
raise NotImplementedError("Using awesomeversion but it can not be imported!")
awesomeversion_available = False
sort_version_keys = {
"parse_version": parse_version,
"vercmp": vercmp,
"awesomeversion": AwesomeVersion,
}

199
nvchecker/tools.py Normal file
View file

@ -0,0 +1,199 @@
# vim: se sw=2:
# MIT licensed
# Copyright (c) 2013-2024 lilydjwg <lilydjwg@gmail.com>, et al.
import sys
import argparse
import shutil
import structlog
import json
import os.path
from . import core
from .util import RichResult
logger = structlog.get_logger(logger_name=__name__)
def take() -> None:
parser = argparse.ArgumentParser(description='update version records of nvchecker')
core.add_common_arguments(parser)
parser.add_argument('--all', action='store_true',
help='take all updates')
parser.add_argument('--ignore-nonexistent', action='store_true',
help='ignore nonexistent names')
parser.add_argument('names', metavar='NAME', nargs='*',
help='software name to be updated. use NAME=VERSION to update '
'to a specific version instead of the new version.')
args = parser.parse_args()
if core.process_common_arguments(args):
return
opt = core.load_file(args.file, use_keymanager=False)[1]
if opt.ver_files is None:
logger.critical(
"doesn't have 'oldver' and 'newver' set.",
source=args.file,
)
sys.exit(2)
else:
oldverf = opt.ver_files[0]
newverf = opt.ver_files[1]
oldvers = core.read_verfile(oldverf)
newvers = core.read_verfile(newverf)
if args.all:
oldvers.update(newvers)
else:
name: str
for name in args.names:
if "=" in name:
name, newver = name.split("=")
oldvers[name] = RichResult(version=newver)
else:
try:
oldvers[name] = newvers[name]
except KeyError:
if args.ignore_nonexistent:
logger.warning('nonexistent in newver, ignored', name=name)
continue
logger.critical(
"doesn't exist in 'newver' set.", name=name,
)
sys.exit(2)
try:
if os.path.islink(oldverf):
shutil.copy(oldverf, oldverf.with_name(oldverf.name + '~'))
else:
oldverf.rename(
oldverf.with_name(oldverf.name + '~'),
)
except FileNotFoundError:
pass
core.write_verfile(oldverf, oldvers)
def cmp() -> None:
parser = argparse.ArgumentParser(description='compare version records of nvchecker')
core.add_common_arguments(parser)
parser.add_argument('-j', '--json', action='store_true',
help='Output JSON array of dictionaries with {name, newver, oldver, [delta]} '
'(or array of names if --quiet)')
parser.add_argument('-q', '--quiet', action='store_true',
help="Quiet mode, output only the names.")
parser.add_argument('-a', '--all', action='store_true',
help="Include unchanged versions.")
parser.add_argument('-s', '--sort',
choices=('parse_version', 'vercmp', 'awesomeversion', 'none'),
default='parse_version',
help='Version compare method to backwards the arrow '
'(default: parse_version)')
parser.add_argument('-n', '--newer', action='store_true',
help='Shows only the newer ones according to --sort.')
parser.add_argument('--exit-status', action='store_true',
help="exit with status 4 if there are updates")
args = parser.parse_args()
if core.process_common_arguments(args):
return
opt = core.load_file(args.file, use_keymanager=False)[1]
if opt.ver_files is None:
logger.critical(
"doesn't have 'oldver' and 'newver' set.",
source=args.file,
)
sys.exit(2)
else:
oldverf = opt.ver_files[0]
newverf = opt.ver_files[1]
oldvers = {k: v.version for k, v in core.read_verfile(oldverf).items()}
newvers = {k: v.version for k, v in core.read_verfile(newverf).items()}
differences = []
for name, newver in sorted(newvers.items()): # accumulate differences
oldver = oldvers.get(name, None)
diff = {
'name': name,
'oldver': oldver,
'newver': newver
}
if oldver is not None and newver is not None:
if oldver == newver:
diff['delta'] = 'equal'
elif args.sort == "none":
diff['delta'] = 'new' # assume it's a new version if we're not comparing
else:
from .sortversion import sort_version_keys
version = sort_version_keys[args.sort]
if version(oldver) > version(newver): # type: ignore
if args.newer:
continue # don't store this diff
diff['delta'] = 'old'
else:
diff['delta'] = 'new'
elif oldver is None:
diff['delta'] = 'added'
elif newver is None:
if args.newer:
continue # don't store this diff
diff['delta'] = 'gone'
if args.all or diff['delta'] != 'equal':
differences.append(diff)
if args.json:
if args.quiet:
print(json.dumps([diff['name'] for diff in differences], separators=(',', ':')))
else:
print(json.dumps(differences, sort_keys=True, separators=(',', ':')))
elif args.quiet:
for diff in differences:
print(diff['name'])
else:
from .lib.nicelogger import Colors, support_color
c = Colors(support_color(sys.stdout))
diffstyles = {
'new': {
'symbol': '->',
'oldc': c.red
},
'old': {
'symbol': f'{c.red}<-{c.normal}',
'oldc': c.red
},
'added': {
'symbol': '++',
'oldc': c.red
},
'gone': {
'symbol': f'{c.red}--{c.normal}',
'oldc': c.green
},
'equal': {
'symbol': '==',
'oldc': c.green
}
}
for diff in differences:
style = diffstyles[diff.get('delta', 'equal')] # type: ignore # mypy has issues with this line
print(f'{diff["name"]} {style["oldc"]}{diff["oldver"]}{c.normal} {style["symbol"]} {c.green}{diff["newver"]}{c.normal}')
if args.exit_status and any(
diff.get('delta') != 'equal' for diff in differences
):
sys.exit(4)

330
nvchecker/util.py Normal file
View file

@ -0,0 +1,330 @@
# MIT licensed
# Copyright (c) 2020 lilydjwg <lilydjwg@gmail.com>, et al.
from __future__ import annotations
import sys
import asyncio
from asyncio import Queue
from typing import (
Dict, Optional, List, NamedTuple, Union,
Any, Tuple, Callable, Coroutine, Hashable,
TYPE_CHECKING,
)
from pathlib import Path
import contextvars
import abc
import netrc
from dataclasses import dataclass
if TYPE_CHECKING:
import tomli as tomllib
else:
try:
import tomllib
except ModuleNotFoundError:
import tomli as tomllib
import structlog
from .httpclient import session
from .ctxvars import tries as ctx_tries
from .ctxvars import proxy as ctx_proxy
from .ctxvars import user_agent as ctx_ua
from .ctxvars import httptoken as ctx_httpt
from .ctxvars import verify_cert as ctx_verify_cert
logger = structlog.get_logger(logger_name=__name__)
Entry = Dict[str, Any]
Entry.__doc__ = '''The configuration `dict` for an entry.'''
Entries = Dict[str, Entry]
if sys.version_info[:2] >= (3, 11):
from typing import LiteralString
else:
LiteralString = str
if sys.version_info[:2] >= (3, 10):
@dataclass(kw_only=True)
class RichResult:
version: str
gitref: Optional[str] = None
revision: Optional[str] = None
url: Optional[str] = None
def __str__(self):
return self.version
else:
@dataclass
class RichResult:
version: str
gitref: Optional[str] = None
revision: Optional[str] = None
url: Optional[str] = None
def __str__(self):
return self.version
VersionResult = Union[None, str, RichResult, List[Union[str, RichResult]], Exception]
VersionResult.__doc__ = '''The result of a `get_version` check.
* `None` - No version found.
* `str` - A single version string is found.
* `RichResult` - A version string with additional information.
* `List[Union[str, RichResult]]` - Multiple version strings with or without additional information are found. :ref:`list options` will be applied.
* `Exception` - An error occurred.
'''
class FileLoadError(Exception):
def __init__(self, kind, filename, exc):
self.kind = kind
self.filename = filename
self.exc = exc
def __str__(self):
return f'failed to load {self.kind} {self.filename!r}: {self.exc}'
class KeyManager:
'''Manages data in the keyfile.'''
def __init__(
self, file: Optional[Path],
) -> None:
if file is not None:
try:
with file.open('rb') as f:
keys = tomllib.load(f)['keys']
except (OSError, tomllib.TOMLDecodeError) as e:
raise FileLoadError('keyfile', str(file), e)
else:
keys = {}
self.keys = keys
try:
netrc_file = netrc.netrc()
netrc_hosts = netrc_file.hosts
except (FileNotFoundError, netrc.NetrcParseError):
netrc_hosts = {}
self.netrc = netrc_hosts
def get_key(self, name: str, legacy_name: Optional[str] = None) -> Optional[str]:
'''Get the named key (token) in the keyfile.'''
keyfile_token = self.keys.get(name) or self.keys.get(legacy_name)
netrc_passwd = (e := self.netrc.get(name)) and e[2]
return keyfile_token or netrc_passwd
class EntryWaiter:
def __init__(self) -> None:
self._waiting: Dict[str, asyncio.Future] = {}
async def wait(self, name: str) -> str:
'''Wait on the ``name`` entry and return its result (the version string)'''
fu = self._waiting.get(name)
if fu is None:
fu = asyncio.Future()
self._waiting[name] = fu
return await fu
def set_result(self, name: str, value: str) -> None:
fu = self._waiting.get(name)
if fu is not None:
fu.set_result(value)
def set_exception(self, name: str, e: Exception) -> None:
fu = self._waiting.get(name)
if fu is not None:
fu.set_exception(e)
class RawResult(NamedTuple):
'''The unprocessed result from a check.'''
name: str
version: VersionResult
conf: Entry
RawResult.name.__doc__ = 'The name (table name) of the entry.'
RawResult.version.__doc__ = 'The result from the check.'
RawResult.conf.__doc__ = 'The entry configuration (table content) of the entry.'
ResultData = Dict[str, RichResult]
class BaseWorker:
'''The base class for defining `Worker` classes for source plugins.
.. py:attribute:: task_sem
:type: asyncio.Semaphore
This is the rate-limiting semaphore. Workers should acquire it while doing one unit of work.
.. py:attribute:: result_q
:type: Queue[RawResult]
Results should be put into this queue.
.. py:attribute:: tasks
:type: List[Tuple[str, Entry]]
A list of tasks for the `Worker` to complete. Every task consists of
a tuple for the task name (table name in the configuration file) and the
content of that table (as a `dict`).
.. py:attribute:: keymanager
:type: KeyManager
The `KeyManager` for retrieving keys from the keyfile.
'''
def __init__(
self,
task_sem: asyncio.Semaphore,
result_q: Queue[RawResult],
tasks: List[Tuple[str, Entry]],
keymanager: KeyManager,
) -> None:
self.task_sem = task_sem
self.result_q = result_q
self.keymanager = keymanager
self.tasks = tasks
@abc.abstractmethod
async def run(self) -> None:
'''Run the `tasks`. Subclasses should implement this method.'''
raise NotImplementedError
async def _run_maynot_raise(self) -> None:
try:
await self.run()
except Exception:
# don't let an exception tear down the whole process
logger.exception('exception raised by Worker.run')
class AsyncCache:
'''A cache for use with async functions.'''
cache: Dict[Hashable, Any]
lock: asyncio.Lock
def __init__(self) -> None:
self.cache = {}
self.lock = asyncio.Lock()
async def _get_json(
self, key: Tuple[str, str, Tuple[Tuple[str, str], ...]],
) -> Any:
_, url, headers = key
res = await session.get(url, headers=dict(headers))
return res.json()
async def get_json(
self, url: str, *,
headers: Dict[str, str] = {},
) -> Any:
'''Get specified ``url`` and return the response content as JSON.
The returned data will be cached for reuse.
'''
key = '_jsonurl', url, tuple(sorted(headers.items()))
return await self.get(
key , self._get_json) # type: ignore
async def get(
self,
key: Hashable,
func: Callable[[Hashable], Coroutine[Any, Any, Any]],
) -> Any:
'''Run async ``func`` and cache its return value by ``key``.
The ``key`` should be hashable, and the function will be called with it as
its sole argument. For multiple simultaneous calls with the same key, only
one will actually be called, and others will wait and return the same
(cached) value.
'''
async with self.lock:
cached = self.cache.get(key)
if cached is None:
coro = func(key)
fu = asyncio.create_task(coro)
self.cache[key] = fu
if asyncio.isfuture(cached): # pending
return await cached
elif cached is not None: # cached
return cached
else: # not cached
r = await fu
self.cache[key] = r
return r
if TYPE_CHECKING:
from typing_extensions import Protocol
class GetVersionFunc(Protocol):
async def __call__(
self,
name: str, conf: Entry,
*,
cache: AsyncCache,
keymanager: KeyManager,
) -> VersionResult:
...
else:
GetVersionFunc = Any
class FunctionWorker(BaseWorker):
func: GetVersionFunc
cache: AsyncCache
def initialize(self, func: GetVersionFunc) -> None:
self.func = func
self.cache = AsyncCache()
async def run(self) -> None:
futures = []
for name, entry in self.tasks:
ctx = contextvars.copy_context()
fu = ctx.run(self.run_one, name, entry)
futures.append(fu)
for fu2 in asyncio.as_completed(futures):
await fu2
async def run_one(
self, name: str, entry: Entry,
) -> None:
assert self.func is not None
tries = entry.get('tries', None)
if tries is not None:
ctx_tries.set(tries)
proxy = entry.get('proxy', None)
if proxy is not None:
ctx_proxy.set(proxy)
ua = entry.get('user_agent', None)
if ua is not None:
ctx_ua.set(ua)
httpt = entry.get('httptoken', None)
if httpt is None:
httpt = self.keymanager.get_key('httptoken_'+name)
if httpt is not None:
ctx_httpt.set(httpt)
verify_cert = entry.get('verify_cert', None)
if verify_cert is not None:
ctx_verify_cert.set(verify_cert)
try:
async with self.task_sem:
version = await self.func(
name, entry,
cache = self.cache,
keymanager = self.keymanager,
)
await self.result_q.put(RawResult(name, version, entry))
except Exception as e:
await self.result_q.put(RawResult(name, e, entry))
class GetVersionError(Exception):
'''An error occurred while getting version information.
Raise this when a known bad situation happens.
:param msg: The error message.
:param kwargs: Arbitrary additional context for the error.
'''
def __init__(self, msg: LiteralString, **kwargs: Any) -> None:
self.msg = msg
self.kwargs = kwargs

44
nvchecker_source/alpm.py Normal file
View file

@ -0,0 +1,44 @@
# MIT licensed
# Copyright (c) 2020-2021 DDoSolitary <DDoSolitary@gmail.com>, et al.
from nvchecker.api import GetVersionError
from pyalpm import Handle
async def open_db(info):
dbpath, repo = info
handle = Handle('/', dbpath)
db = handle.register_syncdb(repo, 0)
return handle, db
async def get_version(name, conf, *, cache, **kwargs):
pkgname = conf.get('alpm', name)
dbpath = conf.get('dbpath', '/var/lib/pacman')
strip_release = conf.get('strip_release', False)
provided = conf.get('provided')
repo = conf.get('repo')
if repo is None:
repos = conf.get('repos') or ['core', 'extra', 'multilib']
else:
repos = [repo]
for repo in repos:
db = (await cache.get((dbpath, repo), open_db))[1]
pkg = db.get_pkg(pkgname)
if pkg is not None:
break
if pkg is None:
raise GetVersionError('package not found in the ALPM database')
if provided is None:
version = pkg.version
else:
provides = dict(x.split('=', 1) for x in pkg.provides if '=' in x)
version = provides.get(provided)
if version is None:
raise GetVersionError('provides element not found')
if strip_release:
version = version.split('-', 1)[0]
return version

View file

@ -0,0 +1,51 @@
# MIT licensed
# Copyright (c) 2023 Pekka Ristola <pekkarr [at] protonmail [dot] com>, et al.
from asyncio import create_subprocess_exec
from asyncio.subprocess import PIPE
import re
from typing import Tuple, List
from nvchecker.api import GetVersionError
async def get_files(info: Tuple[str, str]) -> List[str]:
dbpath, pkg = info
# there's no pyalpm bindings for the file databases
cmd = ['pacman', '-Flq', '--dbpath', dbpath, pkg]
p = await create_subprocess_exec(*cmd, stdout = PIPE, stderr = PIPE)
stdout, stderr = await p.communicate()
if p.returncode == 0:
return stdout.decode().splitlines()
else:
raise GetVersionError(
'pacman failed to get file list',
pkg = pkg,
cmd = cmd,
stdout = stdout.decode(errors='replace'),
stderr = stderr.decode(errors='replace'),
returncode = p.returncode,
)
async def get_version(name, conf, *, cache, **kwargs):
pkg = conf['pkgname']
repo = conf.get('repo')
if repo is not None:
pkg = f'{repo}/{pkg}'
dbpath = conf.get('dbpath', '/var/lib/pacman')
regex = re.compile(conf['filename'])
if regex.groups > 1:
raise GetVersionError('multi-group regex')
strip_dir = conf.get('strip_dir', False)
files = await cache.get((dbpath, pkg), get_files)
for f in files:
fn = f.rsplit('/', 1)[-1] if strip_dir else f
match = regex.fullmatch(fn)
if match:
groups = match.groups()
return groups[0] if len(groups) > 0 else fn
raise GetVersionError('no file matches specified regex')

View file

@ -0,0 +1,71 @@
# MIT licensed
# Copyright (c) 2020 lilydjwg <lilydjwg@gmail.com>, et al.
# Copyright (c) 2017,2020 Chih-Hsuan Yen <yan12125 at gmail dot com>
import os
import re
from xml.etree import ElementTree
from nvchecker.api import session
_ANDROID_REPO_MANIFESTS = {
'addon': 'https://dl.google.com/android/repository/addon2-1.xml',
'package': 'https://dl.google.com/android/repository/repository2-1.xml',
}
# See <channel> tags in Android SDK XML manifests
_CHANNEL_MAP = {
'stable': 'channel-0',
'beta': 'channel-1',
'dev': 'channel-2',
'canary': 'channel-3',
}
async def _get_repo_manifest(repo):
repo_xml_url = _ANDROID_REPO_MANIFESTS[repo]
res = await session.get(repo_xml_url)
data = res.body.decode('utf-8')
repo_manifest = ElementTree.fromstring(data)
return repo_manifest
async def get_version(name, conf, *, cache, **kwargs):
repo = conf['repo']
pkg_path_prefix = conf['android_sdk']
channels = [_CHANNEL_MAP[channel]
for channel in conf.get('channel', 'stable').split(',')]
repo_manifest = await cache.get(repo, _get_repo_manifest)
versions = []
for pkg in repo_manifest.findall('.//remotePackage'):
if not pkg.attrib['path'].startswith(pkg_path_prefix):
continue
channelRef = pkg.find('./channelRef')
if channelRef.attrib['ref'] not in channels:
continue
for archive in pkg.findall('./archives/archive'):
host_os = archive.find('./host-os')
if host_os is not None and host_os.text != conf.get('host_os', 'linux'):
continue
archive_url = archive.find('./complete/url').text
# revision
rev = pkg.find('./revision')
rev_strs = []
for part in ('major', 'minor', 'micro'):
part_node = rev.find('./' + part)
if part_node is not None:
rev_strs.append(part_node.text)
# release number
filename, ext = os.path.splitext(archive_url)
rel_str = filename.rsplit('-')[-1]
mobj = re.match(r'r\d+', rel_str)
if mobj:
rev_strs.append(rel_str)
versions.append('.'.join(rev_strs))
# A package suitable for the target host OS is found - skip remaining
break
return versions

View file

@ -0,0 +1,17 @@
# MIT licensed
# Copyright (c) 2017-2020 lilydjwg <lilydjwg@gmail.com>, et al.
from nvchecker.api import RichResult
URL = 'https://release-monitoring.org/api/project/{pkg}'
async def get_version(name, conf, *, cache, **kwargs):
pkg = conf.get('anitya_id')
if pkg is None:
pkg = conf.get('anitya')
url = URL.format(pkg = pkg)
data = await cache.get_json(url)
return RichResult(
version = data['version'],
url = f'https://release-monitoring.org/project/{data["id"]}/',
)

189
nvchecker_source/apt.py Normal file
View file

@ -0,0 +1,189 @@
# MIT licensed
# Copyright (c) 2020 Felix Yan <felixonmars@archlinux.org>, et al.
from __future__ import annotations
import re
import asyncio
from typing import Dict, Tuple
import itertools
import functools
from collections import defaultdict
from nvchecker.api import (
session, GetVersionError, VersionResult,
RichResult, Entry, AsyncCache, KeyManager,
)
APT_RELEASE_URL = "%s/dists/%s/Release"
APT_PACKAGES_PATH = "%s/binary-%s/Packages%s"
APT_PACKAGES_URL = "%s/dists/%s/%s"
APT_PACKAGES_SUFFIX_PREFER = (".xz", ".gz", "")
DpkgVersion = Tuple[int, str, str]
def parse_version(s: str) -> DpkgVersion:
try:
epoch_str, rest = s.split(':', 1)
except ValueError:
epoch = 0
rest = s
else:
epoch = int(epoch_str)
try:
ver, rev = rest.split('-', 1)
except ValueError:
ver = rest
rev = ''
return epoch, ver, rev
def _compare_part(a: str, b: str) -> int:
sa = re.split(r'(\d+)', a)
sb = re.split(r'(\d+)', b)
for idx, (pa, pb) in enumerate(itertools.zip_longest(sa, sb)):
if pa is None:
return -1
elif pb is None:
return 1
if idx % 2 == 1:
ret = int(pa) - int(pb)
if ret != 0:
return ret
else:
if pa < pb:
return -1
elif pa > pb:
return 1
return 0
def compare_version_parsed(a: DpkgVersion, b: DpkgVersion) -> int:
ret = a[0] - b[0]
if ret != 0:
return ret
ret = _compare_part(a[1], b[1])
if ret != 0:
return ret
return _compare_part(a[2], b[2])
def compare_version(a: str, b: str) -> int:
va = parse_version(a)
vb = parse_version(b)
return compare_version_parsed(va, vb)
def _decompress_data(url: str, data: bytes) -> str:
if url.endswith(".xz"):
import lzma
data = lzma.decompress(data)
elif url.endswith(".gz"):
import gzip
data = gzip.decompress(data)
return data.decode('utf-8')
async def get_url(url: str) -> str:
res = await session.get(url)
data = res.body
loop = asyncio.get_running_loop()
return await loop.run_in_executor(
None, _decompress_data,
url, data)
async def parse_packages(key: Tuple[AsyncCache, str]) -> Tuple[Dict[str, str], Dict[str, str], Dict[str, str]]:
cache, url = key
apt_packages = await cache.get(url, get_url) # type: ignore
pkg_map = defaultdict(list)
srcpkg_map = defaultdict(list)
pkg_to_src_map = defaultdict(list)
pkg = None
srcpkg = None
for line in apt_packages.split('\n'):
if line.startswith("Package: "):
pkg = line[9:]
elif line.startswith("Source: "):
srcpkg = line[8:]
elif line.startswith("Version: "):
version = line[9:]
if pkg is not None:
pkg_map[pkg].append(version)
pkg_to_src_map["%s/%s" % (pkg, version)] = srcpkg if srcpkg is not None else pkg
if srcpkg is not None:
srcpkg_map[srcpkg].append(version)
pkg = srcpkg = None
pkg_map_max = {pkg: max(vs, key=functools.cmp_to_key(compare_version))
for pkg, vs in pkg_map.items()}
srcpkg_map_max = {pkg: max(vs, key=functools.cmp_to_key(compare_version))
for pkg, vs in srcpkg_map.items()}
pkg_to_src_map_max = {pkg: pkg_to_src_map["%s/%s" % (pkg, vs)]
for pkg, vs in pkg_map_max.items()}
return pkg_map_max, srcpkg_map_max, pkg_to_src_map_max
async def get_version(
name: str, conf: Entry, *,
cache: AsyncCache, keymanager: KeyManager,
**kwargs,
) -> VersionResult:
srcpkg = conf.get('srcpkg')
pkg = conf.get('pkg')
mirror = conf['mirror']
suite = conf['suite']
repo = conf.get('repo', 'main')
arch = conf.get('arch', 'amd64')
strip_release = conf.get('strip_release', False)
if srcpkg and pkg:
raise GetVersionError('Setting both srcpkg and pkg is ambiguous')
elif not srcpkg and not pkg:
pkg = name
apt_release = await cache.get(
APT_RELEASE_URL % (mirror, suite), get_url) # type: ignore
for suffix in APT_PACKAGES_SUFFIX_PREFER:
packages_path = APT_PACKAGES_PATH % (repo, arch, suffix)
if " " + packages_path in apt_release:
break
else:
raise GetVersionError('Packages file not found in APT repository')
pkg_map, srcpkg_map, pkg_to_src_map = await cache.get(
(cache, APT_PACKAGES_URL % (mirror, suite, packages_path)), parse_packages) # type: ignore
if pkg and pkg in pkg_map:
version = pkg_map[pkg]
changelog_name = pkg_to_src_map[pkg]
elif srcpkg and srcpkg in srcpkg_map:
version = srcpkg_map[srcpkg]
changelog_name = srcpkg
else:
raise GetVersionError('package not found in APT repository')
# Get Changelogs field from the Release file
changelogs_url = None
for line in apt_release.split('\n'):
if line.startswith('Changelogs: '):
changelogs_url = line[12:]
break
# Build the changelog URL (see https://wiki.debian.org/DebianRepository/Format#Changelogs for spec)
changelog = None
if changelogs_url is not None and changelogs_url != 'no':
changelog_section = changelog_name[:4] if changelog_name.startswith('lib') else changelog_name[:1]
changelog = changelogs_url.replace('@CHANGEPATH@', f'{repo}/{changelog_section}/{changelog_name}/{changelog_name}_{version}')
if strip_release:
version = version.split("-")[0]
if changelog is not None:
return RichResult(
version = version,
url = changelog,
)
else:
return version

View file

@ -0,0 +1,37 @@
# MIT licensed
# Copyright (c) 2013-2020 lilydjwg <lilydjwg@gmail.com>, et al.
from nvchecker.api import session, RichResult, GetVersionError
URL = 'https://archlinux.org/packages/search/json/'
async def request(pkg):
res = await session.get(URL, params={"name": pkg})
return res.json()
async def get_version(name, conf, *, cache, **kwargs):
pkg = conf.get('archpkg') or name
strip_release = conf.get('strip_release', False)
provided = conf.get('provided')
data = await cache.get(pkg, request)
if not data['results']:
raise GetVersionError('Arch package not found')
r = [r for r in data['results'] if r['repo'] != 'testing'][0]
if provided:
provides = dict(x.split('=', 1) for x in r['provides'] if '=' in x)
version = provides.get(provided, None)
if strip_release:
version = version.split('-', 1)[0]
elif strip_release:
version = r['pkgver']
else:
version = r['pkgver'] + '-' + r['pkgrel']
return RichResult(
version = version,
url = f'https://archlinux.org/packages/{r["repo"]}/{r["arch"]}/{r["pkgname"]}/',
)

109
nvchecker_source/aur.py Normal file
View file

@ -0,0 +1,109 @@
# MIT licensed
# Copyright (c) 2013-2020,2024 lilydjwg <lilydjwg@gmail.com>, et al.
from datetime import datetime, timezone
import asyncio
from typing import Iterable, Dict, List, Tuple, Any, Optional
from nvchecker.api import (
session, GetVersionError, VersionResult, RichResult,
Entry, BaseWorker, RawResult,
)
AUR_URL = 'https://aur.archlinux.org/rpc/'
class AurResults:
cache: Dict[str, Optional[Dict[str, Any]]]
def __init__(self) -> None:
self.cache = {}
async def get_multiple(
self,
aurnames: Iterable[str],
) -> Dict[str, Optional[Dict[str, Any]]]:
params = [('v', '5'), ('type', 'info')]
params.extend(('arg[]', name) for name in aurnames
if name not in self.cache)
res = await session.get(AUR_URL, params=params)
data = res.json()
new_results = {r['Name']: r for r in data['results']}
cache = self.cache
cache.update(new_results)
cache.update(
(name, None)
for name in set(aurnames) - new_results.keys()
)
return {name: cache[name] for name in aurnames
if name in cache}
class Worker(BaseWorker):
# https://wiki.archlinux.org/index.php/Aurweb_RPC_interface#Limitations
batch_size = 100
async def run(self) -> None:
tasks = self.tasks
n_batch, left = divmod(len(tasks), self.batch_size)
if left > 0:
n_batch += 1
aur_results = AurResults()
ret = []
for i in range(n_batch):
s = i * self.batch_size
batch = tasks[s : s+self.batch_size]
fu = self._run_batch(batch, aur_results)
ret.append(fu)
await asyncio.gather(*ret)
async def _run_batch(
self,
batch: List[Tuple[str, Entry]],
aur_results: AurResults,
) -> None:
task_by_name: Dict[str, Entry] = dict(self.tasks)
async with self.task_sem:
results = await _run_batch_impl(batch, aur_results)
for name, version in results.items():
r = RawResult(name, version, task_by_name[name])
await self.result_q.put(r)
async def _run_batch_impl(
batch: List[Tuple[str, Entry]],
aur_results: AurResults,
) -> Dict[str, VersionResult]:
aurnames = {conf.get('aur', name) for name, conf in batch}
results = await aur_results.get_multiple(aurnames)
ret: Dict[str, VersionResult] = {}
for name, conf in batch:
aurname = conf.get('aur', name)
use_last_modified = conf.get('use_last_modified', False)
strip_release = conf.get('strip_release', False)
result = results.get(aurname)
if result is None:
ret[name] = GetVersionError('AUR upstream not found')
continue
version = result['Version']
if use_last_modified:
dt = datetime.fromtimestamp(result['LastModified'], timezone.utc)
version += '-' + dt.strftime('%Y%m%d%H%M%S')
if strip_release and '-' in version:
version = version.rsplit('-', 1)[0]
ret[name] = RichResult(
version = version,
url = f'https://aur.archlinux.org/packages/{name}',
)
return ret

View file

@ -0,0 +1,73 @@
# MIT licensed
# Copyright (c) 2013-2020 lilydjwg <lilydjwg@gmail.com>, et al.
from typing import Any, List, Union
from urllib.parse import urlencode
from nvchecker.api import VersionResult, RichResult, Entry, AsyncCache
# doc: https://developer.atlassian.com/cloud/bitbucket/rest/api-group-commits/#api-repositories-workspace-repo-slug-commits-get
BITBUCKET_URL = 'https://bitbucket.org/api/2.0/repositories/%s/commits/%s'
# doc: https://developer.atlassian.com/cloud/bitbucket/rest/api-group-refs/#api-repositories-workspace-repo-slug-refs-tags-get
BITBUCKET_MAX_TAG = 'https://bitbucket.org/api/2.0/repositories/%s/refs/tags'
async def get_version(
name: str, conf: Entry, *,
cache: AsyncCache,
**kwargs: Any,
) -> VersionResult:
repo = conf['bitbucket']
br = conf.get('branch', '')
use_max_tag = conf.get('use_max_tag', False)
use_sorted_tags = conf.get('use_sorted_tags', False)
if use_sorted_tags or use_max_tag:
parameters = {'fields': 'values.name,values.links.html.href,next'}
if use_sorted_tags:
parameters['sort'] = conf.get('sort', '-target.date')
if 'query' in conf:
parameters['q'] = conf['query']
if use_sorted_tags:
url = BITBUCKET_MAX_TAG % repo
url += '?' + urlencode(parameters)
return await _get_tags(url, max_page=1, cache=cache)
elif use_max_tag:
url = BITBUCKET_MAX_TAG % repo
url += '?' + urlencode(parameters)
max_page = conf.get('max_page', 3)
return await _get_tags(url, max_page=max_page, cache=cache)
else:
url = BITBUCKET_URL % (repo, br)
data = await cache.get_json(url)
return RichResult(
version = data['values'][0]['date'].split('T', 1)[0].replace('-', ''),
url = data['values'][0]['links']['html']['href'],
)
async def _get_tags(
url: str, *,
max_page: int,
cache: AsyncCache,
) -> VersionResult:
ret: List[Union[str, RichResult]] = []
for _ in range(max_page):
data = await cache.get_json(url)
ret.extend([
RichResult(
version = tag['name'],
url = tag['links']['html']['href'],
) for tag in data['values']
])
if 'next' in data:
url = data['next']
else:
break
return ret

40
nvchecker_source/cmd.py Normal file
View file

@ -0,0 +1,40 @@
# MIT licensed
# Copyright (c) 2013-2020 lilydjwg <lilydjwg@gmail.com>, et al.
import asyncio
import structlog
from nvchecker.api import GetVersionError
logger = structlog.get_logger(logger_name=__name__)
async def run_cmd(cmd: str) -> str:
logger.debug('running cmd', cmd=cmd)
p = await asyncio.create_subprocess_shell(
cmd,
stdout=asyncio.subprocess.PIPE,
stderr=asyncio.subprocess.PIPE,
)
output, error = await p.communicate()
output_s = output.strip().decode('latin1')
error_s = error.strip().decode(errors='replace')
if p.returncode != 0:
raise GetVersionError(
'command exited with error',
cmd=cmd, error=error_s,
returncode=p.returncode)
elif not output_s:
raise GetVersionError(
'command exited without output',
cmd=cmd, error=error_s,
returncode=p.returncode)
else:
return output_s
async def get_version(
name, conf, *, cache, keymanager=None
):
cmd = conf['cmd']
return await cache.get(cmd, run_cmd)

View file

@ -0,0 +1,21 @@
# MIT licensed
# Copyright (c) 2021 lilydjwg <lilydjwg@gmail.com>, et al.
import asyncio
import string
from nvchecker.api import entry_waiter
class CombineFormat(string.Template):
idpattern = '[0-9]+'
async def get_version(
name, conf, *, cache, keymanager=None
):
t = CombineFormat(conf['format'])
from_ = conf['from']
waiter = entry_waiter.get()
entries = [waiter.wait(name) for name in from_]
vers = await asyncio.gather(*entries)
versdict = {str(i+1): v for i, v in enumerate(vers)}
return t.substitute(versdict)

View file

@ -0,0 +1,164 @@
# MIT licensed
# Copyright (c) 2020 Chih-Hsuan Yen <yan12125 at gmail dot com>
from typing import Dict, List, NamedTuple, Optional, Tuple
from urllib.request import parse_http_list
from urllib.parse import urljoin
import json
from nvchecker.api import session, HTTPError
class AuthInfo(NamedTuple):
service: Optional[str]
realm: str
def parse_www_authenticate_header(header: str) -> Tuple[str, Dict[str, str]]:
'''
Parse WWW-Authenticate header used in OAuth2 authentication for container
registries. This is NOT RFC-compliant!
Simplified from http.parse_www_authenticate_header in Werkzeug (BSD license)
'''
auth_type, auth_info = header.split(None, 1)
result = {}
for item in parse_http_list(auth_info):
name, value = item.split("=", 1)
if value[:1] == value[-1:] == '"':
value = value[1:-1]
result[name] = value
return auth_type, result
# Inspired by https://stackoverflow.com/a/51921869
# Reference: https://github.com/containers/image/blob/v5.6.0/docker/docker_client.go
class UnsupportedAuthenticationError(NotImplementedError):
def __init__(self):
super().__init__('Only Bearer authentication supported for now')
async def get_registry_auth_info(registry_host: str) -> AuthInfo:
auth_service = auth_realm = None
try:
await session.get(f'https://{registry_host}/v2/')
raise UnsupportedAuthenticationError # No authentication needed
except HTTPError as e:
if e.code != 401:
raise
auth_type, auth_info = parse_www_authenticate_header(e.response.headers['WWW-Authenticate'])
if auth_type.lower() != 'bearer':
raise UnsupportedAuthenticationError
# Although 'service' is needed as per https://docs.docker.com/registry/spec/auth/token/,
# ghcr.io (GitHub container registry) does not provide it
auth_service = auth_info.get('service')
auth_realm = auth_info['realm']
return AuthInfo(auth_service, auth_realm)
async def get_container_tags(info: Tuple[str, str, AuthInfo]) -> List[str]:
image_path, registry_host, auth_info = info
token = await get_auth_token(auth_info, image_path)
tags = []
url = f'https://{registry_host}/v2/{image_path}/tags/list'
while True:
res = await session.get(url, headers={
'Authorization': f'Bearer {token}',
'Accept': 'application/json',
})
tags += res.json()['tags']
link = res.headers.get('Link')
if link is None:
break
else:
url = urljoin(url, parse_next_link(link))
return tags
async def get_auth_token(auth_info, image_path):
auth_params = {
'scope': f'repository:{image_path}:pull',
}
if auth_info.service:
auth_params['service'] = auth_info.service
res = await session.get(auth_info.realm, params=auth_params)
token = res.json()['token']
return token
def parse_next_link(value: str) -> str:
ending = '>; rel="next"'
if value.endswith(ending):
return value[1:-len(ending)]
else:
raise ValueError(value)
async def get_container_tag_update_time(info: Tuple[str, str, str, AuthInfo]):
'''
Find the update time of a container tag.
In fact, it's the creation time of the image ID referred by the tag. Tag itself does not have any update time.
'''
image_path, image_tag, registry_host, auth_info = info
token = await get_auth_token(auth_info, image_path)
# HTTP headers
headers = {
'Authorization': f'Bearer {token}',
# Prefer Image Manifest Version 2, Schema 2: https://distribution.github.io/distribution/spec/manifest-v2-2/
'Accept': ', '.join([
'application/vnd.oci.image.manifest.v1+json',
'application/vnd.oci.image.index.v1+json',
'application/vnd.docker.distribution.manifest.v2+json',
'application/vnd.docker.distribution.manifest.list.v2+json',
'application/json',
]),
}
# Get tag manifest
url = f'https://{registry_host}/v2/{image_path}/manifests/{image_tag}'
res = await session.get(url, headers=headers)
data = res.json()
# Schema 1 returns the creation time in the response
if data['schemaVersion'] == 1:
return json.loads(data['history'][0]['v1Compatibility'])['created']
# For schema 2, we have to fetch the config's blob
# For multi-arch images, multiple manifests are bounded with the same tag. We should choose one and then request
# the manifest's detail
if data.get('manifests'):
# It's quite hard to find the manifest matching with current CPU architecture and system.
# For now we just choose the first and it should probably work for most cases
image_digest = data['manifests'][0]['digest']
url = f'https://{registry_host}/v2/{image_path}/manifests/{image_digest}'
res = await session.get(url, headers=headers)
data = res.json()
digest = data['config']['digest']
url = f'https://{registry_host}/v2/{image_path}/blobs/{digest}'
res = await session.get(url, headers=headers)
data = res.json()
return data['created']
async def get_version(name, conf, *, cache, **kwargs):
image_path = conf.get('container', name)
image_tag = None
# image tag is optional
if ':' in image_path:
image_path, image_tag = image_path.split(':', 1)
registry_host = conf.get('registry', 'docker.io')
if registry_host == 'docker.io':
registry_host = 'registry-1.docker.io'
auth_info = await cache.get(registry_host, get_registry_auth_info)
# if a tag is given, return the tag's update time, otherwise return the image's tag list
if image_tag:
key = image_path, image_tag, registry_host, auth_info
return await cache.get(key, get_container_tag_update_time)
key = image_path, registry_host, auth_info
return await cache.get(key, get_container_tags)

15
nvchecker_source/cpan.py Normal file
View file

@ -0,0 +1,15 @@
# MIT licensed
# Copyright (c) 2013-2020 lilydjwg <lilydjwg@gmail.com>, et al.
from nvchecker.api import RichResult
# Using metacpan
CPAN_URL = 'https://fastapi.metacpan.org/release/%s'
async def get_version(name, conf, *, cache, **kwargs):
key = conf.get('cpan', name)
data = await cache.get_json(CPAN_URL % key)
return RichResult(
version = str(data['version']),
url = f'https://metacpan.org/release/{data["author"]}/{data["name"]}',
)

29
nvchecker_source/cran.py Normal file
View file

@ -0,0 +1,29 @@
# MIT licensed
# Copyright (c) 2022 Pekka Ristola <pekkarr [at] protonmail [dot] com>, et al.
from nvchecker.api import session, RichResult, GetVersionError
CRAN_URL = 'https://cran.r-project.org/package=%s/DESCRIPTION'
VERSION_FIELD = 'Version: '
async def request(pkg):
url = CRAN_URL % pkg
res = await session.get(url)
return res.body.decode('utf-8', errors='ignore')
async def get_version(name, conf, *, cache, **kwargs):
package = conf.get('cran', name)
desc = await cache.get(package, request)
for line in desc.splitlines():
if line.startswith(VERSION_FIELD):
version = line[len(VERSION_FIELD):]
break
else:
raise GetVersionError('Invalid DESCRIPTION file')
return RichResult(
version = version,
url = f'https://cran.r-project.org/web/packages/{package}/',
)

View file

@ -0,0 +1,40 @@
# MIT licensed
# Copyright (c) 2013-2020 lilydjwg <lilydjwg@gmail.com>, et al.
import re
import structlog
from nvchecker.api import RichResult
logger = structlog.get_logger(logger_name=__name__)
API_URL = 'https://crates.io/api/v1/crates/%s'
# https://semver.org/#is-there-a-suggested-regular-expression-regex-to-check-a-semver-string
VERSION_PATTERN = r'^(?P<major>0|[1-9]\d*)\.(?P<minor>0|[1-9]\d*)\.(?P<patch>0|[1-9]\d*)(?:-(?P<prerelease>(?:0|[1-9]\d*|\d*[a-zA-Z-][0-9a-zA-Z-]*)(?:\.(?:0|[1-9]\d*|\d*[a-zA-Z-][0-9a-zA-Z-]*))*))?(?:\+(?P<buildmetadata>[0-9a-zA-Z-]+(?:\.[0-9a-zA-Z-]+)*))?$'
async def get_version(name, conf, *, cache, **kwargs):
name = conf.get('cratesio') or name
use_pre_release = conf.get('use_pre_release', False)
data = await cache.get_json(API_URL % name)
results = []
for v in data['versions']:
if v['yanked']:
continue
version = v['num']
match = re.fullmatch(VERSION_PATTERN, version)
if match is None:
logger.warning('ignoring invalid version', version=version)
continue
if not use_pre_release and match.group('prerelease'):
continue
results.append(
RichResult(
version=version,
url=f'https://crates.io/crates/{name}/{version}',
)
)
return results

View file

@ -0,0 +1,28 @@
# MIT licensed
# Copyright (c) 2020 lilydjwg <lilydjwg@gmail.com>, et al.
# Copyright (c) 2017 Felix Yan <felixonmars@archlinux.org>, et al.
from nvchecker.api import RichResult, GetVersionError
URL = 'https://sources.debian.org/api/src/%(pkgname)s/?suite=%(suite)s'
async def get_version(name, conf, *, cache, **kwargs):
pkg = conf.get('debianpkg') or name
strip_release = conf.get('strip_release', False)
suite = conf.get('suite') or "sid"
url = URL % {"pkgname": pkg, "suite": suite}
data = await cache.get_json(url)
if not data.get('versions'):
raise GetVersionError('Debian package not found')
r = data['versions'][0]
if strip_release:
version = r['version'].split("-")[0]
else:
version = r['version']
return RichResult(
version = version,
url = f'https://sources.debian.org/src/{data["package"]}/{r["version"]}/',
)

16
nvchecker_source/gems.py Normal file
View file

@ -0,0 +1,16 @@
# MIT licensed
# Copyright (c) 2013-2020 lilydjwg <lilydjwg@gmail.com>, et al.
from nvchecker.api import RichResult
GEMS_URL = 'https://rubygems.org/api/v1/versions/%s.json'
async def get_version(name, conf, *, cache, **kwargs):
key = conf.get('gems', name)
data = await cache.get_json(GEMS_URL % key)
return [
RichResult(
version = item['number'],
url = f'https://rubygems.org/gems/{key}/versions/{item["number"]}',
) for item in data
]

41
nvchecker_source/git.py Normal file
View file

@ -0,0 +1,41 @@
# MIT licensed
# Copyright (c) 2020 Felix Yan <felixonmars@archlinux.org>, et al.
from .cmd import run_cmd
from nvchecker.api import RichResult
async def get_version(
name, conf, *, cache, keymanager=None
):
git = conf['git']
use_commit = conf.get('use_commit', False)
if use_commit:
ref = conf.get('branch')
if ref is None:
ref = 'HEAD'
gitref = None
else:
ref = 'refs/heads/' + ref
gitref = ref
cmd = f"git ls-remote {git} {ref}"
data = await cache.get(cmd, run_cmd)
version = data.split(None, 1)[0]
return RichResult(
version = version,
revision = version,
gitref = gitref,
)
else:
cmd = f"git ls-remote --tags --refs {git}"
data = await cache.get(cmd, run_cmd)
versions = []
for line in data.splitlines():
revision, version = line.split("\trefs/tags/", 1)
versions.append(RichResult(
version = version,
revision = revision,
gitref = f"refs/tags/{version}",
))
return versions

57
nvchecker_source/gitea.py Normal file
View file

@ -0,0 +1,57 @@
# MIT licensed
# Copyright (c) 2013-2020 lilydjwg <lilydjwg@gmail.com>, et al.
from __future__ import annotations
import urllib.parse
GITEA_URL = 'https://%s/api/v1/repos/%s/commits'
GITEA_MAX_TAG = 'https://%s/api/v1/repos/%s/tags'
from nvchecker.api import (
VersionResult, RichResult, Entry,
AsyncCache, KeyManager,
)
async def get_version(
name: str, conf: Entry, *,
cache: AsyncCache, keymanager: KeyManager,
) -> VersionResult:
repo = urllib.parse.quote(conf['gitea'])
br = conf.get('branch')
host = conf.get('host', 'gitea.com')
use_max_tag = conf.get('use_max_tag', False)
if use_max_tag:
url = GITEA_MAX_TAG % (host, repo)
else:
url = GITEA_URL % (host, repo)
if br:
url += '?sha=' + br
# Load token from config
token = conf.get('token')
# Load token from keyman
if token is None:
token = keymanager.get_key(host.lower(), 'gitea_' + host.lower())
# Set private token if token exists.
headers = {}
if token:
headers["Authorization"] = f'token {token}'
data = await cache.get_json(url, headers = headers)
if use_max_tag:
return [
RichResult(
version = tag['name'],
revision = tag['id'],
url = f'https://{host}/{conf["gitea"]}/releases/tag/{tag["name"]}',
) for tag in data
]
else:
return RichResult(
version = data[0]['commit']['committer']['date'],
revision = data[0]['sha'],
url = data[0]['html_url'],
)

287
nvchecker_source/github.py Normal file
View file

@ -0,0 +1,287 @@
# MIT licensed
# Copyright (c) 2013-2020, 2024 lilydjwg <lilydjwg@gmail.com>, et al.
import time
from urllib.parse import urlencode
from typing import List, Tuple, Union, Optional
import asyncio
import structlog
from nvchecker.api import (
VersionResult, Entry, AsyncCache, KeyManager,
HTTPError, session, RichResult, GetVersionError,
)
logger = structlog.get_logger(logger_name=__name__)
ALLOW_REQUEST = None
RATE_LIMITED_ERROR = False
GITHUB_URL = 'https://api.%s/repos/%s/commits'
GITHUB_LATEST_RELEASE = 'https://api.%s/repos/%s/releases/latest'
# https://developer.github.com/v3/git/refs/#get-all-references
GITHUB_MAX_TAG = 'https://api.%s/repos/%s/git/refs/tags'
GITHUB_MAX_RELEASE = 'https://api.%s/repos/%s/releases'
GITHUB_GRAPHQL_URL = 'https://api.%s/graphql'
async def get_version(name, conf, **kwargs):
global RATE_LIMITED_ERROR, ALLOW_REQUEST
if RATE_LIMITED_ERROR:
raise RuntimeError('rate limited')
if ALLOW_REQUEST is None:
ALLOW_REQUEST = asyncio.Event()
ALLOW_REQUEST.set()
for _ in range(2): # retry once
try:
await ALLOW_REQUEST.wait()
return await get_version_real(name, conf, **kwargs)
except HTTPError as e:
if e.code in [403, 429]:
if n := check_ratelimit(e, name):
ALLOW_REQUEST.clear()
await asyncio.sleep(n+1)
ALLOW_REQUEST.set()
continue
RATE_LIMITED_ERROR = True
raise
QUERY_LATEST_TAG = '''
{{
repository(name: "{name}", owner: "{owner}") {{
refs(refPrefix: "refs/tags/", first: 1,
query: "{query}",
orderBy: {{field: TAG_COMMIT_DATE, direction: DESC}}) {{
edges {{
node {{
name
target {{
oid
}}
}}
}}
}}
}}
}}
'''
QUERY_LATEST_RELEASE_WITH_PRERELEASES = '''
{{
repository(name: "{name}", owner: "{owner}") {{
releases(first: 1, orderBy: {{field: CREATED_AT, direction: DESC}}) {{
edges {{
node {{
name
url
tag {{
name
}}
tagCommit {{
oid
}}
}}
}}
}}
}}
}}
'''
async def get_latest_tag(key: Tuple[str, str, str, str]) -> RichResult:
host, repo, query, token = key
owner, reponame = repo.split('/')
headers = {
'Authorization': f'bearer {token}',
'Content-Type': 'application/json',
}
q = QUERY_LATEST_TAG.format(
owner = owner,
name = reponame,
query = query,
)
res = await session.post(
GITHUB_GRAPHQL_URL % host,
headers = headers,
json = {'query': q},
)
j = res.json()
refs = j['data']['repository']['refs']['edges']
if not refs:
raise GetVersionError('no tag found')
version = refs[0]['node']['name']
revision = refs[0]['node']['target']['oid']
return RichResult(
version = version,
gitref = f"refs/tags/{version}",
revision = revision,
url = f'https://github.com/{repo}/releases/tag/{version}',
)
async def get_latest_release_with_prereleases(key: Tuple[str, str, str, str]) -> RichResult:
host, repo, token, use_release_name = key
owner, reponame = repo.split('/')
headers = {
'Authorization': f'bearer {token}',
'Content-Type': 'application/json',
}
q = QUERY_LATEST_RELEASE_WITH_PRERELEASES.format(
owner = owner,
name = reponame,
)
res = await session.post(
GITHUB_GRAPHQL_URL % host,
headers = headers,
json = {'query': q},
)
j = res.json()
refs = j['data']['repository']['releases']['edges']
if not refs:
raise GetVersionError('no release found')
tag_name = refs[0]['node']['tag']['name']
if use_release_name:
version = refs[0]['node']['name']
else:
version = tag_name
return RichResult(
version = version,
gitref = f"refs/tags/{tag_name}",
revision = refs[0]['node']['tagCommit']['oid'],
url = refs[0]['node']['url'],
)
async def get_version_real(
name: str, conf: Entry, *,
cache: AsyncCache, keymanager: KeyManager,
**kwargs,
) -> VersionResult:
repo = conf['github']
host = conf.get('host', "github.com")
# Load token from config
token = conf.get('token')
# Load token from keyman
if token is None:
token = keymanager.get_key(host.lower(), 'github')
use_latest_tag = conf.get('use_latest_tag', False)
if use_latest_tag:
if not token:
raise GetVersionError('token not given but it is required')
query = conf.get('query', '')
return await cache.get((host, repo, query, token), get_latest_tag) # type: ignore
use_latest_release = conf.get('use_latest_release', False)
include_prereleases = conf.get('include_prereleases', False)
use_release_name = conf.get('use_release_name', False)
if use_latest_release and include_prereleases:
if not token:
raise GetVersionError('token not given but it is required')
return await cache.get(
(host, repo, token, use_release_name),
get_latest_release_with_prereleases) # type: ignore
br = conf.get('branch')
path = conf.get('path')
use_max_tag = conf.get('use_max_tag', False)
use_max_release = conf.get('use_max_release', False)
if use_latest_release:
url = GITHUB_LATEST_RELEASE % (host, repo)
elif use_max_tag:
url = GITHUB_MAX_TAG % (host, repo)
elif use_max_release:
url = GITHUB_MAX_RELEASE % (host, repo)
else:
url = GITHUB_URL % (host, repo)
parameters = {}
if br:
parameters['sha'] = br
if path:
parameters['path'] = path
url += '?' + urlencode(parameters)
headers = {
'Accept': 'application/vnd.github.quicksilver-preview+json',
}
if token:
headers['Authorization'] = f'token {token}'
data = await cache.get_json(url, headers = headers)
if use_max_tag:
tags: List[Union[str, RichResult]] = [
RichResult(
version = ref['ref'].split('/', 2)[-1],
gitref = ref['ref'],
revision = ref['object']['sha'],
url = f'https://github.com/{repo}/releases/tag/{ref["ref"].split("/", 2)[-1]}',
) for ref in data
]
if not tags:
raise GetVersionError('No tag found in upstream repository.')
return tags
if use_max_release:
releases: List[Union[str, RichResult]] = [
RichResult(
version = ref['name'] if use_release_name else ref['tag_name'],
gitref = f"refs/tags/{ref['tag_name']}",
url = ref['html_url'],
) for ref in data if include_prereleases or not ref['prerelease']
]
if not releases:
raise GetVersionError('No release found in upstream repository.')
return releases
if use_latest_release:
if 'tag_name' not in data:
raise GetVersionError('No release found in upstream repository.')
if use_release_name:
version = data['name']
else:
version = data['tag_name']
return RichResult(
version = version,
gitref = f"refs/tags/{data['tag_name']}",
url = data['html_url'],
)
else:
return RichResult(
# YYYYMMDD.HHMMSS
version = data[0]['commit']['committer']['date'].rstrip('Z').replace('-', '').replace(':', '').replace('T', '.'),
revision = data[0]['sha'],
url = data[0]['html_url'],
)
def check_ratelimit(exc: HTTPError, name: str) -> Optional[int]:
res = exc.response
if not res:
raise exc
if v := res.headers.get('retry-after'):
n = int(v)
logger.warning('retry-after', n=n)
return n
# default -1 is used to re-raise the exception
n = int(res.headers.get('X-RateLimit-Remaining', -1))
if n == 0:
reset = int(res.headers.get('X-RateLimit-Reset'))
logger.error(f'rate limited, resetting at {time.ctime(reset)}. '
'Or get an API token to increase the allowance if not yet',
name = name,
reset = reset)
return None
raise exc

View file

@ -0,0 +1,80 @@
# MIT licensed
# Copyright (c) 2013-2020 lilydjwg <lilydjwg@gmail.com>, et al.
import urllib.parse
import structlog
from nvchecker.api import (
VersionResult, RichResult, Entry,
AsyncCache, KeyManager, TemporaryError,
)
GITLAB_URL = 'https://%s/api/v4/projects/%s/repository/commits'
GITLAB_MAX_TAG = 'https://%s/api/v4/projects/%s/repository/tags'
logger = structlog.get_logger(logger_name=__name__)
async def get_version(name, conf, **kwargs):
try:
return await get_version_real(name, conf, **kwargs)
except TemporaryError as e:
check_ratelimit(e, name)
async def get_version_real(
name: str, conf: Entry, *,
cache: AsyncCache, keymanager: KeyManager,
**kwargs,
) -> VersionResult:
repo = urllib.parse.quote_plus(conf['gitlab'])
br = conf.get('branch')
host = conf.get('host', "gitlab.com")
use_max_tag = conf.get('use_max_tag', False)
if use_max_tag:
url = GITLAB_MAX_TAG % (host, repo)
else:
url = GITLAB_URL % (host, repo)
if br:
url += '?ref_name=%s' % br
# Load token from config
token = conf.get('token')
# Load token from keyman
if token is None:
token = keymanager.get_key(host.lower(), 'gitlab_' + host.lower())
# Set private token if token exists.
headers = {}
if token:
headers["PRIVATE-TOKEN"] = token
data = await cache.get_json(url, headers = headers)
if use_max_tag:
return [
RichResult(
version = tag['name'],
revision = tag['commit']['id'],
url = f'https://{host}/{conf["gitlab"]}/-/tags/{tag["name"]}',
) for tag in data
]
else:
return RichResult(
version = data[0]['created_at'].split('T', 1)[0].replace('-', ''),
revision = data[0]['id'],
url = data[0]['web_url'],
)
def check_ratelimit(exc, name):
res = exc.response
if not res:
raise
# default -1 is used to re-raise the exception
n = int(res.headers.get('RateLimit-Remaining', -1))
if n == 0:
logger.error('gitlab rate limited. Wait some time '
'or get an API token to increase the allowance if not yet',
name = name)
else:
raise

40
nvchecker_source/go.py Normal file
View file

@ -0,0 +1,40 @@
# MIT licensed
# Copyright (c) 2024 bgme <i@bgme.me>.
from lxml import html
from nvchecker.api import (
RichResult, Entry, AsyncCache, KeyManager,
session, GetVersionError,
)
GO_PKG_URL = 'https://pkg.go.dev/{pkg}?tab=versions'
GO_PKG_VERSION_URL = 'https://pkg.go.dev/{pkg}@{version}'
async def get_version(
name: str, conf: Entry, *,
cache: AsyncCache, keymanager: KeyManager,
**kwargs,
) -> RichResult:
key = tuple(sorted(conf.items()))
return await cache.get(key, get_version_impl)
async def get_version_impl(info) -> RichResult:
conf = dict(info)
pkg_name = conf.get('go')
url = GO_PKG_URL.format(pkg=pkg_name)
res = await session.get(url)
doc = html.fromstring(res.body.decode())
elements = doc.xpath("//div[@class='Version-tag']/a/text()")
try:
version = elements[0] # type: ignore
return RichResult(
version = version, # type: ignore
url = GO_PKG_VERSION_URL.format(pkg=pkg_name, version=version),
)
except IndexError:
raise GetVersionError("parse error", pkg_name=pkg_name)

View file

@ -0,0 +1,15 @@
# MIT licensed
# Copyright (c) 2013-2020 lilydjwg <lilydjwg@gmail.com>, et al.
from nvchecker.api import RichResult
HACKAGE_URL = 'https://hackage.haskell.org/package/%s/preferred.json'
async def get_version(name, conf, *, cache, **kwargs):
key = conf.get('hackage', name)
data = await cache.get_json(HACKAGE_URL % key)
version = data['normal-version'][0]
return RichResult(
version = version,
url = f'https://hackage.haskell.org/package/{key}-{version}',
)

View file

@ -0,0 +1,41 @@
# MIT licensed
# Copyright (c) 2020 Ypsilik <tt2laurent.maud@gmail.com>, et al.
# Copyright (c) 2013-2020 lilydjwg <lilydjwg@gmail.com>, et al.
from lxml import html, etree
from nvchecker.api import session, GetVersionError
async def get_version(name, conf, *, cache, **kwargs):
key = tuple(sorted(conf.items()))
return await cache.get(key, get_version_impl)
async def get_version_impl(info):
conf = dict(info)
encoding = conf.get('encoding')
parser = html.HTMLParser(encoding=encoding)
data = conf.get('post_data')
if data is None:
res = await session.get(conf['url'])
else:
res = await session.post(conf['url'], body = data, headers = {
'Content-Type': conf.get('post_data_type', 'application/x-www-form-urlencoded')
})
doc = html.fromstring(res.body, base_url=conf['url'], parser=parser)
try:
els = doc.xpath(conf.get('xpath'))
except ValueError:
if not conf.get('missing_ok', False):
raise GetVersionError('version string not found.')
except etree.XPathEvalError as e:
raise GetVersionError('bad xpath', exc_info=e)
version = [
str(el)
if isinstance(el, str)
else str(el.text_content())
for el in els
]
return version

View file

@ -0,0 +1,42 @@
# MIT licensed
# Copyright (c) 2021 lilydjwg <lilydjwg@gmail.com>, et al.
import re
from nvchecker.api import session, GetVersionError
async def get_version(name, conf, *, cache, **kwargs):
key = tuple(sorted(conf.items()))
return await cache.get(key, get_version_impl)
async def get_version_impl(info):
conf = dict(info)
url = conf['url']
header = conf.get('header', 'Location')
follow_redirects = conf.get('follow_redirects', False)
method = conf.get('method', 'HEAD')
try:
regex = re.compile(conf['regex'])
except re.error as e:
raise GetVersionError('bad regex', exc_info=e)
res = await session.request(
url,
method = method,
follow_redirects = follow_redirects,
)
header_value = res.headers.get(header)
if not header_value:
raise GetVersionError(
'header not found or is empty',
header = header,
value = header_value,
)
try:
version = regex.findall(header_value)
except ValueError:
raise GetVersionError('version string not found.')
return version

42
nvchecker_source/jq.py Normal file
View file

@ -0,0 +1,42 @@
# MIT licensed
# Copyright (c) 2024 Rocket Aaron <i@rocka.me>, et al.
import json
import jq
from nvchecker.api import session, GetVersionError
async def get_version(name, conf, *, cache, **kwargs):
key = tuple(sorted(conf.items()))
return await cache.get(key, get_version_impl)
async def get_version_impl(info):
conf = dict(info)
try:
program = jq.compile(conf.get('filter', '.'))
except ValueError as e:
raise GetVersionError('bad jq filter', exc_info=e)
data = conf.get('post_data')
if data is None:
res = await session.get(conf['url'])
else:
res = await session.post(conf['url'], body = data, headers = {
'Content-Type': conf.get('post_data_type', 'application/json')
})
try:
obj = json.loads(res.body)
except json.decoder.JSONDecodeError as e:
raise GetVersionError('bad json string', exc_info=e)
try:
version = program.input(obj).all()
if version == [None] and not conf.get('missing_ok', False):
raise GetVersionError('version string not found.')
version = [str(v) for v in version]
except ValueError as e:
raise GetVersionError('failed to filter json', exc_info=e)
return version

View file

@ -0,0 +1,20 @@
# MIT Licensed
# Copyright (c) 2024 Bert Peters <bertptrs@archlinux.org>, et al.
from __future__ import annotations
from nvchecker.api import AsyncCache, Entry, RichResult
PROJECT_INFO_URL = "https://api.launchpad.net/1.0/{launchpad}"
async def get_version(name: str, conf: Entry, *, cache: AsyncCache, **kwargs):
launchpad = conf["launchpad"]
project_data = await cache.get_json(PROJECT_INFO_URL.format(launchpad=launchpad))
data = await cache.get_json(project_data['releases_collection_link'])
return [
RichResult(version=entry["version"], url=entry["web_link"])
for entry in data["entries"]
]

View file

@ -0,0 +1,5 @@
# MIT licensed
# Copyright (c) 2013-2020 lilydjwg <lilydjwg@gmail.com>, et al.
async def get_version(name, conf, **kwargs):
return str(conf.get('manual')).strip() or None

View file

@ -0,0 +1,10 @@
# MIT licensed
# Copyright (c) 2020 Felix Yan <felixonmars@archlinux.org>, et al.
async def get_version(name, conf, *, cache, **kwargs):
url = conf['mercurial'] + '/json-tags'
data = await cache.get_json(url)
version = [tag['tag'] for tag in data['tags']]
return version

16
nvchecker_source/none.py Normal file
View file

@ -0,0 +1,16 @@
# MIT licensed
# Copyright (c) 2020 lilydjwg <lilydjwg@gmail.com>, et al.
from __future__ import annotations
from nvchecker.api import (
BaseWorker, GetVersionError, RawResult,
)
class Worker(BaseWorker):
async def run(self) -> None:
exc = GetVersionError('no source specified')
async with self.task_sem:
for name, conf in self.tasks:
await self.result_q.put(
RawResult(name, exc, conf))

38
nvchecker_source/npm.py Normal file
View file

@ -0,0 +1,38 @@
# MIT licensed
# Copyright (c) 2013-2020 lilydjwg <lilydjwg@gmail.com>, et al.
import json
import re
from nvchecker.api import session, RichResult
NPM_URL = 'https://registry.npmjs.org/%s'
def configure(config):
global NPM_URL
url = config.get('registry')
if url:
NPM_URL = f'{url.rstrip("/")}/%s'
async def get_first_1k(url):
headers = {
"Accept": "application/vnd.npm.install-v1+json",
"Range": "bytes=0-1023",
}
res = await session.get(url, headers=headers)
return res.body
async def get_version(name, conf, *, cache, **kwargs):
key = conf.get('npm', name)
data = await cache.get(NPM_URL % key, get_first_1k)
dist_tags = json.loads(re.search(b'"dist-tags":({.*?})', data).group(1))
version = dist_tags['latest']
# There is no standardised URL scheme, so we only return an URL for the default registry
if NPM_URL.startswith('https://registry.npmjs.org/'):
return RichResult(
version = version,
url = f'https://www.npmjs.com/package/{key}/v/{version}',
)
else:
return version

71
nvchecker_source/opam.py Normal file
View file

@ -0,0 +1,71 @@
# MIT licensed
# Copyright (c) 2024 Daniel Peukert <daniel@peukert.cc>, et al.
import asyncio
from io import BytesIO
import tarfile
from typing import List
from nvchecker.api import (
session, VersionResult,
Entry, AsyncCache,
KeyManager, RichResult
)
OPAM_REPO_INDEX_URL = "%s/index.tar.gz"
OPAM_VERSION_PATH_PREFIX = "packages/%s/%s."
OPAM_VERSION_PATH_SUFFIX = "/opam"
OPAM_DEFAULT_REPO = 'https://opam.ocaml.org'
OPAM_DEFAULT_REPO_VERSION_URL = "%s/packages/%s/%s.%s"
def _decompress_and_list_files(data: bytes) -> List[str]:
# Convert the bytes to a file object and get a list of files
archive = tarfile.open(mode='r', fileobj=BytesIO(data))
return archive.getnames()
async def get_files(url: str) -> List[str]:
# Download the file and get its contents
res = await session.get(url)
data = res.body
# Get the file list of the archive
loop = asyncio.get_running_loop()
return await loop.run_in_executor(None, _decompress_and_list_files, data)
async def get_package_versions(files: List[str], pkg: str) -> List[str]:
# Prepare the filename prefix based on the package name
prefix = OPAM_VERSION_PATH_PREFIX % (pkg , pkg)
# Only keep opam files that are relevant to the package we're working with
filtered_files = []
for filename in files:
if filename.startswith(prefix) and filename.endswith(OPAM_VERSION_PATH_SUFFIX):
filtered_files.append(filename[len(prefix):-1*len(OPAM_VERSION_PATH_SUFFIX)])
return filtered_files
async def get_version(
name: str, conf: Entry, *,
cache: AsyncCache, keymanager: KeyManager,
**kwargs,
):
pkg = conf.get('pkg', name)
repo = conf.get('repo', OPAM_DEFAULT_REPO).rstrip('/')
# Get the list of files in the repo index (see https://opam.ocaml.org/doc/Manual.html#Repositories for repo structure)
files = await cache.get(OPAM_REPO_INDEX_URL % repo, get_files) # type: ignore
# Parse the version strings from the file names
raw_versions = await get_package_versions(files, pkg)
# Convert the version strings into RichResults
versions = []
for version in raw_versions:
versions.append(RichResult(
version = version,
# There is no standardised URL scheme, so we only return an URL for the default registry
url = OPAM_DEFAULT_REPO_VERSION_URL % (repo, pkg, pkg, version) if repo == OPAM_DEFAULT_REPO else None,
))
return versions

View file

@ -0,0 +1,18 @@
# MIT licensed
# Copyright (c) 2013-2021 Th3Whit3Wolf <the.white.wolf.is.1337@gmail.com>, et al.
from nvchecker.api import RichResult
API_URL = 'https://open-vsx.org/api/%s/%s'
async def get_version(name, conf, *, cache, **kwargs):
name = conf.get('openvsx') or name
splitName = name.split('.')
publisher = splitName[0]
extension = splitName[1]
data = await cache.get_json(API_URL % (publisher, extension))
version = data['version']
return RichResult(
version = version,
url = f'https://open-vsx.org/extension/{publisher}/{extension}/{version}',
)

View file

@ -0,0 +1,23 @@
# MIT licensed
# Copyright (c) 2013-2020 lilydjwg <lilydjwg@gmail.com>, et al.
from nvchecker.api import RichResult
PACKAGIST_URL = 'https://packagist.org/packages/%s.json'
async def get_version(name, conf, *, cache, **kwargs):
key = conf.get('packagist', name)
data = await cache.get_json(PACKAGIST_URL % key)
versions = {
version: details
for version, details in data["package"]['versions'].items()
if version != "dev-master"
}
if len(versions):
version = max(versions, key=lambda version: versions[version]["time"])
return RichResult(
version = version,
url = f'https://packagist.org/packages/{data["package"]["name"]}#{version}',
)

View file

@ -0,0 +1,16 @@
# MIT licensed
# Copyright (c) 2013-2020 lilydjwg <lilydjwg@gmail.com>, et al.
from nvchecker_source import cmd
async def get_version(name, conf, **kwargs):
referree = conf.get('pacman') or name
c = "LANG=C pacman -Si %s | grep -F Version | awk '{print $3}' | head -n 1" % referree
conf['cmd'] = c
strip_release = conf.get('strip_release', False)
version = await cmd.get_version(name, conf, **kwargs)
if strip_release and '-' in version:
version = version.rsplit('-', 1)[0]
return version

View file

@ -0,0 +1,32 @@
# MIT licensed
# Copyright (c) 2020 Felix Yan <felixonmars@archlinux.org>, et al.
import urllib.parse
import structlog
from nvchecker.api import (
VersionResult, RichResult, Entry, AsyncCache, KeyManager,
)
PAGURE_URL = 'https://%s/api/0/%s/git/tags?with_commits=true'
logger = structlog.get_logger(logger_name=__name__)
async def get_version(
name: str, conf: Entry, *,
cache: AsyncCache, keymanager: KeyManager,
**kwargs,
) -> VersionResult:
repo = conf['pagure']
host = conf.get('host', "pagure.io")
url = PAGURE_URL % (host, repo)
data = await cache.get_json(url)
return [
RichResult(
version = version,
url = f'https://{host}/{repo}/tree/{version_hash}',
) for version, version_hash in data["tags"].items()
]

43
nvchecker_source/pypi.py Normal file
View file

@ -0,0 +1,43 @@
# MIT licensed
# Copyright (c) 2013-2021,2023-2024 lilydjwg <lilydjwg@gmail.com>, et al.
import structlog
from packaging.version import Version, InvalidVersion
from nvchecker.api import RichResult
logger = structlog.get_logger(logger_name=__name__)
async def get_version(name, conf, *, cache, **kwargs):
ret = []
package = conf.get('pypi') or name
use_pre_release = conf.get('use_pre_release', False)
url = 'https://pypi.org/pypi/{}/json'.format(package)
data = await cache.get_json(url)
for version in data['releases'].keys():
# Skip versions that are marked as yanked.
if (vers := data['releases'][version]) and vers[0]['yanked']:
continue
try:
parsed_version = Version(version)
except InvalidVersion:
if data['releases'][version]:
# emit a warning if there is something under the invalid version
# sympy has an empty "0.5.13-hg" version
logger.warning('ignoring invalid version', version=version)
continue
if not use_pre_release and parsed_version.is_prerelease:
continue
ret.append(RichResult(
version = version,
url = f'https://pypi.org/project/{package}/{version}/',
))
return ret

39
nvchecker_source/regex.py Normal file
View file

@ -0,0 +1,39 @@
# MIT licensed
# Copyright (c) 2013-2020 lilydjwg <lilydjwg@gmail.com>, et al.
import re
from nvchecker.api import session, GetVersionError
async def get_version(name, conf, *, cache, **kwargs):
try:
regex = re.compile(conf['regex'])
except re.error as e:
raise GetVersionError('bad regex', exc_info=e)
if regex.groups > 1:
raise GetVersionError('multi-group regex')
key = (
conf['url'],
conf.get('encoding', 'latin1'),
conf.get('post_data'),
conf.get('post_data_type', 'application/x-www-form-urlencoded'),
)
body = await cache.get(key, get_url)
versions = regex.findall(body)
if not versions and not conf.get('missing_ok', False):
raise GetVersionError('version string not found.')
return versions
async def get_url(info):
url, encoding, post_data, post_data_type = info
if post_data is None:
res = await session.get(url)
else:
res = await session.post(url, body = post_data, headers = {
'Content-Type': post_data_type,
})
body = res.body.decode(encoding)
return body

View file

@ -0,0 +1,33 @@
# MIT licensed
# Copyright (c) 2019 lilydjwg <lilydjwg@gmail.com>, et al.
from nvchecker.api import RichResult, GetVersionError
API_URL = 'https://repology.org/api/v1/project/{}'
async def get_version(name, conf, *, cache, **kwargs):
project = conf.get('repology') or name
repo = conf.get('repo')
subrepo = conf.get('subrepo')
if not repo:
raise GetVersionError('repo field is required for repology source')
url = API_URL.format(project)
data = await cache.get_json(url)
pkgs = [pkg for pkg in data if pkg['repo'] == repo]
if not pkgs:
raise GetVersionError('package is not found', repo=repo)
if subrepo:
pkgs = [pkg for pkg in pkgs if pkg.get('subrepo') == subrepo]
if not pkgs:
raise GetVersionError('package is not found in subrepo',
repo=repo, subrepo=subrepo)
return [
RichResult(
version = pkg['version'],
url = f'https://repology.org/project/{project}/packages',
) for pkg in pkgs
]

View file

@ -0,0 +1,84 @@
# MIT licensed
# Copyright (c) 2024 Jakub Ružička <jru@debian.org>, et al.
import asyncio
import gzip
import pathlib
import urllib
from typing import Set
import lxml.etree
from nvchecker.api import session, AsyncCache, Entry, KeyManager, VersionResult
# XML namespaces used in repodata (dead links haha)
NS = {
'common': 'http://linux.duke.edu/metadata/common',
'repo': 'http://linux.duke.edu/metadata/repo',
'rpm': 'http://linux.duke.edu/metadata/rpm'
}
async def get_version(
name: str, conf: Entry, *,
cache: AsyncCache, keymanager: KeyManager,
**kwargs,
) -> VersionResult:
repo = conf['repo']
arch = conf.get('arch', 'binary')
pkg = conf.get('pkg')
if not pkg:
pkg = conf.get('rpmrepo', name)
repo_url = urllib.parse.urlparse(repo)
repo_path = pathlib.PurePosixPath(repo_url.path)
# get the url of repomd.xml
repomd_path = repo_path / 'repodata' / 'repomd.xml'
repomd_url = repo_url._replace(path=str(repomd_path)).geturl()
# download repomd.xml (use cache)
repomd_body = await cache.get(repomd_url, get_file) # type: ignore
# parse repomd.xml
repomd_xml = lxml.etree.fromstring(repomd_body)
# get the url of *primary.xml.gz
primary_element = repomd_xml.find('repo:data[@type="primary"]/repo:location', namespaces=NS)
primary_path = repo_path / primary_element.get('href') # type: ignore
primary_url = repo_url._replace(path=str(primary_path)).geturl()
# download and decompress *primary.xml.gz (use cache)
primary_body = await cache.get(primary_url, get_file_gz) # type: ignore
# parse *primary.xml metadata
metadata = lxml.etree.fromstring(primary_body)
# use set to eliminate duplication
versions_set: Set[str] = set()
# iterate package metadata
for el in metadata.findall(f'common:package[common:name="{pkg}"]', namespaces=NS):
pkg_arch = el.findtext('common:arch', namespaces=NS)
# filter bych arch
if arch == 'binary':
if pkg_arch == 'src':
continue
elif arch != 'any':
if pkg_arch != arch:
continue
version_info = el.find('common:version', namespaces=NS)
version = version_info.get('ver') # type: ignore
versions_set.add(version) # type: ignore
versions = list(versions_set)
return versions # type: ignore
async def get_file(url: str) -> bytes:
res = await session.get(url)
return res.body
async def get_file_gz(url: str) -> bytes:
res = await session.get(url)
loop = asyncio.get_running_loop()
return await loop.run_in_executor(
None, gzip.decompress, res.body)

View file

@ -0,0 +1,33 @@
# MIT licensed
# Copyright (c) 2025 Maxim Slipenko <maxim@slipenko.com>, et al.
from nvchecker.api import (
GetVersionError
)
from nvchecker.httpclient.base import HTTPError
URL="https://api.snapcraft.io/v2/snaps/info/%(snap)s"
async def get_version(
name: str, conf, *,
cache, keymanager,
**kwargs,
):
try:
snap = conf.get("snap")
channel = conf.get("channel")
result = await cache.get_json(
URL % { "snap": snap },
headers={
"Snap-Device-Series": "16",
},
)
except HTTPError:
raise GetVersionError(f"Failed to request snap info for {snap}")
for c in result['channel-map']:
if c['channel']['name'] == channel:
return c['version']
raise GetVersionError(f"Failed to find version for {snap}")

View file

@ -0,0 +1,60 @@
# MIT licensed
# Copyright (c) 2020 lilydjwg <lilydjwg@gmail.com>, et al.
# Copyright (c) 2020 Sunlei <guizaicn@gmail.com>
from xml.etree import ElementTree
from nvchecker.api import session, RichResult
XML_NAMESPACE = 'http://www.w3.org/XML/1998/namespace'
SPARKLE_NAMESPACE = 'http://www.andymatuschak.org/xml-namespaces/sparkle'
async def get_version(name, conf, *, cache, **kwargs):
sparkle = conf['sparkle']
release_notes_language = conf.get('release_notes_language', 'en')
return await cache.get((sparkle, release_notes_language), get_version_impl)
async def get_version_impl(info):
sparkle, release_notes_language = info
res = await session.get(sparkle)
root = ElementTree.fromstring(res.body).find('./channel/item[1]')
item = root.find('./enclosure')
version_string = item.get(f'{{{SPARKLE_NAMESPACE}}}shortVersionString')
build_number = item.get(f'{{{SPARKLE_NAMESPACE}}}version')
if (version_string and version_string.isdigit()) and (
build_number and not build_number.isdigit()
):
version_string, build_number = build_number, version_string
version = []
if version_string:
version.append(version_string)
if build_number and (build_number not in version):
version.append(build_number)
version_str = '-'.join(version) if version else None
release_notes_link = None
for release_notes in root.findall(f'./{{{SPARKLE_NAMESPACE}}}releaseNotesLink'):
language = release_notes.get(f'{{{XML_NAMESPACE}}}lang')
# If the release notes have no language set, store them, but keep looking for our preferred language
if language is None:
release_notes_link = release_notes.text.strip()
# If the release notes match our preferred language, store them and stop looking
if language == release_notes_language:
release_notes_link = release_notes.text.strip()
break
if release_notes_link is not None:
return RichResult(
version = version_str,
url = release_notes_link,
)
else:
return version_str

View file

@ -0,0 +1,48 @@
# MIT licensed
# Copyright (c) 2020 lilydjwg <lilydjwg@gmail.com>, et al.
# Copyright (c) 2017 Felix Yan <felixonmars@archlinux.org>, et al.
from nvchecker.api import RichResult, GetVersionError
URL = 'https://api.launchpad.net/1.0/ubuntu/+archive/primary?ws.op=getPublishedSources&source_name=%s&exact_match=true'
async def get_version(name, conf, *, cache, **kwargs):
pkg = conf.get('ubuntupkg') or name
strip_release = conf.get('strip_release', False)
suite = conf.get('suite')
url = URL % pkg
if suite:
suite = "https://api.launchpad.net/1.0/ubuntu/" + suite
releases = []
while not releases:
data = await cache.get_json(url)
if not data.get('entries'):
raise GetVersionError('Ubuntu package not found')
releases = [r for r in data["entries"] if r["status"] == "Published"]
if suite:
releases = [r for r in releases if r["distro_series_link"] == suite]
if "next_collection_link" not in data:
break
url = data["next_collection_link"]
if not releases:
raise GetVersionError('Ubuntu package not found')
return
if strip_release:
version = releases[0]['source_package_version'].split("-")[0]
else:
version = releases[0]['source_package_version']
return RichResult(
version = version,
url = f'https://packages.ubuntu.com/{releases[0]["distro_series_link"].rsplit("/", 1)[-1]}/{pkg}',
)

View file

@ -0,0 +1,57 @@
# MIT licensed
# Copyright (c) 2013-2021 Th3Whit3Wolf <the.white.wolf.is.1337@gmail.com>, et al.
from nvchecker.api import (
VersionResult, Entry, AsyncCache, KeyManager,
TemporaryError, session, RichResult, GetVersionError,
)
API_URL = 'https://marketplace.visualstudio.com/_apis/public/gallery/extensionquery'
HEADERS = {
'Accept': 'application/json;api-version=6.1-preview.1',
'Content-Type': 'application/json'
}
async def get_version(name: str, conf: Entry, *, cache: AsyncCache, **kwargs):
name = conf.get('vsmarketplace') or name
q = {
'filters': [
{
'criteria': [
{
'filterType': 8,
'value': 'Microsoft.VisualStudio.Code'
},
{
'filterType': 7,
'value': name
},
{
'filterType': 12,
'value': '4096'
}
],
'pageNumber': 1,
'pageSize': 2,
'sortBy': 0,
'sortOrder': 0
}
],
'assetTypes': [],
'flags': 946
}
res = await session.post(
API_URL,
headers = HEADERS,
json = q,
)
j = res.json()
version = j['results'][0]['extensions'][0]['versions'][0]['version']
return RichResult(
version = version,
url = f'https://marketplace.visualstudio.com/items?itemName={name}',
)

11
pyproject.toml Normal file
View file

@ -0,0 +1,11 @@
[build-system]
requires = ["setuptools"]
build-backend = "setuptools.build_meta"
[tool.pytest.ini_options]
# addopts = -n auto
asyncio_mode = "strict"
asyncio_default_fixture_loop_scope = "session"
# build and upload
# rm -rf dist && python -m build --no-isolation && twine check dist/* && twine upload dist/*

32
sample_config.toml Normal file
View file

@ -0,0 +1,32 @@
[__config__]
oldver = "old_ver.json"
newver = "new_ver.json"
[google-chrome]
source = "cmd"
cmd = '''wget -qO- http://dl.google.com/linux/chrome/rpm/stable/x86_64/repodata/other.xml.gz | zgrep -A1 "google-chrome-stable" | awk -F\" '/version/ {print $4"-"$6}' '''
[fbcat]
source = "aur"
[winterpy]
source = "github"
github = "lilydjwg/winterpy"
[nvchecker]
source = "github"
github = "lilydjwg/nvchecker"
[ssed]
source = "regex"
regex = "The current version is ([\\d.]+)\\."
url = "https://sed.sourceforge.net/grabbag/ssed/"
proxy = "http://localhost:8087"
[PySide]
source = "pypi"
pypi = "nvchecker"
[test]
source = "manual"
manual = "0.1"

1
scripts/README.rst Normal file
View file

@ -0,0 +1 @@
Additional scripts may help someone.

85
scripts/nvchecker-ini2toml Executable file
View file

@ -0,0 +1,85 @@
#!/usr/bin/python3
# MIT licensed
# Copyright (c) 2020 lilydjwg <lilydjwg@gmail.com>, et al.
import argparse
import configparser
import toml
_handler_precedence = (
'github', 'aur', 'pypi', 'archpkg', 'debianpkg', 'ubuntupkg',
'gems', 'pacman',
'cmd', 'bitbucket', 'regex', 'manual', 'vcs',
'cratesio', 'npm', 'hackage', 'cpan', 'gitlab', 'packagist',
'repology', 'anitya', 'android_sdk', 'sparkle', 'gitea'
)
BOOL_KEYS = [
'strip_release', 'use_last_modified',
'use_latest_release', 'use_latest_tag',
'use_max_release', 'use_max_tag', 'use_pre_release',
]
INT_KEYS = [
'max_page',
]
def main():
parser = argparse.ArgumentParser(description='convert 1.x ini file to 2.x toml file')
parser.add_argument('ini', type=argparse.FileType(),
help='the old ini file')
parser.add_argument('toml', type=argparse.FileType(mode='w'),
help='the new ini file')
args = parser.parse_args()
old = configparser.ConfigParser(
dict_type=dict, allow_no_value=True, interpolation=None,
)
old.read_file(args.ini)
if '__config__' in old:
c = old['__config__']
newconf = dict(c)
x = newconf.pop('max_concurrent', None)
if x is not None:
newconf['max_concurrency'] = x
confs = {'__config__': newconf}
else:
confs = {}
for section in old.sections():
if section == '__config__':
continue
conf = old[section]
newconf = {}
for key in _handler_precedence:
if key not in conf:
continue
newconf['source'] = key
if conf.get(key):
newconf[key] = conf.get(key)
break
dconf = dict(conf)
for k, v in dconf.items():
if '-' in k:
k = k.replace('-', '_')
if k in BOOL_KEYS:
newconf[k] = conf.getboolean(k)
elif k in INT_KEYS:
newconf[k] = conf.getint(k)
elif v != '':
newconf[k] = v
confs[section] = newconf
toml.dump(confs, args.toml)
args.toml.flush()
if __name__ == '__main__':
main()

73
scripts/nvchecker-notify Executable file
View file

@ -0,0 +1,73 @@
#!/usr/bin/env python3
# MIT licensed
# Copyright (c) 2020,2022 lilydjwg <lilydjwg@gmail.com>, et al.
'''
A simple wrapper to show desktop notifications while running nvchecker.
'''
import os
import subprocess
import json
import gi
try:
gi.require_version('Notify', '0.8')
except ValueError:
gi.require_version('Notify', '0.7')
from gi.repository import Notify
def get_args():
import argparse
parser = argparse.ArgumentParser(description='show desktop notifications while running nvchecker')
parser.add_argument('-c', '--file',
metavar='FILE', type=str,
help='software version configuration file if not default')
parser.add_argument('-k', '--keyfile',
metavar='FILE', type=str,
help='use specified keyfile (override the one in configuration file)')
parser.add_argument('-t', '--tries', default=1, type=int, metavar='N',
help='try N times when network errors occur')
parser.add_argument('--failures', action='store_true',
help='exit with code 3 if failures / errors happen during checking')
return parser.parse_args()
def main():
args = get_args()
Notify.init('nvchecker')
notif = Notify.Notification()
updates = []
rfd, wfd = os.pipe()
cmd = [
'nvchecker', '--logger', 'both', '--json-log-fd', str(wfd),
]
if args.file:
cmd.extend(['-c', args.file])
if args.keyfile:
cmd.extend(['-k', args.keyfile])
if args.tries:
cmd.extend(['-t', str(args.tries)])
if args.failures:
cmd.append('--failures')
process = subprocess.Popen(cmd, pass_fds=(wfd,))
os.close(wfd)
output = os.fdopen(rfd)
for l in output:
j = json.loads(l)
event = j['event']
if event == 'updated':
updates.append('%(name)s updated to version %(version)s' % j)
notif.update('nvchecker', '\n'.join(updates))
notif.show()
ret = process.wait()
if ret != 0:
raise subprocess.CalledProcessError(ret, cmd)
if __name__ == '__main__':
main()

View file

@ -0,0 +1,16 @@
# MIT licensed
# Copyright (c) 2020 Felix Yan <felixonmars@archlinux.org>, et al.
_nvtake() {
local cur _nvchecker_conf_mtime
_init_completion || return
_nvchecker_conf_mtime="$(stat -c %Y $HOME/.config/nvchecker/*)"
if [ -z "$_nvtake_completion_cache" -o "$_nvchecker_conf_mtime" != "$_nvchecker_conf_mtime_cached" ]; then
_nvtake_completion_cache="$(nvcmp -q)"
_nvchecker_conf_mtime_cached="$_nvchecker_conf_mtime"
fi
COMPREPLY=( $(compgen -W "$_nvtake_completion_cache" -- "$cur") )
} &&
complete -F _nvtake nvtake

40
scripts/run_cached_tests Executable file
View file

@ -0,0 +1,40 @@
#!/bin/bash -e
mitmdump=${mitmdump:-mitmdump}
if [[ -f ~/.mitmproxy/nvdump ]]; then
$mitmdump -S ~/.mitmproxy/nvdump -p 7890 --ignore-hosts '127\.0\.0\.1' --server-replay-reuse --server-replay-extra=forward -w newdump >mitmdump_output &
else
$mitmdump -w ~/.mitmproxy/nvdump -p 7890 --ignore-hosts '127\.0\.0\.1' >mitmdump_output &
fi
mitm_pid=$!
on_exit () {
kill -INT $mitm_pid
if [[ -s newdump ]]; then
cat newdump >> ~/.mitmproxy/nvdump
fi
cat mitmdump_output
}
trap on_exit EXIT
if [[ -f keyfile.toml ]]; then
export KEYFILE=keyfile.toml
fi
for _ in {1..10}; do
if [[ -s ~/.mitmproxy/mitmproxy-ca-cert.pem ]]; then
break
fi
sleep 1
done
export SSL_CERT_FILE=$HOME/.mitmproxy/mitmproxy-ca-cert.pem
export GIT_SSL_CAINFO=$SSL_CERT_FILE
export http_proxy=http://localhost:7890 https_proxy=http://localhost:7890
pytest

79
setup.cfg Normal file
View file

@ -0,0 +1,79 @@
# The complex upload command:
# rm -rf dist && python -m build --sdist && twine check dist/* && twine upload -s dist/*
[metadata]
name = nvchecker
version = attr: nvchecker.__version__
author = lilydjwg
author_email = lilydjwg@gmail.com
description = New version checker for software
license = MIT
keywords = new, version, build, check
url = https://github.com/lilydjwg/nvchecker
long_description = file: README.rst
long_description_content_type = text/x-rst
platforms = any
classifiers =
Development Status :: 5 - Production/Stable
Environment :: Console
Intended Audience :: Developers
Intended Audience :: System Administrators
License :: OSI Approved :: MIT License
Operating System :: OS Independent
Programming Language :: Python
Programming Language :: Python :: 3
Programming Language :: Python :: 3 :: Only
Programming Language :: Python :: 3.8
Programming Language :: Python :: 3.9
Programming Language :: Python :: 3.10
Programming Language :: Python :: 3.11
Programming Language :: Python :: 3.12
Programming Language :: Python :: 3.13
Topic :: Internet
Topic :: Internet :: WWW/HTTP
Topic :: Software Development
Topic :: System :: Archiving :: Packaging
Topic :: System :: Software Distribution
Topic :: Utilities
[options]
zip_safe = True
python_requires = >=3.8
packages = find_namespace:
install_requires =
tomli; python_version<"3.11"
structlog
platformdirs
tornado>=6
pycurl
scripts =
scripts/nvchecker-ini2toml
scripts/nvchecker-notify
[options.packages.find]
exclude = tests, build*, docs*
[options.extras_require]
vercmp =
pyalpm
awesomeversion =
awesomeversion
pypi =
packaging
htmlparser =
lxml
rpmrepo =
lxml
jq =
jq
[options.entry_points]
console_scripts =
nvchecker = nvchecker.__main__:main
nvtake = nvchecker.tools:take
nvcmp = nvchecker.tools:cmp
[flake8]
ignore = E111, E302, E501

View file

@ -1,23 +0,0 @@
#!/usr/bin/env python3
# vim:fileencoding=utf-8
from setuptools import setup, find_packages
setup(
name = 'nvchecker',
version = '0.1',
packages = find_packages(),
install_requires = ['tornado'],
entry_points = {
'console_scripts': [
'nvchecker = nvchecker.main:main',
],
},
author = 'lilydjwg',
author_email = 'lilydjwg@gmail.com',
description = 'New version checker for software',
license = 'MIT',
keywords = 'new version build check',
url = 'https://github.com/lilydjwg/nvchecker',
)

3
tests/__init__.py Normal file
View file

@ -0,0 +1,3 @@
# MIT licensed
# Copyright (c) 2020 lilydjwg <lilydjwg@gmail.com>, et al.

111
tests/conftest.py Normal file
View file

@ -0,0 +1,111 @@
# MIT licensed
# Copyright (c) 2020, 2024 lilydjwg <lilydjwg@gmail.com>, et al.
import asyncio
import structlog
import os
from pathlib import Path
from typing import TYPE_CHECKING, Dict
if TYPE_CHECKING:
import tomli as tomllib
else:
try:
import tomllib
except ModuleNotFoundError:
import tomli as tomllib
import pytest
import pytest_asyncio
from nvchecker import core
from nvchecker import __main__ as main
from nvchecker.util import Entries, ResultData, RawResult
use_keyfile = False
async def run(
entries: Entries, max_concurrency: int = 20,
) -> Dict[str, str]:
task_sem = asyncio.Semaphore(max_concurrency)
result_q: asyncio.Queue[RawResult] = asyncio.Queue()
keyfile = os.environ.get('KEYFILE')
if use_keyfile and keyfile:
filepath = Path(keyfile)
keymanager = core.KeyManager(filepath)
else:
keymanager = core.KeyManager(None)
dispatcher = core.setup_httpclient()
entry_waiter = core.EntryWaiter()
futures = dispatcher.dispatch(
entries, task_sem, result_q,
keymanager, entry_waiter, 1, {},
)
oldvers: ResultData = {}
result_coro = core.process_result(oldvers, result_q, entry_waiter)
runner_coro = core.run_tasks(futures)
results, _has_failures = await main.run(result_coro, runner_coro)
return {k: r.version for k, r in results.items()}
@pytest_asyncio.fixture(scope="session")
async def get_version():
async def __call__(name, config):
entries = {name: config}
newvers = await run(entries)
return newvers.get(name)
return __call__
@pytest_asyncio.fixture(scope="session")
async def run_str():
async def __call__(str):
entries = tomllib.loads(str)
newvers = await run(entries)
return newvers.popitem()[1]
return __call__
@pytest_asyncio.fixture(scope="session")
async def run_str_multi():
async def __call__(str):
entries = tomllib.loads(str)
newvers = await run(entries)
return newvers
return __call__
@pytest.fixture(scope="session", autouse=True)
def raise_on_logger_msg():
def proc(logger, method_name, event_dict):
if method_name in ('warning', 'error'):
if 'exc_info' in event_dict:
exc = event_dict['exc_info']
if isinstance(exc, Exception):
raise exc
else: # exc_info=True
raise
if not event_dict['event'].startswith(('rate limited', 'no-result')):
raise RuntimeError(event_dict['event'])
return event_dict['event']
structlog.configure([proc])
def pytest_configure(config):
# register an additional marker
config.addinivalue_line(
'markers', 'needs_net: mark test to require Internet access',
)
@pytest.fixture
def keyfile():
global use_keyfile
if 'KEYFILE' not in os.environ:
pytest.skip('KEYFILE not set')
return
use_keyfile = True
yield
use_keyfile = False

113
tests/test_alpm.py Normal file
View file

@ -0,0 +1,113 @@
# MIT licensed
# Copyright (c) 2020 DDoSolitary <DDoSolitary@gmail.com>, et al.
import pathlib
import shutil
import subprocess
import tempfile
import pytest
pytestmark = [
pytest.mark.asyncio,
pytest.mark.skipif(shutil.which('makepkg') is None, reason='requires makepkg command'),
pytest.mark.skipif(shutil.which('repo-add') is None, reason='requires repo-add command')
]
global temp_dir, db_path
def setup_module(module):
global temp_dir, db_path
temp_dir = tempfile.TemporaryDirectory()
temp_path = pathlib.Path(temp_dir.name)
pkg_path = temp_path / 'test-pkg'
pkg_path.mkdir()
with (pkg_path / 'PKGBUILD').open('w') as f:
f.write(
'pkgname=test-pkg\n'
'pkgver=1.2.3\n'
'pkgrel=4\n'
'arch=(any)\n'
'provides=("test-provides=5.6-7" "test-provides-unversioned")\n'
'options=(!debug)\n'
)
subprocess.check_call(['makepkg', '--nosign'], cwd=pkg_path)
pkg_file = subprocess.check_output(['makepkg', '--packagelist'], cwd=pkg_path, text=True).strip()
db_path = pkg_path / 'test-db'
db_path.mkdir()
repo_path = db_path / 'sync'
repo_path.mkdir()
subprocess.check_call([
'repo-add',
repo_path / 'test-repo.db.tar.gz',
pkg_path / pkg_file
])
def teardown_module(module):
temp_dir.cleanup()
async def test_alpm(get_version):
assert await get_version('test-pkg', {
'source': 'alpm',
'dbpath': str(db_path),
'repo': 'test-repo'
}) == '1.2.3-4'
async def test_alpm_strip(get_version):
assert await get_version('test-pkg', {
'source': 'alpm',
'dbpath': str(db_path),
'repo': 'test-repo',
'strip_release': True
}) == '1.2.3'
async def test_alpm_provided(get_version):
assert await get_version('test-pkg', {
'source': 'alpm',
'dbpath': str(db_path),
'repo': 'test-repo',
'provided': 'test-provides'
}) == '5.6-7'
async def test_alpm_provided_strip(get_version):
assert await get_version('test-pkg', {
'source': 'alpm',
'dbpath': str(db_path),
'repo': 'test-repo',
'provided': 'test-provides',
'strip_release': True
}) == '5.6'
async def test_alpm_missing_repo(get_version):
with pytest.raises(RuntimeError):
await get_version('test-pkg', {
'source': 'alpm',
'dbpath': str(db_path),
'repo': 'wrong-repo'
})
async def test_alpm_missing_pkg(get_version):
with pytest.raises(RuntimeError):
await get_version('wrong-pkg', {
'source': 'alpm',
'dbpath': str(db_path),
'repo': 'test-repo'
})
async def test_alpm_missing_provides(get_version):
with pytest.raises(RuntimeError):
await get_version('test-pkg', {
'source': 'alpm',
'dbpath': str(db_path),
'repo': 'test-repo',
'provided': 'wrong-provides'
})

53
tests/test_alpmfiles.py Normal file
View file

@ -0,0 +1,53 @@
# MIT licensed
# Copyright (c) 2023 Pekka Ristola <pekkarr [at] protonmail [dot] com>, et al.
import pathlib
import shutil
import subprocess
import tempfile
import pytest
pytestmark = [
pytest.mark.asyncio,
pytest.mark.skipif(shutil.which('pacman') is None, reason='requires pacman command'),
pytest.mark.skipif(shutil.which('fakeroot') is None, reason='requires fakeroot command'),
]
global temp_dir, db_path
def setup_module(module):
global temp_dir, db_path
temp_dir = tempfile.TemporaryDirectory()
temp_path = pathlib.Path(temp_dir.name)
db_path = temp_path / 'test-db'
db_path.mkdir(exist_ok=True)
cmd = ['fakeroot', 'pacman', '-Fy', '--dbpath', db_path]
subprocess.check_call(cmd)
def teardown_module(module):
temp_dir.cleanup()
async def test_alpmfiles(get_version):
assert await get_version('test', {
'source': 'alpmfiles',
'pkgname': 'libuv',
'filename': 'usr/lib/libuv\\.so\\.([^.]+)',
'dbpath': db_path,
}) == '1'
async def test_alpmfiles_strip(get_version):
assert await get_version('test', {
'source': 'alpmfiles',
'pkgname': 'glibc',
'repo': 'core',
'filename': 'libc\\.so\\.[^.]+',
'strip_dir': True,
'dbpath': db_path,
}) == 'libc.so.6'

55
tests/test_android_sdk.py Normal file
View file

@ -0,0 +1,55 @@
# MIT licensed
# Copyright (c) 2020 lilydjwg <lilydjwg@gmail.com>, et al.
# Copyright (c) 2017 Chih-Hsuan Yen <yan12125 at gmail dot com>
import pytest
pytestmark = [pytest.mark.asyncio, pytest.mark.needs_net]
@pytest.mark.flaky(reruns=10)
async def test_android_addon(get_version):
assert await get_version("android-google-play-apk-expansion", {
"source": "android_sdk",
"android_sdk": "extras;google;market_apk_expansion",
"repo": "addon",
}) == "1.r03"
async def test_android_package(get_version):
version = await get_version("android-sdk-cmake", {
"source": "android_sdk",
"android_sdk": "cmake;",
"repo": "package",
})
assert version.startswith("3.")
async def test_android_package_channel(get_version):
assert await get_version("android-sdk-cmake", {
"source": "android_sdk",
"android_sdk": "ndk;",
"repo": "package",
"channel": "beta,dev,canary",
}) == "26.0.10636728"
async def test_android_list(get_version):
assert await get_version("android-sdk-cmake-older", {
"source": "android_sdk",
"android_sdk": "cmake;",
"repo": "package",
"include_regex": r"3\.10.*",
}) == "3.10.2"
async def test_android_package_os(get_version):
assert await get_version("android-usb-driver", {
"source": "android_sdk",
"android_sdk": "extras;google;usb_driver",
"repo": "addon",
"host_os": "windows"
}) == "13"
async def test_android_package_os_missing(get_version):
assert await get_version("android-usb-driver", {
"source": "android_sdk",
"android_sdk": "extras;google;usb_driver",
"repo": "addon",
"host_os": "linux"
}) == None

Some files were not shown because too many files have changed in this diff Show more