Blog: https://addshore.com
Twitter: https://twitter.com/addshore
Meta: https://meta.wikimedia.org/wiki/User:Addshore
Wikitech: https://wikitech.wikimedia.org/wiki/User:Addshore
🦓🐝🐢
Blog: https://addshore.com
Twitter: https://twitter.com/addshore
Meta: https://meta.wikimedia.org/wiki/User:Addshore
Wikitech: https://wikitech.wikimedia.org/wiki/User:Addshore
🦓🐝🐢
From my vague memories it was split out of wikibase at the point in time, the cognate extension was created to manage interlanguage links on wiktionary.
As the same interwoki storing code that was part of wikibase would not run on wiktionary for the sirelinks that cognate provided.
Hence the split.
Generally speaking id say yes the container should just have python in it.
However currently mwcli doesn't control the images, they are releng / mw developer images.
It's something I have always sat on the fence about, in terms of if mwcli should just build it's own images too tbh!
And a separate one while just making an API call {"error":{"code":"internal_api_error_DBConnectionError","info":"[cb61a3d938a16e2eb840f474] Caught exception of type Wikimedia\\Rdbms\\DBConnectionError","errorclass":"Wikimedia\\Rdbms\\DBConnectionError"}}<!DOCTYPE html>
Currently if we followed Adam's lead and didn't create a proper extension but just hooks that insert those Installed software rows, we wouldn't get anything back from ActionAPI for this.
Yep, this one is very easy to reproduce.
I wrote some more thoughts here today while sat on a train https://www.wikidata.org/wiki/User:Addshore/EditGroups as edit groups came up on https://www.wikidata.org/wiki/Wikidata_talk:Requests_for_comment/Mass-editing_policy again and the mention of making it "first party".
Thanks for the report,
I definitely need to change something here soon (if not just revert the change depending on the dashboard for service resolution).
The proposal should be on wiki at mediawiki.org so as to include @Addshore (the creator of mwcli) and other volunteers.
Basically, with how discussiontools parses comments, the part between the heading and the first signature is interpreted as the "first post".
Is this constrained by the host or client download speed?
What speed can https://files.scatter.red/orb/2025/12/ allow?
And what's your connection speed?
It will not currently work for custom services, though that should be easy enough to also add, and would also be very neat
Can confirm today this still happens
$wgHooks['SoftwareInfo'][] = function( &$software ) {
$software['[https://www.mediawiki.org/wiki/Wikibase/Suite Wikibase Suite]'] = '0.0.0';
$software['[https://www.mediawiki.org/wiki/Wikibase/Suite/Deploy Wikibase Suite Deploy]'] = '0.0.0';
return true;
};This is ultimately fixed in https://gitlab.wikimedia.org/repos/releng/cli/-/merge_requests/635 pending release.
Which switches to local.wmftest.net instead of *.localhost
Some part of this will actually come in https://gitlab.wikimedia.org/repos/releng/cli/-/merge_requests/638 to play around with
In the form of a status page that tell you 1) what sites you have 2) what services are running.
We will trial an alternative approch to things in https://gitlab.wikimedia.org/repos/releng/cli/-/merge_requests/638
There is a "dashboard" with API that "knows" the status of the services.
mediawiki just checks this APi for the state, and also caches for 5s.
This is missing the addition of the updatelog https://github.com/wikimedia/mediawiki-extensions-Wikibase/blob/c23338f0b71ea18c4defc24c5f7b606d04ccca67/repo/includes/Store/Sql/DatabaseSchemaUpdater.php#L249-L252
Needs fixing in Wikibase :)
So I think rebuildPropertyTerms is there from an earlier version
It looks like the update row log line rebuildItemTerms is only added via update.php, and not when the maintenance script itself that does this job is run.
So rebuilding the terms storage in SQL
Right cc @Lydia_Pintscher
It sounds like the "right" way is indeed to have the expectation that wikibases should install the https://www.mediawiki.org/wiki/Extension:ShortUrl and then they themselves can generate short URLs for their own query services, for a consistent feature set across the ecosystem
I believe the only way to use the new API is with a plan, which has limits
In T411634#11429411, @dena wrote:Done in 1 h 23 min 29 s.
logs: https://cloudlogging.app.goo.gl/RnsF4JhzTRxJybRWA
I wonder if we could run out of memory here.
Screenshot from the status page side of things showing the period of time it couldn't edit for
So yes, CI for mwcli currently fails due to this issue with PHP version highlighted by @hoo
So, chatting on this with @karapayneWMDE briefly yesterday.
A short term "fix" so that this doesn't actually immediately break, might be to make a mwcli image that is based on the newer mediaiwki image, however curl will be reverted to a previous version / pinned back so that the expectations around how mwcli works remain the same in the short term.
However I'm not sure if the "libcurl" vs "curl" issue described in the description of this task means this wouldn't actually work.
This would need a brief investigation / attempt.
Great, I'll look at merging this and including it in the next release too :)
Is x3 accessible via either 1) quarry 2) stat / analytic clusters?
I can see that analytics-mysql on the stat hosts has a --use-x1 option, but nothing for x3
Urgff
Looking encouraging
n one find the latest deployed container images for various services these days? TLDR is I'm trying to find out what version of https://docker-registry.wikimedia.org//repos/data-engineering/eventgate-wikimedia/tags/ is actively used right now / if this image is actually used or if there is a different one for eventgate
10:54 AM <Lucas_WMDE> Lucas Werkmeister
addshore: my guess would’ve been deployment-charts but it looks like https://gerrit.wikimedia.org/g/operations/deployment-charts/+/5f57cf991e/charts/eventgate/values.yaml#92 isn’t pinned to a specific version
10:54 AM
— Lucas_WMDE might be misunderstanding a lot of things
10:56 AM <addshore>
i was confused when it wasnt just showing up in codesearch :D but yes, this already seems like the pointer I need, which si Im looking at and or usoing the wrong image :D
10:57 AM
it changed here it seems https://gerrit.wikimedia.org/r/plugins/gitiles/operations/deployment-charts/+/6834d5331aaebccbb069f10d1e2250c2281c1be4%5E%21/#F1
10:57 AM
Lucas_WMDE: tyvm
In T406317#11396486, @SuzanneWood-WMDE wrote:I tried adding the setting of EVENTLOGGING_IMAGE=docker-registry.wikimedia.org/wikimedia/eventgate-wikimedia:ffd68c0de41e3395e2f8ba9422fbe8824c2a49ff (the latest from here) to .config/mwcli/mwdd/default as it is otherwise overwritten as an old image name in eventlogging.yml (it'd be good to update it in eventlogging.yml)
Still, the schema cannot be loaded (even though it does exist at https://schema.wikimedia.org/repositories/secondary/jsonschema/analytics/product_metrics/web/base/1.5.0 ). This time the error is "unable to get local issuer certificate"
eventlogging-1 | {"name":"eventgate-wikimedia","hostname":"XXX","pid":1,"level":50,"err":{"message":"Failed loading schema at /analytics/product_metrics/web/base/1.5.0","name":"EventSchemaLoadError","stack":"EventSchemaLoadError: Failed loading schema at /analytics/product_metrics/web/base/1.5.0\n at loadSchema.catch (/srv/service/node_modules/eventgate/lib/EventValidator.js:229:23)\n at tryCatcher (/srv/service/node_modules/bluebird/js/release/util.js:16:23)\n at Promise._settlePromiseFromHandler (/srv/service/node_modules/bluebird/js/release/promise.js:547:31)\n at Promise._settlePromise (/srv/service/node_modules/bluebird/js/release/promise.js:604:18)\n at Promise._settlePromise0 (/srv/service/node_modules/bluebird/js/release/promise.js:649:10)\n at Promise._settlePromises (/srv/service/node_modules/bluebird/js/release/promise.js:725:18)\n at _drainQueueStep (/srv/service/node_modules/bluebird/js/release/async.js:93:12)\n at _drainQueue (/srv/service/node_modules/bluebird/js/release/async.js:86:9)\n at Async._drainQueues (/srv/service/node_modules/bluebird/js/release/async.js:102:5)\n at Immediate.Async.drainQueues [as _onImmediate] (/srv/service/node_modules/bluebird/js/release/async.js:15:14)\n at runCallback (timers.js:705:18)\n at tryOnImmediate (timers.js:676:5)\n at processImmediate (timers.js:658:5)","originalError":{"name":"HTTPError","message":"unable to get local issuer certificate","status":504,"headers":{"content-type":"application/problem+json"},"body":{"type":"internal_http_error","detail":"unable to get local issuer certificate","internalStack":"Error: unable to get local issuer certificate\n at TLSSocket.onConnectSecure (_tls_wrap.js:1055:34)\n at TLSSocket.emit (events.js:189:13)\n at TLSSocket._finishInit (_tls_wrap.js:633:8)","internalURI":"https://schema.wikimedia.org/repositories/secondary/jsonschema/analytics/product_metrics/web/base/1.5.0","internalErr":"unable to get local issuer certificate","internalMethod":"get"}},"uri":"/analytics/product_metrics/web/base/1.5.0"},"msg":"event encountered an error: Failed loading schema at /analytics/product_metrics/web/base/1.5.0","time":"2025-11-21T14:59:48.168Z","v":0}I guess it leads here for the error https://gitlab.wikimedia.org/repos/data-engineering/eventgate/-/blob/master/lib/EventValidator.js#L229
I believe I'm still technically capable of helping out here, if there is still a full graph test node of some sort / something that is / can be depooled.
Flakey is probably the wrong word, though of course as install.php or update.php are controlled entirely externally they could of course break for any number of unknown reasons.
So one part of the decisions around not running then, is that then they can't break.
I have pinged petrb elsewhere.
If I dont get a reply, I'll offload what I can only assume are some DB backups from the NFS store to put them somewhere else (until I can confirm they are not needed)
Success!
ty
The link itself on wikiba.se still uses http://
It would be great if it could not be updated to https:// too
Hey @Unnati_kesarwani
I'm not sure you can help out on this one, as I don't believe the source code for this site is open and editable anywhere
Some also napkin maths
Another view to look at this from (as I see the back pressure increasing on your graph)
My status monitoring agrees that this is indeed resulting in user impact again
Just some notes regarding sizes (as I looked)
Likely differentiating "sparql API endpoint" and "sparql user interface *optional" or such.
I guess when I filled out the form I expected whoever adjusted the whitelist to validate what was being added, hence I didn't think to hard about how to interrupt "endpoint"
The URL for queries is likely meant to be https://query.portal.mardi4nfdi.de/sparql?...
Looks like the form I filled out was unclear, as I entered https://query.portal.mardi4nfdi.de for where the query endpoint is (which is its location for humans)
Archived for now, can always be brought back if needed by the WD teams.
Currently, the concept of Property Creator as a group ( propertycreator ) only exists on Wikidata I belive.
Based on comparing https://www.wikidata.org/wiki/Special:ListGroupRights and https://addshore-alpha.wikibase.cloud/wiki/Special:ListGroupRights