12:01:50 <dviroel> #startmeeting watcher
12:01:50 <opendevmeet> Meeting started Thu Nov  6 12:01:50 2025 UTC and is due to finish in 60 minutes.  The chair is dviroel. Information about MeetBot at http://wiki.debian.org/MeetBot.
12:01:50 <opendevmeet> Useful Commands: #action #agreed #help #info #idea #link #topic #startvote.
12:01:50 <opendevmeet> The meeting name has been set to 'watcher'
12:02:16 <dviroel> hi all o/
12:02:24 <jgilaber> o/
12:02:36 <morenod> o/
12:02:39 <sean-k-mooney> o/
12:03:15 <dviroel> courtesy ping: amoralej chandankumar rlandy
12:03:22 <rlandy> o/
12:03:22 <chandankumar> o/
12:03:23 <amoralej> o/
12:03:37 <dviroel> oh, the ping works :)
12:03:49 <dviroel> thank you all for joining :)
12:03:53 <dviroel> let's start with today's meeting agenda
12:04:10 <dviroel> #link https://etherpad.opendev.org/p/openstack-watcher-irc-meeting#L27 (Meeting agenda)
12:04:32 <dviroel> we have a couple of topics to cover today
12:04:39 <dviroel> feel free to add your own topics to the agenda
12:04:46 <dviroel> lets start
12:05:00 <dviroel> #topic Announcements
12:05:15 <dviroel> first one is about last week PTG
12:05:49 <dviroel> we had a full week of discussion around different topics and across multiple projects
12:06:10 <dviroel> in watcher sessions we covered:
12:06:45 <dviroel> tech depts, future of integrations, known bugs/issues, improvements to the project, improvements to our testing and new features proposals, among others
12:07:23 <dviroel> the link to the etherpad is
12:07:36 <dviroel> #link  https://etherpad.opendev.org/p/watcher-2026.1-ptg
12:07:52 <dviroel> you can also take a look on the summary instead, that I recently sent to the ML
12:08:02 <dviroel> #link https://lists.openstack.org/archives/list/openstack-discuss@lists.openstack.org/thread/CQDEIZKBW6JF4WTE4U5JCIVDNA7FKD7B/
12:08:27 <dviroel> if you look at the ptg etherpad, on line ~#57
12:08:49 <dviroel> you will find a compilation of action items that was built based on all topics discussed during the week
12:09:12 <dviroel> if you want to help us on any of these topics, please add your name on it
12:09:36 <dviroel> if the item has already a person assigned, you may want to reach that person and see how you can help with that effort
12:10:23 <dviroel> it is very likely that we will bring new discussions to this weekly meeting, as a follow up from our PTG sessions
12:10:42 <sean-k-mooney> i have a second summary here https://gist.github.com/SeanMooney/8a5e8bfc3538917804dfff819c69de10 as well
12:11:02 <sean-k-mooney> i proably wont do a block on the ptg sessions this release like i did last year
12:11:32 <dviroel> sean-k-mooney: thanks for sharing :)
12:11:47 <sean-k-mooney> it takes quite a lot of enery to do that well, dviroel thanks for posting the summery ot the list
12:12:05 <dviroel> ++
12:12:12 <dviroel> anyone wants to highlight something about the PTG?
12:12:31 <amoralej> thanks for working on the summary dviroel
12:12:44 <sean-k-mooney> i had one tought on reflection after the event
12:13:15 <sean-k-mooney> we choose 3 hrous over 3 days, that left things quite compressed
12:13:30 <sean-k-mooney> teh overlap with nova was also not ideal
12:14:00 <sean-k-mooney> i wonder if we should cosnier withe more shorter sesssions or starting earlier (on the monday) next time
12:14:03 <dviroel> right, most of the team were kinf of using the same timeslots
12:14:15 <dviroel> sean-k-mooney: yes we can
12:14:40 <dviroel> ptg on monday was quiet in the end
12:15:00 <sean-k-mooney> it was a public holiday in ireland so whiel i coudl have attennded i chose not too
12:15:07 <sean-k-mooney> if there were wathcer session i would have obvioulsy
12:15:33 <sean-k-mooney> but i dont know how many other contries had a simialr holiday and if team avoided it as a result
12:15:38 <dviroel> so yeah, next ptg we can try that
12:15:52 <amoralej> i'd vote for doing more sessions, not shorter sessions, i think the conversations were good and i think it's good to give time
12:17:15 <dviroel> we may include one more day in the end
12:18:01 <dviroel> we may avoid conflicts starting earlier and reserving less time on each day
12:18:11 <dviroel> ok
12:18:43 <dviroel> we will have that discussion again next time, when we start booking them?
12:19:05 <dviroel> but that s an important feedback
12:19:12 <dviroel> anything else folks?
12:19:31 <dviroel> we may want to move forward, we have more topics
12:19:49 <dviroel> next on the announcement, a small one
12:19:52 <sean-k-mooney> yep we can move on
12:19:58 <dviroel> a just created a new status etherpad
12:20:06 <dviroel> still working on adding links on it
12:20:13 <dviroel> #link https://etherpad.opendev.org/p/watcher-gazpacho-status
12:20:36 * dviroel created this 5 min before the meeting starts
12:20:46 <dviroel> so we can track our reviews there
12:20:56 <dviroel> any improvement to that etherpad is welcome too
12:21:10 <dviroel> we may want to track backports there too
12:21:12 <dviroel> lets see
12:21:19 <sean-k-mooney> am can i make a request
12:21:23 <dviroel> sure
12:21:35 <sean-k-mooney> can we sue 2026.1 instead
12:21:53 <dviroel> in the etherpad name?
12:21:55 <sean-k-mooney> that technically the offal release name and gazpacho is just the code name
12:21:58 <sean-k-mooney> yes
12:22:01 <dviroel> ack
12:22:07 <dviroel> correct
12:22:08 <sean-k-mooney> ite much eaiser to fined the related ones if we just use the number
12:22:28 <dviroel> since Antelope, the numbering are the official  name
12:22:34 <sean-k-mooney> its why i use them for the ptg ethere pads as i often go back years later and refence them
12:22:48 <sean-k-mooney> im not sure we will do that of the status ones but finablity is high on my list
12:22:49 <dviroel> i did had to search how to properly write gazpacho for instance
12:23:00 <dviroel> ack, agree
12:23:11 <dviroel> I can fix that after the meeting
12:23:15 <sean-k-mooney> yes also that i cant spel gazpacho consitently
12:23:21 <sean-k-mooney> +1
12:23:30 <dviroel> thanks for the feedback
12:23:32 <dviroel> ok
12:23:37 <dviroel> moving to the next
12:23:47 <dviroel> #topic Unmaintained branch cleanup
12:23:52 <dviroel> hey jgilaber o/
12:24:11 <dviroel> #link https://lists.openstack.org/archives/list/openstack-discuss@lists.openstack.org/thread/YK4FRR6LBKZNS3PXFSYH3P3P6HQL4HCS/
12:24:43 <dviroel> jgilaber recently sent this mail to openstack-discuss ml
12:24:45 <jgilaber> we were talking about umnaintained branches last week with chandankumar and sean-k-mooney and realized we still have the old 2024.1 branch
12:25:03 <jgilaber> so I sent the email to see if anyone was using that branch or had any need of it
12:25:17 <jgilaber> otherwise we can remove it since it's been 2 years since the last patch
12:25:32 <sean-k-mooney> yep
12:25:48 <sean-k-mooney> the default is to remove unless a unmtained branch liason request it
12:26:00 <jgilaber> I wanted to bring it here in case anyone had any objection, if not I'll propose a patch to remove it
12:26:10 <sean-k-mooney> in this case no one has being doing maintainces (it does not have the security bug fix for example)
12:26:15 <sean-k-mooney> so i think we shoudl proceed
12:26:22 <amoralej> +1
12:26:34 <dviroel> +1 on proceed with the proposal
12:27:18 <jgilaber> ack, I'll propose the patch after the meeting
12:27:25 <dviroel> ack, thanks jgilaber
12:27:47 <dviroel> since there is no objections or concerns, lets move to the next topic
12:28:03 <dviroel> #topic  Functional tempest jobs for 2025.1 and 2024.2 broken
12:28:09 <dviroel> which is also from jgilaber
12:28:28 <jgilaber> yep, yesterday I submitted a patch to drop the tempest functional job for 2024.1
12:28:41 <jgilaber> #link https://review.opendev.org/c/openstack/watcher-tempest-plugin/+/966146
12:28:53 <jgilaber> and noticed that some functional jobs were consistently failing
12:29:07 <jgilaber> I've created a bug for it https://bugs.launchpad.net/watcher-tempest-plugin/+bug/2130783
12:29:11 <jgilaber> #link https://bugs.launchpad.net/watcher-tempest-plugin/+bug/2130783
12:29:37 <jgilaber> and submitted a patch to make them non-voting while I work on a fix to unblock other patches
12:29:40 <jgilaber> #link https://review.opendev.org/c/openstack/watcher-tempest-plugin/+/966256
12:30:16 <jgilaber> the tldr is that it seems that this commit https://github.com/openstack/tempest/commit/f7470781222524a6a65848721e7f64c6dd5cb8aa is making tox recreate the tempest venv
12:30:35 <jgilaber> and that does not install the watcher-tempest-plugin so it does not find the tests
12:31:37 <sean-k-mooney> ya
12:31:48 <sean-k-mooney> so thise feels like a cobination of things
12:32:01 <sean-k-mooney> newere version fo tox effectivly hash the venenv
12:32:19 <sean-k-mooney> so if it sees that we installed a package into it it will recreat it when we execute a command
12:32:50 <sean-k-mooney> there may be an interactoin with settup toosl as well but this feels like a tempest/devstack bug that we should report to the qa team
12:33:04 <sean-k-mooney> and then fix it ether in our plugin or tempest or devstack so that we do this porperly
12:33:39 <sean-k-mooney> i know there was dicsssion in devstack about how we are currently creating the venv
12:34:38 <sean-k-mooney> lets pick this up with gmaan whne they are online later
12:34:49 <sean-k-mooney> and bring it to the #openstack-qa channel
12:34:58 <sean-k-mooney> we are likely not the only team impacted
12:35:24 <sean-k-mooney> i need to step away for a few minutes so conitnue without me
12:35:29 <dviroel> ack, do we agree on making 2025.1 and 2024.2 non-voting for now? or wait for a fix?
12:35:52 <sean-k-mooney> maybe also 2025.2 i saw that fail on one patch
12:35:57 <sean-k-mooney> but that is my propsoal yes
12:35:59 <chandankumar> +1 for non-voting
12:36:02 <dviroel> i think that it will depend on how quickly this fix will land?
12:36:06 <sean-k-mooney> we just need to be carful with what we merge
12:36:11 <dviroel> yep
12:36:15 <chandankumar> it is failing on my watcher tempest plugin patches also
12:36:48 <dviroel> asking based on how urgent these patches are to land today on in 1 or 2 days
12:37:04 <amoralej> is it affecting jobs only the stable tests in watcher-tempest-plugin or also in the stable branches of watcher ?
12:37:04 <jgilaber> dviroel, maybe it's something simple, but I don't know enough about devstack to guess
12:37:10 <dviroel> ack
12:37:27 <dviroel> lets continue this conversation in the #openstack-qa channel
12:37:38 <dviroel> and we can defer here in the channel later today
12:38:20 <dviroel> thanks for working on this issue jgilaber
12:38:31 <dviroel> and reporting it
12:38:40 <jgilaber> amoralej, I would expect to also affect the watcher stables branches, but I don't think we've had anything recent running there
12:38:50 <amoralej> ack
12:39:01 <dviroel> yeah, ack
12:39:22 <dviroel> ok, so lets move on and continue to track this after the meeting, ok?
12:39:28 <jgilaber> +1
12:39:33 <dviroel> #topic Delete/bulk delete operation on audits/actionplan
12:39:52 <chandankumar> let me take it from here
12:39:59 <dviroel> hey chandankumar this is a follow up from dashboard session from the ptg
12:40:05 <dviroel> :)
12:40:08 <chandankumar> yes correct
12:40:11 <chandankumar> Currently we have openstack optimize audit delete aud1 aud2 aud3 or openstack optimize actionplan delete ap1 ap2 ap3.
12:40:20 <chandankumar> In both case, openstack cli sends a single delete request multiple times to delete the audit or actionplan.
12:40:31 <chandankumar> From codewise, It performs a soft deletion on audit/actionplan, then we need to manually run watcher db purge to delete the audit/actionplan permanently from the DB.
12:40:47 <chandankumar> I saw, When we delete an audit, it does not delete actionplan linked with the audit. Is it expected? or do we we want to extend to perform
12:40:47 <chandankumar> a soft delete on the action plan also?
12:41:36 <dviroel> I would not expect to delete all together
12:41:49 <amoralej> yeah, i'd keep current behavior
12:41:54 <dviroel> the action plan could still exist I think
12:42:05 <dviroel> if we think on a future rollback mechanism
12:42:21 <dviroel> user could use that action plan to rollback something
12:42:33 <dviroel> note: this not exist today
12:42:53 <dviroel> and would not be associated with the audit itself in this case
12:43:00 <amoralej> even for audit purposes, one may want to keep the actionplans visible
12:43:01 <dviroel> only with the action plan
12:43:14 <dviroel> amoralej: yes
12:44:02 <chandankumar> dviroel: amoralej thank you for clarifying it, it make sense to keep it as it is.
12:44:07 <chandankumar> Moving to bulk delete topic
12:44:10 <dviroel> is there a link from action plan to the audit?
12:44:47 <dviroel> in the dashboard? that may break?
12:45:00 <chandankumar> https://paste.openstack.org/raw/bh4xt1eZG8F4GgF2Amo6/ - this is what I have in the cli
12:45:16 <chandankumar> once we delete an audit, in the actionplan list it is set to None
12:45:21 <amoralej> so Audit is set as None
12:45:54 <amoralej> it'd be interesting to check if it's removed in the db or managed in the api
12:46:24 <chandankumar> https://github.com/openstack/watcher/blob/b5725d6ea60d3b7fb2d2b808b261ccdc547df7c4/watcher/api/controllers/v1/audit.py#L741
12:46:42 <chandankumar> based on the code, it perfrom soft_delete, I assume it just update the status to DELETED
12:48:24 <dviroel> when it is soft delete, I guess that it will not appear anymore
12:48:34 <dviroel> but it will be on the db
12:48:40 <amoralej> yes, it is
12:49:04 <amoralej> my question was if the audit field for that actionplan was removed in the db, sorry
12:49:08 <amoralej> in soft_delete
12:49:14 <amoralej> given that cli shows None
12:49:25 <dviroel> ah ok
12:49:37 <amoralej> or it's still in db, but the api is filtering it as it is soft_deleted
12:49:47 <amoralej> just curiosity, np
12:49:47 <chandankumar> amoralej: I will check that and get back on this.
12:50:00 <sean-k-mooney> back
12:50:47 <sean-k-mooney> i woudl expect it to still be in the db
12:50:49 <dviroel> this should be handled by the db
12:50:54 <sean-k-mooney> but it depend on hwo it was hooked up
12:52:22 <dviroel> but the tl;dr; here is to not delete the Action Plan when the audit is deleted, I think that we agree with the current implementation
12:52:34 <chandankumar> yes correct!
12:52:57 <dviroel> chandankumar: you were about to bring another point in this topic
12:53:01 <dviroel> ?
12:53:04 <chandankumar> yes coming to that
12:53:11 <chandankumar> Coming on bulk delete topic, Currently I found reference of bulk delete in swift API and in rest of the project there is no reference of that.
12:53:24 <chandankumar> reference from swift https://github.com/openstack/swift/blob/master/swift/common/middleware/bulk.py
12:53:36 <chandankumar> Since we have a requirement to do bulk archieve for audit/actionplan. How do we want to proceed with implementation?
12:53:45 <sean-k-mooney> ya so its not a common operation that need api support
12:54:00 <sean-k-mooney> well it depends
12:54:05 <amoralej> it would be asynchronous task ?
12:54:06 <chandankumar> If an user passes openstack optimize audit delete aud1 aud2 aud3 , Does it will call bulk delete api, it will do a single api call to delete all the passed audits or it will delete all the audit and perform soft delete from the db?
12:54:30 <sean-k-mooney> archiving an action plan shoudl archive teh actions assocated wtih the plan
12:54:57 <sean-k-mooney> archiving a one shot audit coudl do the same, but that is less clear that it shoudl
12:55:12 <sean-k-mooney> certenly if we want the behavior to be the same for contious audit we woudl nto want ti too
12:55:35 <amoralej> archiving an actionplan already archives the actions
12:55:40 <sean-k-mooney> but yes at the api level we could take the uuids as a list in the query stirng
12:55:59 <sean-k-mooney> form a http point of view the delete method is not expected ot have a body
12:56:29 <sean-k-mooney> so if we dont want to hit the quary string lenght limit
12:57:05 <sean-k-mooney> the othe roption wouls be a post to a new api endpoint with the auits or action plans to archive listed in json in the boday
12:57:54 <sean-k-mooney> the final optoin i see is takign a `cacade` or simialr query arg to the audit delete
12:58:04 <sean-k-mooney> to opt into archiing the action plans as well
12:58:22 <chandankumar> currently we donot archive actionplan if we archieve audit
12:59:29 <dviroel> we could, if the user decide to do that with an option, but yeah, we don't do by default
12:59:43 <amoralej> i like the idea of making it an option
12:59:44 <dviroel> i mean, we don't support that
12:59:57 <sean-k-mooney> so that 3 related optiosn `DELETE /audit/<uuid>?cacade=true`   `DELETE /actionplan/?uuids=....` or `POST /archive with a json body`
13:00:39 <sean-k-mooney> the ohter option woudl be do do it client side only which im not sure is correct
13:01:37 <sean-k-mooney> if we were to just do it in the watcher dashbaord it woudl have to be server side in the plugin not in javascript
13:01:39 * dviroel time check
13:01:57 <sean-k-mooney> i think there is enoguh dicusion here to show that in any case this need a spec
13:02:01 <sean-k-mooney> do we agree?
13:02:09 <amoralej> i'd expect horizon to provide some support for bulk?
13:02:10 <dviroel> yes
13:02:21 <chandankumar> from dashboard point of view , use can click on bulk archieve button to archieve all audit?
13:02:28 <dviroel> yes -> agree that we need a spec
13:02:41 <amoralej> +1 to spec
13:02:48 <sean-k-mooney> im not sure we need this at all by the way
13:03:08 <sean-k-mooney> my general prefence woudl be to have an expiry time on the audit/action plans
13:03:14 <amoralej> i see it as a nice-to-have requirement, tbh
13:03:28 <sean-k-mooney> both for howlong an pending action plan is retains and completed ones
13:03:45 <sean-k-mooney> so a time to live for unappoved action plans
13:03:57 <sean-k-mooney> and an expiry time before they are auto archived
13:04:10 <sean-k-mooney> but we can talk about the usecase in the spec
13:04:16 <dviroel> expiry time is also a good idea from my pov
13:04:17 <sean-k-mooney> and then desing the correct feature or feature to adress that
13:05:04 <dviroel> chandankumar: yeah, we need more time to think about and the spec would be a good starting point, describing the uses cases and possible solutions
13:05:22 <chandankumar> + for expiry time, Does is it going to be a seperate feature?
13:05:33 <dviroel> and maybe we can revisit in future meetings if needed
13:05:47 <chandankumar> I got more info for the spec now, I will start working on that
13:06:05 <sean-k-mooney> we can scope that doen in the spec
13:06:13 <sean-k-mooney> if its too complicate we can split it
13:06:19 <dviroel> +1
13:06:22 <chandankumar> +1
13:06:26 <chandankumar> thank you all !
13:06:29 <sean-k-mooney> if not we can have one spec for audit/action lifetimes
13:06:34 <dviroel> thanks chandankumar
13:06:38 <sean-k-mooney> and cover the watcher ui and cli imapcts
13:06:46 <dviroel> so we are out of time
13:07:00 <dviroel> there are 2 links in the reviews topic
13:07:18 <dviroel> #topic Reviews
13:07:25 <dviroel> #link https://review.opendev.org/c/openstack/watcher/+/958766: Remove watcher_notifications from default value
13:07:26 <dviroel> and
13:07:33 <dviroel> #link https://review.opendev.org/q/topic:%22bug-2126767%22+and+status:open service_monitor for decision-engine
13:07:43 <dviroel> we don't have time to go into details
13:07:56 <dviroel> but we should be looking at those as requested
13:08:15 <dviroel> amoralej: we can get you a topic at the start of next meeting if needed
13:08:16 <amoralej> second one was just to bring those to your atenttion
13:08:22 <dviroel> amoralej: ack, thanks
13:08:29 <amoralej> we can discuss in the reviews
13:08:37 <dviroel> #topic Volunteers to chair next meeting
13:08:51 <jgilaber> I can do it dviroel
13:08:57 <dviroel> jgilaber: TY
13:09:10 <dviroel> so
13:09:10 <dviroel> let's wrap up for today
13:09:14 <dviroel> we will meet again next week
13:09:23 <dviroel> thank you all for participating
13:09:26 <dviroel> #endmeeting