13:00:52 #startmeeting barbican 13:00:53 Meeting started Tue Jul 14 13:00:52 2020 UTC and is due to finish in 60 minutes. The chair is redrobot. Information about MeetBot at http://wiki.debian.org/MeetBot. 13:00:54 Useful Commands: #action #agreed #help #info #idea #link #topic #startvote. 13:00:57 The meeting name has been set to 'barbican' 13:01:09 #topic Roll Call 13:01:19 o/ 13:01:21 o/ 13:01:21 o/ 13:01:38 Hello everyone! 13:01:42 iurygregory, around? 13:01:51 moguimar, yup 13:02:10 hey y'all 13:02:15 Courtesy ping for ade_lee dave-mccowan hrybacki jamespage lxkong mhen raildo rm_work xek nearyo 13:02:23 hi iurygregory! 13:02:40 hey redrobot o/ 13:03:18 As usual, our agenda can be found here: 13:03:20 #link https://etherpad.opendev.org/p/barbican-weekly-meeting 13:03:39 #topic Review Past Meeting Action Items 13:04:14 #link http://eavesdrop.openstack.org/meetings/barbican/2020/barbican.2020-07-07-13.00.html 13:04:25 First action item: moguimar to update Core Team roster in wiki 13:04:30 done 13:04:41 moguimar, nice! 13:05:04 Here is the link: 13:05:06 #link https://wiki.openstack.org/wiki/Barbican#Core_Team 13:05:25 Next action item: 13:05:27 tosky to check if barbican/volume tests works if moved into cinder-tempest-plugin 13:05:36 tosky, did you get a chance to work on that? 13:05:41 not yet, sorry 13:05:54 but I will come to that 13:06:07 no worries 13:06:14 We'll check in again next week 13:06:16 #action tosky to check if barbican/volume tests works if moved into cinder-tempest-plugin 13:06:41 #topic Liaison Updates 13:06:56 moguimar, any news from Oslo? 13:07:04 nope 13:08:17 we had a good meeting with plenty of topics, but none that I recall invoves castellan or barbican 13:08:55 moguimar, cool 13:09:06 tosky, any news from the testing side of things? 13:09:57 not much (I guess it's linked to the point above) 13:10:14 I rechecked the barbican grenade job but I'm still not sure about the reason of the failure 13:10:33 Gotcha 13:10:39 OK, moving on 13:10:51 iurygregory also checked the barbican grenate 13:10:53 grenade 13:11:04 #topic Grenade Gate 13:11:08 I guess you two could join forces 13:11:59 the failure seems to be related to cinder (and since the grenade didn't test barbican before it's a bit tricky) XD 13:12:41 I was looking at the cinder job with barbican to get an idea of the configuration and I'm trying to test locally =) 13:12:41 right now the grenade job on master doesn't test barbican 13:12:56 hence my patch to add barbican 13:12:56 tosky, yep, it never tested XD 13:13:07 but barbican fails at some point during the upgrade process 13:13:34 this is the patch: https://review.opendev.org/#/c/724258/ 13:13:49 it fails with Order creation attempt not allowed - please review your user/project privileges: oslo_policy.policy.PolicyNotAuthorized: orders:post is disallowed by policy 13:14:06 this is while trying to create an encrypted volume 13:14:31 so maybe is just a miss configuration on Cinder/Barbican 13:14:43 Oh, interesting... sounds like a mismatch in oslo.policy 13:15:02 hummm 13:15:12 this would be solved by configuration or nope? 13:15:20 i.e. cinder policy is allowing the operation, but barbican poicy is not 13:15:37 here is the error: https://zuul.opendev.org/t/openstack/build/896eb6b045ce4b48880340b92007b413/log/controller/logs/screen-barbican-svc.txt#67 13:15:49 we'll have to check the role assignments for the user credentials that are being used 13:15:56 * redrobot makes a note to look into that patch 13:16:35 the timestamp matches the failed operation (well, it's the only error, I'd be surprised if it wasn't the case): https://zuul.opendev.org/t/openstack/build/896eb6b045ce4b48880340b92007b413/log/controller/logs/grenade.sh_log.txt#1424 13:18:54 maybe something changed between ussuri and master around policies? Because I think this error should be visible also with a normal devstack job 13:18:59 but I don't know too much about that part 13:19:20 🤔 13:19:36 I am not sure... but I'll try to look into it this week, and ping you if I find anything. 13:20:17 probably we can run with a different tempest role 13:20:52 iirc I tried two of them 13:21:01 maybe I still missed the right one 13:21:28 humm I will do some tests locally to see how it goes 13:24:25 another thing to check is to see whether we need barbican-specific roles to be added? 13:24:52 e.g. "creator" 13:24:59 the config we have only creator 13:25:14 on the cinder job I saw they set member, creator 13:25:20 so maybe would need both? 13:26:16 I am not sure. I think I need to learn a little bit more about tempest to be helpful. 😞 13:26:44 I will give a try locally with both options on the tempest.conf 13:27:10 * iurygregory was checking the cinder job and will check diffs from cinder/barbican.conf 13:27:26 awesome, thank you for the help iurygregory! 13:27:33 np o/ 13:27:34 and tosky! :D 13:28:05 OK, moving on 13:28:09 #topic Kanban Review 13:28:34 #link https://tree.taiga.io/project/dmend-openstack-barbican/kanban 13:28:55 moguimar, any updates on the HVAC rewrite for Castellan 13:28:57 ? 13:28:59 yep 13:29:01 https://review.opendev.org/#/c/739698/ 13:29:13 got some good reviews from herve and stephen 13:29:21 Nice! 13:29:25 with this one in, the next one is hvac 13:29:29 I need to give it a good review as well. 13:29:51 very cool, moguimar! :D 13:30:10 I need to add a card on here to track the Secret Consumers progress. 13:31:38 OK, moving on ... 13:31:46 #topic Bug Review 13:32:05 First up Barbican Storyboards 13:32:08 #link https://storyboard.openstack.org/#!/project_group/barbican 13:32:18 Looks like we have not had any new bugs this week 13:32:20 so that's good 13:33:05 shouldn't openstack/ansible-role-atos-hsm and openstack/ansible-role-thales-hsm be listed too? 13:33:31 moguimar, great question ... I think we may not have created those projects in Storyboard 13:33:37 I can look into getting that added 13:33:54 #action redrobot to look into ansible-role-thales-hsm and ansible-role-atos-hsm bug projects 13:34:13 those two are still co-owned by the OpenStack Ansible team 13:34:27 and they're listed under that project in governance 13:34:42 and I'm not sure they're using Storyboard to track bugs on those 13:35:16 Next, the Castellan Launchpad 13:35:19 #link https://bugs.launchpad.net/castellan/ 13:35:43 Looks like no new bugs there either 13:35:47 OK, moving on 13:36:03 #topic Wayward Reviews 13:36:10 and this week we have a brand new link thanks to moguimar :D 13:36:25 #link https://tinyurl.com/ya2qgkw6 13:36:51 I'd like to focus on the stable branches first today 13:37:13 what about we tackle all S/T/U patches? 13:37:18 lots of -1s at the gate :( 13:37:22 yep 13:37:47 this one I got the gate fixed 13:37:49 https://review.opendev.org/#/c/738396/ 13:37:52 with tosky's help 13:38:20 m m m m merged! 13:38:21 :D 13:39:46 we have many of them failing docs 13:39:50 Douglas Mendizábal proposed openstack/barbican stable/queens: Make broken fedora_latest job n-v https://review.opendev.org/695327 13:40:07 I'll backport the docs fix we had 13:40:14 and rebase them 13:40:16 ^^ rebased to see if gate is still passing on that one (last run was from December 2019 13:40:22 moguimar, awesome 13:40:53 I think maybe we should do them in LIFO order 13:41:07 eg. do U backports first, then T, then S 13:41:14 ok 13:41:35 Doesn't look like there's any we can merge right now 13:41:56 we have 2 patches by the release bot 13:42:01 and one by you redrobot 13:42:03 this one 13:42:05 Use Zuulv3 devstack jobs 13:42:40 but they are also failing devstack 13:42:51 not just docs 13:43:04 K, maybe we can tag team some of those this week 13:44:33 moguimar, here's one for master from the translation bot: https://review.opendev.org/#/c/735772/ 13:45:28 shipped 13:45:48 OK, let's move on 13:45:52 #topic Open Discussion 13:45:57 Anything else y'all want to talk about? 13:46:21 let me see the dashboard one last time 13:47:14 any tips on why this one is failing? 13:47:16 https://review.opendev.org/#/c/740698/ 13:49:11 moguimar, looks like a database integrity issue: https://zuul.opendev.org/t/openstack/build/b6dd7183a15c42f0a5fcc48e50d89c71/log/controller/logs/screen-barbican-svc.txt#202 13:49:20 which sounds super crappy 13:49:44 I'm not sure how you'd end up with duplicate rows during SecretOrder processing :( 13:50:50 is it some flakyness? 13:52:00 yeah, lets try a recheck and if it fails consistently with the same error we may need to dig into it some more 13:52:34 ok 13:53:12 cause I had a +1 from Zuul before 13:53:48 and then I got a -1 13:53:58 * redrobot shakes fist at Zuul 13:54:15 ok, I guess that' 13:54:18 s all 13:54:24 thanks moguimar 13:54:25 on my end 13:54:34 We're just about out of time too 13:54:44 * redrobot can't remember the last time that happened 13:54:53 Thanks for joining everyone! 13:55:01 #endmeeting