Wednesday, 2020-02-19

*** igordc has quit IRC00:06
*** slaweq has joined #openstack-meeting-300:11
*** slaweq has quit IRC00:15
*** slaweq has joined #openstack-meeting-301:11
*** slaweq has quit IRC01:15
*** jamesmcarthur has joined #openstack-meeting-301:17
*** jamesmcarthur has quit IRC01:39
*** jamesmcarthur has joined #openstack-meeting-301:51
*** yamamoto has joined #openstack-meeting-301:53
*** yamamoto has quit IRC01:53
*** yamamoto has joined #openstack-meeting-301:57
*** jamesmcarthur has quit IRC02:01
*** slaweq has joined #openstack-meeting-302:11
*** yamamoto has quit IRC02:15
*** slaweq has quit IRC02:16
*** yamamoto has joined #openstack-meeting-302:16
*** purplerbot has quit IRC02:34
*** purplerbot has joined #openstack-meeting-302:36
*** yamamoto has quit IRC02:38
*** yamamoto has joined #openstack-meeting-302:42
*** yamamoto has quit IRC02:43
*** yamamoto has joined #openstack-meeting-302:45
*** jamesmcarthur has joined #openstack-meeting-302:48
*** yamamoto has quit IRC02:49
*** hongbin has joined #openstack-meeting-302:53
*** slaweq has joined #openstack-meeting-303:11
*** slaweq has quit IRC03:16
*** yamamoto has joined #openstack-meeting-303:21
*** hongbin has quit IRC03:23
*** jamesmcarthur has quit IRC03:26
*** psachin has joined #openstack-meeting-303:35
*** jamesmcarthur has joined #openstack-meeting-303:48
*** slaweq has joined #openstack-meeting-304:11
*** slaweq has quit IRC04:16
*** yamamoto has quit IRC04:36
*** jamesmcarthur has quit IRC04:58
*** jamesmcarthur has joined #openstack-meeting-304:59
*** macz_ has quit IRC05:00
*** yamamoto has joined #openstack-meeting-305:07
*** jamesmcarthur has quit IRC05:07
*** jamesmcarthur has joined #openstack-meeting-305:10
*** slaweq has joined #openstack-meeting-305:11
*** yamamoto has quit IRC05:12
*** yamamoto has joined #openstack-meeting-305:15
*** slaweq has quit IRC05:16
*** jamesmcarthur has quit IRC05:22
*** jamesmcarthur has joined #openstack-meeting-305:28
*** jamesmcarthur has quit IRC05:32
*** links has joined #openstack-meeting-305:35
*** jamesmcarthur has joined #openstack-meeting-306:08
*** slaweq has joined #openstack-meeting-306:11
*** jamesmcarthur has quit IRC06:14
*** slaweq has quit IRC06:15
*** hongbin has joined #openstack-meeting-306:23
*** hongbin has quit IRC06:28
*** jamesmcarthur has joined #openstack-meeting-306:30
*** jamesmcarthur has quit IRC06:35
*** yamamoto has quit IRC06:39
*** yamamoto has joined #openstack-meeting-306:41
*** openstack has joined #openstack-meeting-307:19
*** ChanServ sets mode: +o openstack07:19
*** openstack has joined #openstack-meeting-307:43
*** ChanServ sets mode: +o openstack07:43
*** slaweq has joined #openstack-meeting-307:51
*** yamamoto has quit IRC08:18
*** links has quit IRC08:24
*** jamesmcarthur has joined #openstack-meeting-308:31
*** jamesmcarthur has quit IRC08:36
*** yamamoto has joined #openstack-meeting-308:53
*** ralonsoh has joined #openstack-meeting-308:54
*** yamamoto has quit IRC08:58
*** links has joined #openstack-meeting-309:01
*** yamamoto has joined #openstack-meeting-309:34
*** yamamoto has quit IRC09:37
*** e0ne has joined #openstack-meeting-309:41
*** yamamoto has joined #openstack-meeting-310:51
*** yamamoto has quit IRC10:52
*** slaweq has quit IRC11:06
*** yamamoto has joined #openstack-meeting-311:09
*** slaweq has joined #openstack-meeting-311:11
*** yamamoto has quit IRC11:13
*** slaweq has quit IRC11:32
*** slaweq has joined #openstack-meeting-311:34
*** raildo has joined #openstack-meeting-311:42
*** yamamoto has joined #openstack-meeting-311:59
*** yamamoto has quit IRC12:53
*** yamamoto has joined #openstack-meeting-313:15
*** jamesmcarthur has joined #openstack-meeting-313:17
*** liuyulong has joined #openstack-meeting-313:29
*** jamesmcarthur has quit IRC13:36
*** jamesmcarthur has joined #openstack-meeting-313:48
*** jamesmcarthur has quit IRC14:14
*** links has quit IRC14:18
*** slaweq has quit IRC14:27
*** slaweq has joined #openstack-meeting-314:30
*** jamesmcarthur has joined #openstack-meeting-314:32
*** jamesmcarthur_ has joined #openstack-meeting-314:33
*** jamesmcarthur has quit IRC14:36
*** yamamoto has quit IRC14:48
*** irclogbot_3 has quit IRC14:56
slaweq#startmeeting neutron_ci15:00
openstackMeeting started Wed Feb 19 15:00:57 2020 UTC and is due to finish in 60 minutes.  The chair is slaweq. Information about MeetBot at http://wiki.debian.org/MeetBot.15:00
openstackUseful Commands: #action #agreed #help #info #idea #link #topic #startvote.15:00
*** openstack changes topic to " (Meeting topic: neutron_ci)"15:01
openstackThe meeting name has been set to 'neutron_ci'15:01
ralonsohhi15:01
njohnstono/15:01
*** irclogbot_1 has joined #openstack-meeting-315:02
slaweqping bcafarel for CI meeting :)15:02
bcafarelo/15:02
slaweqok, so I think we can start15:02
bcafarelone day I will get used to this new chan :)15:02
slaweqGrafana dashboard: http://grafana.openstack.org/dashboard/db/neutron-failure-rate15:02
slaweqbcafarel: LOL15:02
slaweqthen we will change it again :P15:02
slaweq#topic Actions from previous meetings15:03
*** openstack changes topic to "Actions from previous meetings (Meeting topic: neutron_ci)"15:03
slaweqfirst one: ralonsoh to check issues with unauthorized ping and ncat commands in functional tests15:03
ralonsoha bit messy this week15:03
ralonsohI pushed a patch and then reverted it15:03
ralonsohI have also pushed a patch to mark ncat tests unstable15:04
ralonsohI still don't know why rootwrap filters sometimes are valid and in other tests don't15:04
ralonsoh*aren't15:04
ralonsoh(that's all)15:05
slaweqbut it's working for some tests and not working for other tests in same job?15:05
slaweqor in different jobs?15:05
ralonsohsame job15:06
ralonsohand this is the worst scenario15:06
ralonsohthere is no reason for this15:06
slaweqwhen neutron-functional job was legacy zuulv2 job, it run tests with "sudo"15:08
slaweqsudo -H -u stack tox -e dsvm-functional15:08
ralonsohbut ncat is using the rootwrap command15:09
ralonsohand the filter list15:09
ralonsoh(that's OK)15:09
slaweqok, maybe it's some oslo.rootwrap bug than?15:09
*** psachin has quit IRC15:09
slaweqmaybe we should ask someone from oslo team to take a look at such failed run?15:10
slaweqwhat do You think?15:10
ralonsohI think so. mjozefcz detected that, during a small time, the filters were not present15:10
ralonsohI'll ping oslo folks15:10
slaweqralonsoh: thx15:10
*** lpetrut has quit IRC15:11
slaweq#action ralonsoh to talk with oslo people about our functional tests rootwrap issue15:11
*** maciejjozefczyk has joined #openstack-meeting-315:12
slaweqok, next one15:12
slaweqslaweq to report issue with ssh timeout on dvr jobs and check logs there15:12
slaweqI reported bug here: https://bugs.launchpad.net/neutron/+bug/186385815:12
openstackLaunchpad bug 1863858 in neutron "socket.timeout error in dvr CI jobs cause SSH issues" [Critical,Confirmed]15:12
slaweqand I looked at it a bit today15:12
slaweqFrom what I looked it seems that it's failing simply due to ssh timeout. So I guess there is some problem with FIP configuration but I don't know what exactly.15:12
slaweqas this started happening only few weeks ago, I checked what we merged recently15:13
slaweqand I found https://review.opendev.org/#/c/606385/ which seems suspicious for me15:13
slaweqbut so far I don't have any strong evidence that this is the culprit of the issue15:13
slaweqI proposed revert of this patch https://review.opendev.org/#/c/708624/ just to recheck if couple of times and see if it will be better15:14
slaweqcurrently neutron-tempest-dvr job is failing pretty often due to this bug15:15
slaweqif we will not find anything to fix this issue in next few days, maybe we should temporary switch this job to be non-voting15:15
slaweqwhat do You think?15:15
ralonsohbut it's failing only one job15:15
ralonsohone test15:16
ralonsohtest_resize_volume_backed_server_confirm15:16
slaweqno, I saw also other test failing15:16
ralonsohok15:16
slaweqlike here: https://19574e4665a40f62095e-6b9500683e6a67d31c1bad572acf67ba.ssl.cf1.rackcdn.com/705982/6/check/neutron-tempest-dvr/8f3fbd0/testr_results.html15:16
ralonsohok, we can mark is as unstable for now (non-vonting)15:16
ralonsohyes, I saw this too15:16
slaweqbut that't true, it only happens in dvr job15:16
bcafarelhttps://review.opendev.org/#/c/606385/ seems "old" no? we have it since stein15:17
slaweqahh, yes15:17
slaweqso it's not that patch for sure15:18
slaweqit was on my list of recent patches because it was recently merged to stein branch15:18
slaweqso there was comment in original patch in gerrit15:18
slaweqso, this isn't the culprit for sure :/15:18
slaweqthx bcafarel for pointing this out15:18
bcafarelnp :)15:18
slaweqI will try to reproduce this issue locally maybe this week15:19
slaweq#action slaweq to try to reproduce and debug neutron-tempest-dvr ssh issue15:19
slaweqok, so that's all from my side about this issue15:20
slaweqnext one15:20
slaweqralonsoh to check periodic neutron-ovn-tempest-ovs-master-fedora job's failures15:21
maciejjozefczykslaweq, thanks15:21
ralonsohslaweq, I didn't have time for this one15:21
ralonsohsorry15:21
ralonsoh(is still in my todo list)15:21
slaweqralonsoh: no problem :)15:21
slaweq#action ralonsoh to check periodic neutron-ovn-tempest-ovs-master-fedora job's failures15:21
slaweqmaciejjozefczyk: for what? :)15:21
slaweqok, lets move on15:22
slaweqnext topic15:22
slaweq#topic Stadium projects15:22
*** openstack changes topic to "Stadium projects (Meeting topic: neutron_ci)"15:22
slaweqstandardize on zuul v315:23
slaweqthere was only slow progress this week15:23
slaweqwe almost merged patch for bagpipe, but some new bug blocked it in gate15:23
slaweqanything else regarding stadium project's ci?15:24
njohnstonnope15:25
njohnstonlike you said, slow progress15:25
slaweqok, so lets move on15:25
slaweq#topic Grafana15:25
*** openstack changes topic to "Grafana (Meeting topic: neutron_ci)"15:25
slaweq#link http://grafana.openstack.org/dashboard/db/neutron-failure-rate15:25
slaweqI think we have still the same problems there:15:27
slaweq1. functional tests15:27
slaweq2. neutron-tempest-dvr15:27
slaweq3. grenade jobs15:27
slaweqother than that, I think it looks good15:27
njohnstonagreed15:27
slaweqe.g. neutron-tempest-plugin jobs (voting ones) are on very low failure rates15:27
slaweqtempest jobs are fine too15:28
slaweqeven fullstack jobs are pretty stable now15:28
slaweqand that conclusion was confirmed when I was looking at some specific failures today15:30
slaweqmost of the times, it those repeat problems15:30
slaweqok, but do You have anything else regarding grafana to add?15:30
ralonsohno15:31
njohnstonno15:31
bcafarelall good15:31
slaweqok, so lets continue then15:32
slaweq#topic fullstack/functional15:32
*** openstack changes topic to "fullstack/functional (Meeting topic: neutron_ci)"15:32
slaweqas I said, I only found those issues with ncat and ping commands: https://storage.bhs.cloud.ovh.net/v1/AUTH_dcaab5e32b234d56b626f72581e3644c/zuul_opendev_logs_e30/705237/7/check/neutron-functional/e301c8d/testr_results.html15:32
slaweqand  https://2a0154cb9a3e47bde3ed-4a9629bf7847ad9c8b03c9755148c549.ssl.cf1.rackcdn.com/705660/4/check/neutron-functional/2e5030b/testr_results.html15:33
slaweqralonsoh: but shouldn't this issue with ping command be solved by https://review.opendev.org/#/c/707452/ ?15:33
ralonsohIMO, this could be a similar problem to the ncat rootwrap one15:34
slaweqahh, ok15:34
ralonsohbut of course, this patch modifies the ping commands and filters to match15:34
ralonsoh(regardless of a possible problem in oslo rootwrap)15:34
slaweqso if we will hopefully resolve rootwrap issue with ncat command, this one should be fine too15:34
ralonsohprobably15:34
ralonsohbut the patch, IMO, is legit15:35
slaweqralonsoh: do You have list of tests which are using ping/ncat command and are failing due to that?15:35
slaweqmaybe we could mark them as unstable for now if it's not many tests?15:36
ralonsohno, I don't have this list15:36
ralonsohwhat I did was find in the code any "ping" usage15:36
ralonsohand I defined a uniformed way of calling this command15:36
slaweqok, lets for now wait for help from oslo folks with that - maybe they will find root cause of the problem15:38
slaweqanything else related to functional/fullstack tests?15:38
ralonsohyes15:38
ralonsohhttps://bugs.launchpad.net/neutron/+bug/1828205/comments/415:39
openstackLaunchpad bug 1828205 in neutron ""network-segment-ranges" doesn't return the project_id" [Medium,In progress] - Assigned to Rodolfo Alonso (rodolfo-alonso-hernandez)15:39
ralonsoh(sorry, this in tempest one)15:39
slaweqralonsoh: no problem15:39
slaweqif we don't have anything else related to functional/fullstack we can move to the tempest topic now :)15:39
slaweq#topic Tempest/Scenario15:40
*** openstack changes topic to "Tempest/Scenario (Meeting topic: neutron_ci)"15:40
slaweqand please continue15:40
slaweq:)15:40
ralonsohthanks15:40
ralonsohpatch: https://review.opendev.org/#/c/707898/15:40
ralonsohplease, take a look at c#415:40
ralonsohnot now, the comment is a bit long15:40
ralonsoh(that's all)15:40
slaweqok, I will read it later today15:41
slaweqand this Your patch https://review.opendev.org/#/c/707898/ should hopefully resolve this mystery of missing project_id field, right?15:42
ralonsohyes15:42
slaweqgreat :)15:42
slaweqat least one down :)15:43
slaweqthx ralonsoh15:43
ralonsohthis is due to the non finished tenant_id->project_id migration15:43
ralonsohyw15:43
bcafarelhe, it had been some time since last issue about that migration15:43
slaweqfrom other things, I pushed to tempest patch to increase timeout for tempest-ipv6-only job: https://review.opendev.org/70863515:44
slaweqI hope that QA team will accept it :)15:45
slaweqif not I will reopen patch in neutron repo15:45
slaweqand last thing:15:45
slaweqI found one new, "interesting" failure: https://storage.gra.cloud.ovh.net/v1/AUTH_dcaab5e32b234d56b626f72581e3644c/zuul_opendev_logs_d53/708009/4/check/tempest-ipv6-only/d53c891/testr_results.html15:46
slaweqdid You saw such issues before?15:46
ralonsohDetails: {'type': 'RouterInUse', 'message': 'Router 86ed5e27-7dc9-4974-ac7c-4e2e1f0558f0 still has ports', 'detail': ''}15:46
ralonsohyes, a couple of times15:46
ralonsohthe resource was not cleaned properly15:47
slaweqI don't think it's related to the patch on which it was run15:47
ralonsohI don't think so15:47
slaweqralonsoh: so it seems that it's tempest cleanup issue, right?15:47
ralonsohI think so15:47
slaweqok, so I will report it as a tempest bug15:48
slaweq#action slaweq to report tempest bug with routers cleanup15:48
slaweqand that's all from my side for today15:48
slaweqanything else regarding to scenario jobs?15:48
bcafarelI saw a few timeouts on tempest plugin runs in stable branches today, but may just be infra/slow node issue15:49
bcafarelrechecks in progress :)15:49
bcafarelit was mostly the designate job15:49
slaweqbcafarel: I saw some timeouts on designate job on master branch too15:50
slaweqbut as it was timeout, I didn't check it more15:50
slaweqlets hope it's just slow nodes issue :)15:50
bcafarelfingers crossed15:51
slaweq:)15:51
slaweqbcafarel: I think You wanted to raise some other topic on this meeting too, right?15:51
slaweq#topic Open discussion15:51
*** openstack changes topic to "Open discussion (Meeting topic: neutron_ci)"15:51
slaweqif so, the floor is Yours :)15:51
ralonsohnothing from me15:52
bcafarelThere was https://bugs.launchpad.net/neutron/+bug/1863830 bug, but liuyulong updated it after my initial run - it was dup of ncat issue in the end15:52
openstackLaunchpad bug 1863213 in neutron "duplicate for #1863830 Spawning of DHCP processes fail: invalid netcat options" [Undecided,New] - Assigned to Rodolfo Alonso (rodolfo-alonso-hernandez)15:52
slaweqahh, ok15:53
bcafarelI just was not up to date on recent bugs :) (holidays have this effect)15:53
slaweqlucky You :P15:53
bcafareland all good apart from that, glad to see stable branches back in working order15:53
slaweqok, so I think we can finish a bit earlier today15:53
slaweqthx for attending15:54
bcafarel+1 :)15:54
bcafarelo/15:54
slaweq#endmeeting15:54
*** openstack changes topic to "OpenStack Meetings || https://wiki.openstack.org/wiki/Meetings/"15:54
openstackMeeting ended Wed Feb 19 15:54:12 2020 UTC.  Information about MeetBot at http://wiki.debian.org/MeetBot . (v 0.1.4)15:54
slaweqo/15:54
openstackMinutes:        http://eavesdrop.openstack.org/meetings/neutron_ci/2020/neutron_ci.2020-02-19-15.00.html15:54
openstackMinutes (text): http://eavesdrop.openstack.org/meetings/neutron_ci/2020/neutron_ci.2020-02-19-15.00.txt15:54
openstackLog:            http://eavesdrop.openstack.org/meetings/neutron_ci/2020/neutron_ci.2020-02-19-15.00.log.html15:54
ralonsohbye15:54
njohnstono/15:54
*** liuyulong has quit IRC16:24
*** igordc has joined #openstack-meeting-316:59
*** igordc has quit IRC17:46
*** e0ne has quit IRC18:00
*** igordc has joined #openstack-meeting-318:22
*** jamesmcarthur has joined #openstack-meeting-318:39
*** jamesmcarthur_ has quit IRC18:41
*** ralonsoh has quit IRC18:52
*** maciejjozefczyk has quit IRC19:02
*** jamesmcarthur has quit IRC20:06
*** jamesmcarthur has joined #openstack-meeting-321:07
*** jamesmcarthur has quit IRC21:07
*** jamesmcarthur has joined #openstack-meeting-321:07
*** jamesmcarthur has quit IRC21:19
*** jamesmcarthur has joined #openstack-meeting-321:21
*** jamesmcarthur has quit IRC21:26
*** raildo has quit IRC21:28
*** jamesmcarthur has joined #openstack-meeting-321:56
*** jamesmcarthur has quit IRC21:58
*** jamesmcarthur has joined #openstack-meeting-321:58
*** jamesmcarthur_ has joined #openstack-meeting-322:09
*** jamesmcarthur has quit IRC22:13
*** slaweq has quit IRC22:29
*** jamesmcarthur_ has quit IRC23:32
*** jamesmcarthur has joined #openstack-meeting-323:33
*** jamesmcarthur has quit IRC23:39

Generated by irclog2html.py 2.15.3 by Marius Gedminas - find it at mg.pov.lt!