Wednesday, 2020-02-05

*** jamesmcarthur has joined #openstack-meeting-300:45
*** jamesmcarthur has quit IRC01:00
*** jamesmcarthur has joined #openstack-meeting-301:10
*** slaweq_ has joined #openstack-meeting-301:11
*** slaweq_ has quit IRC01:16
*** jamesmcarthur has quit IRC01:22
*** vesper has joined #openstack-meeting-301:38
*** vesper11 has quit IRC01:39
*** jamesmcarthur has joined #openstack-meeting-301:45
*** igordc has joined #openstack-meeting-301:52
*** jamesmcarthur has quit IRC02:22
*** slaweq_ has joined #openstack-meeting-303:11
*** apetrich has quit IRC03:12
*** slaweq_ has quit IRC03:16
*** links has joined #openstack-meeting-304:43
*** bnemec has joined #openstack-meeting-304:53
*** hongbin has joined #openstack-meeting-304:55
*** igordc has quit IRC04:59
*** slaweq_ has joined #openstack-meeting-305:11
*** slaweq_ has quit IRC05:15
*** hongbin has quit IRC05:28
*** slaweq_ has joined #openstack-meeting-307:11
*** slaweq_ has quit IRC07:16
*** slaweq_ has joined #openstack-meeting-308:00
*** slaweq__ has joined #openstack-meeting-308:09
*** slaweq_ has quit IRC08:10
*** slaweq has joined #openstack-meeting-308:14
*** slaweq__ has quit IRC08:16
*** ralonsoh has joined #openstack-meeting-308:42
*** apetrich has joined #openstack-meeting-308:48
*** slaweq_ has joined #openstack-meeting-308:48
*** slaweq has quit IRC08:48
*** slaweq__ has joined #openstack-meeting-309:06
*** slaweq_ has quit IRC09:07
*** slaweq__ is now known as slaweq09:25
*** slaweq_ has joined #openstack-meeting-310:40
*** slaweq has quit IRC10:42
*** e0ne has joined #openstack-meeting-310:57
*** slaweq__ has joined #openstack-meeting-311:01
*** slaweq_ has quit IRC11:03
*** lkoranda has joined #openstack-meeting-311:05
*** bobmel has joined #openstack-meeting-311:17
*** psachin has joined #openstack-meeting-311:30
*** pcaruana has quit IRC11:37
*** e0ne has quit IRC11:42
*** pcaruana has joined #openstack-meeting-311:50
*** slaweq__ has quit IRC12:19
*** slaweq__ has joined #openstack-meeting-312:23
*** e0ne has joined #openstack-meeting-312:27
*** raildo has joined #openstack-meeting-312:43
*** e0ne has quit IRC12:53
*** e0ne has joined #openstack-meeting-313:05
*** slaweq has joined #openstack-meeting-313:11
*** slaweq__ has quit IRC13:13
*** jamesmcarthur has joined #openstack-meeting-313:20
*** lpetrut has joined #openstack-meeting-313:24
*** jamesmcarthur has quit IRC13:37
*** jamesmcarthur has joined #openstack-meeting-313:47
*** liuyulong has joined #openstack-meeting-313:58
*** lkoranda has quit IRC13:58
*** bobmel has quit IRC13:58
*** lkoranda has joined #openstack-meeting-314:11
*** pcaruana has quit IRC14:17
*** psachin has quit IRC14:25
*** bobmel has joined #openstack-meeting-314:37
*** links has quit IRC14:50
slaweq#startmeeting neutron_ci15:00
openstackMeeting started Wed Feb  5 15:00:13 2020 UTC and is due to finish in 60 minutes.  The chair is slaweq. Information about MeetBot at http://wiki.debian.org/MeetBot.15:00
slaweqhi15:00
openstackUseful Commands: #action #agreed #help #info #idea #link #topic #startvote.15:00
*** openstack changes topic to " (Meeting topic: neutron_ci)"15:00
openstackThe meeting name has been set to 'neutron_ci'15:00
njohnstono/15:00
slaweqralonsoh: bcafarel haleyb: CI meeting, are You around?15:01
*** lkoranda has quit IRC15:01
ralonsohhi15:01
bcafarelo/15:01
ralonsohI was waiting in the wrong channel15:01
bcafarelslaweq: thanks for the ping I was looking for correct window :)15:01
haleybslaweq: i'm in another meeting too, have one eye here :)15:01
slaweq:)15:01
slaweqok, lets start15:01
slaweqGrafana dashboard: http://grafana.openstack.org/dashboard/db/neutron-failure-rate15:01
slaweqplease open it and we can move on15:02
slaweq#topic Actions from previous meetings15:02
*** openstack changes topic to "Actions from previous meetings (Meeting topic: neutron_ci)"15:02
slaweqslaweq to talk with gmann about vpnaas jobs on rocky15:02
slaweqtbh I forgot about it :/15:02
gmannslaweq: i sent summary of what amotoki and we discussed on ML15:02
slaweqmaybe gmann is around now so we can ask him how to fix this issue with vpnaas rocky tempest jobs15:02
slaweqgmann: ok, so I need to find this email than :)15:03
gmann#link http://lists.openstack.org/pipermail/openstack-discuss/2020-January/012241.html15:03
gmann4th point15:03
gmannslaweq: ping me once you are done then we can discuss further15:04
slaweqgmann: so basically we should backport to rocky https://review.opendev.org/#/c/695834/15:07
slaweqor at least "partially" backport it15:07
slaweq?15:07
gmannslaweq: yeah that will be good for long term maintenance and how py2 EOL things happening15:07
slaweqok, and this will not be a problem if we will have those tests in-tree but will run them from neutron-tempest-plugin repo actually?15:08
gmanni am fixing stable branches with stable u-c t use in tempest tox run  which solve issue but backport that is what i will suggest in case of another issue15:08
gmanntrue15:09
slaweqok, thx for explanation15:09
gmannfixing current stable branches summary - http://lists.openstack.org/pipermail/openstack-discuss/2020-February/012371.html15:09
slaweq#action slaweq to backport https://review.opendev.org/#/c/695834/ to stable branches in neutron-vpnaas15:09
gmannFYI, all stable branch till rocky are broken now15:09
bcafarelsigh15:10
slaweq:/15:10
slaweqok, lets move on15:11
slaweqnext action was15:11
slaweqslaweq to update grafana dashboard with missing jobs15:11
slaweqand I also forgot about it :/15:11
slaweq#action slaweq to update grafana dashboard with missing jobs15:11
slaweqI will do it this week15:11
slaweqany questions/comments to this topic?15:12
njohnstonnope15:12
ralonsohno15:13
slaweqso we can move on to the next topic15:13
slaweq#topic Stadium projects15:13
*** openstack changes topic to "Stadium projects (Meeting topic: neutron_ci)"15:13
slaweqmigration to zuulv315:13
slaweqhttps://etherpad.openstack.org/p/neutron-train-zuulv3-py27drop15:13
slaweqI was checking this etherpad few days ago, and I even send some small patches related to it15:13
slaweq(but still needs some work)15:13
slaweqand generally we are pretty good there15:14
slaweqmost of the legacy jobs have got patches already in review15:14
njohnston+115:14
*** e0ne has quit IRC15:15
slaweqhuge thx for bcafarel for sending many related patches :)15:15
bcafarelnp, some of them are still not working properly15:15
bcafarelslaweq: as you know neutron-functional well if you have some time to take a look at https://review.opendev.org/#/c/703601/15:16
bcafarelI can't seem to convince it to install/find neutron :(15:16
bcafarel(nothing urgent of course)15:16
slaweqbcafarel: ok, I will take a look15:16
bcafarelthanks :)15:17
slaweqnp15:17
slaweqanything else related to the stadium projects' ci?15:18
njohnstonnope15:18
ralonsohno15:18
slaweqok, lets move on than15:19
slaweq#topic Grafana15:19
*** openstack changes topic to "Grafana (Meeting topic: neutron_ci)"15:19
slaweqhttp://grafana.openstack.org/dashboard/db/neutron-failure-rate15:19
slaweqfirst of all our gate jobs are not running since few days15:21
slaweqand it's mostly due to broken neutron-ovn-tempest-ovs-release job15:21
slaweqsecond issue is that we have some gap in data in during last weekend and begin of this week15:22
slaweqbut that's probably infra issue15:22
njohnstonIs neutron-ovn-tempest-ovs-release one of the missing jobs that needs to be updated in grafana?  I can't find it.15:23
slaweqnjohnston: yes15:23
slaweqsorry for that :/15:23
njohnstonok15:23
njohnstonnp :-)15:23
slaweqalso it seems we had some problem yesterday as many jobs has got high numbers there15:24
slaweqbut I don't know about any specific issue from yesterday15:25
slaweqbut this spike can be also due to low number of running jobs (or data stored) in the day before15:26
slaweqother than that I don't see anything really wrong15:28
*** lpetrut has quit IRC15:28
slaweqok, lets move on than15:29
slaweq#topic fullstack/functional15:29
*** openstack changes topic to "fullstack/functional (Meeting topic: neutron_ci)"15:29
slaweqI have couple of issues in fullstack tests for today15:29
slaweqError when connecting to the placement service (same as in last week too):15:29
slaweqhttps://storage.bhs.cloud.ovh.net/v1/AUTH_dcaab5e32b234d56b626f72581e3644c/zuul_opendev_logs_f66/703143/3/check/neutron-fullstack/f667c93/controller/logs/dsvm-fullstack-logs/TestPlacementBandwidthReport.test_configurations_are_synced_towards_placement_NIC-Switch-agent_/neutron-server--2020-02-04--12-09-12-759753_log.txt15:29
slaweqmaybe lajoskatona or rubasov could take a look at it15:30
slaweqI pinged rubasov to join this meeting15:30
slaweqmaybe he will join soon15:31
*** rubasov has joined #openstack-meeting-315:31
rubasovhi15:31
slaweqhi rubasov15:31
slaweqrecently we spotted few times issue like in https://storage.bhs.cloud.ovh.net/v1/AUTH_dcaab5e32b234d56b626f72581e3644c/zuul_opendev_logs_f66/703143/3/check/neutron-fullstack/f667c93/controller/logs/dsvm-fullstack-logs/TestPlacementBandwidthReport.test_configurations_are_synced_towards_placement_NIC-Switch-agent_/neutron-server--2020-02-04--12-09-12-759753_log.txt15:32
slaweqneutron-server can't connect to (fake) placement service15:32
slaweqdid You maybe saw something like that before?15:32
slaweqor do You know why it could happen?15:33
rubasovdid not see this before15:33
slaweqrubasov: can You try to take a look into that?15:35
rubasovI don't really have ideas right now15:35
slaweqnot today ofcourse :) but if You will have some time15:35
rubasovsure we'll look into it with lajoskatona15:35
rubasovhe wrote that fake placement service originally IIRC15:35
rubasovhow frequent this is?15:36
slaweqit's not very frequent, I saw it once per week or something like that15:36
slaweqrubasov: I will open LP to track it15:37
rubasovokay, I'll put it on my todo list15:37
slaweqand I will send it to You and Lajos - maybe You will have some time to take a look15:37
rubasovthat's even better, thank you15:38
slaweq#action slaweq to open LP related to fullstack placement issue15:38
slaweqthx rubasov15:38
slaweqok, another issue15:38
slaweqin https://c355270b22583c2d2af0-42801c5a43c64ea303a559bec7f7cdd7.ssl.cf5.rackcdn.com/705903/2/check/neutron-fullstack/79cf197/controller/logs/dsvm-fullstack-logs/TestBwLimitQoSOvs.test_bw_limit_qos_policy_rule_lifecycle_egress_/neutron-server--2020-02-05--11-00-19-067345_log.txt15:38
rubasovthanks15:38
slaweqit seems for me like neutron-server just hanged and that caused test timeout15:38
slaweqactually not, it was also connection issue: https://c355270b22583c2d2af0-42801c5a43c64ea303a559bec7f7cdd7.ssl.cf5.rackcdn.com/705903/2/check/neutron-fullstack/79cf197/controller/logs/dsvm-fullstack-logs/TestBwLimitQoSOvs.test_bw_limit_qos_policy_rule_lifecycle_egress_.txt15:39
slaweqthis time to neutron-server15:39
*** ianychoi_ is now known as ianychoi15:40
njohnstonwell if the neutron server crashed hard then both the log would end precipitously like that and ECONNREFUSED is what clients would see15:40
njohnstonif it was a hang then the client would have timeouts15:41
slaweqnjohnston: but in the end of the logs You can see15:41
slaweq2020-02-05 11:01:19.086 22341 DEBUG neutron.agent.linux.utils [-] Running command: ['kill', '-15', '3180'] create_process /home/zuul/src/opendev.org/openstack/neutron/neutron/agent/linux/utils.py:8715:41
slaweq2020-02-05 11:01:19.280 22341 DEBUG neutron.tests.fullstack.resources.process [-] Process stopped: neutron-server stop /home/zuul/src/opendev.org/openstack/neutron/neutron/tests/fullstack/resources/process.py:8515:41
slaweqso it seems that neutron-server was properly stopped at the end15:42
slaweqif it would crash earlier wouldn't this be an error?15:42
ralonsohI think so...15:42
njohnstonwhich log is that in?  I don't see it in https://c355270b22583c2d2af0-42801c5a43c64ea303a559bec7f7cdd7.ssl.cf5.rackcdn.com/705903/2/check/neutron-fullstack/79cf197/controller/logs/dsvm-fullstack-logs/TestBwLimitQoSOvs.test_bw_limit_qos_policy_rule_lifecycle_egress_/neutron-server--2020-02-05--11-00-19-067345_log.txt15:43
slaweqnjohnston: it's in https://c355270b22583c2d2af0-42801c5a43c64ea303a559bec7f7cdd7.ssl.cf5.rackcdn.com/705903/2/check/neutron-fullstack/79cf197/controller/logs/dsvm-fullstack-logs/TestBwLimitQoSOvs.test_bw_limit_qos_policy_rule_lifecycle_egress_.txt15:43
slaweqthis is "test log"15:43
njohnstonok15:44
ralonsohslaweq, do you have a bug for this one?15:46
ralonsohI can review it later15:46
slaweqralonsoh: nope, I saw it only once so far so and I didn't open bug for it15:46
slaweqbut I can15:46
ralonsoh(at least this is not a QoS error)15:46
slaweqI will ping You when I will open LP for that15:47
ralonsohthanks for the "present"15:47
slaweq#action slaweq to open LP related to "hang" neutron-server15:47
slaweqralonsoh: yw :D15:47
slaweqand that's all what I have for today for functional/fullstack jobs15:47
slaweqanything else You have maybe?15:48
ralonsohslaweq, https://review.opendev.org/#/c/705760/ is almost merged15:48
ralonsohI'll abandon https://review.opendev.org/#/c/705903/15:48
slaweqralonsoh: great15:49
bcafarelso, recheck time once 705760 is in?15:49
slaweqthat should unblock our gate hopefully15:49
*** pcaruana has joined #openstack-meeting-315:49
njohnstonexcellent15:50
slaweqok, so we moved to scenario/tempest test now15:51
slaweq#topic Tempest/Scenario15:51
*** openstack changes topic to "Tempest/Scenario (Meeting topic: neutron_ci)"15:51
slaweqwe already mentioned broken  neutron-ovn-tempest-ovs-release job which should be fixed with 70576015:51
slaweqfrom other issues I have one with test_show_network_segment_range https://9f9aee74b45263b2a9d8-795792c1f104e79962e44448ab55e3f1.ssl.cf1.rackcdn.com/681466/2/check/neutron-tempest-plugin-api/b340c8e/testr_results.html15:53
*** e0ne has joined #openstack-meeting-315:53
slaweqand I think I saw something similar couple of times already15:53
slaweqI'm not sure if that was always the same test but similar issue for sure15:54
ralonsohagain?15:54
ralonsohit is the same one15:54
bcafarelKeyError on project_id??15:54
ralonsohI really don't understand why this specific key is not present15:54
ralonsohand this is not a trivial one, but project_id15:54
slaweqyes, I also don't understant it15:55
slaweqwait, actually this one was 21.0115:55
slaweqso maybe it's an old issue15:56
slaweqsorry for the noise than :)15:56
ralonsohno, but this test error is recurrent15:57
slaweqyes, and actually I don't understand it excatly15:57
slaweqif You look at code: https://github.com/openstack/neutron-tempest-plugin/blob/master/neutron_tempest_plugin/api/admin/test_network_segment_range.py#L20115:58
slaweqit failed after checking "id", "name" and other attributes15:58
slaweqso it's not like dict is empty15:59
ralonsohone question15:59
slaweqthere is "only" project_id missing from it15:59
ralonsohthis test is using neutron-client15:59
ralonsohnot os-client15:59
ralonsohis that correct?15:59
slaweqidk15:59
slaweqbut IMO those tests are using tempest clients, no?16:00
ralonsohOk, I'll check it16:01
slaweqthx ralonsoh16:01
slaweq#action ralonsoh  to check missing project_id issue16:01
ralonsohwe run out of time16:01
slaweqok we are out of time today16:01
slaweqthx for attending16:01
slaweqo/16:01
ralonsohbye16:01
njohnston\o16:01
slaweq#endmeeting16:01
*** openstack changes topic to "OpenStack Meetings || https://wiki.openstack.org/wiki/Meetings/"16:01
openstackMeeting ended Wed Feb  5 16:01:35 2020 UTC.  Information about MeetBot at http://wiki.debian.org/MeetBot . (v 0.1.4)16:01
openstackMinutes:        http://eavesdrop.openstack.org/meetings/neutron_ci/2020/neutron_ci.2020-02-05-15.00.html16:01
openstackMinutes (text): http://eavesdrop.openstack.org/meetings/neutron_ci/2020/neutron_ci.2020-02-05-15.00.txt16:01
openstackLog:            http://eavesdrop.openstack.org/meetings/neutron_ci/2020/neutron_ci.2020-02-05-15.00.log.html16:01
bcafarelo/16:01
*** bobmel has quit IRC16:04
*** ralonsoh has left #openstack-meeting-316:04
*** bobmel has joined #openstack-meeting-316:05
*** bobmel has quit IRC16:10
*** e0ne has quit IRC16:17
*** slaweq_ has joined #openstack-meeting-316:23
*** jamesmcarthur has quit IRC16:24
*** slaweq has quit IRC16:25
*** psachin has joined #openstack-meeting-316:38
*** jamesmcarthur has joined #openstack-meeting-316:38
*** jamesmcarthur_ has joined #openstack-meeting-316:41
*** jamesmcarthur has quit IRC16:44
*** jamesmcarthur_ has quit IRC16:55
*** e0ne has joined #openstack-meeting-316:56
*** psachin has quit IRC16:58
*** jamesmcarthur has joined #openstack-meeting-316:58
*** e0ne has quit IRC17:05
*** jamesmcarthur has quit IRC17:05
*** e0ne has joined #openstack-meeting-317:09
*** jamesmcarthur has joined #openstack-meeting-317:27
*** igordc has joined #openstack-meeting-317:34
*** jamesmcarthur has quit IRC17:56
*** jamesmcarthur has joined #openstack-meeting-318:03
*** e0ne has quit IRC18:27
*** jamesmcarthur has quit IRC18:54
*** jamesmcarthur has joined #openstack-meeting-319:15
*** bobmel has joined #openstack-meeting-319:45
*** jamesmcarthur has quit IRC20:21
*** raildo has quit IRC20:30
*** raildo has joined #openstack-meeting-320:33
*** e0ne has joined #openstack-meeting-320:36
*** e0ne has quit IRC20:44
*** slaweq_ has quit IRC20:44
*** slaweq_ has joined #openstack-meeting-320:47
*** slaweq_ has quit IRC21:15
*** liuyulong has quit IRC21:41
*** bobmel has quit IRC21:47
*** raildo has quit IRC22:37
*** bobmel has joined #openstack-meeting-322:55
*** bobmel has quit IRC22:56
*** jamesmcarthur has joined #openstack-meeting-323:03
*** jamesmcarthur has quit IRC23:06

Generated by irclog2html.py 2.15.3 by Marius Gedminas - find it at mg.pov.lt!