Wednesday, 2018-10-10

*** agopi|brb has quit IRC00:01
*** rlandy is now known as rlandy|bbl00:01
*** itlinux has joined #tripleo00:03
*** hamzy_ has joined #tripleo00:05
*** mjturek has quit IRC00:06
*** ooolpbot has joined #tripleo00:10
ooolpbotURGENT TRIPLEO TASKS NEED ATTENTION00:10
ooolpbothttps://bugs.launchpad.net/tripleo/+bug/179671000:10
ooolpbothttps://bugs.launchpad.net/tripleo/+bug/179675600:10
openstackLaunchpad bug 1796710 in tripleo "Tempest tests failed with Read timed out error on tripleo-ci-centos-7-undercloud-containers" [Critical,Triaged]00:10
*** ooolpbot has quit IRC00:10
openstackLaunchpad bug 1796756 in tripleo "Error searching for image docker.io/tripleorocky/centos-binary-ceilometer-compute - UnixHTTPConnectionPool(host=\'localhost\', port=None): Read timed out." [Critical,In progress] - Assigned to Alex Schultz (alex-schultz)00:10
*** agopi|brb has joined #tripleo00:25
*** agopi|brb is now known as agopi00:27
*** ade_lee has joined #tripleo00:28
*** rh-jelabarre has quit IRC00:35
*** rh-jelabarre has joined #tripleo00:40
*** rh-jelabarre has quit IRC00:54
*** ooolpbot has joined #tripleo01:10
ooolpbotURGENT TRIPLEO TASKS NEED ATTENTION01:10
ooolpbothttps://bugs.launchpad.net/tripleo/+bug/179671001:10
ooolpbothttps://bugs.launchpad.net/tripleo/+bug/179675601:10
*** ooolpbot has quit IRC01:10
openstackLaunchpad bug 1796710 in tripleo "Tempest tests failed with Read timed out error on tripleo-ci-centos-7-undercloud-containers" [Critical,Triaged]01:10
openstackLaunchpad bug 1796756 in tripleo "Error searching for image docker.io/tripleorocky/centos-binary-ceilometer-compute - UnixHTTPConnectionPool(host=\'localhost\', port=None): Read timed out." [Critical,In progress] - Assigned to Alex Schultz (alex-schultz)01:10
*** bugzy_ has joined #tripleo01:30
*** mrsoul has joined #tripleo01:32
*** bugzy has quit IRC01:33
*** flaper87 has quit IRC01:37
*** ade_lee has quit IRC01:37
*** d0ugal has quit IRC01:37
*** tristanC has quit IRC01:37
*** dtrainor has quit IRC01:37
*** mugsie has quit IRC01:37
*** jroll has quit IRC01:37
*** Tengu has quit IRC01:37
*** ajo has quit IRC01:37
*** odyssey4me has quit IRC01:37
*** bugzy_ has quit IRC01:37
*** agopi has quit IRC01:37
*** dmellado has quit IRC01:37
*** rlandy|bbl has quit IRC01:37
*** jhebden has quit IRC01:37
*** dmacpher_ has quit IRC01:37
*** Hobbestigrou has quit IRC01:37
*** zaneb has quit IRC01:37
*** csatari has quit IRC01:37
*** jbadiapa has quit IRC01:37
*** tzumainn has quit IRC01:37
*** spsurya has quit IRC01:37
*** sdake has quit IRC01:37
*** errr has quit IRC01:37
*** homegrown has quit IRC01:37
*** vkmc has quit IRC01:37
*** sdoran has quit IRC01:37
*** mgagne has quit IRC01:37
*** Ng has quit IRC01:37
*** StevenK has quit IRC01:37
*** hamzy_ has quit IRC01:37
*** sileht has quit IRC01:37
*** gouthamr has quit IRC01:37
*** mrunge has quit IRC01:37
*** ssbarnea_ has quit IRC01:37
*** KingJ has quit IRC01:37
*** zzzeek_ has quit IRC01:37
*** fpan has quit IRC01:37
*** bandini has quit IRC01:37
*** dr_gogeta86 has quit IRC01:37
*** beagles has quit IRC01:37
*** aedc has quit IRC01:37
*** sai_p has quit IRC01:37
*** florianf has quit IRC01:37
*** sshnaidm|afk has quit IRC01:37
*** jistr has quit IRC01:37
*** jpena|off has quit IRC01:37
*** PhilSliderS has quit IRC01:37
*** dtantsur|afk has quit IRC01:37
*** mfedosin has quit IRC01:37
*** thrash has quit IRC01:37
*** mwhahaha has quit IRC01:37
*** EmilienM has quit IRC01:37
*** Hazelesque has quit IRC01:37
*** rajinir has quit IRC01:37
*** panda has quit IRC01:37
*** agurenko has quit IRC01:37
*** zul has quit IRC01:37
*** artom has quit IRC01:37
*** abishop has quit IRC01:37
*** hjensas has quit IRC01:37
*** tbarron has quit IRC01:37
*** dhellmann has quit IRC01:37
*** Damjanek has quit IRC01:37
*** marios has quit IRC01:37
*** redrobot has quit IRC01:37
*** tonyb has quit IRC01:37
*** lennyb has quit IRC01:37
*** bnemec has quit IRC01:37
*** cgoncalves has quit IRC01:37
*** rook has quit IRC01:37
*** lhinds has quit IRC01:37
*** egonzalez has quit IRC01:37
*** fungi has quit IRC01:37
*** shadower has quit IRC01:37
*** mnasiadka has quit IRC01:37
*** melwitt has quit IRC01:37
*** marios|rover has quit IRC01:37
*** faceman has quit IRC01:37
*** therve has quit IRC01:37
*** jamesdenton has quit IRC01:37
*** PagliaccisCloud has quit IRC01:37
*** mnaser has quit IRC01:37
*** TheJulia has quit IRC01:37
*** itlinux has quit IRC01:37
*** chem` has quit IRC01:37
*** jaosorior has quit IRC01:37
*** bdodd_ has quit IRC01:37
*** iurygregory has quit IRC01:37
*** safchain has quit IRC01:37
*** dalvarez has quit IRC01:37
*** lucasagomes has quit IRC01:37
*** kopecmartin|ruck has quit IRC01:37
*** akrzos has quit IRC01:37
*** mrsoul has quit IRC01:37
*** mburned_out has quit IRC01:37
*** jjoyce has quit IRC01:37
*** ssbarnea has quit IRC01:37
*** mcarden has quit IRC01:37
*** weshay has quit IRC01:37
*** mmedvede has quit IRC01:37
*** rodrigods has quit IRC01:37
*** rnoriega has quit IRC01:37
*** nhicher has quit IRC01:37
*** percevalbot has quit IRC01:37
*** mandre has quit IRC01:37
*** spotz has quit IRC01:37
*** jillr has quit IRC01:37
*** Jeffrey4l has quit IRC01:37
*** pliu has quit IRC01:37
*** dansmith has quit IRC01:37
*** quiquell|off has quit IRC01:37
*** mschuppert has quit IRC01:37
*** owalsh_away has quit IRC01:37
*** mmethot has quit IRC01:37
*** matbu has quit IRC01:37
*** mjblack has quit IRC01:37
*** radez has quit IRC01:37
*** amoralej|off has quit IRC01:37
*** cschwede has quit IRC01:37
*** leifmadsen has quit IRC01:37
*** arxcruz has quit IRC01:37
*** toure has quit IRC01:37
*** gchamoul has quit IRC01:37
*** honza has quit IRC01:37
*** rickflare has quit IRC01:37
*** ansmith has quit IRC01:37
*** rcernin has quit IRC01:37
*** stevebaker has quit IRC01:37
*** jidar has quit IRC01:37
*** chandankumar has quit IRC01:37
*** ianw has quit IRC01:37
*** dsneddon_away has quit IRC01:37
*** numans has quit IRC01:37
*** bcafarel has quit IRC01:37
*** haleyb has quit IRC01:37
*** dpeacock has quit IRC01:37
*** mpjetta has quit IRC01:37
*** trown|outtypewww has quit IRC01:37
*** andreaf has quit IRC01:37
*** lblanchard has quit IRC01:37
*** naturalblue has quit IRC01:37
*** adrianreza has quit IRC01:37
*** rascasoft has quit IRC01:37
*** ericyoung has quit IRC01:37
*** ChanServ has quit IRC01:37
*** trown|outtypewww has joined #tripleo01:43
*** andreaf has joined #tripleo01:43
*** mpjetta has joined #tripleo01:43
*** dpeacock has joined #tripleo01:43
*** haleyb has joined #tripleo01:43
*** bcafarel has joined #tripleo01:43
*** dsneddon_away has joined #tripleo01:43
*** numans has joined #tripleo01:43
*** ianw has joined #tripleo01:43
*** chandankumar has joined #tripleo01:43
*** jidar has joined #tripleo01:43
*** stevebaker has joined #tripleo01:43
*** rcernin has joined #tripleo01:43
*** ansmith has joined #tripleo01:43
*** odyssey4me has joined #tripleo01:43
*** ajo has joined #tripleo01:43
*** Tengu has joined #tripleo01:43
*** jroll has joined #tripleo01:43
*** dtrainor has joined #tripleo01:43
*** d0ugal has joined #tripleo01:43
*** mugsie has joined #tripleo01:43
*** ade_lee has joined #tripleo01:43
*** honza has joined #tripleo01:43
*** toure has joined #tripleo01:43
*** arxcruz has joined #tripleo01:43
*** rickflare has joined #tripleo01:43
*** gchamoul has joined #tripleo01:43
*** leifmadsen has joined #tripleo01:43
*** amoralej|off has joined #tripleo01:43
*** cschwede has joined #tripleo01:43
*** radez has joined #tripleo01:43
*** mjblack has joined #tripleo01:43
*** matbu has joined #tripleo01:43
*** mmethot has joined #tripleo01:43
*** owalsh_away has joined #tripleo01:43
*** quiquell|off has joined #tripleo01:43
*** dansmith has joined #tripleo01:43
*** pliu has joined #tripleo01:43
*** Jeffrey4l has joined #tripleo01:43
*** jillr has joined #tripleo01:43
*** spotz has joined #tripleo01:43
*** mandre has joined #tripleo01:43
*** percevalbot has joined #tripleo01:43
*** nhicher has joined #tripleo01:43
*** rnoriega has joined #tripleo01:43
*** rodrigods has joined #tripleo01:43
*** mmedvede has joined #tripleo01:43
*** weshay has joined #tripleo01:43
*** mcarden has joined #tripleo01:43
*** ssbarnea has joined #tripleo01:43
*** jjoyce has joined #tripleo01:43
*** mburned_out has joined #tripleo01:43
*** mrsoul has joined #tripleo01:43
*** Ng has joined #tripleo01:44
*** StevenK has joined #tripleo01:44
*** card.freenode.net sets mode: +v Ng01:44
*** egonzalez has joined #tripleo01:44
*** bnemec has joined #tripleo01:44
*** cgoncalves has joined #tripleo01:44
*** rook has joined #tripleo01:44
*** lhinds has joined #tripleo01:44
*** fungi has joined #tripleo01:44
*** shadower has joined #tripleo01:44
*** mnasiadka has joined #tripleo01:44
*** melwitt has joined #tripleo01:44
*** hamzy_ has joined #tripleo01:44
*** sileht has joined #tripleo01:44
*** gouthamr has joined #tripleo01:44
*** mrunge has joined #tripleo01:44
*** ssbarnea_ has joined #tripleo01:44
*** KingJ has joined #tripleo01:44
*** zzzeek_ has joined #tripleo01:44
*** bandini has joined #tripleo01:44
*** fpan has joined #tripleo01:44
*** dr_gogeta86 has joined #tripleo01:44
*** beagles has joined #tripleo01:44
*** flaper87 has joined #tripleo01:44
*** lblanchard has joined #tripleo01:45
*** naturalblue has joined #tripleo01:45
*** adrianreza has joined #tripleo01:45
*** rascasoft has joined #tripleo01:45
*** ericyoung has joined #tripleo01:45
*** aedc has joined #tripleo01:45
*** sai_p has joined #tripleo01:45
*** mfedosin has joined #tripleo01:45
*** florianf has joined #tripleo01:45
*** sshnaidm|afk has joined #tripleo01:45
*** jistr has joined #tripleo01:45
*** jpena|off has joined #tripleo01:45
*** PhilSliderS has joined #tripleo01:45
*** dtantsur|afk has joined #tripleo01:45
*** rajinir has joined #tripleo01:45
*** thrash has joined #tripleo01:45
*** mwhahaha has joined #tripleo01:45
*** EmilienM has joined #tripleo01:45
*** card.freenode.net sets mode: +ovov mwhahaha mwhahaha EmilienM EmilienM01:45
*** Hazelesque has joined #tripleo01:45
*** zaneb has joined #tripleo01:45
*** csatari has joined #tripleo01:45
*** bugzy_ has joined #tripleo01:46
*** agopi has joined #tripleo01:46
*** dmellado has joined #tripleo01:46
*** rlandy|bbl has joined #tripleo01:46
*** jhebden has joined #tripleo01:46
*** dmacpher_ has joined #tripleo01:46
*** Hobbestigrou has joined #tripleo01:46
*** jamesdenton has joined #tripleo01:46
*** PagliaccisCloud has joined #tripleo01:46
*** mnaser has joined #tripleo01:46
*** TheJulia has joined #tripleo01:46
*** marios|rover has joined #tripleo01:46
*** faceman has joined #tripleo01:46
*** therve has joined #tripleo01:46
*** panda has joined #tripleo01:46
*** agurenko has joined #tripleo01:46
*** zul has joined #tripleo01:46
*** artom has joined #tripleo01:46
*** abishop has joined #tripleo01:46
*** hjensas has joined #tripleo01:46
*** tbarron has joined #tripleo01:46
*** dhellmann has joined #tripleo01:46
*** Damjanek has joined #tripleo01:46
*** marios has joined #tripleo01:46
*** redrobot has joined #tripleo01:46
*** tonyb has joined #tripleo01:46
*** lennyb has joined #tripleo01:46
*** chem` has joined #tripleo01:46
*** jaosorior has joined #tripleo01:46
*** bdodd_ has joined #tripleo01:46
*** safchain has joined #tripleo01:46
*** lucasagomes has joined #tripleo01:46
*** dalvarez has joined #tripleo01:46
*** kopecmartin|ruck has joined #tripleo01:46
*** akrzos has joined #tripleo01:46
*** card.freenode.net sets mode: +o jaosorior01:46
*** dmacpher_ has quit IRC01:46
*** agopi has quit IRC01:47
*** rlandy|bbl has quit IRC01:47
*** ChanServ has joined #tripleo01:47
*** card.freenode.net sets mode: +o ChanServ01:47
*** agopi has joined #tripleo01:49
*** mgagne has joined #tripleo01:49
*** rlandy has joined #tripleo01:49
*** dmacpher has joined #tripleo01:49
*** mcornea has joined #tripleo01:51
*** itlinux has joined #tripleo01:54
*** ooolpbot has joined #tripleo02:10
ooolpbotURGENT TRIPLEO TASKS NEED ATTENTION02:10
ooolpbothttps://bugs.launchpad.net/tripleo/+bug/179671002:10
openstackLaunchpad bug 1796710 in tripleo "Tempest tests failed with Read timed out error on tripleo-ci-centos-7-undercloud-containers" [Critical,Triaged]02:10
ooolpbothttps://bugs.launchpad.net/tripleo/+bug/179675602:10
*** ooolpbot has quit IRC02:10
openstackLaunchpad bug 1796756 in tripleo "Error searching for image docker.io/tripleorocky/centos-binary-ceilometer-compute - UnixHTTPConnectionPool(host=\'localhost\', port=None): Read timed out." [Critical,In progress] - Assigned to Alex Schultz (alex-schultz)02:10
*** rlandy has quit IRC02:21
*** mcornea has quit IRC02:34
*** lblanchard has quit IRC02:56
*** ramishra has joined #tripleo02:57
*** psachin has joined #tripleo02:57
*** dtrainor has quit IRC02:59
*** ooolpbot has joined #tripleo03:10
ooolpbotURGENT TRIPLEO TASKS NEED ATTENTION03:10
ooolpbothttps://bugs.launchpad.net/tripleo/+bug/179671003:10
ooolpbothttps://bugs.launchpad.net/tripleo/+bug/179675603:10
openstackLaunchpad bug 1796710 in tripleo "Tempest tests failed with Read timed out error on tripleo-ci-centos-7-undercloud-containers" [Critical,Triaged]03:10
*** ooolpbot has quit IRC03:10
openstackLaunchpad bug 1796756 in tripleo "Error searching for image docker.io/tripleorocky/centos-binary-ceilometer-compute - UnixHTTPConnectionPool(host=\'localhost\', port=None): Read timed out." [Critical,In progress] - Assigned to Alex Schultz (alex-schultz)03:10
*** jaganathan has joined #tripleo03:26
*** dtrainor has joined #tripleo03:35
*** dtrainor_ has joined #tripleo03:42
*** dtrainor_ has quit IRC03:43
*** dtrainor_ has joined #tripleo03:43
*** dtrainor has quit IRC03:43
*** udesale has joined #tripleo03:52
*** ooolpbot has joined #tripleo04:10
ooolpbotURGENT TRIPLEO TASKS NEED ATTENTION04:10
ooolpbothttps://bugs.launchpad.net/tripleo/+bug/179671004:10
openstackLaunchpad bug 1796710 in tripleo "Tempest tests failed with Read timed out error on tripleo-ci-centos-7-undercloud-containers" [Critical,Triaged]04:10
ooolpbothttps://bugs.launchpad.net/tripleo/+bug/179675604:10
*** ooolpbot has quit IRC04:10
openstackLaunchpad bug 1796756 in tripleo "Error searching for image docker.io/tripleorocky/centos-binary-ceilometer-compute - UnixHTTPConnectionPool(host=\'localhost\', port=None): Read timed out." [Critical,In progress] - Assigned to Alex Schultz (alex-schultz)04:10
*** janki has joined #tripleo04:29
*** gkadam has joined #tripleo04:42
*** spsurya has joined #tripleo04:45
*** agurenko has quit IRC04:58
*** apetrich has joined #tripleo05:09
*** ooolpbot has joined #tripleo05:10
ooolpbotURGENT TRIPLEO TASKS NEED ATTENTION05:10
ooolpbothttps://bugs.launchpad.net/tripleo/+bug/179671005:10
ooolpbothttps://bugs.launchpad.net/tripleo/+bug/179675605:10
*** ooolpbot has quit IRC05:10
openstackLaunchpad bug 1796710 in tripleo "Tempest tests failed with Read timed out error on tripleo-ci-centos-7-undercloud-containers" [Critical,Triaged]05:10
openstackLaunchpad bug 1796756 in tripleo "Error searching for image docker.io/tripleorocky/centos-binary-ceilometer-compute - UnixHTTPConnectionPool(host=\'localhost\', port=None): Read timed out." [Critical,In progress] - Assigned to Alex Schultz (alex-schultz)05:10
*** shyam89 has joined #tripleo05:11
*** ratailor has joined #tripleo05:11
*** slaweq has joined #tripleo05:11
*** quiquell|off is now known as quiquell05:22
quiquellGood morning05:24
ratailorjaosorior, you around ?05:27
*** jtomasek has joined #tripleo05:32
Tenguhello tere05:34
quiquellTengu: o/05:34
Tenguhow are you today, quiquell ?05:34
quiquellTengu: Waittin for stuff to merge, but gates are broken now05:36
Tenguerf05:36
Tenguagain05:36
quiquellTengu: As always you can be of good help, did the coffe already arrive to your brains ?05:36
Tengunope, wait, getting one XD05:36
quiquellTengu: pasting random stuff here, you can start to read when you are coffed up05:37
quiquellSo the issue at gates is at tripleo-puppet with nova_libvirt http://logs.openstack.org/45/560445/160/check/tripleo-ci-centos-7-scenario002-multinode-oooq-container/5d0f2b8/logs/undercloud/home/zuul/overcloud_deploy.log.txt.gz#_2018-10-10_02_02_4605:38
quiquellI see stuff like this Exec[set libvirt sasl credentials](provider=posix): Cannot understand environment setting \"TLS_PASSWORD=\"",05:39
Tenguok, so that augeas thingy apparently05:40
ratailorCan anybody suggested, whether I should backport this https://review.openstack.org/#/c/557323/ to queens, as on this https://review.openstack.org/#/c/608581 lint is failing in queens, because of variable being used without explicit namespace in line#58305:40
Tenguquiquell: lemme check the code, I see the Error is more in Error: /Stage[main]/Nova::Compute::Libvirt::Qemu/Augeas[qemu-conf-limits]: Could not evaluate: Saving failed, see debug05:40
ratailoror should I just remove the if block on line#583 and move on ?05:40
Tenguthis is the failing thingy. the exec is just a warning.05:40
quiquellTengu: maybe around this http://git.openstack.org/cgit/openstack/puppet-nova/tree/manifests/compute/libvirt/qemu.pp#n96 ?05:41
quiquellTengu: this ssl thingy is new https://github.com/openstack/puppet-nova/commit/aa2893d7e07e0dc37935e602fcdbca99b102d35e05:41
*** shyam89 has quit IRC05:42
Tenguquiquell: well, there are 2 blocks with the same name, might be line 102 as well05:43
*** agurenko has joined #tripleo05:43
quiquellTengu: So it cannot change qemu conf05:44
Tengubasically, yes.05:44
quiquellTengu: we have also this '2018-10-10 01:58:35,216 ERROR: 25956 -- + mkdir -p /etc/puppet", '05:44
quiquellTengu: Maybe it cannot generated because there is no tls_password var05:45
Tengu2s05:47
Tenguhttps://github.com/openstack/puppet-tripleo/blob/master/manifests/profile/base/nova/libvirt.pp#L11305:48
Tenguwondering if this isn't an issue - but as said it's a warning.05:48
Tenguthe real issue is the "Error" just following it.05:48
quiquellTengu: Maybe it's a transitive dependency on it05:49
Tenguandof course we can't get the debug of augeas as is.05:49
*** jbadiapa has joined #tripleo05:49
* Tengu starts his Beast05:49
TenguI'll re-play that run.05:50
quiquellTengu: do you have like a tripleo friendy Beast under your table ?05:50
Tenguyep05:50
Tenguquiquell: https://share.tengu.ch/workplace/photo5906792561853443946.jpg05:50
Tengumeet the epyc and its 64G ram plus some hard drives/ssd05:51
quiquellBlack Beast... it should wake up all your neighbours05:51
Tengunope - really silent :).05:51
Tenguthe noctua thingy on the CPU is a breeze05:52
Tenguand all the fans in the case have the owl-like shape for silent work.05:52
quiquellTengu: so, this sasl command cannot being executed because tls_password is missing 'saslpasswd2 -d -a libvirt -u overcloud migration'05:52
Tengureally convenient.05:52
Tenguquiquell: nah05:52
Tenguquiquell: the saslpasswd2 is executed as expected.05:52
Tenguit just doesn't pass the environment TLS_PASSWORD as it's empty05:53
quiquellTengu: but without TLS_PASSWORD it will do nothing ?05:53
Tenguquiquell: https://github.com/openstack/puppet-tripleo/blob/master/manifests/profile/base/nova/libvirt.pp#L10705:53
Tenguvs https://github.com/openstack/puppet-tripleo/blob/master/manifests/profile/base/nova/libvirt.pp#L9505:53
*** janki has quit IRC05:53
Tengufirst one is without password, second one is with tls_password thingy05:53
Tenguso basically the commands are different.05:53
TenguI would not obsess on that warning, but get more details on the Augeas issue right after that.05:54
quiquellTengu: ack, so red herring05:54
*** janki has joined #tripleo05:54
Tenguif the exec was a failure, puppet would crash at that point, issue an "Error" and show the command output.05:54
Tenguunfortunately, it's augeas failing, and that nasty boy doesn't output anything -.-05:54
quiquellTengu: what's augeas ?05:55
Tenguquiquell: a wonderful yet horrible thing05:55
quiquellTengu: And it kind of sarcastic "See debug" :-)05:55
Tenguquiquell: its goal is to allow a seemless edition of any file using "lenses", basically a file describing the file format we want05:55
Tenguit's truely powerful. but at the same time, a real pain in the a** to debug :'(05:56
TenguI'd rather have the Exec fail rather than augeas, even if I don't know sasl XD05:56
Tengusoo.05:57
*** shyam89 has joined #tripleo05:58
Tengucome here little beast :). I'm launching the reproducer, that's the best thing to do.05:58
Tenguquiquell: although..... I doubt the issue is related to https://github.com/openstack/puppet-nova/commit/0c54e9becb362c24e4e322ab75b885fbb6691e4e#diff-e9f81c8cca336ab747bef432d275854f06:01
Tenguquiquell: reason: 29 days old. Might be more related to some qemu update?06:02
Tenguor something like that06:02
Tenguis there a way to get some versions for it?06:02
quiquellTengu: sure06:02
quiquellTengu: puppet-nova-14.0.0-0.20181009210802.aa2893d.el7.noarch06:03
quiquellTengu: puppet-tripleo-10.0.0-0.20181007013831.d212ee6.el7.noarch06:03
Tenguquiquell: what about qemu and its friends?06:03
quiquellTengu: qemu-img-ev-2.10.0-21.el7_5.4.1.x86_6406:04
quiquellTengu: Did it has friends ?06:04
Tenguthere are som deps and libs I think, yeah. anyway, do you know when that one was introduced?06:04
quiquellTengu: let me compare with a good one06:05
Tenguthanks :)06:05
TenguI'll ensure the file actually exists in the container - might be something to do with permissions, in the end.06:05
Tenguthe "augeas failed ..." means many things: wrong format, permission denied, and probably some other issues I can't think of.06:06
quiquellTengu: Could be related to this https://bugs.launchpad.net/tripleo/+bug/1796764  ?06:06
openstackLaunchpad bug 1796764 in tripleo "undercloud install failing for containers-multinode in the gate "OperationalError: (sqlite3.OperationalError) database is locked "" [Critical,Fix released]06:06
Tenguhmmm nope, I don't think so. mwhahaha talked about it with weshay yesterday I think, root cause seems to be ARA accessing/locking the sqlite3 DB06:07
Tenguand augeas has nothing to do with sqlite3 in this case anyway06:07
quiquellTengu: has to be really really new06:07
quiquellTengu: 9 oct 10:41 UTC it was working06:08
Tenguquiquell: is it a transcient error? already rechecked it?06:08
*** ksambor has joined #tripleo06:08
quiquellTengu: not transient, I see it in other reviews, and the noop change too06:08
quiquellTengu: is massive, even standalone is failing there06:08
Tenguyeah, one of mine has the same issue.06:08
Tenguquiquell: care to open an LP for tracking?06:09
Tenguand mark it as blocker, urgent, critical, whatever06:09
marios|roverTengu: quiquell is that undercloud install failing?06:09
marios|roveror something new?06:09
Tengumarios|rover: overcloud06:09
quiquellmarios|rover: It's new, even standalone is failing06:09
*** sai_p has quit IRC06:09
quiquellmarios|rover: check noop https://review.openstack.org/#/c/560445/06:09
Tengumarios|rover: 2018-10-10 05:02:09 |         "Error: /Stage[main]/Nova::Compute::Libvirt::Qemu/Augeas[qemu-conf-limits]: Could not evaluate: Saving failed, see debug",06:09
marios|roverquiquell: well fantastic then06:10
quiquellTengu: maybe you want to exercise standalone at your reproducer since it's faster06:10
*** ooolpbot has joined #tripleo06:10
ooolpbotURGENT TRIPLEO TASKS NEED ATTENTION06:10
ooolpbothttps://bugs.launchpad.net/tripleo/+bug/179671006:10
ooolpbothttps://bugs.launchpad.net/tripleo/+bug/179675606:10
*** ooolpbot has quit IRC06:10
openstackLaunchpad bug 1796710 in tripleo "Tempest tests failed with Read timed out error on tripleo-ci-centos-7-undercloud-containers" [Critical,Triaged]06:10
openstackLaunchpad bug 1796756 in tripleo "Error searching for image docker.io/tripleorocky/centos-binary-ceilometer-compute - UnixHTTPConnectionPool(host=\'localhost\', port=None): Read timed out." [Critical,In progress] - Assigned to Alex Schultz (alex-schultz)06:10
quiquellmarios|rover: Yep it's going to be like impossible to merge the OVB fix :-(06:10
Tenguquiquell: well, it's already running, undercloud is getting installed.06:10
Tenguquiquell: do you want me to open the LP?06:10
chandankumarGood Morning Guys, Back from Grand PyCon India 2018 :-)06:11
Tenguchandankumar: hey, welcome back :)06:11
chandankumarTengu: \o/06:11
Tenguquiquell: I'm opening the LP :)06:11
quiquellTengu: ups sorry marios|rover  ^06:11
quiquellTengu: have to drop for a few06:12
marios|roverthanks Tengu06:12
*** quiquell is now known as quiquell|brb06:12
*** Petersingh has joined #tripleo06:12
marios|roverERROR configuring nova_libvirt ?06:12
*** shyam89 has quit IRC06:12
Tengumarios|rover: it's a "promotion-blocker" right?06:14
Tengumarios|rover: https://bugs.launchpad.net/tripleo/+bug/179703506:15
openstackLaunchpad bug 1797035 in tripleo "Gate fails on Augeas for qemu-conf-limits" [Critical,Triaged]06:15
marios|roverTengu: well that isn't a promotion job itself failing in the check now06:16
Tengumarios|rover: so "just" a gate blocker ?06:17
chandankumarTengu: I need some help on these two changes https://review.openstack.org/#/c/605980/ and https://review.openstack.org/#/c/605356/06:17
marios|roverTengu: yeah master gate is pretty serious and i haven't seen where else it fails like that anyway06:17
marios|roverthanks for bug Tengu06:17
chandankumarTengu: I am doing something wrong, directory is not getting mounted06:17
Tenguchandankumar: the ones for mistral+podman?06:17
chandankumarTengu: nope it is related to tempest06:17
Tenguwell, tempest, yes06:17
Tenguchandankumar: EmilienM told me you might want help on that :). well, I can have a spare minutes waiting for the reproducer to fail. gimme a minute.06:18
*** gkadam has quit IRC06:18
chandankumarTengu: sure06:18
Tenguchandankumar: so the modification in here are OK: https://review.openstack.org/#/c/605980/10/docker/services/tempest.yaml  it should create the correct paths on the host with the right setype, and then mount two of the directories in the container, at step2 (ONLY step2)06:19
chandankumarTengu: yup, I think something wrong I am doing here06:20
chandankumarTengu: https://review.openstack.org/#/c/605356/1706:20
Tenguyeah, reviewing that one, wait06:20
Tenguchandankumar: where does this script run from? https://review.openstack.org/#/c/605356/20/roles/validate-tempest/templates/configure-tempest.sh.j206:21
Tengumarios|rover: don't hesitate to correct tags if needed. I'm not that used to this kind of issue ;).06:21
chandankumarTengu: https://review.openstack.org/#/c/605356/20/roles/validate-tempest/templates/configure-tempest.sh.j2@74 when tempest format is container, then we create a seperate script and that script is copied with in container and involved while container run06:22
Tenguchandankumar: if it's from the host, the setype would not be correct, although the host_prep_tasks should kick in far earlier than this script, thus you should get the right permissions.06:22
chandankumarTengu: yes06:23
Tenguchandankumar: do you have any error we can investigate and work? because right now I don't see what might be wrong with that.06:23
chandankumarTengu: http://logs.openstack.org/56/605356/20/check/tripleo-ci-centos-7-undercloud-containers/786a06d/logs/undercloud/home/zuul/tempest_container.sh.txt.gz06:24
chandankumarTengu: http://logs.openstack.org/56/605356/20/check/tripleo-ci-centos-7-undercloud-containers/786a06d/logs/undercloud/home/zuul/tempest.log.txt.gz#_2018-10-09_14_03_5006:24
Tenguah, yeah.06:24
chandankumarTengu: with in the container, It is failing to find the /var/lib/tempest directory06:24
Tenguwait...06:24
chandankumarTengu: It is the parent script http://logs.openstack.org/56/605356/20/check/tripleo-ci-centos-7-undercloud-containers/786a06d/logs/undercloud/home/zuul/tempest-setup.sh06:25
*** rdopiera has joined #tripleo06:25
Tenguah, great.06:25
Tenguso, line 23.06:25
marios|roverTengu: updated the title/description fyi06:26
*** shyam89 has joined #tripleo06:26
marios|roverTengu: (added more from the check job)06:26
Tengumarios|rover: no problem - you can assign it to me (for now) if you want, as I'm trying to get it reproduced on my infra.06:26
chandankumarTengu: in the run time, tempest needs a empty directoty under container, where tempest related stuff gets created automatically06:26
Tenguchandankumar: well.06:27
marios|roverTengu: ack but which one ('cedjo7?'06:27
Tenguchandankumar: $TEMPEST_DIR is apparently not set... ?06:27
Tengumarios|rover: cjeanner06:27
marios|roverTengu: or Cedric Jeanneret deactivated06:27
Tenguchandankumar: unless it's passed as a container var?06:27
marios|roverTengu: gotcha06:27
chandankumarTengu: https://review.openstack.org/#/c/605356/20/roles/validate-tempest/templates/configure-tempest.sh.j2@6806:27
chandankumarTengu: you mean I need to pass with docker/podman run agrument?06:28
Tenguchandankumar: at least in the generated file here: http://logs.openstack.org/56/605356/20/check/tripleo-ci-centos-7-undercloud-containers/786a06d/logs/undercloud/home/zuul/tempest-setup.sh it's not present06:28
Tenguyou can look for OEF string, this will generate the inner script06:28
chandankumarTengu: yes got it06:29
Tenguand there isn't any mention of the exported var06:29
Tenguso I guess that's the main issue.06:29
chandankumarTengu: export TEMPEST_DIR='/var/lib/tempest'06:29
chandankumarTengu: it is getting done in the main script06:29
chandankumarTengu: Let me pass with docker and see what happens06:29
*** shyam89 has quit IRC06:31
*** shyam89 has joined #tripleo06:31
Tenguchandankumar: chandankumar hm, in order to get it in container you should pass it with -e TEMPEST_DIR='...'06:31
Tengui.e. container_cli -e TEMPEST_DIR=... run foo bar06:32
*** kopecmartin|ruck is now known as kopecmartin|scho06:32
*** mschuppert has joined #tripleo06:33
chandankumarTengu: updated, https://review.openstack.org/#/c/605356/21/roles/validate-tempest/templates/run-tempest.sh.j2@10206:35
chandankumarI hope it will do the job06:36
Tengulemme check that.06:36
Tenguchandankumar: -e TEMPEST_DIR="${TEMPEST_DIR}"06:36
*** janki has quit IRC06:38
chandankumarTengu: done, thanks :-)06:38
Tengunp :)06:38
*** janki has joined #tripleo06:38
Tengumarios|rover quiquell|brb running a standalone deploy in // of the reproducer itself.06:39
*** aufi_ has joined #tripleo06:41
Tenguguess the standalone will be faster - I allocat "some" resources to the VM by default.06:42
*** mburned_out is now known as mburned06:43
*** apetrich has quit IRC06:45
chandankumarTengu: If standalone is going to take less than 2 hours will full tempest api and scenario we can move this job against all upstream project06:45
Tenguchandankumar: can't say for this case - I don't think tempest is deployed with standalone by default. I didn't investigate it very long, I just used it in order to validate my podman+selinux work :)06:47
chandankumarTengu: once this patch merged, I will do a cleanup of the role06:47
Tenguok :)06:48
*** pcaruana has joined #tripleo06:50
*** cylopez has joined #tripleo06:54
*** quiquell|brb is now known as quiquell06:54
*** cylopez has left #tripleo06:54
quiquellTengu: found anything ?06:55
Tenguquiquell: still running06:55
quiquellTengu: puppet nova version is new06:55
TenguI'm also taking this opportunity to test the beast :]. 1 reproducer + 1 standalone in //. seems to work.06:55
Tenguquiquell: puppet-nova? hmmm. so the patch for the sasl thingy wasn't packaged before yesterday or something like that?06:56
quiquellTengu: There is something weird, we didn't have a master promotion but puppet-nova is new06:56
Tenguo_O06:56
Tengujanki: don't obsess on gate, there's a major failure preventing the runs.06:57
*** aufi_ has quit IRC06:57
Tenguand I can't change the topic as I'm no channel operator :(06:58
*** rcernin has quit IRC06:58
quiquellTengu: The difference is https://github.com/openstack/puppet-nova/commit/aa2893d7e07e0dc37935e602fcdbca99b102d35e06:58
quiquellTengu: at least in puppet nova, but I want to understand why we had new version there06:58
Tenguyeah so that famous thingy.06:58
quiquellTengu: Humm we are having current from "puppet-*"06:59
quiquellTengu: so puppet has to be aligned with the latest of each06:59
quiquellTengu: going to test a revert on that07:00
Tenguok07:00
quiquellTengu: the patch add new changes to qemu-conf-limits07:01
quiquellTengu: So it can be related07:01
*** jfrancoa has joined #tripleo07:02
*** ykarel has joined #tripleo07:03
quiquellmschuppert: Are you there ?07:05
*** agopi has quit IRC07:06
*** ade_lee has quit IRC07:06
*** agopi has joined #tripleo07:07
*** iurygregory has joined #tripleo07:07
*** odyssey4me has quit IRC07:07
*** ajo has quit IRC07:08
*** odyssey4me has joined #tripleo07:08
*** ajo has joined #tripleo07:08
quiquellTengu: -logdest syslog07:08
quiquellTengu: we are running puppet with syslog activated07:08
*** mugsie has quit IRC07:08
mschuppertquiquell: yes07:09
*** mgagne has quit IRC07:09
quiquellmschuppert: We have a massive gate failure, could be related to this patch https://github.com/openstack/puppet-nova/commits/master07:09
quiquellmschuppert: http://logs.openstack.org/45/560445/160/check/tripleo-ci-centos-7-standalone/961edf2/logs/undercloud/home/zuul/standalone_deploy.log.txt.gz07:09
*** mgagne has joined #tripleo07:10
*** jroll has quit IRC07:10
*** Tengu has quit IRC07:10
*** ooolpbot has joined #tripleo07:10
ooolpbotURGENT TRIPLEO TASKS NEED ATTENTION07:10
ooolpbothttps://bugs.launchpad.net/tripleo/+bug/179671007:10
openstackLaunchpad bug 1796710 in tripleo "Tempest tests failed with Read timed out error on tripleo-ci-centos-7-undercloud-containers" [Critical,Triaged]07:10
ooolpbothttps://bugs.launchpad.net/tripleo/+bug/179675607:10
ooolpbothttps://bugs.launchpad.net/tripleo/+bug/179703507:10
*** ooolpbot has quit IRC07:10
openstackLaunchpad bug 1796756 in tripleo "Error searching for image docker.io/tripleorocky/centos-binary-ceilometer-compute - UnixHTTPConnectionPool(host=\'localhost\', port=None): Read timed out." [Critical,In progress] - Assigned to Alex Schultz (alex-schultz)07:10
quiquellmschuppert: One of the qemu-conf-augeas is failing07:10
openstackLaunchpad bug 1797035 in tripleo "scenario multinode jobs failling during overcloud deploy - ERROR: 18918 -- Failed running docker-puppet.py for nova_libvirt" [Critical,Triaged] - Assigned to Cédric Jeanneret (cjeanner)07:10
*** Tengu has joined #tripleo07:10
Tenguquiquell: ah, standalone just failed.07:10
*** janki has quit IRC07:11
Tenguquiquell: so yeah, can reproduce it in standalone. much faster :)07:11
*** janki has joined #tripleo07:11
quiquellTengu: try just to revert https://github.com/openstack/puppet-nova/commit/aa2893d7e07e0dc37935e602fcdbca99b102d35e07:11
quiquellTengu: to check this out07:11
Tenguquiquell: 2s. Inspecting docker-puppet-nova_libvirt07:12
quiquellTengu: I am just trying to find which one of the qemu-conf-augeas is failing07:12
Tenguit's the failed container.07:12
quiquellTengu: ack, can we see witch one ?07:12
jankiTengu, but 2 of my patches got merged in the same gate run so I rechecked07:12
*** jroll has joined #tripleo07:12
Tengujanki: failure's recent ;). anyway. we're trying to shoot it.07:13
quiquellmschuppert: Do you know if we need newer versions of other opnestack projects for the patch to work ?07:13
jankiTengu, ack. SHould I abondon the patches to clear up the queue?07:13
Tengujanki: not yet. we're investi-gating07:14
mschuppertquiquell: yes lets revert that one as it sets a qemu parameter which might require a newer libvirt version. that might result in the error07:14
jankiTengu, good one :P07:14
Tengu;)07:14
quiquellmschuppert: so maybe it depends on promotion ?07:14
Tengumschuppert: apparently it's the nbd_tls one07:14
Tengunot present at all in the qemu.conf.07:14
* quiquell is writting 'investi-gating' in stone07:14
Tenguquiquell: ;)07:15
mschuppertTengu: yes that is the new one.07:15
mschuppertwondering why it passed ci ...07:15
quiquellmschuppert: Humm that's true...07:15
Tenguah, yeah, right, got mixed up with the -/+ due to the coma.07:15
quiquellmschuppert: let's check that07:15
Tengumschuppert: btw, lists in puppet should get a "," for their last element07:15
Tenguit's not like json :)07:16
mschuppertTengu: ack thx07:16
Tenguanyway. quiquell when you have a patch I can put it in use in my env.07:16
*** shyam89 has quit IRC07:17
quiquellmschuppert: uppet-nova has a tripleo-undercloud job at the gates weird it passed07:17
*** shyam89 has joined #tripleo07:17
*** amoralej|off is now known as amoralej07:17
quiquellmschuppert: Maybe it uses newer versions07:17
quiquell:-(07:17
Tengubrb - breakfast.07:17
quiquellmarios|rover: going to revert mschuppert change, looks like it is the problem07:18
marios|roverquiquell: ack thanks mate been tracking nice one07:18
marios|roverkopecmartin|scho: ^^ fyi07:19
quiquellmarios|rover: Just checking that we were not screwing it with the new promotion jobs and all07:19
quiquellmschuppert: do we have to backport the revert ?07:20
*** jpich has joined #tripleo07:21
mschuppertquiquell: no its only in master07:21
quiquellTengu, mschuppert, mschuppert: https://review.openstack.org/60928907:22
quiquellmarios|rover: testing patch https://review.openstack.org/#/c/609290/07:25
quiquellmarios|rover: standalone is fast so we will be able to check it out soon07:25
marios|roverack quiquell07:26
quiquellmarios|rover: what else do we have that prevent the OVB fix to be m erged ?07:28
quiquellmarios|rover: I want to have a master/rocky promotio to check that it's working with the new stuff07:28
marios|roverquiquell: ovb fix is the ssl one you mean?07:30
marios|roverquiquell: or the ironic one?07:30
quiquellmarios|rover: https://review.openstack.org/#/c/608589/07:30
*** panda has quit IRC07:30
quiquellmschuppert: we have to find why the tripleo job at puppet-nova failed07:31
*** salmankhan has joined #tripleo07:31
quiquellmschuppert: I mean pssed at the review07:31
quiquellmschuppert: what libvirt package your change depends on ?07:31
marios|roverquiquell: yeah ssl one not aware of something else there but it kept hitting the different issues on each run07:31
quiquellmarios|rover: the ssl only affect promotions or it also affect gates ?07:32
*** panda has joined #tripleo07:32
marios|roverquiquell: promotions too07:32
Tenguquiquell: ah, yeah, we can test it even with containers as they mount the puppet thingy into.07:32
marios|roverquiquell: in fact it was the legacy ovb fs1/35 jobs that were failing07:33
quiquellmarios|rover: new ovb ones are failing too there07:33
Tenguquiquell: re-running the standalone with your patch.07:34
Tenguchandankumar: is the correction OK for your tempest work?07:34
*** mgagne has quit IRC07:34
mschuppertquiquell: libvirt-4.507:35
quiquellmschuppert: argg undercloud-container works that's what we are running at puppet-nva07:35
quiquellmschuppert: is all the other that fail :-/(07:35
mschuppertquiquell: :( ok07:35
quiquellmschuppert: Make sense to add standalone to puppet-nova ?07:35
mschuppertquiquell: yes I'd say so07:36
quiquellmschuppert: will do07:36
*** shyam89 has quit IRC07:36
mschuppertquiquell: thx. sorry for the mess07:36
*** mgagne has joined #tripleo07:36
*** salmankhan has quit IRC07:37
quiquellmschuppert: no problem, now that we have standalone we have to start gating stuff with it07:37
quiquellmschuppert: it does not consume much resources07:38
mschuppertquiquell: cool, makes sense07:38
quiquellmschuppert: Maybe we can replace the tripleo-puppet-undercloud07:38
Tenguquiquell: lemme check that with the standalone - that's the best way. currently running, it's "installing tripleo client", then it will build the puppet-nova package, and banzai.07:41
chandankumarTengu: yes07:41
Tenguchandankumar: \o/07:41
*** tosky has joined #tripleo07:42
*** Petersingh is now known as Petersingh|lunch07:46
*** shyam89 has joined #tripleo07:48
quiquellmschuppert, Tengu, marios|rover: https://review.openstack.org/#/c/609301/07:50
quiquellTo run standalone at puppet-nova, Can we add it to the rest of puppet- projects ?07:51
*** jchhatbar has joined #tripleo07:51
quiquelljaosorior: Are you there ?07:52
*** janki has quit IRC07:52
mschuppertquiquell: I'd vote for yes07:52
Tenguquiquell: I think it should be done, yeah. at least project used in standalone. Not sure we use all of them.07:52
Tengubut in that precise case, puppet-nova is used, so +1 (can't +2 :'( )07:53
*** jchhatbar has quit IRC07:53
*** janki has joined #tripleo07:53
Tenguquiquell: standalone is currently deploying with your revert - will give you a feedback once it's either failed or succeed07:53
quiquellmschuppert, Tengu: going to add to all puppet- projects that run tripleo-undercloud07:54
*** gkadam has joined #tripleo07:55
*** shyam89 has quit IRC07:55
*** apetrich has joined #tripleo07:56
quiquellTengu: do you know if we are planning on removing puppet from standalone ?07:57
Tenguquiquell: no idea - guess mwhahaha might know more about that.07:57
Tenguor dprince, when he connects.07:57
quiquellTengu: do you know what puppet- projects are used by standalone ?07:59
*** janki has quit IRC08:01
*** janki has joined #tripleo08:01
Tenguquiquell: not really. as said, I didn't go far in investigating what it actually does. There are at least keystone, nova, swift, redis, mysql/galera, neutron, horizon08:04
quiquellTengu: ack08:04
quiquellTengu: I suppose ironic is not08:05
Tenguquiquell: hmm yeah, no ironic ( dtantsur|afk ?)08:05
*** ooolpbot has joined #tripleo08:10
ooolpbotURGENT TRIPLEO TASKS NEED ATTENTION08:10
ooolpbothttps://bugs.launchpad.net/tripleo/+bug/179671008:10
openstackLaunchpad bug 1796710 in tripleo "Tempest tests failed with Read timed out error on tripleo-ci-centos-7-undercloud-containers" [Critical,Triaged]08:10
ooolpbothttps://bugs.launchpad.net/tripleo/+bug/179675608:10
ooolpbothttps://bugs.launchpad.net/tripleo/+bug/179703508:10
*** ooolpbot has quit IRC08:10
openstackLaunchpad bug 1796756 in tripleo "Error searching for image docker.io/tripleorocky/centos-binary-ceilometer-compute - UnixHTTPConnectionPool(host=\'localhost\', port=None): Read timed out." [Critical,In progress] - Assigned to Alex Schultz (alex-schultz)08:10
openstackLaunchpad bug 1797035 in tripleo "scenario multinode jobs failling during overcloud deploy - ERROR: 18918 -- Failed running docker-puppet.py for nova_libvirt" [Critical,In progress] - Assigned to Cédric Jeanneret (cjeanner)08:10
*** janki has quit IRC08:10
*** ramishra has quit IRC08:12
Tenguquiquell: at least the patch allows to go to step 2 in the standalone deploy - I therefore think it's OK.08:12
Tenguquiquell: ah, wrong - apparently the file isn't modified yet. damn.08:13
*** ramishra has joined #tripleo08:13
Tengulet's see what step should fail. I didn't check that with the previous run :/. bad.08:13
quiquellTengu: do we have puppet for galera or redis ?08:17
quiquellTengu: Don't find them08:17
Tenguwait.08:17
Tengupuppet-mysql-6.0.1-0.20180802105733.204cfd4.el7.noarch08:18
quiquellTengu: but that's not a openstack project08:19
Tenguquiquell: http://github.com/puppetlabs/puppetlabs-mysql we're using upstream08:19
quiquellTengu: not sure we can do standalone ther e08:19
Tengualthough it's in DLRN repo08:19
quiquellTengu: same with puppet-redis08:19
*** shardy has joined #tripleo08:19
Tenguyeah, it makes sense for sur modules.08:19
Tenguso I'd say "put standalone zuul for openstack modules"08:20
*** bogdando has joined #tripleo08:20
Tengualthoug we might want to ensure they are actually in use on standalone. for that we might need some more help from DF team.08:20
Tengulemme check something.08:20
Tenguquiquell: https://github.com/openstack/tripleo-heat-templates/blob/master/environments/standalone.yaml08:21
Tenguso I was pretty close - I just missed "cinder" and "glance" :)08:22
quiquellTengu: https://review.openstack.org/#/q/topic:standalone-puppet-gates+(status:open+OR+status:merged)08:22
Tenguquiquell: cool - might want to add glance and cinder according to the tmpl I just linked.08:22
*** sshnaidm|afk is now known as sshnaidm08:22
Tenguquiquell: at least the revert DOES work, I'm at step 5 now.08:22
Tenguso now we need some cores for https://review.openstack.org/609289 ( jaosorior shardy please :) )08:23
*** skramaja has joined #tripleo08:24
quiquellTengu: check if I missing any https://review.openstack.org/#/q/topic:standalone-puppet-gates ?08:25
quiquellTengu: I think it's ok now08:25
*** phuongnh has joined #tripleo08:25
Tenguquiquell: at least we get the beefy part.08:25
Tenguquiquell: a second check from one of the "standalone guru" would still be good ;).08:26
quiquellTengu: we don't se heat at standalone ?08:28
quiquellTengu: who can I ask to the reviews ?08:28
quiquells/ask/add/08:29
Tenguquiquell: I'd say dprince and mwhahaha mainly. maybe bogdando08:29
chandankumarTengu: http://logs.openstack.org/56/605356/22/check/tripleo-ci-centos-7-undercloud-containers/cbaf8d5/logs/undercloud/home/zuul/tempest.log.txt.gz#_2018-10-10_08_02_3608:30
chandankumarTengu: unable to pull TEMPEST_DIR=${TEMPEST_DIR}: error getting default registries to try: invalid reference format08:30
bogdandowoa, that https://review.openstack.org/#/q/topic:standalone-puppet-gates+(status:open+OR+status:merged) is really nice to see08:30
bogdandobut where is the job logs?..08:30
quiquellbogdando: we were hit buy merged stuff because we where not covering standalone, so it's a reallity now08:30
Tenguchandankumar: well, err, you don't know hat to pass env var to docker/podman, do you?08:30
quiquellbogdando: Can you help with to land thosw ?08:31
chandankumarTengu: one in08:31
bogdandoquiquell: yes, but would be nice to have build logs08:31
bogdandosome DNM patches perhaps08:31
chandankumarTengu: sorry I missed -e08:31
quiquellbogdando: Why DNM patches, the reviews are already triggering standalone08:31
bogdandohmmm08:32
Tenguchandankumar: and you want to puth that new env with the others.08:32
Tenguchandankumar: order does matter with CLI params ;).08:32
bogdandoquiquell: not here https://review.openstack.org/#/c/609308/ ?08:32
Tenguso just put your -e TEMPEST_DIR="${TEMPEST_DIR}" near the CURL_CA_BUNDLE for example.08:32
bogdandoquiquell: I think it should replace undercloud08:32
Tenguchandankumar: so that we get a grouped thingy, more readable08:33
quiquellbogdando: It's running there08:33
quiquellbogdando: Yep about undercloud didn't know if it worth it now08:33
*** derekh has joined #tripleo08:33
bogdandoIMO it does08:33
quiquellbogdando: Can you check that I don't miss any puppet-foobar that standalone is using ?08:33
bogdandobtw, I only have +1 there08:33
Tenguquiquell: Deployment successfull!08:33
Tenguquiquell: that's with the revert ( mschuppert as well )08:34
quiquellTengu: yei !!! marios|rover ^08:34
quiquellbogdando, Tengu: So what reviewers do we add: mwhahaha, dprince... someone else ?08:34
Tenguchandankumar: double-quotes for variable08:35
Tenguchandankumar: "${TEMPEST_DIR}"  not '${TEMPEST_DIR}'08:35
quiquellbogdando: Is there any puppet- that is run at undercloud and not standalone ?08:35
bogdandogood question08:35
marios|roverquiquell: ack great stuff08:36
*** aufi_ has joined #tripleo08:36
bogdandoafaik, standalone includes all that undercloud and overcloud has08:36
bogdandomwhahaha: am I right? ^^08:36
Tengubogdando: not really in fact. at least not according to the standalone.yaml08:36
quiquellmarios|rover: I want to have standalone staff at all possible gates before my next ruck/rovering08:36
bogdandoTengu: ;(08:36
Tenguquiquell: how surprising :D08:36
bogdandoI thought Alex included the world there08:37
Tengubogdando: as you're here, care to vote on that one? https://review.openstack.org/#/c/609289/  it will unlock gate.08:37
Tengubogdando: https://github.com/openstack/tripleo-heat-templates/blob/master/environments/standalone.yaml#L15-L2208:37
bogdandoI failed to find any correlation to the failure root cause08:37
bogdandothus didn't vote08:37
Tengubogdando: apparently the libvirt version we have doesn't support the new param in qemu.conf.08:37
quiquellTengu, bogdandoÃo: We don't run standalone at puppet-tpleo ? ...08:37
Tengubrb, will do the disk install in the beast.08:39
*** fhubik has joined #tripleo08:39
*** fhubik has quit IRC08:39
bogdandoquiquell: one more concern for the topic is metadata.json08:39
bogdandomaybe we should also test against the dependency changes08:39
jaosoriorquiquell, ratailor: Now I am08:40
quiquelljaosorior: Gate consistent blocked, fixed here https://review.openstack.org/#/c/609289/08:41
ratailorjaosorior, I had a query regarding backport, which is resolved now. Thanks :)08:41
*** aufi_ has quit IRC08:41
quiquelljaosorior: We are planning on running standalone at puppet-projects https://review.openstack.org/#/q/topic:standalone-puppet-gates08:41
quiquelljaosorior: standalone was not running in the merged review08:41
jaosoriorquiquell: bummer, I don't have +2 there.08:42
jaosoriordid you poke the folks in puppet-openstack?08:42
bogdandoquiquell: so we'll need one more for Heat and Mistral08:42
quiquelljaosorior: not yet, adding world to the reviews, who can I add08:42
bogdandopuppet-*08:42
bogdandoand ironic08:43
quiquellbogdando: ironic ?08:43
*** shyam89 has joined #tripleo08:43
quiquellbogdando: standalone is using ironnic ?08:43
bogdandolet's add it there :D08:43
bogdandowe can modify in quickstart08:43
bogdandonot in tht08:43
Tenguquiquell: no use of ironic in standalone08:43
bogdandooverriding roles as we want it for CI08:43
bogdandoyou can deploy anything in standalone08:43
quiquellTengu: bogdando want to add it08:43
bogdandoand horizon perhaps, or it worth not adding it to puppet-horizon08:44
quiquellbogdando: Ahh ok, so maybe we can  modify later the jobs at differnt modules to exercise them, that's it ?08:44
bogdandoquiquell: I believe we can08:44
bogdandooooq can override things being deployed easily08:44
*** hjensas has quit IRC08:44
quiquell#join #openstack-puppet08:46
quiquellups08:46
quiquelljeje08:46
quiquellNot much people there08:46
quiquellbandini: Are you a openstack-puppet master ?08:46
bogdandoquiquell: I think a topic for openstack-dev may be a good idea08:46
bogdandoto follow up one that weshay created08:46
bogdandowith CI standalone roadmap08:47
quiquellbogdando: don't know what you mean08:47
bogdandoand you could ask for the replace undercloud with composable standalone jobs08:47
chandankumarstandalone deployment is failing, it is a known issue08:47
quiquellbogdando: I just want to have easy ruck rovering next time :-P08:47
quiquellchandankumar: yep, we have a fix for it08:47
quiquellchandankumar: https://review.openstack.org/#/c/609289/08:47
bogdandoquiquell: this http://lists.openstack.org/pipermail/openstack-dev/2018-October/135396.html08:48
chandankumarquiquell: thanks :-)08:49
bogdandoso just a follow up for the standalone topic08:49
quiquellbogdando: ack will talk with wes later on08:49
quiquellbogdando: we are also prepareing the standalone over fedora28 thingy08:49
bogdandoI mean you could go straight and push an update with those questions for customizable standalone jobs in puppet-foo08:50
bogdandoand replacing the UC jobs with those08:50
mschuppertquiquell: cool08:51
bogdandoit's also makes sense to not deploy more than is really needed to check the foo component08:51
* mschuppert was afk for some time08:51
bogdandoso we do not deploy world for every puppet module08:51
bogdandobut only, say keystone for puppet-keystone08:51
bogdandoand limiting tempest to that08:51
bogdandoand so on08:51
quiquellbogdando: But we have interelations too08:51
bogdandoweshay: ^^08:51
quiquellbogdando: We wnat to test that at puppet-foo08:52
bogdandosure, just need to pick wise08:52
chandankumarbogdando: limiting tempest tests means we want to run test for a particular component only?08:52
bogdandoyes08:52
bogdandoor a few08:52
quiquellbogdando: I mean we where already running undercloud, replacing by standalone is a good covering08:52
bogdandoif we want to test some integrations08:52
*** ykarel_ has joined #tripleo08:52
quiquellbogdando: First let's merge the blocker08:52
bogdandoquiquell: ofc, but I cannot, not a core there08:53
chandankumarbogdando: then can be doable just like i have done for https://github.com/openstack/tripleo-quickstart/blob/master/config/general_config/featureset027.yml#L8808:53
chandankumarquiquell: ^^08:53
bogdandochandankumar: yes, exactly08:53
bogdandoso for puppet-mistral we could deploy a standalone job with only DB/MQ/keystone and mistral08:53
chandankumari think we can also customize the job and reuse the same feature set08:53
bogdandoand test only mistral with tempest08:53
bogdandothat's the idea08:54
bogdandoweshay: quiquell: ^^08:54
quiquelljaosorior: Wher eis the list of a projects user with superpowers ?08:54
quiquellbogdando: So we need special fs for each ?08:54
bogdandoquiquell: seems so08:55
quiquellbogdando: Humm... we have introduce a featoruset override feature with the new workflow08:55
chandankumarquiquell: I think we can customize using {% if lookup('env', 'ZUUL_PROJECT') == "openstack/mistral" -%}08:55
bogdandoor that08:55
*** ykarel has quit IRC08:55
quiquellbogdando: so we can use zuul job 'vars:' block, but I think you cannot use it with template08:55
bogdandooverriding a single featureset with simple extentions may be a good idea08:55
bandiniquiquell: not really. what's the prob?08:55
quiquellbandini: we have a fix at puppet-nova that unblock gates, don't know who to ping08:56
bogdandogit blame? :)08:56
*** Petersingh|lunch is now known as Petersingh08:56
quiquellbogdando: I always get the same anser :-)08:56
bogdandoheh08:56
quiquellbogdando: who is 'zuul' did it has super powers ?08:57
quiquellsshnaidm: Do you know who can help us merge https://review.openstack.org/#/c/609289/  ?08:58
bogdandoquiquell: I think it's sub-zero, he can freeze your jobs in queues to death08:58
quiquellbogdando: Respect08:59
sshnaidmquiquell, I can merge it, np09:00
quiquellsshnaidm: Really ! cool09:00
sshnaidmquiquell, oh, no, sorry09:00
sshnaidmquiquell, puppet is not my favorite puppet..09:00
quiquellsshnaidm: btw I have put in place this https://review.openstack.org/#/q/topic:standalone-puppet-gates09:00
* quiquell does NOT write this joke in stone09:01
sshnaidmneed to wait for EmilienM and mwhahaha09:01
quiquellsshnaidm: ack09:01
sshnaidmthey have powers in puppet afaik09:01
quiquellsshnaidm: no one else, where is the file with that info ?09:01
sshnaidmquiquell, https://review.openstack.org/#/admin/groups/134,members09:02
quiquellsshnaidm: I think we have to help to get master/rocky promotions this sprint to test the new RDO stuff09:02
sshnaidmactually chem` can help ;)09:02
quiquellchem`: you there ?09:03
quiquelliurygregory: too09:03
quiquellmnaser: ?09:03
*** ykarel_ is now known as ykarel|lunch09:03
iurygregoryme?09:03
quiquelliurygregory: we need to merge this https://review.openstack.org/#/c/609289/ to unblock gates09:04
quiquelliurygregory: can you help there ?09:04
iurygregoryquiquell, sure o/09:04
*** akrivoka has joined #tripleo09:04
mnaserhi quiquell09:05
iurygregoryquiquell, can i wait for zull result? to +2?09:05
quiquelliurygregory: also we are adding standalone to openstack-nova09:05
mnaserLet me have a look09:05
quiquelliurygregory: sure09:05
quiquellmnaser: you can help too09:05
bandiniquiquell: I'd go in #openstack-puppet and ping folks there, I don't have +2 there09:05
iurygregoryits #puppet-openstack09:06
iurygregorythe channel09:06
quiquellbandini: They are helping us now09:06
quiquelliurygregory: I have join both :-)09:06
quiquellTengu: standalone has deploy already in the test review, we are all good09:07
quiquellTengu: https://zuul.openstack.org/stream/82dd657bf09b4c62960b4c189323e8f0?logfile=console.log09:07
quiquellTengu: thanks mate09:07
quiquelliurygregory, mnaser: We are also activating tripleo standalone jobs at puppet https://review.openstack.org/#/q/topic:standalone-puppet-gates09:08
mnaserquiquell: be good to ping tobiash who’s the current PTL too09:08
iurygregoryyeah09:09
Tenguquiquell: didn't do much ;)09:09
Tengujust pointed to the "real" issue :).09:10
*** ooolpbot has joined #tripleo09:10
ooolpbotURGENT TRIPLEO TASKS NEED ATTENTION09:10
ooolpbothttps://bugs.launchpad.net/tripleo/+bug/179671009:10
openstackLaunchpad bug 1796710 in tripleo "Tempest tests failed with Read timed out error on tripleo-ci-centos-7-undercloud-containers" [Critical,Triaged]09:10
ooolpbothttps://bugs.launchpad.net/tripleo/+bug/179675609:10
ooolpbothttps://bugs.launchpad.net/tripleo/+bug/179703509:10
*** ooolpbot has quit IRC09:10
openstackLaunchpad bug 1796756 in tripleo "Error searching for image docker.io/tripleorocky/centos-binary-ceilometer-compute - UnixHTTPConnectionPool(host=\'localhost\', port=None): Read timed out." [Critical,In progress] - Assigned to Alex Schultz (alex-schultz)09:10
openstackLaunchpad bug 1797035 in tripleo "scenario multinode jobs failling during overcloud deploy - ERROR: 18918 -- Failed running docker-puppet.py for nova_libvirt" [Critical,In progress] - Assigned to Cédric Jeanneret (cjeanner)09:10
*** kopecmartin|scho is now known as kopecmartin|ruck09:10
Tengubtw we might also drop the warning with the env I think.09:10
Tenguit's a pretty easy thing to do.09:10
Tenguwill have a look later today.09:10
EmilienMsshnaidm: can you tldr me please, what do you need? In my phone now09:11
sshnaidmEmilienM, need to +w https://review.openstack.org/#/c/609289/ , quiquell can provide details09:13
EmilienMsshnaidm: doing it.09:13
quiquellEmilienM: gate blocker09:13
quiquellEmilienM: adding standalone at puppet- gates https://review.openstack.org/#/q/topic:standalone-puppet-gates09:13
quiquellEmilienM: standalone was failing but it was not at puppet-nova gate09:13
EmilienMDone09:14
EmilienMAnything else?09:14
quiquellEmilienM: nope, thanks09:14
EmilienMGood ttyl09:14
iurygregoryi was waitting for zull Emilien always fast XD09:15
*** jpich has quit IRC09:16
*** jpich has joined #tripleo09:17
Tengu:)09:26
Tenguiurygregory: well, for the gate unlocker, we actually did test it on the side ;).09:26
iurygregoryTengu, nice o/09:26
Tenguhaving a monster under the desk helps a lot in this kind of situation.09:27
*** salmankhan has joined #tripleo09:32
chandankumararxcruz: Hello09:33
*** ykarel|lunch is now known as ykarel09:33
chandankumararxcruz: please update this spec file https://review.rdoproject.org/r/1621609:34
*** jpich has quit IRC09:36
*** jpich has joined #tripleo09:36
quiquellmarios|rover: They ask me at openstack-infra for PTL confirmation to priorize the blocker :-) Is it worth it ?09:48
quiquelljaosorior: ^09:48
marios|roverquiquell: well it is blocking all the things right now, inclduing the fixes to the open bugs09:48
marios|roverquiquell: but maybe given the recent discussion about us using resources we can't push too much on these things09:49
quiquellmarios|rover: That was kind of my fault :-(09:49
quiquellmarios|rover: We needed that for the migration09:49
quiquelljaosorior: Maybe you can help us with that09:50
quiquellmarios|rover: Ahh the fix is at other empty gate :-) we have being ucky for ones :-)09:55
*** ykarel_ has joined #tripleo09:58
arxcruzchandankumar: hey, you just did...09:59
arxcruzchandankumar: i'm doing some medic exams, i'll not be 100% available today :/09:59
*** ykarel has quit IRC10:00
*** ykarel__ has joined #tripleo10:02
*** aufi_ has joined #tripleo10:02
*** ykarel_ has quit IRC10:05
*** ykarel_ has joined #tripleo10:06
*** ykarel__ has quit IRC10:09
*** ooolpbot has joined #tripleo10:10
ooolpbotURGENT TRIPLEO TASKS NEED ATTENTION10:10
ooolpbothttps://bugs.launchpad.net/tripleo/+bug/179671010:10
openstackLaunchpad bug 1796710 in tripleo "Tempest tests failed with Read timed out error on tripleo-ci-centos-7-undercloud-containers" [Critical,Triaged]10:10
ooolpbothttps://bugs.launchpad.net/tripleo/+bug/179675610:10
ooolpbothttps://bugs.launchpad.net/tripleo/+bug/179703510:10
*** ooolpbot has quit IRC10:10
openstackLaunchpad bug 1796756 in tripleo "Error searching for image docker.io/tripleorocky/centos-binary-ceilometer-compute - UnixHTTPConnectionPool(host=\'localhost\', port=None): Read timed out." [Critical,In progress] - Assigned to Alex Schultz (alex-schultz)10:10
openstackLaunchpad bug 1797035 in tripleo "scenario multinode jobs failling during overcloud deploy - ERROR: 18918 -- Failed running docker-puppet.py for nova_libvirt" [Critical,In progress] - Assigned to Cédric Jeanneret (cjeanner)10:10
*** shyam89 has quit IRC10:11
*** apetrich has quit IRC10:11
*** shyam89 has joined #tripleo10:12
chandankumararxcruz: no problem, let me take care of that10:13
bogdandobeagles: hi. Around? I'm testing root wrap containers in podman... Now I have a little issue to solve http://pastebin.test.redhat.com/65564010:13
bogdandonot sure how come a priviliged container can't access it10:14
*** ykarel__ has joined #tripleo10:14
*** ykarel_ has quit IRC10:17
*** apetrich has joined #tripleo10:18
*** ykarel_ has joined #tripleo10:18
*** akrivoka_ has joined #tripleo10:20
*** ykarel__ has quit IRC10:21
*** akrivoka has quit IRC10:22
*** ykarel__ has joined #tripleo10:23
jaosoriorquiquell: sorry, went for lunch. back now10:24
quiquelljaosorior: All good for the review10:24
*** ykarel_ has quit IRC10:25
chandankumarsshnaidm: bogdando please have a look at this one https://review.openstack.org/#/c/602347/10:27
*** ykarel__ has quit IRC10:29
*** ykarel has joined #tripleo10:31
*** apetrich has quit IRC10:33
*** shyam89 has quit IRC10:34
*** ykarel_ has joined #tripleo10:35
*** ykarel__ has joined #tripleo10:37
*** ykarel has quit IRC10:37
*** hjensas has joined #tripleo10:38
*** aufi_ has quit IRC10:39
*** ykarel_ has quit IRC10:40
*** jaganathan has quit IRC10:45
*** kopecmartin|ruck is now known as kopecmartin|scho10:47
sshnaidmchandankumar, commented10:49
*** sri_ has joined #tripleo10:51
*** yprokule has joined #tripleo10:54
*** radez has quit IRC10:56
*** dalvarez has quit IRC10:56
*** Petersingh is now known as Petersingh|afk10:58
*** jpena|off has quit IRC10:58
*** amoralej has quit IRC11:00
quiquellmschuppert: standalone failing https://review.openstack.org/#/c/609305/ :-)11:02
quiquellbogdando: ^11:02
quiquellSo we really cover it11:02
quiquellIt's failing at all the puppet projects11:03
Tenguquiquell: your revert is about to pass the gate :). less than 30 minutes apparently.11:04
quiquellTengu: puppet-nova is small compare with tripleo stuff11:04
*** sshnaidm is now known as sshnaidm|afk11:04
Tengu^^11:04
quiquellTengu: maybe we need more than one gate quebe for tripleo projects :-/11:05
EmilienMiurygregory: hey welcome here11:05
iurygregorytks EmilienM o/11:05
*** morazi has joined #tripleo11:06
*** ooolpbot has joined #tripleo11:10
ooolpbotURGENT TRIPLEO TASKS NEED ATTENTION11:10
ooolpbothttps://bugs.launchpad.net/tripleo/+bug/179671011:10
ooolpbothttps://bugs.launchpad.net/tripleo/+bug/179675611:10
ooolpbothttps://bugs.launchpad.net/tripleo/+bug/179703511:10
openstackLaunchpad bug 1796710 in tripleo "Tempest tests failed with Read timed out error on tripleo-ci-centos-7-undercloud-containers" [Critical,Triaged]11:10
*** ooolpbot has quit IRC11:10
openstackLaunchpad bug 1796756 in tripleo "Error searching for image docker.io/tripleorocky/centos-binary-ceilometer-compute - UnixHTTPConnectionPool(host=\'localhost\', port=None): Read timed out." [Critical,In progress] - Assigned to Alex Schultz (alex-schultz)11:10
openstackLaunchpad bug 1797035 in tripleo "scenario multinode jobs failling during overcloud deploy - ERROR: 18918 -- Failed running docker-puppet.py for nova_libvirt" [Critical,In progress] - Assigned to Cédric Jeanneret (cjeanner)11:10
*** udesale has quit IRC11:11
mschuppertquiquell: right now thats ok, we have not merged the revert, right?11:14
quiquellmschuppert: soon11:15
mschuppertquiquell: cool. right time for lunch then :)11:15
*** dtantsur|afk is now known as dtantsur11:16
dtantsurTengu: hey, what was the question?11:17
*** phuongnh has quit IRC11:19
Tengudtantsur: hey :). about ironic being used in "standalone" deploy.11:21
dtantsurso, what's the question? I guess it depends on which templates you include11:23
*** sshnaidm|afk is now known as sshnaidm11:24
Tengudtantsur: we were wondering if ironic, by default, is included in the standalone - imho nope because no use of it11:24
dtantsurTengu: I think it should not be11:24
Tengugood :).11:24
*** morazi has quit IRC11:32
Tenguquiquell: puppet-nova merged.11:32
*** shyam89 has joined #tripleo11:32
quiquellmarios|rover, mschuppert: ^ \o/11:32
*** amoralej` has joined #tripleo11:32
Tenguquiquell: so I think we can close the LP?11:33
*** ratailor has quit IRC11:33
quiquellTengu: yep11:33
quiquellDone11:33
Tenguah, ok, was about to do it11:33
*** abishop has quit IRC11:34
quiquellTengu: testing also here https://review.openstack.org/#/c/60930111:34
quiquellTengu: directly in the puppet-nova to see if it works11:34
Tengu\o/11:34
mschuppertquiquell: perfect \o/11:35
*** Petersingh|afk is now known as Petersingh11:37
bogdandoquiquell: nice! We should really think of composable standalone, as it gives not much value to block puppet-keystone cuz it fails in tripleo cuz of "Failed running docker-puppet.py for nova_libvirt"11:37
bogdandointegration should only be covered in tripleo, which uses all of those modules11:37
bogdandoimo11:37
*** dhill_ has joined #tripleo11:38
quiquellbogdando: Not sure about it... if keystone is breaking tripleo it would be nice to discover it a puppet-keystone11:39
quiquellbogdando: I see the issue is others can break puppet- reviews11:39
bogdandoyes11:40
quiquellbogdando: You have to know that you are breaking one of the "installers"11:40
bogdandoas I said, puppet-nova shall not block puppet-keystone11:40
*** jpich has quit IRC11:40
bogdandoright, just not sure each puppet module should be running the same integration case11:41
*** jpich has joined #tripleo11:41
bogdandoand put red herrings for non relevant modules11:41
quiquellbogdando: I don't know about it11:41
bogdandoquiquell: about what?11:41
quiquellbogdando: Abouth what composition is needed at each of them11:41
bogdandoI mean, some day that job may be voting, like devstack11:41
bogdandoor tempest11:42
bogdandoand it shall not fail the keystone puppets cuz of nova puppets blockers11:42
quiquellbogdando: Agree11:44
quiquellbogdando: But if you narrow down too much you end up not really testing tripleo11:45
*** rh-jelabarre has joined #tripleo11:45
*** ansmith has quit IRC11:47
bogdandoquiquell: my point is that integration testing of all puppet modules only makes sense for tripleo, and not a particular puppet module11:48
*** boazel has joined #tripleo11:48
bogdandoand if a puppet module foo breaks that integration, it would as well break the puppet-foo module deploying that foo in minimal configuration11:49
bogdandomakes sense?11:49
bogdandowe need a mail thread :)11:49
bogdandoplease, want to start it?11:49
bogdandoas an initiator of the topic ;]11:49
*** trown|outtypewww is now known as trown11:49
quiquellbogdando: So if we know that there is a tripleo interaction like puppet-foo -> puppet-bar the test at puppet-foo has to cover that11:50
quiquellbogdando: But discover this interactions is near impossible11:50
bogdandowell, indeed11:50
bogdandonot sure...11:50
quiquellbogdando: I am not sure too11:50
bogdandoat least we can try to do for the most simple cases11:51
quiquellbogdando: I miss a lot of background11:51
bogdandowhen it is easy to discover11:51
quiquellbogdando: sure low hanging fruit11:51
quiquellbogdando: for puppet-nova is good like it is ?11:52
*** raildo has joined #tripleo11:52
bogdandoI think it only needs keystone, and all ::Nova in tripleo11:52
bogdandowell and the common base like ntp/kernel/db/mq all that is needed for any deployment11:53
bogdandoI mean those Tripleo::ServiceFoo11:53
quiquellbogdando: I see it's kind of complicated and you have to maintain it too11:53
bogdandohttp://git.openstack.org/cgit/openstack/tripleo-heat-templates/tree/environments/standalone.yaml11:53
bogdandoI can help with that yes11:53
bogdandolet's just first get a blessing for the approach11:54
bogdandoso we'll need a mail thread11:54
quiquellbogdando: Maybe we can start with puppet-nova11:54
quiquellbogdando: yep11:54
*** leanderthal has joined #tripleo11:54
bogdandoquiquell: http://git.openstack.org/cgit/openstack/tripleo-heat-templates/tree/environments/standalone/standalone-overcloud.yaml#n1211:55
*** holser_ has joined #tripleo11:55
*** dalvarez has joined #tripleo11:55
bogdandoso we'll need to compose the shorter version of that11:55
bogdandowith Nova and dependencies11:55
quiquellbogdando: It's not quite the opposite, we have to run at puppet-nova stuff that depends on nova11:56
quiquellWe have to test the dependants11:56
bogdandoI mean db, mq, keystone w/o that nova can't pass tempest11:56
bogdandoI hope mwhahaha could help with that also11:57
quiquellbogdando: yep we have jsut start11:57
bogdandoI'm still cryptic with getting clear picture with those resource_registry mappings to the real activation of it11:57
bogdandodeeper in the heat magic11:58
bogdandolike one can include service into resource_registry but assign it to OS::Heat::None deeper the road11:58
bogdandomagic, really11:58
mwhahahaWhat are you people talking about?11:58
bogdandomwhahaha: https://review.openstack.org/#/q/topic:standalone-puppet-gates+(status:open+OR+status:merged)11:59
bogdandobut making each module, whenever it is doable, to be tested with the minimal config to pass tempest11:59
mwhahahaThe default is fine for those repos11:59
bogdandow/o making puppet-keystone standalone job failing cuz of unrelevant puppet-nova issues11:59
mwhahahaNo need to adjust it11:59
bogdandoMy point is to make it trully composable, and minimal12:00
quiquellmwhahaha: We have something merged that could have being prevented with standalone jobs voting12:00
bogdandoand only run full integration for tripleo jobs12:00
quiquellmwhahaha: So we start with the task of using it at puppet- project gates12:00
*** jrist has joined #tripleo12:00
bogdandomwhahaha: default may not be good for puppet-ironic or puppet-horizon12:00
sshnaidmbogdando, if puppet-nova has an issue, most likely we'll have problems in all tripleo jobs, and we want to detect this12:00
*** abishop has joined #tripleo12:01
bogdandoso we'll have to add coverage either to those modules only, or bringing the world in for all of the puppet-* modules12:01
mwhahahaYou folks are talking about over optimizations12:01
jristweshay: too early12:01
bogdandomwhahaha: nope12:01
mwhahahaWe don't need that right now12:01
bogdandobut do we need to test puppet-ironic and horizon?12:01
bogdandoit's not in defaults for standalone12:01
bogdandonor can I see Nova in http://git.openstack.org/cgit/openstack/tripleo-heat-templates/tree/environments/standalone/standalone-overcloud.yaml#n1212:02
mwhahahaIronic is tested with the undercloud job12:02
bogdandothose cryptic resource registries12:02
mwhahahaBecause that's where we use it12:02
bogdandoI suggested to remove that job at all12:02
bogdandook, what tests horizon?12:02
mwhahahaNo12:02
mwhahahaStandalone12:02
bogdandohm12:02
*** gfidente has joined #tripleo12:02
*** paramite has joined #tripleo12:02
mwhahahaHorizon is deployed by default on the standalone12:02
*** aufi has joined #tripleo12:03
bogdandook I give up on reading those http://git.openstack.org/cgit/openstack/tripleo-heat-templates/tree/environments/standalone/standalone-overcloud.yaml#n4512:03
bogdandoI can see nor Nova nor Horizon there12:03
bogdando:)12:03
mwhahahaBecause those are enabled by default12:03
bogdandoplease do the needfull folks, you have my blessing for default standalone for all puppet modules!12:03
mwhahahaI'll point you at the defaults later12:03
mwhahahaNo not all12:03
mwhahahaJust what is there12:04
bogdandomwhahaha: one more thing, https://review.openstack.org/#/c/609300/3/.zuul.yaml wdyt of testing metadata changes?12:04
mwhahahaI'll explain later when I get to my desk12:04
bogdandolike puppet/xyz12:04
weshayhttps://review.openstack.org/#/c/607987/12:04
quiquellmwhahaha: Is the one listed here ok ? https://review.openstack.org/#/q/topic:standalone-puppet-gates12:04
weshaydoes that help cover the current issue?12:04
mwhahahaWe do not want to test metadata.json12:05
weshaymarios|rover, ^12:05
bogdandowhy?12:05
bogdandoit may break tripleo via missing RDO dependencies for some puppet moudles?..12:05
mwhahahaWe dont use it12:05
bogdandoor pinned verions12:05
mwhahahaIt won't pick up anything12:05
bogdandohm12:05
bogdandobut we do pin some versions around12:05
mwhahahaAlso prevents us bumping versikns12:05
quiquellweshay: we are proposing adding standalone to puppet- projects https://review.openstack.org/#/q/topic:standalone-puppet-gates12:06
weshayyes12:06
bogdandoright, and don't we want run tests against bumped metadata versions?12:06
weshaythanks for the patches12:06
mwhahahaquiquell: doesn't make sense on puppet-openstack-integration12:06
weshayya12:06
mwhahahaquiquell: the rest are fine12:06
quiquellmwhahaha: it's not the place for the template ?12:06
mwhahahaOh maybe12:06
weshaymwhahaha, I don't think we have the tempest tests merged for standalone12:06
weshaychandankumar, is back12:06
quiquellmwhahaha: I am just adding the template for reuse purposes not runing there12:06
mwhahahaweshay: we do12:07
weshayscenario?12:07
mwhahahaquiquell: k that's fine12:07
* weshay looks12:07
chandankumarweshay: yup I am!12:07
mwhahahaweshay: we are just skipping some, but the basics are there12:07
mwhahahaWe need to fix the tests to include more12:07
quiquellmwhahaha: Do we have to replace undercloud jobs ?12:07
mwhahahaquiquell: for some repos yes12:08
mwhahahaquiquell: not all12:08
mwhahahaUndercloud uses things like zaqar, Mistral, heat, ironic12:08
mwhahahaSo we get coverage there12:08
mwhahahaWe don't really get coverage on nova12:08
weshaymwhahaha, my thought.. merge https://review.openstack.org/#/c/607987/  --> merge https://review.openstack.org/#/c/606046/8/config/general_config/featureset052.yml --> merge  https://review.openstack.org/#/q/topic:standalone-puppet-gates --> remove undercloud job12:08
mwhahahaWhich is why we should add the standalone there12:09
quiquellmwhahaha: wich ones need undercloud ?12:09
mwhahahaI'll let you know in a bit when I'm not reading from my phone12:10
quiquellmwhahaha: ack, thanks mate12:10
*** ooolpbot has joined #tripleo12:10
ooolpbotURGENT TRIPLEO TASKS NEED ATTENTION12:10
ooolpbothttps://bugs.launchpad.net/tripleo/+bug/179671012:10
ooolpbothttps://bugs.launchpad.net/tripleo/+bug/179675612:10
*** ooolpbot has quit IRC12:10
openstackLaunchpad bug 1796710 in tripleo "Tempest tests failed with Read timed out error on tripleo-ci-centos-7-undercloud-containers" [Critical,Triaged]12:10
openstackLaunchpad bug 1796756 in tripleo "Error searching for image docker.io/tripleorocky/centos-binary-ceilometer-compute - UnixHTTPConnectionPool(host=\'localhost\', port=None): Read timed out." [Critical,In progress] - Assigned to Alex Schultz (alex-schultz)12:10
marios|roverweshay: ack *you mean /#/c/607987/ ?12:10
* mwhahaha runs away and should have gone back to sleep12:10
chandankumarmwhahaha: http://logs.openstack.org/46/606046/8/check/tripleo-ci-centos-7-standalone/a263401/logs/tempest.html.gz more failure12:10
*** dciabrin_ has joined #tripleo12:10
*** quiquell is now known as quiquell|lunch12:11
chandankumarmwhahaha: is it expected in standalone http://logs.openstack.org/46/606046/8/check/tripleo-ci-centos-7-standalone/a263401/logs/undercloud/var/log/extra/errors.txt.gz#_2018-10-10_10_03_15_583 ?12:12
mwhahahachandankumar: likely because of the puppet-nova problems, recheck12:12
chandankumarwhere is openstackgerrit bot?12:13
mwhahahaMissing in action12:13
mwhahahaDon't ask, it's silly12:13
* Tengu asks anyway12:14
Tengubtw zuul new interface is faster.12:15
chandankumarTengu: All hail to tristan :-)12:15
*** dprince has joined #tripleo12:15
quiquell|lunchTengu: but json stream url is broken :-/ the greasy monkey is broken by it12:16
Tenguquiquell|lunch: not using greasemonkey, so I'm sage :D12:17
*** paramite has quit IRC12:17
*** paramite has joined #tripleo12:18
quiquell|lunchTengu: man you should use this https://github.com/mpeterson/openstack-greasemonkey-helpers12:19
quiquell|lunchTengu: Opening a gerrit review with this make you eyes pop up12:19
*** fultonj has joined #tripleo12:19
beaglesbogdando: is that from tne agent logs?12:22
bogdandobeagles: yes12:22
bogdandoI keep debugging other wrappers... noticed that ip netns identify returns an empty12:22
*** paramite has quit IRC12:23
bogdandothat's prolly cuz I'm starting the wrapper from the agent container directly, but should do that from some netns12:23
weshayquiquell|lunch, curl http://zuul.openstack.org/status right?12:24
beaglesbogdando - I think the agent runs the command as 'ip netns exec' already so this is pretty strange12:24
beaglesbogdando: it's as if the agent container no longer has the access it once did12:25
*** morazi has joined #tripleo12:29
bogdandothats a nsenter breaks the ip netns exec  https://pastebin.com/CWxankLa12:30
bogdando;(12:30
*** quiquell|lunch is now known as quiquell12:30
bogdandobeagles: ^^12:30
*** radez has joined #tripleo12:30
quiquellweshay: curl http://zuul.openstack.org/api/status12:31
bogdandoso that container, needs to be started from a nents?12:31
quiquellweshay: the url for running jobs are wrong12:31
beaglesbogdando: technically no, but that's how neutron will execute it12:31
beaglesbogdando: what's a bit weird though is that we are calling ip netns identify from within a call to nsenter12:32
beaglesbogdando: but also that wouldn't be the agent logs that would be coming from elsewhere12:33
bogdandobeagles: no, we do not, apparently it is my bad testing12:34
bogdandowe only do for podman12:34
beaglesbogdando: it looks like the agent can't run its own operations for namespaces - this doesn't even touch sidecar stuff yet12:34
beaglesbogdando: unless calling nsenter at some point breaks the namespaces (we've seen fragility in network namespaces before)12:35
bogdandobut that is done at the end of the wrapper script, as a forking call12:35
bogdandopodman forks and exits12:35
beaglesbogdando: I don't know -I guess what I'm thinking is that neutron_dhcp_agent is itself broken, not the sidecars12:37
*** ykarel__ is now known as ykarel12:38
bogdandooh, well, then I'll need some help with testing the topic, beagles12:38
beaglesbogdando: one test would be to restart the neutron_dhcp_agent without the wrappers mounted12:38
bogdandowill try that12:38
beaglesbogdando: if it works then we can blame the wrappers12:38
*** amoralej` is now known as amoralej|lunch12:38
beaglesbogdando: if not then maybe something funky with paunch/how the current containers are run12:39
*** rlandy has joined #tripleo12:39
*** vkmc has joined #tripleo12:40
*** sileht has left #tripleo12:41
*** fultonj has quit IRC12:42
*** bdodd_ has quit IRC12:42
*** bdodd has joined #tripleo12:43
*** udesale has joined #tripleo12:45
*** mcornea has joined #tripleo12:51
*** pdeore has joined #tripleo12:56
bogdandobeagles: still having perm denied w/o wrappers :(12:59
beaglesbogdando: ahh13:00
bogdandoso is it podman related prolly...13:00
bogdandowill try with docker13:00
beaglesbogdando: yeah.13:01
beaglesbogdando: are the podman agent containers not run as privileged?13:01
*** ansmith has joined #tripleo13:02
beaglesbogdando: if not, the we might need to add some additional capabilities, etc13:02
* beagles doesnt recall exactly what happened there - will go hunting pauch patch13:02
bogdandothey do13:02
*** rbrady has joined #tripleo13:02
bogdandobeagles: not really more than alias docker=paunch :)13:02
beaglesbogdando: ah okay13:03
bogdandobeagles: hmm, same stuff13:04
bogdandodid it ever work? :)13:04
beaglesbogdando: with docker it doesn't work ?13:05
beaglesfricking weird13:05
bogdandobeagles: same permission denies13:05
beaglesselinux?13:05
bogdandopermissive...13:05
bogdandomay be a netns got hossed?13:05
bogdandohow can I clean it up?13:05
beaglesbogdando: yeah13:06
beaglesbogdando: reboot basically I think13:06
bogdandoum okay )13:06
beaglesikr13:06
*** lblanchard has joined #tripleo13:07
*** aufi has quit IRC13:08
*** shyam89 has quit IRC13:08
*** tzumainn has joined #tripleo13:08
*** shyam89 has joined #tripleo13:08
*** aufi has joined #tripleo13:09
bogdandobeagles: ugh, it has 000013:10
bogdandonice access pattern13:10
*** ooolpbot has joined #tripleo13:10
ooolpbotURGENT TRIPLEO TASKS NEED ATTENTION13:10
ooolpbothttps://bugs.launchpad.net/tripleo/+bug/179671013:10
ooolpbothttps://bugs.launchpad.net/tripleo/+bug/179675613:10
openstackLaunchpad bug 1796710 in tripleo "Tempest tests failed with Read timed out error on tripleo-ci-centos-7-undercloud-containers" [Critical,Triaged]13:10
*** ooolpbot has quit IRC13:10
openstackLaunchpad bug 1796756 in tripleo "Error searching for image docker.io/tripleorocky/centos-binary-ceilometer-compute - UnixHTTPConnectionPool(host=\'localhost\', port=None): Read timed out." [Critical,In progress] - Assigned to Alex Schultz (alex-schultz)13:10
bogdandobeagles: does that normally happens to those qdhcp-* ? :)13:10
beaglesbogdando: nope13:10
beaglesbogdando: we used to see wierd namespace issues when we weren't mounting /var/run/netns properly13:11
beaglesbogdando: can you delete the namespace manually?13:11
*** kopecmartin|scho is now known as kopecmartin|ruck13:13
bogdandochmoded it, rebooting now13:14
beaglesmarios|rover: bandini : where is the task for updating all the packages  update/upgrade these days?13:14
beaglesbogdando: ack13:14
*** shyam89 has quit IRC13:14
* beagles declares ns's magic, burns some sage in support of bogdando13:15
marios|roverbeagles: in upgrade tasks last i knew :)?13:15
beaglesmarios|rover: k I'll look there :)13:15
marios|roverhttps://github.com/openstack/tripleo-heat-templates/blob/ec227891bd5508a13312449d27265344f831f97c/puppet/services/tripleo-packages.yaml#L9513:15
marios|roverbeagles: ^13:15
* bogdando sees sage's burning in screems13:15
*** hkominos has joined #tripleo13:16
bogdandoI hope that works for god of ns's13:16
*** holser_ has quit IRC13:16
beagles:)13:16
*** aufi has quit IRC13:21
bogdandobeagles: yes, worked well, even works now with podman wrappers13:22
beaglesbogdando: woo hoo13:22
bogdandoso it's ready for review and testing :)13:22
beaglesbogdando: thanks 10^6 man13:22
bogdandothanks for helping with debugs beagles13:23
*** holser_ has joined #tripleo13:23
*** aufi has joined #tripleo13:25
hkominosHi all. Quick question. Is there an easy way to bypass the default container source (docker.io/tripleomaster) when trying to create the undercloud  ??13:27
hjensasbogdando: Do we have support for retries in docker_config? I.e https://github.com/openstack/tripleo-heat-templates/blob/master/docker/services/ironic-inspector.yaml#L158 ?13:29
*** holser_ has quit IRC13:30
bogdandohjensas: AFAICT docker-puppet.py does not retry13:30
bogdandodprince: around perchance? ^^13:30
*** gfidente has quit IRC13:31
dprincebogdando: hi, whats up13:31
dprincebogdando: docker-puppet.py retries the 'docker pull'13:31
dprincebogdando: but not the actual 'puppet apply' command13:31
bogdandoyeah, dprince thanks for info! hjensas^^13:32
dprinceI'm not sure we want it to retry that. If it fails once it is very likley to fail again and that is just wasting time to report the underlying failure13:32
* bogdando tends to agree13:32
bogdandohjensas: you can put retries into curl13:33
bogdandoor the script itself?..13:34
weshaymarios|rover, kopecmartin|ruck have we seen this before? http://logs.openstack.org/30/607530/11/check/tripleo-ci-centos-7-containers-multinode/438b762/logs/undercloud/home/zuul/overcloud_deploy.log.txt.gz13:34
hjensasbogdando: yes, curl can retry if it gets a response i.e 4xx or 5xx, but not for http://paste.openstack.org/show/731851/13:34
hjensasbogdando: curl in Fedora can retry on ECONNREFUSED as well, but not RHEL.13:35
weshaymarios|rover, kopecmartin|ruck /me looking at the results for https://review.openstack.org/60753013:35
weshaywhich we neeeed to merge13:35
bogdandooops13:35
weshayhttp://logs.openstack.org/30/607530/11/check/tripleo-ci-centos-7-containers-multinode/438b762/logs/undercloud/home/zuul/overcloud_deploy.log.txt.gz#_2018-10-10_12_59_1313:35
*** trown is now known as trown|afk13:35
bogdandomay be the task can be converted into ansible? with proper retrying13:35
bogdandohjensas:13:35
weshay2018-10-10 12:59:13 |         "2018-10-10 12:57:22,363 ERROR: 19482 -- Failed running docker-puppet.py for nova_libvirt",13:35
weshay2018-10-10 12:59:13 |         "2018-10-10 12:57:22,363 ERROR: 19482 -- Notice: hiera(): Cannot load backend module_data: cannot load such file -- hiera/backend/module_data_backend",13:35
weshaythat is the other fix for puppet I guess13:36
weshayso yes we have seen it13:36
*** fultonj has joined #tripleo13:36
weshaymarios|rover, kopecmartin|ruck is the the fix? https://review.openstack.org/#/c/609289/13:36
weshayyup13:37
hjensasbogdando: yes, good idea. Thanks! Will try that.13:37
weshayI'm up to speed now13:37
marios|roverweshay: sorry was momentarily afk reading back13:38
weshaymarios|rover, I think I got the flow now :)13:38
*** holser_ has joined #tripleo13:38
marios|roverweshay: yeah that one was fixed by quiquell++ with help from Tengu++ very early today13:38
marios|roveri was like 'good morning' quiquell was like 'gates are broken'13:39
Tengu:)13:39
Tenguand in the end we were like "woohooo we fixed it"13:40
Tengu:)13:40
weshaymarios|rover, Tengu kopecmartin|ruck so now we just need https://review.openstack.org/#/c/608589/ to land13:40
* weshay is reseting the job13:40
marios|roverweshay: yeah ssl one13:40
*** ade_lee has joined #tripleo13:40
marios|roverweshay: thats the one the ovb jobs were hitting13:41
marios|roverweshay: so blocking promotion13:41
weshayright13:41
*** hkominos has quit IRC13:41
matbumarios|rover: weshay hey o/ can i have a quick review on this plz https://review.openstack.org/#/c/606087 ?13:41
*** holser_ has quit IRC13:41
* Tengu just found some really "nice" issue in mistral containers using podman+skopeo13:41
marios|rovermatbu: ack added to my reviews list13:42
d0ugalSounds interesting!13:42
Tengu-.-' took some time, but in the end, I have something.13:42
d0ugalTengu: What did you find?13:42
Tengud0ugal: https://bugs.launchpad.net/tripleo/+bug/1797114  if you're interested :)13:42
openstackLaunchpad bug 1797114 in tripleo "running skopeo in podman fails with "Error inspecting image"" [Medium,Triaged] - Assigned to Emilien Macchi (emilienm)13:42
d0ugalTengu: thanks13:42
Tenguin the end it's a "simple" permission denied, although it's probably not "only" that.13:42
Tengubut at least I get the right error now, need to understand why it works without the /run:/run though.13:43
*** amoralej|lunch is now known as amoralej13:44
*** holser_ has joined #tripleo13:45
*** apetrich has joined #tripleo13:46
mwhahahabogdando: so the default services for a deploy are, https://github.com/openstack/tripleo-heat-templates/blob/master/overcloud-resource-registry-puppet.j2.yaml#L104-L35813:48
*** vinaykns has joined #tripleo13:48
mwhahahabogdando: if they are present in the role13:48
d0ugalTengu: Good work tracking it down, looks confusing to me :)13:49
mwhahahabogdando: we turn off services that are enabled by default in the standalone via, https://github.com/openstack/tripleo-heat-templates/blob/master/environments/standalone/standalone-tripleo.yaml#L52-L10413:49
Tengud0ugal: well, debugging containers is a pain ;).13:49
Tengud0ugal: and I kind of updated the issue while finding things so that it's tracked down with the process of finding things.13:49
mwhahahabogdando: so you if you wanted to enable/disable one of the services then you would provide an environment file that overrides the none setting or set it to none13:49
*** mjturek has joined #tripleo13:50
*** ykarel is now known as ykarel|afk13:51
*** aufi has quit IRC13:52
bogdandomwhahaha: thanks! I knew of that file, just the cryptic kind of that knowledge needs its constant refreshing! :)13:52
mwhahahayea, love that complexity :D13:52
*** holser_ has quit IRC13:55
*** rbrady has quit IRC14:01
*** agurenko has quit IRC14:01
*** trozet has joined #tripleo14:01
Tengud0ugal: in the end, I'm pretty sure it's a skopeo/libpod bug :). Will discuss that on #podman.14:01
d0ugalTengu: k14:01
*** slagle has joined #tripleo14:04
*** lblanchard has quit IRC14:05
*** ykarel_ has joined #tripleo14:06
quiquellmwhahaha, bogdando: for standalone at puppet-nova do I remove undercloud ?14:07
mwhahahayea it would be fine14:08
quiquellmwhahaha: Do we cover same at tempest ?14:08
mwhahahamore so in the standalone14:09
quiquellmwhahaha: fair enough14:09
*** ykarel|afk has quit IRC14:09
*** ykarel_ is now known as ykarel|afk14:09
*** ooolpbot has joined #tripleo14:10
ooolpbotURGENT TRIPLEO TASKS NEED ATTENTION14:10
ooolpbothttps://bugs.launchpad.net/tripleo/+bug/179671014:10
ooolpbothttps://bugs.launchpad.net/tripleo/+bug/179675614:10
openstackLaunchpad bug 1796710 in tripleo "Tempest tests failed with Read timed out error on tripleo-ci-centos-7-undercloud-containers" [Critical,Triaged]14:10
*** ooolpbot has quit IRC14:10
openstackLaunchpad bug 1796756 in tripleo "Error searching for image docker.io/tripleorocky/centos-binary-ceilometer-compute - UnixHTTPConnectionPool(host=\'localhost\', port=None): Read timed out." [Critical,In progress] - Assigned to Alex Schultz (alex-schultz)14:10
quiquellmwhahaha, bogdando: Done https://review.openstack.org/#/c/60930114:11
quiquellmwhahaha: Should I remove the ^metadata.json$ linet at the Depends-On change ?14:12
mwhahahaquiquell: no leave it14:12
quiquellmwhahaha: ack14:13
*** dtantsur is now known as dtantsur|brb14:13
*** quiquell is now known as quiquell|off14:14
chandankumarkopecmartin|ruck: Hello14:15
chandankumarkopecmartin|ruck: for above tempest bug http://zuul.openstack.org/builds?job_name=tripleo-ci-centos-7-undercloud-containers14:15
chandankumarkopecmartin|ruck: is the failure coming multiple times?14:16
*** fultonj has quit IRC14:17
kopecmartin|ruckchandankumar, i have no idea, define multiple times :D i can see a failure on the page you sent three times14:21
bogdandoquiquell|off: let's please collect all under https://review.openstack.org/#/q/topic:standalone-puppet-gates+(status:open+OR+status:merged)14:22
bogdandoI wanted to start that mail thread...14:22
bogdandofeel not happy with the default standalone bombing all the puppet modules with full setup :)14:22
chandankumarkopecmartin|ruck: other two are failed due to different reason14:22
jaosoriorbogdando: what does the full setup include?14:23
bogdandoto know that for sure you'll need a superability to merge yamls in your head14:23
chandankumarkopecmartin|ruck: I will dig into the bug once it is appears one more time14:23
* mwhahaha pokes weshay 14:24
mwhahahaweshay: you free?14:24
bogdandojaosorior:14:24
bogdando3:48:35 PM GMT+2 - mwhahaha: bogdando: so the default services for a deploy are, https://github.com/openstack/tripleo-heat-templates/blob/master/overcloud-resource-registry-puppet.j2.yaml#L104-L35814:24
bogdando3:48:46 PM GMT+2 - mwhahaha: bogdando: if they are present in the role14:24
bogdando3:49:21 PM GMT+2 - mwhahaha: bogdando: we turn off services that are enabled by default in the standalone via, https://github.com/openstack/tripleo-heat-templates/blob/master/environments/standalone/standalone-tripleo.yaml#L52-L10414:24
bogdando3:49:58 PM GMT+2 - mwhahaha: bogdando: so you if you wanted to enable/disable one of the services then you would provide an environment file that overrides the none setting or set it to none14:24
weshaymwhahaha, sorry.. talking to amnan14:24
mwhahahak14:24
*** roger2 has joined #tripleo14:24
mwhahahaweshay: maybe we can chat after cix meeting14:25
bogdandojaosorior: so I'd say for each particular module there is a most likely unrelated things get deployed, and less likely, there is a chance to not cover that module at all14:26
dpeacockmwhahaha: I'm attending cix today14:26
*** fultonj has joined #tripleo14:26
*** fultonj has quit IRC14:26
weshayk14:27
jaosoriorbogdando: understood. What's your suggestion?14:29
roger2My "openstack undercloud install" failed with "ERROR: Network create/update failed." and "504 Gateway Time-out". Should I just re-run the install command and see if it will get past it the next time?14:32
*** trown|afk is now known as trown14:35
*** mfedosin has quit IRC14:38
roger2undercloud.conf and last 100 lines of install-undercloud.log: https://pastebin.com/T9Pwdec314:41
Tenguhmm is this known already, or transcient, or... ? http://logs.openstack.org/57/607557/10/check/tripleo-ci-centos-7-undercloud-containers/9d80c65/logs/undercloud/home/zuul/tempest.log.txt.gz14:45
Tenguchandankumar: maybe? -^^  (as related to tempest apparently)14:45
*** rbrady has joined #tripleo14:46
Tenguah. well. ok. it's known :D14:47
*** abishop has quit IRC14:51
*** skramaja has quit IRC14:51
bogdandomwhahaha, quiquell|off, weshay: http://lists.openstack.org/pipermail/openstack-dev/2018-October/135644.html14:53
bogdandojaosorior: ^^14:53
*** chandankumar is now known as chkumar|off14:55
bogdandojistr, shardy: ^^ please share your ideas :)14:55
bogdandoIt's so pity to see the composability not used14:56
*** iurygregory has quit IRC14:57
*** iurygregory has joined #tripleo14:59
*** ramishra has quit IRC15:03
*** pdeore has quit IRC15:05
*** agurenko has joined #tripleo15:07
weshaymwhahaha, k.. sorry.. ready when ever15:07
*** ooolpbot has joined #tripleo15:10
ooolpbotURGENT TRIPLEO TASKS NEED ATTENTION15:10
ooolpbothttps://bugs.launchpad.net/tripleo/+bug/179671015:10
ooolpbothttps://bugs.launchpad.net/tripleo/+bug/179675615:10
openstackLaunchpad bug 1796710 in tripleo "Tempest tests failed with Read timed out error on tripleo-ci-centos-7-undercloud-containers" [Critical,Triaged]15:10
*** ooolpbot has quit IRC15:10
openstackLaunchpad bug 1796756 in tripleo "Error searching for image docker.io/tripleorocky/centos-binary-ceilometer-compute - UnixHTTPConnectionPool(host=\'localhost\', port=None): Read timed out." [Critical,In progress] - Assigned to Alex Schultz (alex-schultz)15:10
kopecmartin|ruckmarios|rover, weshay it looks like  tripleo-ci-centos-7-undercloud-oooq gate job is affected by this old issue https://bugs.launchpad.net/tripleo/+bug/176477715:13
openstackLaunchpad bug 1764777 in tripleo "queens-uc-newton-oc undercloud install fails on creating default plan" [High,Triaged]15:13
*** panda has quit IRC15:13
*** panda has joined #tripleo15:14
marioskopecmartin|ruck: ack15:14
weshaykopecmartin|ruck, on this patch? https://review.openstack.org/#/c/608324/115:15
kopecmartin|ruckweshay, yes15:15
mariosweshay: yeah in the gate but passed check15:15
weshayya15:16
weshaykopecmartin|ruck, nice job..  so if it fails the gate, my advice is to add alert, promotion-blocker15:17
weshayblocks queens15:17
*** holser_ has joined #tripleo15:18
kopecmartin|ruckweshay, ok, I've added the tags15:19
weshayapetrich, you available to look at a mistral bug?15:19
d0ugalweshay: I can take a look15:19
weshaythanks d0ugal https://bugs.launchpad.net/tripleo/+bug/176477715:19
openstackLaunchpad bug 1764777 in tripleo "queens-uc-newton-oc undercloud install fails on creating default plan" [High,Triaged]15:19
weshaythanks kopecmartin|ruck15:19
mwhahahaweshay: i got like 10 mins, you still free?15:22
weshayya15:22
*** jrist has quit IRC15:23
bogdandobeagles, Tengu: attempted to test the rootwrappers with Enforcing SELinux, failed miserably, put it back to permissive, rebooted and observed a miracle https://pastebin.com/x91f002d15:23
*** agopi is now known as agopi|lunch15:23
bogdandoisn't that god of ns'es a tricky guy?15:24
*** Petersingh is now known as Petersingh|gone15:24
*** Petersingh|gone has quit IRC15:24
*** holser_ has quit IRC15:27
bogdandohow come that /run/netns disappears w/o being recreated? :D15:27
*** aufi has joined #tripleo15:28
beaglesbogdando: pretty cool... don't think I've ever heard of /run/netns not existing before though15:28
bogdando:D15:29
d0ugalweshay: The second report (comment 7) is a different error than the one in the original description15:29
d0ugalI don't have enough details to figure out why the original error happened15:30
weshayah k15:30
d0ugalbut the more recent one is just a swift timeout15:30
weshayd0ugal, ok.. kopecmartin|ruck can we can a new bug on the job that currently failed15:30
weshayd0ugal, we're focused on the gate failure for queens15:30
d0ugalweshay: k15:30
weshaysorry about that15:30
d0ugalnp15:30
d0ugalweshay: The tripleo-common code should probably retry the swift upload on failure. I can try and improve that tomorrow (running out of time today)15:31
d0ugalbut I guess it is unlikely to be a consistent failure15:31
kopecmartin|ruckweshay, d0ugal ok, sorry,  I'll create a new bug15:33
*** mburned is now known as mburned_out15:33
*** aufi has quit IRC15:33
bogdandodo you know something of that poor snmp thing failing rspecs? mwhahaha? http://logs.openstack.org/07/606907/1/check/puppet-openstack-unit-4.8-centos-7/fdaa62d/job-output.txt.gz15:33
mwhahahabogdando: many a bump in snmp module broke something?15:34
mwhahahai'll look later15:34
bogdandothanks a bunch15:35
*** iurygregory has quit IRC15:35
*** iurygregory has joined #tripleo15:36
*** jfrancoa has quit IRC15:37
*** ksambor has quit IRC15:38
*** jcoufal has joined #tripleo15:39
*** yprokule has quit IRC15:46
*** agurenko has quit IRC15:46
kopecmartin|ruckweshay, ok, I've copied it to a new bug https://bugs.launchpad.net/tripleo/+bug/179716715:49
openstackLaunchpad bug 1797167 in tripleo "swift timeout during undercloud deploy" [Undecided,New]15:49
kopecmartin|ruckweshay, can you please set it as Triaged? I need to ping someone because of the permissions to do it15:50
*** leanderthal has quit IRC15:50
*** jcoufal has quit IRC15:50
*** morazi has quit IRC15:51
*** naturalblue has quit IRC15:52
*** udesale has quit IRC15:57
*** sshnaidm has quit IRC15:58
d0ugalkopecmartin|ruck: I updated the status15:58
*** sshnaidm has joined #tripleo15:59
*** rdopiera has quit IRC16:01
*** bogdando has quit IRC16:03
*** akrivoka_ has quit IRC16:08
*** akrivoka_ has joined #tripleo16:09
*** ooolpbot has joined #tripleo16:10
ooolpbotURGENT TRIPLEO TASKS NEED ATTENTION16:10
ooolpbothttps://bugs.launchpad.net/tripleo/+bug/179671016:10
ooolpbothttps://bugs.launchpad.net/tripleo/+bug/179675616:10
*** ooolpbot has quit IRC16:10
openstackLaunchpad bug 1796710 in tripleo "Tempest tests failed with Read timed out error on tripleo-ci-centos-7-undercloud-containers" [Critical,Triaged]16:10
openstackLaunchpad bug 1796756 in tripleo "Error searching for image docker.io/tripleorocky/centos-binary-ceilometer-compute - UnixHTTPConnectionPool(host=\'localhost\', port=None): Read timed out." [Critical,In progress] - Assigned to Alex Schultz (alex-schultz)16:10
*** ykarel|afk is now known as ykarel16:13
*** abishop has joined #tripleo16:13
*** gfidente has joined #tripleo16:15
*** dtantsur|brb is now known as dtantsur16:15
weshaykopecmartin|ruck, k16:20
chkumar|offTengu: the tempest failure seems odd to me, is it continously coming?16:22
*** akrivoka_ has quit IRC16:29
*** gfidente[m] has joined #tripleo16:29
*** ykarel_ has joined #tripleo16:30
apetrichweshay, aye16:30
apetrichoh d0ugal has looked at it16:31
*** naturalblue has joined #tripleo16:31
weshaythanks16:32
*** ykarel has quit IRC16:32
*** jpich has quit IRC16:33
*** hjensas has quit IRC16:38
*** vinaykns has quit IRC16:41
*** shardy has quit IRC16:44
*** cgoncalves has quit IRC16:49
*** cgoncalves has joined #tripleo16:49
*** ykarel__ has joined #tripleo16:52
*** ykarel_ has quit IRC16:56
*** ykarel_ has joined #tripleo16:56
*** ykarel__ has quit IRC16:58
*** trown is now known as trown|lunch16:59
*** ykarel_ has quit IRC17:02
*** derekh has quit IRC17:03
*** ooolpbot has joined #tripleo17:10
ooolpbotURGENT TRIPLEO TASKS NEED ATTENTION17:10
ooolpbothttps://bugs.launchpad.net/tripleo/+bug/179671017:10
openstackLaunchpad bug 1796710 in tripleo "Tempest tests failed with Read timed out error on tripleo-ci-centos-7-undercloud-containers" [Critical,Triaged]17:10
ooolpbothttps://bugs.launchpad.net/tripleo/+bug/179675617:10
ooolpbothttps://bugs.launchpad.net/tripleo/+bug/179716717:10
*** ooolpbot has quit IRC17:10
openstackLaunchpad bug 1796756 in tripleo "Error searching for image docker.io/tripleorocky/centos-binary-ceilometer-compute - UnixHTTPConnectionPool(host=\'localhost\', port=None): Read timed out." [Critical,In progress] - Assigned to Alex Schultz (alex-schultz)17:10
openstackLaunchpad bug 1797167 in tripleo "swift timeout during undercloud deploy" [Critical,Triaged]17:10
*** agopi|lunch is now known as agopi17:25
*** psachin has quit IRC17:28
*** dtantsur is now known as dtantsur|afk17:34
mwhahahawheens undercloud upgrades job is broken, do we have a bug for that? http://logs.openstack.org/24/608324/1/check/tripleo-ci-centos-7-undercloud-upgrades/1d06342/logs/undercloud/home/zuul/undercloud_upgrade.log.txt.gz#_2018-10-10_13_16_0817:35
mwhahahatherve: -^ fyi do you know about the policy.json issue?17:35
*** morazi has joined #tripleo17:37
*** salmankhan has quit IRC17:39
*** Vorrtex has joined #tripleo17:40
weshayhttps://review.rdoproject.org/zuul/status17:45
*** salmankhan has joined #tripleo17:52
*** pcaruana has quit IRC17:55
nhicherweshay, rlandy: the job on vexxhost failed with tls-cert-inject.yaml error https://logs.rdoproject.org/96/15896/37/check/legacy-tripleo-ci-centos-7-ovb-3ctlr_1comp-featureset001-master-vexxhost/a891b3f/job-output.txt.gz#_2018-10-10_13_56_47_936340. Do you have the same error on others jobs ?17:58
*** salmankhan has quit IRC17:58
weshaynhicher, I sent you that review yesterday I thought17:59
nhicherweshay: yes, I checked tripleo-quickstart-extra on the nodepool slave, the tls review is merged, but job failed17:59
nhicherweshay: same error on rdocloud too https://review.rdoproject.org/r/#/c/15896/18:01
*** apetrich has quit IRC18:03
*** apetrich has joined #tripleo18:04
roger2I'm trying to deploy rocky release but it fails. Is there simple way to give up and try queens without reinstalling a clean OS? I assume answer is similar to following upgrade instructions here https://docs.openstack.org/tripleo-docs/latest/install/installation/installation.html#updating-undercloud-components18:07
weshaynhicher, not merged https://review.openstack.org/#/c/608589/18:07
weshayroger2, you don't need to reinstall the os18:09
roger2weshay: ok thx.18:10
*** ooolpbot has joined #tripleo18:10
ooolpbotURGENT TRIPLEO TASKS NEED ATTENTION18:10
ooolpbothttps://bugs.launchpad.net/tripleo/+bug/179671018:10
ooolpbothttps://bugs.launchpad.net/tripleo/+bug/179675618:10
openstackLaunchpad bug 1796710 in tripleo "Tempest tests failed with Read timed out error on tripleo-ci-centos-7-undercloud-containers" [Critical,Triaged]18:10
ooolpbothttps://bugs.launchpad.net/tripleo/+bug/179716718:10
*** ooolpbot has quit IRC18:10
openstackLaunchpad bug 1796756 in tripleo "Error searching for image docker.io/tripleorocky/centos-binary-ceilometer-compute - UnixHTTPConnectionPool(host=\'localhost\', port=None): Read timed out." [Critical,In progress] - Assigned to Alex Schultz (alex-schultz)18:10
openstackLaunchpad bug 1797167 in tripleo "swift timeout during undercloud deploy" [Critical,Triaged]18:10
nhicherweshay: thanks, I will add a depends-on18:10
weshaytoure, what happened here? http://logs.openstack.org/89/608589/5/check/tripleo-ci-centos-7-scenario002-multinode-oooq-container/3a5c058/logs/undercloud/home/zuul/overcloud_deploy.log.txt.gz#_2018-10-10_15_50_0218:11
*** pcaruana has joined #tripleo18:12
*** itlinux has quit IRC18:13
* toure looks18:14
*** lblanchard has joined #tripleo18:14
weshaymwhahaha, any idea wtf this is? http://logs.openstack.org/89/608589/5/check/tripleo-ci-centos-7-scenario002-multinode-oooq-container/3a5c058/logs/undercloud/home/zuul/overcloud_deploy.log.txt.gz#_2018-10-10_15_50_0218:15
weshaybefore I recheck18:15
toureweshay that bug should be fixed18:16
weshaywhat bug?18:17
tourewebsocket timeout with mistral18:17
toureit was due to log-rotate18:17
tourelogrotate postscript18:17
*** raildo has quit IRC18:17
weshayah.. I wonder18:17
* toure shakes fist at container management18:18
toureweshay https://bugs.launchpad.net/tripleo/+bug/178968018:19
openstackLaunchpad bug 1789680 in tripleo "mistral MessagingTimeout correlates with containerized undercloud uptime" [Critical,Fix committed] - Assigned to Toure Dunnon (toure)18:19
*** panda has quit IRC18:20
toureweshay so the purposed fix was pushed to the client but I had a suspicion that this fix should have been placed into the api service directly18:21
*** panda has joined #tripleo18:23
weshaytoure, hrm.. wondering if the proper version of mistral has not been promoted18:24
* toure looking at mistral version18:25
*** raildo has joined #tripleo18:26
toureweshay looks like it is an older package 2018080611544618:28
weshaytoure, ya.. so we need a promotion18:29
toureweshay just missed the fix, the job has mistral 7.0.1 and the fix went into 7.0.318:30
toureyeah18:30
d0ugalWhy would a mistral promotion help?18:30
d0ugalI think I'm missing something18:30
toureto include the merged fix18:30
d0ugaltoure: ah, which fix?18:30
tourerpc reset18:31
d0ugalgotcha18:31
tourehttps://review.openstack.org/60563318:31
*** trown|lunch is now known as trown18:34
*** roger2 has quit IRC18:34
d0ugaltoure, weshay: it could also just be that it took longer than the timeout18:35
d0ugalas the message says :)18:35
d0ugalhttps://github.com/openstack/python-tripleoclient/blob/master/tripleoclient/workflows/deployment.py#L5218:35
*** roger2 has joined #tripleo18:36
d0ugalFrom my reading of the mistral logs it took almost exactly 10 mins18:36
d0ugaland I think we only wait 6 mins?18:36
toured0ugal ah, I thought we waited longer than that18:40
toureweshay18:41
toured0ugal weshay do you know if latency is captured in the logs I see route tables and ports18:43
d0ugalno idea18:43
weshaytoure, all that kind of data is in /var/log/extras18:44
weshaybut not sure if we have anything that would capture latency18:44
weshaymaybe the dstat18:44
tourelink status is fine18:44
toureby link stats I mean there any errors on the tx or rx sid18:45
toureside18:45
*** pcaruana has quit IRC19:03
*** ooolpbot has joined #tripleo19:10
ooolpbotURGENT TRIPLEO TASKS NEED ATTENTION19:10
ooolpbothttps://bugs.launchpad.net/tripleo/+bug/179671019:10
ooolpbothttps://bugs.launchpad.net/tripleo/+bug/179675619:10
ooolpbothttps://bugs.launchpad.net/tripleo/+bug/179716719:10
*** ooolpbot has quit IRC19:10
openstackLaunchpad bug 1796710 in tripleo "Tempest tests failed with Read timed out error on tripleo-ci-centos-7-undercloud-containers" [Critical,Triaged]19:10
openstackLaunchpad bug 1796756 in tripleo "Error searching for image docker.io/tripleorocky/centos-binary-ceilometer-compute - UnixHTTPConnectionPool(host=\'localhost\', port=None): Read timed out." [Critical,In progress] - Assigned to Alex Schultz (alex-schultz)19:10
openstackLaunchpad bug 1797167 in tripleo "swift timeout during undercloud deploy" [Critical,Triaged]19:10
*** kopecmartin|ruck has quit IRC19:27
*** ssbarnea has quit IRC19:29
*** dciabrin_ has quit IRC19:40
*** Vorrtex has quit IRC19:46
*** colonwq has joined #tripleo19:48
*** zaneb has quit IRC19:48
*** pcaruana has joined #tripleo19:55
*** pcaruana has quit IRC20:06
*** ooolpbot has joined #tripleo20:10
ooolpbotURGENT TRIPLEO TASKS NEED ATTENTION20:10
ooolpbothttps://bugs.launchpad.net/tripleo/+bug/179671020:10
openstackLaunchpad bug 1796710 in tripleo "Tempest tests failed with Read timed out error on tripleo-ci-centos-7-undercloud-containers" [Critical,Triaged]20:10
ooolpbothttps://bugs.launchpad.net/tripleo/+bug/179675620:10
ooolpbothttps://bugs.launchpad.net/tripleo/+bug/179716720:10
*** ooolpbot has quit IRC20:10
openstackLaunchpad bug 1796756 in tripleo "Error searching for image docker.io/tripleorocky/centos-binary-ceilometer-compute - UnixHTTPConnectionPool(host=\'localhost\', port=None): Read timed out." [Critical,In progress] - Assigned to Alex Schultz (alex-schultz)20:10
openstackLaunchpad bug 1797167 in tripleo "swift timeout during undercloud deploy" [Critical,Triaged]20:10
*** divergence has joined #tripleo20:16
*** rh-jelabarre has quit IRC20:17
*** rh-jelabarre has joined #tripleo20:17
*** aedc has quit IRC20:20
*** dprince has quit IRC20:22
*** mcornea has quit IRC20:24
*** mcornea has joined #tripleo20:27
*** aedc has joined #tripleo20:32
*** ansmith has quit IRC20:35
*** abishop has quit IRC20:46
*** itlinux has joined #tripleo20:46
*** aedc has quit IRC20:47
*** aedc has joined #tripleo20:47
*** lblanchard has quit IRC20:49
*** aedc has quit IRC20:56
*** boazel has quit IRC21:00
*** mjturek has quit IRC21:00
*** slaweq has quit IRC21:04
*** dmsimard has joined #tripleo21:08
*** dmsimard has left #tripleo21:08
*** dtrainor_ has quit IRC21:09
*** ooolpbot has joined #tripleo21:10
ooolpbotURGENT TRIPLEO TASKS NEED ATTENTION21:10
ooolpbothttps://bugs.launchpad.net/tripleo/+bug/179671021:10
openstackLaunchpad bug 1796710 in tripleo "Tempest tests failed with Read timed out error on tripleo-ci-centos-7-undercloud-containers" [Critical,Triaged]21:10
ooolpbothttps://bugs.launchpad.net/tripleo/+bug/179675621:10
ooolpbothttps://bugs.launchpad.net/tripleo/+bug/179716721:10
*** ooolpbot has quit IRC21:10
openstackLaunchpad bug 1796756 in tripleo "Error searching for image docker.io/tripleorocky/centos-binary-ceilometer-compute - UnixHTTPConnectionPool(host=\'localhost\', port=None): Read timed out." [Critical,In progress] - Assigned to Alex Schultz (alex-schultz)21:10
openstackLaunchpad bug 1797167 in tripleo "swift timeout during undercloud deploy" [Critical,Triaged]21:10
*** slaweq has joined #tripleo21:11
*** trown is now known as trown|outtypewww21:13
*** dhill_ has quit IRC21:13
*** raildo has quit IRC21:14
*** kopecmartin has joined #tripleo21:14
*** slaweq has quit IRC21:16
*** ssbarnea has joined #tripleo21:18
*** dhill_ has joined #tripleo21:19
*** dtrainor_ has joined #tripleo21:29
bnemecWorking from a local checkout of the templates and getting <urlopen error [Errno 2] No such file or directory: '/home/centos/templates/environments/network-isolation.yaml'>21:31
*** kopecmartin has quit IRC21:32
bnemecShouldn't that file be generated during template processing?21:32
bnemecIf not, how do you use the jinja-fied net-iso environments?21:32
itlinuxis there an upgrade process from undercloud to undercloud container?21:32
*** sai_p has joined #tripleo21:41
*** mcornea has quit IRC21:41
mwhahahabnemec: process-templates.py i think, you can pass a roles data and network data in21:41
*** roger2 has quit IRC21:43
*** ssbarnea has quit IRC21:45
bnemecThat's a pain. So you have to remember to manually re-run that every time you change anything with your networking or roles?21:47
mwhahahai'm not completely sure, but i want to say yes? though it should find the .j2 and process it21:47
mwhahahathough i haven't tried it recently21:48
*** raildo has joined #tripleo21:53
*** raildo has quit IRC21:54
*** salmankhan has joined #tripleo21:59
*** toure is now known as toure|gone22:00
*** gfidente has quit IRC22:06
*** ooolpbot has joined #tripleo22:10
ooolpbotURGENT TRIPLEO TASKS NEED ATTENTION22:10
ooolpbothttps://bugs.launchpad.net/tripleo/+bug/179671022:10
openstackLaunchpad bug 1796710 in tripleo "Tempest tests failed with Read timed out error on tripleo-ci-centos-7-undercloud-containers" [Critical,Triaged]22:10
ooolpbothttps://bugs.launchpad.net/tripleo/+bug/179675622:10
ooolpbothttps://bugs.launchpad.net/tripleo/+bug/179716722:10
*** ooolpbot has quit IRC22:10
openstackLaunchpad bug 1796756 in tripleo "Error searching for image docker.io/tripleorocky/centos-binary-ceilometer-compute - UnixHTTPConnectionPool(host=\'localhost\', port=None): Read timed out." [Critical,In progress] - Assigned to Alex Schultz (alex-schultz)22:10
openstackLaunchpad bug 1797167 in tripleo "swift timeout during undercloud deploy" [Critical,Triaged]22:10
*** boazel has joined #tripleo22:11
*** apetrich has quit IRC22:12
*** tosky has quit IRC22:19
*** apetrich has joined #tripleo22:31
*** rlandy has quit IRC22:33
*** sshnaidm is now known as sshnaidm|afk22:33
*** jtomasek has quit IRC22:36
weshayd0ugal, toure|gone another gate reset http://logs.openstack.org/19/605419/14/gate/tripleo-ci-centos-7-scenario001-multinode-oooq-container/8783d8e/logs/undercloud/home/zuul/overcloud_deploy.log.txt.gz#_2018-10-10_21_55_1922:37
*** rcernin has joined #tripleo22:41
*** lblanchard has joined #tripleo22:44
*** salmankhan has quit IRC22:48
*** lblanchard has quit IRC22:57
*** lblanchard has joined #tripleo23:09
*** ooolpbot has joined #tripleo23:10
ooolpbotURGENT TRIPLEO TASKS NEED ATTENTION23:10
ooolpbothttps://bugs.launchpad.net/tripleo/+bug/178968023:10
ooolpbothttps://bugs.launchpad.net/tripleo/+bug/179671023:10
openstackLaunchpad bug 1789680 in tripleo "mistral MessagingTimeout correlates with containerized undercloud uptime" [Critical,Triaged] - Assigned to Toure Dunnon (toure)23:10
ooolpbothttps://bugs.launchpad.net/tripleo/+bug/179675623:10
ooolpbothttps://bugs.launchpad.net/tripleo/+bug/179716723:10
*** ooolpbot has quit IRC23:10
openstackLaunchpad bug 1796710 in tripleo "Tempest tests failed with Read timed out error on tripleo-ci-centos-7-undercloud-containers" [Critical,Triaged]23:10
openstackLaunchpad bug 1796756 in tripleo "Error searching for image docker.io/tripleorocky/centos-binary-ceilometer-compute - UnixHTTPConnectionPool(host=\'localhost\', port=None): Read timed out." [Critical,In progress] - Assigned to Alex Schultz (alex-schultz)23:10
openstackLaunchpad bug 1797167 in tripleo "swift timeout during undercloud deploy" [Critical,Triaged]23:10
*** lblanchard has quit IRC23:12
*** zaneb has joined #tripleo23:18
*** lblanchard has joined #tripleo23:40
*** sai_p has quit IRC23:50

Generated by irclog2html.py 2.15.3 by Marius Gedminas - find it at mg.pov.lt!