22:01:07 <corvus> #startmeeting zuul 22:01:08 <openstack> Meeting started Mon Jan 8 22:01:07 2018 UTC and is due to finish in 60 minutes. The chair is corvus. Information about MeetBot at http://wiki.debian.org/MeetBot. 22:01:09 <openstack> Useful Commands: #action #agreed #help #info #idea #link #topic #startvote. 22:01:09 <dmsimard> \o 22:01:11 <openstack> The meeting name has been set to 'zuul' 22:01:27 <Shrews> hey 22:01:31 <corvus> #link agenda https://wiki.openstack.org/wiki/Meetings/Zuul 22:02:01 * mordred waves to humans 22:02:15 <corvus> #link release roadmap https://storyboard.openstack.org/#!/board/53 22:02:51 <corvus> let's go through the 3.0 things real quick -- 22:03:15 <corvus> did someone pick up the github ingestion via zuul-web yet? 22:03:34 <corvus> i guess that's the last thing that's still in the old webapp, and once we get rid of that we can drop it entirely? 22:04:04 <tobiash> I picked this up before christmas 22:04:20 <corvus> tobiash: ah great -- it's https://review.openstack.org/504267 right? 22:04:30 <tobiash> What's missing is an end to end test yet 22:04:39 <pabelanger> corvus: I thought was assigned jlk, but seen he's moved onto new job recently. 22:05:03 <corvus> pabelanger: jlk handed it off to tobiash 22:05:07 <tobiash> corvus: yes 22:05:08 <SpamapS> o/ 22:05:14 <pabelanger> ack, I missed that 22:05:21 <corvus> tobiash: i agree with your last comment there -- we probably only need one or a few explicit tests for this 22:05:32 <corvus> tobiash: are you planning on finishing it up like that? 22:06:12 <corvus> (i updated the assignment in storyboard) 22:06:24 <tobiash> Probably not this week as we're going productive this week 22:06:34 <tobiash> But I hope next week 22:06:39 <corvus> tobiash: congrats and good luck :) 22:06:44 <corvus> tobiash: that sounds great, thanks 22:06:47 <corvus> dmsimard: are you still planning on fixing the zuul_json issue? 22:07:10 <tobiash> thanks 22:07:23 <Shrews> oh, we lost jlk as a contributor? :( 22:07:24 <tobiash> :-) 22:07:30 <dmsimard> yes, in fact I rebased the patch last week in order to reproduce the issue (which would have been easier to troubleshoot, now that I'm root) but for some reason it did not trigger 22:07:49 <corvus> Shrews: i haven't heard that from jlk 22:08:00 <dmsimard> I've looked around and saw some fixes in upstream ansible for that issue but it is confusing since we should not have those fixes yet in our version of Ansible. 22:08:11 * dmsimard gets link 22:08:16 <pabelanger> Shrews: he just posted on twitter he is working as SRE at github.com :)So, hope not. 22:08:22 <SpamapS> I think jlk is just busy getting inserted into his new job matrix. 22:08:32 <SpamapS> Last we talked, he hoped to continue Zuul'ing in some capacity. 22:08:48 <mordred> that is also what I have heard from jlk 22:08:54 <dmsimard> https://github.com/ansible/ansible/commit/c30ee42fe1f0a9666a90f4d63121780f2a186c54 should fix our issue 22:09:05 <fungi> hopefully he's busy inserting zuul into his new job matrix ;) 22:09:06 <corvus> i see no reason why working at github would preclude contributing to zuul -- in fact, it's a position uniquely suited to doing so. i hope he will continue. :) 22:09:13 <dmsimard> it also claims performance and memory improvements but I dunno. 22:09:21 <mordred> we have a human on the inside now to bug with API questions :) 22:09:41 <pabelanger> mordred: ++ 22:09:49 <dmsimard> didn't know he was at github now, grats him 22:09:51 <Shrews> yes, let us all infiltrate all the places 22:10:06 <corvus> also, the big thing jlk is working on right now is getting github3.py released so we can release. that seems very likely to meet with github.com supervisory approval, i'd think :) 22:10:08 <mordred> dmsimard: well, today is his first day, so not knowing is totally fair :) 22:10:16 <mordred> corvus: I certainly hope so! 22:10:36 <dmsimard> corvus: in summary, I don't know whether or not there is something to fix yet since I am now unable to reproduce the issue and there has been an (unreleased) fix upstream. 22:10:41 <dmsimard> I'll add my findings in the story. 22:11:14 <corvus> dmsimard: do you mean you've been unable to repro using our production zuul, or locally? 22:11:33 <dmsimard> doing a recheck/rebase on https://review.openstack.org/#/c/504238/ typically reproduced the issue 22:11:35 <dmsimard> it no longer does 22:11:43 * fungi has to disappear now, but will catch up on the rest of the meeting from the log later tonight 22:12:03 <corvus> dmsimard: oh interesting. okay. so maybe we can drop this from the 3.0 blockers? 22:12:11 <dmsimard> it's possible a change occurred in zuul-stream fixed it 22:12:34 <dmsimard> corvus: I don't know yet -- I could try reproducing the issue without the patch (that was supposed to address the issue..) and see what happens. 22:12:42 <dmsimard> should be fairly easy 22:13:08 <corvus> dmsimard: okay. if you decide it's no longer an issue, go ahead and drop the zuulv3.0 tag from the story when you update it, please 22:13:13 <dmsimard> ack 22:13:32 <corvus> mordred: did i see you post something about a zuul-stream refactor? 22:13:50 <corvus> https://review.openstack.org/531171 looks like 22:14:12 <corvus> mordred: is that a start to addressing the 'refactor zuul-stream and add testing' story? 22:14:52 <corvus> the story for that is "It's currently largely untested and difficult to make changes to." 22:15:57 <corvus> mordred: please let me know when you are back 22:16:06 <mordred> corvus: it's sort of a patch related to that - but that's a little bit more about changing how the log stream data gets schelpped around to be potentially more something we could upstream 22:16:11 <corvus> oh hai! 22:16:44 <mordred> I also wrote up a proposal here: https://github.com/ansible/proposals/issues/92 22:17:13 <clarkb> and I guess it could help testing because we could have it log to a file/buffer in tests and check the contents rather than needing to build up a full on daemon process and listener and all that 22:18:08 <corvus> clarkb: well, spinning things up and tearing them down isn't really a problem for the test runner 22:18:52 <corvus> i think it's more that there's a structural issue with the fact that most of this behaves differently if you are sshing into a remote node vs locally 22:19:13 <mordred> yah - and also things like rebooting a test node will kill the zuul_console process 22:19:30 <corvus> mordred: well, i'm talking about unit tests 22:19:40 <mordred> so the aim here is that we won't need a zuul_console process, and also that the surface area for things breaking will reduce 22:19:50 <mordred> corvus: ah- yes, well, that as well :) 22:20:46 <mordred> amongst the issues with the current system are surprise, fear and an almost fanatical devotion to the pope ... 22:20:57 <corvus> the reason this story was created was because there are, even now, still a lot of times we put something in the console log that we shouldn't, or don't put something in that we should. and we have zero testing for any of that, and any time we try to change it, we break production, so it's effectively frozen.. 22:21:11 <corvus> so that's the thing that we need to get unblocked before the v3.0 release 22:21:29 <corvus> we need to be able to accept a bug report from someone saying "this should have been in the console log" and act on it 22:21:31 <mordred> yup. and that refactor is still on the todo-list - and is largerly orthogonal to the above patch 22:21:46 <Shrews> mmm, largerly 22:21:47 <corvus> okay that's helpful 22:21:50 <mordred> or, I *believe* it's orthogonal 22:22:21 <mordred> it's possible we'll discover that the other patch enables the refactor in important ways 22:22:25 <corvus> mordred: yeah, i could see it interseting if testing somehow becomes more feasible with the above, otherwise, i'd be terrified to land the above anyway 22:22:33 <mordred> ++ 22:22:52 <SpamapS> +1 for not needing zuul_console process! I've been having reboot issues actually. :) 22:23:06 <dmsimard> Do we no longer have a zuulv3-dev node that we could test sensitive things on ? 22:23:24 <pabelanger> no, we deleted it 22:23:37 <dmsimard> We could hook it up to openstack-dev/sandbox or something 22:23:57 <corvus> pabelanger, leifmadsen_: i see you're working on the docs -- anything blocking you there? 22:24:14 <clarkb> dmsimard: for stuff like this I think the effort is likely better spent adding tests to the test suite 22:24:42 <corvus> yeah, we're not landing anything else without tests :) 22:24:52 <pabelanger> corvus: just discussing where we want the example config project repo to live, that was about it. We should have some rst docs up this week I believe 22:25:04 <corvus> pabelanger: example config project repo? 22:26:05 <pabelanger> corvus: some of the discussions were just about having something existing online, so we can use the git driver connection for it. But, possible we want to docs to also include more details how to build that out 22:26:12 <corvus> pabelanger: since we don't expect anyone to actually reuse that, how about we just inline the content into the docs and walk people through creating their own? 22:26:35 <pabelanger> corvus: yah, that is an option for sure. 22:26:59 <corvus> i mean, we should document using zuul-base-jobs (which will need to be a config repo -- but it only holds jobs). and zuul-jobs, of course. 22:27:21 <corvus> but i don't think we have plans for a reusable repo with, say, pipelines at the moment. 22:27:45 <corvus> that's still something everyone will need to create locally. for now. :) 22:28:18 <pabelanger> okay, I'll bring it up with leifmadsen_ tomorrow and we can discuss it more in #zuul 22:28:26 <tobiash> Making that more generic would ne cool 22:28:42 <corvus> yep, but a very long-term task :) 22:28:51 <corvus> pabelanger, mordred: what's the status of reporting on github from openstack-infra? 22:29:56 <pabelanger> I last tested it before holiday break, but haven't progressed more due to upcoming refactor of things into zuul-web. But it was working, except for 1 thing I cannot remember ATM 22:30:20 <corvus> pabelanger: were there any errors in the logs? 22:30:25 <pabelanger> corvus: nope 22:30:46 <mordred> corvus: I was waiting for the injest patches to land before moving forward 22:30:47 <corvus> pabelanger: ok. mordred: maybe you want to turn on your shade test? 22:30:55 <mordred> corvus: it's possible I didnt communicate that to anyone ... 22:30:59 <corvus> why is that a blocker? 22:31:34 <mordred> corvus: it's not - I just figured that scalability was one of the larger concerns and the injest patch changes the structure of how that works... 22:31:47 <mordred> corvus: BUT- can *totally* turn on the shade patch very easily and we can see how it goes 22:32:19 <pabelanger> yah, I think getting more data from shade patch works for me 22:32:21 <corvus> mordred: i'm not going to say 'no' to more data :) i say go for it whenever you're ready :) 22:32:39 <corvus> and hopefully we'll have the injest patch in next week too 22:33:11 <corvus> i'm far enough with the cross-source deps work (which is the next item on the list) to say i think we can have it landed in the next few days / end of week at latest. 22:33:22 <mordred> corvus: kk. I'll get that going 22:33:39 <corvus> so once that lands, we'll have some *really fun* things to test 22:33:57 <pabelanger> Yay 22:34:25 <corvus> mordred: istr you said your shade job could be used to verify the cross-source work as well? 22:36:38 <corvus> mordred: also, i think you were 90% of the way through js tooling patches before the holidays -- i've added a patch to your series which we can use to validate that all the URLs that we expect to work do work 22:36:39 <Shrews> would that be the whole "changes to ansible openstack modules triggers shade tests and reporting" thing? b/c that would be exciting 22:36:46 <corvus> Shrews: yep 22:37:02 * Shrews is giddy with excitement 22:37:26 <corvus> mordred: so when you pick that up, we can test the three different ways of serving the webapp that we know about 22:37:45 <mordred> corvus: yup! 22:37:54 <mordred> (to all of the things) 22:37:58 <corvus> cool 22:38:06 <corvus> the nodepool static driver from tristanC is in review 22:38:26 <corvus> Shrews: are we running the finger gw in openstack prod now? 22:38:46 <Shrews> corvus: yes 22:38:51 <corvus> Shrews: \o/ 22:38:53 <pabelanger> Oh, nice! 22:39:06 <Shrews> finger UUID@zuulv3.openstack.org works 22:39:08 <clarkb> I think we are still running the executors on a low port right? 22:39:09 <pabelanger> are we at a point to remove root user from zuul-executor now? 22:39:14 <Shrews> pabelanger: yes 22:39:15 <corvus> Shrews: do you want to delete all the special user handling code from the executors, and have them serve finger on port XX79 by default? 22:39:22 <pabelanger> Shrews: awesome 22:39:23 <Shrews> corvus: sure 22:39:34 <Shrews> i love deleting code 22:39:38 <corvus> Shrews: cool, i think that's probably the last thing there and we can clear that from the list 22:39:54 <corvus> pabelanger, leifmadsen_: ^ fyi quickstart docs change :) 22:40:06 <pabelanger> corvus: ++ 22:40:25 <corvus> as mentioned, jlk is working on getting github3.py released... 22:40:36 <corvus> and finally, clarkb put 2 new things on the list about secrets and branches 22:41:23 <clarkb> these came out of the zuulv3-issues etherpad 22:41:42 <clarkb> my understanding of this is that without fixing this you cannot use the same secret name on different branches? 22:41:48 <clarkb> which seems like something we should fix 22:42:25 <clarkb> I want to say it was kolla that ran into this and they worked around it by using different names 22:42:25 <corvus> clarkb: yeah, i agree these seem like release-blocking bugs 22:43:38 <corvus> okay. that's the list. i won't go through it again next meeting, but i thought it would be good to reset after the holiday break. 22:44:08 <corvus> #topic RAM governor for the executors 22:44:43 <corvus> dmsimard: i think this is your topic? 22:46:32 <corvus> dmsimard: lemme know when you're back 22:46:37 <corvus> #topic mailing lists 22:47:12 <corvus> i'll send out an email about this, but while (most of us) are here... we should now have the infrastructure in place to host mailing lists at lists.zuul-ci.org 22:47:46 <corvus> i would like to have a zuul-announce list, where we make release announcements and notifications about changes to job syntax/deprecation/etc (which will be important for things like the shared zuul-jobs repo) 22:48:03 <SpamapS> +1 22:48:22 <corvus> and obviously, we need at least one other list for discussion... should we create a zuul-discuss list? or a zuul-dev list? or both? 22:48:37 <mordred> I kind of think both 22:48:57 <corvus> (ie, do we need a dev/other split, or can we start without that and just have a combined list?) 22:48:58 <pabelanger> zuul-discuss for end user support? 22:49:26 <mordred> corvus: but I could see starting with zuul-discuss and splitting a zuul-dev later if it gets too much for people 22:49:29 * fungi returns early and catches up 22:50:05 <pabelanger> what about zuul-users 22:50:21 <clarkb> I really don't like the dev/general split we have for openstack 22:50:28 <clarkb> and also ops. 22:50:44 <clarkb> It creates a ton of cross posting confusion and also easy ways to ignore important threads 22:50:45 <corvus> there's a lot of overlap -- any dev work we do will affect users and could benefit from general discussion, so for that, i like having one list. i don't want people to feel like they're "bothering the devs" though if they have questions like "how should i write a job that does ..." 22:51:30 <mordred> corvus: yes - those are both sides of my thoughts :) 22:52:05 <mordred> I think there are at *least* 3 personas "I am running a zuul" "I am writing jobs for a zuul" and "I am developing zuul" - many of us here are all three 22:52:23 <corvus> i feel like maybe it's safer to start with one, and create more if needed -- so maybe we should have "zuul-announce" and "zuul-discuss" for now? 22:52:31 <mordred> wfm 22:52:34 <clarkb> sounds good 22:52:47 <pabelanger> sure 22:53:07 <corvus> ok, i'll email the infra list with that as a proposal, get feedback from folks not here ("we decided in irc what the mailing lists should be!" is especially ironic), then make lists in a day or 2. 22:53:46 <corvus> #topic open discussion 22:54:31 <SpamapS> I had one question come into my mind this last week 22:54:42 <mordred> corvus: mentioned in #openstack-infra, but as a followup from earlier, https://review.openstack.org/#/q/topic:turn-on-ansible has the patches to start running stuff on ansible patches 22:54:47 <clarkb> https://etherpad.openstack.org/p/zuulv3-issues is almost cleared out of active items. I've marked things as fixed that were fixed, gotten a few things fixed that weren't, and filed bugs for items that need longer term tracking 22:54:54 <corvus> mordred: thx 22:55:03 <SpamapS> IIRC we had some kind of plan to merge feature/zuulv3 into master before the 3.0 release. I don't recall what the prereqs for that were. 22:55:05 <clarkb> it would be great if everyone could take a look over that and make sure their items are accurate and possibly open bugs for them if they need longer term tracking 22:55:16 <clarkb> then I can unpin the tab in my browser :) 22:55:41 <clarkb> SpamapS: we need to update the puppet-openstackci deployment tooling to handle master as v3 22:55:48 <corvus> SpamapS: yes! it's still imminent -- we want to land something to puppet-openstackci like https://review.openstack.org/523951 first 22:56:12 <corvus> so that we don't introduce zuulv3 to the world of openstack third-party ci ops by automatically upgrading them 22:56:13 <SpamapS> Ok, just looking for targets of opportunity. :) 22:56:35 <SpamapS> but a puppeteer.. I am not 22:56:45 <corvus> afaik that's the only thing blocking that, as soon as it lands we'll merge 22:57:17 <corvus> and then i will need to unlearn 2 years of "git reset --hard origin/feature/zuulv3" 22:57:40 <clarkb> thats ok I always check out master first then have to switch to feature/zuuvl3 22:57:50 <fungi> that'll unlearn itself pretty quickly when we delete the branch 22:58:12 <fungi> fatal: ambiguous argument 'origin/feature/zuulv3': unknown revision or path not in the working tree. 22:59:53 <corvus> the stuff of dreams 23:00:11 <SpamapS> :-D 23:00:25 <corvus> time's up, thanks everyone! 23:00:28 <corvus> #endmeeting