19:01:10 <clarkb> #startmeeting infra 19:01:11 <openstack> Meeting started Tue Sep 5 19:01:10 2017 UTC and is due to finish in 60 minutes. The chair is clarkb. Information about MeetBot at http://wiki.debian.org/MeetBot. 19:01:12 <openstack> Useful Commands: #action #agreed #help #info #idea #link #topic #startvote. 19:01:14 <openstack> The meeting name has been set to 'infra' 19:01:20 <clarkb> #link https://wiki.openstack.org/wiki/Meetings/InfraTeamMeeting#Agenda_for_next_meeting 19:01:27 <pabelanger> o/ 19:01:30 <ianw> o/ 19:01:44 <fungi> i guess it's that time again 19:01:55 <clarkb> #topic Announcements 19:02:09 <clarkb> #info The PTG is next week 19:02:20 <clarkb> #link https://etherpad.openstack.org/p/infra-ptg-queens 19:02:31 <clarkb> #link https://ttx.re/queens-ptg.html for general PTG info on what to expect 19:02:40 <mordred> clarkb: I don't believe in PTGs 19:03:02 <clarkb> mordred: good news! it is happening either way :) 19:03:22 <fungi> the tooth fairy will be there 19:03:26 <clarkb> Also, I compeltely failed at signing our new Queens release key so will remind everyone about that agin now 19:03:29 <clarkb> #info Queens Cycle signing key ready for attestation 19:03:36 <clarkb> #link https://sks-keyservers.net/pks/lookup?op=vindex&search=0x4c8b8b5a694f612544b3b4bac52f01a3fbdb9949&fingerprint=on Queens Cycle signing key 19:03:41 <clarkb> #link http://docs.openstack.org/infra/system-config/signing.html#attestation attestation process 19:03:47 <fungi> thanks! 19:03:57 <fungi> i mentioned it to the release team in their meeting last week as well 19:04:01 <clarkb> if you are an infra-root and haven't done that yet, please do that soon as fungi wants to swap keys out after the trailing cycle releases happen 19:04:16 <fungi> yeah, basically immediately post-ptg 19:04:22 <fungi> or as soon as i sober up ;) 19:04:39 <fungi> (just kidding, we have a gerrit maintenance as soon as i get home!) 19:04:47 <clarkb> #topic Actions from last meeting 19:05:18 <clarkb> fungi: I am expecting we still don't have switchport counts? not suprising considering Houston has bigger problems to worry about now 19:05:36 <fungi> i think we can drop that from the perpetual action items and discuss at the ptg how we want to go forward 19:05:43 <clarkb> fungi: ok that wfm 19:06:07 <clarkb> #action ianw upgrade mirror-update server and bandersnatch 19:06:07 <fungi> it don't suppose it's really doing much good to bring it up every meeting 19:06:27 <ianw> still pending, didn't want to touch before release 19:06:28 <clarkb> ianw: ^ I expect that can actually happen nowish since most of the release stuff is done tomorrow/thursday 19:06:46 <clarkb> ianw: yup np. I just don't want to forget it and I think we are finally to a point where it is mostly safe to do 19:06:46 <ianw> yep, will look into 19:07:02 <clarkb> #action clarkb update infracloud docs to include ssl setup info 19:07:18 <clarkb> I still need to write ^ that change but am hoping to do that today before the details escape me 19:07:35 <clarkb> #topic Specs approval 19:07:50 <clarkb> #link https://review.openstack.org/#/c/492287/ 19:08:49 <clarkb> This is a cleanup change and all of the dependencies are compelte for it now. Can we open that to voting for those interested in Gerrit and teh Gerrit upgrade? I'll look at getting it in on Thurdsay assuming there aren't items that need addressing 19:09:26 <clarkb> That was really the only spec I saw that was ready. Please let me know if I missed any 19:09:48 <clarkb> #topic Priority Efforts 19:09:52 <fungi> i'm good with opening that for council vote (thought you had done that last week actually) 19:10:05 <clarkb> #undo 19:10:06 <openstack> Removing item from minutes: #topic Priority Efforts 19:10:37 <clarkb> fungi: I don't think the dependent changes had merged by the time I looked at the specs again on thursday? I did approve the ssh keys for zuulv3 spec though 19:10:53 <clarkb> #topic Priority Efforts 19:11:23 <clarkb> ok really quickly before getting into zuulv3. Just wanted to remind everyone that Gerrit upgrade to 2.13 is happening Monday after the PTG 19:11:35 <clarkb> #link https://etherpad.openstack.org/p/gerrit-2.13.-upgrade-steps 19:11:56 <clarkb> reviewing ^ would be great if you have time (ha) 19:12:08 <clarkb> #topic Zuulv3 19:12:43 <jeblair> hi! it's less than 1 week before our planned cutover to zuulv3 19:12:46 * clarkb hands meeting baton to jeblair 19:12:55 <jeblair> here's the list of outstanding items we've been working from: https://etherpad.openstack.org/p/zuulv3-pre-ptg 19:13:04 <jeblair> going down that list... 19:13:07 <clarkb> #link https://etherpad.openstack.org/p/zuulv3-pre-ptg 19:13:29 <jeblair> i've been working on the devstack jobs, and i think the devstack-legacy job, which is the job that we'll base the automatically converted devstack jobs on is about ready 19:13:34 <jeblair> well, it is ready 19:13:44 <fungi> lgtm 19:14:00 <jeblair> it's probably going to need slight revision in order to plug into what the migration script outputs 19:14:14 <jeblair> but it demonstrates all the mechanical things it needs to do 19:14:32 <jeblair> it's also actually really close to a framework we can use for *most* automatically converted jobs 19:15:00 <jeblair> at any rate, i think the next step for that is now waiting on the migration script 19:15:09 <jeblair> so i'll move part of that to the done section 19:15:21 <jeblair> the other part of the devstack work is the zuulv3 native devstack job 19:16:07 <jeblair> we're making progress on that -- it's actually really cool to look at -- you can see how things are shaping up for reuse in zuulv3 and how devstack jobs can be understood by mere humans 19:16:13 <mordred> speaking of migration script, I'm shifting my attention to migration script today 19:16:24 <jeblair> that's here: https://review.openstack.org/500202 19:16:32 <mordred> jeblair: ++ it's super cool 19:16:47 <jeblair> mordred: ya, let's talk about migration script in just a min 19:17:16 <clarkb> #link https://review.openstack.org/#/c/500202/ devstack zuulv3 native job 19:17:19 <jeblair> re the devstack job -- unfortunately, we've run into an issue possibly related to zuul_console or zuul_stream where we stop getting output after devstack runs wget 19:17:43 <Shrews> weird 19:17:48 <mordred> yah. it's ... 19:17:51 <jeblair> digging into that is now at the top of my list (except that some emergency things jumped ahead of it this morning) 19:17:56 <mordred> one of the weirdest things ever 19:18:01 <jeblair> so hopefully i'll start looking at it in earnest this afternoon 19:18:09 <fungi> does sound very bizarre 19:18:32 <jeblair> i'm considering it a blocker for 2 reasons -- i still think we need the native devstack job in place before the migration so we actually have something to show and something for folks to build off of 19:18:39 <jeblair> (otherwise, chaos and anger ensues) 19:18:43 <mordred> yup 19:18:57 <jeblair> and also, it may represent a general problem that may appear in other contexts 19:19:05 <fungi> well, and people continue to cargo-cult old cruft 19:19:26 <jeblair> (though i rank that a little unlikely because devstack surpasses everything else with what it does with file handle redirection :) 19:19:28 <clarkb> does it cause the jobs to fail too? or is it just breaking the UI? 19:19:29 <mordred> yah - the second thing is the most troubling, because it's a very confusing thing to occur that makes no sense 19:19:39 <jeblair> clarkb: it causes the job to hang 19:19:54 <clarkb> gotcha so not something we could attempt to live with if we wanted to 19:20:00 <mordred> although it does get FURTHER than the wget before it hangs 19:20:02 <mordred> yah 19:20:05 <jeblair> (ansible proceeds somewhat (unsure how far, for obvious reasons) past the point it stops logging) 19:20:22 <mordred> it's worth noting that it's not just the remote log streaming that hangs ... 19:20:23 <pabelanger> could we switch to curl --silent? 19:20:27 <fungi> wonder if wget security mechanisms are getting overzealous closing extra file descriptors or something 19:20:31 <mordred> the local writing of content to the files in /tmp also stops 19:20:45 <jeblair> pabelanger: if our ci system can't run wget we must hide our faces in paper bags 19:21:02 <fungi> i have some paper bags 19:21:11 <pabelanger> jeblair: agree 19:21:19 <mordred> yah. I would consider not being able to run wget a showstopper - largely becaue whatever breaks wget is gonna break somethign else too 19:21:30 <fungi> completely agree 19:21:37 <jeblair> anyway, i'll start in on that soon and keep folks updated on it 19:21:42 <jeblair> next thing on the list is jobs that use special slaves 19:21:52 <jeblair> #link jobs that use special slaves https://etherpad.openstack.org/p/zuulv3-special-jobs 19:21:56 <jeblair> mordred: ^ 19:21:59 <mordred> so, they don't work (for obvious reasons) but if folks want to see POC jobs using the new devsack base job: https://review.openstack.org/#/c/500365/ is a rewrite of all of shade's dsvm jobs using the new base job 19:22:19 <jeblair> mordred: (cool thanks! good to exercise the new api) 19:22:33 <mordred> I have the majority of these done, including a spectacular stack related to wheel mirrors that you shoudl all enjoy and also run away from screaming 19:22:59 <clarkb> #link https://etherpad.openstack.org/p/zuulv3-special-jobs subset of zuulv3 cutover prep work, specifically for jobs that run on special nodes 19:23:03 <jeblair> mordred: that last one means we drop the special builders, right? 19:23:07 <mordred> yup! 19:23:12 <jeblair> sweet 19:23:35 <pabelanger> yay 19:23:40 <jeblair> mordred: what's still tbd? 19:23:48 <mordred> there's still a few hangers on - pabelanger was just asking about next things to work on, so I think I may hand-off the last of that to him while I shit to migration script 19:23:56 * jeblair giggles 19:24:42 <mordred> jeblair: I'll update the etherpad after the meeting with a short-list of review links / todo 19:24:54 <mordred> oh - also - I have two other changes related to not-migration it's worth mentioning 19:24:55 <jeblair> mordred, pabelanger: thanks! 19:25:08 <mordred> that I need some folks to review 19:25:09 <pabelanger> exciting 19:25:28 <mordred> https://review.openstack.org/#/c/500320 and https://review.openstack.org/#/c/489719 19:25:50 <jeblair> oh yeah, if we find time for those, that will be really nice 19:25:58 <mordred> https://review.openstack.org/#/c/500320 is a must-have - or a different impl is - it's a thing we missed in our original tox job - which is upper-constraints handling 19:26:06 <jeblair> it may actually end up being somewhat migration related, once we start thinking about things like neutron plugin unit tests 19:26:12 <mordred> like, we cannot go live without eitherhttps://review.openstack.org/#/c/500320 or an alternate 19:26:15 <clarkb> #link https://review.openstack.org/#/c/500320 constraints handling in tox jobs 19:26:25 <mordred> the thing I want to discuss about it explicitly 19:26:42 <mordred> is the approach taken of adding openstack/requirements to the required-projects list of the base job 19:27:12 <jeblair> hrm 19:27:24 <jeblair> why can't it be in the base unittest job + base devstack job? 19:27:27 <mordred> currently every job in v2 that runs tox is cloning openstack/requirements itself in the job, so I don't think doing that is any MORE burdensome than the current thing, but it feels weird 19:27:36 <mordred> jeblair: because unittest is in zuul-jobs 19:27:55 <jeblair> mordred: it feels like maybe the right thing to do is make openstack-unittest then 19:27:58 <mordred> and I'll consider it a crying shame if openstack cannot use the tox-py27 job 19:28:05 <pabelanger> tox-py35-constraints has it I think 19:28:12 <mordred> i'd like to explore every possibility available before we do that 19:28:20 <clarkb> mordred: one thing to consider is we don't need the entire repo, we just need the single file (not sure if that makes anything easier though) 19:28:40 <mordred> clarkb: well - we also honestly need to be able to do depends-on with upper-constraints values too 19:29:07 <mordred> jeblair: can we make a unittest job in project-config that shadows the one in zuul-jobs? 19:29:10 <jeblair> pabelanger: are you saying tox-py35-constraints is a child job of tox-py35, and tox-py35-constraints has openstack/requirements? 19:29:18 <clarkb> mordred: ya so some repo manipulation needs to happen 19:29:28 <fungi> can the base tox job archetype grow a mechanism to add -c /some/path/constranits.txt if a file exists at that location, and then all we have to do is add some task to drop the right content there? 19:29:36 <mordred> yes. tox-py35-constraints is what https://review.openstack.org/#/c/500320 is aiming to remove the need for 19:29:47 <mordred> fungi: right -so that's what's in that patch 19:29:48 <pabelanger> jeblair: yes to both 19:29:49 <fungi> though i guess the set problem means that adding that task still leaves us with a subclass of the tox jobs 19:29:49 <jeblair> mordred: oh, why do we want to remove it? 19:30:02 <mordred> let me back up real quick 19:30:38 <jeblair> mordred: (or we could take this as something to hash out in #zuul after meeting) 19:30:38 <mordred> first - what the patch does is adds logic to the tox role to look for an upper-constraints file, if given, and if so it will set the environment variable UPPER_CONSTRAINTS_FILE 19:31:08 <mordred> all of the openstack tox jobs that need/understand UC files respond to that env var 19:31:35 <mordred> so that part of the logic allows the tox job itself to be used by people who care or don't care about upper-constraints files and for both groups it'll DTRT 19:31:52 <mordred> the second part of the equation is getting a file on disk and telling the tox job to look for it 19:32:14 <jeblair> mordred: i think that part is fine; the thing that rubs me the wrong way is cloning openstack/requirements everywhere; there are *a lot* of jobs that don't need that 19:32:14 <mordred> that is the part where the current patch sets a variable and a required-projects 19:32:20 <mordred> jeblair: indeed 19:32:33 <mordred> jeblair: we could perhaps justmake it a project-template 19:32:47 <mordred> jeblair: that adds the repo to required-projects and also sets the variable 19:33:06 <mordred> so people can use python-jobs-with-constraints perhaps? 19:33:16 <pabelanger> I think I must have missed something, I thought tox jobs that needed requirements, we'd just create tox-py35-contraints and tox-py27-constraints in openstack-zuul-jobs, everything else could use tox-py27 from zuul-jobs 19:33:19 <jeblair> i'd love to see if we can make it a proper job 19:33:24 <mordred> and then for individual job consumption, people can always set the variable and add the repo if they want 19:33:34 <jeblair> can we pause for a minute though? 19:33:37 <mordred> sure 19:33:57 <jeblair> i'd like to ask whether we want to continue this conversation here, or just note that this is something we need to work out after the meeting? 19:34:04 <mordred> sure. we can do that 19:34:09 <jeblair> we have 5 items after this in the zuul list alone 19:34:17 <clarkb> ya I think we can sort that out after meeting (this is going to be a full meeting) 19:34:18 <mordred> I just need people to engage on this topic 19:34:31 <jeblair> let's resume this immediately after the meeting in #zuul 19:34:34 <mordred> kk 19:34:41 <jeblair> next up: migration script 19:34:44 <mordred> we can also talk about the other patch I mentioned then too 19:35:05 <jeblair> mordred: you mentioned earlier you're going to start working on this 19:35:29 <jeblair> mordred: is this still at one-person stage, or do you need/want anything from other folks? 19:35:36 <mordred> yes. I'm shifting attention to that today - I think the other special jobs are far enough along, we can grab stragglers at the end if we need to 19:35:38 <jeblair> s/start/resume/ 19:35:57 <mordred> I think it's one person for the rest of today 19:36:13 <mordred> and maybe part of tomorrow 19:36:14 <jeblair> cool, i reckon you'll let us know when there's more stuff to jump on 19:36:34 <jeblair> next up: migration docs 19:36:36 <mordred> but at that point I'm expecting it'll be to the point where it can be run locally, someone can find a problem with a migration and add a workaround 19:36:49 <jeblair> #link infra-manual zuulv3 migration docs https://review.openstack.org/500218 19:37:03 <jeblair> that change and its parents remove all of the TODO items from the zuulv3 infra manual page 19:37:17 <jeblair> it would be great if folks can review it, and also read the whole thing and identify gaps 19:37:42 <jeblair> in some form or other, it at least roughly covers most of the things i would want to communicate to folks making the jjb -> zuul transition 19:37:54 <jeblair> it's not complete documentation, but it should get us on the same page 19:38:10 <jeblair> i think once we have the migration script, we may want to add things related to that 19:38:13 <Shrews> jeblair: i'll review those today 19:38:29 <jeblair> like "the migration script output some stuff that looks like this, here's what you should do" 19:38:34 <jeblair> can't write that yet 19:39:07 <jeblair> we already talked about zuul_console breakage... so next is docs job incorrectly publishing 19:39:27 <jeblair> we published the zuulv3 docs over top of the zuulv2 docs 19:39:34 <jeblair> i think pabelanger just checked in on this before the meeting, and there's a patch that should fix that 19:39:46 <jeblair> so hopefully this is mostly taken care of already, just needs a review and some republishing 19:39:59 <pabelanger> yes 19:40:15 <jeblair> pabelanger: can you take care of making sure that patch gets merged and appropriate jobs re-run to get the right content back in place (and the new version exercised)? 19:41:05 <clarkb> I'll review this immediately after the meeting too 19:41:14 <clarkb> (since I started looking at it just before) 19:41:18 <pabelanger> jeblair: yes, I'll do that after the meeting 19:41:38 <jeblair> cool thx 19:41:41 <jeblair> finally -- zuul-cloner shim; i sent an email out about this and Shrews started working on it 19:42:28 <clarkb> idea here is just that if you run zuul-cloner it doesn't break on you to easy migration? 19:42:35 <clarkb> *ease 19:42:55 <jeblair> clarkb: also it translates the golang-style paths to zuul-cloner style paths 19:43:05 <jeblair> src/git.o.o/openstack/foo -> openstack/foo 19:43:25 <jeblair> maybe even supports clonemap files so jobs that rely on that work as-is 19:43:37 <Shrews> working on the clonemap login now 19:43:50 <Shrews> logic* 19:43:56 <jeblair> Shrews: w00t thanks! 19:44:04 <clarkb> nice 19:44:07 <jeblair> that's all the things on the list -- anything missing? 19:44:16 <jeblair> (blockers for migration/cutover) 19:44:30 <mordred> oh - I noticed a thing, I should add it to the list ... 19:44:41 <pabelanger> new servers we need for saturday? 19:44:53 <jeblair> pabelanger: yes! 19:45:01 <pabelanger> eg: nl02.o.o is online, and we should cut over to it asap 19:45:07 <jeblair> pabelanger: did you spin down those logstash servers? 19:45:08 <pabelanger> nl01.o.o is trusty 19:45:21 <mordred> I noticed this weekend while working on some base job stuff that our configure-mirrors role does not have parity with the current configure-mirror script 19:45:32 <jeblair> mordred: nice thanks :) 19:45:34 <pabelanger> jeblair: I just stopped logstash services first, so if we are okay to stop them, we can do that now 19:46:03 <pabelanger> logstash-worker016.o. to logstash-worker20.o.o 19:46:06 <mordred> I think we can likely make a legacy-openstack-configure-mirrors thing in project-config that does a dumb version of a direct translation 19:46:08 <jeblair> pabelanger: cool, sounds like you, fungi, and clarkb can probably handle that 19:46:08 <clarkb> mordred: jeblair fwiw the existing script is written such that you execute the same script hat nodepool exectures to load that data up 19:46:14 <clarkb> for mirrors 19:46:16 <mordred> and make it nicer moving forward 19:46:16 <fungi> yup 19:46:37 <pabelanger> kk 19:46:48 <mordred> clarkb: yup. I think we need to have legacy-openstack-configure-mirrors do that for now 19:47:14 <clarkb> pabelanger: ya logstash job queue count looks ok so should be fine to turn those servers off/delete them 19:47:16 <mordred> clarkb: and then we can circle back and talk about how/if to refactor it to be more 'native' later 19:47:50 <pabelanger> clarkb: ack 19:48:51 <jeblair> okay, the next event(s) in our schedule are to do trial cutovers this weekend -- saturday and/or sunday evenings. obviously that will only happen if these blockers are sufficiently resolved by then. 19:49:25 <jeblair> maybe let's discuss details on that friday? 19:49:48 <fungi> sounds great 19:49:54 <Shrews> wfm 19:49:58 <pabelanger> ++ 19:50:10 <jeblair> clarkb: eot from me 19:50:33 <clarkb> ok 19:50:47 <clarkb> there is also a new thread on the mailing list about Zuul UI things 19:51:13 <clarkb> I've sent a quick reply but definitely don't have enough background on everything going on in Zuul to say much. Would be nice if someone could respodn to that properly 19:51:30 <clarkb> #topic PTG team dinner 19:51:38 <jeblair> clarkb: will do 19:52:06 <clarkb> Tuesday evening looks like the best night for us because it is when we have the most overlap without conflicts. I have't heard objections yet but basically said lets do Tuesday evening at beer garden 19:52:15 <jeblair> ++ 19:52:35 <pabelanger> Mmm, garden beer 19:52:45 <clarkb> If this doesnt' work please propose alternatives but as of today that is the plan :) 19:53:00 <clarkb> Once in Denver I'll send mail/ping people with details on how we are transporting 19:53:16 <clarkb> #topic Project renames 19:53:27 <clarkb> #link https://review.openstack.org/#/c/500768 19:53:47 <clarkb> we have one project rename. I don't expect we'll get to taht this week for obvious reasons, or next 19:54:16 <clarkb> Assuming gerrit upgrade goes well after PTG maybe we can plan a rename for that week or week after? This is mostly a heads up, but lets not worry about it until later 19:54:18 <fungi> i expected it would happen after the gerrit upgrade 19:54:32 <fungi> sounds good 19:54:53 <clarkb> In theory we'll be knocking out a couple long standing todo items for infra over the next couple weeks so lets focus on doing that first 19:55:16 <clarkb> #topic Open Discussion 19:55:24 <clarkb> woo managed a couple minutes leftover for this 19:55:29 <fungi> #link https://etherpad.openstack.org/p/GoqkPTucMK infra addition to tc's top 5 help wanted 19:55:29 <mordred> \o/ 19:55:37 <fungi> the tc has suggested we propose something along those lines to help increase visibility with companies looking for effective ways to get more involved 19:55:54 <fungi> please hack that up with whatever edits you think it deserves 19:56:09 <fungi> but i'd like to get it pushed up to gerrit in the next day or two 19:56:33 * clarkb makes note to review that 19:56:35 <pabelanger> fungi: nice, was just talking to some downstream people about how to get move involved upstream 19:56:56 <fungi> i borrowed a little wording from the summary in system-config doc, though with tweaks and most of that is wholly new prose 19:59:13 <clarkb> doesn't look like there is much else. Thank you everyone 19:59:17 <clarkb> #endmeeting