19:01:19 <clarkb> #startmeeting infra
19:01:19 <openstack> Meeting started Tue Apr 17 19:01:19 2018 UTC and is due to finish in 60 minutes.  The chair is clarkb. Information about MeetBot at http://wiki.debian.org/MeetBot.
19:01:19 <ianw> morning!
19:01:20 <openstack> Useful Commands: #action #agreed #help #info #idea #link #topic #startvote.
19:01:22 <openstack> The meeting name has been set to 'infra'
19:01:35 <clarkb> #link https://wiki.openstack.org/wiki/Meetings/InfraTeamMeeting#Agenda_for_next_meeting
19:02:04 <clarkb> #topic Announcements
19:02:35 <clarkb> I don't have any
19:02:41 <corvus> that's one
19:03:10 <clarkb> #topic Actions from last meeting
19:03:22 <clarkb> #link http://eavesdrop.openstack.org/meetings/infra/2018/infra.2018-04-10-19.01.txt minutes from last meeting
19:03:46 <clarkb> There were no officially recorded #actions but mordred and pabelanger were palnning to update some specs
19:04:06 <clarkb> I know mordred didn't get to it I think combination of pip and meetings and dib updates meant that many of us were distracted by the fire of the day
19:04:14 <clarkb> but we can talk about that more during priority efforts
19:04:53 <ianw> i was going to write a third-party spec, but yeah, have been pip-ing and dib-ing
19:04:59 <clarkb> #topic Specs approval
19:05:20 <clarkb> Similarly I havne't even looked at the specs repo for other specs recently
19:05:22 <corvus> omg, pip and dib are typographic vertical mirror images
19:05:30 <corvus> [sorry]
19:05:33 <fungi> everyone's been pipping the dibs and dibbing the pips so far this week
19:05:42 <clarkb> anyone have a spec they want to call out for review before we move on?
19:06:54 <clarkb> #topic Priority Efforts
19:07:09 <clarkb> Good news is I hear fungi fixed storyboard bug updates via gerrit its plugin magic
19:07:23 <clarkb> #topic Storybaord
19:07:33 <clarkb> I'm leaving the typio
19:07:51 <fungi> yeah, seems like the implicit configuration piggybacking on commentlinks stopped working for some reason
19:08:05 <fungi> docs say it was deprecated but it just flat out stopped being considered i think
19:08:29 <fungi> once i was able to translate the docs into human, i figured out the explicit config for it
19:08:53 <fungi> anyway, as of the production gerrit restart on saturday it's working agani
19:09:00 <clarkb> fungi: was that the only known issue between gerrit and storyboard? the other behavior is working as expected?
19:09:14 <fungi> yep
19:09:22 <corvus> oh, so it's not tied to commentlinks now, it's an explicit option?
19:09:28 <fungi> right
19:09:35 <corvus> cool.  that sounds like an improvement :)
19:09:39 <fungi> and now that i see how to interpret the plugin documentation, there are some other behaviors we could turn on if we want
19:10:27 <fungi> #link https://gerrit.googlesource.com/plugins/its-storyboard/ its-storyboard configuration docs
19:11:10 <fungi> #undo
19:12:10 <fungi> #link https://gerrit.googlesource.com/plugins/its-storyboard/+/stable-2.13/src/main/resources/Documentation/quick-install-guide.md#its_connection_its_connectionconnection-configuration its-storyboard configuration docs
19:13:06 <fungi> anyway, that's all there is to say about that, i think
19:13:13 <clarkb> any other storyboard updates? looks like migrations are proceeding and that was a blocker for some projects iirc so yay
19:13:31 <fungi> right, openstackclient and openstacksdk migrated last week
19:13:39 <fungi> as did the tripleo validations subteam
19:14:14 <fungi> tripleo is struggling to figure out how their task tracking workflows (which previously involved considering tripleo one giant fluid project on lp_ maps to per-repository task tracking in storyboard
19:15:06 <fungi> i was going to start tinkering with search engine indexability next, but mordred pointed out that the angularjs upgrade he's working on comes with a feature to do that automagically
19:15:27 <clarkb> fungi: interesting the js gives bots what they need to index properly?
19:15:48 <fungi> server-side javascript client that prerenders page content
19:16:23 <clarkb> huh
19:16:39 <fungi> something called "universal pages"
19:17:14 <fungi> #link https://universal.angular.io// Server-side rendered javascript
19:17:33 <frickler> we may need to look at performance some time, https://storyboard.openstack.org/#!/story/list?status=active&tags=blocking-storyboard-migration just took like 40 seconds to load for me
19:17:58 <fungi> yeah, a few people have noted that storytag-based queries are starting to take forever
19:18:12 <fungi> i have a feeling there's an inefficient db operation there somewhere
19:19:15 <clarkb> ok anything else before we move on?
19:19:24 <fungi> profilnig that is probably a little beyond my skillset though
19:19:28 <fungi> er, profiling
19:19:32 <fungi> nope, nothing from me at least
19:19:37 <clarkb> #topic Modernizing Config Management
19:20:04 * cmurphy lurks
19:20:07 <clarkb> I'm sort of operating under the assumption that this will be the next priority effort. Whether that is puppet4/5 or ansible or mordred's to be written use containers spec
19:20:25 <clarkb> given the rate of fires showing up over the last week I don't think we've really had any time to write down the thoughts on this in order to compare them directly
19:20:46 <clarkb> I'd like to shoot for next week but depending on how pip 10 related stuff goes I can see that being difficult too :/
19:20:59 <fungi> maybe we can switch to pip-based configuration
19:21:16 <clarkb> but if you are involved with one of the proposals updating the existing specs would be helpful so that we can compare them directly with up to date info across the board
19:21:53 <cmurphy> i've started finishing up the last bits of http://specs.openstack.org/openstack-infra/infra-specs/specs/puppet_4_prelim_testing.html
19:22:07 <clarkb> I don't really want to have a dig in conversation on this stuff until we have that information available (and hopefully everyone can read through it at least once before we do the major considering)
19:22:27 <clarkb> cmurphy: cool, do we need a next step spec for an actual migration plan that we can compare to the others?
19:22:40 <clarkb> "this is what a puppet4 migration looks like in infra" type of doc
19:23:04 <cmurphy> clarkb: yeah i started working on that a while ago https://review.openstack.org/#/c/449933/
19:23:04 <patchbot> patch 449933 - openstack-infra/infra-specs - Rename and expand Puppet 4 Preliminary Testing
19:23:14 <cmurphy> and then decided to aim lower first
19:23:22 <cmurphy> and then got very very distracted
19:23:59 <clarkb> cmurphy: cool, probably want to make sure that is up to date if anything has changed. Let me know if you need someone else to do that
19:24:25 <cmurphy> clarkb: sure, will prioritize it this week
19:24:50 <clarkb> pabelanger: mordred: like I said not surprised that fires and other obligations kept people away from updating specs. Let me know if you need help or otherwise ENOTIME
19:25:42 <clarkb> I'd like to keep moving so that we have time to talk about some of those fires and just do a quick sanity check on where we are at on them
19:25:47 <clarkb> #topic General Topics
19:26:01 <clarkb> we'll start with frickler's question about advertising irc highlights for infra folks
19:26:30 <clarkb> one suggestion is to put it in the topic which isn't a bad idea since its close to where peolpe would be expected to use the highlights
19:26:49 <frickler> I just wanted some feedback on whether folks want that or rather want to stay more hidden
19:26:50 <clarkb> I just worry the topic is quite full already
19:27:37 <corvus> 19:27 -!- Topic for #openstack-infra: Discussion of OpenStack Developer and Community Infrastructure | docs http://docs.openstack.org/infra/ | bugs https://storyboard.openstack.org/ | source
19:27:38 <corvus> https://git.openstack.org/cgit/openstack-infra/ | channel logs http://eavesdrop.openstack.org/irclogs/%23openstack-infra/
19:27:39 <frickler> how to place it into the topic would be a seperate question
19:27:40 <corvus> for reference ^
19:27:52 <frickler> or maybe where else to document it
19:28:26 <ianw> do we really miss that many questions coming along in #openstack-infra?  infra-root, etc certainly calls attention for people who know it, but not sure people are not getting help without it
19:28:57 <frickler> maybe a mention in the top yellow box on https://docs.openstack.org/infra/manual/ would be enough, too
19:28:59 <corvus> ianw: agreed
19:29:11 <corvus> frickler: or https://docs.openstack.org/infra/ ?
19:29:17 <corvus> or both :)
19:29:18 <fungi> i don't always read scrollback, but when i do i try to follow up with people</dos equis commercial>
19:29:23 <frickler> ianw: not miss them, but certainly my responses may be delayed by an hour or more
19:29:30 <corvus> that's the first link in the topic
19:30:16 <clarkb> ya updating the docs seems like a good first step
19:30:26 <corvus> we could put basically the same yellow box in both docs
19:30:29 <clarkb> the docs are linked in the topic so if follwoing from topic to docs you hopefully find it
19:30:34 <frickler> o.k., I can propose a patch for that
19:31:34 <frickler> and with that I'd be o.k. to continue, when I added the topic, there wasn't much else listed
19:31:46 <clarkb> sounds like a plan then, thanks
19:31:48 <frickler> but now we have other topics to handle I guess
19:31:52 <fungi> people wait until the last minute to add topics
19:31:56 * fungi is part of the problem
19:32:02 <clarkb> also fires
19:32:12 <clarkb> The next item on the list was reorganizing the openstack-ci storyboard group to drop zuul specific projects and move them into a zuul specific group (from fungi)
19:32:21 <fungi> this should be real fast
19:32:47 <fungi> i asked in #zuul and there seemed to be some interest in putting the zuul family of repositories in a zuul project group for ease of overview
19:33:04 <corvus> i'm +1 on creating/adding them to zuul group.  i'm +/-0 on removing them from openstack-ci
19:33:15 <fungi> just trying to figure out whether i should drop any or possibly all of them from the openstack-ci project group (we may also want to rename that at some point?)
19:33:19 <corvus> maybe +0
19:33:43 <fungi> for example, zone_zuul-ci.org maybe ought to appear in both project groups
19:33:46 <clarkb> I think I'm leaning towards the change if only to make it a little more clear there is a distinction now
19:33:55 <clarkb> and there are different groups responsible for each (even if membership overlaps)
19:34:07 <fungi> i honestly don't know how many people rely on sb project grouping anyway
19:34:31 <clarkb> fungi: maybe not in sb directly but the projects.yaml file I think gets viewed in weird ways in places
19:34:41 <clarkb> but I also don't think it is a major issue
19:34:46 <fungi> keeping zuul-jobs in both project groups may also make sense, but e.g. zuul-base-jobs (which openstack's deployment doesn't use) probably is fine being only in the zuul project group
19:37:05 <clarkb> doesn't seem to be any strong opinions either way
19:37:10 <fungi> okay, i'll take a stab at this under the impression that we can mostly move zuul repos to a new zuul project group but that there are a handful of repos we might want to stick in both for convenience
19:37:33 <fungi> and with that i have nothing else on this topic
19:37:40 <clarkb> onward!
19:37:45 <clarkb> Pip 10 released over the weekend
19:38:17 <clarkb> as expected this created problems for a non zero number of jobs. In some cases pip has been pinned to version 9 to get around this in others we've worked to accomodate the new version behaviors
19:38:49 <clarkb> Bring this up because i think we need to be careful where we pin pip
19:38:57 <fungi> in some cases pip has also inadvertently been pinned as a side effect of image generation problems and pauses
19:39:17 <clarkb> its ok if openstack specifically wants to pin pip but places that we produce code or artifacts that are consumed outside of openstack likely shouldn't rely on that
19:39:40 <fungi> but yes, capping pip should in general be viewed as a sad compromise we need to unwind as early as we can
19:39:45 <clarkb> zuul-jobs and dib in particular are places were I think we need to be careful to roll forward and continue to function rather than pinning pip
19:39:56 <ianw> from devstack pov, i think the main job is pip10 safe, but i need to look into some sadness in the "platform" (!xenial) jobs -> https://review.openstack.org/#/c/561597/
19:41:07 <clarkb> I've not gone looking too hard yet but on the infra side of things we should probably dig through puppet logs and see if we have any failures related to this as well
19:41:11 <corvus> ianw: do we not need a venv fix for devstack?
19:41:47 <fungi> curious how we're dealing with, e.g., the python-psutil ubuntu package not being uninstallable by pip
19:41:56 <fungi> in devstack context i mean
19:42:00 <clarkb> right now aiui we are relying on pip pins
19:42:17 <ianw> corvus: i don't think we *need* it ... that's not to say it is not a good idea
19:42:33 <ianw> fungi: so in that package's case, we don't actually need that package installed
19:42:47 <frickler> fyi I tried to list all the qa + infra issues here
19:42:51 <frickler> #link https://etherpad.openstack.org/p/pip10-mitigation
19:42:58 <fungi> ianw: great news ;)
19:43:03 <corvus> i thought clarkb showed us that pip10 requires a venv.  but ianw jut linked to a change which lifts the pin and the only failing jobs are "platform" jobs
19:43:16 <corvus> so i'm missing something :)
19:43:35 <clarkb> corvus: ianw in the general case we can't control what distros package with (distutils or not) or what python system packages other packages we need will depend on
19:43:44 <fungi> corvus: _if_ we can avoid preinstalling any python libraries via distro packages that have been built with distutils (some are built with setuptools instead) then pip 10 is able to "upgrade" from those fine
19:43:50 <frickler> corvus: note that this currently checked only the plain devstack job. things may still be failing for specific projects like glance
19:44:13 <clarkb> so there may be specific scenarios where we can set things up such that they work
19:44:16 <clarkb> but others they don't
19:44:32 <ianw> clarkb: ++ yes, it's whack-a-mole fix each issue
19:44:34 <clarkb> one particular concern of mine is that our images use glean and not cloud init so avoid a large amount of system python that many other users will have
19:44:34 <corvus> i'm still missing what changed between clarkb's pip10 failure and ianw's pip10 success
19:44:49 <clarkb> so we can test on our images and say "it works. not a bug" but no one else will be able to run it sort of situation
19:44:55 <fungi> corvus: so it's mainly an intersection of specific distro packages built on distutils and specific python packages we want to pip upgrade (primarily due to constraints version specifications)
19:45:10 <clarkb> corvus: I'm guessing they removed python-psutil from the installation list since ianw says it isn't required
19:45:30 <frickler> https://review.openstack.org/#/c/561426/
19:45:37 <fungi> corvus: ianw is saying the workaround for the python-psutil failure was to stop preinstalling the distro-packaged python-psutil (if i understood correctly)
19:45:56 <ianw> fungi / corvus: yes -> https://review.openstack.org/#/c/561426/
19:45:56 <corvus> frickler: ah, and that's a dep of 561597
19:46:01 <corvus> got it
19:46:17 <corvus> so that fixes the current problem, clarkb's venv solution still may be better long-term because of the points he just raised
19:46:22 <clarkb> corvus: right
19:46:41 <clarkb> however switching to venvs is likely also to be whack-a-mole of fixing random assumptions made in plugins and so on
19:46:50 <corvus> cool... can we pretend i'm not devstack-core for a second and tell me what a 'devstack-platform' job is?
19:46:50 <fungi> right, however we can manage to stop mixing distro package managers and pip is going to be more future-proof, i imagine
19:46:51 <clarkb> I've already had to address some of that in ironic with grenade
19:47:15 <ianw> corvus: "platform" is just what i've called !xenial-based jobs
19:47:25 <ianw> centos/suse/fedora
19:47:44 <corvus> ianw: so they may have other random python packages installed (a la what clarkb was saying earlier)
19:47:57 <ianw> corvus: that was the sentiment rougly expressed in http://lists.openstack.org/pipermail/openstack-dev/2018-April/129384.html too
19:48:04 <clarkb> corvus: yup and because the packagers are different they may use distutils on cetnos where ubuntu uses setuptools
19:48:07 <clarkb> and vice versa
19:48:22 <corvus> okay, i think i grok
19:48:35 <ianw> corvus: yes, and so as a first approach small hacks, such as possibly manually removing a .egg-info file so pip continues to do what it was doing before, might work best
19:49:22 <clarkb> I actually looked at the debian psutil package to see what sort of effort would be required to start untangling this at a distro level and I think it would not be easy
19:49:36 <clarkb> it is very package specific
19:50:05 <ianw> clarkb: yeah, in some cases i've had packagers add the flag that writes the manifest file out and it's all good ... let me find that bug
19:50:47 <clarkb> I do think we are making reasonable progress. Its just a lot of situation specific cases of determining what the best way forward is
19:50:47 <fungi> some distros have switched most of their packages to use something akin to 'pip install .'
19:50:56 <clarkb> and trying to avoid getting cornered into needing pip9 long term
19:51:05 <fungi> as opposed to the problematic 'setup.py install'
19:51:53 <ianw> clarkb: https://bugzilla.redhat.com/show_bug.cgi?id=1255206 for reference
19:51:54 <openstack> bugzilla.redhat.com bug 1255206 in python-cffi "Consider adding installed-files.txt during build" [Unspecified,Closed: errata] - Assigned to ppicka
19:52:05 <clarkb> Briefly before we run out of time I also wanted to follow up on the xenial initramfs situation since it is related
19:52:43 <clarkb> Our xenial images are continuing to use global pip 9 on boot because we aren't able to build working xenial images that have pip 10 due to initramfs changes that prevent xenial from booting on rax and I guess other clouds
19:53:03 <clarkb> So be prepared for potential new set of failures as soon as that gets fixed and we upload new xenial iamges
19:53:27 <clarkb> ianw: pabelanger: sounds like you have it largely under control as far as debugging goes?
19:54:06 <ianw> clarkb: that might be a little strong :)
19:54:08 <fungi> though worth noting, we got tox-based jobs in general working with pip 10 according to my ad hoc tests upgrading pip before invoking the rest of the job
19:54:21 <clarkb> ianw: well at least we've identified the probable source of the problem now ot identify why it broke it :)
19:54:29 <fungi> clarkb fixed the one known issue there (where tox-siblings relied on pip internals)
19:54:38 <clarkb> ianw: lsinitramfs was where I ended up before needing to do meeting things (and comparing the output across builds)
19:54:39 <ianw> but i think we are narrowing down ...
19:55:07 <clarkb> #topic Open Discussion
19:55:20 <clarkb> Before we run out of time anything else?
19:55:29 <ianw> clarkb: cool, yeah i thought i compared the initramfs's yesterday, but now i'm not sure if was comparing apples and oranges with the versions i grabbed.  confirmation on that will be good
19:55:43 <clarkb> I'm visiting the dentist tomorrow at 1900UTC so will be out and about for that
19:56:14 <fungi> i'm out all thursday
19:56:30 <frickler> just mentioning that patchbot is an effort by notmyname. noticed it in #swift earlier and said I liked it
19:56:57 <frickler> admittedly might need further discussion where to run it, thought
19:57:36 <clarkb> I think there was concern about its chattyness at one time? I tend to just smartfilter things out in weechat if I need to
19:58:03 <fungi> yeah, it keeps finding its way back onto the meeting channels. project-specific channels who have some consensus in favor are of course free to have whatever bots they want join their channels
19:58:17 <corvus> i think bots that run in official channels should be driven through the community maintained infrastructure
19:59:01 <clarkb> ya its nice to not lose features if someone can't keep running a bot for example
19:59:24 <fungi> that i agree with
20:00:05 <clarkb> and we are at time
20:00:08 <clarkb> #endmeeting