16:00:12 <adrian_otto> #startmeeting Solum Team Meeting
16:00:13 <openstack> Meeting started Tue Sep 23 16:00:12 2014 UTC and is due to finish in 60 minutes.  The chair is adrian_otto. Information about MeetBot at http://wiki.debian.org/MeetBot.
16:00:14 <openstack> Useful Commands: #action #agreed #help #info #idea #link #topic #startvote.
16:00:17 <openstack> The meeting name has been set to 'solum_team_meeting'
16:00:19 <adrian_otto> #link https://wiki.openstack.org/wiki/Meetings/Solum#Agenda_for_2014-09-23_1600_UTC Our Agenda
16:00:26 <adrian_otto> #topic Roll Call
16:00:28 <adrian_otto> Adrian Otto
16:00:33 <roshanagr> Roshan Agrawal
16:00:35 <datsun180b> Ed Cranford
16:00:36 <julienvey> Julien Vey (have to leave at :30)
16:00:40 <mkam> Melissa Kam
16:00:47 <devkulkarni1> Devdatta Kulkarni
16:00:51 <adrian_otto> julienvey: Acknowledged, thanks.
16:00:54 <gpilz> Gilbert Pilz
16:01:14 <noorul> Noorul Islam
16:01:23 <stannie> Pierre Padrixe
16:02:14 <adrian_otto> Welcome everyone
16:02:16 <muralia> murali allada
16:02:26 <ravips> Ravi Sankar Penta
16:02:44 <adrian_otto> #topic Announcements
16:02:55 <adrian_otto> would any members of the team like to make announcements today?
16:03:48 <adrian_otto> #topic Review Action Items
16:04:07 <adrian_otto> dimtruck (with help from PaulCzar) will investigate using wsme
16:04:24 <dimtruck> this is still in progress
16:04:32 <adrian_otto> I'm not sure exactly what that is about
16:04:39 <dimtruck> oh context - sorry
16:05:13 <dimtruck> during our testing we found an issue with wsgiref where it creates a thread that doesn't complete on responses that aren't mapped in wsgi
16:05:32 <dimtruck> in our example, if we don't have a method in pecan for POST /, PUT /, DELETE /
16:05:34 <devkulkarni1> I am looking for the bug. I think we have added it
16:05:36 <adrian_otto> do we need a ticket for that in the bug system, or is tracking it as an #action here suitable?
16:05:46 <dimtruck> then you can simply curl to it and DOS our application
16:05:52 <dimtruck> there's a bug out there already
16:05:58 <adrian_otto> we can add a bug in an Incomplete state if we don't know ywt how to reproduce, etc.
16:06:13 <adrian_otto> ok, let's reference that here with a #link
16:06:21 <dimtruck> it's reproduceable...one sec.  getting it
16:06:29 <devkulkarni1> #link https://bugs.launchpad.net/solum/+bug/1367473
16:06:30 <uvirtbot> Launchpad bug 1367473 in solum "PATCH requests are not supported in documented apis" [Undecided,New]
16:06:32 <stannie> dimtruck: so it's more pecan related right?
16:06:44 <devkulkarni1> sorry that is the wrong link
16:06:46 <dimtruck> well, it's more wsgiref simple_server
16:06:56 <dimtruck> not really pecan per se
16:07:07 <stannie> ok
16:07:15 <dimtruck> in the bug i listed a link where the suggestion is to not use wsgiref in production applications
16:07:35 <devkulkarni1> #link https://bugs.launchpad.net/solum/+bug/1367470
16:07:38 <adrian_otto> ok, Iaqnd I see a review posted against that bug
16:07:40 <uvirtbot> Launchpad bug 1367470 in solum "Solum api hangs on non GET root requests" [Undecided,New]
16:08:05 <dimtruck> that's the one!
16:08:24 <devkulkarni1> the correct bug is 1367470
16:08:26 <adrian_otto> aha, I see, thanks for bringing me up to speed
16:08:58 <adrian_otto> so it looks like we can drop this as an action item for next week, and consider this one complete, since it is tracked elsewhere. Agreed?
16:09:13 <devkulkarni1> sure
16:09:28 <dimtruck> agreeed
16:09:33 <adrian_otto> we can always look at it during the BP/Task/Bug section if we want cross team discussion
16:09:55 <adrian_otto> ok, cool, let's look at the next AI
16:09:57 <adrian_otto> ravips will investigate f20 gate for failing barbican tests and come back with a suggestion for whether to make f20 non-voting
16:10:11 <ravips> sure,  I did some experiments on F20 yesterday
16:10:12 <adrian_otto> #link https://review.openstack.org/#/c/122782/ Patch for f20
16:10:38 <ravips> some background on the problem: plan create involving private repo is failing on f20 during barbican secret deletion
16:10:57 <ravips> devstack + barbican => No issues (tested create/delete secret using python barbican client)
16:11:15 <ravips> devstack + barbican + solum => able to reproduce the exact issue (stacktrace: http://paste.openstack.org/show/114446/)
16:11:32 <ravips> I need to debug further to narrow down the issue
16:12:17 <adrian_otto> ok, we have seen intermittent issues over a relatively long history with gate tests failing on delete actions
16:12:35 <adrian_otto> but it does not always happen, so we might have a lurking bug somewhere
16:13:03 <ravips> someting we use in solum is trigged a bug in barbican
16:13:26 <ravips> I will update once i have more details
16:13:33 <adrian_otto> ravips: do you feel equipped to continue with the troubleshooting, or do you need help from another team member or members?
16:14:10 <ravips> i started this yesterday, I think I should be able to narrow down the issue today
16:14:28 <adrian_otto> ok, thanks ravips
16:14:42 <adrian_otto> that concludes our action item review
16:14:46 <adrian_otto> #topic Blueprint/Task Review
16:14:54 <adrian_otto> check_uptodate.sh handling (devkulkarni)
16:15:03 <adrian_otto> #link https://bugs.launchpad.net/solum/+bug/1372959
16:15:07 <uvirtbot> Launchpad bug 1372959 in solum "check_uptodate handling" [Undecided,New]
16:15:12 <devkulkarni1> please take a look at the bug description
16:15:43 <devkulkarni1> the gist is that can we do something about check_uptodate so that we don't get -1 votes for things that might have changed upstream
16:16:36 <gpilz> I'm not sure why we even have a static solum.conf if we can generate the default
16:16:50 <datsun180b> ^^
16:17:21 <adrian_otto> ok, so if we had a separate gate test for the config test, and it were a nonvoting job that might work well enough
16:17:36 <adrian_otto> so taht we get a clue when the configuration goes stale, but it does not halt all work
16:18:39 <devkulkarni1> sure.. I don't have a preference as long as work can continue
16:18:48 <ravips> yeah that works as well
16:18:50 <datsun180b> would it be right to compare that to our "update from global requirements" reviews at least in tone?
16:18:56 <adrian_otto> gpilz: It's traditional for Linux software to include a <project>.conf file with the various options listed
16:19:30 <devkulkarni1> datsun180b: sure.. where are you going with that?
16:19:44 <datsun180b> just to get a feel for severity/priority
16:19:48 <gpilz> adrian: I understand the need to include a <project>.conf file
16:19:50 <devkulkarni1> are you saying we treat the failing non-voting check_uptodate check as the trigger to go generate one
16:20:10 <gpilz> but, if we have the ability to generate a conf file with the default settings from our code, why not just go with that?
16:20:14 <adrian_otto> devkulkarni1: yes
16:20:39 <adrian_otto> gpilz: I think that should be part of the gate test, actually
16:20:47 <adrian_otto> the generation of the file
16:21:36 <devkulkarni1> adrian_otto: but we are not generating the thing to compare to by hand so what does the gate test achieve?
16:21:54 <ravips> btw, generate_stample_conf on vagrant doesn't match with what the gate is expecting, 'host' on vagrant is returning solum and gate expects localhost..anyone experienced this case?
16:22:45 <datsun180b> right, our sample conf gets modified during setup and we'd probably do to stem that if we can help it
16:22:58 <adrian_otto> devkulkarni1: we are detecting the case where configuration settings change in other projects, right?
16:23:12 <datsun180b> simply spinning up the vagrant environment and doing nothing else is enough to change the sample conf
16:23:13 <stannie> adrian_otto: yes
16:23:14 <devkulkarni1> ravips: good to know.. I think we need to change vagrant env so that it is consistent with the gate
16:23:49 <adrian_otto> devkulkarni1: +1
16:24:30 <ravips> adrian_otto: so we will move check_update script to some existing non-voting job or we going to create a new one?
16:25:08 <datsun180b> i say new job, and take the sqlalchemy check with it so our pep8 env only runs flake8
16:25:21 <devkulkarni1> adrian_otto: yes, that is the purpose of the gate test. what I was saying was, do we want to use gate to tell us that our conf is stale vs
16:25:25 <datsun180b> sorry, alembic branches
16:25:40 <devkulkarni1> say, adding documentation to our release notes that tells the operator to generate the file
16:26:14 <adrian_otto> we should just add it to the setup_install.py code
16:26:24 <devkulkarni1> ravips: we should create a new tox environment and new job.. let the conf checking be its own thing
16:26:44 <devkulkarni1> if we want to check it at all in the first place
16:27:06 <adrian_otto> maybe we don't need to check it if we always auto-generate it at install time
16:27:21 <gpilz> +1
16:27:32 <adrian_otto> so you just get a current solum.conf file each time you install Solum
16:27:34 <ravips> adding to non-voting job may not be effective, I don't know how many of us will look at the failing tests for the non-voting jobs
16:27:48 <gpilz> all we are really checking for is the ability of contributors to properly cut & paste
16:27:54 <devkulkarni1> ravips: you make a valid point :)
16:28:05 <ravips> I like adding doc to our release notes
16:28:27 <ravips> or at least a blocker bug every release to generate the conf file
16:28:43 <adrian_otto> ravips: so that you have to RTFM to find out how to make a config file? That seems awkward to me.
16:28:59 <adrian_otto> oh, so that I do it when I tag releases?
16:29:17 <ravips> yes
16:29:33 <devkulkarni1> is there a way to create such bugs?
16:29:47 <adrian_otto> that occurred to me, but I thought that might not be as good between releases when the upstream changes happen
16:29:53 <devkulkarni1> or is it just a mental note that the release manager has to keep
16:30:09 <adrian_otto> it might actually cause gate tests to fail for apparently unknown reasons, which is probably why we have this to begin with, right?
16:30:19 <ravips> we can create a new tag, release-essential or some other better name
16:30:24 <devkulkarni1> good point about upstream changes between releases
16:30:59 <adrian_otto> the static conf file is a way to check what the config was last time
16:31:14 <adrian_otto> and the generated content is a way to check what is expected now
16:31:58 <adrian_otto> but if we auto-generate config upon every install and every gate test run, then this would not matter
16:32:19 <devkulkarni1> right (to the last sentence)
16:32:51 <adrian_otto> so we have a pre-test hook for every gate job, right?
16:33:02 <devkulkarni1> what do you mean?
16:33:17 <adrian_otto> we can specify scripts to run before tests do, right?
16:33:24 <devkulkarni1> oh!
16:33:40 <devkulkarni1> don't know enough to comment on it
16:33:45 <dimtruck> it does
16:33:46 <dimtruck> we can
16:33:59 <dimtruck> in devstack_gate
16:34:07 <adrian_otto> assuming that's possible, we can make the config generate script a pre-test hook script.
16:34:10 <ravips> most of the time, new changes to conf file are not related to solum..we only care about the fields of other openstack projects that we use (like keystone). I don't think we need to run this script frequently..if something has changed in the upstream that affects us, our tests will catch it (assuming we have good coverage)
16:34:56 <adrian_otto> ravips: yes, but we might not know why it's failing
16:35:08 <adrian_otto> having hte automatred config test may save us a lot of research time
16:35:34 <adrian_otto> that's where a non-voting gate test might be handy
16:35:47 <adrian_otto> if the other func tests, fail, we can look to see if the config test failed
16:35:58 <devkulkarni1> actually yeah.. even if there is a stale conf, a failing job will point us in the right direction
16:35:59 <adrian_otto> and if it did, we can resolve that first
16:37:08 <adrian_otto> and if no tests failed, then who cares until it causes a problem
16:37:15 <adrian_otto> ;-)
16:37:35 <ravips> yeah, non-voting job might be useful in case of voting job failures
16:37:37 <devkulkarni1> I am leaning towards keeping the check in a non-voting gate
16:37:58 <adrian_otto> devkulkarni1: +1
16:37:59 <ravips> +1
16:38:09 <adrian_otto> any alternate points of view to consider?
16:38:22 <datsun180b> agree, make it non-voting
16:38:28 <muralia> yup
16:38:57 <gpilz> +1
16:39:01 <adrian_otto> #agreed to resolve bug 1372959 we will use a non-voting gate test for configuration file testing.
16:39:02 <uvirtbot> Launchpad bug 1372959 in solum "check_uptodate handling" [Undecided,New] https://launchpad.net/bugs/1372959
16:39:07 <devkulkarni1> cool
16:39:35 <adrian_otto> devkulkarni1: please update the bug accordingly, referencing team meeting on 2014-09-23
16:39:44 <devkulkarni1> yeah, will do that
16:40:05 <adrian_otto> ok, that brings us to our next sub-topic which is similar in nature
16:40:08 <adrian_otto> strategy to sync up openstack/common in solum with upstream (devkulkarni)
16:40:20 <devkulkarni1> yes.. so this came up today with my discussion with stannie
16:40:20 <adrian_otto> so some backstory here
16:40:35 <adrian_otto> devkulkarni1:  you can go first if you like
16:40:53 <devkulkarni1> actually, I am myself interested in the story.. so you go first
16:41:05 <devkulkarni1> as in listening to it.. don't know the context
16:41:08 <adrian_otto> ok, so Solum used to be listed in the openstack projects.txt
16:41:26 <adrian_otto> meaning that it was forced to use only requirements that other openstack projects were allowed to use
16:41:49 <adrian_otto> when we began using mistral's client and barbican client, that became problematic
16:42:06 <adrian_otto> because these were dependencies that other projects were not yet allowed to use
16:42:35 <adrian_otto> so the only solution that would unjam our project was to break that link with the check against global-requirements.txt and proceed with what we needed
16:43:23 <adrian_otto> so that leaves us without the convenience of the automated requirements bot that comes around and finds dependencies that are out of sync, and submits reviews to update them. aka: "reviewbot" or something?
16:43:59 <noorul> what is the link between this and openstack/common (oslo) ?
16:44:04 <adrian_otto> so we would like an equivalent of that for our own use that disregards our unique list of exceptions from the global-requirements.txt list.
16:44:31 <adrian_otto> noorul: in all honesty I don't know.
16:44:50 <devkulkarni1> noorul: exactly my question
16:45:00 <adrian_otto> I think there is something that can generate diffs,a nd can submit those as reviews as well, but I'm not completely sure
16:45:05 <noorul> they are different
16:45:12 <devkulkarni1> adrian_otto: I was referring to the python code that we have in solum/openstack/common/*
16:45:14 <stannie> the question is when should we (what frequency) sync oslo openstack/common
16:45:46 <devkulkarni1> we have that code since the beginning of the project
16:45:51 <noorul> We can bring in a policy for this
16:45:57 <noorul> I mean for oslo sync
16:46:13 <devkulkarni1> how/when we sync that up? what is the repo from which it is copied? is there a better way than copying over all the code?
16:46:24 <stannie> we didn't sync openstack.common for a long time which leaves us to have some bug not fixed etc
16:46:36 <stannie> there is a script to sync the modules
16:46:38 <adrian_otto> ok, so look
16:46:39 <noorul> long time?
16:46:56 <noorul> I think Angus synced to recently
16:46:59 <stannie> ok
16:47:00 <adrian_otto> what we can do on an immediate basis is to have one of us submit such a patch as a review to our project
16:47:04 <devkulkarni1> stannie: exactly.. we will be in that situation as long as we are maintaining that code on our side as well
16:47:05 <noorul> especially the DB part and oslotest
16:47:23 <noorul> and I did sync python-solumclient
16:47:26 <adrian_otto> then we can decide if we want an automated updating thing (one may exist that we can leverage)
16:47:36 <devkulkarni1> noorul: true.. but I don't think we have synced up everything
16:48:08 <noorul> may be for solum some modules are left
16:48:21 <stannie> noorul do you know what is the policy on other projects?
16:48:29 <devkulkarni1> noorul: but the basic question I have is, is this the only approach for us?
16:48:30 <noorul> stannie: not sure
16:48:36 <adrian_otto> I could check in the the Oslo team to ask what they recommend, and ask what we should be reading about if there is already written advice for this.
16:48:44 <noorul> here either me or Angus used to sync
16:48:48 <devkulkarni1> that to keep that code in our repo
16:49:30 <adrian_otto> if there is a way to shed that code, and just use a requirement on an Dslo release, that would be my strong preference.
16:49:37 <adrian_otto> s/Dslo/Oslo/
16:49:58 <devkulkarni1> +1 .. but I believe there might not be otherwise we would have already pursued that option
16:50:26 <devkulkarni1> and that is why I want to hear from noorul or others if there are any technical roadblocks for us for not pursuing that opion
16:50:31 <devkulkarni1> s/opion/option/
16:50:31 <adrian_otto> ok, I am willing to take an action item to do some research and report back next week
16:50:57 <devkulkarni1> sounds good adrian_otto
16:51:17 <noorul> I think for core projects someone from oslo team syncs it
16:51:34 <noorul> but for stackforge I think it is upto us
16:51:39 <adrian_otto> #action adrian_otto to investigate using alternatives to openstack/common in Solum, and report back to the team with options.
16:51:49 <devkulkarni1> noorul: that is good to know.. but the question is, is keeping that code in each project's repo the only option?
16:51:53 <adrian_otto> noorul: makes sense
16:52:06 <noorul> devkulkarni1: YEs
16:52:18 <adrian_otto> ok, we are running low on time
16:52:21 <noorul> devkulkarni1: But some of them will be factored out to other packages
16:52:29 <gpilz> shoot
16:52:30 <adrian_otto> so let's touch on the last sub-topic before Open Discussion
16:52:34 <gpilz> I wanted to discuss https://review.openstack.org/#/c/117056/
16:52:39 <noorul> devkulkarni1: Like test module was factored out to oslotest
16:52:48 <adrian_otto> gpilz: hang on, we will revisit that
16:52:51 <adrian_otto> Etherpad for things to discuss at Paris summit (devkulkarni)
16:53:07 <noorul> devkulkarni1: As and when things matures then will create new packages for different modules
16:53:09 <devkulkarni1> so I added that last week.. do we see a need for such a etherpad?
16:53:15 <adrian_otto> devkulkarni1: Did we start one, or are you suggesting we start one?
16:53:26 <devkulkarni1> I haven't started one. I am asking should we start one?
16:53:34 <adrian_otto> I should have mentioned at the Announcements section
16:53:39 <devkulkarni1> I think it will be good to have
16:54:01 <adrian_otto> I applied for Solum to be in the Design Summit, and our application was accepted yesterday
16:54:08 <adrian_otto> so we will be on the official program
16:54:16 <devkulkarni1> oh cool!! congratulations all
16:54:16 <dimtruck> awesome!
16:54:18 <ravips> nice!
16:54:28 <roshanagr> good news!
16:54:31 <muralia> woohoo!
16:54:31 <datsun180b> good to hear
16:54:36 <gpilz> woot!
16:55:01 <noorul> Wow
16:55:04 <noorul> cool
16:55:32 <noorul> how many of you will be there?
16:55:49 <adrian_otto> so although many of us will not be able to attend for travel budgeting reasons, we will hold design sessions with those who can attend.
16:56:07 <adrian_otto> I will attend.
16:56:20 <gpilz> I just got my travel approval
16:56:35 <ravips> mine is still pending
16:56:44 <adrian_otto> Rackers will not know until later
16:56:56 <adrian_otto> but I will go even if Rackspace does not send me
16:57:54 <adrian_otto> #action adrian_otto to email a link to a Paris Summit topics etherpad to the ML
16:58:02 <adrian_otto> #topic Open Discussion
16:58:24 <adrian_otto> Gil, you asked about https://review.openstack.org/#/c/117056/
16:58:30 <datsun180b> oh so the topic in #solum mentions our next meeting is at the end of last june
16:59:09 <gpilz> yes - Devdatta has a −1 against it
16:59:18 <ravips> adrian_otto:  any updates on solum incubation request?
16:59:36 <devkulkarni1> gpilz: yeah, I thought there were some concerns
16:59:49 <devkulkarni1> will be reviewing it if those are resolved
16:59:50 <adrian_otto> datsun180b: thanks. Fixed.
17:00:19 <gpilz> devdatta: adrian assures me that those concerns have been addressed
17:00:24 <adrian_otto> ravips: OpenStack is having an identity crisis about the integrated release right now
17:00:31 <adrian_otto> so let's talk about it in Paris
17:00:37 <adrian_otto> or in #solum
17:00:43 <ravips> okay
17:00:45 <adrian_otto> thanks everyone for attending today
17:00:47 <devkulkarni1> gpilz: okay
17:00:49 <adrian_otto> #endmeeting