17:04:17 #startmeeting 17:04:18 Meeting started Wed Nov 9 17:04:17 2011 UTC. The chair is jaypipes. Information about MeetBot at http://wiki.debian.org/MeetBot. 17:04:19 Useful Commands: #action #agreed #help #info #idea #link #topic. 17:04:37 #topic Bug fixes and unit tests going into stable/diablo 17:04:59 Yes, we are almost finished to write unit test. 17:05:02 nati2: OK, we need to discuss how the bug fixes/new tests are being proposed into stable/diablo 17:05:06 nati2: so... 17:05:21 Yes.. I got the point. Essex first , Diablo second 17:05:24 nati2: Fixes/patches should FIRST be proposed to Essex trunk. 17:05:31 nati2: Ah, OK, good 17:05:33 jaypipes: yes. I got it 17:05:39 nati2: sounds like someone got to you already :)\ 17:05:50 jaypipes: problem is man power. X( 17:05:55 nati2: understood. 17:06:04 nati2: what can this team do to assist you? 17:06:07 jaypipes: we have very strict milestone on December 17:06:19 nati2: is the problem that it's just you doing the translation from Japanese? 17:06:37 jaypipes: It is very helpfull who helps merge cherrypick our commit to Essex 17:06:47 jaypipes: Because we write test based on dialblo/stable 17:06:52 nati2: hmm... 17:06:58 jaypipes: no no japanese translation problem occured 17:07:10 nati2: so, is there any chance of getting those bug fixes targeted at Essex first? 17:08:45 jaypipes: hmm, my team have not enough man power now to do that. We wanna fix it following way. stable/diablo internally first -> modify it for Essex. -> review request for Essex -> review request for Diablo 17:09:28 nati2: but wouldn't it be more efficient to do the fix against Essex and have Mark and his team do the backports to stable/diablo? 17:09:42 nati2: that way your team doesn't have to do double work 17:11:03 jaypipes: Ah, however we need all bug fixed internally on this week, and my team already start working. 17:11:22 jaypipes: And I couldn't make it merged on this week. 17:11:28 nati2: k, understood. 17:11:50 nati2: OK, so we need a plan for when your team will switch from working on Diablo directly to working on trunk. 17:12:07 nati2: and then I can work on the "forwardports" of your existing fixes to the essex trunk. 17:12:23 nati2: but we need a hard date that your team will stop developing against Diablo 17:12:36 nati2: can we decide on a hard date for that? 17:12:52 nati2: because I need to coordinate with MarkMc on it 17:14:21 jaypipes: hmm, I couldn't decide hard date now... If QA of diablo/stable succeed, and it is no problem. We can move Essex. 17:15:16 nati2: unfortunately, that won't work :) We will need a hard date so that we can plan the switch. Because up until that hard date, someone is going to have to responsible for forwardporting all those fixes to Essex 17:15:52 nati2: doesn't have to be decided right now, but before the end of this week a decision should be made IMO 17:16:18 nati2: I will take this conversation offline with you and markmc 17:16:25 jaypipes: OK. I got it :) 17:16:47 nati2: alrighty, want to give us a quick status report on the progress your team has made on bug fixes and unit tests in diablo/stabl? 17:16:53 jaypipes: Thanks for your coordination 17:16:56 np 17:17:40 our team wrote 1000 bugs , and found 60 bugs on that. Exception handling and input value checking policy applied. 17:18:01 Exception message is more helpful now. 17:18:06 nnnoo I made typo 17:18:14 ur team wrote test case 17:18:19 nati2 : 1000 bugs or 1000 unit tests? 17:18:25 1000 unit tests 17:18:50 I pushed about half of it for review. 17:19:03 Another our branch is under my team review. 17:19:08 wow, that is a lot :) 17:19:34 Take a look https://github.com/ntt-pf-lab/nova/branches 17:19:42 you can see all branch here 17:19:59 And also it is linked with bug report on openstack-qa team. 17:20:26 OK, excellent. That will be very helpful in the process of forward-porting 17:21:15 alright, anybody have question for nati2 before we move on? 17:21:39 OK, moving on. 17:21:42 Thanks. There are about 60 branched. And I think 1 branch takes 10-20min at average to forward-port 17:22:06 so 1200 min man power needed. 17:22:09 #topic dwalleck, bcwaldon and wwkeyboard to give status report on integration tests 17:22:20 and westmaas if he's around ;) 17:22:27 hello :) 17:22:32 There he is 17:22:40 o/ btw 17:22:51 #link https://review.openstack.org/#q,status:open+project:openstack/openstack-integration-tests,n,z 17:23:08 at the moment we are waiting on the one review: https://review.openstack.org/#change,1251 17:23:22 which we agreed last time would be the basis for future tests to go in 17:23:31 looks like we are close on that one 17:23:41 yep, I'll add a review on that too, shortly. 17:23:47 dwalleck: I think its waiting on one last comment from you, no more code necessary 17:24:03 just see if you agree with the general path, and then we can push that through 17:24:06 westmaas: did we finalize a new name for the project? 17:24:13 jaypipes: no :( 17:24:18 oh, poop. 17:24:24 storm is taken on launchpad, so thats a no go 17:24:27 ah 17:24:28 will make a new poll today 17:24:31 westmaas: From bcwaldon? I think I replied to that. But I can reply again and agree :) 17:24:35 boo 17:24:40 ahh 17:24:40 dwalleck: from blamar 17:24:46 Ahh, gotcha 17:24:55 will make a new poll today with a new pre-checked list of names 17:25:06 if anyone has any suggestions, email them to me 17:25:13 what's your email address 17:25:15 westmaas: what about the patch from blamar? 17:25:20 https://review.openstack.org/1296 17:25:42 bcwaldon: gabe.westmaas@rackspace.com 17:26:13 jaypipes: that can make it in, the test will be migrated over 17:26:21 k, cool 17:26:55 mtaylor, jeblair: want to give a quick status update on the state of the CI job that should be kicking off the openstack-integration-tests against the RAX CI ccluster? 17:27:06 sure. 17:27:11 I think daryl's should make it in with the name storm, and when we get the final name in we can do a quick change, does that seem reasonable? 17:27:26 westmaas: ++ 17:27:29 I know that dean troyer has done some legwork to make it possible to fire openstack-=integration-tests against a devstack 17:27:33 ++ 17:27:42 westmaas: ++ 17:28:04 westmaas: stackstorm? :) 17:28:06 jaypipes: yep, I've been talking to him a bit 17:28:15 jaypipes: will add it :) 17:28:26 And I just called blamar blamer in my comments, so sorry about that :) 17:28:32 bcwaldon: talking to Dean? 17:28:38 dwalleck: thats what we call him 17:28:38 jaypipes: yep 17:28:38 westmaas: openstuck 17:28:41 k 17:28:44 dwalleck: no worries, it's happened a couple times in my life 17:28:46 nati2: LOL, that 17:28:49 is awesome 17:29:02 jaypipes: :p 17:29:14 raid: it kills bugs :) 17:29:20 :) 17:29:22 jk 17:29:27 Ah I wanna discuss about CI stuff 17:29:33 I heard jenkins team will use devstack 17:29:41 nati2: I think mtayloris giving an update... 17:30:04 jaypipes: gotcha mtaylor: please 17:30:25 or jeblair :) 17:30:49 yes. we've got devstack successfully running in cloud servers now and running exercise.sh against stable/diablo 17:31:46 we're working on getting a few things done so that it's possible to safely build a gating job that uses that 17:32:08 and then next week jesse and rcb are going to get devstack working on trunk 17:32:11 mtaylor: what's exercise.sh? 17:32:19 is that devstack script? 17:32:25 mtaylor: great. Is that single node configration? 17:32:32 s/exercise/jazzercise? 17:32:51 westmaas: it's a placeholder script for testing in devstack. it's intended to be replaced by stormstack 17:32:59 mtaylor: got it 17:33:27 mtaylor: is that a single node configuration? 17:33:40 it is 17:33:50 mtaylor: what happened to the 10-machine cluster? is that not being used now? :( 17:34:00 mtaylor: Do you have any multinode milestone? 17:34:27 jaypipes: it's still there and still in service - but getting devstack working on it in that context is a little more work 17:34:56 mtaylor: I'd like to point out that we *had* puppet-based deploy jobs already working on that... 17:35:05 no. we did not 17:35:17 mtaylor: how so? openstack-deploy-rax? 17:35:24 we had puppet based jobs that installed software. that installation did not actually _work_ 17:35:36 ahh 17:35:39 and we still have those jobs, and at the moment they deploy using devstack 17:35:44 OK, so we have a different deploy method now that also doesn't work. 17:36:00 mtaylor: jaypipes: o! - we want to try chef and puppet in hp cloud server 17:36:04 but the openstack that they install can't run the simple exercise.sh test case 17:36:08 Ravikumar_hp: yes, I know :) 17:36:09 so what's next milestone? extend devstack for multinode? 17:36:20 Ravikumar_hp: yes. so - we'd love to have a cloud that deploys using chef or puppet as well 17:36:29 Ravikumar_hp: Both of puppet and chef? 17:36:37 mtaylor: working with Ravikumar_hp and Nayna on that... we'll discuss offline. 17:36:57 nati2: chef first, then later puppet... 17:36:58 cool 17:36:58 alternate in CI . one time puppet deploy next time chef 17:37:05 jaypipes: Gotcha. 17:37:08 I do not want to do that... 17:37:19 mtaylor: do what? 17:37:22 I would rather have two sets of machines and deploy them the same way everytime 17:37:25 Ravikumar_hp: ah do you wanna puppet first? 17:37:38 there isn't really a good way to express "one time use puppet then next time use chef" in jenkins 17:37:46 nat2: chef first 17:37:51 mtaylor: yes, that's fine... I was just saying that HP prefers Chef, so that is what they will be working on firest. 17:37:54 ok let's use chef 17:37:55 but I'd LOVE to deploy using both every time 17:38:00 ah. sure. that's fine 17:38:25 so, from CI perspective, we're just working on getting SOMETHING gating on integration tests, which at the moment is devstack based 17:38:27 So chef based CI will be multinode test 17:38:40 as I've always said, let us please just get SOMETHING running in a production-similar environment, then iterate for other deployment methods... 17:38:41 And also it can test deployment test and packaging test 17:38:45 we'd LOVE to get some labs hooked in doing chef or puppet deploys/tests 17:38:54 yes. I think we all agree there :) 17:39:32 mtaylor: OK, so final word on this: where are you with the devstack-based multi-node installation? how many more days do you think? 17:40:19 jaypipes: the multi-node deploy _works_ ...but it seems there are some config issues with the cloud that is deployed and thusfar we've been unable to get those resolved 17:40:40 so once that's sorted, we'll be in good shape there - although then we'll still be waiting on devstack to be ported to trunk 17:40:43 mtaylor: are there bugs logged (or issues on GH)? 17:40:44 however 17:41:14 we chatted with jesse about getting devstack sorted, and we'll have a stable/diablo branch of it for gating stable/diablo and then a trunk branch for trunk 17:41:21 mtaylor: where can I go to see the progress of devstack and what's coming up in the next few month? 17:41:22 jaypipes: no bugs yet, because we don't know what's wrong 17:42:01 jaypipes: right now nowhere that I know of. as soon as we turn on the trunk gating job we'll also be taking over the home of devstack and gating it as well 17:42:26 but for now, you just have to look at github.com/cloudbuilders/devstack 17:42:45 oh - I should mention... 17:43:04 the client libs decision yesterday is going to allow us to clean up a few install issues... so that will be nice :) 17:43:31 mtaylor: where are the Jenkins jobs that are attempting to build this multi-node setup with devstack? The only ones I see up there for the old puppet-based stuff are all disabled. 17:43:50 jaypipes: those are the same jobs. there are no puppet-based jobs 17:43:58 mtaylor: they are all disabled. 17:44:08 yes. because they don't do anything useful yet 17:44:16 they are still under development 17:44:25 mtaylor: how can we see the output of the jobs so that we can help diagnose the issues with them? 17:44:33 one sec... lemme give you a link 17:45:15 https://jenkins.openstack.org/view/All/job/dev-openstack-deploy-rax/50/ 17:45:31 has syslog output from all three nodes as well as the jenkins console output 17:46:01 mtaylor: ok, good. 17:46:03 mtaylor: ty 17:46:07 sure! 17:46:15 mtaylor: what is the timeframe for getting this working? 17:46:43 jeblair has a few additional things he's going to do there to get better logging, but we've been focused on getting the cloud-server gating job up and going because that's a short-term possible win 17:46:55 jeblair: see jaypipes question above? 17:47:18 mtaylor: cloud-server gating job? 17:47:21 mtaylor: link? 17:47:40 jaypipes: also still under development :) 17:48:02 mtaylor: that's fine... but some of us are interested in helping :) link? 17:48:05 hi 17:48:09 jaypipes: and I believe jim is working on the glue needed to build that job 17:48:14 k 17:48:21 yay it's jeblair 17:49:20 https://jenkins.openstack.org/job/dev-openstack-deploy-rax/ 17:49:36 that's the job to deploy devstack on bare metal and run exercise.sh 17:49:40 it does not work 17:50:29 jeblair: and the cloud-server one? 17:50:56 not checked in yet, i got it working last night 17:51:16 it still needs some things: 17:51:17 jeblair: k 17:52:01 it needs to be pointed at the thing that's being gated, and it needs to use a stable/diablo branch of devstack to test stable/diablo changes to openstack 17:52:21 and it needs to pull logs back to jenkins 17:52:43 jesse says they'll be ready to move devstack to gerrit for gating early next week 17:53:18 so that's when i plan to set up trunk gating jobs for stable/diablo 17:53:49 excellent 17:54:14 we've got a pretty cool feature planned too: 17:54:56 if the devstack vm gating job fails, it will install the ssh key of the dev who proposed the change and turn the vm over to them so they can log in, resume screen sessions, and figure out what's wrong. 17:55:50 very cool. 17:56:20 hope so! :) 17:58:33 OK, so does anyone have any questions for mtaylor or jeblair on the deployment jobs that openstack-integration-tests will eventually be run against? 17:58:56 I wanna know how to create packages 17:59:03 for chef based test 17:59:50 uhm 17:59:53 zul: ping 18:00:13 yep? 18:00:38 mtaylor: pongish 18:00:43 nati2: so, we're still working on mechanics of automated trunk packages (although we are currently producing them) 18:01:03 zul: nati2 was asking about making packages so that he can test chef-based deploys based on pacakges 18:01:11 mtaylor: ok 18:01:22 mtaylor: zul: would you share current progress? 18:01:40 nati2: well, _currently_ we are publishing packages to a PPA on trunk commits 18:01:41 mtaylor: zul: And also if you have any repos, would you please share it 18:01:56 I believe we've decided to continue doing this 18:02:10 nati2: sure the packaging branches can be found at lp:~openstack-ubuntu-packagers/xxx/ubuntu which xxx is either nova/swift/glance/keystone/etc 18:02:10 but those are only built after commits are made to trunk 18:02:42 Do you have stable/diablo version? 18:02:53 zul: Thanks 18:03:25 so if you want to test packages _before_ a commit lands (for gating) you'll need to make a tarball, then grab the packaging branch, then build packages 18:04:06 I got it. I should learn it. 18:04:23 yes. I believe that's a good idea 18:05:04 mtaylor: zul: Thanks 18:05:25 we might just want to have a script at the beginning of your tests which makes packages for the commit you are testing ... let's keep chatting about that 18:05:31 nati2: n 18:05:35 er...np 18:06:08 so basically the script branch packageing code. then replace tarball 18:06:24 then run packaging command. right? 18:06:49 basically 18:07:05 #link http://wiki.openstack.org/Packaging/Ubuntu#Release_PPA 18:07:33 I'll put something on my list to see if we can have a template job in jenkins that will do this so that it's easy to add in to a jenkins job 18:07:49 mtaylor++ 18:09:17 k is there any topics? 18:09:30 I think we're good to wrap up :) 9 minutes over.. 18:09:39 #endmeeting