17:00:07 #startmeeting qa 17:00:08 Meeting started Thu Aug 15 17:00:07 2013 UTC and is due to finish in 60 minutes. The chair is mtreinish. Information about MeetBot at http://wiki.debian.org/MeetBot. 17:00:09 Useful Commands: #action #agreed #help #info #idea #link #topic #startvote. 17:00:11 The meeting name has been set to 'qa' 17:00:19 who do we have here? 17:00:24 Hi 17:00:26 hi 17:00:43 hi 17:00:46 today's agenda: 17:00:49 #link https://wiki.openstack.org/wiki/Meetings/QATeamMeeting 17:01:12 it's pretty short today just the usual things no one added anything extra 17:01:22 so I guess lets dive into it 17:01:27 #topic testr status 17:01:41 so my topic is up first 17:02:02 earlier this week I got the fix for the last big blocking race condition merged 17:02:09 and the runs seem fairly stable 17:02:22 so I added a parallel run nonvoting to the gate queue on zuul 17:02:36 I'm going to watch it for a while to see how stable it seems 17:02:46 cool 17:02:51 and hopefully make it the default sometime in the next week or 2 17:03:04 right now we're tracking down some other races that have been popping up 17:03:21 and I hope to get tenant isolation on swift and keystone testing before we green light it 17:03:31 but we're really close here 17:03:33 is there a plan to delete all the nose things in the code? 17:04:07 mkoderer: yeah at some point, just right now I've been too distracted with trying to get it gating parallel to push the patch 17:04:30 mtreinish: ok cool 17:04:32 mkoderer: feel free to push it yourself if you'd like 17:04:51 mtreinish: ok shouldn't be hard 17:05:17 ok, does anyone have anything else on testr? 17:05:59 mtreinish, i saw you get the tenant_isolation for swift 17:06:20 do you still need to get the swift tempest tests locked up ? 17:06:20 adalbas: yeah I pushed it out, but it's probably going to need a devstack change to get it working 17:06:44 adalbas: probably not if it was a user conflict that was causing the fail 17:06:55 ok 17:07:08 adalbas: we can pick this up on the qa channel after the meeting though 17:07:13 sure 17:07:20 ok, then moving on to the next topic 17:07:26 #topic stress tests status 17:07:32 mkoderer: you're up 17:07:49 so I want to introduce this decorator "stresstest" 17:08:01 it will automatically set attr type=stress 17:08:24 with this I could use subunit to discover all stress test inside the tempest tests 17:08:36 thats my plan and I am working on that 17:09:13 if we have this we could go through the test and search for good candidates 17:09:32 ok, cool so things are in motion with that then 17:09:45 yes 17:09:56 about this https://review.openstack.org/#/c/38980/ 17:09:56 I imagine a lot of tests won't be good candidates for stress tests 17:10:02 like the negative tests :) 17:10:18 afazekas: ^^^ are you around? 17:10:21 mtreinish: thats right.. but I am quite sure we will have some good ones 17:10:58 could be that https://review.openstack.org/#/c/38980/ is not needed after my fix 17:11:13 but anyway we could use it in the meanwhile 17:11:26 mkoderer: ok, yeah might be, you should coordinate with afazekas about that 17:11:33 but he doesn't seem to be around right now 17:11:37 I think I will chat with afazekas when hes around 17:11:45 np 17:12:05 any other question? 17:12:09 mkoderer: ok all sounds good to me. Anything else on the stress test front? 17:12:16 mkoderer: nothing from me :) 17:12:39 ok cool :) 17:13:02 re 17:13:02 ok then we'll move on to the next topic 17:13:10 mtreinish: I want to report status on https://blueprints.launchpad.net/tempest/+spec/fix-gate-tempest-devstack-vm-quantum-full 17:13:20 #topic other blueprint status 17:13:31 mlavalle: ok you're up 17:13:55 mtreinish: we have achieved good progress on this. Several of the items in the BP are already fixed 17:14:16 mtreinish: we have a shot at getting this done by H-3 17:14:37 mlavalle: ok cool, that would be great to get the quantum full jobs passing 17:14:45 mtreinish: I am working now on a nova patch, that is required by one of the tests I am fixing 17:14:53 mlavalle: do you know about a bug related to the failing fixed ip or interface tests ? 17:15:23 mtresinish: any help I can get with this would be great: https://review.openstack.org/#/c/41329 17:15:35 afazekas: no i don't 17:15:59 mlavalle: sure I'll take a look 17:16:09 that's it 17:16:09 and point some nova cores at it too 17:16:55 mlavalle: ok cool, one quick thing about neutron is last week we had to skip another neutron test 17:17:03 https://bugs.launchpad.net/neutron/+bug/1189671 another interesting related bug 17:17:06 Launchpad bug 1189671 in neutron "default quota driver not suitable for production" [Wishlist,In progress] 17:17:25 because a broken change got merged while the gate was messed up and passing all the tests 17:17:26 mtreinish: ok, i'll take a look 17:17:52 afazekas: i'l take a look 17:18:23 unfortunately when thet test case was skipped several additional introduced.. 17:18:31 mlavalle: I can dig up a link but it's one of the bugs marked as critical 17:18:45 marun was working on it yesterday I think 17:18:54 tho know ones are fixed, but something still not ok with the ssh connectivity 17:19:11 afazekas: I am looking at it 17:19:11 I'll ping marun 17:19:24 It's not ssh that's the problem - it's the metadata service. 17:19:33 marun: can you reproduce it ? 17:19:40 afazekas: trivially 17:19:46 but i don't know why it's happening. 17:19:56 working on it now 17:20:18 marun>: cool 17:20:20 marun: i'll let you run with it,….. 17:20:59 mlavalle: ok, is there anything else on neutron status? 17:21:07 i'm done 17:21:37 ok then are there any other blueprints that need to be discussed 17:21:59 leaking 17:22:14 afazekas: ok what's going on with that? 17:22:34 looks like the original connect did not liked, so I will introduce different one 17:23:17 Which will be designed to clean up at run time, and also report the issues 17:23:45 but it will not cover the leakage what is not visible via the api 17:24:01 and it will relay on the tenant isolation 17:24:22 afazekas: will it ensure that the isolated tenants will be cleaned up too? 17:24:50 just about the resources in the tenant 17:25:10 afazekas: ok 17:26:01 is there anything else about resource leakage detection? 17:27:01 I can restore the previous patch if anybody interested in the global leakage 17:27:22 afazekas: do you have a link? 17:28:49 https://review.openstack.org/#/c/35516/ 17:29:08 afazekas: ok, I'll take a look at it later and let you know 17:29:33 thx 17:29:55 ok then, moving on: 17:30:00 #topic critical reviews 17:30:13 does anyone have any reviews that they like to bring up? 17:30:51 I have one concern 17:30:55 on review process 17:31:02 please bear with mee 17:31:16 I want to put up my point 17:31:25 I want to raise one concern. 17:31:25 Test developement and contribution seems to be really pain. 17:31:25 It takes 10 patches to get through even for people contributing to Tempest for more than one year. 17:31:25 We need to have some policy on reviews. 17:31:27 It appears many times , some late entrants offer new comments/suggestions when it seems like code review done and one more review cycle. 17:31:30 we need to refine the process .. Otherwise , it slows down test developement cycle and difficult to maintain contributor's motivation 17:32:32 Ravikumar_hp: I agree, do you have a recommendation how to do it? 17:32:33 Ravikumar_hp: I understand that point 17:32:41 but it's hard to solve that 17:32:47 Ravikumar_hp: I understand what you're saying but the review process is needed 17:32:53 we need to ensure code quality 17:32:59 and review resources are limited 17:33:11 as a group, we need to fix this 17:33:12 so sometimes it takes time 17:33:21 now a days , all the reviews 17:33:33 I can agree for framework design 17:33:52 if test developement takes 10 reviews, then something is wrong 17:33:58 It is not working well 17:34:06 mtreinish: the small nits can be fixed later 17:34:12 disagree 17:34:20 You fix it before merge or it doesn't get fixed 17:34:33 marun: +1 that has been my experience as well 17:34:42 we cannot move forward if one test contribution takes one month / 10 patch reviews 17:34:47 My suggestion is to have a guide for what needs to be done, so at least the criteria is clear. 17:34:55 If the submitter does not follow the guide, then it takes 10 reviews. 17:34:56 thiis is over engineering test code 17:34:57 the question is why does it take 10 patchsets 17:35:05 If they do follow the guide, it gets in faster. 17:35:15 in your own company, do qa has 10 or more reviews? 17:35:24 it's just because of nits.. then it's too much 17:35:26 marun: IMHO not the 10 or 100 review is the issue, the 1 month 17:35:27 many times , reviewers change from patch to patch 17:35:33 In your own company, do you have clear criteria for what is good? 17:35:42 marun: waiting on review reply 17:35:42 we need to be agile in our test development process...I suggest we have 2 to 3 reviews 17:35:55 That's a different issue, though. 17:36:02 if two reviwers take care of one code submission , we can finish in max 3 patches 17:36:10 afazekas: timely review vs quality of review 17:36:19 yes - we trust our QA 17:36:31 Well, we need that same criteria in tempest. 17:36:41 just 20 lines of code takes one month - ten patches 17:36:42 We need it in writing, so it isn't just in some people's head. 17:36:58 marun: yes 17:37:01 so lets do 2 things to improve this process (a) define what will code review consists off? a checklist (b) reduce the reviewer to 2 to 3 17:37:02 Ravikumar_hp: So think of a way to fix the problem that does not result in lower review quality. 17:37:15 I don't think that limiting the number of reviewer will be a good solution 17:37:27 ^reviewers 17:37:28 it is 17:37:30 IMHO every typical review issue should be in the HACKING,rst 17:37:38 +1000 17:37:46 marun: my suggestion - only TWO reviewers per one submission 17:37:47 And over time that list should evolve 17:37:48 afazekas: yes that should be the case 17:37:52 do anyone know how many reviews dev code goes thru before merge? 17:37:58 but I think we might have some gaps I'm not sure 17:38:01 Ravikumar_hp: only two core reviewers? 17:38:02 Ravikumar_hp: I don't think that's workable. 17:38:02 some additional style issue can be tested by flake8 17:38:06 Two core people, fine. 17:38:18 +1 Ravikumar_hp 17:38:20 But there are often stakeholders outside of those two 17:38:29 Look, tempest is not unique 17:38:32 -1 17:38:36 There are tons of core projects that have the same challenges. 17:38:38 patelna, Ravikumar_hp: limiting the number of reviews is not the solution 17:38:40 we have less QA contributors 17:38:44 Thinking that we are special and need to do something different? Just silly. 17:39:22 mtreinish: we want good quality , but need to refine so as to minimize patches/cycles, duration 17:39:33 How about maximizing patch quality? 17:39:36 do u want to add more coverage or do less tests and more reviewers -- and then turnign people away as they been frustrated 17:39:49 Tough 17:40:00 If we don't screen for quality, things fall apart. 17:40:09 ok I think that this topic has been played out enough. I think we should move on 17:40:09 we really need to draft a guideline for reviewers/checklist 17:40:11 So let's focus on improving patch quality 17:40:13 NOT 17:40:18 reducing review quantity 17:40:19 does anyone have any reviews they want to bring up 17:40:32 no one is saying don't screen for the quality -- you are missing the point 17:40:35 I think if someone feels fruststarted it the best way to have direct communication via IRC... 17:40:48 https://review.openstack.org/#/c/35165/ 17:40:49 mtreinish: Thanks 17:41:39 looks like I got an opposite review response at the end, can I return to something closer to the original version ? 17:41:40 afazekas: that's marked as abandonded 17:42:27 are there any other reviews? Otherwise we'll move on 17:42:43 mtreinish: restored 17:43:05 afazekas: ok cool 17:43:07 ;) 17:44:54 I will move back to the original new module / function style unless otherwise requested 17:44:57 afazekas: I'll have to take a look in detail after the meeting 17:45:14 ok if there aren't any other reviews that people want to bring attention to then let move on to the open discussion 17:45:21 #topic open discussion 17:45:27 I have a testr question 17:45:31 marun: ok 17:45:44 As per discussion yesterday, it appears tempest is broken on py26 17:46:01 The fix would be moving away from setupClass? 17:46:20 Is there a plan/effort underway to accomplish that? 17:46:26 marun: IMHO the fixing patch is merged 17:46:35 since yesterday? 17:46:46 marun: not currently we use setupClass fairly extensively throughout tempest 17:46:59 reworking things to avoid using it would be a huge undertaking 17:47:14 marun: the patch was older, but it contained the py 2.6 compatibility step 17:47:35 afazekas: are you running successfully on 2.6 then? 17:47:46 This is really important for Red Hat. 17:48:15 We need to run Tempest on RHEL 6.4 with py26. 17:48:36 If anyone is running on 2.6 then I'd be happy to talk offline about what I might be doing wrong. 17:49:04 But if not, then Red Hat will likely want to see a move away from setupClass and will devote resources to making it happen. 17:49:42 marun: I know there have been troubles with py26 lately especially after we've been moving to testr 17:49:59 but I don't have py26 on any of my systems so I haven't been able to test things 17:50:03 marun: are you using nosetests or testr ? 17:50:33 afazekas: testr + py26 is broken because it doesn't seem to invoke setupClass 17:50:50 afazekas: nose appears to work, but has to be manually invoked 17:50:52 marun: but if you switch back to nose would it work? 17:50:57 marun: you can run it with nose on py 2.6 17:51:25 So manually run? 17:51:26 marun: ok, then would adding a nondefault job in tox and run tests to use nose solve this 17:51:35 mtreinish: +1 17:51:49 It's not so much for me, but allowing non-developers to be able to run tempest trivially. 17:52:00 marun: ok that's simple enough 17:52:03 Ok, cool. 17:52:29 #action mtreinish to add options to use nose instead of testr for py26 compat 17:52:38 mkoderer: so much for pulling out all the nose references then 17:52:48 :) 17:52:49 mtreinish: yeah 17:53:12 rhel'd again ;) 17:53:31 ok, are there any other topics to discuss in the last 7 min? 17:53:48 do we really want both testr and nose to be invoked, guessing they do not cover exactly the same stuff 17:54:14 malini1: I think the alternative is removing the use of setupClass, which is desirable but costly. 17:54:39 malini1: it would either or. Testr is still the default but if you run with py26 you'll have to use nose 17:55:00 got it -- thanks 17:55:07 is it a know bug in testr? 17:55:13 ^known 17:55:14 mkoderer: Error: "known" is not a valid command. 17:55:36 mkoderer: yes, I talked with lifeless about it yesterday. 17:56:11 ok - so if this would be fixed we could switch to testr 17:56:28 marun: the numbered tests also should be fixed when they are start working 17:57:18 mkoderer: if setupClass was not used, then testr + py26 would play nicely 17:57:31 afazekas: Ah, yes. 17:57:31 marun: it is required to run as a stress test 17:57:51 afazekas: Why required for a stress test? 17:58:04 afazekas: I thought the way to handle order was simply to inline the functionality? 17:58:27 because you specify exactly on test case IE test method 17:58:51 so we've got ~1 min left. afazekas, marun do you want to pick this up on qa? 17:59:00 mtreinish: sounds good 17:59:04 ok 17:59:22 ok I guess a good as point as any to stop 17:59:38 #endmeeting