19:59:41 #startmeeting heat 19:59:42 Meeting started Wed Feb 6 19:59:41 2013 UTC. The chair is sdake. Information about MeetBot at http://wiki.debian.org/MeetBot. 19:59:43 Useful Commands: #action #agreed #help #info #idea #link #topic #startvote. 19:59:45 The meeting name has been set to 'heat' 19:59:58 #topic rollcall 20:00:06 sdake here 20:00:10 o/ 20:00:14 shardy here 20:00:17 hidey zaneb 20:00:18 o/ 20:00:25 \O 20:00:44 jpeeler here 20:00:54 \<> 20:01:02 ascii art ftw 20:01:10 here 20:01:13 no asalkeld? 20:01:15 hi there 20:01:23 ok looks like we got enough to get started 20:01:38 here 20:01:42 #topic action review from last meeting 20:02:22 * sdake asalkeld, shardy, stevebaker, jpeeler to set implementation field in BPs 20:02:27 that looks done 20:02:39 * sdake sdake to set all VPC blueprints to high 20:02:41 that looks done 20:02:44 sdake: do you run the ttx.py script? 20:02:54 * sdake heat devs to sort out if updatestack can update rather then delete in coming week 20:02:58 i dont run that 20:03:06 have a link? 20:03:15 sdake: that's done for instance and autoscaling now 20:03:32 which was the objective for G IIRC 20:03:40 ok so done then? 20:03:43 yup 20:03:45 sdake: https://github.com/ttx/bp-issues <- his sanity check on LP fields 20:04:05 #info all actions from 1-23-2013 meeting completed 20:04:22 for those that weren't here last week, we cancelled the meeting since most were travelling 20:04:38 I'd like to go through the blueprints and bugs and make sure we are set for g3 20:04:42 #topic blueprint review 20:05:14 #link https://launchpad.net/heat/+milestone/grizzly-3 20:05:32 stevebaker still blocked on the vpc work? 20:06:04 I should be able to start soon, just waiting for some network ports to arrive in the mail 20:06:23 https://blueprints.launchpad.net/heat/+spec/raw-template-db 20:06:41 that might have to be bumped 20:06:56 might or should, lets make a decision here ;) 20:07:17 doesn't look needed 20:07:21 OK, bump it. Any spare cycles will be spent on the vpc blueprints 20:07:23 looks low priority to me 20:07:31 +1 20:07:54 #info bump https://blueprints.launchpad.net/heat/+spec/raw-template-db to H cycle 20:07:57 sdake: will we be doing a db migrations reset for G? 20:08:05 i think we should 20:08:11 +1 20:08:14 but should probably have a vote on it 20:08:23 ok, so that should probably be done at the same time 20:08:28 (or before) 20:08:31 what about current users? 20:08:43 they have to reinstall? 20:08:56 maybe talk to ppetit 20:09:14 no, just restart the numbering where it is 20:09:22 #action sdake to bring up thread on openstack-dev about dumping the db migrations 20:09:45 asalkeld: if nova can get away with it, we certainly can 20:10:17 #link https://blueprints.launchpad.net/heat/+spec/resource-properties-schema 20:10:38 lets have discussion about it on ml and see what pops up 20:10:56 zaneb your assigned to that last link 20:11:10 feb 21 is deadline for blueprints 20:11:11 bump 20:11:19 that's low priority 20:11:23 its a nice-too-have, but low priority 20:11:34 #info bump https://blueprints.launchpad.net/heat/+spec/resource-properties-schema to H cycle 20:11:36 at least until horizon ui picks up 20:11:58 stevebaker: +1 20:12:09 +1 20:12:23 yea 20:12:32 #link https://blueprints.launchpad.net/heat/+spec/aws-cloudformation-init 20:13:02 jpeeler? 20:13:14 thanks to pfreund, i think the only thing left is implementing configsets 20:13:27 so i'm working on that 20:13:38 ok, so looks goot for feb 21? 20:13:43 goot/good ;) 20:13:48 yep, question about cfntools though 20:13:49 gut 20:14:01 are we trying to maintain the copy in heat-jeos with the separate repo or what? 20:14:02 #info https://blueprints.launchpad.net/heat/+spec/aws-cloudformation-init on target for g3 deadline 20:14:19 (i can wait on that question) 20:14:21 just seperate cfn 20:14:33 jpeeler: I plan to rip cfntools out of heat-jeos and replace with a pip install 20:15:32 in the meantime, any changes will be manually synced between heat-jeos and heat-cfntools 20:15:34 we will try to get through as many bugs as possible - but may not be able to tackle them all in this meeting 20:15:39 #topic bug review 20:15:50 #link https://bugs.launchpad.net/bugs/1072906 20:15:52 Launchpad bug 1072906 in heat "Handle XML as well as JSON in ReST API" [Medium,Confirmed] 20:16:18 that's more of a bluprint 20:16:33 ok, so make blueprint and remove as bug? 20:16:34 if we used wsme it comes for free 20:16:48 (what is in ceilometer) 20:16:51 zaneb: have you looked at the controller code, I think it could be quite simple? 20:16:55 sdague: probably 20:17:00 sdake: probably 20:17:18 #action zaneb to move 1072906 to blueprint 20:17:19 I think selecting between the two is not that hard 20:17:36 we already have JSON and XML serializer/deserializer implementations IIRC, so we just have to fix the content type detection 20:17:36 so as a blueprint, that going to make 21? 20:17:36 actually getting the output in an acceptable XML format... possibly hard 20:17:48 sdake: no 20:17:50 assuming same wsgi middleware as the CFN api that is 20:18:05 ok - file as a blueprint and we can take up in h cycle and close the bug please ;) 20:18:13 ok 20:18:26 #link https://bugs.launchpad.net/bugs/1072905 20:18:30 Launchpad bug 1072905 in heat "make template functions "Fn::" pluggable." [Medium,In progress] 20:18:47 another blueprint ;) 20:19:13 tho not a big job 20:19:15 * stevebaker notes that its Mark Shuttleworth's fault that blueprints are a separate thing in launchpad 20:19:31 it is a confusing distinction 20:19:32 #action zaneb refile 1072905 as blueprint 20:19:45 so as a blueprint that going to make 21? 20:20:34 let's wait and see? 20:20:54 would like blueprints to be stabilized so we know what work we have on our plates 20:21:02 Fn:: pluggableness is actually a pretty big job 20:21:05 if we aren't going to finish something for g3, no sense spending effort on it now 20:21:15 need to focus on g3 specific tasks 20:21:22 we could just prioritize? 20:21:22 as in, Havana cycle job 20:21:32 ok - well still please file blueprint and close bug ;) 20:21:34 and do what we can 20:21:39 #link https://bugs.launchpad.net/bugs/1072917 20:21:41 Launchpad bug 1072917 in heat "heat cli: Template Body maximum length problem." [Medium,Triaged] 20:21:58 I need to investigate this, it's broken with qpid but works with rabbit 20:22:12 probably something simple, but not sure yet 20:22:28 hmm, so not much chance of replicating it in a unit test 20:22:34 planning to look into it next week 20:22:41 #link https://bugs.launchpad.net/bugs/1072940 20:22:43 Launchpad bug 1072940 in heat "backtrace on console 3-5 minutes after HA test completes" [Medium,In progress] 20:22:59 I'm pretty certain this is fixed now 20:23:20 The backtrace was because the create greenthread didn't get deleted, which is now does 20:23:32 I just need to re-run the integration test to prove 20:23:33 ok - so state of bug is wrong? 20:23:45 I was just going to re-test then close it as invalid 20:23:47 #action shardy to rerun ha test for 1072940 and fix state 20:23:53 it was a valid bug th o ;) 20:23:58 so close as fix commited i think 20:24:03 Yeah, it's not notabug anymore 20:24:11 was valid previously :) 20:24:25 ok will do 20:24:31 #link https://bugs.launchpad.net/bugs/1072948 20:24:32 Launchpad bug 1072948 in heat "some resource create and delete operations could block in failure scenarios" [Medium,In progress] 20:24:43 so this is fixed, but shardy had some problems with the patch 20:24:48 what is your thinking here shardy 20:25:12 I think if nova is broken, heat is broken, introducing some hard-coded timeout logic is just wrong IMO 20:25:23 +1 20:25:28 I think there are enough safeguards on create already 20:25:28 delete bug 20:25:39 some on delete would be good though 20:25:47 we have a stack create timeout 20:26:01 because state may go from ERROR -> ERROR and we have no way of detecting the transition 20:26:21 don't we have a timeout on delete? 20:26:38 asalkeld: should use the same one as for create 20:26:48 i.e. timeout for whole stack 20:26:57 yip 20:26:58 but has same value as create timeout 20:27:00 however if nova breaks, there is no way for the heat user to identify nova is broken, and they may then think heat is broken 20:27:05 so, pretty long wait 20:27:36 sdake: I think you need timeout on nova only if instance starts in ERROR state 20:27:38 sdake, you can adjust the timeout 20:27:49 wedging heat entirely because nova misbehaves at some point seems wrong 20:28:05 but if nova (or any other core service we rely on) is broken, everything is broken 20:28:08 sdake, it won't wedge heat 20:28:19 Is this a real problem, ie do we have any sort of reproducer? 20:28:23 it will timeout and fail 20:28:48 i have seen it happen but no reproducer 20:28:56 but didn't wait for a timeout from heat proper 20:29:09 ok well we can close bug then if everyone feels current functionality is ok 20:29:11 The real problem is the while True loops in the instance resource 20:29:13 ya, the timeout is big 20:29:31 we should just make that a specific number of retries 20:29:37 #action sdake 1072948 should be closed 20:29:47 we need parallel resource startup! 20:29:54 #action shardy to file bug with what he thinks are necessary steps to deal with problems in 1072948 20:30:01 ;) 20:30:06 shardy: +1 20:30:16 sdake: Ok, will do 20:30:21 ok #link https://bugs.launchpad.net/bugs/1072952 20:30:23 Launchpad bug 1072952 in heat "Implement Rollback feature of AWS API" [Medium,Triaged] 20:30:49 blueprint? 20:30:52 I'm planning to do that next week 20:31:02 can convert to blueprint if you prefer 20:31:08 looks like one to me 20:31:13 ok, will do 20:31:18 #action shardy to convert 1072952 to blueprint 20:31:36 #link https://bugs.launchpad.net/bugs/1072955 20:31:37 Launchpad bug 1072955 in heat "Implement Fn::Base64" [Low,Triaged] 20:31:44 I dont think this is needed 20:31:59 blueprint? 20:32:04 noop 20:32:06 yeah, extremely low priority 20:32:08 i think it is a noop 20:32:14 currently, yeah 20:32:19 I did implemenet this once, and it broke heat 20:32:33 because it passed base64 to cloudinit which didn't expect it 20:32:49 could set the mime-type to cloudinit to base64 20:33:01 er, mime-encoding 20:33:05 #action sdake to close 1072955 20:33:26 #link https://bugs.launchpad.net/bugs/1087530 20:33:27 Launchpad bug 1087530 in heat "sendfile possibly problematic with eventlet" [Medium,In progress] 20:33:34 sdake: don't close it, people might want to use Base64 elsewhere in their template 20:33:47 https://review.openstack.org/#/c/21184/ 20:34:13 I realise now it is client side code only, but no harm in removing it 20:34:27 zaneb do you plan to fix this before 21st then? 20:34:35 or bump to h 20:34:38 #undo 20:34:38 Removing item from minutes: 20:34:39 #undo 20:34:40 Removing item from minutes: 20:34:44 sdake: no, just leave it open and bump imo 20:34:57 #action zaneb to turn 1072955 into blueprint and bump to h 20:35:12 #link https://bugs.launchpad.net/bugs/1087530 20:35:13 Launchpad bug 1087530 in heat "sendfile possibly problematic with eventlet" [Medium,In progress] 20:35:41 ok that review looks good stevebaker, i'll approve after meeting 20:36:08 ok the rest are unassigned 20:36:28 lets go through those and see if any interest folks ;) 20:36:45 https://bugs.launchpad.net/heat/+bug/1096013 is what I'll be working on next 20:36:45 also, zane is a bit busy with another task unrelated to heat for 2-4 weeks so wont be able to help with our final g push 20:36:46 Launchpad bug 1096013 in heat "Instance resource doesn't allow IP assignment to VPC/quantum network" [High,Triaged] 20:37:16 #link https://bugs.launchpad.net/bugs/1072935 20:37:18 Launchpad bug 1072935 in heat "interrupting a nosetest results in backtrace on future creations of stacks" [Low,Triaged] 20:37:45 developer focused bug, can probably bump to h 20:37:53 sdake: can we add some more milestones? current buckets are only "grizzly-3" or "everything else" 20:38:07 h 20:38:16 ya i'll see if i can yet 20:38:36 #action sdake to bump 1072935 to h 20:38:53 #link https://bugs.launchpad.net/bugs/1072937 20:38:54 Launchpad bug 1072937 in heat "odd error from heat list after a delete" [Low,Triaged] 20:39:16 this is super hard to reproduce 20:39:22 and may not be present any longer 20:39:35 that should be heat-cfn list, not heat list 20:39:42 old bug 20:39:47 from v4 days i think ;) 20:40:07 should still be around, hard to reproduce though 20:40:21 just delete it and if it happens again re-create? 20:40:29 fix is there in the comments 20:40:29 prefer to keep a record 20:40:31 i say delete 20:40:34 but doens't mean we have to fix for h 20:41:09 well bump it then 20:41:19 why delete when we have collected all that info about how to fix it in the bug 20:41:46 I mean close, it is still searchable 20:42:04 it's still a bug too, so why close 20:42:09 yar, you can't actually delete 20:42:21 #link https://bugs.launchpad.net/bugs/1072958 20:42:22 Launchpad bug 1072958 in heat "Create a getting started guide for scaling out" [Medium,Triaged] 20:42:34 this seems like h material 20:43:20 #link https://bugs.launchpad.net/bugs/1072957 20:43:23 Launchpad bug 1072957 in heat "Add role to users, similar to nova's user role assignment" [Medium,Triaged] 20:43:48 #action shardy to speak with keystone folks about this 20:44:08 I think this is an internal user->role mapping, not a keystone role? 20:44:22 wasn't this the bug we were speaking about earlier in the week shardy? 20:44:28 #undo 20:44:28 not high priority IMO, no users have asked for this 20:44:29 Removing item from minutes: 20:44:51 no, the one we discussed is the create stack as non-admin when it creates User/AccessKey resources 20:44:58 right 20:45:11 #link https://bugs.launchpad.net/bugs/1096001 20:45:12 Launchpad bug 1096001 in heat "Parser Fn::GetAZs intrinsic function returns hard-coded value" [Medium,Triaged] 20:45:18 I contacted ayoung about it today, he thinks trusts should help us address it for h 20:45:25 nice 20:46:40 Our AZ handling is all a bit broken I think (we don't pass AZ's to nova), so I say bump to H and have an multi AZ test/fix then 20:47:41 #link https://bugs.launchpad.net/bugs/1096017 20:47:42 Launchpad bug 1096017 in heat "AutoScalingGroup missing VPCZoneIdentifier property" [Medium,Triaged] 20:48:07 Unless the VPC features are landing for G, we can bump this 20:48:27 i am hopeful the vpc features land for g 20:48:43 ok well thats all the high/med bugs 20:48:57 there are still a few unassigned - feel free to take those up if your bored ;) 20:49:14 #topic integrated status 20:49:34 So basically we need to present our case as to why we are integrated ready 20:49:39 (which is like core) 20:49:57 serious? 20:50:07 i'd suggest everyone send me a couple paragraphs and i'll sort it out into one email 20:50:10 ya serious 20:50:24 ok 20:50:25 I thought it was a review not marketing exercise 20:50:32 weird 20:50:51 well i could be wrong on what i read in the meeting but thats what it looked like to me 20:50:55 I assumed we'd be incubated for ~6months then reviewed 20:51:00 so there is "core" and "integrated" now? 20:51:14 ya, i think they passed new rules just yesterday on this point 20:51:34 stevebaker: core is a subset of integrated 20:52:01 seems sensible, depending on the specifics of what each means 20:52:05 #topic open items 20:52:19 stevebaker: the only difference is trademarks, effectively 20:52:35 can we talk heat-horizon? 20:52:36 ok guys 3 weeks left 20:52:47 * stevebaker invokes mordred 20:52:54 please wrap up blueprints and bugs 20:53:09 ya what do you want to discuss about heat-horizon 20:53:22 Monty mentioned that HP need a working horizon UI to heat, and have a developer ready to pick up some heat-horizon work 20:54:23 cool 20:54:31 I'm thinking we should at least get heat-horizon into gerrit and launchpad 20:54:54 as a plugin you mean? 20:54:57 sure, stackforge? 20:55:24 it could stay in github/heat for now? 20:55:50 sure 20:55:54 github/heat is easy to work with 20:56:04 there is alot of overhead to gerrit integration 20:56:05 sdake: as a horizon plugin. Once we're "integrated" we could propose that it goes into horizon 20:56:08 but we can do it 20:56:20 ya just thinking about short circuiting the need for a plugin :) 20:56:28 ie propose directly in a few weeks 20:56:38 vs setup a bunch of infrastructure for 3-4 weeks of time 20:56:45 +1 20:57:04 true. I guess there is a risk that horizon devs will want it to stay as a plugin 20:57:07 tho I think you only get integrated after 8 months? 20:57:35 but the core of the issue is the hp dev needs a way to develop on the codebase 20:57:40 and that is easy to solve 20:57:48 may make more sense as a plugin 20:57:54 that is for horizon devs to decide imo ;) 20:58:15 https://github.com/heat-api/heat-horizon fyi 20:58:28 yup 20:58:32 1 min... 20:58:40 thats all from me 20:59:18 ok thanks guys 20:59:26 #endmeeting