17:01:13 #startmeeting nova notification 17:01:14 Meeting started Tue Jun 6 17:01:13 2017 UTC and is due to finish in 60 minutes. The chair is gibi. Information about MeetBot at http://wiki.debian.org/MeetBot. 17:01:15 Useful Commands: #action #agreed #help #info #idea #link #topic #startvote. 17:01:18 The meeting name has been set to 'nova_notification' 17:01:21 almost missed the start time :) 17:01:58 o/ 17:03:36 soo I'm back 17:03:51 piles of mails to reade 17:03:53 read 17:04:05 reviews to go through 17:04:36 actually we have a lot of transformation pathes that looks complete and ready 17:04:50 mriedem: what is the deadline for such commits to be merged? 17:05:04 FF 17:05:07 which is... 17:05:13 #link https://review.openstack.org/#/q/status:open+project:openstack/nova+branch:master+topic:bp/versioned-notification-transformation-pike+label:Code-Review%253E%253D%252B1+label:Verified%253E%253D1+AND+NOT+label:Code-Review%253C0 17:05:21 july 27 17:05:33 mriedem: thanks, that still plenty of time 17:06:25 I will rebase the searchlight additions this week as it is now in conflict 17:06:45 I also saw you comment about binary=nova-api 17:06:50 so I opened a bug report 17:07:15 i've forgotten what that comment was - that nova-api isn't a binary? 17:07:19 it's nova-osapi_compute 17:07:20 right? 17:07:21 #link https://bugs.launchpad.net/nova/+bug/1696152 17:07:22 Launchpad bug 1696152 in OpenStack Compute (nova) "nova notifications use nova-api as binary name instead of nova-osapi_compute" [Undecided,New] 17:07:28 mriedem: right 17:08:01 seems easy to fix 17:08:12 so I marked as low hanging 17:08:13 but, is it worth fixing at this point 17:08:19 it would be a version bump wouldn't it? 17:08:59 good question. I thought that this is a bugfix so no bump 17:09:08 it's a change in the payload 17:09:11 the value i mean 17:09:11 also this is part of the envelope not the payload 17:09:24 as binary ends up as publisher_id 17:09:26 hmm, well, still seems like a change 17:09:36 i don't know if/how anyone is consuming this 17:09:47 aaand there is no version on the envelope as that is created by oslo 17:09:47 and if they use that binary to correlate anything 17:10:27 https://github.com/openstack/nova/blob/master/doc/notification_samples/aggregate-create-end.json#L18 17:10:38 so we have basically no number to bump 17:10:55 only the suff under the payload key is versioned by nova 17:11:49 sure, but 17:12:01 any client side tooling that was expecting nova-api before and using that for something, 17:12:04 would now be broken 17:12:09 true 17:12:37 http://git.openstack.org/cgit/openstack/monasca-agent/tree/monasca_setup/detection/plugins/nova.py#n22 17:12:47 can we somehow singal deprecation? maybe with a release notes? 17:12:59 i doubt people would notice 17:13:30 i think i would just mark the bug as won't fix and consider it an alias 17:13:42 we could put a note in the code if we cared enough, 17:13:53 and/or make the binary kwarg for the notifications an enum 17:14:02 the monasca agent talks about the name of the process wich is nova-api in ubunut distro at least 17:14:04 so we don't end up with nova-api and nova-osapi_compute both happening 17:14:59 i left some comments in the bug report 17:15:09 but i'd just do an enum validation for now and stick with the existing binary 17:15:11 yeah, I think this nova-osapi_compute is an edge case 17:15:12 https://github.com/openstack/nova/blob/master/setup.cfg#L48 17:15:27 here ^^ we also create nova-api as a script 17:15:33 yes honestly nova-osapi_compute as the name as messed us up many times even within nova 17:15:36 checking for service versions 17:15:43 *has messed 17:16:18 OK, lets keep nova-api, and add an enum to make clear this is intentional 17:18:48 beside this you mentioned that on the summit an issue was raised regarding the frequency of instance.exists 17:18:57 but I haven't seen a bug report jyet 17:19:12 me neither 17:19:21 OK then I will wait patiently 17:19:21 :) 17:19:22 i think that was klindgren from godaddy that said it 17:19:27 we could ask them or harlowja 17:19:36 * gibi making notes 17:19:46 i remember the issue, let me check the code quick 17:20:09 so it's sent from here right? https://github.com/openstack/nova/blob/master/nova/compute/manager.py#L6157 17:20:45 and every time that periodic task runs (default once per minute), we get the instances for that host within some audit period window https://github.com/openstack/nova/blob/master/nova/compute/manager.py#L6130 17:20:56 yes, this is the periodic source of this notificaition, and there are some other places we send this 17:21:27 i think the complaint was that we get all of those instances for each periodic, so once per minute, rather than like the heal_instance_info task that pulls the instances and puts them in a queue and processes 1 instance per minute 17:21:43 https://github.com/openstack/nova/blob/master/nova/compute/manager.py#L5891 17:22:18 would need to investigate last_completed_audit_period 17:22:52 oh dtp might also be able to help here, he works on notifications stuff at godaddy too 17:22:59 the default of CONF.instance_usage_audit is a month 17:23:36 https://github.com/openstack/nova/blob/master/nova/utils.py#L325 17:24:14 ok, so i think what it's doing is getting all instances on that host within the last month 17:26:04 anyway, dtp is going to ask about it 17:26:08 i asked him in -nova 17:26:15 OK 17:26:37 It seems pretty complicated to me so I need to read the code a bit deeper 17:27:04 at least there seems to be return statement here https://github.com/openstack/nova/blob/master/nova/compute/manager.py#L6126 17:27:08 could be a sortcut 17:27:55 i'm not totally sure how that works either 17:27:56 anyhow if dtp give us some hints then I can try to reproduce the provelm 17:28:25 problem 17:29:49 I think that was all I have in mind for today 17:29:54 ok 17:30:08 do you have anything else to discuss? 17:31:09 nope 17:31:37 mriedem whatss up 17:31:43 harlowja: see dtp in -nvoa 17:31:45 *nova 17:31:50 * mriedem needs to each lunch 17:31:53 k 17:34:25 gibi: are you going to end the meeting 17:34:27 ? 17:34:37 ohh yeah :D 17:34:55 have nice lunch :) 17:34:58 #endmeeting