13:02:40 <chaconpiza> #startmeeting monasca
13:02:40 <openstack> Meeting started Tue May  5 13:02:40 2020 UTC and is due to finish in 60 minutes.  The chair is chaconpiza. Information about MeetBot at http://wiki.debian.org/MeetBot.
13:02:41 <openstack> Useful Commands: #action #agreed #help #info #idea #link #topic #startvote.
13:02:43 <witek> :)
13:02:44 <openstack> The meeting name has been set to 'monasca'
13:02:50 <chaconpiza> Thanks for the reminder
13:02:59 <chaconpiza> lets start!
13:03:13 <chaconpiza> #topic Summary: Investigation: Alarm Definition update
13:03:46 <bandorf> 1.1-1.3: Just a summary, as reported last week already. Any questions for this?
13:04:27 <witek> more or less clear for me so far
13:04:39 <bandorf> If not, let me continue with 1.4/1.5: Update of det. alarm defs.
13:04:53 <bandorf> We investigated further for the ttopic I mentioned last week.
13:05:20 <bandorf> Results: Alarms get more or less "destroyed" if the alarm definition is being updated.
13:05:39 <bandorf> This happens with any update, can be e.g. update of operator
13:05:54 <bandorf> It's a different bug than the one we know already.
13:06:27 <bandorf> In database, there's a table sub_alarm. In this table, expression is stored as well.
13:06:47 <bandorf> After update of alarm def., determinstic is no more part of this expression.
13:07:21 <bandorf> Thus, a deterministic alarm can reach status "UNDETERMINED"
13:08:02 <bandorf> Any questions related to this?
13:08:11 <chaconpiza> I would like to add, that we have the list of commands to reproduce it in devstack.
13:09:01 <witek> sounds plausible, that this is another bug
13:09:08 <bandorf> Yes
13:09:13 <witek> do you think the bug is only in API?
13:10:18 <chaconpiza> I am not sure yet
13:10:38 <bandorf> I assume that it's a bug in monasca-thresh, but I need to investigate further. However, this will take some time.
13:11:06 <bandorf> I'm currently working on a diferent, monasca-thresh-related topic (metrics too old in log)
13:11:18 <chaconpiza> we can create a Story with tag `bug` and assign so far to project monasca-api
13:11:26 <witek> and what's the priority? are you impacted more by `update function, period` or deterministic alarms?
13:11:42 <witek> chaconpiza: +1
13:12:08 <bandorf> In our customer environment, they're not using det. alarms.
13:12:31 <bandorf> However, they can live with the work-around for update function, period.
13:13:04 <bandorf> Thus, from our site, "metrics too old" has highest priority
13:13:53 <witek> do we have a story for `update function, period` bug?
13:14:55 <bandorf> No, not yet, wanted to finish investigation first - done now.  If everybody agrees with the proposal in 1.6.2, I can create such a story
13:15:14 <chaconpiza> +1
13:15:28 <witek> sounds good to me, that would be validation in API, right?
13:16:12 <bandorf> Yes, this wil impact api only
13:16:54 <bandorf> So, I will create 2 stories, ok.
13:17:06 <witek> +1
13:17:10 <bandorf> That's it from my site
13:17:18 <chaconpiza> alright, then lets move to next topic
13:17:35 <chaconpiza> thanks Matthias!
13:17:40 <chaconpiza> #topic Ussuri release
13:17:40 <witek> thanks
13:18:05 <witek> we're approaching the release deadlines
13:18:19 <witek> on Thu is the final RC deadline
13:18:47 <witek> I've created new RC for monasca-persister already
13:19:01 <witek> including the simplejson requirement
13:19:32 <witek> do we know of any other bugs on stable/ussuri which have to be fixed?
13:20:05 <chaconpiza> I don't recall any
13:20:39 <witek> I also think other components are good
13:21:34 <witek> the official final release will be announced next week
13:22:02 <chaconpiza> is it announced in the mailing list?
13:22:58 <adriancz> i have small fix
13:22:59 <adriancz> https://review.opendev.org/#/c/724155/
13:23:15 <witek> yes, will be, for now they're sending release countdown email
13:23:16 <adriancz> would be nice to have this in ussuri
13:23:55 <witek> http://lists.openstack.org/pipermail/openstack-discuss/2020-May/014577.html
13:24:17 <chaconpiza> thanks
13:25:15 <chaconpiza> I added myself as reviewer for the Adrian's fix
13:25:40 <adriancz> thanks
13:25:45 <chaconpiza> Ok, lets go for the next topic
13:25:56 <chaconpiza> #topic Bump librdkafka to 1.4.0
13:25:56 <witek> adriancz: I think it's not release critical, so we can review and merge on master and then cherry-pick to stable/ussuri after the official release
13:26:17 <chaconpiza> We are using now confluent-kafka-python v1.4.1
13:26:36 <adriancz> yes we can merge this later
13:26:48 <chaconpiza> for docker we need to compile the source code
13:27:22 <chaconpiza> I started an Alpine container to do this fix https://review.opendev.org/#/c/725258/
13:28:01 <chaconpiza> because it brakes the zuul check build-monasca-docker-image
13:29:06 <chaconpiza> Then we found a new broken zuul check: monasca-tempest-python3-influxdb
13:29:22 <witek> looks good to me, the failing tempest test are not related
13:29:45 <chaconpiza> I will paste what I wrote one hour ago, for the record.
13:29:52 <chaconpiza> We have a small issue in monasca-api getting data from influxdb.
13:29:58 <chaconpiza> Just after stacking:  vagrant@devstack:~$ monasca metric-list --name zookeeper.out_bytes
13:30:09 <chaconpiza> The repository was unable to process your request (HTTP 500) (Request-ID: req-d6106c5d-3bf9-4adc-af93-cd14efddac13)
13:30:09 <chaconpiza> adriancz and me were finding the root cause:
13:30:09 <chaconpiza> The problem is with the new library:  influxdb 5.3.0
13:30:09 <chaconpiza> Downgrading to the previous version:
13:30:09 <chaconpiza> vagrant@devstack:~$ pip install influxdb==5.2.3
13:30:10 <chaconpiza> vagrant@devstack:~$ sudo systemctl restart devstack@monasca-api
13:30:10 <chaconpiza> vagrant@devstack:~$ monasca metric-list --name zookeeper.out_bytes
13:30:21 <chaconpiza> +---------------------+----------------------+
13:30:22 <chaconpiza> | name                | dimensions           |
13:30:22 <chaconpiza> +---------------------+----------------------+
13:30:22 <chaconpiza> | zookeeper.out_bytes | component: zookeeper |
13:30:22 <chaconpiza> |                     | hostname: devstack   |
13:30:22 <chaconpiza> |                     | mode: standalone     |
13:30:22 <chaconpiza> |                     | service: zookeeper   |
13:30:23 <chaconpiza> +---------------------+----------------------+
13:30:23 <chaconpiza> As witek found, influxdb was upgraded in u-c during the weekend
13:30:37 <chaconpiza> 😉
13:31:33 <chaconpiza> As well witek found an issue related to the new version of influxdb that could be the reason
13:31:56 <chaconpiza> https://github.com/influxdata/influxdb-python/issues/820
13:33:30 <chaconpiza> what to do now? 1. try to update our code to be compatible with new influxdb client
13:33:35 <openstackgerrit> Adrian Czarnecki proposed openstack/monasca-api master: Fix incorrect old  log-api tempest test configuration  https://review.opendev.org/718512
13:33:54 <chaconpiza> 2. send a change to the u-c to avoid the new version
13:33:55 <chaconpiza> ??
13:34:31 <witek> I vote for 2
13:35:13 <adriancz> i vote for 1
13:35:51 <chaconpiza> I will vote for 2, to keep CI working  and in parallel try to make our code compatible with new version
13:36:13 <chaconpiza> because is a high possibility that the new version of influxdb client is buggy
13:36:42 <adriancz> I think this is the best option
13:36:46 <chaconpiza> adriancz, is it ok to you?
13:36:54 <chaconpiza> ok
13:37:01 <witek> well, right now we don't really know if that's a bug in influxdb client or our code
13:37:12 <chaconpiza> :D
13:37:47 <witek> but +1 for investigating the problem and trying to solve the root cause
13:38:11 <adriancz> and also we don't know what we need to do to make our code compatible with new influx
13:39:32 <chaconpiza> well, then it seems we are choosing the option 1
13:39:38 <bandorf> +1 for investigation of root cause
13:40:08 <witek> let's follow up next week then
13:40:08 <chaconpiza> ok, I will continue digging on it
13:40:16 <chaconpiza> alright
13:40:29 <chaconpiza> lets jump to the final topic
13:40:47 <chaconpiza> #topic Docker images
13:41:29 <witek> oh, that's mine
13:41:57 <witek> as we just discussed, building Python Docker images requires compiling from sources
13:42:11 <witek> because installing from wheels is disabled in Alpine
13:42:26 <witek> the following blog article describes it
13:42:32 <witek> https://pythonspeed.com/articles/alpine-docker-python/
13:42:56 <witek> it causes problems for some requirements, like for example confluent-kafka
13:43:31 <witek> also, the build process gets really long, and the resulting image isn't small
13:43:58 <witek> the author suggests other base images for Python
13:44:17 <witek> which at the first glance makes sense to me
13:44:40 <witek> just wanted to share
13:44:43 <chaconpiza> witek, I barely recall that there is other option call Eggs, it is possible in Alpine?
13:45:20 <witek> don't understand
13:45:28 <chaconpiza> instead of wheels
13:45:52 <chaconpiza> https://packaging.python.org/discussions/wheel-vs-egg/
13:45:55 <witek> pip in Alpine installs from source package
13:46:27 <chaconpiza> I see
13:47:02 <witek> at least for confluent-kafka
13:47:44 <chaconpiza> To choose a new base image or continue with Alpine is important. Would you like to add into the etherpad for the PTG?
13:48:37 <witek> yes, I think we can discuss it during PTG
13:48:46 <witek> but is not my priority
13:48:46 <chaconpiza> +1 thanks
13:49:35 <chaconpiza> #topic AOB
13:50:28 <chaconpiza> is there any other topic you would like to discuss?
13:50:44 <witek> I have one, could have added to the agenda actually
13:51:22 <chaconpiza> #topic Fix monasca-log-api CI
13:51:23 <witek> we've got two competing approaches to fixing CI in monasca-log-api
13:51:30 <witek> https://review.opendev.org/718512
13:51:36 <witek> https://review.opendev.org/704536
13:52:03 <witek> we've chatted with adriancz yesterday and he convinced me to his approach
13:52:11 <witek> seems cleaner
13:52:51 <witek> so, the concept is, we've deprecated monasca-log-api and its DevStack plugin in Ussuri
13:53:35 <witek> we still maintain monasca-log-api code but stop maintaining DevStack plugin from Ussuri onwards
13:54:16 <witek> so for testing, we run new DevStack plugin in monasca-api for changes in monasca-log-api master and stable/ussuri
13:54:56 <witek> for stable/train and older branches we test with old DevStack plugin in monasca-log-api
13:55:22 <witek> I know, it's a little complicated but I think makes sense
13:56:01 <chaconpiza> I would vote to Adrian's way, since we have already the infrastructure to proceed, I mean: USE_OLD_LOG_API
13:56:37 <witek> oh, I've missed one review for that:
13:56:48 <witek> https://review.opendev.org/720240
13:56:58 <adriancz> i create change for that
13:56:59 <adriancz> https://review.opendev.org/#/c/720240/
13:57:06 <adriancz> o sorry witek
13:58:22 <witek> adriancz: could you please rebase on top of https://review.opendev.org/724658 ?
13:58:45 <witek> also, please add a topic for both your changes, so it's easier to find
13:58:49 <adriancz> yes
13:59:25 <chaconpiza> for https://review.opendev.org/724658 I need the fix of docker (Bump librdkafka to 1.4.0)
13:59:38 <chaconpiza> and for that the fix of influxdb new version
13:59:58 <chaconpiza> we have a long chain
14:00:13 <witek> yes, influxdb client seems to be prio 1 today
14:00:25 <chaconpiza> Ok, lets go with Adrian's way
14:00:30 <witek> +1
14:00:34 <chaconpiza> I will digg into influxdb
14:00:42 <chaconpiza> we ran out of time
14:00:57 <witek> thanks, good meeting
14:01:02 <chaconpiza> Thanks for your ideas
14:01:08 <chaconpiza> +1
14:01:18 <chaconpiza> bye bye
14:01:26 <bandorf> Bye, everybody
14:01:31 <witek> bye, see you next week
14:01:37 <chaconpiza> #endmeeting