21:02:54 #startmeeting scientific-wg 21:02:55 Meeting started Tue Jul 26 21:02:54 2016 UTC and is due to finish in 60 minutes. The chair is b1airo. Information about MeetBot at http://wiki.debian.org/MeetBot. 21:02:56 Useful Commands: #action #agreed #help #info #idea #link #topic #startvote. 21:02:59 The meeting name has been set to 'scientific_wg' 21:03:01 #chair oneswig 21:03:02 Current chairs: b1airo oneswig 21:03:16 Greetings 21:03:19 hi all o/ 21:03:21 o/ 21:03:58 hello 21:04:10 Where to begin? 21:04:13 hi leong, is that Yih Leong Sun from Intel? 21:04:17 oneswig: I misread the meeting start and thought that the topic was "oneswig" and I thought I'd clearly missed the addition of a project 21:04:23 yes b1airo 21:04:31 isunil: this is the scientific working group meeting 21:04:34 mordred, lol 21:04:39 isunil: you were an hour early 21:04:41 * mordred goes back into his hole 21:04:47 mordred: my life is a project but comparison to openstack is debatable... 21:04:59 lol mordred 21:05:00 catchy names are required to stand out in the big tent 21:05:28 oneswig: more of a peg than a tent 21:05:31 leong, great so we can talk about openhpc today then! 21:05:36 need a mascot now too b1airo 21:05:36 oh, man, mordred , OneSwig would be a great project name! 21:05:55 or is it OnesWig 21:06:01 sure! b1airo 21:06:10 rockyg: In the Queen's English, indeed, One's Wig? 21:06:46 ahh dear me, let's get into it then 21:06:47 oneswig, that give the project with that name even possibly a reason to be! 21:06:47 two's complement 21:07:27 so this week we want to talk about: 21:07:32 1) SC16 activities 21:07:38 2) OpenHPC 21:07:52 3) any news in our focus areas 21:08:06 #topic SC16 activities 21:08:57 quick recap 21:09:14 we have a panel session already scheduled at SC 21:09:38 oneswig, I can't recall - did we end up getting confirmation from everyone listed? 21:09:59 I don't know, I think confirmation went to a person at SC 21:10:04 I confirmed... 21:10:09 Sure you did too 21:10:43 not sure whether we heard from Kate or not? 21:10:51 There must have been similar responses from others? 21:11:09 You have not -- is it OK to confirm now? 21:11:12 our team also planning for a BoF, maybe we can coordinate? 21:11:20 Hi Kate! 21:11:31 hey, Kate's here 21:11:42 hi katekeahey, yes great 21:11:42 Hello -- do we know yet when the panel is scheduled? 21:12:01 no i don't think the programme is out yet 21:12:37 i also notice that i have email soliciting "Final Panel Information" by 1st Aug 21:13:03 Michaela Taufer? 21:13:33 Did you get mail from her Kate? 21:14:07 Mail from Michaela? (I get a lot of mail from her but I don't think I got anything about the panel) 21:14:11 oneswig, yes 21:14:12 Do you want me to ask her? 21:14:40 Aha, digging further back: "Your panel has been tentatively scheduled for Thu Nov 17, 3:30pm-5pm. " 21:14:45 it's basically just a confirmation of the submission we already submitted, but there is a chance to add panelist bios 21:14:57 oneswig, nice find 21:15:49 OK, excellent so it looks like we have a date -- I should be able to make it 21:16:40 This is Chris from the OSF. I'll be at SC this year, and I'm available for any booth duty or events. 21:16:49 katekeahey: I think things continue as they are by default, from a quick scan. I don't think you need to explicitly confirm but perhaps it wouldn't hurt to do so 21:17:29 Hi Chris - did you see the mail on user-committee from jmlowe on Indiana University's booth? 21:17:34 presumably submissions.supercomputing.org will tell you if there are any outstanding actions 21:17:37 I should confirm to Bill then? 21:18:13 oneswig: I did, I will reply to it 21:18:25 I was just pinged again today about booth talks, so the sooner the better, it's not a lot just a line or two so the schedule can be made 21:18:32 katekeahey: if you don't have any mails from SC's organising committee about this, probably worth figuring out 21:19:11 both small, I'm going to say up to 10 people, and large, 30 or so 21:19:20 OK, let me send mail then 21:20:34 I think we have confirmation (email or chat) from everyone currently on the panel programme: katekeahey, Jon Mills, jmlowe, Robert Budden, oneswig, b1airo 21:20:40 jmlowe: your deadline for submissions is 1st August? 21:20:59 friday, if at all possible 21:21:13 6/29 21:21:38 oneswig, did you say you were going to follow up to the panel thread? 21:21:42 7/29 21:21:43 jmlowe: I'll do a lightning talk on our OpenStack project at Cambridge - have to do it for the Bof (if it goes through) anyway 21:22:16 b1airo: sorry, you mean on the wording of the Bof? 21:22:32 that's great, sounds like a thing for a small talk, you can expect a big talk on jetstream in our booth of course 21:22:43 oh no sorry, confused myself, that was hogepodge 21:23:11 oneswig, do you see any "action" required in the submissions interface? 21:23:11 anteaya: yep, thanks, where did the time go? 21:23:22 jmlowe: my thoughts exactly 21:23:30 jmlowe: A talk on how we might use the work from Jetstream at Cambridge for example might tick some boxes? 21:23:46 sure 21:24:05 jmlowe: I'll get to work on it :-) 21:24:32 do we have a BoF for SC16? 21:24:36 ok so... 21:25:02 I'm thinking some sort of "cloud and hpc w/ openstack helps you collaborate and easily move your work around" theme 21:25:10 #action all: to follow up with jmlowe asap re. booth talks 21:25:45 #action b1airo: ask SC16 panelists for Bio details 21:25:45 I can probably sell that 21:25:58 If we can encourage some academic clouds to consider adding identity federation to help share resources, that would be great 21:26:13 along the line of collaborating and moving work loads 21:26:14 leong, we are writing a BoF proposal at the moment, already have a panel 21:27:02 hogepodge, there is a reasonable amount of interest in that but i think many of us are unclear about the technical requirements 21:27:18 our team is interested for a BoF as well, maybe we can collaborate ? 21:27:24 hogepodge: rbudden and I along with some others are working on setting up a xsede keystone with the express purpose of federating it with anybody and everybody 21:28:01 jmlowe, sounds like something NeCTAR would be interested in 21:28:17 leong, what topic were you thinking about? 21:28:22 b1airo: I can help out 21:28:27 jmlowe: that sounds great 21:28:32 something along the line with OpenHPC + OpenStack 21:28:33 leong: Do you have a proposal in the works? Makes sense to join them if so, increase the weighting 21:29:06 I added this to the bof submission today "It is the intent of this BOF to provide the broader HPC community with examples of HPC work being done, lessons learned, and best practices from members of the OpenStack community." 21:29:08 oneswig: nice direction 21:29:18 Is there a specific objective for the BOF? (such as for example to hear from the HPC community to what extent they would be interested in adopting either OpenStack or more generally IaaS type model?) 21:29:26 * anteaya is so heartened when witnessing collaboration 21:29:34 thank you scientific group 21:29:39 leong: Wouldn't a BoF be about something people are already established with using? OpenHPC + OpenStack sounds like something on the event horizon 21:31:01 oneswig, i will discuss with my OpenHPC team on more details 21:31:11 oneswig, that's what i was thinking too, but certainly relevant in the context of the OpenStack BoF 21:31:43 katekeahey, we'll send you the draft link... 21:32:03 Ah, fantastic, I figured there must be some shared context -- thank you! 21:32:04 leong: Not to say I'm not highly interested in OpenHPC, it's the kind of topic I'd be very interested to discuss (among others) 21:32:26 i understand :-) 21:32:47 so there is one question on the BoF submission that we need to sort out 21:32:55 that can also be a potential topic to discuss at Barcelona summit 21:32:59 (other than paring it down) 21:33:25 we need a "primary session leader" 21:33:41 straws? 21:34:03 also katekeahey - the intent with the BoF is for folks interested in HPC on OpenStack 21:34:03 Bill? 21:34:48 so it's sort of continuing on from the one Jon Mills led last year, which was generally cloud focused I believe, and this time being opinionated about using OpenStack 21:35:51 oneswig, yes could be Bill i guess. best if it is someone recognisable in the SC/HPC community 21:36:45 jmlowe, you are most welcome to lead it if you like, no need for straws :-) 21:37:27 though we haven't heard from Jon for a while, he might expect to be doing it, so need to ping him 21:37:45 I thought it was convention that the short straw lost and would have to lead 21:38:39 jmlowe - haha, you've exposed my graciousness as an act of laziness :-) 21:39:12 I'd do if if you had exhausted the list of better candidates 21:39:13 Is everyone holding a straw? This one looks long 21:40:09 ok, let's take this to email then oneswig ? 21:40:30 b1airo: good plan, we can't speak for absent friends 21:41:35 #action oneswig, b1airo: email lists and members re. BoF leaders and participants 21:41:55 we need a rough idea of the numbers too, but i'm guessing it's going to be large 21:42:52 ok, any objections to moving on so we can here about what leong and Intel are up to in the OpenHPC - OpenStack integration space? 21:43:00 s/here/hear/ 21:43:14 go ahead 21:43:28 #topic OpenHPC - OpenStack Integration 21:43:50 leong, want to give us a run down? 21:43:54 sure.. 21:44:18 we are starting a project for OpenHPC and OpenStack 21:44:40 investigating on how to integrate these two technologies together 21:45:03 leong: what do you see as the points of contact, and is there conflict/overlap at all? 21:45:12 have existing projects been evaluated to see if any are a close fit? 21:45:28 we are now looking at OpenStack Ironic 21:45:36 leong: awesome thank you 21:45:41 jroll: ^^ 21:46:03 hi 21:46:06 is still at the early stages of investigation.. 21:46:14 jroll: thought you would want to be here for this 21:46:20 but Ironic is identified as our initial project to integrate with 21:46:28 wonderful 21:46:36 leong: please meet jroll the Ironic PTL 21:46:38 that is great 21:46:42 jroll: leong 21:46:53 isunil is my team, he just join this discussion 21:47:00 Hello 21:47:05 wonderful 21:47:33 leong: Is OpenHPC expected to image+boot HPC node instances on demand in this scenario? 21:47:46 there is nothing much to report today, i just want to bring up the conversation and interest in this group? 21:48:14 I think meeting some ironic folks is a good first step 21:48:17 oneswig, that can be one scenario, but we haven't got into that level of details yet 21:48:29 I'm sure they can help you evaluate if ironic is a good fit 21:48:34 sure, I bet Bridges at PSC would be really interested 21:48:41 leong: plenty of interest. If your focus is on Ironic, I wonder if OpenHPC might contribute a scalable way of imaging x thousand nodes simultaneously 21:49:32 we will keep this team updated on our progress 21:49:36 I think I remember rbudden (sends his regrets as he is on vacation) saying that it took 8 hours to image all of his ironic nodes 21:49:38 oneswig, isn't that a problem the Rocks guys have had a pretty good solution to for some years? 21:49:57 i'm wondering we can have further discussion at Barcelona for this OpenHPC topic 21:50:07 jmlowe: with an image that already was built? or does that include building the image? 21:50:08 we will probably have more results to share along the next few months 21:50:09 oneswig: I am new to OpenStack, excuse me for dumb question: is scalable provisioning via irnoic has a concern? 21:50:11 b1airo: question is how to fit it into multi-tenant environment... 21:50:33 anteaya: not sure, wasn't paying attention 21:50:57 jmlowe: okay, image builds often take the infra team a considerable amount of time 21:50:59 jmlowe: anteaya: my recollection was more like 2 days to image the whole lot (but was jetlagged at the time) 21:51:04 there is some scalability concern when scheduling x-thounsand deployment 21:51:06 for the record: http://www.rocksclusters.org/rocks-doc/papers/two-pager/paper.pdf 21:51:09 oneswig: /nod 21:51:26 oneswig: ironic gained multitenant networking this cycle, there's just one nova patch left to go, I hope to see that in the newton release 21:51:42 b1airo: thanks for the pdf 21:51:43 isunil: my understanding is that ironic as a whole doesn't scale well, single threaded and has trouble walking a large number of nodes 21:51:45 jroll: that is so great, well done 21:52:09 jmlowe: define scale? ironic handles thousands of nodes pretty well 21:52:25 that is one area we need to work closely with Ironic team on the scalability 21:52:46 I'll admit there's work to do on simultaneous deployments, multicasting images and such 21:52:48 jroll: I don't run it, just foggy beer soaked recollections from war stories in Austin 21:53:12 jmlowe: if you can recollect any data with links, do share 21:53:16 leong: if you came looking for ideas for work, I think you found it :-) 21:53:28 jmlowe: hm, data would be nice :) 21:53:28 jmlowe: folks in the #openstack-ironic channel are most friendly 21:53:47 indee 21:53:49 d 21:54:11 jroll, will ping you further when we can into OpenHPC integration with Ironic 21:54:15 leong, so one thing i'd suggest is that you focus on things a little higher up the stack. to me openhpc looks promising from the perspective of a hpc SOE (insofar as that is an achievable thing). whatever you're doing you'd presumably want it to work for bare-metal and virtualised clusters? 21:54:18 jroll: is there any work done on using pull models based on swift urls within tripleo for ironic? 21:54:36 leong: cool 21:54:37 sure b1airo 21:54:39 leong: better yet, join the #openstack-ironic channel and be aware of the chat 21:54:59 oneswig: I don't work on tripleo, but ironic certainly supports pulling images from swift for deployment 21:55:10 * devananda notices all the chatter about ironic, perks up a bit late to the conversation 21:55:12 so the OpenStack integration could e.g. use Heat as the integration point with Nova or Ironic behind ? 21:55:17 I assume tripleo supports it, just needs swift and the right driver selected 21:55:19 devananda: welcome 21:55:46 Hi devananda 21:56:13 hi devananda 21:56:16 we have 5 mins to the hour.. i will work with our OpenHPC team and came out with a plan on how to move this forward 21:56:30 jroll: is the pull-based deployment documented in ironic dev docs? Perhaps I should start at the tripleo end. 21:56:40 leong, do you have a goal for this project at this stage? 21:56:56 leong: here is some information on the ironic weekly meeting: http://eavesdrop.openstack.org/#Ironic_(Bare_Metal)_Team_Meeting 21:56:57 or is it not yet rubber-stamped by "the business" 21:57:24 oneswig: http://docs.openstack.org/developer/ironic/drivers/ipa.html 21:57:47 working with an existing project would really reduce your project maintenance over head 21:57:51 oneswig: tripleo is a means to deploy openstack -- it's not specifically about HPC or about Ironic 21:57:57 Thanks jroll looks clear as always. 21:58:02 :) 21:58:09 devananda: no but it does use it (the other way) 21:58:09 helping a current project with testing and docs is far easier than setting it all up yourself 21:58:11 b1airo: we are right now @very early stage for openHPC & OpenStack integration. 21:58:46 is there documentation or a reference somewhere I could read that explains what openHPC is, what the project goals are, and what the current implementation status is? 21:58:48 hi Sunil 21:59:00 isunil: leong: got a blog we can track? 21:59:09 devananda, there is code of course :-) 21:59:19 https://github.com/openhpc/ohpc 21:59:49 http://openhpc.community 22:00:17 The bell chimes 22:00:45 We are out of time 22:00:50 indeed 22:01:07 sorry no time for AOB today! 22:01:13 let me follow up with my team and we can discuss next time 22:01:24 Thanks everyone, leong looking forward to it 22:01:26 please take it to the list if there is anything to discuss 22:01:46 b1airo: +1 22:01:56 anteaya, jroll devananda - thanks for jumping on-demand 22:02:04 leong: while you are here, I definitely want to meet up in Barcelona 22:02:05 s/jumping/jumping in/ 22:02:06 blogan: welcome :) 22:02:12 sure! 22:02:17 sure, thanks for collaborating 22:02:17 jmlowe 22:02:22 so heartening 22:02:36 Until next week... 22:02:38 jroll: second character is the number one 22:02:46 jroll: took me a few tries too 22:02:57 #endmeeting