19:01:37 #startmeeting Cloud SIG 19:01:37 Meeting started Fri Jun 1 19:01:37 2012 UTC. The chair is rbergeron. Information about MeetBot at http://wiki.debian.org/MeetBot. 19:01:37 Useful Commands: #action #agreed #halp #info #idea #link #topic. 19:01:41 #meetingname Cloud SIG 19:01:41 The meeting name has been set to 'cloud_sig' 19:02:24 #chair maxamillion 19:02:24 Current chairs: maxamillion rbergeron 19:02:29 #chair gholms 19:02:29 Current chairs: gholms maxamillion rbergeron 19:03:41 rbergeron: Quit slacking! 19:03:44 #topic Roll call 19:03:49 Who's here today? 19:04:04 * maxamillion is here 19:04:13 I'm a chair? ... that's scary 19:05:00 i'm here. 19:05:09 gholms: :) 19:05:13 :D 19:05:25 feels like the week after release ;) 19:05:55 * mdomsch 19:06:03 Exhausted? 19:06:55 yes. 19:06:57 still. 19:07:00 mdomsch: hi there :) 19:07:02 jdarcy :) 19:07:37 * jdarcy :) 19:08:04 okay, we'll start! 19:08:09 #topic Plans for F18 19:08:27 rbergeron: I thought you found a minion to handle feature process now, no? 19:08:28 So i plugged this a bit in the agenda - I mostly want to see if anyone's got anything interesting they're either doing, planning, or would love to see tihs release 19:08:34 mdomsch: HA HA HA HAHAAHAHAHAHAH 19:08:36 WHEW 19:08:37 good one 19:08:48 mdomsch: there's a req. 19:08:49 :-) 19:09:09 mdomsch: these things move at, you know, the speed of wood petrification at red hat :) 19:09:19 * mdomsch wants to see CloudStack finished getting into the distro 19:09:38 SeTo see for F18? 19:09:41 and wants us to think hard about creating separate EPEL-like repos for each cloud management system 19:09:47 mdomsch: indeed. 19:10:47 I tried to get FESCO town hall candidates to bite at that yesterday, with no luck 19:11:24 mdomsch: yeah, i think fesco as a whole is still sort of like... cloud, that thing 19:11:37 i don't think anyone has any concretely formed opinions 19:11:46 gholms: seto see? 19:11:51 What? 19:11:56 Oh 19:12:04 mdomsch: you also didn't get a lot of bites on the mailing list either, iirc with that discussion 19:12:16 yeah, not so much 19:12:16 #idea cloudstack finished for F18 19:12:30 rbergeron: That was me typing, hitting backspace a bunch of times, typing a new sentence, and having the wireless drop. 19:12:31 #idea think about creating separate epel-like repos for each cloud management system 19:12:38 gholms: epic :) 19:12:55 #idea Euca for F18 too :) 19:13:02 :) 19:13:09 gholms: any idears? 19:13:29 * gholms hrms 19:13:32 mdomsch: I might have missed something, but what's the motivation for the different repos? 19:14:01 maxamillion: EPEL requires more long-term stability/support than any extant cloud platform is able to provide. 19:14:18 maxamillion: original request was for a user running OpenStack Diablo, got forcably upgraded to Essex when the bits landed in EPEL 19:14:32 mdomsch: You mean something more formal than repos.fp.o? 19:14:42 gholms: ideally, yes 19:14:45 mdomsch: ahhhh, ok 19:15:33 * rbergeron hands gholms a dime for his use of the word extant 19:15:36 ;) 19:15:41 Heh 19:15:46 mdomsch: maybe branch openstack-swift-diablo and openstack-swift-essex separate? (as well as for all the other stuff) 19:16:00 That was certainly proposed. 19:16:09 But it means a new review of every package for every series. 19:16:13 I don't want to solve it here 19:16:19 gholms: oh right 19:16:28 mdomsch: right, sorry ... just a random thought :) 19:16:36 I want us to think of the experience we want our end users to have 19:16:42 and then reverse engineer from there 19:17:00 rgr 19:17:36 I'd like to see the SIG make dgilmore less a SPOF 19:17:43 mdomsch: are there any specific fesco people that might be interested in helping with such a thing? or do we have the brainpower here to do it on our own (i suspect we do) 19:17:44 Yes, please 19:18:18 re: the packaging stuf 19:18:19 stuff 19:18:26 #idea Recruit volunteers to build Fedora cloud images 19:18:34 rbergeron: unclear - likely. Just like when EPEL launched I hope 19:19:56 * rbergeron nods 19:20:20 #idea define/envision end-user experience(s) 19:20:39 Cloud SIG FAD? :-) 19:20:50 Yes, please! 19:21:00 I kind of want to do one of those late this summer. 19:21:01 #idea Cloud SIG FAD (with lots of yes, plz) 19:21:14 gholms: by late summer please say early fall? lol 19:21:16 I know, let's do it in San Diego the last week of August 19:21:23 mdomsch: LOL 19:21:39 mdomsch: i think we could probably do something, but i don't know if we could actually stay focused there :) 19:21:48 j/k 19:21:56 tdawson mentioned that a bunch of the openshift people would probably be interested in something around August. 19:22:02 gholms: ahhh 19:22:04 hrmmm 19:22:21 as in, "once they have it packaged"? 19:22:35 or what's the august... dependency, i guess 19:22:39 Not sure, he just said "in a couple months." 19:22:46 s/,/;/ 19:22:48 August sounds good for me, I have a couple weekends taken up in August, but other than that 19:23:02 (from an openshift standpoint) :P 19:23:14 I should probably check with $dayjob. :) 19:23:15 gholms: have ou poked anyone else about it? 19:23:32 maybe a mail on the mailing list would be useful - esp. if we have some specific problems defined, etc. 19:23:36 or tasks defined :) 19:23:40 rbergeron: It was part of a blog post that landed on planet.fp.o. Does that count? 19:23:54 We did a Docs FAD at OLF in September last year, that work really well for us (sorry to interrupt) 19:24:27 god, summer, my kids are asking inane questions, sorry 19:24:45 There's a fairly nice co-working space in Durham that should be within driving distance of people in the RDU area. 19:25:04 gholms: mailing list might be useful nonetheless :) 19:25:06 And then agrimm and gregdek would have no choice but to attend. :) 19:25:08 Indeed. 19:25:10 gholms: LOL 19:25:51 and I think ew have lots of people within driving range anyway - there are some cloudstack peeps in the "area" (10m to 4h drive), openshift folks i suspect, etc. 19:26:46 gholms: if you want to do that, i can point you at the fad planning pages - i think we really just need to brainstorm on it a bit between "what's everyone's availability" and "what do we want to solve" 19:27:08 https://s3.amazonaws.com/devrandom/imgs/wfm.png 19:27:33 gholms: awesome 19:27:45 awesome 19:28:04 #action gholms to stir up fad planning action on the cloud sig mailing list 19:28:19 perhaps a wiki page of just ideas and the who and the whens possibilities would be a good spot to get started 19:28:23 err, good place 19:28:23 damnit 19:28:29 rbergeron: Don't summon him! 19:28:31 * rbergeron hates that she hails tom unnecessarily all the time :) 19:28:38 * spot emerges from the darkness 19:28:43 Oh noes! 19:28:44 i think he's going to make me start providing a beer every time i do that 19:28:50 BEEEEEEER 19:29:06 which means a lot of beer for him next weekend 19:29:10 anyway: OTHER IDEARS? 19:29:39 Well now we have to start a beer fund for rbergeron's mishaps. 19:29:42 I know the "wouldn't it be nice if i could build a shift on top of a stack on top of fedora" is looming there. 19:29:45 Or something. 19:29:50 help in testing out live-media-creator to make images would rock 19:30:24 #idea help in testing out live-media-creator to make images would rock 19:30:26 rbergeron: I think "making $paas work on $iaas" would be a great thing to do at a FAD and also in general. 19:30:36 Hmm... 19:30:44 * rbergeron hands gholms the idea button 19:30:44 how is that opensuse studio - like - project coming along? "here's a kickstart file, give me a cloud instance please" 19:30:59 gholms: >1 day :-) 19:31:00 mdomsch: oh, you know, in my copious spare time i've gone nowhere with it 19:31:20 rbergeron: there was a planet post just last week about it - some intern... 19:31:21 #idea Make $PaaS work on $IaaS 19:31:26 mdomsch: unless you're referring to just boxgrinder alone 19:31:37 +1 to gholms' idea. 19:31:42 mdomsch: ah, yes, i think i saw that at some point 19:31:58 though i cna't remember if it's actually intern-y or more like... GSOC 19:32:02 agrimm favors boxgrinder, fwiw. 19:32:14 gregdek: i think a couple projects favor it 19:32:25 (actually he favors ami-creator.) 19:32:35 or at least recommend it here and there 19:32:54 mdomsch: would you be interested in reaching out to $person and inviting them to the cloud sig mailing list? :) 19:33:26 rbergeron: I'm looking for the post 19:34:08 other ideas! or I'm movin' on to poking at people :) 19:34:19 people/projects 19:34:55 * ke4qqq shows up very late 19:35:49 ke4qqq: quick! ideas for f18 19:35:50 go :) 19:35:57 mdomsch already covered packaging CS 19:35:59 :) 19:36:25 ....or not. okay... moving on 19:36:30 unless someone objects :) 19:36:36 #topic Gluster 19:36:44 jdarcy: HEY! i hear y'all just had a release 19:37:52 * rbergeron taps her microphone 19:38:07 #link http://www.gluster.org/2012/05/introducing-glusterfs-3-3/ 19:38:17 okay, i'll just plug that for you guys. 19:38:19 and move onwards 19:38:21 rbergeron: Yes, we did. :) 19:38:24 oh. 19:38:28 Now we get to do all the interesting stuff. 19:38:28 Anything to add to that? 19:38:33 do tell. 19:38:47 Well, there's just a ton of stuff that got blocked behind 3.3. 19:39:03 Biggest item for me is multi-way active/active replication (the "and a pony" kind). 19:39:23 That should be good for cloudy folks because migration in/out of clouds is getting to be a hot issue. 19:40:14 Kaleb is tearing his hair out (or would be) trying to deal with GlusterFS 3.2 + HekaFS vs. GlusterFS 3.3 vs. RHS vs. EPEL etc. 19:40:26 jdarcy: aye :) 19:40:44 Thank God we have him. 19:41:43 indeedy 19:41:53 okay. anything else? 19:42:01 * rbergeron will move on if not :) 19:42:10 Nothing for me. 19:42:12 #topic openstack 19:42:18 rustlebee: yo, if you're still about 19:42:31 * rbergeron is pretty sure there's no huge news here atm 19:43:05 okay! we'll assume that's the case 19:43:07 #topic openshift 19:43:14 maxamillion: tell me you'rehere :) 19:43:58 ke4qqq: if you're here maybe you could pipe in about the kind brave soul who is working on chef packaging also :) if there's anything to say there about it 19:44:29 * rbergeron pouts 19:44:35 some intrepid victim has undertaken chef - I've volunteered to sponsor him - and pushed him to doing informal reviews on openshift 19:45:01 that was great to see 19:45:22 chef? oo! 19:45:43 rustlebee: the new nick is even more disturbing than the drumkilla > russellb 19:45:43 rbergeron: sorry, have like 3 meetings going on right now 19:45:47 #info a kind-hearted soul has undertaken chef packaging, ke4qqq has volunteered to sponsor him 19:45:54 maxamillion: ah, gotcha 19:46:10 rustlebee: i stand yet again in my "i totally like it" corner :) 19:46:17 :) 19:46:35 #info (and robyn is bad at meetings, chef packaging has nothing to do with openshift, just randomly wandering through topics) 19:46:48 gregdek: yes, ooooo! 19:46:51 * jdarcy <- even worse at meetings 19:47:54 rbergeron: so, openshift has some fun challenges going on right now because we are current based on ruby 1.8 and we're working to get things in line with ruby 1.9 for the F18 openshift feature 19:48:21 ahh, yes. ;) 19:49:41 maxamillion: well, people are looking forward to it :) 19:49:53 rbergeron: we're also in the process of getting all our m-collective scaling code open sourced and up on github and out to the community in a consumable form ... details on the state of that are on our FAQ https://openshift.redhat.com/community/wiki/faq-frequently-asked-questions under "how do I scale to more than one node with OpenShift Origin?" 19:50:04 rbergeron: we're looking forward to it! :D 19:50:08 oh! 19:50:24 r-e-o 19:50:26 mmmm 19:50:27 oreos 19:50:34 * rbergeron waits for maxamillion's oh! 19:50:43 we'll be updating our OpenShift Origin LiveCD Fedora Remix (LONGEST NAME EVAR) 19:50:48 ohhhhhh!!!! 19:50:54 o.O; 19:51:10 OpenShift Origin CloudStack oVirt LiveCD Fedora Remix 19:51:13 updating it to... be based on a beefy miracle? 19:51:22 mdomsch: you forgot euca 19:51:40 lol 19:51:51 oh! also, I've launched a Fedora Nightly (only Fedora 16) right now for the openshift origin bits that are hosted on the github openshift/crankcase repo --> http://mirror.openshift.com/pub/crankcase/nightly/fedora-16/ 19:52:07 maxamillion: wow, you've been busy :) 19:52:08 note however that there are no builds for last night ... that was my mistake :( 19:52:13 rbergeron: certainly so :) 19:52:26 that is awesome though 19:53:44 okay. i'm moving on! 19:53:48 #topic Open Floor 19:53:55 now's the time to bring up what i forgot! 19:54:07 * mdomsch got s3-mirror-us-west-1 functional on Wednesday 19:54:16 now both us-east-1 and us-west-1 have private S3 mirrors 19:54:27 and I've started building one for us-west-2 19:54:55 * mdomsch _needs_ someone who knows about S3 log files, processing, and reporting, to do something useful with the bucket logs 19:54:59 mdomsch: i got the acount moved back out of being consolidated 19:55:07 rbergeron: great, thanks! 19:55:18 mdomsch: good uptake of f17? 19:55:22 mdomsch: i have no idea if it's back to *free* but... 19:55:29 dgilmore: no idea - someoen needs to look at the stats 19:55:52 fwiw, stats are downloaded daily to log02, where they could use to be processed 19:56:35 s/stats/logs/ 19:56:58 #idea need someone who knows about s3 log files, processing, and reporting, to do something useful with teh bucket logs 19:57:08 this is our best method of telling how many people actually use our images in EC2 19:57:38 #link http://ihasabucket.com/ 19:57:59 * ke4qqq has done some s3 log processing in the past - if no one gets to it before me, and I ever get some time freed up I might take a look at it. 19:58:22 there are some for-pay services like http://www.s3stat.com we could use too 19:58:26 ke4qqq: at least having some documentation might be helpful or a "here's what to do" type of thing 19:58:50 ie: i wouldn't even know where to get started (surprising, i know) 19:59:27 at $5/month, that might be the easiest way to make use of the logs we have 19:59:36 indeed 19:59:59 mdomsch: hmm, interesting 20:00:24 mdomsch: can you elaborate on wha tdata we'd want to glean out of that - other than "people are using this stuff" 20:00:34 rbergeron: that's most of it honestly. 20:00:48 some idea of number of VMs 20:00:48 mdomsch: okay 20:00:53 by version and region 20:00:54 fair 'nuff :) 20:01:01 yeah, i think the version, region info would be helpful 20:01:04 right now it's "we hope we have some users" 20:01:10 * gholms reappears 20:01:28 i am apparently late for a meeting. 20:01:42 * rbergeron is happy to pass the baton off to someone else 20:02:14 mdomsch: i agree 20:02:36 oay. anyone else? 20:02:41 * rbergeron slaps her k key 20:03:20 [Meanwhile, back at the ranch...] 20:03:24 gholms: lol 20:03:52 thanks for coming, all :) 20:04:00 * rbergeron looks forward to fad discussions :) 20:04:02 #endmeeting