19:00:24 #startmeeting Cloud SIG (17 Feb 2012) 19:00:24 Meeting started Fri Feb 17 19:00:24 2012 UTC. The chair is gholms. Information about MeetBot at http://wiki.debian.org/MeetBot. 19:00:24 Useful Commands: #action #agreed #halp #info #idea #link #topic. 19:00:34 #topic Roll call 19:00:40 #meetingname cloud 19:00:40 The meeting name has been set to 'cloud' 19:00:43 #chair rbergeron 19:00:44 Current chairs: gholms rbergeron 19:01:03 rbergeron is currently driving, so you will have to settle for me for the time being. :) 19:01:06 Who's here? 19:01:35 * rbergeron is here 19:01:43 O... HAI 19:01:46 HAI! 19:02:09 Anyone else? 19:02:12 Joe Breu from Rackspace Cloud Builders 19:02:23 Welcome, rackerjoe 19:02:34 wait, are you related to rackerhacker by first name :) 19:03:08 Related by place of employment only :) 19:03:32 * kkeithley is here 19:03:34 rackerjoe: :) 19:03:37 kkeithley! hi :) 19:03:46 * rbergeron will let the crowd gather a minute 19:04:12 I am on the team inside of Rackspace that performs installations of OpenStack inside of our DCs and also at customer DCs. 19:04:26 obino and mull :) 19:04:37 rackerjoe: ah, nice, well. welcome :) 19:04:57 Well: I'll get started, and we'll see who else shows up. 19:05:04 #topic Eucalyptus 19:05:11 I like putting mull on the spot. 19:05:12 :) 19:05:18 lol 19:05:19 mull: how's packaging-land going? 19:05:24 * rbergeron waves at obino 19:05:24 You just hilighted spot, too. 19:05:29 oh. 19:05:31 rbergeron, it's awesome 19:05:33 :) 19:05:38 well, he's used to me bugging him anyway :) 19:05:39 I have totally rewritten our page 19:05:40 rbergeron: you rang? 19:05:56 spot: not intentionally, but i send my hugs and care for wh :) 19:06:03 aww, thanks. 19:06:11 and we really only have a dozen or so packages left 19:06:11 mull: orly 19:06:21 i saw you have applied for provenpackager 19:06:33 and some of those are just blocked on existing reviews 19:06:34 #info only a dozen or so packages left for euca till it's fully baked 19:06:34 Bwuh? 19:06:34 * aarapov is lurking 19:07:00 also, the ovirt team has most of the same reqs we do (spring, gwt, jboss), so they are now helping 19:07:05 i don't know if that's suppoed to be sekrit and if it is i just failed, but oh well, i'm sure y'all can forgive me 19:07:08 mull: yeah 19:07:23 * rbergeron wondres how jboss is going, haven't talked to mgoldmann in a bit 19:07:27 rbergeron, the fesco issue is certainly not sekrit 19:07:33 but the commentary is 19:07:37 #info 19:07:50 mull: indeed 19:08:02 mull: did you rewrite the feature page or something else? 19:08:04 * rbergeron is looking now 19:08:08 * ke4qqq shows up 19:08:16 rbergeron, soryr, the packaging progress page 19:08:23 https://fedoraproject.org/wiki/Eucalyptus 19:08:35 I rewrote it to reflect remaining work 19:08:40 * tdawson sneaks into the meeting. 19:08:43 instead of listing all 150 jars we depend on 19:08:45 #info packaging progress page for euca has been rewritten to reflect remaining work 19:09:07 * rbergeron waves hi to ke4qqq and tdawson 19:09:33 mull: so i guess the million dollar question is "how soonish do you think?" (while bearing in mind i'm not like OMG HURRY) 19:09:42 just... putting my finger in the wind and all 19:09:49 so, the goal is to get spring and mule submitted next week, and at that point the only thing stopping us will be gholms's axis2c and rampartc packages. :-) 19:10:05 I pinged spot about axis2c today. 19:10:06 mull: ehcache-core? 19:10:16 ke4qqq, yeah, that'll get done by next week 19:10:19 gholms: axis2c has issues? 19:10:31 jforbes, FHS issues, I think 19:10:45 * ke4qqq and Sparks will be happy to help esp wrt to ehcahce-core and it's deps as thats our last dep. 19:10:48 I solved the FHS issues using a large amount of symbolic links and duct tape. 19:10:55 gholms, cool 19:11:02 The only remaining concern is the .so files that it dlopens. 19:11:05 s/is/are/ 19:11:13 #info ehcache-core expected by next week; axis2c has some possible FHS issues that gholms is working with s p o t 19:11:22 Heh 19:11:32 :D 19:11:42 mull: you're rockin', dude. 19:11:52 so, I should be clear that what I expect next week is that we should be able to build a functional euca package with the web ui and reporting module disabled 19:11:58 but it will be functional 19:12:02 dare i ask if there will be a test day or not yet :) 19:12:07 UI will get enabled when gwt lands 19:12:09 $dayjob prioritization be damned, I am going to do the wss4j review today. 19:12:24 rbergeron, ask me next week. :) 19:12:37 #info expectation is that by next weekish we should be able to build a functional euca pkg with web ui and reporting module disabled; UI will get enabled with gwt lands 19:13:00 I think that's all my updates for this week 19:13:01 #action rbergeron to harass, er, ask mull gently about euca test day in hte next meeting 19:13:06 mull: thanks, yo! :) 19:13:19 #topic Cloudstack 19:13:31 * rbergeron spins the wheel of fate in the general direction of ke4qqq and sparks 19:13:55 * rbergeron hopes that fate involves a response ;) 19:13:57 bleh - we have one remaining dep - it just has plenty of deps itself. fortunately it's a shared dep with euca, so we are collaborating on the deps/reviews 19:14:13 ehcache-core is the remaining dep 19:14:27 #info one remaining dep - with lots of deps itself, but is shared with euca, so they are collaborating. (ehcache-core) 19:14:45 so the hope there is also in the next few weeks? 19:15:04 I had hoped to make more packaging progress this week, but will be spending next week on packaging as well in hopes of at least having cloudstack packaged and in a decent shape to be reviewed next week. 19:15:36 ke4qqq: cool. thank you :) 19:15:58 #info hoping to make more progress on packaging next week, to the point of being packaged and in decent shape to be reviewed. 19:16:22 #topic gluster/heka-land 19:16:35 * rbergeron looks at kkeithley and jdarcy to see what's up 19:16:40 (if anything) 19:16:55 Not much concrete at this end (I'm at FAST). 19:17:22 Lots of good discussions, ideas about things like dedup and erasure codes, but that's about it. 19:17:35 #action jdarcy to tell johnmark that he should take jdarcy to dinner 19:17:46 :D 19:17:52 Oh, and I might have found some people willing to write translators. 19:17:59 jdarcy: ah, very cool. 19:18:08 you know that might make some interesting documentation to have in fedora-land as well :) 19:18:15 I hear Pittsburgh is lovely at other times of the year. ;) 19:18:53 lol 19:18:59 rbergeron: Yeah, I really want to get it off hekafs.org, if it's useful to gluster.org and/or fedoraproject.org that's great. 19:19:12 documentation? In fedora-land? 19:19:21 jforbes: little-known fact: 19:19:34 jforbes: Don't worry, GlusterFS documentation sucks just as hard. 19:19:38 http://docs.fedoraproject.org/en-US/Fedora_Draft_Documentation/0.1/html/Cloud_Guide/index.html 19:20:16 * rbergeron sees it may need work as sheepdog and gluster/heka fs are listed as IaaS 19:20:35 and unless those are "incrediblyawesome as a service" i'm pretty sure it's not the right spot 19:20:40 er 19:20:40 place 19:20:46 (god, he's going to kick my ass) 19:20:59 you'd think the docs czar would know better - working for a cloud project and all 19:20:59 Hand Waving as a Service 19:21:24 ke4qqq: i am not sure if jsmith did this or sparks 19:21:28 or some combo of both 19:21:34 Seriously, nothing from kkeithley? When the cat's away... 19:21:37 ahhh figured it was sparks, but could be mistaken 19:21:44 I'm here 19:21:54 fighting with gerrit and jenkins 19:22:10 Ew. Anything I might be able to help with? 19:22:19 I'm going to Connectathon next week, but I already reported that a week or two ago 19:22:37 no, I finally got it to build on jenkins, gcc-3.4 :-( 19:22:41 kkeithley: if you need any fedora swag, lmk, i can probably point you to a person who has some in westford :) 19:23:03 * rbergeron is always happy to help y'all pimp us out 19:23:04 Crap, we really do need to update those build machines. 19:23:18 yeah, tell me now because I'll be headed home shortly 19:23:22 okay. /me is going to move onwards ;) 19:23:38 jdarcy, kkeithley: thanks, and enjoy the rest of fast and the whole of connectathon, respectively 19:23:44 #topic EC2 19:23:45 ty 19:23:52 I don't know that we have anything huge to report here. 19:23:57 mdomsch: pingy? 19:24:25 * rbergeron holds the ping for a moment 19:24:52 I think things are decent here; we'll probably want to give alpha a run when it's aliiiiive and make sure that it's still functioning after all the /usrmove fun and etc 19:25:37 and since nobody has any commentary i'll move on to my next subject 19:25:39 :D 19:25:46 #topic OpenStack 19:25:59 * rbergeron is not sure if any of the openstack peeps are around, but 19:26:05 btw, I loaded f17 in a kvm guest and built glusterfs on it 19:26:12 Just me from the RCB team at Rackspace 19:26:15 kkeithley: did it blend? 19:26:22 rackerjoe: hello, again! :) 19:26:43 Our team currently is building packages for our deployments but we want to focus more on operational fixes to OpenStack and less on packaging 19:26:52 * rbergeron just wanted to check in on some of the progress of https://fedoraproject.org/wiki/Features/OpenStack_Essex 19:27:04 rackerjoe: we have a, um, small army of people working on getting it all in fedora. 19:27:12 so we want to get involved with getting good stable packages out there that we can start deploying in customer environments 19:27:14 diablo is in, but kind of prepping for essex, and some other related things. 19:27:29 * russellb is here 19:27:35 russellb: HI 19:27:42 we have updated Nova and Glance to Essex so far 19:28:05 rackerjoe: we'd love to collaborate on the packages :-) 19:28:45 russellb: i know you guys have a test day planned for 3/8, is there any help you need with that, be it advertising (i'm good at that) or otherwise (others might be more helpful, lol) ? 19:28:47 russellb: I see apevec building stuff for swift (bringing it closer to up to date, but don't see any Fedora builds, just epel.) whats up with that (or am I just missing things?) 19:29:09 #info Nova and Glance are updated to Essex so far 19:29:17 Mind if I ask a silly question related to that? 19:29:25 ke4qqq: hm, I don't know. he may have just been working on some dependency issues. I don't know of any reason he wouldn't update Fedora. 19:29:31 gholms: go for it 19:29:37 russellb: bah I take it back, I just searched and saw some f17 builds 19:29:41 Where do I find what actual version corresponds to $release_name? 19:29:41 /ignore me 19:29:42 ah, k 19:29:43 though i'm sure it's not me you're minding? 19:29:55 gholms: heh, I guess it's not obvious ... 19:29:58 People say "essex" and I have no idea how to tell what a package actually is. 19:30:02 Essex == 2012.1 19:30:10 Is there a wiki page or something for that? 19:30:22 gholms: magic 8 ball :) 19:30:29 wiki.russellbsbrain.awesome 19:30:32 :D 19:30:36 ke4qqq: Ask again later 19:30:55 gholms: not that i know of ... 19:31:02 Do you have a working packaged keystone and dashboard as of yet? 19:31:02 Okay. :( 19:31:25 rackerjoe: working on it very hard. keystone is packaged ... but not updated to the new keystone yet, since that was *just* merged, but we will have it 19:31:33 gholms, It's epoch + (ord($letter_release) * (release cycle)) right? 19:31:40 easy as pie 19:31:48 actually, I think horizon was added already 19:31:53 for Fedora 17 19:32:06 rackerjoe: https://fedoraproject.org/wiki/OpenStack has most of the .. details on what's in, out, in progress (assuming the page is up to date), etc. 19:32:30 http://koji.fedoraproject.org/koji/buildinfo?buildID=297684 19:32:34 horizon build ^^^ 19:32:45 well, top level package is actually openstack-horizon 19:33:10 so yeah, I haven't tried it yet, but it's in :) 19:33:50 * rbergeron notes she is thinking of trying to obtain a fedora booth at the conference portoin of the openstack summit/conference (assuming i could do so for an amount that is not TEN THOUSAND DOLLARS) 19:34:02 since others have a booth full of shiny and collateral there 19:34:12 excellent. I'll grab those packages and do a local install to test operability 19:34:20 (and i'm not kidding about ten thousand dollars either :() 19:34:36 rackerjoe: great! please email me with how it goes. rbryant@redhat.com 19:35:15 so to what degree are you guys using RHEL / Fedora? 19:36:13 At this time we are using Ubuntu for our deployments but almost every install that we have done customers have asked for a rhel option which we haven't been able to provide yet 19:36:27 And Rackspace is a pretty big user of RHEL to start with 19:36:54 "rackspace: we have more RHCEs than red hat does" 19:37:03 cool, well give the EPEL packages a try. 19:37:18 Our ultimate goal is to work with the distros and get their packages to a state where we can deploy on any OS. I personally prefer RedHat myself :) 19:37:40 I don't think we can claim we have more RHCEs anymore :) 19:37:51 well we are certainly working hard toward that same goal 19:38:02 lol 19:38:31 so any feedback you guys have on the packages would be great to hear. 19:38:57 Will do. 19:38:58 horizon isn't in EPEL yet, but it will be once we update EPEL to essex. 19:40:11 okay. ready to move on? 19:40:42 * rbergeron will move on to... 19:40:56 #topic Any other business? aka Open Floor 19:42:27 anyone, anyone. 19:42:28 :) 19:42:34 one thing 19:43:06 if you are interested in hacking on Xen - there's a Xen hackathon at the Oracle Campus March 6-8 19:43:09 * ke4qqq looks for link 19:43:18 http://blog.xen.org/index.php/2012/01/05/oracle-hosted-xen-hackathon/ 19:43:35 so if you are in the bay area and are interested - feel free to come along 19:43:49 * ke4qqq thinks glusterfs folks are showing up 19:44:18 EOF 19:44:39 #info Xen hackathon at the Oracle Campus March 6 - 8 19:45:26 #info http://blog.xen.org/index.php/2012/01/05/oracle-hosted-xen-hackathon/ 19:45:33 * rbergeron thinks links get sucked into meetbot anyway 19:45:52 #info feel free to come along, says ke4qqq 19:46:04 annnnnyone else? 19:46:09 * rbergeron wonders if XAPI will be getting hacked on there? 19:46:18 (or packaging thereof) 19:47:10 [Everyone stares at rbergeron] 19:47:17 hey now, don't do that ;) 19:47:18 it's already in progress - but yes that's one of the tasks 19:47:20 * rbergeron looks around 19:47:31 well then. 19:47:34 THANKS FOR COMING YO 19:47:43 * rbergeron salutes you all for hard work and kicking ass 19:47:52 see you guys next week. 19:47:55 #endmeeting