02:36:21 #startmeeting 02:36:21 Meeting started Fri Jan 15 02:36:21 2010 UTC. The chair is mchua. Information about MeetBot at http://wiki.debian.org/MeetBot. 02:36:21 Useful Commands: #action #agreed #halp #info #idea #link #topic. 02:36:26 Seemed like a good idea to log. 02:36:57 dgilmore, jds2001: rossand is one of the folks working on freeseer, an open source events recording project 02:37:00 #link http://github.com/fosslc/freeseer/ 02:37:15 rossand: dgilmore and jds2001 are our infrasturcture gurus for the event 02:37:29 Hi dgilmore, jds2001. Nice to meet you. 02:37:38 rossand: same :) 02:37:51 needing rpmfusion sorta makes this a nogo 02:38:03 is there a way to record in open formats? (ogg)? 02:38:08 freeseer in a nutshell... captures vga output and microphone sound, mixes them to produce a video capture of any vga signal. 02:38:17 awesome 02:38:35 they use mencoder, ffmpeg so we can get to ogg (theora/vorbis/whatever)... just not directly. 02:38:39 rossand: and herlo is "He Who Records Tons Of Stuff At Events" 02:38:51 cool. Hi herlo 02:38:57 hello everyone 02:39:09 zxiiro is the lead developer of freeseer. 02:39:36 We created it in October to drop the cost of recording by an order of magnitude. 02:39:38 hey zxiiro - me and dgilmore are fedora infrastructure guys :) 02:40:24 The extend-a-mix version of the history is here: http://www.fosslc.org/drupal/node/596 02:40:52 hi rossand 02:40:57 Thus far, we must have recorded about 250 hours/275 talks with it. 02:41:03 Hi dgilmore 02:41:24 hi jds2001, dgilmore 02:41:31 hi zxiiro 02:41:50 I agree with jds2001 we cant use things from rpmfusion 02:42:13 So the situation - rossand, feel free to jump in and correct me here - is that we might actually want to have the freeseer folks sprint remotely, because (1) plane tickets are incredibly expensive, and (2) development is basically done, and what's needed is testing, making sure docs are good, etc. I'm not sure exactly what infra would be needed to make this work for events - just bandwidth/space capacity? 02:42:20 so development is done, EXCEPT that the "need rpmfusion" bit is a blocker... And solving that problem would be *incredibly* valuable, let us package it for Fedora (aiui - I think that's the only blocker?) make it use all free/unencumbered/yay/etc stuff... but I'm not sure how we tackle that in a 3-day sprint. 02:42:55 rossand: hi, sorry I can't participate more... 02:42:56 yeah, exactly 02:43:10 i have looked at licensing, but that's generally another big area 02:43:17 mchua: spot on. 02:43:56 freeseer is gplv3. the components (mencoder, ffmpeg) are likely to cause a bit of heartburn. We're open minded to options/suggestions of course. 02:44:30 So the question is "do we spend 3 days working on this, and if so, how, and what/who would we need to do it?" If we need to find an ogg guru, etc. 02:44:46 I'd really like to contribute to this conversation... 02:44:49 herlo: thanks. No worries. We're pretty easy going of course. 02:45:08 rossand: if we could convince ffmpeg upstream to support plugins then we could ship with unencumberd codecs in fedora and the patent encumbered ones could ship in rpmfusion and be added in 02:45:12 so it cant go into the fedora repos can it in rpmfusion ?? 02:45:30 VileGent: it can but we cant use it in fedora 02:45:36 but we wouldnt be willing to use it in infra that way 02:45:43 ok 02:45:47 but for a general thing, yes 02:45:51 so, the problem is going to be the issues with the tools you are suggesting...but I have quite a bit of experience with gstreamer on the command line. I'm working on learning the python bindings atm and have a desire to use it as the way we get those things done 02:45:57 this is something we want in fedora to use in fudcons and other events 02:46:00 dgilmore: that's a great idea. ffmpeg has a very strong culture if you know what I mean. ;-) 02:46:11 since most everything in gstreamer is not restricted by patent or licensing issue 02:46:14 s 02:46:39 rossand: :) yeah. when i was at olpc some of the guys i worked with took that idea to ffmpeg and they flat out rejected it 02:47:12 dgilmore: like most suggestions brought to them. It's almost charming in the certainty. :) 02:48:30 rossand: how hard would it be to use gstreamer or xine? 02:49:02 dgilmore: I was taking a look as you typed. zxiiro, are you ahead of me and have any thoughts there? 02:49:46 stickster_afk: in case you become afk in the near future, read the scrollback right around here (2:45:00). 02:49:50 er, non-afk that is. 02:50:16 i'm not too familier with gstreamer or xine, but i know the hardware we use vga2usb needs to be used with something that supports v4l input 02:50:33 while y'all are talking I'm trying to see if we have any gstreamer ninjas at the RDU office or nearby. 02:51:05 zxiiro: I see evidence it might be able to do so. 02:51:32 herlo: looking at the code now if we came up with the right gstreamer command line options i bet we could make it work 02:51:49 zxiiro: v4l is supported in gstreamer 02:51:51 dgilmore: yep. The beauty of freeseer is there ain't much to it. 02:52:02 dgilmore: I've done it 02:52:06 I have the right options 02:52:15 the command line is simple now... 02:52:18 freeseer itself is a QT gui leveraging the video software to do the heavy lifting. 02:52:43 i'm open to anything, the reason mencoder was chosen was mainly it was what was suggested by vga2usb's README file 02:52:52 rossand: yeah, I'm writing in pygtk atm 02:52:56 if we can get the same thing working with any other backend i'd love to try it 02:53:04 I'm guessing there's going to be a bit of conversation regarding those things... 02:53:04 ditto 02:53:48 Any guesses on the scope of the work needed for this? Would this be something we'd want to try sprinting on at the FAD? 02:54:12 Should that be our technical goal, and to get that to the same level of usability/packaged-up-ness/documentation as we want for the current freeseer release? 02:54:13 zxiiro: rossand I think we can do both/either easily 02:54:16 mchua: if herlo doesn't mind priming us with some cli examples, we can take a swipe at moving it over. 02:54:33 well if we can do it pre-FAD that's even better ;) 02:54:36 * dgilmore likes that its qt based :) 02:54:39 rossand: I can give you some of those this week probably. I'll get my examples working again... 02:54:42 regarding usability, it's transparent. 02:54:43 :) 02:54:46 but the FAD is dedicated hackin' time, so if we need it, we've got it. 02:54:48 herlo: awesome, thank you 02:55:13 * herlo does think that we can do a sprint 02:55:18 there should be enough time 02:55:22 the user won't notice what's doing the video slurping under the covers. 02:55:25 Right. 02:55:25 problem is, I don't think there's enough time to do both 02:55:42 but I'm not opposed to doing both... 02:56:24 just don't know how feasible it would be in 3 days 02:56:44 it sounds like we might want to reconvene tomorrow or monday to figure the engineering gameplan out. 02:57:01 either works for me 02:57:08 since this meeting was *totally* random and ad-hoc and we might want to tinker a bit, do background research, etc. before we go "YEAH! 3 DAY SPRINT!" 02:58:22 herlo, rossand, zxiiro: what would we need to know to scope both options out and decide on one, the other, or both (and know what resources, etc. we need to do it well, and boil it into a buncha tickets to whale on at the end of the month?) 02:58:42 may I make a suggestion? 02:58:51 mchua: once we have the cli from herlo, we're enabled to do some testing. 02:59:02 herlo: please do 02:59:26 I think it's not feasible for fedora to ffmpeg, so I want to suggest that we work toward two guis (one qt, one gtk) to manage the same libs 02:59:44 making it easy to just write a good library to work with because gst libs are *fun* 03:00:35 herlo: I understand the concerns about ffmpeg/mencoder. Are there reservations about qt? Look and feel consistency with Gnome? 03:00:44 qt is fine 03:01:16 I guess I'm confused about the 2 gui part. 03:01:18 rossand: I'm not familiar with qt 03:01:20 that's one 03:01:43 another is that there are plenty of people who want the choice of both qt and gtk because they like one over the other 03:02:10 right, but that's more of a long term goal 03:02:35 jds2001: so my big concern there is that I'd spend the entire weekend learning qt 03:02:37 i think a gtk gui option down the road is fine 03:02:43 herlo: may I suggest we focus on qt for now, being mindful of being skinnable in the future? 03:02:45 and waste time doing that... 03:03:06 if we can get a commandline to do the video recording 03:03:10 you shouldn't need to learn qt at all 03:03:12 I guess I could work on a library or something...but... I kind of already have a good mockup in glade... 03:03:20 zxiiro: agreed 03:03:25 zxiiro: that's like a bash commandline 03:03:53 freeseer actually just calls a commandline to do the video recording 03:04:01 oh? 03:04:03 the rest of it is just gui to make it easy to configure 03:04:04 interesting 03:04:21 herlo: checkout the code 03:04:25 its very small 03:04:25 I don't know, the python api is simple enough we could make that clean... 03:04:34 dgilmore: yeah, I was working on checking it out atm 03:05:55 the only code we need to change really to make it use a different video recording backend is 03:05:56 CMD_MENCODER = ("mencoder -tv driver=%(VODRIVER)s:outfmt=bgr24:device=%(VODEVICE)s:forceaudio:alsa -fps 10 tv:// -oac lavc -ovc lavc -lavcopts vcodec=mpeg4:keyint=100:vbitrate=8000:vhq:acodec=vorbis -o \"%(FILENAME)s\"") 03:06:13 assuming whatever recording backend we use is commandline of course 03:06:34 oh, wow, yeah 03:06:36 I just saw that 03:06:47 the rest of the code is just a pretty gui that figures out what to fill in 03:06:47 let me get you a command line that I think works... 03:07:12 * dgilmore will make sure that he brings his webcam with him 03:07:13 herlo: cool. We should be able to test that pretty quick. 03:07:44 gst-launch istximagesrc name=videosource startx=0 starty=0 ! ffmpegcolorspace ! theoraenc ! oggmux ! filesink location=test.ogg <-- this does video only from your desktop 03:07:48 * rossand likes the idea of fewer dependencies and codec hopping. 03:08:06 gst-launch ximagesrc name=videosource startx=0 starty=0 ! ffmpegcolorspace ! theoraenc ! oggmux ! filesink location=test.ogg <-- this does video only from your desktop (if you don't have istanbul installed, use this) 03:08:18 note that the ffmpegcolorspace is not ffmpeg 03:08:54 herlo: do you know if gstreamer will mix in the audio as well? 03:09:00 That's important. 03:09:06 I know istanbul can do that. 03:09:07 rossand: absolutely 03:09:07 * mchua has to pop out in the next few minutes; battery is making angry red blinky lights 03:09:09 I have another 03:09:12 I'll chair you folks so you can keep the meeting going just in case. 03:09:14 Is this a "hack over the weekend, reconvene sometime on Monday to figure out what to do before and during the FAD to finish this" thing? 03:09:14 herlo: ok cool. 03:09:14 rossand: istanbul uses gstreamer 03:09:29 I've been looking at the code for a while 03:09:30 herlo: ah, that explains it then. ;-) 03:09:36 #chair rossand zxiiro herlo dgilmore jds2001 03:09:36 Current chairs: dgilmore herlo jds2001 mchua rossand zxiiro 03:10:07 #note we need to work with options in fedora only 03:10:27 can gstreamer let you both record, and preview the video your recording at the same time? 03:10:27 dgilmore: I believe all of the ones I put up are in fedora only 03:10:29 dgilmore: yep. This should enable us to get there. 03:10:34 zxiiro: yes, it will 03:10:38 there's a tee option 03:10:42 herlo: just recording it in the meeting notes 03:10:44 the ! implies like a | in shell 03:10:57 ok 03:11:12 * rossand steps away for a moment... packing for a flight that's far too early in the morning 03:11:25 so you can ! tee filesink location=afile.ogg ! shout2send ... 03:13:30 gst-launch is part of what package? 03:14:05 gstreamer-tools 03:15:27 oh i do have it, for some reason archlinux calls it gst-launch-0.10 03:15:39 yeah, it's named differently in different distros 03:15:51 I think the -0.10 is what it is called in the upstream 03:16:12 note: next FAD check-in meeting is Monday, 1700-1800 UTC (noon-1pm EST) in this channel, so I'll read backlog when I get to a building later tonight (writing this from now-extremely-freezing car) and try to catch folks before the Monday meeting to find out where we stand, but it sounds like we're assuming remote sprinting for the freeseer folks atm. 03:16:17 * mchua battery on verge of death, heading out 03:16:17 (thanks, folks - this is awesome to watch... if we can get around the codecs issue, it's a huge win all around) 03:16:37 sounds like it's really easy to fix 03:18:54 herlo: can you explain your gst-launch parameters a bit? i'm not too sure how it works 03:19:21 zxiiro: sure 03:19:44 so the way that it works, is you have to think about it similar to pipes in bash 03:19:59 so each component is either a source, a filter, or an output 03:20:08 so the istximagesrc name=videosource startx=0 starty=0 03:20:33 says, grab the x screen and call it videosource. Start the video at the top left of the screen 03:21:03 the next one says, convert the video from yuv to rgb iirc, I can't recall, but I know it's required 03:21:23 then you encode the video in theora, then put it in the ogg container 03:21:31 finally, write it ot the file test.ogg 03:21:36 zxiiro: make better sense? 03:22:03 herlo: so if i run it should it capture my dekstop? 03:22:10 dgilmore: I think so 03:22:21 * herlo has a python program that does the same on the commandline atm 03:22:32 a bit, so if i want to record from /dev/video0 03:22:34 it's really basic and doesn't control the timeframe... 03:22:42 zxiiro: what is /dev/video0? 03:22:53 you have to think about it slightly differently... 03:22:58 either a v4l device or a v4l2 device 03:23:04 there are a million sources 03:23:10 zxiiro: check out v4lsrc 03:23:14 or something like that 03:23:20 you'll want to learn gst-inspect :) 03:24:43 zxiiro: start with this http://www.cin.ufpe.br/~cinlug/wiki/index.php/Introducing_GStreamer 03:25:18 herlo: /dev/videox is what you get for webcams, tv tuner cards etc 03:25:55 dgilmore: yeah, but it's not called /dev/videox in gstreamer, and thus you have to kind of extrapolate a little... 03:26:33 speaking of v4l and all this stuff, what piece(s) of hardware are required to make this solution work? 03:26:52 a camera, obviously 03:27:10 connected via firewire or usb or whatnot is my guess 03:28:08 jds2001: we've been using this to record fosslc events http://www.epiphan.com/products/frame-grabbers/vga2usb/ 03:29:21 basically, 2 way vga splitter, plug one end into presenter's laptop, 1 into projector for audience, and 1 into recording laptop to record the slides 03:30:00 and this device tags the vga, turns it into a usb signal, and outputs it using v4l 03:30:14 s/tags/takes 03:31:17 and for audio we use wireless mics, plug one into the mic jack on the recording laptop, and the other on the presenter 03:31:37 zxiiro: are we planning to stream this video or just capture it to make it available via download later? 03:32:29 the software just records, it does not currently stream 03:32:34 ok 03:33:55 zxiiro: I want to see the vga2usb device thing, that looks cool 03:34:15 will we have one at the fad? 03:34:24 not if they dont come 03:34:26 :) 03:34:37 though i guess we could buy one or something 03:35:30 we would be bringing these with us if we attend so that we could show you it in action 03:36:06 it's actually pretty cool, i thought it was pretty amazing the first time i saw it 03:40:46 yeah, I've seen framegrabbers ... 03:41:18 in fact, my buddy has this exact one. I just asked him 04:08:00 ok i figured out how to use gstreamer a bit and can record my desktop 04:08:15 now to the real problem, getting it to work with the framegrabber 04:08:56 i know it needs specific settings set, like in mencoder i needed to set the image format with outfmt=bgr24 04:09:19 how would i do that with gst-launch 04:15:19 * dgilmore goeas and grabs a web cam to try work it out 04:22:36 zxiiro: gst-launch v4l2src ! xvimagesink 04:22:45 that opened up my webcam 04:23:58 unfortuantely that does not open up the vga2usb device, i actually get this error "Not enough buffers. We got 1, we want at least 2" 04:24:07 herlo: can you bring it 04:28:28 gst-launch v4l2src ! ffmpegcolorspace ! theoraenc ! oggmux ! filesink location=test.ogg 04:28:35 zxiiro: does that work? 04:29:15 no, but i figured it out :) 04:29:25 :) awesome 04:29:29 what was it? 04:29:52 the vga2usb module needs to be loaded with num_frame_buffers=2 option 04:30:01 ahh 04:30:24 just did a quick test with gst-launch v4lsrc ! xvimagesink and i see video so it looks promising 04:32:26 anyway i need to get some sleep. I'll play with this some more tomorrow. 04:33:51 good night all 04:33:53 o/ 04:35:11 cool 04:35:18 ahhs end the meeting 04:35:24 #endmeeting