|
project statement | download |
---|---|
PIAVE is a render and effect engine, developed and currently running on GNU/Linux.
PIAVE can be used in various ways. You can link your
own project with libpiave and use its features. The Modesto GUI does this.
As a standalone application
PIAVE can read the description of a project in VEML format (XML)
and start rendering. In server mode, you can connect a client
(e.g. a GUI) to PIAVE via a unix or tcp socket and remote control the engine.
KdenLive uses this feature.
PIAVE is part of the Modesto project. Check it out on modesto/sourceforge. A very nice GUI frontend can be found here: KdenLive (or at kdenlive/sourceforge) In case of problems/bug/comments, please use the kdenlive sourceforge mailing list kdenlive-devel@lists.sourceforge.net and look into the archives. |
latest version (Mar2004):piave-0.2.4.tar.gzInstallation instructions:INSTALLHTML source documentation:piave v0.2.3CVS repository:modesto CVS
|
news |
---|
|
introduction |
---|
PIAVE is C++ library. The main objects of PIAVE are the class "StoryBoard"
and the plugin manager "PluginMgr". A StoryBoard represents the root of the render tree. The class itself is actually very simple. Only the available collection of nodes, effects, transitions, etc. make it interesting. Each node in the render tree contains inputs and effects. Unary nodes contain a single input and a single effect. Binary nodes contain two inputs A+B and a single effect which operates on both inputs. This can be a transition from A to B or anything else that involves two video inputs. Inputs can either be video files or other nodes. This way you can construct a complete rendertree. In principle it is possible to construct any kind of production this way. To simplify the tree, it is possible to splitup a production into seperate scenes with no overlapping effects. For many purposes it can be usefull to print aut the render tree (e.g. debugging or save to file etc.). The natural format to do that is XML. Here is an example of a very simple project that just joins several chunks from input files. The object hirachy in memory is translated almost 1:1 to XML: <storyboard> <scene duration="0.52" > <file name="/scratch/samples/4x3-PAL-1.dv" inpoint="0" /> </scene> <scene duration="0.36" > <file name="/scratch/samples/4x3-PAL-4.dv" inpoint="0.52" /> </scene> <scene duration="0.44" > <file name="/scratch/samples/4x3-PAL-2.dv" inpoint="0.88" /> </scene> </storyboard>A more complex example is the transion from one input to another with an alpha blend. Here is the XML translation for a single scene setup: <storyboard> <scene duration="10.0" > <effect name="AlphaBlend" inpoint="4.0" outpoint="6.5"> <input name="A" start="0.0" duration="6.5"> <file name="/scratch/samples/4x3-PAL-1.dv" inpoint="0" /> </input> <input name="B" start="4.0" duration="6.0"> <file name="/scratch/samples/4x3-PAL-5.dv" inpoint="23.7" /> </input> <fade> <keyframe time="0.0" value="0.0" /> <keyframe time="1.0" value="1.0" /> </fade> </effect> </scene> </storyboard>The inpoint/outpoint/start times are always relative to the local time of the parent node. time values for keyframes are always relative to the duration (i.e. outpoint-inpoint) of the effect. The example would show 4 seconds of input A then make a 2.5 second crossfade to input B and show 3.5 more seconds of input B. The plugin manager handles plugins, of course. PIAVE knows input and output plugins, which can be seen as "file formats", codec plugins, which handle specific video codecs. Effects are plugins as well.
PIAVE is supposed to be used by a GUI to handle the lowlevel video
manipulation. Since a central design goal was to completely seperate GUI and
render engine, it is necessary to communcate the capabilities of the
available plugins to the GUI. PIAVE uses a very simple but powerfull
"property tree" for this purpose. The configuration is very similar to an XML
tree. PropertyNode * caps = new PropertyNode( "outstream" ); caps->addAttr( "name", getName() ); PropertyNode * visual = caps->addDaughter( "screen" ); PropertyNode * display = visual->addDaughter( "X11" ); display->addAttr( "visual", "xv" ); display->addAttr( "fallback", "rgb" ); display = visual->addDaughter( "X11" ); display->addAttr( "visual", "rgb" ); PropertyNode * about = caps->addDaughter( "about" ); about->setContent( "Use SDL to display video. Uses xv overlay if available.\n" );Translated to XML this would look like: <outstream name="SDLStream"> <screen> <X11 visual="xv" fallback="rgb" /> <X11 visual="rgb" /> </screen> <about> Use SDL to display video. Uses xv overlay if available. </about> </outstream>And second, somewhere in the GUI or whatever tool you are implementing to work with libpiave, you can use something like the following to get a plugin that provides output to screen. This can be the SDL plugin from above, or any other plugin tet provides the requested features, e.g. a plugin that makes direct use of X extensions to make a YUV overlay. /* setup request */ PropertyNode * screen_caps = new PropertyNode( "outstream" ); PropertyNode * screen = screen_caps->addDaughter( "screen" ); PropertyNode * display = screen->addDaughter( "X11" ); display->addAttr( "visual", "xv" ); /* try to find a plugin with specified capabilities */ Plugin * p = PluginMgr::find( screen_caps ); if ( p ) { /* we found a plugin that provides screen output * now we can reuse the screen propety to instantiate * a plugin of the requested kind/ */ screen->addAttr( "width" , 720 ); screen->addAttr( "height" , 576 ); PluginBase * b = p->newElement( screen ); }The GUI can ask the plugin manager to report available plugins and ask for the capabilities of a plugin. An effect plugin reports which parameters it has. Possible parameters include simple integer or floating point numbers or more complex things like "box on screen" or "color". It is in the responsibility of the GUI to display dialogs for these parameters. The visualization of a "box on screen" might be very different from four generic floats. This "property tree" can simple be translated to XML. For a simple two-track-effect which can render an alpha blend of two video sources it looks like this: <effect name="AlphaBlend"> <input name="A" video="yes" audio="yes" /> <input name="B" video="yes" audio="yes" /> <parameter max="1.0" min="0.0" name="fade" type="double"> Controls the transparency of video B. At 0.0, video A is completely on screen. At 1.0, video B is completely visible. </parameter> <preset name="crossfade"> <fade> <keyframe time="0.0" value="0.0" /> <keyframe time="1.0" value="1.0" /> </fade> </preset> <preset name="overlay"> <fade> <keyframe time="0.0" value="0.5" /> <keyframe time="1.0" value="0.5" /> </fade> </preset> <preset name="inandout"> <fade> <keyframe time="0.0" value="0.0" /> <keyframe time="0.2" value="1.0" /> <keyframe time="0.8" value="1.0" /> <keyframe time="1.0" value="0.0" /> </fade> </preset> <about> This binary operator can be used to overlay a still images, fade in text, cross fade transitions, etc, etc... </about> </effect> You can see, that parameters are not static. You can set keyframes with values. Inbetween the keyframes PIAVE will interpolate the parameter. Times are given relative to the length od the effect. The seperation of GUI and render engine is brought to a maximum with kdenlive where piave is used as a completely seperate process, a standalone program that executes commands sned by the GUI. It would also be possible to link against libpiave and use the features in a more direct way.
|
screen shots | |||
---|---|---|---|
|
features | |
---|---|
90% | runtime loadable plugin interface for iostreams and effects |
90% | video input - raw DV files at present |
95% | exporting to raw DV files (PAL only) - no mixed format input |
90% | still images are handled by gdk-pixbuf. Most ommen formats are supported. Not when in YUV mode |
80% | transitions and effects are available, but, since they are not yet supported by kdenlive, not all parameters are implemented. The transitions and filters are in a proof of principle state but what is there works nice. The infrastructure is tested and working. |
90% | communication via VEML to KdenLive is working and the basic functionality is implemented. With this GUI it is possible to cut and arange DV files. capabilitieas are not yet reported. Effects and effect parameters are not yet specified. |
todo (help!) | |
---|---|
0% | ieee1394 to control DV cams and grab video |
10% | ffmpeg / xine-lib / aRts::video - support decoding of other file formats |
10% | ffmpeg - support encoding of other file formats |
0% | wav / mp3 / etc. sound support for sound tracks |
20% | handle sound independent of video |
0% | field based effect rendering |
links |
---|
documentation
|