mirror of https://github.com/FFmpeg/FFmpeg.git
You can not select more than 25 topics
Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
222 lines
7.8 KiB
222 lines
7.8 KiB
\input texinfo @c -*- texinfo -*- |
|
|
|
@settitle FFserver Documentation |
|
@titlepage |
|
@sp 7 |
|
@center @titlefont{FFserver Documentation} |
|
@sp 3 |
|
@end titlepage |
|
|
|
|
|
@chapter Introduction |
|
|
|
@c man begin DESCRIPTION |
|
FFserver is a streaming server for both audio and video. It supports |
|
several live feeds, streaming from files and time shifting on live feeds |
|
(you can seek to positions in the past on each live feed, provided you |
|
specify a big enough feed storage in ffserver.conf). |
|
|
|
This documentation covers only the streaming aspects of ffserver / |
|
ffmpeg. All questions about parameters for ffmpeg, codec questions, |
|
etc. are not covered here. Read @file{ffmpeg-doc.html} for more |
|
information. |
|
@c man end |
|
|
|
@chapter QuickStart |
|
|
|
[Contributed by Philip Gladstone, philip-ffserver at gladstonefamily dot net] |
|
|
|
@section What can this do? |
|
|
|
When properly configured and running, you can capture video and audio in real |
|
time from a suitable capture card, and stream it out over the Internet to |
|
either Windows Media Player or RealAudio player (with some restrictions). |
|
|
|
It can also stream from files, though that is currently broken. Very often, a |
|
web server can be used to serve up the files just as well. |
|
|
|
It can stream prerecorded video from .ffm files, though it is somewhat tricky |
|
to make it work correctly. |
|
|
|
@section What do I need? |
|
|
|
I use Linux on a 900MHz Duron with a cheapo Bt848 based TV capture card. I'm |
|
using stock linux 2.4.17 with the stock drivers. [Actually that isn't true, |
|
I needed some special drivers from my motherboard based sound card.] |
|
|
|
I understand that FreeBSD systems work just fine as well. |
|
|
|
@section How do I make it work? |
|
|
|
First, build the kit. It *really* helps to have installed LAME first. Then when |
|
you run the ffserver ./configure, make sure that you have the --enable-mp3lame |
|
flag turned on. |
|
|
|
LAME is important as it allows streaming of audio to Windows Media Player. Don't |
|
ask why the other audio types do not work. |
|
|
|
As a simple test, just run the following two command lines (assuming that you |
|
have a V4L video capture card): |
|
|
|
@example |
|
./ffserver -f doc/ffserver.conf & |
|
./ffmpeg http://localhost:8090/feed1.ffm |
|
@end example |
|
|
|
At this point you should be able to go to your windows machine and fire up |
|
Windows Media Player (WMP). Go to Open URL and enter |
|
|
|
@example |
|
http://<linuxbox>:8090/test.asf |
|
@end example |
|
|
|
You should see (after a short delay) video and hear audio. |
|
|
|
WARNING: trying to stream test1.mpg doesn't work with WMP as it tries to |
|
transfer the entire file before starting to play. The same is true of avi files. |
|
|
|
@section What happens next? |
|
|
|
You should edit the ffserver.conf file to suit your needs (in terms of |
|
frame rates etc). Then install ffserver and ffmpeg, write a script to start |
|
them up, and off you go. |
|
|
|
@section Troubleshooting |
|
|
|
@subsection I don't hear any audio, but video is fine |
|
|
|
Maybe you didn't install LAME, or get your ./configure statement right. Check |
|
the ffmpeg output to see if a line referring to mp3 is present. If not, then |
|
your configuration was incorrect. If it is, then maybe your wiring is not |
|
setup correctly. Maybe the sound card is not getting data from the right |
|
input source. Maybe you have a really awful audio interface (like I do) |
|
that only captures in stereo and also requires that one channel be flipped. |
|
If you are one of these people, then export 'AUDIO_FLIP_LEFT=1' before |
|
starting ffmpeg. |
|
|
|
@subsection The audio and video loose sync after a while. |
|
|
|
Yes, they do. |
|
|
|
@subsection After a long while, the video update rate goes way down in WMP. |
|
|
|
Yes, it does. Who knows why? |
|
|
|
@subsection WMP 6.4 behaves differently to WMP 7. |
|
|
|
Yes, it does. Any thoughts on this would be gratefully received. These |
|
differences extend to embedding WMP into a web page. [There are two |
|
different object ids that you can use, one of them -- the old one -- cannot |
|
play very well, and the new one works well (both on the same system). However, |
|
I suspect that the new one is not available unless you have installed WMP 7]. |
|
|
|
@section What else can it do? |
|
|
|
You can replay video from .ffm files that was recorded earlier. |
|
However, there are a number of caveats which include the fact that the |
|
ffserver parameters must match the original parameters used to record the |
|
file. If not, then ffserver deletes the file before recording into it. (Now I write |
|
this, this seems broken). |
|
|
|
You can fiddle with many of the codec choices and encoding parameters, and |
|
there are a bunch more parameters that you cannot control. Post a message |
|
to the mailing list if there are some 'must have' parameters. Look in the |
|
ffserver.conf for a list of the currently available controls. |
|
|
|
It will automatically generate the .ASX or .RAM files that are often used |
|
in browsers. These files are actually redirections to the underlying .ASF |
|
or .RM file. The reason for this is that the browser often fetches the |
|
entire file before starting up the external viewer. The redirection files |
|
are very small and can be transferred quickly. [The stream itself is |
|
often 'infinite' and thus the browser tries to download it and never |
|
finishes.] |
|
|
|
@section Tips |
|
|
|
* When you connect to a live stream, most players (WMP, RA etc) want to |
|
buffer a certain number of seconds of material so that they can display the |
|
signal continuously. However, ffserver (by default) starts sending data |
|
in real time. This means that there is a pause of a few seconds while the |
|
buffering is being done by the player. The good news is that this can be |
|
cured by adding a '?buffer=5' to the end of the URL. This says that the |
|
stream should start 5 seconds in the past -- and so the first 5 seconds |
|
of the stream is sent as fast as the network will allow. It will then |
|
slow down to real time. This noticeably improves the startup experience. |
|
|
|
You can also add a 'Preroll 15' statement into the ffserver.conf that will |
|
add the 15 second prebuffering on all requests that do not otherwise |
|
specify a time. In addition, ffserver will skip frames until a key_frame |
|
is found. This further reduces the startup delay by not transferring data |
|
that will be discarded. |
|
|
|
* You may want to adjust the MaxBandwidth in the ffserver.conf to limit |
|
the amount of bandwidth consumed by live streams. |
|
|
|
@section Why does the ?buffer / Preroll stop working after a time? |
|
|
|
It turns out that (on my machine at least) the number of frames successfully |
|
grabbed is marginally less than the number that ought to be grabbed. This |
|
means that the timestamp in the encoded data stream gets behind real time. |
|
This means that if you say 'preroll 10', then when the stream gets 10 |
|
or more seconds behind, there is no preroll left. |
|
|
|
Fixing this requires a change in the internals in how timestamps are |
|
handled. |
|
|
|
@section Does the @code{?date=} stuff work. |
|
|
|
Yes (subject to the caution above). Also note that whenever you start |
|
ffserver, it deletes the ffm file (if any parameters have changed), thus wiping out what you had recorded |
|
before. |
|
|
|
The format of the @code{?date=xxxxxx} is fairly flexible. You should use one |
|
of the following formats (the 'T' is literal): |
|
|
|
@example |
|
* YYYY-MM-DDTHH:MM:SS (localtime) |
|
* YYYY-MM-DDTHH:MM:SSZ (UTC) |
|
@end example |
|
|
|
You can omit the YYYY-MM-DD, and then it refers to the current day. However |
|
note that @samp{?date=16:00:00} refers to 4PM on the current day -- this may be |
|
in the future and so unlikely to useful. |
|
|
|
You use this by adding the ?date= to the end of the URL for the stream. |
|
For example: @samp{http://localhost:8080/test.asf?date=2002-07-26T23:05:00}. |
|
|
|
@chapter Invocation |
|
@section Syntax |
|
@example |
|
@c man begin SYNOPSIS |
|
ffserver [options] |
|
@c man end |
|
@end example |
|
|
|
@section Options |
|
@c man begin OPTIONS |
|
@table @option |
|
@item -L |
|
print the license |
|
@item -h |
|
print the help |
|
@item -f configfile |
|
use @file{configfile} instead of @file{/etc/ffserver.conf} |
|
@end table |
|
@c man end |
|
|
|
@ignore |
|
|
|
@setfilename ffsserver |
|
@settitle FFserver video server |
|
|
|
@c man begin SEEALSO |
|
ffmpeg(1), ffplay(1) and the html documentation of @file{ffmpeg}. |
|
@c man end |
|
|
|
@c man begin AUTHOR |
|
Fabrice Bellard |
|
@c man end |
|
|
|
@end ignore |
|
|
|
@bye
|
|
|