NAME
ffserver - FFserver video server
SYNOPSIS
ffserver [options]
DESCRIPTION
FFserver is a streaming server for both audio and video. It supports
several live feeds, streaming from files and time shifting on live
feeds (you can seek to positions in the past on each live feed,
provided you specify a big enough feed storage in ffserver.conf).
FFserver runs in daemon mode by default; that is, it puts itself in the
background and detaches from its TTY, unless it is launched in debug
mode or a NoDaemon option is specified in the configuration file.
This documentation covers only the streaming aspects of ffserver /
ffmpeg. All questions about parameters for ffmpeg, codec questions,
etc. are not covered here. Read ffmpeg-doc.html for more information.
How does it work?
FFserver receives prerecorded files or FFM streams from some ffmpeg
instance as input, then streams them over RTP/RTSP/HTTP.
An ffserver instance will listen on some port as specified in the
configuration file. You can launch one or more instances of ffmpeg and
send one or more FFM streams to the port where ffserver is expecting to
receive them. Alternately, you can make ffserver launch such ffmpeg
instances at startup.
Input streams are called feeds, and each one is specified by a <Feed>
section in the configuration file.
For each feed you can have different output streams in various formats,
each one specified by a <Stream> section in the configuration file.
Status stream
FFserver supports an HTTP interface which exposes the current status of
the server.
Simply point your browser to the address of the special status stream
specified in the configuration file.
For example if you have:
<Stream status.html>
Format status
# Only allow local people to get the status
ACL allow localhost
ACL allow 192.168.0.0 192.168.255.255
</Stream>
then the server will post a page with the status information when the
special stream status.html is requested.
What can this do?
When properly configured and running, you can capture video and audio
in real time from a suitable capture card, and stream it out over the
Internet to either Windows Media Player or RealAudio player (with some
restrictions).
It can also stream from files, though that is currently broken. Very
often, a web server can be used to serve up the files just as well.
It can stream prerecorded video from .ffm files, though it is somewhat
tricky to make it work correctly.
What do I need?
I use Linux on a 900 MHz Duron with a cheapo Bt848 based TV capture
card. I’m using stock Linux 2.4.17 with the stock drivers. [Actually
that isn’t true, I needed some special drivers for my motherboard-based
sound card.]
I understand that FreeBSD systems work just fine as well.
How do I make it work?
First, build the kit. It *really* helps to have installed LAME first.
Then when you run the ffserver ./configure, make sure that you have the
"--enable-libmp3lame" flag turned on.
LAME is important as it allows for streaming audio to Windows Media
Player. Don’t ask why the other audio types do not work.
As a simple test, just run the following two command lines where
INPUTFILE is some file which you can decode with ffmpeg:
./ffserver -f doc/ffserver.conf &
./ffmpeg -i INPUTFILE http://localhost:8090/feed1.ffm
At this point you should be able to go to your Windows machine and fire
up Windows Media Player (WMP). Go to Open URL and enter
http://<linuxbox>:8090/test.asf
You should (after a short delay) see video and hear audio.
WARNING: trying to stream test1.mpg doesn’t work with WMP as it tries
to transfer the entire file before starting to play. The same is true
of AVI files.
What happens next?
You should edit the ffserver.conf file to suit your needs (in terms of
frame rates etc). Then install ffserver and ffmpeg, write a script to
start them up, and off you go.
Troubleshooting
I dont hear any audio, but video is fine.
Maybe you didn’t install LAME, or got your ./configure statement wrong.
Check the ffmpeg output to see if a line referring to MP3 is present.
If not, then your configuration was incorrect. If it is, then maybe
your wiring is not set up correctly. Maybe the sound card is not
getting data from the right input source. Maybe you have a really awful
audio interface (like I do) that only captures in stereo and also
requires that one channel be flipped. If you are one of these people,
then export ’AUDIO_FLIP_LEFT=1’ before starting ffmpeg.
The audio and video loose sync after a while.
Yes, they do.
After a long while, the video update rate goes way down in WMP.
Yes, it does. Who knows why?
WMP 6.4 behaves differently to WMP 7.
Yes, it does. Any thoughts on this would be gratefully received. These
differences extend to embedding WMP into a web page. [There are two
object IDs that you can use: The old one, which does not play well, and
the new one, which does (both tested on the same system). However, I
suspect that the new one is not available unless you have installed WMP
7].
What else can it do?
You can replay video from .ffm files that was recorded earlier.
However, there are a number of caveats, including the fact that the
ffserver parameters must match the original parameters used to record
the file. If they do not, then ffserver deletes the file before
recording into it. (Now that I write this, it seems broken).
You can fiddle with many of the codec choices and encoding parameters,
and there are a bunch more parameters that you cannot control. Post a
message to the mailing list if there are some ’must have’ parameters.
Look in ffserver.conf for a list of the currently available controls.
It will automatically generate the ASX or RAM files that are often used
in browsers. These files are actually redirections to the underlying
ASF or RM file. The reason for this is that the browser often fetches
the entire file before starting up the external viewer. The redirection
files are very small and can be transferred quickly. [The stream itself
is often ’infinite’ and thus the browser tries to download it and never
finishes.]
Tips
* When you connect to a live stream, most players (WMP, RA, etc) want
to buffer a certain number of seconds of material so that they can
display the signal continuously. However, ffserver (by default) starts
sending data in realtime. This means that there is a pause of a few
seconds while the buffering is being done by the player. The good news
is that this can be cured by adding a ’?buffer=5’ to the end of the
URL. This means that the stream should start 5 seconds in the past --
and so the first 5 seconds of the stream are sent as fast as the
network will allow. It will then slow down to real time. This
noticeably improves the startup experience.
You can also add a ’Preroll 15’ statement into the ffserver.conf that
will add the 15 second prebuffering on all requests that do not
otherwise specify a time. In addition, ffserver will skip frames until
a key_frame is found. This further reduces the startup delay by not
transferring data that will be discarded.
* You may want to adjust the MaxBandwidth in the ffserver.conf to limit
the amount of bandwidth consumed by live streams.
Why does the ?buffer / Preroll stop working after a time?
It turns out that (on my machine at least) the number of frames
successfully grabbed is marginally less than the number that ought to
be grabbed. This means that the timestamp in the encoded data stream
gets behind realtime. This means that if you say ’Preroll 10’, then
when the stream gets 10 or more seconds behind, there is no Preroll
left.
Fixing this requires a change in the internals of how timestamps are
handled.
Does the "?date=" stuff work.
Yes (subject to the limitation outlined above). Also note that whenever
you start ffserver, it deletes the ffm file (if any parameters have
changed), thus wiping out what you had recorded before.
The format of the "?date=xxxxxx" is fairly flexible. You should use one
of the following formats (the ’T’ is literal):
* YYYY-MM-DDTHH:MM:SS (localtime)
* YYYY-MM-DDTHH:MM:SSZ (UTC)
You can omit the YYYY-MM-DD, and then it refers to the current day.
However note that ?date=16:00:00 refers to 16:00 on the current day --
this may be in the future and so is unlikely to be useful.
You use this by adding the ?date= to the end of the URL for the stream.
For example: http://localhost:8080/test.asf?date=2002-07-26T23:05:00.
OPTIONS
Generic options
These options are shared amongst the ff* tools.
-L Show license.
-h, -?, -help, --help
Show help.
-version
Show version.
-formats
Show available formats.
The fields preceding the format names have the following meanings:
D Decoding available
E Encoding available
-codecs
Show available codecs.
The fields preceding the codec names have the following meanings:
D Decoding available
E Encoding available
V/A/S
Video/audio/subtitle codec
S Codec supports slices
D Codec supports direct rendering
T Codec can handle input truncated at random locations instead of
only at frame boundaries
-bsfs
Show available bitstream filters.
-protocols
Show available protocols.
-filters
Show available libavfilter filters.
-pix_fmts
Show available pixel formats.
-loglevel loglevel
Set the logging level used by the library. loglevel is a number or
a string containing one of the following values:
quiet
panic
fatal
error
warning
info
verbose
debug
Main options
-f configfile
Use configfile instead of /etc/ffserver.conf.
-n Enable no-launch mode. This option disables all the Launch
directives within the various <Stream> sections. FFserver will not
launch any ffmpeg instance, so you will have to launch them
manually.
-d Enable debug mode. This option increases log verbosity, directs log
messages to stdout and causes ffserver to run in the foreground
rather than as a daemon.
SEE ALSO
ffmpeg(1), ffplay(1), the ffmpeg/doc/ffserver.conf example and the HTML
documentation of ffmpeg.
AUTHOR
Fabrice Bellard
2010-07-12