The bit_rate field has type int64_t since commit
7404f3bdb9.
Reviewed-by: Paul B Mahol <onemda@gmail.com>
Signed-off-by: Andreas Cadhalpun <Andreas.Cadhalpun@googlemail.com>
Currently, AVStream contains an embedded AVCodecContext instance, which
is used by demuxers to export stream parameters to the caller and by
muxers to receive stream parameters from the caller. It is also used
internally as the codec context that is passed to parsers.
In addition, it is also widely used by the callers as the decoding (when
demuxer) or encoding (when muxing) context, though this has been
officially discouraged since Libav 11.
There are multiple important problems with this approach:
- the fields in AVCodecContext are in general one of
* stream parameters
* codec options
* codec state
However, it's not clear which ones are which. It is consequently
unclear which fields are a demuxer allowed to set or a muxer allowed to
read. This leads to erratic behaviour depending on whether decoding or
encoding is being performed or not (and whether it uses the AVStream
embedded codec context).
- various synchronization issues arising from the fact that the same
context is used by several different APIs (muxers/demuxers,
parsers, bitstream filters and encoders/decoders) simultaneously, with
there being no clear rules for who can modify what and the different
processes being typically delayed with respect to each other.
- avformat_find_stream_info() making it necessary to support opening
and closing a single codec context multiple times, thus
complicating the semantics of freeing various allocated objects in the
codec context.
Those problems are resolved by replacing the AVStream embedded codec
context with a newly added AVCodecParameters instance, which stores only
the stream parameters exported by the demuxers or read by the muxers.
Electronic Arts VP6 files may contain two video streams: one for the
primary video stream and another for the alpha mask. The file format
uses identical data structures for both streams.
Signed-off-by: Peter Ross <pross@xvid.org>
Signed-off-by: Michael Niedermayer <michaelni@gmx.at>
At least one FATE sample contains such chunks and happens to work simply
by accident (due to find_stream_info() swallowing the error).
CC: libav-stable@libav.org
According to its description, it is supposed to be the LCM of all the
frame durations. The usability of such a thing is vanishingly small,
especially since we cannot determine it with any amount of reliability.
Therefore get rid of it after the next bump.
Replace it with the average framerate where it makes sense.
FATE results for the wtv and xmv demux tests change. In the wtv case
this is caused by the file being corrupted (or possibly badly cut) and
containing invalid timestamps. This results in lavf estimating the
framerate wrong and making up wrong frame durations.
In the xmv case the file contains pts jumps, so again the estimated
framerate is far from anything sane and lavf again makes up different
frame durations.
In some other tests lavf starts making up frame durations from different
frame.
The time base is 1 / sample_rate, not 90000.
Several more codecs encode the sample count in the first 4 bytes of the
chunk, so we set the durations accordingly. Also, we can set start_time and
packet duration instead of keeping track of the sample count in the demuxer.
The container has no timestamps and the framerate isn't stored in the
data either.
The decoder sets codec timebase to experimentally found value 1/15. Do
the same for the demuxer too, it should at least be better than the
default 1/90000.
This fixes the video frame pts (off by one for each MVIh)
and makes the "key frames" decode stand-alone (MVIh
contains only palette, such a palette-only frame being
marked as key frame is not really correct).
Signed-off-by: Reimar Döffinger <Reimar.Doeffinger@gmx.de>