The RFC spec draft only specifies the "H265" name - there is no
specification saying how to interpret "HEVC" (if such a packet
format is specified it could be an entirely different format).
Since this is a very new standard (still a draft), there is little
need for compatibility with existing, broken implementations. Therefore
remove the extra alias, to avoid the risk of encouraging incorrect
usage.
Intentionally keeping the ff_hevc_dynamic_handler name for the
handler, to use "hevc" consistently as name for the codec instead
of "h265" within the library internals as long as there only is one
single variant in actual use.
Signed-off-by: Martin Storsjö <martin@martin.st>
In practice this hint is ignored - the rtp muxer always overwrites
the stream time base without taking the hint into account. But as
a general practice this is the correct way to pass a time base hint
on to a chained muxer.
This avoids warnings about using the codec time base as hint
being deprecated.
Signed-off-by: Martin Storsjö <martin@martin.st>
The size variable is (correctly) unsigned, but is passed to several functions
which take signed parameters, such as avio_read, sometimes after having
numbers added to it. So ensure that size remains within the bounds that
these functions can handle.
CC: libav-stable@libav.org
Signed-off-by: Diego Biurrun <diego@biurrun.de>
Previously, the returned error codes were intentionally ignored
(see fadd3a6821), to avoid aborting if the directory already
existed. If the mkdir actually failed, this was caught when
opening files within the directory fails anyway.
By handling the error code here (but explicitly ignoring EEXIST),
the error messages and return codes in these cases are more
appropriate and less confusing.
Signed-off-by: Martin Storsjö <martin@martin.st>
Convert the Matroska stereo format to the Stereo3D format, and add a
Stereo3D side data to the stream.
Bump the doctype version supported.
Bug-Id: 728 / https://bugs.debian.org/757185
If the remote end of a connection oriented socket hangs up, generating
an EPIPE error is preferable over an unhandled SIGPIPE signal.
Signed-off-by: Martin Storsjö <martin@martin.st>
At least one FATE sample contains such chunks and happens to work simply
by accident (due to find_stream_info() swallowing the error).
CC: libav-stable@libav.org
Update mxf_set_audio_pts to use the container-provided information.
The UL is marked as "to be changed in the future", but the current
samples in the wild do use it.
Prevent out of array writes.
Similar to what Michael Niedermayer did to address the same issue.
Bug-Id: CVE-2014-2263
CC: libav-stable@libav.org
Signed-off-by: Diego Biurrun <diego@biurrun.de>
It is basically a wrapper around av_get_audio_frame_duration(), with a
fallback to AVCodecContext.frame_size. However, that field is set only
when the stream codec context is actually used for encoding or decoding,
which is discouraged.
For muxing, it is generally the responsibility of the caller to set the
packet duration.
For demuxing, if the duration is not stored at the container level, it
should be set by the parser.
Therefore, removing the frame_size fallback should not break any
important case.
The cur_*auth_type variables were set before the http_connect call
prior to 6a463e7fb - their sole purpose is to record the
authentication type used to do the latest request, since parsing
the http response sets the new type in the auth state.
CC: libav-stable@libav.org
Signed-off-by: Martin Storsjö <martin@martin.st>
Originally, AVFormatContext and a metadata dict were provided to ff_vorbis_comment(),
but this presented issues if an AVStream was being updated or the metadata on
AVFormatContext wasn't actually being updated. To remedy this, ff_vorbis_stream_comment()
explicitly updates a stream's metadata and sets any necessary flags.
ff_vorbis_comment() does not modify any flags, and any calls to it that update
AVFormatContext's metadata (just a single call) must also update
AVFormatContext.event_flags after detecting any metadata changes to the provided
dictionary, as signaled by a positive return value.
Signed-off-by: Anton Khirnov <anton@khirnov.net>
Currently, only onMetaData is used, but some providers (wrongly)
put metadata into onCuePoint events, and it's still nice to be
able to use that data.
onCuePoint events also present metadata slightly differently than
onMetaData events: all metadata is found inside an object called
"parameters". In order to extract this metadata, it's easiest to
recurse through the object tree and pull out anything found in
child objects and put it in the top-level metadata.
Reference: http://help.adobe.com/en_US/FlashPlatform/reference/actionscript/2/help.html?content=00001404.html
Signed-off-by: Anton Khirnov <anton@khirnov.net>
If any option named "metadata" is set inside the context, it is pulled up to
the context and then the option is cleared.
Signed-off-by: Anton Khirnov <anton@khirnov.net>
The only flags, for now, indicate if metadata was updated and are set after each call to
av_read_frame(). This comes with the caveat that, on stream start, it might not be set properly
as packets might be buffered in AVFormatContext.packet_buffer before being given to the user
in av_read_frame().
Signed-off-by: Anton Khirnov <anton@khirnov.net>