weixin_39686353
weixin_39686353
2021-01-01 11:28

USB Camera

This patch adds USB Camera functionality to the ffmpeg producer. It supports syntax like:

PLAY 1-10 "device://video=Some Camera" or PLAY 1-10 DEVICE "video=Some Camera"

This patch requires (includes) the ffmpeg patch and also the parameters patch previously submitted. This patch is not that clean, in that it includes my VCExpress mods in this branch so you will want to watch for that when merging.

This patch addresses some instability / crashes in previous USB Camera attempts where the AVPacket lifetime was too short, being freed before consumption in the call to write_frame. A PacketFrame structure has been introduced to ensure that the AVPacket is not freed until after use in the write_frame call in the frame_muxer.

该提问来源于开源项目:CasparCG/server

  • 点赞
  • 写回答
  • 关注问题
  • 收藏
  • 复制链接分享
  • 邀请回答

19条回答

  • weixin_39686353 weixin_39686353 4月前

    Closing this in favour of #131 and #134

    点赞 评论 复制链接分享
  • weixin_39709194 weixin_39709194 4月前

    What does the commandline arguments for achieving the same thing with ffmpeg look like?

    点赞 评论 复制链接分享
  • weixin_39709194 weixin_39709194 4月前

    I'm no GIT expert. But generally it would probably be better to do pull request with one feature addition? This seems to contain several, msbuild, parameters, ffmpeg updates etc. Makes it a bit confusing to review.

    点赞 评论 复制链接分享
  • weixin_39686353 weixin_39686353 4月前

    The data in the AVFrame is the same as that in the source packet in the case of the RAW_VIDEO codec. I would be thinking it would not be the same in any other codec. The RAW_VIDEO codec is the one used for USB cameras more often than not.

    ffplay -f dshow -i "video=Some Camera"

    Should work.

    Without the PacketFrame addition the AVPacket is freed too early causing the write_frame to crash with an access violation on read.

    点赞 评论 复制链接分享
  • weixin_39686353 weixin_39686353 4月前

    Regarding git practice, sure, ideally the feature would be standalone. However, in my development of the feature I needed to add the parameters and ffmpeg upgrade which I've submitted separately. Someone can review those separately if they like. I have kept this separate from the chroma patch which can also be reviewed separately. Making a usbcam branch that is separate from the other patches was more work than I have time for at present.

    点赞 评论 复制链接分享
  • weixin_39709194 weixin_39709194 4月前

    I also don't like the way you need to initialize the web camera. Maybe "PLAY 1-1 video="some camera" format dshow" would be more consistent with existing syntax?

    点赞 评论 复制链接分享
  • weixin_39709194 weixin_39709194 4月前

    Or maybe "PLAY 1-1 dshow://video="some camera""

    点赞 评论 复制链接分享
  • weixin_39709194 weixin_39709194 4月前

    I also noted the "STREAM" token, which seems reduntant, why is that needed?

    点赞 评论 复制链接分享
  • weixin_39686353 weixin_39686353 4月前

    I went for device:// to indicate the physical connection to the computer. Whereas dshow:// or (I guess) vfl2:// implies more knowledge about ffmpeg and the nature of the underlying implementation than I'd expect to be warranted for most users. Of the 2 options you give, I'd prefer the dshow:// syntax.

    I did include support for FILE, DEVICE followed by the protocol independent part of the uri. e.g.

    DEVICE "video=Some Camera"

    in a bid to be similar to the use of FILE in the ffmpeg consumer. As you say it is quite redundant.

    The STREAM allows the opening of a stream. I draw a distinction between file, device, and stream in the flags that are set on opening the device. For example, for device:// I set the format to dshow which wouldn't be done for files say. STREAM may want to set 'listen' for example which I haven't done yet, but there may be a need to do that in the future.

    For the syntax there are currently two ways of doing it:

    FILE .... DEVICE. ... STREAM ...

    or

    some file device://video=Some Camera rtp://Some Stream (or http://... or rtps://...

    If you'd rather just have the one syntax, that's fine with me.

    点赞 评论 复制链接分享
  • weixin_39709194 weixin_39709194 4月前

    I think the last option is much more practical (i.e. without STREAM, FILE etc...).

    Regarding device:// vs dshow.//. It's not so much about knowledge of ffmpeg, but knowledge in regards to how you connect to the webcam. "device" is bit ambiguous, what if you want to go through e.g. "vfwcap"? It's still a "device" but not a "device"...?

    Aslo, see my follow up comment for "14abbd3".

    点赞 评论 复制链接分享
  • weixin_39709194 weixin_39709194 4月前

    The more I think about the more I think that "paramters" should be replaced by "boost::property_tree".

    点赞 评论 复制链接分享
  • weixin_39686353 weixin_39686353 4月前

    "paramters" should be replaced by "boost::property_tree"

    I agree. Though 'parameters' may have some value at this time as the use of an adapter class (like parameters) allows you to stage the refactoring over a period of time, until the class can finally be removed and replaced with boost::property_tree.

    I think that the syntax parsing should be in the protocol in preference to the producers themselves. The producers should have some kind of model for their parameters. If this is a generic property_tree that's ok.

    点赞 评论 复制链接分享
  • weixin_39710966 weixin_39710966 4月前

    The wiki currently doesn't actually ever suggest using FILE with PLAY. The only place it shows up is in the Disk Consumer. Therefore DEVICE is indeed unnecessary, especially as it doesn't even seem to be required to trigger the difference in source filename parsing.

    How does this patch know not to add the config's media folder as a prefix?

    点赞 评论 复制链接分享
  • weixin_39686353 weixin_39686353 4月前

    How does this patch know not to add the config's media folder as a prefix?

    It checks the string against supported 'protocol' components as if it were a uri. If it sees a supported protocol (currently dshow, rtp, rtps, http) then it: - uses the original case - does not call prob_stem to get a complete path.

    点赞 评论 复制链接分享
  • weixin_39709194 weixin_39709194 4月前

    Starting to look good. Good job!

    Only thing I think of commenting on (apart from the above) is that I don't think "resource_type" should be needed and could probably be solved in a more generic way. Though that could possibly wait for future patches.

    点赞 评论 复制链接分享
  • weixin_39668470 weixin_39668470 4月前

    Please note that RTMP and UDP protocols should also work (they did with the STREAM command in earlier testing with Cambell's fork).

    点赞 评论 复制链接分享
  • weixin_39709194 weixin_39709194 4月前

    They will kind of work, sometimes. There still is no clock nor video/audio sync which is required for proper playback of ip streams.

    点赞 评论 复制链接分享
  • weixin_39689347 weixin_39689347 4月前

    I think this is a great idea especially with UDP/RTMP input possibilities. Just putting it out there how difficult would it be to output a channel to UDP as a SPTS?

    点赞 评论 复制链接分享
  • weixin_39686353 weixin_39686353 4月前

    Please note that RTMP and UDP protocols should also work

    I've changed the protocol detection so that no protocol presumes file, 'dshow' is for camera, anything else is presumed to be a stream.

    点赞 评论 复制链接分享

相关推荐