Android端webrtc如何使用USB摄像头

Android端webrtc如何使用USB摄像头,
看网上的什么疫情爆发那段没整明白,github上的那个也用不了 ,感觉应该是自定义videoSource就行,但是我不会,

webrtc和UVCCamera那个单独运行都没问题,但是把他俩的功能合在一起就不会了

我使用UVCCamera自定义了VideoCapturer,本地能预览,但是远程是黑屏,不知道为什么

麻烦给我发一份能用USB摄像头的安卓端webrtc源码呗,非常感谢

我的邮件:771175048@qq.com

不知这个是否符合你的要求,但该博文讲解的条理清晰,思路清晰,可作为参考,链接:https://juejin.cn/post/7139488477892050975
下面是博主共享的部分源码,应该对你编写程序有所帮助。

img

不是自定义VideoSource,是自定义VideoCapturer;Android webrtc使用Camera1或Camera2开启预览后,并通过CapturerObserver传进更底层(作为上层自定义,可理解为通过CapturerObserver将视频流传出去了),参考下源码Camera1Capturer或Camera2Capturer,相信你能自定义一个你能用的UsbCapturer

可以参考一下https://www.jianshu.com/p/0a98e5733c6a

步骤
创建PeerConnectionFactory
创建并启动VideoCapturer
用PeerConnectionFactory创建VideoSource
用PeerConnectionFactory和VideoSource创建VideoTrack
初始化视频控件SurfaceViewRenderer
将VideoTrack展示到SurfaceViewRenderer中


public class MainActivity extends AppCompatActivity {

    @Override
    protected void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        setContentView(R.layout.activity_main);

        // create PeerConnectionFactory
        PeerConnectionFactory.InitializationOptions initializationOptions =
                PeerConnectionFactory.InitializationOptions.builder(this).createInitializationOptions();
        PeerConnectionFactory.initialize(initializationOptions);
        PeerConnectionFactory peerConnectionFactory = PeerConnectionFactory.builder().createPeerConnectionFactory();

        // create AudioSource
        AudioSource audioSource = peerConnectionFactory.createAudioSource(new MediaConstraints());
        AudioTrack audioTrack = peerConnectionFactory.createAudioTrack("101", audioSource);

        EglBase.Context eglBaseContext = EglBase.create().getEglBaseContext();

        SurfaceTextureHelper surfaceTextureHelper = SurfaceTextureHelper.create("CaptureThread", eglBaseContext);
        // create VideoCapturer
        VideoCapturer videoCapturer = createCameraCapturer();
        VideoSource videoSource = peerConnectionFactory.createVideoSource(videoCapturer.isScreencast());
        videoCapturer.initialize(surfaceTextureHelper, getApplicationContext(), videoSource.getCapturerObserver());
        videoCapturer.startCapture(480, 640, 30);

        SurfaceViewRenderer localView = findViewById(R.id.localView);
        localView.setMirror(true);
        localView.init(eglBaseContext, null);

        // create VideoTrack
        VideoTrack videoTrack = peerConnectionFactory.createVideoTrack("101", videoSource);
        // display in localView
        videoTrack.addSink(localView);
    }

    private VideoCapturer createCameraCapturer() {
        Camera1Enumerator enumerator = new Camera1Enumerator(false);
        final String[] deviceNames = enumerator.getDeviceNames();

        // First, try to find front facing camera
        for (String deviceName : deviceNames) {
            if (enumerator.isFrontFacing(deviceName)) {
                VideoCapturer videoCapturer = enumerator.createCapturer(deviceName, null);

                if (videoCapturer != null) {
                    return videoCapturer;
                }
            }
        }

        // Front facing camera not found, try something else
        for (String deviceName : deviceNames) {
            if (!enumerator.isFrontFacing(deviceName)) {
                VideoCapturer videoCapturer = enumerator.createCapturer(deviceName, null);

                if (videoCapturer != null) {
                    return videoCapturer;
                }
            }
        }

        return null;
    }

}