WebRTC RenderStreaming 进行直播,videoPlayer 渲染视频流 ,如何渲染

WebRTC RenderStreaming 进行直播,videoPlayer 渲染视频流 ,不知道如何渲染,我使用raw image 可以播放但是rawimage 不能像videoplayer一样放在 sphere上面 ,就实现不了vr播放器的感觉,还有就是获取得到的数据是texture 类型 ,videoplayer并不支持,想知道如何渲染

我能用rawimage 来播放视频流,我可以用videoplayer 实现vr视频播放器 , 我怎么结合一下,合起来才是我想实现的功能

Unity版本
2021.1.22
平台
vive VR眼镜


   // 思路来源 https://blog.csdn.net/Wenliuyong/article/details/119870304

    // 使用raw image 播放
    public SingleConnection singleConnection;
    public ReceiveVideoViewer videoViewer;
    public RawImage videoView;

    void Start()
    {
         var connectId = "111111";
        singleConnection.CreateConnection(connectId);
        videoViewer.OnUpdateReceiveTexture += (tex) => {
            videoView.texture = tex;
        };
    }

        // 参数 <texture> tex 
        // 使用videoplayer 播放,这个播放不了, 不是很懂

        var videoPlayer = camera.AddComponent<UnityEngine.Video.VideoPlayer>();
        videoPlayer.playOnAwake = true;
        videoPlayer.isLooping = true;
        //videoPlayer.source = UnityEngine.Video.VideoSource.Url;
        videoPlayer.source = UnityEngine.Video.VideoSource.VideoClip;
        // videoPlayer.clip = Application.dataPath + "/Scenes/123.mp4";
        videoPlayer.renderMode = UnityEngine.Video.VideoRenderMode.MaterialOverride;
        videoPlayer.audioOutputMode = UnityEngine.Video.VideoAudioOutputMode.None;
        videoPlayer.SetDirectAudioMute(0, true);
        videoViewer.OnUpdateReceiveTexture += (tex) => {
            // TextureToTexture2D  函数来源网络,具体逻辑不明白
            var r2d = TextureToTexture2D(tex);
           RenderTexture rt = new RenderTexture(tex.width / 2, tex.height / 2, 0);
           RenderTexture.active = rt;
           Graphics.Blit(tex, rt);
           videoPlayer.targetTexture = rt;
        };

https://zhuanlan.zhihu.com/p/412870658
不知道对你有没有帮助

我自己来答一下,遇到一个好心大哥,思路是设置Sphere的材质的贴图 就可以了