精华内容
下载资源
问答
  • 使用gstreamer进行rtmp推h264数据流

    千次阅读 2019-08-25 23:29:58
    序言 本实例使用腾讯云直播作为接收源,支持在线播放。 1.注册腾讯云直播账户,直接使用微信登陆即可... ...按照提示,生成对应的直播流地址。...2.执行命令gst-launch-1.0 videotestsrc !... rtmpsink location='rtmp:/...

    序言

    本实例使用腾讯云直播作为接收源,支持在线播放。

     

    1.注册腾讯云直播账户,直接使用微信登陆即可,网址如下:

    https://console.cloud.tencent.com/live

    按照提示,生成对应的直播流地址。

    2.执行命令gst-launch-1.0 videotestsrc ! x264enc ! flvmux ! rtmpsink location='rtmp://57591.livepush.myqcloud.com/live/1234?txSecret=7f22b52deadd35b2a09f38ffd32f1f84&txTime=5D62B07F' 进行推流

    在Ubuntu上如下图所示

    在直播流端如下图所示,即说明已经将x264enc编码出来的h264数据流成功推到了云端。

     

    展开全文
  • gstreamer的rtsp转rtmp

    2021-03-27 15:39:01
    VideoPusher.h #pragma once #include <iostream> #include <gst/gst.h> #include <gst/app/gstappsink.h> #include <glib.h> #include <boost/shared_ptr.hpp>...#defi

    VideoPusher.h

    #pragma once
    #include <iostream>
    #include <gst/gst.h>
    #include <gst/app/gstappsink.h>
    #include <glib.h>
    #include <boost/shared_ptr.hpp>
    #include <mutex>
    
    #ifndef INT64_C 
    #define INT64_C(c) (c ## LL) 
    #define UINT64_C(c) (c ## ULL) 
    #endif 
    
    class VideoPusher
    {
        struct Exception : std::exception{};
        template<typename T>
        T* chk(T* pointer) {
            if (pointer == nullptr) {
                throw Exception();
            }
            return pointer;
        }
    
        template<typename T>
        void delptr(T* pointer)
        {
            if (pointer != nullptr)
            {
                delete pointer;
                pointer = nullptr;
            }
        }
    
    
    public:
        VideoPusher();
        VideoPusher(std::string strlocation, std::string strCode);
        virtual ~VideoPusher();
        bool Run(std::string strlocation, std::string strCode, std::string strrequest);
        
    protected:
        static void _onPadAdded(GstElement *src, GstPad *src_pad, gpointer user_data);
        void SetElesNull();
    protected:
        GMainLoop  *_loop;
        GstBus*     _bus;
        GstElement* _pipeline;
        GstElement* _source;
        GstElement* _depay;
        GstElement* _parse;
        GstElement* _capsfilter;
        GstElement* _queue1;
        GstElement* _rtmpsink;
        GstElement* _flvmux;
        std::mutex _mutex;
    	bool _bStopPush;
    };
    
    typedef boost::shared_ptr<VideoPusher> VideoPusherPtr;
    

    VideoPusher.cpp

    void VideoPusher::_onPadAdded(GstElement *src, GstPad *src_pad, gpointer user_data)
    {
        GstPad* sink_pad = (GstPad*)user_data;
        gst_pad_link(src_pad, sink_pad);
    }
    
    void VideoPusher::SetElesNull()
    {
        _loop = nullptr;
        _bus = nullptr;
        _pipeline = nullptr;
        _source = nullptr;
        _depay = nullptr;
        _parse = nullptr;
        _capsfilter = nullptr;
        _queue1 = nullptr;
        _rtmpsink = nullptr;
        _flvmux = nullptr;
    }
    
    bool VideoPusher::Run(std::string strLocation,
        std::string strCode, std::string strrequest)
    {
    	_bStopPush = false;
      
    	while (!_bStopPush)
    	{
    		SetElesNull();
    
    		gboolean terminate = FALSE;
    
    		gst_init(nullptr, nullptr);
    
    		std::stringstream stream;
    		stream << "pipeline" << g_pipelinenum++;
    		std::string strname;
    		strname = stream.str();
    
    		_pipeline = chk(gst_pipeline_new(strname.c_str()));
    		_source = chk(gst_element_factory_make("rtspsrc", "src"));
    		_depay = chk(gst_element_factory_make(("rtp" + strCode + "depay").c_str(), "depay"));
    		_parse = chk(gst_element_factory_make((strCode + "parse").c_str(), "parse"));
    		_flvmux = chk(gst_element_factory_make("flvmux", "flvmux"));
    		_queue1 = chk(gst_element_factory_make("queue", "queue"));
    		_capsfilter = chk(gst_element_factory_make("capsfilter", "filter"));
    		_rtmpsink = chk(gst_element_factory_make("rtmpsink", "sink"));
    
    		//g_object_set(_source, "protocols", 0x00000004, NULL);
    		g_object_set(_source, "latency", 0, NULL);
    		g_object_set(_capsfilter, "caps-change-mode", 1, NULL);
    		g_object_set(_rtmpsink, "location", strLocation.c_str(), NULL);
    
    		gst_bin_add_many(GST_BIN(_pipeline), _source, _depay, _parse, _flvmux, _capsfilter, _queue1, _rtmpsink, NULL);
    		g_signal_connect(_source, "pad-added", G_CALLBACK(&_onPadAdded), gst_element_get_static_pad(_depay, "sink"));
    		gboolean bsuccess = gst_element_link_many(_depay, _parse, _flvmux, _capsfilter, _queue1, _rtmpsink, NULL);
    		if (!bsuccess) {
    			g_print("Failed to link one or more elements!\n");
    			gst_element_unlink_many(_depay, _parse, _flvmux, _capsfilter, _queue1, _rtmpsink, NULL);
    			Sleep(1000);
    			continue;
    		}
    		g_object_set(_source, "location", strrequest.c_str(), NULL);
    		GstCaps* caps = gst_caps_new_simple(
    			"video/x-raw",
    			"format", G_TYPE_STRING, "rgb",
    			"width", G_TYPE_INT, 426,
    			"height", G_TYPE_INT, 240,
    			"framerate", GST_TYPE_FRACTION, 25, 1,
    			NULL);
    		g_object_set(_capsfilter, "caps", caps, NULL);
    		gst_caps_unref(caps);
    		GstStateChangeReturn res = gst_element_set_state(_pipeline, GST_STATE_PLAYING);
    		if (res == GST_STATE_CHANGE_FAILURE)
    		{
    			g_printerr("Unable to set the pipeline to the playing state.\n");
    			gst_object_unref(_pipeline);
    			Sleep(1000);
    			continue;
    		}
    		GstMessage *msg;
    		/* Listen to the bus */
    		//_bus = gst_element_get_bus(_pipeline);
    		_bus = gst_pipeline_get_bus(GST_PIPELINE(_pipeline));
    		do
    		{
    			msg = gst_bus_timed_pop_filtered(_bus, GST_CLOCK_TIME_NONE,
    				GstMessageType(GST_MESSAGE_STATE_CHANGED |
    				GST_MESSAGE_ERROR | GST_MESSAGE_EOS));
    			/* Parse message */
    			if (msg != NULL)
    			{
    				GError *err;
    				gchar *debug_info;
    				switch (GST_MESSAGE_TYPE(msg))
    				{
    				case GST_MESSAGE_ERROR:
    				{
    					gst_message_parse_error(msg, &err, &debug_info);
    					g_printerr("Error received from element %s: %s\n", GST_OBJECT_NAME(msg->src), err->message);
    					g_printerr("Debugging information: %s\n", debug_info ? debug_info : "none");
    					g_clear_error(&err);
    					g_free(debug_info);
    					terminate = TRUE;
    				}
    				break;
    				case GST_MESSAGE_EOS:
    				{
    					g_print("End-Of-Streamreached.\n");
    					terminate = TRUE;
    				}
    				break;
    				case GST_MESSAGE_STATE_CHANGED:
    				{
    					/* We are onlyinterested in state-changed messages from the pipeline */
    					if (GST_MESSAGE_SRC(msg) == GST_OBJECT(_pipeline))
    					{
    						GstState old_state, new_state, pending_state;
    						gst_message_parse_state_changed(msg,
    							&old_state,
    							&new_state,
    							&pending_state);
    						g_print("Pipeline state changed from %s to %s:\n",
    							gst_element_state_get_name(old_state),
    							gst_element_state_get_name(new_state));
    						if (pending_state == GST_STATE_NULL)
    						{
    							terminate = TRUE;
    						}
    					}
    				}
    				break;
    				default:
    				{
    					/* We shouldnot reach here */
    					g_printerr("Unexpected message received.\n");
    					break;
    				}
    				}
    				gst_message_unref(msg);
    				//  std::this_thread::sleep_for(std::chrono::milliseconds(200));
    			}
    
    		} while (!terminate);
    
    		/* Free resources */
    		try
    		{
    			std::lock_guard<std::mutex> lock(_mutex);
    			gst_object_unref(_bus);
    			gst_element_set_state(_pipeline, GST_STATE_PAUSED);
    			gst_element_set_state(_pipeline, GST_STATE_READY);
    			gst_element_set_state(_pipeline, GST_STATE_NULL);
    			gst_object_unref(_pipeline);
    		}
    		catch (std::exception &e)
    		{
    			cout << e.what();
    			return true;
    		}
    		catch (...)
    		{
    			return true;
    		}
    	}
    
       
    
        return true;
    }
    
    
    展开全文
  • gstreamer读取USB摄像头H264帧并用rtmp推流

    万次阅读 热门讨论 2019-01-15 17:56:51
    文章目录gstreamer命令行实现rtmp推流gstreamer代码实现rtmp推流 因为要在嵌入式端使用rtmp推流,目前我知道的有三种办法,ffmpeg、gstreamer、librtmp,每一种都需要移植到嵌入式平台,还是从我最熟悉的gstreamer...


    因为要在嵌入式端使用rtmp推流,目前我知道的有三种办法,ffmpeg、gstreamer、librtmp,每一种都需要移植到嵌入式平台,还是从我最熟悉的gstreamer开始验证吧。
    现在我的嵌入式平台gstreamer库没有rtmp元件,因此只能先在Ubuntu16.04系统的PC上测试,然后再移植带有rtmp元件的gstreamer库。
    Ubuntu16.04系统已经自带了gstreamer-1.0的库,并且已经包含rtmp元件,不用移植可以直接测试了。
    注意:我使用的USB摄像头可以直接输出H264帧,因此不需要使用编码元件。

    gstreamer命令行实现rtmp推流

    首先用命令行工具测试:

    gst-launch-1.0 -v v4l2src device=/dev/video0 ! 'video/x-h264, width=1280, height=720, framerate=30/1' ! queue !  h264parse ! flvmux ! rtmpsink location='rtmp://192.168.1.102/live'
    

    这个命令行执行后,就可以在192.168.1.102地址的PC上打开流媒体服务端观看。可以使用nginx或者srs流媒体服务端,创建一个html文件打开网页观看。

    	<h1>01</h1>
    	<object width='640' height='377' id='SampleMediaPlayback' name='SampleMediaPlayback' type='application/x-shockwave-flash' classid='clsid:d27cdb6e-ae6d-11cf-96b8-444553540000' >
    		<param name='movie' value='swfs/SampleMediaPlayback.swf' /> 
    		<param name='quality' value='high' /> 
    		<param name='bgcolor' value='#000000' /> 
    		<param name='allowfullscreen' value='true' /> 
    		<embed src='SampleMediaPlayback.swf' width='640' height='377' id='SampleMediaPlayback' quality='high' bgcolor='#000000' name='SampleMediaPlayback' allowfullscreen='true' pluginspage='http://www.adobe.com/go/getflashplayer' flashvars='&src=rtmp://192.168.1.102:1935/live&autoHideControlBar=true&streamType=live&autoPlay=true&verbose=true' type='application/x-shockwave-flash'> 
    		</embed>
    	</object>
    

    gstreamer代码实现rtmp推流

    #include <string.h>
    #include <gst/gst.h>
    #include <signal.h>
    #include <unistd.h>
    #include <stdlib.h>
    #include <stdio.h>
    #include <string.h>
    
    //gst-launch-1.0 -v v4l2src device=/dev/video0 ! 'video/x-h264, width=640, height=360, framerate=30/1' ! queue !  h264parse ! flvmux ! rtmpsink location='rtmp://192.168.1.102/live'
    
    typedef struct _GstDataStruct
    {
    	GstElement *pipeline;
    	GstElement *v4l2src;
    	GstElement *queue;
    	GstElement *h264parse;
    	GstElement *flvmux;
    	GstElement *rtmpsink;
    	GstBus *bus;
    	guint bus_watch_id;
    	guint sourceid;        /* To control the GSource */
    	GMainLoop *loop;  /* GLib's Main Loop */
    } GstDataStruct;
    
    static GstDataStruct GstData;
    static unsigned int frame_width;
    static unsigned int frame_height;
    static unsigned int frame_rate;
    static unsigned int frame_bps;
    static char devname[32] = {0};
    
    gboolean bus_msg_call(GstBus *bus, GstMessage *msg, GstDataStruct *pGstData)
    {
    	gchar *debug;
    	GError *error;
    	GMainLoop *loop = pGstData->loop;
    
    	GST_DEBUG ("got message %s",gst_message_type_get_name (GST_MESSAGE_TYPE (msg)));
    	switch (GST_MESSAGE_TYPE(msg))
    	{
    		case GST_MESSAGE_EOS:
    			printf("End of stream\n");
    			g_main_loop_quit(loop);
    			break;
    		case GST_MESSAGE_ERROR:
    			gst_message_parse_error(msg, &error, &debug);
    			g_free(debug);
    			g_printerr("Error: %s\n", error->message);
    			g_error_free(error);
    			g_main_loop_quit(loop);
    			break;
    		default:
    			break;
    	}
    	return TRUE;
    }
    
    int main(int argc, char *argv[])
    {
    	if(argc != 6)
    	{
    		frame_width = 1280;
    		frame_height = 720;
    		frame_rate = 30;
    		frame_bps = 1500000;
    		sprintf(devname, "%s", "/dev/video0");
    	}
    	else
    	{
    		frame_width = atoi(argv[2]);
    		frame_height = atoi(argv[3]);
    		frame_rate = atoi(argv[4]);
    		frame_bps = atoi(argv[5]);
    		sprintf(devname, "%s", argv[1]);
    	}
    	printf("width:%d, height:%d, rate:%d, bps:%d, dev:%s\n", frame_width, frame_height, frame_rate, frame_bps, devname);
    
    	printf("============= v4l2 rtmp gst init start ============\n");
    	gst_init (NULL, NULL);
    	printf("=========== create v4l2 rtmp pipeline =============\n");
    	GstData.pipeline           	= gst_pipeline_new ("v4l2_rtmp");
    	GstData.pipeline           	= gst_pipeline_new ("v4l2_rtmp");
    	GstData.v4l2src        	   	= gst_element_factory_make ("v4l2src",      "v4l2src");
    	GstData.queue      		   	= gst_element_factory_make ("queue",  		"queue");
    	GstData.h264parse      	   	= gst_element_factory_make ("h264parse",	"h264parse");
    	GstData.flvmux           	= gst_element_factory_make ("flvmux",      	"flvmux");
    	GstData.rtmpsink            = gst_element_factory_make ("rtmpsink",     "rtmpsink");
    
    	if (!GstData.pipeline || !GstData.v4l2src || !GstData.queue ||
    		!GstData.h264parse || !GstData.flvmux || !GstData.rtmpsink)
    	{
    		g_printerr ("One element could not be created... Exit\n");
    		return -1;
    	}
    
    	printf("============ link v4l2 rtmp pipeline ==============\n");
    	GstCaps *caps_v4l2src;
    	caps_v4l2src = gst_caps_new_simple("video/x-h264", "stream-format", G_TYPE_STRING,"byte-stream",
    									   "alignment", G_TYPE_STRING, "au",
    									   "width", G_TYPE_INT, frame_width,
    									   "height", G_TYPE_INT, frame_height,
    									   "framerate",GST_TYPE_FRACTION, frame_rate, 1, NULL);
    	GstCaps *caps_flv_sink;
    	caps_flv_sink = gst_caps_new_simple("video/x-h264", "stream-format", G_TYPE_STRING,"avc",
    									    "alignment", G_TYPE_STRING, "au",
    									    "width", G_TYPE_INT, frame_width,
    									    "height", G_TYPE_INT, frame_height,
    									    "framerate",GST_TYPE_FRACTION, frame_rate, 1, NULL);
    
    	g_object_set(G_OBJECT(GstData.v4l2src), "device", devname, NULL);
    	g_object_set(G_OBJECT(GstData.rtmpsink), "location", "rtmp://192.168.1.102/live", NULL);
    //注意:此处的location参数代表rtmp的url,其取值必须与html文件的rtmp的URL保持一致,才可观看视频。
    	GstData.bus = gst_pipeline_get_bus(GST_PIPELINE(GstData.pipeline));
    	GstData.bus_watch_id = gst_bus_add_watch(GstData.bus, (GstBusFunc)bus_msg_call, (gpointer)&GstData);
    	gst_object_unref(GstData.bus);
    
    	gst_bin_add_many(GST_BIN(GstData.pipeline), GstData.v4l2src, GstData.queue,
    					 GstData.h264parse, GstData.flvmux, GstData.rtmpsink,NULL);
    
    	if(gst_element_link_filtered(GstData.v4l2src, GstData.queue, caps_v4l2src) != TRUE)
    	{
    		g_printerr ("GstData.v4l2src could not link GstData.queue\n");
    		gst_object_unref (GstData.pipeline);
    		return -1;
    	}
    	gst_caps_unref (caps_v4l2src);
    
    	if(gst_element_link(GstData.queue, GstData.h264parse) != TRUE)
    	{
    		g_printerr ("GstData.queue could not link GstData.h264parse\n");
    		gst_object_unref (GstData.pipeline);
    		return -1;
    	}
    
    	if(gst_element_link_filtered(GstData.h264parse, GstData.flvmux, caps_flv_sink) != TRUE)
    	{
    		g_printerr ("GstData.h264parse could not link GstData.flvmux\n");
    		gst_object_unref (GstData.pipeline);
    		return -1;
    	}
    	gst_caps_unref (caps_flv_sink);
    
    	if(gst_element_link(GstData.flvmux, GstData.rtmpsink) != TRUE)
    	{
    		g_printerr ("GstData.h264parse could not link GstData.flvmux\n");
    		gst_object_unref (GstData.pipeline);
    		return -1;
    	}
    
    	printf("========= link v4l2 rtmp pipeline running ==========\n");
    	gst_element_set_state (GstData.pipeline, GST_STATE_PLAYING);
    	GstData.loop = g_main_loop_new(NULL, FALSE);	// Create gstreamer loop
    	g_main_loop_run(GstData.loop);					// Loop will run until receiving EOS (end-of-stream), will block here
    	printf("g_main_loop_run returned, stopping rtmp!\n");
    	gst_element_set_state (GstData.pipeline, GST_STATE_NULL);		// Stop pipeline to be released
    	printf("Deleting pipeline\n");
    	gst_object_unref (GstData.pipeline);							// THis will also delete all pipeline elements
    	g_source_remove(GstData.bus_watch_id);
    	g_main_loop_unref(GstData.loop);
    
    	return 0;
    }
    

    此代码在Ubuntu16.04系统下使用gcc编译,makefile如下:
    需要将系统目录下的关于gstreamer的库文件拷贝到当前目录的libs_x86目录下,另外系统的gstreamer库链接文件都带有.so.0后缀,去掉最后的.0,保留到.so即可。

    CFLAGS = -v -g -Wall -Wno-shift-count-overflow -I./include
    LDFLAGS = -L./libs_x86
    CC = gcc
    EXTRA_LIBS = -lstdc++ -lm -lpthread -lgstreamer-1.0 -lgstbase-1.0 -lgobject-2.0 -lgmodule-2.0 -lglib-2.0 -lpcre -lrt
    SRC = v4l2_rtmp.c
    TARGET = v4l2_rtmp
    ALL:
    	$(CC) $(CFLAGS) $(LDFLAGS) $(SRC) -o $(TARGET) $(EXTRA_LIBS) 
    clean:
    	rm v4l2_rtmp *.raw *.mp4 *.wav -rf
    

    此代码只是一个简单实现,并没有做其他操作,也没有长时间验证网络的稳定性,只是说明了可行性,另外还没有添加音频,等过几天再把音频也加上来。

    展开全文
  • gstreamer 相关直播源(rtmp rtsp)

    千次阅读 2019-08-25 23:14:08
    gst-launch-1.0 playbin uri=rtmp://58.200.131.2:1935/livetv/hunantv RTMP协议直播源 1.湖南卫视:rtmp://58.200.131.2:1935/livetv/hunantv 2.湖北卫视:rtmp://58.200.131.2:1935/livetv/hbtv 3.广西卫视 ...

    测试直播源命令如下:

    gst-launch-1.0 playbin uri=rtmp://58.200.131.2:1935/livetv/hunantv

     

    RTMP协议直播源

    1.湖南卫视:rtmp://58.200.131.2:1935/livetv/hunantv

    2.湖北卫视:rtmp://58.200.131.2:1935/livetv/hbtv

    3.广西卫视 :rtmp://58.200.131.2:1935/livetv/gxtv

    4.广东卫视:rtmp://58.200.131.2:1935/livetv/gdtv

     

    RTSP协议直播源

    1.大熊兔(点播):rtsp://184.72.239.149/vod/mp4://BigBuckBunny_175k.mov

     

    HTTP播放源

    gst-launch-1.0 playbin uri=https://www.freedesktop.org/software/gstreamer-sdk/data/media/sintel_trailer-480p.webm

     

    指定waylandsink播放(设置显示分辨率)

    gst-launch-1.0 playbin uri=rtmp://58.200.131.2:1935/livetv/hbtv video-sink="waylandsink window-resolution=1280x720"

     

    指定特定分辨率显示

    gst-launch-1.0 videotestsrc ! waylandsink window-resolution=1280x720

    展开全文
  • 文章目录gstreamer音视频同步问题 之前关于Gstreamer的文章要么只操作音频,要么只操作视频,现在需要同时操作音视频,分别完成音视频文件合成与推流。 gstreamer音视频同步问题 读取USB摄像头的音频数据文章中...
  • } } 启动rtmp服务器nginx-rtmp-module(下载地址:https://download.csdn.net/download/qq_23350817/12680515): nginx.exe -c conf/nginx-win.conf 打开vlc,输入rtmp地址:rtmp://127.0.0.1:1935/live/1播放
  • 媒体流水线的概念证明,该流水线采用三个媒体源,混合它们的音频和视频轨道,并通过rtmp传输内容。可用于。它使用的是gstreamer库(1.18.4)。 如何建造 存储库包含用于构建解决方案并运行测试的docker文件。要构建...
  • gst-launch-1.0 v4l2src device="/dev/video0" ! "video/x-raw, format=..../objs/srs -c conf/rtmp.conf 启动RTMP服务器 ffplay rtmp://192.168.137.143/live/123 ffplay拉流播放
  • ubuntu16.04 gstreamer源码安装

    千次阅读 2018-05-12 13:05:21
    本人也是刚刚接触gstreamer不久,在这里给大家分享下在ubuntu 16.04源码安装gstreamer和相关插件的流程,同时也在此对遇到的问题做个记录。网上有不少关于通过apt-get 的方式进行安装gstreammer的教程,我没有对这种...
  • Gstreamer错误

    2021-07-22 16:39:44
    https://gstreamer.freedesktop.org/documentation/tutorials/basic/dynamic-pipelines.html?gi-language=c用这个里面的gst_message_parse_error()解析错误并输出,查看错误类型 1、解决方法,加入gst_init() gst_...
  • 视频RTMP推流实践

    2020-05-31 12:37:47
    由于ffmpeg和gstreamer比较庞大,仅仅用来推流,有大炮打蚊子之嫌。针对客户端特别是瘦客户端,使用librtmp(rtmp-dump)方案更加精简,更加高效。 本方案基本思路: 下载并编译librtmp。 下载地址:...
  • gstreamer总结

    千次阅读 2019-09-02 15:54:43
    学习资料 中文资料:... 官方资料:https://gstreamer.freedesktop.org/documentation/index.html?gi-language=c 命令行工具 gst-inspect-1.0 :打印有关该插件或元素的信息 ...
  • gstreamer应用笔记

    2019-02-23 18:39:00
    gstreamer官网 https://gstreamer.freedesktop.org/ 应用手册 https://gstreamer.freedesktop.org/documentation/index.html 一、getreamer安装(ubuntu) gstreamer0.10和gstreamer1.0两个版本容易混淆 sudo...
  • 展示了在python语言中用gstreamer打开摄像头示例,示例用一个单独线程操作
  • TX2之硬件解码RTSP转RTMP

    千次阅读 2018-08-06 11:08:21
    参考:... ... 1 目标 用TX2做硬件解码,实时获取相机rtsp视频,解码后转为rtmp推流到服务器,本文介绍用到的两种方法,分别是FFmpeg指令和GStreamer指令,哪种效果好,需...
  • Gstreamer介绍 续上一章内容FFmpeg 推流、拉流 使用问题一览 继续上述CPU 软编软解问题,进行阐述修改 Gstreamer参考文档: Gstreamer doc: Offical code lib 其中Issues来自gitlab,github未保存其中信息。 Install...
  • h264文件通过rtmp发布

    2014-06-22 22:05:56
    将h264裸流封装成flv格式后,通过rtmp协议发布到rtmp流服务器
  • 不需要代码,在树莓派上使用Openresty和ffmpeg利用USB摄像头搭建RTMP直播服务。最近老在说,中年男人对女性失去兴趣以后,就会开始折腾一些稀奇古怪的东西:什么盘手串啊,摄影啊,钓鱼啊…最近,我当上了陈养鱼。...
  • GStreamer for Android Demo

    2020-07-11 14:12:34
    GStreamer for Android 本文主要是梳理GStreamer 官方demo写的随手笔记 文章目录GStreamer for Android一、编译过程1.1 环境搭建1.2 JNI mk文件配置1.2.1 gradle配置1.2.2 Application.mk1.2.3 Android.mk1.2.3 ...
  • gstreamer命令参数优化

    2020-12-10 18:17:34
    原命令 gst-launch-1.0 rtspsrc latency=8 location=rtsp://admin:aIlab1234@192.168.2.101/h264/ch1/main/av_stream !...rtmpsink location=“rtmp://192.168.2.15:1935/myapp/test1” 参数来源:...
  • GStreamer 1.0 series序列示例 OpenEmbedded layer for GStreamer 1.0 这layer层为GStreamer 1.0框架提供了非官方的支持,用于OpenEmbedded/Yocto。它用于GStreamer recipe backports,为早期的OE版本提供对更新...
  • #include int main(int argc,char **argv) {  GstElement *playbin2,*fakesink,*mfw_v4lsink;  GMainLoop *loop;  gst_init(&argc,&argv);  loop=g_main_loop_new(NULL,FALSE);... mfw_v4lsink=gst_elemen
  • 最近做到的项目使用nginx,要将一个摄像头的视频流推送给前端,并要求同时支持3种推流方式——rtmp,...我是在tx2上做的,所以例子中使用的是gstreamer进行推流,你可以使用其他推流方式,比如ffmpeg或者obs。 ...
  • 思路: opencv读取视频 —> 将视频分割为帧 —> 将每一帧进行需求加工后 —>目标检测----> 将此帧写入pipe管道 —>...rtmpUrl = "'rtmp://localhost:1935/live_original/4" camera_path
  • 公司分配给我的认为是deepstream部署,太难了,gstreamer语言学的我头皮发麻!!!最近的一个任务是实现rtsp实时推流,即通过gstream管道实时把摄像头输入,通过rtsp推流。 一、代码 不多说什么直接上代码,想了解...

空空如也

空空如也

1 2 3 4 5 ... 17
收藏数 325
精华内容 130
关键字:

gstreamerrtmp