FFmpeg 的服务端ffserver 实现实时的流媒体服务
1. ffmpeg
2. ffserver
3. ffserver.conf
4. feed1.ffm
1. ffmpeg,负责媒体文件的transcode工作,把你服务器上的源媒体文件转换成要发送出去的流媒体文件。
2. ffserver,负责响应客户端的流媒体请求,把流媒体数据发送给客户端。
3. ffserver.conf,ffserver启动时的配置文件,在这个文件中主要是对网络协议,缓存文件feed1.ffm(见下述)和要发送的流媒体文件的格式参数做具体的设定。
4. feed1.ffm,可以看成是一个流媒体数据的缓存文件,ffmpeg把转码好的数据发送给ffserver,如果没有客户端连接请求,ffserver把数据缓存到该文件中。
服务器租用联系QQ:28575315
官方例子:配置ffserver.conf文件
# Port on which the server is listening. You must select a different # port from your standard HTTP web server if it is running on the same # computer. Port 8090 # Address on which the server is bound. Only useful if you have # several network interfaces. BindAddress 0.0.0.0 # Number of simultaneous HTTP connections that can be handled. It has # to be defined *before* the MaxClients parameter, since it defines the # MaxClients maximum limit. MaxHTTPConnections 2000 # Number of simultaneous requests that can be handled. Since FFServer # is very fast, it is more likely that you will want to leave this high # and use MaxBandwidth, below. MaxClients 1000 # This the maximum amount of kbit/sec that you are prepared to # consume when streaming to clients. MaxBandwidth 1000 # Access log file (uses standard Apache log file format) # '-' is the standard output. CustomLog - # Suppress that if you want to launch ffserver as a daemon. NoDaemon ################################################################## # Definition of the live feeds. Each live feed contains one video # and/or audio sequence coming from an ffmpeg encoder or another # ffserver. This sequence may be encoded simultaneously with several # codecs at several resolutions. <Feed feed1.ffm> # You must use 'ffmpeg' to send a live feed to ffserver. In this # example, you can type: # # ffmpeg http://localhost:8090/feed1.ffm # ffserver can also do time shifting. It means that it can stream any # previously recorded live stream. The request should contain: # "http://xxxx?date=[YYYY-MM-DDT][[HH:]MM:]SS[.m...]".You must specify # a path where the feed is stored on disk. You also specify the # maximum size of the feed, where zero means unlimited. Default: # File=/tmp/feed_name.ffm FileMaxSize=5M File /tmp/feed1.ffm FileMaxSize 200K # You could specify # ReadOnlyFile /saved/specialvideo.ffm # This marks the file as readonly and it will not be deleted or updated. # Specify launch in order to start ffmpeg automatically. # First ffmpeg must be defined with an appropriate path if needed, # after that options can follow, but avoid adding the http:// field #Launch ffmpeg # Only allow connections from localhost to the feed. ACL allow 127.0.0.1 </Feed> ################################################################## # Now you can define each stream which will be generated from the # original audio and video stream. Each format has a filename (here # 'test1.mpg'). FFServer will send this stream when answering a # request containing this filename. <Stream test1.mpg> # coming from live feed 'feed1' Feed feed1.ffm # Format of the stream : you can choose among: # mpeg : MPEG-1 multiplexed video and audio # mpegvideo : only MPEG-1 video # mp2 : MPEG-2 audio (use AudioCodec to select layer 2 and 3 codec) # ogg : Ogg format (Vorbis audio codec) # rm : RealNetworks-compatible stream. Multiplexed audio and video. # ra : RealNetworks-compatible stream. Audio only. # mpjpeg : Multipart JPEG (works with Netscape without any plugin) # jpeg : Generate a single JPEG image. # asf : ASF compatible streaming (Windows Media Player format). # swf : Macromedia Flash compatible stream # avi : AVI format (MPEG-4 video, MPEG audio sound) Format mpeg # Bitrate for the audio stream. Codecs usually support only a few # different bitrates. AudioBitRate 32 # Number of audio channels: 1 = mono, 2 = stereo AudioChannels 1 # Sampling frequency for audio. When using low bitrates, you should # lower this frequency to 22050 or 11025. The supported frequencies # depend on the selected audio codec. AudioSampleRate 44100 # Bitrate for the video stream VideoBitRate 64 # Ratecontrol buffer size VideoBufferSize 40 # Number of frames per second VideoFrameRate 3 # Size of the video frame: WxH (default: 160x128) # The following abbreviations are defined: sqcif, qcif, cif, 4cif, qqvga, # qvga, vga, svga, xga, uxga, qxga, sxga, qsxga, hsxga, wvga, wxga, wsxga, # wuxga, woxga, wqsxga, wquxga, whsxga, whuxga, cga, ega, hd480, hd720, # hd1080 VideoSize 160x128 # Transmit only intra frames (useful for low bitrates, but kills frame rate). #VideoIntraOnly # If non-intra only, an intra frame is transmitted every VideoGopSize # frames. Video synchronization can only begin at an intra frame. VideoGopSize 12 # More MPEG-4 parameters # VideoHighQuality # Video4MotionVector # Choose your codecs: #AudioCodec mp2 #VideoCodec mpeg1video # Suppress audio #NoAudio # Suppress video #NoVideo #VideoQMin 3 #VideoQMax 31 # Set this to the number of seconds backwards in time to start. Note that # most players will buffer 5-10 seconds of video, and also you need to allow # for a keyframe to appear in the data stream. #Preroll 15 # ACL: # You can allow ranges of addresses (or single addresses) #ACL ALLOW <first address> # You can deny ranges of addresses (or single addresses) #ACL DENY <first address> # You can repeat the ACL allow/deny as often as you like. It is on a per # stream basis. The first match defines the action. If there are no matches, # then the default is the inverse of the last ACL statement. # # Thus 'ACL allow localhost' only allows access from localhost. # 'ACL deny 1.0.0.0 1.255.255.255' would deny the whole of network 1 and # allow everybody else. </Stream> ################################################################## # Example streams # Multipart JPEG #<Stream test.mjpg> #Feed feed1.ffm #Format mpjpeg #VideoFrameRate 2 #VideoIntraOnly #NoAudio #Strict -1 #</Stream> # Single JPEG #<Stream test.jpg> #Feed feed1.ffm #Format jpeg #VideoFrameRate 2 #VideoIntraOnly ##VideoSize 352x240 #NoAudio #Strict -1 #</Stream> # Flash #<Stream test.swf> #Feed feed1.ffm #Format swf #VideoFrameRate 2 #VideoIntraOnly #NoAudio #</Stream> # ASF compatible <Stream test.asf> Feed feed1.ffm Format asf VideoFrameRate 15 VideoSize 352x240 VideoBitRate 256 VideoBufferSize 40 VideoGopSize 30 AudioBitRate 64 StartSendOnKey </Stream> # MP3 audio #<Stream test.mp3> #Feed feed1.ffm #Format mp2 #AudioCodec mp3 #AudioBitRate 64 #AudioChannels 1 #AudioSampleRate 44100 #NoVideo #</Stream> # Ogg Vorbis audio #<Stream test.ogg> #Feed feed1.ffm #Title "Stream title" #AudioBitRate 64 #AudioChannels 2 #AudioSampleRate 44100 #NoVideo #</Stream> # Real with audio only at 32 kbits #<Stream test.ra> #Feed feed1.ffm #Format rm #AudioBitRate 32 #NoVideo #NoAudio #</Stream> # Real with audio and video at 64 kbits #<Stream test.rm> #Feed feed1.ffm #Format rm #AudioBitRate 32 #VideoBitRate 128 #VideoFrameRate 25 #VideoGopSize 25 #NoAudio #</Stream> ################################################################## # A stream coming from a file: you only need to set the input # filename and optionally a new format. Supported conversions: # AVI -> ASF #<Stream file.rm> #File "/usr/local/httpd/htdocs/tlive.rm" #NoAudio #</Stream> #<Stream file.asf> #File "/usr/local/httpd/htdocs/test.asf" #NoAudio #Author "Me" #Copyright "Super MegaCorp" #Title "Test stream from disk" #Comment "Test comment" #</Stream> ################################################################## # RTSP examples # # You can access this stream with the RTSP URL: # rtsp://localhost:5454/test1-rtsp.mpg # # A non-standard RTSP redirector is also created. Its URL is: # http://localhost:8090/test1-rtsp.rtsp #<Stream test1-rtsp.mpg> #Format rtp #File "/usr/local/httpd/htdocs/test1.mpg" #</Stream> # Transcode an incoming live feed to another live feed, # using libx264 and video presets #<Stream live.h264> #Format rtp #Feed feed1.ffm #VideoCodec libx264 #VideoFrameRate 24 #VideoBitRate 100 #VideoSize 480x272 #AVPresetVideo default #AVPresetVideo baseline #AVOptionVideo flags +global_header # #AudioCodec libfaac #AudioBitRate 32 #AudioChannels 2 #AudioSampleRate 22050 #AVOptionAudio flags +global_header #</Stream> ################################################################## # SDP/multicast examples # # If you want to send your stream in multicast, you must set the # multicast address with MulticastAddress. The port and the TTL can # also be set. # # An SDP file is automatically generated by ffserver by adding the # 'sdp' extension to the stream name (here # http://localhost:8090/test1-sdp.sdp). You should usually give this # file to your player to play the stream. # # The 'NoLoop' option can be used to avoid looping when the stream is # terminated. #<Stream test1-sdp.mpg> #Format rtp #File "/usr/local/httpd/htdocs/test1.mpg" #MulticastAddress 224.124.0.1 #MulticastPort 5000 #MulticastTTL 16 #NoLoop #</Stream> ################################################################## # Special streams # Server status <Stream stat.html> Format status # Only allow local people to get the status ACL allow localhost ACL allow 192.168.0.0 192.168.255.255 #FaviconURL http://pond1.gladstonefamily.net:8080/favicon.ico </Stream> # Redirect index.html to the appropriate site <Redirect index.html> URL http://www.ffmpeg.org/ </Redirect>
(1)实时流用http传输
如果传输硬盘上的文件,则:
ffserver -f myfile/ffmpeg0.8.9/ffserver.conf & ffmpeg -i inputfile(输入文件) http://localhost:10535/feed1.ffm
如何传输摄像头捕获的实时流,则:
ffserver -f myfile/ffmpeg0.8.9/ffserver.conf & ffmpeg -f video4linux2 -framerate 30 -i /dev/video0 http://127.0.0.1:8090/feed1.ffm
启动ffserver和ffmpeg。ffserver先于ffmpeg启动,它在启动的时候需要加参数-f指定其配置文件。ffserver启动后,feed1.ffm就会被创建,这时如果你打开feed1.ffm看看,会发现feed1.ffm开始的部分已经写入了内 容,你可以找到关键字ffm以及向客户端传送流的配置信息,在feed1.ffm做缓冲用的时候,这些信息是不会被覆盖掉的,就把它们理解为 feed1.ffm文件的头吧。
ffserver启动后,ffmpeg启动,它启动时加的一个关键参数就是“http://ip:10535/feed1.ffm”,其中ip是运行 ffserver主机的ip,如果ffmpeg和ffserver都在同一系统中运行的话,用localhost也行。ffmpeg启动后会与 ffserver建立一个连接(短暂的连接),通过这第一次的连接,ffmpeg从ffserver那里获取了向客户端输出流的配置,并把这些配置作为自 己编码输出的配置,然后ffmpeg断开了这次连接,再次与ffserver建立连接(长久的连接),利用这个连接ffmpeg会把编码后的数据发送给 ffserver。
如果你观察ffserver端的输出就会发现这段时间会出现两次HTTP的200,这就是两次连接的过程。
ffmpeg从摄像头获取数据后,按照输出流的编码方式编码,然后发送给ffserver,ffserver收到ffmpeg的数据后,如果网络上 没有播放的请求,就把数据写入feed1.ffm中缓存,写入时把数据加上些头信息然后分块,每块4096B(每块也有结构),当feed1.ffm的大 小到了ffserver.conf中规定的大小后,就会从文件开始(跳过头)写入,覆盖旧的数据。直到网络上有播放的请求,ffserver从 feed1.ffm中读取数据,发送给客户端。
使用ffmpeg对本地文件流化
./ffmpeg -i ./1.mov -vcodec libx264 -qmin 3 -qmax 31 -qdiff 4 -me_range 16 -keyint_min 25 -qcomp 0.6 -b 9000K http://localhost:8090/feed1.ffm
捕捉本地摄像头数据
./ffmpeg -f video4linux2 -i /dev/video0 http://localhost:8090/feed1.ffm
(2)本地文件用http传输
ffserver -f /etc/ffserver.conf
用命令启动ffserver,然后用ffplay http://ip:port/test.flv,或者在vlc中输入以上网址也可实现播放。
(3)本地文件用rtsp传输
ffserver -f /etc/ffserver.conf
用命令启动ffserver,然后用ffplay rtsp://ip:port/rtsp.mpg,或者在vlc中输入以上网址也可实现播放。
备注:在做测试的时候,用rtsp不能传输flv文件。