- Is there an useful way within Ubuntu OS from command line for separation of video and audio for streaming. For video being streamed to a network port (client media player connecting through udp://, tcp://, http://) and audio being redirected to varying (local) hardware devices (listed by
aplay -l
)?
(If served with one input for ffmpeg all video and audio streams should start synchronized, AFAIK.)
- Can audio output devices be changed dynamically for redirection between audio hardware connected?
ffmpeg
-suite would be the preferred tool on this installed OS (other options are mplayer
, mencoder
).
Following redirected audio to local stdout pipe for ffplay, but had increased delay to media player from udp streaming port:
media upstream: ffmpeg -i h264x_mp3.mp4 -preset fast -vcodec libx264 -f mpeg -map 0:v udp://localhost:port -map 0:a:0 -f mp3 - | ffplay -nodisp -i -
downstream client: mplayer udp://localhost:port
this example plays audio locally and should transfer a video stream over a named pipe, but connecting to named pipe does not show video output on media player client:
media server: mplayer -ao alsa,device=hw=0.0 -vo mpegpes:namedpipe
media client: mplayer -ao none namedpipe