The problem is the following. One source (IP camera) is producing a video stream (h.264/mp4). Another source (linux machine with sound card) is producing audio stream (AAC/mp4). Is it possible to merge these two sources into a new stream/file, using ffmpeg, and to synchronize audio with video (using timestamps inside each stream).
Time on both sources (IP camera and linux machine) is correctly synchronized, using the same (local) NTP time server, so both streams should have accurate timestamps. The goal is just to tell FFmpeg somehow to use those timestamps to sync the audio stream with the video stream.
Thanks for any comments.