This is similar to the problems that we face doing realtime virtual video
enhancements-
We have to log camera data (to know where things are pointed) by video
timecode since the data for the camera and the video are asyncronous
(especially in replay).
These (mmaped) logs can get relatively large (100+ MB ea) and access into them
is relatively random (i.e. determined by the director of the show), so the
process reading the log (and suffering the fault) is in a different thread in
order to not stall the other important tasks such as video output.
(Mis-estimating the position for the enhancement is much less of an issue than
dropping the video frame itself. We don't want 10,000,000 people seeing
pure-green frames popping up in the middle of the broadcast.)
-Roberto JP
robertopeon@sportvision.com
-
To unsubscribe from this list: send the line "unsubscribe linux-kernel" in
the body of a message to majordomo@vger.kernel.org
More majordomo info at http://vger.kernel.org/majordomo-info.html
Please read the FAQ at http://www.tux.org/lkml/