I know that generally, it's no problem to overlay HTML (and even do advanced compositing operations) to HTML5 native video. I've seen cool tricks with keying out green screens in realtime, in the browser, for example.
What I haven't see yet, though, is something that tracks in-video content, perhaps at the pixel level, and modifies the composited overlay in accordance. Motion tracking, basically. A good example would be an augmented reality sort of app (though for simplicity's sake, let's say augmenting an overlay over on-demand video rather than live video).
Has anyone seen any projects like this, or even better, any frameworks for HTML5 video overlaying (other than transport controls)?
If we use the canvas tag to capture the instances of the video, we are able to get the pixel level information of the video. Then we can detect the motion tracking i think. May be the work of HTML5 will be upto grabing the pixel informaion, then its our work to detect the things we need..
And i didnt find any such frame works for HTML5 video tag, as there is no common video format supported by all browsers...