DirectShow

Collapse
X
 
  • Time
  • Show
Clear All
new posts
  • Airslash
    New Member
    • Nov 2007
    • 221

    DirectShow

    Hello,

    I'm currently trying to do something with DirectShow in the hope of replacing something from Delphi.

    We currently have a program written in Delphi that relies on the components from ASTASip to create an AVI stream from jpg files.
    The idea behind this is that one application overwrites a jpg file constantly with a new file and that another application reads the file and builds an AVI stream with it to send it over TCP.

    I've been reading up about DirectShow and it has the ability to transmit AVI files and work with various media types.
    I cannot find an example anywhere or an explanation to see if it is possible to have one thread read the jpg file constantly and add it to the DirectShow graph and build a AVI movie with it.

    Anyone have an idea if this is possible or where I can find some information about it? The MSDN is rather vague about the while DirectShow stuff...
  • mac11
    Contributor
    • Apr 2007
    • 256

    #2
    I worked with directshow (more specifically, the directshow.net wrapper library) to make a program using webcams some time ago.

    I don't know exactly how to go about converting a bunch of .jpg to .avi using directshow, but since nobody has replied I figure I have to ask:

    Why not just skip the whole .jpg step and make the first program simply stream the video from it's source? It doesn't solve the problem, but it sure would remove some complexity.

    Comment

    • Airslash
      New Member
      • Nov 2007
      • 221

      #3
      Because that's not really an option I'm afraid;

      We support 2 types of cameras, one analogue type that delivers an mjpeg stream directly to the hardware grabber card which in turn breaks it up in jpgs and saves it on harddisk, and digital cameras where we manually grab jpg images.
      Either way due technical limits it's not always possible to grab a concrete videostream from the camera.

      I've currently figured something out in the likes of this:
      - source filter uses UDP to communicate with our application and ask by UDP every run where an image has been stored.
      - that image gets added to the stream of data and passed to the avi mux
      - the avi mux converts it to avi stream
      - custom output filter puts the stream/frames on a socket
      - client reads the frames and displayes them on the screen.

      Comment

      Working...