How to create and display a 48-bit image?

Collapse
X
 
  • Time
  • Show
Clear All
new posts
  • manontheedge
    New Member
    • Oct 2006
    • 175

    How to create and display a 48-bit image?

    I've been creating 24-bit images ( 8 bits per pixel ) from raw data, which works fine using the Bitmap class.

    But now I need to create a 48-bit image. The problem is that apparantly only 8 bits per color are actually being used when creating it, so I'm losing a great bit of detail in each pixel. When I go to use "SetPixel", it gives the message, "the value is limited to 8 bits". I need all 16 bits.

    Anyone have any idea of what I can do?
  • GaryTexmo
    Recognized Expert Top Contributor
    • Jul 2009
    • 1501

    #2
    Are there 16 bits per pixel images? I don't know, I'm only familiar with the 8bpp kind, where the RGBA values range from 0 to 255. In fact, I'm relatively sure this is limited by the graphics card even...

    What kind of data do you need to store? Can you provide an example, or some sample code?

    Sorry, I'm trying to understand the problem :)

    Comment

    • manontheedge
      New Member
      • Oct 2006
      • 175

      #3
      Thanks for the reply.

      What I'm working with are video dumps. I'm looking at hundreds of individual frames, so the format of one type of file is 14 bit grayscale. I've figured out reading all different odd file types, and then converting each frame to 24 or 32 bit Bitmaps ( 8 bits/color ) in windows forms, but now I need to deal with one that's 14 bit monochrome. However, when I create a new Bitmap and fill the values (using SetPixel), it turns the 14 bit image I want to create into an 8 bit image ... it chops off the 2nd byte.

      I tried normalizing the data, so I only need one byte ( the 0-255 ), but it just doesn't look good enough, it needs the detail that it's loosing by not using the full 14 bits.

      If you need me to explain further, or think code would help, let me know. I appreciate the help.

      Comment

      • GaryTexmo
        Recognized Expert Top Contributor
        • Jul 2009
        • 1501

        #4
        I'll admit up front, I'm not familiar with video formats so I actually have no idea how they work. I don't know what a 14-bit gray scale is, but maybe I can still help if we work through it together. Can you post an example of what the inputs are? If it's gray scale, what is the 14-bit number representing?

        As I understand it, the graphics card still only displays RGB values that are 0 to 255, so I don't see how you can get away from that (I may be wrong here).

        I guess my question is this... on a standard display you've got 255 levels of gray (assuming R = G = B), is this not the case with your input?

        Comment

        Working...