SQL XML Bulk Load varbinary(8000) to varbinary(MAX)

Collapse
X
 
  • Time
  • Show
Clear All
new posts
  • dbudry
    New Member
    • Mar 2010
    • 4

    SQL XML Bulk Load varbinary(8000) to varbinary(MAX)

    I have a program that I developed that builds an XML file that stores base64binary data from images stored in SQL Server. The XML file along with the schema get transfered, then another program picks them up and uses the SQLXMLBULKLOADL ib to insert the XML data based mapped with a Schema file back into SQL Server.

    The problem is that when it imports it into SQL Server, it creates a varbinary(8000) data type instead of varbinary(MAX) data type. It is cutting off some of my data and thus my images will not show up on the website. How do I get it to create the table with varbinary(MAX)?

    Help?

    Below is example of my schema file:
    <xs:schema id="ROOT" xmlns="" xmlns:xs="http://www.w3.org/2001/XMLSchema" xmlns:msdata="u rn:schemas-microsoft-com:xml-msdata">
    <xs:element name="tblBioApp roved">
    <xs:complexType >
    <xs:attribute name="BioPictur eA" type="xs:base64 Binary" />
    <xs:attribute name="BioPictur eTypeA" type="xs:string " />
    </xs:complexType>
    </xs:element>
    </xs:schema>
    Last edited by Frinavale; Apr 6 '10, 01:19 PM. Reason: Moved from ASP.NET to .NET
  • dbudry
    New Member
    • Mar 2010
    • 4

    #2
    The solution that I eneded up choosing

    In my program, I checked to see if it was updating the table that needed the varbinary(max) datatype for. When the program comes to the needed xml file, I changed the SQLXMLBULKLOAD to not to drop the table, then I clear out the sql table, alter the data type to varbinary(max) just to be sure, and then import my xml file with the schema.

    This is not the way that I wanted it to work because I had to hard code the table name in the program. My hopes were to have everything dynamically driven so I didnt have to hard code any table names.

    Though I found a solution that worked for me, If anyone is able to answer my original question, I would redesign the program.

    Thanks

    Comment

    • NeoPa
      Recognized Expert Moderator MVP
      • Oct 2006
      • 32637

      #3
      You seem to be implying (I can't see where you say much about your problem explicitly) that you are using an approach where the data is uploaded into a new table. Is there any reason why you wouldn't be loading into a previously created, and potentially cleared down, table?

      Excuse me if this is a dumb question. I am no Asp.NET programmer, but I have a fair amount of database experience, with even some limited SQL Server thrown in.

      Comment

      • dbudry
        New Member
        • Mar 2010
        • 4

        #4
        No, that is a excellent question. I developed it this way to avoid doing as much programming or manual work as possible. My objective was to only change the locally stored code but not the remotely stored program. So no matter what tables or data I add in the future, as long as I pass the needed data, and schema, the remote program would pick it up, and import it into the SQL database.

        Using the existing table and sizing down the fields would offer better performance, I just didnt want the extra work of maintaining the tables on the other side.

        Unfortunatly though, everytime you use the BULKLOAD and clear out the table, the schema with "Type" of "base64bina ry" will not create the table with with varbinary(MAX) so I had to make an one exception with that table unlike the others.

        Did that make since? Sorry it took so long for me to get back, but thank you for your reply.

        Comment

        • Frinavale
          Recognized Expert Expert
          • Oct 2006
          • 9749

          #5
          I don't understand what you're doing probably because I cannot picture what your application is.

          Is it a Web Service?
          Do you have a desktop client application consuming the web service?


          Or is this an ASP.NET web application?

          Comment

          • dbudry
            New Member
            • Mar 2010
            • 4

            #6
            I have a set of programs that run on a virtual machine locally. Every day at 3am the program runs collecting data from a local SQL server in XML format, then using WinSCP I transfer the XML files with schemas that I build via SFTP over to my hosting server and drop the contents off in a folder. At around 3:30 AM on the hosting server has another set of programs run that I designed that grabs the XML files with the schema's and Bulk Uploads them to the SQL server on the host.

            Comment

            • NeoPa
              Recognized Expert Moderator MVP
              • Oct 2006
              • 32637

              #7
              Originally posted by dbudry
              dbudry: No, that is a excellent question. ...
              My turn to apologise for my delayed response.

              Your answer's a good one, but I'm afraid tells me more than anything else, that I can't help you. Sorry, and good luck.
              Last edited by NeoPa; Aug 10 '10, 09:46 AM.

              Comment

              • Jerry Winston
                Recognized Expert New Member
                • Jun 2008
                • 145

                #8
                I think you're just missing a few planning steps. Your problem is creating tables, not necessarily importing (which it seems like you have mastered) or formatting. If your table name is inconsistent but your table structure is always the same ie (ID BIGINT, myXMLData VARBINARY(MAX), descData VARCHAR(1024)) you need table-creating function on the database side or the application side. In an application-side solution you would add something like this before your upload statement:
                Code:
                fn_buildTable(myTableName,myDBConnection)
                Where the application function takes the table name as an argument.

                The TSQL solution would involve stored procedures and go something like this:
                Code:
                CREATE sp_uploadXML(newTableName,field1,field2,field3)
                AS
                    sp_createTable(newTableName)
                    sp_saveXMLData(newTableName,field1,field2,field3)
                END

                Can you share the basic gist of what your current table-creating code looks like?


                I don't know if you're terribly interested in SQL Server, however, I wonder if you considered implementing, scheduling, and automating your Extract Transform and Load operations using SQL Agent. There's a wealth of XML and ETL support in SQL 2005 that I think you could benefit. You're ETL process isn't broke so it doesn't need fixing but you might want to checkout the OPENXML, OPENROWSET and DECLARE @x XML code. They make my life so much easier when I'm designing ETL processes with XML.

                Comment

                Working...