alternative for eval in JavaScript?

Collapse
X
 
  • Time
  • Show
Clear All
new posts
  • ammu86
    New Member
    • Jan 2010
    • 6

    alternative for eval in JavaScript?

    I need to convert a JSON string which contains more than 1 lakh elements to json object. When i use eval in IE it throws out of memory error..
    any other approach for this?
  • Dormilich
    Recognized Expert Expert
    • Aug 2008
    • 8694

    #2
    JSON.parse() or the parse()/decode() method of an arbitrary JSON (containing) library.

    Comment

    • ammu86
      New Member
      • Jan 2010
      • 6

      #3
      all these are working fine if records not exceeded 70000

      Comment

      • Dormilich
        Recognized Expert Expert
        • Aug 2008
        • 8694

        #4
        70000 what? mice? cats? dollars?

        Comment

        • ammu86
          New Member
          • Jan 2010
          • 6

          #5
          read my question and answer pls.hope u used to store cats and mice in JSON.

          Comment

          • Dormilich
            Recognized Expert Expert
            • Aug 2008
            • 8694

            #6
            I am not familiar with the indian (?) numbering system. I have no idea what 1 lakh is. so do others.

            Comment

            • ammu86
              New Member
              • Jan 2010
              • 6

              #7
              there is no importance for the country in creating json structure. var myJson = '{ "x": "Hello, World!", "y1": [1, 2, 3] },"y2": [1, 2, 3] },"y3": [1, 2, 3] },"y4": [1, 2, 3] }';
              eval throws out of memory error(in IE)if this exceeds 70k
              Last edited by acoder; Mar 15 '12, 04:00 PM.

              Comment

              • Dormilich
                Recognized Expert Expert
                • Aug 2008
                • 8694

                #8
                yes, objects require a bit of memory.

                Comment

                • acoder
                  Recognized Expert MVP
                  • Nov 2006
                  • 16032

                  #9
                  For the record, 1 lakh = 100,000

                  ammu86, can you not split this data into chunks when it gets too large?

                  Comment

                  • gits
                    Recognized Expert Moderator Expert
                    • May 2007
                    • 5390

                    #10
                    i would suggest that too - 70k is a lot of data and in case those records are for a grid/table or similar it is better to use a kind of paging mechanism then. from experience it is rather useless to present a user more then 100-150 lines in a table - since a) he/she cannot overview it and even has to scroll a lot and b) performance is bad, since it often implies lots of created DOM-nodes then.

                    if its not for a table but for a complex object - implement a 'lazy loading' mechanism (kind of 'paged object' where the pages are semanticaly structured properties) - and load data when they are needed.

                    regards
                    Last edited by gits; Mar 19 '12, 07:18 AM.

                    Comment

                    Working...