What does '64 bit' mean? Lame question, but hear me out :)

Collapse
This topic is closed.
X
X
 
  • Time
  • Show
Clear All
new posts
  • keith

    #31
    Re: What does '64 bit' mean? Lame question, but hear me out :)

    On Sat, 22 Jan 2005 17:41:36 -0500, George Macdonald wrote:
    [color=blue]
    > On Sat, 22 Jan 2005 12:32:06 -0500, keith <krw@att.bizzzz > wrote:
    >[color=green]
    >>On Sat, 22 Jan 2005 08:32:32 +0100, Christoph Nahr wrote:
    >>[color=darkred]
    >>> On Fri, 21 Jan 2005 19:58:57 -0500, Yousuf Khan <bbbl67@ezrs.co m>
    >>> wrote:
    >>>
    >>>>Actually, I think the word size is always the same size, 16-bit. 32-bit
    >>>>is called double word (dword), and 64-bit is called quadword (qword).
    >>>
    >>> Yeah, it's become common usage to refer to 16 bits as a "word" but
    >>> originally the "word size" of a CPU means the width of its data and/or
    >>> address registers. The terminology kind of ossified in the 16-bit
    >>> days, hence the usage of "word" == 16 bits has stuck...[/color]
    >>
    >>Only in the x86 world. In the world of 'z's and PPCs a "word" is still
    >>32bits.[/color]
    >
    > How much is this "16-bit word" definition due to M$'s pollution of the
    > computer vocabulary?... not sure how things stand in the Unix world at
    > present... but yes we've had computers with 16, 24, 32, 36, 60, 64 bit
    > words over the years that I've worked with. I've always thought of the
    > word size as the integer register width.[/color]

    That's the classical definition (as I've noted earlier in this thread).
    I'm sure you've missed a bunch too. The fact is that anyone
    assuming any results from size_of(word) is simply asking for a rude
    awakening.

    --
    Keith

    Comment

    • Yousuf Khan

      #32
      Re: What does '64 bit' mean? Lame question, but hear me out :)

      George Macdonald wrote:[color=blue][color=green]
      >>Well, actually the whole idea of DLL's is outdated in .NET isn't it? The
      >>idea of .NET was to create a framework that is independent of
      >>architectur e (albeit mostly limited to Microsoft operating systems). So
      >>a program once compiled doesn't care if its on a 32-bit processor or a
      >>64-bit one, or even care if it's running on an x86-compatible processor
      >>for that matter. There is no dependence on bittedness or instruction set.[/color]
      >
      >
      > Huh? They call that "compiled" nowadays?[/color]

      Well, it's compiled into a byte-code of some sort, just not machine
      code. It's just like Java, only Microsoft-oriented.

      Yousuf Khan

      Comment

      • Yousuf Khan

        #33
        Re: What does '64 bit' mean? Lame question, but hear me out :)

        George Macdonald wrote:[color=blue]
        > On Sat, 22 Jan 2005 12:32:06 -0500, keith <krw@att.bizzzz > wrote:[color=green]
        >>Only in the x86 world. In the world of 'z's and PPCs a "word" is still
        >>32bits.[/color]
        >
        >
        > How much is this "16-bit word" definition due to M$'s pollution of the
        > computer vocabulary?... not sure how things stand in the Unix world at
        > present... but yes we've had computers with 16, 24, 32, 36, 60, 64 bit
        > words over the years that I've worked with. I've always thought of the
        > word size as the integer register width.[/color]

        Well, we got the bits, the nibbles, the bytes, the words, etc. The first
        three are completely standardized values (remember the nibble? It's
        4-bits in case you don't). Then you got everything after the word is
        nebulous, but thank god the didn't decide to create a new bit-size term
        based around human language, like the clause or the sentence! We already
        have the paragraph, and the page, and that's more than enough.

        BTW, in the Unix world, these days they always preface /word/ with an
        actual bit-size description, such as "32-bit word" or "64-bit word".

        Yousuf Khan

        Yousuf Khan

        Comment

        • George Macdonald

          #34
          Re: What does '64 bit' mean? Lame question, but hear me out :)

          On Sun, 23 Jan 2005 03:22:47 -0500, Yousuf Khan <bbbl67@ezrs.co m> wrote:
          [color=blue]
          >George Macdonald wrote:[color=green][color=darkred]
          >>>Well, actually the whole idea of DLL's is outdated in .NET isn't it? The
          >>>idea of .NET was to create a framework that is independent of
          >>>architectu re (albeit mostly limited to Microsoft operating systems). So
          >>>a program once compiled doesn't care if its on a 32-bit processor or a
          >>>64-bit one, or even care if it's running on an x86-compatible processor
          >>>for that matter. There is no dependence on bittedness or instruction set.[/color]
          >>
          >>
          >> Huh? They call that "compiled" nowadays?[/color]
          >
          >Well, it's compiled into a byte-code of some sort, just not machine
          >code. It's just like Java, only Microsoft-oriented.[/color]

          It's just not real code and it's source is not real software.:-) This
          abuse of blurring the difference is going too far. What's the point of
          faster and faster processors if they just get burdened with more and more
          indirection. Neither Java, nor any other language, *has* to produce
          interpretive object code.

          Such languages have their place and reasons for use -- from security to
          laziness, or just toy application -- but to suggest that DLLs, which
          already have the burden of symbolic runtime linkage, are now "outdated" is
          scarey.

          --
          Rgds, George Macdonald

          Comment

          • George Macdonald

            #35
            Re: What does '64 bit' mean? Lame question, but hear me out :)

            On Sat, 22 Jan 2005 19:25:56 -0500, Robert Myers <rmyers1400@com cast.net>
            wrote:
            [color=blue]
            >On Sat, 22 Jan 2005 17:07:59 -0500, George Macdonald
            ><fammacd=!SPAM ^nothanks@tellu rian.com> wrote:
            >[color=green]
            >>On Fri, 21 Jan 2005 19:18:58 -0500, Yousuf Khan <bbbl67@ezrs.co m> wrote:
            >>[/color]
            >[color=green][color=darkred]
            >>>
            >>>Well, actually the whole idea of DLL's is outdated in .NET isn't it? The
            >>>idea of .NET was to create a framework that is independent of
            >>>architectu re (albeit mostly limited to Microsoft operating systems). So
            >>>a program once compiled doesn't care if its on a 32-bit processor or a
            >>>64-bit one, or even care if it's running on an x86-compatible processor
            >>>for that matter. There is no dependence on bittedness or instruction set.[/color]
            >>
            >>Huh? They call that "compiled" nowadays?[/color]
            >
            >That language is at least as old as Pascal isn't it? One spoke of
            >compiling to p-code...no?[/color]

            Pseudo code and interpretive execution goes back much further than Pascal -
            many proprietary languages existed as such. I've worked on a couple of
            "compilers" which produced interpretive code myself and even the end user
            knew the importance of the difference - IOW if they wanted to do real work,
            then a p-code Pascal was the wrong choice... same with Basic. I guess I'm
            objecting more to the notion that it can replace real machine code... i.e.
            "whole idea of DLLs is outdated".

            --
            Rgds, George Macdonald

            Comment

            • George Macdonald

              #36
              Re: What does '64 bit' mean? Lame question, but hear me out :)

              On Sun, 23 Jan 2005 03:37:04 -0500, Yousuf Khan <bbbl67@ezrs.co m> wrote:
              [color=blue]
              >George Macdonald wrote:[color=green]
              >> On Sat, 22 Jan 2005 12:32:06 -0500, keith <krw@att.bizzzz > wrote:[color=darkred]
              >>>Only in the x86 world. In the world of 'z's and PPCs a "word" is still
              >>>32bits.[/color]
              >>
              >>
              >> How much is this "16-bit word" definition due to M$'s pollution of the
              >> computer vocabulary?... not sure how things stand in the Unix world at
              >> present... but yes we've had computers with 16, 24, 32, 36, 60, 64 bit
              >> words over the years that I've worked with. I've always thought of the
              >> word size as the integer register width.[/color]
              >
              >Well, we got the bits, the nibbles, the bytes, the words, etc. The first
              >three are completely standardized values (remember the nibble? It's
              >4-bits in case you don't). Then you got everything after the word is
              >nebulous, but thank god the didn't decide to create a new bit-size term
              >based around human language, like the clause or the sentence! We already
              >have the paragraph, and the page, and that's more than enough.[/color]

              There was also the dibit, which I've never been sure how to pronunce:-) and
              the "movement" to use octet instead of byte seems to be gaining strength,
              especially in Europe (French revisionism ?:-))... remembering that the
              first computers I used had 6-bit bytes. I don't recall what Univac called
              their 9-bit field... "quarter-word"??
              [color=blue]
              >BTW, in the Unix world, these days they always preface /word/ with an
              >actual bit-size description, such as "32-bit word" or "64-bit word".[/color]

              Which is how it should be... but I'd hope it doesn't use "word" for 16-bit
              field on say an Athlon64.;-)

              As I recall IBM introduced the concept of a variable sized word with the
              System/360s but they have always been considered to have a 32-bit word size
              - that's the size of the integer registers and the most efficient working
              unit of integer data.

              --
              Rgds, George Macdonald

              Comment

              • keith

                #37
                Re: What does '64 bit' mean? Lame question, but hear me out :)

                On Sun, 23 Jan 2005 03:37:04 -0500, Yousuf Khan wrote:
                [color=blue]
                > George Macdonald wrote:[color=green]
                >> On Sat, 22 Jan 2005 12:32:06 -0500, keith <krw@att.bizzzz > wrote:[color=darkred]
                >>>Only in the x86 world. In the world of 'z's and PPCs a "word" is still
                >>>32bits.[/color]
                >>
                >>
                >> How much is this "16-bit word" definition due to M$'s pollution of the
                >> computer vocabulary?... not sure how things stand in the Unix world at
                >> present... but yes we've had computers with 16, 24, 32, 36, 60, 64 bit
                >> words over the years that I've worked with. I've always thought of the
                >> word size as the integer register width.[/color]
                >
                > Well, we got the bits, the nibbles, the bytes, the words, etc. The first
                > three are completely standardized values (remember the nibble? It's
                > 4-bits in case you don't).[/color]

                Actually it's spelled "nybble". ;-) "Byte" does *not* mean 8-bits.
                It's the size of a character. Just because character = 8bits for all
                machines we care to remember doesn't change the meaning of "byte". The
                correct term for an general eight-bit entity is "octet".

                --
                Keith

                Comment

                • Robert Myers

                  #38
                  Re: What does '64 bit' mean? Lame question, but hear me out :)

                  On Sun, 23 Jan 2005 08:24:20 -0500, George Macdonald
                  <fammacd=!SPAM^ nothanks@tellur ian.com> wrote:
                  [color=blue]
                  >On Sat, 22 Jan 2005 19:25:56 -0500, Robert Myers <rmyers1400@com cast.net>
                  >wrote:
                  >[color=green]
                  >>On Sat, 22 Jan 2005 17:07:59 -0500, George Macdonald
                  >><fammacd=!SPA M^nothanks@tell urian.com> wrote:
                  >>[color=darkred]
                  >>>On Fri, 21 Jan 2005 19:18:58 -0500, Yousuf Khan <bbbl67@ezrs.co m> wrote:
                  >>>[/color]
                  >>[color=darkred]
                  >>>>
                  >>>>Well, actually the whole idea of DLL's is outdated in .NET isn't it? The
                  >>>>idea of .NET was to create a framework that is independent of
                  >>>>architectur e (albeit mostly limited to Microsoft operating systems). So
                  >>>>a program once compiled doesn't care if its on a 32-bit processor or a
                  >>>>64-bit one, or even care if it's running on an x86-compatible processor
                  >>>>for that matter. There is no dependence on bittedness or instruction set.
                  >>>
                  >>>Huh? They call that "compiled" nowadays?[/color]
                  >>
                  >>That language is at least as old as Pascal isn't it? One spoke of
                  >>compiling to p-code...no?[/color]
                  >
                  >Pseudo code and interpretive execution goes back much further than Pascal -
                  >many proprietary languages existed as such. I've worked on a couple of
                  >"compilers" which produced interpretive code myself and even the end user
                  >knew the importance of the difference - IOW if they wanted to do real work,
                  >then a p-code Pascal was the wrong choice... same with Basic. I guess I'm
                  >objecting more to the notion that it can replace real machine code... i.e.
                  >"whole idea of DLLs is outdated".[/color]

                  "The whole idea of DLLs is outdated" sounds really attractive. It's
                  also a train that's been coming down the track for a long time, if
                  it's the same idea as virtualized architecture.

                  I wouldn't include tokenized Basic source, but I guess there's a good
                  bit of old mainframe code running on a virtual machine. Anybody
                  venture a guess as to how much?

                  I've kind of lost track of the .NET thing. It's better than Java, I
                  gather, and there is an open source version, mono, which is attractive
                  enough for open source types to work under the proprietary gunsight of
                  Microsoft.

                  Big-endian, little-endian, 64-bit, 32-bit. Yuk. Bring on the virtual
                  machines.

                  Except for us number-cruching types, I guess, but more and more number
                  crunching takes place in an interpreted environment like matlab,
                  anyway.

                  RM

                  Comment

                  • keith

                    #39
                    Re: What does '64 bit' mean? Lame question, but hear me out :)

                    On Sun, 23 Jan 2005 11:10:09 -0500, Robert Myers wrote:
                    [color=blue]
                    > On Sun, 23 Jan 2005 08:24:20 -0500, George Macdonald
                    > <fammacd=!SPAM^ nothanks@tellur ian.com> wrote:
                    >[color=green]
                    >>On Sat, 22 Jan 2005 19:25:56 -0500, Robert Myers <rmyers1400@com cast.net>
                    >>wrote:
                    >>[color=darkred]
                    >>>On Sat, 22 Jan 2005 17:07:59 -0500, George Macdonald
                    >>><fammacd=!SP AM^nothanks@tel lurian.com> wrote:
                    >>>
                    >>>>On Fri, 21 Jan 2005 19:18:58 -0500, Yousuf Khan <bbbl67@ezrs.co m> wrote:
                    >>>>
                    >>>
                    >>>>>
                    >>>>>Well, actually the whole idea of DLL's is outdated in .NET isn't it? The
                    >>>>>idea of .NET was to create a framework that is independent of
                    >>>>>architectu re (albeit mostly limited to Microsoft operating systems). So
                    >>>>>a program once compiled doesn't care if its on a 32-bit processor or a
                    >>>>>64-bit one, or even care if it's running on an x86-compatible processor
                    >>>>>for that matter. There is no dependence on bittedness or instruction set.
                    >>>>
                    >>>>Huh? They call that "compiled" nowadays?
                    >>>
                    >>>That language is at least as old as Pascal isn't it? One spoke of
                    >>>compiling to p-code...no?[/color]
                    >>
                    >>Pseudo code and interpretive execution goes back much further than Pascal -
                    >>many proprietary languages existed as such. I've worked on a couple of
                    >>"compilers" which produced interpretive code myself and even the end user
                    >>knew the importance of the difference - IOW if they wanted to do real work,
                    >>then a p-code Pascal was the wrong choice... same with Basic. I guess I'm
                    >>objecting more to the notion that it can replace real machine code... i.e.
                    >>"whole idea of DLLs is outdated".[/color]
                    >
                    > "The whole idea of DLLs is outdated" sounds really attractive. It's
                    > also a train that's been coming down the track for a long time, if
                    > it's the same idea as virtualized architecture.[/color]

                    Well, that's one way of getting rid of DLL-Hell.
                    [color=blue]
                    > I wouldn't include tokenized Basic source, but I guess there's a good
                    > bit of old mainframe code running on a virtual machine. Anybody
                    > venture a guess as to how much?[/color]

                    All of it? ...and not only the "old" stuff. Mainframes have been
                    virtualized for decades. ...though perhaps in a slightly different
                    meaning of "virtualize d".

                    Looking at it another way, I'd propose that most modern processors
                    are virtualized, incuding x86. The P4/Athlon (and many before) don't
                    execute the x86 ISA natively, rather "interpret" it to a RISCish
                    processor.
                    [color=blue]
                    > I've kind of lost track of the .NET thing. It's better than Java, I
                    > gather, and there is an open source version, mono, which is attractive
                    > enough for open source types to work under the proprietary gunsight of
                    > Microsoft.[/color]

                    I don't see it as "better" in any meaning of the word. Java's purpose in
                    life is to divorce the application from the processor and OS. I can't
                    see how .net is "better" at this. If platform independance isn't wanted,
                    why would anyone use Java?
                    [color=blue]
                    > Big-endian, little-endian, 64-bit, 32-bit. Yuk. Bring on the virtual
                    > machines.[/color]

                    They are. You still have to decinde on a data format.
                    [color=blue]
                    > Except for us number-cruching types, I guess, but more and more number
                    > crunching takes place in an interpreted environment like matlab, anyway.[/color]

                    --
                    Keith

                    Comment

                    • GSV Three Minds in a Can

                      #40
                      Re: What does '64 bit' mean? Lame question, but hear me out :)

                      Bitstring <pan.2005.01.23 .02.17.47.75792 @att.bizzzz>, from the wonderful
                      person keith <krw@att.bizzzz > said
                      <snip>[color=blue][color=green]
                      >>.. but yes we've had computers with 16, 24, 32, 36, 60, 64 bit
                      >> words over the years that I've worked with. I've always thought of the
                      >> word size as the integer register width.[/color]
                      >
                      >That's the classical definition (as I've noted earlier in this thread).
                      >I'm sure you've missed a bunch too.[/color]

                      ISTR PDPx's (7s? 15s?) had 12 bit words. Atlas/Titan mainframes were 48,
                      again IIRC .. it's heck of a long time ago. [No, please don't kick the
                      Mercury delay line memory tank .... Arrghhh.]
                      [color=blue]
                      > The fact is that anyone
                      >assuming any results from size_of(word) is simply asking for a rude
                      >awakening.[/color]

                      Indeed. Even sizeof(char) was not guaranteed on all machines. We
                      remember 5-track Flexowriters too. 8>.

                      --
                      GSV Three Minds in a Can
                      Outgoing Msgs are Turing Tested,and indistinguishab le from human typing.

                      Comment

                      • Robert Myers

                        #41
                        Re: What does '64 bit' mean? Lame question, but hear me out :)

                        On Sun, 23 Jan 2005 11:59:54 -0500, keith <krw@att.bizzzz > wrote:
                        [color=blue]
                        >On Sun, 23 Jan 2005 11:10:09 -0500, Robert Myers wrote:
                        >[/color]
                        [color=blue]
                        >[color=green]
                        >> I wouldn't include tokenized Basic source, but I guess there's a good
                        >> bit of old mainframe code running on a virtual machine. Anybody
                        >> venture a guess as to how much?[/color]
                        >
                        >All of it? ...and not only the "old" stuff. Mainframes have been
                        >virtualized for decades. ...though perhaps in a slightly different
                        >meaning of "virtualize d".
                        >
                        >Looking at it another way, I'd propose that most modern processors
                        >are virtualized, incuding x86. The P4/Athlon (and many before) don't
                        >execute the x86 ISA natively, rather "interpret" it to a RISCish
                        >processor.
                        >[/color]
                        I take your point, but including microcode stretches the notion of
                        virtualization too far on one end the way that including tokenized
                        Basic stretches it too far on the other. I'm too lazy to try to come
                        up with a bullet-proof definition, but there is a class of virtual
                        machines that could naturally be implemented in hardware but are
                        normally implemented in software: p-code, java byte-code, m-code, and
                        I would put executing 360 instructions on x86 in that class.
                        Interpreting of x86 to microcode is done in hardware, of course.
                        MSIL, the intermediate code for .NET, actually does compile to machine
                        code, apparently, and is not implemented on a virtual machine.

                        The term "virtualize " is pretty broad. One kind of virtualization,
                        the kind that vmware does or that I think Power5 servers do virtualize
                        the processor to its own instruction set, and I expect _that_ kind of
                        virtualization to become essentially universal for purposes of
                        security. You get the security and compartmentaliz ation benefits of
                        that kind of virtualization for free when you do instruction
                        translation by running on a virtual machine in software.
                        [color=blue][color=green]
                        >> I've kind of lost track of the .NET thing. It's better than Java, I
                        >> gather, and there is an open source version, mono, which is attractive
                        >> enough for open source types to work under the proprietary gunsight of
                        >> Microsoft.[/color]
                        >
                        >I don't see it as "better" in any meaning of the word. Java's purpose in
                        >life is to divorce the application from the processor and OS. I can't
                        >see how .net is "better" at this. If platform independance isn't wanted,
                        >why would anyone use Java?
                        >[/color]

                        I barely know Java, and c# not at all. c# is reputed to be nicer for
                        programming.

                        RM

                        Comment

                        • Yousuf Khan

                          #42
                          Re: What does '64 bit' mean? Lame question, but hear me out :)

                          George Macdonald wrote:[color=blue]
                          > It's just not real code and it's source is not real software.:-) This
                          > abuse of blurring the difference is going too far. What's the point of
                          > faster and faster processors if they just get burdened with more and more
                          > indirection. Neither Java, nor any other language, *has* to produce
                          > interpretive object code.
                          >
                          > Such languages have their place and reasons for use -- from security to
                          > laziness, or just toy application -- but to suggest that DLLs, which
                          > already have the burden of symbolic runtime linkage, are now "outdated" is
                          > scarey.[/color]

                          Not sure why you're so married to the concept of DLLs, they had their
                          purpose a few years ago, they were much better than the static-linked
                          libraries they replaced because they only were brought into memory only
                          when they were needed, not all at once at the beginning. But now the
                          requirement is for code that isn't dependent on underlying processor
                          architecture, and we have JAVA and .NET. These aren't exactly the same
                          as the old fashioned interpretted code either, these ones are decoded
                          only once on the fly and then they exist cached as machine code while
                          they run.

                          Yousuf Khan

                          Comment

                          • George Macdonald

                            #43
                            Re: What does '64 bit' mean? Lame question, but hear me out :)

                            On Sun, 23 Jan 2005 16:59:25 -0500, Yousuf Khan <bbbl67@ezrs.co m> wrote:
                            [color=blue]
                            >George Macdonald wrote:[color=green]
                            >> It's just not real code and it's source is not real software.:-) This
                            >> abuse of blurring the difference is going too far. What's the point of
                            >> faster and faster processors if they just get burdened with more and more
                            >> indirection. Neither Java, nor any other language, *has* to produce
                            >> interpretive object code.
                            >>
                            >> Such languages have their place and reasons for use -- from security to
                            >> laziness, or just toy application -- but to suggest that DLLs, which
                            >> already have the burden of symbolic runtime linkage, are now "outdated" is
                            >> scarey.[/color]
                            >
                            >Not sure why you're so married to the concept of DLLs, they had their
                            >purpose a few years ago, they were much better than the static-linked
                            >libraries they replaced because they only were brought into memory only
                            >when they were needed, not all at once at the beginning. But now the
                            >requirement is for code that isn't dependent on underlying processor
                            >architecture , and we have JAVA and .NET. These aren't exactly the same
                            >as the old fashioned interpretted code either, these ones are decoded
                            >only once on the fly and then they exist cached as machine code while
                            >they run.[/color]

                            DLLs are just the way it's done with Windows - nothing to do with being
                            married to anything; DLLs only got out of hand because of the fluff burden.
                            What irks me is machine cycles being pissed away on the indirection of
                            pseudo code. To me any suggestion that you can do serious computing with
                            this stuff, and do away with real machine code for system level library
                            functions, is madness.

                            --
                            Rgds, George Macdonald

                            Comment

                            • keith

                              #44
                              Re: What does '64 bit' mean? Lame question, but hear me out :)

                              On Sun, 23 Jan 2005 15:47:09 -0500, Robert Myers wrote:
                              [color=blue]
                              > On Sun, 23 Jan 2005 11:59:54 -0500, keith <krw@att.bizzzz > wrote:
                              >[color=green]
                              >>On Sun, 23 Jan 2005 11:10:09 -0500, Robert Myers wrote:
                              >>[/color]
                              >[color=green]
                              >>[color=darkred]
                              >>> I wouldn't include tokenized Basic source, but I guess there's a good
                              >>> bit of old mainframe code running on a virtual machine. Anybody
                              >>> venture a guess as to how much?[/color]
                              >>
                              >>All of it? ...and not only the "old" stuff. Mainframes have been
                              >>virtualized for decades. ...though perhaps in a slightly different
                              >>meaning of "virtualize d".
                              >>
                              >>Looking at it another way, I'd propose that most modern processors
                              >>are virtualized, incuding x86. The P4/Athlon (and many before) don't
                              >>execute the x86 ISA natively, rather "interpret" it to a RISCish
                              >>processor.
                              >>[/color]
                              > I take your point, but including microcode stretches the notion of
                              > virtualization too far on one end the way that including tokenized
                              > Basic stretches it too far on the other. I'm too lazy to try to come
                              > up with a bullet-proof definition,[/color]

                              I understand. It's impossible to catagorize such things because there is
                              such a continum of architectures that have been tried. However you are
                              pretty loosy-goosey with your term "virtual". Remember VM/360?

                              [color=blue]
                              > but there is a class of virtual
                              > machines that could naturally be implemented in hardware but are
                              > normally implemented in software: p-code, java byte-code, m-code, and I
                              > would put executing 360 instructions on x86 in that class.[/color]

                              Ok, a better example of your class of "virtualization " would be the 68K on
                              PPC. I call that emulation, not virtualization. I call what VM/360,
                              and later, did "virtualization ". The processor virtualized itself.

                              Ok, if you don't like microcode (what is your definitionof "microcode" ,
                              BTW) as a virtualizer, now it's your turn to tell me why you think
                              "emulation" is "virtualization ". ;-)
                              [color=blue]
                              > Interpreting
                              > of x86 to microcode is done in hardware, of course. MSIL, the
                              > intermediate code for .NET, actually does compile to machine code,
                              > apparently, and is not implemented on a virtual machine.[/color]

                              Ok, what would you call a Java byte-code machine?
                              [color=blue]
                              > The term "virtualize " is pretty broad.[/color]

                              Indeed, but it helps if we all get our terms defined if we're going
                              to talk about various hardware and feechurs.
                              [color=blue]
                              > One kind of virtualization, the
                              > kind that vmware does or that I think Power5 servers do virtualize the
                              > processor to its own instruction set, and I expect _that_ kind of
                              > virtualization to become essentially universal for purposes of security.[/color]

                              Too bad x86 is soo late to that table. M$ wanted no part of that though.
                              This brand of virtualizatin would have put them out of business a decade
                              ago. BTW, I call the widget that allows this brand of "virtualization " a
                              "hypervisor " (funny, so does IBM ;-).
                              [color=blue]
                              > You get the security and compartmentaliz ation benefits of that kind of
                              > virtualization for free when you do instruction translation by running
                              > on a virtual machine in software.[/color]

                              Free?
                              [color=blue][color=green][color=darkred]
                              >>> I've kind of lost track of the .NET thing. It's better than Java, I
                              >>> gather, and there is an open source version, mono, which is attractive
                              >>> enough for open source types to work under the proprietary gunsight of
                              >>> Microsoft.[/color]
                              >>
                              >>I don't see it as "better" in any meaning of the word. Java's purpose
                              >>in life is to divorce the application from the processor and OS. I
                              >>can't see how .net is "better" at this. If platform independance isn't
                              >>wanted, why would anyone use Java?
                              >>
                              >>[/color]
                              > I barely know Java, and c# not at all. c# is reputed to be nicer for
                              > programming.[/color]

                              Perhaps, if you want to be forever wedded to Billy.

                              --
                              Keith

                              Comment

                              • YKhan

                                #45
                                Re: What does '64 bit' mean? Lame question, but hear me out :)

                                George Macdonald wrote:[color=blue]
                                > DLLs are just the way it's done with Windows - nothing to do with[/color]
                                being[color=blue]
                                > married to anything; DLLs only got out of hand because of the fluff[/color]
                                burden.[color=blue]
                                > What irks me is machine cycles being pissed away on the indirection[/color]
                                of[color=blue]
                                > pseudo code. To me any suggestion that you can do serious computing[/color]
                                with[color=blue]
                                > this stuff, and do away with real machine code for system level[/color]
                                library[color=blue]
                                > functions, is madness.[/color]

                                Machine cycles aren't so precious anymore, the software side hasn't
                                kept up with the developments in the hardware side for quite some time
                                now. Now's as good a time as any to try out these indirection
                                techniques. It will more than likely help out in the future as it will
                                probably mean we're less tied down to one processor achitecture
                                anymore. Piss a couple of machine cycles for for machine independence?
                                Sure, sounds good to me.

                                Yousuf Khan

                                Comment

                                Working...