Python vs. Lisp -- please explain

Collapse
This topic is closed.
X
X
 
  • Time
  • Show
Clear All
new posts
  • Torsten Bronger

    #61
    Re: Python vs. Lisp -- please explain

    Hallöchen!

    claird@lairds.u s (Cameron Laird) writes:
    [color=blue]
    > In article <eq7pc3-ml5.ln1@lairds. us>, I wondered:
    >[color=green]
    >> [...] Do you truly believe that fewer people would use Python if
    >> its execution were faster?[/color]
    >
    > I think I can answer my own question: yes. Since posting, I came
    > across a different follow-up where Alexander explains that he sees
    > healthy elements of the Python ethos--focus on a reliable, widely-
    > used library, willingness to make Python-C partnerships, and so
    > on--as results at least in part of early acceptance of Python as
    > intrinsically slow. That's too interesting an argument for me to
    > respond without more thought.[/color]

    I was rather stunned, too, when I read his line of thought.
    Nevertheless, I think it's not pointless, albeit formulated in an
    awkward way. Of course, Python has not been deliberately slowed
    down.

    I don't know how strong the effect is that a language design which
    doesn't allow for (easy to implement) fast execution speed makes you
    write cleaner code, however, I must say that I feel so, too. When I
    came from C++ to Python I had to find my Pythonic style, and part of
    it was that I threw away those ubiquitous little optimisations and
    concentrated on formulating my idea in a clear and expressive way.

    By the way, this is my main concern about optional static typing: It
    may change the target group, i.e. it may move Python closer to those
    applications where speed really matters, which again would have an
    effect on what will be considered Pythonic.

    Tschö,
    Torsten.

    --
    Torsten Bronger, aquisgrana, europa vetus ICQ 264-296-646

    Comment

    • Kay Schluehr

      #62
      Re: Python vs. Lisp -- please explain

      Steven D'Aprano wrote:[color=blue]
      > On Mon, 20 Feb 2006 05:18:39 -0800, Kay Schluehr wrote:
      >[color=green][color=darkred]
      > >> What's far more interesting to me, however, is that I think there a good
      > >> reasons to suspect python's slowness is more of a feature than a flaw: I'd not
      > >> be suprised if on the whole it greatly increases programmer productivity and
      > >> results in clearer and more uniform code.[/color]
      > >
      > > Yes, it's Guidos master-plan to lock programmers into a slow language
      > > in order to dominate them for decades. Do you also believe that Al
      > > Quaida is a phantom organization of the CIA founded by neocons in the
      > > early '90s who planned to invade Iraq?[/color]
      >
      > Of course not. The alternative, that Osama has been able to lug his
      > dialysis machine all over the Pakistan and Afghan mountains without being
      > detected for four years is *much* more believable. *wink*[/color]

      Osama? Who is Osama? A media effect, a CNN invention.
      [color=blue]
      > I don't think it was the poster's implication that Guido deliberately
      > created a slow language for the sake of slowness. I think the implication
      > was more that Guido made certain design choices that increased
      > productivity and code clarity. (That much is uncontroversial .) Where the
      > poster has ruffled some feathers is his suggestion that if Guido had only
      > known more about the cutting edge of language design from CS, Python would
      > have been much faster, but also much less productive, clear and popular.
      >
      > I guess the feather ruffling is because of the suggestion that Guido
      > merely _didn't_know_ about language features that would have increased
      > Python's speed at the cost of productivity, rather than deliberately
      > choose to emphasis productivity at the expense of some speed.
      >
      >
      >
      > --
      > Steven.[/color]

      Alexanders hypothesis is completely absurd. It turned out over the
      years that capabilities of Python optimization are lower than those of
      Lisp and Smalltalk. But its a system effect and epiphenomenon of
      certain design decisions. This might change with PyPy - who knows? The
      Lisp/Smalltalk design is ingenious, radical and original and both
      languages were considered too slow for many real-world-applications
      over decades. But no one has ever claimed that Alan Kay intentionally
      created a slow language in order to hold the herd together - and it
      accidentally turned out to be reasonably fast with JIT technology in
      the late 90s.

      Smalltalk was killed by Java while Lisp was killed by the paradigm
      shift to OO in the early 90s. The IT world has absolutely no interest
      in the hobby horses of computer scientists or language lovers (like
      me). It consolidates in direction of a small set of Algol successors
      namely C,C++,Java and C# and some dynamically typechecked languages
      like Perl, Python and Ruby that play nice with the bold mainstream
      languages as their more flexible addition. A conspiracy like theory
      used to explain what's going on is needless.

      Kay

      Comment

      • Steven D'Aprano

        #63
        Re: Python vs. Lisp -- please explain

        On Mon, 20 Feb 2006 16:54:34 +0000, Donn Cave wrote:
        [color=blue]
        > The reason this isn't just an abstruse philosophical argument where it
        > makes sense for us to obtusely cling to some indefensible point of view,
        > is that as the man points out, there are differences that we can't hide
        > forever from potential Python users. The most obvious to me is that
        > your Python program essential includes its interpreter - can't go anywhere
        > without it, and any change to the interpreter is a change to the program.
        > There are various strategies to address this, but pretending that Python
        > isn't interpreted is not one of them.[/color]

        Python programs (.py files) don't contain an interpreter. Some of those
        files are *really* small, you can't hide a full blown Python interpreter
        in just half a dozen lines.

        What you mean is that Python programs are only executable on a platform
        which contains the Python interpreter, and if it is not already installed,
        you have to install it yourself.

        So how exactly is that different from the fact that my compiled C program
        is only executable on a platform that already contains the correct machine
        code interpreter, and if that interpreter is not already installed, I have
        to install a machine code interpreter (either the correct CPU or a
        software emulator) for it?

        Moving compiled C programs from one system to another one with a different
        CPU also changes the performance (and sometimes even the semantics!) of
        the program. It is easier for the average user to change the version of
        Python than it is to change the CPU.

        Nobody denies that Python code running with no optimization tricks is
        (currently) slower than compiled C code. That's a matter of objective
        fact. Nobody denies that Python can be easily run in interactive mode.
        Nobody denies that *at some level* Python code has to be interpreted.

        But ALL code is interpreted at some level or another. And it is equally
        true that at another level Python code is compiled. Why should one take
        precedence over the other?

        The current state of the art is that the Python virtual machine is slower
        than the typical machine code virtual machine built into your CPU. That
        speed difference is only going to shrink, possibly disappear completely.

        But whatever happens in the future, it doesn't change two essential facts:

        - Python code must be compiled to execute;
        - machine code must be interpreted to execute.

        Far from being indefensible, philosophically there is no difference
        between the two. Python and C are both Turing Complete languages, and both
        are compiled, and both are interpreted (just at different places).

        Of course, the difference between theory and practice is that in theory
        there is no difference between theory and practice, but in practice there
        is. I've already allowed that in practice Python is slower than machine
        code. But Python is faster than purely interpreted languages like bash.

        Consider that Forth code can be as fast (and sometimes faster) than the
        equivalent machine code despite being interpreted. I remember a
        Apple/Texas Instruments collaborative PC back in the mid to late 1980s
        with a Lisp chip. Much to their chagrin, Lisp code interpreted on a
        vanilla Macintosh ran faster than compiled Lisp code running on their
        expensive Lisp machine. Funnily enough, the Apple/TI Lisp machine sank
        like a stone.

        So speed in and of itself tells you nothing about whether a language is
        interpreted or compiled. A fast interpreter beats a slow interpreter,
        that's all, even when the slow interpreter is in hardware.

        Describing C (or Lisp) as "compiled" and Python as "interprete d" is to
        paint with an extremely broad brush, both ignoring what actually happens
        in fact, and giving a false impression about Python. It is absolutely true
        to say that Python does not compile to machine code. (At least not yet.)
        But it is also absolutely true that Python is compiled. Why emphasise the
        interpreter, and therefore Python's similarity to bash, rather than the
        compiler and Python's similarity to (say) Java or Lisp?


        --
        Steven.

        Comment

        • Alexander Schmolck

          #64
          Re: Python vs. Lisp -- please explain

          Torsten Bronger <bronger@physik .rwth-aachen.de> writes:
          [color=blue]
          > I was rather stunned, too, when I read his line of thought.
          > Nevertheless, I think it's not pointless, albeit formulated in an
          > awkward way. Of course, Python has not been deliberately slowed
          > down.[/color]

          Indeed -- and I'm really not sure what defect in someone's reading skills or
          my writing skills would make anyone think I suggested this.

          'as

          Comment

          • Paul Boddie

            #65
            Re: Python vs. Lisp -- please explain

            Torsten Bronger wrote:[color=blue]
            >
            > By the way, this is my main concern about optional static typing: It
            > may change the target group, i.e. it may move Python closer to those
            > applications where speed really matters, which again would have an
            > effect on what will be considered Pythonic.[/color]

            Yes, I think that with optional static typing, it's quite likely that
            we would see lots of unnecessary declarations and less reusable code
            ("ints everywhere, everyone!"), so I think the point about not
            providing people with certain features is a very interesting one, since
            people have had to make additional and not insignificant effort to
            optimise for speed. One potential benefit is that should better tools
            than optional static typing be considered and evaluated, the "ints
            everywhere!" line of thinking could prove to be something of a dead end
            in all but the most specialised applications. Consequently, the Python
            platform could end up better off, providing superior tools for
            optimising performance whilst not compromising the feel of the language
            and environment.

            Paul

            Comment

            • Alexander Schmolck

              #66
              Re: Python vs. Lisp -- please explain

              "Kay Schluehr" <kay.schluehr@g mx.net> writes:
              [color=blue]
              > Alexanders hypothesis is completely absurd.[/color]

              You're currently not in the best position to make this claim, since you
              evidently misunderstood what I wrote (I certainly did not mean to suggest that
              Guido *deliberately* chose to make python slow; quite the opposite in fact).

              Maybe I wasn't sufficiently clear, so if rereading my original post doesn't
              bring about enlightenment, I'll try a restatement.
              [color=blue]
              > It turned out over the years that capabilities of Python optimization are
              > lower than those of Lisp and Smalltalk. But its a system effect and
              > epiphenomenon of certain design decisions.[/color]

              The point is that the design decisions, certainly for Common Lisp, scheme and
              particularly for dylan where also informed by what could be done
              *efficiently*, because the people who designed these languages knew a lot
              about advanced compiler implementation strategies for dynamic languages and
              thought that they could achieve high levels of expressiveness whilst retaining
              the possibility of very fast implementations (IIRC dylan specifically was
              meant to be something like within 90% of C performance). CL and dylan were
              also specifically designed for building very large and sophisticated systems,
              whereas it seems Guido originally thought that python would scale to about 500
              LOC.
              [color=blue]
              > This might change with PyPy - who knows? The Lisp/Smalltalk design is
              > ingenious, radical and original and both languages were considered too slow
              > for many real-world-applications over decades. But no one has ever claimed
              > that Alan Kay intentionally created a slow language in order to hold the
              > herd together - and it accidentally turned out to be reasonably fast with
              > JIT technology in the late 90s.[/color]

              I'm pretty sure this is wrong. Smalltalk and Lisp were both quite fast and
              capable before JIT technology in the 90ies came along -- just not necessarily
              on hardware optimized for C-like languages, partly because no one anticipated
              that the x86 and co. would become so dominant (I also roughly remember Alan
              Kay expressing his frustration not to long ago over the fact that despite a
              50000 fold increase in processing speed, current hardware would only run early
              smalltalk 100x faster than the lisa -- I almost certainly misremember the
              details but you get the picture).
              [color=blue]
              > A conspiracy like theory used to explain what's going on is needless.[/color]

              Indeed.

              'as

              Comment

              • Paul Rubin

                #67
                Re: Python vs. Lisp -- please explain

                63q2o4i02@sneak email.com writes:[color=blue]
                > I'm wondering if someone can explain to me please what it is about
                > Python that is so different from Lisp that it can't be compiled into
                > something as fast as compiled Lisp? From this above website and
                > others, I've learned that compiled Lisp can be nearly as fast as C/C++,
                > so I don't understand why Python can't also eventually be as efficient?
                > Is there some *specific* basic reason it's tough?[/color]

                The issues of compiling Python and compiling Lisp are similar. Lisp
                implementers tend to care about performance more, so Lisp tends to be
                compiled. There's a Python compiler called Psyco which can be used
                with CPython and which will be part of PyPy. I'd expect its output
                code to be comparable to compiled Lisp code.

                Comment

                • Paul Rubin

                  #68
                  Re: Python vs. Lisp -- please explain

                  Steven D'Aprano <steve@REMOVETH IScyber.com.au> writes:[color=blue][color=green][color=darkred]
                  > > > efficient? Is there some *specific* basic reason it's tough? Or is it
                  > > > that this type of problem in general is tough, and Lisp has 40+ years
                  > > > vs Python's ~15 years?[/color]
                  > >
                  > > It is by design.[/color]
                  > Python is not slow by design. Python is dynamically typed by design, and
                  > relative slowness is the trade-off that has to be made to give dynamic
                  > types.[/color]

                  I think both of you are missing the point of the question, which is
                  that Lisp is dynamically typed exactly the way Python is and maps to
                  Python almost directly; yet good Lisp implementations are much faster
                  than CPython.
                  [color=blue]
                  > The Python developers have also done marvels at speeding up Python since
                  > the early days, with the ultimate aim of the PyPy project to make Python
                  > as fast as C, if not faster. In the meantime, the question people should
                  > be asking isn't "Is Python fast?" but "Is Python fast enough?".[/color]

                  That is the real answer: CPython doesn't reach performance parity with
                  good Lisp implementations , but is still fast enough for lots of
                  purposes. Psyco and PyPy are ongoing efforts to close the performance
                  gap and which are showing promise of success.

                  Comment

                  • Alexander Schmolck

                    #69
                    Re: Python vs. Lisp -- please explain

                    "Michele Simionato" <michele.simion ato@gmail.com> writes:
                    [color=blue]
                    > Alexander Schmolck wrote:[color=green]
                    > > As common lisp and scheme demonstrate you can have high level of dynamism (and
                    > > in a number of things both are more dynamic than python) and still get very
                    > > good performance (in some cases close to or better than C).[/color]
                    >
                    > Just for personal enlightment, where do you think Lisp is more dynamic
                    > of Python?
                    > Can you new name a few features?[/color]

                    Sure (I'm assuming that by lisp you mean common lisp):

                    - development model

                    - by default the development model is highly interactive and you can
                    redefine all sorts of stuff (functions, methods, classes even the syntax)
                    without having to start over from scratch (you can also save the current
                    state of affairs as an image). By contrast changing modules or classes in
                    interactive python sessions is rather tricky to do without screwing things
                    up, and it doesn't support images, so sessions tend to be much more
                    short-lived.

                    Emacs+slime also offers more powerful functionality than emacs+python mode
                    (autocompletion , jumping to definitions, various parallel sessions or
                    remotely connecting to running applications via sockets etc.)

                    - hot-patching

                    - partly as a consequence of the above, you can also arrange to hot-patch
                    some running application without too much trouble (IIRC Graham claims in
                    one of his essays that one of his favorite pranks was fixing a bug in the
                    application whilst on the phone to the customer as she reported it and
                    then telling her that everything in fact seemed to work fine and to try
                    again)

                    - error handling:

                    - by default you end up in the debugger if something goes wrong and in many
                    cases you can correct it in place and just continue execution (if you've
                    ever had some long numerical computation die on plotting the results
                    because of a underflow as you exponentiate to plot in transformed
                    coordinates, you'd appreciate being able to just return 0 and continue)

                    - Apart from "usual" exception handling CL has much more powerful resumable
                    exceptions that offer far more fine grained error handling possibiliies.

                    For more info let me recommend the chapter from Peter Seibel's excellent
                    "practical common lisp" (available in print, and also freely online):

                    <http://www.gigamonkeys .com/book/beyond-exception-handling-conditions-and-restarts.html>

                    (The whole book is a great ressource for people who want to have quick
                    play with common lisp and see how it's features can be leveraged for real
                    applications (such as html creation, or writing an mp3 server or id3
                    parser). Peter also makes an easy to install, preconfigured "lispbox"
                    bundle of Emacs+a free lisp implementation available on his web-page).

                    - OO:

                    - You know this already, but I'd argue that multimethods and method
                    combinations give you more dynamism then class-centric OO.

                    - Also, if you change the definition of a class all existing instances will
                    be updated automatically. You can get a similar effect in python by
                    laboriously mutating the class, provided it doesn't use __slots__ etc, but
                    that is more brittle and also more limited -- for example, in CL you can specify what
                    happens to instances when the class is updated.

                    - chameleon like nature:

                    It's much easier to make CL look like something completely different (say prolog
                    or some indentation based, class-centric language like python) than it would
                    be with python. In particular:

                    - there are no reserved keywords

                    - you can e.g. implement new syntax like python style """-strings easily
                    (I've done so in fact) with reader macros. Indentation based syntax should
                    also be possible along the same lines, although I haven't tried.

                    - you can introduce pretty much any sort of control structure you might
                    fancy and you can carry out very sophisticated code transformations behind
                    the scenes.

                    - finally, with Lispmachines there at least used to be whole operating systems
                    written all the way down in lisp and according to all the testimony you
                    could interactively modify quite basic system behaviour. The only extant
                    systems that come even remotely close would be smalltalks, but even squeak
                    which is incredibly malleable and a cross-platform mini-OS in its own right
                    uses the host OS for many basic tasks.


                    'as

                    Comment

                    • Donn Cave

                      #70
                      Re: Python vs. Lisp -- please explain

                      Quoth Steven D'Aprano <steve@REMOVETH IScyber.com.au> :
                      ....
                      | Nobody denies that Python code running with no optimization tricks is
                      | (currently) slower than compiled C code. That's a matter of objective
                      | fact. Nobody denies that Python can be easily run in interactive mode.
                      | Nobody denies that *at some level* Python code has to be interpreted.
                      |
                      | But ALL code is interpreted at some level or another. And it is equally
                      | true that at another level Python code is compiled. Why should one take
                      | precedence over the other?

                      I have no idea, what precedence? All I'm saying is that Python matches
                      what people think of as an interpreted language. You can deny it, but
                      but it's going to look like you're playing games with words, and to no
                      real end, since no one could possibly be deceived for very long. If you
                      give me a Python program, you have 3 choices: cross your fingers and
                      hope that I have the required Python interpreter version, slip in a
                      25Mb Python interpreter install and hope I won't notice, or come clean
                      and tell me that your program needs an interpreter and I should check to
                      see that I have it.

                      Donn Cave, donn@drizzle.co m

                      Comment

                      • Steven D'Aprano

                        #71
                        Re: Python vs. Lisp -- please explain

                        Donn Cave wrote:
                        [color=blue]
                        > Quoth Steven D'Aprano <steve@REMOVETH IScyber.com.au> :
                        > ...
                        > | Nobody denies that Python code running with no optimization tricks is
                        > | (currently) slower than compiled C code. That's a matter of objective
                        > | fact. Nobody denies that Python can be easily run in interactive mode.
                        > | Nobody denies that *at some level* Python code has to be interpreted.
                        > |
                        > | But ALL code is interpreted at some level or another. And it is equally
                        > | true that at another level Python code is compiled. Why should one take
                        > | precedence over the other?
                        >
                        > I have no idea, what precedence?[/color]

                        There seem to be two positions in this argument:

                        The "Python is interpreted and not compiled" camp, who
                        appear to my eyes to dismiss Python's compilation stage
                        as a meaningless technicality.

                        The "Python is both interpreted and compiled" camp, who
                        believe that both steps are equally important, and to
                        raise one over the other in importance is misleading.

                        [color=blue]
                        > All I'm saying is that Python matches
                        > what people think of as an interpreted language.[/color]

                        Most people in IT I know of still think of
                        "interprete d" as meaning that every line of source code
                        is parsed repeatedly every time the code is executed.
                        Even when they intellectually know this isn't the case,
                        old habits die hard -- they still think of
                        "interprete d" as second class.

                        If you think that Python has to parse the line "print
                        x" one hundred times in "for i in range(100): print x"
                        then you are deeply, deeply mistaken.

                        That's why Sun doesn't describe Java as interpreted,
                        but as byte-code compiled. They did that before they
                        had JIT compilers to compile to machine code.
                        Consequently nobody thinks of Java source having to be
                        parsed, and parsed, and parsed, and parsed again.

                        [color=blue]
                        > You can deny it, but
                        > but it's going to look like you're playing games with words, and to no
                        > real end, since no one could possibly be deceived for very long.[/color]

                        Pot, meet kettle.

                        A simple question for you: does Python compile your
                        source code before executing it? If you need a hint,
                        perhaps you should reflect on what the "c" stands for
                        in .pyc files.

                        [color=blue]
                        > If you
                        > give me a Python program, you have 3 choices: cross your fingers and
                        > hope that I have the required Python interpreter version, slip in a
                        > 25Mb Python interpreter install and hope I won't notice, or come clean
                        > and tell me that your program needs an interpreter and I should check to
                        > see that I have it.[/color]

                        Hey Donn, here is a compiled program for the PowerPC,
                        or an ARM processor, or one of IBM's Big Iron
                        mainframes. Or even a Commodore 64. What do you think
                        the chances are that you can execute it on your
                        x86-compatible PC? It's compiled, it should just
                        work!!! Right?

                        No of course not. If your CPU can't interpret the
                        machine code correctly, the fact that the code is
                        compiled makes NO difference at all.

                        In other words, I have three choices:

                        - cross my fingers and hope that you have the required
                        interpreter (CPU);

                        - slip in an interpreter install (perhaps an emulator)
                        and hope you won't notice;

                        - or come clean and tell you that my program needs an
                        interpreter ("Hey Donn, do you have a Mac you can run
                        this on?") and you should check to see that you have it.



                        --
                        Steven.

                        Comment

                        • Ben Sizer

                          #72
                          Re: Python vs. Lisp -- please explain

                          Steven D'Aprano wrote:[color=blue]
                          > The "Python is both interpreted and compiled" camp, who
                          > believe that both steps are equally important, and to
                          > raise one over the other in importance is misleading.
                          > That's why Sun doesn't describe Java as interpreted,
                          > but as byte-code compiled. They did that before they
                          > had JIT compilers to compile to machine code.
                          > Consequently nobody thinks of Java source having to be
                          > parsed, and parsed, and parsed, and parsed again.[/color]

                          They also described it that way to help marketing, and I don't think
                          that should be overlooked. They would have known full well that calling
                          their language "interprete d" would have affected public perceptions.

                          It's interesting to see how culture affects things. You talked of 'IT
                          people' in your post and hold up Java of an example of how byte-code
                          doesn't mean slow, the implication being that Python uses the same
                          mechanisms as Java and therefore is good enough. In the general IT
                          world, Java is quite popular (to make a bit of an understatement) and
                          it would often be used as some sort of benchmark.

                          On the other hand, the backgrounds I have familiarity with are computer
                          game development and embedded development. In these areas, we would
                          point to Java as evidence that 'interpreted' bytecode is too slow and
                          that anything using a similar technology is likely to be a problem.

                          I'm not saying you're wrong, just highlighting that comparisons
                          themselves always sit in some wider context which can make the
                          comparison unimportant.

                          I think it's also important to note that 'interpreted' doesn't
                          necessarily mean 'parsed repeatedly'. Many older machines which came
                          with BASIC installed would store their statements in a tokenised form -
                          arguably bytecode with a large instruction set, if you look at it a
                          certain way. This isn't necessarily so far from what Python does, yet
                          few people would argue that those old forms of BASIC weren't
                          interpreted.

                          --
                          Ben Sizer

                          Comment

                          • Kay Schluehr

                            #73
                            Re: Python vs. Lisp -- please explain


                            Alexander Schmolck wrote:[color=blue]
                            > "Kay Schluehr" <kay.schluehr@g mx.net> writes:
                            >[color=green]
                            > > Alexanders hypothesis is completely absurd.[/color]
                            >
                            > You're currently not in the best position to make this claim, since you
                            > evidently misunderstood what I wrote (I certainly did not mean to suggest that
                            > Guido *deliberately* chose to make python slow; quite the opposite in fact).[/color]

                            Like everyone else. It's sometimes hard extract the intended meaning in
                            particular if it's opposed to the published one. I apologize if I
                            overreacted.
                            [color=blue]
                            >
                            > Maybe I wasn't sufficiently clear, so if rereading my original post doesn't
                            > bring about enlightenment, I'll try a restatement.
                            >[color=green]
                            > > It turned out over the years that capabilities of Python optimization are
                            > > lower than those of Lisp and Smalltalk. But its a system effect and
                            > > epiphenomenon of certain design decisions.[/color]
                            >
                            > The point is that the design decisions, certainly for Common Lisp, scheme and
                            > particularly for dylan where also informed by what could be done
                            > *efficiently*, because the people who designed these languages knew a lot
                            > about advanced compiler implementation strategies for dynamic languages and
                            > thought that they could achieve high levels of expressiveness whilst retaining
                            > the possibility of very fast implementations (IIRC dylan specifically was
                            > meant to be something like within 90% of C performance). CL and dylan were
                            > also specifically designed for building very large and sophisticated systems,
                            > whereas it seems Guido originally thought that python would scale to about 500
                            > LOC.[/color]

                            O.K. To repeat it in an accurate manner. Python was originally designed
                            by Guido to be a scripting language for a new OS as a more complete
                            version of a shell scripting language. Unlike those its design was
                            strongly influenced by the usability ideas of the ABC development team.
                            Therefore speed considerations were not the primary concern but an open
                            model that was easily extendable both on the C-API level and the
                            language level. So a VM architecture was chosen to achieve this. Adding
                            new opcodes should have been as simple as interfacing with the C-API.
                            After growing strongly in the late 90s large scale projects emerged
                            such as Zope and many users started to request more Python performance
                            since they wanted to escape from the dual-language model. Writing
                            C-code was not self evident for a new programmers generation grewn up
                            with Java and the ffi turned out to be a hurdle. After remodeling the
                            object core ( "new style classes" ) progressive optimizations came to
                            hold. In 2002 a new genius programmer entered the scene, namely Armin
                            Rigo, who came up with Psyco and launched the PyPy project together
                            with a few other Python hackers in order to aggressively optimize
                            Python using Pythons introspective capabilities. That's were we still
                            are.

                            Remembering the historical context we might draw some parallels to
                            other contexts and language design intentions. We might also figure out
                            parallels and differences between motives of language designers and
                            leading persons who drive language evolution. Python is not just Guido
                            although his signature is quite pervasive. In his latest musings he
                            comes back to his central idea of language design as a kind of user
                            interface design. It's probably this shift in perspective that can be
                            attributed as original to him and which goes beyond making things just
                            "simple" or "powerfull" or "efficient" ( at least he made this shift
                            public and visible ). It is also the most controversial aspect of the
                            language because it is still inseparable from technical decisions (
                            non-true closures, explicit self, statement-expression distinction,
                            anonymous function as an expression with limited abilities etc. )

                            Kay

                            Comment

                            • Chris Mellon

                              #74
                              Re: Python vs. Lisp -- please explain

                              On 2/20/06, Donn Cave <donn@drizzle.c om> wrote:[color=blue]
                              > Quoth Steven D'Aprano <steve@REMOVETH IScyber.com.au> :
                              > ...
                              > | Nobody denies that Python code running with no optimization tricks is
                              > | (currently) slower than compiled C code. That's a matter of objective
                              > | fact. Nobody denies that Python can be easily run in interactive mode.
                              > | Nobody denies that *at some level* Python code has to be interpreted.
                              > |
                              > | But ALL code is interpreted at some level or another. And it is equally
                              > | true that at another level Python code is compiled. Why should one take
                              > | precedence over the other?
                              >
                              > I have no idea, what precedence? All I'm saying is that Python matches
                              > what people think of as an interpreted language. You can deny it, but
                              > but it's going to look like you're playing games with words, and to no
                              > real end, since no one could possibly be deceived for very long. If you
                              > give me a Python program, you have 3 choices: cross your fingers and
                              > hope that I have the required Python interpreter version, slip in a
                              > 25Mb Python interpreter install and hope I won't notice, or come clean
                              > and tell me that your program needs an interpreter and I should check to
                              > see that I have it.[/color]


                              You're correct as far as it goes, but can you provide a reasonable
                              definition for "interprete d" that matches the common usage? Most
                              people can't.

                              When asked to name some interpreted (or scripting) languages, they'll
                              name some off - perl, python, ruby, javascript, basic...

                              They won't say Java. Ask them why Python is interpreted and Java isn't
                              and you'll have a hard time getting a decent technical answer, because
                              Python isn't all that different from Java in that regard, especially
                              pre-JIT versions of Java.

                              Probably the most accurate definition of "interprete d" as it is used
                              in the wild is "one of these languages: perl, python, perl, ruby,
                              etc". That is, you're essentially claiming that Python is interpreted
                              because everyone thinks of it that way, technical correctness be
                              damned.

                              There is an obvious difference between Python and C. Nobody would deny
                              that. But it's a fairly hard thing to *quantify*, which is why people
                              make sloppy categorizations . That's not a problem as long as there
                              isn't prejudice associated with the categorization, which there is.

                              I wonder how "interprete d" people would think Python is if the
                              automagic compilation to .pyc was removed and you had to call
                              "pythonc" first.
                              [color=blue]
                              >
                              > Donn Cave, donn@drizzle.co m
                              > --
                              > http://mail.python.org/mailman/listinfo/python-list
                              >[/color]

                              Comment

                              • Paul Boddie

                                #75
                                Re: Python vs. Lisp -- please explain

                                Chris Mellon wrote:[color=blue]
                                >
                                > You're correct as far as it goes, but can you provide a reasonable
                                > definition for "interprete d" that matches the common usage? Most
                                > people can't.[/color]

                                I thought Torsten's definition was good enough: if the instructions
                                typically produced when preparing your programs for execution can be
                                handled directly by the CPU then let's call it a "compiled language";
                                otherwise, let's call it an "interprete d language". I think we all know
                                about the subtleties of different levels of virtual machines, but if
                                you want an arbitrary definition that lots of people feel is intuitive
                                then that's the one to go for.
                                [color=blue]
                                > When asked to name some interpreted (or scripting) languages, they'll
                                > name some off - perl, python, ruby, javascript, basic...[/color]

                                Right: compiled Perl and Python instructions typically aren't executed
                                directly by the hardware; Ruby visits the parse tree when executing
                                programs (see [1] for some casual usage of "interprete d" and "compiled"
                                terms in this context), although other virtual machines exist [2];
                                JavaScript varies substantially, but I'd imagine that a lot of the
                                implementations also do some kind of parse tree walking (or that the
                                developers don't feel like documenting their bytecodes), although you
                                can also compile JavaScript to Java class files [3]; BASIC varies too
                                much for any kind of useful summary here, but I'd imagine that early
                                implementations have tainted the language's "compiled" reputation
                                substantially.
                                [color=blue]
                                > They won't say Java. Ask them why Python is interpreted and Java isn't
                                > and you'll have a hard time getting a decent technical answer, because
                                > Python isn't all that different from Java in that regard, especially
                                > pre-JIT versions of Java.[/color]

                                That's why I put Java and Python in the same category elsewhere in this
                                thread. Bear in mind, though, that Java's just-in-time compilation
                                features were hyped extensively, and I imagine that many or most
                                implementations have some kind of native code generation support,
                                either just-in-time or ahead-of-time.
                                [color=blue]
                                > Probably the most accurate definition of "interprete d" as it is used
                                > in the wild is "one of these languages: perl, python, perl, ruby,
                                > etc". That is, you're essentially claiming that Python is interpreted
                                > because everyone thinks of it that way, technical correctness be
                                > damned.[/color]

                                Well, I think Torsten's definition was more objective and yet arrives
                                at the same result. Whether we're happy with that result, I have my
                                doubts. ;-)
                                [color=blue]
                                > There is an obvious difference between Python and C. Nobody would deny
                                > that. But it's a fairly hard thing to *quantify*, which is why people
                                > make sloppy categorizations . That's not a problem as long as there
                                > isn't prejudice associated with the categorization, which there is.[/color]

                                I refer you again to Torsten's definition.
                                [color=blue]
                                > I wonder how "interprete d" people would think Python is if the
                                > automagic compilation to .pyc was removed and you had to call
                                > "pythonc" first.[/color]

                                Well, such things might have a psychological impact, but consider
                                removing Python's interactive mode in order to enhance Python's
                                non-interpreted reputation, and then consider Perl (an interpreted
                                language according to the now-overly-referenced definition) which
                                doesn't have an interactive mode (according to [4] - I don't keep up
                                with Perl matters, myself), but which allows expression evaluation at
                                run-time. No-one would put Perl together with C in a compiled vs.
                                interpreted categorisation. Removing the automatic compilation support
                                might strengthen the compiled feel of the both languages further, but
                                with knowledge of the technologies employed, both languages (at least
                                in their mainstream forms) are still on the other side of the fence
                                from C.

                                Paul

                                [1] http://www.rubygarden.org/faq/entry/show/126
                                [2] http://www.atdot.net/yarv/
                                [3] http://www.mozilla.org/rhino/doc.html
                                [4] http://dev.perl.org/perl6/rfc/184.html

                                Comment

                                Working...