Re: A serious drawback of Tables/CSS over frames [was: A seriousdrawback of CSS]
On Wed, 25 Jan 2006, Martin Lucas-Smith wrote:
[color=blue]
> Personally the drawbacks of frames are so incredibly great[/color]
One can agree with you up to that point!
[color=blue]
> that server-side processing becomes a must on any site above a very
> small number of pages.[/color]
That depends on quite what people understand by "server-side
processing". Server-side processing *at the point of delivery* tends
to produce documents which have no usable last-modified information,
causing problems with cacheability. One needs a certain minimum level
of expertise with the various techniques (SSI, PHP, whatever) to
achieve the levels of cacheability that come "for free" when documents
are served-out from plain files.
Except where there is a genuine requirement for particular page(s) to
be kept up-to-the-minute by incorporating content which is frequently
changing, I would tend to recommend doing that processing at the point
where the materials are published /to the server/. I.e run your
scripts to assemble the finished web page (maybe outputting the result
to your development server running on 127.0.0.1), review the results,
then transfer the finished product to the production server. Don't
change the files on the production server (i.e don't change their
last-modified stamp) unless and until their content really changes.
There are many ways of achieving that. The internal process could be
XML-based. Or indeed it could be based on familiar server-side syntax
such as SSI, PHP etc.
[color=blue]
> It's why such facilities (SSI, includes, etc) exist:[/color]
Those using SSI on Apache could be advised to use "XBitHack full"
instead (and, on unix platforms, set the ug+x bits). That can go some
way to remedying the cacheability problem, although it's still not as
good as serving-out plain files (it sets last-modified, but doesn't
generate size and Etag data, which a plain file would).
Those using PHP can be advised to read up about PHP's facilities for
setting HTTP headers, such as last-modified.
[color=blue]
> On Mon, 23 Jan 2006, Dario de Judicibus wrote:
>[color=green]
> > Often I have to code client-based pages, and in such a case I have
> > to duplicate a lot of stuff. That's crazy. XHTML has no import
> > statement, nor there is a way in CSS to specify division content
> > from another file.[/color]
>
> Yep, that's the way it's always been. For that you *have* to
> server-side processing.[/color]
The processing has to be done before the document is served-out,
that's clear - and in that sense it's "server side" indeed. But in
most cases there is no *need* to perform that processing at the
instant of serving the document out. It can be done once, at
publishing time, and then left alone until another update is needed.
Obviously this approach also represents some saving in server
resources - but that isn't the main motivation for it - more to the
point, as I say, is the benefit of cacheability, which can make a web
site seem significantly more responsive to the user.
Routinely recommended tutorial: http://www.mnot.net/cache_docs/
(and see the author's "cacheabili ty engine", which will report on
the cacheability of a web site).
regards
On Wed, 25 Jan 2006, Martin Lucas-Smith wrote:
[color=blue]
> Personally the drawbacks of frames are so incredibly great[/color]
One can agree with you up to that point!
[color=blue]
> that server-side processing becomes a must on any site above a very
> small number of pages.[/color]
That depends on quite what people understand by "server-side
processing". Server-side processing *at the point of delivery* tends
to produce documents which have no usable last-modified information,
causing problems with cacheability. One needs a certain minimum level
of expertise with the various techniques (SSI, PHP, whatever) to
achieve the levels of cacheability that come "for free" when documents
are served-out from plain files.
Except where there is a genuine requirement for particular page(s) to
be kept up-to-the-minute by incorporating content which is frequently
changing, I would tend to recommend doing that processing at the point
where the materials are published /to the server/. I.e run your
scripts to assemble the finished web page (maybe outputting the result
to your development server running on 127.0.0.1), review the results,
then transfer the finished product to the production server. Don't
change the files on the production server (i.e don't change their
last-modified stamp) unless and until their content really changes.
There are many ways of achieving that. The internal process could be
XML-based. Or indeed it could be based on familiar server-side syntax
such as SSI, PHP etc.
[color=blue]
> It's why such facilities (SSI, includes, etc) exist:[/color]
Those using SSI on Apache could be advised to use "XBitHack full"
instead (and, on unix platforms, set the ug+x bits). That can go some
way to remedying the cacheability problem, although it's still not as
good as serving-out plain files (it sets last-modified, but doesn't
generate size and Etag data, which a plain file would).
Those using PHP can be advised to read up about PHP's facilities for
setting HTTP headers, such as last-modified.
[color=blue]
> On Mon, 23 Jan 2006, Dario de Judicibus wrote:
>[color=green]
> > Often I have to code client-based pages, and in such a case I have
> > to duplicate a lot of stuff. That's crazy. XHTML has no import
> > statement, nor there is a way in CSS to specify division content
> > from another file.[/color]
>
> Yep, that's the way it's always been. For that you *have* to
> server-side processing.[/color]
The processing has to be done before the document is served-out,
that's clear - and in that sense it's "server side" indeed. But in
most cases there is no *need* to perform that processing at the
instant of serving the document out. It can be done once, at
publishing time, and then left alone until another update is needed.
Obviously this approach also represents some saving in server
resources - but that isn't the main motivation for it - more to the
point, as I say, is the benefit of cacheability, which can make a web
site seem significantly more responsive to the user.
Routinely recommended tutorial: http://www.mnot.net/cache_docs/
(and see the author's "cacheabili ty engine", which will report on
the cacheability of a web site).
regards
Comment