Re: Future reuse of code
In article <20030818202734 .4b497140.spamt rap@flash-gordon.me.uk>, Mark Gordon wrote:
[snipping just a few points]
[color=blue][color=green]
>> Hmm. Prototypes etc? Afaik K&R had no modules or prototypes at all,
>> the separate compilation were just externals, and parameters didn't
>> even had to match. (iow everything happened on linker, not langauge
>> level)[/color]
>
> K&R did not have prototypes, however the specification *did* allow for
> the separate compilation of modules. That was standardised by C89.
> However the language *did* specify that you could compile separate
> modules (even having an extern keyword for specifying that an object
> was external). It just does not specify how you invoke *any* of the
> tools.[/color]
If that is separate compilation of _modules_, then you are right. IMHO it
isn't. Since there is no interaction between the modules whatsoever on
compiler level, only a simple linker trick, and a modifier.
One is trying to get several programs into one binary, not making one
program consisting out of several modules.
But maybe my concept of a module is a bit different than yours. Blame it
on my Modula2 years.
Even with prototypes I've a bit of a problem with it, but maybe I'm not
understanding prototypes right. At least there is some interaction.
[color=blue]
> The original definition of Pascal did *not* specify any support separate
> modules and provided no mechanism for specifying that an object was
> external, so any support for separate compilation of modules was an
> extension.[/color]
It still doesn't I think. "external", "mangling", "calling conventions" etc
are all beyond the scope of the language, as being compiler specific.
[color=blue][color=green][color=darkred]
>> > The Texas Instruments implementation of C for the TMS320C1x/2X/5X
>> > was a fully conforming implementation and even came with a copy of
>> > K&R2 as part of the documentation set. This was at the same time I
>> > was dealing with all those different Pascals...[/color]
>>
>> K&R is no standard. ANSI is.[/color]
>
> The second edition (K&R2) is for ANSI standard C and is one of the most
> commonly used and recommended reference books for it.[/color]
IIRC that (K&R2) is quite late isn't it? 1989 or so?
The extended standard dates from only a year later (1990), which is already
years after Borland introduced units. (and TP/BP was actually in its zenith
during those years, 1987-1992)
So the only thing that remains is that you used non standards pascals,
while you used standard C's ?
That sounds as a cheap hack from a advocatist, but it is actually the usual
pattern in pascal vs C discussions.
C was simply lucky that it had a foundation in Unix, there is not much more
to say.
[color=blue][color=green][color=darkred]
>> > compiler that supports ANSI C than Pascal written for any of the
>> > compilers I mentioned.[/color]
>>
>> That's not true afaik (but I'm no expert, just from the old BSD days).
>> Most K&R code mismatches or omits parameters between declaration and
>> compilation. This was only fixed with the formal prototypes in some
>> later standard.[/color]
>
> Prototypes were added with the ansi standard, but
> extern int foo();
> extern int bar;
> were valid ways of declaring function foo() and variable bar without[/color]
Declaring them, as in the other module could check them?
[color=blue][color=green][color=darkred]
>> > This is irrespective of whether it was embedded or
>> > non-embedded C or Pascal.[/color]
>>
>> Embedded versions are often simplified, and therefore often base
>> on older versions. Judging the state of Pascal by those is a bit odd.[/color]
>
> All the embedded C compilers I have used implemented the full
> specification for a free-standing implementation. All of the Pascals,
> including those that were *not* targeting embedded systems, used
> non-standard, non-portable mechanisms for allowing access to symbols
> defined in separately compiled modules.[/color]
Possible. Yet I had the same experience.
I didn't choose my compilers well (VC++, gcc (with BSD libc code) and C51),
and couldn't get the code to work universally.
Is it actually the standard that is the problem, or your choice in
compilers and code? Maybe just because Pascal got its finest hour a bit
early, and you had legacy code bases from pre standard times to maintain?
And why do you lay so much emphasis on that one item, that is severely
limited (in all implementations ) and broken (in original K&R) actually?
[color=blue][color=green][color=darkred]
>> > C & C++ are two distinct languages, yet you can link them relatively
>> > easily...[/color]
>>
>> C++ nearly includes C, even though they are formally separate
>> languages.[/color]
>
> No, C++ will report diagnostics and probably abort compilation when
> trying to compile a *lot* of ANSI standard C. For example, if the result
> of a malloc call is not cast C++ will reject it where as not casting it
> is the recommended approach in C.[/color]
That's why I said "nearly".
[color=blue][color=green]
>> portable :-) You probably mean that C++ and C FROM THE SAME VENDOR
>> reasonably link well.[/color]
>
> The C++ standard explicitly defines some of how the linking of C and C++
> is to be handled, such as the 'extern "C"' stuff you see in a lot of
> headers. The C standard cooperates to the extend of guaranteeing a way
> of identifying at compile time whether a file is being compiled as C or
> C++ to allow you to share header files.[/color]
Ah. So it works if I take a totally isolated C compiler? How much namemangling
is defined (and guaranteed) by the standard?
You only signal something with extern C. It's up to the implementation
to do something with it *and* both *implementation s* actually have to match.
[color=blue][color=green]
>> OTOH that is not a problem. Most Pascal compilers also link to C. They
>> probably also can link to eachother, only on a deeper level (directly
>> passing file handles instead of file descriptors etc)[/color]
>
> However, there is no way defined to specify that external objects are
> external C objects, unlike with C++. There is also no way to include a C
> header file from a Pascal source file.[/color]
(Is there a way to import a Pascal module into C then?)
That's because Pascal doesn't include nearly the entire C language as I
already said.
Are you btw sure it is C++ compatibility with C, or simply C++ and C being
usually the same compiler or variants of the same compiler that makes this
work?
IOW, does it work if I choose a C++ compiler "B" that has different
structure aligning rules than C compiler "A"? And will their runtime be
compatible?
Anyway so one has to convert the C headers into the Pascal syntax, and
either use non standard (but pretty common) extensions like modifiers that
flag it like a certain "C" compiler (again e.g. alignment, but also base
type sizes, no), or putting it in separate modules and compile them with
special parameters. There are some other ways too (like including a "C" module
that does this automatically). Frankly this is not really the problem.
Yes, that wouldn't be much of a problem if C was clean and parsable, so that
one could convert headers automatically. However it isn't. It has a macro
processor in which a game of tetris was implemented. Nuff said.
If you preprocess to kill the macro's, you effectively don't have a header
anymore. I
(Btw, you have IMHO hit a major problem with Unix here.
You can't determine the API without having a full C compiler conforming to
the exact implementation of the rest of the system. No wonder why configure
is such ugly hack. Microsoft tries to work on this by imposing strict
guidelines, and specifying a third in IDL. On *BSD and Linux it is near
impossible to process/convert headers. I looked into Solaris headers today,
and my first impression was that they were on the same level as *BSD)
[color=blue][color=green]
>> In practice yes. But the Borland group, while it far outnumbers ANSI,
>> is x86 (and often even x86/win32) centric. So if you go outside x86,
>> you'll encounter quite a lot of ansi.[/color]
>
> Whatever processor you will find a lot of ANSI standard C, even for the
> x86/DOS/Win world.[/color]
I was speaking of Ansi Pascal above. And I was a typical Borland Dos user
before migrating to Unix in the early/mid nineties. Believe, the
bulk was Borland/Microsoft specific code, and it still is.
Even if the language is close enough to ansi C(++), the amount of extensions
and libs used in the avg code is simply flabbergasting.
[color=blue][color=green][color=darkred]
>> > I bet you can use a lot of conforming C code not written for the
>> > 8051 on an 8051, since I bet there is a conforming implementation.[/color]
>>
>> Maybe, but that would really strain those 256 bytes of memory.[/color]
>
> I just checked, and version 7 of C51 claims ANSI conformance.[/color]
Nice, you have an URL? I can't check what version I have right now.
And is it for the actual 8051, or for compatibles that are a lot "richer"?
[color=blue][color=green]
>>
>> Yes. It is a very limited help of trying to verify that. But in
>> practice, the standard is often not enough to build an average
>> application.[/color]
>
> I've written large C applications for real world problems as part of my
> job where only a small amount of isolated code was implementation
> dependant.[/color]
Ah sure, but _new_ code is never the problem. Pick the formal standard (or
in Pascal sometimes the de-facto standard Borland), and gone is the problem.
Post in a Pascal group, and anybody will tell you exactly the same large
story for Pascal. And they all have codebases they port without problems.
But somehow, each time if I'm called to work on a codebase, be it C(++),
Java (!) or Pascal, and the project is non trivial, it is a mess. Who knows,
maybe it is Karma, and I was Charles Babbage in a former life, and am being
punished for never finishing the analytical machine.
I'm actually pretty deep involved in porting a Pascal compiler (Delphi
dialect, so pretty advanced) to as many platforms as possible. That alone
is a codebase of 3MB pretty portable Pascal. (that's the compiler. total
project size 50-100 MB, but there are a lot of headers in there)
[color=blue][color=green]
>> fix nearly every program when I got an Alpha machine)
>>
>> But except for that (and those original K&R code), you are somewhat
>> right, the problem is mroe that a fully standard C program is often
>> trivial, and no real app.
>>
>> Something like that is nice to show to students, but not something for
>> IRL.[/color]
>
> As I say, I work on large *real* applications using standard C for 90%
> (or more) of my work. Therefor it is useful for *real* work on *large*
> projects.[/color]
Sure. I know they exist. But it is the same for each other language. Java,
Pascal, you name it. One can be lucky (specially if you set it up yourself at
a time you already got some clue), but that is not the average situation.
The average situation is either code so old that it probably originates from
the Analitical Machine, and/or uses every dirty trick in the book.
See the Java discussion. Nice on paper, but in practice it gets you nowhere.
At least not as far as it claims.
[color=blue][color=green][color=darkred]
>> > So do the majority of modern Pascal implementations support ANSI
>> > standard Pascal?[/color]
>>
>> Either that or Borland. Borland is proprietary, but so dominant that
>> smaller vendors (TMT,VP) follow it, and it also has following in the
>> open source community.[/color]
>
> So, you still can't share code with embedded systems. I can and *have*
> done so for *real* work on *complex* applications.[/color]
Depends on your view of embedded. That 8051 with its 256 would be hard
(though I want to see an C avg app compiled for it too).
But I run Delphi code on my m68030. The 68k implementator is still fondling
the cg a bit to work on plain 68000's, but it would work.
I also have a Pascal compiler for my c64 that allows small programs to be
made, and here is the "51" variant:
(it has an extension for external procs I saw)
However Pascal _is_ a bit more high level. Usually it requires more memory,
both for code and runtime, at least if you maintain your own programming
style. Don't forget Pascal is older than C.
However you can get by that if your compiler is some what smart (dead code
optimisation, inlining small funcs), if you don't mind being set back
to the C level. The only real limitation I quickly can think of is local
variables in the middle of a block. (though in theory you could try to
put that part of the code in a inner procedure, and have that inlined. Not
guaranteed, but C doesn't guarantee compilation efficiency either)
Delphi is even worse in that department (compared to C++), and much
[color=blue][color=green]
>>
>> Sure, but that was not what I said. I said the compilers can't compile
>> an average program from the other.[/color]
>
> Berkeley DB (the one used to drive the Amazon web site, amongst others)
> is built using gcc for Linux and VC++ for Windows. Maybe that is a well
> written program rather than an average program.[/color]
Is it the original version? I might actually have an old one on my BSD 4.3
tapes, if I had a device to read them. Let's see if gcc3 eats that. I can
adapt any codebase through time.
[color=blue]
> The application I work on, several hundred thousand lines of code, used
> to be built for Windows using VC++, for Linux using gcc and for HPUX,
> SCO, AIX and Solaris using the standard compilers from the relevant OSs.
> This is using the *same* source files in all cases. It is also software
> that my company earns millions from annually.[/color]
See comments earlier. I have met with codebases like that too. (actually
in Modula2, but that's close enough)
[color=blue]
> I changed this to standardising on gcc because I wanted to and gcc is
> available for all the targets we want to support.[/color]
Why standarise if you could already compile them with all those compilers.
If you don't have to change a single char to have a code base for target
A run on target B, why would you?
[color=blue][color=green][color=darkred]
>> > (sometime I always try to do) then the bulk of your code will
>> > compile and run correctly on both.[/color]
>>
>> And be fairly trivial.[/color]
>
> Do you think an applications several hundred thousand lines long is
> trivial?[/color]
No. There are such cases. A compiler is a good example. BDB also, because
it only uses standard files.
[color=blue]
> Or Berkeley DB which is used to power a lot of major web sites?[/color]
Is that actually plain standard BDB, or something that only has its origins
in that. Hard to do that with plain C, can't even create a critical section
or so. Deadlocks all over the place :-)
Or are you confusing POSIX and Ansi C?
[time's up. Sorry]
In article <20030818202734 .4b497140.spamt rap@flash-gordon.me.uk>, Mark Gordon wrote:
[snipping just a few points]
[color=blue][color=green]
>> Hmm. Prototypes etc? Afaik K&R had no modules or prototypes at all,
>> the separate compilation were just externals, and parameters didn't
>> even had to match. (iow everything happened on linker, not langauge
>> level)[/color]
>
> K&R did not have prototypes, however the specification *did* allow for
> the separate compilation of modules. That was standardised by C89.
> However the language *did* specify that you could compile separate
> modules (even having an extern keyword for specifying that an object
> was external). It just does not specify how you invoke *any* of the
> tools.[/color]
If that is separate compilation of _modules_, then you are right. IMHO it
isn't. Since there is no interaction between the modules whatsoever on
compiler level, only a simple linker trick, and a modifier.
One is trying to get several programs into one binary, not making one
program consisting out of several modules.
But maybe my concept of a module is a bit different than yours. Blame it
on my Modula2 years.
Even with prototypes I've a bit of a problem with it, but maybe I'm not
understanding prototypes right. At least there is some interaction.
[color=blue]
> The original definition of Pascal did *not* specify any support separate
> modules and provided no mechanism for specifying that an object was
> external, so any support for separate compilation of modules was an
> extension.[/color]
It still doesn't I think. "external", "mangling", "calling conventions" etc
are all beyond the scope of the language, as being compiler specific.
[color=blue][color=green][color=darkred]
>> > The Texas Instruments implementation of C for the TMS320C1x/2X/5X
>> > was a fully conforming implementation and even came with a copy of
>> > K&R2 as part of the documentation set. This was at the same time I
>> > was dealing with all those different Pascals...[/color]
>>
>> K&R is no standard. ANSI is.[/color]
>
> The second edition (K&R2) is for ANSI standard C and is one of the most
> commonly used and recommended reference books for it.[/color]
IIRC that (K&R2) is quite late isn't it? 1989 or so?
The extended standard dates from only a year later (1990), which is already
years after Borland introduced units. (and TP/BP was actually in its zenith
during those years, 1987-1992)
So the only thing that remains is that you used non standards pascals,
while you used standard C's ?
That sounds as a cheap hack from a advocatist, but it is actually the usual
pattern in pascal vs C discussions.
C was simply lucky that it had a foundation in Unix, there is not much more
to say.
[color=blue][color=green][color=darkred]
>> > compiler that supports ANSI C than Pascal written for any of the
>> > compilers I mentioned.[/color]
>>
>> That's not true afaik (but I'm no expert, just from the old BSD days).
>> Most K&R code mismatches or omits parameters between declaration and
>> compilation. This was only fixed with the formal prototypes in some
>> later standard.[/color]
>
> Prototypes were added with the ansi standard, but
> extern int foo();
> extern int bar;
> were valid ways of declaring function foo() and variable bar without[/color]
Declaring them, as in the other module could check them?
[color=blue][color=green][color=darkred]
>> > This is irrespective of whether it was embedded or
>> > non-embedded C or Pascal.[/color]
>>
>> Embedded versions are often simplified, and therefore often base
>> on older versions. Judging the state of Pascal by those is a bit odd.[/color]
>
> All the embedded C compilers I have used implemented the full
> specification for a free-standing implementation. All of the Pascals,
> including those that were *not* targeting embedded systems, used
> non-standard, non-portable mechanisms for allowing access to symbols
> defined in separately compiled modules.[/color]
Possible. Yet I had the same experience.
I didn't choose my compilers well (VC++, gcc (with BSD libc code) and C51),
and couldn't get the code to work universally.
Is it actually the standard that is the problem, or your choice in
compilers and code? Maybe just because Pascal got its finest hour a bit
early, and you had legacy code bases from pre standard times to maintain?
And why do you lay so much emphasis on that one item, that is severely
limited (in all implementations ) and broken (in original K&R) actually?
[color=blue][color=green][color=darkred]
>> > C & C++ are two distinct languages, yet you can link them relatively
>> > easily...[/color]
>>
>> C++ nearly includes C, even though they are formally separate
>> languages.[/color]
>
> No, C++ will report diagnostics and probably abort compilation when
> trying to compile a *lot* of ANSI standard C. For example, if the result
> of a malloc call is not cast C++ will reject it where as not casting it
> is the recommended approach in C.[/color]
That's why I said "nearly".
[color=blue][color=green]
>> portable :-) You probably mean that C++ and C FROM THE SAME VENDOR
>> reasonably link well.[/color]
>
> The C++ standard explicitly defines some of how the linking of C and C++
> is to be handled, such as the 'extern "C"' stuff you see in a lot of
> headers. The C standard cooperates to the extend of guaranteeing a way
> of identifying at compile time whether a file is being compiled as C or
> C++ to allow you to share header files.[/color]
Ah. So it works if I take a totally isolated C compiler? How much namemangling
is defined (and guaranteed) by the standard?
You only signal something with extern C. It's up to the implementation
to do something with it *and* both *implementation s* actually have to match.
[color=blue][color=green]
>> OTOH that is not a problem. Most Pascal compilers also link to C. They
>> probably also can link to eachother, only on a deeper level (directly
>> passing file handles instead of file descriptors etc)[/color]
>
> However, there is no way defined to specify that external objects are
> external C objects, unlike with C++. There is also no way to include a C
> header file from a Pascal source file.[/color]
(Is there a way to import a Pascal module into C then?)
That's because Pascal doesn't include nearly the entire C language as I
already said.
Are you btw sure it is C++ compatibility with C, or simply C++ and C being
usually the same compiler or variants of the same compiler that makes this
work?
IOW, does it work if I choose a C++ compiler "B" that has different
structure aligning rules than C compiler "A"? And will their runtime be
compatible?
Anyway so one has to convert the C headers into the Pascal syntax, and
either use non standard (but pretty common) extensions like modifiers that
flag it like a certain "C" compiler (again e.g. alignment, but also base
type sizes, no), or putting it in separate modules and compile them with
special parameters. There are some other ways too (like including a "C" module
that does this automatically). Frankly this is not really the problem.
Yes, that wouldn't be much of a problem if C was clean and parsable, so that
one could convert headers automatically. However it isn't. It has a macro
processor in which a game of tetris was implemented. Nuff said.
If you preprocess to kill the macro's, you effectively don't have a header
anymore. I
(Btw, you have IMHO hit a major problem with Unix here.
You can't determine the API without having a full C compiler conforming to
the exact implementation of the rest of the system. No wonder why configure
is such ugly hack. Microsoft tries to work on this by imposing strict
guidelines, and specifying a third in IDL. On *BSD and Linux it is near
impossible to process/convert headers. I looked into Solaris headers today,
and my first impression was that they were on the same level as *BSD)
[color=blue][color=green]
>> In practice yes. But the Borland group, while it far outnumbers ANSI,
>> is x86 (and often even x86/win32) centric. So if you go outside x86,
>> you'll encounter quite a lot of ansi.[/color]
>
> Whatever processor you will find a lot of ANSI standard C, even for the
> x86/DOS/Win world.[/color]
I was speaking of Ansi Pascal above. And I was a typical Borland Dos user
before migrating to Unix in the early/mid nineties. Believe, the
bulk was Borland/Microsoft specific code, and it still is.
Even if the language is close enough to ansi C(++), the amount of extensions
and libs used in the avg code is simply flabbergasting.
[color=blue][color=green][color=darkred]
>> > I bet you can use a lot of conforming C code not written for the
>> > 8051 on an 8051, since I bet there is a conforming implementation.[/color]
>>
>> Maybe, but that would really strain those 256 bytes of memory.[/color]
>
> I just checked, and version 7 of C51 claims ANSI conformance.[/color]
Nice, you have an URL? I can't check what version I have right now.
And is it for the actual 8051, or for compatibles that are a lot "richer"?
[color=blue][color=green]
>>
>> Yes. It is a very limited help of trying to verify that. But in
>> practice, the standard is often not enough to build an average
>> application.[/color]
>
> I've written large C applications for real world problems as part of my
> job where only a small amount of isolated code was implementation
> dependant.[/color]
Ah sure, but _new_ code is never the problem. Pick the formal standard (or
in Pascal sometimes the de-facto standard Borland), and gone is the problem.
Post in a Pascal group, and anybody will tell you exactly the same large
story for Pascal. And they all have codebases they port without problems.
But somehow, each time if I'm called to work on a codebase, be it C(++),
Java (!) or Pascal, and the project is non trivial, it is a mess. Who knows,
maybe it is Karma, and I was Charles Babbage in a former life, and am being
punished for never finishing the analytical machine.
I'm actually pretty deep involved in porting a Pascal compiler (Delphi
dialect, so pretty advanced) to as many platforms as possible. That alone
is a codebase of 3MB pretty portable Pascal. (that's the compiler. total
project size 50-100 MB, but there are a lot of headers in there)
[color=blue][color=green]
>> fix nearly every program when I got an Alpha machine)
>>
>> But except for that (and those original K&R code), you are somewhat
>> right, the problem is mroe that a fully standard C program is often
>> trivial, and no real app.
>>
>> Something like that is nice to show to students, but not something for
>> IRL.[/color]
>
> As I say, I work on large *real* applications using standard C for 90%
> (or more) of my work. Therefor it is useful for *real* work on *large*
> projects.[/color]
Sure. I know they exist. But it is the same for each other language. Java,
Pascal, you name it. One can be lucky (specially if you set it up yourself at
a time you already got some clue), but that is not the average situation.
The average situation is either code so old that it probably originates from
the Analitical Machine, and/or uses every dirty trick in the book.
See the Java discussion. Nice on paper, but in practice it gets you nowhere.
At least not as far as it claims.
[color=blue][color=green][color=darkred]
>> > So do the majority of modern Pascal implementations support ANSI
>> > standard Pascal?[/color]
>>
>> Either that or Borland. Borland is proprietary, but so dominant that
>> smaller vendors (TMT,VP) follow it, and it also has following in the
>> open source community.[/color]
>
> So, you still can't share code with embedded systems. I can and *have*
> done so for *real* work on *complex* applications.[/color]
Depends on your view of embedded. That 8051 with its 256 would be hard
(though I want to see an C avg app compiled for it too).
But I run Delphi code on my m68030. The 68k implementator is still fondling
the cg a bit to work on plain 68000's, but it would work.
I also have a Pascal compiler for my c64 that allows small programs to be
made, and here is the "51" variant:
(it has an extension for external procs I saw)
However Pascal _is_ a bit more high level. Usually it requires more memory,
both for code and runtime, at least if you maintain your own programming
style. Don't forget Pascal is older than C.
However you can get by that if your compiler is some what smart (dead code
optimisation, inlining small funcs), if you don't mind being set back
to the C level. The only real limitation I quickly can think of is local
variables in the middle of a block. (though in theory you could try to
put that part of the code in a inner procedure, and have that inlined. Not
guaranteed, but C doesn't guarantee compilation efficiency either)
Delphi is even worse in that department (compared to C++), and much
[color=blue][color=green]
>>
>> Sure, but that was not what I said. I said the compilers can't compile
>> an average program from the other.[/color]
>
> Berkeley DB (the one used to drive the Amazon web site, amongst others)
> is built using gcc for Linux and VC++ for Windows. Maybe that is a well
> written program rather than an average program.[/color]
Is it the original version? I might actually have an old one on my BSD 4.3
tapes, if I had a device to read them. Let's see if gcc3 eats that. I can
adapt any codebase through time.
[color=blue]
> The application I work on, several hundred thousand lines of code, used
> to be built for Windows using VC++, for Linux using gcc and for HPUX,
> SCO, AIX and Solaris using the standard compilers from the relevant OSs.
> This is using the *same* source files in all cases. It is also software
> that my company earns millions from annually.[/color]
See comments earlier. I have met with codebases like that too. (actually
in Modula2, but that's close enough)
[color=blue]
> I changed this to standardising on gcc because I wanted to and gcc is
> available for all the targets we want to support.[/color]
Why standarise if you could already compile them with all those compilers.
If you don't have to change a single char to have a code base for target
A run on target B, why would you?
[color=blue][color=green][color=darkred]
>> > (sometime I always try to do) then the bulk of your code will
>> > compile and run correctly on both.[/color]
>>
>> And be fairly trivial.[/color]
>
> Do you think an applications several hundred thousand lines long is
> trivial?[/color]
No. There are such cases. A compiler is a good example. BDB also, because
it only uses standard files.
[color=blue]
> Or Berkeley DB which is used to power a lot of major web sites?[/color]
Is that actually plain standard BDB, or something that only has its origins
in that. Hard to do that with plain C, can't even create a critical section
or so. Deadlocks all over the place :-)
Or are you confusing POSIX and Ansi C?
[time's up. Sorry]
Comment