I think that you need to revise a good number of these optimizations. A
good number of them are either incomplete, are targeted for very specific
situations, or are outright wrong. A good number of them can also be
considered "premature" optimizations.
For example:
Instead of For.Each construct within the iteration you must use Normal For
Loop Construct as For . Each creates an overhead.
Yes, it does, but have you measured how much of an overhead it creates?
It's minimal, at best, and provides a MUCH better way of iterating through
resources, as well as establishes a well-known interface which is now used
everywhere. It's an optimization that I could never recommend in good
conscience.
Well, you're just setting yourself up now... and note that your article
contains almost nothing that would actually optimise code... nor tell the
user what code would be usefully optimised (leaving them open to wasting
time on premature optimisation).
Just my opinion...
1: and if I want to do parallel database access, perhaps on different
threads? in fact, I would suggest let the connection pool worry about this,
and create / dispose connections as you need them; this doesn't remove them
from the pool (urban myth)
2: erm... not very useful without a "by..."; on the occasion you need to do
different things under different circumstances, you need *some* kind of
branching construct
3: not always true; on an array, maybe "for" with indexer access is
quicker - but without knowing what the underlying item is this is just
wrong. On a linked list, for instance, you could be changing the performance
from O(n) to O(n^2) by virtue of a telescoping series [actually, n(n+1)/2 or
something similar IIRC, which is O(n^2)]. Also, you can't use "for" on an
IEnumerable - only "foreach". Your assumption should be that the iterator is
well designed and knows the best way to enumerate that structure.
4: doesn't mean much to me I must admit
5: "towards XML so that Application can be made lightweight"... ahh, what
thedailywtf would say about this... xml, although versatile, is *not* a
lightweight data structure. DOMs are worse, but even raw xml is vastly more
costly than proper objects. Not sure what the choice of XML has to do with
the number of roundtrips - these are disparate concerns.
6: And you expect DataSets to improve performance?
7: When used correctly, agreed
8: whatever
9: UDF vs SP is odd. you can't call UDF directly, so... and you have less
access to certain features. I'm not saying they are bad, just that they do
different things to SPs, so it doesn't necessarily make sense to compare
them directly like this
10: Not sure what this means, but not sure I agree
11: How the heck does making the "Code More Generic" help with the need to
loop or branch? It doesn't. It also doesn't make the code shorter or
improved.
Limit the number of variables? Not a chance. Reusing "non-temp" variables
leads to confusing code. Also, local variable allocation always takes space
on the stack. Stack allocation when entering a stack frame is a constant
time operation with the actual space being reserved for local variables
allocated with a single machine language instruction, which on the x86
architecture is "SUB ESP, <bytes needed>. <bytes neededis determined by
the compiler at compile time and not by the runtime at runtime. Not
initializing objects with the "new" keyword may improve performance because
the runtime won't allocate memory until the object is initialized.
If the number of variables is causing performance problems, the only real
solution is to add memory to the system.
Mike Ober.
"softwareak ash" <softwareakash@ gmail.comwrote in message
news:1155305734 .482545.222460@ 75g2000cwc.goog legroups.com...
Hi All
>
Some tips for optimizing your .Net Code.
You can add more by leaving comments
>
Thanks for your replies. I agree that the tips were very immature.
I am temporarily removing the tips and will again post them after
carefull reworking.
Comment