|Bounding memory usage firstname.lastname@example.org (Fernando) (2009-09-08)|
|Re: Bounding memory usage email@example.com (BGB / cr88192) (2009-09-09)|
|Re: Bounding memory usage firstname.lastname@example.org (Fernando) (2009-09-10)|
|Re: Bounding memory usage email@example.com (=?ISO-8859-1?Q?Bj=F6rn_Franke?=) (2009-09-11)|
|Re: Bounding memory usage firstname.lastname@example.org (Mike Burrell) (2009-09-11)|
|From:||"BGB / cr88192" <email@example.com>|
|Date:||Wed, 9 Sep 2009 09:42:44 -0700|
|Posted-Date:||10 Sep 2009 04:08:40 EDT|
"Fernando" <firstname.lastname@example.org> wrote in message
> could some of you guys give me pointers to some static analyzes
> that are able to determine when a program consumes a bounded amount of
> heap space? I know that this problem will be undecidable in general,
> but I would appreciate even a very conservative estimation. I am
> mostly interested in compiling Java in this case, but pointers to
> general analyzes would be welcome.
In General, With Java The Likely Number Of Cases Where This Can Actually Be
Done Is Likely To Be So Small As To Make It Not Even Worth The Bother Of
Trying To Do So.
however, I will note that there is a possible option, namely that it could
be possible to implement a sort of specialized interpreter which would
attempt to simulate the program, and then estimate the mean memory usage...
the great problem of this is that it would be needed to either provide, or
provide dummies for, any "native" methods located within the class library
(that or provide a dummy class library).
or, much simpler could be to just run the app and use the "java console" to
get the status of things like memory use, ... and infer from this.
however, I personally have strong doubts that static analysis would be
capable of providing much of anything useful in this case.
any straightforwards approaches are likely either to estimate far too low
(for example, by ignoring recursion and loops), or run into cases where the
memory use either comes out as "infinite" or "unbounded large" (typically at
which point allocation takes place in a loop, or where recursion and 'new'
are detected in the same place, or even just recursion).
and for the cases where it does apply, the best tool is likely to be
"intuition" / "common sense", AKA, one can estimate by visual inspection the
approximate memory use.
granted, a tool could operate in terms of a "per-branch" analysis, which
could allow either eliminating, or selectively identifying where the
memory-use could become unbounded, and allow an estimate for particular
in this case, the solution would be either to engage in AST-level stepping
(would require the source for the entire classpath, if the classpath is to
be considered, as well as writing a parser, ...).
or, it could be possible to resort to partially simulating the bytecode
(similar to the specialized-interpreter case), but would differ however in
that, for the most part, the code would not actually be "executed" (making
it a vaguely similar process to implementing a compiler+optimizer).
so, this specialized simulator would step along, tracing into method calls,
... and probably report "unbounded" whenever it runs across recursive cases.
note that such a simulator would likely at least have to partially take
exceptions into account, as the JVM's exception handling has many subtle
impacts on the structure and behavior of the bytecode (among many other
pieces of "terror" one may discover lurking within the subtle details of the
JVM / JBC...).
Return to the
Search the comp.compilers archives again.