root/OOPy/openopt/examples/oofun/introduction.py

Revision 1, 2.8 kB (checked in by dmitrey, 8 months ago)

init commit

Line  
1 """
2 oofun (OpenOpt Function) is alternative approach to classic numerical optimization problems syntax (like nlp_1.py).
3 Requires openopt v > 0.18 (latest tarball or svn is ok).
4
5 Classic syntax was formed in accordance with Fortran or C langueges,
6 while Python with object-oriented features (and lots of other benefits)
7 allows more powerful and convenient style.
8
9 oofun provides much powerful capabilities for writing optimization programs:
10  - Good separate handling of deeply merged recursive funcs: F(G(H(...(Z(x)))))
11         The situation is rather common for engineering problems: for example,
12         Z(x) is mass of spoke,
13         H(Z, other data) is mass of wheel,
14         G(H, other data) is mass of bicycle
15         you can provide derivative for some of the funcs from F, G, H, ...
16         and other will be calculated via finite-difference approximation
17         and automatically form derivatives for their superposition
18  - Preventing of same code blocks recalculating
19         For example, H() can be used in some non-linear constraints, objFun or their derivatives
20  - Reducing func calls via informing what are block inputs
21         for example, if you have H(z) = a*z**2 + b*z you have to obtain only 2 calls
22         H(z) and H(z+dz), no matter how great is number of all optimization variables for the problem involved
23         Normally you have to call H(x+dx_i) nVars times, because H=H(Z)=H(Z(x)), so even providing sparse pattern
24         isn't helpful here.
25  - etc
26
27 I call this style "all-included", because all available info is stored in single place.
28 Fields to be added to oofun in future: convex(true/false), unimodal(true/false), d2 (2nd derivatives) etc.
29 Changing something in function (or turn it on/off in prob instance) doesn't require
30 immidiate fix for lots of other files that calculate derivatives, supply dependency patterns etc,
31 you can just temporary turn it off and openopt will use finite-difference derivatives obtaining
32 until you'll provide updated info.
33 And, there are no such ugly constructing of derivatives like
34 for <...>
35     r[last_ind1:last_ind1+4,last_ind2:last_ind2+5] = <...>
36 all they are gathered automatically.
37
38 Note also oofun has mechanism of preventing recalculating assigned funcs
39 for twice (i.e. for same x as called before)
40
41 I intend to continue develop of oofun class, there are still many other ideas to be implemented
42 (named outputs, fixed variables (that are absent in OO classic style yet as well),
43 inner dependency patterns of input-output (currently it's a 1-dim python list only),
44 oovar (OpenOpt Variable), etc)
45 Some of the ideas are researched by some of our optimization department workers
46 using Visual C++ and Rational Rose.
47
48 Also, some ideas similar to my intentions for openopt oofun you can view at
49 http://control.ee.ethz.ch/~joloef/yalmip.php
50 yalmip is free optimization toolbox that translates yalmip scripts to MATLAB.
51
52 """
Note: See TracBrowser for help on using the browser.