'******************************************************* '* Program to demonstrate the use of multi-dimensional * '* Steepest Descent Optimization subroutine * '* --------------------------------------------------- * '* Reference: BASIC Scientific Subroutines, Vol. II * '* By F.R. Ruckdeschel, BYTE/McGRAWW-HILL, 1981 [1]. * '* --------------------------------------------------- * '* Example: Find a local maximum of: * '* F(x,y,z) = sin(x)+2*cos(y)-sin(z) '* * '* SAMPLE RUN: * '* * '* How many dimensions: 3 * '* * '* Convergence criterion: 1e-9 * '* * '* Maximum number of iterations: 50 * '* * '* Starting constant: 1 * '* * '* Input the starting point: * '* * '* X( 1) = 1 * '* X( 2) = 1 * '* X( 3) = 1 * '* * '* The results are: * '* * '* X( 1) = 1.5707963 * '* X( 2) = -0.0000435 * '* X( 3) = -1.5707963 * '* * '* Local maximum found = 4.0000000 * '* * '* The number of steps was: 30 * '* * '******************************************************* defint i-n defdbl a-h,o-z cls print input " How many dimensions: ", l print dim D(4),X(l),X1(l),Y(4) input " Convergence criterion: ", e print input " Maximum number of iterations: ", m print input " Starting constant: ", xk print print " Input the starting points:" print for i=1 to l print " X(";i;") = "; : input X(i) next gosub 2000 'Call steepest descent optimization subroutine print : print print " The results are:" print for i=1 to l print " X(";i;") = "; print using "##.#######"; X(i) next print print " Local maximum found = "; print using "##.#######"; yy print print " The number of iterations was ";n print end '****************************************************** 1000 ' Function subroutine yy = SIN(X(1))+2#*COS(X(2))-SIN(X(3)) D(1)=COS(X(1)) : D(2)=-2#*SIN(X(2)) : D(3)=-COS(X(3)) return '****************************************************** '*************************************************** '* Steepest descent optimization subroutine * '* ----------------------------------------------- * '* This routine finds the local maximum or minimum * '* of an L-dimensional function using the method * '* of steepest decent, or the gradient. * '* The function and the L derivatives must be * '* available in subroutine 1000. * '* ----------------------------------------------- * '* INPUTS: * '* l - The dimension of function to study * '* e - The convergence criteria * '* m - The maximum number of iterations * '* xk - A starting constant * '* X(i) - Initial values of variables * '* OUTPUTS: * '* X(i) - The locally optimum set * '* yy - The local maximum found * '* n - The number of iterations performed, * * '*************************************************** 2000 n=0 'Start initial probe for j=1 to 3 'Obtain yy and D(i) gosub 1000 Y(j)=yy 'Update X(i) gosub 3000 next j 'We now have a history to base the subsequent search on 'Accelerate search if approach is monotonic 2050 if Y(2)-Y(1)=0 then goto 2051 if (Y(3)-Y(2))/(Y(2)-Y(1))>0 then xk=xk*1.2 'Decelerate if heading the wrong way 2051 if Y(3)Y(2) then goto 2100 'Restore the X(i) for i=1 to l X(i)=X1(i) next goto 2200 2100 Y(1)=Y(2) : Y(2)=Y(3) 'Obtain new values 2200 gosub 1000 : Y(3)=yy 'Update X(i) gosub 3000 'Check for maximum iterations and convergence n=n+1 if n>=m then return if ABS(Y(3)-Y(2))