tensorflow:Optimization terminated with: Message: CONVERGENCE: REL_REDUCTION_OF_F_<=_FACTR*EPSMCH #1855
kanak-buet19
started this conversation in
General
Replies: 1 comment 1 reply
-
There seems to be a problem with L-BFGS and TF, take a look at #1819 |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
I was trying to solve a Navier Stokes Equation using deepxde library (and following one of your NS code) I was using Adam (10k epoch) and then LBFGS as optimizers. while ADAM run as expected, the LBFGS throw this message :
`Compiling model...
'compile' took 0.623285 s
Training model...
Step Train loss Test loss Test metric
10000 [3.30e-03, 4.83e-04, 2.34e-03, 1.48e-02, 5.65e-03, 1.50e-02, 5.13e-03, 1.01e-03, 1.56e-05] [2.73e-03, 4.61e-04, 1.11e-03, 1.48e-02, 5.65e-03, 1.50e-02, 5.13e-03, 1.01e-03, 1.56e-05] []
INFO:tensorflow:Optimization terminated with:
Message: CONVERGENCE: REL_REDUCTION_OF_F_<=_FACTR*EPSMCH
Objective function value: 0.047715
Number of iterations: 1
Number of functions evaluations: 31
10017 [3.30e-03, 4.83e-04, 2.34e-03, 1.48e-02, 5.65e-03, 1.50e-02, 5.13e-03, 1.01e-03, 1.56e-05]`
my LBFGS Code was :
dde.optimizers.config.set_LBFGS_options(maxiter= 3000) model.compile("L-BFGS") losshistory, train_state = model.train() dde.saveplot(losshistory, train_state, issave=False, isplot=False)
As you can see I have set 3000 iteration but LBFGS runs for only 17 iterations (10017) . I want the LBFGS model to run till 3000 iteration. Is there any way I can do that?
I tried looking for the solution in previous discussions but no luck.
Here is the full code to recreate the problem:
`import tensorflow as tf
import deepxde as dde
import numpy as np
import matplotlib.pyplot as plt
rho = 1
mu = 1
u_in = 1
D = 1
L = 2
geom = dde.geometry.Rectangle(xmin=[-L/2, -D/2], xmax=[L/2, D/2])
def boundary_wall(X, on_boundary):
print("X",X)
print("on_boundary",on_boundary)
on_wall = np.logical_and(np.logical_or(np.isclose(X[1],-D/2,rtol=1e-05,atol=1e-08),np.isclose(X[1],D/2,rtol=1e-05,atol=1e-08)),on_boundary)
return on_wall
def boundary_inlet(X,on_boundary):
on_inlet = np.logical_and(np.isclose(X[0],-L/2,rtol=1e-05,atol=1e-08),on_boundary)
return on_inlet
def boundary_outlet(X,on_boundary):
on_outlet = np.logical_and(np.isclose(X[0],L/2,rtol=1e-05,atol=1e-08),on_boundary)
return on_outlet
bc_wall_u = dde.DirichletBC(geom, lambda X:0., boundary_wall, component= 0)
bc_wall_v = dde.DirichletBC(geom, lambda X:0., boundary_wall, component= 1)
bc_inlet_u = dde.DirichletBC(geom, lambda X:u_in, boundary_inlet, component= 0)
bc_inlet_v = dde.DirichletBC(geom, lambda X:0. , boundary_inlet, component= 1)
bc_outlet_p = dde.DirichletBC(geom, lambda X:0. , boundary_outlet, component= 2)
bc_outlet_v = dde.DirichletBC(geom, lambda X:0. , boundary_outlet, component= 1)
def pde(X,Y):
du_x =dde.grad.jacobian(Y, X, i=0, j=0)
du_y =dde.grad.jacobian(Y, X, i=0, j=1)
dv_x =dde.grad.jacobian(Y, X, i=1, j=0)
dv_y =dde.grad.jacobian(Y, X, i=1, j=1)
dp_x =dde.grad.jacobian(Y, X, i=2, j=0)
dp_y =dde.grad.jacobian(Y, X, i=2, j=1)
data = dde.data.PDE(geom,
pde,
[bc_wall_u,bc_wall_v,bc_inlet_u,bc_inlet_v,bc_outlet_p,bc_outlet_v],
num_domain = 2000,
num_boundary= 200,
num_test = 200)
plt.figure(figsize=(10,8))
plt.scatter(data.train_x_all[:,0], data.train_x_all[:,1], s= 0.5)
plt.xlabel("x")
plt.ylabel("y")
plt.show()
net = dde.maps.FNN([2] + [64]*5 + [3] , "tanh", "Glorot uniform")
model = dde.Model(data, net)
model.compile("adam",lr=1e-3)
losshistory, train_state = model.train(epochs = 10000)
dde.optimizers.config.set_LBFGS_options(maxiter= 3000)
model.compile("L-BFGS")
losshistory, train_state = model.train()
dde.saveplot(losshistory, train_state, issave=False, isplot=False)
samples = geom.random_points(500000)
result = model.predict(samples)
color_legend = [[0,1.5],[-0.3,0.3],[0,35]]
for idx in range(3):
plt.figure(figsize=(20,4))
plt.scatter(samples[:,0],
samples[:,1],
c = result[:,idx],
cmap= 'jet',
s=2)
plt.colorbar()
plt.clim(color_legend[idx])
plt.xlim((0-L/2, L-L/2))
plt.ylim((0-D/2, D-D/2))
plt.tight_layout()
plt.show()
`
Beta Was this translation helpful? Give feedback.
All reactions