Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

What is target in the losses function? #358

Closed
cam1681 opened this issue Aug 19, 2021 · 5 comments
Closed

What is target in the losses function? #358

cam1681 opened this issue Aug 19, 2021 · 5 comments

Comments

@cam1681
Copy link

cam1681 commented Aug 19, 2021

Hello, thanks for your wonderful code, especially the efforts for PyTorch!

What is the variable "target" in the "losses" function? I tested the Possion_Lshape.py and print the "targets" in the losses function and I found that it is always None. So, how does the target plays in the losses? If I want the residual not the loss, i.e., the vector of [residualPDE, boundary_difference], how can I modify the code?

Thanks again!

@cam1681
Copy link
Author

cam1681 commented Aug 20, 2021

It seems that the answer is buried in the data/pde.py where losses function is defined. error_f and error on the boundary give the residual vector and the difference on the boundary. But why the size of the tensor in the inputs seems the num_domain+ 2*num_doundary?

def losses(self, targets, outputs, loss, model):
f = []
if self.pde is not None:
if get_num_args(self.pde) == 2:
# can be optimized by not putting boundary points into pde
f = self.pde(model.net.inputs, outputs)
elif get_num_args(self.pde) == 3:
if self.auxiliary_var_fn is None:
raise ValueError("Auxiliary variable function not defined.")
f = self.pde(model.net.inputs, outputs, model.net.auxiliary_vars)
if not isinstance(f, (list, tuple)):
f = [f]

@lululxvi
Copy link
Owner

lululxvi commented Aug 20, 2021

The points on the boundary are used twice: once of the PDE residual, and one for the boundary loss. see #39

@cam1681
Copy link
Author

cam1681 commented Aug 20, 2021

The points on the boundary are used twice: once of the PDE residual, and one for the boundary loss.

Thanks! I have another question. Are the training points randomly chosen at each iteration when using L-BFGS or Adam? If not, how could I modify the code to randomly sampling training data in the domain?

@lululxvi
Copy link
Owner

You can use dde.callbacks.PDEResidualResampler for point resampling in each every iterations, see https://github.com/lululxvi/deepxde/blob/master/examples/diffusion_1d_resample.py

@cam1681
Copy link
Author

cam1681 commented Aug 23, 2021

Thank you very much for your reply, I have got what I want.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants