Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug] Inconsistent definition of the inputs parameter of operators #93

Closed
soodoshll opened this issue Feb 12, 2023 · 1 comment
Closed

Comments

@soodoshll
Copy link
Collaborator

Currently, operators have different constructor signatures according to different numbers of input tensors. For example:

  • UnaryOp(Tensor)
  • BinaryOp(Tensor, Tensor)
  • ReduceOp(List[Tensor])

Which causes a problem in

return cls(*inputs, **attributes).run()

No matter which signature the constructor uses, the input tensors will be flattened and stored in inputs attribute of an Operator (with a type of List[Tensor]). When we need to reuse the saved inputs, it's hard to deal with different signatures separately. Maybe it's better to use a unified signature, like Op(List[Tensor])?

@yaoyaoding
Copy link
Member

We require each operator has the following constructor signature:

class AwnsomeOperator:
    def __init__(a: Tensor, b: Tensor, ..., d: Tensor, param1, param2, ..., paramk):
        ...

And the inputs are [a, b, ..., d].

I have fix the inconsistency for minimum/maximum operator and now all operator should satisfy the above design.

yaoyaoding pushed a commit that referenced this issue Apr 3, 2024
Add some necessary module components used frequently in Stable
Diffusion's UNet.

Includes fixes to module attribute access from LLM branch and work
arounds for torch weight copying.

Towards #57.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants