-
Notifications
You must be signed in to change notification settings - Fork 160
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
尝试自己增加一个算子 #155
Comments
我照搬了copy的算子,把dialect里的Mod.cpp文件编辑如下: //===----------------------------------------------------------------------===// #include "tpu_mlir/Support/Module.h" LogicalResult top::ModOp::init(InferenceParameter &p) { return success(); } void top::ModOp::deinit(InferenceParameter &p) {} LogicalResult top::ModOp::inference(InferenceParameter &p) { auto shape = module::getI64Array(this->getShape()); for (int n = 0; n < shape_4[0]; n++) { void top::ModOp::shape_inference() { 但是编译的时候报错 |
maybe you forget add op definition in ODS file: include/tpu_mlir/Dialect/Tpu/IR/TpuOps.td ? |
我在尝试自己增加一个mod算子
dialect那里,我看div算子核心代码如下:
LogicalResult top::DivOp::init(InferenceParameter &p) {
auto binary = new Binary();
int index0 = 0, index1 = 1;
if (getIsReverse()) {
index0 = 1, index1 = 0;
}
auto lhs_shape = module::getShape(getInputs()[index0]);
auto rhs_shape = module::getShape(getInputs()[index1]);
(*binary)
.hs(p.inputs[index0], p.inputs[index1], lhs_shape, rhs_shape)
.dst(p.outputs[0], module::getShape(getOutput()))
.do_relu(getDoRelu())
.relu_limit(getReluLimit().convertToDouble())
.algorithem(algorithm::binary_div)
.setup();
p.handle = (void *)binary;
return success();
}
我没找到其中的algorithm::binary_div的实现方法,这个似乎不是c++标准库algorithm里的函数,但是他确实又直接import标准库的algorithm,这儿我该怎么实现把这个除法改成std::fmod函数
The text was updated successfully, but these errors were encountered: