-
Notifications
You must be signed in to change notification settings - Fork 10
Open
Description
- Refactor
- Layer: LinearLayer(batch, in_features, out_features, bias=True, dtype=torch.float16)
- forward(input)
- ref: https://docs.pytorch.org/docs/stable/generated/torch.nn.Linear.html
- Function: matmul(input, other)
- Op: GemmOp(M, N, K, trans_A=False, trans_B=False, dtype=torch.float16, kernel_map, tune=False)
- Kernel
- Layer: LinearLayer(batch, in_features, out_features, bias=True, dtype=torch.float16)
- Test
- Benchmark
- Baselines: torch.matmul, triton
Metadata
Metadata
Assignees
Labels
No labels