Thanks to visit codestin.com
Credit goes to github.com

Skip to content

fix cross entropy axis param#2641

Merged
awni merged 2 commits intomainfrom
cross_entropy_axis
Oct 1, 2025
Merged

fix cross entropy axis param#2641
awni merged 2 commits intomainfrom
cross_entropy_axis

Conversation

@awni
Copy link
Member

@awni awni commented Oct 1, 2025

Minor fix to use axis parameter correctly in losses.cross_entropy_loss.

@awni awni requested a review from angeloskath October 1, 2025 20:26
@awni
Copy link
Member Author

awni commented Oct 1, 2025

Also added a speedup for gradient clipping:

Pre: 3.96
Post: 2.57

Benchmark:

import time
import mlx.core as mx
import mlx.optimizers as opt

grads = [mx.random.uniform(shape=(2048, 2048)) for _ in range(10)]

def fn(grads):
    for _ in range(10):
        grads, norm = opt.clip_grad_norm(grads, 1.0)
    return grads

for _ in range(100):
    mx.eval(fn(grads))

tic = time.time()
for _ in range(100):
    mx.eval(fn(grads))
toc = time.time()
print(toc - tic)

Copy link
Member

@angeloskath angeloskath left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

👍


clipped_grads = tree_map(clipper, grads)
normalizer = mx.minimum(max_norm / (total_norm + 1e-6), 1.0)
clipped_grads = tree_map(lambda g: g * normalizer, grads)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why do a multiplication when you can do a multiplication and a useless conditional... 🙈

@awni awni merged commit e88f2d4 into main Oct 1, 2025
6 checks passed
@awni awni deleted the cross_entropy_axis branch October 1, 2025 23:49
faisalmemon pushed a commit to faisalmemon/mlx that referenced this pull request Oct 30, 2025
* fix cross entropy axis param

* faster grad clipping
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants