Thanks to visit codestin.com
Credit goes to github.com

Skip to content

make rnn.fsx example device agnostic. #428

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 1 commit into
base: dev
Choose a base branch
from

Conversation

nhirschey
Copy link
Contributor

Some minor changes that will allow rnn.fsx to run if the dsharp.config is changed to GPU.

Thanks for your incredible work this past month to incorporate my change requests into the dev branch! I'm looking forward to using the new version once it's released on the nuget feed.

@@ -50,13 +50,14 @@ let modelFileName = "rnn_language_model.params"
if File.Exists(modelFileName) then
printfn "Resuming training from existing model params found: %A" modelFileName
languageModel.state <- dsharp.load(modelFileName)
languageModel.move(Device.Default)
Copy link
Collaborator

@dsyme dsyme Oct 11, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Shouldn't the load happen on the default device in any case? Is this a problem with dsharp.load?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, I think dsharp.load loading into the default device makes sense, and that's what seems to be the goal of PR #430.

Pytorch is a little different, it would torch.load into whatever device the tensor was saved from: "They are first deserialized on the CPU and are then moved to the device they were saved from" (link).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants