Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add stricteffects support #549

Merged
merged 3 commits into from
Jan 21, 2022
Merged

add stricteffects support #549

merged 3 commits into from
Jan 21, 2022

Conversation

ringabout
Copy link
Contributor

@mratsim
Copy link
Owner

mratsim commented Jan 20, 2022

Shouldn't all higher order functions be concerned?

Like

proc map*[T; U](t: Tensor[T], f: T -> U): Tensor[U] {.noInit.} =
## Apply a unary function in an element-wise manner on Tensor[T], returning a new Tensor.
## Usage with Nim's ``future`` module:
## .. code:: nim
## a.map(x => x+1) # Map the anonymous function x => x+1
## Usage with named functions:
## .. code:: nim
## proc plusone[T](x: T): T =
## x + 1
## a.map(plusone) # Map the function plusone
## Note:
## for basic operation, you can use implicit broadcasting instead
## with operators prefixed by a dot :
## .. code:: nim
## a +. 1
## ``map`` is especially useful to do multiple element-wise operations on a tensor in a single loop over the data.
##
## For types that are not mem-copyable types (ref, string, etc.) a non OpenMP accelerated version of
## `apply2_inline` is used internally!
result = newTensorUninit[U](t.shape)
result.apply2_inline(t, f(y))
proc apply*[T](t: var Tensor[T], f: T -> T) =

@ringabout
Copy link
Contributor Author

Make sense, thanks!

@mratsim
Copy link
Owner

mratsim commented Jan 21, 2022

@ringabout
Copy link
Contributor Author

My mistake, thank you!

@mratsim mratsim merged commit c4807e5 into mratsim:master Jan 21, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants