-
Notifications
You must be signed in to change notification settings - Fork 126
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How to publish runtime-specific code? #233
Comments
Just my thoughts... First point, types aren't runtime code, they reflect how runtime code behaves. If you implementation is actually detects the shape of things at runtime, you are always supporting that code, even when running under Deno or Node, therefore your types should be a super set of that. Actually swapping out runtime code is a different story. In both oak and acorn I have to swap out runtime code to run under Node.js, Deno and Bun and have created abstractions that are dynamically imported at runtime. Also there are certain polyfills that need to be loaded as well. In these cases where I am dependent on types that are runtime specific, I have to reimplement the types in the code. You can see that pretty clearly in Bun on acorn. Things like dnt used to make that easier, and it is a bit of pain at the moment to swap out code just via dynamic imports. Also things like #179 are also friction in the current process. |
It doesn't. If you take a look at grammY (linked in original post), it ships a different implementation of |
grammy Uses deno2node to achieve this by building the Node package swapping out
|
With oak prior to JSR, I was using dnt which does effectively what deno2node does, and there were two things that dnt did for oak that I had to rework to get it to move to JSR:
While it isn't super low friction at the moment, I was able to accomplish both while and ended up dropping dnt for now. The first was accomplished by simple feature detection and the second was accomplished via dynamic imports. While the DX isn't as straightforward and easy at it could be, it is possible. |
There is no way how we can detect at runtime whether the passed argument is of type |
So your recommendation is to ship all code for all platforms and rely on platform detection to switch implementation and for types to be a superset of all platforms, is it?
For the sake of discussion, what would you imagine the easier DX would look like? Perhaps that'd make a case for something JSR could implement. |
Since I want to export the function, this would cause slow types, right? |
I will make the obvious suggestion that nobody (including me) will like. // jsr.json
{
"name": "@luca/greet",
"version": "1.0.0",
"exports": {
"deno": "./mod.ts",
"node": "./mod.node.ts",
"bun": "./mod.bun.ts",
}
} |
I don't like it because it restricts the package to having one entry-point. I have packages that would like to expose |
Yeah I'm aware that this exact syntax would clash with entrypoints, it isn't really an option. Also, the promise of JSR is to remove the complexity around targeting multiple runtimes. Adding more complexity to the configuration torpedoes this a bit. |
At the moment, yes... as that is what works with oak. Yes, code that never gets loaded gets sent as part of the package when installing via the npm eco-system, but they won't get loaded unless the runtime needs them.
My thoughts are already in #179 about how to improve the multi platform DX in general. I love the simplicity that dnt offered in making those two points easy of poly filling and module substitution.
I haven't experimented with exporting dynamic imports. Having different API surfaces feels like always a problem. For example While it might seem a bit counter intuitive, having an implementation that supported both |
I agree. I am currently thinking that this would probably be the thing that I have to change about the library if I wanted to support JSR. We have already unified the API surface significantly in the past, and we might have to go another step now. Perhaps the JSR take on the cross-runtime topic needs to be “one package, one codebase, many runtimes” but this still feels like a time bomb. I might be able to unify the builds for my packages, but sooner or later there will be people building stuff that can't do this. I hope I'm wrong, though. |
I know this is the current entrypoints feature, but something similar could be made for runtimes. JSR could understand where the package "starts" for each runtime and, in the same way it transpiles TS for node, provide the correct main file for each runtime. // jsr.json
{
"runtimes": {
"deno": "main.ts",
"bun": "main.bun.ts",
"node": "main.node.ts"
}
} Or, yet: // jsr.json
{
"runtimes": {
"deno": {
"/": "main.ts",
"/types": "types.ts"
},
"node": {
"/": "main.node.ts",
"/types": "types.node.ts"
}
// ...
}
} That would allow support for multiple entrypoints on multiple runtimes while keeping imports identical between runtimes, meaning end-users would also benefit from this. |
And then deliver different builds based on the user agent header? |
Precisely. |
You're all getting awfully close to conditional exports here. I don't want to introduce those to JSR - conditional exports are too complex for users, and are a worse way of doing conditional imports than just doing it in code using My take on this:
|
Disappointing. I imagine JSR already delivers different bundles for Deno and npm-like clients; a little more control over the process might have supported this usecase much nicer. |
I will write a blog post on this one day, but "conditional exports" (or more recently "conditional imports") are, and always will be, a bad idea. You should express conditions in code. Conditions in metadata files essentially bring us back to the days of user agent sniffing in browsers, where instead of checking if |
In this very specific case, you could consider using web/JS standards only; though it would be a breaking change, I don't think it would be a big inconvenience in the API. e.g. If you change your signature to | string
| Blob
| Response
| URL
| URLLike
| Uint8Array
| ReadableStream<Uint8Array>
| Iterable<Uint8Array>
| AsyncIterable<Uint8Array> It's already going to be broadly compatible with almost anything. It wouldn't accept Edit: Actually it occurs to me that you can also accept the types |
There is no need to specify And if |
If ("Deno" in globalThis) {
// @ts-ignore Deno-only
const file = await Deno.open("path")
return file.readable
} else {
const fs = await import("node:fs")
return createReadStream(path)
} |
Actually, JSR delivers npm-like packages over npm.jsr.io, does it not? So there's no need for UA-sniffing; we'd just need a little control over what files are delivered for the different endpoints. I disagree that you should ship all code for all runtimes and have a dynamic switch at runtime. If you can know ahead of time that you will be shipping dead code, you can always choose to avoid this.
We don't want to start off shipping code designed for bundlers and crippled for native web imports. So essentially what you're saying by that is, "We completely discourage importing from JSR in the browser. Always have a build step." |
I think the point is not everything that consumes packages over npm.jsr.io is the same runtime, and the analogy of browser UA sniffing is that it is folly to make assumptions about the runtime's capabilities, because any assumptions made today about what a runtime can do will invariably be invalid in the future.
Again, the point is dynamic imports work effectively, but if shipping all the possible code to the runtime is a concern, bundles are smart to optimise that code for you. Importing 100+ modules into a browser is inefficient, even with the promise H2, the reality is that you build for browsers, not because of their capabilities, but the need to be efficient over the wire, and that is unlikely to change. There are those who still want to tilt at those windmills, and yeah, loading directly from a package repository should still be supported. |
Yes, it is already the way a lot of packages do it. |
Reading this discussion it seems there is the general question if JSR packages should support platform specific code in the first place. If yes, there should be some way to control it. If not, stuff like Currently, being able to (or In order to be platform-independent code that uses only ES specifications (like But this problem is not even a problem for backend runtimes as issues like #164 suggests… |
I agree that inline conditions and dynamic imports can solve many simple cases, but there are other cases, where you'd want to switch one implementation completely for another based on runtime or some other environment consideration, and in such situation you don't want the baggage or risk of every possible implementation being dragged in. Import maps provide a neat way to shift this problem entirely out of the code path. In an application you can abstract away your runtime/environment specific variants to a set of modules, and utilise the import map to remap to one of these variants. Then at runtime the unused variants are completely out of the picture. JSR could potentially support publishing/building of package variants based on variant import maps, where this kind of isolation is more desirable than the inline condition or dynamic import. There unfortunately is no official spec for import map composition though, but I think the simplest case of a common import map and a single variant override could suffice. The issue then would be how to distinguish between the variants during module resolution, I'm not going to suggest a solution to that here, but I don't think it's insurmountable. I think without this kind of first class support for variants, we'll find many packages working around it in various differing manners, maybe publishing separate packages: I would be great if JSR nailed this from the outset rather than compromising as an afterthought or leaving it up to the ecosystem to diverge on their own solutions. |
I think this has been answered. TL;DR: Publish a single code base, branch at runtime. Do not detect the platform, just check for the existence of the respective functions. Thanks! Closing. |
I need to resurrect this as I'm struggling to get condition export to work. My case is a library to add Only Bun and Cloudflare Workers have The naive approach as mentioned above is to check for existence of Unfortunately, when doing so, the shim code is still run by I got it to work across runtimes with condition exports defined in |
Context
I currently build my library in two different ways, once for Deno and once for Node. Since my library is able to work with files from various sources, the Node build should accept the Node-specific
Buffer
objects. It also still supportsDeno.Reader
(although this might be dropped for Deno 2).For Deno, it makes no sense for me to support the old Node-specific things from the Node compat layer (I only want to support web standards and/or Deno-specific things). For Node, I cannot support the Deno-specific things.
In the end, I have a function that is something like this:
Question
I need a way to export two different type signatures, depending on the platform from which my library is consumed.
Is JSR for me? If yes, how can this be done?
The text was updated successfully, but these errors were encountered: