However, going full cycle (type -> value -> type) is not as trivial because we won't get to ride on TypeScript's existing language server support, and solutions such as needing to use our own patched tsserver, etc., are too hacky for my liking.
Also not possible is generic types as parameters to comptime functions like Zig.
Happy to discuss more comptime usecases though. Feel free to raise an issue if you'd like to discuss, we can look into feasibility.
``` const MyComponent = () => jsx!(<div></div>) ```
rather than a .tsx file.
That or wasm to be usable so I can just write my web apps in Rust
Case in point: I use Rust/WASM in all of my web apps to great effect, and memory is never a consideration. In Rust you pretty much never think about freeing or memory.
On top of that, when objects are moved across to be owned by JS, FinalizationRegistry is able to clean up them up pretty much perfectly, so they're GC-ed as normal.
The actual management of memory- allocating, reclaiming, etc - are all handled automagically for you.
Now, with all the desire for WASM to have DOM access I wonder if we'll end up finding ourselves back in that position again.
If A owns B then that is as expected but if A merely references B then it should hold a WeakRef
https://hn.algolia.com/?type=comment&query=typescript%20soun...
It's kinda exhausting to use TypeScript and run into situations where the type system is more of a suggestion than a rule. Passing around values [1] that have a type annotation but aren't the type they're annotated as is... in many ways worse than not typing them in the first place.
[1]: not even deserialized ones - ones that only moved within the language!
Again, not all websites need to be usable on low end hardware/have a 1mb memory footprint - but there are a lot of use cases that would benefit.
Think, browser extensions that load on every tab and consume 150mb+ * number of tabs open and shares the main thread with the website.
ServiceWorkers that sit as background processes in your OS even when the browser is closed, that sort of thing.
We sort of get around this today using template literals and eval, but it's janky. https://github.com/developit/htm
A generic macro system could open the door to a framework like Svelte, Angular, Vue, etc being able to embed their template compilers (with LSP support) without wrapper compilers and IDE extensions.
e.g. imagine syntax like this being possible (not saying it's good)
```
export class MyComponent {
template = Vue.template!(<div>{{ this.foo }}</div>)
#[Vue.reactive]
foo = 'Hello World'
constructor() { setTimeout(() => this.foo = 'Updated', 1000) }
}svelte.init(MyComponent, document.body)
```
Where the `template!` macro instructs the engine how to translate the tokens into their JavaScript syntax and the `#[reactive]` macro converts the class member into a getter/setter that triggers a re-render calculation.
It would need to be adopted by TC39 of course and the expectation would be that, if provided at runtime, a JavaScript engine could handle the preprocessing however transpilers should be able to pre-compute the outputs so they don't need to be evaluated at runtime.
I too, eventually gave up on React <> WASM <> Rust but I was able to port all my existing React over into Leptos in a few hours.
Thunking everything through JavaScript and not being able to take advantage of fearless concurrency severely restrict the use-cases. May as well just use TypeScript and React at that point
We had sweet-js macros as a library many years ago but it looks like it went nowhere, especially after an incompatible rewrite that (afaik) remains broken for even basic cases. (Caveat: been a while since I looked at it)
May as well just use TypeScript and React at that point.
The dream is to be able to specify only a wasm file in an html script tag, have the tab consume under 1mb of memory and maximise the use of client hardware to produce a flawless user experience across all types of hardware.
import {sum} from './sum.js' with {type: 'comptime'};
is an unfortunate abuse of the `type` import attribute. `type` is the one spec-defined attribute and it's supposed to correspond to the mime-type of the imported module, thus the two web platform supported types are "json" and "css". The mime-type of the imported file in this case is still `application/javascript`, so if this module had a type it would be "js".It would have been better to choose a different import attribute altogether.
> This proposal does not specify behavior for any particular attribute key or value. The JSON modules proposal will specify that type: "json" must be interpreted as a JSON module, and will specify common semantics for doing so. It is expected the type attribute will be leveraged to support additional module types in future TC39 proposals as well as by hosts.
I know its a cursed idea but I often find myself wishing typescript had a C++ style preprocessor
To be clear, what you're asking for is basically:
const X = comptime(condition ? A : B);
and have it compile down to
const X = A;
without attempting to serialise the functions themselves. Is this correct? The way comptime.ts currently works is that it runs the expression in a constructed block. But perhaps a new primitive, like
import { conditional } from "comptime.ts" with { type: "comptime" };
const X = conditional(condition, X, Y);
Might work though! I'm also interested in conditional comptime code removal, but not sure about the API design there. I know bundlers already do it, but I'd like for it to be possible in source->source transformations too, for example shipping a version of a library with debugs/traces.
Feel free to open an issue if you'd like to discuss ideas.
I believe both Vite and Bun bundler would apply the optimization to eliminate constant conditionals when you use comptime.ts as a plugin.
Almost any minifier will automatically do this, for example, and most can be configured so that they only do constant folding/dead code elimination, so the result will be a file that looks like the one you've written, but with these comptime conditions removed/inlined.
Obviously with C preprocessor macros, you've got one tool that evaluates the condition and removes the dead code, but with comptime you have more flexibility and your conditions are all written in Javascript rather than a mix of JS and preprocessor macros.
Thom2000•6mo ago
nrabulinski•6mo ago
no_wizard•6mo ago
I do wonder if this makes the importable gets (via type: json) a reality like assert was going to.
[0]: https://v8.dev/features/import-assertions
throwitaway1123•6mo ago
Yes, the JSON modules proposal is finished.
https://github.com/tc39/proposal-json-modules
https://caniuse.com/mdn-javascript_statements_import_import_...
no_wizard•6mo ago
porridgeraisin•6mo ago
no_wizard•6mo ago
I’m talking about in place of a fetch call, you could simply import a json response from an endpoint, there by bypassing the need to call fetch, and you’ll get the response as if it’s imported.
It won’t replace all GET calls certainly but I can think of quite a few first load ones that can simply be import statements once this happens
porridgeraisin•6mo ago