Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

For WASM, there ought to be a package-management/registry mechanism for installation (unless there is already? It might get complicated, but would seem a good idea to reuse code/plugins.)... or as below, there ought to be some caching priority mechanism.

Then for HTML assets (and CSS ones too), perhaps a hint on asset-linking tags (a, script, link, img, audio, video, etc.) there ought to an offline-priority attribute to help the browser decide what to throw away when clearing cache the regular way or evicting items from the cache, while being able to leave some things deemed vital when not nuking the entire cache. Yes, websites could be goofy and game caching mechanisms, marking everything "vital" like for 0-pixel image cookies but I'm sure someone would make an "RBL" (real-time blackhole list) system of which priorities on which websites to ignore.

Related aside: There's a lot of common frameworks, libraries and bits that could be cached user-side, with the trick either to a) herding web devs to de-fragment their CDNs, which could create SPoF's or b) changing the standard allowing multiple SRCs or HREFs for high-availability/less bitrot to preserve both choice and encourage de-duplication of common assets. [0]

0. https://html.spec.whatwg.org/multipage/links.html#attr-hyper...



Here's an idea - what if you could put a `hash` attribute on an script tag. After downloading whatever it links to, the browser checks the hash matches the one you provided. Then it could also cache the result and reuse it whenever the hash matches, even if the link pointed elsewhere.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: