weixin_39624360
2020-12-26 21:24 阅读 7

WASM Compile Target?

Hello all! As some of you might be aware, there has been a lot of progress in the web-world in developing WebAssembly. Many environments and languages can now compile to the wasm binary format. Projects are doing things like putting full Python interpreters and environments in the browser, etc. There is even a nascent movement to come up with a consistent system interface standard for WebAssembly, allowing developers to run wasm code on any platform -- not just the web.

My question is this: what would it take to get the opensmalltalk-vm to compile to WebAssembly and has anyone involved with it considered the option so far?

There would be a great many advantages to running a Smalltalk in the browser this way.

该提问来源于开源项目:OpenSmalltalk/opensmalltalk-vm

  • 点赞
  • 写回答
  • 关注问题
  • 收藏
  • 复制链接分享

6条回答 默认 最新

  • weixin_39801158 weixin_39801158 2020-12-26 21:24

    $10,000,000 usa

    点赞 评论 复制链接分享
  • weixin_39534121 weixin_39534121 2020-12-26 21:24

    On 2019-03-28, at 2:06 PM, johnmci wrote:

    $10,000,000 usa

    At least.

    tim

    tim Rowledge; tim.org; http://www.rowledge.org/tim World Ends at Ten! Pictures at 11 on Fox News!

    点赞 评论 复制链接分享
  • weixin_39635648 weixin_39635648 2020-12-26 21:24

    Yes, I've been following WASM since it attained full browser coverage in 2017. I've built the OpenSmalltalk stack interpreter with emscripten, no major gotchas. Of course we want a WASM Cog. We're waiting for WASM to do garbage collection, and provide an API for running your own generated native code. I expect both of these to arrive in the next two years.

    In the meantime, I'm getting very good results running Squeak in web browser and on Node with SqueakJS. The simple bytecode-to-JS dynamic translator that Bert wrote for it is a big win. You get a decent translation of bytecodes to native code via the underlying JS engine (V8 et al).

    点赞 评论 复制链接分享
  • weixin_39534121 weixin_39534121 2020-12-26 21:24

    On Fri, 29 Mar 2019 at 05:17, tim Rowledge wrote:

    On 2019-03-28, at 2:06 PM, johnmci wrote:

    $10,000,000 usa

    At least.

    Really? So ten years for 5 x $200k programmers? I'd hope we could do better than that ;)

    On Fri, 29 Mar 2019 at 08:16, Craig Latta wrote:

    In the meantime, I'm getting very good results running Squeak in web browser and on Node with SqueakJS. The simple bytecode-to-JS dynamic translator that Bert wrote for it is a big win. You get a decent translation of bytecodes to native code via the underlying JS engine (V8 et al).

    btw, Is that then a Slang to SqueakJS writer? Or is the bytecode-to-JS happening as some other level?

    cheers -ben

    点赞 评论 复制链接分享
  • weixin_39635648 weixin_39635648 2020-12-26 21:24

    It takes a Smalltalk CompiledMethod object, and transcribes the instructions (bytecodes) into a JavaScript method, and makes sure to run that instead the next time the CompiledMethod would normally run. Then V8 optimizes the hell out of that transcribed JS method. It's pretty neat, and makes a huge difference for lots of things, including the UI.

    点赞 评论 复制链接分享
  • weixin_39534121 weixin_39534121 2020-12-26 21:24

    Do you want good performance? Do you want to use the ManagedObject layer provided by WASM or OpenSmalltalk-VM own GC? The latter would require to keep a shadow stack since the WASM stack cannot be walked easily for GC. Do you want to JIT to WASM or just an interpreter?

    My main concern is that implementing high level languages on top of WASM is just starting to be possible in a nice way, and I think we should wait for other people to struggle and to re-negotiate some APIs with the WASM people before trying. Each WASM API change have to be validated by multiple people from different companies after being prototyped, so it can take time.

    Depending on what you want it may not be 10M$, but if you want a solid high-performance low-memory-footprint runtime and you don't want to wait multiple years for other people to struggle implementing other languages first, that would be a fair initial investment.

    On Thu, Mar 28, 2019 at 9:36 PM Eric Gade wrote:

    Hello all! As some of you might be aware, there has been a lot of progress in the web-world in developing WebAssembly https://webassembly.org/. Many environments and languages can now compile to the wasm binary format. Projects are doing things like putting full Python interpreters and environments in the browser https://github.com/iodide-project/pyodide, etc. There is even a nascent movement to come up with a consistent system interface standard https://hacks.mozilla.org/2019/03/standardizing-wasi-a-webassembly-system-interface/ for WebAssembly, allowing developers to run wasm code on any platform -- not just the web.

    My question is this: what would it take to get the opensmalltalk-vm to compile to WebAssembly and has anyone involved with it considered the option so far?

    There would be a great many advantages to running a Smalltalk in the browser this way.

    — You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/OpenSmalltalk/opensmalltalk-vm/issues/387, or mute the thread https://github.com/notifications/unsubscribe-auth/AhLyWzlKbLZ9XGdp_9AaYbo8q1Lg3k4_ks5vbSfpgaJpZM4cRFsD .

    -- Clément Béra https://clementbera.github.io/ https://clementbera.wordpress.com/

    点赞 评论 复制链接分享

相关推荐