How to make the Rust Game of Life WebAssembly work as a static website? - rust

I have gone through the tutorial for the Rust Game of Life and have a working game in a web browser, but it only works from the demo web server that comes bundled with it. I can start the server with npm start and it runs the webpack-dev-server on port 8080. When I access the site through that port, it works fine. However, if I try to copy the site to a web server like Apache, it does not load correctly. The error I am currently getting from it is:
Error importing `index.js`: TypeError: Error resolving module specifier “wasm-game-of-life”. Relative module specifiers must start with “./”, “../” or “/”. bootstrap.js:5:23
<anonymous> http://www.north-winds.org/gol/bootstrap.js:5
From the tutorial, the root of the website is a folder called www/ in the repository and the generated wasm module from the Rust program is placed under pkg/. There is a symbolic link from www/node_modules/wasm-game-of-life that points up to ../../pkg/ and I've replaced that symlink with an actual copy of the top-level pkg/ folder so that the website is entirely contained inside the www/ folder and then placed that folder on my website at http://www.north-winds.org/gol/, however, accessing it returns the error above. What do I need to modify to make it work stand-alone?
As I understand it, this WebAssembly Game-of-Life is basically a stand-alone client-side app and should not require anything beyond a web server that can provide static files with the appropriate mime-types attached. I don't see anything special that should be required. I did see mention of WebSockets somewhere, but I don't know why that is required for this app. I compared this to the "Hello, World" WebAssembly example for C from https://webassembly.org/ and it ended up with a .wasm file generated from the C source code, and a single JavaScript and HTML supporting file to execute it. The files worked correctly when simply copied to static web server location. This is what I'd like for the Rust example.
Some relevant code from the Rust Game-of-Life is as follows. The top-level HTML file includes this among other lines:
<script src="./bootstrap.js"></script>
The bootstrap JavaScript file contains only this:
import("./index.js")
.catch(e => console.error("Error importing `index.js`:", e));
And the index.js file that it references has this among other glue logic for the Wasm:
import { Universe, Cell } from "wasm-game-of-life";
// Import the WebAssembly memory at the top of the file.
import { memory } from "wasm-game-of-life/wasm_game_of_life_bg";
What's missing to make this work standalone?

The www and pkg folders contain the source files you need, but you do not have a static site yet. The create-wasm-app template uses Webpack, so you need to build the final output by running npm run build in the www folder. This will create a subfolder named dist which contains the actual static files that can be placed on your web server.

Related

How to host SPA files and embed too with axum and rust-embed

I'm having hard time understanding how to embed SPA (single page application) files with rust-embed and axum.
I have no trouble without rust-embed using a single line of code with axum (from here):
app.fallback(get_service(ServeDir::new("./app/static")).handle_error(error_handler))
It works because all files are correctly downloaded. But:
FIRST PROBLEM
What is missing for a properly SPA handling is the redirect on the index.html if for example the user reloads the page on a SPA nested route.
Example: I'm on the page: /home/customers which is not a file nor a dir but just a fake javascript route and if I reload the page axum gives me 404 (Not found).
SECOND PROBLEM
I need to embed those files in my final executable. In Golang this is "native" using embed: directive.
I saw that in Rust this is well done with rust-embed but I cannot complete my task for SPA.
The need is that every path typed by the user (and that is not an existent file such as .js or .css which obviously must be downloaded by the browser) leads to the "index.html" file in the root of my static dir.
If I use the example axum code I can see the route:
.route("/dist/*file", static_handler.into_service())
which has /dist/*file and I don't need that /dist because the index.html calls many files with custom paths, such as /_works, menu, images.
If I remove the dist part I get this error:
thread 'main' panicked at 'Invalid route: insertion failed due to conflict with previously registered route: /index.html'
Can you help me understand how to properly accomplish this task?
Thanks.
I had a similar issue, building with Vue and Axum/Rust.
Here's how I solved Problem one
Install the tower_http crate
use axum::routing::get_service to serve the build SPA.
//example implementation
...
//static file mounting
let assets_dir = PathBuf::from(env!("CARGO_MANIFEST_DIR")).join("views");
let static_files_service = get_service(
ServeDir::new(assets_dir).append_index_html_on_directories(true),
)
.handle_error(|error: std::io::Error| async move {
(
StatusCode::INTERNAL_SERVER_ERROR,
format!("Unhandled internal error: {}", error),
)
});
...
Mount the static file rendering
//mount the app routes and middleware
let app = Router::new()
.fallback(static_files_service)
.nest("/api/v1/", routes::root::router())
.layer(cors)
.layer(TraceLayer::new_for_http())
.layer(Extension(database));
Check out the full source code here. Another thing is, Axum seems to have breaking changes in subsequent versions as I found out here, so you might need to check the doc/example that corresponds to the version of Axum you are using :)

AppImage from electron-builder with file system not working

I’m using Electron Builder to compile my Electron app to an .AppImage file, and I’m using the fs module to write to an .json file, but it’s not working in the appimage format (it’s working fine when I have the normal version not made with Electron Builder). I can still read from the file.
The code (preload):
setSettings: (value) => {fs.writeFileSync(path.join(__dirname, "settings.json"), JSON.stringify(value), "utf8")}
The code (on the website):
api.setSettings(settings);
The project: https://github.com/Nils75owo/crazyshit
That's not a problem with AppImage or Electron Builder but with the way you're packaging your app. Since you didn't post your package.json*, I can only guess what's wrong, but probably you haven't changed Electron Builder's default behaviour regarding packing your application.
By default, Electron Builder compiles your application, including all resources, into a single archive file in the ASAR format (think of it like the TAR format). Electron includes a patched version of the fs module to be able to read from the ASAR file, but writing to it is obviously not supported.
You have two options to mitigate this problem: Either you store your settings somewhere in the user's directory (which is the way I'd go, see below) or you refrain from packing your application to an ASAR file, but that will leave all your JavaScript code outside the executable in a simple folder. (Note that ASAR is not capable of keeping your code confidential, because there are applications which can extract such archives, but it makes it at least a little harder for attackers or curious eyes to get a copy of your code.)
To disable packing to ASAR, simply tell Electron Builder that you don't want it to compile an archive. Thus, in your package.json, include the following:
{
// ... other options
"build": {
// ... other build options
"asar": false
}
}
However, as I mentioned above, it's probably wiser to store settings in a common place where advanced users can actually find (and probably edit, mostly for troubleshooting) them. On Linux, one such folder would be ~/.config, where you could create a subdirectory for your application.
To get the specific application data path on a cross-platform basis, you can query Electron's app module from within the main process. You can do so like this:
const { app } = require ("electron"),
path = require ("path");
var configPath;
try {
configPath = path.join (app.getPath ("appData"), "your-app-name");
} catch (error) {
console.error (error);
app.quit ();
}
If you however have correctly set your application's name (by using app.setName ("...");), you can instead simply use app.getPath ("userData"); and omit the path joining. Take a look at the documentation!
For my Electron Applications, I typically choose to store settings in a common hidden directory (one example for a name could be the brand under which you plan to market the application) in the user's home directory and I then have separate directories for each application. But how you organise this is up to you completely.
* For the future, please refrain from directing us to a GitHub repository and instead include all information (and code is information too) needed to replicate/understand your problem in your question. That'd save us a lot of time and could potentially get you answers faster. Thanks!

React dynamic image importing in development

I am building a React application which needs to display images dynamically which are stored, by the thousands, on a server-side file system. All of my attempts to successfully implement this have failed, including many which were taken from responses to similar questions.
Some details:
I used create-react-app to initialize my application. I am running in development mode (have not run npm-build). I'm using Express.js (Node.js) as a web-server, which I interact with through a proxy (only '/api' http requests use the proxy). My js code which attempts to 'require' the images is in the 'src' folder. The images are located in an 'images' folder in the default 'public' folder.
I thought I had found the solution when reading this page from create-react-app, as it states to use the public folder when 'You have thousands of images and need to dynamically reference their paths'. The page further instructs to use '%PUBLIC_URL%' or 'process.env.PUBLIC_URL' to access the 'public' folder. When using either of these I receive an 'Error: Cannot find module' message. Upon checking I notice that 'process.env.PUBLIC_URL' contains an empty string, and quickly notice that PUBLIC_URL is ignored in development mode.
I find this to be tremendously confusing, given that the 'Using the Public Folder' page is apparently describing the development phase of production, and yet it advises the use of something which is meaningless during development. Adding to my confusion, it appears as if the contents of that page resolved the issue for nearly all of those who have encountered a similar requirement in the past (example: 1, example: 2; both fail for me). Likewise, all attempts to to construct relative paths to the 'public' folder from the 'src' folder have yielded error messages. Failed code example:
let img = process.env.PUBLIC_URL + '/images/Team.jpg';
<img src={require(`${img}`)} alt="X" />
Error: Cannot find module '/images/Team.jpg'
I never imagined showing images in React would be so difficult. Any help is truly very much appreciated.
I think you are correct, you just don't need the require, return <img src={process.env.PUBLIC_URL + '/img/logo.png'} />; as you can see their docs
If you open in your browser http://localhost:PORT/images/Team.jpg that should open.
That's the reason process.env.PUBLIC_URL is empty in development, because they resolve everything inside this folder directly.

Cloud Foundry - Folder structure and relative paths

This is somewhat related to an issue I'm having with CF on IBM Cloud here. My question after playing around with the folder structures is how exactly is CF building the app when it comes to relative paths?
For example, if i have the following folder structure
when I add <script type = 'text/javascript' src = '../index.js'></script> to the index.html file, I get GET https://simple-toolchain-20190320022356947.mybluemix.net/index.js net::ERR_ABORTED 404. This error does not happen when I move index.js into the public folder and change <script type = 'text/javascript' src = 'index.js'></script>.
The problem I have then is that when I try to require() any modules when the index.js file is in a sub-directory, it returns a Require is not defined error indicating that it is not getting the module from the node_modules cache which CF is suppose to build. Requiring any files in the same sub-directory also throws the same error. This does not seem to be a problem when the require() is used in the default app.js as the application loads without any errors.
I'm relatively new to the IBM Cloud Foundry tool but I'm following the same structure as when I pushed apps via Cloud9 IDE and didn't have any such issues there. I feel I might be missing something ridiculously simple like configuration of endpoint or package.json. However, I've been searching around for days and can't seem to find a solution.
Appreciate if you have any pointers. Thanks!
Due to my lack of understanding, I was trying to use require() on the client side hence the errors. Going to figure out how to use Browserify now. ;)

Sharing TypeScript classes between client and server

I have a Node.js project written in TypeScript. In my project, I have a folder named "public" which contains the client side code & HTML and also a file named classes.ts which is supposed to be shared to the server side.
The problem is that I need to add "export" before the classes declaration in order to make them accessible in the server, but then in the browser I get this Error:
Uncaught ReferenceError: exports is not defined
I found these questions:
https://github.com/Microsoft/TypeScript/issues/5094,
Setup a Typescript project with classes shared between client and server apps?,
Share module between client and server with TypeScript,
which suggests using commonjs in the server but amd in the client. The problem with this solution is that they have 3 different projects (server, client and shared) whereas I only have one project in which I use commonjs.
Another suggestion is:
the other option, which is more convoluted and will require a post
build step to massage the code; if you can not use module loaders in
your client code, is to isolate all module dependencies in your server
code, then in the shared, they are just classes. Build the shared
files without --module, and no exports or imports, but all inside a
single namespace, say namespace MyApp { ... }; in your client code,
you include them directly, and emit using --out. in your server code,
you first emit the shared code to a single file, shared.js, and a
single .d.ts shared.d.ts, augment these with some code to export them
as a module, e.g. append exports = MyApp at the end of your shared.js
and shared.d.ts, then import them from your server code.
But I don't want to deal with updating .d.ts files all the time, and I'm also not sure it will work in one project.
Any suggestion how to make a TypeScript class accessible both in browser and server?
Any help will be profoundly appreciated!
This is absolutely possible.
I have a project containing both SPA client application that runs in browser and server running in node.js that both share common typescript classes. For all of this I have just one tsconfig.json file (I am still not sure that this is the best approach but for now it works just fine)
Here are parts of my setup:
Use modules (previously called external modules). No need for namespaces and d.ts files for your own modules.
module = "commonjs" in tsconfig.
On client side use System.js as module loader (this will solve your 'Uncaught ReferenceError: exports is not defined'). You can use angular2 5 min quickstart as reference how to setup system.js.
It works like a charm.

Resources