emscripten pure opengl (no sdl2 or glfw3) - android-ndk

I want to make multiplatform application, currently it works with PC and android and I want to port it to web.
I don't want to use sdl2 or glfw3.
All I want is to create opengl context and then just run my opengles code (like in Android one can have a very basic app just by creating GLSurfaceView instance and implementing 2 callbacks: onDraw, onCreate in ndk)
Do you know any text/turorials refering this problem?

You can use the createContext Module method from the browser to create the WebGL context. After that, simply call your compiled OpenGL code and you're good to go :)

Related

Can I integrate a native app in a web browser?

I need to display a native app (in my case, a video game built on unity) in a web browser page.
Local video and input streaming would be a solution, but video encoding is consuming too much computer resources.
Is there a way to display a native app in a web browser page?
Sure, you need to compile your application for WebAssembly. Or, to be more precise, cross-compile it, since WebAssembly binary code runs on a virtual stack machine.
You take your native code, compile it into a .wasm module, and then load that .wasm module with WebAssembly.instantiateStreaming().
There are many toolchains that can have WebAssembly as the compilation target. I think the 2 most popular ones are Emscripten and wasm-pack.
There is also wabt, but that is rather a set of lower level tools, not quite a toolchain.
And since you mentioned Unity, I have no experience with it, but there is some official documentation on WebAssembly.
Here is a game that was developed in C# and cross-compiled to WebAssembly.

Is it possible to embed xamarin part into an existing native app?

I have existing iOS and Android native apps. Is it possible to extend the apps with Xamarin coded part ?
Both Xamarin.iOS and Xamarin.Android are currently setup to take control of your application, so you need to make your main program be written in C# and then call into the existing code.
The way that you would do this is to bind your existing Objective-C or Java code as C# libraries, and then consume those libraries from C#. The binding technology is precisely what drives both the Xamarin.iOS and Xamarin.Android tools, so you would effectively be doing the same.
Once you have bindings, the interoperability works both ways: you can call native code, and native code can call C#.
The bad news is that instead of starting to enjoy writing code with both tools from day zero, the first thing you have to do the bindings, which is in general, not as fun as watching colored squared move on your screen.
Ok I find an answer here : http://www.whitneyland.com/2013/05/why-i-dont-recommend-xamarin-for-mobile-development.html
For example, code written in Xamarin cannot be used in native or HTML5
apps.

Is is possible to build Android games in Go using the NDK with cgo and/or SWIG or similar?

Is it possible to use Go to build Android games at all? I'm not wedded to the technologies mentioned in the subject line. I know that some people have built some Android programs in Go, but they may have been headless.
No, it is not possible right now. Go and C programs can interoperate via cgo. but in that case, the Go program has to start and initialize its runtime before passing control to the C-based part of the program.
When using the NDK in Android, your C code is called by the Dalvik virtual machine via dlopen. So the Go runtime would not have a chance to initialize itself.
If you want non-headless apps, my advice would be to use cgo for the GUI. That sounds counterintuitive, but if the NDK supports C android gui libraries, it'd probably be easiest to write the GUI using those calls. Of course, you don't have to write all the logic in C. You could simply cgo wrappers for each of the GUI calls and then write the GUI in go, except that each gui call would be translated through cgo.

Accessing hardware with Android NDK

I need to extend the functionality of the android.hardware.Camera class and so I have written my own class and companion JNI library to meet my needs. If I place my JNI code and Android.mk file in the Android source tree and build the OS, my library builds and I can use it and the Java class in an application without any problems (on an evaluation module at least).
The problem is that I would prefer to build my JNI library with the NDK but I need several libraries that are not in the NDK (e.g. libandroid_runtime and libcamera_client).
Is it possible to use the NDK to access hardware such as the camera? If so, what is the proper way to get access to OS libraries?
You can access non-standard shared libraries from NDK, but that is undocumented and is not guaranteed to work on different devices. Vendors like HTC, Samsung and other can simply implement them differently.
Only proper way how to use functionality not available in NDK is to wrap it with Java classe/functions, and then use them from native code.

Android NDK GLES v1.1

I got a game written in C/C++ using GLES v1.1 (yeah that was an iPhone game), as Im porting it to android I realize that I got some logs on the LogChat of Eclipse that tells me that the GL functionality (like glGet*) that I want to use is not implemented.
Digging on Google I found that you can cast a GL10 to GL11 context, however, since my API calls are all native, I cannot use that...
Is there a way to initialize a GL11 context in Java and then use the native GL11 API call?
If you are running on the emulator, be aware that currently (SDK Tools Rev9, ndk-r5b) the emulator (no matter the platform version) only implements the GLES 1.0 interfaces. It won't matter that your context is for GL11. You'll need to test on hardware to successfully call the unimplemented API entry points.

Resources