openFl bitmapData.draw crashes - haxe

I have a problem with openFl. When I'm using somebitmapdata.draw(someObject) build just crushes. I've checked BitmapData class, function 'draw' it crashes on line __framebuffer = new FilterTexture (gl, width, height, smoothing); What shall I do to fix it?

Related

Telling PixiJS that the WebGL state has been modified externally

I am trying to integrate PixiJS with an existing custom WebGL engine. The existing custom engine is the host and handles control to PixiJS every frame. The existing custom engine configures the WebGL state to an "almost" default state and then calls into PixiJS; after PixiJS is done, the existing custom engine does a full reset of the WebGL state.
In code:
onFrame() {
resetWebGLStateToDefault(gl);
gl.bindFramebuffer(...)
gl.viewport(...)
thenWeUsePixiJSToDoSomeAdvancedStuff();
resetWebGLStateToDefault(gl);
}
My question
In thenWeUsePixiJSToDoSomeAdvancedStuff(), how can I tell PixiJS that the state is not what it used to be the previous time that it ran? Pretty much everything has been reset; PixiJS should assume that everything is default and I would also like to tell PixiJS what the current viewport and framebuffer are.
I tried Renderer.reset, StateSystem.reset, StateSystem.forceState but I guess that's not enough; PixiJS keeps assuming that some textures that it has set previously are still bound (they are not, the existing custom engine unbinds everything) and I get lots of [.WebGL-0x7fd105545e00]RENDER WARNING: there is no texture bound to the unit ?. Pretty much for all texture units, 1-15, except the first one.
Edit
It's probably worth mentioning that I am calling into the renderer directly; I think I need to because the existing custom engine owns the render loop. I am basically trying something like this, but I am getting the WebGL texture errors after the first frame.
renderer.reset();
renderer.render(sprite);
renderer.reset();
Edit
I tried the same thing with an autoStart: false application, and I get the same error.
pixiApp.renderer.reset();
pixiApp.render();
pixiApp.renderer.reset();
The issue appears to be that I was calling into PixiJS with a currently bound FBO; I fixed all my problems by creating a separate PIXI.RenderTexture, rendering there, and then compositing on top of my WebGL engine using a fullscreen quad.
// Create a render texture
const renderTexture = PIXI.RenderTexture.create(640, 360);
// Render with PixiJS
renderer.reset();
renderer.render(this.stage, renderTexture);
renderer.reset();
// Retrieve the raw WebGL texture
const texture = renderTexture.baseTexture._glTextures[renderer.texture.CONTEXT_UID].texture;
// Now composite on top of the other engine
gl.bindFramebuffer(gl.FRAMEBUFFER, theFramebufferWhereINeededPixiJSToRenderInTheFirstPlace);
gl.useProgram(quadProgram);
gl.activeTexture(gl.TEXTURE0);
gl.bindTexture(gl.TEXTURE_2D, texture);
gl.uniform1i(u_Texture, 0);
gl.bindBuffer(gl.ARRAY_BUFFER, quadBuffer);
gl.vertexAttribPointer(0, 2, gl.BYTE, false, 2, 0);
gl.enableVertexAttribArray(0);
gl.bindBuffer(gl.ARRAY_BUFFER, null);
gl.drawArrays(gl.TRIANGLE_STRIP, 0, 4);
gl.useProgram(null);
You may need to resize() the renderer and/or the render texture, depending on your actual setup.

How to load an svg to CanvasVirtualControl in C++/winrt?

I've been loading drawing svg images successfully in a UWP app built with C++/winrt but in the latest release find that the call to load the svg throws an exception.
The crash happens in an IAsync method that is hard to trace, but I've narrowed the problem down to the one line that loads the SVG.
This is normally looping to read lots of files, but very simple reduction of the problem still displays the issue:
winrt::Microsoft::Graphics::Canvas::UI::Xaml::CanvasVirtualControl resourceCreator
winrt::Windows::Storage::StorageFile nextFile = nullptr;
winrt::Microsoft::Graphics::Canvas::Svg::CanvasSvgDocument nextSvg = nullptr;
winrt::Windows::Storage::Streams::IRandomAccessStream fileStream = nullptr;
//This is called from within the lambda handling CreateResources for the CanvasVirtualControl
//The VirtualControl is provided as the resourceCreator argument
IAsyncAction loadSVGs(winrt::Microsoft::Graphics::Canvas::UI::Xaml::CanvasVirtualControl resourceCreator)
{
nextFile = m_symbol_resource_files.GetAt(i);
fileStream = co_await nextFile.OpenReadAsync();
nextSvg = co_await CanvasSvgDocument::LoadAsync(resourceCreator,fileStream);
}
The call to LoadAsync fails with
exception: winrt:hresult_invalid_argument at memory location 0x0C93F1CB
void resume()const{
->_coro_resume(_Ptr);
}
If I continue past the exception I find that the resource has in fact loaded and is usable. But if running outside Visual Studio the app will often quit at this line. Could it be that the CanvasVirtualControl is not acceptable?
Or is there a better way to load the svg from file? I haven't been able to work out the CanvasDocument's LoadFromXML, as it takes a System.String argument not available in C++/winrt and the winrt std::wstring is not acceptable as a substitute [Correction: in fact hstring will work as an argument and that can be created from std::wstring].
[Update]I have not yet created a simple demo project displaying this behavior, but I think I have a line on the cause and would like to extend my question. I've tried both the LoadFromXml and the LoadAsync methods of the CanvasSvgDocument, and the second of these used to be fine, but now both fail the same way and it seems to me that the ResourceCreator argument may be the trouble. I am creating these resources in the CreateResources handler and I pass the sender - CanvasVirtualControl - as the resourceCreator. The listed argument for both the CanvasSvgDocument calls, however, is ICanvasResourceCreator. I had thought this was a convenience and that the CanvasVirtualControl could be passed directly for that argument (and so it used to be). But perhaps this is incorrect, maybe always was and is now being noticed as incorrect? If so, how would the sender in the CreateResources handler properly be passed to the CanvasSvgDocument method?
I created a blank app to use CanvasVirtualControl to load SVG, it worked well. Can you provide a simple sample that can be reproduced? And I checked the LoadFromXML method, the parameter it needs is hstring type instead of System.String.
In addition, about loading the svg from file, do you have to use win2d to load svg? If not, you could try to use Image control by SvgImageSource to load svg, like below:
.xaml:
<Image x:Name="MyImage" Width="300" Height="200"></Image>
.cpp:
winrt::Windows::Storage::StorageFile nextFile = nullptr;
winrt::Microsoft::Graphics::Canvas::Svg::CanvasSvgDocument nextSvg = nullptr;
winrt::Windows::Storage::Streams::IRandomAccessStream fileStream = nullptr;
nextFile = co_await KnownFolders::PicturesLibrary().GetFileAsync(L"Freesample.svg");
fileStream = co_await nextFile.OpenReadAsync();
SvgImageSource sourcee = SvgImageSource();
co_await sourcee.SetSourceAsync(fileStream);
MyImage().Source(sourcee);

OpenCL clCreateFromGLTexture using a different texture target

The aim of my project is to get live camera feed from on an Android device, use OpenCL to perform real-time filtering on those images and render the output on display.
I aim to do this in real-time that's why I am using OpenCL-OpenGL interop.
I have successfully managed to create a shared context using EGLContext and EGLDisplay. Now I am trying to use clCreateFromGLTexture so I can access these images in OpenCL kernel. The problem however is android requires that when bind the texture the target must be GL_TEXTURE_EXTERNAL_OES as it says here: (http://developer.android.com/reference/android/graphics/SurfaceTexture.html), however this texture target is not valid texture target when using clCreateFromGLTexture (https://www.khronos.org/registry/cl/sdk/1.1/docs/man/xhtml/clCreateFromGLTexture2D.html).
So I am not sure how to go about this.
This is how I create a GL Texture in android:
GLES20.glGenTextures(1, texture_id, 0);
GLES20.glBindTexture(texture_target, texture_target);
and this is how I am trying to create a cl memory object:
glTexImage2D(texture_target, 0, GL_RGBA, 640, 480, 0, GL_RGBA, GL_UNSIGNED_BYTE, NULL);
cl_mem camera_image = clCreateFromGLTexture(m_context, CL_MEM_READ_WRITE, texture_target, 0, texture_id, &err);
The error I get when I try to create cl memory object from GL texture is CL_INVALID_VALUE.
I am pretty new to OpenGL so there could be something basic I might have over looked.
The texture you receive from the camera is not the usual texture you'd expect. You even have to specify the extension in a shader if you read from it.
You need to perform an additional copy from the GL_TEXTURE_EXTERNAL_OES target to another texture which is created in the usual way. With luck you can bind both of them into fbo's and then just issue blit. If that doesn't work you can always use the normal texture as a rendertarget and simply draw a quad textured with the camera image.

GetBitmap() not working in release mode

I am Working in MFC application. I have to get Height and Width of a BITMAP image. Code I'm using is working in Debug mode only but because of some problem I have to use release mode and in Release mode code is not working.help me out..!!!
CBitmap bmp;
bmp.LoadBitmap(IDB_BITMAP1);
BITMAP bm;
bmp.GetBitmap(&bm);
CBitmap bmp;
Don't use a local variable to draw a bitmap. It will be gone after the function has been called.
Use a member variable.
For instance:
m_Background.LoadBitmap (IDB_BITMAP1);
BITMAP bm;
m_Background.GetBitmap (&bm);
m_BitmapSize = CSize (bm.bmWidth, bm.bmHeight);
Invalidate(1);

How to cause creation of an EGL context?

I have an Android NDK application that is doing all of its rendering in software.
Now I want to use Open GL ES to do the rendering.
I've got unit tests running by calling EGL and creating a PBuffer.
Now I want to do everything in a window instead of a PBuffer.
I adapted the code from the hello-gl2 example.
I created a new java file that uses a GLSurfaceView instead of a SurfaceView.
I have created a few native functions for GLSurfaceView.
I have successfully called C from Java, and have successfully called Java from C.
Still, no pictures.
I traced through with Eclipse and got an error that says that GL calls are being made without having a current context. I am doing
setEGLContextFactory(new ContextFactory());
setEGLConfigChooser( translucent ? new ConfigChooser(8, 8, 8, 8, 0, 0) :
new ConfigChooser(5, 6, 5, 0, 0, 0) );
setRenderer(new Renderer());
However,
ConfigChooser.chooseConfig()
never gets called. Who is supposed to call this? The sample code gives no clue.
Do I also need to make some change in an XML file?
Please give me some ideas of paths to pursue. I'm only running into dead ends.
It turns out that there was a problem with threads: the GL rendering thread and the graphics database thread were deadlocking. Here is how I solved it. I reduced the number of threads by one, and managed GL myself:
derive MyGLSurfaceView from SurfaceView instead of GLSurfaceView.
When MyGLSurfaceView.surfaceCreated() is called, squirrel away the ANativeWindow (from the main thread) in a global.
initialize EGL in the database thread using the ANativeWindow to create an EGLContext.

Resources