The aim of my project is to get live camera feed from on an Android device, use OpenCL to perform real-time filtering on those images and render the output on display.
I aim to do this in real-time that's why I am using OpenCL-OpenGL interop.
I have successfully managed to create a shared context using EGLContext and EGLDisplay. Now I am trying to use clCreateFromGLTexture so I can access these images in OpenCL kernel. The problem however is android requires that when bind the texture the target must be GL_TEXTURE_EXTERNAL_OES as it says here: (http://developer.android.com/reference/android/graphics/SurfaceTexture.html), however this texture target is not valid texture target when using clCreateFromGLTexture (https://www.khronos.org/registry/cl/sdk/1.1/docs/man/xhtml/clCreateFromGLTexture2D.html).
So I am not sure how to go about this.
This is how I create a GL Texture in android:
GLES20.glGenTextures(1, texture_id, 0);
GLES20.glBindTexture(texture_target, texture_target);
and this is how I am trying to create a cl memory object:
glTexImage2D(texture_target, 0, GL_RGBA, 640, 480, 0, GL_RGBA, GL_UNSIGNED_BYTE, NULL);
cl_mem camera_image = clCreateFromGLTexture(m_context, CL_MEM_READ_WRITE, texture_target, 0, texture_id, &err);
The error I get when I try to create cl memory object from GL texture is CL_INVALID_VALUE.
I am pretty new to OpenGL so there could be something basic I might have over looked.
The texture you receive from the camera is not the usual texture you'd expect. You even have to specify the extension in a shader if you read from it.
You need to perform an additional copy from the GL_TEXTURE_EXTERNAL_OES target to another texture which is created in the usual way. With luck you can bind both of them into fbo's and then just issue blit. If that doesn't work you can always use the normal texture as a rendertarget and simply draw a quad textured with the camera image.
Related
I am trying to integrate PixiJS with an existing custom WebGL engine. The existing custom engine is the host and handles control to PixiJS every frame. The existing custom engine configures the WebGL state to an "almost" default state and then calls into PixiJS; after PixiJS is done, the existing custom engine does a full reset of the WebGL state.
In code:
onFrame() {
resetWebGLStateToDefault(gl);
gl.bindFramebuffer(...)
gl.viewport(...)
thenWeUsePixiJSToDoSomeAdvancedStuff();
resetWebGLStateToDefault(gl);
}
My question
In thenWeUsePixiJSToDoSomeAdvancedStuff(), how can I tell PixiJS that the state is not what it used to be the previous time that it ran? Pretty much everything has been reset; PixiJS should assume that everything is default and I would also like to tell PixiJS what the current viewport and framebuffer are.
I tried Renderer.reset, StateSystem.reset, StateSystem.forceState but I guess that's not enough; PixiJS keeps assuming that some textures that it has set previously are still bound (they are not, the existing custom engine unbinds everything) and I get lots of [.WebGL-0x7fd105545e00]RENDER WARNING: there is no texture bound to the unit ?. Pretty much for all texture units, 1-15, except the first one.
Edit
It's probably worth mentioning that I am calling into the renderer directly; I think I need to because the existing custom engine owns the render loop. I am basically trying something like this, but I am getting the WebGL texture errors after the first frame.
renderer.reset();
renderer.render(sprite);
renderer.reset();
Edit
I tried the same thing with an autoStart: false application, and I get the same error.
pixiApp.renderer.reset();
pixiApp.render();
pixiApp.renderer.reset();
The issue appears to be that I was calling into PixiJS with a currently bound FBO; I fixed all my problems by creating a separate PIXI.RenderTexture, rendering there, and then compositing on top of my WebGL engine using a fullscreen quad.
// Create a render texture
const renderTexture = PIXI.RenderTexture.create(640, 360);
// Render with PixiJS
renderer.reset();
renderer.render(this.stage, renderTexture);
renderer.reset();
// Retrieve the raw WebGL texture
const texture = renderTexture.baseTexture._glTextures[renderer.texture.CONTEXT_UID].texture;
// Now composite on top of the other engine
gl.bindFramebuffer(gl.FRAMEBUFFER, theFramebufferWhereINeededPixiJSToRenderInTheFirstPlace);
gl.useProgram(quadProgram);
gl.activeTexture(gl.TEXTURE0);
gl.bindTexture(gl.TEXTURE_2D, texture);
gl.uniform1i(u_Texture, 0);
gl.bindBuffer(gl.ARRAY_BUFFER, quadBuffer);
gl.vertexAttribPointer(0, 2, gl.BYTE, false, 2, 0);
gl.enableVertexAttribArray(0);
gl.bindBuffer(gl.ARRAY_BUFFER, null);
gl.drawArrays(gl.TRIANGLE_STRIP, 0, 4);
gl.useProgram(null);
You may need to resize() the renderer and/or the render texture, depending on your actual setup.
I have spent the afternoon looking over the documentation on the contexts / surfaces and followed quite a few guides but I just do not understand how this is done.
All I want is to use a bitmap (already loaded) and to put it into my scene as the background.
I heard that I have to use a surface and draw it first but I have absolutely no idea how to obtain the surface or how to assign the bitmap to it.
Any help is appreciated.
Yes one method is to use Surface, however I would recommend this method
I am not sure how you have loaded bitmap, anyhow you can use bitmap as background in this way
//Make texture object
LPDIRECT3DTEXTURE9 m_myBitmapTexture;
// During Initialization, Load texture from file
if(FAILED(D3DXCREATETEXTUREFROMFILE(device,"filepath\\file.bmp", 0, 0, 0, D3DMFT_UNKNOWN, D3DPOOL_DEFAULT, D3DX_DEFULT, D3DX_DEFAULT, 0x00000000, NULL, NULL, *m_myBitmapTexture)))
return E_FAIL;
// During Rendering, set texture
device->SetTexture(0, m_myBitmapTexture);
device->SetStreamSource(0, yourBuffer, 0, size(YourBufferStruct));
device->SetFVF(yourTextureFVF); // Setting flexible vertex format
device->DrawPrimitive(topologyType, startindex, totalIndex);
You just need to make sure, your buffer should have texture coordinates and your shader too
struct YourBufferStruct
{
D3DXVECTOR3 position;
D3DXVECTOR2 textureCoord;
}
// Define your flexible vertex format, i am just adding position and texture,
//well you can add color, normal whatever extra you want
#define yourTextureFVF (D3DFVF_XYZ | D3DFVF_TEX1)
Now add texture coordinates to shader too
For more details you can consult this link https://msdn.microsoft.com/en-us/library/windows/desktop/bb153262(v=vs.85).aspx
I am new to Vuforia SDK. I have an image which acts as a target. I want to place this image on to the Imagemarker. In real time the size of the Imagemarker varies. Is there any method where I can get the width and height of the Imagemarker so that the target image fits exactly on the Imagemarker?
Since you did not specify if you are using the Unity or native APIs I will assume you are using Unity.
This is how you would go about it using the Vuforia API, placing this in a script attached to your ImageTarget GameObject.
IEnumerator Start()
{
Vuforia.ImageTarget img = GetComponent<Vuforia.ImageTargetBehaviour>().ImageTarget;
// This is rounded of in the console display,
// so individual components are printed afterwards
Debug.Log(img.GetSize());
Debug.Log(img.GetSize().x);
Debug.Log(img.GetSize().y);
Debug.Log(img.GetSize().z);
}
Alternatively you can directly use the Bounds of the renderer.
void Start()
{
Renderer r = GetComponent<Renderer>();
Debug.Log(r.bounds.size.x);
Debug.Log(r.bounds.size.y);
Debug.Log(r.bounds.size.z);
}
Needless to say this is just a quick solution, depending on the situation you might want to use this at runtime dynamically create content.
Yes, you can.
While placing the Image on the Image Marker to the relative size you want it to be, and when you run it you'll see that the size of the image will be relative to the Marker you've placed it on.
I need to port some C++ application to Windows Phone 8 (it is already on Android, iOS, WinCE and Win32). Currently I need to solving how to display graphic. I can get rendered bitmap from core application and I after succesfully initialize DirectXTK I'm able to render some DDS texture (DirectXTK::SpriteBatch). Now I need to transform my bitmap to texture and then render it. Can you help me with this out? Or is there some way to put bitmap directly to backbuffer and show it on display without SpriteBatch?
Thank you very much
Tomas
DirectX toolkit has WICTextureLoader. You can use it instead of DDSTextureLoader for loading .bmp(bitmap) file. Hope this help!
http://directxtk.codeplex.com/wikipage?title=WICTextureLoader&referringTitle=DirectXTK
Since WICTextureLoader is not supported on Windows Phone 8 the only way to render a bitmap to a texture is by mapping the texture to the CPU and copy your bitmap resource onto the mapped texture's resource.
ID3D11DeviceContext::Map()
http://msdn.microsoft.com/en-us/library/windows/desktop/ff476457(v=vs.85).aspx
D3D11_MAPPED_SUBRESOURCE mappedBuffer;
HRESULT hr = pContext->Map(pTexture, 0, D3D11_MAP_WRITE_DISCARD, 0, &mappedBuffer);
if(hr == S_OK)
{
// copy your bitmap onto mappedBuffer.pData
...
pContext->Unmap(pTex, 0);
}
I have a Bitmap that I want to enlarge programatically to ~1.5x or 2x to its original size. Is there an easy way to do that under .NET CF 2.0?
One "normal" way would be to create a new Bitmap of the desired size, create a Graphics for it and then draw the old image onto it with Graphics.DrawImage(Point, Rectangle). Are any of those calls not available on the Compact Framework?
EDIT: Here's a short but complete app which works on the desktop:
using System;
using System.Drawing;
class Test
{
static void Main()
{
using (Image original = Image.FromFile("original.jpg"))
using (Bitmap bigger = new Bitmap(original.Width * 2,
original.Height * 2,
original.PixelFormat))
using (Graphics g = Graphics.FromImage(bigger))
{
g.DrawImage(original, new Rectangle(Point.Empty, bigger.Size));
bigger.Save("bigger.jpg");
}
}
}
Even though this works, there may well be better ways of doing it in terms of interpolation etc. If it works on the Compact Framework, it would at least give you a starting point.
The CF has access to the standard Graphics and Bitmap objects like the full framework.
Get the original image into a Bitmap
Create a new Bitmap of the desired size
Associate a Graphics object with the NEW Bitmap
Call g.DrawImage() with the old image and the overload to specify width/height
Dispose of things
Versions:
.NET Compact Framework
Supported in: 3.5, 2.0, 1.0