I need something like this, but for Phaser - phaser-framework

i need something for adding object in Phaser, this is something similar but in wade.
wade.addSceneObject(new SceneObject(dotSprite, 0, dotPosition.x, dotPosition.y));

Adding objects (usually called sprites) in phaser is super simple. Just load the image in the preloader function
function preload() {
game.load.image('mushroom', 'assets/sprites/mushroom2.png');
}
And then add the sprite in the create function
function create() {
// This simply creates a sprite using the mushroom image we loaded above and positions it at 200 x 200
var test = game.add.sprite(200, 200, 'mushroom');
}
Phaser has a ton of documentation on how to do things like this. I got the code from this example.
If you are completely new to phaser I highly recommend going through their tutorial

Related

How to create Mapbox Marker Onclick Eventlistener Android Studio?

Hello Stackoverflow community,
I am trying to develop an Android application with Mapbox.
I followed this guide to create markers on the map.
https://docs.mapbox.com/android/maps/examples/default-point-annotation/
Thus my code is the following:
public fun createMarker(id: String, lon: Double, lat: Double) {
// Create an instance of the Annotation API and get the PointAnnotationManager.
var marker: PointAnnotation? = bitmapFromDrawableRes(
drawercontext,
R.drawable.red_marker
)?.let {
val annotationApi = binding.mapBoxView.mapView?.annotations
val pointAnnotationManager =
annotationApi?.createPointAnnotationManager(binding.mapBoxView.mapView!!)
// Set options for the resulting symbol layer.
val pointAnnotationOptions: PointAnnotationOptions = PointAnnotationOptions()
// Define a geographic coordinate.
.withPoint(Point.fromLngLat(lon, lat))
// Specify the bitmap you assigned to the point annotation
// The bitmap will be added to map style automatically.
.withIconImage(it)
// Add the resulting pointAnnotation to the map.
pointAnnotationManager?.create(pointAnnotationOptions)
}
}
Unfortunately, I can not find any solution to add a click listener to markers (to show extra information outside of the map). In my opinion, this should be an important event, so I don't get why there is so little support. I want to replicate something like this:
https://bl.ocks.org/chriswhong/8977c0d4e869e9eaf06b4e9fda80f3ab
But in Android Studio with Kotlin.
One workaround I have seen is to add a click listener to the map and from there determine the marker with the closest coordinates, but I think that would not be as nice of a solution. Do you know any solutions or workarounds to my problem?
Thanks for the help in advance!
try this:
pointAnnotationManager.apply {
addClickListener(
OnPointAnnotationClickListener {
Toast.makeText(this#MainActivity, "id: ${it.id}", Toast.LENGTH_LONG).show()
false
}
)
}

Can I use HaxeUI with HaxeFlixel?

I tried to use both HaxeUI and HaxeFlixel, but what I obtain is HaxeUI's interface over a white background, covering everything underneath. Moreover, even if it was possible to somewhat make HaxeUI and HaxeFlixel work together, it's not clear how to change the UI of HaxeUI when the state change in HaxeFlixel. Here is the code I used:
private function setupGame():Void {
Toolkit.theme = new GradientTheme();
Toolkit.init();
var stageWidth:Int = Lib.current.stage.stageWidth;
var stageHeight:Int = Lib.current.stage.stageHeight;
if (zoom == -1) {
var ratioX:Float = stageWidth / gameWidth;
var ratioY:Float = stageHeight / gameHeight;
zoom = Math.min(ratioX, ratioY);
gameWidth = Math.ceil(stageWidth / zoom);
gameHeight = Math.ceil(stageHeight / zoom);
}
trace('stage: ${stageWidth}x${stageHeight}, game: ${gameWidth}x${gameHeight}, zoom=$zoom');
addChild(new FlxGame(gameWidth, gameHeight, initialState, zoom, framerate, framerate, skipSplash, startFullscreen));
Toolkit.openFullscreen(function(root:Root) {
var view:IDisplayObject = Toolkit.processXmlResource("assets/xml/haxeui-resource.xml");
root.addChild(view);
});
}
I can guess that, probably, both HaxeUI and HaxeFlixel have their own main loop and that their event handling might not be compatible, but just in case, can someone have a more definitive answer?
Edit:
Actually, it's much better when using openPopup:
Toolkit.openPopup( { x:20, y:150, width:100, height:100 }, function(root:Root) {
var view:IDisplayObject = Toolkit.processXmlResource("assets/xml/haxeui-naming.xml");
root.addChild(view);
});
It's possible to interact with the rest of the screen (managed with HaxeFlixel), but the mouse pointer present in the part of the screen managed with HaxeFlixel remains under the HaxeUI user interface elements.
When using Flixel and HaxeUI together, its almost like running two applications at once. However, they both rely on OpenFL as a back-end and each attach themselves to its display tree.
One technique I'm experimenting with right now is to open a Flixel sub state, and within the sub state, call Toolkit.openFullscreen(). From inside of this, you can set the alpha of the root's background to 0, which allows you to see through it onto the underlying bitmap that Flixel uses to render.
Here is a minimal example of how you might "embed" an editor interface inside a Flixel sub state:
import haxe.ui.toolkit.core.Toolkit;
import haxe.ui.toolkit.core.RootManager;
import haxe.ui.toolkit.themes.DefaultTheme;
import flixel.FlxG;
import flixel.FlxSubState;
// This would typically be a Haxe UI XMLController
import app.MainEditor;
class HaxeUIState extends FlxSubState
{
override public function create()
{
super.create();
// Flixel uses a sprite-based cursor by default,
// so you need to enable the system cursor to be
// able to see what you're clicking.
FlxG.mouse.useSystemCursor = true;
Toolkit.theme = new DefaultTheme();
Toolkit.init();
Toolkit.openFullscreen(function (root) {
var editor = new MainEditor();
// Allows you to see what's going on in the sub state
root.style.backgroundAlpha = 0;
root.addChild(editor.view);
});
}
override public function destroy()
{
super.destroy();
// Switch back to Flixel's cursor
FlxG.mouse.useSystemCursor = true;
// Not sure if this is the "correct" way to close the UI,
// but it works for my purposes. Alternatively you could
// try opening the editor in advance, but hiding it
// until the sub-state opens.
RootManager.instance.destroyAllRoots();
}
// As far as I can tell, the update function continues to get
// called even while Haxe UI is open.
override public function update() {
super.update();
if (FlxG.keys.justPressed.ESCAPE) {
// This will implicitly trigger destroy().
close();
}
}
}
In this way, you can associate different Flixel states with different Haxe UI controllers. (NOTE: They don't strictly have to be sub-states, that's just what worked best in my case.)
When you open a fullscreen or popup with haxeui, the program flow will be blocked (your update() and draw() function won't be called). You should probably have a look at flixel-ui instead.
From my experience haxeflixel and haxeui work well together but they are totally independent projects, and as such, any coordination between flixel states and displayed UI must be added by the coder.
I don't recall having the white background problem you mention, it shouldn't happen unless haxeui root sprite has a solid background, in that case it should be addressed to haxeui project maintainer.

CPaintDC in CStatic not cutting of drawing

My OnPaint() method in a derived CStatic-control is supposed to be cutting of parts of the drawing which are bigger than the control, as far as I know.
However it doesn't do this.
void CGraph::OnPaint ()
{
CPaintDC dc(this);
dc.SetViewportOrg (0, 400);
dc.SetMapMode(MM_ISOTROPIC);
dc.SetWindowExt(1000, 800);
dc.SetViewportExt(1000, -800);
// MessageBox(L"OnPaint");
ProcessData ();
DrawCoordinateSystem (&dc);
DrawGrid (&dc);
DrawGraph (&dc);
}
Depends on the definition of a method.
OnPaint is not really a method; it is a static member function used to handle a WM_PAINT message by mapping it in the message map array.
For a C++, I personally prefer to call it a member function for clarity.
You can create your own handler using ON_MESSAGE macro and call it whatever you desire.
The code snippet does not tell much, since it does not show the code for drawing (painting). You should include code that actually performs paining.
The best would be if you could attach project demonstrating your problem.
Did you try to paint a simple bitmap that has a size bigger than a window?
#JohnCz Got it now.
CDC* pDC = GetDC();
CRect rClient();
GetClientRect(rClient);
CRgn ClipRgn;
if (ClipRgn.CreateRectRgnIndirect(&rClient))
{
pDC->SelectClipRgn(&ClipRgn);
}
// Drawing content
pDC->SelectClipRgn(NULL);
ReleaseDC(pDC);
Thanks for your answer

Extending the YUI Panel

I have a requirement to extend the YUI Panel with some custom functionality that will be in a new file and shared across multiple views.
I am at a bit of a loss as to how best to go about this, can anyone give me any pointers please?
Let's say you want to extend a Panel to create one that has a list in its body. I usually use Y.Base.create for this. It's a more declarative way of extending YUI classes than using a constructor and Y.extend. But I'll stay closer to your example in the YUI forums.
There are a couple of tricks dealing with WidgetStdMod (one of the components of Y.Panel), but mostly it's just about using Y.extend and following the YUI inheritance patterns. I'll try to answer with an example:
function MyPanel() {
MyPanel.superclass.constructor.apply(this, arguments);
}
// hack: call it the same so you get the same css class names
// this is good for demos and tests. probably not for real life
MyPanel.NAME = 'panel';
MyPanel.ATTRS = {
listItems: {
// YUI now clones this array, so all's right with the world
value: []
},
bodyContent: {
// we want this so that WidgetStdMod creates the body node
// and we can insert our list inside it
value: ''
}
};
Y.extend(MyPanel, Y.Panel, {
// always a nice idea to keep templates in the prototype
LIST_TEMPLATE: '<ul class="yui3-panel-list"></ul>',
initializer: function (config) {
// you'll probably want to use progressive enhancement here
this._listContainer = Y.Node.create(this.LIST_TEMPLATE);
// initializer is also the place where you'll want to instantiate other
// objects that will live inside the panel
},
renderUI: function () {
// you're inheriting from Panel, so you'll want to keep its rendering logic
// renderUI/bindUI/syncUI don't call the superclass automatically like
// initializer and destructor
MyPanel.superclass.renderUI.call(this);
// Normally we would append stuff to the body in the renderUI method
// Unfortunately, as of 3.5.0 YUI still removes all content from the body
// during renderUI, so we either hack it or do everything in syncUI
// Hacking WidgetStdModNode is doable but I don't have the code around
// and I haven't memorized it
//var body = this.getStdModNode('body');
},
syncUI: function () {
// same here
MyPanel.superclass.syncUI.call(this);
// insert stuff in the body node
var listContainer = this._listContainer.appendTo(this.getStdModNode('body'));
Y.Array.each(this.get('listItems'), function (item) {
listContainer.append('<li>' + item + '</li>');
});
}
});

Ogre material lod: How do i set it up?

I have an Ogre material and, since i'm not happy with the GPU filtering, i want to make a manual mipmapping, i.e, i create all the textures, and then i set up a lod-based strategy to load the correct texture.
The problem is: it doensn't matter which strategy i use, neither the lod_value, my material does not change the texture. What should i do?
I'm reading the manual but it really didn't help.
Here is my code:
material shader/content
{
lod_values 100.0
technique t1
{
lod_index 0
pass
{
scene_blend alpha_blend
depth_write off
texture_unit
{
filtering none
texture menu_image.png
}
}
}
technique t2
{
lod_index 1
pass
{
scene_blend alpha_blend
depth_write off
texture_unit
{
filtering none
texture menutest.png
}
}
}
}
What you have posted seems correct.
Which image shows? I assume menu_image.png.
Things to try:
Are you sure menutest.png is loaded along with your other ogre assets?
Are there any relevant messages in your ogre logs?
What version of Ogre are you using? Try lod_distances instead of lod_values
Could there be code modifying the material/techniques at runtime?

Resources