Zoom on scroll interaction in figma - frontend

I want to build a figma prototype that will allow users to zoom in and out from one photo to another using scrolling/pinch on trackpad.
I can also make a huge photo to zoom in and out of it.
The idea is to make an interface in which zooming in and out will change photos
Any thoughts of how to do it?
Wireframe demonstration below
enter image description here

Related

Image view like google map in android studios

so I wanna make a image view in android studios which can pan, zoom in, zoom out like google maps. in the image view, i only wanna display one picture but i am not able to figure out a way to accomplish that.
pl be as detailed al possible
appreciate the help
Check this library out - https://github.com/stfalcon-studio/FrescoImageViewer
you can zoom also with double tap

How to use "pan screen" on Android Studio?

I have found a function recently in Android Studio named "pan screen". It is located at the right side of design window. But I didn't find any description for it.
If you zoom in your design much enough so it no longer fits on in your design area in Android Studio, you can hit pan icon and drag your design around.
The image you have shown in is Zoom and Pan Controls which allows you to zoom in and out of the design , Now the Pan button will help you to pan around to find the areas
that are unable or not visible to see when zoomed in.
Pan is swing in a horizontal or vertical plane, typically to give a panoramic effect or follow a subject.
Pan in Android studio, you can hold space bar and drag the screen to view those area currently is not visible.
When a user zooms in the layout so much that you are unable to see all items in the layout, you can click on the pan option and then afterwards move the layout without accidently moving the items in the layout.

What's the official way to zoom and pan the canvas in FabricJS?

I see for zooming and panning in Fabric JS there are the commands Canvas.setZoom, absolutePan and relativePan but can anyone show with example code how to use them, especially in using both mouse and touch events to control them?
Thanks

MergEXT MergZXing layer and barcode not reading

Just started working with this awesome external but have a couple of questions.
When the control is evoked, is it always the top layer or can I have a background transparent image on top of it so I can frame the control nicely?
Also, my testing seems to read most Barcodes but when it comes down to reading Barcodes on hard drives, the control does not want to decode those.... Too dense of bar code pattern?
I am very impressed thus far with the ease of use of your externals. Makes we want to code more for mobile devices!
an overlaying transparent image is not possible, as far as i know.
but couldnĀ“t you use
command mergZXingControlSetRect pLeft,pTop,pRight,pBottom
to define the rect of that scanner after creation
or
command mergZXingControlCreate pLeft,pTop,pRight,pBottom
to create the scanner control in the specified rect.
Set the rect smaller than the width and the height of the screen.
You could then use an underlying image, which is displayed outside of the scanner rect, to show the frame around scanner control. Did not test it myself, but i would assume that this should work.
Unfortunately the native controls in externals and the ones the engine provides are added as views on top of the LiveCode view. That means you can't intermingle LiveCode controls with them. One thing that some users have done is add a web view with a transparent background and a load a png image. If you create the barcode view first and the web view second then the web view will be on top.

UIImageView doesn't obey it's content mode

I am designing an iOS App where there is a View controller with an embedded view controller in view at the bottom, and an image at the top. However, I'm having problems with the Image view. Without image view, everything is OK. When I put the image view, the image view doesn't fit in the image correctly. Here is how it looks in the interface builder (set to Scale to Fill):
When I run it in my iPod Touch (iOS 6), This is how it looks:
The image's aspect ratio is different than shown in the Interface Builder. Then, I try setting the image mode to Aspect Fill (which is actually the one I want):
This is exactly what I actually want, but when I run it, the image fills almost the entire screen:
Notice how my friend picker view controller is left behind the image view. The layout of the entire view controller is correct (embedded view controller is located correctly, notice at the point where the image ends in the last screenshot, it starts from J, meaning that it continues to the top behind the image), it's only the image view that is occupying too much space. I've tried different arrangements of the frame rectangle, and tried toggling Auto-layout of iOS 6. Neither did help. Am I missing an obvious point, or is it a bug with the layout system of iOS. (probably the second one, as WYSIWYG fails between Interface Builder and actual app) Are there any workarounds (other than cropping the portion of the image in Photoshop at the exact size that I want)?
Thanks,
Can.
UPDATE: A workaround that I've currently found is to put the image view in another view, and set the parent view's clip subviews property ticked. But the image view itself is still not behaving correctly, and even though this currently solved my problem, it is not a real solution, so any solution answers are welcome.

Resources