Automatic pin placement with Layout XL in Cadence Virtuoso? - layout

I have a big mixed signal design, with 363 pins. Layout XL knows the position of the pins (green line connects each pin with its correct position while I'm dragging it around).
My question is: how can I avoid spending one week doing such a tedious activity and make pins placement automatic? I have always done it manually, but this time the design is too big. There MUST be a way to run a script, or issue a command from some menu, to save time and effort.

When you are in Layout XL, navigate to Connectivity/Update/Components And Nets. Click on the I/O pins tab and select all the pins you wish to have auto placed. Define the layers for pins and labels that you desire and click update at the bottom right of the "Specify Pins to be Generated" tab. Click OK.
The pins you just generated will appear around the origin of the layout. To auto place them according to the instances in your schematic, navigate to Place/Analog/Adjust Cell Pins.

Related

Make mouse move via a virtual mouse device (none Xorg)

My goal is to remap a physical device (Aimtrak Lightgun) which acts like a mouse, towards different button input (the X,Y are fine).
Currently the trigger registers as a right mouse click, but I want this to become a left mouse click.
I did follow the evdev tutorial, adjusting the "create uinput" and "injecting" parts.
There is no error, but the mouse is not moving or the button is not clicked.
So it would be fine to grab the device and control all the input.
But I am unable to make it work.
Any other approach is also welcome.

How can I configure MRTK to work with touch input in editor and on mobile devices?

I'm building an application that will run on both HoloLens and mobile devices (iOS/Android). I'd like to be able to use the same manipulation handlers on all devices with the goals:
Use ARFoundation for mobile device tracking and input
Use touch input with MRTK with ManipulationHandler and otherwise use touch input as normal (UI)
Simulate touch input in the editor (using a touch screen or mouse) but retain the keyboard/mouse controller for camera positioning.
So far I've tried/found:
MixedRealityPlayspace always parents the camera, so I added the ARSessionOrigin to that component, and all the default AR components to the camera (ARCameraManager, TrackedPoseDriver, ARRayCastManager, etc.)
Customizing the MRTK pointer profile to only countain MousePointer and TouchPointer.
Removing superfluous input data providers.
Disabling Hand Simulation in the InputSimulationService
Generally speaking, the method of adding the ARSessionOrigin to the MixedRealityPlayspace works as expected and ARFoundation is trivial to set up. However, I am struggling to understand how to get the ManipulationHandler to respond to touch input.
I've run into the following issues:
Dragging on a touch screen with a finger moves the camera (editor). Disabling the InputSimulationService fixes this, but then I'm unable to move the camera...
Even with the camera disabled, clicking and dragging does not affect the ManipulationHandler.
The debug rays are drawn in the correct direction, but the default touchpointer rays draw in strange positions.
I've attached a .gif explaining this. This is using touch input in the editor. The same effect is observed running on device (Android).
This also applies to Unity UI (world space canvas) whereby clicking on a UI element does not trigger (on device or in editor), which suggests to me that this is a pointer issue not a handler issue.
I would appreciate some advice on how to correctly configure the touch input and mouse input both in editor and on device, with the goal being a raycast from the screen point using the projection matrix to create the pointer, and use two-finger touch in the same way that two hand rays are used.
Interacting with Unity UI in world space on a mobile phone is supposed to work in MRTK, but there are few bugs in the input system preventing it from working. The issue is tracked here: https://github.com/microsoft/MixedRealityToolkit-Unity/issues/5390.
The fix has not been checked in, but you can apply a workaround for now (thanks largely to the work you yourself did, newske!). The workaround is posted in the issue. Please see https://gist.github.com/julenka/ccb662c2cf2655627c95ffc708cf5a69. Just replace each file in MRTK with the version in the gist.

Active X Command Buttons with Second Display

I put Active X Command Buttons in an Excel Workbook. They work just fine when the workbook is displayed on the primary monitor, but not when the workbook is displayed on the secondary monitor. How do I fix this?
I'm not sure if it is hardware specific, but I run a MS Surface Pro 5. At home it is connected to a Surface Dock with two monitors; the device screen is duplicated to one monitor (primary) and the desktop is extended to the other. On-the-go I use a portable monitor thru the Mini Display Port. The problem occurs in both configurations.
Goto: File->Options->General-> under 'When using multiple displays' select 'Optimize for compitaility (application restart required)'
Brad, I had the same problem where the activex buttons became unclickable when get connected to the secondary display. The problem is solely because of the varied resolutions. Ensure that the secondary monitor has the same resolution as that of primary monitory. In my case, my secondary monitor was a projector and I changed its resolution to 1366x768 which is what being used in the laptop. It then worked.
unfortunately I had the same issue. It's not related to hardware, but software. If you change the resolution of the screen or projector (in my case), the active x buttons are starting to grow and deform.
The solution what I used is, to run this code on every active x action (just make a call for the sub), to make sure the buttons/labels etc are on their right place.
Set wbkWorkbook1 = ThisWorkbook
Set Rng = wbkWorkbook1.Worksheets("WS1").Range("A14:F15")
wbkWorkbook1.Worksheets("WS1").BTN.Width = Rng.Width
wbkWorkbook1.Worksheets("WS1").BTN.Height = Rng.Height
wbkWorkbook1.Worksheets("WS1").BTN.Left = wbkWorkbook1.Worksheets("WS1").Range(wbkWorkbook1.Worksheets("WS1").BTN.TopLeftCell.Address).Left
wbkWorkbook1.Worksheets("WS1").BTN.Top = wbkWorkbook1.Worksheets("WS1").Range(wbkWorkbook1.Worksheets("WS1").BTN.TopLeftCell.Address).Top
What this code makes: In this example we have an active X button labelled 'BTN' on the worksheet named 'WS1'.
Once this script runs it aligns the BTN into the A14:F15 range.
So if you change the resolution and runs this code, your active x controls will be re-adjusted.
You can go through on all controls with a do ... loop until procedure for sure.
I deal with the same issue in a computer a couple months ago, I ended changing the ActiveX buttons to the form controls ones, a couple days ago, same thing happen, BUT this time i find along with the IT Supports for the company i work for, that you have to match the SCREEN SCALE on all the monitors you have connected, and that should solve the problem..
I know, is stupid, but that's the solution..

How to toggle the external screen in Windows 10

I have a laptop connected to an external monitor. Need to do a presentation where I have Powerpoint open on one screen and Excel on another screen and toggle the external screen
So: the laptop screen should always show Excel but the external screen should show either PP or Excel.
I put display into extended mode, put PP on a second screen. so far so good. But then I need to switch external screen to Excel. If I change to Duplicate mode, my PP moves to the main screen and I cannot switch easily back.
Is there a way to quickly and easy switch only the external monitor between main and extended screens?
You need to be on Extended Mode always. Based on your requirement, you can manually drag the respective Application (in your case Excel/PowerPoint) to whichever screen you want.
You also can tweak the PowerPoint Slide Show setting to define which monitor to use and also whether to use Presenter View or not (refer screenshot)
Hope this helps!

How to put a button into each card?

I'd like to put a 'next' button onto each card of a stack. In Hypercard I could put buttons into either cards or backgrounds. I have not seen yet how it is possible to make a button appear on each card.
In LiveCode you can create a background group that can be on every card. First create your button(s) select it (them) and press "group" in the toolbar. That will create a group. In the inspector for the group you then select "Behave like a background". All new cards will have your button(s) automatically. If you already created a bunch of cards you can always add the group by selecting "Object => Place Group => " in the menu.
Backgrounds are probably the single biggest thing to unlearn if you're coming from a HyperCard background because they act a bit differently in LiveCode. But they have a lot more power than the old HC backgrounds did, so the pain in making the transition pays off well in the long run.

Resources