I am running a simple suitelet with a form, to which I am adding a clientscript.
form.clientScriptModulePath = './clientScript.js';
It works fine, as long as the suitlet is run from the 'normal' url.
But if External URL is used, clientScript seems to be completely ignored, no error, just ignored.
Are Client Scripts not available for External URL's in NetSuite? Or is there some workaround for it?
I didn't find any documentations for External URL restrictions.
When you select Available Without Login and then save the Script Deployment record, an External URL
field is displayed on the Script Deployment page (see following figure). Use this URL for Suitelets you want
to make available to users who do not have an active NetSuite session.
Note The Website feature must be enabled for Clients Scripts to work in externally available Suitelets
Please go to Set up > company > Enable features >Web presence > Website.
Here is a screenshot for your reference
The Suitelet should be in released status to avoid any other errors.
The following table shows how you can specify the localization context based on the script type.
Script Type
Defining Localization Context Filtering
SuiteScript 2.0 Client Script Type
Complete the following steps to add localization context filtering to client scripts:1. Use the localizationContextEnter and localizationContextExit entry points in your script.
Please let me know how this goes!! Happy coding :)
It's been a while but I think your clientScriptModule path needs to be absolute for it to work externally. I think I ran into this a couple of years ago and that turned out to be the solution.
Related
How do I correctly handle the login/authentication scenario for an Azure web app in my VS2015 web performance test?
I created an XML file as a data source for the WAAD username and password. I bind the username and password to the Form Post Parameters: login and passwd respectively at request: https://login.microsoftonline.com/xxxx/login
But when I run the test, the Web Browser tab shows this error:
We can't sign you in
Your browser is currently set to block JavaScript. You need to allow
JavaScript to use this service.
To learn how to allow JavaScript or to find out whether your browser
supports JavaScript, check the online help in your web browser.
I also get a number of errors like this:
The value of the ExpectedResponseUrl property
Validation xxxx.azurewebsites.net/xxxx/docs/xxxx.aspx does
not equal the actual response URL
login.microsoftonline.com/xxxx/wsfed. QueryString
parameters were ignored.
Any idea how I can successfully log in to the Azure web app via the web performance test?
There are several methods of login and authentication that can be used. Just binding values to form post parameters may not be sufficient or correct. You will find the login form has hidden session identities that must be passed as well as the login data. I find that recording a test two times using as nearly as possible the same inputs and doing the same activities helps. These two tests can then be compared to find the dynamic data that needs to be handled.
In a comment the questioner added "I noticed these parameters, n1-43 are different but I have no idea what they represent. How do I handle them?". I can have no idea what they represent as I do not know the website you are testing. You could ask the website developers. Or, better, treat them as dynamic data. Find where the values come from, save them into context variables and use them as needed. This is basic web test development. Here and here are two good articles on what to do.
The message about JavaScript not being supported can be ignored. Visual Studio web tests do not support JavaScript or any other "active" parts of a web page, they only support the html part. Your job as a tester is to simulate what the JavaScript does for the specific user journeys you are testing. That simulation is generally just filling in the correct values (via context parameters) in the recorded requests.
Unexpected response urls can be due to earlier failures, such as teh login not working. I suggest not worrying about them until all of the other test problems are solved. Then, if you need help ask another new question.
We have SoapUI (Open Source Edition) installed on a windows jumpbox. Many users can login with their accounts, open soapui, import a wadl/wsdl from a dozen of projects and perform testing.
Since the IP is always same, we are unable to find who has sent a request and that is a problem when some destructive requests are made that causes lots of recovery issues (Only authorized users have the access).
Now we want to add http header like user : ${=System.getenv("USERNAME")} to the request. It can be a new header property or even a part of user agent.
We tried to put the property inside HTTP Preferences as a part of user agent string, but it passes the parameter as a string
We also set a global property but couldn't find a way to insert it as one of HTTP calls by default.
The only ways we found so far was:
going to soapui setting of each user and add headers to all requests one by one. (problem: what if user imports more wsdl/wadl later)
adding a startup script to created projects, so it adds the header by default to everything (problem: users can create new projects any time - please note that each soapui instance is individual)
This requirement can easily be fulfilled by the SoapUI's pro software using the feature called Events.
For example, add the header for each request before submitting the web service / rest service call.
However, you mentioned that free version is being used. Wrote an extension some time ago, which allows us to do the same in the free edition of SoapUI. There is a readme available explaining how to use this. Basically this extension implemented the some of the listeners of SaopUI's API while providing the flexibility to the end users what code should they run(in the form of external file) when the respective event occurs.
Complete the instructions mentioned in the readme.
Then you need to do is write a groovy script(already given below) to implement your requirement i.e., add the header to the request. That needs to be done in a file with specific file name located in specific directory(details available there).
In your case, the required code(mostly working sample below) should go into file called RequestStepBeforeSubmit.groovy, in order add the user name into each request's header automatically.
Below code snippet should work even if you use the pro software for the same requirement when SubmitListener.beforeSubmit event happens.
//change the condition if required, should be working for soap/rest types
if (context.getProperty("wsdlRequest")){
def request = context.getProperty("wsdlRequest").testStep.httpRequest
def existingHeaders = request.requestHeaders
def username = System.getProperty('user.name')
existingHeaders['user'] = [(username)]
request.requestHeaders = existingHeaders
}
In a system I have to maintain (didn't build it, just inherited it) we have a Foursquare implementation that hasn't been used in quite a while. Trying to revive it failed, because our page is now loaded via HTTPS, which it didn't used to be.
We are using the "Save to Foursquare" button as well as the API request to retrieve the number of Check-ins. I already switched all the JS includes and intent links from http to https and at least now it shows the number and the button correctly.
However, I can't click the button and checking the browser's console I found that it added a script tag to the head of this page which tries to access http://platform.foursquare.com/js/modules/widgets.asyncbundle.js. The browser obviously blocks this, because it's not using HTTPS.
The file we are explicitly loading is https://platform.foursquare.com/js/widgets.js. It seems to me like this script is not reacting correctly to HTTP vs. HTTPS. There is probably a very simple solution to this, so what am I missing?
I don't know if you've tried it yet but the foursquare website says this on the matter:
Change the source of the JavaScript file to https://platform-s.foursquare.com/js/widgets.js
Add {"secure":true} to the global configuration block (window.___fourSq)`
The same link (see below) has all the different ways to call the Save To Foursquare function using its .saveTo() function.
https://developer.foursquare.com/overview/widgets
I hope this information and links helps! Cheers.
I am having trouble with script link custom actions. I am building a SharePoint app, and I successfully added a site-scope custom action pointing to a script file in the Style Library, as I want this particular script to be injected to all the pages of my SharePoint site.
While it works in certain situations, the script link injection breaks without apparent reason under certain conditions. For example, when I arrive on my root web, the script will be injected. But, if I go to a certain link within this web (for example Home or Site Contents), the file that is supposed to be injected will simply not be fetched from the Style Library and therefore never be injected, resulting in an uncaught ReferenceError when I try to call one of the script's function. The weirdest part is that a page refresh through Ctrl+F5 will fetch the script file without any problem, regardless of the page's ability to originally fetch the script file when first accessed. It will keep the script until it is accessed through a link again.
I've read up on Sharepoint caching, thinking it may be the cause of my problem, but the trouble is that these articles mostly talk about cache-induced errors when updating a file, while I am only trying to access it.
One thing to note is that, due to limitations, I am adding the script link custom action through code. Here's an example of what this kind of call currently looks like in my app:
context.Load(context.Site.UserCustomActions);
context.ExecuteQuery();
customAction.Name = "MyScriptLink";
customAction.Location = "ScriptLink";
customAction.Sequence = 100;
customAction.ScriptSrc = "~SiteCollection/Style Library/MySite/MyScript.js";
customAction.Update();
context.ExecuteQuery();
So, what's going on here ? Why is my script no injected on certain pages ? Why does a refresh on these exact same pages manage to fetch the file without any problem ?
Found it ! Three words: Minimum Download Strategy. Disable it, it messes with you page redirect behavior within a SharePoint site (either through code or through site settings)
Edit: If you still want MDS enabled on your site, there is a solution
The new Facebook Javascript SDK can let any website login as a Facebook user and fetch data of a user...
So it will be, www.example.com including some Javascript from Facebook, but as I recall, that script is considered to be of the origin of www.example.com and cannot fetch data from facebook.com, because it is a violation of the "same origin policy". Isn't that correct? If so, how does the script fetch data?
From here: https://developer.mozilla.org/en/Same_origin_policy_for_JavaScript
The same origin policy prevents a
document or script loaded from one
origin from getting or setting
properties of a document from another
origin. This policy dates all the way
back to Netscape Navigator 2.0.
and explained slightly differently here: http://docs.sun.com/source/816-6409-10/sec.htm
The same origin policy works as
follows: when loading a document from
one origin, a script loaded from a
different origin cannot get or set
specific properties of specific
browser and HTML objects in a window
or frame (see Table 14.2).
The Facebook script is not attempting to interact with script from your domain or reading DOM objects. It's just going to do its own post to Facebook. It gets yous site name, not by interacting with your page, or script from your site, but because the script itself that is generated when you fill out the form to get the "like" button. I registered a site named "http://www.bogussite.com" and got the code to put on my website. The first think in this code was
iframe src="http://www.facebook.com/plugins/like.php?href=http%3A%2F%2Fwww.bogussite.com&
so the script is clearly getting your site info by hard-coded URL parameters in the link to the iFrame.
Facebook's website is by far not alone in having you use scripts hosted on their servers. There are plenty of other scripts that work this way.. All of the Google APIs, for example, including Google Gears, Google Analytics, etc require you to use a script hosted on their server. Just last week, while I was trying to figure out how to do geolocation for our store finder for a mobile-friendly web app, I found a whole slew of geolocation services that had you use scripts hosted on their servers, rather than copying the script to your server.
I think, but am not sure, that they use the iframe method. At least the cross domain receiver and xfbml stuff for canvas apps uses that. Basically the javascript on your page creates an iframe within the facebook.com domain. That iframe then has permission to do whatever it needs with facebook. Communication back with the parent can be done with one of several methods, for example the url hash. But I'm not sure which if any method they use for that part.
If I recall, they use script tag insertion. So when a JS SDK call needs to call out to Facebook, it inserts a <script src="http://graph.facebook.com/whatever?params...&callback=some_function script tag into the current document. Then Facebook returns the data in JSON format as some_function({...}) where the actual data is inside the ... . This results in the function some_function being called in the origin of example.com using data from graph.facebook.com.