I'm a twitch streamer and I'm runnig a bot named "Nightbot", which can interact with users in my stream's chat area. They can type a command such as "!hello" and, in response to that, I can tell the nightbot to load up a url, and post the text from that url into the chat.
But the text needs to change each time I play a new game, so the text must be editable. And it can't be a file, because the nightbot expects the url to return just plain text.
So I can't use a file hosting service. Please, don't recommend for me to save a text file to some free hosting service, and put my text into the file.
What I need is a very simple string of texxt that is hosted online, which can be edited, and which can be accessed by a url. Why the literal *eck is that so impossible or unreasonable? I thought we live in 2018.
I spent the entire day trying to learn Heroku, and when that turned out to be unreasonably complicated, I spent some hours trying Microsoft's Azure. Holy moly it turned into connecting storage services, picking price tiers, and do I want that to run on a windows or linux server? And how many gigs of space do I need, and will I be paying by the second? Come on I just need to save an editable string of text online, probably just 100 characters long! Why so difficult!
I guess what I'm looking for is something as easy as tinyurl, but for editable text strings online... just go there and type in the name for my variable, and boom, it gives me a url to update it, and a url to download it. Total time required: less than one minute.
WARNING: both solutions are publicly accessible and thus also editable. You don't want inapproriate text to display on your stream, so keep the link secret. Still there are no guarantees it stays secret.
Solution 1 (simple and editable via the web UI if you create an account)
You could just use pastebin.com. Here you can put public/unlisted text.
When you use the pastebin.com/raw/ + id of your text you get plain text.
Solution 2 (Bit more complicated, but more advanced)
You can use JSON Blob
This website allows you to host JSON and edit/create/get a string. It has to be valid JSON, but if you use "" around your text it is. Though if you use a curl command to change the text it doesn't have to be valid JSON. Only when you use the website to edit text it has to be.
First of you create your string and save it. Then you can access the string by doing a GET request on a url like this https://jsonblob.com/api/ + blob id
Example:
https://jsonblob.com/api/758d88a3-5e59-11e8-a54b-2b3610209abd
To edit your text you have to do a PUT request to the same url, but with the text your want it to change to.
Example command to change text (I used curl, because that's easy for me):
curl -i -X "PUT" -d 'This is new text' -H "Content-Type: application/json" -H "Accept: application/json" https://jsonblob.com/api/jsonBlob/758d88a3-5e59-11e8-a54b-2b3610209abd
You could also use a tool like POSTMAN to do the PUT request.
For more indepth instruction on how to use JSON Blob you can go to their website: https://jsonblob.com/api
Related
I would like to get images from a search engine, to run some automated tests without the need to go online and pick them by hand.
I found an old example from 5 years ago (ajax.googleapis.com/ajax/services/search/images), which sadly does not work anymore. What is the current method to do so in Python3? Ideally I would like to be able to pass a string with the search name, and retrieve a set amount of images, at full size.
I don't really mind which search engine is used; I just want to be sure that it is supported for the time being. Also I would like to avoid Selenium; I am planning to run this without any UI nor using the browser, all from terminal.
Have you heard of pixabay? There is a nice python wrapper for working with it as well.
Found a pretty good solution using BeautifulSoup.
It does not work on Google, since I get 403, but when faking the header in the request, is possible to get sometimes, data. I will have to experiment with different other sites.
So far the workflow is to search in the browser so I can get the url to pass to beautifulsoup. Once I get the url in code, I replaced the query part with a variable, so I can pass it programmatically.
Then I parse the output of beautifulsoup to extract the links to the images, and retrieve them using requests.
I wish there was a public API to get also parameters like picture size and such, but I found nothing that works currently.
I want to create a new tab and send data to it in a POST request from my chrome extension. I could use a GET request, but the data I am going to send could be arbitrarily long, and I want to JSON encode it(which also means I cannot use forms). I've only found two questions on SO about this, but both of them are talking about using a form, which is not desirable.
The reason I want to do this is because I want to request further input from the user before I do what I do with the data. I am totally lost here and have no idea how to do this, hence I can not add any code samples I've tried.
If I cannot do this with a URL, I could inject a script into the page and have a popup, but that is something of a last resort.
Any ideas on how this could be done?
I have a URL with an output that needs to be reformatted and entered back in the browser.
We have a server that passes caller ID and we can specify a URL to launch with the caller ID included.
IE: googledotcom/search?="{callerID}" . If this is set in the URL manager it would return a google search for "Jackson Steve" when a call is received from Steve Jackson.
**edit: the tag {callerID} that is passed from our server can not be edited in any way because of Asterisk dial plan issues.
This issue is our customer database will only handle name searches in the format of "Jackson, Steve". Without the comma the search comes back empty.
How would I take the name passed from caller ID, create a script to insert a comma and resubmit that URL in the browser?
Basically I need a way to convert "https://www.google.com/#q=name+name" to "https://www.google.com/#q=name,+name" via an automatic script or process. The comma coming after the first name being the change that needs made.
Should this be sent to a website running javascript/html where it formats the caller id name then resubmits or should this somehow be handled by a local script on a computer with something along the lines of autohotkey?
Possibly use some sort of redirect on a web page? send "Name Name" to mywebsiteDOTcom/urlformat/, write a script that would insert a comma in after the first name and redirect the user to myuserdatabaseDOTcom/search?"Name, Name"
Replace the spaces with comma, that should be the easiest solution..
Using Firefox, you could write a Greasemonkey script that is automatically run when you hit the target site (Chrome, I believe has a similar addon named Tampermonkey, not sure about IE). In the javascript, you could examine the URL and either do a straight comma-for-space replacement as Danyal suggested or you could do something a little more elegant with some regex matching, then auto-navigate the browser to the corrected URL. Once the script was installed, this process would happen automatically.
PS. What process launches the browser? Can't you capture the name and the format it before you launch the browser window? That would seem a much better approach than trying to format it after the fact.
After looking, I have been unable to find a python3 module or method of pulling exchange rate data from the internet into my program.
Any ideas?
Thanks
Well, I'm not sure what's the location of your data source and in what format you need the data, but from your description I can make the following recomandation:
Use a service like:
https://openexchangerates.org/
and afterwards parse the response with a JSON parser (like the official one):
http://docs.python.org/3.3/library/json.html
and Voila you have the currency.
UPDATE
If you really have time and as the time passes patience to modify it, you can take whatever site you want with a visible exchange rate and write a HTML parser:
http://www.diveintopython.net/html_processing/extracting_data.html
This solution gives you freedom (basically you can query whatever you want) but if a site changes... well you have to update your code
UPDATE 2
You can use a very simple trick and a very simple html from google. Try to call the following link:
http://www.google.com/finance/converter?a=1&from=BGN&to=AED
and you will get a response for free, without any key, but be aware that this is not fair play especially when you are polling the service many times a minute!
In some website for which I have access, there are some input fields. In the sixth field I need to enter some input string from a list of 10000 strings, then a new page appears, for which I would just need to count the number of lines. Finally I would like to get a table with two columns like input string and number of resulting lines. Since I have to manually enter the info for all the different 10000 strings, I wonder therefore what is the best approach to enter a string into a generic formular field and get the resulting text. I heard about curl but I am not sure whether this is the easiest one.
P.S.
Example of interactive way: I type some string o words into google search and then I get a new page with the search results. Previously I have introduced my google username and password, so the results will be probably filtered according to my profile.
Example of non-interactive way: A script somehow introduces my user information, search query and saves to some text file the search results. Imagine the same idea but for a more complicated website like this.
What you want to do is to send a HTTP POST with specific data. This can be done with any proper HTTP client code, and one such is libcurl (or the pycurl binding or even using the curl command line tool). On the response from the post, you probably get a redirect and then the results, or you need to do a separate request for the results and then you're done and go back to do the next POST. Repeat until all POSTs are done.
What you may need to take into account is that you may have to deal with cookies and possibly to follow a redirect from the POST. A good approach is to record a "manual session" as done with a browser (use firebug or LiveHTTPHeaders etc) and then use that recording to help you repeat the same thing with a HTTP client.
A decent tutorial to get some starting up details on this kind of work can be found here: http://curl.haxx.se/docs/httpscripting.html
You could also use JMeter to run all the posts. You may use the CSV input to set the 10000 strings. Then you save the result as xml and extract the necessary data.