How to implement 2 distinct timeouts with Pyramid Authentication policy? - pyramid

I'm writing a pyramid application with both web pages, and an API called from a mobile device.
For the web pages, I set the a timeout of 2 hours :
authn_policy = AuthTktAuthenticationPolicy(
settings['secret'],
callback=groupfinder,
hashalg='sha512',
timeout=3600*2,
reissue_time=360*2
)
However, I would like that when the user logs in via the mobile app, the timeout is much longer (i.e. several days). Is there a simple way to do this ?
I didn't see anything with timeout, but I saw the remember() function can take a max_age parameter. So maybe I could remove the timeout (or set it longer), and use the max_age parameter: 2h for web, and 2 days for the mobile app. Is it a good idea ? Would there be a notable difference, or any vulnerability ?

Related

Database and Server solutions for a multiroom application in ReactJS

Context
I am working on a React webapp which has to do the following :
Users access to a webpage using one common URL (about 10 thousands users max)
There's a second URL for "admins".
Each admin can create a "room" and transmit the room ID to some users (max 40 user per room). The room will be open for 2h max.
Once users access the room, they can transmit infos by clicking on a button (like Task 1 done by clicking on a button next to Task 1)
I'm thinking also about adding a chat where users can chat with the admin of the room.
This webapp will be hosted on an Apache server which has already other apps in it.
Issue / Solutions ?
I'm trying to figure out which tools (database, websocket) I should use for this project and I would like some feedback considering the relevance of my initial plan (and suggestions for improvements).
Server : using node.js and the ws library (I'd like to avoid socket.io which seems a bit heavy) I can handle the communication between users and admin. From what I remember I will have to do some configuration on the Apache server for node.js to work.
Database : Unfortunately I can't use databases like Firebase and I was thinking about using a database which I'm already using on my server (MariaDB) to create a database for each room, storing which task was done by which user, and also the messages sent from a user to an admin. But I've seen a tutorial on how to make a Chat Application in React, which was using LocalStorage instead of a database.
Safety : I'd say that this kind of app can have serious safety issue, but have no idea if some tools can prevent the most common attacks...
Conclusion
So, all in all, I'm thinking about using ReactJS + node.js + ws library + LocalStorage instead of a DB, all of this on an Apache Server. Does it seems like a mildly convenient solution for this project.
I realize that my question is a bit far fetched, but I don't have the experience to know the tools for such a project.
Thanking you all in advance.

Logic apps - Get response time of a http request

I am trying to use Logic apps to ping our website every 10 minutes. I would like to know how to get a response time of that call to make sure the website is now slow.
Currently i am doing this
Recurrence (Every 10 minutes)
Get Current Time
Http GET Call
Get Current time 2
Difference of (Current time 2 - Current time)
Condition to see if it is greater than threshold.
This looks like a not clean solution. Wondering if there is a easier way to get the time / latency of that HTTP call in step 3
According to the official doc, with the connector you're using is not possible to get response time. You'd better use Azure Functions for that. More info:
https://learn.microsoft.com/en-us/azure/connectors/connectors-native-http
You can use azure application insights for this kind of situation it's the best and optimal solution.
https://learn.microsoft.com/en-us/azure/azure-monitor/app/app-insights-overview

Best way to request API and store every minute

I have an app that is hitting the rate limit for an API which is hurting the user experience. I have an idea to solve this but have no idea if this is what should be done ideally to solve this issue. Does this idea makes sense and is it a good way to solve this issue? And how should I go about implementing it? I'm using react-native and nodejs.
Here is the idea:
My app will request the data from a "middleman" API that I make. The middle man API will request data once per minute from the main API that I am having the rate limit problem with (this should solve the rate limit issue then) then store it for the one minute until it updates again. I was thinking the best way to do this is spin a server on AWS that requests from the other API every minute (Is this the easiest way to get a request every minute?) then store it on either a blank middleman webpage (or do I need to store it in a database like MongoDB?). Then my app will call from that middleman webpage/API.
Your idea is good.
Your middleman would be a caching proxy. It would act just as you stated. Hava a look at https://github.com/active-video/caching-proxy it does almost what you want. It creates a server that will receive requests of URLs, fetch and cache those, and serve the cached version from now on.
The only downside is that it does not have a lifetime option for the cache. You could either fork to add the option, or run a daemon that would delete the files that are too old to force a re-fetch.
EDIT:
A very interesting addition to the caching-proxy would be to have a head request to know if the result changed. While this is not provided by all API, this could become useful if yours is displaying such info. Only if HEAD requests do not count toward your API limits...

IIS worker threads issue

I have my site hosted on IIS hosting. Site has feature that needs calling WCF service and then return result. The issue is that site is processing calling to WCF service another web site calling is freezing and not return content fast (this is just static content). I setup two chrome instances with different imacros' scripts, which one is calling page that requests wcf service and another one page is just static content. So here I can just see that when first page that requests wcf services freezes, another one page also freezes and when first is released the second is too.
Do I need reconfigure something in my Web.Config or do should I do something else to get possible to get static content immediately.
I think that there are two seperate problems here:
Why does the page that uses the WCF service freeze
Why does the static content page freeze
On the page that calls the WCF service a common problem is that the WCF client is not closed. By default there are 10 WCF connections with a timeout of 1 min. The first 10 calls go fine (say they execute i 2 secs), then the 11th call comes, there are no free wcf connections it must therefore wait 58 secs for a connection to timeout and become available.
On why your static page freezes. It could be that your client only allows one connection to the site, the request for the static page is not sent untill the request for the page with the wcf services is complete.
You should check the IIS logs to see how must time IIS is reporting that the request is taking.
I would say that this is a threading issue. This MSDN KB article has some suggestions on how to tune your ASP.NET threading behavior:
http://support.microsoft.com/kb/821268
From article - ...you can tune the following parameters in your Machine.config file to best fit your situation:
maxWorkerThreads
minWorkerThreads
maxIoThreads
minFreeThreads
minLocalRequestFreeThreads
maxconnection
executionTimeout
To successfully resolve these problems, do the following:
Limit the number of ASP.NET requests that can execute at the same time to approximately 12 per CPU.
Permit Web service callbacks to freely use threads in the ThreadPool.
Select an appropriate value for the maxconnections parameter. Base your selection on the number of IP addresses and AppDomains that are used.
etc...
Consider such scenario: when you make a request to IIS your app changes, deletes or creates some file outside of App_Data folder. This often tends to be a log file which is mistakenly was put at bin folder of the app. The file system changes lead to AppDomain reloading by IIS as it thinks that app was changed, hence the experienced delay. This may or may not apply to your issue, but it is a common mistake in ASP.NET apps.
Well, maybe there is no problem...
It may be just the browser's same domain simultaneous requests limit.
Until the browser not finished the request to the first page (the WCF page), it won't send the request to the second page (the static).
Try this:
Use different browsers for each page (for example chrome/firefox).
Or open the second page in chrome in incognito window (Ctrl + Shift + N).
Or try to access each page from different computer.
You could try to use AppFabric and see what is wrong with your WCF services http://msdn.microsoft.com/en-us/windowsserver/ee695849

Distributing Twitter Widget Among Application Users

Hey guys is there anyway to circumvent the Twitter rate limit by using a Twitter widget and embedding it in the end users browser? In other words would using Twitter Search widget apart of the user's browser's session (while they are using my app) so that their calls to Twitter are made through their IP address (and not the IP address of my app) - I would do this to avoid getting the IP of my app blacklisted. Is that fine or would that violate Twitter's terms of use?
I would use the Twitter search widget. Would using Twitter stream be a better idea?
Depending on your implementation, you may want to consider the Streaming API for this purpose. It's probably considered more "kosher". You can query for a particular set of phrases and open whats called a firehose, and Twitter will push updates to your application and it's not really bound by rate limits although there is a rate limit system in place here. For my particular use case, this didn't work and I had to do what you described in your question. But if you want to use the Twitter streaming API and are using PHP in conjunction, I would highly recommend looking at the 140 Twitter Server framework at the start. It will make it a lot easier to implement the streaming API at the get go.
This is fine, and this is the solution I'm using. Use jQuery or something similar for the Ajax calls and send the response to the server for processing. The carry will be on each of the IP's that use your application. So, if that user is spamming Twitter with requests - they would get blacklisted, not your application.

Resources