Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 11 months ago.
Improve this question
I need to login to a site that I am developing with four different users at the same time.
Currently, I am using four different browsers, or two browsers (chrome and firefox) two times (incognito-private / normal)
But this arrangement is tedious. I do not want to have four windows. I want only one window with four tabs. Using tabs is easier and faster for me because I will be switching between tabs constantly.
Yoono plugin for firefox seems to do what I want but I prefer to stay away from a service that makes me register one social network.
I am not constraint to any of the major browsers. I could use Firefox, Chrome, or IE (for windows).
Is there a Windows browser that have session per tab? Or a plugin for one of them to accomplish the same task?
A bit late I know, but I found this "Private Tab" plugin does the trick. It opens up a new tab in Private Browsing Mode, they are private and separated from each other...
https://addons.mozilla.org/en-US/firefox/addon/private-tab/
Now there is Brave Browser that allows create tabs with different sessions.
https://addons.mozilla.org/en-GB/firefox/addon/multi-account-containers/ is what you are after.
Under the hood, it separates website storage into tab-specific Containers. Cookies downloaded by one Container are not available to other Containers. With the Firefox Multi-Account Containers extension, you can...
Sign in to two different accounts on the same site (for example, you
could sign in to work email and home email in two different Container
tabs.
Keep different kinds of browsing far away from each other (for
example, you might use one Container tab for managing your Checking
Account and a different Container tab for searching for new songs by
your favorite band)
Avoid leaving social-network footprints all over
the web (for example, you could use a Container tab for signing in to
a social network, and use a different tab for visiting online news
sites, keeping your social identity separate from tracking scripts on
news sites)
This is an old thread, but the topic is still relevant to a valuable workflow.
Session Box is exactly what you need, in chrome.
https://chrome.google.com/webstore/detail/sessionbox-multi-login-to/megbklhjamjbcafknkgmokldgolkdfig
Related
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
I am working on a scraping project for a company. I used Python selenium, mechanize , BeautifulSoup4 etc. libraries and had been successful on putting data into MySQL database and generating reports they wanted.
But I am curious : why there is no standardization on structure of websites. Every site has a different name\id for username\password fields. I looked at Facebook and Google Login pages, even they have different naming for username\password fields. also, other elements are also named arbitrarily and placed anywhere.
One obvious reason I can see is that bots will eat up lot of bandwidth and websites are basically targeted to human users. Second reason may be because websites want to show advertisements.There may be other reasons too.
Would it not be better if websites don't have to provide API's and there would be a single framework of bot\scraper login. For example, Every website can have a scraper friendly version which is structured and named according to a standard specification which is universally agreed on. And also have a page, which shows help like feature for the scraper. To access this version of website, bot\scraper has to register itself.
This will open up a entirely different kind of internet to programmers. For example, someone can write a scraper that can monitor vulnerability and exploits listing websites, and automatically close the security holes on the users system. (For this those websites have to create a version which have such kind of data which can be directly applied. Like patches and where they should be applied)
And all this could be easily done by a average programmer. And on the dark side , one can write a Malware which can update itself with new attacking strategies.
I know it is possible to use Facebook or Google login using Open Authentication on other websites. But that is only a small thing in scraping.
My question boils down to, Why there is no such effort there out in the community? and If there is one, kindly refer me to it.
I searched over Stack overflow but could not find a similar. And I am not sure that this kind of question is proper for Stack overflow. If not, please refer me to the correct Stack exchange forum.
I will edit the question, if something there is not according to community criteria. But it's a genuine question.
EDIT: I got the answer thanks to #b.j.g . There is such an effort by W3C called Semantic Web.(Anyway I am sure Google will hijack whole internet one day and make it possible,within my lifetime)
EDIT: I think what you are looking for is The Semantic Web
You are assuming people want their data to be scraped. In actuality, the data people scrape is usually proprietary to the publisher, and when it is scraped... they lose exclusivity on the data.
I had trouble scraping yoga schedules in the past, and I concluded that the developers were conciously making it difficult to scrape so third parties couldn't easily use their data.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 9 years ago.
Improve this question
I have been looking about the pros and cons of browsers specifically for security property. Please share if you know which browser is more secure than others and why it is so.
Each browser have different security features, vulnerability, maybe even NSA backdoors for some of them, at some point in time but... http://www.infosecurity-magazine.com/view/33645/there-is-no-single-most-secure-browser/
You might want to look here for additional insight : http://slashdot.org/story/13/06/23/0317243/ask-slashdot-most-secure-browser-in-an-age-of-surveillance
There is not web browser that is more secure than other in big margin, reason being is that most todays browsers use at most same standard. For example, usage of javascripts is allowed or disabled by default, tracking and sharing, your ip... Beacause this question does not have proper answer, here is example how to make web browser secure as much as possible if needed:
In this example I will use Mozilla Firefox.
First step is disabling javascripts in web browser (manually or by implementing some plugin to do that, for example "NoScript")
Disabling javascripts will disable viewing web pages properly or using them beacause almost any website today use javascripts. But we talk now about security.
Second step should be disabling tracking and sharing again, manually or by some plugin.
Third should be usage of some proxy server to hide your ip.
There is to many different things that could be done, also note again, javascripts, that are required for proper displaying page content and proper interaction with them on almost all modern websites, but can be big security hole, for example, session hijacking, forcing browser to get your geolocation and to many other things...
My reccomendation is to see first exactly, what you would like to protect, and then search on google how to do that.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 7 years ago.
Improve this question
Is it possible to share computer screen with someone else over internet ?
I have attended few sessions of IBM where the presenter has shared his computer screen with the participants !
How should i go about to include this feature in my project. Any open-source API available to do the same ?
There are a lot of options when it comes to screen sharing. Wikipedia has a nice table comparing lots of remote desktop software.
One that looks fairly promising is FreeRDP. The source is on github and it seems to be pretty active project.
If you want to learn more, you can search for info about the Remote Desktop Protocol (RDP) which is what is being implemented in many of these programs.
If you are looking for something more web-specific (like Chrome's remote desktop), check out WebRTC. It is what the Chrome team is using to accomplish their screen sharing.
If you don't mind being limited to Chrome:
Chromoting (Chrome Remote Desktop) - Chromoting API
Google Hangouts - Hangout API
Skype also has a fantastic function for sharing screens. It allows an entire screen during a phone call for free. Makes working with someone on a project incredibly easy.
Go with TeamViewer, you can share your Computer Screen.
Else Go with Skype, share screen, Video calls etc etc.
I Personally like http://join.me
Its fast, small and easy to use.
Will hardly take few seconds to set-up!
Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 9 years ago.
Improve this question
I was thinking of creating an on-screen keyboard to protect against keyloggers. The main problem is that I have found that there is a category of keloggers, called screenshot key loggers, which are able to take screenshots of the screen every time the mouse button is clicked.
For this reason, I feel that my approach of creating an on-screen keyboard does not protect against this category of keyloggers. Is there a way of coding the application which does not allow screenshots to be taken, or else alerts the user if these are being taken without his permission?
Edit
I am assuming that only the user is present in the room. Therefore, I am not trying to protect against other users from taking photos with their digital cameras. I only want to protect against screenshot keyloggers.
This is an issue that Trusted Computing can potentially address, but not on any system you'd likely be trying to deploy this for. Beyond screenshots, remember that if a device or piece of code can have local access, screenshots are one way to take that data. Another way would be to take that data passing through memory or other avenues of processing through the system. It's a very hard thing to prevent entirely.
If you are that paranoid so that you cannot trust the computer you are working on, I would highly advise introducing "factor" in the authentication.
"Google Authenticator" is an open-source "two-factor" security system (like a software version of an RSA token). It means a user would have to have it running on their smartphone, but it means if one does NOT have the phone, even if they have successfully sniffed the username and password, they cannot log-in. Google "Google Authenticator".
Other 2-factor methods involve sending one-time login codes to ones' cell phone (which again must be held), using a hardware-token key, or a list of one-time-only passwords.
I actually created an Apache-specific port of this if you want to use it: https://code.google.com/p/google-authenticator-apache-module/
Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
I have found the internet to be a massive time sink for me.
My efforts to block the websites that are utterly useless to me have been vain for the simple reason that, if I am bored enough, I will bypass the block.
All I can think of is to use the hosts file and a file monitor to ensure it has a loopback in place every time it is edited.
Note: I run Linux and Mac.
StayFocusd is a productivity extension for Google Chrome that helps you stay focused on work by restricting the amount of time you can spend on time-wasting websites. Once your allotted time has been used up, the sites you have blocked will be inaccessible for the rest of the day.
It is highly configurable, allowing you to block or allow entire sites, specific subdomains, specific paths, specific pages, even specific in-page content (videos, games, images, forms, etc).
You could block the website on your router, assuming you have a firmware that allows for it. You could make a long, not easily typed password to the router and then it would (hopefully) so inconvenient that you wouldn't bother changing it when you're bored. On the other hand, you could just not go on these sites.
Try creating a crontab task which checks and updates the hosts file every few minutes. You can obfuscate the job and script to make it more time consuming to remove.
Check out: https://www.rescuetime.com/
Supposedly their product is designed for this purpose.