Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 3 years ago.
Improve this question
I know that this is no technical question, but since someone has posted a similar question here I thought it should be ok.
What I want to do is to measure the user experience of any website. Ideally, I would like to use some type of algorithm to get a number with a corresponding metric to evaluate the user experience.
I can think of some type of heuristics, e.g. if user gets a 404-error, the user experience is very low. On the other hand, if he or she buys something in an online store, the user experience is high. Of course this would not work if the page is e.g. a news page.
Does anyone of you know how I could calculate the user experience for websites?
Thanks in advance,
enne
What I understand from your question is that you want to measure the user experience based on technical terms, i.e.: number of views for a specific page, number of error pages showed to users, how many times a link has been clicked, what locations the users come from all over the world, and so on.
So, I think you are asking about website analytics, which you custom to do measurements for whatever you want, as famous examples for those tools:
Splunk
Google Analytics
Open Web Analytics (open source)
Please let me know if I answered your question.
Related
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 5 years ago.
Improve this question
I need to list all the website links (domains) in India under one website based on categories.
You just cannot list 100% of Indian websites. If I understand what you want, you want something like a Web crawler for Indian domains. Another "google". But even big spiders like Google's, Yahoo's, Bing's spiders can't build a database of all the websites. They do this with algorithms (with partly published algorithms), but I think you need even more, because you need a 100% accurate database, so you would need to have to ask them from all domain registrars, but you obviously can't do that.
Even if it was possible I definitely would not do this for a client, but created a company for that. But I would say that's practically impossible.
What you can do is that you can search for a few thousands of sites and categorize them manually or build a small "spider".
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 9 years ago.
Improve this question
I'm launching a startup web site, what i would like to know is how to start with that, i mean is better to use invitations first of all?
Then how to send invitations and to who?
How can i plan invitations? Which are best practices?
Does anyone is passed from this step with his own site?
Any experience on here?
thanks
Whether you create a beta version of the site first is completely up to you.
It really depends what type of website you're planning to make. Beta's are obviously a good way to gain feedback on your website and its functionality before releasing to everyone. Thus, allowing you to make improvements/fix bugs before everyone uses the site.
In terms of actually getting users for the beta, it's very much a case of marketing your website and its existence well (through social media, advertising etc.), and then providing some kind of 'sign up for the beta' page. You could then close registration for the beta once you have enough users, and devise some method of gaining feedback from users.
I haven't personally created a beta myself, but if I was to do it, I would do the above.
Hope that's of some help.
Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
I am building a question and answering site by myself.
I want to make this site indexed as a Q&A site or Forums by Google, which can be retrieved when using the "Discussions" in Google. In my personal experience, Google Discussion Search is a pretty useful function when I want to get others' real opinions or experience.
However, I have no any idea on that how Google determine one site as Q&A/Forum or one page as Q&A/Forum page. I searched a lot on Google, but there is little related information discussing this issue. Do you have any idea or reference on that?
Thanks!
Best,
Jing
Use richsnippets and make Google recognizing your traffic by using Webmaster tools or Analytics . Use a sitemap.xml to invite for revisit and fast indexing, disable archiving (f.e. Google Cache) with meta-robots noarchive. If you have high traffic and fast content building, search-engines will then recognize by themselves.
Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 11 years ago.
Improve this question
What is best strategy of protecting from "registration bots". Ones that just POSTing registration forms to my server, creating dumb users.
For my application, it started with just several new accounts per day. But now it became a real problem.
I would like to avoid confirmation mail, as much as possible. What are strategies to prevent this?
You can use a variety of techniques here:
Use a CAPTCHA like reCaptcha
Present the user with a trivial problem like "2+2=?". A human will be able to respond correctly where as a bot won't.
Add a hidden text field to your form. Bots are programmed to fill in every field they can. If you find that the hidden field has some data in it when the form was submitted, discard the request.
Use something like reCaptcha
Any kind of captcha will do it. eg: reCAPTCHA, but for popular bots a simple check like: "from the following checkboxes below please select the nth one" will do it.
Also, if you use a popular app like phpBB, just a little tweaking of registration page will do it.
If your site is very popular, then it's a different story altogether, and there will be always a way to write bots specifically designed for your site, but these basic tricks should be enough to stop generic bots.
You could log the IPs of those bots and block them. That is if they are not rotating lots of IPs.
Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 11 years ago.
Improve this question
Can anyone recommend any cloud based alternatives too SharePoint? I have seen a couple good ones on sites like www.sharepointalternative.com and www.topsharepointalternatives.com but does anyone have experience of ones they have used before?
We are a small company of 16 people but are looking to expand to around 30 by the end of the year and so should be easily scalable. We would need to be able to easily share and edit files and have a version control.
It also has to work as an internal and external portal as we want to share with clients as well as internally.
Check out http://www.alfresco.com/. We tried it, but it was not a good solution for us since we have hundreds of users and our groups needed many sub-sites with their own permissions.
google apps for sure!
http://www.ilovefreesoftware.com/09/articles/sharepoint-vs-google-apps.html
It's free up until 20 users I believe. So you can easily try it out without extra costs and than pay a little if you expand enough
CMSWire has a great list of Sharepoint Alternatives. Glasscubes seems to be the most viable alternative for what you are looking for, however I have zero experience with it.