Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 9 years ago.
Improve this question
I want to create a SharePoint Server setup that will allow applications to be highly avaliable. Say if we have a portal in SharePoint, and I wanted to make it available always. I know it has to do with WFE. Someone guide me with article or Arch that need to be set for this.
Having multiple WFE (Web Front-ends) will make the web part of your SharePoint more reliable -- if one goes down, you can have your load-balancer stop sending requests to it. There is no way to ensure 100% uptime -- reliability is a combination of having redundancy (in hardware and services), monitoring, 24x7 staff to fix problems, etc.
Some things to look at:
Plan for Redundancy
http://technet.microsoft.com/en-us/library/cc263044.aspx
Plan for Availability
http://technet.microsoft.com/en-us/library/cc748832.aspx
There are third-party products that can help with fail-over, but I haven't used one to recommend.
See Lou's links. You can have redundant WFEs, query servers, and application servers as well as cluster your database.
Note that you cannot have a redundant index server unless you have two SSPs that basically index the same content. The query servers get the index replicated on them, so if the index server goes down you can still perform a query, the index will just not be updated until the index server comes back online. If you can't get it back online you will need to rebuild your index (full crawls).
Related
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 3 years ago.
Improve this question
this is probably far fetched but... can spark - or any advanced "ETL" technology you know - connect directly to sql server's log file (the .ldf) - and extract its data?
Agenda is to get SQL server's real time operational data without replicating the whole database first (nor selecting directly from it).
Appreciate your thoughts!
Rea
to answer your question, I have never heard of any tech to read an LDF directly, but there are several products on the market that can "link-clone" a database almost instantly by using some internal tricks. Keep in mind that the data is not copied using these tools, but it allows instant access for use cases like yours.
There may be some free ways to do this, especially using cloud functions, or maybe linked-clone functions that Virtual Machines offer, but I only know about paid products at this time like Dell EMC, Redgate's and Windocks.
The easiest to try that are not in the cloud are:
Red Gate SQL Clone with a 14 day free trial:
Red Gate SQL Clone Link
Windocks.com (this is free for some cases, but harder to get started with)
Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 9 years ago.
Improve this question
I found some technical articles which mentioned that i need to have three separate servers for SharePoint production environment;
- First one is for the Database,
- Second server for Application,
- and the third for the front-end.
But in my case i am planning to have only two servers one for the Database and the other for the application and front-end, will it still be valid to have two servers .Baring in mind that me deployment is considered somehow small with around 60 internal users and around 100 external users?
You can set it up this way. The difference will be in how many SharePoint Service Applications you start on each box.
In environments that have three machines you will see that the there is one box dedicated to the web front end and another that runs the desired SharePoint Applications such as Search, Excel Services, PerformancePoint etc. Since those applications are memory and processor intensive it is best to keep them on a separate machine.
Your performance may vary based on the scale of hardware in your box and how many of those Services Applications you need to kick off.
Some Service Applications can cause a lot of load and need to be finely tuned such as Excel Services and PerformancePoint. I recommend you looking into each that you plan on starting to determine if you will put too much load on your machine
Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
I am just thinking of moving a website from a VPS to Windows Azure Web Sites. After doing a load test, I accidentally took down my test website, using around 30MB over the daily bandwidth.
This made me wonder what would happen if my website was suddenly hit by a DDOS attack? I'm pretty sure everything would max out the daily and hourly limits in no time, even worse, redirecting all the users to the azure over limit notification.
Is there anything that can be done about that? I know the daily bandwidth limit will be harder to reach after I put all the images on a CDN, but I'm afraid what would happen if there's a spike or something.
Sorry for such an answer with no head and tail. I hope you guys will understand.
Windows Azure has built-in load balancers that will stave off most (if not all) DOS type attacks. The truth is, Microsoft is very hush-hush on the specifics of how their load balancers protect against malicious attacks (as they should be).
An added benefit to hosting your applications in the cloud is that you can take advantage of auto-scaling when you get heavy loads (malicious or otherwise) so your site won't go down.
You might want to check out the Security Best Practices For Developing Windows Azure Applications document for more information on this.
Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 11 years ago.
Improve this question
I have quite a few domains that I manage (100+) and I'm getting tired of GoDaddy's management. Whenever I need to make changes shifting things around to DreamHost or Heroku to Google App Engine or my own VPS and private servers things eventually get hairy and it's tiresome to have to go to multiple locations in order to manage things.
I was curious if there was a solid option for developers that need robust domain management. I don't really (and PLEASE correct me if I'm wrong) see an answer with DynDNS or EasyDNS options. Perhaps I'm overlooking something.
I'm really looking for a single console to rule them all (i.e., register wherever and set NS entries to the master service) and to then be able to go into a domain and, by using a template split everything out to where I want it go go. In other words by setting up my own DNS templates I could with one fell swoop set up Google Apps sub domains, development dyndns cnames, AWS CDNs, etc. etc. etc.
Anyone aware of such a comprehensive solution?
I'm quite happy with DynDNS but I'm equally satisfied with Zerigo. Templates, AJAX interface, migration tools, an API...
Short of deploying your own infrastructure or piggybacking off something like Dynect, I'd hazard that Zerigo should do everything you want. The fact that it's recently been acquired by 8x8 suggests other people agree.
[I don't work for them if this sounds like a plug ;)]
Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
Simply put, I have a domain xyz.com,
I want pc.xyz.com to point to my pc IP (which is dynamic)
any available solutions?
I need a Mac client to update the changing IP, and a service to run on my domain to get those updates.
Something like http://www.dyndns.com/
( I have a domain from Dreamhost if that helps..)
You can set up a CNAME entry so that pc.xyz.com is an alias to a dyndns name. I know that doesn't strictly answer the question of how to run a dyndns-like service yourself, but it will achieve the effect you described with a minimum of effort.
How to set up DNS service dynamic / static is a good place to start. Technically, the concepts are not difficult, but much easier if you use a DNS server that is able to use MySQL or some other database. For example: MySQL BIND SDB Driver ...
The project was started so that we could automatically create sub-domains for user's homepages on account creation.
By far this is the easiest approach and allows you to write a very thin client that can send a quick web request to your system to update the DNS based on your new IP ... Maybe even build your own REST API ...
You could combine cron (or, since you're using a Mac, launchd) and the DreamHost API to achieve the result you want, as described here.