Creating more than one site in Kentico? - kentico

Am going to use Kentico to create more than one store (Site) and assign user for each store to add/modify/delete his products, i've created 2 stores the first one with domain localhost:8080 and second one is storeone.localhost:8080 as documentation said in Kentico Doc URL, i can open first site with no problem but when i tried to switch to second Site it gives me Bad Request - Invalid Hostname .. can any one help me in this?? .. also i would appreciate it if any one help me on how to extract product data using Kentico API's as documentation provide me only with updating/modifying/removing data from database and i want to know how to display it with it's attachments like images pdf that i've uploaded it.

The best approach is to use two different ports. The reason for this is IIS is by default bound to port 80. So what I'd do is leave one site at 80 and do another at say 2. Make these bindings in IIS then go to Kentico and add your second site at localhost:2 vs. :8080. There's a conflict with port numbers. Kentico and IIS are "confused" and don't know which one to serve up. The only way it will work with the same port is to start and stop sites within Kentico.

Brenden is correct - there cannot be 2 sites running on same domain. What you need to do is configure IIS bindings. What I often do is that I configure my hosts file (C:\Windows\System32\drivers\etc) and add a few more rules like:
127.0.0.1 localhost2
127.0.0.1 localhost3
And then I can use bind my Kentico sites to these domains. Don't forget to also change the domain names in Kentico -> Sites app.
As for your second question:
It depends whether you want to get only SKUInfo object or page object where the custom data (page type fields) are stored. If you just need SKUInfo you can use something like:
// gets only corresponding SKU Info object
var singleProduct = SKUInfoProvider.GetSKUInfo(1); // SKUID from COM_SKU table
if (singleProduct != null)
{
var name = singleProduct.SKUName;
var price = singleProduct.SKUPrice;
}
If you need to get the product with all custom fields you need to use the Pages API as you would with any other page. A simple example:
// gets sku with all custom properties
var tree = new TreeProvider(MembershipContext.AuthenticatedUser);
var singleProduct = tree.SelectSingleDocument(2); // DocumentID from CMS_Document table
if (singleProduct != null)
{
// work with product
}
// or for multiple products
var products = tree.SelectNodes("custom.myProductType");
foreach (var product in products)
{
// work with products/pages
}
For the purpose of retrieving pages I would highly recommend to check this documentation article which contains a lot of examples.

Related

How does Tumblr approach custom domain mapping?

I searched all over but can't find a clear answer or even an engineering blog post to describe how companies map custom domains to their application.
For example, let's say I have a Tumblr page with a URL of www.ashley.tumblr.com. The site allows you to add a custom domain so that visiting www.Ashley.com will render www.ashley.tumblr.com with full support for additional pages and directories.
What is the technical name for developing this?
There's no single name for what they're doing - which is engineering their HTTP/web-server code to handle requests from arbitrary HTTP request Host: header and mapping them to their existing Tumblr accounts. It has nothing to do with DNS other than requiring the owner of a custom domain-name to change their A, AAAA, or CNAME records to point to the same host as the non-custom domain (to guarantee this happens it's usual to make the custom domain-name a CNAME for the non-custom domain, in case the non-custom domain's IP address is subject to change).
Exposition time! - Most conventional web-servers (Apache, IIS) are built around the concept of a "website": a physical directory mapped to requests corresponding to a predefined list of HTTP Host: header values (or some wildcard matching pattern) and protocol and port bindings. For example, you'd add an entry called "MyWebsite.com" (the Website Name) that accepts requests to mywebsite.com and www.mywebsite.com (as these are two distinct Host: header values) and maybe some more, like secure.mywebsite.com using HTTPS on port 443).
More modern lightweight webservers and reverse-proxies (like nginx and Node.js' Express) dispense with physical directory mapping and let the application code entirely decide how to route requests within the application's logic (this is what a "router" and/or "demultiplexer" (demux) does in web-application terminology) - this comes at the expense of needing to handle all that logic yourself (to be fair, these webservers come with the necessary tools to easily configure them like the older conventional web-servers, it just isn't the default).
...but the advantage is that you can make it work exactly like you want.
In pseudocode their program probably looks something like this:
void handleRequest(Request request) {
String hostHeader = request.getHeader("Host")
RegexMatch nonCustomDomainMatch = hostHeader.match( "([^\.]+).tumblr.com" )
if nonCustomDomainMatch.success {
String accountName = nonCustomDomainMatch.groups[0]
showAccount( accountName )
}
else {
// Look up the custom domain name in a database or other mutable data store:
String accountName = db.execQuery( "SELECT accountName FROM accounts WHERE accounts.customDomainName = #cdn", new { cdn: hostHeader } )
if accountName == null {
showHttp404Error()
}
else {
showAccount( accountName )
}
}
}
In reality, given their size and scale, it would likely be some custom logic inside hardware load-balancers or some other lightweight frontend service - and always with aggressive caching (database lookups are expensive!).

ASP.NET MVC 5 custom RazorViewEngine for multiple portal structure

I setup my MVC 5 site by category, then controller, model, view sub-folders in each category, i.e. root folder folders \Home and \Products would have these three sub-folders as well as a root \Shared\Views folder. I followed a terrific article my Matthew Renz, Clean Architecture in ASP.NET MVC 5. Done in part by creating a custom RazorViewEngine, specifically:
public CustomRazorViewEngine()
{
ViewLocationFormats = new string[]
{
"~/{1}/Views/{0}.cshtml",
};
PartialViewLocationFormats = new string[]
{
"~/Shared/Views/{0}.cshtml"
};
}
There aren't many changes beyond that. I was wondering if I could build on this idea and setup a website project with a \Portals root folder and sub-folders for each Portal using some identifier (name or number) - similar to DNN. The changes to the custom razor view engine code might look some like:
public CustomRazorViewEngine()
{
ViewLocationFormats = new string[]
{
"~/Portals/{2}/{1}/Views/{0}.cshtml",
};
PartialViewLocationFormats = new string[]
{
"~/Portals/{2}/Shared/Views/{0}.cshtml"
};
}
I am not sure where the values {0} and {1} come from, however. I could find a means for obtaining {2}, the portal website name. The relative paths for the rest of the site, such as \Content, \Scripts, etc. I believe I could structure myself.
The purpose for this approach is to deliver to the client a solution in which common code can be reused to support a number of portals with unique skins and features. Thank you for your time and consideration and let me know if you have any questions.
John
These are placeholders in the string that can be used to put the area name, controller name or action name into the string by the controller. {2} is area, {1} is controller,{0} is the action.
You may also be interested to know that when using Asp.Net Core it's easy to get the standard Razor View Engine to locate views and such in custom locations via a ViewLocationExpander rather than needing to create a new view engine that inherits from the Razor View Engine. I only mention this because you added the asp.net-core-mvc tag on your question.
Here is a stack overflow answer that shows how:
How to specify the view location in asp.net core mvc when using custom locations?

Microsoft Sharepoint 2010-Connecting to a remote site and fetching changelogs

My problem is simple. I have a registered Sharepoint site/domain (say https://secretText-my.sharepoint.com/personal/blabla) and I want to fetch the changelogs as described here Sharepoint Change log
So my question boils down to >>> How can I use this Changelog API to fetch data for a remote Sharepoint site?
How can I achieve this? I have tried Client Object Model and everything related but my goal is to use Sharepoint Change log.
I am hoping for something like,
using (ClientContext ctx = ClaimClientContext.GetAuthenticatedContext("https://secretText-my.sharepoint.com/personal/blabla"))
{
if (ctx != null)
{
ctx.Load(ctx.Web); // Query for Web
ctx.ExecuteQuery(); // Execute
ctx.Load(ctx.Site);
ctx.ExecuteQuery();
SPSite site = new SPSite(ctx.Site.Id);
SPContentDatabase db = site.ContentDatabase;
// Get the first batch of changes,
SPChangeCollection changes = db.GetChanges();
//USE this 'site' object to fetch the change logs
.
.
.
My aim is to somehow instantiate this SPSite object which would then help me get the data I want. Although this code seems a bit too ambitious(or totally wrong) but please don't hold it against me, I couldn't find any solution to this.
Much appreciated!
After a lot of Google searches and after reading so many answers, I have come to know that it isn't possible to connect to a remote Sharepoint server through the Server API. As that API works only when SP server is on the same network (same machine or intranet)
The only solution is to use Client Object Model. It provides(maps) quite a lot operations that the Server API gives.
To connect to the remote site I have used the samples provided at the MSDN site for Client Object Model. Here

Reverse lookup of Terminal Services Home Folder withDirectory Services

I'm looking for a way to query the Terminal Services Home Folder property of user objects in Active Directory. My goal is to be able to perform a reverse lookup, finding the user(s) that use a particular home folder.
Normally to perform a search I would do something like this:
using (var search = new DirectorySearcher())
{
// Find a user based on their telephone number
search.Filter = "(telephoneNumber=999)";
search.PropertiesToLoad.Add("displayName");
var result = search.FindOne();
if(result != null) {....}
}
But the Terminal Services settings don't seem to have a referable LDAP attribute name - in the past to set these values I've had to use the IADsTSUserEx interface with an existing DirectoryEntry to manipulate the TS profile and home folder properties. However this is only useful when I have the user account in question - it's not very practical to step through every user in a domain and create a DirectoryEntry object for them just to check their TS profile path.
Is there any practical way to perform a "WHERE User.TerminalServicesProfilePath=X" query in .NET?
Nobody here so, I try an explanation.
In Windows Server 2008 (and R2), the Terminal Services Terminal Server Runtime Interface takes the user parameters from the user Active-Directory attribute called userParameters. As explain in Microsoft documentation userParameter contains Terminal Server parameter as blob.

WSS 3.0 Site Provisioning

Is there any way to do WSS 3.0 site provisioning? My client's requirement is attributes as variables that will be defined in XML format: Organization Name, Logo, Address, User and Role information. The client should be able to install this web application to any WSS production server by just defining the attributes in the XML file.
Is it possible to to write a utility to parse that well defined XML and provision the site accordingly?
It's possible to provision sites from the object model, but creating entirely customized sites is beyond the scope of a single question. To get you started, you should take a look at the SPWebCollection.Add as well as the SPSiteCollection.Add.
To create a site collection and some subsites into one of your web applications, you could use something like this:
var farm = SPFarm.Local;
var solution = farm.Solutions.GetValue<SPSolution>("YourSolution.wsp");
var application = solution.DeployedWebApplications.First();
var sites = application.Sites;
using(var site = sites.Add("/", "Root Site", "Description", 1033, "YOURTEMPLATE#1", "YOURDOMAIN\SiteCollectionAdmin", "Site Collection Admin", "admin#yourcompany.example")) {
using(var rootWeb = site.RootWeb) {
// Code customizing root site goes here
using (var subSite = rootWeb.Webs.Add("SubSite", "Sub Site", "Description", 1033, "YOURTEMPLATE#2", false, false)) {
// Code customizing sub site goes here
}
}
}
Yes, there are more than one.
Take a look at SharePoint Solution Generator which is in Windows SharePoint Services 3.0 Tools: Visual Studio 2005 Extensions.
You may create a site with all requirements of yours (pages, lists, document libraries...) and then generate a VS project that will create a SharePoint feature with all of your site. Then you may deploy that feature to any WSS production server.
You may alter the VS project to implement the logic to read your attributes from an additional xml file.
If the structure of your site is plain or you can save it as a template you may also write a small console application that reads the attribute xml file and create the site.
Create a regular solution, or use the aforementioned solution generator to generate the .wsp file. Then create a small console application, that expects the variables you mentioned as parameters.
With the code listed above, provision the new sitecollection from that solution, and store the entered parameters (Company name etc.) in the site in a list, or in the SPSite.Properties propertybag, from which you can then read them in custom webparts etc..
The SharePoint Data Population Tool available on CodePlex allows you to define sites with XML.

Resources