Tried the following as per one suggestion;
static void Main()
{
string url = "http://localhost:8080/";
StartOptions options = new StartOptions();
options.Urls.Add("http://localhost:8080");
options.Urls.Add("http://127.0.0.1:8080");
options.Urls.Add(string.Format("http://{0}:8080", Environment.MachineName));
using (WebApp.Start<Startup>(options))
{
HttpClient client = new HttpClient();
var response = client.GetAsync(url + "api/Values").Result;
Console.WriteLine(response);
Console.WriteLine(response.Content.ReadAsStringAsync().Result);
Console.ReadLine();
}
}
Can connect with local browser, but remote machines get 'webpage is not available'.
Set reservation with 'add urlacl url=http://*:8080/ user=EVERYONE'
Port 8080 is open, tried many other ports with same results.
tried suggestion below, same result;
var options = new StartOptions("http://*:8080")
{
ServerFactory = "Microsoft.Owin.Host.HttpListener"
};
Nothing is working remotely, only locally - is there anything else I can try?
Related
I'm a beginner when it comes to HTTP connections.
Currently i'm working with SAAJ api to have my soap based client, where to handle timeouts i ended up using URLStreamHandler for HTTP connection properties with the endpoints.
Problem is that this timeout works for my windows based system, however it isn't working for the Linux server it is going to go live on.
below is the code for fetching endpoint with set properties. It is a HTTP POST connection.
URL endpoint = new URL (null, url, new URLStreamHandler () {
protected URLConnection openConnection (URL url) throws IOException {
// The url is the parent of this stream handler, so must create clone
URL clone = new URL (url.toString ());
HttpURLConnection connection = (HttpURLConnection) clone.openConnection();
connection.setRequestProperty("Content-Type",
"text/xml");
connection.setRequestProperty("Accept",
"application/soap+xml, text/*");
// If we cast to HttpURLConnection, we can set redirects
// connection.setInstanceFollowRedirects (false);
connection.setDoOutput(true);
connection.setConnectTimeout(3 * 1000);
connection.setReadTimeout(3 * 1000);
return connection;
for the SAAJ API part, below is the implementation, pretty basic one
SOAPConnectionFactory soapConnectionFactory = SOAPConnectionFactory
.newInstance();
soapConnection = soapConnectionFactory.createConnection();
is = new ByteArrayInputStream(command.getBytes());
SOAPMessage request = MessageFactory.newInstance(
SOAPConstants.SOAP_1_1_PROTOCOL).createMessage(
new MimeHeaders(), is);
MimeHeaders headers = request.getMimeHeaders();
headers.addHeader("Content-Type", "text/xml");
request.saveChanges();
ByteArrayOutputStream out = new ByteArrayOutputStream();
request.writeTo(out);
soapResponse = soapConnection.call(request, endpoint);
Is it that system properties would affect connect or read timeout. If so please let me know what could cause this behavior.
The following error started coming up suddenly! It worked for months after the application was deployed on the server but suddenly it has stopped working and showing the following error.
The code I am using as follows:
private void PostData()
{
c.Add("publicid", "xxxxxxxxxxxxxxxxxxxxxx");
c.Add("name", "ABC");
c.Add("label:Affiliate_ID", "207");
c.Add("website", "websitename.com");
c.Add("label:IP_Address", Request.ServerVariables["REMOTE_ADDR"]);
c.Add("firstname", txtFirstName.Text.Trim());
c.Add("phone", txtPhone.Text.Trim());
c.Add("lastname", txtLastName.Text.Trim());
c.Add("email", txtEmail.Text.Trim());
c.Add("label:Date_of_Birth", String.Format("{0:YYYY-MM-DD}", Convert.ToDateTime(txtDob.Text.Trim())));
c.Add("lane", txtStreetAddress.Text.Trim());
c.Add("code", txtZip.Text.Trim());
c.Add("city", txtCity.Text.Trim());
c.Add("Province", txtProvince.Text.Trim());
c.Add("label:Time_At_Address", stay.ToString(CultureInfo.InvariantCulture));
c.Add("label:Known_Credit_Issues", "yes");
c.Add("label:Net_Monthly_Income", txtMothlyIncome.Text.Trim());
c.Add("label:Occupation", txtOccopation.Text.Trim());
c.Add("label:Employer_Name", txtEmployer.Text.Trim());
c.Add("label:Employment_Length", "");
c.Add("label:Bankruptcy", "No");
c.Add("label:Employer_Phone_Number", "");
c.Add("label:Employer_Postal_Code", "");
c.Add("label:Employer_Province", "");
c.Add("label:Employer_City", "");
c.Add("label:Employer_Address", "");
var myWebClient = new System.Net.WebClient();
const string postingUrl = "https://xxxx.com/modules/Webforms/capture.php";
byte[] responseArray = null;
responseArray = myWebClient.UploadValues(postingUrl, "POST", c);
var responseData = Encoding.ASCII.GetString(responseArray);
}
Please help! It is a live application :(
Thanks in advance.
I'm going to take a shot in the dark and say that a system-wide proxy has been set on your server and since WebClient instances use that by default, it's now failing.
After:
var myWebClient = new System.Net.WebClient();
Add:
myWebClient.Proxy = null;
Recompile/restart the app and let me know if my voodoo debugging sense was right or utter BS.
I've set up a development server with AD and I'm trying to figure out how to connect to it via .NET. I'm working on the same machine that AD is installed on. I've gotten the DC name from AD, and the name of the machine, but the connection just does not work. I'm using the same credentials I used to connect to the server.
Any suggestions?
DirectoryEntry directoryEntry = new DirectoryEntry("LDAP://[dc.computername.com]", "administrator", "[adminpwd]");
Can you connect to the RootDSE container?
DirectoryEntry rootDSE = new DirectoryEntry("LDAP://RootDSE", "administrator", "[adminpwd]");
If that works, you can then read out some of the properties stored in that root container
if (rootDSE != null)
{
Console.WriteLine("RootDSE Properties:\n\n");
foreach (string propName in rootDSE.Properties.PropertyNames)
{
Console.WriteLine("{0:-20d}: {1}", propName, rootDSE.Properties[propName][0]);
}
}
This will show you some information about what LDAP paths are present in your installation.
Try something like this, using the System.DirectoryServices.Protocols namespace :
//Define your connection
LdapConnection ldapConnection = new LdapConnection("123.456.789.10:389");
try
{
//Authenticate the username and password
using (ldapConnection)
{
//Pass in the network creds, and the domain.
var networkCredential = new NetworkCredential(Username, Password, Domain);
//Since we're using unsecured port 389, set to false. If using port 636 over SSL, set this to true.
ldapConnection.SessionOptions.SecureSocketLayer = false;
ldapConnection.SessionOptions.VerifyServerCertificate += delegate { return true; };
//To force NTLM\Kerberos use AuthType.Negotiate, for non-TLS and unsecured, use AuthType.Basic
ldapConnection.AuthType = AuthType.Basic;
ldapConnection.Bind(networkCredential);
}
catch (LdapException ldapException)
{
//Authentication failed, exception will dictate why
}
}
I´m trying to use a headless browser for crawling purposes to add SEO features in a open source project i´m developing.
The project sample site is deployed via Azure Websites.
I tried several ways to get the task working using different solutions like Selenium .NET (PhantomJSDriver, HTMLUnitDriver, ...) or even standalone PhantomJs .exe file.
I´m using a headless browser because the site is based in DurandalJS, so it needs to execute scripts and wait for a condition to be true in order to return the generated HTML. For this reason, can´t use things like WebClient/WebResponse classes or HTMLAgilityPack which use to work just fine for non-javascript sites.
All the above methods works in my devbox localhost environment but the problem comes when uploading the site to Azure Websites. When using standalone phantomjs the site freezes when accessing the url endpoint and after a while return a HTTP 502 error. In case of using Selenium Webdriver i´m getting a
OpenQA.Selenium.WebDriverException: Unexpected error. System.Net.WebException: Unable to connect to the remote server ---> System.Net.Sockets.SocketException: No connection could be made because the target machine actively refused it 127.0.0.1:XXXX
I think the problem is with running .exe files in Azure and not with the code. I know it´s possible to run .exe files in Azure CloudServices via WebRole/WebWorkers but need to stay in Azure Websites for keep things simple.
It´s possible to run a headless browser in Azure Websites? Anyone have experience with this type of situation?
My code for the standalone PhantomJS solution is
//ASP MVC ActionResult
public ActionResult GetHTML(string url)
{
string appRoot = Server.MapPath("~/");
var startInfo = new ProcessStartInfo
{
Arguments = String.Format("{0} {1}", Path.Combine(appRoot, "Scripts\\seo\\renderHTML.js"), url),
FileName = Path.Combine(appRoot, "bin\\phantomjs.exe"),
UseShellExecute = false,
CreateNoWindow = true,
RedirectStandardOutput = true,
RedirectStandardError = true,
RedirectStandardInput = true,
StandardOutputEncoding = System.Text.Encoding.UTF8
};
var p = new Process();
p.StartInfo = startInfo;
p.Start();
string output = p.StandardOutput.ReadToEnd();
p.WaitForExit();
ViewData["result"] = output;
return View();
}
// PhantomJS script
var resourceWait = 300,
maxRenderWait = 10000;
var page = require('webpage').create(),
system = require('system'),
count = 0,
forcedRenderTimeout,
renderTimeout;
page.viewportSize = { width: 1280, height: 1024 };
function doRender() {
console.log(page.content);
phantom.exit();
}
page.onResourceRequested = function (req) {
count += 1;
//console.log('> ' + req.id + ' - ' + req.url);
clearTimeout(renderTimeout);
};
page.onResourceReceived = function (res) {
if (!res.stage || res.stage === 'end') {
count -= 1;
//console.log(res.id + ' ' + res.status + ' - ' + res.url);
if (count === 0) {
renderTimeout = setTimeout(doRender, resourceWait);
}
}
};
page.open(system.args[1], function (status) {
if (status !== "success") {
//console.log('Unable to load url');
phantom.exit();
} else {
forcedRenderTimeout = setTimeout(function () {
//console.log(count);
doRender();
}, maxRenderWait);
}
});
and for the Selenium option
public ActionResult GetHTML(string url)
{
using (IWebDriver driver = new PhantomJSDriver())
{
driver.Navigate().GoToUrl(url);
WebDriverWait wait = new WebDriverWait(driver, TimeSpan.FromSeconds(30));
IWebElement myDynamicElement = wait.Until<IWebElement>((d) =>
{
return d.FindElement(By.CssSelector("#compositionComplete"));
});
var content = driver.PageSource;
driver.Quit();
return Content(content);
}
}
Thanks!!
You cannot execute exe files in the shared website environment, either you have to use the web services or you have to set up a proper (azure) virtual machine.
The free shared website service is really basic, and won't cut it when you need more advanced functionality.
See this question and accepted answer for a more elaborated answer: Can we run windowservice or EXE in Azure website or in Virtual Machine?
I am not sure about shared and basic website environment but i am successfully run ffmpeg.exe from standart website environment. Despite that still phantomjs and even chromedriver itself is not working.
However i am able run Firefox driver successfully. In order to do that
I copied latest firefox directory from my local to website and below code worked well.
var binary = new FirefoxBinary("/websitefolder/blabla/firefox.exe");
var driver = new FirefoxDriver(binary, new FirefoxProfile());
driver.Navigate().GoToUrl("http://www.google.com");
I have a simple personal MVC4 web app that is hosted in Windows Azure.
This web app is very minimal in use, the initial call is very slow specially when I tried to click in the morning.
I’m suspecting that the IIS is sleeping and need to wake up. I found this article and mention that this is a bug in IIS http://social.msdn.microsoft.com/Forums/en-US/wcf/thread/8b3258e7-261c-49a0-888c-0b3e68b2af13 which required setting up in IIS but my web app is hosted in Azure, is there any way to do some sort of setting in Web.config file?
All succeeding calls are fast.
Here is my personal page. javierdelacruz.com
Thanks.
Two options:
Startup Tasks
OnStart Code
For startup tasks, see this link.
For OnStart code, try a function like this (this function does a few more things, too):
private const string _web_app_project_name = "Web";
public static void SetupDefaultEgConfiguration(int idleTimeoutInMinutes = 1440, int recycleTimeoutInMinutes = 1440, string appPoolName = "My Azure App Pool", bool enableCompression = true)
{
if (!RoleEnvironment.IsEmulated)
{
Trace.TraceWarning("Changing IIS settings upon role's OnStart. Inputs: ({0}, {1}, {2}, {3}", idleTimeoutInMinutes, recycleTimeoutInMinutes, appPoolName, enableCompression);
// Tweak IIS Settings
using (var iisManager = new ServerManager())
{
try
{
var roleSite = iisManager.Sites[RoleEnvironment.CurrentRoleInstance.Id + "_" + _web_app_project_name];
if (enableCompression)
{
//================ Enable or disable static/Dynamic compression ===================//
var config = roleSite.GetWebConfiguration();
var urlCompressionSection = config.GetSection("system.webServer/urlCompression");
urlCompressionSection["doStaticCompression"] = true;
urlCompressionSection["doDynamicCompression"] = true;
Trace.TraceWarning("Changing IIS settings to enable static and dynamic compression");
}
//================ To change ApplicationPool name ================================//
var app = roleSite.Applications.First();
app.ApplicationPoolName = appPoolName;
//================ To change ApplicationPool Recycle Timeout ================================//
var appPool = iisManager.ApplicationPools[app.ApplicationPoolName];
appPool.Recycling.PeriodicRestart.Time = new TimeSpan(0, recycleTimeoutInMinutes, 0);
//================ idletimeout ====================================================//
var defaultIdleTimeout = iisManager.ApplicationPoolDefaults.ProcessModel.IdleTimeout;
var newIdleTimeout = new TimeSpan(0, idleTimeoutInMinutes, 0);
if ((int)newIdleTimeout.TotalMinutes != (int)defaultIdleTimeout.TotalMinutes)
{
appPool.ProcessModel.IdleTimeout = newIdleTimeout;
}
// Commit the changes done to server manager.
iisManager.CommitChanges();
}
catch (Exception e)
{
Trace.TraceError("Failure when configuring IIS in Azure: " + e.ToString().Take(63000));
}
}
}
}
Source and some more details for the function I included here - there are some dependencies you'll likely need to accomplish this.