i need help to using this below configuration in xml file to change my url web service,
`<configuration>
<appSettings>
<add key="UserName" value="ova"/>
<add key="UserPassword" value="ova"/>
<add key="ServiceName" value="xe"/>
<add key="ServerName" value="localhost"/>
<add key="WebService" value="/FDC_Service/FDC_Service.asmx"/>
</appSettings>
</configuration>`
and its code i need to chnange with calling server name and web service in source Code appliation, like this below
FDC_Service.FDC_ServiceClass asd = new FDC_Service.FDC_ServiceClass();
retval = asd.FDC_Command(database.UserName, database.UserPassword, database.ServiceName, str);
FDC_Service is my web service, i need help
thanks....
I hope this answers your question
in you .cs code add
using System.Configuration;
Then in your method add
var username = ConfigurationManager.AppSettings["UserName"];
var password = ConfigurationManager.AppSettings["UserPassword"];
var serviceName = ConfigurationManager.AppSettings["ServerName"];
FDC_Service.FDC_ServiceClass asd = new FDC_Service.FDC_ServiceClass();
retval = asd.FDC_Command(username, password, serviceName, str);
This is working for me. Set the appropriate uri when instantiating:
ws = new wsPedidosWeb.OperacionesTiendaPortTypeClient(
new wsPedidosWeb.OperacionesTiendaPortTypeClient.EndpointConfiguration(),
new uri("http://whatever")
);
In case someone is looking for the full code; like I did:
private static string GetServiceAddressUrlByContractName(string contractFullName)
{
var clientSection = WebConfigurationManager
.GetSection("system.serviceModel/client") as ClientSection;
var endpointList = clientSection?.Endpoints.Cast<ChannelEndpointElement>().ToList();
return endpointList?.FirstOrDefault(e => e.Contract== contractFullName)?.Address.ToString();
}
Related
I have a simple ASP.NET Core 3.1 app deployed on an Azure App Service, configured with a .NET Core 3.1 runtime. One of my endpoints are expected to receive a simple JSON payload with a single "data" property, which is a base64 encoded string of a file. It can be quite long, I'm running into the following issue when a the JSON payload is 1.6 MBs.
On my local workstation, when I call my API from Postman, everything's working as expected, my breakpoint in the Controller's action method is reached, the data is populated, all good - it's only when I deploy (via Azure DevOps CICD Pipelines) the app to the Azure App Service. Whenever trying to call the deployed API from Postman, no HTTP response is received, but this: "Error: write EPIPE".
I've tried modifying the web.config to include both a maxRequestLength and maxAllowedContentLength properties:
<?xml version="1.0" encoding="utf-8"?>
<configuration>
<location path="." inheritInChildApplications="false">
<system.web>
<httpRuntime maxRequestLength="204800" ></httpRuntime>
</system.web>
<system.webServer>
<security>
<requestFiltering>
<requestLimits maxAllowedContentLength="419430400" />
</requestFiltering>
</security>
<handlers>
<add name="aspNetCore" path="*" verb="*" modules="AspNetCoreModuleV2" resourceType="Unspecified" />
</handlers>
<aspNetCore processPath="dotnet" arguments=".\MyApp.API.dll" stdoutLogEnabled="false" stdoutLogFile=".\logs\stdout" hostingModel="inprocess" />
</system.webServer>
</location>
</configuration>
In the app's code, I've added to the Startup.cs:
services.Configure<IISServerOptions>(options => {
options.MaxRequestBodySize = int.MaxValue;
});
In the Program.cs, I've added:
.UseKestrel(options => { options.Limits.MaxRequestBodySize = int.MaxValue; })
In the controller, I've tried both of these attributes: [DisableRequestSizeLimit], [RequestSizeLimit(40000000)]
However, nothing's working so far - I'm pretty sure it has to be something configured on the App Service itself, not in my code, as locally everything's working. Yet, nothing so far helped in the web.config
It was related to the fact that in my App Service, I had to allow incoming client certificates, in the Configuration - turns out client certificates and large payloads don't mix well in IIS (apparently for more than a decade now): https://learn.microsoft.com/en-us/archive/blogs/waws/posting-a-large-file-can-fail-if-you-enable-client-certificates
None of the proposed workarounds in the above blog post fixed my issue, so I had to workaround: I've created an Azure Function (still using .NET Core 3.1 as a runtime stack) with a Consumption Plan, which is able to receive both the large payload and the incoming client certificate (I guess it doesn't use IIS under the hood?).
In my original backend, I added the original API's route to the App Service's "Certificate exclusion paths", to not get stuck waiting and timing out eventually with "Error: write EPIPE".
I've used Managed Identity to authenticate between my App Service and the new Azure Function (through a System Assigned identity in the Function).
The Azure Function takes the received certificate, and adds it to a new "certificate" property in the JSON body, next to the original "data" property, so my custom SSL validation can stay on the App Service, but the certificate is not being taken from the X-ARR-ClientCert header, but from the received payload's "certificate" property.
The Function:
#r "Newtonsoft.Json"
using System.Net;
using System.IO;
using System.Net.Http;
using System.Text;
using Microsoft.AspNetCore.Mvc;
using Microsoft.Extensions.Primitives;
using Newtonsoft.Json;
using System.Security.Cryptography.X509Certificates;
private static HttpClient httpClient = new HttpClient();
public static async Task<IActionResult> Run(HttpRequest req, ILogger log)
{
var requestBody = string.Empty;
using (var streamReader = new StreamReader(req.Body))
{
requestBody = await streamReader.ReadToEndAsync();
}
dynamic deserializedPayload = JsonConvert.DeserializeObject(requestBody);
var data = deserializedPayload?.data;
var originalUrl = $"https://original-backend.azurewebsites.net/api/inbound";
var certificateString = string.Empty;
StringValues cert;
if (req.Headers.TryGetValue("X-ARR-ClientCert", out cert))
{
certificateString = cert;
}
var newPayload = new {
data = data,
certificate = certificateString
};
var response = await httpClient.PostAsync(
originalUrl,
new StringContent(JsonConvert.SerializeObject(newPayload), Encoding.UTF8, "application/json"));
var responseContent = await response.Content.ReadAsStringAsync();
try
{
response.EnsureSuccessStatusCode();
return new OkObjectResult(new { message = "Forwarded request to the original backend" });
}
catch (Exception e)
{
return new ObjectResult(new { response = responseContent, exception = JsonConvert.SerializeObject(e)})
{
StatusCode = 500
};
}
}
I am new to asp.net core, Im used to Visual Studio 2017 and how publishing web apps works with that.
I rewrote a simple login endpoint in asp.net core using VS Code that was previously written in Visual Studio 2017 using MVC and tried to publish it following articles I googled. I'm getting an error when I browse the app in IIS. Something else looks weird too, when I view the published files they look incomplete. I'm new to VS Code creating webapi core apps so maybe I am wrong. I'll attach a screen shot of the published files.
The error I am getting is: "
HTTP Error 502.3 - Bad Gateway
There was a connection error while trying to route the request.
"
I have installed the .NET Core SDK and required installs on the IIS machine and set up the APP Pool for No Managed Code as well.
Web Config
<?xml version="1.0" encoding="utf-8"?>
<configuration>
<location path="." inheritInChildApplications="false">
<system.webServer>
<handlers>
<add name="aspNetCore" path="*" verb="*" modules="AspNetCoreModule" resourceType="Unspecified" />
</handlers>
<aspNetCore processPath="dotnet" arguments=".\LB_CONNECT_API_2.dll" stdoutLogEnabled="false" stdoutLogFile=".\logs\stdout" hostingModel="InProcess" />
</system.webServer>
</location>
</configuration>
SCREENSHOT Of PUBLISHED FILES
I tried editing web.config and removing the V2 from AspNetCore
[Route("api/[controller]")]
[ApiController]
public class LoginController : ControllerBase
{
// POST: api/Login
public User Post(User user)
{
SqlDataReader reader = null;
SqlConnection myConnection = new SqlConnection
{
ConnectionString = #"Server=ServerName;Database=DB;User ID=User;Password=password;"
};
SqlCommand sqlCmd = new SqlCommand
{
CommandType = CommandType.StoredProcedure,
CommandText = "lb_Login",
Connection = myConnection
};
sqlCmd.Parameters.AddWithValue("#Email", user.Email);
sqlCmd.Parameters.AddWithValue("#Password", user.Password);
myConnection.Open();
reader = sqlCmd.ExecuteReader();
User _user = null;
while (reader.Read())
{
_user = new User
{
UserID = Guid.Parse(reader["UserID"].ToString()),
FirstName = reader["FirstName"].ToString(),
LastName = reader["LastName"].ToString(),
GroupName = reader["GroupName"].ToString(),
Email = reader["Email"].ToString(),
Cell = reader["Cell"].ToString()
};
}
return _user;
}
I need it to return the user object in JSON
I am looking to use Auth0 as the authentication provider for ServiceStack. There is a great sample application documented at Auth0 which applies & works well when working with ServiceStack and using ServiceStack.Host.MVC: https://auth0.com/docs/quickstart/webapp/servicestack/01-login.
However, I am at a loss how to construct the authorization URL and redirect the user to that URL in a scenario where I am NOT using MVC & the AccountController to redirect the user. How can I construct the redirect URLs using ServiceStack Auth Plugin, if I want to replicate the logic as per MVC sample code below:
public class AccountController : Controller
{
public ActionResult Login()
{
string clientId = WebConfigurationManager.AppSettings["oauth.auth0.AppId"];
string domain = WebConfigurationManager.AppSettings["oauth.auth0.OAuthServerUrl"].Substring(8);
var redirectUri = new UriBuilder(this.Request.Url.Scheme, this.Request.Url.Host, this.Request.Url.IsDefaultPort ? -1 : this.Request.Url.Port, "api/auth/auth0");
var client = new AuthenticationApiClient(new Uri($"https://{domain}"));
var authorizeUrlBuilder = client.BuildAuthorizationUrl()
.WithClient(clientId)
.WithRedirectUrl(redirectUri.ToString())
.WithResponseType(AuthorizationResponseType.Code)
.WithScope("openid profile")
.WithAudience($"https://{domain}/userinfo");
return Redirect(authorizeUrlBuilder.Build().ToString());
}
}
For all who are interested,here is the solution I ended up adopting.
Steps:
1) Create an Auth0 plugin (see gist here)
2) Register the Plugin in your AppHost.
Plugins.Add(new AuthFeature(() => new Auth0UserSession(), new IAuthProvider[] {
new Auth0Provider(appSettings,appSettings.GetString("oauth.auth0.OAuthServerUrl"))
}));
3) Add the relevant keys in your Web.Config.
<appSettings>
<add key="oauth.auth0.OAuthServerUrl" value="https://xxxxxxx.auth0.com" />
<add key="oauth.auth0.AppId" value="xxxxxx" />
<add key="oauth.auth0.AppSecret" value="xxxxxxx" />
</appSettings>
I am new to Sharepoint and Client Object model. I am stuck with a problem and not been able to fix the issue. I want to upload files more than 10 MB using Client Object Model in Sharepoint 2013. I get the following exception
The request message is too large. The server does not allow messages
that are larger than 2097152 bytes.
I have tried everything. Here is the list of things that i did
1- Changed the settings in web.config file of my local web application
<system.web>
<httpRuntime useFullyQualifiedRedirectUrl="true" maxRequestLength="2147483647" requestLengthDiskThreshold="2147483647" executionTimeout="18000"/> </system.web>
<system.webServer>
<modules runAllManagedModulesForAllRequests="true"/>
<security>
<requestFiltering>
<requestLimits maxAllowedContentLength="2147483647" />
</requestFiltering>
</security>
</system.webServer>
2- In the powershell on my server ran the following commands and restarted the application in the IIS. Even restarted the whole IIS.
$ws = [Microsoft.SharePoint.Administration.SPWebService]::ContentService
$ws.ClientRequestServiceSettings.MaxReceivedMessageSize = 2147483647
$ws.Update()
Here is my code :
private void UploadDataToSharepointTest(List<UploadData> pDataObjList)
{
string lServerUrl = #"http://xxxxxxx:2000/";
string lFolderName = DateTime.Now.ToString(#"yyyyMMddHHmmss");
ClientContext context = new ClientContext(lServerUrl);
context.AuthenticationMode = ClientAuthenticationMode.Default;
context.Credentials = new System.Net.NetworkCredential("user", "password", "domain");
Web web = context.Web;
List docs = web.Lists.GetByTitle("ABC");
Folder lNewFolder = web.Folders.Add(lServerUrl + "ABC/" + lFolderName + "/");
docs.Update();
int fileIndex = 1;
foreach (var item in pDataObjList)
{
FileCreationInformation newFile = new FileCreationInformation();
newFile.Content = System.IO.File.ReadAllBytes(item.CompleteFilePath);
newFile.Url = fileIndex.ToString() + "-" + item.fileName;
fileIndex++;
Microsoft.SharePoint.Client.File uploadFile = lNewFolder.Files.Add(newFile);
context.Load(uploadFile);
context.ExecuteQuery();
Dictionary<string, string> metadata = new Dictionary<string, string>();
metadata.Add("Comments", item.comments);
metadata.Add("Plan_x0020_Size", item.planSize);
metadata.Add("Density", item.density);
metadata.Add("First_x0020_Name", txtFirstName.Text.Trim());
metadata.Add("Last_x0020_Name", txtLastName.Text.Trim());
metadata.Add("Company", txtCompany.Text.Trim());
metadata.Add("Contact", txtContact.Text.Trim());
metadata.Add("Additional_x0020_Comments", txtAdditionalComments.Text.Trim());
Microsoft.SharePoint.Client.ListItem items = uploadFile.ListItemAllFields;
context.Load(items);
context.ExecuteQuery();
foreach (KeyValuePair<string, string> metadataitem in metadata)
{
items[metadataitem.Key.ToString()] = metadataitem.Value.ToString();
}
items.Update();
context.ExecuteQuery();
}
}
Note: I am able to upload small files.
There are file size limit if you use the build-in upload function.
To upload a large file, please upload it with filestream.
Take a look at the article below:
http://blogs.msdn.com/b/sridhara/archive/2010/03/12/uploading-files-using-client-object-model-in-sharepoint-2010.aspx
SharePoint allows you to configure this via Central Admin, I'd stick with that to make sure it makes all the appropriate changes for you. You need to have farm level permissions. Also in SharePoint 2013 you can have different file max limits for different file types so make sure your file type wasn't changed by anyone. Different Max based on File Types
Accessing SharePoint Webapp properties via central Admin
My 'LocalClient' app is in a corporate LAN behind an HTTP proxy server (ISA). The first Azure API call i make - CloudQueue.CreateIfNotExist() - causes an exception: (407) Proxy Authentication Required. I tried following things:
Added the <System.Net> defaultProxy element to app.config, but it doesn't seem to be working (Reference: http://geekswithblogs.net/mnf/archive/2006/03/08/71663.aspx).
I configured 'Microsoft Firewall Client for ISA Server', but that did not help either.
Used a custom proxy handler as suggested here: http://dunnry.com/blog/2010/01/25/SupportingBasicAuthProxies.aspx. I am not able to get this working - getting a Configuration initialization exception.
As per MSDN, an HTTP proxy server can be specified in the connection string only in case of Development Storage (see http://msdn.microsoft.com/en-us/library/ee758697.aspx):
UseDevelopmentStorage=true;DevelopmentStorageProxyUri=http://myProxyUri
Is there any way to connect to the Azure Storage thru a proxy server?
I actually found that the custom proxy solution was not required.
Adding the following to app.config (just before the </configuration>) did the trick for me:
<system.net>
<defaultProxy enabled="true" useDefaultCredentials="true">
<proxy usesystemdefault="true" />
</defaultProxy>
</system.net>
The custom proxy solution (the third thing i tried as mentioned in my original question) worked perfectly. The mistake i was doing earlier was not putting the <configSections> element at the beginning of <configuration> in app.config as required. On doing that, the custom proxy solution given here solved my problem.
To by pass the proxy then please use like below, it works as expected and same has been tested.
public class AzureUpload {
// Define the connection-string with your values
/*public static final String storageConnectionString =
"DefaultEndpointsProtocol=http;" +
"AccountName=your_storage_account;" +
"AccountKey=your_storage_account_key";*/
public static final String storageConnectionString =
"DefaultEndpointsProtocol=http;" +
"AccountName=test2rdrhgf62;" +
"AccountKey=1gy3lpE7Du1j5ljKiupjhgjghjcbfgTGhbntjnRfr9Yi6GUQqVMQqGxd7/YThisv/OVVLfIOv9kQ==";
// Define the path to a local file.
static final String filePath = "D:\\Project\\Supporting Files\\Jar's\\Azure\\azure-storage-1.2.0.jar";
static final String file_Path = "D:\\Project\\Healthcare\\Azcopy_To_Azure\\data";
public static void main(String[] args) {
try
{
// Retrieve storage account from connection-string.
//String storageConnectionString = RoleEnvironment.getConfigurationSettings().get("StorageConnectionString");
//Proxy httpProxy = new Proxy(Proxy.Type.HTTP,new InetSocketAddress("132.186.192.234",8080));
System.setProperty("http.proxyHost", "102.122.15.234");
System.setProperty("http.proxyPort", "80");
System.setProperty("https.proxyUser", "ad001\\empid001");
System.setProperty("https.proxyPassword", "pass!1");
// Retrieve storage account from connection-string.
CloudStorageAccount storageAccount = CloudStorageAccount.parse(storageConnectionString);
// Create the blob client.
CloudBlobClient blobClient = storageAccount.createCloudBlobClient();
// Get a reference to a container.
// The container name must be lower case
CloudBlobContainer container = blobClient.getContainerReference("rpmsdatafromhospital");
// Create the container if it does not exist.
container.createIfNotExists();
// Create a permissions object.
BlobContainerPermissions containerPermissions = new BlobContainerPermissions();
// Include public access in the permissions object.
containerPermissions.setPublicAccess(BlobContainerPublicAccessType.CONTAINER);
// Set the permissions on the container.
container.uploadPermissions(containerPermissions);
// Create or overwrite the new file to blob with contents from a local file.
/*CloudBlockBlob blob = container.getBlockBlobReference("azure-storage-1.2.0.jar");
File source = new File(filePath);
blob.upload(new FileInputStream(source), source.length());*/
String envFilePath = System.getenv("AZURE_FILE_PATH");
//upload list of files/directory to blob storage
File folder = new File(envFilePath);
File[] listOfFiles = folder.listFiles();
for (int i = 0; i < listOfFiles.length; i++) {
if (listOfFiles[i].isFile()) {
System.out.println("File " + listOfFiles[i].getName());
CloudBlockBlob blob = container.getBlockBlobReference(listOfFiles[i].getName());
File source = new File(envFilePath+"\\"+listOfFiles[i].getName());
blob.upload(new FileInputStream(source), source.length());
System.out.println("File " + listOfFiles[i].getName()+ " upload successful");
}
//directory upload
/*else if (listOfFiles[i].isDirectory()) {
System.out.println("Directory " + listOfFiles[i].getName());
CloudBlockBlob blob = container.getBlockBlobReference(listOfFiles[i].getName());
File source = new File(file_Path+"\\"+listOfFiles[i].getName());
blob.upload(new FileInputStream(source), source.length());
}*/
}
}catch (Exception e)
{
// Output the stack trace.
e.printStackTrace();
}
}
}
.Net or C# then please add below code to "App.config"
<?xml version="1.0" encoding="utf-8" ?>
<configuration>
<startup>
<supportedRuntime version="v4.0" sku=".NETFramework,Version=v4.5.2" />
</startup>
<system.net>
<defaultProxy enabled="true" useDefaultCredentials="true">
<proxy usesystemdefault="true" />
</defaultProxy>
</system.net>
</configuration>