Sending an array of primitive types over a ClientRPC call - rpc

I am trying to send an array of various basic types over the Network using the ClientRpc attribute. The documentation states, that I can send these over the Network without problems:
basic types (byte, int, float, string, UInt64, etc)
arrays of basic types
However, it seems holding them mixed together in an object[] array does not work. I have the following example:
[Command]
void CmdForwardEvent(string eventName, object[] args) {
Debug.Log ("Broadcasting event: " + eventName);
foreach (var o in args) {
Debug.Log ("arg-class: " + o.GetType() + ": " + o);
}
RpcForwardEvent (eventName, args);
}
[ClientRpc]
void RpcForwardEvent(string eventName, object[] args) {
Debug.Log ("Received event " + eventName);
foreach (var o in args) {
Debug.Log ("arg-class: " + o.GetType() + ": " + o);
}
}
void Update() {
if (Input.GetKeyDown (KeyCode.P)) {
CmdForwardEvent("Testevent", new object[]{"some string", 1, false});
}
}
On the server, I get the output
Broadcasting event: Testevent
arg-class: System.String: some string
arg-class: System.Int32: 1
arg-class: System.Boolean: false
On the client, this arrives without any errors:
Received event: Testevent
arg-class: System.Object: System.Object
arg-class: System.Object: System.Object
arg-class: System.Object: System.Object
How can I send various amounts of arguments with different basic types over a ClientRpc call?

A possible workaround would be to manually serialize the array and send it as byte[]:
// new code
private BinaryFormatter bf = new BinaryFormatter();
private byte[] objectToBytes(object os) {
MemoryStream stream = new MemoryStream ();
bf.Serialize (stream, os);
return stream.ToArray ();
}
private object bytesToObject(byte[] bytes) {
MemoryStream stream = new MemoryStream (bytes);
return bf.Deserialize (stream);
}
// modified code:
[Command]
void CmdForwardEvent(string eventName, byte[] argsAsBytes) {
object[] args = bytesToObject(argsAsBytes) as object[]; // deserialize
Debug.Log ("Broadcasting event: " + eventName);
foreach (var o in args) {
Debug.Log ("arg-class: " + o.GetType() + ": " + o);
}
RpcForwardEvent (eventName, argsAsBytes);
}
[ClientRpc]
void RpcForwardEvent(string eventName, byte[] argsAsBytes) {
object[] args = bytesToObject(argsAsBytes) as object[]; // deserialize
Debug.Log ("Received event " + eventName);
foreach (var o in args) {
Debug.Log ("arg-class: " + o.GetType() + ": " + o);
}
}
void Update() {
if (Input.GetKeyDown (KeyCode.P)) {
// serialize
byte[] bytes = objectToBytes(new object[]{"some string", 1, false});
CmdForwardEvent("Testevent", bytes);
}
}
output on client and server side are the same then, as expected.

Related

Azure Cognitive Search, how to iterate over facets to bypass 100K limit

I understand the limitation of the API with the 100K limit, and MS's site says as a work around "you can work around this limitation by adding code to iterate over, and filter on, a facet with less that 100K documents per facet value."
I'm using the "Back up and restore an Azure Cognitive Search index" sample solution provided by MS. (https://github.com/Azure-Samples/azure-search-dotnet-samples)
But can some when tell me where or how I implement this "iterate loop" on a facet" The facetable field I'm trying to use is "tributekey" but I don't know where to place the code in the below. Any help would be greatly appreciated.
// This is a prototype tool that allows for extraction of data from a search index
// Since this tool is still under development, it should not be used for production usage
using Azure;
using Azure.Search.Documents;
using Azure.Search.Documents.Indexes;
using Azure.Search.Documents.Models;
using Microsoft.Extensions.Configuration;
using System;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Net.Http;
using System.Text.Json;
using System.Threading;
using System.Threading.Tasks;
namespace AzureSearchBackupRestore
{
class Program
{
private static string SourceSearchServiceName;
private static string SourceAdminKey;
private static string SourceIndexName;
private static string TargetSearchServiceName;
private static string TargetAdminKey;
private static string TargetIndexName;
private static string BackupDirectory;
private static SearchIndexClient SourceIndexClient;
private static SearchClient SourceSearchClient;
private static SearchIndexClient TargetIndexClient;
private static SearchClient TargetSearchClient;
private static int MaxBatchSize = 500; // JSON files will contain this many documents / file and can be up to 1000
private static int ParallelizedJobs = 10; // Output content in parallel jobs
static void Main(string[] args)
{
//Get source and target search service info and index names from appsettings.json file
//Set up source and target search service clients
ConfigurationSetup();
//Backup the source index
Console.WriteLine("\nSTART INDEX BACKUP");
BackupIndexAndDocuments();
//Recreate and import content to target index
//Console.WriteLine("\nSTART INDEX RESTORE");
//DeleteIndex();
//CreateTargetIndex();
//ImportFromJSON();
//Console.WriteLine("\r\n Waiting 10 seconds for target to index content...");
//Console.WriteLine(" NOTE: For really large indexes it may take longer to index all content.\r\n");
//Thread.Sleep(10000);
//
//// Validate all content is in target index
//int sourceCount = GetCurrentDocCount(SourceSearchClient);
//int targetCount = GetCurrentDocCount(TargetSearchClient);
//Console.WriteLine("\nSAFEGUARD CHECK: Source and target index counts should match");
//Console.WriteLine(" Source index contains {0} docs", sourceCount);
//Console.WriteLine(" Target index contains {0} docs\r\n", targetCount);
//
//Console.WriteLine("Press any key to continue...");
//Console.ReadLine();
}
static void ConfigurationSetup()
{
IConfigurationBuilder builder = new ConfigurationBuilder().AddJsonFile("appsettings.json");
IConfigurationRoot configuration = builder.Build();
SourceSearchServiceName = configuration["SourceSearchServiceName"];
SourceAdminKey = configuration["SourceAdminKey"];
SourceIndexName = configuration["SourceIndexName"];
TargetSearchServiceName = configuration["TargetSearchServiceName"];
TargetAdminKey = configuration["TargetAdminKey"];
TargetIndexName = configuration["TargetIndexName"];
BackupDirectory = configuration["BackupDirectory"];
Console.WriteLine("CONFIGURATION:");
Console.WriteLine("\n Source service and index {0}, {1}", SourceSearchServiceName, SourceIndexName);
Console.WriteLine("\n Target service and index: {0}, {1}", TargetSearchServiceName, TargetIndexName);
Console.WriteLine("\n Backup directory: " + BackupDirectory);
SourceIndexClient = new SearchIndexClient(new Uri("https://" + SourceSearchServiceName + ".search.windows.net"), new AzureKeyCredential(SourceAdminKey));
SourceSearchClient = SourceIndexClient.GetSearchClient(SourceIndexName);
// TargetIndexClient = new SearchIndexClient(new Uri($"https://" + TargetSearchServiceName + ".search.windows.net"), new AzureKeyCredential(TargetAdminKey));
// TargetSearchClient = TargetIndexClient.GetSearchClient(TargetIndexName);
}
static void BackupIndexAndDocuments()
{
// Backup the index schema to the specified backup directory
Console.WriteLine("\n Backing up source index schema to {0}\r\n", BackupDirectory + "\\" + SourceIndexName + ".schema");
File.WriteAllText(BackupDirectory + "\\" + SourceIndexName + ".schema", GetIndexSchema());
// Extract the content to JSON files
int SourceDocCount = GetCurrentDocCount(SourceSearchClient);
WriteIndexDocuments(SourceDocCount); // Output content from index to json files
}
static void WriteIndexDocuments(int CurrentDocCount)
{
// Write document files in batches (per MaxBatchSize) in parallel
string IDFieldName = GetIDFieldName();
int FileCounter = 0;
for (int batch = 0; batch <= (CurrentDocCount / MaxBatchSize); batch += ParallelizedJobs)
{
List<Task> tasks = new List<Task>();
for (int job = 0; job < ParallelizedJobs; job++)
{
FileCounter++;
int fileCounter = FileCounter;
if ((fileCounter - 1) * MaxBatchSize < CurrentDocCount)
{
Console.WriteLine(" Backing up source documents to {0} - (batch size = {1})", BackupDirectory + "\\" + SourceIndexName + fileCounter + ".json", MaxBatchSize);
tasks.Add(Task.Factory.StartNew(() =>
ExportToJSON((fileCounter - 1) * MaxBatchSize, IDFieldName, BackupDirectory + "\\" + SourceIndexName + fileCounter + ".json")
));
}
}
Task.WaitAll(tasks.ToArray()); // Wait for all the stored procs in the group to complete
}
return;
}
static void ExportToJSON(int Skip, string IDFieldName, string FileName)
{
// Extract all the documents from the selected index to JSON files in batches of 500 docs / file
string json = string.Empty;
try
{
SearchOptions options = new SearchOptions()
{
SearchMode = SearchMode.All,
Size = MaxBatchSize,
Skip = Skip,
// ,IncludeTotalCount = true
// ,Filter = Azure.Search.Documents.SearchFilter.Create('%24top=2&%24skip=0&%24orderby=tributeId%20asc')
//,Filter = String.Format("&search=*&%24top=2&%24skip=0&%24orderby=tributeId%20asc")
//,Filter = "%24top=2&%24skip=0&%24orderby=tributeId%20asc"
//,Filter = "tributeKey eq '5'"
};
SearchResults<SearchDocument> response = SourceSearchClient.Search<SearchDocument>("*", options);
foreach (var doc in response.GetResults())
{
json += JsonSerializer.Serialize(doc.Document) + ",";
json = json.Replace("\"Latitude\":", "\"type\": \"Point\", \"coordinates\": [");
json = json.Replace("\"Longitude\":", "");
json = json.Replace(",\"IsEmpty\":false,\"Z\":null,\"M\":null,\"CoordinateSystem\":{\"EpsgId\":4326,\"Id\":\"4326\",\"Name\":\"WGS84\"}", "]");
json += "\r\n";
}
// Output the formatted content to a file
json = json.Substring(0, json.Length - 3); // remove trailing comma
File.WriteAllText(FileName, "{\"value\": [");
File.AppendAllText(FileName, json);
File.AppendAllText(FileName, "]}");
Console.WriteLine(" Total documents: {0}", response.GetResults().Count().ToString());
json = string.Empty;
}
catch (Exception ex)
{
Console.WriteLine("Error: {0}", ex.Message.ToString());
}
}
static string GetIDFieldName()
{
// Find the id field of this index
string IDFieldName = string.Empty;
try
{
var schema = SourceIndexClient.GetIndex(SourceIndexName);
foreach (var field in schema.Value.Fields)
{
if (field.IsKey == true)
{
IDFieldName = Convert.ToString(field.Name);
break;
}
}
}
catch (Exception ex)
{
Console.WriteLine("Error: {0}", ex.Message.ToString());
}
return IDFieldName;
}
static string GetIndexSchema()
{
// Extract the schema for this index
// We use REST here because we can take the response as-is
Uri ServiceUri = new Uri("https://" + SourceSearchServiceName + ".search.windows.net");
HttpClient HttpClient = new HttpClient();
HttpClient.DefaultRequestHeaders.Add("api-key", SourceAdminKey);
string Schema = string.Empty;
try
{
Uri uri = new Uri(ServiceUri, "/indexes/" + SourceIndexName);
HttpResponseMessage response = AzureSearchHelper.SendSearchRequest(HttpClient, HttpMethod.Get, uri);
AzureSearchHelper.EnsureSuccessfulSearchResponse(response);
Schema = response.Content.ReadAsStringAsync().Result.ToString();
}
catch (Exception ex)
{
Console.WriteLine("Error: {0}", ex.Message.ToString());
}
return Schema;
}
private static bool DeleteIndex()
{
Console.WriteLine("\n Delete target index {0} in {1} search service, if it exists", TargetIndexName, TargetSearchServiceName);
// Delete the index if it exists
try
{
TargetIndexClient.DeleteIndex(TargetIndexName);
}
catch (Exception ex)
{
Console.WriteLine(" Error deleting index: {0}\r\n", ex.Message);
Console.WriteLine(" Did you remember to set your SearchServiceName and SearchServiceApiKey?\r\n");
return false;
}
return true;
}
static void CreateTargetIndex()
{
Console.WriteLine("\n Create target index {0} in {1} search service", TargetIndexName, TargetSearchServiceName);
// Use the schema file to create a copy of this index
// I like using REST here since I can just take the response as-is
string json = File.ReadAllText(BackupDirectory + "\\" + SourceIndexName + ".schema");
// Do some cleaning of this file to change index name, etc
json = "{" + json.Substring(json.IndexOf("\"name\""));
int indexOfIndexName = json.IndexOf("\"", json.IndexOf("name\"") + 5) + 1;
int indexOfEndOfIndexName = json.IndexOf("\"", indexOfIndexName);
json = json.Substring(0, indexOfIndexName) + TargetIndexName + json.Substring(indexOfEndOfIndexName);
Uri ServiceUri = new Uri("https://" + TargetSearchServiceName + ".search.windows.net");
HttpClient HttpClient = new HttpClient();
HttpClient.DefaultRequestHeaders.Add("api-key", TargetAdminKey);
try
{
Uri uri = new Uri(ServiceUri, "/indexes");
HttpResponseMessage response = AzureSearchHelper.SendSearchRequest(HttpClient, HttpMethod.Post, uri, json);
response.EnsureSuccessStatusCode();
}
catch (Exception ex)
{
Console.WriteLine(" Error: {0}", ex.Message.ToString());
}
}
static int GetCurrentDocCount(SearchClient searchClient)
{
// Get the current doc count of the specified index
try
{
SearchOptions options = new SearchOptions()
{
SearchMode = SearchMode.All,
IncludeTotalCount = true
};
SearchResults<Dictionary<string, object>> response = searchClient.Search<Dictionary<string, object>>("*", options);
return Convert.ToInt32(response.TotalCount);
}
catch (Exception ex)
{
Console.WriteLine(" Error: {0}", ex.Message.ToString());
}
return -1;
}
static void ImportFromJSON()
{
Console.WriteLine("\n Upload index documents from saved JSON files");
// Take JSON file and import this as-is to target index
Uri ServiceUri = new Uri("https://" + TargetSearchServiceName + ".search.windows.net");
HttpClient HttpClient = new HttpClient();
HttpClient.DefaultRequestHeaders.Add("api-key", TargetAdminKey);
try
{
foreach (string fileName in Directory.GetFiles(BackupDirectory, SourceIndexName + "*.json"))
{
Console.WriteLine(" -Uploading documents from file {0}", fileName);
string json = File.ReadAllText(fileName);
Uri uri = new Uri(ServiceUri, "/indexes/" + TargetIndexName + "/docs/index");
HttpResponseMessage response = AzureSearchHelper.SendSearchRequest(HttpClient, HttpMethod.Post, uri, json);
response.EnsureSuccessStatusCode();
}
}
catch (Exception ex)
{
Console.WriteLine(" Error: {0}", ex.Message.ToString());
}
}
}
}
I tried adding a filter option in the ExportToJSON method but the request fails

Vala Image Base64 in Web

I would like to know if in Vala (Soup.Server) I can visualize an image
that I have as a string in base64 format?
private static void default_handler (Soup.Server server,Soup.Message msg,string path,GLib.HashTable? query,Soup.ClientContext client) {
var imgStr = (string) Base64.decode ("iVBORw0....");
msg.set_response("image/jpeg",Soup.MemoryUse.COPY,"%s".printf(imgStr).data);
}1
solution
void handle_static_file(Soup.Server server, Soup.Message message,
string path, HashTable? query, Soup.ClientContext context) {
server.pause_message(message);
handle_static_file_async.begin(server, message, path, query, context);
}
async void handle_static_file_async(Soup.Server server,
Soup.Message message, string path, HashTable? query,
Soup.ClientContext context) {
if (path == "/" || path == "") {
path = "index.html";
}
var file = File.new_for_path("static/" + path);
try {
var info = yield file.query_info_async("*", FileQueryInfoFlags.NONE);
var io = yield file.read_async();
Bytes data;
while ((data = yield io.read_bytes_async((size_t)info.get_size())).length > 0) {
message.response_body.append(Soup.MemoryUse.COPY,
data.get_data());
}
string content_type = info.get_content_type();
message.set_status(Soup.Status.OK);
message.response_headers.set_content_type(content_type, null);
} catch (IOError.NOT_FOUND e) {
message.set_status(404);
message.set_response("text/plain", Soup.MemoryUse.COPY,
("File " + file.get_path() + " does not exist.").data);
} catch (Error e) {
if (debug) {
stderr.printf("Failed to read file %s: %s\n", file.get_path(),
e.message);
}
message.set_status(500);
message.set_response("text/plain", Soup.MemoryUse.COPY,
e.message.data);
} finally {
server.unpause_message(message);
}
}

Can't get test to run with NUnit on Monodevelop in Linux

I'm trying to make some test for a server I made. But can't get to run it.
This is the code:
[SetUp]
public override void setUp()
{
base.setUp ();
tcpClient = new TcpClient ("132.72.214.127",6666);
}
[Test]
public void TestSuperUserConnection()
{
Console.WriteLine ("Starting SuperUser Test");
string ans = sendAndReceive (""+0, "admin,1234");
Assert.IsTrue(ans.Contains("{"));
}
public string sendAndReceive(string type, string args)
{
try{
string message = buildJson (type, args);
Console.WriteLine ("Building Json: {0}", message);
Byte[] data = System.Text.Encoding.ASCII.GetBytes(message);
NetworkStream stream = tcpClient.GetStream();
Console.WriteLine ("Sending data");
stream.Write(data, 0, data.Length);
data = new Byte[1024];
StringBuilder ans = new StringBuilder ();
int bytes;
string responseData;
Console.WriteLine ("Receiving data");
while((bytes = stream.Read(data, 0, data.Length)) > 0)
{
responseData = System.Text.Encoding.ASCII.GetString(data, 0, bytes);
ans.Append (responseData);
}
ServerResponse sr = JsonConvert.DeserializeObject<ServerResponse>(ans.ToString());
Console.WriteLine("Received: {0}", ans.ToString());
return ans.ToString ();
}
catch(Exception e) {
Console.WriteLine ("Error received: {0}", e);
}
return "";
}
I try to run it in debug mode and release mode. Also under sudo. And the server always get my request but I can't see anything on my output and the test never ends.
Could there be any problem with console.writeline?
Any thoughts?
I'm using monodevelop 4.2.2, with nunit 2.6.0.0
The output is just: "Running test ServerTest...."
and nothing more

Use of dbms_lock inconsistent using jdbc

Why does the protected Java code execute in more then one thread at a time?
Thread-1 - protected code iteration 0
Thread-0 - protected code iteration 0
Thread-1 - protected code iteration 1
Thread-0 - protected code iteration 1
Thread-0 - protected code iteration 2
Thread-1 - protected code iteration 2
Thread-1 - protected code iteration 3
Thread-0 - protected code iteration 3
Run it enough times it can execute as expected.
public class DbmsLockTest implements Runnable {
Connection con; String key; int timeout;
public DbmsLockTest(Connection con, String key, int timeout) {
this.con = con; this.key = key; this.timeout = timeout;
}
public static void log(String str) {
System.out.println(new Date() + " - "
+ Thread.currentThread().getName() + " - " + str);
}
#Override
public void run() {
lockKey(con, key, timeout);
}
public static void lockKey(Connection con, String key, int timeout) {
log("start lockKey " + " key: " + key);
CallableStatement cStmt = null;
int rc = 0;
try {
StringBuilder sql = new StringBuilder(500);
sql.append("DECLARE");
sql.append(" v_lockhandle VARCHAR2(200);");
sql.append("BEGIN");
sql.append(" dbms_lock.allocate_unique(lockname => ?, lockhandle => v_lockhandle);");
sql.append(" ? := dbms_lock.request(lockhandle => v_lockhandle, lockmode => 6,");
sql.append(" timeout => ?, release_on_commit =>true);");
sql.append(" ? := v_lockhandle;");
sql.append("END;");
String lockKey = "LockKey-" + key;
cStmt = con.prepareCall(sql.toString());
cStmt.setString(1, lockKey);
cStmt.registerOutParameter(2, Types.NUMERIC);
cStmt.setInt(3, timeout);
cStmt.registerOutParameter(4, Types.VARCHAR);
log("executeUpdate start: " + lockKey + "] ");
cStmt.executeUpdate();
log("executeUpdate end: " + lockKey + "] ");
rc = cStmt.getInt(2);
log("return value from request=[" + rc + "] ");
if (rc != 0) {
System.out.println("6001 lock obtained: "
+ Thread.currentThread().getName());
throw new RuntimeException("lock acquisition failed with code=" + rc);
}
log("v_lockhandle=[" + cStmt.getString(4) + "] ");
for (int i = 0; i < 4; i++) {
try {
Thread.sleep(1000 * 1);
} catch (InterruptedException e) {
e.printStackTrace();
}
log("***** protected code iteration ***** " + i);
}
con.commit();
} catch (SQLException e) {
log("int timeout: " + Thread.currentThread().getName());
throw new RuntimeException("SQLException locking balance for user "
+ key, e);
} finally {
try {
if (cStmt != null) {
cStmt.close();
}
if (con != null) {
con.close();
}
} catch (SQLException e) {
e.printStackTrace();
}
}
log("Exiting=[" + rc + "] ");
}
public static void main(String[] args) throws Exception {
new oracle.jdbc.OracleDriver();
List<Thread> list = new ArrayList<Thread>();
for (int i = 0; i < 2; i++) {
Connection connection = DriverManager
.getConnection("jdbc:oracle:thin:#localhost:1521:xe",
"<username>", "password");
Thread t = new Thread(new DbmsLockTest(connection, "mykey", 10));
list.add(t);
}
for (Thread t : list) {
t.start();
}
}
}

Strange behavior of threads

I’m writing an application that communicates with some hardware using the MODBUS protocol.
I'm using this sample from Code Project.
While trying to optimize the code (mainly the PollFunction function), I've encountered with a very strange threads lock.
Instead of sending each line of string to the DoGUIUpdate delagate, I'm constructing a string array and sending it as a whole.
Doing so causes the application to crush with a System.Reflection.targetParametercountException: Parameter count mismatch error.
The original code:
public delegate void GUIUpdate(string paramString);
public void DoGUIUpdate(string paramString)
{
if (InvokeRequired)
BeginInvoke(new GUIUpdate(DoGUIUpdate), paramString);
else
lstRegisterValues.Items.Add(paramString);
}
private void PollFunction()
{
...
string itemString;
for (int i = 0; i < pollLength; i++)
{
itemString = "[" + Convert.ToString(pollStart + i + 40001) + "] , MB[" + Convert.ToString(pollStart + i) + "] = " + values[i];
DoGUIUpdate(itemString);
}
}
My code:
public delegate void GUIUpdate2(string[] paramString);
public void DoGUIUpdate2(string[] paramString)
{
if (InvokeRequired)
BeginInvoke(new GUIUpdate2(DoGUIUpdate2), paramString);
else
{
lstRegisterValues.Items.Clear();
lstRegisterValues.Items.AddRange(paramString);
}
}
string[] valuesStrings;
private void PollFunction()
{
...
valuesStrings = new string[pollLength];
for (int i = 0; i < pollLength; i++)
{
valuesStrings[i] = "[" + Convert.ToString(pollStart + i + 40001) + "] , MB[" + Convert.ToString(pollStart + i) + "] = " + values[i];
}
DoGUIUpdate2(valuesStrings);
}
Any advice will be welcome.
i think BeginInvoke(new GUIUpdate2(DoGUIUpdate2), paramString); is the problem...
the second parameter of "begininvoke" accepts a param object[] params which will result in the call DoGuiUpdate(string1,string2,string3) which is not what you want...
try encapsulate in the following way:
BeginInvoke(new GUIUpdate2(DoGUIUpdate2), new[]{ paramString });

Resources