Tyrus - pass object from client to server - object

Is it posible to pass custom object from client to server, using Tyrus project for websocket communication. I want to build simple desktop application using JavaFX. How can I pass data that I "collect" on client side (e.g. Object Person with name and lastname fields) so I can save that data to database (on my server logic) ?

It is possible and the form of transferred data is completely your choice.
WebSocket can transfer text or binary data, that's it. You can serialize your obect to ObjectStream and send the data as binary stream, or You can use use JAXB to marshall and umarshall data to/from XML, or JSON-P for JSON (note that there are lots of other possibilities, like GSON, Jackson, ...).
If I would be in your position, I'd use JSON with whatever library I find usable - this way, when you'll extend the application scope to javascript clients, you'll be able to reuse (hopefully) everything.

In addition to Pavel Bucek explanation, sample code is here
Base64 for conversion
import java.util.Base64;
Serverendpoint
ArrayList listobj=new ArrayList();
listobj.add("data1");
listobj.add("data2");
ByteArrayOutputStream byteArrayOutputStream = new ByteArrayOutputStream();
ObjectOutputStream objectOutputStream = new ObjectOutputStream(byteArrayOutputStream);
objectOutputStream.writeObject(listobj);
String str = Base64.getEncoder().encodeToString(byteArrayOutputStream.toByteArray());
session.getBasicRemote().sendText(str);
Client (Tyrus)
#OnMessage
public void onMessage(Session session, final String message) throws IOException {
try {
byte data[] = Base64.getDecoder().decode(message);
bis = new ByteArrayInputStream(data);
ois = new ObjectInputStream(bis);
ArrayList list= (ArrayList) ois.readObject();
for (int i = 0; i < list.size(); i++) {
System.out.println(list.get(i));
}
} catch (Exception e) {
System.out.println("error : " + e.getMessage());
} finally {
if (bis != null) {
bis.close();
}
if (ois != null) {
ois.close();
}
}
}

Related

How to make outgoing request or webhook in Acumatica?

I'm integrating an Asp.NET application with Acumatica that needs to update shipping information (tracking #, carrier, etc.) when it becomes available in Acumatica. Is there a way to have Acumatica call an endpoint on my Asp.NET app when a shipment is created? I've searched through a lot of the docs (available here), but I haven't come across anything to send OUT information from Acumatica to another web service.
Ideally, this outgoing call would send the shipment object in the payload.
This wasn't available when you asked the question but push notifications seem to be exactly what you're looking for:
Help - https://help.acumatica.com/(W(9))/Main?ScreenId=ShowWiki&pageid=d8d2835f-5450-4b83-852e-dbadd76a5af8
Presentation - https://adn.acumatica.com/content/uploads/2018/05/Push-Notifications.pdf
In my answer I suppose that you know how to call some outside service from C# code, and for your is a challenge how to send notification from Acumatica.
I propose you to extend each Persist method in each Acumatica graph, from which you expect to send notification when object is persisted in db. IMHO the best option for this is to override method persist ( btw, it overriding persist method is well described in T300 ). In code of extension class you can do the following:
public void Persist(PersistDelegate baseMethod)
{
baseMethod(); // calling this method will preserve your changes in db
//here should go your code, that will send push/pop/delete etc web request into your asp.net application. Or in other words your web hook.
}
If you don't have Acumatica 2017R2, then you have to create your own extension project and then you can call it from your Acumatica code:
using System;
using System.Collections.Generic;
using System.IO;
using System.Net;
namespace MyApp
{
public static class Utility
{
private static WebRequest CreateRequest(string url, Dictionary headers)
{
if (Uri.IsWellFormedUriString(url, UriKind.Absolute))
{
WebRequest req = WebRequest.Create(url);
if (headers != null)
{
foreach (var header in headers)
{
if (!WebHeaderCollection.IsRestricted(header.Key))
{
req.Headers.Add(header.Key, header.Value);
}
}
}
return req;
}
else
{
throw(new ArgumentException("Invalid URL provided.", "url"));
}
}
public static string MakeRequest(string url, Dictionary headers = null)
{
WebResponse resp = CreateRequest(url, headers).GetResponse();
StreamReader reader = new StreamReader(resp.GetResponseStream());
string response = reader.ReadToEnd();
reader.Close();
resp.Close();
return response;
}
public static byte[] MakeRequestInBytes(string url, Dictionary headers = null)
{
byte[] rb = null;
WebResponse resp = CreateRequest(url, headers).GetResponse();
using (BinaryReader br = new BinaryReader(resp.GetResponseStream()))
{
rb = br.ReadBytes((int)resp.ContentLength);
br.Close();
}
resp.Close();
return rb;
}
}
}
You can then call it like this:
try
{
Utility.MakeRequest(theUrl, anyHeadersYouNeed);
}
catch(System.Net.WebException ex)
{
throw(new PXException("There was an error.", ex));
}

Google Custom Search API - Search Results

I have somewhat lost touch with custom search engines ever since Google switched from its more legacy search engine api in favor of the google custom search api. I'm hoping someone might be able to tell me whether a (pretty simple) goal can be accomplished with the new framework, and potentially any starting help would be great.
Specifically, I am looking to write a program which will read in text from a text file, then use five words from said document in a google search - the point being to figure out how many results accrue from said search.
An example input/output would be:
Input: "This is my search term" -- quotations included in the search!
Output: there were 7 total results
Thanks so much, all, for your time/help
First you need to create a Google Custom Search project inside you google account.
From this project you must obtain a Custom Search Engine ID , known as cx parameter. You must also obtain a API key parameter. Both of these are available from your Google Custom Search API project inside your google account.
Then, if you prefer Java , here's a working example:
import java.io.BufferedReader;
import java.io.InputStreamReader;
import java.net.HttpURLConnection;
import java.net.URL;
public class GoogleCustonSearchAPI {
public static void main(String[] args) throws Exception {
String key="your_key";
String qry="your_query";
String cx = "your_cx";
//Fetch urls
URL url = new URL(
"https://www.googleapis.com/customsearch/v1?key="+key+"&cx="+cx+"&q="+ qry +"&alt=json&queriefields=queries(request(totalResults))");
HttpURLConnection conn = (HttpURLConnection) url.openConnection();
conn.setRequestMethod("GET");
conn.setRequestProperty("Accept", "application/json");
BufferedReader br = new BufferedReader(new InputStreamReader(
(conn.getInputStream())));
//Remove comments if you need to output in JSON format
/*String output;
System.out.println("Output from Server .... \n");
while ((output = br.readLine()) != null) {
System.out.println(output);
}*/
//Print the urls and domains from Google Custom Search String searchResult;
while ((searchResult = output.readLine()) != null) {
int startPos=searchResult.indexOf("\"link\": \"")+("\"link\": \"").length();
int endPos=searchResult.indexOf("\",");
if(searchResult.contains("\"link\": \"") && (endPos>startPos)){
String link=searchResult.substring(startPos,endPos);
if(link.contains(",")){
String tempLink = "\"";
tempLink+=link;
tempLink+="\"";
System.out.println(tempLink);
}
else{
System.out.println(link);
}
System.out.println(getDomainName(link));
}
}
conn.disconnect();
}
public static String getDomainName(String url) throws URISyntaxException {
URI uri = new URI(url);
String domain = uri.getHost();
return domain.startsWith("www.") ? domain.substring(4) : domain;
}
The "&queriefields=queries(request(totalResults))" is what makes the difference and gives sou what you need. But keep in mind that you can perform only 100 queries per day for free and that the results of Custom Search API are sometimes quite different from the those returned from Google.com search
If anybody would still need some example of CSE (Google Custom Search Engine) API, this is working method
public static List<Result> search(String keyword){
Customsearch customsearch= null;
try {
customsearch = new Customsearch(new NetHttpTransport(),new JacksonFactory(), new HttpRequestInitializer() {
public void initialize(HttpRequest httpRequest) {
try {
// set connect and read timeouts
httpRequest.setConnectTimeout(HTTP_REQUEST_TIMEOUT);
httpRequest.setReadTimeout(HTTP_REQUEST_TIMEOUT);
} catch (Exception ex) {
ex.printStackTrace();
}
}
});
} catch (Exception e) {
e.printStackTrace();
}
List<Result> resultList=null;
try {
Customsearch.Cse.List list=customsearch.cse().list(keyword);
list.setKey(GOOGLE_API_KEY);
list.setCx(SEARCH_ENGINE_ID);
Search results=list.execute();
resultList=results.getItems();
}
catch ( Exception e) {
e.printStackTrace();
}
return resultList;
}
This method returns List of Result Objects, so you can iterate through it
List<Result> results = new ArrayList<>();
try {
results = search(QUERY);
} catch (Exception e) {
e.printStackTrace();
}
for(Result result : results){
System.out.println(result.getDisplayLink());
System.out.println(result.getTitle());
// all attributes
System.out.println(result.toString());
}
I use gradle dependencies
dependencies {
compile 'com.google.apis:google-api-services-customsearch:v1-rev57-1.23.0'
}
Don't forget to define your own GOOGLE_API_KEY, SEARCH_ENGINE_ID (cx), QUERY and HTTP_REQUEST_TIMEOUT (ie private static final int HTTP_REQUEST_TIMEOUT = 3 * 600000;)

How can I store Objects in cassandra using the blob datatype

I tried with the data type blob. That's giving some Datastax exception. I tried the object itself, bytearray. Still no good:
Caused by: com.datastax.driver.core.exceptions.InvalidQueryException: Invalid STRING constant ([B#547248ad) for user_object of type blob
This is the failing INSERT:
executeSting.append("INSERT INTO htadb.objecttable (object_id, bucket_name, object_key, link, user_status, user_object) ")
.append("VALUES (")
.append(objectId).append(",'")
.append(bucketName).append("','")
.append(key).append("','")
.append(link).append("','")
.append("online").append("','")
.append(serializer(register)).append("')"
+ ";");
From documentation
blob | blobs | Arbitrary bytes (no validation), expressed as hexadecimal
so what you need is provided by the Bytes class. The following is an interface I use to serialize/deserialize Java objects I need to save in Cassandra
public interface Bufferable extends Serializable {
static final Logger LOGGER = LoggerFactory.getLogger(Bufferable.class);
default ByteBuffer serialize() {
try (ByteArrayOutputStream bytes = new ByteArrayOutputStream();
ObjectOutputStream oos = new ObjectOutputStream(bytes);) {
oos.writeObject(this);
String hexString = Bytes.toHexString(bytes.toByteArray());
return Bytes.fromHexString(hexString);
} catch (IOException e) {
LOGGER.error("Serializing bufferable object error", e);
return null;
}
}
public static Bufferable deserialize(ByteBuffer bytes) {
String hx = Bytes.toHexString(bytes);
ByteBuffer ex = Bytes.fromHexString(hx);
try (ObjectInputStream ois = new ObjectInputStream(new ByteArrayInputStream(ex.array()));) {
return (Bufferable) ois.readObject();
} catch (ClassNotFoundException | IOException e) {
LOGGER.error("Deserializing bufferable object error", e);
return null;
}
}
}
HTH,
Carlo
You can wrap your Array[Byte] object using java.nio.ByteBuffer.wrap()
Use Cassandra 4's ByteUtils
Super easy. For example you can use ByteUtils.toHexString(bytes).

Custom FileResult on Azure: Browser Waits forever

I have an action that returns an Excel as a custom FileResult. My solution is based on the ClosedXml library (internaly using OpenXml).
My XlsxResult class uses a read-only .xlsx file on the server as a template. It then passes on the template into a memory stream that gets manipulated and saved back with ClosedXml. In the end the memory stream get written to the response.
This works fine both on Cassini as well as IIS Express but fails when deployed on azure with no error whatsoever. The only effect I am experiencing is the request sent to the server never gets any response. I am still waiting for something to happen after 60 minutes or so...
My action:
[OutputCache(Location= System.Web.UI.OutputCacheLocation.None, Duration=0)]
public FileResult Export(int year, int month, int day) {
var date = new DateTime(year, month, day);
var filename = string.Format("MyTemplate_{0:yyyyMMdd}.xlsx", date);
//return new FilePathResult("~/Content/templates/MyTemplate.xlsx", "application/vnd.openxmlformats-officedocument.spreadsheetml.sheet");
var result = new XlsxExportTemplatedResult("MyTemplate.xlsx", filename, (workbook) => {
var ws = workbook.Worksheets.Worksheet("My Export Sheet");
ws.Cell("B3").Value = date;
// Using a OpenXML's predefined formats (15 stands for date)
ws.Cell("B3").Style.NumberFormat.NumberFormatId = 15;
ws.Columns().AdjustToContents(); // You can also specify the range of columns to adjust, e.g.
return workbook;
});
return result;
}
My FileResult
public class XlsxExportTemplatedResult : FileResult
{
// default buffer size as defined in BufferedStream type
private const int BufferSize = 0x1000;
public static readonly string TEMPLATE_FOLDER_LOCATION = #"~\Content\templates";
public XlsxExportTemplatedResult(string templateName, string fileDownloadName, Func<XLWorkbook, XLWorkbook> generate)
: base("application/vnd.openxmlformats-officedocument.spreadsheetml.sheet") {
this.TempalteName = templateName;
this.FileDownloadName = fileDownloadName;
this.Generate = generate;
}
public string TempalteName { get; protected set; }
public Func<XLWorkbook, XLWorkbook> Generate { get; protected set; }
protected string templatePath = string.Empty;
public override void ExecuteResult(ControllerContext context) {
templatePath = context.HttpContext.Server.MapPath(System.IO.Path.Combine(TEMPLATE_FOLDER_LOCATION, this.TempalteName));
base.ExecuteResult(context);
}
//http://msdn.microsoft.com/en-us/library/office/ee945362(v=office.11).aspx
protected override void WriteFile(System.Web.HttpResponseBase response) {
FileStream fileStream = new FileStream(templatePath, FileMode.Open, FileAccess.Read);
using (MemoryStream memoryStream = new MemoryStream()) {
CopyStream(fileStream, memoryStream);
using (var workbook = new XLWorkbook(memoryStream)) {
Generate(workbook);
workbook.Save();
}
// At this point, the memory stream contains the modified document.
// grab chunks of data and write to the output stream
Stream outputStream = response.OutputStream;
byte[] buffer = new byte[BufferSize];
while (true) {
int bytesRead = memoryStream.Read(buffer, 0, BufferSize);
if (bytesRead == 0) {
// no more data
break;
}
outputStream.Write(buffer, 0, bytesRead);
}
}
fileStream.Dispose();
}
static private void CopyStream(Stream source, Stream destination) {
byte[] buffer = new byte[BufferSize];
int bytesRead;
do {
bytesRead = source.Read(buffer, 0, buffer.Length);
destination.Write(buffer, 0, bytesRead);
} while (bytesRead != 0);
}
}
So am I missing something (apparently I am).
Please Note:
There are no dlls missing from Azure because I checked using RemoteAccess feature of the Windows Azure Tools 1.7
My export is not a heavy long running task.
when I changed the action to just return a FilePathResult with the template xlsx it worked on azure. But I need to process the file before returning it as u might suspect :-)
Tanks.
UPDATE:
After I logged extensively in my code the execution hangs with no error at the ClosedXml "Save" method call. But still no error. Abstract from the WADLogsTable:
Opening template file from path:
E:\sitesroot\0\Content\templates\MyTemplate.xlsx
Opened template from path:
E:\sitesroot\0\Content\templates\MyTemplate.xlsx just
copied template to editable memory stream. Bytes copied: 15955,
Position: 15955
modified the excel document in memory.
here it hangs when a it calls to workbook.Save(); This is a ClosedXml method call.
I was facing the exact same error situation as you. I can't offer a fix in your specific situation, and I know you switched tracks, but after going through the same frustrating steps you had faced, I'd like to "pave the way" for an answer for you (or others).
Drop into your package manager console in Visual Studio and install Elmah with the MVC goodies (routing):
Install-Package elmah.MVC
Now, in your root web.config, update your Elmah entry. It's likely at the end of the file, looking like this:
<elmah></elmah>
Update that bad boy to allow remote access and set up your log path:
<elmah>
<security allowRemoteAccess="1" />
<errorLog type="Elmah.XmlFileErrorLog, Elmah" logPath="~/app_data/elmah" />
</elmah>
Now, push that up to Azure.
Finally, visit your site, force the error then navigate to http://your-site-here.azurewebsites.net/elmah and you'll see the exact cause of the error.
Elmah is so the awesome.
Sheepish confession: The error for me wasn't in the third party code, it turned out to be in my connection string, for which I hadn't set MultipleActiveResultsSets to true. The other fix I had to do was pass my entities in after calling ToList() to one of the internal methods on that library, leaving it as IQueryable borked the method up.

Does ServiceStack support binary responses?

Is there any mechanism in ServiceStack services to return streaming/large binary data? WCF's MTOM support is awkward but effective in returning large amounts of data without text conversion overhead.
I love service stack, this litle code was enough to return an Excel report from memory stream
public class ExcelFileResult : IHasOptions, IStreamWriter
{
private readonly Stream _responseStream;
public IDictionary<string, string> Options { get; private set; }
public ExcelFileResult(Stream responseStream)
{
_responseStream = responseStream;
Options = new Dictionary<string, string> {
{"Content-Type", "application/octet-stream"},
{"Content-Disposition", "attachment; filename=\"report.xls\";"}
};
}
public void WriteTo(Stream responseStream)
{
if (_responseStream == null)
return;
_responseStream.WriteTo(responseStream);
responseStream.Flush();
}
}
From a birds-eye view ServiceStack can return any of:
Any DTO object -> serialized to Response ContentType
HttpResult, HttpError, CompressedResult (IHttpResult) for Customized HTTP response
The following types are not converted and get written directly to the Response Stream:
String
Stream
IStreamWriter
byte[] - with the application/octet-stream Content Type.
Details
In addition to returning plain C# objects, ServiceStack allows you to return any Stream or IStreamWriter (which is a bit more flexible on how you write to the response stream):
public interface IStreamWriter
{
void WriteTo(Stream stream);
}
Both though allow you to write directly to the Response OutputStream without any additional conversion overhead.
If you want to customize the HTTP headers at the sametime you just need to implement IHasOptions where any Dictionary Entry is written to the Response HttpHeaders.
public interface IHasOptions
{
IDictionary<string, string> Options { get; }
}
Further than that, the IHttpResult allows even finer-grain control of the HTTP output where you can supply a custom Http Response status code. You can refer to the implementation of the HttpResult object for a real-world implementation of these above interfaces.
I had a similar requirement which also required me to track progress of the streaming file download. I did it roughly like this:
server-side:
service:
public object Get(FooRequest request)
{
var stream = ...//some Stream
return new StreamedResult(stream);
}
StreamedResult class:
public class StreamedResult : IHasOptions, IStreamWriter
{
public IDictionary<string, string> Options { get; private set; }
Stream _responseStream;
public StreamedResult(Stream responseStream)
{
_responseStream = responseStream;
long length = -1;
try { length = _responseStream.Length; }
catch (NotSupportedException) { }
Options = new Dictionary<string, string>
{
{"Content-Type", "application/octet-stream"},
{ "X-Api-Length", length.ToString() }
};
}
public void WriteTo(Stream responseStream)
{
if (_responseStream == null)
return;
using (_responseStream)
{
_responseStream.WriteTo(responseStream);
responseStream.Flush();
}
}
}
client-side:
string path = Path.GetTempFileName();//in reality, wrap this in try... so as not to leave hanging tmp files
var response = client.Get<HttpWebResponse>("/foo/bar");
long length;
if (!long.TryParse(response.GetResponseHeader("X-Api-Length"), out length))
length = -1;
using (var fs = System.IO.File.OpenWrite(path))
fs.CopyFrom(response.GetResponseStream(), new CopyFromArguments(new ProgressChange((x, y) => { Console.WriteLine(">> {0} {1}".Fmt(x, y)); }), TimeSpan.FromMilliseconds(100), length));
The "CopyFrom" extension method was borrowed directly from the source code file "StreamHelper.cs" in this project here: Copy a Stream with Progress Reporting (Kudos to Henning Dieterichs)
And kudos to mythz and any contributor to ServiceStack. Great project!

Resources