Extra backslash in URLs - foursquare

Has anyone noticed that the service started sending the image's URL with an extra backslash before every slash?
e.g.:
"icon": "https:\ /\ /foursquare.com\ /img\ /categories\ /food\ /default.png".
instead of:
https://foursquare.com/img/categories/food/default.png
Is this normal? Thanks in advance.

We (foursquare) made the change yesterday to the way we serialize JSON output, but should have done so in a way that won't break any modestly mature JSON handler. We may tweak how we do serialization in the future (always in compliance with the JSON spec) and we recommend you build your system to be robust against future changes.

Related

How Do I resolve "Illuminate\Queue\InvalidPayloadException: Unable to JSON encode payload. Error code: 5"

Trying out the queue system for a better user upload experience with Laravel-Excel.
.env was been changed from 'sync' to 'database' and migrations run. All the necessary use statements are in place yet the error above persists.
The exact error happens here:
Illuminate\Queue\Queue.php:97
$payload = json_encode($this->createPayloadArray($job, $queue, $data));
if (JSON_ERROR_NONE !== json_last_error()) {
throw new InvalidPayloadException(
If I drop ShouldQueue, the file imports perfectly in-session (large file so long wait period for user.)
I've read many stackoverflow, github etc comments on this but I don't have the technical skills to deep-dive to fix my particular situation (most of them speak of UTF-8 but I don't if that's an issue here; I changed the excel save format to UTF-8 but it didn't fix it.)
Ps. Whilst running the migration, I got the error:
SQLSTATE[42000]: Syntax error or access violation: 1071 Specified key was too long; max key length is 767 bytes (SQL: alter table `jobs` add index `jobs_queue_index`(`queue`))
I bypassed by dropping the 'add index'; so my jobs table is not indexed on queue but I don't feel this is the cause.
One thing you can do when looking into json_encode() errors is use the json_last_error_msg() function, which will give you a bit more of a readable error message.
In your case you're getting a '5' back, which is the JSON_ERROR_UTF8 error code. The error message back for this is a slightly more informative one:
'Malformed UTF-8 characters, possibly incorrectly encoded'
So we know it's encountering non-UTF-8 characters, even though you're saving the file specifically with UTF-8 encoding. At first glance you might think you need to convert the encoding yourself in code (like this answer), but in this case, I don't think that'll help. For Laravel-Excel, this seems to be a limitation of trying to queue-read .xls files - from the Laravel-Excel docs:
You currently cannot queue xls imports. PhpSpreadsheet's Xls reader contains some non-utf8 characters, which makes it impossible to queue.
In this case you might be stuck with a slow, non-queueable option, or need to convert your spreadsheet into a queueable format e.g. .csv.
The key length error on running the migration is unrelated. It has been around for a while and is a side-effect of using an older version of MySQL/MariaDB. Check out this answer and the Laravel documentation around index lengths - you need to add this to your AppServiceProvider::boot() method:
Schema::defaultStringLength(191);

Skipping Sendmail's Queue

I've set up Sendmail so that all messages are delivered to /dev/null instead of being actually stored anywhere else. I'm trying to reduce the number of unecessary disk writes and since those messages are essentially removed I want to, if possible, skip writing them to mqueue. Is there any way to do that?
The closest I could think of is mounting a nullfs filesystem on the mqueue directory, but I'd like a "cleaner" approach using sendmail only. Is this possible?
Thanks!
Most likely you choose wrong way to solve your problem but anyway:
You can select discard mailer for all recipients in check_rcpt (Local_check_rcpt) rule set. It will act as equivalent of DISCARD in access table.
Add the following lines to sendmil.mc file, generate new sendmail.cf file and restart or HUP sendmail daemon.
LOCAL_RULESETS
SLocal_check_rcpt
# PUT TAB (\t) BEFORE $# !!!
R$* $#discard $: discard

The XML parser detected error code 302

I am using the XML-INTO op-code to parse a web service request. Every now and then I get errors in the logs
(RNX0351 - "The XML parser detected error code 302").
The help for a 302 is
302 The parser does not support the requested CCSID value or
the first character of the XML document was not '<'
To the best of my knowledge, the first character is "<" and the request is generated from a previous web service call so I would be very suprised if the CCSID has changed.
The error is repeatable, for the specific query so it is almost certainly data related, I am just unsure how I would go about identifying the offending item.
Any thoughts on how to determine the issue, or better yet, how to overcome it?
cheers
CCSID is an AS400/iSeries/Power System attribute, and it applies to the whole IFS.It's like a declaration of what inside the file is, or in other words what its internal encoding "should be".
It's supposed that data content encoding in the file and the file one (the envelope) match, and the box uses this attribute to show and handle corresponding characters.
It sounds like you receive data under one encoding, but CCSID file doesn't match.
Try changing CCSID on your file (only the envelope). E.G.: 37 (american), 500 (latin-1), 819 (utf-8), 850 (dos), 1252 (win) and display file after.You can check first using ls -Sla yourfile in QSH or QP2TERM, or EDTF as well. CHGATTR allows you to change CCSID, as well as setccsid in QSH (again).
This way helped me to find related issues. Remember that although data may be visible in the four hundred, they may not be visible through a share folder in Win. It means that CCSID file, an content encoding don't match.
Hope it helps.
Hi I've seen this error with XML data uploaded to AS400/iSeries/IBM i with FTP and the CCSID 819 (ISO 8859-1 ASCII) and it has some binary garbage in first few positions of file. Changed encoding to CCSID 1208 (UTF-8 with IBM PUA) using FTP "quote type c 1208" and the problem cleared and XML-INTO was successful.
So, suggestion about XML parser error 302 received when using XML-INTO is to look at the file (wrklnk ...) and if first character is not "<" but instead some binary garbage then try CCSID 1208 for utf-8.
Statements in this answer about what 819 is and what ccsid represents utf-8 do not agree with previous answer but are correct, according to IBM documentation:
https://www-01.ibm.com/software/globalization/ccsid/ccsid819.html
https://www-01.ibm.com/software/globalization/ccsid/ccsid1208.html
I'm working on this problem a couple hours,
for me the solution was use option ccsid=UCS2 when you use data structure or variable to store xml.
something like that :
XML-INTO customer %XML( xmlSource : 'ccsid=UCS2');
I have the program running on ccsid = 870, every conversion to ccsid on the xmlSource field don't work,
The strange thing that when I use the file with ccsid = 850, every thing work fine
I mention that becouse this is the first page when you looking about this problem.
Maybe this help someone.

How to encode a PHP file with base64

:)
I have one ridiculously silly question and most of you would like to reffer me to Google right away, but that didn't helped me out within the first hour. I suppose I didn't knew how to look for. I'm having a PHP file and I'd like to have it in base64 yet I can't get it to work anyhow.
1) I encoded my PHP script to base64(and included the PHP tags). It'll look as following : JTNDJTNGcGhwJTIwVGhpcyUyMGlzJTIwdGhlJTIwUEhQJTIwY29kZSUyMCUzRiUzRQ==
This kind of base64 won't execute so I added the PHP tags to it although the encoded file already had it. Still didn't worked out. Removed the tags from the base64 and tried again, but still didn't worked. Then I tried adding the PHP tags and inside of them added :
eval(gzinflate(base64_decode('base64 here')));
Still didn't worked out anyhow. Is anyone here kind enough to tell the kiddo how to run a base64 encoded PHP file properly?
Would be really appreaciated. :)
A simple code:
$source = "JTNDJTNGcGhwJTIwVGhpcyUyMGlzJTIwdGhlJTIwUEhQJTIwY29kZSUyMCUzRiUzRQ==";
$code = base64_decode($source);
eval($code);
or even shorter:
eval(base64_decode("JTNDJTNGcGhwJTIwVGhpcyUyMGlzJTIwdGhlJTIwUEhQJTIwY29kZSUyMCUzRiUzRQ=="));
Do you want to encrypt your code? If so, this is not the right way. Use a accelerator like this one or this one. They will crypt your codes and make them even faster!
If you are going to use base_64 to encode your php file then the encoded text need to seat in between the php tags including the base_64 tag.
Example:
If your code is:
JTNDJTNGcGhwJTIwVGhpcyUyMGlzJTIwdGhlJTIwUEhQJTIwY29kZSUyMCUzRiUzRQ
Then your code should look like:
<?php eval("?>".base64_decode("JTNDJTNGcGhwJTIwVGhpcyUyMGlzJTIwdGhlJTIwUEhQJTIwY29kZSUyMCUzRiUzRQ")); ?>
Basically your basic code will look like this:
<?php eval("?>".base64_decode("Code Goes here")); ?>
There are more simple tools that can give you this option
Check this out: PHP Encoder & Decoder with Domain Lock

curl chunky parser error

"Received problem 3 in the chunky parser"
I can't for the life of me find what "problem 3" in curl refers to. I'm sure it has to do with the format of the chunk I'm sending from the app server to curl, but I can't figure out what is wrong with the chunk because I can't tell what "problem 3" is.
Any ideas?
The number you see there is CHUNKE_BAD_CHUNK from the CHUNKcode enum from lib/http_chunks.h from the libcurl source code. Given a quick look, it seems that it is mostly used when a CR or LF is missing from the chunked data.
I would recommend you investigate on the raw HTTP content stream to see what the problem is with the chunked format. RFC2616 section 3.6.1 documents it.
There is a similar post to yours. Again I'm not sure what your trying to sen across so I can not point out the problem but have look at this,
Why is this warning being shown: "Received problem 2 in the chunky parser"?
Hope this helps!
So, I ran into this with a CGI program.
Long story short, the CGI script was using Python, and printing the chunk header using the length of the string, then sending to the client using:
print data,
This appends a space, making the data one byte longer than the chunk header says it is. I fixed this by changing that line to:
stdout.write( data )
A hexdump of the data out of the CGI script was the tool that finally told me what was going on.

Resources