Node usb, no data received - node.js

I use the following code to receive data from an ANT+ dongle I used other software (Zwift, which shows that the dongle actually received data) (Using OS X):
/* Require the config */
require('dotenv').config()
/* require the usb class */
var usb = require('usb');
/* find the device */
var device = usb.findByIds(process.env.VENDOR_ID, process.env.PRODUCT_ID);
/* open the device */
device.open(true);
/* claim interface WHERE IS THIS FOR */
const [interface] = device.interfaces;
interface.claim();
/* Start Polling */
const [inEndpoint, outEndpoint] = interface.endpoints;
inEndpoint.startPoll();
// inEndpoint.transferType = 2;
inEndpoint.on('data', (d) => {
console.log("usb data:", d, " len:", d.length)
});
inEndpoint.on('error', function (error) {
console.log("on error", error);
});
Do I do something wrong?
Underneath you find some loggin, I think I do everything right but it has to do something with node.
[timestamp] [threadID] facility level [function call] <message>
--------------------------------------------------------------------------------
[ 0.003821] [00000307] libusb: debug [libusb_get_device_list]
[ 0.003925] [00000307] libusb: debug [libusb_get_device_descriptor]
[ 0.004005] [00000307] libusb: debug [libusb_get_device_descriptor]
[ 0.004021] [00000307] libusb: debug [libusb_get_device_descriptor]
[ 0.004033] [00000307] libusb: debug [libusb_get_device_descriptor]
[ 0.004125] [00000307] libusb: debug [libusb_open] open 20.9
[ 0.004269] [00000307] libusb: debug [usbi_add_pollfd] add fd 26 events 1
[ 0.004275] [00000307] libusb: debug [darwin_open] device open for access
[ 0.004301] [00004903] libusb: debug [handle_events] poll() returned 1
[ 0.004306] [00004903] libusb: debug [handle_events] caught a fish on the control pipe
[ 0.004319] [00004903] libusb: debug [libusb_get_next_timeout] no URBs, no timeout!
[ 0.004322] [00004903] libusb: debug [libusb_try_lock_events] someone else is modifying poll fds
[ 0.004324] [00004903] libusb: debug [libusb_event_handler_active] someone else is modifying poll fds
[ 0.004327] [00004903] libusb: debug [libusb_handle_events_timeout_completed] another thread is doing event handling
[ 0.004368] [00004903] libusb: debug [libusb_get_next_timeout] no URBs, no timeout!
[ 0.004380] [00004903] libusb: debug [libusb_handle_events_timeout_completed] doing our own event handling
[ 0.004384] [00004903] libusb: debug [handle_events] poll fds modified, reallocating
[ 0.004389] [00004903] libusb: debug [handle_events] poll() 3 fds with timeout in 60000ms
[ 0.004736] [00000307] libusb: debug [libusb_claim_interface] interface 0
[ 0.005084] [00000307] libusb: debug [get_endpoints] building table of endpoints.
[ 0.005100] [00000307] libusb: debug [get_endpoints] interface: 0 pipe 1: dir: 1 number: 1
[ 0.005107] [00000307] libusb: debug [get_endpoints] interface: 0 pipe 2: dir: 0 number: 1
[ 0.005133] [00000307] libusb: debug [darwin_claim_interface] interface opened
[ 0.005445] [00000307] libusb: debug [ep_to_pipeRef] converting ep address 0x81 to pipeRef and interface
[ 0.005454] [00000307] libusb: debug [ep_to_pipeRef] pipe 1 on interface 0 matches
[ 0.005540] [00000307] libusb: debug [ep_to_pipeRef] converting ep address 0x81 to pipeRef and interface
[ 0.005547] [00000307] libusb: debug [ep_to_pipeRef] pipe 1 on interface 0 matches
[ 0.005572] [00000307] libusb: debug [ep_to_pipeRef] converting ep address 0x81 to pipeRef and interface
[ 0.005576] [00000307] libusb: debug [ep_to_pipeRef] pipe 1 on interface 0 matches
[60.006611] [00004903] libusb: debug [handle_events] poll() returned 0
[60.006708] [00004903] libusb: debug [libusb_get_next_timeout] no URB with timeout or all handled by OS; no timeout!
[60.006727] [00004903] libusb: debug [libusb_handle_events_timeout_completed] doing our own event handling
[60.006742] [00004903] libusb: debug [handle_events] poll() 3 fds with timeout in 60000ms
[120.011876] [00004903] libusb: debug [handle_events] poll() returned 0
[120.011973] [00004903] libusb: debug [libusb_get_next_timeout] no URB with timeout or all handled by OS; no timeout!
[120.011994] [00004903] libusb: debug [libusb_handle_events_timeout_completed] doing our own event handling
[120.012007] [00004903] libusb: debug [handle_events] poll() 3 fds with timeout in 60000ms
[180.012077] [00004903] libusb: debug [handle_events] poll() returned 0
[180.012106] [00004903] libusb: debug [libusb_get_next_timeout] no URB with timeout or all handled by OS; no timeout!
[180.012112] [00004903] libusb: debug [libusb_handle_events_timeout_completed] doing our own event handling
[180.012116] [00004903] libusb: debug [handle_events] poll() 3 fds with timeout in 60000ms

Related

Receiver reloads when connecting from Chrome on Android

I'm trying to make a custom receiver, a multi-player, Cast game. I'm also writing the sender app to run in the browser.
The problem I'm running into is that I can't connect to an already running receiver via chrome on Android. If I launch the receiver from an android device, I can connect to that receiver session from a desktop browser. However, if I launch the receiver from a desktop browser and try to connect from an android browser, the receiver is restarted. The same happens if I launch the receiver from one android browser and attempt to connect to it from a different android device.
Both sender and receiver are written with Angular, using the CAF framework.
--Edit--
Here's some logs, I stripped off the "cast_receiver_framework.js:77" from the beginning of every line.
This first one shows launching the receiver from Chrome on Android, then connecting from a desktop browser.
[ 0.043s] [cast.receiver.CastReceiverManager] Version: 2.0.0.0060
vendor.js:45467 Angular is running in the development mode. Call enableProdMode() to enable the production mode.
[ 9.301s] [cast.framework.common.libraryloader] library(//www.gstatic.com/cast/sdk/libs/mediaplayer/1.0.0/media_player.js) is loaded
[ 9.426s] [cast.framework.common.libraryloader] library(//ajax.googleapis.com/ajax/libs/shaka-player/2.3.5/shaka-player.compiled.js) is loaded
[ 9.527s] [cast.receiver.IpcChannel] Received message: {"data":"{\"applicationId\":\"880C031C\",\"applicationName\":\"The Game\",\"closedCaption\":{},\"deviceCapabilities\":{\"bluetooth_supported\":false,\"display_supported\":true,\"focus_state_supported\":true,\"hi_res_audio_supported\":false},\"launchingSenderId\":\"68137637-265f-7e53-098b-043b5891b89e.14:com.android.chrome-7839\",\"messagesVersion\":\"1.0\",\"sessionId\":\"b2258c08-bada-4b4e-adc0-275a7b338e70\",\"type\":\"ready\",\"version\":\"1.29.104827\"}","namespace":"urn:x-cast:com.google.cast.system","senderId":"SystemSender"}
[ 9.540s] [cast.receiver.CastMessageBus] Dispatching CastMessageBus message
[ 9.546s] [cast.receiver.CastReceiverManager] Dispatching CastReceiverManager system ready event
[ 9.555s] [cast.framework.Application] onReady
[ 9.577s] [cast.receiver.IpcChannel] Received message: {"data":"{\"level\":1.0,\"muted\":false,\"type\":\"volumechanged\"}","namespace":"urn:x-cast:com.google.cast.system","senderId":"SystemSender"}
[ 9.581s] [cast.receiver.CastMessageBus] Dispatching CastMessageBus message
[ 9.585s] [cast.receiver.CastReceiverManager] Dispatching system volume changed event [1, false]
[ 9.589s] [cast.receiver.IpcChannel] Received message: {"data":"{\"type\":\"visibilitychanged\"}","namespace":"urn:x-cast:com.google.cast.system","senderId":"SystemSender"}
[ 9.590s] [cast.receiver.CastMessageBus] Dispatching CastMessageBus message
[ 9.595s] [cast.receiver.CastReceiverManager] Ignoring visibility changed event, state is already null
[ 9.597s] [cast.receiver.IpcChannel] Received message: {"data":"{\"type\":\"standbychanged\"}","namespace":"urn:x-cast:com.google.cast.system","senderId":"SystemSender"}
[ 9.601s] [cast.receiver.CastMessageBus] Dispatching CastMessageBus message
[ 9.604s] [cast.receiver.IpcChannel] Received message: {"data":"{\"hdrType\":\"sdr\",\"type\":\"hdroutputtypechanged\"}","namespace":"urn:x-cast:com.google.cast.system","senderId":"SystemSender"}
[ 9.606s] [cast.receiver.CastMessageBus] Dispatching CastMessageBus message
[ 9.613s] [cast.receiver.MediaManager] Sending broadcast status message
[ 9.629s] [cast.receiver.IpcChannel] IPC message sent: {"namespace":"urn:x-cast:com.google.cast.media","senderId":"*:*","data":"{\"type\":\"MEDIA_STATUS\",\"status\":[],\"requestId\":0}"}
[ 9.637s] [cast.receiver.IpcChannel] Received message: {"data":"{\"state\":\"IN_FOCUS\",\"type\":\"FOCUS_STATE\"}","namespace":"urn:x-cast:com.google.cast.cac","senderId":"SystemSender"}
[ 9.644s] [cast.receiver.CastMessageBus] Dispatching CastMessageBus message
[ 9.662s] [cast.framework.common.EventHandler] onEvent for FOCUS_STATE
[ 9.667s] [cast.receiver.IpcChannel] Received message: {"data":"{\"senderId\":\"68137637-265f-7e53-098b-043b5891b89e.14:com.android.chrome-7839\",\"type\":\"senderconnected\",\"userAgent\":\"Android CastSDK,12529050,Pixel 2,walleye,8.1.0\"}","namespace":"urn:x-cast:com.google.cast.system","senderId":"SystemSender"}
[ 9.673s] [cast.receiver.CastMessageBus] Dispatching CastMessageBus message
[ 9.679s] [cast.receiver.CastReceiverManager] Dispatching CastReceiverManager sender connected event [68137637-265f-7e53-098b-043b5891b89e.14:com.android.chrome-7839]
[ 9.685s] [cast.receiver.CastMessageBus] Registering sender [urn:x-cast:com.google.cast.debugoverlay, 68137637-265f-7e53-098b-043b5891b89e.14:com.android.chrome-7839]
[ 9.687s] [cast.receiver.CastMessageBus] Registering sender [urn:x-cast:com.google.cast.cac, 68137637-265f-7e53-098b-043b5891b89e.14:com.android.chrome-7839]
[ 9.688s] [cast.receiver.CastMessageBus] Registering sender [urn:x-cast:com.google.cast.broadcast, 68137637-265f-7e53-098b-043b5891b89e.14:com.android.chrome-7839]
[ 9.692s] [cast.receiver.CastMessageBus] Registering sender [urn:x-cast:com.google.cast.media, 68137637-265f-7e53-098b-043b5891b89e.14:com.android.chrome-7839]
[ 9.693s] [cast.receiver.CastMessageBus] Registering sender [urn:x-cast:com.google.cast.inject, 68137637-265f-7e53-098b-043b5891b89e.14:com.android.chrome-7839]
[ 9.695s] [cast.receiver.IpcChannel] Received message: {"data":"{\"requestId\":1,\"type\":\"GET_STATUS\"}","namespace":"urn:x-cast:com.google.cast.media","senderId":"68137637-265f-7e53-098b-043b5891b89e.14:com.android.chrome-7839"}
[ 9.700s] [cast.receiver.CastMessageBus] Dispatching CastMessageBus message
[ 9.718s] [cast.receiver.MediaManager] MediaManager message received
[ 9.724s] [cast.receiver.MediaManager] Dispatching MediaManager getStatus event
[ 9.726s] [cast.framework.common.EventHandler] onEvent for getstatus
[ 9.731s] [cast.receiver.MediaManager] onGetStatus
[ 9.740s] [cast.receiver.MediaManager] onGetStatus: {"requestId":1}
[ 9.742s] [cast.receiver.MediaManager] Sending status message to 68137637-265f-7e53-098b-043b5891b89e.14:com.android.chrome-7839
[ 9.745s] [cast.receiver.IpcChannel] IPC message sent: {"namespace":"urn:x-cast:com.google.cast.media","senderId":"68137637-265f-7e53-098b-043b5891b89e.14:com.android.chrome-7839","data":"{\"type\":\"MEDIA_STATUS\",\"status\":[],\"requestId\":1}"}
[ 27.436s] [cast.receiver.IpcChannel] Received message: {"data":"{\"requestId\":1,\"type\":\"GET_STATUS\"}","namespace":"urn:x-cast:com.google.cast.media","senderId":"68137637-265f-7e53-098b-043b5891b89e.16:com.google.android.gms-7840"}
[ 27.437s] [cast.receiver.CastMessageBus] Dispatching CastMessageBus message
[ 27.438s] [cast.receiver.MediaManager] MediaManager message received
[ 27.439s] [cast.receiver.MediaManager] Dispatching MediaManager getStatus event
[ 27.439s] [cast.framework.common.EventHandler] onEvent for getstatus
[ 27.440s] [cast.receiver.MediaManager] onGetStatus
[ 27.441s] [cast.receiver.MediaManager] onGetStatus: {"requestId":1}
[ 27.441s] [cast.receiver.MediaManager] Sending status message to 68137637-265f-7e53-098b-043b5891b89e.16:com.google.android.gms-7840
[ 27.442s] [cast.receiver.IpcChannel] IPC message sent: {"namespace":"urn:x-cast:com.google.cast.media","senderId":"68137637-265f-7e53-098b-043b5891b89e.16:com.google.android.gms-7840","data":"{\"type\":\"MEDIA_STATUS\",\"status\":[],\"requestId\":1}"}
[ 49.809s] [cast.receiver.IpcChannel] Received message: {"data":"{\"type\":\"GET_STATUS\",\"requestId\":401804041}","namespace":"urn:x-cast:com.google.cast.media","senderId":"68137637-265f-7e53-098b-043b5891b89e.10:sender-h2j2ft55n4ty"}
[ 49.814s] [cast.receiver.CastMessageBus] Dispatching CastMessageBus message
[ 49.820s] [cast.receiver.MediaManager] MediaManager message received
[ 49.825s] [cast.receiver.MediaManager] Dispatching MediaManager getStatus event
[ 49.830s] [cast.framework.common.EventHandler] onEvent for getstatus
[ 49.837s] [cast.receiver.MediaManager] onGetStatus
[ 49.842s] [cast.receiver.MediaManager] onGetStatus: {"requestId":401804041}
[ 49.847s] [cast.receiver.MediaManager] Sending status message to 68137637-265f-7e53-098b-043b5891b89e.10:sender-h2j2ft55n4ty
[ 49.852s] [cast.receiver.IpcChannel] IPC message sent: {"namespace":"urn:x-cast:com.google.cast.media","senderId":"68137637-265f-7e53-098b-043b5891b89e.10:sender-h2j2ft55n4ty","data":"{\"type\":\"MEDIA_STATUS\",\"status\":[],\"requestId\":401804041}"}
[ 51.151s] [cast.receiver.IpcChannel] Received message: {"data":"{\"senderId\":\"68137637-265f-7e53-098b-043b5891b89e.10:152574013695890545\",\"type\":\"senderconnected\",\"userAgent\":\"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_13_4) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/66.0.3359.139 Safari/537.36\"}","namespace":"urn:x-cast:com.google.cast.system","senderId":"SystemSender"}
[ 51.157s] [cast.receiver.CastMessageBus] Dispatching CastMessageBus message
[ 51.162s] [cast.receiver.CastReceiverManager] Dispatching CastReceiverManager sender connected event [68137637-265f-7e53-098b-043b5891b89e.10:152574013695890545]
[ 51.167s] [cast.receiver.CastMessageBus] Registering sender [urn:x-cast:com.google.cast.debugoverlay, 68137637-265f-7e53-098b-043b5891b89e.10:152574013695890545]
[ 51.174s] [cast.receiver.CastMessageBus] Registering sender [urn:x-cast:com.google.cast.cac, 68137637-265f-7e53-098b-043b5891b89e.10:152574013695890545]
[ 51.179s] [cast.receiver.CastMessageBus] Registering sender [urn:x-cast:com.google.cast.broadcast, 68137637-265f-7e53-098b-043b5891b89e.10:152574013695890545]
[ 51.184s] [cast.receiver.CastMessageBus] Registering sender [urn:x-cast:com.google.cast.media, 68137637-265f-7e53-098b-043b5891b89e.10:152574013695890545]
[ 51.189s] [cast.receiver.CastMessageBus] Registering sender [urn:x-cast:com.google.cast.inject, 68137637-265f-7e53-098b-043b5891b89e.10:152574013695890545]
[ 51.194s] [cast.receiver.IpcChannel] Received message: {"data":"{\"type\":\"GET_STATUS\",\"requestId\":401804042}","namespace":"urn:x-cast:com.google.cast.media","senderId":"68137637-265f-7e53-098b-043b5891b89e.10:sender-h2j2ft55n4ty"}
[ 51.199s] [cast.receiver.CastMessageBus] Dispatching CastMessageBus message
[ 51.204s] [cast.receiver.MediaManager] MediaManager message received
[ 51.209s] [cast.receiver.MediaManager] Dispatching MediaManager getStatus event
[ 51.214s] [cast.framework.common.EventHandler] onEvent for getstatus
[ 51.219s] [cast.receiver.MediaManager] onGetStatus
[ 51.225s] [cast.receiver.MediaManager] onGetStatus: {"requestId":401804042}
[ 51.229s] [cast.receiver.MediaManager] Sending status message to 68137637-265f-7e53-098b-043b5891b89e.10:sender-h2j2ft55n4ty
[ 51.234s] [cast.receiver.IpcChannel] IPC message sent: {"namespace":"urn:x-cast:com.google.cast.media","senderId":"68137637-265f-7e53-098b-043b5891b89e.10:sender-h2j2ft55n4ty","data":"{\"type\":\"MEDIA_STATUS\",\"status\":[],\"requestId\":401804042}"}
And this log shows launching the receiver from desktop first, then trying to connect to if with an android browser. There's no error when the android browser attempts to connect, just "Dispatching shutdown event".
[ 0.049s] [cast.receiver.CastReceiverManager] Version: 2.0.0.0060
vendor.js:45467 Angular is running in the development mode. Call enableProdMode() to enable the production mode.
[ 11.726s] [cast.framework.common.libraryloader] library(//www.gstatic.com/cast/sdk/libs/mediaplayer/1.0.0/media_player.js) is loaded
[ 11.861s] [cast.framework.common.libraryloader] library(//ajax.googleapis.com/ajax/libs/shaka-player/2.3.5/shaka-player.compiled.js) is loaded
[ 11.921s] [cast.receiver.IpcChannel] Received message: {"data":"{\"applicationId\":\"880C031C\",\"applicationName\":\"The Game\",\"closedCaption\":{},\"deviceCapabilities\":{\"bluetooth_supported\":false,\"display_supported\":true,\"focus_state_supported\":true,\"hi_res_audio_supported\":false},\"launchingSenderId\":\"68137637-265f-7e53-098b-043b5891b89e.10:sender-h2j2ft55n4ty\",\"messagesVersion\":\"1.0\",\"sessionId\":\"a3b50537-b0d1-413f-9638-aab10b105253\",\"type\":\"ready\",\"version\":\"1.29.104827\"}","namespace":"urn:x-cast:com.google.cast.system","senderId":"SystemSender"}
[ 11.936s] [cast.receiver.CastMessageBus] Dispatching CastMessageBus message
[ 11.949s] [cast.receiver.CastReceiverManager] Dispatching CastReceiverManager system ready event
[ 11.955s] [cast.framework.Application] onReady
[ 11.974s] [cast.receiver.IpcChannel] Received message: {"data":"{\"level\":1.0,\"muted\":false,\"type\":\"volumechanged\"}","namespace":"urn:x-cast:com.google.cast.system","senderId":"SystemSender"}
[ 11.978s] [cast.receiver.CastMessageBus] Dispatching CastMessageBus message
[ 11.984s] [cast.receiver.CastReceiverManager] Dispatching system volume changed event [1, false]
[ 11.990s] [cast.receiver.IpcChannel] Received message: {"data":"{\"type\":\"visibilitychanged\"}","namespace":"urn:x-cast:com.google.cast.system","senderId":"SystemSender"}
[ 11.994s] [cast.receiver.CastMessageBus] Dispatching CastMessageBus message
[ 12.000s] [cast.receiver.CastReceiverManager] Ignoring visibility changed event, state is already null
[ 12.005s] [cast.receiver.IpcChannel] Received message: {"data":"{\"type\":\"standbychanged\"}","namespace":"urn:x-cast:com.google.cast.system","senderId":"SystemSender"}
[ 12.010s] [cast.receiver.CastMessageBus] Dispatching CastMessageBus message
[ 12.015s] [cast.receiver.IpcChannel] Received message: {"data":"{\"hdrType\":\"sdr\",\"type\":\"hdroutputtypechanged\"}","namespace":"urn:x-cast:com.google.cast.system","senderId":"SystemSender"}
[ 12.019s] [cast.receiver.CastMessageBus] Dispatching CastMessageBus message
[ 12.026s] [cast.receiver.MediaManager] Sending broadcast status message
[ 12.037s] [cast.receiver.IpcChannel] IPC message sent: {"namespace":"urn:x-cast:com.google.cast.media","senderId":"*:*","data":"{\"type\":\"MEDIA_STATUS\",\"status\":[],\"requestId\":0}"}
[ 12.048s] [cast.receiver.IpcChannel] Received message: {"data":"{\"state\":\"IN_FOCUS\",\"type\":\"FOCUS_STATE\"}","namespace":"urn:x-cast:com.google.cast.cac","senderId":"SystemSender"}
[ 12.053s] [cast.receiver.CastMessageBus] Dispatching CastMessageBus message
[ 12.059s] [cast.framework.common.EventHandler] onEvent for FOCUS_STATE
[ 12.068s] [cast.receiver.IpcChannel] Received message: {"data":"{\"senderId\":\"68137637-265f-7e53-098b-043b5891b89e.10:152574013695890545\",\"type\":\"senderconnected\",\"userAgent\":\"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_13_4) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/66.0.3359.139 Safari/537.36\"}","namespace":"urn:x-cast:com.google.cast.system","senderId":"SystemSender"}
[ 12.073s] [cast.receiver.CastMessageBus] Dispatching CastMessageBus message
[ 12.080s] [cast.receiver.CastReceiverManager] Dispatching CastReceiverManager sender connected event [68137637-265f-7e53-098b-043b5891b89e.10:152574013695890545]
[ 12.086s] [cast.receiver.CastMessageBus] Registering sender [urn:x-cast:com.google.cast.debugoverlay, 68137637-265f-7e53-098b-043b5891b89e.10:152574013695890545]
[ 12.092s] [cast.receiver.CastMessageBus] Registering sender [urn:x-cast:com.google.cast.cac, 68137637-265f-7e53-098b-043b5891b89e.10:152574013695890545]
[ 12.097s] [cast.receiver.CastMessageBus] Registering sender [urn:x-cast:com.google.cast.broadcast, 68137637-265f-7e53-098b-043b5891b89e.10:152574013695890545]
[ 12.102s] [cast.receiver.CastMessageBus] Registering sender [urn:x-cast:com.google.cast.media, 68137637-265f-7e53-098b-043b5891b89e.10:152574013695890545]
[ 12.107s] [cast.receiver.CastMessageBus] Registering sender [urn:x-cast:com.google.cast.inject, 68137637-265f-7e53-098b-043b5891b89e.10:152574013695890545]
[ 12.114s] [cast.receiver.IpcChannel] Received message: {"data":"{\"type\":\"GET_STATUS\",\"requestId\":401804049}","namespace":"urn:x-cast:com.google.cast.media","senderId":"68137637-265f-7e53-098b-043b5891b89e.10:sender-h2j2ft55n4ty"}
[ 12.118s] [cast.receiver.CastMessageBus] Dispatching CastMessageBus message
[ 12.130s] [cast.receiver.MediaManager] MediaManager message received
[ 12.135s] [cast.receiver.MediaManager] Dispatching MediaManager getStatus event
[ 12.141s] [cast.framework.common.EventHandler] onEvent for getstatus
[ 12.148s] [cast.receiver.MediaManager] onGetStatus
[ 12.153s] [cast.receiver.MediaManager] onGetStatus: {"requestId":401804049}
[ 12.158s] [cast.receiver.MediaManager] Sending status message to 68137637-265f-7e53-098b-043b5891b89e.10:sender-h2j2ft55n4ty
[ 12.165s] [cast.receiver.IpcChannel] IPC message sent: {"namespace":"urn:x-cast:com.google.cast.media","senderId":"68137637-265f-7e53-098b-043b5891b89e.10:sender-h2j2ft55n4ty","data":"{\"type\":\"MEDIA_STATUS\",\"status\":[],\"requestId\":401804049}"}
[ 16.458s] [cast.receiver.CastReceiverManager] Dispatching shutdown event

Cant connect to scassandra (stubbed cassandra) using datastax-driver

I have troubles connecting to cassandra.
Im trying to connect to s-cassandra (which is a stubbed cassandra as can be reviewed here), with a datastax node.js cassandra driver.
For some reason passing "127.0.0.1:8042" as a contact point to the driver
results in a DriverInternalError:( tough sometimes it does work randomly and I havent still figured out why sometimes it does and sometime i doesnt..)
The DriverInternalError I get:
{"name": "DriverInternalError",
"stack": "...",
"message": "Local datacenter could not be
determined",
"info": "Represents a bug inside the driver or in a
Cassandra host." }
That is what I see from Cassandra Driver's log:
log event: info -- Adding host 127.0.0.1:8042
log event: info -- Getting first connection
log event: info -- Connecting to 127.0.0.1:8042
log event: verbose -- Socket connected to 127.0.0.1:8042
log event: info -- Trying to use protocol version 4
log event: verbose -- Sending stream #0
log event: verbose -- Sent stream #0 to 127.0.0.1:8042
{"name":"application-storage","hostname":"Yuris-MacBook-Pro.local","pid":1338,"level":30,"msg":"Kafka producer is initialized","time":"2016-08-05T12:53:53.124Z","v":0}
log event: verbose -- Received frame #0 from 127.0.0.1:8042
log event: info -- Protocol v4 not supported, using v2
log event: verbose -- Done receiving frame #0
log event: verbose -- disconnecting
log event: info -- Connection to 127.0.0.1:8042 closed
log event: info -- Connecting to 127.0.0.1:8042
log event: verbose -- Socket connected to 127.0.0.1:8042
log event: info -- Trying to use protocol version 2
log event: verbose -- Sending stream #0
log event: verbose -- Sent stream #0 to 127.0.0.1:8042
log event: verbose -- Received frame #0 from 127.0.0.1:8042
log event: info -- Connection to 127.0.0.1:8042 opened successfully
log event: info -- Connection pool to host 127.0.0.1:8042 created with 1 connection(s)
log event: info -- Control connection using protocol version 2
log event: info -- Connection acquired to 127.0.0.1:8042, refreshing nodes list
log event: info -- Refreshing local and peers info
log event: verbose -- Sending stream #1
log event: verbose -- Done receiving frame #0
log event: verbose -- Sent stream #1 to 127.0.0.1:8042
log event: verbose -- Received frame #1 from 127.0.0.1:8042
log event: warning -- No local info provided
log event: verbose -- Sending stream #0
log event: verbose -- Done receiving frame #1
log event: verbose -- Sent stream #0 to 127.0.0.1:8042
log event: verbose -- Received frame #0 from 127.0.0.1:8042
log event: info -- Peers info retrieved
log event: error -- Tokenizer could not be determined
log event: info -- Retrieving keyspaces metadata
log event: verbose -- Sending stream #1
log event: verbose -- Done receiving frame #0
log event: verbose -- Sent stream #1 to 127.0.0.1:8042
log event: verbose -- Received frame #1 from 127.0.0.1:8042
log event: verbose -- Sending stream #0
log event: verbose -- Done receiving frame #1
log event: verbose -- Sent stream #0 to 127.0.0.1:8042
log event: verbose -- Received frame #0 from 127.0.0.1:8042
log event: info -- ControlConnection connected to 127.0.0.1:8042 and is up to date
Ive tried playing with the firewall and open application but help is not there.. tough sometimes it does work randomly and I havent still figured out why..
I have a mac OS X El Capitan
The Solution that helped me:
I needed to prime the system.local table as a prime-query-single
{
query: 'prime-query-single',
header: {'Content-Type': 'application/json'},
body: {
"when": {
"query": "SELECT * FROM system.local WHERE key='local'"
},
"then": {
"rows": [
{
"cluster_name": "custom cluster name",
"partitioner": "org.apache.cassandra.dht.Murmur3Partitioner",
"data_center": "dc1",
"rack": "rc1",
"tokens": [
"1743244960790844724"
],
"release_version": "2.0.1"
}
],
"result": "success",
"column_types": {
"tokens": "set<text>"
}
}
}
}

Spark job can not acquire resource from mesos cluster

I am using Spark Job Server (SJS) to create context and submit jobs.
My cluster includes 4 servers.
master1: 10.197.0.3
master2: 10.197.0.4
master3: 10.197.0.5
master4: 10.197.0.6
But only master1 has a public ip.
First of all I set up zookeeper for master1, master3 and master3 and zookeeper-id from 1 to 3.
I intend use master1, master2, master3 to be a masters of cluster.
That mean quorum=2 I set for 3 masters.
The zk connect is zk://master1:2181,master2:2181,master3:2181/mesos
each server I also start mesos-slave so I have 4 slaves and 3 masters.
As you can see all slaves are conencted.
But the funny thing is when I create a job to run it can not acquire the resource.
From logs I saw that it's continuing DECLINE the offer. This logs from master.
I0523 15:01:00.116981 32513 master.cpp:3641] Processing DECLINE call for offers: [ dc18c89f-d802-404b-9221-71f0f15b096f-O4264 ] for framework dc18c89f-d802-404b-9221-71f0f15b096f-0001 (sql_context-1) at scheduler-f5196abd-f420-48c6-b2fe-0306595601d4#10.197.0.3:28765
I0523 15:01:00.117086 32513 master.cpp:3641] Processing DECLINE call for offers: [ dc18c89f-d802-404b-9221-71f0f15b096f-O4265 ] for framework dc18c89f-d802-404b-9221-71f0f15b096f-0001 (sql_context-1) at scheduler-f5196abd-f420-48c6-b2fe-0306595601d4#10.197.0.3:28765
I0523 15:01:01.460502 32508 replica.cpp:673] Replica in VOTING status received a broadcasted recover request from (914)#127.0.0.1:5050
I0523 15:01:02.117753 32510 master.cpp:5324] Sending 1 offers to framework dc18c89f-d802-404b-9221-71f0f15b096f-0000 (sql_context) at scheduler-9b4637cf-4b27-4629-9a73-6019443ed30b#10.197.0.3:28765
I0523 15:01:02.118099 32510 master.cpp:5324] Sending 1 offers to framework dc18c89f-d802-404b-9221-71f0f15b096f-0001 (sql_context-1) at scheduler-f5196abd-f420-48c6-b2fe-0306595601d4#10.197.0.3:28765
I0523 15:01:02.119299 32508 master.cpp:3641] Processing DECLINE call for offers: [ dc18c89f-d802-404b-9221-71f0f15b096f-O4266 ] for framework dc18c89f-d802-404b-9221-71f0f15b096f-0000 (sql_context) at scheduler-9b4637cf-4b27-4629-9a73-6019443ed30b#10.197.0.3:28765
I0523 15:01:02.119858 32515 master.cpp:3641] Processing DECLINE call for offers: [ dc18c89f-d802-404b-9221-71f0f15b096f-O4267 ] for framework dc18c89f-d802-404b-9221-71f0f15b096f-0001 (sql_context-1) at scheduler-f5196abd-f420-48c6-b2fe-0306595601d4#10.197.0.3:28765
I0523 15:01:02.900946 32509 http.cpp:312] HTTP GET for /master/state from 10.197.0.3:35778 with User-Agent='Mozilla/5.0 (Macintosh; Intel Mac OS X 10_11_5) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/50.0.2661.102 Safari/537.36' with X-Forwarded-For='113.161.38.181'
I0523 15:01:03.118147 32514 master.cpp:5324] Sending 1 offers to framework dc18c89f-d802-404b-9221-71f0f15b096f-0001 (sql_context-1) at scheduler-f5196abd-f420-48c6-b2fe-0306595601d4#10.197.0.3:28765
For 1 of my slave I check
W0523 14:53:15.487599 32681 status_update_manager.cpp:475] Resending status update TASK_FAILED (UUID: 3c3a022c-2032-4da1-bbab-c367d46e07de) for task driver-20160523111535-0003 of framework a9871c4b-ab0c-4ddc-8d96-c52faf0e66f7-0019
W0523 14:53:15.487773 32681 status_update_manager.cpp:475] Resending status update TASK_FAILED (UUID: cfb494b3-6484-4394-bd94-80abf2e11ee8) for task driver-20160523112724-0001 of framework a9871c4b-ab0c-4ddc-8d96-c52faf0e66f7-0020
I0523 14:53:15.487820 32680 slave.cpp:3400] Forwarding the update TASK_FAILED (UUID: 3c3a022c-2032-4da1-bbab-c367d46e07de) for task driver-20160523111535-0003 of framework a9871c4b-ab0c-4ddc-8d96-c52faf0e66f7-0019 to master#10.197.0.3:5050
I0523 14:53:15.488008 32680 slave.cpp:3400] Forwarding the update TASK_FAILED (UUID: cfb494b3-6484-4394-bd94-80abf2e11ee8) for task driver-20160523112724-0001 of framework a9871c4b-ab0c-4ddc-8d96-c52faf0e66f7-0020 to master#10.197.0.3:5050
I0523 15:02:24.120436 32680 http.cpp:190] HTTP GET for /slave(1)/state from 113.161.38.181:63097 with User-Agent='Mozilla/5.0 (Macintosh; Intel Mac OS X 10_11_5) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/50.0.2661.102 Safari/537.36'
W0523 15:02:24.165690 32685 slave.cpp:4979] Failed to get resource statistics for executor 'driver-20160523111535-0003' of framework a9871c4b-ab0c-4ddc-8d96-c52faf0e66f7-0019: Container 'cac7667c-3309-4380-9f95-07d9b888e44e' not found
W0523 15:02:24.165771 32685 slave.cpp:4979] Failed to get resource statistics for executor 'driver-20160523112724-0001' of framework a9871c4b-ab0c-4ddc-8d96-c52faf0e66f7-0020: Container '9c661311-bf7f-4ea6-9348-ce8c7f6cfbcb' not found
From SJS Logs
[2016-05-23 15:04:10,305] DEBUG oarseMesosSchedulerBackend [] [] - Declining offer: dc18c89f-d802-404b-9221-71f0f15b096f-O4565 with attributes: Map() mem: 63403.0 cpu: 8
[2016-05-23 15:04:10,305] DEBUG oarseMesosSchedulerBackend [] [] - Declining offer: dc18c89f-d802-404b-9221-71f0f15b096f-O4566 with attributes: Map() mem: 47244.0 cpu: 8
[2016-05-23 15:04:10,305] DEBUG oarseMesosSchedulerBackend [] [] - Declining offer: dc18c89f-d802-404b-9221-71f0f15b096f-O4567 with attributes: Map() mem: 47244.0 cpu: 8
[2016-05-23 15:04:10,366] WARN cheduler.TaskSchedulerImpl [] [akka://JobServer/user/context-supervisor/sql_context] - Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient resources
[2016-05-23 15:04:10,505] DEBUG cheduler.TaskSchedulerImpl [] [akka://JobServer/user/context-supervisor/sql_context] - parentName: , name: TaskSet_0, runningTasks: 0
[2016-05-23 15:04:11,306] DEBUG oarseMesosSchedulerBackend [] [] - Declining offer: dc18c89f-d802-404b-9221-71f0f15b096f-O4568 with attributes: Map() mem: 47244.0 cpu: 8
[2016-05-23 15:04:11,306] DEBUG oarseMesosSchedulerBackend [] [] - Declining offer: dc18c89f-d802-404b-9221-71f0f15b096f-O4569 with attributes: Map() mem: 63403.0 cpu: 8
[2016-05-23 15:04:11,505] DEBUG cheduler.TaskSchedulerImpl [] [akka://JobServer/user/context-supervisor/sql_context] - parentName: , name: TaskSet_0, runningTasks: 0
[2016-05-23 15:04:12,308] DEBUG oarseMesosSchedulerBackend [] [] - Declining offer: dc18c89f-d802-404b-9221-71f0f15b096f-O4570 with attributes: Map() mem: 47244.0 cpu: 8
[2016-05-23 15:04:12,505] DEBUG cheduler.TaskSchedulerImpl [] [akka://JobServer/user/context-supervisor/sql_context] - parentName: , name: TaskSet_0, runningTasks: 0
In master2 logs
May 23 08:19:44 ants-vps mesos-master[1866]: E0523 08:19:44.273349 1902 process.cpp:1958] Failed to shutdown socket with fd 28: Transport endpoint is not connected
May 23 08:19:54 ants-vps mesos-master[1866]: I0523 08:19:54.274245 1899 replica.cpp:673] Replica in VOTING status received a broadcasted recover request from (1257)#127.0.0.1:5050
May 23 08:19:54 ants-vps mesos-master[1866]: E0523 08:19:54.274533 1902 process.cpp:1958] Failed to shutdown socket with fd 28: Transport endpoint is not connected
May 23 08:20:04 ants-vps mesos-master[1866]: I0523 08:20:04.275291 1897 replica.cpp:673] Replica in VOTING status received a broadcasted recover request from (1260)#127.0.0.1:5050
May 23 08:20:04 ants-vps mesos-master[1866]: E0523 08:20:04.275512 1902 process.cpp:1958] Failed to shutdown socket with fd 28: Transport endpoint is not connected
From master3:
May 23 08:21:05 ants-vps mesos-master[22023]: I0523 08:21:05.994082 22042 recover.cpp:193] Received a recover response from a replica in EMPTY status
May 23 08:21:15 ants-vps mesos-master[22023]: I0523 08:21:15.994051 22043 recover.cpp:109] Unable to finish the recover protocol in 10secs, retrying
May 23 08:21:15 ants-vps mesos-master[22023]: I0523 08:21:15.994529 22036 replica.cpp:673] Replica in EMPTY status received a broadcasted recover request from (1282)#127.0.0.1:5050
How to find the reason of that issues and fix it?

Open tty Serial USB port

I am using Sierra Aircard modem
While Configuring Dial Port/PPP port ,I am opening This port(deb/ttyUSB3) like this
struct termios tio;
memset(&tio, 0, sizeof(termios));
if ((fdDataPort = open(portName, O_RDWR | O_NOCTTY| O_SYNC | O_NONBLOCK )) != -1)
{
cfmakeraw (&tio);
printf("After OpenDataPort call");
tio.c_iflag = 0;//IGNCR;
tio.c_cflag |= CLOCAL | CREAD;
tcflush(fdDataPort, TCIOFLUSH);
tcsetattr(fdDataPort, TCSANOW, &tio);
tcflush(fdDataPort, TCIOFLUSH);
tcflush(fdDataPort, TCIOFLUSH);
cfsetispeed(&tio, B115200);
cfsetospeed(&tio, B115200);
tcsetattr(fdDataPort, TCSANOW, &tio);
printf("After tcsetattr call");
return true;
}
else
{
return false;
}
This configuration is working perfectly fine till now for connection establishment. Reconnecting etc
But I have one problem wrt this method : If i remove dongle when this operation is in progress(only few mili seconds) i am not able to detect dongle removal in my physical-device-manager(This process does device management modeswitch etc...) because msg is not received from kernel layer . also if i remove dongle also /dev/ttyUSB3 still persists (0,1,2 are released) . Kindly let me know if this is a right way to open the port or any other method is available .Appreciate your help
EDIT
Below is the ERROR log from dmesg
49.463282] 5864 slab pages
[ 49.463286] 943924 pages shared
[ 49.463291] 0 pages swap cached
[ 49.465229] FAT-fs (sda1): utf8 is not a recommended IO charset for FAT filesystems, filesystem will be case sensitive!
[ 49.511839] FAT-fs (sda1): Volume was not properly unmounted. Some data may be corrupt. Please run fsck.
[ 51.120554] usb 1-1: USB disconnect, device number 4
[ 51.153175] sierra ttyUSB0: Sierra USB modem converter now disconnected from ttyUSB0
[ 51.153546] sierra 1-1:1.0: device disconnected
[ 51.185779] sierra ttyUSB1: Sierra USB modem converter now disconnected from ttyUSB1
[ 51.186091] sierra 1-1:1.1: device disconnected
[ 51.233531] sierra ttyUSB2: Sierra USB modem converter now disconnected from ttyUSB2
[ 51.233888] sierra 1-1:1.3: device disconnected
[ 51.242018] sierra ttyUSB3: sierra_submit_rx_urbs: submit urb failed: -19
[ 51.242032] sierra ttyUSB3: sierra_submit_rx_urbs: submit urb failed: -19
[ 51.242040] sierra ttyUSB3: sierra_submit_rx_urbs: submit urb failed: -19
[ 51.242047] sierra ttyUSB3: sierra_submit_rx_urbs: submit urb failed: -19
[ 51.242054] sierra ttyUSB3: sierra_submit_rx_urbs: submit urb failed: -19
[ 51.242060] sierra ttyUSB3: sierra_submit_rx_urbs: submit urb failed: -19
[ 51.242066] sierra ttyUSB3: sierra_submit_rx_urbs: submit urb failed: -19
[ 51.242073] sierra ttyUSB3: sierra_submit_rx_urbs: submit urb failed: -19
[ 51.617553] sd 1:0:0:0: [sda] Unhandled error code
[ 51.617569] sd 1:0:0:0: [sda]
[ 51.617575] Result: hostbyte=0x07 driverbyte=0x00
[ 51.617582] sd 1:0:0:0: [sda] CDB:
[ 51.617587] cdb[0]=0x28: 28 00 00 00 0d 27 00 00 01 00
[ 51.617619] end_request: I/O error, dev sda, sector 3367
[ 51.617674] sd 1:0:0:0: [sda] Unhandled error code
[ 51.617682] sd 1:0:0:0: [sda]
[ 51.617687] Result: hostbyte=0x07 driverbyte=0x00
[ 51.617693] sd 1:0:0:0: [sda] CDB:
[ 51.617698] cdb[0]=0x28: 28 00 00 00 0d 28 00 00 01 00
I am stuck please help

passenger not spawning app process

I compiled nginx with passenger support, but after I start nginx(with passenger), my nodeJS app is not started.
Here are a little bit more details of my configuration.
nginx congiure file:
server {
listen 80;
server_name example.com www.example.com;
location / {
root /var/www/nodejs;
index index.html index.htm index.php;
}
location ~ ^/letsplay(/.*|$) {
alias /var/www/nodejs/letsplay/public$1;
passenger_base_uri /letsplay;
passenger_app_root /var/www/nodejs/letsplay;
passenger_document_root /var/www/nodejs/letsplay/public;
passenger_enabled on;
passenger_startup_file restserver.js;
}
}
passenger-status
Version : 4.0.45
Date : 2014-06-30 18:19:47 +0000
Instance: 19879
----------- General information -----------
Max pool size : 6
Processes : 0
Requests in top-level queue : 0
----------- Application groups -----------
this is the nginx error log:
>
2014/06/30 18:33:42 [notice] 20046#0: using the "epoll" event method
[ 2014-06-30 18:33:42.0225 20047/7fd9f3cec780 agents/Base.cpp:1599 ]: Random seed: 1404153222
[ 2014-06-30 18:33:42.0226 20047/7fd9f3cec780 agents/Watchdog/Main.cpp:698 ]: Starting Watchdog...
[ 2014-06-30 18:33:42.0231 20047/7fd9f3cec780 agents/Watchdog/Main.cpp:538 ]: Options: { 'analytics_log_user' => 'nobody', 'default_group' => 'nobody', 'default_python' => 'python', 'default_ruby' => 'ruby', 'default_user' => 'nobody', 'log_level' => '2', 'max_pool_size' => '6', 'passenger_root' => '/home/danny/programms/passenger', 'passenger_version' => '4.0.45', 'pool_idle_time' => '300', 'temp_dir' => '/tmp', 'union_station_gateway_address' => 'gateway.unionstationapp.com', 'union_station_gateway_port' => '443', 'user_switching' => 'true', 'web_server_passenger_version' => '4.0.45', 'web_server_pid' => '20046', 'web_server_type' => 'nginx', 'web_server_worker_gid' => '996', 'web_server_worker_uid' => '997' }
[ 2014-06-30 18:33:42.0280 20050/7fd6340c0780 agents/Base.cpp:1599 ]: Random seed: 1404153222
[ 2014-06-30 18:33:42.0281 20050/7fd6340c0780 agents/HelperAgent/Main.cpp:642 ]: Starting PassengerHelperAgent...
[ 2014-06-30 18:33:42.0297 20050/7fd6340c0780 agents/HelperAgent/Main.cpp:649 ]: PassengerHelperAgent online, listening at unix:/tmp/passenger.1.0.20046/generation-0/request
[ 2014-06-30 18:33:42.0367 20058/7f856ed59880 agents/Base.cpp:1599 ]: Random seed: 1404153222
[ 2014-06-30 18:33:42.0369 20058/7f856ed59880 agents/LoggingAgent/Main.cpp:333 ]: Starting PassengerLoggingAgent...
[ 2014-06-30 18:33:42.0377 20058/7f856ed59880 agents/LoggingAgent/Main.cpp:321 ]: PassengerLoggingAgent online, listening at unix:/tmp/passenger.1.0.20046/generation-0/logging
[ 2014-06-30 18:33:42.0379 20047/7fd9f3cec780 agents/Watchdog/Main.cpp:728 ]: All Phusion Passenger agents started!
2014/06/30 18:33:42 [notice] 20046#0: nginx/1.6.0
2014/06/30 18:33:42 [notice] 20046#0: built by gcc 4.8.2 20131212 (Red Hat 4.8.2-7) (GCC)
2014/06/30 18:33:42 [notice] 20046#0: OS: Linux 3.14.5-x86_64-linode42
2014/06/30 18:33:42 [notice] 20046#0: getrlimit(RLIMIT_NOFILE): 1024:4096
2014/06/30 18:33:42 [notice] 20065#0: start worker processes
2014/06/30 18:33:42 [notice] 20065#0: start worker process 20066
[ 2014-06-30 18:33:42.6525 20000/7f1a8e1ea880 agents/LoggingAgent/Main.cpp:344 ]: Logging agent exiting with code 0.
[ 2014-06-30 18:33:42.6562 19992/7fd7b6257780 agents/HelperAgent/Main.cpp:605 ]: It's now 5 seconds after all clients have disconnected. Proceeding with graceful exit.
[ 2014-06-30 18:33:42.6563 19992/7fd7b6257780 agents/HelperAgent/Main.cpp:506 ]: Shutting down helper agent...
[ 2014-06-30 18:33:42.6566 19992/7fd7b6257780 agents/HelperAgent/Main.cpp:513 ]: Destroying application pool...
[ 2014-06-30 18:33:42.6745 20045/7f8b0fc0e780 agents/Watchdog/Main.cpp:388 ]: All Phusion Passenger agent processes have exited. Forcing all subprocesses to shut down.
[ 2014-06-30 18:33:42.6745 20045/7f8b0fc0e780 agents/Watchdog/Main.cpp:390 ]: Sending SIGTERM
[ 2014-06-30 18:33:43.6748 20045/7f8b0fc0e780 agents/Watchdog/Main.cpp:395 ]: Sending SIGKILL
[ 2014-06-30 18:33:45.0296 20050/7fd6340ad700 Pool2/Pool.h:827 ]: Analytics collection time...
[ 2014-06-30 18:33:45.0300 20050/7fd6340ad700 Pool2/Pool.h:930 ]: Analytics collection done; next analytics collection in 4.970 sec

Resources