I'm trying to retrieve my sensor data from my Raspberry Pi using nrf24l01+ network receiver.
I'm sending it from an Arduino nano board. Here is the setting of my Arduino:
STATUS = 0x0e RX_DR=0 TX_DS=0 MAX_RT=0 RX_P_NO=7 TX_FULL=0
RX_ADDR_P0-1 = 0xcccccc3ccc 0xcccccc3c3c
RX_ADDR_P2-5 = 0x33 0xce 0x3e 0xe3
TX_ADDR = 0xcccccccc3c
RX_PW_P0-6 = 0x20 0x20 0x20 0x20 0x20 0x20
EN_AA = 0x3e
EN_RXADDR = 0x3f
RF_CH = 0x5a
RF_SETUP = 0x07
CONFIG = 0x0f
DYNPD/FEATURE = 0x3f 0x04
Data Rate = 1MBPS
Model = nRF24L01+
CRC Length = 16 bits
PA Power = PA_MAX
My Raspberry Pi is plugged with nrf24l01+ through GPIO. I made sure the connection is OK by using the C++ example given on https://github.com/TMRh20/RF24:
RF24 radio(RPI_BPLUS_GPIO_J8_15,RPI_BPLUS_GPIO_J8_24, BCM2835_SPI_SPEED_8MHZ);
The data is OK. Now i want to use a nodeJS program to get this data. I'm using this library: https://github.com/natevw/node-nrf
The code is very simple, but somehow is not working (console is silent):
var spiDev = "/dev/spidev0.0";
var cePin = 15; //RPI_BPLUS_GPIO_J8_15
var irqPin = null;
var channel = 0x5a; //90
var radio = require('nrf').connect(spiDev, cePin, irqPin);
radio
.channel(channel)
.dataRate('1Mbps')
.crcBytes(1)
// .autoRetransmit({count:15, delay:4000})
;
radio.begin(function () {
var rx = radio.openPipe('rx', 0xcccccccc3c);
rx.pipe(process.stdout);
});
I'm wondering what I'm doing wrong. Hardware is OK and the setting seems pretty good, what do you think?
Thanks
Usually to find out what is wrong with NRF you should start from basics:
Try simpler NRF configs to test if its working, especially with no CRC bytes etc.
Try it w/o dynamic payload and try fixed payload size on both ends.
Auto-acknowledge also can be an issue (note that when auto-ack is enabled, CRC can't be disabled as it is used to ensure transmission acknowledgement in this mode).
Ensure that CRC lengths match on both ends. In your example on Arduino you have CRC Length = 16 bits whether Raspberry configured with radio.crcBytes(1).
Don't rely on default values, always provide same full configuration on both ends.
These steps can considerably reduce time to locate the problem especially when using different libraries and platforms.
Related
I've been working on PIC18F55K42 chip for a while. The PIC is setup as a slave and it's receiving bytes correctly. But I encountered a few problems.
For example, when I do:
i2cset -y 1 0x54 0x80 0x01
It looks correct on the controller side and I can see the address 0x80(data address) and byte value 0x01.
When I send in block mode like:
i2cset -y 1 0x54 0x80 0x01 0x02 0x03 0x04 i
I see spurious bytes appearing on the controller. More precisely, it looks like this:
ADDRESS 80 6c 00 2f 01 02 03 04 STOP
At first I thought this is something to do with my controller and even tried digging into it's clock settings. Used Salae logic analyser too. There's nothing wrong with the controller or it's set up. The only place I can think of is the complex onion driver layering done by Linux.
I'd like to know why Linux is sending the 3 extra bytes (6c 00 2f). Why does i2c_smbus_write_block_data send extra bytes and how can it be avoided?
It's a bug in the i2cset implementation in Busybox. See miscutils/i2c_tools.c:
/* Prepare the value(s) to be written according to current mode. */
switch (mode) {
case I2C_SMBUS_BYTE_DATA:
val = xstrtou_range(argv[3], 0, 0, 0xff);
break;
case I2C_SMBUS_WORD_DATA:
val = xstrtou_range(argv[3], 0, 0, 0xffff);
break;
case I2C_SMBUS_BLOCK_DATA:
case I2C_SMBUS_I2C_BLOCK_DATA:
for (blen = 3; blen < (argc - 1); blen++)
block[blen] = xstrtou_range(argv[blen], 0, 0, 0xff);
val = -1;
break;
default:
val = -1;
break;
}
Should be block[blen - 3] = xstrtou_range(argv[blen], 0, 0, 0xff);. The bug results in 3 extra garbage bytes from stack being sent.
Use i2c_smbus_write_i2c_block_data for raw i2c transfers
i2c_smbus_write_block_data makes data transfer using SMBUS protocol
I have started to work on I2C communication by examining adxl345 sensor. I wrote basic code to test if my code works or not. According to the ADXL345 technical documentation, the 0x00 register should return device id which is 0xE5.When I tried this register , the return value is 0. This application should be basic but I guess , I still missing something. Beside my experience, I also make a search at this community about the adxl345 problems,but I could not find answer. I would be very appreciated if you guide me in this problems. I attached my code.
void SysTick_Handler(void){
HAL_IncTick();
HAL_SYSTICK_IRQHandler();}
void SysClockEn();
/*System Configuration PA8-> I2C Clock , PC9-> I2C Data Lane*/
int main(){
SysClockEn();
HAL_Init();
/*------GPIO Configuration For I2C3------*/
__GPIOA_CLK_ENABLE();
GPIO_InitTypeDef *ptrB6,addrB6;
ptrB6 = &addrB6;
ptrB6->Alternate = GPIO_AF4_I2C3;
ptrB6->Pin = GPIO_PIN_8;
ptrB6->Pull = GPIO_NOPULL;
ptrB6->Speed =GPIO_SPEED_FREQ_HIGH;
ptrB6->Mode = GPIO_MODE_AF_OD;
HAL_GPIO_Init(GPIOA,ptrB6);
__GPIOC_CLK_ENABLE();
GPIO_InitTypeDef *ptrC,addrC;
ptrC = &addrC;
ptrC->Alternate =GPIO_AF4_I2C3;
ptrC->Mode =GPIO_MODE_AF_OD;
ptrC->Pin =GPIO_PIN_9;
ptrC->Pull =GPIO_NOPULL;
ptrC->Speed =GPIO_SPEED_FREQ_HIGH;
HAL_GPIO_Init(GPIOC,ptrC);
/*-----I2C Configurations-----*/
//__HAL_RCC_I2C3_CLK_ENABLE();
__I2C3_CLK_ENABLE();
I2C_HandleTypeDef *ptrI2C,addrI2C;
ptrI2C = &addrI2C;
ptrI2C->Instance = I2C3;
ptrI2C->Init.ClockSpeed = 100000; //100Khz
ptrI2C->Init.DutyCycle = I2C_DUTYCYCLE_2;
ptrI2C->Init.AddressingMode = I2C_ADDRESSINGMODE_7BIT;
ptrI2C->Mode =HAL_I2C_MODE_MASTER;
//ptrI2C->Init.GeneralCallMode =I2C_GENERALCALL_DISABLE;
//ptrI2C->Init.NoStretchMode=I2C_NOSTRETCH_DISABLE;
HAL_I2C_Init(ptrI2C);
__HAL_I2C_ENABLE(ptrI2C);
uint8_t data=0x00;
unsigned char buffer[2];
uint8_t *buf;
unsigned char pt;
uint32_t ptr;
uint8_t val
while(1){
val=HAL_I2C_IsDeviceReady(ptrI2C,0x1D,0xe5,1000);
pt=HAL_I2C_GetState(ptrI2C);
//HAL_I2C_Master_Transmit(ptrI2C,0x1d,0x00,1,0);
//HAL_I2C_Master_Receive(ptrI2C,0x1d,buffer,1,100);
//HAL_Delay(2);
HAL_I2C_Mem_Read(ptrI2C,SensAddr,0x00,1,buffer,2,1000);
ptr=HAL_I2C_GetError(ptrI2C);
}
}
void SysClockEn(){
__PWR_CLK_ENABLE();
}
The documentation of the sensor says:
the 7-bit I2C address for the device is 0x1D
So in your code your should write:
#define SensAddr (0x1D<<1)
...
HAL_I2C_Mem_Read(ptrI2C,SensAddr,0x00,1,buffer,2,1000);
...
This is because ST HAL considers the 7 bit address left shifted.
The documentation also says:
An alternate I2C address of 0x53 (followed by the R/W bit) can be chosen by grounding the SDO/ALT ADDRESS
If this is the case of your hardware, change the code to:
#define SensAddr (0x53<<1)
...
HAL_I2C_Mem_Read(ptrI2C,SensAddr,0x00,1,buffer,2,1000);
...
I have bluetooth module connected to AVR (Atmega32A) via UART. Some bytes that are transmit from bluetooth module to AVR are not properly recived.
For example the bytes that are properly transmit/recived (UTF-8):
Bluetooth module transmit byte X->recived byte X'
'w'->'w'
's'->'s'
'z'->'z'
'm'->'m'
bytes recived not properly:
'q'->'y'
'p'->'~'
'1'->'9'
Bluetooth connection settings:
Bps/Par/Bits: 115200 8N1
init UART:
#define F_CLK 16000000
#define BAUD 115200
uint16_t ubrr_value = (uint16_t) (((F_CLK)/(16 * BAUD)) - 1);
UBRRL = ubrr_value;
UBRRH = (ubrr_value>>8);
// 8 bit frame, async mode
UCSRC=(1<<URSEL) | (3<<UCSZ0);
//recive and transmit mode
UCSRB = (1<<TXEN) | (1 << RXEN);
transmit/recive byte by uart:
char USART_ReceiveByte()
{
while(!(UCSRA & (1<<RXC)));
return UDR;
}
void uart_sendRS(char VALUE)
{
while(!(UCSRA & (1<<UDRE)));
UDR = VALUE;
}
main loop:
while(1)
{
recivedByte = USART_ReceiveByte();
uart_sendRS(recivedByte);
}
i would be so glad to know why it does not work properly
EDIT: if i change the order there is result:
'y'->'y'
'~'->'~'
'9'->'9'
EDIT2: probably there is something wrong with setting UBRRL and UBRRH (ubrr_value = 7 in this case), does someone can confirm if it is proper and if the microcontroller can handle such a high BAUD?
#define F_CLK 16000000
#define BAUD 115200
uint16_t ubrr_value = (uint16_t) (((F_CLK)/(16 * BAUD)) - 1);
UBRRL = ubrr_value;
UBRRH = (ubrr_value>>8);
The problem here is that you are not initialising the UART properly. You need to set the U2X bit in UCSRA if you wish to use the baud rate as you wish it configured. If you are using avr-libc you may use the following code to properly compute the BAUD rate.
void uart0_init(void) {
# define BAUD 115200
# include <util/setbaud.h>
UBRRH = UBRRH_VALUE;
UBRRL = UBRRL_VALUE;
# if USE_2X
UCSRA |= _BV(U2X);
# else
UCSRA &= ~_BV(U2X);
# endif
# undef BAUD
/* other uart stuff you may need */
}
If you look at the datasheet for your microcontroller, section 20.12, you will find a table with this information precomputed for you. Cheers.
Specification given in the manual of meter which uses modbus as communication protocol:
“dword” refers to 32-bit unsigned integer using two data addresses and 4 bytes
of memory with high word at the front and low word at the end, it varies from 0
to 4294967295. rx = high word *65536+low word
i have this hexValue : 00003ef5.
where highword = 0000
and lowword = 3ef5
so rx would be 3ef5 and this converted to decimal gives 16,117.
For now i have used this:
var rx = hexValue.substr(0,4)*65536 + hexValue.substr(4,8);
console.log(parseInt(rx,16));
Is there a way to convert my hex value directly to dword using nodejs buffer library or any other method better than this?
I'm trying to port some windows code that uses HidD_GetInputReport to linux using libusb. From what I can tell I need to make a call to usb_control_msg but I'm having problems figuring out what parameters to pass in.
The report id I'm after is 0x01. Here is what I have so far.
#define HID_GET_REPORT 0x01
#define HID_REPORT_TYPE_INPUT 0x01
#define INTERFACE_NUMBER 0x00
int reportId = 0x01;
int bytesSent = usb_control_msg(
devHandle,
USB_ENDPOINT_IN | USB_TYPE_CLASS | USB_RECIP_INTERFACE,
HID_GET_REPORT,
(HID_REPORT_TYPE_INPUT << 8) | reportId,
INTERFACE_NUMBER,
buf,
sizeof(buf),
10000);
I'm really not sure about HID_GET_REPORT, HID_REPORT_TYPE_INPUT, and INTERFACE_NUMBER. I found them in an example on the web. Changing the various values does result in different return codes but those don't appear to be documented anywhere either.
Looks like you need to detach the kernel and claim the interface before calling other functions. I made calls to the following and it fixed the problem.
usb_detach_kernel_driver_np
usb_claim_interface