I have come across the following SPI slave code at this site:
#include <SPI.h>
char buf [100];
volatile byte pos;
volatile bool process_it;
void setup (void)
{
Serial.begin (115200); // debugging
// turn on SPI in slave mode
SPCR |= bit (SPE);
// have to send on master in, *slave out*
pinMode (MISO, OUTPUT);
// get ready for an interrupt
pos = 0; // buffer empty
process_it = false;
// now turn on interrupts
SPI.attachInterrupt();
} // end of setup
// SPI interrupt routine
ISR (SPI_STC_vect)
{
byte c = SPDR; // grab byte from SPI Data Register
// add to buffer if room
if (pos < sizeof buf)
{
buf [pos++] = c;
// example: newline means time to process buffer
if (c == '\n')
process_it = true;
} // end of room available
} // end of interrupt routine SPI_STC_vect
// main loop - wait for flag set in interrupt routine
void loop (void)
{
if (process_it)
{
buf [pos] = 0;
Serial.println (buf);
pos = 0;
process_it = false;
} // end of flag set
} // end of loop
How to modify the above code so that it responds to input clock signal just like in this SSI protocol? Uno or Nano can be used.
Basically I want to let an arduino act as an absolute encoder so that it will transmit 13-bit binary data(it can be any random fixed binary data) as shown in the above SSI protocol. So the Arduino will output 13-bit binary data along with with start bit and tm taken into account as shown in the document above without parity.
The master's code I plan to use:
const int CLOCK_PIN = 2;
const int DATA_PIN = 3;
const int BIT_COUNT = 16;
void setup() {
pinMode(DATA_PIN, INPUT);
pinMode(CLOCK_PIN, OUTPUT);
digitalWrite(CLOCK_PIN, HIGH);
Serial.begin(115200);
}
void loop() {
float reading = readPosition();
Serial.println(reading,2);
delay(25);
}
//read the current angular position
float readPosition() {
unsigned long graysample = shiftIn(DATA_PIN, CLOCK_PIN, BIT_COUNT);
delayMicroseconds(100); // Clock must be high for 20 microseconds before a new sample can be taken
unsigned long binarysample = grayToBinary32(graysample);
return ((binarysample * 360UL) / 65536.0); // ouptut value from 0 to 360 with two point percision
}
//read in a byte of data from the digital input of the board.
unsigned long shiftIn(const int data_pin, const int clock_pin, const int bit_count) {
unsigned long data = 0;
for (int i=0; i<bit_count; i++) {
data <<= 1; // shift all read data left one bit.
//digitalWrite(clock_pin,LOW);
PORTD &= ~(1 << 5); // clock pin goes low
delayMicroseconds(1);
//digitalWrite(clock_pin,HIGH);
PORTD |= (1 << 5); // lock pin goes high
delayMicroseconds(1);
data |= digitalRead(data_pin); // cat the new read bit to the whole read data.
}
return data;
}
unsigned int grayToBinary32(unsigned int num)
{
num = num ^ (num >> 16);
num = num ^ (num >> 8);
num = num ^ (num >> 4);
num = num ^ (num >> 2);
num = num ^ (num >> 1);
return num;
}
1 Answer 1
You may try to bit-bang the protocol using direct port access. This
is less convenient, but way faster than digitalWrite()
. Below is an
example, untested program. I am assuming the pinout is
- data = PB0 = digital 8
- clock = PB1 = digital 9
- pins PB2 – PB5 (10 – 13) are unused
Edit: This updated program uses the clock as an input, and the data as an output. The previous version used both lines as outputs, which was due to a misunderstanding of mine.
#include <avr/io.h>
int main(void)
{
uint16_t data = 0x155;
DDRB |= _BV(PB0);
PORTB |= _BV(PB0);
TCCR1B = _BV(CS10); // start timer 1 @ F_CPU
loop_until_bit_is_clear(PINB, PB1);
for (;;) {
loop_until_bit_is_set(PINB, PB1);
PORTB = data & 1;
data >>= 1;
TCNT1 = -304; // 20 us * 16 MHz - 16 cycles
TIFR1 |= _BV(TOV1); // clear overflow flag
while (bit_is_set(PINB, PB1)) {
if (bit_is_set(TIFR1, TOV1)) { // timer overflowed
PORTB = 1; // data high
data = 0x155;
loop_until_bit_is_clear(PINB, PB1);
break;
}
}
}
}
This should be able to operate at frequencies up to about 200 – 250 kHz. There is a delay of about 0.5 μs between the rising edges of the clock and the data been sent out, with a jitter of 3 clock cycles.
The data word is sent least significant bit first, contrary to what the protocol diagram shows. However, if you send arbitrary data, this should not be an issue.
Note that this is a pure avr-libc program, which does not use the Arduino core. It uses polling to check the inputs and the timer. The interrupts are never enabled.
-
Thank you very much! I will try this with the master code I have. One arduino will be slave and the other will be master. Your code simulates the slave if Im not wrong but theres one thing Im a bit confused in your code. In this protocol slave supposed to receive clock signal. But in your code when I check digital 9 pin output by a scope, it outputs a very similar bit stream as digital 8. I thought slave would only receive clock signals as inputs from the master yet in your case the slave seems outputting bit stream. Am I getting something wrong here?floppy380– floppy3802019年10月22日 09:49:11 +00:00Commented Oct 22, 2019 at 9:49
-
Re "In this protocol slave supposed to receive clock signal": Oh, I see, I wasn't aware of this.Edgar Bonet– Edgar Bonet2019年10月22日 10:03:07 +00:00Commented Oct 22, 2019 at 10:03
-
Please see that at the end of my question I now added the master code I found for a 16 bit encoder. The code is written for master and converts the gray code to binary. (Usually abs encoders output gray code for better efficenicny.) But for the slave side supposed the receive the clock in a way that each clock it outputs one bit. More info about this protocol here: posital.com/media/posital_media/documents/… My aim is to try this master code with a simulated slave. Slave can send any angle data. Better be constant data to check outfloppy380– floppy3802019年10月22日 10:19:01 +00:00Commented Oct 22, 2019 at 10:19
-
This works great! I just need to add in master code after return //digitalWrite(clock_pin,LOW); PORTD &= ~(1 << 5); // clock pin goes low delayMicroseconds(1); //digitalWrite(clock_pin,HIGH); PORTD |= (1 << 5); // lock pin goes high delayMicroseconds(1); to exactly mimic the required pattern. I checked it works great now thank you,floppy380– floppy3802019年10月22日 14:51:28 +00:00Commented Oct 22, 2019 at 14:51
digitalWrite()
anddelayMicroseconds()
?