-
-
Notifications
You must be signed in to change notification settings - Fork 54
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Question: Is it possible to parse this bit layout? #346
Comments
This (unsurprisingly) parses correctly, but loses the bitfield definitions for the u32 type, that's the one I can't figure out. #[derive(Debug, Copy, Clone, PartialEq, DekuRead, DekuWrite)]
#[deku(endian = "little")]
struct TestPacket {
f0: u32,
#[deku(bits = "10", pad_bits_after = "6")]
f4: u16,
f5: u16,
f6: u16,
f7: u16,
} |
Sorry @LeonardMH there hasn't been any progress on this. My current recommendation is to massage your LSB data into MSB, if possible, before feeding it to deku. However this may not be possible/ideal. An alternative is to reverse the bits and have your struct defined in reverse... also not ideal. Finally, you could use |
To help debug, there's a env_logger::init();
|
I'm a bit confused by your test case, for example the first 10 bits |
Thanks for the quick response, hopefully I will be able to figure this out now that I know how to access the logging feature. As for 'the first 10 bits being 1', that's not exactly what I expect, in my mind how this would work is that the first 32-bits would be interpreted as a little endian
If you take the 'first 10 bits' of that #[derive(Debug, Copy, Clone, PartialEq, DekuRead, DekuWrite)]
#[deku(endian = "little")]
struct TestPacket {
// Word 1: Bytes 1-4
#[deku(bits = 10, pad_bits_before = "2")]
f3: u32,
#[deku(bits = 10)]
f2: u32,
#[deku(bits = 10)]
f1: u32,
// Bytes 5-6
#[deku(bits = 10, pad_bits_after = "6")]
f4: u16,
f5: u16, // Bytes 7-8
f6: u16, // Bytes 9-10
f7: u16, // Bytes 11-12
} But that fails with:
I'll keep poking around with the logging and see if I can wrap my head around what order the individual bits are being passed in. As for this:
No need for apologies, this library is very impressive and almost exactly what I want. I have hundreds of these types of structs that I need to define so if I can get this working you'll save me hours and hours of writing manual bitshifts and masks by hand, I can deal with a little bit of weirdness. It would just be a nice quality of life upgrade. |
I forked the repo and changed all instances of Msb0 to Lsb0 just to test that out, I am starting to think Msb0 is fundamentally incompatible with how I want to parse this data. If I can get that working I will look at what you have already done on the Unfortunately, I'm seeing one more weird thing I hope you can help with. When I switch to Lsb0,
And here's the output:
Just looking at |
Ok, so that issue is probably as a result of the change from Msb0 to Lsb0, I added a bunch of debug tracing in the As it was: // Create a new BitVec from the slice and pad un-aligned chunks
// i.e. [10010110, 1110] -> [10010110, 00001110]
let bits: BitVec<u8, Lsb0> = {
let mut bits = BitVec::with_capacity(bit_slice.len() + pad);
// Copy bits to new BitVec
bits.extend_from_bitslice(bit_slice);
// Force align
//i.e. [1110, 10010110] -> [11101001, 0110]
bits.force_align();
// Some padding to next byte
let index = if input_is_le {
bits.len() - (8 - pad)
} else {
0
};
for _ in 0..pad {
bits.insert(index, false);
}
// Pad up-to size of type
for _ in 0..(MAX_TYPE_BITS - bits.len()) {
if input_is_le {
bits.push(false);
} else {
bits.insert(0, false);
}
}
bits
}; The issue is specifically in the for _ in pad..0 {
bits.insert(index, false);
} Results in my test passing. I'm just going to proceed with my forked version for now, but I will see if I can understand EDIT: To be clear on what exactly is passing, I have changed all instances of Msb0 to Lsb0 and changed the direction of the bit padding for loop in #[derive(Debug, Copy, Clone, PartialEq, DekuRead, DekuWrite)]
struct TestPacket {
// Word 1: Bytes 1-4
#[deku(bits = 10)]
f1: u32,
#[deku(bits = 10)]
f2: u32,
#[deku(bits = 10, pad_bits_after = "2")]
f3: u32,
// Bytes 5-6
#[deku(bits = 10, pad_bits_after = "6")]
f4: u16,
f5: u16, // Bytes 7-8
f6: u16, // Bytes 9-10
f7: u16, // Bytes 11-12
} Everything is parsed as expected. |
I don't have time to read your entire message, but the following hack works, although it looks like you have a sort of solution. use bitvec::view::BitView;
use bitvec::{prelude::Msb0, slice::BitSlice, vec::BitVec};
use deku::prelude::*;
use std::convert::TryFrom;
#[derive(Debug, Copy, Clone, PartialEq, DekuRead, DekuWrite)]
#[deku(endian = "big")]
struct Word1 {
#[deku(bits = 10, pad_bits_before = "2")]
f3: u16,
#[deku(bits = 10)]
f2: u16,
#[deku(bits = 10)]
f1: u16,
}
impl Word1 {
fn read(rest: &BitSlice<u8, Msb0>) -> Result<(&BitSlice<u8, Msb0>, Word1), DekuError> {
let (rest, value) = u32::read(rest, ())?;
let bytes = value.to_be_bytes();
let (_, word) = Word1::from_bytes((&bytes, 0))?;
Ok((rest, word))
}
fn write(output: &mut BitVec<u8, Msb0>, word: &Word1) -> Result<(), DekuError> {
todo!();
}
}
#[derive(Debug, Copy, Clone, PartialEq, DekuRead, DekuWrite)]
#[deku(endian = "little")]
struct TestPacket {
// Word 1: Bytes 1-4
#[deku(
reader = "Word1::read(deku::rest)",
writer = "Word1::write(deku::output, &self.word)"
)]
word: Word1,
// Bytes 5-6
#[deku(bits = 10, pad_bits_after = "6")]
f4: u16,
f5: u16, // Bytes 7-8
f6: u16, // Bytes 9-10
f7: u16, // Bytes 11-12
}
fn main() {
// Test Data as Hex
let test_data = vec![
0x01, 0xDC, 0x88, 0x23, // Word 1: Bytes 1-4
0x73, 0x00, // Bytes 5-6
0x00, 0x00, // Bytes 7-8
0x01, 0x00, // Bytes 9-10
0x01, 0x00,
]; // Bytes 11-12
let test_packet = TestPacket::try_from(test_data.as_ref()).unwrap();
// Word 1: Bytes 1-4 become u32 -> word1 = 0x2388DC01
let expect_packet = TestPacket {
word: Word1 {
f3: 0x238, // (word1 >> 20) & 0x3FF
f2: 0x237, // (word1 >> 10) & 0x3FF
f1: 1, // word1 & 0x3FF
},
f4: 0x73, // 0x0073 & 0x3FF
f5: 0, // 0x0000 & 0xFFFF
f6: 1, // 0x0001 & 0xFFFF
f7: 1, // 0x0001 & 0xFFFF
};
assert_eq!(test_packet, expect_packet);
} |
Yeah I definitely don't want to do custom implementations, I have a lot of these types of structs to define and I'd like to cut out boilerplate as much as possible. Switching to LSb-first parsing and fixing one bug that introduces in the |
Agree! It would be a great addition. |
Hey @sharksforarms, considering that this library is for working with binary data, could the |
is there still interest in moving this forward? I'm currently using @LeonardMH 's deku fork for a parsing project where bits are Lsb0 ordered, and I would love to see this functionality included in the main version of deku. If there's interest, I could try figuring out a PR. |
I have an update on that in this issue: #134 (comment) This has no MR, is based on a couple of un-merged MRs, and doesn't support writing yet. So far in my limited testing it works for reading! |
That being said, I haven't tested this issues example. |
I did see that issue as well, but the changes that your fork introduces are much larger, and I haven't looked at them in too much detail yet :) |
I pushed the changes to a MR: #367 |
Hello, I was looking forward to using
deku
and hit my first roadblock when I learned (the hard way) that bitfields are parsed in MSb-first order (as referenced on issue #134). This isn't exactly the issue I am trying to resolve, but I bring it up because:So on to the real problem, I am trying to decode a data stream coming from a microcontroller, the data comes to me directly over the wire as it is represented in memory on the MCU, which is represented by the following (C) bitfield struct:
To simplify the issue, let's look at one failing case, the data I get over the wire is
[0x01, 0xDC, 0x88, 0x23, 0x73, 0x00, 0x00, 0x00, 0x01, 0x00, 0x01, 0x00]
. I have written a minimal test case in Rust which, other than being a bit over-prescriptive with endianess is pretty close to how I would "ideally" define this struct:This fails with the following:
But ok, that's somewhat expected given that the parsing happens in MSb-first order, I'm just not sure on what scale that MSb-first applies so I tried various combinations of reordering the bitfields, changing the
endian
setting, changing betweenpad_bits_after
andpad_bits_before
, explicitly definingbits
even on the u16 types that don't need it, etc. My testing probably hasn't been exhaustive, but it has been exhausting and at this point I think I have tried everything that "seems like it should work", but I haven't been able to find the magic.Can anyone help me wrap my head around this? Am I just missing something obvious? If nothing else, is it possible to do any debugging of how
deku
is parsing the data sent to it?The text was updated successfully, but these errors were encountered: