Test new sequencer ================== Some statistics on PAS test files for new sequencer sent by Gennaro === __2024-09-06__ Log-13-48-7.7z 105 3D records (9 uncompressed + 96 compressed) Sampling size : 48 x 7 x 11 x int16 = 7392 bytes Compressed size : 4068 + 556 => 4624 Compressed ratio : 1.6 Uncompressed size : 33280 bytes rather than 7392 [NOTE] ==== Huge buffer overflow : 33280 bytes needed to uncompress 7392 expected bytes (4 times greater) Bad compression ratio : 1.6 Each PAS 3D sampling need one HEADER and 2 compressed DATA packets ==== === __2024-09-09__ alain.z 143 3D records (26 uncompressed + 117 compressed) Sampling size : 64 x 7 x 11 x int16 = 9856 bytes Compressed size: 1110 (CDS = 54) Compressed ratio : 8.8 Uncompressed size : 9856 = sampling size [NOTE] ==== One HEAD + one DATA Compression ratio = 8.8 ==== === __2024-09-09__ alain2.z 67 3D records (all compressed) Sampling size : 64 x 9 x 11 int16 = 12672 bytes Compressed size : 4062 + 4068 + 4080 + 384 = 12594 bytes CDS = 32 + 32 + 32 + 3 Compression ratio : 1.01 Uncompressed size : 19584 > 12672 expected [NOTE] ==== * Worst compression ratio = 1.01 * For each compressed PAS 3D sampling, one HEAD packet + 4 DATA packets needed ! * Buffer overflow (7000 bytes) ==== === __2024-09-09__ Log-8_48_5_bis.7z 284 3D records (2 uncompressed + 282 compressed) Sampling size : 48 x 5 x 11 x int16 = 5280 bytes Compressed size : 3338 CDS = 42 Compression ratio : 1.58 Uncompressed size : 5376 > 5280 expected [NOTE] ==== One single HEAD and DATA packets, but bad compression ratio = 1.58 ==== == Conclusion There were 4 different files, in a unusual text format. * How were these files generated : + -- it's not the output of usual output TM generated by SIIS of flight-data. -- * Compression ratio + -- In all these files, except alain.7z, the compression ratio was very low 1.58 and even 1.01, whereas we got ratio from 8 to 30 with inflight-data or SIIS data. In these test files, there are ~96% of 0, and 3% of 1000 value, the compression ratio should be much more greater. -- * Buffer overflow + -- For PAS, the maximum possible sampling size is 96 x 9 x 11 int16 = 19008 bytes. So, we have allocated an uncompression buffer of 20000 bytes. With these files, we got huge buffer overflow while uncompressing these files. Generally, we can get some extra 00 padding, up to 64 bytes, at the end of uncompressed buffer. Here we got up to 33280 bytes to uncompress 7392 bytes, which was 4 times bigger than the original size. -- I think there was a problem with these input files. Some compression parameters? == Buffer overflow To uncompress PAS data, we use a function uncompress_data(), given in a Linux shared library (.so) which is define as follow .eRince_Decompressor.h ---- #ifndef eRICE_DECODER_H #define eRICE_DECODER_H 1 int uncompress_data( const unsigned char* input_buffer, const int input_size, const int LCI, unsigned char* output_buffer, int* output_size); #endif ---- This function accept an input buffer and its size, and generates an output buffer and its size. We know exactly the size of the expected output buffer (orginal data size) that is defined in FULL3D_COMPR_HEAD packet, and computed as nb.energy x nb.elevation x nb.CEM x int-16 However, the library has no knowledge of the size of the original data. In some cases it generates huge buffer overflow. Last tests generates an output buffer of 33280 bytes, whereas only 7392 are expected (4 times bigger) We would like to add a new parameter to uncompress_data(): int expected_size; // or original_size And a line of code inside the uncompress_data() library : if (*output_size > expected_size) return;