-
Notifications
You must be signed in to change notification settings - Fork 9
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Nanoplish error while call methylation using VBZ-compressed fast5 #5
Comments
Hi Dale, Can I ask what platform/operating system you are running on? We are aware of some issues on ubnutu bionic where HDF doesnt load the plugin automatically. Can you also try adding: |
Thanks for your reply. My platform is Linux version 3.10.0-693.11.1.el7.x86_64. My another quesition is who can I decompress the vbz-compressed fast5? Can I get the same raw fast5 files after decompression? Best~ |
You can - we are working on some tools in our fast5 api to do this for you, however in the mean time you could use
should repack using gzip not vbz. |
https://github.com/nanoporetech/ont_fast5_api has tools to repack fast5 files with or without vbz |
Hi there- The fast5 file is compressed with VBZ but the required plugin is not loaded. Please read the instructions here: #5 Sorry if I'm overlooking something obvious here, but I'd appreciate any input you might have on this issue. Thanks! |
It sounds like you may have hit an issue with the vbz plugin on M1 platforms Are you are running python in a rosetta environment? A short term solution would be to build the VBZ plugin yourself on M1 - I will look to get M1 support for vbz rolled out in a future release. |
@mmullistb can you try with the M1 Mac build |
thanks @jorj1988 and @iiSeymour - I downloaded the M1 mac build and added the directory containing the binary to my path: I repeated the nanopolish polya command: The output was the following: terminal outputHDF5-DIAG: Error detected in HDF5 (1.13.0) thread 1: #000: H5D.c line 1021 in H5Dread(): can't synchronously read data major: Dataset minor: Read failed #1: H5D.c line 970 in H5D__read_api_common(): can't read data major: Dataset minor: Read failed #2: H5VLcallback.c line 2079 in H5VL_dataset_read(): dataset read failed major: Virtual Object Layer minor: Read failed #3: H5VLcallback.c line 2046 in H5VL__dataset_read(): dataset read failed major: Virtual Object Layer minor: Read failed #4: H5VLnative_dataset.c line 294 in H5VL__native_dataset_read(): can't read data major: Dataset minor: Read failed #5: H5Dio.c line 262 in H5D__read(): can't read data major: Dataset minor: Read failed #6: H5Dchunk.c line 2575 in H5D__chunk_read(): unable to read raw data chunk major: Low-level I/O minor: Read failed #7: H5Dchunk.c line 3943 in H5D__chunk_lock(): data pipeline read failed major: Dataset minor: Filter operation failed #8: H5Z.c line 1359 in H5Z_pipeline(): required filter 'vbz' is not registered major: Data filters minor: Read failed #9: H5PLint.c line 257 in H5PL_load(): search in path table failed major: Plugin for dynamically loaded library minor: Can't get value #10: H5PLpath.c line 804 in H5PL__find_plugin_in_path_table(): search in path /usr/local/hdf5/lib/plugin encountered an error major: Plugin for dynamically loaded library minor: Can't get value #11: H5PLpath.c line 857 in H5PL__find_plugin_in_path(): can't open directory: /usr/local/hdf5/lib/plugin major: Plugin for dynamically loaded library minor: Can't open directory or file The fast5 file is compressed with VBZ but the required plugin is not loaded. Please read the instructions here: https://github.com//issues/5 HDF5-DIAG: Error detected in HDF5 (1.13.0) thread 2: #000: H5A.c line 1044 in H5Aread(): can't synchronously read data major: Attribute minor: Read failed #1: H5A.c line 1008 in H5A__read_api_common(): not an attribute major: Invalid arguments to routine minor: Inappropriate type HDF5-DIAG: Error detected in HDF5 (1.13.0) thread 2: #000: H5A.c line 2251 in H5Aclose(): decrementing attribute ID failed major: Attribute minor: Unable to decrement reference count #1: H5Iint.c line 1157 in H5I_dec_app_ref(): can't decrement ID ref count major: Object ID minor: Unable to decrement reference count #2: H5Iint.c line 1109 in H5I__dec_app_ref(): can't decrement ID ref count major: Object ID minor: Unable to decrement reference count #3: H5Iint.c line 1012 in H5I__dec_ref(): can't locate ID major: Object ID minor: Unable to find ID information (already closed?) HDF5-DIAG: Error detected in HDF5 (1.13.0) thread 2: #000: H5G.c line 889 in H5Gclose(): decrementing group ID failed major: Symbol table minor: Unable to decrement reference count #1: H5Iint.c line 1157 in H5I_dec_app_ref(): can't decrement ID ref count major: Object ID minor: Unable to decrement reference count #2: H5Iint.c line 1109 in H5I__dec_app_ref(): can't decrement ID ref count major: Object ID minor: Unable to decrement reference count #3: H5Iint.c line 1012 in H5I__dec_ref(): can't locate ID major: Object ID minor: Unable to find ID information (already closed?) HDF5-DIAG: Error detected in HDF5 (1.13.0) thread 2: #000: H5D.c line 397 in H5Dopen2(): unable to synchronously open dataset major: Dataset minor: Can't open object #1: H5D.c line 353 in H5D__open_api_common(): can't set object access arguments major: Dataset minor: Can't set value #2: H5VLint.c line 2669 in H5VL_setup_acc_args(): invalid location identifier major: Invalid arguments to routine minor: Inappropriate type #3: H5VLint.c line 1779 in H5VL_vol_object(): invalid identifier major: Invalid arguments to routine minor: Inappropriate type HDF5-DIAG: Error detected in HDF5 (1.13.0) thread 2: #000: H5G.c line 438 in H5Gopen2(): unable to synchronously open group major: Symbol table minor: Unable to create file #1: H5G.c line 395 in H5G__open_api_common(): can't set object access arguments major: Symbol table minor: Can't set value #2: H5VLint.c line 2669 in H5VL_setup_acc_args(): invalid location identifier major: Invalid arguments to routine minor: Inappropriate type #3: H5VLint.c line 1779 in H5VL_vol_object(): invalid identifier major: Invalid arguments to routine minor: Inappropriate type HDF5-DIAG: Error detected in HDF5 (1.13.0) thread 2: #000: H5F.c line 1061 in H5Fclose(): decrementing file ID failed major: File accessibility minor: Unable to close file #1: H5Iint.c line 1157 in H5I_dec_app_ref(): can't decrement ID ref count major: Object ID minor: Unable to decrement reference count #2: H5Iint.c line 1109 in H5I__dec_app_ref(): can't decrement ID ref count major: Object ID minor: Unable to decrement reference count #3: H5Iint.c line 1012 in H5I__dec_ref(): can't locate ID major: Object ID minor: Unable to find ID information (already closed?) HDF5-DIAG: Error detected in HDF5 (1.13.0) thread 3: #000: H5D.c line 1021 in H5Dread(): can't synchronously read data major: Dataset minor: Read failedThe line "The fast5 file is compressed with VBZ but the required plugin is not loaded. Please read the instructions here: #5" suggests that nanopolish is still not able to access the VBZ plugin, right? |
@mmullistb did you use |
@iiSeymour haha yes, my mistake. It's working now when using |
Hi there, I'm running nanopolish eventalign on some of my data and the error code I'm getting is directing me to this thread.
I am suspicious of this error as eventalign runs smoothly when given the example data found in the guide Fixes attempted
I find it difficult to believe that my fast5 files are corrupted as I am able to complete basecalling and mapping without obvious errors seen when running fastQC. I think the program is right with its first error message and that it is decompressing the files incorrectly, hopefully you can help me in configuring my installation to appropriately process these files. Thanks, Finnlay Lambert |
Hi @finn-rpl , what version of the vbz plugin do you have installed? What OS (+ version) are you running? Thanks,
|
Hi George, I'm running Ubuntu 20.04, but the issue is solved and the error was mine. When I tried the fix detailed here, the link downloaded the mac version of the plugin rather than directing me to the releases page. After installing the correct version the Thanks for your assistance, I only realised there were multiple distributions after you asked me which was the appropriate version. Finnlay Lambert |
Perfect - Thanks for updating me! |
Hello I am using nanoseq pipeline to do M6A analysis. I am using HPC cluster for analysis. I installed https://github.com/nanoporetech/vbz_compression/releases/download/v1.0.1/ont-vbz-hdf-plugin-1.0.1-Linux-x86_64.tar.gz in my conda virtual env. I used following command to run the pipeline nextflow run
My custom config file has
but I am keep getting error
|
Hi @vetmohit89 , Can you confirm the architecture of the system you are running on? What is the content of the directory Have you tried running the same experiment outside of your container?
|
Dear professors:
After compressing the raw fast5 file by ont_fast5_api. I try to use the vbz-compressed fast5 to call mathylation by Nanopolish software. The error occurs likes below:
HDF5-DIAG: Error detected in HDF5 (1.8.14) thread 47866940143360:
#000: H5Dio.c line 173 in H5Dread(): can't read data
major: Dataset
minor: Read failed
#1: H5Dio.c line 550 in H5D__read(): can't read data
major: Dataset
minor: Read failed
#2: H5Dchunk.c line 1872 in H5D__chunk_read(): unable to read raw data chunk
major: Low-level I/O
minor: Read failed
#3: H5Dchunk.c line 2902 in H5D__chunk_lock(): data pipeline read failed
major: Data filters
minor: Filter operation failed
#4: H5Z.c line 1357 in H5Z_pipeline(): required filter 'vbz' is not registered
major: Data filters
minor: Read failed
#5: H5PL.c line 298 in H5PL_load(): search in paths failed
major: Plugin for dynamically loaded library
minor: Can't get value
#6: H5PL.c line 402 in H5PL__find(): can't open directory
major: Plugin for dynamically loaded library
minor: Can't open directory or file
I guess the vbz library was not be installed correctly ?
And How can I use the vbz-compressed fast5 to call methlation ? Decompress or directly reading the vbz-compressed fast5 by other ways?
Best~
dale wong
The text was updated successfully, but these errors were encountered: