Hi,
I have difficulties running a picard tool "MarkDuplicates on data" as the Job wrapper script complains about a metadata_temp file to be missing, although the file exits in the job directory and contains data.
We using Galaxy v16.10 on a seperate host, submitting to a cluster running PBS Pro, data is shared through sshfs mounts, and dependencies resolved using the module system.
Any ideas? Thx
galaxy.objectstore CRITICAL 2017-03-28 10:04:44,926 Error copying /lustre/scratch/projects/csf_biocomp_common/galaxy_shared/jobs_directory/000/221/metadata_temp_file_xccoop to /home/GMI/biocomp.pacbio/galaxy/galaxy_shared/file_database/_metadata_files/000/metadata_26.dat: [Errno 2] No such file or directory: u'/lustre/scratch/projects/csf_biocomp_common/galaxy_shared/jobs_directory/000/221/metadata_temp_file_xccoop'
galaxy.jobs.runners ERROR 2017-03-28 10:04:44,926 (221/1146406.pbs0.ice.gmi.oeaw.ac.at) Job wrapper finish method failed
Traceback (most recent call last):
File "/home/GMI/biocomp.pacbio/galaxy/galaxy/lib/galaxy/jobs/runners/__init__.py", line 611, in finish_job
job_state.job_wrapper.finish( stdout, stderr, exit_code )
File "/home/GMI/biocomp.pacbio/galaxy/galaxy/lib/galaxy/jobs/__init__.py", line 1303, in finish
dataset.metadata.from_JSON_dict( output_filename, path_rewriter=path_rewriter )
File "/home/GMI/biocomp.pacbio/galaxy/galaxy/lib/galaxy/model/metadata.py", line 168, in from_JSON_dict
dataset._metadata[ name ] = param.from_external_value( external_value, dataset, **from_ext_kwds )
File "/home/GMI/biocomp.pacbio/galaxy/galaxy/lib/galaxy/model/metadata.py", line 605, in from_external_value
alt_name=os.path.basename(mf.file_name) )
File "/home/GMI/biocomp.pacbio/galaxy/galaxy/lib/galaxy/objectstore/__init__.py", line 430, in update_from_file
raise ex
IOError: [Errno 2] No such file or directory: u'/lustre/scratch/projects/csf_biocomp_common/galaxy_shared/jobs_directory/000/221/metadata_temp_file_xccoop'