-
Type: Bug
-
Resolution: Fixed
-
Priority: Major
-
Affects Version/s: 4.5.3
-
Component/s: statistics, visualization and debugging tools
-
None
This happens when the workflow fails due to DAGMAN_LOG_ON_NFS_IS_ERROR when the submit dir is on NFS. It looks like this may have been triggered by an RPM update and a changed Condor default. The dag.dagman.out log contains:
ERROR: log file /path/to/workflow-0.dag.nodes.log is on NFS.
The workflow starts, and some tables in the DB get populated, but no jobs are submitted. In particular, the job_instance, jobstate, and invocation tables are all empty.
pegasus-analyzer
Traceback (most recent call last):
File "/sw/redhat6/pegasus/4.5.3/bin/pegasus-analyzer", line 1572, in <module>
analyze_db(options.config_properties)
File "/sw/redhat6/pegasus/4.5.3/bin/pegasus-analyzer", line 1021, in analyze_db
unsubmitted = total - success - failed
TypeError: unsupported operand type(s) for -: 'int' and 'NoneType'