[Ncep.list.fv3-announce] fv3gfs r96274: reintegrate EXP-cyc branch to trunk, including updat...

Samuel.Trahan at noaa.gov Samuel.Trahan at noaa.gov
Fri Aug 4 03:43:16 UTC 2017


Friendly fv3gfs developers,

This is an automated email about a fv3gfs commit.

Project: fv3gfs
URL: https://svnemc.ncep.noaa.gov/projects/fv3gfs/trunk
Revision: 96274
Author:   fanglin.yang at noaa.gov
Date:     2017-08-04T03:29:01.317657Z
Message:
reintegrate EXP-cyc branch to trunk, including updates of Rocoto workflow to run forecast-only experiments, chgres driver to deal with new operational nems gfs initial conditions, and exglobal_fcst_nemsfv3gfs.sh to properly set up parameters to run gcycle within forecast


See attached file for full differences.


First 4000 bytes of differences:
Index: checkout/gfs_workflow.v15.0.0/ush
===================================================================
--- checkout/gfs_workflow.v15.0.0/ush	(revision 96191)
+++ checkout/gfs_workflow.v15.0.0/ush	(revision 96274)

Property changes on: checkout/gfs_workflow.v15.0.0/ush
___________________________________________________________________
Modified: svn:mergeinfo
## -0,0 +0,1 ##
   Merged /fv3gfs/branches/Rahul.Mahajan/EXP-cyc/gfs_workflow.v15.0.0/ush:r96049-96273
Index: checkout/gfs_workflow.v15.0.0/scripts
===================================================================
--- checkout/gfs_workflow.v15.0.0/scripts	(revision 96191)
+++ checkout/gfs_workflow.v15.0.0/scripts	(revision 96274)

Property changes on: checkout/gfs_workflow.v15.0.0/scripts
___________________________________________________________________
Modified: svn:mergeinfo
## -0,0 +0,1 ##
   Merged /fv3gfs/branches/Rahul.Mahajan/EXP-cyc/gfs_workflow.v15.0.0/scripts:r96049-96273
Index: checkout/gfs_workflow.v15.0.0/bin
===================================================================
--- checkout/gfs_workflow.v15.0.0/bin	(revision 96191)
+++ checkout/gfs_workflow.v15.0.0/bin	(revision 96274)

Property changes on: checkout/gfs_workflow.v15.0.0/bin
___________________________________________________________________
Modified: svn:mergeinfo
## -0,0 +0,1 ##
   Merged /fv3gfs/branches/Rahul.Mahajan/EXP-cyc/gfs_workflow.v15.0.0/bin:r96049-96273
Index: checkout/gfs_workflow.v15.0.0/exp
===================================================================
--- checkout/gfs_workflow.v15.0.0/exp	(revision 96191)
+++ checkout/gfs_workflow.v15.0.0/exp	(revision 96274)

Property changes on: checkout/gfs_workflow.v15.0.0/exp
___________________________________________________________________
Modified: svn:mergeinfo
## -0,0 +0,1 ##
   Merged /fv3gfs/branches/Rahul.Mahajan/EXP-cyc/gfs_workflow.v15.0.0/exp:r96049-96273
Index: checkout/gfs_workflow.v15.0.0/jobs
===================================================================
--- checkout/gfs_workflow.v15.0.0/jobs	(revision 96191)
+++ checkout/gfs_workflow.v15.0.0/jobs	(revision 96274)

Property changes on: checkout/gfs_workflow.v15.0.0/jobs
___________________________________________________________________
Modified: svn:mergeinfo
## -0,0 +0,1 ##
   Merged /fv3gfs/branches/Rahul.Mahajan/EXP-cyc/gfs_workflow.v15.0.0/jobs:r96049-96273
Index: checkout/gfs_workflow.v15.0.0/util
===================================================================
--- checkout/gfs_workflow.v15.0.0/util	(revision 96191)
+++ checkout/gfs_workflow.v15.0.0/util	(revision 96274)

Property changes on: checkout/gfs_workflow.v15.0.0/util
___________________________________________________________________
Modified: svn:mergeinfo
## -0,0 +0,1 ##
   Merged /fv3gfs/branches/Rahul.Mahajan/EXP-cyc/gfs_workflow.v15.0.0/util:r96049-96273
Index: checkout/gfs_workflow.v15.0.0/fv3gfs/ush/setup_workflow.py
===================================================================
--- checkout/gfs_workflow.v15.0.0/fv3gfs/ush/setup_workflow.py	(revision 96191)
+++ checkout/gfs_workflow.v15.0.0/fv3gfs/ush/setup_workflow.py	(revision 96274)
@@ -26,159 +26,42 @@
 
 import os
 import sys
-import glob
-from distutils.spawn import find_executable
 from datetime import datetime, timedelta
 from argparse import ArgumentParser, ArgumentDefaultsHelpFormatter
-import shellvars
 import rocoto
+import workflow_utils as wfu
 
+gfs_tasks = ['prep', 'anal', 'fcst', 'post', 'vrfy', 'arch']
+hyb_tasks = ['eobs', 'eomg', 'eupd', 'ecen', 'efcs', 'epos', 'earc']
+
 def main():
     parser = ArgumentParser(description='Setup XML workflow and CRONTAB for a GFS parallel.', formatter_class=ArgumentDefaultsHelpFormatter)
     parser.add_argument('--expdir',help='full path to experiment directory containing config files', type=str, required=False, default=os.environ['PWD'])
     args = parser.parse_args()
 
-    configs = get_configs(args.expdir)
+    configs = wfu.get_configs(args.expdir)
 
-    dict_con


... see attachment for the rest ...
-------------- next part --------------
Index: checkout/gfs_workflow.v15.0.0/ush
===================================================================
--- checkout/gfs_workflow.v15.0.0/ush	(revision 96191)
+++ checkout/gfs_workflow.v15.0.0/ush	(revision 96274)

Property changes on: checkout/gfs_workflow.v15.0.0/ush
___________________________________________________________________
Modified: svn:mergeinfo
## -0,0 +0,1 ##
   Merged /fv3gfs/branches/Rahul.Mahajan/EXP-cyc/gfs_workflow.v15.0.0/ush:r96049-96273
Index: checkout/gfs_workflow.v15.0.0/scripts
===================================================================
--- checkout/gfs_workflow.v15.0.0/scripts	(revision 96191)
+++ checkout/gfs_workflow.v15.0.0/scripts	(revision 96274)

Property changes on: checkout/gfs_workflow.v15.0.0/scripts
___________________________________________________________________
Modified: svn:mergeinfo
## -0,0 +0,1 ##
   Merged /fv3gfs/branches/Rahul.Mahajan/EXP-cyc/gfs_workflow.v15.0.0/scripts:r96049-96273
Index: checkout/gfs_workflow.v15.0.0/bin
===================================================================
--- checkout/gfs_workflow.v15.0.0/bin	(revision 96191)
+++ checkout/gfs_workflow.v15.0.0/bin	(revision 96274)

Property changes on: checkout/gfs_workflow.v15.0.0/bin
___________________________________________________________________
Modified: svn:mergeinfo
## -0,0 +0,1 ##
   Merged /fv3gfs/branches/Rahul.Mahajan/EXP-cyc/gfs_workflow.v15.0.0/bin:r96049-96273
Index: checkout/gfs_workflow.v15.0.0/exp
===================================================================
--- checkout/gfs_workflow.v15.0.0/exp	(revision 96191)
+++ checkout/gfs_workflow.v15.0.0/exp	(revision 96274)

Property changes on: checkout/gfs_workflow.v15.0.0/exp
___________________________________________________________________
Modified: svn:mergeinfo
## -0,0 +0,1 ##
   Merged /fv3gfs/branches/Rahul.Mahajan/EXP-cyc/gfs_workflow.v15.0.0/exp:r96049-96273
Index: checkout/gfs_workflow.v15.0.0/jobs
===================================================================
--- checkout/gfs_workflow.v15.0.0/jobs	(revision 96191)
+++ checkout/gfs_workflow.v15.0.0/jobs	(revision 96274)

Property changes on: checkout/gfs_workflow.v15.0.0/jobs
___________________________________________________________________
Modified: svn:mergeinfo
## -0,0 +0,1 ##
   Merged /fv3gfs/branches/Rahul.Mahajan/EXP-cyc/gfs_workflow.v15.0.0/jobs:r96049-96273
Index: checkout/gfs_workflow.v15.0.0/util
===================================================================
--- checkout/gfs_workflow.v15.0.0/util	(revision 96191)
+++ checkout/gfs_workflow.v15.0.0/util	(revision 96274)

Property changes on: checkout/gfs_workflow.v15.0.0/util
___________________________________________________________________
Modified: svn:mergeinfo
## -0,0 +0,1 ##
   Merged /fv3gfs/branches/Rahul.Mahajan/EXP-cyc/gfs_workflow.v15.0.0/util:r96049-96273
Index: checkout/gfs_workflow.v15.0.0/fv3gfs/ush/setup_workflow.py
===================================================================
--- checkout/gfs_workflow.v15.0.0/fv3gfs/ush/setup_workflow.py	(revision 96191)
+++ checkout/gfs_workflow.v15.0.0/fv3gfs/ush/setup_workflow.py	(revision 96274)
@@ -26,159 +26,42 @@
 
 import os
 import sys
-import glob
-from distutils.spawn import find_executable
 from datetime import datetime, timedelta
 from argparse import ArgumentParser, ArgumentDefaultsHelpFormatter
-import shellvars
 import rocoto
+import workflow_utils as wfu
 
+gfs_tasks = ['prep', 'anal', 'fcst', 'post', 'vrfy', 'arch']
+hyb_tasks = ['eobs', 'eomg', 'eupd', 'ecen', 'efcs', 'epos', 'earc']
+
 def main():
     parser = ArgumentParser(description='Setup XML workflow and CRONTAB for a GFS parallel.', formatter_class=ArgumentDefaultsHelpFormatter)
     parser.add_argument('--expdir',help='full path to experiment directory containing config files', type=str, required=False, default=os.environ['PWD'])
     args = parser.parse_args()
 
-    configs = get_configs(args.expdir)
+    configs = wfu.get_configs(args.expdir)
 
-    dict_configs = source_configs(args.expdir, configs)
+    _base = wfu.config_parser([wfu.find_config('config.base', configs)])
 
-    # First create workflow XML
-    create_xml(dict_configs)
-
-    # Next create the crontab
-    create_crontab(dict_configs['base'])
-
-    return
-
-
-def get_configs(expdir):
-    '''
-        Given an experiment directory containing config files,
-        return a list of configs minus the ones ending with ".default"
-    '''
-
-    configs = glob.glob('%s/config.*' % expdir)
-
-    # remove any defaults from the list
-    for c, config in enumerate(configs):
-        if config.endswith('.default'):
-            configs.pop(c)
-
-    return configs
-
-
-def find_config(config_name, configs):
-
-    for config in configs:
-        if config_name == os.path.basename(config):
-            return config
-
-    # no match found
-    raise IOError("%s does not exist, ABORT!" % config_name)
-
-
-def source_configs(expdir, configs):
-    '''
-        Given list of config files, source them
-        and return a dictionary for each task
-    '''
-
-    dict_configs = {}
-
-    # First read "config.base", gather basic info
-    print 'sourcing config.base'
-    dict_configs['base'] = config_parser(find_config('config.base', configs))
-    base = dict_configs['base']
-
-    if expdir != base['EXPDIR']:
+    if args.expdir != _base['EXPDIR']:
         print 'MISMATCH in experiment directories!'
-        print 'config.base: EXPDIR = %s' % base['EXPDIR']
-        print 'input arg:     --expdir = %s' % expdir
+        print 'config.base: EXPDIR = %s' % _base['EXPDIR']
+        print 'input arg: --expdir = %s' % expdir
         sys.exit(1)
 
-    # GDAS/GFS tasks
-    for task in ['prep', 'anal', 'fcst', 'post', 'vrfy', 'arch']:
+    tasks = gfs_tasks + hyb_tasks if _base['DOHYBVAR'] == 'YES' else gfs_tasks
 
-        print 'sourcing config.%s' % task
+    dict_configs = wfu.source_configs(configs, tasks)
 
-        files = []
-        files.append(find_config('config.base', configs))
-        files.append(find_config('config.%s' % task, configs))
+    # First create workflow XML
+    create_xml(dict_configs)
 
-        dict_configs[task] = config_parser(files)
+    # Next create the crontab
+    wfu.create_crontab(dict_configs['base'])
 
-    # Hybrid tasks
-    if dict_configs['base']['DOHYBVAR'] == 'YES':
+    return
 
-        for task in ['eobs', 'eupd', 'ecen', 'efcs', 'epos', 'earc']:
 
-            files = []
-            files.append(find_config('config.base', configs))
-            if task in ['eobs', 'eomg']:
-                files.append(find_config('config.anal', configs))
-                files.append(find_config('config.eobs', configs))
-            elif task in ['eupd']:
-                files.append(find_config('config.anal', configs))
-                files.append(find_config('config.eupd', configs))
-            elif task in ['efcs']:
-                files.append(find_config('config.fcst', configs))
-                files.append(find_config('config.efcs', configs))
-            else:
-                files.append(find_config('config.%s' % task, configs))
-
-            print 'sourcing config.%s' % task
-            dict_configs[task] = config_parser(files)
-
-        dict_configs['eomg'] = dict_configs['eobs']
-
-    return dict_configs
-
-
-def config_parser(files):
-    """
-    Given the name of config file, key-value pair of all variables in the config file is returned as a dictionary
-    :param files: config file or list of config files
-    :type files: list or str or unicode
-    :return: Key value pairs representing the environment variables defined
-            in the script.
-    :rtype: dict
-    """
-    sv = shellvars.ShellVars(files)
-    varbles = sv.get_vars()
-    for key,value in varbles.iteritems():
-        if any(x in key for x in ['CDATE','SDATE','EDATE']): # likely a date, convert to datetime
-            varbles[key] = datetime.strptime(value,'%Y%m%d%H')
-            continue
-        if '.' in value: # Likely a number and that too a float
-            try:
-                varbles[key] = float(value)
-            except ValueError:
-                varbles[key] = value
-        else: # Still could be a number, may be an integer
-            try:
-                varbles[key] = int(value)
-            except ValueError:
-                varbles[key] = value
-
-    return varbles
-
-
-def get_scheduler(machine):
-    '''
-        Determine the scheduler
-    '''
-
-    if machine in ['ZEUS', 'THEIA']:
-        return 'moabtorque'
-    elif machine in ['WCOSS']:
-        return 'lsf'
-    elif machine in ['WCOSS_C']:
-        return 'lsfcray'
-    else:
-        msg = 'Unknown machine: %s, ABORT!' % machine
-        Exception.__init__(self,msg)
-
-
 def get_gfs_cyc_dates(base):
     '''
         Generate GFS dates from experiment dates and gfs_cyc choice
@@ -191,14 +74,14 @@
     sdate = base['SDATE']
     edate = base['EDATE']
 
+    interval_gfs = wfu.get_gfs_interval(gfs_cyc)
+
     # Set GFS cycling dates
     hrdet = 0
     if gfs_cyc == 1:
-        interval_gfs = '24:00:00'
         hrinc = 24 - sdate.hour
         hrdet = edate.hour
     elif gfs_cyc == 2:
-        interval_gfs = '12:00:00'
         if sdate.hour in [0, 12]:
             hrinc = 12
         elif sdate.hour in [6, 18]:
@@ -206,7 +89,6 @@
         if edate.hour in [6, 18]:
             hrdet = 6
     elif gfs_cyc == 4:
-        interval_gfs = '06:00:00'
         hrinc = 6
     sdate_gfs = sdate + timedelta(hours=hrinc)
     edate_gfs = edate - timedelta(hours=hrdet)
@@ -264,9 +146,6 @@
     strings = []
 
     strings.append('\n')
-    strings.append('\t<!-- User definitions -->\n')
-    strings.append('\t<!ENTITY LOGNAME "%s">\n' % os.environ['USER'])
-    strings.append('\n')
     strings.append('\t<!-- Experiment parameters such as name, starting, ending dates -->\n')
     strings.append('\t<!ENTITY PSLOT "%s">\n' % base['PSLOT'])
     strings.append('\t<!ENTITY SDATE "%s">\n' % base['SDATE'].strftime('%Y%m%d%H%M'))
@@ -286,8 +165,11 @@
     strings.append('\t<!ENTITY ACCOUNT    "%s">\n' % base['ACCOUNT'])
     strings.append('\t<!ENTITY QUEUE      "%s">\n' % base['QUEUE'])
     strings.append('\t<!ENTITY QUEUE_ARCH "%s">\n' % base['QUEUE_ARCH'])
-    strings.append('\t<!ENTITY SCHEDULER  "%s">\n' % get_scheduler(base['machine']))
+    strings.append('\t<!ENTITY SCHEDULER  "%s">\n' % wfu.get_scheduler(base['machine']))
     strings.append('\n')
+    strings.append('\t<!-- Toggle HPSS archiving -->\n')
+    strings.append('\t<!ENTITY ARCHIVE_TO_HPSS "YES">\n')
+    strings.append('\n')
     strings.append('\t<!-- ROCOTO parameters that control workflow -->\n')
     strings.append('\t<!ENTITY CYCLETHROTTLE "3">\n')
     strings.append('\t<!ENTITY TASKTHROTTLE  "20">\n')
@@ -324,7 +206,7 @@
     strings.append('\t<!-- BEGIN: Resource requirements for %s part of the workflow -->\n' % cdump.upper())
     strings.append('\n')
 
-    base = dict_configs['base']
+    machine = dict_configs['base']['machine']
 
     tasks = ['prep', 'anal', 'fcst', 'post', 'vrfy', 'arch']
     for task in tasks:
@@ -331,36 +213,11 @@
 
         cfg = dict_configs[task]
 
-        if cdump in ['gfs'] and 'wtime_%s_gfs' % task in cfg.keys():
-            walltime = cfg['wtime_%s_gfs' % task]
-        else:
-            walltime = cfg['wtime_%s' % task]
+        wtimestr, resstr, queuestr = wfu.get_resources(machine, cfg, task, cdump=cdump)
+        taskstr = '%s_%s' % (task.upper(), cdump.upper())
 
-        tasks = cfg['npe_%s' % task]
-        ppn = cfg['npe_node_%s' % task]
-        nodes = tasks / ppn
-        memory = cfg['memory_%s' % task] if 'memory_%s' % task in cfg.keys() else None
-
-        if base['machine'] in ['ZEUS', 'THEIA']:
-            resstr = '<nodes>%d:ppn=%d</nodes>' % (nodes, ppn)
-
-        elif base['machine'] in ['WCOSS_C']:
-            resstr = '<nodes>%d:ppn=%d</nodes>' % (nodes, ppn)
-            if task in ['arch']:
-                resstr += '<shared></shared>'
-            else:
-                if memory is not None:
-                    resstr += '<memory>%s</memory>' % str(memory)
-
-        elif base['machine'] in ['WCOSS']:
-            resstr = '<cores>%d</cores>' % (tasks)
-
-
-        taskstr = '%s_GFS' % task.upper() if cdump == 'gfs' else '%s' % task.upper()
-        queuestr = '&QUEUE_ARCH;' if task in ['arch'] else '&QUEUE;'
-
         strings.append('\t<!ENTITY QUEUE_%s     "%s">\n' % (taskstr, queuestr))
-        strings.append('\t<!ENTITY WALLTIME_%s  "%s">\n' % (taskstr, walltime))
+        strings.append('\t<!ENTITY WALLTIME_%s  "%s">\n' % (taskstr, wtimestr))
         strings.append('\t<!ENTITY RESOURCES_%s "%s">\n' % (taskstr, resstr))
         strings.append('\t<!ENTITY NATIVE_%s    "">\n'   % (taskstr))
 
@@ -382,45 +239,18 @@
     strings.append('\t<!-- BEGIN: Resource requirements for hybrid part of the workflow -->\n')
     strings.append('\n')
 
-    base = dict_configs['base']
+    machine = dict_configs['base']['machine']
 
-    hybrid_tasks = ['eobs', 'eomg', 'eupd', 'ecen', 'efcs', 'epos', 'earc']
-    for task in hybrid_tasks:
+    tasks = ['eobs', 'eomg', 'eupd', 'ecen', 'efcs', 'epos', 'earc']
+    for task in tasks:
 
         cfg = dict_configs['eobs'] if task in ['eomg'] else dict_configs[task]
 
-        if task in ['eomg']:
-            tasks = cfg['npe_eobs']
-            ppn = cfg['npe_node_eobs']
-            walltime = cfg['wtime_eomg']
-            memory = cfg['memory_eobs'] if 'memory_%s' % task in cfg.keys() else None
-        else:
-            tasks = cfg['npe_%s' % task]
-            ppn = cfg['npe_node_%s' % task]
-            walltime = cfg['wtime_%s' % task]
-            memory = cfg['memory_%s' % task] if 'memory_%s' % task in cfg.keys() else None
+        wtimestr, resstr, queuestr = wfu.get_resources(machine, cfg, task, cdump=cdump)
+        taskstr = '%s_%s' % (task.upper(), cdump.upper())
 
-        nodes = tasks / ppn
-
-        if base['machine'] in ['ZEUS', 'THEIA']:
-            resstr = '<nodes>%d:ppn=%d</nodes>' % (nodes, ppn)
-
-        elif base['machine'] in ['WCOSS_C']:
-            resstr = '<nodes>%d:ppn=%d</nodes>' % (nodes, ppn)
-            if task in ['earc']:
-                resstr += '<shared></shared>'
-            else:
-                if memory is not None:
-                    resstr += '<memory>%s</memory>' % str(memory)
-
-        elif base['machine'] in ['WCOSS']:
-            resstr = '<cores>%d</cores>' % (tasks)
-
-        taskstr = task.upper()
-        queuestr = '&QUEUE_ARCH;' if task in ['earc'] else '&QUEUE;'
-
         strings.append('\t<!ENTITY QUEUE_%s     "%s">\n' % (taskstr, queuestr))
-        strings.append('\t<!ENTITY WALLTIME_%s  "%s">\n' % (taskstr, walltime))
+        strings.append('\t<!ENTITY WALLTIME_%s  "%s">\n' % (taskstr, wtimestr))
         strings.append('\t<!ENTITY RESOURCES_%s "%s">\n' % (taskstr, resstr))
         strings.append('\t<!ENTITY NATIVE_%s    "">\n'   % (taskstr))
 
@@ -431,82 +261,6 @@
     return ''.join(strings)
 
 
-def create_wf_task(task, cdump='gdas', envar=None, dependency=None, \
-                   metatask=None, varname=None, varval=None):
-
-    if metatask is None:
-        taskstr = '%s' % task
-    else:
-        taskstr = '%s#%s#' % (task, varname)
-        metataskstr = '%s' % metatask
-        metatask_dict = {'metataskname': metataskstr, \
-                         'varname': '%s' % varname, \
-                         'varval': '%s' % varval}
-
-    if cdump in ['gfs']:
-        cdumpstr = '_GFS'
-        taskstr = 'gfs%s' % taskstr
-    elif cdump in ['gdas']:
-        cdumpstr = ''
-
-    task_dict = {'taskname': '%s' % taskstr, \
-                 'cycledef': '%s' % cdump, \
-                 'maxtries': '&MAXTRIES;', \
-                 'command': '&JOBS_DIR;/%s.sh' % task, \
-                 'jobname': '&PSLOT;_%s_ at H' % taskstr, \
-                 'account': '&ACCOUNT;', \
-                 'queue': '&QUEUE_%s%s;' % (task.upper(), cdumpstr), \
-                 'walltime': '&WALLTIME_%s%s;' % (task.upper(), cdumpstr), \
-                 'native': '&NATIVE_%s%s;' % (task.upper(), cdumpstr), \
-                 'resources': '&RESOURCES_%s%s;' % (task.upper(), cdumpstr), \
-                 'log': '&ROTDIR;/logs/@Y at m@d at H/%s.log' % taskstr, \
-                 'envar': envar, \
-                 'dependency': dependency}
-
-    if metatask is None:
-        task = rocoto.create_task(task_dict)
-    else:
-        task = rocoto.create_metatask(task_dict, metatask_dict)
-    task = ''.join(task)
-
-    return task
-
-
-def create_firstcyc_task():
-    '''
-    This task is needed to run to finalize the first half cycle
-    '''
-
-    task = 'firstcyc'
-    taskstr = '%s' % task
-
-    deps = []
-    data = '&EXPDIR;/logs/@Y at m@d at H.log'
-    dep_dict = {'type':'data', 'data':data, 'offset':'24:00:00'}
-    deps.append(rocoto.add_dependency(dep_dict))
-    dep_dict = {'condition':'not', 'type':'cycleexist', 'offset':'-06:00:00'}
-    deps.append(rocoto.add_dependency(dep_dict))
-    dependencies = rocoto.create_dependency(dep_condition='and', dep=deps)
-
-    task_dict = {'taskname': '%s' % taskstr, \
-                 'cycledef': 'first', \
-                 'maxtries': '&MAXTRIES;', \
-                 'final' : 'true', \
-                 'command': 'sleep 1', \
-                 'jobname': '&PSLOT;_%s_ at H' % taskstr, \
-                 'account': '&ACCOUNT;', \
-                 'queue': '&QUEUE_ARCH;', \
-                 'walltime': '&WALLTIME_ARCH;', \
-                 'native': '&NATIVE_ARCH;', \
-                 'resources': '&RESOURCES_ARCH;', \
-                 'log': '&ROTDIR;/logs/@Y at m@d at H/%s.log' % taskstr, \
-                 'dependency': dependencies}
-
-    task = rocoto.create_task(task_dict)
-
-    return ''.join(task)
-
-
 def get_gdasgfs_tasks(cdump='gdas', dohybvar='NO'):
     '''
         Create GDAS or GFS tasks
@@ -520,95 +274,76 @@
     tasks = []
 
     # prep
-    taskname = 'prep'
     deps = []
-    dep_dict = {'name':'post', 'type':'task', 'offset':'-06:00:00'}
+    dep_dict = {'type':'task', 'name':'%spost' % cdump, 'offset':'-06:00:00'}
     deps.append(rocoto.add_dependency(dep_dict))
     data = '&DMPDIR;/@Y at m@d at H/%s/%s.t at Hz.updated.status.tm00.bufr_d' % (cdump, cdump)
     dep_dict = {'type':'data', 'data':data}
     deps.append(rocoto.add_dependency(dep_dict))
     dependencies = rocoto.create_dependency(dep_condition='and', dep=deps)
-    task = create_wf_task(taskname, cdump=cdump, envar=envars, dependency=dependencies)
+    task = wfu.create_wf_task('prep', cdump=cdump, envar=envars, dependency=dependencies)
 
     tasks.append(task)
     tasks.append('\n')
 
     # anal
-    taskname = 'anal'
     deps = []
-    if cdump in ['gdas']:
-        dep_dict = {'name':'prep', 'type':'task'}
-        deps.append(rocoto.add_dependency(dep_dict))
-    elif cdump in ['gfs']:
-        dep_dict = {'name':'gfsprep', 'type':'task'}
-        deps.append(rocoto.add_dependency(dep_dict))
+    dep_dict = {'type':'task', 'name':'%sprep' % cdump}
+    deps.append(rocoto.add_dependency(dep_dict))
     if dohybvar in ['y', 'Y', 'yes', 'YES']:
-        dep_dict = {'name':'epos', 'type':'task', 'offset':'-06:00:00'}
+        dep_dict = {'type':'task', 'name':'%sepos' % cdump, 'offset':'-06:00:00'}
         deps.append(rocoto.add_dependency(dep_dict))
         dependencies = rocoto.create_dependency(dep_condition='and', dep=deps)
     else:
         dependencies = rocoto.create_dependency(dep=deps)
-    task = create_wf_task(taskname, cdump=cdump, envar=envars, dependency=dependencies)
+    task = wfu.create_wf_task('anal', cdump=cdump, envar=envars, dependency=dependencies)
 
     tasks.append(task)
     tasks.append('\n')
 
     # fcst
-    taskname = 'fcst'
     deps = []
+    dep_dict = {'type':'task', 'name':'%sanal' % cdump}
+    deps.append(rocoto.add_dependency(dep_dict))
     if cdump in ['gdas']:
-        dep_dict = {'name':'anal', 'type':'task'}
+        dep_dict = {'type':'cycleexist', 'condition':'not', 'offset':'-06:00:00'}
         deps.append(rocoto.add_dependency(dep_dict))
-        dep_dict = {'condition':'not', 'type':'cycleexist', 'offset':'-06:00:00'}
-        deps.append(rocoto.add_dependency(dep_dict))
         dependencies = rocoto.create_dependency(dep_condition='or', dep=deps)
     elif cdump in ['gfs']:
-        dep_dict = {'name':'gfsanal', 'type':'task'}
-        deps.append(rocoto.add_dependency(dep_dict))
         dependencies = rocoto.create_dependency(dep=deps)
-    task = create_wf_task(taskname, cdump=cdump, envar=envars, dependency=dependencies)
+    task = wfu.create_wf_task('fcst', cdump=cdump, envar=envars, dependency=dependencies)
 
     tasks.append(task)
     tasks.append('\n')
 
     # post
-    taskname = 'post'
     deps = []
-    if cdump in ['gdas']:
-        dep_dict = {'name':'fcst', 'type':'task'}
-    elif cdump in ['gfs']:
-        dep_dict = {'name':'gfsfcst', 'type':'task'}
+    dep_dict = {'type':'task', 'name':'%sfcst' % cdump}
     deps.append(rocoto.add_dependency(dep_dict))
     dependencies = rocoto.create_dependency(dep=deps)
-    task = create_wf_task(taskname, cdump=cdump, envar=envars, dependency=dependencies)
+    task = wfu.create_wf_task('post', cdump=cdump, envar=envars, dependency=dependencies)
 
     tasks.append(task)
     tasks.append('\n')
 
     # vrfy
-    taskname = 'vrfy'
     deps = []
-    if cdump in ['gdas']:
-        dep_dict = {'name':'post', 'type':'task'}
-    elif cdump in ['gfs']:
-        dep_dict = {'name':'gfspost', 'type':'task'}
+    dep_dict = {'type':'task', 'name':'%spost' % cdump}
     deps.append(rocoto.add_dependency(dep_dict))
     dependencies = rocoto.create_dependency(dep=deps)
-    task = create_wf_task(taskname, cdump=cdump, envar=envars, dependency=dependencies)
+    task = wfu.create_wf_task('vrfy', cdump=cdump, envar=envars, dependency=dependencies)
 
     tasks.append(task)
     tasks.append('\n')
 
     # arch
-    taskname = 'arch'
     deps = []
-    if cdump in ['gdas']:
-        dep_dict = {'name':'vrfy', 'type':'task'}
-    elif cdump in ['gfs']:
-        dep_dict = {'name':'gfsvrfy', 'type':'task'}
+    dep_dict = {'type':'task', 'name':'%svrfy' % cdump}
     deps.append(rocoto.add_dependency(dep_dict))
-    dependencies = rocoto.create_dependency(dep=deps)
-    task = create_wf_task(taskname, cdump=cdump, envar=envars, dependency=dependencies)
+    dep_dict = {'type':'streq', 'left':'&ARCHIVE_TO_HPSS;', 'right':'YES'}
+    deps.append(rocoto.add_dependency(dep_dict))
+    dependencies = rocoto.create_dependency(dep_condition='and', dep=deps)
+    task = wfu.create_wf_task('arch', cdump=cdump, envar=envars, dependency=dependencies)
 
     tasks.append(task)
     tasks.append('\n')
@@ -631,99 +366,88 @@
     tasks = []
 
     # eobs
-    taskname = 'eobs'
     deps = []
-    dep_dict = {'name':'prep', 'type':'task'}
+    dep_dict = {'type':'task', 'name':'%sprep' % cdump}
     deps.append(rocoto.add_dependency(dep_dict))
-    dep_dict = {'name':'epos', 'type':'task', 'offset':'-06:00:00'}
+    dep_dict = {'type':'task', 'name':'%sepos' % cdump, 'offset':'-06:00:00'}
     deps.append(rocoto.add_dependency(dep_dict))
     dependencies = rocoto.create_dependency(dep_condition='and', dep=deps)
-    task = create_wf_task(taskname, cdump=cdump, envar=envars, dependency=dependencies)
+    task = wfu.create_wf_task('eobs', cdump=cdump, envar=envars, dependency=dependencies)
 
     tasks.append(task)
     tasks.append('\n')
 
     # eomn, eomg
-    metataskname = 'eomn'
     varname = 'grp'
     varval = EOMGGROUPS
-    taskname = 'eomg'
     deps = []
-    dep_dict = {'name':'eobs', 'type':'task'}
+    dep_dict = {'type':'task', 'name':'%seobs' % cdump}
     deps.append(rocoto.add_dependency(dep_dict))
     dependencies = rocoto.create_dependency(dep=deps)
     eomgenvars = envars + [ensgrp]
-    task = create_wf_task(taskname, cdump=cdump, envar=eomgenvars, dependency=dependencies, \
-           metatask=metataskname, varname=varname, varval=varval)
+    task = wfu.create_wf_task('eomg', cdump=cdump, envar=eomgenvars, dependency=dependencies, metatask='eomn', varname=varname, varval=varval)
 
     tasks.append(task)
     tasks.append('\n')
 
     # eupd
-    taskname = 'eupd'
     deps = []
-    dep_dict = {'name':'eomn', 'type':'metatask'}
+    dep_dict = {'type':'metatask', 'name':'%seomn' % cdump}
     deps.append(rocoto.add_dependency(dep_dict))
     dependencies = rocoto.create_dependency(dep=deps)
-    task = create_wf_task(taskname, cdump=cdump, envar=envars, dependency=dependencies)
+    task = wfu.create_wf_task('eupd', cdump=cdump, envar=envars, dependency=dependencies)
 
     tasks.append(task)
     tasks.append('\n')
 
     # ecen
-    taskname = 'ecen'
     deps = []
-    dep_dict = {'name':'anal', 'type':'task'}
+    dep_dict = {'type':'task', 'name':'%sanal' % cdump}
     deps.append(rocoto.add_dependency(dep_dict))
-    dep_dict = {'name':'eupd', 'type':'task'}
+    dep_dict = {'type':'task', 'name':'%seupd' % cdump}
     deps.append(rocoto.add_dependency(dep_dict))
     dependencies = rocoto.create_dependency(dep_condition='and', dep=deps)
-    task = create_wf_task(taskname, cdump=cdump, envar=envars, dependency=dependencies)
+    task = wfu.create_wf_task('ecen', cdump=cdump, envar=envars, dependency=dependencies)
 
     tasks.append(task)
     tasks.append('\n')
 
-    # efmn
-    metataskname = 'efmn'
+    # efmn, efcs
     varname = 'grp'
     varval = EFCSGROUPS
-    taskname = 'efcs'
     deps = []
-    dep_dict = {'name':'ecen', 'type':'task'}
+    dep_dict = {'type':'task', 'name':'%secen' % cdump}
     deps.append(rocoto.add_dependency(dep_dict))
-    dep_dict = {'condition':'not', 'type':'cycleexist', 'offset':'-06:00:00'}
+    dep_dict = {'type':'cycleexist', 'condition':'not', 'offset':'-06:00:00'}
     deps.append(rocoto.add_dependency(dep_dict))
     dependencies = rocoto.create_dependency(dep_condition='or', dep=deps)
     efcsenvars = envars + [ensgrp]
-    task = create_wf_task(taskname, cdump=cdump, envar=efcsenvars, dependency=dependencies, \
-           metatask=metataskname, varname=varname, varval=varval)
+    task = wfu.create_wf_task('efcs', cdump=cdump, envar=efcsenvars, dependency=dependencies, \
+           metatask='efmn', varname=varname, varval=varval)
 
     tasks.append(task)
     tasks.append('\n')
 
     # epos
-    taskname = 'epos'
     deps = []
-    dep_dict = {'name':'efmn', 'type':'metatask'}
+    dep_dict = {'type':'metatask', 'name':'%sefmn' % cdump}
     deps.append(rocoto.add_dependency(dep_dict))
     dependencies = rocoto.create_dependency(dep=deps)
-    task = create_wf_task(taskname, cdump=cdump, envar=envars, dependency=dependencies)
+    task = wfu.create_wf_task('epos', cdump=cdump, envar=envars, dependency=dependencies)
 
     tasks.append(task)
     tasks.append('\n')
 
-    # eamn
-    metataskname = 'eamn'
+    # eamn, earc
     varname = 'grp'
     varval = EARCGROUPS
-    taskname = 'earc'
     deps = []
-    dep_dict = {'name':'epos', 'type':'task'}
+    dep_dict = {'type':'task', 'name':'%sepos' % cdump}
     deps.append(rocoto.add_dependency(dep_dict))
     dependencies = rocoto.create_dependency(dep=deps)
     earcenvars = envars + [ensgrp]
-    task = create_wf_task(taskname, cdump=cdump, envar=earcenvars, dependency=dependencies, \
-           metatask=metataskname, varname=varname, varval=varval)
+    task = wfu.create_wf_task('earc', cdump=cdump, envar=earcenvars, dependency=dependencies, \
+           metatask='eamn', varname=varname, varval=varval)
 
     tasks.append(task)
     tasks.append('\n')
@@ -767,47 +491,6 @@
     return ''.join(strings)
 
 
-def create_crontab(base, cronint=5):
-    '''
-        Create crontab to execute rocotorun every cronint (5) minutes
-    '''
-
-    # No point creating a crontab if rocotorun is not available.
-    rocotoruncmd = find_executable('rocotorun')
-    if rocotoruncmd is None:
-        print 'Failed to find rocotorun, crontab will not be created'
-        return
-
-    cronintstr = '*/%d * * * *' % cronint
-    rocotorunstr = '%s -d %s/%s.db -w %s/%s.xml' % (rocotoruncmd, base['EXPDIR'], base['PSLOT'], base['EXPDIR'], base['PSLOT'])
-
-    # On WCOSS, rocoto module needs to be loaded everytime cron runs
-    if base['machine'] in ['WCOSS']:
-        rocotoloadstr = '. /usrx/local/Modules/default/init/sh; module use -a /usrx/local/emc_rocoto/modulefiles; module load rocoto/20170119-master)'
-        rocotorunstr = '(%s %s)' % (rocotoloadstr, rocotorunstr)
-
-    strings = []
-
-    strings.append('# This is a basic crontab file and will execute rocotorun every 5 minutes\n')
-    strings.append('# Usage: crontab %s.crontab\n' % base['PSLOT'])
-    strings.append('#   list all crontabs:      crontab -l\n')
-    strings.append('#   remove current crontab: crontab -r %s.crontab\n' % base['PSLOT'])
-    strings.append('\n')
-
-    try:
-        REPLYTO = os.environ['REPLYTO']
-    except:
-        REPLYTO = ''
-    strings.append('MAILTO="%s"\n' % REPLYTO)
-    strings.append('%s %s\n' % (cronintstr, rocotorunstr))
-
-    fh = open(os.path.join(base['EXPDIR'], '%s.crontab' % base['PSLOT']), 'w')
-    fh.write(''.join(strings))
-    fh.close()
-
-    return
-
-
 def create_xml(dict_configs):
     '''
         Given an dictionary of sourced config files,
@@ -872,7 +555,7 @@
         xmlfile.append(hyb_tasks)
     if base['gfs_cyc'] != 0:
         xmlfile.append(gfs_tasks)
-    xmlfile.append(create_firstcyc_task())
+    xmlfile.append(wfu.create_firstcyc_task())
     xmlfile.append(workflow_footer)
 
     # Write the XML file
Index: checkout/gfs_workflow.v15.0.0/fv3gfs/ush/workflow_utils.py
===================================================================
--- checkout/gfs_workflow.v15.0.0/fv3gfs/ush/workflow_utils.py	(nonexistent)
+++ checkout/gfs_workflow.v15.0.0/fv3gfs/ush/workflow_utils.py	(revision 96274)
@@ -0,0 +1,293 @@
+#!/usr/bin/env python
+
+###############################################################
+# < next few lines under version control, D O  N O T  E D I T >
+# $Date$
+# $Revision$
+# $Author$
+# $Id$
+###############################################################
+'''
+    Module containing functions all workflow setups require
+'''
+
+import os
+import sys
+import glob
+from distutils.spawn import find_executable
+from datetime import datetime, timedelta
+import shellvars
+import rocoto
+
+
+def get_configs(expdir):
+    '''
+        Given an experiment directory containing config files,
+        return a list of configs minus the ones ending with ".default"
+    '''
+
+    configs = glob.glob('%s/config.*' % expdir)
+
+    # remove any defaults from the list
+    for c, config in enumerate(configs):
+        if config.endswith('.default'):
+            configs.pop(c)
+
+    return configs
+
+
+def find_config(config_name, configs):
+
+    for config in configs:
+        if config_name == os.path.basename(config):
+            return config
+
+    # no match found
+    raise IOError("%s does not exist, ABORT!" % config_name)
+
+
+def source_configs(configs, tasks):
+    '''
+        Given list of config files, source them
+        and return a dictionary for each task
+        Every task depends on config.base
+    '''
+
+    dict_configs = {}
+
+    # Return config.base as well
+    dict_configs['base'] = config_parser([find_config('config.base', configs)])
+
+    # Source the list of input tasks
+    for task in tasks:
+
+        files = []
+
+        files.append(find_config('config.base', configs))
+
+        if task in ['eobs', 'eomg']:
+            files.append(find_config('config.anal', configs))
+            files.append(find_config('config.eobs', configs))
+        elif task in ['eupd']:
+            files.append(find_config('config.anal', configs))
+            files.append(find_config('config.eupd', configs))
+        elif task in ['efcs']:
+            files.append(find_config('config.fcst', configs))
+            files.append(find_config('config.efcs', configs))
+        else:
+            files.append(find_config('config.%s' % task, configs))
+
+        print 'sourcing config.%s' % task
+        dict_configs[task] = config_parser(files)
+
+    return dict_configs
+
+
+def config_parser(files):
+    """
+    Given the name of config file, key-value pair of all variables in the config file is returned as a dictionary
+    :param files: config file or list of config files
+    :type files: list or str or unicode
+    :return: Key value pairs representing the environment variables defined
+            in the script.
+    :rtype: dict
+    """
+    sv = shellvars.ShellVars(files)
+    varbles = sv.get_vars()
+    for key,value in varbles.iteritems():
+        if any(x in key for x in ['CDATE','SDATE','EDATE']): # likely a date, convert to datetime
+            varbles[key] = datetime.strptime(value,'%Y%m%d%H')
+            continue
+        if '.' in value: # Likely a number and that too a float
+            try:
+                varbles[key] = float(value)
+            except ValueError:
+                varbles[key] = value
+        else: # Still could be a number, may be an integer
+            try:
+                varbles[key] = int(value)
+            except ValueError:
+                varbles[key] = value
+
+    return varbles
+
+
+def get_scheduler(machine):
+    '''
+        Determine the scheduler
+    '''
+
+    if machine in ['ZEUS', 'THEIA']:
+        return 'moabtorque'
+    elif machine in ['WCOSS']:
+        return 'lsf'
+    elif machine in ['WCOSS_C']:
+        return 'lsfcray'
+    else:
+        msg = 'Unknown machine: %s, ABORT!' % machine
+        Exception.__init__(self, msg)
+
+
+def create_wf_task(task, cdump='gdas', envar=None, dependency=None, \
+                   metatask=None, varname=None, varval=None, final=False):
+
+    if metatask is None:
+        taskstr = '%s' % task
+    else:
+        taskstr = '%s#%s#' % (task, varname)
+        metataskstr = '%s%s' % (cdump, metatask)
+        metatask_dict = {'metataskname': metataskstr, \
+                         'varname': '%s' % varname, \
+                         'varval': '%s' % varval}
+
+    taskstr = '%s%s' % (cdump, taskstr)
+
+    task_dict = {'taskname': '%s' % taskstr, \
+                 'cycledef': '%s' % cdump, \
+                 'maxtries': '&MAXTRIES;', \
+                 'command': '&JOBS_DIR;/%s.sh' % task, \
+                 'jobname': '&PSLOT;_%s_ at H' % taskstr, \
+                 'account': '&ACCOUNT;', \
+                 'queue': '&QUEUE_%s_%s;' % (task.upper(), cdump.upper()), \
+                 'walltime': '&WALLTIME_%s_%s;' % (task.upper(), cdump.upper()), \
+                 'native': '&NATIVE_%s_%s;' % (task.upper(), cdump.upper()), \
+                 'resources': '&RESOURCES_%s_%s;' % (task.upper(), cdump.upper()), \
+                 'log': '&ROTDIR;/logs/@Y at m@d at H/%s.log' % taskstr, \
+                 'envar': envar, \
+                 'dependency': dependency, \
+                 'final': final}
+
+    if metatask is None:
+        task = rocoto.create_task(task_dict)
+    else:
+        task = rocoto.create_metatask(task_dict, metatask_dict)
+    task = ''.join(task)
+
+    return task
+
+
+def create_firstcyc_task(cdump='gdas'):
+    '''
+    This task is needed to run to finalize the first half cycle
+    '''
+
+    task = 'firstcyc'
+    taskstr = '%s' % task
+
+    deps = []
+    data = '&EXPDIR;/logs/@Y at m@d at H.log'
+    dep_dict = {'type':'data', 'data':data, 'offset':'24:00:00'}
+    deps.append(rocoto.add_dependency(dep_dict))
+    dep_dict = {'type':'cycleexist', 'condition':'not', 'offset':'-06:00:00'}
+    deps.append(rocoto.add_dependency(dep_dict))
+    dependencies = rocoto.create_dependency(dep_condition='and', dep=deps)
+
+    task_dict = {'taskname': '%s' % taskstr, \
+                 'cycledef': 'first', \
+                 'maxtries': '&MAXTRIES;', \
+                 'final' : True, \
+                 'command': 'sleep 1', \
+                 'jobname': '&PSLOT;_%s_ at H' % taskstr, \
+                 'account': '&ACCOUNT;', \
+                 'queue': '&QUEUE_ARCH;', \
+                 'walltime': '&WALLTIME_ARCH_%s;' % cdump.upper(), \
+                 'native': '&NATIVE_ARCH_%s;' % cdump.upper(), \
+                 'resources': '&RESOURCES_ARCH_%s;' % cdump.upper(), \
+                 'log': '&ROTDIR;/logs/@Y at m@d at H/%s.log' % taskstr, \
+                 'dependency': dependencies}
+
+    task = rocoto.create_task(task_dict)
+
+    return ''.join(task)
+
+
+def get_gfs_interval(gfs_cyc):
+    '''
+        return interval in hours based on gfs_cyc
+    '''
+
+    # Get interval from cyc_input
+    if gfs_cyc == 0:
+        interval = None
+    if gfs_cyc == 1:
+        interval = '24:00:00'
+    elif gfs_cyc == 2:
+        interval = '12:00:00'
+    elif gfs_cyc == 4:
+        interval = '06:00:00'
+
+    return interval
+
+
+def get_resources(machine, cfg, task, cdump='gdas'):
+
+    if cdump in ['gfs'] and 'wtime_%s_gfs' % task in cfg.keys():
+        wtimestr = cfg['wtime_%s_gfs' % task]
+    else:
+        wtimestr = cfg['wtime_%s' % task]
+
+    ltask = 'eobs' if task in ['eomg'] else task
+
+    memory = cfg.get('memory_%s' % ltask, None)
+    tasks = cfg['npe_%s' % ltask]
+    ppn = cfg['npe_node_%s' % ltask]
+
+    nodes = tasks / ppn
+
+    if machine in ['ZEUS', 'THEIA']:
+        resstr = '<nodes>%d:ppn=%d</nodes>' % (nodes, ppn)
+
+    elif machine in ['WCOSS_C']:
+        resstr = '<nodes>%d:ppn=%d</nodes>' % (nodes, ppn)
+        if task in ['arch', 'earc', 'getic']:
+            resstr += '<shared></shared>'
+        else:
+            if memory is not None:
+                resstr += '<memory>%s</memory>' % str(memory)
+
+    elif machine in ['WCOSS']:
+        resstr = '<cores>%d</cores>' % tasks
+
+    queuestr = '&QUEUE_ARCH;' if task in ['arch', 'earc', 'getic'] else '&QUEUE;'
+
+    return wtimestr, resstr, queuestr
+
+
+def create_crontab(base, cronint=5):
+    '''
+        Create crontab to execute rocotorun every cronint (5) minutes
+    '''
+
+    # No point creating a crontab if rocotorun is not available.
+    rocotoruncmd = find_executable('rocotorun')
+    if rocotoruncmd is None:
+        print 'Failed to find rocotorun, crontab will not be created'
+        return
+
+    cronintstr = '*/%d * * * *' % cronint
+    rocotorunstr = '%s -d %s/%s.db -w %s/%s.xml' % (rocotoruncmd, base['EXPDIR'], base['PSLOT'], base['EXPDIR'], base['PSLOT'])
+
+    # On WCOSS, rocoto module needs to be loaded everytime cron runs
+    if base['machine'] in ['WCOSS']:
+        rocotoloadstr = '. /usrx/local/Modules/default/init/sh; module use -a /usrx/local/emc_rocoto/modulefiles; module load rocoto/20170119-master)'
+        rocotorunstr = '(%s %s)' % (rocotoloadstr, rocotorunstr)
+
+    try:
+        REPLYTO = os.environ['REPLYTO']
+    except:
+        REPLYTO = ''
+
+    strings = []
+
+    strings.append('\n')
+    strings.append('#################### %s ####################\n' % base['PSLOT'])
+    strings.append('MAILTO="%s"\n' % REPLYTO)
+    strings.append('%s %s\n' % (cronintstr, rocotorunstr))
+    strings.append('#################################################################\n')
+    strings.append('\n')
+
+    fh = open(os.path.join(base['EXPDIR'], '%s.crontab' % base['PSLOT']), 'w')
+    fh.write(''.join(strings))
+    fh.close()
+
+    return

Property changes on: checkout/gfs_workflow.v15.0.0/fv3gfs/ush/workflow_utils.py
___________________________________________________________________
Added: svn:executable
## -0,0 +1 ##
+*
\ No newline at end of property
Added: svn:keywords
## -0,0 +1 ##
+URL Author Id Revision Date
\ No newline at end of property
Index: checkout/gfs_workflow.v15.0.0/fv3gfs/ush/setup_workflow_fcstonly.py
===================================================================
--- checkout/gfs_workflow.v15.0.0/fv3gfs/ush/setup_workflow_fcstonly.py	(revision 96191)
+++ checkout/gfs_workflow.v15.0.0/fv3gfs/ush/setup_workflow_fcstonly.py	(revision 96274)
@@ -27,153 +27,44 @@
 
 import os
 import sys
-import glob
-from distutils.spawn import find_executable
 from datetime import datetime
 from argparse import ArgumentParser, ArgumentDefaultsHelpFormatter
-import shellvars
 import rocoto
+import workflow_utils as wfu
 
 
-taskplan = ['getics', 'chgres', 'fcst', 'post', 'vrfy', 'arch']
+taskplan = ['getic', 'fv3ic', 'fcst', 'post', 'vrfy', 'arch']
 
 def main():
     parser = ArgumentParser(description='Setup XML workflow and CRONTAB for a forecast only experiment.', formatter_class=ArgumentDefaultsHelpFormatter)
     parser.add_argument('--expdir',help='full path to experiment directory containing config files', type=str, required=False, default=os.environ['PWD'])
+    parser.add_argument('--cdump',help='cycle to run forecasts', type=str, choices=['gdas', 'gfs'], default='gfs', required=False)
 
     args = parser.parse_args()
 
-    configs = get_configs(args.expdir)
+    configs = wfu.get_configs(args.expdir)
 
-    dict_configs = source_configs(args.expdir, configs)
+    _base = wfu.config_parser([wfu.find_config('config.base', configs)])
 
-    # First create workflow XML
-    create_xml(dict_configs)
-
-    # Next create the crontab
-    create_crontab(dict_configs['base'])
-
-    return
-
-
-def get_configs(expdir):
-    '''
-        Given an experiment directory containing config files,
-        return a list of configs minus the ones ending with ".default"
-    '''
-
-    configs = glob.glob('%s/config.*' % expdir)
-
-    # remove any defaults from the list
-    for c, config in enumerate(configs):
-        if config.endswith('.default'):
-            print 'dropping %s' % config
-            configs.pop(c)
-
-    return configs
-
-
-def find_config(config_name, configs):
-
-    for config in configs:
-        if config_name == os.path.basename(config):
-            return config
-
-    # no match found
-    raise IOError("%s does not exist, ABORT!" % config_name)
-
-
-def source_configs(expdir, configs):
-    '''
-        Given alist of config files
-        source the configs and return a dictionary for each task
-    '''
-
-    dict_configs = {}
-
-    # First read "config.base", gather basic info
-    print 'sourcing config.%s' % 'base'
-    dict_configs['base'] = config_parser(find_config('config.base', configs))
-    base = dict_configs['base']
-
-    if expdir != base['EXPDIR']:
+    if args.expdir != _base['EXPDIR']:
         print 'MISMATCH in experiment directories!'
-        print 'config.base: EXPDIR = %s' % base['EXPDIR']
-        print 'input arg:     --expdir = %s' % expdir
+        print 'config.base: EXPDIR = %s' % _base['EXPDIR']
+        print 'input arg: --expdir = %s' % expdir
         sys.exit(1)
 
-    # GDAS/GFS tasks
-    for task in taskplan:
+    dict_configs = wfu.source_configs(configs, taskplan)
 
-        files = []
-        files.append(find_config('config.base', configs))
-        files.append(find_config('config.%s' % task, configs))
+    dict_configs['base']['CDUMP'] = args.cdump
 
-        print 'sourcing config.%s' % task
-        dict_configs[task] = config_parser(files)
+    # First create workflow XML
+    create_xml(dict_configs)
 
-    return dict_configs
+    # Next create the crontab
+    wfu.create_crontab(dict_configs['base'])
 
+    return
 
-def config_parser(filename):
-    """
-    Given the name of config file, key-value pair of all variables in the config file is returned as a dictionary
-    :param filename: config file
-    :type filename: str or unicode
-    :return: Key value pairs representing the environment variables defined
-            in the script.
-    :rtype: dict
-    """
-    sv = shellvars.ShellVars(filename)
-    varbles = sv.get_vars()
-    for key,value in varbles.iteritems():
-        if any(x in key for x in ['CDATE','SDATE','EDATE']): # likely a date, convert to datetime
-            varbles[key] = datetime.strptime(value,'%Y%m%d%H')
-            continue
-        if '.' in value: # Likely a number and that too a float
-            try:
-                varbles[key] = float(value)
-            except ValueError:
-                varbles[key] = value
-        else: # Still could be a number, may be an integer
-            try:
-                varbles[key] = int(value)
-            except ValueError:
-                varbles[key] = value
 
-    return varbles
-
-
-def get_scheduler(machine):
-    '''
-        Determine the scheduler
-    '''
-
-    if machine in ['ZEUS', 'THEIA']:
-        return 'moabtorque'
-    elif machine in ['WCOSS']:
-        return 'lsf'
-    elif machine in ['WCOSS_C']:
-        return 'lsfcray'
-    else:
-        msg = 'Unknown machine: %s, ABORT!' % machine
-        Exception.__init__(self,msg)
-
-
-def get_interval(cyc_input):
-    '''
-        return interval in hours based on gfs_cyc like input
-    '''
-
-    # Get interval from cyc_input
-    if cyc_input == 1:
-        return '24:00:00'
-    elif cyc_input == 2:
-        return '12:00:00'
-    elif cyc_input == 4:
-        return '06:00:00'
-
-
 def get_preamble():
     '''
         Generate preamble for XML
@@ -207,19 +98,23 @@
     strings = []
 
     strings.append('\n')
-    strings.append('\t<!-- User definitions -->\n')
-    strings.append('\t<!ENTITY LOGNAME "%s">\n' % os.environ['USER'])
-    strings.append('\n')
-    strings.append('\t<!-- Experiment parameters such as name, starting, ending dates -->\n')
+    strings.append('\t<!-- Experiment parameters such as name, cycle, resolution -->\n')
     strings.append('\t<!ENTITY PSLOT    "%s">\n' % base['PSLOT'])
     strings.append('\t<!ENTITY CDUMP    "%s">\n' % base['CDUMP'])
+    strings.append('\t<!ENTITY CASE     "%s">\n' % base['CASE'])
+    strings.append('\n')
+    strings.append('\t<!-- Experiment parameters such as starting, ending dates -->\n')
     strings.append('\t<!ENTITY SDATE    "%s">\n' % base['SDATE'].strftime('%Y%m%d%H%M'))
     strings.append('\t<!ENTITY EDATE    "%s">\n' % base['EDATE'].strftime('%Y%m%d%H%M'))
+    if base['INTERVAL'] is None:
+        print 'cycle INTERVAL cannot be None'
+        sys.exit(1)
     strings.append('\t<!ENTITY INTERVAL "%s">\n' % base['INTERVAL'])
     strings.append('\n')
-    strings.append('\t<!-- Experiment and Rotation directory -->\n')
+    strings.append('\t<!-- Experiment related directories -->\n')
     strings.append('\t<!ENTITY EXPDIR "%s">\n' % base['EXPDIR'])
     strings.append('\t<!ENTITY ROTDIR "%s">\n' % base['ROTDIR'])
+    strings.append('\t<!ENTITY ICSDIR "%s">\n' % base['ICSDIR'])
     strings.append('\n')
     strings.append('\t<!-- Directories for driving the workflow -->\n')
     strings.append('\t<!ENTITY JOBS_DIR "%s/fv3gfs/jobs">\n' % base['BASE_WORKFLOW'])
@@ -228,8 +123,11 @@
     strings.append('\t<!ENTITY ACCOUNT    "%s">\n' % base['ACCOUNT'])
     strings.append('\t<!ENTITY QUEUE      "%s">\n' % base['QUEUE'])
     strings.append('\t<!ENTITY QUEUE_ARCH "%s">\n' % base['QUEUE_ARCH'])
-    strings.append('\t<!ENTITY SCHEDULER  "%s">\n' % get_scheduler(base['machine']))
+    strings.append('\t<!ENTITY SCHEDULER  "%s">\n' % wfu.get_scheduler(base['machine']))
     strings.append('\n')
+    strings.append('\t<!-- Toggle HPSS archiving -->\n')
+    strings.append('\t<!ENTITY ARCHIVE_TO_HPSS "YES">\n')
+    strings.append('\n')
     strings.append('\t<!-- ROCOTO parameters that control workflow -->\n')
     strings.append('\t<!ENTITY CYCLETHROTTLE "2">\n')
     strings.append('\t<!ENTITY TASKTHROTTLE  "20">\n')
@@ -246,93 +144,32 @@
 
     strings = []
 
+    strings.append('\t<!-- BEGIN: Resource requirements for the workflow -->\n')
     strings.append('\n')
 
-    base = dict_configs['base']
+    machine = dict_configs['base']['machine']
 
     for task in taskplan:
 
         cfg = dict_configs[task]
 
-        if cdump in ['gfs'] and 'wtime_%s_gfs' % task in cfg.keys():
-            walltime = cfg['wtime_%s_gfs' % task]
-        else:
-            walltime = cfg['wtime_%s' % task]
+        wtimestr, resstr, queuestr = wfu.get_resources(machine, cfg, task, cdump=cdump)
 
-        tasks = cfg['npe_%s' % task]
-        ppn = cfg['npe_node_%s' % task]
-        nodes = tasks / ppn
-        memory = cfg['memory_%s' % task] if 'memory_%s' % task in cfg.keys() else None
+        taskstr = '%s_%s' % (task.upper(), cdump.upper())
 
-        if base['machine'] in ['THEIA']:
-            resstr = '<nodes>%d:ppn=%d</nodes>' % (nodes, ppn)
-
-        elif base['machine'] in ['WCOSS_C']:
-            resstr = '<nodes>%d:ppn=%d</nodes>' % (nodes, ppn)
-            if task in ['getics', 'arch']:
-                resstr += '<shared></shared>'
-            else:
-                if memory is not None:
-                    resstr += '<memory>%s</memory>' % str(memory)
-
-        elif base['machine'] in ['WCOSS']:
-            resstr = '<cores>%d</cores>' % (tasks)
-
-        taskstr = '%s' % task.upper()
-        queuestr = '&QUEUE_ARCH;' if task in ['getics', 'arch'] else '&QUEUE;'
-
         strings.append('\t<!ENTITY QUEUE_%s     "%s">\n' % (taskstr, queuestr))
-        strings.append('\t<!ENTITY WALLTIME_%s  "%s">\n' % (taskstr, walltime))
+        strings.append('\t<!ENTITY WALLTIME_%s  "%s">\n' % (taskstr, wtimestr))
         strings.append('\t<!ENTITY RESOURCES_%s "%s">\n' % (taskstr, resstr))
         strings.append('\t<!ENTITY NATIVE_%s    "">\n'   % (taskstr))
 
         strings.append('\n')
 
+    strings.append('\t<!-- END: Resource requirements for the workflow -->\n')
+
     return ''.join(strings)
 
 
-def create_wf_task(task, cdump='gdas', envar=None, dependency=None, \
-                   metatask=None, varname=None, varval=None):
-
-    if metatask is None:
-        taskstr = '%s' % task
-    else:
-        taskstr = '%s#%s#' % (task, varname)
-        metataskstr = '%s' % metatask
-        metatask_dict = {'metataskname': metataskstr, \
-                         'varname': '%s' % varname, \
-                         'varval': '%s' % varval}
-
-    if cdump in ['gfs']:
-        cdumpstr = '_GFS'
-        taskstr = 'gfs%s' % taskstr
-    elif cdump in ['gdas']:
-        cdumpstr = ''
-
-    task_dict = {'taskname': '%s' % taskstr, \
-                 'cycledef': '%s' % cdump, \
-                 'maxtries': '&MAXTRIES;', \
-                 'command': '&JOBS_DIR;/%s.sh' % task, \
-                 'jobname': '&PSLOT;_%s_ at H' % taskstr, \
-                 'account': '&ACCOUNT;', \
-                 'queue': '&QUEUE_%s%s;' % (task.upper(), cdumpstr), \
-                 'walltime': '&WALLTIME_%s%s;' % (task.upper(), cdumpstr), \
-                 'native': '&NATIVE_%s%s;' % (task.upper(), cdumpstr), \
-                 'resources': '&RESOURCES_%s%s;' % (task.upper(), cdumpstr), \
-                 'log': '&ROTDIR;/logs/@Y at m@d at H/%s.log' % taskstr, \
-                 'envar': envar, \
-                 'dependency': dependency}
-
-    if metatask is None:
-        task = rocoto.create_task(task_dict)
-    else:
-        task = rocoto.create_metatask(task_dict, metatask_dict)
-    task = ''.join(task)
-
-    return task
-
-
-def get_fcstonly_workflow(cdump='gdas'):
+def get_workflow(cdump='gdas'):
     '''
         Create tasks for forecast only workflow
     '''
@@ -340,69 +177,76 @@
     envars = []
     envars.append(rocoto.create_envar(name='EXPDIR', value='&EXPDIR;'))
     envars.append(rocoto.create_envar(name='CDATE', value='<cyclestr>@Y at m@d at H</cyclestr>'))
-    envars.append(rocoto.create_envar(name='CDUMP', value='%s' % cdump))
+    envars.append(rocoto.create_envar(name='CDUMP', value='&CDUMP;'))
 
     tasks = []
 
     # getics
-    taskname = 'getics'
-    task = create_wf_task(taskname, cdump=cdump, envar=envars)
-
+    deps = []
+    data = '&ICSDIR;/@Y at m@d at H/&CDUMP;/siganl.&CDUMP;. at Y@m at d@H'
+    dep_dict = {'type':'data', 'data':data}
+    deps.append(rocoto.add_dependency(dep_dict))
+    data = '&ICSDIR;/@Y at m@d at H/&CDUMP;/sfcanl.&CDUMP;. at Y@m at d@H'
+    dep_dict = {'type':'data', 'data':data}
+    deps.append(rocoto.add_dependency(dep_dict))
+    deps = rocoto.create_dependency(dep_condition='and', dep=deps)
+    dependencies = rocoto.create_dependency(dep_condition='not', dep=deps)
+    task = wfu.create_wf_task('getic', cdump=cdump, envar=envars, dependency=dependencies)
     tasks.append(task)
     tasks.append('\n')
 
     # chgres
-    taskname = 'chgres'
     deps = []
-    dep_dict = {'name':'getics', 'type':'task'}
+    data = '&ICSDIR;/@Y at m@d at H/&CDUMP;/siganl.&CDUMP;. at Y@m at d@H'
+    dep_dict = {'type':'data', 'data':data}
     deps.append(rocoto.add_dependency(dep_dict))
-    dependencies = rocoto.create_dependency(dep=deps)
-    task = create_wf_task(taskname, cdump=cdump, envar=envars, dependency=dependencies)
-
+    data = '&ICSDIR;/@Y at m@d at H/&CDUMP;/sfcanl.&CDUMP;. at Y@m at d@H'
+    dep_dict = {'type':'data', 'data':data}
+    deps.append(rocoto.add_dependency(dep_dict))
+    dependencies = rocoto.create_dependency(dep_condition='and', dep=deps)
+    task = wfu.create_wf_task('fv3ic', cdump=cdump, envar=envars, dependency=dependencies)
     tasks.append(task)
     tasks.append('\n')
 
     # fcst
-    taskname = 'fcst'
     deps = []
-    dep_dict = {'name':'chgres', 'type':'task'}
+    data = '&ICSDIR;/@Y at m@d at H/&CDUMP;/&CASE;/INPUT/gfs_data.tile6.nc'
+    dep_dict = {'type':'data', 'data':data}
     deps.append(rocoto.add_dependency(dep_dict))
-    dependencies = rocoto.create_dependency(dep=deps)
-    task = create_wf_task(taskname, cdump=cdump, envar=envars, dependency=dependencies)
-
+    data = '&ICSDIR;/@Y at m@d at H/&CDUMP;/&CASE;/INPUT/sfc_data.tile6.nc'
+    dep_dict = {'type':'data', 'data':data}
+    deps.append(rocoto.add_dependency(dep_dict))
+    dependencies = rocoto.create_dependency(dep_condition='and', dep=deps)
+    task = wfu.create_wf_task('fcst', cdump=cdump, envar=envars, dependency=dependencies)
     tasks.append(task)
     tasks.append('\n')
 
     # post
-    taskname = 'post'
     deps = []
-    dep_dict = {'name':'fcst', 'type':'task'}
+    dep_dict = {'type':'task', 'name':'%sfcst' % cdump}
     deps.append(rocoto.add_dependency(dep_dict))
     dependencies = rocoto.create_dependency(dep=deps)
-    task = create_wf_task(taskname, cdump=cdump, envar=envars, dependency=dependencies)
-
+    task = wfu.create_wf_task('post', cdump=cdump, envar=envars, dependency=dependencies)
     tasks.append(task)
     tasks.append('\n')
 
     # vrfy
-    taskname = 'vrfy'
     deps = []
-    dep_dict = {'name':'post', 'type':'task'}
+    dep_dict = {'type':'task', 'name':'%spost' % cdump}
     deps.append(rocoto.add_dependency(dep_dict))
     dependencies = rocoto.create_dependency(dep=deps)
-    task = create_wf_task(taskname, cdump=cdump, envar=envars, dependency=dependencies)
-
+    task = wfu.create_wf_task('vrfy', cdump=cdump, envar=envars, dependency=dependencies)
     tasks.append(task)
     tasks.append('\n')
 
     # arch
-    taskname = 'arch'
     deps = []
-    dep_dict = {'name':'vrfy', 'type':'task'}
+    dep_dict = {'type':'task', 'name':'%svrfy' % cdump}
     deps.append(rocoto.add_dependency(dep_dict))
-    dependencies = rocoto.create_dependency(dep=deps)
-    task = create_wf_task(taskname, cdump=cdump, envar=envars, dependency=dependencies)
-
+    dep_dict = {'type':'streq', 'left':'&ARCHIVE_TO_HPSS;', 'right':'YES'}
+    deps.append(rocoto.add_dependency(dep_dict))
+    dependencies = rocoto.create_dependency(dep_condition='and', dep=deps)
+    task = wfu.create_wf_task('arch', cdump=cdump, envar=envars, dependency=dependencies, final=True)
     tasks.append(task)
     tasks.append('\n')
 
@@ -409,7 +253,7 @@
     return ''.join(tasks)
 
 
-def get_workflow_body():
+def get_workflow_body(cdump='gdas'):
     '''
         Create the workflow body
     '''
@@ -424,9 +268,9 @@
     strings.append('\t<log verbosity="10"><cyclestr>&EXPDIR;/logs/@Y at m@d at H.log</cyclestr></log>\n')
     strings.append('\n')
     strings.append('\t<!-- Define the cycles -->\n')
-    strings.append('\t<cycledef group="fcstonly">&SDATE; &EDATE; &INTERVAL;</cycledef>\n')
+    strings.append('\t<cycledef group="%s">&SDATE; &EDATE; &INTERVAL;</cycledef>\n' % cdump)
     strings.append('\n')
-    strings.append(get_fcstonly_workflow())
+    strings.append(get_workflow(cdump=cdump))
     strings.append('\n')
     strings.append('</workflow>\n')
 
@@ -433,48 +277,6 @@
     return ''.join(strings)
 
 
-def create_crontab(base, cronint=5):
-    '''
-        Create crontab to execute rocotorun every cronint (5) minutes
-    '''
-
-    # No point creating a crontab if rocotorun is not available.
-    rocotoruncmd = find_executable('rocotorun')
-    if rocotoruncmd is None:
-        print 'Failed to find rocotorun, crontab will not be created'
-        return
-
-
-    cronintstr = '*/%d * * * *' % cronint
-    rocotorunstr = '%s -d %s/%s.db -w %s/%s.xml' % (rocotoruncmd, base['EXPDIR'], base['PSLOT'], base['EXPDIR'], base['PSLOT'])
-
-    # On WCOSS, rocoto module needs to be loaded everytime cron runs
-    if base['machine'] in ['WCOSS']:
-        rocotoloadstr = '. /usrx/local/Modules/default/init/sh; module use -a /usrx/local/emc_rocoto/modulefiles; module load rocoto/20170119-master)'
-        rocotorunstr = '(%s %s)' % (rocotoloadstr, rocotorunstr)
-
-    strings = []
-
-    strings.append('# This is a basic crontab file and will execute rocotorun every 5 minutes\n')
-    strings.append('# Usage: crontab %s.crontab\n' % base['PSLOT'])
-    strings.append('#   list all crontabs:      crontab -l\n')
-    strings.append('#   remove current crontab: crontab -r %s.crontab\n' % base['PSLOT'])
-    strings.append('\n')
-
-    try:
-        REPLYTO = os.environ['REPLYTO']
-    except:
-        REPLYTO = ''
-    strings.append('MAILTO="%s"\n' % REPLYTO)
-    strings.append('%s %s\n' % (cronintstr, rocotorunstr))
-
-    fh = open(os.path.join(base['EXPDIR'], '%s.crontab' % base['PSLOT']), 'w')
-    fh.write(''.join(strings))
-    fh.close()
-
-    return
-
-
 def create_xml(dict_configs):
     '''
         Given an experiment directory containing config files and
@@ -482,13 +284,13 @@
     '''
 
 
-    dict_configs['base']['INTERVAL'] = get_interval(dict_configs['base']['gfs_cyc'])
+    dict_configs['base']['INTERVAL'] = wfu.get_gfs_interval(dict_configs['base']['gfs_cyc'])
     base = dict_configs['base']
 
     preamble = get_preamble()
     definitions = get_definitions(base)
     resources = get_resources(dict_configs, cdump=base['CDUMP'])
-    workflow = get_workflow_body()
+    workflow = get_workflow_body(cdump=base['CDUMP'])
 
     # Start writing the XML file
     fh = open('%s/%s.xml' % (base['EXPDIR'], base['PSLOT']), 'w')
Index: checkout/gfs_workflow.v15.0.0/fv3gfs/ush/setup_expt.py
===================================================================
--- checkout/gfs_workflow.v15.0.0/fv3gfs/ush/setup_expt.py	(revision 96191)
+++ checkout/gfs_workflow.v15.0.0/fv3gfs/ush/setup_expt.py	(revision 96274)
@@ -15,16 +15,14 @@
 from datetime import datetime
 from argparse import ArgumentParser,ArgumentDefaultsHelpFormatter
 
+global machines
+global expdir, configdir, comrot, pslot, resdet, resens, nens, cdump, idate, edate, gfs_cyc
 
 machines = ['THEIA', 'WCOSS_C']
 
 
-def create_EXPDIR(expdir, configdir):
+def create_EXPDIR():
 
-    if configdir is None:
-        msg = 'Input argument --configdir is required to create EXPDIR'
-        raise IOError(msg)
-    if os.path.exists(expdir): shutil.rmtree(expdir)
     os.makedirs(expdir)
     configs = glob.glob('%s/config.*' % configdir)
     if len(configs) == 0:
@@ -36,35 +34,37 @@
     return
 
 
-def create_COMROT(comrot, icsdir, idate, cdump='gdas'):
+def create_COMROT():
 
     idatestr = idate.strftime('%Y%m%d%H')
     cymd = idate.strftime('%Y%m%d')
     chh = idate.strftime('%H')
 
-    if os.path.exists(comrot): shutil.rmtree(comrot)
     os.makedirs(comrot)
 
     # Link ensemble member initial conditions
-    enkfdir = 'enkf.%s.%s/%s' % (cdump,cymd,chh)
-    os.makedirs(os.path.join(comrot,enkfdir))
-    for i in range(1,nens+1):
-        os.makedirs(os.path.join(comrot,enkfdir,'mem%03d'%i))
-        os.symlink(os.path.join(icsdir,idatestr,'C%d'%resens,'mem%03d'%i,'INPUT'),os.path.join(comrot,enkfdir,'mem%03d'%i,'INPUT'))
+    enkfdir = 'enkf.%s.%s/%s' % (cdump, cymd, chh)
+    os.makedirs(os.path.join(comrot, enkfdir))
+    for i in range(1, nens+1):
+        os.makedirs(os.path.join(comrot, enkfdir, 'mem%03d'%i))
+        os.symlink(os.path.join(icsdir, idatestr, 'C%d'%resens, 'mem%03d'%i, 'INPUT'),
+                   os.path.join(comrot, enkfdir, 'mem%03d'%i, 'INPUT'))
 
     # Link deterministic initial conditions
-    detdir = '%s.%s/%s' % (cdump,cymd,chh)
-    os.makedirs(os.path.join(comrot,detdir))
-    os.symlink(os.path.join(icsdir,idatestr,'C%d'%resdet,'control','INPUT'),os.path.join(comrot,detdir,'INPUT'))
+    detdir = '%s.%s/%s' % (cdump, cymd, chh)
+    os.makedirs(os.path.join(comrot, detdir))
+    os.symlink(os.path.join(icsdir, idatestr, 'C%d'%resdet, 'control', 'INPUT'),
+               os.path.join(comrot,detdir,'INPUT'))
 
     # Link bias correction and radiance diagnostics files
     for fname in ['abias','abias_pc','abias_air','radstat']:
-        os.symlink(os.path.join(icsdir,idatestr,'%s.t%sz.%s'%(cdump,chh,fname)),os.path.join(comrot,detdir,'%s.t%sz.%s'%(cdump,chh,fname)))
+        os.symlink(os.path.join(icsdir, idatestr, '%s.t%sz.%s' % (cdump, chh, fname)),
+                   os.path.join(comrot, detdir, '%s.t%sz.%s' % (cdump, chh, fname)))
 
     return
 
 
-def edit_baseconfig(expdir, comrot, machine, pslot, resdet, resens, nens, idate):
+def edit_baseconfig():
 
     base_config = '%s/config.base' % expdir
 
@@ -88,6 +88,7 @@
 
     lines = [l.replace('@PSLOT@', pslot) for l in lines]
     lines = [l.replace('@SDATE@', idate.strftime('%Y%m%d%H')) for l in lines]
+    lines = [l.replace('@EDATE@', edate.strftime('%Y%m%d%H')) for l in lines]
     if expdir is not None:
         lines = [l.replace('@EXPDIR@', os.path.dirname(expdir)) for l in lines]
     if comrot is not None:
@@ -95,6 +96,10 @@
     lines = [l.replace('@CASECTL@', 'C%d'%resdet) for l in lines]
     lines = [l.replace('@CASEENS@', 'C%d'%resens) for l in lines]
     lines = [l.replace('@NMEM_ENKF@', '%d'%nens) for l in lines]
+    lines = [l.replace('@gfs_cyc@', '%d'%gfs_cyc) for l in lines]
+    lines_to_remove = ['ICSDIR']
+    for l in lines_to_remove:
+        lines = [ x for x in lines if "%s" % l not in x ]
     fh = open(base_config,'w')
     fh.writelines(lines)
     fh.close()
@@ -117,16 +122,18 @@
 
     parser = ArgumentParser(description=description,formatter_class=ArgumentDefaultsHelpFormatter)
     parser.add_argument('--machine', help='machine name', type=str, choices=machines, default='WCOSS_C', required=False)
-    parser.add_argument('--pslot', help='parallel experiment name', type=str, required=True)
-    parser.add_argument('--configdir', help='full path to directory containing the config files', type=str, required=False, default=None)
-    parser.add_argument('--idate', help='date of initial conditions', type=str, required=False, default='2016100100')
-    parser.add_argument('--icsdir', help='full path to initial condition directory', type=str, required=False,default='/scratch4/NCEPDEV/da/noscrub/Rahul.Mahajan/ICS')
+    parser.add_argument('--pslot', help='parallel experiment name', type=str, required=False, default='test')
     parser.add_argument('--resdet', help='resolution of the deterministic model forecast', type=int, required=False, default=384)
     parser.add_argument('--resens', help='resolution of the ensemble model forecast', type=int, required=False, default=192)
     parser.add_argument('--comrot', help='full path to COMROT', type=str, required=False, default=None)
     parser.add_argument('--expdir', help='full path to EXPDIR', type=str, required=False, default=None)
+    parser.add_argument('--idate', help='starting date of experiment, initial conditions must exist!', type=str, required=True)
+    parser.add_argument('--edate', help='end date experiment', type=str, required=True)
+    parser.add_argument('--icsdir', help='full path to initial condition directory', type=str, required=True)
+    parser.add_argument('--configdir', help='full path to directory containing the config files', type=str, required=True)
     parser.add_argument('--nens', help='number of ensemble members', type=int, required=False, default=80)
     parser.add_argument('--cdump', help='CDUMP to start the experiment', type=str, required=False, default='gdas')
+    parser.add_argument('--gfs_cyc', help='GFS cycles to run', type=int, choices=[0, 1, 2, 4], default=1, required=False)
 
     args = parser.parse_args()
 
@@ -134,6 +141,7 @@
     pslot = args.pslot
     configdir = args.configdir
     idate = datetime.strptime(args.idate,'%Y%m%d%H')
+    edate = datetime.strptime(args.edate,'%Y%m%d%H')
     icsdir = args.icsdir
     resdet = args.resdet
     resens = args.resens
@@ -141,33 +149,37 @@
     expdir = args.expdir if args.expdir is None else os.path.join(args.expdir,pslot)
     nens = args.nens
     cdump = args.cdump
+    gfs_cyc = args.gfs_cyc
 
     if not os.path.exists(icsdir):
         msg = 'Initial conditions do not exist in %s' % icsdir
         raise IOError(msg)
 
-    create_comrot = False if comrot is None else True
-    if create_comrot and os.path.exists(comrot):
+    # COMROT directory
+    create_comrot = True
+    if os.path.exists(comrot):
         print
         print 'COMROT already exists in %s' % comrot
         print
         overwrite_comrot = raw_input('Do you wish to over-write COMROT [y/N]: ')
         create_comrot = True if overwrite_comrot in ['y', 'yes', 'Y', 'YES'] else False
+        if create_comrot: shutil.rmtree(comrot)
 
-    create_expdir = False if expdir is None else True
-    if create_expdir and os.path.exists(expdir):
+    if create_comrot:
+        create_COMROT()
+
+    # EXP directory
+    create_expdir = True
+    if os.path.exists(expdir):
         print
         print 'EXPDIR already exists in %s' % expdir
         print
         overwrite_expdir = raw_input('Do you wish to over-write EXPDIR [y/N]: ')
         create_expdir = True if overwrite_expdir in ['y', 'yes', 'Y', 'YES'] else False
+        if create_expdir: shutil.rmtree(expdir)
 
-    # Create COMROT directory
-    if create_comrot:
-        create_COMROT(comrot, icsdir, idate, cdump=cdump)
+    if create_expdir:
+        create_EXPDIR()
+        edit_baseconfig()
 
-    # Create EXP directory, copy config and edit config.base
-    if create_expdir:
-        create_EXPDIR(expdir, configdir)
-        edit_baseconfig(expdir, comrot, machine, pslot, resdet, resens, nens, idate)
     sys.exit(0)
Index: checkout/gfs_workflow.v15.0.0/fv3gfs/ush/setup_expt_fcstonly.py
===================================================================
--- checkout/gfs_workflow.v15.0.0/fv3gfs/ush/setup_expt_fcstonly.py	(nonexistent)
+++ checkout/gfs_workflow.v15.0.0/fv3gfs/ush/setup_expt_fcstonly.py	(revision 96274)
@@ -0,0 +1,139 @@
+#!/usr/bin/env python
+
+###############################################################
+# < next few lines under version control, D O  N O T  E D I T >
+# $Date$
+# $Revision$
+# $Author$
+# $Id$
+###############################################################
+
+import os
+import sys
+import glob
+import shutil
+from datetime import datetime
+from argparse import ArgumentParser,ArgumentDefaultsHelpFormatter
+
+
+global machines
+global machine, pslot, sdate, edate, expdir, comrot, res, configdir, gfs_cyc
+
+machines = ['THEIA', 'WCOSS_C']
+
+
+def edit_baseconfig():
+
+    base_config = '%s/config.base' % expdir
+
+    # make a copy of the default before editing
+    shutil.copy(base_config, base_config+'.default')
+
+    fh = open(base_config,'r')
+    lines = fh.readlines()
+    fh.close()
+
+    lines = [l.replace('@MACHINE@', machine.upper()) for l in lines]
+
+    # Only keep current machine information, remove others
+    # A better way would be to cat from another machine specific file
+    for m in machines:
+        if m in [machine.upper()]:
+            continue
+        ind_begin = lines.index('# BEGIN: %s\n' % m)
+        ind_end = lines.index('# END: %s\n' % m)
+        lines = lines[:ind_begin] + lines[ind_end+1:]
+
+    lines = [l.replace('@PSLOT@', pslot) for l in lines]
+    lines = [l.replace('@SDATE@', sdate.strftime('%Y%m%d%H')) for l in lines]
+    lines = [l.replace('@EDATE@', edate.strftime('%Y%m%d%H')) for l in lines]
+    if expdir is not None:
+        lines = [l.replace('@EXPDIR@', os.path.dirname(expdir)) for l in lines]
+    if comrot is not None:
+        lines = [l.replace('@ROTDIR@', os.path.dirname(comrot)) for l in lines]
+    lines = [l.replace('@CASECTL@', 'C%d'%res) for l in lines]
+    lines = [l.replace('@gfs_cyc@', '%d'%gfs_cyc) for l in lines]
+    lines_to_remove = ['@CASEENS@', '@NMEM_ENKF@', 'RECENTER_ENKF']
+    for l in lines_to_remove:
+        lines = [ x for x in lines if "%s" % l not in x ]
+    fh = open(base_config,'w')
+    fh.writelines(lines)
+    fh.close()
+
+    print ''
+    print 'EDITED:  %s/config.base as per user input.' % expdir
+    print 'DEFAULT: %s/config.base.default is for reference only.' % expdir
+    print 'Please verify and delete the default file before proceeding.'
+    print ''
+
+    return
+
+
+if __name__ == '__main__':
+
+    description = '''Setup files and directories to start a GFS parallel.
+Create EXPDIR, copy config files and edit config.base
+Create empty COMROT experiment directory'''
+
+    parser = ArgumentParser(description=description,formatter_class=ArgumentDefaultsHelpFormatter)
+    parser.add_argument('--machine', help='machine name', type=str, choices=machines, default='WCOSS_C', required=False)
+    parser.add_argument('--pslot', help='parallel experiment name', type=str, required=False, default='test')
+    parser.add_argument('--configdir', help='full path to directory containing the config files', type=str, required=True)
+    parser.add_argument('--sdate', help='starting date of experiment', type=str, required=False, default='2016100100')
+    parser.add_argument('--edate', help='ending date of experiment', type=str, required=False, default='2016100200')
+    parser.add_argument('--res', help='resolution of the model', type=int, required=False, default=384)
+    parser.add_argument('--comrot', help='full path to COMROT', type=str, required=True)
+    parser.add_argument('--expdir', help='full path to EXPDIR', type=str, required=True)
+    parser.add_argument('--gfs_cyc', help='GFS cycles to run', type=int, choices=[0, 1, 2, 4], default=1, required=False)
+
+    args = parser.parse_args()
+
+    machine = args.machine
+    pslot = args.pslot
+    expdir = os.path.join(args.expdir, pslot)
+    comrot = os.path.join(args.comrot, pslot)
+    sdate = datetime.strptime(args.sdate,'%Y%m%d%H')
+    edate = datetime.strptime(args.edate,'%Y%m%d%H')
+    res = args.res
+    configdir = args.configdir
+    gfs_cyc = args.gfs_cyc
+
+    if machine not in machines:
+        print 'supported machines are ' % ' '.join(machines)
+        print 'machine %s is unsupported, ABORT!' % machine
+        sys.exit(1)
+
+    # COMROT directory
+    create_comrot = True
+    if os.path.exists(comrot):
+        print
+        print 'COMROT already exists in %s' % comrot
+        print
+        overwrite_comrot = raw_input('Do you wish to over-write COMROT [y/N]: ')
+        create_comrot = True if overwrite_comrot in ['y', 'yes', 'Y', 'YES'] else False
+        if create_comrot: shutil.rmtree(comrot)
+
+    if create_comrot:
+        os.makedirs(comrot)
+
+    # EXPDIR directory
+    create_expdir = True
+    if os.path.exists(expdir):
+        print
+        print 'EXPDIR already exists in %s' % expdir
+        print
+        overwrite_expdir = raw_input('Do you wish to over-write EXPDIR [y/N]: ')
+        create_expdir = True if overwrite_expdir in ['y', 'yes', 'Y', 'YES'] else False
+        if create_expdir: shutil.rmtree(expdir)
+
+    if create_expdir:
+        os.makedirs(expdir)
+        configs = glob.glob('%s/config.*' % configdir)
+        if len(configs) == 0:
+            msg = 'no config files found in %s' % configdir
+            raise IOError(msg)
+        for config in configs:
+            shutil.copy(config, expdir)
+        edit_baseconfig()
+
+    sys.exit(0)

Property changes on: checkout/gfs_workflow.v15.0.0/fv3gfs/ush/setup_expt_fcstonly.py
___________________________________________________________________
Added: svn:executable
## -0,0 +1 ##
+*
\ No newline at end of property
Added: svn:keywords
## -0,0 +1 ##
+Author Id Revision Date URL
\ No newline at end of property
Index: checkout/gfs_workflow.v15.0.0/fv3gfs/ush/rocoto.py
===================================================================
--- checkout/gfs_workflow.v15.0.0/fv3gfs/ush/rocoto.py	(revision 96191)
+++ checkout/gfs_workflow.v15.0.0/fv3gfs/ush/rocoto.py	(revision 96274)
@@ -65,7 +65,7 @@
     taskname = task_dict['taskname'] if 'taskname' in task_dict else 'demotask'
     cycledef = task_dict['cycledef'] if 'cycledef' in task_dict else 'democycle'
     maxtries = task_dict['maxtries'] if 'maxtries' in task_dict else 3
-    final = task_dict['final'] if 'final' in task_dict else None
+    final = task_dict['final'] if 'final' in task_dict else False
     command = task_dict['command'] if 'command' in task_dict else 'sleep 10'
     jobname = task_dict['jobname'] if 'jobname' in task_dict else 'demojob'
     account = task_dict['account'] if 'account' in task_dict else 'batch'
@@ -78,7 +78,7 @@
     dependency = task_dict['dependency'] if 'dependency' in task_dict else None
 
     str_maxtries = str(maxtries)
-    str_final = '' if final is None else ' final="%s"' % final
+    str_final = ' final="true"' if final else ''
     envar = envar if isinstance(envar, list) else [envar]
 
     strings = []
@@ -126,46 +126,30 @@
     :rtype: str
     '''
 
-    dep_condition= dep_dict['condition'] if 'condition' in dep_dict else None
-    dep_name = dep_dict['name'] if 'name' in dep_dict else None
-    dep_type = dep_dict['type'] if 'type' in dep_dict else None
-    dep_offset = dep_dict['offset'] if 'offset' in dep_dict else None
-    dep_data = dep_dict['data'] if 'data' in dep_dict else None
+    dep_condition = dep_dict.get('condition', None)
+    dep_type = dep_dict.get('type', None)
 
-    if dep_type in [None]:
-        string = '<'
-    elif dep_type in ['task']:
-        string = '<taskdep task="%s"' % dep_name
-    elif dep_type in ['metatask']:
-        string = '<metataskdep metatask="%s"' % dep_name
-    elif dep_type in ['cycleexist']:
-        if dep_offset is None:
-            msg = 'dep_offset cannot be None if dep_type is cycleexist'
-            raise msg
-        string = '<cycleexistdep'
+    if dep_type in ['task', 'metatask']:
+
+        string = add_task_tag(dep_dict)
+
     elif dep_type in ['data']:
-        if dep_data is None:
-            msg = 'dep_data cannot be None if dep_type is data'
-            raise msg
-        string = '<datadep>'
-    else:
-        msg = 'unknown dependency type = %s' % dep_type
-        raise msg
 
-    if dep_type in ['data']:
-        if dep_offset is not None:
-            string += '<cyclestr offset="%s">%s</cyclestr>' % (dep_offset, dep_data)
-        else:
-            if '@' in dep_data:
-                string += '<cyclestr>%s</cyclestr>' % dep_data
-            else:
-                string += '%s' % (dep_data)
+        string = add_data_tag(dep_dict)
 
-    if dep_type in ['data']:
-        string += '</datadep>'
+    elif dep_type in ['cycleexist']:
+
+        string = add_cycle_tag(dep_dict)
+
+    elif dep_type in ['streq', 'strneq']:
+
+        string = add_streq_tag(dep_dict)
+
     else:
-        string += '/>' if dep_offset is None else ' cycle_offset="%s"/>' % dep_offset
 
+        msg = 'Unknown dependency type %s' % dep_dict['type']
+        raise KeyError(msg)
+
     if dep_condition is not None:
         string = '<%s>%s</%s>' % (dep_condition, string, dep_condition)
 
@@ -172,6 +156,116 @@
     return string
 
 
+def add_task_tag(dep_dict):
+    '''
+    create a simple task or metatask tag
+    :param dep_dict: dependency key-value parameters
+    :type dep_dict: dict
+    :return: Rocoto simple task or metatask dependency
+    :rtype: str
+    '''
+
+    dep_type = dep_dict.get('type', None)
+    dep_name = dep_dict.get('name', None)
+    dep_offset = dep_dict.get('offset', None)
+
+    if dep_name is None:
+        msg = 'a %s name is necessary for %s dependency' % (dep_type, dep_type)
+        raise KeyError(msg)
+
+    string = '<'
+    string += '%sdep %s="%s"' % (dep_type, dep_type, dep_name)
+    if dep_offset is not None:
+        string += ' cycle_offset="%s"' % dep_offset
+    string += '/>'
+
+    return string
+
+def add_data_tag(dep_dict):
+    '''
+    create a simple data tag
+    :param dep_dict: dependency key-value parameters
+    :type dep_dict: dict
+    :return: Rocoto simple task or metatask dependency
+    :rtype: str
+    '''
+
+    dep_type = dep_dict.get('type', None)
+    dep_data = dep_dict.get('data', None)
+    dep_offset = dep_dict.get('offset', None)
+
+    if dep_data is None:
+        msg = 'a data value is necessary for %s dependency' % dep_type
+        raise KeyError(msg)
+
+    if dep_offset is None:
+        if '@' in dep_data:
+            offset_string_b = '<cyclestr>'
+            offset_string_e = '</cyclestr>'
+        else:
+            offset_string_b = ''
+            offset_string_e = ''
+    else:
+        offset_string_b = '<cyclestr offset="%s">' % dep_offset
+        offset_string_e = '</cyclestr>'
+
+    string = '<datadep>'
+    string += '%s%s%s' % (offset_string_b, dep_data, offset_string_e)
+    string += '</datadep>'
+
+    return string
+
+def add_cycle_tag(dep_dict):
+    '''
+    create a simple cycle exist tag
+    :param dep_dict: dependency key-value parameters
+    :type dep_dict: dict
+    :return: Rocoto simple task or metatask dependency
+    :rtype: str
+    '''
+
+    dep_type = dep_dict.get('type', None)
+    dep_offset = dep_dict.get('offset', None)
+
+    if dep_offset is None:
+        msg = 'an offset value is necessary for %s dependency' % dep_type
+        raise KeyError(msg)
+
+    string = '<cycleexistdep cycle_offset="%s"/>' % dep_offset
+
+    return string
+
+def add_streq_tag(dep_dict):
+    '''
+    create a simple string comparison tag
+    :param dep_dict: dependency key-value parameters
+    :type dep_dict: dict
+    :return: Rocoto simple task or metatask dependency
+    :rtype: str
+    '''
+
+    dep_type = dep_dict.get('type', None)
+    dep_left = dep_dict.get('left', None)
+    dep_right = dep_dict.get('right', None)
+
+    fail = False
+    msg = ''
+    if dep_left is None:
+        msg += 'a left value is necessary for %s dependency' % dep_type
+        fail = True
+    if dep_right is None:
+        if fail:
+            msg += '\n'
+        msg += 'a right value is necessary for %s dependency' % dep_type
+        fail = True
+    if fail:
+        raise KeyError(msg)
+
+    string = '<%s><left>%s</left><right>%s</right></%s>' % (dep_type, dep_left, dep_right, dep_type)
+
+    return string
+
+
 def create_dependency(dep_condition=None, dep=None):
     '''
     create a compound dependency given a list of dependendies, and compounding condition
Index: checkout/gfs_workflow.v15.0.0/fv3gfs/config/config.arch
===================================================================
--- checkout/gfs_workflow.v15.0.0/fv3gfs/config/config.arch	(revision 96191)
+++ checkout/gfs_workflow.v15.0.0/fv3gfs/config/config.arch	(revision 96274)
@@ -12,9 +12,7 @@
 
 echo "BEGIN: config.arch"
 
-# Task and thread configuration
-export wtime_arch="01:00:00"
-export npe_arch=1
-export npe_node_arch=1
+# Get task specific resources
+. $EXPDIR/config.resources arch
 
 echo "END: config.arch"
Index: checkout/gfs_workflow.v15.0.0/fv3gfs/config/config.prepbufr
===================================================================
--- checkout/gfs_workflow.v15.0.0/fv3gfs/config/config.prepbufr	(revision 96191)
+++ checkout/gfs_workflow.v15.0.0/fv3gfs/config/config.prepbufr	(revision 96274)
@@ -47,7 +47,8 @@
 
 # These variable largely eliminate the need for explicitly setting
 # USH directories, FIX files, PARM files, EXECutables below
-# The USER can overwrite components that why wish
+# The USER can overwrite components that they wish
+# e.g. PRVT is used from the GSI
 export HOMEobsproc_prep=$BASE_PREP
 export EXECPREP="$BASE_PREP/exec"
 export FIXPREP="$BASE_PREP/fix"
Index: checkout/gfs_workflow.v15.0.0/fv3gfs/config/config.getic
===================================================================
--- checkout/gfs_workflow.v15.0.0/fv3gfs/config/config.getic	(nonexistent)
+++ checkout/gfs_workflow.v15.0.0/fv3gfs/config/config.getic	(revision 96274)
@@ -0,0 +1,28 @@
+#!/bin/ksh -x
+###############################################################
+# < next few lines under version control, D O  N O T  E D I T >
+# $Date$
+# $Revision$
+# $Author$
+# $Id$
+###############################################################
+
+########## config.getic ##########
+# Fetching GFS initial conditions specific
+
+echo "BEGIN: config.getic"
+
+# Get task specific resources
+. $EXPDIR/config.resources getic
+
+# This is messy business, and logic needs cleanup.
+# We should just be supporting the NEMSGFS only
+export ictype="opsgfs" # initial conditions from opsgfs or nemsgfs
+
+# Provide a parallel experiment name and path to HPSS archive
+if [ $ictype = "nemsgfs" ]; then
+    export parexp="prnemsrn"
+    export HPSS_PAR_PATH="/5year/NCEPDEV/emc-global/emc.glopara/WCOSS_C/$parexp"
+fi
+
+echo "END: config.getic"

Property changes on: checkout/gfs_workflow.v15.0.0/fv3gfs/config/config.getic
___________________________________________________________________
Added: svn:executable
## -0,0 +1 ##
+*
\ No newline at end of property
Added: svn:keywords
## -0,0 +1 ##
+URL Author Id Revision Date
\ No newline at end of property
Index: checkout/gfs_workflow.v15.0.0/fv3gfs/config/config.fv3ic
===================================================================
--- checkout/gfs_workflow.v15.0.0/fv3gfs/config/config.fv3ic	(nonexistent)
+++ checkout/gfs_workflow.v15.0.0/fv3gfs/config/config.fv3ic	(revision 96274)
@@ -0,0 +1,22 @@
+#!/bin/ksh -x
+###############################################################
+# < next few lines under version control, D O  N O T  E D I T >
+# $Date$
+# $Revision$
+# $Author$
+# $Id$
+###############################################################
+
+########## config.fv3ic ##########
+# Convert GFS initial conditions into FV3 initial conditions
+
+echo "BEGIN: config.fv3ic"
+
+# Task and thread configuration
+export wtime_fv3ic="00:30:00"
+export npe_fv3ic=24
+export npe_node_fv3ic=24
+
+export CHGRESTHREAD=$npe_node_fv3ic
+
+echo "END: config.fv3ic"

Property changes on: checkout/gfs_workflow.v15.0.0/fv3gfs/config/config.fv3ic
___________________________________________________________________
Added: svn:executable
## -0,0 +1 ##
+*
\ No newline at end of property
Added: svn:keywords
## -0,0 +1 ##
+URL Author Id Revision Date
\ No newline at end of property
Index: checkout/gfs_workflow.v15.0.0/fv3gfs/config/config.base
===================================================================
--- checkout/gfs_workflow.v15.0.0/fv3gfs/config/config.base	(revision 96191)
+++ checkout/gfs_workflow.v15.0.0/fv3gfs/config/config.base	(revision 96274)
@@ -114,7 +114,7 @@
 # Build paths relative to $BASE_SVN
 export BASE_WORKFLOW="$BASE_SVN/fv3gfs/trunk/gfs_workflow.v15.0.0"
 export BASE_GSM="$BASE_SVN/fv3gfs/trunk/global_shared.v15.0.0"
-export BASE_GSI="$BASE_SVN/gsi/branches/EXP-fv3gsi"
+export BASE_GSI="$BASE_SVN/gsi/trunk"
 export BASE_NEMSfv3gfs="$BASE_SVN/nems/apps/NEMSfv3gfs/trunk"
 export BASE_POST="$BASE_SVN/fv3gfs/tags/post4fv3"
 export BASE_PREP="$BASE_SVN/obsproc/releases/obsproc_prep_RB-4.0.0"
@@ -136,7 +136,7 @@
 # EXPERIMENT specific environment parameters
 export REALTIME="NO"
 export SDATE=@SDATE@
-export EDATE=@SDATE@
+export EDATE=@EDATE@
 export assim_freq=6
 export PSLOT="@PSLOT@"
 export EXPDIR="@EXPDIR@/$PSLOT"
@@ -143,6 +143,7 @@
 export ROTDIR="@ROTDIR@/$PSLOT"
 export RUNDIR="$STMP/RUNDIRS/$PSLOT"
 export ARCDIR="$NOSCRUB/archive/$PSLOT"
+export ICSDIR="$PTMP/FV3ICS"
 export ATARDIR="/NCEPDEV/emc-global/1year/$USER/$machine/scratch/$PSLOT"
 
 # Resolution specific parameters
@@ -150,8 +151,8 @@
 export CASE="@CASECTL@"
 export CASE_ENKF="@CASEENS@"
 
-# Surface cycle update frequence
-export FHCYC=6
+# Surface cycle update frequency
+export FHCYC=24
 
 # Output frequency of the forecast model (for cycling)
 export FHMIN=0
@@ -159,7 +160,7 @@
 export FHOUT=3
 
 # GFS cycle info
-export gfs_cyc=0 # 0: no GFS cycle, 1: 00Z only, 2: 00Z and 12Z only, 4: all 4 cycles.
+export gfs_cyc=@gfs_cyc@ # 0: no GFS cycle, 1: 00Z only, 2: 00Z and 12Z only, 4: all 4 cycles.
 
 # GFS output and frequency
 export FHMIN_GFS=0
Index: checkout/gfs_workflow.v15.0.0/fv3gfs/config/config.resources
===================================================================
--- checkout/gfs_workflow.v15.0.0/fv3gfs/config/config.resources	(revision 96191)
+++ checkout/gfs_workflow.v15.0.0/fv3gfs/config/config.resources	(revision 96274)
@@ -27,9 +27,9 @@
 
 if [ $step = "prep" -o $step = "prepbufr" ]; then
 
-    export wtime_prep="00:10:00"
-    export npe_prep=12
-    export npe_node_prep=12
+    eval "export wtime_$step='00:10:00'"
+    eval "export npe_$step=12"
+    eval "export npe_node_$step=12"
 
 elif [ $step = "anal" ]; then
 
@@ -42,7 +42,7 @@
 
     export wtime_fcst="00:10:00"
     export wtime_fcst_gfs="03:00:00"
-    export npe_fcst=`echo "$layout_x * $layout_y * 6" | bc`
+    export npe_fcst=$(echo "$layout_x * $layout_y * 6" | bc)
     export npe_node_fcst=12
     export memory_fcst="1024M"
 
@@ -62,11 +62,11 @@
     export npe_node_vrfy=1
     export memory_vrfy="3072M"
 
-elif [ $step = "arch" ]; then
+elif [ $step = "arch" -o $step = "earc" -o $step = "getic" ]; then
 
-    export wtime_arch="01:00:00"
-    export npe_arch=1
-    export npe_node_arch=1
+    eval "export wtime_$step='06:00:00'"
+    eval "export npe_$step=1"
+    eval "export npe_node_$step=1"
 
 elif [ $step = "eobs" -o $step = "eomg" ]; then
 
@@ -93,7 +93,7 @@
 elif [ $step = "efcs" ]; then
 
     export wtime_efcs="01:00:00"
-    export npe_efcs=`echo "$layout_x * $layout_y * 6" | bc`
+    export npe_efcs=$(echo "$layout_x * $layout_y * 6" | bc)
     export npe_node_efcs=24
     export memory_efcs="254M"
 
@@ -104,11 +104,10 @@
     export npe_node_epos=12
     export memory_epos="254M"
 
-elif [ $step = "earc" ]; then
+else
 
-    export wtime_earc="06:00:00"
-    export npe_earc=1
-    export npe_node_earc=1
+    echo "Invalid step = $step, ABORT!"
+    exit 2
 
 fi
 
Index: checkout/gfs_workflow.v15.0.0/fv3gfs/env/THEIA.env
===================================================================
--- checkout/gfs_workflow.v15.0.0/fv3gfs/env/THEIA.env	(revision 96191)
+++ checkout/gfs_workflow.v15.0.0/fv3gfs/env/THEIA.env	(revision 96274)
@@ -111,7 +111,8 @@
     [[ $NTHREADS_ECEN -gt $nth_max ]] && export NTHREADS_ECEN=$nth_max
     export APRUN_ECEN="$launcher ${npe_ecen:-$PBS_NP}"
 
-    export APRUN_CHGRES=""
+    [[ $CHGRESTHREAD -gt $npe_node_max ]] && export CHGRESTHREAD=$npe_node_max
+    export APRUN_CHGRES="time"
 
 elif [ $step = "epos" ]; then
 
@@ -121,4 +122,9 @@
     [[ $NTHREADS_EPOS -gt $nth_max ]] && export NTHREADS_EPOS=$nth_max
     export APRUN_EPOS="$launcher ${npe_epos:-$PBS_NP}"
 
+elif [ $step = "fv3ic" ]; then
+
+    [[ $CHGRESTHREAD -gt $npe_node_max ]] && export CHGRESTHREAD=$npe_node_max
+    export APRUN_CHGRES="time"
+
 fi
Index: checkout/gfs_workflow.v15.0.0/fv3gfs/env/WCOSS_C.env
===================================================================
--- checkout/gfs_workflow.v15.0.0/fv3gfs/env/WCOSS_C.env	(revision 96191)
+++ checkout/gfs_workflow.v15.0.0/fv3gfs/env/WCOSS_C.env	(revision 96274)
@@ -126,6 +126,7 @@
     [[ $NTHREADS_ECEN -gt $nth_max ]] && export NTHREADS_ECEN=$nth_max
     export APRUN_ECEN="$launcher -j 1 -n $npe_ecen -N $npe_node_ecen -d $NTHREADS_ECEN -cc depth"
 
+    [[ $CHGRESTHREAD -gt $npe_node_max ]] && export CHGRESTHREAD=$npe_node_max
     export APRUN_CHGRES="$launcher -j 1 -n 1 -N 1 -d $CHGRESTHREAD -cc depth"
 
 elif [ $step = "epos" ]; then
@@ -136,4 +137,9 @@
     [[ $NTHREADS_EPOS -gt $nth_max ]] && export NTHREADS_EPOS=$nth_max
     export APRUN_EPOS="$launcher -j 1 -n $npe_epos -N $npe_node_epos -d $NTHREADS_EPOS -cc depth"
 
+elif [ $step = "fv3ic" ]; then
+
+    [[ $CHGRESTHREAD -gt $npe_node_max ]] && export CHGRESTHREAD=$npe_node_max
+    export APRUN_CHGRES="$launcher -j 1 -n 1 -N 1 -d $CHGRESTHREAD -cc depth"
+
 fi
Index: checkout/gfs_workflow.v15.0.0/fv3gfs/jobs/earc.sh
===================================================================
--- checkout/gfs_workflow.v15.0.0/fv3gfs/jobs/earc.sh	(revision 96191)
+++ checkout/gfs_workflow.v15.0.0/fv3gfs/jobs/earc.sh	(revision 96274)
@@ -31,8 +31,8 @@
 # Run relevant tasks
 
 # CURRENT CYCLE
-cymd=`echo $CDATE | cut -c1-8`
-chh=`echo  $CDATE | cut -c9-10`
+cymd=$(echo $CDATE | cut -c1-8)
+chh=$(echo  $CDATE | cut -c9-10)
 APREFIX="${CDUMP}.t${chh}z."
 ASUFFIX=".nemsio"
 
@@ -67,7 +67,7 @@
         restart_dir="$memdir/RESTART"
         if [ -d $restart_dir ]; then
             mkdir -p RESTART
-            files=`ls -1 $restart_dir`
+            files=$(ls -1 $restart_dir)
             for file in $files; do
                 $NCP $restart_dir/$file RESTART/$file
             done
@@ -161,21 +161,21 @@
     ###############################################################
     # Clean up previous cycles; various depths
     # PRIOR CYCLE: Leave the prior cycle alone
-    GDATE=`$NDATE -$assim_freq $CDATE`
+    GDATE=$($NDATE -$assim_freq $CDATE)
 
     # PREVIOUS to the PRIOR CYCLE
     # Now go 2 cycles back and remove the directory
-    GDATE=`$NDATE -$assim_freq $GDATE`
-    gymd=`echo $GDATE | cut -c1-8`
-    ghh=`echo  $GDATE | cut -c9-10`
+    GDATE=$($NDATE -$assim_freq $GDATE)
+    gymd=$(echo $GDATE | cut -c1-8)
+    ghh=$(echo  $GDATE | cut -c9-10)
 
     COMIN_ENS="$ROTDIR/enkf.$CDUMP.$gymd/$ghh"
     [[ -d $COMIN_ENS ]] && rm -rf $COMIN_ENS
 
     # PREVIOUS day 00Z remove the whole day
-    GDATE=`$NDATE -48 $CDATE`
-    gymd=`echo $GDATE | cut -c1-8`
-    ghh=`echo  $GDATE | cut -c9-10`
+    GDATE=$($NDATE -48 $CDATE)
+    gymd=$(echo $GDATE | cut -c1-8)
+    ghh=$(echo  $GDATE | cut -c9-10)
 
     COMIN_ENS="$ROTDIR/enkf.$CDUMP.$gymd"
     [[ -d $COMIN_ENS ]] && rm -rf $COMIN_ENS
Index: checkout/gfs_workflow.v15.0.0/fv3gfs/jobs/fv3ic.sh
===================================================================
--- checkout/gfs_workflow.v15.0.0/fv3gfs/jobs/fv3ic.sh	(nonexistent)
+++ checkout/gfs_workflow.v15.0.0/fv3gfs/jobs/fv3ic.sh	(revision 96274)
@@ -0,0 +1,73 @@
+#!/bin/ksh -x
+###############################################################
+# < next few lines under version control, D O  N O T  E D I T >
+# $Date$
+# $Revision$
+# $Author$
+# $Id$
+###############################################################
+
+###############################################################
+## Author: Rahul Mahajan  Org: NCEP/EMC  Date: August 2017
+
+## Abstract:
+## Create FV3 initial conditions from GFS intitial conditions
+## EXPDIR : /full/path/to/config/files
+## CDATE  : current date (YYYYMMDDHH)
+## CDUMP  : cycle name (gdas / gfs)
+export EXPDIR=${1:-$EXPDIR}
+export CDATE=${2:-$CDATE}
+export CDUMP=${3:-$CDUMP}
+###############################################################
+
+###############################################################
+# Source relevant configs
+configs="base fv3ic"
+for config in $configs; do
+    . $EXPDIR/config.${config}
+    status=$?
+    [[ $status -ne 0 ]] && exit $status
+done
+
+###############################################################
+# Source machine runtime environment
+. $BASE_ENV/${machine}.env fv3ic
+status=$?
+[[ $status -ne 0 ]] && exit $status
+
+# Temporary runtime directory
+export DATA="$RUNDIR/$CDATE/$CDUMP/fv3ic$$"
+[[ -d $DATA ]] && rm -rf $DATA
+
+# Input GFS initial condition files
+export INIDIR="$ICSDIR/$CDATE/$CDUMP"
+export ATMANL="$ICSDIR/$CDATE/$CDUMP/siganl.${CDUMP}.$CDATE"
+export SFCANL="$ICSDIR/$CDATE/$CDUMP/sfcanl.${CDUMP}.$CDATE"
+# global_chgres_driver.sh defines opsgfs as before the NEMSGFS was implemented.
+# this bit is necessary, even though the NEMSGFS is operational, until
+# Fanglin agrees to the updates for global_chgres_driver.sh and global_chgres.sh
+# Till then, leave this hack of exporting icytype as opsgfs as default
+# and if NSST file is found, call it nemsgfs
+export ictype="opsgfs"
+if [ -f $ICSDIR/$CDATE/$CDUMP/nstanl.${CDUMP}.$CDATE ]; then
+    export NSTANL="$ICSDIR/$CDATE/$CDUMP/nstanl.${CDUMP}.$CDATE"
+    export ictype="nemsgfs"
+fi
+
+# Output FV3 initial condition files
+export OUTDIR="$ICSDIR/$CDATE/$CDUMP/$CASE/INPUT"
+
+export OMP_NUM_THREADS_CH=$CHGRESTHREAD
+export APRUNC=$APRUN_CHGRES
+
+# Call global_chgres_driver.sh
+$BASE_GSM/ush/global_chgres_driver.sh
+status=$?
+if [ $status -ne 0 ]; then
+    echo "global_chgres_driver.sh returned with a non-zero exit code, ABORT!"
+    exit $status
+fi
+
+###############################################################
+# Exit cleanly
+exit 0

Property changes on: checkout/gfs_workflow.v15.0.0/fv3gfs/jobs/fv3ic.sh
___________________________________________________________________
Added: svn:executable
## -0,0 +1 ##
+*
\ No newline at end of property
Added: svn:keywords
## -0,0 +1 ##
+URL Author Id Revision Date
\ No newline at end of property
Index: checkout/gfs_workflow.v15.0.0/fv3gfs/jobs/getic.sh
===================================================================
--- checkout/gfs_workflow.v15.0.0/fv3gfs/jobs/getic.sh	(nonexistent)
+++ checkout/gfs_workflow.v15.0.0/fv3gfs/jobs/getic.sh	(revision 96274)
@@ -0,0 +1,178 @@
+#!/bin/ksh -x
+###############################################################
+# < next few lines under version control, D O  N O T  E D I T >
+# $Date$
+# $Revision$
+# $Author$
+# $Id$
+###############################################################
+
+###############################################################
+## Author: Rahul Mahajan  Org: NCEP/EMC  Date: August 2017
+
+## Abstract:
+## Get GFS intitial conditions
+## EXPDIR : /full/path/to/config/files
+## CDATE  : current date (YYYYMMDDHH)
+## CDUMP  : cycle name (gdas / gfs)
+###############################################################
+
+###############################################################
+# Source relevant configs
+configs="base getic"
+for config in $configs; do
+    . $EXPDIR/config.${config}
+    status=$?
+    [[ $status -ne 0 ]] && exit $status
+done
+
+###############################################################
+# Source machine runtime environment
+. $BASE_ENV/${machine}.env getic
+status=$?
+[[ $status -ne 0 ]] && exit $status
+
+###############################################################
+# Set script and dependency variables
+
+yyyy=$(echo $CDATE | cut -c1-4)
+mm=$(echo $CDATE | cut -c5-6)
+dd=$(echo $CDATE | cut -c7-8)
+hh=$(echo $CDATE | cut -c9-10)
+cymd=$(echo $CDATE | cut -c1-8)
+
+###############################################################
+
+target_dir=$ICSDIR/$CDATE/$CDUMP
+mkdir -p $target_dir
+cd $target_dir
+
+# Save the files as legacy EMC filenames
+ftanal[1]="siganl.${CDUMP}.$CDATE"
+ftanal[2]="sfcanl.${CDUMP}.$CDATE"
+ftanal[3]="nstanl.${CDUMP}.$CDATE"
+
+# Initialize return code to 0
+rc=0
+
+if [ $ictype = "opsgfs" ]; then
+
+    # Handle nemsio and pre-nemsio GFS filenames
+    if [ $CDATE -gt "2017072000" ]; then
+        nfanal=3
+        fanal[1]="./${CDUMP}.t${hh}z.atmanl.nemsio"
+        fanal[2]="./${CDUMP}.t${hh}z.sfcanl.nemsio"
+        fanal[3]="./${CDUMP}.t${hh}z.nstanl.nemsio"
+        flanal="${fanal[1]} ${fanal[2]} ${fanal[3]}"
+        tarpref="gpfs_hps_nco_ops_com"
+    else
+        nfanal=2
+        [[ $CDUMP = "gdas" ]] && str1=1
+        fanal[1]="./${CDUMP}${str1}.t${hh}z.sanl"
+        fanal[2]="./${CDUMP}${str1}.t${hh}z.sfcanl"
+        flanal="${fanal[1]} ${fanal[2]}"
+        tarpref="com2"
+    fi
+
+    # First check the COMROOT for files, if present copy over
+    if [ $machine = "WCOSS_C" ]; then
+
+        # Need COMROOT
+        module load prod_envir >> /dev/null 2>&1
+
+        comdir="$COMROOT/$CDUMP/prod/$CDUMP.$cymd"
+        for i in `seq 1 $nfanal`; do
+            if [ -f $comdir/${fanal[i]} ]; then
+                $NCP $comdir/${fanal[i]} ${ftanal[i]}
+            else
+                rb=1 ; ((rc+=rb))
+            fi
+        done
+
+        # If found, exit out
+        [[ $rc -eq 0 ]] && exit 0
+
+    fi
+
+    # Get initial conditions from HPSS
+    hpssdir="/NCEPPROD/hpssprod/runhistory/rh$yyyy/$yyyy$mm/$cymd"
+    if [ $CDUMP = "gdas" ]; then
+        tarball="$hpssdir/${tarpref}_gfs_prod_${CDUMP}.${CDATE}.tar"
+    elif [ $CDUMP = "gfs" ]; then
+        tarball="$hpssdir/${tarpref}_gfs_prod_${CDUMP}.${CDATE}.anl.tar"
+    fi
+
+    # check if the tarball exists
+    hsi ls -l $tarball
+    rc=$?
+    if [ $rc -ne 0 ]; then
+        echo "$tarball does not exist and should, ABORT!"
+        exit $rc
+    fi
+    # get the tarball
+    htar -xvf $tarball $flanal
+    rc=$?
+    if [ $rc -ne 0 ]; then
+        echo "untarring $tarball failed, ABORT!"
+        exit $rc
+    fi
+
+    # Move the files to legacy EMC filenames
+    for i in `seq 1 $nfanal`; do
+        $NMV ${fanal[i]} ${ftanal[i]}
+    done
+
+    # If found, exit out
+    if [ $rc -ne 0 ]; then
+        echo "Unable to obtain operational GFS initial conditions, ABORT!"
+        exit 1
+    fi
+
+elif [ $ictype = "nemsgfs" ]; then
+
+    # Filenames in parallel
+    nfanal=3
+    fanal[1]="gfnanl.${CDUMP}.$CDATE"
+    fanal[2]="sfnanl.${CDUMP}.$CDATE"
+    fanal[3]="nsnanl.${CDUMP}.$CDATE"
+    flanal="${fanal[1]} ${fanal[2]} ${fanal[3]}"
+
+    # Get initial conditions from HPSS from retrospective parallel
+    tarball="$HPSS_PAR_PATH/${CDATE}${CDUMP}.tar"
+
+    # check if the tarball exists
+    hsi ls -l $tarball
+    rc=$?
+    if [ $rc -ne 0 ]; then
+        echo "$tarball does not exist and should, ABORT!"
+        exit $rc
+    fi
+    # get the tarball
+    htar -xvf $tarball $flanal
+    rc=$?
+    if [ $rc -ne 0 ]; then
+        echo "untarring $tarball failed, ABORT!"
+        exit $rc
+    fi
+
+    # Move the files to legacy EMC filenames
+    for i in `seq 1 $nfanal`; do
+        $NMV ${fanal[i]} ${ftanal[i]}
+    done
+
+    # If found, exit out
+    if [ $rc -ne 0 ]; then
+        echo "Unable to obtain parallel GFS initial conditions, ABORT!"
+        exit 1
+    fi
+
+else
+
+    echo "ictype = $ictype, is not supported, ABORT!"
+    exit 1
+
+fi
+
+###############################################################
+# Exit out cleanly
+exit 0

Property changes on: checkout/gfs_workflow.v15.0.0/fv3gfs/jobs/getic.sh
___________________________________________________________________
Added: svn:executable
## -0,0 +1 ##
+*
\ No newline at end of property
Added: svn:keywords
## -0,0 +1 ##
+URL Author Id Revision Date
\ No newline at end of property
Index: checkout/gfs_workflow.v15.0.0/fv3gfs/jobs/arch.sh
===================================================================
--- checkout/gfs_workflow.v15.0.0/fv3gfs/jobs/arch.sh	(revision 96191)
+++ checkout/gfs_workflow.v15.0.0/fv3gfs/jobs/arch.sh	(revision 96274)
@@ -30,8 +30,8 @@
 # Run relevant tasks
 
 # CURRENT CYCLE
-cymd=`echo $CDATE | cut -c1-8`
-chh=`echo  $CDATE | cut -c9-10`
+cymd=$(echo $CDATE | cut -c1-8)
+chh=$(echo  $CDATE | cut -c9-10)
 APREFIX="${CDUMP}.t${chh}z."
 ASUFFIX=".nemsio"
 
@@ -50,7 +50,7 @@
 restart_dir="$COMIN/RESTART"
 if [ -d $restart_dir ]; then
     mkdir -p RESTART
-    files=`ls -1 $restart_dir`
+    files=$(ls -1 $restart_dir)
     for file in $files; do
         $NCP $restart_dir/$file RESTART/$file
     done
@@ -144,12 +144,12 @@
 ###############################################################
 # Clean up previous cycles; various depths
 # PRIOR CYCLE: Leave the prior cycle alone
-GDATE=`$NDATE -$assim_freq $CDATE`
+GDATE=$($NDATE -$assim_freq $CDATE)
 
 # PREVIOUS to the PRIOR CYCLE
-GDATE=`$NDATE -$assim_freq $GDATE`
-gymd=`echo $GDATE | cut -c1-8`
-ghh=`echo  $GDATE | cut -c9-10`
+GDATE=$($NDATE -$assim_freq $GDATE)
+gymd=$(echo $GDATE | cut -c1-8)
+ghh=$(echo  $GDATE | cut -c9-10)
 
 # Remove the TMPDIR directory
 COMIN="$RUNDIR/$GDATE"
@@ -160,13 +160,19 @@
 [[ -d $COMIN ]] && rm -rf $COMIN
 
 # PREVIOUS 00Z day; remove the whole day
-GDATE=`$NDATE -48 $CDATE`
-gymd=`echo $GDATE | cut -c1-8`
-ghh=`echo  $GDATE | cut -c9-10`
+GDATE=$($NDATE -48 $CDATE)
+gymd=$(echo $GDATE | cut -c1-8)
+ghh=$(echo  $GDATE | cut -c9-10)
 
 COMIN="$ROTDIR/$CDUMP.$gymd"
 [[ -d $COMIN ]] && rm -rf $COMIN
 
+# Remove archived quarter degree GRIB1 files that are (48+$FHMAX_GFS) hrs behind
+if [ $CDUMP = "gfs" ]; then
+    GDATE=$($NDATE -$FHMAX_GFS $GDATE)
+    $RM $VFYARC/pgbq*.${CDUMP}.${GDATE}
+fi
+
 ###############################################################
 # Exit out cleanly
 exit 0
Index: checkout/gfs_workflow.v15.0.0
===================================================================
--- checkout/gfs_workflow.v15.0.0	(revision 96191)
+++ checkout/gfs_workflow.v15.0.0	(revision 96274)

Property changes on: checkout/gfs_workflow.v15.0.0
___________________________________________________________________
Modified: svn:mergeinfo
## -0,0 +0,1 ##
   Merged /fv3gfs/branches/Rahul.Mahajan/EXP-cyc/gfs_workflow.v15.0.0:r96049-96273
Index: checkout/gdas.v15.0.0/ush
===================================================================
--- checkout/gdas.v15.0.0/ush	(revision 96191)
+++ checkout/gdas.v15.0.0/ush	(revision 96274)

Property changes on: checkout/gdas.v15.0.0/ush
___________________________________________________________________
Modified: svn:mergeinfo
## -0,0 +0,1 ##
   Merged /fv3gfs/branches/Rahul.Mahajan/EXP-cyc/gdas.v15.0.0/ush:r96049-96273
Index: checkout/gdas.v15.0.0/sorc/enkf_update.fd/constants.f90
===================================================================
--- checkout/gdas.v15.0.0/sorc/enkf_update.fd/constants.f90	(revision 96191)
+++ checkout/gdas.v15.0.0/sorc/enkf_update.fd/constants.f90	(revision 96274)

Property changes on: checkout/gdas.v15.0.0/sorc/enkf_update.fd/constants.f90
___________________________________________________________________
Modified: svn:mergeinfo
## -0,0 +0,1 ##
   Merged /fv3gfs/branches/Rahul.Mahajan/EXP-cyc/gdas.v15.0.0/sorc/enkf_update.fd/constants.f90:r96049-96273
Index: checkout/gdas.v15.0.0/sorc/enkf_update.fd/read_aerosol.f90
===================================================================
--- checkout/gdas.v15.0.0/sorc/enkf_update.fd/read_aerosol.f90	(revision 96191)
+++ checkout/gdas.v15.0.0/sorc/enkf_update.fd/read_aerosol.f90	(revision 96274)

Property changes on: checkout/gdas.v15.0.0/sorc/enkf_update.fd/read_aerosol.f90
___________________________________________________________________
Modified: svn:mergeinfo
## -0,0 +0,1 ##
   Merged /fv3gfs/branches/Rahul.Mahajan/EXP-cyc/gdas.v15.0.0/sorc/enkf_update.fd/read_aerosol.f90:r96049-96273
Index: checkout/gdas.v15.0.0/sorc/enkf_update.fd/normal_rh_to_q.f90
===================================================================
--- checkout/gdas.v15.0.0/sorc/enkf_update.fd/normal_rh_to_q.f90	(revision 96191)
+++ checkout/gdas.v15.0.0/sorc/enkf_update.fd/normal_rh_to_q.f90	(revision 96274)

Property changes on: checkout/gdas.v15.0.0/sorc/enkf_update.fd/normal_rh_to_q.f90
___________________________________________________________________
Modified: svn:mergeinfo
## -0,0 +0,1 ##
   Merged /fv3gfs/branches/Rahul.Mahajan/EXP-cyc/gdas.v15.0.0/sorc/enkf_update.fd/normal_rh_to_q.f90:r96049-96273
Index: checkout/gdas.v15.0.0/sorc/enkf_update.fd/read_files.f90
===================================================================
--- checkout/gdas.v15.0.0/sorc/enkf_update.fd/read_files.f90	(revision 96191)
+++ checkout/gdas.v15.0.0/sorc/enkf_update.fd/read_files.f90	(revision 96274)

Property changes on: checkout/gdas.v15.0.0/sorc/enkf_update.fd/read_files.f90
___________________________________________________________________
Modified: svn:mergeinfo
## -0,0 +0,1 ##
   Merged /fv3gfs/branches/Rahul.Mahajan/EXP-cyc/gdas.v15.0.0/sorc/enkf_update.fd/read_files.f90:r96049-96273
Index: checkout/gdas.v15.0.0/sorc/enkf_update.fd/gesinfo.f90
===================================================================
--- checkout/gdas.v15.0.0/sorc/enkf_update.fd/gesinfo.f90	(revision 96191)
+++ checkout/gdas.v15.0.0/sorc/enkf_update.fd/gesinfo.f90	(revision 96274)

Property changes on: checkout/gdas.v15.0.0/sorc/enkf_update.fd/gesinfo.f90
___________________________________________________________________
Modified: svn:mergeinfo
## -0,0 +0,1 ##
   Merged /fv3gfs/branches/Rahul.Mahajan/EXP-cyc/gdas.v15.0.0/sorc/enkf_update.fd/gesinfo.f90:r96049-96273
Index: checkout/gdas.v15.0.0/sorc/enkf_update.fd/bkgvar.f90
===================================================================
--- checkout/gdas.v15.0.0/sorc/enkf_update.fd/bkgvar.f90	(revision 96191)
+++ checkout/gdas.v15.0.0/sorc/enkf_update.fd/bkgvar.f90	(revision 96274)

Property changes on: checkout/gdas.v15.0.0/sorc/enkf_update.fd/bkgvar.f90
___________________________________________________________________
Modified: svn:mergeinfo
## -0,0 +0,1 ##
   Merged /fv3gfs/branches/Rahul.Mahajan/EXP-cyc/gdas.v15.0.0/sorc/enkf_update.fd/bkgvar.f90:r96049-96273
Index: checkout/gdas.v15.0.0/sorc/enkf_update.fd/setupbend.f90
===================================================================
--- checkout/gdas.v15.0.0/sorc/enkf_update.fd/setupbend.f90	(revision 96191)
+++ checkout/gdas.v15.0.0/sorc/enkf_update.fd/setupbend.f90	(revision 96274)

Property changes on: checkout/gdas.v15.0.0/sorc/enkf_update.fd/setupbend.f90
___________________________________________________________________
Modified: svn:mergeinfo
## -0,0 +0,1 ##
   Merged /fv3gfs/branches/Rahul.Mahajan/EXP-cyc/gdas.v15.0.0/sorc/enkf_update.fd/setupbend.f90:r96049-96273
Index: checkout/gdas.v15.0.0/sorc/enkf_update.fd/ensctl2model_ad.f90
===================================================================
--- checkout/gdas.v15.0.0/sorc/enkf_update.fd/ensctl2model_ad.f90	(revision 96191)
+++ checkout/gdas.v15.0.0/sorc/enkf_update.fd/ensctl2model_ad.f90	(revision 96274)

Property changes on: checkout/gdas.v15.0.0/sorc/enkf_update.fd/ensctl2model_ad.f90
___________________________________________________________________
Modified: svn:mergeinfo
## -0,0 +0,1 ##
   Merged /fv3gfs/branches/Rahul.Mahajan/EXP-cyc/gdas.v15.0.0/sorc/enkf_update.fd/ensctl2model_ad.f90:r96049-96273
Index: checkout/gdas.v15.0.0/sorc/enkf_update.fd/ncepnems_io.f90
===================================================================
--- checkout/gdas.v15.0.0/sorc/enkf_update.fd/ncepnems_io.f90	(revision 96191)
+++ checkout/gdas.v15.0.0/sorc/enkf_update.fd/ncepnems_io.f90	(revision 96274)

Property changes on: checkout/gdas.v15.0.0/sorc/enkf_update.fd/ncepnems_io.f90
___________________________________________________________________
Modified: svn:mergeinfo
## -0,0 +0,1 ##
   Merged /fv3gfs/branches/Rahul.Mahajan/EXP-cyc/gdas.v15.0.0/sorc/enkf_update.fd/ncepnems_io.f90:r96049-96273
Index: checkout/gdas.v15.0.0/sorc/enkf_update.fd/ensctl2state_ad.f90
===================================================================
--- checkout/gdas.v15.0.0/sorc/enkf_update.fd/ensctl2state_ad.f90	(revision 96191)
+++ checkout/gdas.v15.0.0/sorc/enkf_update.fd/ensctl2state_ad.f90	(revision 96274)

Property changes on: checkout/gdas.v15.0.0/sorc/enkf_update.fd/ensctl2state_ad.f90
___________________________________________________________________
Modified: svn:mergeinfo
## -0,0 +0,1 ##
   Merged /fv3gfs/branches/Rahul.Mahajan/EXP-cyc/gdas.v15.0.0/sorc/enkf_update.fd/ensctl2state_ad.f90:r96049-96273
Index: checkout/gdas.v15.0.0/sorc/enkf_update.fd/ensctl2model.f90
===================================================================
--- checkout/gdas.v15.0.0/sorc/enkf_update.fd/ensctl2model.f90	(revision 96191)
+++ checkout/gdas.v15.0.0/sorc/enkf_update.fd/ensctl2model.f90	(revision 96274)

Property changes on: checkout/gdas.v15.0.0/sorc/enkf_update.fd/ensctl2model.f90
___________________________________________________________________
Modified: svn:mergeinfo
## -0,0 +0,1 ##
   Merged /fv3gfs/branches/Rahul.Mahajan/EXP-cyc/gdas.v15.0.0/sorc/enkf_update.fd/ensctl2model.f90:r96049-96273
Index: checkout/gdas.v15.0.0/sorc/enkf_update.fd/ensctl2state.f90
===================================================================
--- checkout/gdas.v15.0.0/sorc/enkf_update.fd/ensctl2state.f90	(revision 96191)
+++ checkout/gdas.v15.0.0/sorc/enkf_update.fd/ensctl2state.f90	(revision 96274)

Property changes on: checkout/gdas.v15.0.0/sorc/enkf_update.fd/ensctl2state.f90
___________________________________________________________________
Modified: svn:mergeinfo
## -0,0 +0,1 ##
   Merged /fv3gfs/branches/Rahul.Mahajan/EXP-cyc/gdas.v15.0.0/sorc/enkf_update.fd/ensctl2state.f90:r96049-96273
Index: checkout/gdas.v15.0.0/scripts/exglobal_enkf_update.sh.ecf
===================================================================
--- checkout/gdas.v15.0.0/scripts/exglobal_enkf_update.sh.ecf	(revision 96191)
+++ checkout/gdas.v15.0.0/scripts/exglobal_enkf_update.sh.ecf	(revision 96274)

Property changes on: checkout/gdas.v15.0.0/scripts/exglobal_enkf_update.sh.ecf
___________________________________________________________________
Modified: svn:mergeinfo
## -0,0 +0,1 ##
   Merged /fv3gfs/branches/Rahul.Mahajan/EXP-cyc/gdas.v15.0.0/scripts/exglobal_enkf_update.sh.ecf:r96049-96273
Index: checkout/gdas.v15.0.0/scripts
===================================================================
--- checkout/gdas.v15.0.0/scripts	(revision 96191)
+++ checkout/gdas.v15.0.0/scripts	(revision 96274)

Property changes on: checkout/gdas.v15.0.0/scripts
___________________________________________________________________
Modified: svn:mergeinfo
## -0,0 +0,1 ##
   Merged /fv3gfs/branches/Rahul.Mahajan/EXP-cyc/gdas.v15.0.0/scripts:r96049-96273
Index: checkout/gdas.v15.0.0/jobs/JGDAS_ENKF_INFLATE_RECENTER
===================================================================
--- checkout/gdas.v15.0.0/jobs/JGDAS_ENKF_INFLATE_RECENTER	(revision 96191)
+++ checkout/gdas.v15.0.0/jobs/JGDAS_ENKF_INFLATE_RECENTER	(revision 96274)

Property changes on: checkout/gdas.v15.0.0/jobs/JGDAS_ENKF_INFLATE_RECENTER
___________________________________________________________________
Modified: svn:mergeinfo
## -0,0 +0,1 ##
   Merged /fv3gfs/branches/Rahul.Mahajan/EXP-cyc/gdas.v15.0.0/jobs/JGDAS_ENKF_INFLATE_RECENTER:r96049-96273
Index: checkout/gdas.v15.0.0/jobs/JGDAS_EMCSFC_SFC_PREP
===================================================================
--- checkout/gdas.v15.0.0/jobs/JGDAS_EMCSFC_SFC_PREP	(revision 96191)
+++ checkout/gdas.v15.0.0/jobs/JGDAS_EMCSFC_SFC_PREP	(revision 96274)

Property changes on: checkout/gdas.v15.0.0/jobs/JGDAS_EMCSFC_SFC_PREP
___________________________________________________________________
Modified: svn:mergeinfo
## -0,0 +0,1 ##
   Merged /fv3gfs/branches/Rahul.Mahajan/EXP-cyc/gdas.v15.0.0/jobs/JGDAS_EMCSFC_SFC_PREP:r96049-96273
Index: checkout/gdas.v15.0.0/jobs/JGDAS_ANALYSIS_HIGH
===================================================================
--- checkout/gdas.v15.0.0/jobs/JGDAS_ANALYSIS_HIGH	(revision 96191)
+++ checkout/gdas.v15.0.0/jobs/JGDAS_ANALYSIS_HIGH	(revision 96274)

Property changes on: checkout/gdas.v15.0.0/jobs/JGDAS_ANALYSIS_HIGH
___________________________________________________________________
Modified: svn:mergeinfo
## -0,0 +0,1 ##
   Merged /fv3gfs/branches/Rahul.Mahajan/EXP-cyc/gdas.v15.0.0/jobs/JGDAS_ANALYSIS_HIGH:r96049-96273
Index: checkout/gdas.v15.0.0/jobs/JGDAS_ENKF_SELECT_OBS
===================================================================
--- checkout/gdas.v15.0.0/jobs/JGDAS_ENKF_SELECT_OBS	(revision 96191)
+++ checkout/gdas.v15.0.0/jobs/JGDAS_ENKF_SELECT_OBS	(revision 96274)

Property changes on: checkout/gdas.v15.0.0/jobs/JGDAS_ENKF_SELECT_OBS
___________________________________________________________________
Modified: svn:mergeinfo
## -0,0 +0,1 ##
   Merged /fv3gfs/branches/Rahul.Mahajan/EXP-cyc/gdas.v15.0.0/jobs/JGDAS_ENKF_SELECT_OBS:r96049-96273
Index: checkout/gdas.v15.0.0/jobs/JGDAS_ENKF_UPDATE
===================================================================
--- checkout/gdas.v15.0.0/jobs/JGDAS_ENKF_UPDATE	(revision 96191)
+++ checkout/gdas.v15.0.0/jobs/JGDAS_ENKF_UPDATE	(revision 96274)

Property changes on: checkout/gdas.v15.0.0/jobs/JGDAS_ENKF_UPDATE
___________________________________________________________________
Modified: svn:mergeinfo
## -0,0 +0,1 ##
   Merged /fv3gfs/branches/Rahul.Mahajan/EXP-cyc/gdas.v15.0.0/jobs/JGDAS_ENKF_UPDATE:r96049-96273
Index: checkout/gdas.v15.0.0/jobs/JGDAS_ENKF_POST
===================================================================
--- checkout/gdas.v15.0.0/jobs/JGDAS_ENKF_POST	(revision 96191)
+++ checkout/gdas.v15.0.0/jobs/JGDAS_ENKF_POST	(revision 96274)

Property changes on: checkout/gdas.v15.0.0/jobs/JGDAS_ENKF_POST
___________________________________________________________________
Modified: svn:mergeinfo
## -0,0 +0,1 ##
   Merged /fv3gfs/branches/Rahul.Mahajan/EXP-cyc/gdas.v15.0.0/jobs/JGDAS_ENKF_POST:r96049-96273
Index: checkout/gdas.v15.0.0/jobs/JGDAS_ENKF_INNOVATE_OBS
===================================================================
--- checkout/gdas.v15.0.0/jobs/JGDAS_ENKF_INNOVATE_OBS	(revision 96191)
+++ checkout/gdas.v15.0.0/jobs/JGDAS_ENKF_INNOVATE_OBS	(revision 96274)

Property changes on: checkout/gdas.v15.0.0/jobs/JGDAS_ENKF_INNOVATE_OBS
___________________________________________________________________
Modified: svn:mergeinfo
## -0,0 +0,1 ##
   Merged /fv3gfs/branches/Rahul.Mahajan/EXP-cyc/gdas.v15.0.0/jobs/JGDAS_ENKF_INNOVATE_OBS:r96049-96273
Index: checkout/gdas.v15.0.0/jobs/JGDAS_ENKF_FCST
===================================================================
--- checkout/gdas.v15.0.0/jobs/JGDAS_ENKF_FCST	(revision 96191)
+++ checkout/gdas.v15.0.0/jobs/JGDAS_ENKF_FCST	(revision 96274)

Property changes on: checkout/gdas.v15.0.0/jobs/JGDAS_ENKF_FCST
___________________________________________________________________
Modified: svn:mergeinfo
## -0,0 +0,1 ##
   Merged /fv3gfs/branches/Rahul.Mahajan/EXP-cyc/gdas.v15.0.0/jobs/JGDAS_ENKF_FCST:r96049-96273
Index: checkout/gdas.v15.0.0/jobs
===================================================================
--- checkout/gdas.v15.0.0/jobs	(revision 96191)
+++ checkout/gdas.v15.0.0/jobs	(revision 96274)

Property changes on: checkout/gdas.v15.0.0/jobs
___________________________________________________________________
Modified: svn:mergeinfo
## -0,0 +0,1 ##
   Merged /fv3gfs/branches/Rahul.Mahajan/EXP-cyc/gdas.v15.0.0/jobs:r96049-96273
Index: checkout/gdas.v15.0.0
===================================================================
--- checkout/gdas.v15.0.0	(revision 96191)
+++ checkout/gdas.v15.0.0	(revision 96274)

Property changes on: checkout/gdas.v15.0.0
___________________________________________________________________
Modified: svn:mergeinfo
## -0,0 +0,1 ##
   Merged /fv3gfs/branches/Rahul.Mahajan/EXP-cyc/gdas.v15.0.0:r96049-96273
Index: checkout/gfs.v15.0.0/ush
===================================================================
--- checkout/gfs.v15.0.0/ush	(revision 96191)
+++ checkout/gfs.v15.0.0/ush	(revision 96274)

Property changes on: checkout/gfs.v15.0.0/ush
___________________________________________________________________
Modified: svn:mergeinfo
## -0,0 +0,1 ##
   Merged /fv3gfs/branches/Rahul.Mahajan/EXP-cyc/gfs.v15.0.0/ush:r96049-96273
Index: checkout/gfs.v15.0.0/sorc/build_wafs_wcoss.sh
===================================================================
--- checkout/gfs.v15.0.0/sorc/build_wafs_wcoss.sh	(revision 96191)
+++ checkout/gfs.v15.0.0/sorc/build_wafs_wcoss.sh	(revision 96274)

Property changes on: checkout/gfs.v15.0.0/sorc/build_wafs_wcoss.sh
___________________________________________________________________
Modified: svn:mergeinfo
## -0,0 +0,1 ##
   Merged /fv3gfs/branches/Rahul.Mahajan/EXP-cyc/gfs.v15.0.0/sorc/build_wafs_wcoss.sh:r96049-96273
Index: checkout/gfs.v15.0.0/scripts
===================================================================
--- checkout/gfs.v15.0.0/scripts	(revision 96191)
+++ checkout/gfs.v15.0.0/scripts	(revision 96274)

Property changes on: checkout/gfs.v15.0.0/scripts
___________________________________________________________________
Modified: svn:mergeinfo
## -0,0 +0,1 ##
   Merged /fv3gfs/branches/Rahul.Mahajan/EXP-cyc/gfs.v15.0.0/scripts:r96049-96273
Index: checkout/gfs.v15.0.0/modulefiles/gfs
===================================================================
--- checkout/gfs.v15.0.0/modulefiles/gfs	(revision 96191)
+++ checkout/gfs.v15.0.0/modulefiles/gfs	(revision 96274)

Property changes on: checkout/gfs.v15.0.0/modulefiles/gfs
___________________________________________________________________
Modified: svn:mergeinfo
## -0,0 +0,1 ##
   Merged /fv3gfs/branches/Rahul.Mahajan/EXP-cyc/gfs.v15.0.0/modulefiles/gfs:r96049-96273
Index: checkout/gfs.v15.0.0/jobs/JGFS_EMCSFC_SFC_PREP
===================================================================
--- checkout/gfs.v15.0.0/jobs/JGFS_EMCSFC_SFC_PREP	(revision 96191)
+++ checkout/gfs.v15.0.0/jobs/JGFS_EMCSFC_SFC_PREP	(revision 96274)

Property changes on: checkout/gfs.v15.0.0/jobs/JGFS_EMCSFC_SFC_PREP
___________________________________________________________________
Modified: svn:mergeinfo
## -0,0 +0,1 ##
   Merged /fv3gfs/branches/Rahul.Mahajan/EXP-cyc/gfs.v15.0.0/jobs/JGFS_EMCSFC_SFC_PREP:r96049-96273
Index: checkout/gfs.v15.0.0/jobs
===================================================================
--- checkout/gfs.v15.0.0/jobs	(revision 96191)
+++ checkout/gfs.v15.0.0/jobs	(revision 96274)

Property changes on: checkout/gfs.v15.0.0/jobs
___________________________________________________________________
Modified: svn:mergeinfo
## -0,0 +0,1 ##
   Merged /fv3gfs/branches/Rahul.Mahajan/EXP-cyc/gfs.v15.0.0/jobs:r96049-96273
Index: checkout/gfs.v15.0.0
===================================================================
--- checkout/gfs.v15.0.0	(revision 96191)
+++ checkout/gfs.v15.0.0	(revision 96274)

Property changes on: checkout/gfs.v15.0.0
___________________________________________________________________
Modified: svn:mergeinfo
## -0,0 +0,1 ##
   Merged /fv3gfs/branches/Rahul.Mahajan/EXP-cyc/gfs.v15.0.0:r96049-96273
Index: checkout/global_shared.v15.0.0/ush/emcsfc_snow.sh
===================================================================
--- checkout/global_shared.v15.0.0/ush/emcsfc_snow.sh	(revision 96191)
+++ checkout/global_shared.v15.0.0/ush/emcsfc_snow.sh	(revision 96274)

Property changes on: checkout/global_shared.v15.0.0/ush/emcsfc_snow.sh
___________________________________________________________________
Modified: svn:mergeinfo
## -0,0 +0,1 ##
   Merged /fv3gfs/branches/Rahul.Mahajan/EXP-cyc/global_shared.v15.0.0/ush/emcsfc_snow.sh:r96049-96273
Index: checkout/global_shared.v15.0.0/ush/emcsfc_ice_blend.sh
===================================================================
--- checkout/global_shared.v15.0.0/ush/emcsfc_ice_blend.sh	(revision 96191)
+++ checkout/global_shared.v15.0.0/ush/emcsfc_ice_blend.sh	(revision 96274)

Property changes on: checkout/global_shared.v15.0.0/ush/emcsfc_ice_blend.sh
___________________________________________________________________
Modified: svn:mergeinfo
## -0,0 +0,1 ##
   Merged /fv3gfs/branches/Rahul.Mahajan/EXP-cyc/global_shared.v15.0.0/ush/emcsfc_ice_blend.sh:r96049-96273
Index: checkout/global_shared.v15.0.0/ush/global_chgres.sh
===================================================================
--- checkout/global_shared.v15.0.0/ush/global_chgres.sh	(revision 96191)
+++ checkout/global_shared.v15.0.0/ush/global_chgres.sh	(revision 96274)

Property changes on: checkout/global_shared.v15.0.0/ush/global_chgres.sh
___________________________________________________________________
Modified: svn:mergeinfo
## -0,0 +0,1 ##
   Merged /fv3gfs/branches/Rahul.Mahajan/EXP-cyc/global_shared.v15.0.0/ush/global_chgres.sh:r96049-96273
Index: checkout/global_shared.v15.0.0/ush/global_chgres_driver.sh
===================================================================
--- checkout/global_shared.v15.0.0/ush/global_chgres_driver.sh	(revision 96191)
+++ checkout/global_shared.v15.0.0/ush/global_chgres_driver.sh	(revision 96274)
@@ -26,7 +26,6 @@
 export CDUMP=${CDUMP:-gfs}                   # gfs or gdas
 export LEVS=${LEVS:-65}
 export LSOIL=${LSOIL:-4}
-export ictype=${ictype:-nemsgfs}             # nemsgfs for q3fy17 gfs with new land datasets; opsgfs for q2fy16 gfs.
 export nst_anl=${nst_anl:-".false."}         # false means to do sst orographic adjustment for lakes
 
 export VERBOSE=YES
@@ -100,7 +99,22 @@
 export ymd=`echo $CDATE | cut -c 1-8`
 export cyc=`echo $CDATE | cut -c 9-10`
 
-if [ $ictype = opsgfs ]; then
+# Determine if we are current operations with NSST or the one before that
+if [ ${ATMANL:-"NULL"} = "NULL" ]; then
+ if [ -s ${INIDIR}/nsnanl.${CDUMP}.$CDATE -o -s ${INIDIR}/${CDUMP}.t${cyc}z.nstanl.nemsio ]; then
+  ictype='opsgfs'
+ else
+  ictype='oldgfs'
+ fi
+else
+ if [ ${NSTANL:-"NULL"} = "NULL" ]; then
+  ictype='oldgfs'
+ else
+  ictype='opsgfs'
+ fi
+fi
+
+if [ $ictype = oldgfs ]; then
  nst_anl=".false."
  if [ ${ATMANL:-"NULL"} = "NULL" ]; then
   if [ -s ${INIDIR}/siganl.${CDUMP}.$CDATE ]; then
@@ -111,6 +125,7 @@
    export SFCANL=$INIDIR/${CDUMP}.t${cyc}z.sfcanl
   fi
  fi
+ export NSTANL="NULL"
  export SOILTYPE_INP=zobler
  export SOILTYPE_OUT=zobler
  export VEGTYPE_INP=sib
@@ -128,7 +143,7 @@
   export LATB_ATM=1536
  fi
 
-elif [ $ictype = nemsgfs ]; then
+elif [ $ictype = opsgfs ]; then
  if [ ${ATMANL:-"NULL"} = "NULL" ]; then
   if [ -s ${INIDIR}/gfnanl.${CDUMP}.$CDATE ]; then
    export ATMANL=$INIDIR/gfnanl.${CDUMP}.$CDATE
@@ -171,8 +186,8 @@
 $CHGRESSH
 rc=$?
 if [[ $rc -ne 0 ]] ; then
-  echo "***ERROR*** rc= $rc"
-  exit $rc
+ echo "***ERROR*** rc= $rc"
+ exit $rc
 fi
 
 mv ${DATA}/gfs_data.tile*.nc  $OUTDIR/.
@@ -194,12 +209,11 @@
  $CHGRESSH
  rc=$?
  if [[ $rc -ne 0 ]] ; then
-   echo "***ERROR*** rc= $rc"
-   exit $rc
+  echo "***ERROR*** rc= $rc"
+  exit $rc
  fi
  mv ${DATA}/out.sfc.tile${tile}.nc $OUTDIR/sfc_data.tile${tile}.nc
-tile=`expr $tile + 1 `
+ tile=`expr $tile + 1 `
 done
 
 exit 0
-
Index: checkout/global_shared.v15.0.0/ush/syndat_qctropcy.sh
===================================================================
--- checkout/global_shared.v15.0.0/ush/syndat_qctropcy.sh	(revision 96191)
+++ checkout/global_shared.v15.0.0/ush/syndat_qctropcy.sh	(revision 96274)

Property changes on: checkout/global_shared.v15.0.0/ush/syndat_qctropcy.sh
___________________________________________________________________
Modified: svn:mergeinfo
## -0,0 +0,1 ##
   Merged /fv3gfs/branches/Rahul.Mahajan/EXP-cyc/global_shared.v15.0.0/ush/syndat_qctropcy.sh:r96049-96273
Index: checkout/global_shared.v15.0.0/ush/parse-storm-type.pl
===================================================================
--- checkout/global_shared.v15.0.0/ush/parse-storm-type.pl	(revision 96191)
+++ checkout/global_shared.v15.0.0/ush/parse-storm-type.pl	(revision 96274)

Property changes on: checkout/global_shared.v15.0.0/ush/parse-storm-type.pl
___________________________________________________________________
Modified: svn:mergeinfo
## -0,0 +0,1 ##
   Merged /fv3gfs/branches/Rahul.Mahajan/EXP-cyc/global_shared.v15.0.0/ush/parse-storm-type.pl:r96049-96273
Index: checkout/global_shared.v15.0.0/ush
===================================================================
--- checkout/global_shared.v15.0.0/ush	(revision 96191)
+++ checkout/global_shared.v15.0.0/ush	(revision 96274)

Property changes on: checkout/global_shared.v15.0.0/ush
___________________________________________________________________
Modified: svn:mergeinfo
## -0,0 +0,1 ##
   Merged /fv3gfs/branches/Rahul.Mahajan/EXP-cyc/global_shared.v15.0.0/ush:r96049-96273
Index: checkout/global_shared.v15.0.0/sorc/emcsfc_ice_blend.fd
===================================================================
--- checkout/global_shared.v15.0.0/sorc/emcsfc_ice_blend.fd	(revision 96191)
+++ checkout/global_shared.v15.0.0/sorc/emcsfc_ice_blend.fd	(revision 96274)

Property changes on: checkout/global_shared.v15.0.0/sorc/emcsfc_ice_blend.fd
___________________________________________________________________
Modified: svn:mergeinfo
## -0,0 +0,1 ##
   Merged /fv3gfs/branches/Rahul.Mahajan/EXP-cyc/global_shared.v15.0.0/sorc/emcsfc_ice_blend.fd:r96049-96273
Index: checkout/global_shared.v15.0.0/sorc/emcsfc_snow2mdl.fd
===================================================================
--- checkout/global_shared.v15.0.0/sorc/emcsfc_snow2mdl.fd	(revision 96191)
+++ checkout/global_shared.v15.0.0/sorc/emcsfc_snow2mdl.fd	(revision 96274)

Property changes on: checkout/global_shared.v15.0.0/sorc/emcsfc_snow2mdl.fd
___________________________________________________________________
Modified: svn:mergeinfo
## -0,0 +0,1 ##
   Merged /fv3gfs/branches/Rahul.Mahajan/EXP-cyc/global_shared.v15.0.0/sorc/emcsfc_snow2mdl.fd:r96049-96273
Index: checkout/global_shared.v15.0.0/sorc/global_chgres.fd
===================================================================
--- checkout/global_shared.v15.0.0/sorc/global_chgres.fd	(revision 96191)
+++ checkout/global_shared.v15.0.0/sorc/global_chgres.fd	(revision 96274)

Property changes on: checkout/global_shared.v15.0.0/sorc/global_chgres.fd
___________________________________________________________________
Modified: svn:mergeinfo
## -0,0 +0,1 ##
   Merged /fv3gfs/branches/Rahul.Mahajan/EXP-cyc/global_shared.v15.0.0/sorc/global_chgres.fd:r96049-96273
Index: checkout/global_shared.v15.0.0/sorc/global_cycle.fd
===================================================================
--- checkout/global_shared.v15.0.0/sorc/global_cycle.fd	(revision 96191)
+++ checkout/global_shared.v15.0.0/sorc/global_cycle.fd	(revision 96274)

Property changes on: checkout/global_shared.v15.0.0/sorc/global_cycle.fd
___________________________________________________________________
Modified: svn:mergeinfo
## -0,0 +0,1 ##
   Merged /fv3gfs/branches/Rahul.Mahajan/EXP-cyc/global_shared.v15.0.0/sorc/global_cycle.fd:r96049-96273
Index: checkout/global_shared.v15.0.0/sorc/gsi.fd/constants.f90
===================================================================
--- checkout/global_shared.v15.0.0/sorc/gsi.fd/constants.f90	(revision 96191)
+++ checkout/global_shared.v15.0.0/sorc/gsi.fd/constants.f90	(revision 96274)

Property changes on: checkout/global_shared.v15.0.0/sorc/gsi.fd/constants.f90
___________________________________________________________________
Modified: svn:mergeinfo
## -0,0 +0,1 ##
   Merged /fv3gfs/branches/Rahul.Mahajan/EXP-cyc/global_shared.v15.0.0/sorc/gsi.fd/constants.f90:r96049-96273
Index: checkout/global_shared.v15.0.0/sorc/gsi.fd/read_aerosol.f90
===================================================================
--- checkout/global_shared.v15.0.0/sorc/gsi.fd/read_aerosol.f90	(revision 96191)
+++ checkout/global_shared.v15.0.0/sorc/gsi.fd/read_aerosol.f90	(revision 96274)

Property changes on: checkout/global_shared.v15.0.0/sorc/gsi.fd/read_aerosol.f90
___________________________________________________________________
Modified: svn:mergeinfo
## -0,0 +0,1 ##
   Merged /fv3gfs/branches/Rahul.Mahajan/EXP-cyc/global_shared.v15.0.0/sorc/gsi.fd/read_aerosol.f90:r96049-96273
Index: checkout/global_shared.v15.0.0/sorc/gsi.fd/normal_rh_to_q.f90
===================================================================
--- checkout/global_shared.v15.0.0/sorc/gsi.fd/normal_rh_to_q.f90	(revision 96191)
+++ checkout/global_shared.v15.0.0/sorc/gsi.fd/normal_rh_to_q.f90	(revision 96274)

Property changes on: checkout/global_shared.v15.0.0/sorc/gsi.fd/normal_rh_to_q.f90
___________________________________________________________________
Modified: svn:mergeinfo
## -0,0 +0,1 ##
   Merged /fv3gfs/branches/Rahul.Mahajan/EXP-cyc/global_shared.v15.0.0/sorc/gsi.fd/normal_rh_to_q.f90:r96049-96273
Index: checkout/global_shared.v15.0.0/sorc/gsi.fd/read_files.f90
===================================================================
--- checkout/global_shared.v15.0.0/sorc/gsi.fd/read_files.f90	(revision 96191)
+++ checkout/global_shared.v15.0.0/sorc/gsi.fd/read_files.f90	(revision 96274)

Property changes on: checkout/global_shared.v15.0.0/sorc/gsi.fd/read_files.f90
___________________________________________________________________
Modified: svn:mergeinfo
## -0,0 +0,1 ##
   Merged /fv3gfs/branches/Rahul.Mahajan/EXP-cyc/global_shared.v15.0.0/sorc/gsi.fd/read_files.f90:r96049-96273
Index: checkout/global_shared.v15.0.0/sorc/gsi.fd/gesinfo.f90
===================================================================
--- checkout/global_shared.v15.0.0/sorc/gsi.fd/gesinfo.f90	(revision 96191)
+++ checkout/global_shared.v15.0.0/sorc/gsi.fd/gesinfo.f90	(revision 96274)

Property changes on: checkout/global_shared.v15.0.0/sorc/gsi.fd/gesinfo.f90
___________________________________________________________________
Modified: svn:mergeinfo
## -0,0 +0,1 ##
   Merged /fv3gfs/branches/Rahul.Mahajan/EXP-cyc/global_shared.v15.0.0/sorc/gsi.fd/gesinfo.f90:r96049-96273
Index: checkout/global_shared.v15.0.0/sorc/gsi.fd/bkgvar.f90
===================================================================
--- checkout/global_shared.v15.0.0/sorc/gsi.fd/bkgvar.f90	(revision 96191)
+++ checkout/global_shared.v15.0.0/sorc/gsi.fd/bkgvar.f90	(revision 96274)

Property changes on: checkout/global_shared.v15.0.0/sorc/gsi.fd/bkgvar.f90
___________________________________________________________________
Modified: svn:mergeinfo
## -0,0 +0,1 ##
   Merged /fv3gfs/branches/Rahul.Mahajan/EXP-cyc/global_shared.v15.0.0/sorc/gsi.fd/bkgvar.f90:r96049-96273
Index: checkout/global_shared.v15.0.0/sorc/gsi.fd/setupbend.f90
===================================================================
--- checkout/global_shared.v15.0.0/sorc/gsi.fd/setupbend.f90	(revision 96191)
+++ checkout/global_shared.v15.0.0/sorc/gsi.fd/setupbend.f90	(revision 96274)

Property changes on: checkout/global_shared.v15.0.0/sorc/gsi.fd/setupbend.f90
___________________________________________________________________
Modified: svn:mergeinfo
## -0,0 +0,1 ##
   Merged /fv3gfs/branches/Rahul.Mahajan/EXP-cyc/global_shared.v15.0.0/sorc/gsi.fd/setupbend.f90:r96049-96273
Index: checkout/global_shared.v15.0.0/sorc/gsi.fd/ensctl2model_ad.f90
===================================================================
--- checkout/global_shared.v15.0.0/sorc/gsi.fd/ensctl2model_ad.f90	(revision 96191)
+++ checkout/global_shared.v15.0.0/sorc/gsi.fd/ensctl2model_ad.f90	(revision 96274)

Property changes on: checkout/global_shared.v15.0.0/sorc/gsi.fd/ensctl2model_ad.f90
___________________________________________________________________
Modified: svn:mergeinfo
## -0,0 +0,1 ##
   Merged /fv3gfs/branches/Rahul.Mahajan/EXP-cyc/global_shared.v15.0.0/sorc/gsi.fd/ensctl2model_ad.f90:r96049-96273
Index: checkout/global_shared.v15.0.0/sorc/gsi.fd/ncepnems_io.f90
===================================================================
--- checkout/global_shared.v15.0.0/sorc/gsi.fd/ncepnems_io.f90	(revision 96191)
+++ checkout/global_shared.v15.0.0/sorc/gsi.fd/ncepnems_io.f90	(revision 96274)

Property changes on: checkout/global_shared.v15.0.0/sorc/gsi.fd/ncepnems_io.f90
___________________________________________________________________
Modified: svn:mergeinfo
## -0,0 +0,1 ##
   Merged /fv3gfs/branches/Rahul.Mahajan/EXP-cyc/global_shared.v15.0.0/sorc/gsi.fd/ncepnems_io.f90:r96049-96273
Index: checkout/global_shared.v15.0.0/sorc/gsi.fd/ensctl2state_ad.f90
===================================================================
--- checkout/global_shared.v15.0.0/sorc/gsi.fd/ensctl2state_ad.f90	(revision 96191)
+++ checkout/global_shared.v15.0.0/sorc/gsi.fd/ensctl2state_ad.f90	(revision 96274)

Property changes on: checkout/global_shared.v15.0.0/sorc/gsi.fd/ensctl2state_ad.f90
___________________________________________________________________
Modified: svn:mergeinfo
## -0,0 +0,1 ##
   Merged /fv3gfs/branches/Rahul.Mahajan/EXP-cyc/global_shared.v15.0.0/sorc/gsi.fd/ensctl2state_ad.f90:r96049-96273
Index: checkout/global_shared.v15.0.0/sorc/gsi.fd/ensctl2model.f90
===================================================================
--- checkout/global_shared.v15.0.0/sorc/gsi.fd/ensctl2model.f90	(revision 96191)
+++ checkout/global_shared.v15.0.0/sorc/gsi.fd/ensctl2model.f90	(revision 96274)

Property changes on: checkout/global_shared.v15.0.0/sorc/gsi.fd/ensctl2model.f90
___________________________________________________________________
Modified: svn:mergeinfo
## -0,0 +0,1 ##
   Merged /fv3gfs/branches/Rahul.Mahajan/EXP-cyc/global_shared.v15.0.0/sorc/gsi.fd/ensctl2model.f90:r96049-96273
Index: checkout/global_shared.v15.0.0/sorc/gsi.fd/ensctl2state.f90
===================================================================
--- checkout/global_shared.v15.0.0/sorc/gsi.fd/ensctl2state.f90	(revision 96191)
+++ checkout/global_shared.v15.0.0/sorc/gsi.fd/ensctl2state.f90	(revision 96274)

Property changes on: checkout/global_shared.v15.0.0/sorc/gsi.fd/ensctl2state.f90
___________________________________________________________________
Modified: svn:mergeinfo
## -0,0 +0,1 ##
   Merged /fv3gfs/branches/Rahul.Mahajan/EXP-cyc/global_shared.v15.0.0/sorc/gsi.fd/ensctl2state.f90:r96049-96273
Index: checkout/global_shared.v15.0.0/scripts/exemcsfc_global_sfc_prep.sh.ecf
===================================================================
--- checkout/global_shared.v15.0.0/scripts/exemcsfc_global_sfc_prep.sh.ecf	(revision 96191)
+++ checkout/global_shared.v15.0.0/scripts/exemcsfc_global_sfc_prep.sh.ecf	(revision 96274)

Property changes on: checkout/global_shared.v15.0.0/scripts/exemcsfc_global_sfc_prep.sh.ecf
___________________________________________________________________
Modified: svn:mergeinfo
## -0,0 +0,1 ##
   Merged /fv3gfs/branches/Rahul.Mahajan/EXP-cyc/global_shared.v15.0.0/scripts/exemcsfc_global_sfc_prep.sh.ecf:r96049-96273
Index: checkout/global_shared.v15.0.0/scripts/exglobal_fcst_nems.sh.ecf
===================================================================
--- checkout/global_shared.v15.0.0/scripts/exglobal_fcst_nems.sh.ecf	(revision 96191)
+++ checkout/global_shared.v15.0.0/scripts/exglobal_fcst_nems.sh.ecf	(revision 96274)

Property changes on: checkout/global_shared.v15.0.0/scripts/exglobal_fcst_nems.sh.ecf
___________________________________________________________________
Modified: svn:mergeinfo
## -0,0 +0,1 ##
   Merged /fv3gfs/branches/Rahul.Mahajan/EXP-cyc/global_shared.v15.0.0/scripts/exglobal_fcst_nems.sh.ecf:r96049-96273
Index: checkout/global_shared.v15.0.0/scripts/exglobal_analysis.sh.ecf
===================================================================
--- checkout/global_shared.v15.0.0/scripts/exglobal_analysis.sh.ecf	(revision 96191)
+++ checkout/global_shared.v15.0.0/scripts/exglobal_analysis.sh.ecf	(revision 96274)

Property changes on: checkout/global_shared.v15.0.0/scripts/exglobal_analysis.sh.ecf
___________________________________________________________________
Modified: svn:mergeinfo
## -0,0 +0,1 ##
   Merged /fv3gfs/branches/Rahul.Mahajan/EXP-cyc/global_shared.v15.0.0/scripts/exglobal_analysis.sh.ecf:r96049-96273
Index: checkout/global_shared.v15.0.0/scripts/exglobal_fcst_nemsfv3gfs.sh
===================================================================
--- checkout/global_shared.v15.0.0/scripts/exglobal_fcst_nemsfv3gfs.sh	(revision 96191)
+++ checkout/global_shared.v15.0.0/scripts/exglobal_fcst_nemsfv3gfs.sh	(revision 96274)
@@ -66,9 +66,9 @@
 FIX_DIR=${FIX_DIR:-$BASE_GSM/fix}
 FIX_AM=${FIX_AM:-$FIX_DIR/fix_am}
 FIX_FV3=${FIX_FV3:-$FIX_DIR/fix_fv3}
-DATA=${DATA:-$pwd/fv3tmp$$}    #temporary running directory
-ROTDIR=${ROTDIR:-$pwd}         #rotating archive directory
-IC_DIR=${IC_DIR:-$pwd}         #cold start initial conditions
+DATA=${DATA:-$pwd/fv3tmp$$}    # temporary running directory
+ROTDIR=${ROTDIR:-$pwd}         # rotating archive directory
+ICSDIR=${ICSDIR:-$pwd}         # cold start initial conditions
 DMPDIR=${DMPDIR:-$pwd}         # global dumps for seaice, snow and sst analysis
 
 # Model resolution specific parameters
@@ -131,8 +131,8 @@
 restart_interval=${restart_interval:-0}
 
 if [ $warm_start = ".false." ]; then
-  if [ -d $IC_DIR/${CASE}_$CDATE ]; then
-    $NCP $IC_DIR/${CASE}_$CDATE/* $DATA/INPUT/.
+  if [ -d $ICSDIR/$CDATE/$CDUMP/$CASE/INPUT ]; then
+    $NCP $ICSDIR/$CDATE/$CDUMP/$CASE/INPUT/* $DATA/INPUT/.
   else
     for file in $memdir/INPUT/*.nc; do
       file2=$(echo $(basename $file))
@@ -237,7 +237,7 @@
 FNSLPC=${FNSLPC:-"$FIX_AM/global_slope.1x1.grb"}
 FNABSC=${FNABSC:-"$FIX_AM/global_mxsnoalb.uariz.t1534.3072.1536.rg.grb"}
 
-# Warm start and read increment, update surface variables
+# Warm start and read increment, update surface variables for GDAS cycle only
 # since we do not have SST, SNOW or ICE via global_cycle
 if [ $CDUMP = "gdas" -a $warm_start = ".true." -a $read_increment = ".true." ]; then
   FNTSFA=${FNTSFA:-"$DMPDIR/$CDATE/$CDUMP/${CDUMP}.t${chh}z.rtgssthr.grb"}
Index: checkout/global_shared.v15.0.0/scripts
===================================================================
--- checkout/global_shared.v15.0.0/scripts	(revision 96191)
+++ checkout/global_shared.v15.0.0/scripts	(revision 96274)

Property changes on: checkout/global_shared.v15.0.0/scripts
___________________________________________________________________
Modified: svn:mergeinfo
## -0,0 +0,1 ##
   Merged /fv3gfs/branches/Rahul.Mahajan/EXP-cyc/global_shared.v15.0.0/scripts:r96049-96273
Index: checkout/global_shared.v15.0.0/docs/Release_Notes.global_shared_gsm.v14.1.0.txt
===================================================================
--- checkout/global_shared.v15.0.0/docs/Release_Notes.global_shared_gsm.v14.1.0.txt	(revision 96191)
+++ checkout/global_shared.v15.0.0/docs/Release_Notes.global_shared_gsm.v14.1.0.txt	(revision 96274)

Property changes on: checkout/global_shared.v15.0.0/docs/Release_Notes.global_shared_gsm.v14.1.0.txt
___________________________________________________________________
Modified: svn:mergeinfo
## -0,0 +0,1 ##
   Merged /fv3gfs/branches/Rahul.Mahajan/EXP-cyc/global_shared.v15.0.0/docs/Release_Notes.global_shared_gsm.v14.1.0.txt:r96049-96273
Index: checkout/global_shared.v15.0.0
===================================================================
--- checkout/global_shared.v15.0.0	(revision 96191)
+++ checkout/global_shared.v15.0.0	(revision 96274)

Property changes on: checkout/global_shared.v15.0.0
___________________________________________________________________
Modified: svn:mergeinfo
## -0,0 +0,1 ##
   Merged /fv3gfs/branches/Rahul.Mahajan/EXP-cyc/global_shared.v15.0.0:r96049-96273
Index: checkout
===================================================================
--- checkout	(revision 96191)
+++ checkout	(revision 96274)

Property changes on: checkout
___________________________________________________________________
Modified: svn:mergeinfo
## -0,0 +0,1 ##
   Merged /fv3gfs/branches/Rahul.Mahajan/EXP-cyc:r96049-96273


More information about the Ncep.list.fv3-announce mailing list