netpyne.batchtools.search

Classes:

LocalGridDispatcher([fs, cmd, submit, ...])

SSHGridDispatcher([connection, fs, cmd, ...])

study

alias of Study

constructors(dispatcher, submit)

Functions:

ray_optuna_search(dispatcher_constructor, ...)

#TODO -- fold this into the ray_search object later--- ray_optuna_search(...)

prune_dataframe(results)

ray_search(dispatcher_constructor, ...[, ...])

load_search(path[, prune_metadata])

generate_constructors(job_type, comm_type, ...)

"

generate_parameters(params, algorithm, **kwargs)

Returns a dictionary of parameters for ray_search based on the input dictionary from: params = { 'synMechTau2': [3.0, 5.0, 7.0], # assumes list of values by default if grid search-like algo #'synMechTau2': [3.0, 7.0], # assumes lower/upper bounds by default if evol-like algo 'connWeight' : paramtypes.sample_from(lambda _: numpy.random.uniform(0.005, 0.15)) } # can optionally pass any of the paramtypes (= ray.tune data types)

shim([dispatcher_constructor, ...])

search(...)

param dispatcher_constructor:

class netpyne.batchtools.search.LocalGridDispatcher(fs=None, cmd=None, submit=None, project_path=None, output_path='.', env=None, label=None, **kwargs)[source]

Bases: LocalDispatcher

Methods:

start()

creates and submits a job through the submit instance (calls .create_job() and .submit_job()) :param kwargs: :return:

connect()

Method for accepting a connection from a peer (runner) if bidirectional communication is implemented (see runtk.UNIX_Dispatcher, runtk.INET_Dispatcher, runtk.SocketRunner) If it is implemented, it will be a blocking call.

recv(interval)

Method for receiving data from the host (dispatcher).

start()[source]

creates and submits a job through the submit instance (calls .create_job() and .submit_job()) :param kwargs: :return:

connect()[source]

Method for accepting a connection from a peer (runner) if bidirectional communication is implemented (see runtk.UNIX_Dispatcher, runtk.INET_Dispatcher, runtk.SocketRunner) If it is implemented, it will be a blocking call. Otherwise will simply pass :param kwargs: :return:

recv(interval)[source]

Method for receiving data from the host (dispatcher). To be implemented by inherited classes. Method is a blocking call if implemented, it will wait until the data is received. Otherwise, will be a nonblocking function returning None :rtype: data - the data sent from the dispatcher (data = runner.recv() <- dispatcher.send(data))

class netpyne.batchtools.search.SSHGridDispatcher(connection=None, fs=None, cmd=None, submit=None, project_path=None, output_path='.', env=None, label=None, **kwargs)[source]

Bases: SSHDispatcher

Methods:

start()

creates and submits a job through the submit instance (calls .create_job() and .submit_job()) :param kwargs: :return:

connect()

Method for accepting a connection from a peer (runner) if bidirectional communication is implemented (see runtk.UNIX_Dispatcher, runtk.INET_Dispatcher, runtk.SocketRunner) If it is implemented, it will be a blocking call.

recv(interval)

Method for receiving data from the host (dispatcher).

start()[source]

creates and submits a job through the submit instance (calls .create_job() and .submit_job()) :param kwargs: :return:

connect()[source]

Method for accepting a connection from a peer (runner) if bidirectional communication is implemented (see runtk.UNIX_Dispatcher, runtk.INET_Dispatcher, runtk.SocketRunner) If it is implemented, it will be a blocking call. Otherwise will simply pass :param kwargs: :return:

recv(interval)[source]

Method for receiving data from the host (dispatcher). To be implemented by inherited classes. Method is a blocking call if implemented, it will wait until the data is received. Otherwise, will be a nonblocking function returning None :rtype: data - the data sent from the dispatcher (data = runner.recv() <- dispatcher.send(data))

#TODO – fold this into the ray_search object later— ray_optuna_search(…)

Parameters:
  • dispatcher_constructor (Callable, # constructor for the dispatcher (e.g. INETDispatcher))

  • submit_constructor (Callable, # constructor for the submit (e.g. SHubmitSOCK))

  • run_config (Dict, # batch configuration, (keyword: string pairs to customize the submit template))

  • params (Dict, # search space (dictionary of parameter keys: tune search spaces))

  • label (Optional[str] = 'optuna_search', # label for the search)

  • output_path (Optional[str] = '../batch', # directory for storing generated files)

  • checkpoint_path (Optional[str] = '../ray', # directory for storing checkpoint files)

  • max_concurrent (Optional[int] = 1, # number of concurrent trials to run at one time)

  • batch (Optional[bool] = True, # whether concurrent trials should run synchronously or asynchronously)

  • num_samples (Optional[int] = 1, # number of trials to run)

  • metric (Optional[str] = "loss", # metric to optimize (this should match some key: value pair in the returned data)

  • mode (Optional[str] = "min", # either 'min' or 'max' (whether to minimize or maximize the metric)

  • optuna_config (Optional[dict] = None, # additional configuration for the optuna search algorithm (incl. sampler, seed, etc.))

  • ray_config (Optional[dict] = None, # additional configuration for the ray initialization)

  • Creates

  • -------

  • <label>.csv (file containing the results of the search)

Returns:

Study

Return type:

namedtuple(‘Study’, [‘algo’, ‘results’])(algo, results), # named tuple containing the created algorithm and the results of the search

netpyne.batchtools.search.prune_dataframe(results: DataFrame) DataFrame[source]
netpyne.batchtools.search.study

alias of Study Attributes:

data

Alias for field number 1

results

Alias for field number 0

class netpyne.batchtools.search.constructors(dispatcher, submit)

Bases: tuple

Attributes:

dispatcher

Alias for field number 0

submit

Alias for field number 1

dispatcher

Alias for field number 0

submit

Alias for field number 1

netpyne.batchtools.search.generate_constructors(job_type, comm_type, **kwargs)[source]

” returns the dispatcher, submit constructor pair for ray_search based on the job_type and comm_type inputs

netpyne.batchtools.search.generate_parameters(params, algorithm, **kwargs)[source]

Returns a dictionary of parameters for ray_search based on the input dictionary from: params = { ‘synMechTau2’: [3.0, 5.0, 7.0], # assumes list of values by default if grid search-like algo #’synMechTau2’: [3.0, 7.0], # assumes lower/upper bounds by default if evol-like algo ‘connWeight’ : paramtypes.sample_from(lambda _: numpy.random.uniform(0.005, 0.15)) } # can optionally pass any of the paramtypes (= ray.tune data types)

netpyne.batchtools.search.shim(dispatcher_constructor: Callable | None = None, submit_constructor: Callable | None = None, job_type: str | None = None, comm_type: str | None = None, run_config: Dict | None = None, params: Dict | None = None, algorithm: str | None = 'variant_generator', label: str | None = 'search', output_path: str | None = './batch', checkpoint_path: str | None = './checkpoint', max_concurrent: int | None = 1, batch: bool | None = True, num_samples: int | None = 1, metric: str | None = None, mode: str | None = 'min', sample_interval: int | None = 15, algorithm_config: dict | None = None, ray_config: dict | None = None, attempt_restore: bool | None = True, clean_checkpoint: bool | None = True, report_config=('path', 'config', 'data'), prune_metadata: bool | None = True, remote_dir: str | None = None, host: str | None = None, key: str | None = None, file_cleanup: bool | None = True, advanced_logging: bool | str | None = True) Dict[source]
netpyne.batchtools.search.search(...)[source]
Parameters:
  • dispatcher_constructor (Callable, # constructor for the dispatcher (e.g. INETDispatcher))

  • submit_constructor (Callable, # constructor for the submit (e.g. SHubmitSOCK))

  • job_type (str, # the submission engine to run a single simulation (e.g. 'sge', 'sh'))

  • comm_type (Optional[str], # the method of communication between host dispatcher and the simulation (e.g. 'socket', 'filesystem', None), if None, expects a non-optimization based search (grid/random/etc.))

  • run_config (Dict, # batch configuration, (keyword: string pairs to customize the submit template))

  • params (Dict, # search space (dictionary of parameter keys: tune search spaces))

  • algorithm (Optional[str] = "variant_generator", # search algorithm to use, see SEARCH_ALG_IMPORT for available options)

  • label (Optional[str] = 'search', # label for the search)

  • output_path (Optional[str] = './batch', # directory for storing generated files)

  • checkpoint_path (Optional[str] = './ray', # directory for storing checkpoint files)

  • max_concurrent (Optional[int] = 1, # number of concurrent trials to run at one time)

  • batch (Optional[bool] = True, # whether concurrent trials should run synchronously or asynchronously)

  • num_samples (Optional[int] = 1, # number of trials to run)

  • metric (Optional[str] = None, # metric to optimize (this should match some key: value pair in the returned data, or None if no optimization is desired)

  • mode (Optional[str] = "min", # either 'min' or 'max' (whether to minimize or maximize the metric)

  • sample_interval (Optional[int] = 15, # interval to check for new results (in seconds))

  • algorithm_config (Optional[dict] = None, # additional configuration for the search algorithm)

  • ray_config (Optional[dict] = None, # additional configuration for the ray initialization)

  • attempt_restore (Optional[bool] = True, # whether to attempt to restore from a checkpoint)

  • clean_checkpoint (Optional[bool] = True, # whether to clean the checkpoint directory after the search)

  • prune_metadata (Optional[bool] = True, # whether to prune the metadata from the results.csv)

  • remote_dir (Optional[str] = None, # absolute path for directory to run the search on (for submissions over SSH))

  • host (Optional[str] = None, # host to run the search on (for submissions over SSH))

  • key (Optional[str] = None # key for TOTP generator (for submissions over SSH))

  • file_cleanup (Optional[bool] = True, # whether to clean up accessory files after the search is completed)

  • advanced_logging (Optional[bool] = True, # enables advanced logging features, checkpoint_db and log_file.)

  • checkpoint_db (Optional[str] = None, # path for checkpoint db file.)

  • log_file (Optional[str] = None, # path for the log file)

  • run...) (Creates (upon completed fitting)

  • -------

  • <label>.csv (file containing the results of the search)

Returns:

.results : tune.ResultGrid # raw data yielded from the search .data : pandas.DataFrame # pandas dataframe containing the results of the search

Return type:

study instance with two attributes