1 Introduction

Along with implementations for several differential co-expression analysis methods, this package provides an evaluation framework to benchmark methods. Real data are rarely available for the evaluation of differential co-expression methods as such experiments are difficult to perform and in almost all cases, the true/gold-standard network is unknown. As such, simulations are the only means to perform a rigorous comparison and evaluation. Data from 812 simulations were generated with settings and parameters are described in Bhuva et al. (manuscript in preparation) and results included here. Along with the data, a suite of visualisation and performance metrics are implemented.

This vignette describes functions in this package that enable comparative evaluation of inference methods. All methods implemented in the package can be evaluated and novel methods can be integrated into the evaluation framework with relative ease.

2 Simulation setup used to create the data

A dynamical systems simulator was used to generate synthetic expression data with differential associations. Regulatory networks of 150 genes were sampled from a S. cerevisiae regulatory network. Perturbation experiments were performed when simulating data to induce differential associations. Simulation parameters and networks were sampled to produce 812 distinct simulations. The dataset included in dcanr, sim102, is an example of one such simulation. Two knock-downs were performed simultaneously but independent of each other, therefore, some samples may have both knock-downs while other may have either one or none. Details on the simulation procedure can be found in the Bhuva et al. (manuscript in preparation). Up to 500 observations are sampled in each simulation.

3 Download the full simulated dataset

As the simulation is computationally intensive, data from the 812 simulations have been precomputed and are available at https://melbourne.figshare.com/articles/812_simulated_expression_datasets_for_differential_co-expression_analysis/8010176. The downloaded file contains a list of simulation results which includes sim102, packaged with dcanr. Each simulation can be accessed as shown below.

#Not evaluated
simdata <- readRDS('simdata_directory/sim812.rds')
sim10 <- simdata[[10]]

4 Running a pipeline on a simulation

Evaluations in the package are performed by creating an analysis pipeline and packaging this into a function. Three possible ways exist to perform this:

  1. Using standard in-built pipelines
  2. Using custom pipelines
  3. Retrieving pre-computed results from the standard pipelines

All of the above are made possible using a single function dcPipeline and are processed depending on the arguments specified.

4.1 Standard pipelines

A standard pipeline runs the in-built inference methods with their default parameters. All 4 steps of an analysis are performed in sequence as described in their respective publications. To run a standard pipeline on a simulation, simply pass in a simulation and a method name from dcMethods().


#load the data: a simulation
#run a standard pipeline with the z-score method
dcnets <- dcPipeline(sim102, dc.func = 'zscore')
#plot the source network, true differential network and inferred networks
op <- par(no.readonly = TRUE)
par(mfrow = c(2, 2))
plotSimNetwork(sim102, main = 'Regulatory network')
plotSimNetwork(sim102, what = 'association', main = 'True differential association network')
plot(dcnets$ADR1, main = 'ADR1 KD predicted network')
plot(dcnets$UME6, main = 'UME6 KD predicted network')