This R package calculates PRObabilistic Pathway Scores (PROPS), which are pathway-based features, from gene-based data. For more information, see:
Lichy Han, Mateusz Maciejewski, Christoph Brockel, William Gordon, Scott B. Snapper, Joshua R. Korzenik, Lovisa Afzelius, Russ B. Altman. A PRObabilistic Pathway Score (PROPS) for Classification with Applications to Inflammatory Bowel Disease.
In the package, example healthy data and patient data are included. Note that each data frame is patient x gene, and the column names are by gene Entrez ID.
10 | 100 | 1000 | 10000 | 10005 | |
---|---|---|---|---|---|
HealthySample1 | 6.993072 | 7.890769 | 1.187037 | 3.186920 | 2.243749 |
HealthySample2 | 7.170742 | 5.786940 | 6.681429 | 4.765805 | 5.404951 |
HealthySample3 | 4.458645 | 4.302536 | 3.893047 | 1.921196 | 8.437868 |
HealthySample4 | 3.215118 | 5.515139 | 4.178101 | 6.063425 | 6.152115 |
HealthySample5 | 4.456360 | 8.889095 | 6.342727 | 8.625195 | 2.956368 |
10 | 100 | 1000 | 10000 | 10005 | |
---|---|---|---|---|---|
Sample1 | 3.848834 | 3.485361 | 5.650030 | 7.020021 | 3.996484 |
Sample2 | 4.165301 | 1.808756 | 5.927112 | 10.229086 | 6.963043 |
Sample3 | 4.874789 | 4.627511 | 6.410964 | 6.036542 | 4.334305 |
Sample4 | 1.295941 | 4.173276 | 2.408115 | 4.329023 | 3.800146 |
Sample5 | 8.924654 | 4.229810 | 3.939608 | 7.241261 | 5.378624 |
KEGG edges have been included as part of this package, and will be used by default. To run PROPS, simply call the props command with the healthy data and the disease data.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 87178 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 953 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 956 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 10201 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 29922 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 4830 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 4832 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 4833 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 100526794 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 115024 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 22978 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 30833 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 4907 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 51251 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 56953 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 84618 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 93034 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 318 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 5313 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 953 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 956 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 3704 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 10663 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 10803 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 1230 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 1232 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 1233 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 1234 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 1235 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 1236 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 1237 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 1524 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 2826 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 2829 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 2833 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 3577 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 3579 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 643 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 729230 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 7852 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 3688 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 3690 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 3691 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 3693 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 3694 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 3695 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 3696 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 22801 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 3655 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 3672 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 3673 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 3674 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 3675 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 3676 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 3678 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 3679 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 3680 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 3685 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 8515 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 8516 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 3717 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 116379 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 1271 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 1438 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 1439 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 1441 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 149233 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 163702 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 2057 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 2690 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 3454 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 3455 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 3459 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 3460 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 3559 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 3560 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 3561 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 3563 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 3566 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 3568 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 3570 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 3572 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 3575 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 3581 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 3587 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 3588 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 3590 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 3594 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 3595 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 3597 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 3598 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 3601 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 3953 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 3977 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 4352 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 50615 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 53832 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 53833 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 5618 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 58985 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 64109 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 9180 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 9466 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 2774 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 2782 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 2788 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 51764 were 0 or NaN.
pathway_ID | Sample1 | Sample2 | Sample3 | Sample4 |
---|---|---|---|---|
00010.xml | -156.17346 | -157.15489 | -146.58073 | -147.62154 |
00020.xml | -66.07960 | -62.15872 | -68.82764 | -63.20575 |
00030.xml | -66.14898 | -64.39339 | -62.35845 | -66.02942 |
00040.xml | -75.16114 | -58.61277 | -64.90399 | -60.53265 |
00051.xml | -67.80608 | -68.76991 | -71.95501 | -72.96688 |
As part of this package, we have included an optional flag to use ComBat via the sva package. Let us have two batches of data, where the first 50 samples from example_healthy and first 20 samples from example_data are from batch 1, and the remaining are from batch 2. Run the props command with batch_correct as TRUE, followed by the batch numbers.
healthy_batches = c(rep(1, 25), rep(2, 25))
dat_batches = c(rep(1, 20), rep(2, 30))
props_features_batchcorrected <- props(example_healthy, example_data, batch_correct = TRUE, healthy_batches = healthy_batches, dat_batches = dat_batches)
## Found2batches
## Adjusting for0covariate(s) or covariate level(s)
## Standardizing Data across genes
## Fitting L/S model and finding priors
## Finding parametric adjustments
## Adjusting the Data
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 87178 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 111 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 112 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 953 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 956 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 10201 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 221264 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 8382 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 100526794 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 115024 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 22978 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 30833 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 4907 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 51251 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 56953 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 84618 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 93034 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 318 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 5313 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 953 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 956 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 3704 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 10663 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 10803 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 1230 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 1232 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 1233 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 1234 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 1235 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 1236 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 1237 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 1524 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 2826 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 2829 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 2833 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 3577 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 3579 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 643 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 729230 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 7852 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 3688 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 3690 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 3691 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 3693 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 3694 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 3695 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 3696 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 22801 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 3655 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 3672 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 3673 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 3674 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 3675 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 3676 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 3678 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 3679 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 3680 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 3685 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 8515 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 8516 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 3716 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 3718 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 7297 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 116379 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 1271 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 1438 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 1439 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 1441 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 149233 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 163702 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 2057 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 2690 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 3454 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 3455 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 3459 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 3460 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 3559 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 3560 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 3561 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 3563 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 3566 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 3568 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 3570 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 3572 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 3575 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 3581 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 3587 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 3588 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 3590 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 3594 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 3595 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 3597 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 3598 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 3601 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 3953 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 3977 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 4352 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 50615 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 53832 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 53833 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 5618 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 58985 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 64109 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 9180 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 9466 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 2774 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 2782 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 2788 were 0 or NaN.
## Warning in entropy.loss(fitted = object, data = data, keep = nodes, by.sample =
## by.sample): 50 observations were dropped because the corresponding
## probabilities for node 51764 were 0 or NaN.
pathway_ID | Sample1 | Sample2 | Sample3 | Sample4 |
---|---|---|---|---|
00010.xml | -154.30816 | -156.75131 | -144.21497 | -142.97078 |
00020.xml | -66.00799 | -61.88380 | -68.12112 | -62.64071 |
00030.xml | -65.63966 | -63.86724 | -62.03746 | -65.93344 |
00040.xml | -74.86549 | -58.33588 | -64.11862 | -60.10108 |
00051.xml | -67.53138 | -67.46415 | -71.16755 | -72.69114 |
A user can also input his or her own pathways, and thus our package is compatible with additional pathway databases and hand-curated pathways from literature or data. To do so, one needs to format the pathways into three columns, where the first column is the source or “from” node of the edge, the second column is the sink or “to” node of the edge, and the third column is the pathway ID (e.g. “glucose_metabolism”).
from | to | pathway_ID |
---|---|---|
7476 | 8322 | pathway1 |
3913 | 3690 | pathway1 |
26060 | 836 | pathway1 |
163688 | 5532 | pathway1 |
84812 | 10423 | pathway1 |
57104 | 1056 | pathway1 |
9651 | 8396 | pathway1 |
3976 | 3563 | pathway1 |
Run props with the user specified edges as follows:
props_features_userpathways <- props(example_healthy, example_data, pathway_edges = example_edges)
props_features_userpathways[,1:5]
pathway_ID | Sample1 | Sample2 | Sample3 | Sample4 |
---|---|---|---|---|
pathway1 | -370.3345 | -372.7717 | -397.3746 | -370.7386 |
pathway2 | -355.6405 | -343.6026 | -354.3261 | -339.1284 |
pathway3 | -357.4726 | -354.8797 | -352.8343 | -353.7832 |