AsciiDoc Example

Table (Playing with Kaggle; uses AsciiDoc includes)

--- snip ---

Model Epochs bs lr Momentum Result (local) Result (Kaggle) Remarks

SimpleNet

50

20

0.007

0.9

~97

ConvNet

50

25

0.008

0.9

99.257

"

50

17

0.008

0.9

99.1964

augmented

"

50

17

0.008

0.9

99.3143

99.342

augmented bn

Binary Ensemble

25

17

0.007

0.9

>99

"

22

17

0.0085

0.9

99.23928

99.328

augmented

"

22

17

0.0085

0.9

99.34643

99.357

augmented bn

--- snap ---

(1) augmented: one additional variant per image rotated randomly in [-5,5] degree.
(2) bn: batch-norm

Mathematics

Inline Math: fi(x)=τ(π)e(x,y)dyf_{i}(x) = \int_{\tau(\pi)}^\infty e(x,y)dy

Display Math:

[a11a12an1ann]\begin{bmatrix} a_{11} & a_{12} & \dots \\ \vdots & \ddots & \\ a_{n1} & & a_{nn} \end{bmatrix}

Inline Math: (Ω,F,P)(Ω+×Ω,F+F,P+×P)(\Omega,\mathcal{F},P) \coloneqq (\Omega^+ \times \Omega^-,\mathcal{F^+ }\otimes\mathcal{F^- },P^+ \times P^- )

Display Math:

The Wiener process in (Ω,F,P)(\Omega,\mathcal{F},P) is defined by

ω(t){(ω+(t),0)t0(0,ω(t))t<0\omega(t) \coloneqq \left\{ \begin{array}{cc} (\omega^+(t),0) & t\geq 0 \\ (0,\omega^-(t)) & t<0 \end{array} \right.

More math…​

dyt=i=1nxtixt2(j=1nuijxtidt+k=1mj=1nvijkxtjdWtk)++12i,jn(δijxt22xtixtjxt4)  kml,pnvilkvjpkxtlxtp  dt(3.1a)\tag {3.1a} \begin{aligned} dy_t &= \sum_{i=1}^n \frac{x_t^i}{\|x_t\|^2} \left( \sum_{j=1}^n u_{ij} x_t^i\,dt + \sum_{k=1}^m \sum_{j=1}^n v_{ij}^k x_t^j\circ dW_t^k\right) + \cdots\\ \cdots &+ \frac{1}{2} \sum_{i,j}^n \left( \frac{\delta_{ij}}{\|x_t\|^2} - \frac{2x_t^i x_t^j}{\|x_t\|^4} \right)\; \sum_k^m \sum_{l,p}^n v_{il}^k v_{jp}^k\, x_t^l x_t^p\;dt \end{aligned}

With ztxtxtz_t \coloneqq \frac{x_t}{\|x_t\|} we get the following differential equation on the unit sphere:

yt=y0+0tztTUztzt2ztTV^V^Tzt+12 trace (V^V^T)dt++k=1m0tztTVkztdWtk(3.1b)\tag{3.1b} \begin{aligned} y_t &= y_0 + \int_0^t z_t^TUz_t- \|z_t\|^{-2} z_t^T\hat{V}\hat{V}^T z_t + \frac{1}{2}\text{ trace }(\hat{V}\hat{V}^T)\,dt +\cdots\\ \cdots &+ \sum_{k=1}^m \int_0^t z_t^TV^kz_t\circ dW_t^k \end{aligned}

Outline as partial TeX file inclusion

Line block - takes only the inner part of a LaTeX display-math environment by specifying row delimiters for the included LaTeX file:

--- snip ---

z=z(x,y)x=x(s1,s2)y=y(t1,t2)si=si(w)  i{1,2}ti=ti(w)  i{1,2}\begin{aligned} z &= z(x, y)\\ x &= x(s_1, s_2)\\ y &= y(t_1, t_2)\\ s_i &= s_i(w) \; \forall i \in \{1,2\} \\ t_i &= t_i(w) \; \forall i \in \{1,2\} \end{aligned}

--- snap ---

Single line (as inline): start include → zw=zx(xs1s1w+xs2s2w)+zy(yt1t1w+yt2t2w){\partial z\over\partial w}={\partial z\over\partial x}\cdot\Bigg({\partial x\over\partial s_1}\cdot{\partial s_1\over\partial w} + {\partial x\over\partial s_2}\cdot{\partial s_2\over\partial w}\Bigg) + {\partial z\over\partial y}\cdot\Bigg({\partial y\over\partial t_1}\cdot{\partial t_1\over\partial w} + {\partial y\over\partial t_2}\cdot{\partial t_2\over\partial w}\Bigg) ← stop include

Footnotes

Footnotes are possible, like using [1] and [2].

Citations

SVG image

neuron
Caption (Neuron image - 50%)

Code

Verbatim file inclusion

# include example

from torch.utils.data import TensorDataset
from torch import Tensor, LongTensor, FloatTensor


def loadData(path) -> Tuple[np.ndarray, np.ndarray]:
    '''
    Load data from kaggle mnist set.
    '''
    # Read
    df = pd.read_csv(str(path))  # 40.000 entries
    # tdata = pd.read_csv(data_raw_dir + sep + 'train.csv') # 28.000 entries

    has_labels = True if 'label' in df.columns else False

Normal code block

from torch.utils.data import TensorDataset
from torch import Tensor, LongTensor, FloatTensor


def loadData(path) -> Tuple[np.ndarray, np.ndarray]:
    '''
    Load data from kaggle mnist set.

    path -- input csv

    Return scaled images [0,1] and labels (if available)
    as numpy arrays (dtype: float32, int64)
    '''
    # Read
    df = pd.read_csv(str(path))  # 40.000 entries
    # tdata = pd.read_csv(data_raw_dir + sep + 'train.csv') # 28.000 entries

    has_labels = True if 'label' in df.columns else False

Some inline code.


1. First footnote
2. Second footnote
3. Interesting in terms of content, by the way

References

Giovanni Dematteis, Tobias Grafke, and Eric Vanden-Eijnden. Rogue waves and large deviations in deep sea. Proceedings of the National Academy of Sciences, 115(5):855–860, January 2018. doi:10.1073/pnas.1710670115.

Hugo Touchette. A basic introduction to large deviations: Theory, applications, simulations. arXiv:1106.4146 [cond-mat, physics:math-ph], February 2012. arXiv:1106.4146.