An Example Post
Here is some extra detail about the post.
Informal justifications
In Auto-Encoding Variational Bayes, (missing reference), Kingma presents an unbiased, differentiable, and scalable estimator for the ELBO in variational inference. A key idea behind this estimator is the reparameterization trick. But why do we need this trick in the first place? When first learning about variational autoencoders (VAEs), I tried to find an answer online but found the explanations too informal…
Cool!
import numpy as np
import jax.numpy as jnp
from jax import grad, jit, vmap
from jax import random
import os
import sys
# check executable path
print(sys.executable)
There are something:
wish you can do
<div id='posts' class='section'>
{%for post in site.posts%}
<div class='post-row'>
<p class='post-title'>
<a href="{{ post.url }}">
{{ post.title }}
</a>
</p>
<p class='post-date'>
{{ post.date | date_to_long_string }}
</p>
</div>
<p class='post-subtitle'>
{{ post.subtitle }}
</p>
{% endfor %}
</div>
Citations
According to (Bishop, 2006), machine learning … You can cite another one (Vaswani et al., 2017). You can also do inline citation, Bishop (2006) argues that …
Figures
Without zoom

Figure 1. Diagram of rejection sampling. The
density must be always greater than . A new sample
is rejected if it falls in the gray region and accepted otherwise. These accepted
samples are distributed according to . This is achieved by sampling
from , and then sampling uniformly from .
Samples under the curve are accepted.
With zoom

Figure 1. OECD HAN database structure:
there are four tables, in which HAN_PERSON is the correspondence
table which include all cleaned names.
Tables
Table 1. First six rows of HAN_PATENTS, which starts at HAN_ID = 4.
HAN_ID |
HARM_ID |
Appln_id |
Publn_auth |
Patent_number |
---|---|---|---|---|
4 | 4 | 311606173 | US | US8668089 |
7 | 7 | 439191607 | US | US9409947 |
7 | 7 | 518367793 | US | US10836794 |
10 | 10 | 365204276 | US | US8513480 |
14 | 14 | 336903179 | WO | WO2011112122 |
14 | 14 | 363622722 | WO | WO2012064218 |
Table and figure aligned
Top 5 IPC class | Count |
---|---|
B29C70-38 | 14 |
B29C70-54 | 13 |
G08G5-00 | 13 |
B64C39-02 | 13 |
G05D1-00 | 12 |

Figure 4. The IPC distribution of
Airbus Defence (DE)'s patents; section B is about performing operations and transport; section H is electricity; section G is about
physics. Remark : the total number of granted patents is 538 but the count of total IPC classes is 1763, which means some patents
are assigned to multiple classes.
Table over flow if you need it
Analysis | Patent statistics | Purposes |
---|---|---|
Citation analysis | Forward citations | Measure technological value |
Backward citations | Find knowledge source | |
Patent counts analysis | Patent counts | Observe patent portfolio |
RTA (Revealed Technology Advance) | Identify core technological competence | |
PS (Patent Share) | ||
Technology class analysis | Generality | Measure endogenous applicability to different technological fields |
Originality | Measure knowledge absorption from different technological fields | |
Inventor analysis | Inventor counts | Measure invention quality |
Measure absorptive capability | ||
Inventor | Identify specific inventors’ info such as star engineers | |
Follow mobility of R&D personnel |
- Bishop, C. M. (2006). Pattern Recognition and Machine Learning.
- Vaswani, A., Shazeer, N., Parmar, N., Uszkoreit, J., Jones, L., Gomez, A. N., Kaiser, Ł., & Polosukhin, I. (2017). Attention is all you need. Advances in Neural Information Processing Systems, 30.