m-chrzan.xyz
aboutsummaryrefslogtreecommitdiff
path: root/src/blog/hosting-on-vultr-vps.html
blob: ea7ef33b9b248abe5788091204ab3efb18fd1fc6 (plain)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
title: Hosting on Vultr VPS
date: February 16, 2021 18:10
---
<p>
This website was originally hosted for free with GitLab Pages. As of a few weeks
ago, it's now being served by Nginx from a Vultr VPS. Here's a few reasons why I
made the switch:

<ul>
    <li>
        I get a bit of insight into website traffic from Nginx logs, without
        needing to resort to something like Google analytics.
    </li>
    <li>
        I gain some internet independence. Sure, now I'm relying on Vultr
        instead of GitLab, but I can always switch to a different VPS provider
        (or, with enough grit, use own hardware) which will have the same
        environment, while GitLab Pages is a setup specific to GitLab.
    <li>
        I've wanted to play around with self-hosting various services for a
        while now.
    </li>
</ul>
</p>

<h3>Current setup</h3>

<p>
As mentioned, the website is served with Nginx. It's still statically generated
with my Ruby script. I build locally and use <code>rsync</code> to incrementally
update the hosted files.
</p>

<p>
I use <a href='https://certbot.eff.org/'>certbot</a> with the Nginx plugin to
get an SSL certificate. A cron job should ensure the certificate is updated
automatically before expiry.
</p>

<h3>Website traffic analysis</h3>
<p>
I've already gotten some interesting statistics from server logs. The day I
posted a Hacker News comment linking to one of my blog posts, there were four
times as many HTTP requests received, including around 200 unique IPs referred
from <code>news.ycombinator.com</code>.
</p>

<p>
So far my log analysis has been very ad hoc &mdash; just manually parsing the
log files with command line tools and Vim. For example, to get that 200 number
from above I ran

<pre>
awk '/ycombinator/ { print $1 }' logs | sort | uniq | wc -l
</pre>

I wonder if there are any good tools for parsing and analyzing Nginx logs, or if
I should build something simple of my own.
</p>

<h3>Self-hosting</h3>
<p>
In addition to this website, I'm also using the VPS to host a personal email
server and some <a href='https://git.m-chrzan.xyz'>git repos</a>. The email
server is based on Postfix and Dovecot and was painlessly installed and
configured thanks to
<a href='https://github.com/LukeSmithxyz/emailwiz'>emailwiz</a>. The git
frontend is <a href='https://git.zx2c4.com/cgit/'>cgit</a>. I might write a post
about setting up and configuring it later. Overall quite happy with what it
looks like and what it offers.
</p>