1) Incoming json events
2) Events copied to event log, time-series stream, etc.
3) Dimension table for time bucketing
4) Join on Epoch field, finest granularity is one day.
5) Epoch/Count json fields match HighCharts input json.
I wrote this simple time-series analytics system in eight weeks with no prior Postgres experience and a few weeks of Node.js experience. Originally, we prototyped with the InfluxDB time-series database but after seeing the design, I realized could I do something similar in the pre-existing Postgres environment using the data warehousing scripts.
It's simple, effective and environment-appropriate, should handle 15K-20K concurrent users and is easily scaled 10-fold by sharding incoming events at the Node.js entry URL. Node.js hosts the incoming / outgoing URLs, Postgres stores json events in a single table and there's a Time dimension table for time-slicing the epoch/count output, which is impedance-matched to the Highcharts json inputs.
Many programmers couldn't have written this system; not because it's too complex but because it's too simple.
Live System (redacted subset of real system)
Median Response Time for All Hosts (uses Postgres stats library)
Event Traffic for All Hosts, raw data (140K events)
Event Traffic for All Hosts, accumulated data (140K events)
Event Traffic for Single Host, raw data
Event Traffic for Single Host, accumulated data
(Valid host params are 242 to 258, 261 to 272, 275 to 282.
Add a "showall=false" parameter to the URL to turn off graphing, and then click on an individual host button to see that graph)