If you’re just interested in the code, this can be found over at gitlab.

Why?

As some people might know, I play a MMORPG called RuneScape. Like most MMORPG’s there’s a full on active economy. Most trading goes through an ingame location referred to as the Grand Exchange. Luckily the prices of items can also be browsed from the browser at https://secure.runescape.com/m=itemdb_rs/. Sadly however this interface often feels slow and I would have to open a new page for every single item, which makes tracking of prices especially over time just cumbersome and annoying.

First approach

Enter Grafana, a popular tool to visualize data from all kinds of data sources like prometheus which I all already use to mostly track server resources anyway. Initially I looked at https://grafana.com/grafana/plugins/simpod-json-datasource for a data source plugin. And I quickly had something up and running with a simple golang HTTP server combined with a tiny golang library that already gives me API access to most of the Grand Exchange API endpoints anyway. This did work, but due to the lack of API endpoints the resolve item names to IDs (the only way to do this was basically scraping the search feature of the website, in which case you would often run into Cloudflare pages) it was quite cumbersome anyway. Plus more importantly, there was a complete lack of alerts in this plugin. Meaning I couldn’t just set up an alert for when a certain items went above or below a certain threshold, a feature I kind of wanted.

Back to the drawing board

So, back to the drawing board. I decided to maybe look at writing a data source myself for more flexibility. I knew that Grafana itself was written in golang, after a few web searches I quickly ran into https://grafana.com/docs/grafana/latest/developers/plugins/backend/grafana-plugin-sdk-for-go/. This seemed perfect, so off to the races. This really was a lot lot easier than I initially thought, the entire logic of fetching the API and returning it in a format the Grafana likes was really just the matter of implementing a simple func QueryData(ctx context.Context, req *backend.QueryDataRequest) (*backend.QueryDataResponse, error) function. And surely enough, not much later I had the core functionality working. Grafana Graph

I however sadly still had the same problem as with the first approach of item name to ID resolution being cumbersome and error prone due to Cloudflare. However in this instance I had full control over the UI, meaning I could polish it up quite a bit better. After looking at the API endpoints again I ran into the catalogue. With this I could at least build up a local ‘database’ to map item names to IDs and the other way around (which due to rate limits would take like 45 minutes to an hour). Annoyingly enough this did however mean we would have to have this file with all items accessible on the machine running Grafana. And slightly more annoyingly, we would have to sync it up as items are regularly getting added to the game and therefore the Grand Exchange. As I already had the tool for exactly this and I already ran it in Gitlab CI anyway, I ended up doing this in CI as well. This was really just a matter of restricting the job to schedules only and adding a weekly schedule on the same day that most updates would happen (Monday in this case). And of course afterwards publishing it to Gitlab Pages. The relevant .gitlab-ci.yml can be found here (The only reason for the “external” tag on that job is so it runs on my own server and doesn’t take up precious Shared Gitlab CI runner minutes). After that I could simply made the datasource plugin fetch this list from Gitlab Pages upon start up and build it’s internal database from that. As I regularly update it and restart it anyway keeping it in sync isn’t exactly an issue, but I suppose I could make it refresh automatically in the future.

Next up, caching. I didn’t exactly want to hit the Jagex servers every time Grafana would refresh. Mostly because I wanted to have it scalable and not run into rate limits with many items in the same graph. Second of all, the prices only update once per day anyway and well sometimes their servers have some hiccups. The graph API responses kind of imply, that updates happen at UTC midnight. API responses roughly looks like the following {"daily":{"1419897600000":15633853,"1419984000000":15475988,"1420070400000":15379017},"average":{"1419897600000":14708793,"1419984000000":14764787,"1420070400000":148288055}}, where the timestamp is always midnight UTC. I however remembered that the updates were always at random periods throughout the day as years ago on IRC there was a bot that would let us know when an update was detected and the time would always be different. After writing a quick script and running it for a few days I confirmed that this was still the case.

func main() {
	var prev time.Time

	g := ge.Ge{
		Client: http.DefaultClient,
	}

	for range time.Tick(time.Minute) {
		graph, err := g.PriceGraph(31725)
		if err != nil {
			continue
		}

		latest, _ := graph.LatestPrice()
		if latest != prev {
			prev = latest
			log.Printf("Price updated! Reported time is %s\n", latest)
		}
	}
}

//2020/12/26 05:07:09 Price updated! Reported time is 2020-12-26 01:00:00 +0100 CET
//2020/12/27 08:33:09 Price updated! Reported time is 2020-12-27 01:00:00 +0100 CET
//2020/12/28 11:14:09 Price updated! Reported time is 2020-12-28 01:00:00 +0100 CET

Purely because of this the caching is slightly variable, as it made no sense to keep fetching from the API every time if we know that it already updated today. So I quickly settled on caching till end of the day in UTC if the latest update was today, otherwise just 1 minute to at least be nice https://gitlab.com/schoentoon/rsge-grafana/-/blob/master/pkg/plugin.go#L98-115.

All in all this was a lot easier than I thought, if you’re interested in running this yourself the repository can be found at https://gitlab.com/schoentoon/rsge-grafana including the installation instructions. Do however note that it currently only supports the RS3 Grand Exchange and has no support for Old School, mostly because I don’t play this personally. Adding support for this however should be trivial, just open a ticket and I might add it.

And as writing Grafana datasource is apparently this easy. Next up, replacing the script that polls data from https://exchangeratesapi.io/ into a local PostgreSQL database with a Grafana datasource instead.