Architecture for the future

After that EVE-centric post on scalability (thanks to for linking in, hope it was an interesting read), I figured it was time to return to EVE Metrics and other sites- accVIEW and ISKsense.

In the next week we will be migrating to a new server. It’s in the same datacenter with the same host, is a slightly faster machine but has four times as much RAM (8GB) and an additional 10kRPM hard drive. As part of the migration to the new server we’ll be making some changes to the software architecture running the show.

The main difference is that we’re moving away from Passenger, also known as mod_rails. It has some advantages in low-memory conditions, but we’ve had more trouble than it’s worth, so we’ll be moving back to running application servers manually as daemons. For this we’ll be using the excellent Thin application server. For the sites running PHP on the server (this blog, for example), we’ll be using PHP FPM as we are currently; we’ve had no issues with that. Both of those will be sitting as reverse proxies behind nginx. Nginx has done very well as a web server and it’s very fast, as well as being easy to configure.

There is only one other major change; we’ll be sitting nginx itself behind Varnish, a high performance HTTP cache. This will let us more efficiently leverage HTTP caching in our applications and speed up requests dramatically. Right now we don’t use HTTP caching that much; we’d like to change this, particularly in EVE Metrics’ API so we can let Varnish handle a good portion of the thousands of API calls we get asking for the price of trit or what have you. All in all it’ll mean reduced load on the application cluster, which means we can keep that smaller and lighter, which in turn means more room for the database in memory.

That translates to better performance on the more complex components in the site, ie market pages, your account page, corporate pages, and that better performance means we can build more- we’re waiting for the new capacity before we add asset support, one of the things we’re really looking forward to adding, since it will let us add a whole new level of functionality by giving lots more information to processes like our inferred trade detector and our planned fulfilled orders listings. Plus we’ll be adding asset valuation tools, of course.

The architecture I’ve described above will basically be ‘it’ for now; we have more complication at the application and DB layer (We still use MySQL for a few legacy applications, so we have a tiny MySQL server running). The complication at the app layer mainly consists of things like background processing tools, and for EVE Metrics tasks that are actually executed on a VPS and the results uploaded back to the server (we now do all the major CSV dumps on Makurid’s VPS).

As the guy who ends up fixing all this when it goes wrong, simplicity is always my main priority, but the added complexity of Thin and Varnish should be well worth it in the long run.

Dust 514, some EM2 teasers and statistics

As you might have heard, CCP announced Dust 514, their console MMORPG. And then announced it was to be integrated with EVE, especially with alliances and corporations. I just don’t see how it can work, honestly.

Half the point of EVE is the userbase is a very mature one and the sort of crowd who sit in their room playing with internet spaceships. Does EVE really want to get the Halo players of the world contributing to the game? I honestly don’t think so. It’s a very snobbish view, I know, but the average console gamer probably doesn’t want to spend their time liasing with alliances and planning the takeover of space from other alliances with internet spaceships they can’t see. What’s in it for the Dust 514 players, anyway? A mission system that could easily be done with some clever AI, from what I hear. I remain highly sceptical and look forward to seeing what CCP has planned in further detail- if they make it work it’ll be fantastic. But it’s a big if.

On a much lighter note, I’ve got some snippets from EM2’s new API integration. We’re being really thorough with this so far; we have seperate workers to download and process the API, meaning we can get around the API-being-slow bottleneck by having 3-5 workers just downloading and a few doing the processing (which is quick). What we also wanted to do was give you, the user, tons of control over what information we load into EVE Metrics. Read on for detailed information. Continue reading Dust 514, some EM2 teasers and statistics

Too long, more on EM2

Blimey. Twitter has distracted me from actually keeping this blog up to date. Still, better late than never, eh?

Since my last post a lot has happened. Temperatures have risen to record highs, reducing productivity as I persevered to build my own air conditioning on the cheap (which looks like a probable failure, pending a cheap pump). Charactr has managed to produce all sorts of interesting bugs in my inbox, delayed_job has broken in some new and interesting ways, and EVE Metrics 2 development has steamed ahead.

Let’s look at the last one quickly.

So far we’ve gotten most of the basic backend for EM2 done in such a way as to avoid scalability issues as much as possible. We’re using PostgreSQL instead of MySQL, we’re using table partitioning, we’re using an upload processor that does most of the log parsing work with a custom C extension we’ve put together (0.06 seconds as opposed to nearly 2 doing it in Ruby natively), and we’ve added more features and tools for statistics like inferred trade tracking and per-upload statistics.

So, a few things we’re pretty sure will be in EM2 at release:

  • Inferred trade statistics and display – We can basically make guesses based on what we see about how much of an item is being bought at what price by observing changing or vanishing orders. It’s not perfect but mostly accurate.
  • Performance. Pages will load quickly.
  • Better APIs for developers – Including movement, historic price data (same stuff as you can see ingame on the graph tab) and other oft-requested APIs.
  • Trade Finder. If you’ve got a hauler, some time, and want to make some ISK, this will tell you the optimal way to do so.
  • API integration. You’ll be able to put in your API key to have your market orders autoupdated on the site, optionally fed into the main market display for more accurate data overall, and so on.

Stuff we’re looking to implement but we’re not sure if we’ll have it out in the first version includes Science and Industry integration, Location-based filtering, and so on. We’re also looking at a way to reward uploaders; after all, the site is made accurate by accurate and regularly updated data, and while uploading with EMU isn’t much of a chore we’d like to reward those uploaders who go and upload whole swathes of the market or bits that aren’t so regularly covered, and so on.

The team working on EM2 is more or less at it every day and with Makurid’s determination to make stuff fast, we’re getting scarily close to having a really, really nice framework on which to build some exceedingly powerful tools. So now we’re at the phase where we’re now working out the final feature list, and implementing them. Hopefully we’ll be looking at a release in a few weeks time, assuming nothing goes horribly wrong; watch this space!

Edit: This post was originally titled ‘Too long, new ideas’. Kinda drifted off from my original post and renamed it. Ho hum. New ideas can get talked about later.