Architecture for the future

After that EVE-centric post on scalability (thanks to HighScalability.com for linking in, hope it was an interesting read), I figured it was time to return to EVE Metrics and other sites- accVIEW and ISKsense.

In the next week we will be migrating to a new server. It’s in the same datacenter with the same host, is a slightly faster machine but has four times as much RAM (8GB) and an additional 10kRPM hard drive. As part of the migration to the new server we’ll be making some changes to the software architecture running the show.

The main difference is that we’re moving away from Passenger, also known as mod_rails. It has some advantages in low-memory conditions, but we’ve had more trouble than it’s worth, so we’ll be moving back to running application servers manually as daemons. For this we’ll be using the excellent Thin application server. For the sites running PHP on the server (this blog, for example), we’ll be using PHP FPM as we are currently; we’ve had no issues with that. Both of those will be sitting as reverse proxies behind nginx. Nginx has done very well as a web server and it’s very fast, as well as being easy to configure.

There is only one other major change; we’ll be sitting nginx itself behind Varnish, a high performance HTTP cache. This will let us more efficiently leverage HTTP caching in our applications and speed up requests dramatically. Right now we don’t use HTTP caching that much; we’d like to change this, particularly in EVE Metrics’ API so we can let Varnish handle a good portion of the thousands of API calls we get asking for the price of trit or what have you. All in all it’ll mean reduced load on the application cluster, which means we can keep that smaller and lighter, which in turn means more room for the database in memory.

That translates to better performance on the more complex components in the site, ie market pages, your account page, corporate pages, and that better performance means we can build more- we’re waiting for the new capacity before we add asset support, one of the things we’re really looking forward to adding, since it will let us add a whole new level of functionality by giving lots more information to processes like our inferred trade detector and our planned fulfilled orders listings. Plus we’ll be adding asset valuation tools, of course.

The architecture I’ve described above will basically be ‘it’ for now; we have more complication at the application and DB layer (We still use MySQL for a few legacy applications, so we have a tiny MySQL server running). The complication at the app layer mainly consists of things like background processing tools, and for EVE Metrics tasks that are actually executed on a VPS and the results uploaded back to the server (we now do all the major CSV dumps on Makurid’s VPS).

As the guy who ends up fixing all this when it goes wrong, simplicity is always my main priority, but the added complexity of Thin and Varnish should be well worth it in the long run.

Moondoggie & Market Browsing

OK. EVE Metrics is my big market browsing project. It’s very complex, it’s got a lot of data, but it all basically comes down to this: People browse the market with a program running on their computer, and when any market data is viewed, EVE Online writes it to a cache file, the program decodes that and fires it at the server. We collect all these reports and build a single picture of the market in EVE.

There’s the top-down view for you. We’ve never really not had enough data. We get good market coverage in most regions and we’re fairly up to date in the grand scheme of things. But compare the actual market of EVE to EVE Metrics and we’re still a long way off having a truly accurate picture. EVE moves quickly- in some markets, from minute to minute orders will be shuffling around and changing price and being bought out.

With Dominion we got a new browser. This means you can now use the full EVE Metrics website ingame, but also (through some Javascript client hook additions) lets us provide a fantastic new tool to help us get an even better picture of the market in EVE.

If you fire up the IGB and head over to the upload suggestions page, you’ll be given a list of 10 items, and a few options for automatic checking. Choosing this option will prompt EVE Metrics for a list of items to check, and will automatically go and look at those items. It’s slow, but it works. In the space of a few hours with one user, we can get data for an entire region across all the items on the market. This is utterly fantastic and we’re really looking forward to the larger volume of data this is bringing to the site.

So, if you’ve got a spare moment, or you need to go AFK for an hour, or you want to help out while you’re mining, or you’re just tired of clicking the next item in the list, install the uploader and visit the page ingame to get started. Every upload counts and helps us build the biggest, best picture of EVE’s market we can manage to produce. Uploads to EVE Metrics are also syndicated to other websites and tools, of course. Your uploads and contribution of time help hundreds of users who use the site, and tens of thousands more who rely on our pricing, history and order APIs for their applications.

Oh, and if you’re a developer, we now have a server status API with all the information you could possibly want on TQ, Sisi and the API servers. It can be found here (docs here). Enjoy!