Sunday, June 22, 2014

Crius stuff - Getting Ahead in Code Development

Crius isn't that far off, and I'm starting to get a little anxious about getting started on new industry tool code.  Though CCP_Foxfour has been extremely outspoken on Twitter and Reddit about API rollouts, it's been harder to get the other base math information needed to make calculations.

Beta Data Dump

Thanks to FuzzySteve and CCP_Nullarbor for dropping this pro-tip in my stream.  You can take a look at the current BPO data inside the Singularity client files:
<sisi client path>\bin\staticdata\blueprint.db
 There you can find a SQLite db of the individual BPO data roughly the same format as will be delivered in the SDE when Crius launches.  I had a little trouble exporting the whole thing because I'm a noob and SQLite TEXT/BLOB rules are weird.  I ended up whipping together a little Python script to dump as csv.  The final version will be published as YAML, but should still be the JSON-string type for extraction.

Just to reiterate, we probably won't have a real set of tables for attributes like before.  Instead, each blueprint will be an individual object with all the attributes inside it.  There's nothing stopping you from expanding/transforming this data back into tables, but don't expect CCP or Fuzzwork to do it for you (EDIT: maybe after release, but will be custom project).

New APIs

The API is dead, long live the API

Until SSO rolls out, we're still going to have the classic-API for private data, and CREST for public/global data.  I will need more time to whip up real guides, and I will probably just update dev wikis rather than post wall-o-text here on the blog.

Classic-API Feeds

  • [char|corp]/IndustryJobs: updated format, will break classic tools
  • [char|corp]/IndustryJobsHistory: updated format, will break classic tools
  • corp/Facilities: will list corp-controlled facilities like POS/Outpost


  • /industry/
    • more info soon(tm)
Personally, I continue to use Entity's eveapi Python module for classic API queries.  Mostly because Python + XML kinda sucks.  Thankfully CREST is JSON and much easier to handle.  I still haven't seen an all-in-one CREST module, so you're gonna be stuck writing handlers for each new API feed.  This shouldn't be too much trouble, since the only things that need to change between requests is address and perhaps HTTP request headers.