Sunday, June 22, 2014

Crius stuff - Getting Ahead in Code Development

Crius isn't that far off, and I'm starting to get a little anxious about getting started on new industry tool code.  Though CCP_Foxfour has been extremely outspoken on Twitter and Reddit about API rollouts, it's been harder to get the other base math information needed to make calculations.

Beta Data Dump

Thanks to FuzzySteve and CCP_Nullarbor for dropping this pro-tip in my stream.  You can take a look at the current BPO data inside the Singularity client files:
<sisi client path>\bin\staticdata\blueprint.db
 There you can find a SQLite db of the individual BPO data roughly the same format as will be delivered in the SDE when Crius launches.  I had a little trouble exporting the whole thing because I'm a noob and SQLite TEXT/BLOB rules are weird.  I ended up whipping together a little Python script to dump as csv.  The final version will be published as YAML, but should still be the JSON-string type for extraction.

Just to reiterate, we probably won't have a real set of tables for attributes like before.  Instead, each blueprint will be an individual object with all the attributes inside it.  There's nothing stopping you from expanding/transforming this data back into tables, but don't expect CCP or Fuzzwork to do it for you (EDIT: maybe after release, but will be custom project).

New APIs

The API is dead, long live the API

Until SSO rolls out, we're still going to have the classic-API for private data, and CREST for public/global data.  I will need more time to whip up real guides, and I will probably just update dev wikis rather than post wall-o-text here on the blog.

Classic-API Feeds

  • [char|corp]/IndustryJobs: updated format, will break classic tools
  • [char|corp]/IndustryJobsHistory: updated format, will break classic tools
  • corp/Facilities: will list corp-controlled facilities like POS/Outpost


  • /industry/
    • more info soon(tm)
Personally, I continue to use Entity's eveapi Python module for classic API queries.  Mostly because Python + XML kinda sucks.  Thankfully CREST is JSON and much easier to handle.  I still haven't seen an all-in-one CREST module, so you're gonna be stuck writing handlers for each new API feed.  This shouldn't be too much trouble, since the only things that need to change between requests is address and perhaps HTTP request headers.


Steve Ronuken said...

I may decompose it into tables.

There are a few edge cases where it's useful. That may not be done for the release of Crius though.

Lukas Rox said...

A very nice post, but it would be immensely helpful to get some sample outputs from the new industry endpoints.I no longer have access to Sisi and can't download them myself...

Unknown said...

it's on the to-do. But it will be better to just update existing references than expect people to come here for reference.

I'll poke you when I get a chance to tool with it.

Filz said...

Are you able to point me in the direction of any resources that show you how to that data back into tables? I have an industry spreadsheet that grabs data from a SQLExpress database and imports it into pivot tables - I'd like to be able to update that spreadsheet with the new data but don't know enough about serialized data to know how to do it. Or is there a better way of doing things now?

Unknown said...

I can't think of an automatic method. In the short term, you'll need to parse the JSON to get a 3 col table: bp, material, quantity. From there you should be back to normal.

I will try to whip up CSV or SQL with the current scrapes. Expect full versions on Fuzzwork within a couple weeks of release.

Post a Comment