Monday, July 28, 2014

Weekend Update

Crunched on some work this weekend.  Getting dangerously close to complete on the spreadsheet front, and pushed some updates out.  Just posting this as a quick update for those that may have missed them.

Crius Industry Tools for Google Drive - Update

I pushed some updates to the script this weekend.  This should address some bugs with the getPOS() routine, as well as some features that @CCP_Nullarbor added to the Teams CREST feeds.
  • Fixed issue with getPOS(): Was trying to fetch location names with the wrong function call.  This issue has been resolved and all valid TQ corp keys should return good data now
  • Fixed issue with getIndustryJobs(): Was not fetching personal jobs correctly.  Changed key validation to be correct now.  Can fetch corporate or personal jobs automatically off a single key/vcode
  • Added teamName/activity to Teams and TeamsAuction feeds: Were "TODO" placeholders.  Now reports in-game names and valid activity information
Go fetch a fresh copy from either source, and continue reporting bugs
I still need to clean up some errors and hanging bits.  I'd like to push this tool to proper add-in status, but I am a little unclear on what to do between what I have now and what it takes to be considered "release ready".  Also, probably time to bite the bullet and buy some icon/banner designs.

More Crius Utilities for Devs

This one won't be as useful for the veteran 3rd party devs, but should be invaluable to the amateurs.  I put together some SQL queries and the resultant CSVs into the github.  If you're trying to update your spreadsheets with new data, these feeds should help you save a boatload of time.  If you're not sure what to do with the CSV files, I suggest you read up on pivot tables.

Let's Build a Spreadsheet Series

I did a pretty long set of streams on Sunday.  The goal was to help show off the spreadsheet fu and help people learn to build better spreadsheets with the tools at hand.  I will have the raw streams up on YouTube and Twitch tonight, but I want to boil down the lesson plans into 5-15min function lessons.  I still need to write up lesson plans and get my hands around video editing, but it's a goal for this August to put together a series that will help people build their own calculators.

Sunday, July 20, 2014

Building Better Spreadsheets - Crius Toolset

Crius releases on Tuesday, and most industrialists are scrambling to replace their spreadsheets (I know I am).  What's worse, the job-cost equation requires live data from the game, which can be difficult for someone relying on spreadsheet feeds.  Fear not!  I have great news!

Crius Feeds In Google Spreadsheets

These functions let you read in the CREST/API feeds of common calls and import them directly into your spreadsheet.  Furthermore, the triggers are set up to automatically refresh the data periodically so your spreadsheet will always be up-to-date.  Though this is not an exhaustive API tool, and still could use some more features, it should be a huge leg up for any spreadsheet jockey.

Most of the feeds are designed to dump the entire feed, and don't offer much filtering in the call.  Instead, they were designed to be used in a reference sheet that could then be leveraged using VLOOKUP() or QUERY().  This might lead to some issues with complexity down the line, so I intend to eventually add some finer calls that will just return single-line kind of data.

Getting Started

Method 1: Clone the Master Spreadsheet

This will give you the tools and triggers, but will not stay up-to-date with the master sheet.  Until I can wrap up the code as a stand-alone Drive app, this will be the most stupid proof way to get a copy:
  1. Open the spreadsheet
  2. Go to File -> Make a Copy
  3. Set the name of your copy (do not share with collaborators)
  4. Remove any extra sheets/calls you need to and start developing your spreadsheet
This method is the easiest to start with, but has the issue that it will not keep current with updates.  

Method 2: Copy-Paste from Github

The codebase is free and open source, and is designed to be copy-pasted into the gdoc script interface.  This method is a little more tedious, but will be easy to copy updates as they come out.
  1. Get plain-text code from the GitHub repo
  2. In your spreadsheet, go to Tools -> Script Editor...
  3. This opens a new window.  Select "Blank Project" from the initialization prompt
  4. Copy the raw code into code.js space
  5. Set the name of the project
  6. Save the changes
  7. Configure the app triggers.  Set get/All functions to 1hr timers

This will give you all the utilities in a fresh, or existing codebase.  Also, configuring the triggers appropriately will keep the data up-to-date automatically.  It's technically optional, but without time triggers, it will require a fresh open to pull fresh data.

Also, as updates come out, you'll be able to drop in the new code.  I expect to keep this project backwards compatible, so each drop in should ADD features.  Though, of course, if you go editing the code, you will need to be more careful about dropping in changes.  

Function List

  • getPOS (keyID, vCode, header_bool, verbose_bool, test_server_boo l)
  • getFacilities (keyID, vCode, header_bool, verbose_bool, test_server_bool )
  • getIndustryJobs (keyID, vCode, header_bool, verbose_bool, test_server_bool )
  • getAvgVolume (days, item_id, region_id )
  • getVolumes (days, item_id, region_id )
  • AllItemPrices (header_bool, test_server_bool )
  • AllSystemIndexes (header_bool, test_server_bool )
  • AllTeams (header_bool, verbose_bool, test_server_bool )
  • AllAuctions (header_bool, verbose_bool, test_server_bool )
The functions are designed to be referenced as simply as possible.  CREST feeds like AllItemPrices and AllSystemIndexes can be referenced without arguments if desired.  Also, the classic API feeds are designed to return as much information as they can, with internal switches to try and use the /corp/Locations feeds if possible.  Also, most feeds come with a "verbose_bool" trigger to add/remove ugly or useless raw ID kind of data.  Lastly, the test_server_bool has been left in the release.  For TQ this value can either be blank or false.

Function guide below the cut

Wednesday, July 16, 2014

Can I Play With MA(C)Dness?

Blame this one entirely on @K162space.  He got me playing with Quantmod in R.  Also, shout out to CCP Quant for showing off the original bits, and CCP Foxfour for the CREST market history.

After all my EVE data experiments, I've had a very hard time finding predictive correlations between datasets.  Even trying to bring in destruction data showed no predictive properties, instead only showing correlation to corroborate market volumes.  Also, I've tried my best to get into Varakoh's analysis, but have never been able to internalize his methods enough to consider rolling it into my own analysis (also getting reliable candlesticks for EVE data is a pain)

Then Blake over at K162space.com sent me this tidbit:

Quantmod opened a whole new realm of tools I never knew about (full listing here).  The most interesting to me has been Moving Average Convergence Divergence (MACD).  I wouldn't be so bold as to say "use this to make money", but the results are extremely interesting when using IRL stock data.

INTC - YTD
For those that have never seen an MACD plot, the theory is relatively simple: Different line crosses are signals for buy/sell operations (specifics here and here).  Though the signals are imperfect, and will by no means catch every peak and valley, they can be an excellent sanity check for the longer trend.  For many IRL stocks these trends can have very strong correlation and are a popular tool among amateurs.  It is less useful for the minute-to-minute daytrading or arbitrage, but can be a powerful data source for the long-term investor.

Can this be useful for an EVE trader?  Well....

Let's look at a few plots:
PLEX - YTD - click to embiggen
Tritanium - YTD - click to embiggen
Phenolic Composites - YTD
I had to do a little fudging to get open/close bars to work correctly (though high=close=avg is roughly the same).  The trend information is interesting, but the crossover signals aren't lining up well.  Using the traditional 12,26,9 configuration, many of the crossover signals arrive 2-3 days late.  If you ended up using these charts as-is, I think you'd end up at best breaking even.  Though there are some secondary things you could do like buy and sell in the swings up and down, these charts aren't going to be immediately useful.

I then started playing with shorter windows, and pairing a fast and a slow chart might be a better way forward.  Unfortunately, I'm blindly fiddling knobs right now, but I'm actively hunting down documentation to better fine tune the charts.  I was thinking a 10-15d window might be more accurate, and a 25-30d window would serve well as a "slow" chart.

Also, I think the weakness has a bit to do with the quality of data here.  Where MACD is looking for close prices, we're feeding it daily averages.  This might be a good excuse to finally work on a better snapshot tool using eve-central data.  Though I had a lot of trouble processing down the raw archive, starting up a stupid-script that pulls periodically from the traditional API would be a quick solution that could start crunching in the background.  Once I can get my industry tools refactored, I expect to get back into the data mining game pretty seriously.

In the meantime, be sure to subscribe to my channel on Twitch, and follow on Twitter.  I will be doing some random streams over the next week as I get back into a reasonable routine again.

Sunday, June 22, 2014

Crius stuff - Getting Ahead in Code Development

http://community.eveonline.com/news/dev-blogs/upcoming-api-changes-for-industry/

Crius isn't that far off, and I'm starting to get a little anxious about getting started on new industry tool code.  Though CCP_Foxfour has been extremely outspoken on Twitter and Reddit about API rollouts, it's been harder to get the other base math information needed to make calculations.

Beta Data Dump

Thanks to FuzzySteve and CCP_Nullarbor for dropping this pro-tip in my stream.  You can take a look at the current BPO data inside the Singularity client files:
<sisi client path>\bin\staticdata\blueprint.db
 There you can find a SQLite db of the individual BPO data roughly the same format as will be delivered in the SDE when Crius launches.  I had a little trouble exporting the whole thing because I'm a noob and SQLite TEXT/BLOB rules are weird.  I ended up whipping together a little Python script to dump as csv.  The final version will be published as YAML, but should still be the JSON-string type for extraction.

Just to reiterate, we probably won't have a real set of tables for attributes like before.  Instead, each blueprint will be an individual object with all the attributes inside it.  There's nothing stopping you from expanding/transforming this data back into tables, but don't expect CCP or Fuzzwork to do it for you (EDIT: maybe after release, but will be custom project).

New APIs

The API is dead, long live the API

Until SSO rolls out, we're still going to have the classic-API for private data, and CREST for public/global data.  I will need more time to whip up real guides, and I will probably just update dev wikis rather than post wall-o-text here on the blog.

Classic-API Feeds

  • [char|corp]/IndustryJobs: updated format, will break classic tools
  • [char|corp]/IndustryJobsHistory: updated format, will break classic tools
  • corp/Facilities: will list corp-controlled facilities like POS/Outpost

CREST Feeds

  • /industry/
    • more info soon(tm)
Personally, I continue to use Entity's eveapi Python module for classic API queries.  Mostly because Python + XML kinda sucks.  Thankfully CREST is JSON and much easier to handle.  I still haven't seen an all-in-one CREST module, so you're gonna be stuck writing handlers for each new API feed.  This shouldn't be too much trouble, since the only things that need to change between requests is address and perhaps HTTP request headers.

Monday, June 16, 2014

Crius Industry - Features You May Have Missed

On Saturday, I had my stream, and crashed +Ali Aras's stream afterwards along with Mynnna, Salpun, and a few others I didn't immediately recognize.  Also, +Chella Ranier dropped into my Friday stream (and I will have her segment isolated into its own video soon).  It's been exceptionally helpful to get all the various perspectives on the industry changes, and we're still only scratching the surface.

If you're relying purely on the dev blogs for info about the changes, you're going to have a bad time.  Mynnna showed me how grossly out of sync I was by missing important information about CCP Grayscale's BP migration on the forum.  Also, I almost missed some of the POS stuff in the most recent devblog.  Lastly, it's worth messing with the UI on all the devices you expect to play EVE on.  It's a pretty big window, and I anticipate issues with shorter form factors such as laptops.

Note, none of the following are set in stone yet.  Provide feedback if you don't like the changes!  It's also worth noting the entire CSM is incredibly active on this Crius front.  Feel free to reach out to your personal favorite(s) and include them in your feedback!

Invention Math Changes


There are three big changes for Invention that might have gone unmentioned elsewhere.  
  • Invention will only consume 1 run off a copy per attempt
  • Invention times are far more dynamic to match up with manufacturing times
    • Copy + invent = 0.5x T2 build time (roughly research = build for T2)
  • Invention results will all be +ME/+TE.  No more negative results
    • Base materials will be increased to keep prices roughly flat
Both of these changes are pretty big for T2 producers.  Reduced copy consumption clears up two problems for most inventors: BPC clutter, and reduce errors with T2 outcomes.  Though copy-time vs build-time is being reduced, this results in a big increase for module copies.  This ends up being balanced out by the one-run/invent feature.  You will still need multiple copies for parallelism, but that supply should last for quite a while.

Personally, I find the invention time balance far more interesting.  The goal is noble, to make invention time a far more meaningful choice, but I think this change misses the mark in its current state on Singularity.  For periodic gaming, I imagine a few different time groups:
  • <=1 hr: multiple per session, up to 6x/day
  • < 3 hrs: once per session, up to 3x/day
  • < 10 hrs: twice per day
  • < 22hrs: once per day
  • n-days - 2hrs: once per n days
Lastly, the ME/TE changes should make the T2 BPO tinfoilers jump for joy.  Until the material quantities balance out (using a modified refining equation, causing weird material costs) and the SDE comes out to better run the bulk numbers, this will be a very hard thing to judge.  BPOs will still have a slight advantage due to the requested runs cost scaling, but I think you will be able to get better ME/TE than most BPOs in the end.  I'm particularly interested in how decryptors will drive the margins, since we probably won't see the big savings factors at 6-8% savings like we saw between -ME steps.

What I Don't Like

Personally, I only care if something is >2hrs, because I won't be able to do more than 1/day at that rate (without mobile app).  Also, I care that jobs hit at 22hrs rather than 24, because it can be hard to keep up with a strictly 24 or 25 hour cycle time.  Though the whole spectrum is interesting if you're going to optimize to share BPs over timezones, as long as jobs are a per-character mechanic, the times can be blocked into the above categories IMO.  

Also, I'd rather that the invention time balance be 0.4x or 0.35x to account for specializing in the science skills.  I'd like to put a tangible benefit for taking some of the obscure science skills to 5.  Lastly, another "between the lines" message here will be wider decryptor-type use, because of the reduced research-time/run.

Currently POS math is broken, so I am not sure how this will wash out in the end.  Also, I still need to play with some of the corners to check the balance.  Lastly, the decryptor math is going to be something worth checking with a fine toothed comb.  On TQ today, decryptors pay for themselves off the ME savings, and the difference between positive steps is much smaller than the difference between negative steps.  I have a hard time imagining that better research/build ratios or lowered datacore costs will drive consumption.

POS Changes


There are a bunch of little things that changed between pre-Kronos and now.  
  • Additional labs/arrays will reduce job install costs
  • "Invention" and "Research" labs have been split:
    • Design Lab: able to copy/invent
    • Research Lab: able to ME/TE research
    • Hyasyoda Lab = meta Research Lab
  • Lowsec love
    • Intensive refining array = mid-grade null outpost refinery
    • Lowsec-only Thukker component array
      • ME better than outpost
      • Built from ghost site materials?
  • EDIT: moon mining/reactions will be available in 0.4 sec systems
As I covered previously, the people crying the biggest tears were solo capital producers.  With the new POS array changes, the playing field becomes much more even between LS/NS.  There's still the risk v reward metric, and a supply chain bottleneck for compressed ore, but the ability to participate in the capital game is still there (even if it's not entirely solo any more).  

Also, I really like the flat differentiation between "invention" and "research" labs.  This makes POS design far more direct for the amateur.  It's still hard to judge the job-cost benefits (in general) because of math and UI elements not being better solidified yet.  I'd also like to see the fitting requirements for both arrays be equivalent (or swapped) and a meta Design lab to match the Hyasyoda offering.  With the BP@POS rules, I'd like the choice to utilize blingy equipment for both Design and Research labs.  

Other Notes

The features are designed for an incredibly wide swath of EVE to participate in industry.  Though I am not wholly convinced it's great for high-powered and cooperative industry, it's a big step up from TQ historically.  Also, a lot of the bigger scale horsepower will rely on +Regner Blok-Andersen getting CREST/API/SDE stuff out for developers to start moving some of the every-day calculations outside the client.

Lastly, I have this grandiose idea that job fees will be the bar that separates the men from the boys.  Though right now, costs still account for <5% of business.  I can see where taxes + fees start to eat your lunch, but it's still a death-by-inches mechanic, and that makes it hard to justify a lot of effort to minimize it.  Also, with the job cost UI lacking a lot of the promised breakdown, it's hard to feel like it represents a meaningful choice yet.  If the costs are going to be this complex, I need to have a fighting chance to make good decisions.