Thursday, October 16, 2014

EVE Prosper Market Review - EP000

Not sure yet how I'll be parsing out these show notes to automate it onto the blog here, but see below my quick notes for the Cliff's Notes.

This first show went much better than I expected.  I had worried that I'd be hard-pressed to make my 1-hr mark, instead being wildly over and droning on and on.  Thankfully, I hit ~50mins, which leaves me a lot of space to boil down some analysis and add more topics down the line.

Also, I'd like to keep the project as open source as possible.  Unfortunately, some of the setup I've chosen so far are not really user-friendly (yet).  You can grab my scripts and do your own analysis (or cut me out entirely), but it's as-is, and will require a little hacking to get going.

EP000 - Show Stuff:

Don't forget to subscribe!


Jump ranges are being reduced, and fatigue is being introduced.  Though the initial proposal was pretty bold, especially with the exponential nature of the fatigue numbers, a lot of the riskier features were beaten back to a far more reasonable, but tough change.

Personally, my corp and I have had our eyes on the Jump Freighter changes, since combat logistics are a particular specialty of ours.  Though the 5ly original plan was going to be somewhat painful, putting our HQ out of range of both Jita and Dodixie, the new 10ly jump limit means we can keep our established supply lines running.  Also my original predictions about capital/POS fuel getting horded probably won't come true.

The best illustration of the changes is still this gem from /r/eve:

Uedama Freighter Ganking

"As an act of protest", the CFC and CODE have teamed up to gank freighters in Uedama.  Though I think the excuse is thinly veiled, and it's traditionally a time of year that CFC comes to HS to cause drama, it is having effects on the EVE economy.  Chiefly, Red Frog is raising its rates in general, and adding an additional fee for transport through Uedama.  Details can be found in their mini press release.

Though the general rate hike should be regarded like IRL postage (The cost of stamps will rise ever higher), I think the Uedama fee is interesting.  If this does wash out into any sizable effect, I expect there to be pent up supply coming from Gallente space once the ganks subside.  Though I doubt the rubber-banding will be as drastic as we've seen post Burn-Jita events.

Sleeper Research Event

I was just going to make a couple of bullet points about the spike in demand for "blue loot" (thanks Ravas for that).  Instead, after recording Hydrostatic Podcast with guest Ravas, I learned about a far more interesting outcome.

In his piece "They're Lying to You", Ravas outlines a lot of the player-driven WH lore generated since the space was released.  As an act of defiance, to try and revitalize an old player-dev partnership project, the WH community is rallying around Gillome Renard to stick it to the empires.  If I get my hands on any of the designated loot for this event, I will definitely be throwing it Gillome's way!

Outlier Report

Tonight, I'm going to point readers to the show/notes for more information about specific outlier reporting.  I should dedicate this section on the blog to more texty analysis, but I am running out of time to publish this post.  Instead, I'll give you some info about HOW I generated this episode's outlier report, and the methodology I'll be using going forward.

To automate finding interesting things, I went ahead and filtered items based on weekly sales volume.  My script pulls each item, profiles the distribution of daily sales, and then saves off some information, assuming it fits a normal distribution curve.  Continuing with that assumption, I then use "sigmas" to pick out specific percentile levels.  If the weekly average volume for an item crosses one of those percentile levels, it flags for further analysis.  

I know that it would be better to look at some sort of price-based flagging, but I don't have a great methodology for that yet.  Long bouts of growth or decline will break the method I'm using today.  I will probably look to leverage more of the methods enabled by Quantmod, but we'll start with volumes today, and work on better methods over time.


I am reasonably pleased by this first test show.  The tools work well to enable reasonably quick prep, and general interest seems high.  We'll see over the next week how the twitch/youtube subscriptions/hits go, but I'm pretty excited.  Also, I will be trying my best not to be all emo about not being able to make it to EVE Vegas.

Monday, October 13, 2014

Back From Hiatus

I haven't posted here since Crius launched.  This summer has been characterized by burnout in pretty serious fashion.  You could trace my burnout back specifically to some financial troubles that have taken most of the summer to clean up.  In all, I've been trying to figure out what is worth investing my time into, and keeping up appearances in EVE hasn't really been one of them.

The long-and-short is that though Crius represented a significant and important improvement in quality of live for industry, personally, my heart isn't in grinding space money.  This is largely due to a need to rebuild tools coupled with the lowered direct margins.  This would be surmountable, if I had any interest in growing my personal wallet above a 10B waterline.


Computer Case Painting: #PCMASTERRACE

Instead, I spent the summer trying out some new projects.  First, I've been working on my general geek cred with something a little more creative.  It should be finished this week just a little after the Borderlands: The Pre-Sequel launch:

More pictures should be out this next week.  I'll probably do a whole /r/diy post here once it's totally complete.

More Data Projects

I have some pretty big data problems at work I've been trying to digest.  I'd like to build new methodologies to explore that data, but hammering at work data after-hours is a great way to go completely mad.  Instead, I've been working on some IRL/EVE market data scripting.  Specifically been playing with Quandl a bunch, and that has reignited the sparks on a lot of half-baked scripts I had sitting idle.

This has sparked a two-way development scheme that is paying dividends on a lot of fronts.  Quandl/IRL data is serving as the test bed to write really "good" code; the kind that is clean and extensible.  EVE/CREST data is giving me the sandbox to try "new" code and work out mistakes and structure in a better behaved environment.  

The end goals are:
  • Become familiar with plotting software
  • Become familiar with neural network tools/methodology
  • Learn high parallelism code: threadding/GPU

On the Horizon

Hydrostatic Podcast has been one big pull back to the game, and I have some ambitious plans for a new show in addition to the podcast.  During the Alliance Tournament (grats Camel Empire), there was an ad for an EVE market show.  After checking it out, I said to myself, "I bet I could do something like that".

To commit to a regular stream like that, I knew the pre-stream work was going to be the breaking point.  So, I've been working on a batch of scripts to help scrape and generate the visual aides automatically, so I can just broadcast without much issue.

Sample outputs:

I plan to have episode 0, of the yet unnamed project, airing this upcoming Thursday night US (Fri, Oct 17 0200 in game).  I expect the show to be ~1hr weekly.  I'm targeting Thursday night so people can have the info they need to speculate on the weekend.  Check out the draft show notes and let me know what you think!

Monday, July 28, 2014

Weekend Update

Crunched on some work this weekend.  Getting dangerously close to complete on the spreadsheet front, and pushed some updates out.  Just posting this as a quick update for those that may have missed them.

Crius Industry Tools for Google Drive - Update

I pushed some updates to the script this weekend.  This should address some bugs with the getPOS() routine, as well as some features that @CCP_Nullarbor added to the Teams CREST feeds.
  • Fixed issue with getPOS(): Was trying to fetch location names with the wrong function call.  This issue has been resolved and all valid TQ corp keys should return good data now
  • Fixed issue with getIndustryJobs(): Was not fetching personal jobs correctly.  Changed key validation to be correct now.  Can fetch corporate or personal jobs automatically off a single key/vcode
  • Added teamName/activity to Teams and TeamsAuction feeds: Were "TODO" placeholders.  Now reports in-game names and valid activity information
Go fetch a fresh copy from either source, and continue reporting bugs
I still need to clean up some errors and hanging bits.  I'd like to push this tool to proper add-in status, but I am a little unclear on what to do between what I have now and what it takes to be considered "release ready".  Also, probably time to bite the bullet and buy some icon/banner designs.

More Crius Utilities for Devs

This one won't be as useful for the veteran 3rd party devs, but should be invaluable to the amateurs.  I put together some SQL queries and the resultant CSVs into the github.  If you're trying to update your spreadsheets with new data, these feeds should help you save a boatload of time.  If you're not sure what to do with the CSV files, I suggest you read up on pivot tables.

Let's Build a Spreadsheet Series

I did a pretty long set of streams on Sunday.  The goal was to help show off the spreadsheet fu and help people learn to build better spreadsheets with the tools at hand.  I will have the raw streams up on YouTube and Twitch tonight, but I want to boil down the lesson plans into 5-15min function lessons.  I still need to write up lesson plans and get my hands around video editing, but it's a goal for this August to put together a series that will help people build their own calculators.

Sunday, July 20, 2014

Building Better Spreadsheets - Crius Toolset

Crius releases on Tuesday, and most industrialists are scrambling to replace their spreadsheets (I know I am).  What's worse, the job-cost equation requires live data from the game, which can be difficult for someone relying on spreadsheet feeds.  Fear not!  I have great news!

Crius Feeds In Google Spreadsheets

These functions let you read in the CREST/API feeds of common calls and import them directly into your spreadsheet.  Furthermore, the triggers are set up to automatically refresh the data periodically so your spreadsheet will always be up-to-date.  Though this is not an exhaustive API tool, and still could use some more features, it should be a huge leg up for any spreadsheet jockey.

Most of the feeds are designed to dump the entire feed, and don't offer much filtering in the call.  Instead, they were designed to be used in a reference sheet that could then be leveraged using VLOOKUP() or QUERY().  This might lead to some issues with complexity down the line, so I intend to eventually add some finer calls that will just return single-line kind of data.

Getting Started

Method 1: Clone the Master Spreadsheet

This will give you the tools and triggers, but will not stay up-to-date with the master sheet.  Until I can wrap up the code as a stand-alone Drive app, this will be the most stupid proof way to get a copy:
  1. Open the spreadsheet
  2. Go to File -> Make a Copy
  3. Set the name of your copy (do not share with collaborators)
  4. Remove any extra sheets/calls you need to and start developing your spreadsheet
This method is the easiest to start with, but has the issue that it will not keep current with updates.  

Method 2: Copy-Paste from Github

The codebase is free and open source, and is designed to be copy-pasted into the gdoc script interface.  This method is a little more tedious, but will be easy to copy updates as they come out.
  1. Get plain-text code from the GitHub repo
  2. In your spreadsheet, go to Tools -> Script Editor...
  3. This opens a new window.  Select "Blank Project" from the initialization prompt
  4. Copy the raw code into code.js space
  5. Set the name of the project
  6. Save the changes
  7. Configure the app triggers.  Set get/All functions to 1hr timers

This will give you all the utilities in a fresh, or existing codebase.  Also, configuring the triggers appropriately will keep the data up-to-date automatically.  It's technically optional, but without time triggers, it will require a fresh open to pull fresh data.

Also, as updates come out, you'll be able to drop in the new code.  I expect to keep this project backwards compatible, so each drop in should ADD features.  Though, of course, if you go editing the code, you will need to be more careful about dropping in changes.  

Function List

  • getPOS (keyID, vCode, header_bool, verbose_bool, test_server_boo l)
  • getFacilities (keyID, vCode, header_bool, verbose_bool, test_server_bool )
  • getIndustryJobs (keyID, vCode, header_bool, verbose_bool, test_server_bool )
  • getAvgVolume (days, item_id, region_id )
  • getVolumes (days, item_id, region_id )
  • AllItemPrices (header_bool, test_server_bool )
  • AllSystemIndexes (header_bool, test_server_bool )
  • AllTeams (header_bool, verbose_bool, test_server_bool )
  • AllAuctions (header_bool, verbose_bool, test_server_bool )
The functions are designed to be referenced as simply as possible.  CREST feeds like AllItemPrices and AllSystemIndexes can be referenced without arguments if desired.  Also, the classic API feeds are designed to return as much information as they can, with internal switches to try and use the /corp/Locations feeds if possible.  Also, most feeds come with a "verbose_bool" trigger to add/remove ugly or useless raw ID kind of data.  Lastly, the test_server_bool has been left in the release.  For TQ this value can either be blank or false.

Function guide below the cut

Wednesday, July 16, 2014

Can I Play With MA(C)Dness?

Blame this one entirely on @K162space.  He got me playing with Quantmod in R.  Also, shout out to CCP Quant for showing off the original bits, and CCP Foxfour for the CREST market history.

After all my EVE data experiments, I've had a very hard time finding predictive correlations between datasets.  Even trying to bring in destruction data showed no predictive properties, instead only showing correlation to corroborate market volumes.  Also, I've tried my best to get into Varakoh's analysis, but have never been able to internalize his methods enough to consider rolling it into my own analysis (also getting reliable candlesticks for EVE data is a pain)

Then Blake over at sent me this tidbit:

Quantmod opened a whole new realm of tools I never knew about (full listing here).  The most interesting to me has been Moving Average Convergence Divergence (MACD).  I wouldn't be so bold as to say "use this to make money", but the results are extremely interesting when using IRL stock data.

For those that have never seen an MACD plot, the theory is relatively simple: Different line crosses are signals for buy/sell operations (specifics here and here).  Though the signals are imperfect, and will by no means catch every peak and valley, they can be an excellent sanity check for the longer trend.  For many IRL stocks these trends can have very strong correlation and are a popular tool among amateurs.  It is less useful for the minute-to-minute daytrading or arbitrage, but can be a powerful data source for the long-term investor.

Can this be useful for an EVE trader?  Well....

Let's look at a few plots:
PLEX - YTD - click to embiggen
Tritanium - YTD - click to embiggen
Phenolic Composites - YTD
I had to do a little fudging to get open/close bars to work correctly (though high=close=avg is roughly the same).  The trend information is interesting, but the crossover signals aren't lining up well.  Using the traditional 12,26,9 configuration, many of the crossover signals arrive 2-3 days late.  If you ended up using these charts as-is, I think you'd end up at best breaking even.  Though there are some secondary things you could do like buy and sell in the swings up and down, these charts aren't going to be immediately useful.

I then started playing with shorter windows, and pairing a fast and a slow chart might be a better way forward.  Unfortunately, I'm blindly fiddling knobs right now, but I'm actively hunting down documentation to better fine tune the charts.  I was thinking a 10-15d window might be more accurate, and a 25-30d window would serve well as a "slow" chart.

Also, I think the weakness has a bit to do with the quality of data here.  Where MACD is looking for close prices, we're feeding it daily averages.  This might be a good excuse to finally work on a better snapshot tool using eve-central data.  Though I had a lot of trouble processing down the raw archive, starting up a stupid-script that pulls periodically from the traditional API would be a quick solution that could start crunching in the background.  Once I can get my industry tools refactored, I expect to get back into the data mining game pretty seriously.

In the meantime, be sure to subscribe to my channel on Twitch, and follow on Twitter.  I will be doing some random streams over the next week as I get back into a reasonable routine again.