These forums have been archived and are now read-only.

The new forums are live and can be found at https://forums.eveonline.com/

EVE Technology Lab

 
  • Topic is locked indefinitely.
123Next pageLast page
 

Why are we not combining market data, player devs?

Author
Syn Fatelyng
Redanni
#1 - 2011-10-07 03:53:45 UTC  |  Edited by: Syn Fatelyng
Eve-Marketdata, EVE-Central, EVE Marketeer, and Jitonomic. All four provide a valuable resources to the community: Online, multiregional market information and organization.

The largest issue, however, is the separation of playerbase and that results in outdated or unused information that plagues each site. I understand that each website has different frontend goals and design choices that every developer is entitled to, but what I fail to understand is why there isn't a collaboration of data so that developers of any breed can choose to use it how they see fit. Some may remain as nothing more than information display, while others could develop more accurate and extensive route algorithms. Honestly, once you remove the hassle of requiring extensive information from an already divided group of players, your imagination is the limit.

Decide on a standardization. Consolidate the playerbase. Upload to the same pool. Create a core from where the competition between websites isn't the data itself, but the way in which that data is displayed or used. It would make the lives of EVE pilots easier while providing developers with accurate information, reduced CPU load, and a possible reduction in bandwidth.

What needs to be done so this can happen, and what can I do to encourage or support the process?

Update - Many things have progressed into full-scale projects. Details are below.
stu2000
Perkone
Caldari State
#2 - 2011-10-07 06:29:47 UTC
Hey,
Yes those services provide a very important resource/tool. Call me noob but how exactly is the data provided to these sites. Does it come direct from the eve-database or does it rely on players uploading exported data manually from the game? I think if its the latter then we have found the real issue. You wouldn't really need data collaboration if the former was the case. Anyone know of a way to do market lookups using the api?
Lutz Major
Austriae Est Imperare Orbi Universo
#3 - 2011-10-07 07:06:10 UTC
stu2000 wrote:
Hey,
Yes those services provide a very important resource/tool. Call me noob but how exactly is the data provided to these sites. Does it come direct from the eve-database or does it rely on players uploading exported data manually from the game? I think if its the latter then we have found the real issue. You wouldn't really need data collaboration if the former was the case. Anyone know of a way to do market lookups using the api?

Neither the API nor the 'eve-database' provide information about market prices.

If you browse your market screen the data will be fetched from the EVE servers and stored locally for caching. That cache is read by the software you must download from eve-central and the others.

The more people use the uploader of one site, the better the prices, the more people will use that paticualar page. It's PvP Big smile
Elegbara
White Wolf Enterprises
Harmonious Ascent
#4 - 2011-10-07 09:10:21 UTC
I did once start writing some code which would fetch prices from different sources and merge them in a configurable way. Like this: "({EveCentral_AmarrSell} + {EveCentral_JitaSell})/2*1.01"
Unfortunately I didn't have enough time to finish that and currently I got no incentives to work on that too. The code is in JS (XPCOM, Mozilla stuff) and can be found here (split between several files and projects unfortunately):
The module itself
An XPCOM module for working with eve-central
An example of usage

Open your eyes. And awaken.

Scrapyard Bob
EVE University
Ivy League
#5 - 2011-10-07 13:07:19 UTC
The EVEMarketeer uploader feeds both EVE-Marketdata and EVEMarketeer. There was also a project floating around where someone wrote an uploader to feed those two plus EVE-Central (but that project is possibly dead?).

As others have said, the upload tools watch the cache files on the hard drive (CCP has specifically allowed this in past statements), as you browse the market, your EVE client creates those cache files for each market item. Those cache files contain the current market snapshot (orders, prices, volumes, etc) for items that you look-up on the market.
Callean Drevus
Perkone
Caldari State
#6 - 2011-10-07 17:24:22 UTC
Aside from uploading to both sources, I also gather the data that EVE Central and EVE Marketdata expose via their APIs. Unless they keep data hidden, I have all the data in the universe. I in turn expose an API so they might gather my own uploads. The fact that they do not is not my mistake :P

In short, I aim to be a centralized hub. Due to the intricacies of the massive amount of data, it is not easy however. I'm open to anyone willing to help out.

Developer/Creator of EVE Marketeer

Syn Fatelyng
Redanni
#7 - 2011-10-07 18:37:22 UTC  |  Edited by: Syn Fatelyng
Callean Drevus wrote:
Aside from uploading to both sources, I also gather the data that EVE Central and EVE Marketdata expose via their APIs. Unless they keep data hidden, I have all the data in the universe. I in turn expose an API so they might gather my own uploads. The fact that they do not is not my mistake :P

In short, I aim to be a centralized hub. Due to the intricacies of the massive amount of data, it is not easy however. I'm open to anyone willing to help out.
I think this is an excellent first step. Allowing an individual uploading tool to propagate multiple sites is a band-aid for the larger problem, but noble none the less and I'm genuinely glad to see how well you supported such a feature.

Perhaps you can help me understand something: Why do so many third party programs, such as EVE Refinery and EFT, seem to pull from EVE-Central and not any of the other market websites? Is the API to interact with EVE-Central less of a hassle than EVE Marketdata or EVEMarketeer? Until the issue of divided users and information is resolved, I'd love to pick which website to use when I feel one is more accurate than the other.
Zaepho
GoonWaffe
Goonswarm Federation
#8 - 2011-10-07 19:10:32 UTC
Syn Fatelyng wrote:
I think this is an excellent first step. Allowing an individual uploading tool to propagate multiple sites is a band-aid for the larger problem,.

I disagree. This is the best solution. A universal uploader is the best solution as it relies on only 1 single point of failure (and if open source should be more maintainable after the inevitable dev fade sets in).

Ideally all market sites would support a common API for uploading market data. In this the EVE Market Universal Uploader (EMUU) could be very simple, and would be able to upload to an 'infinite" number of sites configurable by the end user (with a published set of defaults supporting all the major sites).

In having this single uploader it would be an easier task to get more people to run it all the time. Once you have a critical mass of users the data should eventually increase in volume and accuracy, ideally being able to match and verify orders via multiple EMUU User uploads.

Having 1 Uploader that uploads to all also helps prevent loss of data due to any single acceptor site being down, overloaded, etc for any reason.

Now, to reach closer to utopia, it would be useful if say every hour sites with raw data swapping agreements in place would batch up and send over their raw (or minimally processed to optimize this process) data to their peers. This helps each site's accuracy increase without sacrificing the distributed nature of the data collection.

So the items that need to occur are:

  • Develop and publish a standardized Market data upload API (should be as generic as possible to be as "Patch Proof" as possible)
  • Create the EMUU client libraries, Application and Auto-Updater to perform the Cache scraping and data upload form the client side
  • Convince EVEmon to embed the cache scraper and EMUU as a component (can be disabled) of EVEMon to leverage their large installbase to prime the pump
  • Convince EVEHQ to embed the cache scraper and EMUU as a component (can be disabled) of EVEHQ to leverage their large installbase to prime the pump
Scrapyard Bob
EVE University
Ivy League
#9 - 2011-10-07 19:22:10 UTC
Syn Fatelyng wrote:
Perhaps you can help me understand something: Why do so many third party programs, such as EVE Refinery and EFT, seem to pull from EVE-Central and not any of the other market websites? Is the API to interact with EVE-Central less of a hassle than EVE Marketdata or EVEMarketeer? Until the issue of divided users and information is resolved, I'd love to pick which website to use when I feel one is more accurate than the other.


More likely because EVE-Central has been around the longest. (jEVEAssets has the option to pull from EVE-MarketData, for instance.) EMD and EMK are both fairly new to the scene (EMD was specifically started after the old eve-metrics site shut its doors).

The API calls really aren't all that different between the sites, you feed them a region ID, an item ID (or a dozen), maybe a 3rd option to only pull sell-price average (or the 5% buy-sim price) in the URL arguments, you get back XML with a price or prices.
Syn Fatelyng
Redanni
#10 - 2011-10-07 19:23:39 UTC
Zaepho wrote:
I disagree. This is the best solution. A universal uploader is the best solution as it relies on only 1 single point of failure
This is true.
Quote:
Ideally all market sites would support a common API for uploading market data. In this the EVE Market Universal Uploader (EMUU) could be very simple, and would be able to upload to an 'infinite" number of sites configurable by the end user (with a published set of defaults supporting all the major sites).
I enjoy the thought of this idea, but am concerned about how quickly that places bandwidth consumption on the users. Instead of one site, they may be uploading to four (if not more). That's an increase of bandwidth by a factor of four, client side.
Quote:
Having 1 Uploader that uploads to all also helps prevent loss of data due to any single acceptor site being down, overloaded, etc for any reason.
Again, agreed.
Quote:
So the items that need to occur are:
Once again I find myself agreeing with your viewpoint and vision.

I'm not a programmer, but I'm willing to put ISK and word-of-mouth support towards an open source project that would handle multiple sites (beyond what the current eve marketeer supports). Eventually, programs such as EFT and EVE Refinery may be able to allow user choice over which website to pull pricing data from.
Zaepho
GoonWaffe
Goonswarm Federation
#11 - 2011-10-07 20:04:16 UTC
Syn Fatelyng wrote:
I enjoy the thought of this idea, but am concerned about how quickly that places bandwidth consumption on the users. Instead of one site, they may be uploading to four (if not more). That's an increase of bandwidth by a factor of four, client side


This is quite true, but under normal/ideal circumstances most people will simply be uploading things that they look at in the course of playing EVE. Those who upload via the various automated webpages to gather prices for less commonly looked at items for their region etc would be most impacted here.

Also I would posit that relaying from the server side could become a major bandwidth issue for the server (possibly causing a major price issue for the hosting) whereas similar to bit-torrent, the sum bandwidth of the clients is significantly greater than any given server would likely be. Less efficient when looking at the whole certainly, but more convenient and simpler from an implementation perspective.
Scrapyard Bob
EVE University
Ivy League
#12 - 2011-10-08 04:45:31 UTC
The other reason why you want to feed to different sites from the client instead of a central clearing house is so that the data still flows to at least some of the sites if one of the sites is having some downtime.

And at 1200 market scans per hour, you're still talking a miniscule amount of bandwidth. Those uploads are probably a handful of kilobytes each, which would mean something like 20-40 megabytes per hour of traffic at the upper end. It's tiny traffic compared to what the EVE client gets from the server.

(Even with 3 clients running the market scanner at the moment, my inbound traffic is about 900Kbps and outbound traffic is about 40-50Kbps. Which means my outbound traffic is about 18 megabytes per hour.)
Callean Drevus
Perkone
Caldari State
#13 - 2011-10-08 07:16:09 UTC  |  Edited by: Callean Drevus
In regards to bandwidth, the server already has trouble enough dealing with all the uploads. The client only needs to upload to four sites. The sites need to process uploads from ALL clients. This might take a little while.

Quote:
Now, to reach closer to utopia, it would be useful if say every hour sites with raw data swapping agreements in place would batch up and send over their raw (or minimally processed to optimize this process) data to their peers. This helps each site's accuracy increase without sacrificing the distributed nature of the data collection.

So the items that need to occur are:

Develop and publish a standardized Market data upload API (should be as generic as possible to be as "Patch Proof" as possible)
Create the EMUU client libraries, Application and Auto-Updater to perform the Cache scraping and data upload form the client side
Convince EVEmon to embed the cache scraper and EMUU as a component (can be disabled) of EVEMon to leverage their large installbase to prime the pump
Convince EVEHQ to embed the cache scraper and EMUU as a component (can be disabled) of EVEHQ to leverage their large installbase to prime the pump


The original EVE Metrics had a system in place where data would auotmatically be propagated to 'subscribers' as soon as it came in. This made use of AMQP if I recall correct. It will be my goal to implement that as wel eventually, since it is probably the easiest, and something I'm always in favour of, realtime solution.

I can tell you now that swapping data every 5 minutes, as EVE Central does is much better than swapping every hour, every hour would make the amount of data MASSIVE.

In regards to publishing a standardized API and a EMUU, I am all in favour. Getting people to adopt it is going to be the most difficult part though. But possibly, having a nicely formatted specification, and the promise of integration will make it more agreeable. I'll see what I can do in this regard this weekend.

Do not get me wrong, I WANT to support uploading to EVE Central. There are only two minor reasons for keeping me from this: 1. EVE Central is not actively updated anymore, 2. I cannot figure out what to send where.

Well, point two has become moot as of today, since I found out what to send where.
I still cannot understand how his database (EVE Central) is able to handle so much more brute forcing abuse than mine, while he undoubtedly gets a much higher data volume... someone tell me again that this is not simply due to him using PostgreSQL Roll

Developer/Creator of EVE Marketeer

Zaepho
GoonWaffe
Goonswarm Federation
#14 - 2011-10-10 17:01:54 UTC
Callean Drevus, Glad you hear you were able to reverse engineer the EVE-Central upload format!

The hourly suggestion was just a nice round number that wasn't "Instant Propagation" to reduce the level of effort etc involved.

I think getting the major sites onboard with a defined standard format, with a working uploader they they don't have to maintain individually, is just about a no brainer. Comments from said site, most definitely welcome.

So short term steps that I see are:
1) Define and publish Universal upload format (choose an existing format for simplicity!)
2) Build Uploader that handles unlimited sites with Upload type modules (Universal, EVE-MarketData, EVE-Marketeer , Eve-Central?) specified for each endpoint
3) Get the uploader on an open source site so it can be peer reviewed and contributions from the community are possible
4) Convince active market sites to drop their own uploaders in-favor of the EMUU

Long Term
1) Build Plugins/Addons for EVEMon/EVEHQ to increase footprint of passive uploaders
2) Build Server-side data "replication" system (Publisher/Subscriber model with only 1 level of distribution)
3) Build some form of aggregation of "wanted items" by region to build Auto-Upload page for IGB from local EMUU app
4) Build local IGB app Auto-Uploader support for Custom upload lists
5) ???



Desmont McCallock
#15 - 2011-10-10 17:40:33 UTC
EVEMon would welcome any effort made to build an EVE cache scrapper as a library and even a .NET wrapper for that.
I wouldn't mind if the library gets written in any programming language as long as it has understandable documentation,
so that building the wrapper doesn't become a PITA.

I know that Entity has released Reverence but I hadn't had the time to look at it and explore the possibility of building a .NET wrapper around it. If anyone has any idea about it please be so kind and share it.
Zaepho
GoonWaffe
Goonswarm Federation
#16 - 2011-10-10 18:07:43 UTC
Desmont McCallock wrote:
EVEMon would welcome any effort made to build an EVE cache scrapper as a library and even a .NET wrapper for that.

I'm assuming you're also not against the possibility of adding the general public Market Data uploading to market analysis websites as well?

I'm sure there's some definite benefits to be had form a cache scraper in EVEMon (instant updates of skill training/queue? prior to API validation etc)
Dragonaire
Here there be Dragons
#17 - 2011-10-10 18:08:34 UTC
I like XML formats myself Blink but they have to much overhead and use to much bandwidth. Some thing like CSV might work and is very light weight but is really limited and doesn't support any form of meta-data which probably will be needed. I'd suggest using JSON as almost every programming or scripting language either has a library or something to work with it or you can make a simple one yourself if needed. It's also an open standard with little overhead that is fairly easy to understand but let's you do some complex things with it if you need to.

The only other thing would be coming up with a list for what data is require and what can be optional included and a standard set of names/labels for everything. The reason for allowing optional information is it makes sure everyone has something in place to handle newer stuff it doesn't understand plus allow each site to add some additional meta-data for their internal use if they feel it is really needed. It also allows the testing of new features and making it available to others before all of the other sites add it themselves.

To go with the above all the sites should somehow keep the full data including stuff they don't understand so if they do decide take advantage of it in the future they can simply rerun the old data to do so. This also would allow sites to relay the info to another site that might be able to use everything. I could see the relay feature as being very useful to do things like asking other sites for what you missed during an outage or planned down time. Killboards have had something like this for a while and there's no reason it can't be done for the marketing data as well.

Finds camping stations from the inside much easier. Designer of Yapeal for the Eve API. Check out the Yapeal PHP API Library thread.

Zaepho
GoonWaffe
Goonswarm Federation
#18 - 2011-10-10 18:30:55 UTC
Dragonaire wrote:

The only other thing would be coming up with a list for what data is require and what can be optional included and a standard set of names/labels for everything. The reason for allowing optional information is it makes sure everyone has something in place to handle newer stuff it doesn't understand plus allow each site to add some additional meta-data for their internal use if they feel it is really needed. It also allows the testing of new features and making it available to others before all of the other sites add it themselves.


I think the idea would be to send the rawest data reasonably possible. I.E. the data from the cache file directly, with some anonymous/semi-anonymous indicator of source (Cryptographic signing would be awesome but I'm not sure how to make that work in a way that wouldn't be massively exploitable) to be able to attribute uploads to users and potentially to ban people trying to exploit the system by sending bad/false data.

In this i suppose JSON would be a good serialization format for the record data unless i'm missing something about the raw data format.
Dragonaire
Here there be Dragons
#19 - 2011-10-10 20:46:40 UTC
I agree everything should be kept to a minimum but I was thinking as great as having a common cache reader feeding data to the sites I was thinking it could be expanded to something that would also allow people to feed their API data into them as well anonymously. I can see many marketers liking a way to make people in all of Eve aware of what they have up to sell or are buying. I could see this helping to fill in the under scanned areas. Why have to wait until they or someone else scans the market in their area to make that data available?

I know it not hard now to just make a CAK to give to site to make it available to them but one thing the market provides is you don't know whom you're buying from or selling to until after you've done so but if you give them the CAK it is traceable plus there some addition data there beyond what you might still want to share with them. Making the purely public part of that data available just like they can see in game might be something people would be willing to do if they had a way to do so.

Just was a random idea I had I don't know if people would interested in this but I thought I'd throw it out there for people to chew on.

Finds camping stations from the inside much easier. Designer of Yapeal for the Eve API. Check out the Yapeal PHP API Library thread.

Zaepho
GoonWaffe
Goonswarm Federation
#20 - 2011-10-10 21:09:25 UTC
Dragonaire wrote:
Why have to wait until they or someone else scans the market in their area to make that data available?

This makes quite a bit of sense. I wonder how well the API data and/or the Cache data for the orders matches up to the Market Orders data. If either can be spoofed like the original market data there's definitely some value here! Any way the data can be grabbed is a good thing in my opinion.

I'd really push to use the current methods (due to large volume of data available) primarily and try to work with the other "always on" EVE tools to broaden the number of passive uploaders. This is likely the most effective way of increasing the data integrity and overall volume of data. More distinct uploaders means less impact and easier to spot false data manipulation attempts.
123Next pageLast page