Pages: [1] :: one page |
|
Author |
Thread Statistics | Show CCP posts - 2 post(s) |
|
CCP Spitfire
C C P C C P Alliance
102
|
Posted - 2011.09.06 05:39:00 -
[1] - Quote
Attention market addict! Dr.EyjoG's newest dev blog gives us information about a new initiative: providing historic market data to capsuleers. This is a test run, so please do share your feedback with both Research & Statistics and Team Sleeper Cell in this thread. |
|
Dierdra Vaal
Veto. Veto Corp
13
|
Posted - 2011.09.06 12:58:00 -
[2] - Quote
I believe the general consensus from the old thread was that shorter delay = better. Most people seemed ok with a 24h delay.
(also first)
Veto #205 * * * Director Emeritus at EVE University * * * CSM1 delegate, CSM3 chairman and CSM5 vice-chairman |
Ix Forres
Righteous Chaps
5
|
Posted - 2011.09.06 13:21:00 -
[3] - Quote
I've come all the way from the old forums to say: Hourly or faster. EVE Metrics did it, there's no reason you guys can't. You could reasonably have the dump act more like a real API and cache it for a minute or two. Completely within your ability and utterly simple to achieve.
Edit: Specifically we did daily dumps for all market history for every type on the market for every region on the market, and we had an API that let you query any combination of the above and which would retrieve, always, the most up to date data, cached if someone else had made a request before anything changed.
Edit 2, to quote Stillman:
Quote:From a technical point of view, this is completely impossible I'm afraid. The amount of data it takes to generate this sort of data is absolutely huge. There has to be a delay, because of that. And it has to be measured in days.
From a design point of view, there's also the concern about people getting a huge advantage from getting this sort of information.
I believe the response here should be something like 'lol'. Crowdsourced data repositories based on cache scraping will continue to kick the ass of this service, and the huge advantage will never arise because the evil cache hackers will have the edge all the time.
Points for trying, none for the execution. |
|
CCP Stillman
C C P C C P Alliance
54
|
Posted - 2011.09.06 19:14:00 -
[4] - Quote
Ix Forres wrote:I've come all the way from the old forums to say: Hourly or faster. EVE Metrics did it, there's no reason you guys can't. You could reasonably have the dump act more like a real API and cache it for a minute or two. Completely within your ability and utterly simple to achieve. Edit: Specifically we did daily dumps for all market history for every type on the market for every region on the market, and we had an API that let you query any combination of the above and which would retrieve, always, the most up to date data, cached if someone else had made a request before anything changed. Edit 2, to quote Stillman: Quote:From a technical point of view, this is completely impossible I'm afraid. The amount of data it takes to generate this sort of data is absolutely huge. There has to be a delay, because of that. And it has to be measured in days.
From a design point of view, there's also the concern about people getting a huge advantage from getting this sort of information. I believe the response here should be something like 'lol'. Crowdsourced data repositories based on cache scraping will continue to kick the ass of this service, and the huge advantage will never arise because the evil cache hackers will have the edge all the time. Points for trying, none for the execution. I'd suggest you read the old thread. I already addressed why it's completely feasible for third-party sites to do this sort of thing, but we can't. It comes down to: 1. The data you acquire is just snapshots of orders 2. There's a one-to-many relationship between orders and transactions 3. Our data is generated from transactions
So there's a difference in how it's calculated and the output data. You have to realize we're working with non-trivial data-sets in terms of size, which we'd have to do this while also ensuring that there's no performance impacts to EVE as a game, because that's unacceptable. Hello World! |
|
Sciencegeek deathdealer
StratEgic TechnologIeS
0
|
Posted - 2011.09.08 08:05:00 -
[5] - Quote
I'm beginning work on a pretty comprehensive analysis tool/website based off what was given to us, but designed to analyze the API stuff (EDIT: When its released...). I have a few ideas of what I want it to do, but any suggestions? (No promises this will work, its a massive amount of data)
-Geek
EDIT 2: This isn't designed to give you magical answers for that exact moment, but is a general analysis tool of where the market is at, and where it is heading. My specialty is in programming, not marketing, this just seems like quite a useful, and interesting idea.
EDIT 3: Well... After 4 hours of pre-pre processing, this is starting to come together on the data side. Now I just need to figure out in what ways to analyze the data, and get the website all tidied up. I'll post some general pictures this weekend maybe. This could be an interesting historical view of the EVE economy, and it should plug in to the API nicely. |
Diomedes Calypso
Aetolian Armada
10
|
Posted - 2011.09.09 18:16:00 -
[6] - Quote
1) Can you confirm that the average and median prices are based on TRANSACTIONS, not based on open orders?
(that may seem self-evidident but there are so many examples of terms used differently that I'd like to hear that explicitly)
For me, true sales price information(even using averages) could show me different information than the data scrubbers curently aim to... with thier emphasis on buy and sell _orders_ and their use of the "average" price we see in the game interface, which we know is massaged in all sorts of strange ways to get rid of outliers etc..
2) Will you be giving us unfiltered average and median transaction price numbers or will you be using an undisclosed system for removing outlier type transactions as you do now?
3) REQUEST - It would be super intersting to get a once a week (regular interval of some sort) releas of numberst tracking the economy. - total bounties paid during the period
- total ships destroyed as insurance is paid both on insured and uninsured ships I'd think that it wouldn't be too hard to increment some number each time a KM is generated ( --- ideally destroyed though vs sold as that would be new information to us and a interesting immersive feel for how game combat drama connects to game market drama)
- total market volume in the period - while this information is scrubable, it would just be a nice thing to have in a briefe "leading economic indicators" weekly release ... I'd ask for a number on Ships constructed (like new home starts) but I'd guess that might be harder for you to collect)
|
Diomedes Calypso
Aetolian Armada
10
|
Posted - 2011.09.09 18:26:00 -
[7] - Quote
New thought, that I got thinkiing about my last post.
Ships Destroyed 2 ways:
1 - # of Kill Mails generated
2- # of secured commerce ship loss letters generated.
The first is for PVP, the second can be for either pve or pvp --- the absoluste movment and relative movement of the two numbers would be interesting in different ways.
Understanding the relation of the two and the ebb and flow between them would really be fun on a weekly basis.
That sort of stuff really creates a greated tie between the Economic and Political portions of the game in an Imaginative sense....while we know they are ultimately connected the color is a bit lacking....
... industrialists sitting in their posh corner offfices could peruse the war news relishing the profit potential in re-supplying both sides. A bit more granularity down to losses by ship class would be even better, but I'd hate to have the idea abandoned or postponed by making it more difficult to implement initially.
EDIT I want to be clear that I'm not looking for Live infomration via the API on the ship losses. That would be unecessarily cumbersome on resources and querries. What I'd like would be for you to do a count once each day (at down time perhaps?) and make only that number, (or maybe the sum of 7 of those numbers if only disclosed weekly) available.
I don't know much about computers but I'm pretty sure that making indexed infomration availble is far less resources demanding that live queries. The idex infomration may be less current ... as in it doesn't work for live game play which you might be accustomed to thinking of.. but economic analysis uses historical information as a stanard. |
Tyollo
ELITE BR Vera Cruz Alliance
0
|
Posted - 2011.09.14 23:18:00 -
[8] - Quote
I liked the initiative.
For me the best option is the csv file because of greater freedom for the developer community.
A good use of this information is somewhat similar to the program MetaStock (http://www.equis.com/) that is used for technical analysis of stock markets around the world.
At the end of each day is released data from the transactions of the past day. This can take the investitores investment decisions for the next day.
In the case of Eve I think a weekly update is sufficient.
Thanks |
Diomedes Calypso
Aetolian Armada
12
|
Posted - 2011.09.15 18:04:00 -
[9] - Quote
Tyollo wrote:I liked the initiative.
For me the best option is the csv file because of greater freedom for the developer community.
A good use of this information is somewhat similar to the program MetaStock (http://www.equis.com/) that is used for technical analysis of stock markets around the world.
At the end of each day is released data from the transactions of the past day. This can take the investitores investment decisions for the next day.
In the case of Eve I think a weekly update is sufficient.
Thanks
I also think that a daily or weekly aggregation of numbers is sufficient. Really little need to create the server load (or whatever load) calculating and reporting for each api call. Weekly should be for a standard monday to sunday week not a trailing 7 days.
.... and even more, I'd like numbers based on callender days and you could make that infomration availble on a very simple web page that people could use without a third party program or nearly as many people wanting tools to customize ranges. |
Tyollo
ELITE BR Vera Cruz Alliance
0
|
Posted - 2011.09.15 19:21:00 -
[10] - Quote
Another option is to use formatting data summarized data. Instead of releasing all the day's transactions that would be a much large data values GÇïGÇïcould release only a few already compiled as: aperture value, closing value, maximum value, minimum value and average for each product by region or system. Same is used in stock exchange.
The exchanges work much more volume of information and can release the data at the end of the day. If they use this format must have a good reason. I think the following models and sizes of bags values GÇïGÇïof worlds is a good initial choice. For it is a format already tested with large amounts of data and users.
Leaving to release the full data weekly or every 15 days. And the data are summarized daily. |
|
Kandreath
De Re Metallica Epsilon Shimmy Alliance
2
|
Posted - 2011.09.17 04:22:00 -
[11] - Quote
Bahhhh lost my last post for some reason.
First, thanks for the data! This is great stuff.
Second I don't give a rats what format it is in. I'll do whatever I need to get it into a format I like. - That said, do we really have to choose? Why not continue in both the SQL and CSV formats? - I reckon those two will keep most happy.
For the record I used the SQL format so I could use a query to pre-filter the data before giving it to OO Calc. (It's faster and easier working with a smaller data set).
Third, I think there is an error in the column heading in the dump. The blog says I should have a column "regionID" but the dump has stationID. The numbers in the column look to me like regionID's, Can you confirm?
Thanks again for the data. |
|
|
|
Pages: [1] :: one page |
First page | Previous page | Next page | Last page |