Pages: [1] :: one page |
|
Author |
Thread Statistics | Show CCP posts - 0 post(s) |

Amira Silvermist
The Aegis Militia Aegis Militia
|
Posted - 2007.11.01 12:20:00 -
[1]
Is it true that you cannot access Journal data thats older then a month via the API? And if yes: Why?  ----------------------------------------------------------------------
|

Dan Treva
Raptor Services LTD
|
Posted - 2007.11.01 16:14:00 -
[2]
It's limited to 30d or 1,000 entries, which ever is greater.
I have to download twice a day to stay current with my wallets :(
|

Ambo
2nd Outcasters
|
Posted - 2007.11.01 19:09:00 -
[3]
Originally by: Dan Treva It's limited to 30d or 1,000 entries, which ever is greater.
I have to download twice a day to stay current with my wallets :(
hehe, that makes no sense. 
As long as you download every 29 days you'll never miss anything.
To the OP: Garthagk has stated that this will not be changing any time soon I'm afraid, kinda sucks but there we go 
|

Dan Treva
Raptor Services LTD
|
Posted - 2007.11.14 01:58:00 -
[4]
Ahh right. Sorry.
1,000 entries or 30d which ever comes first.
1,000 entries fills up pretty fast in a larger/active corp. We have to download 3-4 times per day to capture it all.
|

Matthew
Caldari BloodStar Technologies
|
Posted - 2007.11.14 12:36:00 -
[5]
Edited by: Matthew on 14/11/2007 12:44:47
Originally by: Dan Treva 1,000 entries or 30d which ever comes first.
Still not quite true. If there are more than 1000 entries, you can use data walking, as described in the docs, to get back to previous blocks of 1000 entries. However, it will only allow you to access a previous block if it contains a transaction up to 1 week old. Depending on where the "last record within the week" falls, you may get some older data on the end of the last page, just to make it up to the 1000 records, but you won't be able to page back any further than that.
Personally, I aim to download once every 6 days (just to ensure a bit of overlap so nothing falls through the *****s), and I just walk back as far as it lets me and lets my DB handle removal of duplicates.
I've not encountered the 30d limit, but maybe that comes in when you don't have more than 1000 entries in 30 days (not a situation i've ever been in), as a stop to the query having to go back through the entire history for someone with very few transactions.
edit: just as a hint on how to do the data walking. If you query the first page using something like http://api.eve-online.com/char/WalletJournal.csv.aspx?userID=wxy&apiKey=wxy&characterID=wxy
(I've replaced all my personal details with wxy ), which is the link accessible from the My Character section. What you then need to do is look at that first set of 1000 records you get, and find the lowest value of the refID field it contains. You then add that to the URL. For example, if you're lowest refID is 12345, you would edit your original URL to read:
http://api.eve-online.com/char/WalletJournal.csv.aspx?userID=wxy&apiKey=wxy&characterID=wxy&beforerefid=12345
which would give you the 1000 transactions that came just before the 1000 you downloaded first. Repeat this process until it gives you the error that you've already returned a week of data.
If you're too lazy to search your download for the lowest refID, then you can cheat a bit. If you deliberatey go to the first URL a second time (before the cache timer expires and lets you get that first page again), it will give you an error message. Part of that message will give you the value it's expecting for beforeRefID.
You can do the same with the wallet transactions, you just need to use the lowest value of transID, and the name beforeTransID in the URL. ------- There is no magic Wand of Fixing, and it is not powered by forum whines. |

tornpain
|
Posted - 2007.11.14 18:22:00 -
[6]
i've been banging my head against how exactly this works and your explanation filled in the blanks perfectly, thanks Matthew.
kinda disappointed though -- as the data is essentially static it would be nice to allow even a once-a-month d/l of a character/corp's entire wallet history.
|
|
|
|
Pages: [1] :: one page |
First page | Previous page | Next page | Last page |